I’ve been playing with the largest models I can get running and have been using Librewolf or Firefox, but these use several gigabytes of system memory. What options exist that have less overhead? I’m mostly looking at maximizing the model training potential as I’m learning. The obvious solution is python in a terminal, but I need a hiking trail not free solo rock climbing.

    • duncesplayed@lemmy.one
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      1 year ago

      Absolutely right. I just tried it on the browsers installed on my system, loading this page:

      Firefox: 560MiB
      Epiphany (GNOME Web): 226MiB
      elinks: 16MiB
      lynx: 14MiB

      Looks like lynx is the winner

      (Sidenote: This isn’t really a fair fight for Firefox since it’s my daily driver, with extensions installed and a bunch of stuff cached. I’m guessing even a fresh install wouldn’t get below 300MiB, though)

  • bamboo@lemm.ee
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    Something webkit based probably. Gnome web is probably the most accessible of these.

  • Bloody Harry@feddit.de
    link
    fedilink
    arrow-up
    6
    ·
    edit-2
    1 year ago

    Not what you’re asking for, but how about putting the web browser and the page rendering on a different machine? This way your main machine can focus on calculating.

    Edit: If the pages are super simple, there’s “web browsers” which do work on the command line which can render simple pages in a very crude way.

  • AggressivelyPassive@feddit.de
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    There’s a reason these browsers use that much memory. Something in living there and that’s not just overhead. You can’t realistically reduce that by a reasonable amount by just using another browser while retaining functionality.