I can confirm that it does not run (at least not smoothly) with an Nvidia 4080 12Gb. However, gemma2:27B runs pretty well. Do you think if we add another graphical card, a modest one, maybe the llama3.1:70B could run?
I can confirm that it does not run (at least not smoothly) with an Nvidia 4080 12Gb. However, gemma2:27B runs pretty well. Do you think if we add another graphical card, a modest one, maybe the llama3.1:70B could run?
What are your PC specifications for running Ollama3.1:70B smoothly?
Many assertions without any proof. Could you at least point out the sources for such statements?
Thanks for the suggestion; however, a VPS is out of my options, despite its functionalities, it is a subscription and the storage will be too limited.
At least 12TB or above. For the budget… As cheapest as possible :)
I was looking into Unraid OS, and it seems nice, and as you, I want to keep far away from subscriptions as much as possible.
My doubt with HDD is their speed, high noise and energy consumption.
Thanks for answering.
That is exactly my fear. If I need to build my private streaming station, I will need too much storage (I still do not know how much exactly).
Furthermore, I want to build on top a Minecraft server for my kid.
Thanks for answering.
Maybe perplexity?
Great job! Thanks for sharing
I have a dynamic IP, and it’s being a pain in the @$$ for me. I simply cannot use my domain to access my home server because of this.
Is your script available on GitHub or similar platforms?
Once you find a good and nice solution, please let me know. It seems the good things are always condemned to disappear.