If you’re having to type out version numbers in your commands, something is broken.
I ended up having to roll my own shell script wrapper to bring some sanity to Python.
If you’re having to type out version numbers in your commands, something is broken.
I ended up having to roll my own shell script wrapper to bring some sanity to Python.
Yes yes, I know all that. The fact remains that a permanent IP associated with an individual is personally identifying information. Even the variety in browser requests counts as such according to the GDPR, and that is usually pooled with lots of other users. This is clearly a level above that. It’s why, for example, I would not use the VPS for proxy web browsing: zero privacy.
In an ideal world, creators would accept that only 10% of their viewers would contribute to them monetarily
Agreed.
(through patreon or donations)
and then you lapse into using “patreon” as if it’s a generic noun!
Not your intention I know, but this kind of corporate capture of minds has to end somehow.
What’s the downside you see from having a static IP address?
What’s the downside to having one’s phone number in the public directory? There’s no security risk and yet plenty of people opt out. It’s personally identifying information.
I don’t know if any companies provide reverse proxies without a CDN though.
Exactly.
You still need encryption between your CDN and your origin, ideally using a proper certificate.
It can be self-signed though, that’s what I’m doing and it’s partly to outsource the TLS maintenance. But the main reason I’m doing it is to get IP privacy. WHOIS domain privacy is fine, but to me it seems pretty sub-optimal for a personal site to be publicly associated with even a permanent IP address. A VPS is meant to be private, it’s in the name. This is something that doesn’t get talked about much. I don’t see any way to achieve this without a CDN, unfortunately.
I guess it’s popular because people already use Github and don’t want to look for other services?
Yes, and the general confusion between Git and Github, and between public things and private things. It’s everywhere today. Another example: saying “my Substack” as if blogging was just invented by this private company. So it’s worse than just laziness IMO. It’s a reflexive trusting of the private over the public.
I have some static sites that I just rsync to my VPS and serve using Nginx. That’s definitely a good option.
Agree. And hard to get security wrong cos no database.
If you want to make it faster by using a CDN and don’t want it to be too hard to set up, you’re going to have to use a CDN service.
Yes but this can just be a drop-in frontend for the VPS. Point the domain to Cloudflare and tell only Cloudflare where to find the site. This provides IP privacy and also TLS without having to deal with LetsEncrypt. It’s not ideal because… Cloudflare… but at least you’re using standard web tools. To ditch Cloudflare you just unplug them at the domain and you still have a website.
Perhaps its irrational but I’m bothered by how many people seem to think that Github Pages is the only way to host a static website. I know that’s not your case.
This is a bit fuzzy. You seem to recommend a VPS but then suggest a bunch of page-hosting platforms.
If someone is using a static site generator, then they’re already running a web server, even if it’s on localhost. The friction of moving the webserver to the VPS is basically zero, and that way they’re not worsening the web’s corporate centralization problem.
I host my sites on a VPS. Better internet connection and uptime, and you can get pretty good VPSes for less than $40/year.
I preferred this advice.
This whole subject is such a chestnut here. No messaging option is perfect, you will need to compromise. If a perfect option existed you would have heard of it already. And if you haven’t heard of it, then by definition it must be small with few users and even fewer maintainers to keep an eye on its codebase and security, which is risky in itself.
Cynicism is a self-fulfilling prophesy.
Can recommend Hetzner (German IP). Good value and so far solid.
Before that I used OVH (French IP) for years but it ended badly. First they locked me out of my account for violating 2FA which I had not asked for or been told about, and would not provide any recourse except sending them a literal signed paper letter, which I had to do twice because the first one they ignored. A nightmare which went on for weeks. And then, cherry on the cake, my VPS literally went up in smoke when their Strasbourg data center burned down! Oops! Looks like your VPS is gone, sorry about that, here’s a voucher for six months free hosting! Months later they discovered a backup but the damage was done. Never again.
i dont want to learn 400 obscure keystrokes among other nonsense. we dont need to hear about your text editing stockholm-syndrome.
This reads like projected insecurity. Or maybe even… jealousy.
This doesn’t really compute. Society would collapse if nobody trusted “third parties”, and your second phrase is just hyperbole.
It’s more complex than that. The issue is money, and incentives, and how power is structured. A third party that you are paying or whose income is uncoupled to the profit motive, and preferably one that has both private and institutional stakeholders - well, if we choose not to trust them, then basically we can’t trust anyone for anything. That would be a dark place to be.
always find myself needing to fire up a window manager just to get a browser eventually
A chromeless tiling WM is basically invisible and AFAIK has almost zero performance impact. That’s roughly what I do.
Launching it using the raw framebuffer means it blocks the screen until you close it, and there’s no means to do anything else except switching to another TTY, is that it?
Fair point about raw speed. I never found the keyboard-vs-mouse speed debate very interesting either.
But cognitive load is a double-edged sword. Sure, the first time you attempt a task, the abstraction of a GUI is really helpful. There’s nothing to remember, you just point and click around and eventually the task is done. But when you have a task with 7 steps which you have to do every 2 weeks, then the GUI becomes a PITA in my experience. GUIs are all but impossible to script, and so you’re gonna need a good memory if you want to get it done quickly and accurately. This is where CLI scripting becomes genuinely useful. Personally I have quite a few such tasks.
Great! I guessed that going full framebuffer would be trickier than that. You’ve laid me down a new challenge.
How much can you actually do without a windowing environment? […] Opening images in fbi, PDFs in fbpdf, listening to music in cmus, watching movies in mplayer
Maybe not an “environment” but it sounds like you’re at least using a window manager. The PDFs and videos, not to mention web browser, are gonna be hard to pull off from a raw shell. [Hard but not that hard, apparently!]
But that’s a detail. Otherwise I share your enthusiasm, I’ve been doing things this way for a while. Basically: tiling window manager + TUI file manager + scripts which do precisely what I want, if possible in the terminal, if necessary by launching a GUI app. In practice the GUI apps are Firefox, mapping app, and messaging apps.
The general discovery I made was this: for the small price of foregoing pretty colors and buttons and chrome, you can get a computer to do exactly what you want it to do much quicker. Assuming a willingness to learn a bit of shell scripting, of course.
For example: I have a button which runs a script with getmail
that pulls in my email and then deploys ripmime
and weasyprint
to convert it to datestamped PDF files, which it dumps with any attachments directly into an inbox folder. In other words, I have made ranger
into my email client and I never need to “download” anything, it’s already there.
And those PDFs I can then manipulate with a bunch of shell scripts that use standard utilities, i.e. to split them, merge them, shrink them, clean them of metadata, even make they look like them come from photocopied paper (dumb bank!). All the stupid shit I once did with 10 manipulations hunting thru menus with a pointer in a fiddly app and always forgetting how it was done. Now I just select the file in the terminal, hit a button and it’s done, I don’t even see the PDF.
Of course, it’s not for everyone, but this is the promise of free computing.
While I appreciate the amount of development those companies bring to the table, the moment they’re in control of the project they’ll try to find ways to profit from it at the expense of the community, and it almost always results in a poorer product.
Yes, hard to argue with this. Or indeed anything else you just said. I agree that for any project it’s crucial that there be a wide variety of stakeholders.
Believe it or not, I’m being gradually won over by the arguments deployed in this discussion! Incredible but true.
The issue is more general. When dealing with, say,
apt
, my experience is that nothing ever breaks and any false move is immediately recoverable. When dealing with Python, even seemingly trivial tasks inevitably turn into a broken mess of cryptic error messages and missing dependencies which requires hours of research to resolve. It’s a general complaint. The architecture seems fragile in some way. Of course, it’s possible it’s just because I am dumb and ignorant.