• 1 Post
  • 217 Comments
Joined 1 year ago
cake
Cake day: June 16th, 2023

help-circle
  • The issue is more general. When dealing with, say, apt, my experience is that nothing ever breaks and any false move is immediately recoverable. When dealing with Python, even seemingly trivial tasks inevitably turn into a broken mess of cryptic error messages and missing dependencies which requires hours of research to resolve. It’s a general complaint. The architecture seems fragile in some way. Of course, it’s possible it’s just because I am dumb and ignorant.






  • You still need encryption between your CDN and your origin, ideally using a proper certificate.

    It can be self-signed though, that’s what I’m doing and it’s partly to outsource the TLS maintenance. But the main reason I’m doing it is to get IP privacy. WHOIS domain privacy is fine, but to me it seems pretty sub-optimal for a personal site to be publicly associated with even a permanent IP address. A VPS is meant to be private, it’s in the name. This is something that doesn’t get talked about much. I don’t see any way to achieve this without a CDN, unfortunately.

    I guess it’s popular because people already use Github and don’t want to look for other services?

    Yes, and the general confusion between Git and Github, and between public things and private things. It’s everywhere today. Another example: saying “my Substack” as if blogging was just invented by this private company. So it’s worse than just laziness IMO. It’s a reflexive trusting of the private over the public.


  • I have some static sites that I just rsync to my VPS and serve using Nginx. That’s definitely a good option.

    Agree. And hard to get security wrong cos no database.

    If you want to make it faster by using a CDN and don’t want it to be too hard to set up, you’re going to have to use a CDN service.

    Yes but this can just be a drop-in frontend for the VPS. Point the domain to Cloudflare and tell only Cloudflare where to find the site. This provides IP privacy and also TLS without having to deal with LetsEncrypt. It’s not ideal because… Cloudflare… but at least you’re using standard web tools. To ditch Cloudflare you just unplug them at the domain and you still have a website.

    Perhaps its irrational but I’m bothered by how many people seem to think that Github Pages is the only way to host a static website. I know that’s not your case.


  • This is a bit fuzzy. You seem to recommend a VPS but then suggest a bunch of page-hosting platforms.

    If someone is using a static site generator, then they’re already running a web server, even if it’s on localhost. The friction of moving the webserver to the VPS is basically zero, and that way they’re not worsening the web’s corporate centralization problem.

    I host my sites on a VPS. Better internet connection and uptime, and you can get pretty good VPSes for less than $40/year.

    I preferred this advice.




  • Can recommend Hetzner (German IP). Good value and so far solid.

    Before that I used OVH (French IP) for years but it ended badly. First they locked me out of my account for violating 2FA which I had not asked for or been told about, and would not provide any recourse except sending them a literal signed paper letter, which I had to do twice because the first one they ignored. A nightmare which went on for weeks. And then, cherry on the cake, my VPS literally went up in smoke when their Strasbourg data center burned down! Oops! Looks like your VPS is gone, sorry about that, here’s a voucher for six months free hosting! Months later they discovered a backup but the damage was done. Never again.






  • Fair point about raw speed. I never found the keyboard-vs-mouse speed debate very interesting either.

    But cognitive load is a double-edged sword. Sure, the first time you attempt a task, the abstraction of a GUI is really helpful. There’s nothing to remember, you just point and click around and eventually the task is done. But when you have a task with 7 steps which you have to do every 2 weeks, then the GUI becomes a PITA in my experience. GUIs are all but impossible to script, and so you’re gonna need a good memory if you want to get it done quickly and accurately. This is where CLI scripting becomes genuinely useful. Personally I have quite a few such tasks.



  • How much can you actually do without a windowing environment? […] Opening images in fbi, PDFs in fbpdf, listening to music in cmus, watching movies in mplayer

    Maybe not an “environment” but it sounds like you’re at least using a window manager. The PDFs and videos, not to mention web browser, are gonna be hard to pull off from a raw shell. [Hard but not that hard, apparently!]

    But that’s a detail. Otherwise I share your enthusiasm, I’ve been doing things this way for a while. Basically: tiling window manager + TUI file manager + scripts which do precisely what I want, if possible in the terminal, if necessary by launching a GUI app. In practice the GUI apps are Firefox, mapping app, and messaging apps.

    The general discovery I made was this: for the small price of foregoing pretty colors and buttons and chrome, you can get a computer to do exactly what you want it to do much quicker. Assuming a willingness to learn a bit of shell scripting, of course.

    For example: I have a button which runs a script with getmail that pulls in my email and then deploys ripmime and weasyprint to convert it to datestamped PDF files, which it dumps with any attachments directly into an inbox folder. In other words, I have made ranger into my email client and I never need to “download” anything, it’s already there.

    And those PDFs I can then manipulate with a bunch of shell scripts that use standard utilities, i.e. to split them, merge them, shrink them, clean them of metadata, even make they look like them come from photocopied paper (dumb bank!). All the stupid shit I once did with 10 manipulations hunting thru menus with a pointer in a fiddly app and always forgetting how it was done. Now I just select the file in the terminal, hit a button and it’s done, I don’t even see the PDF.

    Of course, it’s not for everyone, but this is the promise of free computing.


  • While I appreciate the amount of development those companies bring to the table, the moment they’re in control of the project they’ll try to find ways to profit from it at the expense of the community, and it almost always results in a poorer product.

    Yes, hard to argue with this. Or indeed anything else you just said. I agree that for any project it’s crucial that there be a wide variety of stakeholders.