• 0 Posts
  • 124 Comments
Joined 1 year ago
cake
Cake day: August 8th, 2023

help-circle
  • I use it for sim racing sometimes and it’s amazing to feel like I’m in an F1 car or something. Until I get nauseous after 15 minutes or something. It’s also a bit of a hassle to set up. That being said, maybe it would be cooler if I got into beat saber or something.

    Was it over hyped? Maybe. But it’s still a cool technology and I’d be sad to see it fall into nothingness. I don’t see a future where everyone is wearing VR glasses, but it’s still a very neat thing to enjoy every now and then.





  • I was just about to post the same thing. I’ve been using Linux for almost 10 years. I never really understood the folder layout anyway into this detail. My reasoning always was that /lib was more system-wide and /usr/lib was for stuff installed for me only. That never made sense though, since there is only one /usr and not one for every user. But I never really thought further, I just let it be.





  • Sometimes I look at the memes around here and wonder wtf y’all are doing. Like, neither my code nor the code at the place I work at are perfect. But I don’t think I’ve ever seen a merge do this. Maybe some of the most diverged merges temporarily had a lot of errors because of some refactoring, but then it was just a few find + replaces away from being fixed again. But those were merges where multiple teams had been working on both the original and the fork for years and even then it was usually pretty okay.






  • Machine learning and compression have always been closely tied together. It’s trying to learn the “rules” that describe the data rather than memorizing all the data.

    I remember implementing a paper older than me in our “Information Theory” course at university that treated the creation of a decision tree as compression. Their algorithm considered sending the decisions tree and all the exceptions to the decision tree and the tree itself. If a node in the tree increased the overall message size, it would simply be pruned. This way they ensured that you wouldn’t make conclusions while having very little data and would only add the big patterns in the data.

    Fundamentally it is just compression, it’s just a way better method of compression than all the models that we had before.

    EDIT: The paper I’m talking about is “Inferring decision trees using the minimum description length principle” - L. Ross Quinlan & Ronald L. Rivest


  • I’m on Arch (actually a converted Antergos) and I have an NVIDIA card as well. My first attempt a few months ago was horrible, bricking my system and requiring a bootable USB an a whole evening to get Linux working again.

    My second attempt was recently, and went a lot better. X11 no longer seems to work, so I’m kinda stuck with it, but it feels snappy as long as my second monitor is disconnected. I’ve yet to try some gaming. My main monitor is a VRR 144Hz panel with garbage-tier HDR. The HDR worked out of the box on KDE Plasma, with the same shitty quality as on Windows, so I immediately turned it off again. When my second monitor is connected I get terrible hitching. Every second or so the screen just freezes for hundreds of milliseconds. Something about it (1280x1024, 75Hz, DVI) must not make Wayland happy. No settings seem to change anything, only physically disconnecting the monitor seems to work.