• 5 Posts
  • 644 Comments
Joined 11 months ago
cake
Cake day: October 4th, 2023

help-circle

  • tal@lemmy.todaytoSelfhosted@lemmy.worldProgrammatic access to discord
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    1 day ago

    I get that.

    Honestly, though I’m still a little puzzled as to why people initially got into Discord; I never did.

    I can understand why people wanted to use some systems. Twitter does massive-scale real-time indexing. That was a huge feature, really changed what one could do on the platform.

    Reddit provided a good syntax (Markdown), had a low barrier to entry (no email verification at a time when that was common), and third-party client access. It solved the spam problem that was killing Usenet and permitted for more-reasonable moderation.

    There were a whole host of services that aimed to lower the complexity bar to get a web page and some content online associated with someone’s identity; it was clear that lack of technical knowledge and the technical knowledge required to get stuff up was a real limiting factor for many people.

    But I just didn’t really get where Discord provided much of a win over stuff like IRC. I mean, I guess maybe it bundled a couple services into one, which maybe lowered the bar to use a bit. IRC really seemed pretty fine to me. Reddit bundling image-hosting seems to have lowered the bar, been something that people wanted. Maybe Discord doing images and file-hosting made it more-accessible.

    I have no idea why a number of people who liked Cataclysm: Dark Days Ahead used Discord rather than Reddit; it seemed like a dramatically-worse system if one was aiming to create material for others to look back at and refer to.

    kagis

    https://old.reddit.com/r/RedditForGrownups/comments/t417q1/can_someone_please_explain_discord_to_me_like_im/

    It’s just modern day IRC with video.

    Ahaha, thanks. This is indeed an ELI60 response, although it doesn’t really explain how Discord suddenly got so popular. But if I couple this with /u/Healthy-Car-1860’s response, I’m kind of getting the picture.

    Got popular because it spread through the entire gamer/twitch community like wildfire due to actually being a more complete package and easier to use than anything prior. Online gamers have been struggling with voip software forever (Roger Wilco, Teamspeak, Ventrilo, Skype, and many others).

    Once it was rooted in the people who are on their computers app day every day it was bound to spread because the UX is incredibly easy compared to previous options for both chat and voip.

    Maybe that’s it. I never had a lot of interest in VoIP, especially group VoIP. When I was playing online games much, people used keyboards to communicate, not mics. There was definitely a period where people needed the ability to collaborate in games and games didn’t always provide that functionality. I remember people complaining about Teamspeak and Ventrilo. I briefly poked at Mumble – nice to have an open-source option – but I just had no reason to want to do VoIP with groups of people.

    But I suppose for a video game clan or something, that might be important functionality. And if it’s also a one-stop shop for some other things that you might want to do anyway, it maybe makes sense to just use that rather than multiple services.



  • If I need to do an emergency boot from a USB stick to repair something that can’t boot, which it sounds like is what you’re after, pretty much any Linux distro will do. I’d probably rather have a single, mainstream bootable OS than a handful.

    I’d use Debian, just because that’s what I use normally, so I’m most familiar with it. But it really doesn’t matter all that much.

    And honestly, while having an emergency bootable medium with a functioning system can simplify things, if you’re familiar with the boot process, you very rarely actually need emergency boot media on a Linux system. You have a pretty flexible bootloader in grub, and the Linux kernel can run and be usable enough to fix things on a pretty broken system, if you pass something like init=/bin/sh to the kernel, maybe busybox instead for a really broken system, and can remount root read-write (mount -o rw,remount /) and know how to force syncs (echo s > /proc/sysrq-trigger) and reboots (echo b > /proc/sysrq-trigger).

    I’ve killed ld.so and libc before and broght back systems without alternate boot media. The only time I think you’d likely really get into trouble truly requiring alternate boot media is (a) installing a new kernel that doesn’t work for some reason and removing all the old, working kernels before checking to see that your new one works, or (b) killing grub. Maybe if you hork up your partition table or root filesystem enough that grub can’t bring the kernel up, but in most of those cases, I’m not sure that you’re likely gonna be bringing things back up with rescue tools – you’re probably gonna need to reinstall your OS anyway.

    EDIT: Well, okay, if you wipe the partition table, I guess that you might be able to find the beginning of a filesystem partition based on magic strings or something and either manually reconstruct the partition table or at least extract a copy of the filesystem to somewhere else.


  • Internet Archive creates digital copies of print books and posts those copies on its website where users may access them in full, for free, in a service it calls the “Free Digital Library.” Other than a period in 2020, Internet Archive has maintained a one-to-one owned-to-loaned ratio for its digital books: Initially, it allowed only as many concurrent “checkouts” of a digital book as it has physical copies in its possession. Subsequently, Internet Archive expanded its Free Digital Library to include other libraries, thereby counting the number of physical copies of a book possessed by those libraries toward the total number of digital copies it makes available at any given time.

    This appeal presents the following question: Is it “fair use” for a nonprofit organization to scan copyright-protected print books in their entirety, and distribute those digital copies online, in full, for free, subject to a one-to-one owned-to-loaned ratio between its print copies and the digital copies it makes available at any given time, all without authorization from the copyright-holding publishers or authors? Applying the relevant provisions of the Copyright Act as well as binding Supreme Court and Second Circuit precedent, we conclude the answer is no. We therefore AFFIRM.

    Basically, there isn’t an intrinsic right under US fair use doctrine to take a print book, scan it, and then lend digital copies of the print book.

    My impression, from what little I’ve read in the past on this, is that this was probably going to be the expected outcome.

    And while I haven’t closely-monitored the case, and there are probably precedent issues that are interesting for various parties, my gut reaction is that I kind of wish that archive.org weren’t doing these fights. The problem I have is that they’re basically an indispensible, one-of-a-kind resource for recording the state of webpages at some point in time via their Wayback Machine service. They are pretty widely used as the way to cite a page on the Web.

    What I worry about is that they’re going to get into some huge fight over copyright on some not-directly-related issue, like print books or something, and then someone is going to sue them and get a ton of damages and it’s going to wipe out that other, critical aspect of their operations…like, some random publisher will get ownership of archive.org and all of their data and logs and services and whatnot.


  • tal@lemmy.todaytoLinux@lemmy.worldAny luck with Snapdragon Elite?
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    7 days ago

    I’d add that I was was initially somewhat interested in ARM hardware, but I’ve cooled a lot on it.

    For me, and I suspect a number of others, power efficiency is the main appeal.

    First, even on Linux, where a lot of software is open-source and some distros have ARM builds, there’s a lot of closed-source software out there, like Steam games, that are x86-based and won’t ever be ARM, and if you’re emulating x86, your power-efficiency benefits go away.

    Second, the ARM world is more SoC-oriented, so you don’t have the ecosystem of drivers for modular hardware that plays nicely, and a lot of SoC data isn’t available. This is not a minor issue. An ARM system is not just an x86 system with a slightly-different processor. Whole different world.

    Think this sums it up:

    https://www.reddit.com/r/linux/comments/1b0lbva/linux_for_arm/

    With Arm support, it greatly depends on the specific boards as well as its general popularity. It isn’t at all as well standardized like x86 is. Each device takes at least some amount of unique customization, even before you get into the video and other hardware drivers, often not open source. ie Qualcomm.

    An image for a Thinkad X13S won’t necessarily work or even boot on a Lenovo Flex-5G, for example.

    The good news is that there are growing numbers of people hacking on these to get better support and usability.

    For the X13s, at least. My Flex 5G does not seem to be very popular, lol.

    I think this point is often underappreciated - if ARM is to take off for personal computing, there’s a lot of standardization work yet to be done.

    Third, for a lot of software, what matters is single-thread performance. And x86 is ahead there.

    Fourth, I was recently in a discussion with someone on here and they informed me that the power-efficiency gap has narrowed (at least on Apple’s ARMs, dunno about all ARM sysyems) since Apple’s M1 release, when it was more-significant. I haven’t looked into that, but given that that was the major selling point, it also gave me pause.

    EDIT: All that being said, I am totally onboard with wanting laptops with long battery life and the state of things in 2024 is a favorite pet peeve. I would very much like to have a laptop with beefy batteries in the vein of some of the older Thinkpads. I had a T460 or something with two batteries, one removable and where one could get a larger battery that just hung out the back. That was fantastic. Getting a laptop with a single fixed 100Wh battery is very difficult these days, and pretty much nonexistent outside of power-hungry gaming systems that will burn through even the larger battery in short order. Getting a multi-battery system that can be expanded (even past 100Wh) is pretty much only the domain of a few very expensive “ruggedized” laptops like the Panasonic ToughBook.

    I’m not sure whether that’s because the typical consumer:

    • Cares way more about weight than I do.

    • Is way more price-sensitive than I am. 100Wh batteries aren’t that expensive, not in 2024. Here’s a device with a 146Wh battery and an inverter and charging hardware and a case on Amazon for $76.

    • Does not care about battery life, like, uses their laptop never far from a plug.

    • Just doesn’t consider battery life when buying a laptop.

    • Is willing to live with swapping USB PD power stations in. The problem here is that while there is actually a USB device class for power sources that permits a battery bank to report remaining capacity – I checked this in the USB spec the other day – (though I don’t know whether Linux can use this, map it to a /sys/class/power_supply device the way it can ACPI batteries), I haven’t been able to find anyone who actually implements this on their power station and advertises it. No power stations that I own implement it. And I want things like my laptop telling me remaining time to keep working, which they cannot, absent that information.


  • I do not, but I read an article within the last year or so saying that it’s doable on a few models – this one dealt with the Thinkpad you’re talking about – but has a lot of stumbling blocks.

    https://www.theregister.com/2023/09/08/linux_on_the_thinkpad_x13s/

    E.g…:

    Getting it to boot from SSD is an epic undertaking, involving entering a UEFI firmware shell and manually going through 30 or 40 entries to find and enable the right UEFI boot entry, but after hours of searching and countless reboots, it worked, and Debian would start. Unfortunately, when booting the installed OS, the screen blanked after just a few lines of output, never to return. The OS was running – for instance, pressing the power button led to a clean shutdown after a few seconds – but with no display, not even a text one, we couldn’t configure a Wi-Fi connection, and the machine has no built-in Ethernet port.


  • I figured I’d also add a bit of text about why I’m using wl-paste and clipman above, for anyone interested in clipboard management.

    So, the specific problem I’m trying to solve with this daemon: On Windows and (traditional, dunno if this has changed, decades out-of-date on MacOS) MacOS, the desktop environment maintains a persistent clipboard buffer. When you hit Control-C (or, on MacOS, Command-C), the program tells the OS to save a copy of the copied/cut content.

    This is not actually what X11 or Wayland do. Neither X11 nor Wayland maintain a persistent clipboard. What they do is act as a broker between two X11 or Wayland programs. So, if you you launch Program A and select text (for PRIMARY) or hit Copy (for CLIPBOARD) and then in Program B, middle-click (for PRIMARY) or hit Paste (for CLIPBOARD), then Wayland tells Program B to go get data from Program A.

    For X11, this was particularly important in that the system was designed to run on very low-resource thin terminals. That terminal may not have the ability to store a whole lot of clipboard data.

    This becomes user-visible If someone closes Program A prior to pasting in Program B, because their content also disappears.

    Some people found this obnoxious, and introduced a solution many years back, reusing an approach used by software on Windows and MacOS to solve a different issue.

    On both Windows and MacOS, some people ran into the limitations of having a single clipboard. They didn’t want a “copy” to destroy whatever was already in their clipboard, wanted a history.

    Some software packages dealt with this – I believe Microsoft Office was among these – by introducing its own “clipboard history”. Emacs has its own sophisticated system with a history, the kill-ring, that’s been extended in all kinds of exotic ways (e.g. to be a “kill-torus”). But while these mitigate the problem for a particular important program, these are not system-wide.

    So what folks on MacOS and Windows did was introduce “clipboard managers”. The idea here is that you have a very small, simple program. It just sits and waits for the clipboard contents to change. When they do, it saves a copy. It typically saves some finite number of clipboard entries, and lets you go back in a time-ordered list and choose saved clipboard contents. Some provide more-sophisticated functionality, like searching through history. That’s nice if you just chose “copy” and realize that you just blew away some content that you’d copied. Based on a quick glance, neither MacOS nor Windows ships out-of-box with a clipboard manager in the base OS in 2024, but it’s a simple program to write, so people who want it don’t have trouble adding it on.

    X11 has three clipboards (PRIMARY, SECONDARY, CLIPBOARD) and Wayland can do at least PRIMARY and CLIPBOARD, dunno about SECONDARY. That’s a bit more state that can be retained, but they aren’t really intended to provide a “history”.

    Some people on X11 or Wayland also want that “clipboard manager” functionality. And a “clipboard manager” also has the nice side-benefit of providing clipboard persistence beyond the lifetime of the program from which you copied the data. You don’t have the “Program A was closed before pasting to Program B” issue, because what happens is that you copy in Program A, then the clipboard manager detects that the clipboard contents have changed and internally transfers the clipboard contents from Program A to its own memory, then announces that it has new clipboard contents, a copy of what was just stored, so when the user pastes in Program B, he’s actually asking the clipboard manager to send data to Program B. I don’t actually know how fast the clipboard managers detect new data in the clipboard – depending upon how the clipboard API works, I suppose that there might be a window for data loss, where someone copies in Program A and then immediately closes Program A – but it seems to work well-enough on a day-to-day basis.

    This is particularly obnoxious for software packages like xsel on X11 and wl-clipboard on Wayland. They’re command-line programs that you can use to “copy” text. They need to provide the appearance to the user of looking like any other command-line program and exiting once they’ve run. But the X11 and Wayland protocols don’t permit for that – a program from which one is copying data has to stay alive long enough to send data to whatever program is requesting it. So xsel and wl-clipboard have to fork off a process to stay alive until the clipboard contents change, which is kind of a kludge.

    I don’t personally need a clipboard manager. I don’t care about a clip history. I use emacs’ kill ring, but the overwhelming remainder of copy-pasting I do is very simple. And my experience has been that clipboard managers tend to come-and-go, and tend to be tied to a particular desktop environment or widget toolkit, and come with a bunch of library dependencies.

    What I do want, though, is clip persistence past the lifetime of a given program. I don’t like having to think about whether a program is still running or not. I want the clipboard to act like my X11 server or or Wayland contains an independent clip buffer, and when I choose “copy”, it saves a copy to the thing.

    The combination of clipman plus wl-paste --watch can be made to act as a very minimalist clipboard manager. It doesn’t use KDE or GNOME or GTK or Qt or anything like that. All it does is talk directly to Wayland. That fits my bill well. Note that it does store a copy of the clipboard on-disk, which some people may want (so that it lasts across sessions, which I don’t care about). That’s necessary because wl-paste doesn’t retain state and clipman doesn’t stay running. Some people may not like this in that they may not want clipboard contents sitting around on disk from session to session; stuff like passwords might be persisted there; just a heads-up. There are clipboard managers out there that won’t persist state to disk, if that’s a concern for anyone.






  • CIFS supports leases. That is, hosts will try to ask for exclusive access to a file, so that they can assume that it hasn’t changed.

    IIRC sshfs just doesn’t care much about cache coherency across hosts and just kind of assumes that things haven’t changed underfoot, uses a timer to expire the cache.

    considers

    Honestly, with inotify, it’d probably be possible to make a newer sshfs that does support leases.

    I suspect that the Unixy thing to do is to use NFSv4 which also does cache coherency correctly.

    It is easy to deploy sshfs, though, so I do appreciate why people use it; I do so myself.

    kagis to see if anyone has benchmarks

    https://blog.ja-ke.tech/2019/08/27/nas-performance-sshfs-nfs-smb.html

    Here are some 2019 benchmarks that show NFSv4 to generally be the most-performant.

    The really obnoxious thing about NFSv4, IMHO, is that ssh is pretty trivial to set up, and sshfs just requires a working ssh connection and sshfs software installed, whereas if you want secure NFSv4, you need to set up Kerberos. Setting up Kerberos is a pain. It’s great for large organizations, but for “I have three computers that I want to make talk together”, it’s just overkill.



  • While I generally agree, I think that there are some other ways that one could make games:

    • One is to just do games incrementally. Like, you buy a game, it doesn’t have a whole lot of content then buy DLC. That’s not necessarily a terrible way for things to work – it maybe means that games having trouble get cut off earlier, don’t do a Star Citizen. But it means that it’s harder to do a lot of engine development for the first release. Paradox’s games tend to look like this – they just keep putting out hundreds of dollars in expansion content for games, as long as players keep buying it. It also de-risks the game for the publisher – they don’t have so much riding on any one release. I think that that works better for some genres than others.

    • Another is live service games. I think that there are certain niches that that works for, but that that has drawbacks and on the whole, too many studios are already fighting for too few live service game players.

    • Another is just to scale down the ambition of games. I mean, maybe people don’t want really-high-production-cost games. There are good games out there that some guy made on his lonesome. Maybe people don’t want video cutscenes and such. Balatro’s a pretty good game, IMHO, and it didn’t have a huge budget.

    I do think, though, that there are always going to be at least some high-budget games out there. There’s just some stuff that you can’t do as well otherwise. If you want to create a big, open-world game with a lot of human labor involved in production, it’s just going to have a lot of content, going to be expensive to make that content. Even if we figure out how to automate some of that work, do it more-cheaply, there’ll be something new that requires human labor.


  • The price is reasonable

    I was going to “say $60 for re-releases of several older DS games doesn’t seem that amazing”, but then I realized that I was looking at the price for the bundle below it, which apparently includes all of:

    The games in this bundle:

    • Castlevania: Dawn of Sorrow
    • Castlevania: Portrait of Ruin
    • Castlevania: Order of Ecclesia
    • Haunted Castle Revisited
    • Haunted Castle

    Plus:

    • Castlevania: Circle of the Moon

    • Castlevania: Harmony of Dissonance

    • Castlevania: Aria of Sorrow

    • Castlevania: Dracula X

    • Castlevania

    • Castlevania II Simon’s Quest

    • Castlevania III Dracula’s Curse

    • Super Castlevania IV

    • Castlevania The Adventure

    • Castlevania II Belmont’s Revenge

    • Castlevania Bloodlines

    • Kid Dracula

    That’s…actually pretty darn good too, if you enjoy Metroidvanias.

    I do kind of wish that they’d figured out some sort of way to do higher-resolution graphics, but the games themselves ain’t bad.




  • I don’t really follow consoles, but I’ll take a guess based on what limited information is about the thing in the article.

    If you figure that PC and various console hardware has converged to a fair degree over the decades and that stuff is gonna get generally ported around anyway, it’s hard to differentiate yourself on game selection or hardware features. Plus you’ve got antitrust regulators going after console vendors buying games to be exclusives, and that also tamps down on that.

    So okay, say what you can compete on is in significant part how you run what is more or less the same set of games. Most games already have rendering code that can scale pretty well with hardware for the PC.

    It might make sense to make sure that you have faster rendering hardware so that it’s your version that looks the nicest (well, or at least second nicest, hard to compete with the PC’s hardware iteration time for users willing to buy the latest-and-greatest there).

    Let me extrapolate one further. It might even make sense, if that’s the direction of things, for console vendors to make some kind of cartridge containing the GPU, something with a durable, idiot-proof upgrade that you don’t have to open the console to do, and to let users upgrade their console to the next gen mid-lifecycle at a lower cost than getting a new console. Controllers haven’t changed all that much, and serial compute capabilities aren’t improving all that much annually. The thing that is improving at a good clip is parallel compute.


  • Another potential factor, I guess, is that what “top” means is not fixed. When I see a “top list”, I think “what are the best games in the context of the time that the list is written”. Like, advice on what games one should play today.

    But it’d also be legitimate to create a top list to just recognize the studios that created a game, ranking them taking into account in the context of their time. Like, there are certainly games here that advanced the genre. I’d have an easier time swallowing some of this if the ranking is to be taken in a “context of the time” thing.

    I see a similar debate surrounding Citizen Kane in movie rankings. The movie is often featured very highly in some movie rankings, even at the top. However, a lot of people are not really that into watching it. Thing is, it introduced a lot of things that later movies then adopted, stuff that we kind of take for granted now. So if you’re looking to give credit to the movie’s creators, then it might rank very highly, but many people are looking for a ranking as to what movies to watch today, and don’t think that the movie ranks nearly as highly today.

    Example:

    https://www.cbr.com/citizen-kane-still-greatest-film-all-time/

    Citizen Kane Is Still the Greatest Film of All Time

    Entertainment enthusiasts love to debate what the best of all time in any given medium is, and in most cases, they conclude with the one that changed the game. When television fans discuss the greatest show ever, they usually narrow it down to The Sopranos, The Wire and Breaking Bad. All three were created decades after television existed, but they change the way people viewed TV shows. Citizen Kane released nearly 50 years after the birth of cinema, but it changed the way people watched and made movies. Other iconic films that changed the game were made around the time of Citizen Kane like The Wizard of Oz, Gone With The Wind, Snow White and the Seven Dwarfs and Casablanca, but Welles’ movie leaves the more lasting impression because it combines the Hollywood budget with the genius of a bold and young outsider.

    Citizen Kane may not be as entertaining as some that came after it nor does it tend to be on the top of everyone’s personal favorite movies list, but when talking about the most respected and influential movies ever made, it’s always near the top of that conversation.

    I think that maybe some of the problem is that we should just use different terms, to avoid confusion between the two types of lists. Like, instead of “top”, maybe “ground-breaking” for lists of the “innovative” category, and “best movies to watch today” for lists of the “what’s most enjoyable to watch in the present time” sort.

    I would not recommend Wolfenstein 3D or Doom as a first-person shooter in 2024. But if one asked me for a list of the most influential first-person shooter games…well, they might be pretty high on that list. If I wanted games like that, I’d be looking for games that introduced ideas or technical improvements that were then widely-adopted. For example, regenerating shields are now very common in first-person shooters. They solve a gameplay problem that plagued early first-person shooters where a person would save a game with very low health reserves and get caught in a very difficult situation; players didn’t like that. I think that it might have been Halo that popularized that mechanic. Is the original Halo the best FPS to play in 2024? Well, it’s playable – I played it the other day, in fact. But it’s probably not where I’d direct a new player to the genre asking me for the best game for them to play. But sure, it was influential.

    I remember that a lot of early computer RPGs followed many conventions introduced by Dungeons & Dragons from the time, like stat scales modeled on a 3d6 dice roll, skill progression decoupled from skill use (e.g. I can gain “experience points” from doing one thing that I can then apply to something else, which seems a bit unintuitive), the concept of discrete classes with equipment restrictions. Dungeons & Dragons was quite influential, introduced a lot of useful ideas, but some of its rules were designed around pencil-and-paper play – it needed to keep the math quick and simple. I remember being delighted by the fact that in the original Fallout, gaining a stat point really was something that you could feel, whereas in Dungeons & Dragons, it tended to be a smaller effect (and particular tiers were more important, where modifiers got changed). And it took some time for computer RPGs to shift away from those conventions. I think that games that introduced those different mechanics were important for the genre, but…that doesn’t necessarily make a given game itself something top-tier that I want to play in 2024.