• BlinkAndItsGone@lemm.ee
    link
    fedilink
    English
    arrow-up
    41
    arrow-down
    1
    ·
    edit-2
    1 year ago

    Here’s the most important part IMO:

    He admits that — in general — when AMD pays publishers to bundle their games with a new graphics card, AMD does expect them to prioritize AMD features in return. “Money absolutely exchanges hands,” he says. “When we do bundles, we ask them: ‘Are you willing to prioritize FSR?’”

    But Azor says that — in general — it’s a request rather than a demand. “If they ask us for DLSS support, we always tell them yes.”

    SO developers aren’t forced contractually to exclude DLSS, but outside the contract language, they are pressured to ignore it in favor of FSR. That explains why these deals tend to result in DLSS being left out, and also why there are some exceptions (e.g. Sony games–I imagine Sony knows what features it wants its PC releases to have and has decided to push back on DLSS inclusion). I think AMD is being honest this time, and I’m surprised it admitted publicly that it’s doing this. Hopefully the word about this will get out and more developers will insist on including DLSS.

    • rivalary@lemmy.ca
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      2
      ·
      1 year ago

      I wish Nvidia and AMD would work together to create these features as open standards.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        31
        arrow-down
        1
        ·
        1 year ago

        Well, FSR is open, as is FreeSync and most other AMD tech, it’s just that NVIDIA is so dominant that there’s really no reason for them to use anything other than their own proprietary tech. If Intel can eat away at NVIDIA market share, maybe we’ll see some more openness.

        • conciselyverbose@kbin.social
          link
          fedilink
          arrow-up
          1
          arrow-down
          4
          ·
          1 year ago

          I guess they could just use FSR as a wrapper for DLSS, but they made DLSS because there was nothing like it available, and it leverages the hardware to absolutely blow doors off of FSR. They’re not comparable effects.

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            6
            ·
            1 year ago

            Last I checked, DLSS requires work by the developers to work properly, so it’s less “leveraging the hardware” and more “leveraging better data,” though maybe FSR 3 has a similar process.

            • conciselyverbose@kbin.social
              link
              fedilink
              arrow-up
              3
              arrow-down
              1
              ·
              1 year ago

              It’s a hardware level feature, though. The reason they didn’t support hardware prior to RTX was because they didn’t have the tensor cores to do the right math.

              FSR is substantially less capable because it can’t assume it has the correct hardware to get the throughput DLSS needs to work. I know the “corporations suck” talking point is fun and there’s some truth to it, but most of the proprietary stuff nvidia does is either first or better by a significant bit. They use the marriage of hardware and software to do things you can’t do effectively with broad compatibility, because they use the architecture of the cards it’s designed for (and going forward) extremely effectively.

              • sugar_in_your_tea@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                6
                ·
                1 year ago

                I think it’s more the other way around. They designed the feature around their new hardware as a form of competitive advantage. Most of the time, you can exchange cross platform compatibility for better performance.

                Look at CUDA vs OpenCL, for example. Instead of improving OpenCL or making CUDA an open standard, they instead double down on keeping it proprietary. They probably get a small performance advantage here, but the main reason they do this is to secure their monopoly. The same goes for GSync vs FreeSync, but it seems they are backing down and supporting FreeSync as well.

                They want you to think it’s a pro-consumer move, but really it’s just a way to keep their competition one step behind.

                • conciselyverbose@kbin.social
                  link
                  fedilink
                  arrow-up
                  1
                  arrow-down
                  2
                  ·
                  1 year ago

                  They can’t improve openCL. They can make suggestions or proposals, but because broad compatibility are the priority, most of it wouldn’t get added. They’d be stuck with a worse instruction set with tooling that spends half its time trying to figure out all the different hardware compatibility you have to deal with.

                  Cuda is better than openCL. Gsync was better than freesync (though the gap has closed enough that freesync is viable now). DLSS is better than FSR. None of them are small advantages, and they were all created before there was anything else available even if they wanted to. Supporting any of them in place of their own tech would have been a big step back and abandoning what they had just sold their customers.

                  It’s not “pro consumer”. It absolutely is “pro technology”, though. Nvidia has driven graphic and gpgpu massively forward. Open technology is nice, but it has limitations as well, and Nvidia’s approach has been constant substantial improvement to what can be done.

      • BlinkAndItsGone@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 year ago

        Well, Nvidia isn’t directly involved here at all, they’ve only commented on the issue once (to say that they don’t block other companies’ upscaling). The objections tend to come from users, the majority of whom have Nvidia cards and want to use what is widely considered the superior upscaling technology.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 year ago

    This is the best summary I could come up with:


    He seemingly wants to say that AMD did not actually make Starfield, quite possibly the year’s biggest PC game, exclusively support AMD’s FSR upscaling technology at the expense of competitors like Nvidia DLSS.

    Instead, he repeatedly lands on this: “If they want to do DLSS, they have AMD’s full support.” He says there’s nothing blocking Bethesda from adding it to the game.

    Azor, a co-founder of Alienware, has had many open conversations with me over the years, and this is the only thing he’s been cagey about all afternoon.

    “If and when Bethesda wants to put DLSS into the game, they have our full support,” he reiterates.

    Bethesda didn’t reply to repeated questions about whether it’ll add DLSS to the game.

    It also includes “Native Anti-Aliasing,” an optional new mode that uses FSR techniques to anti-alias and sharpen game graphics instead of upscaling them from a lower resolution.


    The original article contains 432 words, the summary contains 148 words. Saved 66%. I’m a bot and I’m open source!

  • all-knight-party@kbin.cafe
    link
    fedilink
    arrow-up
    9
    arrow-down
    2
    ·
    1 year ago

    Seems sort of weird. They say “when we make a deal we expect prioritization of AMD features” but that they don’t explicitly say you can’t add DLSS. I think that’s too much grey area to say for sure, especially when the one saying it is on one side.

    • geosoco@kbin.social
      link
      fedilink
      arrow-up
      11
      arrow-down
      1
      ·
      1 year ago

      There was a recent game announcement that was Amd sponsored that has both (I think it was the avatar game?). I think it’s very likely many of the games are time or budget constrained, and so when they’re given money from AMD they implement that first and if they’ve got time or previous code add DLSS.

      This feels like the old console money that Sony & Microsoft would give where developers focused some extra optimization or early engine design around one platform because of extra funding. If I recall, Sony gave a bunch of money to xplatform games.

  • AlbertScoot@kbin.social
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    1 year ago

    I mean, this has always been the case. This is a practice going back decades for video card manufacturers.