Hacker News new | past | comments | ask | show | jobs | submit login

Yeah, I want to see actual tests before I’m convinced that they’re on par with Nvidia. If AMD turns out to have caught up with Nvidia I will be very impressed, that cannot be easy to do while managing the same on the CPU side at the same time.



There's enough break-downs I'm seeing where these new cards aren't really caught up with the newest RTX cards because DLSS 2.0 is the big differentiator and stuff like nVidia Broadcast just aren't there for AMD. The other suspicious comparisons are AMD benchmarks against RTX 2080 Ti when the newest RTX cards _all_ blow away the RTX 2080 Ti in almost every metric.

However, what I'm seeing out of this mess is that AMD is absolutely competitive on a watt-per-performance basis now. The other problem is that AMD is so far behind nVidia in software (read: AI research mindshare) that it's not clear if we'll be able to see that many titles take on raytracing in future titles or adopt the work necessary to do ML-based upscaling with AMD as the baseline software stack rather than DLSS.


I suspect the RTX 2080 Ti benchmarks were used considering the RTX 3070 just launched and only the Founders Edition is available for purchase. Based on what I've heard, the RTX 3070 is basically on-par with the RTX 2080 Ti in terms of real-world performance in games, so it's still probably a useful comparison.


The RTX 3070 does not official go on sale until tomorrow. You might be able to pre-order somewhere though.


DLSS 2.0, even though it doesn't require individual game training like 1.0, still isn't a generic feature available to any game that puts in the effort to support it, because Nvidia locks it down with watermarks and a non-redistribute let dll until you get approval and are added to their whitelist. Only a handful of games are whitelisted, giving a big disadvantage to indie games vs AAA.


How many indie games are pushing 4k with ray tracing/eye candy which is really what dlss is for


How many people spend $500-800 on a GPU just to play indie games?

And in any case since Unreal and Unity are integrating DLSS and other "GameWorks RTX" features more and more games will be able to implement them with essentially just a toggle.

The graphical fidelity in indie games has increased dramatically mostly due to the fact that Unreal Engine became very indie friendly and Unity has really stepped up their game with what they offer out of the box.

Eye candy is now easy because the game engines have a lot of these effects and the materials required for them built-in and you also can quite easily get really high quality assets, particles and materials in the engine marketplace for very low cost.

5 years ago developing a water shader for physically accurate water rendering would probably take an indie dev months to do complete and probably could've gotten them a speaking spot at GDC, today it's a toggle switch in the UE4 level editor.


> And in any case since Unreal and Unity are integrating DLSS and other "GameWorks RTX" features more and more games will be able to implement them with essentially just a toggle.

That integration (UE4) comes uses the watermarked dll. Unusable until you get explicitly whitelisted by Nvidia. Only a few tens of games have been whitelisted.


It's still in its development stages and NVIDIA essentially wants to guarantee that all games that launch with DLSS have a good enough implementation not to draw criticism.

The current rumour mill hints at 3.0 when it goes GA it's also when it should be part of unity.

However you can still grab the DLSS branch that NVIDIA maintains and work on it, if the result is good enough I haven't seen any evidence stating that removing that watermark is particularly difficult.


It's a basic engine feature now with a toggle in the major engines.


The point is that looking at DLSS supported games, majority are AAA titles, which is what DLSS is for.

Who cares if your average indie game doesn't support it, it probably isn't useful in those cases since you're already running it at 900 FPS


4K and raytracing are just feature toggles with next to no added development time. To get assets with texel density necessary to really take advantage of 4K? There are huge photogrammetry libraries accessible to indies like Quixel megascans, etc. you probably have an outdated notion of where things stand, IMO.

> The point is that looking at DLSS supported games, majority are AAA titles, which is what DLSS is for.

No, that's my point, and it is in part due to lock out with Nvidia's explicit whitelisting system.


Unless it's Dwarf Fortress


> it's not clear if we'll be able to see that many titles take on raytracing in future titles

Considering AMD's core role in next-gen consoles, it's likely that there'll be broad support for raytracing in games (especially cross-platform games). I'd say the question is more whether the 6000 series is anywhere close to RTX in performance for RT (which afaik wasn't shown today).


I have serious doubts about RT being a thing with the next gen consoles. Possibly a few cheap effects. Maybe the Series X+1 or whatever refresh they do in a year or two. Even powerful PCs can’t really pull off major RT without tanking performance.


For all the AAA games it's mostly for enhancing the rasterized picture with accurate (rather than cheap) lighting indeed. But that's kind of the optimal solution, rasterization is really good at producing the "base" picture.


Seems like this might be a Zen 1 kind of situation: not quite caught up with the competition but close enough that they're competitive.


On the software side AI is a definite miss. The other is video editing acceleration support in Adobe Premiere and DaVinci Resolve. I haven't looked lately but I think Nvidia completely dominates acceleration of post-processing effects and such.


> because DLSS 2.0 is the big differentiator

Is it really, though? Consoles have been doing upscaling without it for years, and one has to assume they're still going to be innovating on that front on RDNA 2.0 with the new generation, too.

The DLSS 2.0 mode where it's used as a super-sampling replacement is kinda interesting, but realistically TXAA in most engines is also pretty solid. It seems like a fairly minor extra feature at the end of the day as a result... Cool, but not game changing at all.

EDIT: although AMD did make mention of something called "super resolution" as part of their fidelity fx suite which sounds like a DLSS competitor but there's no real details. And of course the actual image results here are far more important




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: