Not all rays are primary. In fact, most rays are not. Features like area lights, depth of field, motion blur, reflections, and global illumination chew up quite a lot of rays.
but nvidia can put one or two stupid RT features in AAA title and slow down 1080 TITAN and 1080 GTX cards so much that people will want 2080 RTX cards.... just as they have done for years with gamewoks, hairworks etc.... even when you barely notice these features in the games.
But, it won't matter for their sales: the RTX 2080 (or whatever) will be better at typical raster-only games, too. NVIDIA can't gave up on rasterization.
Crysis didn't push DX10 adoption through visuals, but because new DX10 GPUs were better at DX9, people bought more first-gen DX10 cards.
But.... *some* card has to be the first card to explicitly support real-time ray-tracing, so I don't really blame them.
>For today’s reveal, NVIDIA is simultaneously announcing that they will support hardware acceleration of DXR through their new RTX Technology. RTX in turn combines previously-unannounced Volta architecture ray tracing features with optimized software routines to provide a complete DXR backend, while pre-Volta cards will use the DXR shader-based fallback option
Did you actually watch the video stream? This is aimed at a wide range of industries now with gaming not the focus at an event for Pros. A clue if you missed the live stream is the Quadro cards start at over $2k.
Interesting timing with Nvidia releasing radical new tech which may well expand multiple industries on the same day that AMD release their chopped down server CPUs for HEDT/Workstation. Why? Because Nvidia are looking at replacing even more CPU workloads with GPUs so there's a battle with AMD fighting Intel for the current market share whilst Nvidia aim to reduce that market size. Not a good day for Intel overall.
You're silly. AMD is going after the same market as Nvidia, but using their (sadly inferior) Vega GPUs.
The fact that they didn't add AVX-512 to Ryzen can be seen as evidence that they "got it" sooner than Intel, who will also be joining the party with their own GPUs (not Xeon Phi, as they previously hoped).
Sure, AMD would like to compete with NV in the data centre but that may be harder than catching up with Intel who have been asleep at the wheel for a while now whereas NV seem very perky. So it may not be silly at all, time will tell.
Agreed. Nvidia seems to be a formidable competitor.
AMD's Vega looked competitive when it was announced, but was behind when it launched. By the time Vega 2 was announced, it was already behind. That is not a promising trend.
AMD's best hope is that their next GPU redesign will be the "Zen" of GPUs. If they can't get it right then, they will have to settle for being the kind of also-ran in the GPU space that their CPUs have been for much of the past decade.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
16 Comments
Back to Article
Cygni - Monday, August 13, 2018 - link
My stream isn't workin at all, guess I'll be reading your text play-by-play. :)(Assuming your vid doesnt die too!)
Hxx - Monday, August 13, 2018 - link
lets see RTX and GTX cards :)ikjadoon - Monday, August 13, 2018 - link
RT = ray tracing! :D Maybe?ikjadoon - Monday, August 13, 2018 - link
Let's all pretend I was just testing everyone.* because this is definitely it, lmao: https://developer.nvidia.com/rtx
olafgarten - Monday, August 13, 2018 - link
10 Gigarays per second = 10,000,000,000 Rays per second = 166,666,666 Rays per Frame @ 60 FPS = 33 MPix of the Star Wars Demo in real time?mode_13h - Tuesday, August 14, 2018 - link
Not all rays are primary. In fact, most rays are not. Features like area lights, depth of field, motion blur, reflections, and global illumination chew up quite a lot of rays.mode_13h - Tuesday, August 14, 2018 - link
All of that is to say that you need significantly more than 1 ray per rendered pixel.Gothmoth - Monday, August 13, 2018 - link
amazing today.... yeah and no real content for RT until 2020..... nice try.Gothmoth - Monday, August 13, 2018 - link
but nvidia can put one or two stupid RT features in AAA title and slow down 1080 TITAN and 1080 GTX cards so much that people will want 2080 RTX cards.... just as they have done for years with gamewoks, hairworks etc.... even when you barely notice these features in the games.ikjadoon - Monday, August 13, 2018 - link
But, it won't matter for their sales: the RTX 2080 (or whatever) will be better at typical raster-only games, too. NVIDIA can't gave up on rasterization.Crysis didn't push DX10 adoption through visuals, but because new DX10 GPUs were better at DX9, people bought more first-gen DX10 cards.
But.... *some* card has to be the first card to explicitly support real-time ray-tracing, so I don't really blame them.
ikjadoon - Monday, August 13, 2018 - link
But to clarify... RTX isn't working as a fully independent silo. AFAIK. RTX runs through DirectX Ray Tracing, recently included into DirectX 12:https://www.anandtech.com/show/12547/expanding-dir...
>For today’s reveal, NVIDIA is simultaneously announcing that they will support hardware acceleration of DXR through their new RTX Technology. RTX in turn combines previously-unannounced Volta architecture ray tracing features with optimized software routines to provide a complete DXR backend, while pre-Volta cards will use the DXR shader-based fallback option
smilingcrow - Monday, August 13, 2018 - link
Did you actually watch the video stream?This is aimed at a wide range of industries now with gaming not the focus at an event for Pros.
A clue if you missed the live stream is the Quadro cards start at over $2k.
smilingcrow - Monday, August 13, 2018 - link
Interesting timing with Nvidia releasing radical new tech which may well expand multiple industries on the same day that AMD release their chopped down server CPUs for HEDT/Workstation.Why? Because Nvidia are looking at replacing even more CPU workloads with GPUs so there's a battle with AMD fighting Intel for the current market share whilst Nvidia aim to reduce that market size.
Not a good day for Intel overall.
mode_13h - Tuesday, August 14, 2018 - link
You're silly. AMD is going after the same market as Nvidia, but using their (sadly inferior) Vega GPUs.The fact that they didn't add AVX-512 to Ryzen can be seen as evidence that they "got it" sooner than Intel, who will also be joining the party with their own GPUs (not Xeon Phi, as they previously hoped).
smilingcrow - Tuesday, August 14, 2018 - link
Sure, AMD would like to compete with NV in the data centre but that may be harder than catching up with Intel who have been asleep at the wheel for a while now whereas NV seem very perky.So it may not be silly at all, time will tell.
mode_13h - Tuesday, August 14, 2018 - link
Agreed. Nvidia seems to be a formidable competitor.AMD's Vega looked competitive when it was announced, but was behind when it launched. By the time Vega 2 was announced, it was already behind. That is not a promising trend.
AMD's best hope is that their next GPU redesign will be the "Zen" of GPUs. If they can't get it right then, they will have to settle for being the kind of also-ran in the GPU space that their CPUs have been for much of the past decade.