I'm not too impressed with the visuals, actually. The refraction is nice but they shadows are completely broken in most of the images. It's probably a compromise for performance since most modern ray tracers have no problem with this... it just takes them a while to get it done.
It's an interesting demo, but it doesn't change the fact that ray-tracing is too computationally intensive for real-time stuff. The same amount of money in traditional rasterization hardware would buy 100x the performance.
This technology has been "budding" for 10+ years, and has always been 8+ years behind rasterization in image quality, and they're getting further behind as time goes on. If you ignore refraction and reflection, real-time ray tracing has been at DirectX 6 levels of image quality for several years, now.
This is a fundamental limitation of raytracing... the math to do it is much more complex than rasterization, which demands more hardware to deliver each real-time pixel. Intel canceled the Larrabee GPU because they knew it was a lost cause.
To say "it's no crysis" kinda proves the point: that game came out in 2007 and here it is 2010 and ray tracing is nowhere close.
Well when ray tracing overtakes rasterization I'll be looking for your post with a picture of your foot in your mouth, though I'm sure you won't be there since you'll be too busy on newegg picking up the hardware.
nVidia has demonstrated promising, HIGH-QUALITY ray-tracing demos for the GTX 400 series that aren't far off from game-capable.
Your negative comments are not productive. This article demonstrates not only an improvement in ray-tracing, but also in architecture with this being performed in a cloud environment. This means that when many-cores come down in price and scale up more, it's very possible to see ray-tracing in high-end machines, but you wouldn't know that because you're too busy telling everyone how stupid of an idea ray-tracing is versus rasterization because of some "fundamental rule" that you can see, but obviously Intel, nVidia, and ATi/AMD cannot. They've only been doing this for years, but you obviously know more than them.
You should quickly write them a letter telling them how they are wasting thousands of dollars demonstrating ray-tracing on games since it will ray-tracing will always be behind rasterization. Be sure to write a letter to SSD manufacturers and quantum computing researches as well. Hard drives and transistors have had too far a head start they should just give up now.
You've saved us all with your wisdom. However can we thank you?
I'm not just some poser on the internet. I've written ray tracers. It's pretty easy. I thought this demo was a clever way to show the same effect three different ways.
Reflections, refraction, and sub-cameras are all the same effect: after the ray hits the surface, a calculation is performed to shoot _another_ ray to determine what the object on the other side is supposed to be... it's simply a different calculation for reflection/refraction/picture-in-picture. Ray tracing does this very accurately since it calculates each pixel individually, but this is also precisely why its performance is so bad... GPU rasterization can process whole swaths of pixels in one pass with circuitry and caches aligned to do precisely that (and only that).
Likewise, lighting is calculated by shooting another ray from an object intersection point against all the lights in the scene... if it doesn't hit any, that pixel is part of a shadow. This yields mathematically perfect shadows, but the extra ray causes a massive speed penalty that gets worse as you add lights to a scene (which is why the lighting is so primitive in every real time ray tracing demo I've ever seen).
The fundamental performance problem with ray tracing is the rays themselves. They can go in wildly different directions on a per-pixel basis, generating a variable number of additional rays depending on what they hit and the number of light sources. You're forced to limit the use of these effects to preserve your speed, which raises the question of why you even bothered with ray tracing in the first place. It's a tease... ray tracing can do wonderful per-pixel effects but has no way to do it fast.
You have to remember this is a game that was never intended with ray tracing in mind. If a game was made from the ground up to be ray traced, it could very very easily look far better than anything else that exists these days. Being as you've use ray tracing you would know how easy it would be to do this. The ability to make it look better than anything else is already there, just no one has done it and the performance hit would be too demanding.
Replying to your general thread, I completely agree; I look at these screenshots and, to me, it looks like pre-rendered imagery from a mid-90s game (no, that's not a compliment). The lighting is disturbingly uniform (did they disable most lightsources and just go for global illumination?) which is probably the primary culprit.
I don't think the idea that ray tracing never WILL catch up is still not sinking in to people. Think about this, then: rasterization can be done faster than ray tracing. You might say "But in 5 years, hardware will be fast enough to do this at high quality in real time!"
To this, I respond, "Yeah, and imagine what rasterization will be able to do on the same hardware."
It may not be practical today, unless you are a 3d modeling/gaming company or movie studio,
But at least this paves the way for ray traced games that can actually be easier to develop for in terms of effects and realistic visuals. And high quality rendering throughout the editing process.
I imagine in a few years all of that hardware could be in a few GPU cards working in tandem.
Why would I want to put multiple cards in my machine to get graphics that are inferior to what a single raster based card can produce now? I think it's even worse that they had to use 128-cores just to get 40 - 50fps.
Intel should just stick to x86 CPUs, all of their efforts at revolutionizing the gaming market have failed & this is looking more and more like another failure. Perhaps if Intel can really push multi-core CPU tech to extreme heights were you have 256 or 512 high-end cores, then we might be in a position to see this take off. That might be quite some time away, maybe 10 years from now & it'll probably be terrible expensive when it comes out.
Well I didn't see that this Knights Ferry has unknown specs, so who knows what the comparison to a Core i or Core2 based processor would be per core. I'd imagine Intel would have released that information if they weren't embarrassed by it though.
It would have been nice if they posted a side-by-side comparative "rasterized" rendered screen shot so that those of us who don't own that game could see the difference between the ray-traced and raster renderers.
It would be hard to do such a comparison in an unbiased way.
You could compare it to the original game "Return to Castle Wolfenstein", but this was released in 2001. Surprisingly, the original still compares favorably outside of the specific things that a ray tracer does well (reflections, refractions, picture-in-picture, shadows), and it would obviously run on 2001-era hardware just fine while this demo required some massively expensive server hardware.
If this game were updated to run on the latest Unreal engine (for example), it would look far better and still run on cheap hardware.
After reading through the previous comments, it appears that the point of Intel's technology has been missed. This is a proof-of-concept for *cloud* based computing. This is not introducing new GPU technology, even though they highlight KC nodes as potential cloud participants. The computing resources could just as easily be i7 or even earlier generations of Intel architectures.
Rather, Intel has proposed a platform where commoditized CPU resources are combined together as a pool to react to input from the user and produce a video stream in response, which is then transmitted back to a thin client. In the near future, for some hourly fee, a video game participant could leverage hundreds (perhaps thousands) of threads to provide rendering power that would never be available to a single consumer platform. However, latency, not CPU power, will be the primary issue with this type of service. Still, it is promising!
Latency is what I'm most impressed about here. All this hardware is synchronized well enough to do real-time raytracing at dozens of FPS. That takes some very serious network programming to accomplish.
I don't know how this is going to be used in the real world (ray traced games won't be it), but I have to believe that ultra-low-latency distributed computing will find some problem it can solve better than anything else :)
Events occurring at dozens per second is slower than snail paced, even for a single-digit MHz computer of the 1980s, and much less for anything money can buy today...
Not sure why you think this demo's so incredibly impressive, the cloud servers only draw what they're told, each frame in sequence in a predictable manner, and the client buffers the results and display them.
that was exactly what i was going to say too, everyone bickered about how raytracing is so power hungry and super inefective, what all missed was that this demo was played on a measely laptop using cloud graphics. That, for me is the future of gaming, you wont have a powerfull computer, just something to connect to the cloud and play on demand, the TCO of a gaming platform will be hugely higher than this type of setup, especially is you factor in constant upgrades for your computer. Latency of cource is vital for that.
I am sorry but even Nvidia admit that Ray Tracing wont be in Games for another Decade, an it would be another decade for Hybrid Ray Trace + Rasterization to work out, before we use Ray Trace only. That is like 30 years, i hope i am still alive then.
There are fundamental problems with Ray Tracing, it may be perfect for Reflection and stuff, otherwise Rasterization takes the lead.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
26 Comments
Back to Article
MC-Sammer - Monday, September 13, 2010 - link
I'm really surprised at how good it looks, even if it is a low quality (visually) game.like the shadow being casted on the car really blew me away
Kardax - Monday, September 13, 2010 - link
Isn't it Wolfenstein?I'm not too impressed with the visuals, actually. The refraction is nice but they shadows are completely broken in most of the images. It's probably a compromise for performance since most modern ray tracers have no problem with this... it just takes them a while to get it done.
It's an interesting demo, but it doesn't change the fact that ray-tracing is too computationally intensive for real-time stuff. The same amount of money in traditional rasterization hardware would buy 100x the performance.
MC-Sammer - Monday, September 13, 2010 - link
Well it's no crysis, but for a budding technology with those framerates it's not the worst thing intel has ever doneKardax - Monday, September 13, 2010 - link
This technology has been "budding" for 10+ years, and has always been 8+ years behind rasterization in image quality, and they're getting further behind as time goes on. If you ignore refraction and reflection, real-time ray tracing has been at DirectX 6 levels of image quality for several years, now.This is a fundamental limitation of raytracing... the math to do it is much more complex than rasterization, which demands more hardware to deliver each real-time pixel. Intel canceled the Larrabee GPU because they knew it was a lost cause.
To say "it's no crysis" kinda proves the point: that game came out in 2007 and here it is 2010 and ray tracing is nowhere close.
Whizzard9992 - Monday, September 13, 2010 - link
Well when ray tracing overtakes rasterization I'll be looking for your post with a picture of your foot in your mouth, though I'm sure you won't be there since you'll be too busy on newegg picking up the hardware.nVidia has demonstrated promising, HIGH-QUALITY ray-tracing demos for the GTX 400 series that aren't far off from game-capable.
Your negative comments are not productive. This article demonstrates not only an improvement in ray-tracing, but also in architecture with this being performed in a cloud environment. This means that when many-cores come down in price and scale up more, it's very possible to see ray-tracing in high-end machines, but you wouldn't know that because you're too busy telling everyone how stupid of an idea ray-tracing is versus rasterization because of some "fundamental rule" that you can see, but obviously Intel, nVidia, and ATi/AMD cannot. They've only been doing this for years, but you obviously know more than them.
You should quickly write them a letter telling them how they are wasting thousands of dollars demonstrating ray-tracing on games since it will ray-tracing will always be behind rasterization. Be sure to write a letter to SSD manufacturers and quantum computing researches as well. Hard drives and transistors have had too far a head start they should just give up now.
You've saved us all with your wisdom. However can we thank you?
Kardax - Monday, September 13, 2010 - link
I'm not just some poser on the internet. I've written ray tracers. It's pretty easy. I thought this demo was a clever way to show the same effect three different ways.Reflections, refraction, and sub-cameras are all the same effect: after the ray hits the surface, a calculation is performed to shoot _another_ ray to determine what the object on the other side is supposed to be... it's simply a different calculation for reflection/refraction/picture-in-picture. Ray tracing does this very accurately since it calculates each pixel individually, but this is also precisely why its performance is so bad... GPU rasterization can process whole swaths of pixels in one pass with circuitry and caches aligned to do precisely that (and only that).
Likewise, lighting is calculated by shooting another ray from an object intersection point against all the lights in the scene... if it doesn't hit any, that pixel is part of a shadow. This yields mathematically perfect shadows, but the extra ray causes a massive speed penalty that gets worse as you add lights to a scene (which is why the lighting is so primitive in every real time ray tracing demo I've ever seen).
The fundamental performance problem with ray tracing is the rays themselves. They can go in wildly different directions on a per-pixel basis, generating a variable number of additional rays depending on what they hit and the number of light sources. You're forced to limit the use of these effects to preserve your speed, which raises the question of why you even bothered with ray tracing in the first place. It's a tease... ray tracing can do wonderful per-pixel effects but has no way to do it fast.
ssj4Gogeta - Tuesday, September 14, 2010 - link
Maybe it's just the textures that aren't so good in the demo?mino - Tuesday, September 14, 2010 - link
You can be sure they would go to better detail had it been feasible.B3an - Tuesday, September 14, 2010 - link
You have to remember this is a game that was never intended with ray tracing in mind. If a game was made from the ground up to be ray traced, it could very very easily look far better than anything else that exists these days. Being as you've use ray tracing you would know how easy it would be to do this. The ability to make it look better than anything else is already there, just no one has done it and the performance hit would be too demanding.Guspaz - Tuesday, September 14, 2010 - link
Replying to your general thread, I completely agree; I look at these screenshots and, to me, it looks like pre-rendered imagery from a mid-90s game (no, that's not a compliment). The lighting is disturbingly uniform (did they disable most lightsources and just go for global illumination?) which is probably the primary culprit.I don't think the idea that ray tracing never WILL catch up is still not sinking in to people. Think about this, then: rasterization can be done faster than ray tracing. You might say "But in 5 years, hardware will be fast enough to do this at high quality in real time!"
To this, I respond, "Yeah, and imagine what rasterization will be able to do on the same hardware."
Mr Perfect - Wednesday, September 15, 2010 - link
Beyond3D has an interesting article on the pros and cons of ray tracing. After reading it, I'm simply not holding my breath for ray tracinghttp://beyond3d.com/content/articles/94/
Maybe a raster/ray tracing hybrid engine though. Best of both worlds, none of the drawbacks.
Brian Klug - Monday, September 13, 2010 - link
Oh god, that indeed is a typo. Apologies, fixed!-Brian
Lifted - Monday, September 13, 2010 - link
That made me LOL when I realized it wasn't a typo.hechacker1 - Monday, September 13, 2010 - link
It may not be practical today, unless you are a 3d modeling/gaming company or movie studio,But at least this paves the way for ray traced games that can actually be easier to develop for in terms of effects and realistic visuals. And high quality rendering throughout the editing process.
I imagine in a few years all of that hardware could be in a few GPU cards working in tandem.
Klinky1984 - Monday, September 13, 2010 - link
Why would I want to put multiple cards in my machine to get graphics that are inferior to what a single raster based card can produce now? I think it's even worse that they had to use 128-cores just to get 40 - 50fps.Intel should just stick to x86 CPUs, all of their efforts at revolutionizing the gaming market have failed & this is looking more and more like another failure. Perhaps if Intel can really push multi-core CPU tech to extreme heights were you have 256 or 512 high-end cores, then we might be in a position to see this take off. That might be quite some time away, maybe 10 years from now & it'll probably be terrible expensive when it comes out.
Klinky1984 - Monday, September 13, 2010 - link
Well I didn't see that this Knights Ferry has unknown specs, so who knows what the comparison to a Core i or Core2 based processor would be per core. I'd imagine Intel would have released that information if they weren't embarrassed by it though.phatboye - Monday, September 13, 2010 - link
It would have been nice if they posted a side-by-side comparative "rasterized" rendered screen shot so that those of us who don't own that game could see the difference between the ray-traced and raster renderers.Kardax - Monday, September 13, 2010 - link
It would be hard to do such a comparison in an unbiased way.You could compare it to the original game "Return to Castle Wolfenstein", but this was released in 2001. Surprisingly, the original still compares favorably outside of the specific things that a ray tracer does well (reflections, refractions, picture-in-picture, shadows), and it would obviously run on 2001-era hardware just fine while this demo required some massively expensive server hardware.
If this game were updated to run on the latest Unreal engine (for example), it would look far better and still run on cheap hardware.
jklappenbach - Monday, September 13, 2010 - link
After reading through the previous comments, it appears that the point of Intel's technology has been missed. This is a proof-of-concept for *cloud* based computing. This is not introducing new GPU technology, even though they highlight KC nodes as potential cloud participants. The computing resources could just as easily be i7 or even earlier generations of Intel architectures.Rather, Intel has proposed a platform where commoditized CPU resources are combined together as a pool to react to input from the user and produce a video stream in response, which is then transmitted back to a thin client. In the near future, for some hourly fee, a video game participant could leverage hundreds (perhaps thousands) of threads to provide rendering power that would never be available to a single consumer platform. However, latency, not CPU power, will be the primary issue with this type of service. Still, it is promising!
Kardax - Monday, September 13, 2010 - link
Latency is what I'm most impressed about here. All this hardware is synchronized well enough to do real-time raytracing at dozens of FPS. That takes some very serious network programming to accomplish.I don't know how this is going to be used in the real world (ray traced games won't be it), but I have to believe that ultra-low-latency distributed computing will find some problem it can solve better than anything else :)
FaaR - Wednesday, September 15, 2010 - link
Events occurring at dozens per second is slower than snail paced, even for a single-digit MHz computer of the 1980s, and much less for anything money can buy today...Not sure why you think this demo's so incredibly impressive, the cloud servers only draw what they're told, each frame in sequence in a predictable manner, and the client buffers the results and display them.
chochosan - Tuesday, September 14, 2010 - link
that was exactly what i was going to say too, everyone bickered about how raytracing is so power hungry and super inefective, what all missed was that this demo was played on a measely laptop using cloud graphics. That, for me is the future of gaming, you wont have a powerfull computer, just something to connect to the cloud and play on demand, the TCO of a gaming platform will be hugely higher than this type of setup, especially is you factor in constant upgrades for your computer. Latency of cource is vital for that.dreamlane - Monday, September 13, 2010 - link
I think this raytracing stuff is really showing some potential!iwodo - Tuesday, September 14, 2010 - link
I am sorry but even Nvidia admit that Ray Tracing wont be in Games for another Decade, an it would be another decade for Hybrid Ray Trace + Rasterization to work out, before we use Ray Trace only. That is like 30 years, i hope i am still alive then.There are fundamental problems with Ray Tracing, it may be perfect for Reflection and stuff, otherwise Rasterization takes the lead.
Stas - Tuesday, September 14, 2010 - link
...ly sh*t O.OMurloc - Tuesday, September 14, 2010 - link
but you're still playing on a huge server in the same room, because through internet connection this can't work, they are too slow.