Intel is no stranger to raytracing - we've seen demonstrations such as Quake IV ported to an Intel-designed raytracer along with a number of other demos in the past. The promise of raytrace renderers over today's more conventional raster engines for games and desktop 3D has always been increased realism and theoretically near linear scaling. Of course, the problem until now has been that raytracers haven't been able to maintain playable framerate at desktop resolutions. 

Yesterday Intel demonstrated a new example of raytraced graphics on the desktop using a raytrace rendered version of Wolfenstein. This time, it was based around a cloud-centric model where frames are rendered on 4 servers, each with a 32-core codename Knight's Ferry silicon at the core. 

Knight's Ferry is a Many Integrated Core (MIC) architecture part intel showed off at the International Supercomputer Conference this year with 32 logical cores. 

We saw the Wolfenstein demo raytraced at 1280x720 and averaging between 40 to 50 FPS, all being rendered on four servers with Knights Ferry at the core. Intel showed off the game visualized on a Dell notebook. All of the frames are sent to the thin client notebook over ordinary gigabit ethernet. 

Interesting visual features of the Wolfenstein raytracer include physics-correct refractions and reflections at interfaces like glass and water - taking into account the actual index of the material. and very recursive examples like a surveillance station camera with the surveillance station inside it. Check out the gallery for those examples, screenshots, and more.

Update

Fixed a major typo (thanks everyone for catching that)! For more info check out the project's homepage

POST A COMMENT

26 Comments

View All Comments

  • Mr Perfect - Wednesday, September 15, 2010 - link

    Beyond3D has an interesting article on the pros and cons of ray tracing. After reading it, I'm simply not holding my breath for ray tracing

    http://beyond3d.com/content/articles/94/

    Maybe a raster/ray tracing hybrid engine though. Best of both worlds, none of the drawbacks.
    Reply
  • Brian Klug - Monday, September 13, 2010 - link

    Oh god, that indeed is a typo. Apologies, fixed!

    -Brian
    Reply
  • Lifted - Monday, September 13, 2010 - link

    That made me LOL when I realized it wasn't a typo. Reply
  • hechacker1 - Monday, September 13, 2010 - link

    It may not be practical today, unless you are a 3d modeling/gaming company or movie studio,

    But at least this paves the way for ray traced games that can actually be easier to develop for in terms of effects and realistic visuals. And high quality rendering throughout the editing process.

    I imagine in a few years all of that hardware could be in a few GPU cards working in tandem.
    Reply
  • Klinky1984 - Monday, September 13, 2010 - link

    Why would I want to put multiple cards in my machine to get graphics that are inferior to what a single raster based card can produce now? I think it's even worse that they had to use 128-cores just to get 40 - 50fps.

    Intel should just stick to x86 CPUs, all of their efforts at revolutionizing the gaming market have failed & this is looking more and more like another failure. Perhaps if Intel can really push multi-core CPU tech to extreme heights were you have 256 or 512 high-end cores, then we might be in a position to see this take off. That might be quite some time away, maybe 10 years from now & it'll probably be terrible expensive when it comes out.
    Reply
  • Klinky1984 - Monday, September 13, 2010 - link

    Well I didn't see that this Knights Ferry has unknown specs, so who knows what the comparison to a Core i or Core2 based processor would be per core. I'd imagine Intel would have released that information if they weren't embarrassed by it though. Reply
  • phatboye - Monday, September 13, 2010 - link

    It would have been nice if they posted a side-by-side comparative "rasterized" rendered screen shot so that those of us who don't own that game could see the difference between the ray-traced and raster renderers. Reply
  • Kardax - Monday, September 13, 2010 - link

    It would be hard to do such a comparison in an unbiased way.

    You could compare it to the original game "Return to Castle Wolfenstein", but this was released in 2001. Surprisingly, the original still compares favorably outside of the specific things that a ray tracer does well (reflections, refractions, picture-in-picture, shadows), and it would obviously run on 2001-era hardware just fine while this demo required some massively expensive server hardware.

    If this game were updated to run on the latest Unreal engine (for example), it would look far better and still run on cheap hardware.
    Reply
  • jklappenbach - Monday, September 13, 2010 - link

    After reading through the previous comments, it appears that the point of Intel's technology has been missed. This is a proof-of-concept for *cloud* based computing. This is not introducing new GPU technology, even though they highlight KC nodes as potential cloud participants. The computing resources could just as easily be i7 or even earlier generations of Intel architectures.

    Rather, Intel has proposed a platform where commoditized CPU resources are combined together as a pool to react to input from the user and produce a video stream in response, which is then transmitted back to a thin client. In the near future, for some hourly fee, a video game participant could leverage hundreds (perhaps thousands) of threads to provide rendering power that would never be available to a single consumer platform. However, latency, not CPU power, will be the primary issue with this type of service. Still, it is promising!
    Reply
  • Kardax - Monday, September 13, 2010 - link

    Latency is what I'm most impressed about here. All this hardware is synchronized well enough to do real-time raytracing at dozens of FPS. That takes some very serious network programming to accomplish.

    I don't know how this is going to be used in the real world (ray traced games won't be it), but I have to believe that ultra-low-latency distributed computing will find some problem it can solve better than anything else :)
    Reply

Log in

Don't have an account? Sign up now