Ray Tracing On Xbox One?

A tweet from Microsoft Studios boss, Phil Spencer, has all my fellow tech enthusiasts intrigued. The question was posed to Spencer regarding the possibility of using the cloud for real-time ray tracing.

Screen Shot 2014-03-03 at 3.05.16 PM

First, what is ray tracing? To understand ray tracing, we must first understand how light functions in reality. As we all know, light is made up of particles called photons. These photons are emitted from their light source and are “bounced” around the environment as they interact with the ground, people, objects, etc.

When photons directly emitted from a light source interact with something, this is called direct lighting. When photons emitted from that light source bounce off an object and then bounce onto a different object, this is called indirect lighting. Indirect lighting is the reason why even though an object may be in the shadow of a tree, it is not completely 100% dark and black. This is because photons are bouncing off the ground, tree, and other objects before hitting it – indirect lighting.

Ray tracing is effectively recreating these photon bounces in a virtual world. You are literally tracking these photons as they move through the environment and interact with objects. In order to replicate this digitally, you need a lot of power. This is why Pixar has render farms of thousands and thousands of processors. This is fine for film because the film itself will never change.

To do this for video games, however, it’s completely different. Games react to the player because the player completely controls what’s happening in a game, unlike film. To render this plus all the other effects in real-time at 30 fps requires such an insane amount of computational power. Now, video games get around this by implementing tricks like ambient occlusion, image based lighting, and contact hardening shadows through a deferred rendering pipeline. The end result looks convincing enough, but at a fraction of the resource cost. For perspective, even 4x Titan Blacks simply cannot achieve real-time ray tracing. It’s nowhere near enough power.

If this is true, and if something like ray tracing can be offloaded to the cloud, this could spell industry wide ramifications. In essence, ray tracing is all computational. Meaning, in theory, this could be achievable via cloud. However, there are simply so many variables at play for this to even begin to enter conversation. For example, factors like bandwidth and type of game come into play.

Cloud powered ray tracing might be achievable for single-player titles when you don’t have to worry about other players. However, what about a game like Battlefield 4 with scores of other players? Not to mention that this needs to occur in real-time for everyone. So if someone knocks a building down, then light needs to stream in through the rubble, thus requiring more computational load.

In my honest opinion, ray tracing is years away. I simply don’t believe Microsoft’s claims here. The internet needs to catch up and client-side hardware needs to become much more powerful. Still, we can dream.


  • It’s not clear to me which of Microsoft’s claims you disbelieve. That the company has done experiments with real-time ray tracing? That there’s a ton of potential? Or that it can generate amazing visuals?

  • I agree with Fredelas that it’s not clear. As for cloud based anything, the problem is bandwidth. Even with Gb internet services like Google Fiber, which is only available in select neighborhoods in three cities around the US, the amount of data that would need to be pipped back and forth is enormous. I agree with Shank that the technology is not ready and it probably wont be ready for at least a decade. The only way something like this will come sooner is if a) we demand our ISPs to provide better service and b) if someone makes a breakthrough on the scale of the unified micro processor.

Leave a Reply