Tim Sweeney says photorealism may be achieved at 40 teraflops

Rangers

Legend
Well, I'm kind of a noob obviously and not really capable of discussing this but things like this are interesting to me:

http://gamingbolt.com/photorealisti...ainable-with-40-tflops-says-epics-tim-sweeney

“You know, we’re getting to the point now where we can render photo-realistic static scenes without humans with static lighting,” Sweeney said in an interview with Gamespot. “Today’s hardware can do that, so part of that problem is solved. Getting to the point of photo-realistic dynamic environments, especially with very advanced shading models like wet scenes, or reflective scenes, or anisotropic paint, though…maybe forty Teraflops is the level where we can achieve all of that.”
Read more at http://gamingbolt.com/photorealisti...ps-says-epics-tim-sweeney#5qilBcCEfZEzVYAT.99

Nvidia's newly announced Pascal monster is 11 TF's. AMD will probably go above that with their next (they were at 8.9 with Fury X). But it certainly puts PS4's 1.8 in perspective.

It's funny because a long time ago Sweeney said something like 5000 teraflops was required to simulate reality, and I always kept that goal in my head, with each new high Flops mark I'd mentally 4X it (for Quad SLI/Crossfire in theory). Well now, I guess he's set a much much closer goal (I guess "simulating reality" and photorealistic gfx aren't the same).

It does seem fairly reasonable I guess, just judging by how close we are getting. Of course the other issue is the money that must be spend on production values to get there.
 
It's funny because a long time ago Sweeney said something like 5000 teraflops was required to simulate reality
"Simulating reality" may shoot at a higher quality level mark than "photorealism". I could imagine that advances in realtime CG and increased experience in the field has led to discoveries in new rendering techniques and hardware features that lets us fake photorealism more easily, which would let us do more work with less computing resources...
 
there was am old pdf from nvidia about 10-12 years ago which stated how much flops they thought they needed to achieve photorealism, I have it on an old HDD (one day I will transfer all my old HDD's onto a new SDD, I must have about 10 lying around) but perhaps someone has a link to it, perhaps worthy of a laugh of how wrong/right they were
 
I'm sure he just pulled a number out of thin air at that moment.
 
Yeah, I think it's a lowball figure. If every game nowadays with static scenes and lighting was photorealistic, he may have a point. But dynamic GI and such isn't a solved problem such that we know we just need to amp up the Flops to hit a target. 40 TF will create a photorealistic driving game or novelty visuals game (Pikmin) but Final Fantasy or Uncharted as if a live action movie is going to be well beyond that, I think.
 
Makes me wonder when we will get a single chip gpu with that kind of power.
Titan X (P) has 11Tflops on TSMC's 16nm, so if we are lucky then the big die chips on 7nm might get us there.
 
4 top Titan/Fury GPU's in a PC would theoretically get you there to ~40 TF now. Is 4X SLI or Crossfire even a thing? Regardless, probably not a very viable or efficient thing, and more important that spec wont be targeted.

So GTA VII in 12 years then.

Probably. I'm still eagerly waiting current gen GTA from a graphics perspective, it's probably gonna be amazing I've said all along. Those guys have more manpower than anybody and it always shows.
 
That's not altogether wrong. The GPU has evolved to become more like a CPU in terms of programmability - dedicated graphics in that respect is definitely falling.
 
That's not altogether wrong. The GPU has evolved to become more like a CPU in terms of programmability - dedicated graphics in that respect is definitely falling.
Except that wasn't Sweeney's prediction. He predicted the decline of the dedicated graphics card market because he saw no way for graphics cards to compete with the flexibility of the CPU. And basically the opposite happened. Here's the actual quote:
Tim Sweeney in 1999 said:
2006-7: CPU's become so fast and powerful that 3D hardware will be only marginally beneficial for rendering, relative to the limits of the human visual system, therefore 3D chips will likely be deemed a waste of silicon (and more expensive bus plumbing), so the world will transition back to software-driven rendering. And, at this point, there will be a new renaissance in non-traditional architectures such as voxel rendering and REYES-style microfacets, enabled by the generality of CPU's driving the rendering process. If this is a case, then the 3D hardware revolution sparked by 3dfx in 1997 will prove to only be a 10-year hiatus from the natural evolution of CPU-driven rendering.

Here's the :link:

That isn't what happened at all. In fact, it's pretty much the opposite of what happened. Graphics chips aren't seen as a waste of silicone. They are more than marginally beneficial to rendering, and in some ways we've seen less emphasis placed on the CPU than the GPU. The 3D hardware revolution is alive and well, despite the prediction to the contrary.

The idea that because shaders are software or GPUs are more general purpose now, then he was right, doesn't really make sense. He predicted the fall of the market. Graphics chips would be a waste of silicon. Everything would be handled on the CPU and we'd be using non-traditional renderers without APIs by 2007. None of that happened.
 
Back
Top