Epic's Tim Sweeney predicts photo-realistic graphics within 10 years

Frontino

Newcomer
http://www.eurogamer.net/articles/2013-07-11-epics-tim-sweeney-predicts-photo-realistic-graphics-within-10-years

If Tim is just addressing photorealism of an environment, I believe it. But if we take into consideration world interaction, physics and humanoids as all included, that's a personal big NO.

Just the humanoids have not yet reached Final Fantasy: The Spirits Within quality.

That was back in 2001 and you could tell the characters weren't real even at DVD resolution.
Also current CGI humanoids in movies have even been improved, but still lack complete realism, which is immediate to spot when looking at mouth movements. The only time I got fooled was during a trailer of TRON: Legacy, with the CGI character of CLU.

I didn't know it was supposed to look like a young Jeff Bridges, so I thought he was some actor doing his stuff. I didn't pay hard attention and didn't realize he was CGI. That's pretty impressive, if you think about it.
But then I watched the movie when it came out in Home Video and even at a low resolution download I could recognize the fakeness of CLU in some situations involving mainly the character animation and the close-ups.

So, if even millions of $ of pre-rendered 2009 CGI still doesn't cross completely the uncanny valley, how can game developers expect to do it in 10 years if they haven't yet reached a 12 years old movie's CGI?
 
Seems like he says the same things you are in the article, that AI and animation are still big obstacles. The nice thing about photos is they're totally still ;)
 
Dunno, he kinda lost me when he argued about doing everything Larrabee style... ignoring arguments about power/space efficiency etc.

Games already look good to me and photorealistic on occasions, and I bet I will still find flaws in 10 years if I look for them.
 
Sweeney was predicting the end of discrete graphics and a convergence to CPU rendering ten years back. Instead, we've effectively had discrete graphics parts built into or alongside the CPU, with no sign of a return to purely CPU driven graphics.
 
Sweeney was predicting the end of discrete graphics and a convergence to CPU rendering ten years back. Instead, we've effectively had discrete graphics parts built into or alongside the CPU, with no sign of a return to purely CPU driven graphics.

I think GPUs survived solely thanks to Microsoft's Direct3D and fanboys. If you watch the evolution of flagship cards from (for example) Riva TNT to Geforce GTX Titan you'll notice an increase in chipset+PCB size, heatsink+fan dimensions, additional power adapters and the overall look of it, it makes you think the companies clients are the same that do car tuning.
If Intel/Amd focused on improving vector performance in their CPUs way before, there wouldn't be any need of video cards or IGPs.
Actually, Intel's AVX could already kill the need of an IGP, if it wasn't for Direct3D.
 
I'm sorry, but that's ludicrous. CPUs can still only focus on one thing at a time, while GPUs are massively parallel with gobs more execution resources and bandwidth compared to any CPU. Throwing enough vector hardware onto a CPU to rival a GPU would be doable in theory, but it would be a huge, hot (expensive!!!) chip that would draw far more power than any CPU up until now (except for AMDs new ridiculous 5GHz/220W monster), but you would still lack the bandwidth, the latency-hiding ability, parallelity (not really a word, I know) and so on. And what if you dedicate your CPU to drawing 3D graphics, who's gonna do the physics, animations, particle systems, game logic and running the OS on top of it all?

GPUs exist because they make fuxing sense. Not because of the legacy of D3D or - jeez! - fanboys.
 
I think GPUs survived solely thanks to Microsoft's Direct3D and fanboys. If you watch the evolution of flagship cards from (for example) Riva TNT to Geforce GTX Titan you'll notice an increase in chipset+PCB size, heatsink+fan dimensions, additional power adapters and the overall look of it, it makes you think the companies clients are the same that do car tuning.
If Intel/Amd focused on improving vector performance in their CPUs way before, there wouldn't be any need of video cards or IGPs.
Actually, Intel's AVX could already kill the need of an IGP, if it wasn't for Direct3D.

But they didn't do any of that, and instead the discrete card just got more powerful, and more necessary if you want to do any serious gaming at the ever larger resolutions that are available to us now with larger screens and multi-screen setups.

It happened that way because the graphics card companies did a better job than anything Intel or AMDs CPU divisions did for gaming graphics.

I'm actually surprised at Sweeney still predicting a return to CPU driven graphics. It's the same prediction he's been making for years, and it's still not coming true. You think he would have revised his opinion after all this time of his predicted path not happening.
 
10-15 years ago you could seriously discuss this. Nowadays there really arent arguments anymore, Cell was a monster in vector processing - it still neglible to GPUs when doing pixel shading.
Nowadays its not simply about fitting more and more transistors on a chip, its power thats the limiting factor. And the emost efficient way is dedicated hardware and will always be.
In the future more and more components different will be on one die, evaporating the slow busses between them - but there will be specialised parts for graphics (and quite possibly other functions) and all will be fighting to keep the powerdraw and heat down.
 
Physics doesn't need fixed function units, yet it could be as demanding as graphics, if developers wanted to. There was once a discrete card from Ageia, but it died fast because it didn't have the 2 most important things that kept Nvidia and AMD GPUs alive: a Mircrosoft's monopolial API and car-tuning-fanboys.
 
I think GPUs survived solely thanks to Microsoft's Direct3D and fanboys.

Seriously what is this, anyone who wants anything you dont is a fanboy, ive seen a lot of this attitude recently

There was once a discrete card from Ageia, but it died fast because it didn't have the 2 most important things that kept Nvidia and AMD GPUs alive: a Mircrosoft's monopolial API and car-tuning-fanboys.
Fanboys again eh, It died because nvidia bought the company and stopped making the cards, if your going to present an argument dont base it on stuff youve made up
 
Real time physics based animation of multiple clothes layers and hair for a dozen deformable body characters with IK animation on screen in a reasonable power budget in 10 years? Optimistic. It's not like they're even remotely close to doing 1 now ...
 
Real time physics based animation of multiple clothes layers and hair for a dozen deformable body characters with IK animation on screen in a reasonable power budget in 10 years? Optimistic. It's not like they're even remotely close to doing 1 now ...

OK, so just a city with cars, can we get that, pretty please? http://www.youtube.com/watch?v=FJLy-ci-RyY
10 years is a theoretical 100x improvement, applying moore's law. Is there still physical room to apply moore's law for the next 10 years? Or would that imply splitting atoms?
 
The law is an economic observation rather than a feature size or processing speed one.

There are more ways than shrinking transistors to get twice as many of them for the same amount of money, hence the eagerness for larger wafers and forms of chip stacking.
My interpretation of the consenus is that relying purely on shrinks will lead to the end of Moore's law before physical subdivision becomes impossible.

I'm also leaning towards the conclusion that the fraction of the silicon products that Moore's law applies to is shrinking with time, as devices with the most transistors per dollar tend to sacrifice other attributes that make them less compelling for the markets that they need to sell to.

Using "scales with Moore's Law" as a substitute for saying there's an exponential curve is going to be less tenable going forward when the traditional underpinnings of that law stop scaling exponentially.
 
Last edited by a moderator:
OK, so just a city with cars, can we get that, pretty please? http://www.youtube.com/watch?v=FJLy-ci-RyY
10 years is a theoretical 100x improvement, applying moore's law. Is there still physical room to apply moore's law for the next 10 years? Or would that imply splitting atoms?

What is this rendered on, GPU? software rendering on a cluster of 48 computers?

This looks great anyway, but with a few limitations still : beyond displaying a pile of grains while moving (which avoids doing a 320x200 tech demo) it's an environment like Mafia 1, which gives you big bangs for the buck (draw a cube and pretend it's a building, at worst :))
 
What is this rendered on, GPU? software rendering on a cluster of 48 computers?

This looks great anyway, but with a few limitations still : beyond displaying a pile of grains while moving (which avoids doing a 320x200 tech demo) it's an environment like Mafia 1, which gives you big bangs for the buck (draw a cube and pretend it's a building, at worst :))

http://raytracey.blogspot.co.nz/2012/09/real-time-path-tracing-racing-game.html

Sam Lapere said...
Thanks. It's running on a couple of Geforce cards.
September 12, 2012 at 2:36 AM

Could be just about any model... but considering SLI, probably high end or last gen high-end. My guess would be 2 x 680 or 2 x 580.
 
So Sweeney has never ever been wrong with his exaggerated predictions (more than once) but it's all a result of a bunch of fanboys and Microsoft's big evil conspiracy?

If anything else it's those "fanboys" that provided jobs for gentlemen like Sweeney and many others here that probably hide conveniently behind a garden variety nickname.
 
I think GPUs survived solely thanks to Microsoft's Direct3D and fanboys. ... If Intel/Amd focused on improving vector performance in their CPUs way before, there wouldn't be any need of video cards or IGPs.
Actually, Intel's AVX could already kill the need of an IGP, if it wasn't for Direct3D.

Could you please explain me why exits GPU in the ARM ecosystem? No Microsoft and no DirectX.
 
1993 - predicts photo-realistic graphics within 10 years!
2003 - predicts photo-realistic graphics within 10 years!!
2013 - predicts photo-realistic graphics within 10 years!!!

Although I have to admit 2013 we are pretty damn close. But i am going to bet we wont get there in 2023. Simply because 1993 - 2013 we were 100% invested in performance and quality. The money for development in the next 10 years will shift to mobile graphics. Which means we are very power contained. Sure Graphics Quality and Performance will still improve. But just at much smaller rate.
 
Back
Top