Future console discussion and thoughts

I think rendering needs to be able to be photorealistic to enable the fantastic to appear more realistic.

I completely disagree with this statement :)
More than often I find my self in a situation where an artist comes to me and tells me: "I don't care if this is realistic, it doesn't look good, change it".

And that's the whole point: any rendering engine must impose less possible limitations to the artists, and trying to be photorealistic _is_ a limitation to their creativity. Being believable is more important than being realistic in my opinion.
If, for example, implementing some kind of global illumination makes a scene more believable and gives the artist more freedom, that's what I'll do. But that's not a goal, that's a tool.
 
Last edited by a moderator:
That's true, but lets face it and not forget about it, both are needed. Race Sims for ex. will mainly persue photorealism, but doesn't automaticly exclude it from the project to aim for a cartoony look on purpose to ensure a rich fantasy environment typical of comic books, cartoons or anime.
Other titles strongly need photorealism imo, sci-fiction, horror for ex. wich helps and delivers allot more impact being photorealistic to the point of almost make y jump out leaning back hearbeating like a drum'n'bass track.
 
Last edited by a moderator:
If, for example, implementing some kind of global illumination makes a scene more believable and gives the artist more freedom, that's what I'll do. But that's not a goal, that's a tool.
Taking realtime GI as an example of a desirable photorealism technique, are there any cases where realtime GI would have a negative impact on a game's looks? When would a game be better of with a flat ambient light value instead of secondary illumination? In my mind, you have stylistic renders like Okami or Cell shading, and you have realistic renderers which might be artistic in look, but emulate a sort of real-world lighting. In all the games I can think of that aren't stylised, and some of those that are, photorealistic techniques would be a bonus.

This is different to real-world illumination though, to be clear on that. there's going to be cases where you don't want light to behave realistically as it harms the lighting, just like films. Cinema throws in loads of fake lights, especially coloured fills, to make a better image. It's like gameplay is often (always!) unrealistic, because real-life has severe limits like only being able to die once, and not being able to recover from a couple of sword wounds by eating a chicken. Things like unrealistic light attentuation etc. I can go with. I don't think people expect or mean photorealistic to be 'like real life.' I know it's not what I mean. As opposed to being 'lighting that works like real life' I think the term is more meant as 'lighting and IQ as good as a photo.' That's certainly what I think. And in that respect I can't see any negative side to pursuing that.
 
Do you really want photorealistic games? I mean, I can see photorealism 24/7, I'd like to play and feel something different, something I can't see in real life, an environment which is not real, merely believable, but which gives emotions that can't be felt in real life.

Photorealism is boring.

Ofcourse i want photorealistic games.

I cannot afford a Ferrari Enzo, but id love to have the closest true to life and immersive experience as possible while playing Forza Motorsport \ Grand Turismo .

Photorealism does not (atleast not in my book) mean that all the games have to look exactly like real life. Its more of an overall IQ that compares to a photo or a movie. If there would be a LOTR MMO, that looked like the movie, i would wanna play it...
 
For the next xgpu i agree with the DX11/SM6.0 estimations on the presupposition of a late 2010 release date.
If we keep getting a new shader model every 2 years and a new DX api every 4 years & if xbox3 will come out close to Novemeber 2010 then the situation will be shaped rather like this :

Of course there is always the probability the next xbox to come out late 2009 instead of late 2010. In this case i think that it will have a full SM 5.0 gpu with some SM 6.0 extensions.
For the next playstasion i dont know what to think/suppose because Sony tend to be always unpredictable :smile:

Now speaking for photorealism or what the casual gamers believe as p/r.
The only sure think is that If with an early SM4.0 game we can come that close to p/r like those screens demostrate then i 'm surelly pretty excited for the level of realism that we can expect from the next consoles with their sm6.0 capabilities.


For ps3/xbox360 CPU's i will rather give them the next year to show me what they can achieve in their current form before i start praying for a next omg 50 core Cell cpu.
whoah thats what I call GRAPHICS
 
I think it's a no-brainer the PS4 will support 4k displays out of the box, remember this: ;)
Izumi Kawanishi interview (SCEI corporate executive, software platform development division) @ AV Watch
http://www.watch.impress.co.jp/av/do...0511/rt003.htm
  • DVD upconversion and progressive conversion will be implemented
  • PS3 targets 4K x 2K video output too
MS may add it like a check-box feature later on just like they did with 1080p, because they have no real interest in pushing hi-end TV sets.

Of course this will not be where the main-stream games are heading by 2011, the transfer may occur even later on in the life-cycle of the next-generation consoles, compared to the introduction of the support for 1080p in this console generation.

Kutaragi last year presented a timeline for the development of new display resolutions (link).

Revealing a timeline for technological evolution, Kutaragi-san spoke of how over the next five years screen resolutions will take a gigantic leap from current levels through to 4k by 2k pixels running at 240 frames per second.
Which may be optimistic, but even if you add a few years, you are still in the beginning of the next generation consoles.

People may argue that 4k resolution is heading for diminishing returns, but that was what some people kept saying about 1080p. But a friend of mine who recently bought a 1080p 40W2000 Bravia is telling me a very different "wow" story about watching 1080p content and that is just a 40" screen. The size of flat TV-screens will just keep growing, TV screens between 50" and 100" will be common by 2011, and you can extrapolate the development another 5 years. Flat screens do not have some max size limitations due to some physical law in the same way as CRTs had, (which become to heavy, due to the thickness of the glass needed to avoid implosion). It will be the size of the door openings that set the max size for flat screens in the long perspective.

If people perceive a big improvement in watching a 1080p movie compared to 720p at a high quality 40" screen today, they will likely percieve a similar improvement when going to 4k resolution at a 60" screen and above. Besides watching 4k movies those screens will be nice for watching pthotographs, you can get cheap 8 MPixel cameras today, by 1011, you likely have a 8 MP camera in your mobile phone.

Of course the main stream gamers will not have 4k screens, but that the same can be said about 1080p screens today, but still we see games giving 1080p output. Sony want to provide content so people see a benefit in buying a new larger screen, it´s a hen and egg situation, no content no buyers.

Anyway I don't think it's unrealistic to expect consoles to output 4 times more pixels within 5-6 years and at the same time improving the pixel quality. And I don't think it is unrealistic to expect the same discussion as we see today about wether it's better to spend more shader time on a few pixels than spend less shader time on a lot of pixels. :smile:
 
Last edited by a moderator:
Anyway I don't think it's unrealistic to expect consoles to output 4 times more pixels within 5-6 years and at the same time improving the pixel quality. And I don't think it is unrealistic to expect the same discussion as we see today about wether it's better to spend more shader time on a few pixels than spend less shader time on a lot of pixels. :smile:

There's more to the story than just "can" they produce 4k res tv sets in 5 years. Of course they can. Problem is what will people be running through it? People are already seeing the limits of movies in 1080p as many "older" films (pre90s) were not filmed sharp enough to get the most out of a 1080p display and are closer to 480 (in focus) the majority of the time.

Also consider tv studios that were hesitant to invest in hd cameras for their broadcasts and have recently made this (expensive) investment. A lot of content is still broadcast in sdtv. These things will be the biggest thing holding back any true adoption of 4k sets along with the perceived "fullfilment" of hd resolution with 1080p being advertised currently.

I think 1080p will be the standard for the majority of the people over the next 10 years for these reasons.

not to mention it would be good if ms/sony could stick with this resolution they might get around to that pesky aliasing issue.:D

I'd love to see 16x aa standard on ps4/x720.
 
no photorealism. no graphics provide that, not even the best most expensive pre-rendered CGI can do that.

next-gen consoles will/should have

new control / interaction interfaces beyond the control pad with 2 analog sticks. beyond the Wii Remote's sensors as well. something that can recognize more gestures. maybe VR glove with all kinds of tactile feedback and gesture recognition. put games will probably still be displayed on TVs, *not* head-mounted visplays/full V.R.

Microsoft and Sony will have a dozen or dozens of cores. probably a few main beefy smart cores (full CPUs) combined with lots of dedicated stupid but fast cores for floating point. Nintendo will go with a simple CPU with probably 2 beefy cores, but nothing even remotely as complex or powerful as the Xbox3,PS4 CPUs.

GPUs that go one major step in rendering technology beyond shaders. better image quality, much higher complexity. graphics comparable to the pre-rendered CG intros and cutscenes in PS2/Cube/Xbox games, but nowhere near movie/film grade high budget CG. Nintendo will have somewhat less powerful graphics but still beyond the most powerful PC GPUs of 2006/2007.

Sony & Microsoft will go with probably 4 GB of RAM total, and large amounts of embedded memory
64-128 MB

Nintendo will probably go with 1 GB of RAM and a reasonable amount of embedded memory. graphics will not be #1 priority but there will have to be a huge leap up from GameCube and Wii.

I hope to God that optical discs are nowhere to be found in ANY of the next-gen consoles - I would love for everything to solid-state and super fast - but it probably won't happen.
 
There's more to the story than just "can" they produce 4k res tv sets in 5 years. Of course they can. Problem is what will people be running through it? People are already seeing the limits of movies in 1080p as many "older" films (pre90s) were not filmed sharp enough to get the most out of a 1080p display and are closer to 480 (in focus) the majority of the time.
Good point, to really get any advantage in films we need new cameras that are just about to get into production, that´s why I think Kutaragi is optimistic about his timeline. But some people are happy just to upscale content anyway. :LOL:
 
People are already seeing the limits of movies in 1080p as many "older" films (pre90s) were not filmed sharp enough to get the most out of a 1080p display and are closer to 480 (in focus) the majority of the time.
.

35mm film negatives are capable of holding roughly 3000x2000 equivalent amount of pixel detail under "good" conditions. In many circumstances the reason you see alot of the older content looking more like 480 quality is because there may have not been a rescan of the original negative since the dvd was produced and they're waiting for the SPECIAL EDITION when HD formats become common place. So you the older 1080p content maybe nothing more than the digital intermediate nicely upscaled from the time the dvd was authored. Alot of times their isn't enough $ incentive to do a quality scan on old negative. 35mm film stocks really haven't changed much since the mid-80s.

other reasons could be they didn't store the negative correctly or the under-exposed the film, or used super high speed film which has less detail.
 
Last edited by a moderator:
35mm film negatives are capable of holding roughly 3000x2000 equivalent amount of pixel detail under "good" conditions. In many circumstances the reason you see alot of the older content looking more like 480 quality is because there may have not been a rescan of the original negative since the dvd was produced and they're waiting for the SPECIAL EDITION when HD formats become common place. So you the older 1080p content maybe nothing more than the digital intermediate nicely upscaled from the time the dvd was authored. Alot of times their isn't enough $ incentive to do a quality scan on old negative.

Interesting - I was thinking ( and it appeared to be) the original films just weren't as sharply focused as the newer films and while the older films certainly benefit from a higher resolution format, the end result on a 1080p screen doesn't benefit as much as a new film because of the focus on the original. After renting quite a few old and new hd-dvd's this was the one thing that really jumped out at me. Thinking about it logically it makes sense that the cameras of today would be better than they were 15 years ago but it's one of those things that don't hit you until you see the results in person.
 
For the cost, no, it would not happen
Look at N64's failure

I think he meant download games only instead of optical discs - not carts.

I could see a online/download only console in 5 years but I also think they will have a optical version as well. Optical would probably still be the standard version and the dl version would probably cost a bit less (-$50) but I think it would still be necessary in 5 years to have the option of going into a store and picking up a hard copy of a game for many people. For those that don't need this option they should be rewarded with a lower cost console as it enables the manufacturer to save on the drive and on packaging/shipping for their content. 10 years down the road they may be able to drop the drive all together but I still think they will have an option to buy perhaps an external disc drive and dl the game at your local eb etc and burn to disc.
 
Interesting - I was thinking ( and it appeared to be) the original films just weren't as sharply focused as the newer films and while the older films certainly benefit from a higher resolution format, the end result on a 1080p screen doesn't benefit as much as a new film because of the focus on the original. After renting quite a few old and new hd-dvd's this was the one thing that really jumped out at me. Thinking about it logically it makes sense that the cameras of today would be better than they were 15 years ago but it's one of those things that don't hit you until you see the results in person.

Cameras havent changed but lenses have. The camera portion simply grabs the film with two clamps moves the film into the plane and exposes it for 1/24th of a sec and moves it away. I know you probably meant the lenses too but my film school days ingrained in me to never referenece the lenses or loading magazine when referring to the camera. The lenses would be where modern improvements are made. Many cinematographers will bring their own lenses on sets and not think twice about the camera or loading mags. Also the widespread adoption of super35 format since the mid90s (super35 allows the part of the negative which was reserved for the soundtrack on standard 35mm to be exposed for increased image area on the negative resulting in more detail) which may also be a factor in the noticable jump in iq since around that time. But I still think 1080p could allow us to view alot of older movies especially some old 70mm native films unlike we have ever seen before.

But you have me thinking now which is the bigger holdup with the older movies not looking as clear as newer films on HD media. Poor negative transfers or limitation of the lenses/optics of the time. hmmmm maybe one day we can re-texturize them :)
 
Last edited by a moderator:
This is different to real-world illumination though, to be clear on that. there's going to be cases where you don't want light to behave realistically as it harms the lighting, just like films. Cinema throws in loads of fake lights, especially coloured fills, to make a better image. It's like gameplay is often (always!) unrealistic, because real-life has severe limits like only being able to die once, and not being able to recover from a couple of sword wounds by eating a chicken. Things like unrealistic light attentuation etc. I can go with. I don't think people expect or mean photorealistic to be 'like real life.' I know it's not what I mean. As opposed to being 'lighting that works like real life' I think the term is more meant as 'lighting and IQ as good as a photo.' That's certainly what I think. And in that respect I can't see any negative side to pursuing that.

Absolutely, that's exactly what I mean. There are many cases where I don't want light to behave realistically to achieve a certain result. Which means I don't want to be photorealistic.

An example where flat shading is better than GI? The training-sim at the beginning of the latest Splinter Cell. A flat shaded ambient looking like a computer simulation was the kind of look they wante to achieve.
While a racing game like PGR3 would require photorealism.

The negative side of pursuing photorealism when not needed is, put simply, less time availabe to write the technology the artists really require to create their vision. It obviously all depends on the game, my point is: photorealism is not the goal of a game rendering engine, it's one of the tools, and photorealism is more often then not a tool not needed.

But if, by photorealism, you mean "a believable scene" I agree with you: it's something we often want to achieve.
 
Kutaragi-san spoke of how over the next five years screen resolutions will take a gigantic leap from current levels through to 4k by 2k pixels running at 240 frames per second.

Kutaragio smokes a lot of crack too. None of that is going to happen.

He'll be lucky if in five years the majority of the developed world is up to 720p displays, is what.
 
Kutaragi-san spoke of how over the next five years screen resolutions will take a gigantic leap from current levels through to 4k by 2k pixels running at 240 frames per second.

Kutaragi also spoke about how our PS3 will be rendering 1080p on TWO displays at once (twice as many pixels) @ 120fps.

I agree with Rangers, however i think he might smoke something stronger than crack aswell.
 
Kutaragi also spoke about how our PS3 will be rendering 1080p on TWO displays at once (twice as many pixels)@ 120fps.

I agree with Rangers, however i think he might smoke something stronger than crack aswell.

fortunately he never said that about games
 
Back
Top