Predict: The Next Generation Console Tech

Status
Not open for further replies.
Halleluja!
To both points. :)
Disregarding for the moment the trend towards mobile, the trend in the home is clearly towards the displays getting larger, and since rooms don't grow with them, that the angle of view covered by the screens is increasing. So resolutions need to increase to fill that view with information (and to reduce the visibility of various pixel-level artefacts).
Usage patterns determine needs, clearly, but the trends are obvious. As far as TV resolutions go, we'll probably be stable at 1080p for a fair amount of time - my cineast friends are already complaining though, and want digital cinema distribution quality.

Yeah, I was quite surprised to see 42 inch 1024 x 768 plasmas commanding a price similar to a reasonable 42 (or higher) inch LCD. I was even more surprised to see how good the lower res plasmas looked, at least up until you got pretty close.

I prefer the look of plasmas, but in front of TVs I have a tendency to creep towards the screen as I play, almost ending up pressing my nose against the screen, drooling. I like to be closer than for tv and movies anyway. If next gen console games target 1920 x 1080 it would bother me knowing I was missing out.

It would be interesting to know what range of sizes and resolutions of screen gamers are playing on now, and seeing how that ties in with HDTV sales (how gaming lags behind sales). MS and Sony seem to want people to keep their consoles in their front room under their main telly (Blu Ray, HD movie streaming, Kinect needing lots of space, media centre extender etc) and as you say that means bigger and bigger screens, at least for the moment.
 
Let say I completely disagree with the "we need more pixels mantra".
The target should be to get rid of artifacts/aliasing even if it takes away a bit of sharpness (can be bothering this gen as textures quality and filtering is already low to begin with but looking forward the trade off should have less impact).
Computing power, extra memory space should be directed to the world simulation/physics/interactions not only in frame buffer/render targets or higher quality textures (even though there is clearly a need for improvement here on top of better filtering).
 
Let say I completely disagree with the "we need more pixels mantra".
The target should be to get rid of artifacts/aliasing even if it takes away a bit of sharpness (can be bothering this gen as textures quality and filtering is already low to begin with but looking forward the trade off should have less impact).
Computing power, extra memory space should be directed to the world simulation/physics/interactions not only in frame buffer/render targets or higher quality textures (even though there is clearly a need for improvement here on top of better filtering).
Im agree with you about more texture filters,shaders etc . Maybe I misinterpreted the previous posts, but you think or want games setting for next generation console up to 1080p at 60fps native (not upscale) or 720 3D with 4 * MSAA( for subpixels and FXAA too for edges)?

For if we keep only the concept "better pixels than only more pixels" (contrasting old concept pc like) we will be stuck in 720p resolution (which clearly should already be the default resolution of the current generation...) and we will be more distant what we see in high end pcs.

I do not wish in any way or claim the next gen consoles have the same performance as performance "high end" PCs 600watts, but at least the next consoles will not get extremely outdated and have to still for five years....
 
For if we keep only the concept "better pixels than only more pixels" (contrasting old concept pc like) we will be stuck in 720p resolution (which clearly should already be the default resolution of the current generation...) and we will be more distant what we see in high end pcs.

I think most people would be very happy with a clean 720p image sporting "better" pixels instead of more pixels (I know I would). 90% of the gaming populace most likely thinks the 360 is rendering at 1080p because that's what the dashboard says anyway.

I play on both PC and consoles. RE5 on my PS3 is struggling to maintain a steady 30 fps. On my PC I should get around 170 fps at 1080p (didn't test it myself, but that's what all the benchmarks claim) with the very high image preset. That's an enormous difference. Still, unless I pretty much crawl into my TV screen the perceivable visual impact is almost irrellevant (and I have a pretty big tv).
 
I think most people would be very happy with a clean 720p image sporting "better" pixels instead of more pixels (I know I would). 90% of the gaming populace most likely thinks the 360 is rendering at 1080p because that's what the dashboard says anyway.

I play on both PC and consoles. RE5 on my PS3 is struggling to maintain a steady 30 fps. On my PC I should get around 170 fps at 1080p (didn't test it myself, but that's what all the benchmarks claim) with the very high image preset. That's an enormous difference. Still, unless I pretty much crawl into my TV screen the perceivable visual impact is almost irrellevant (and I have a pretty big tv).


This is a really good point and the vast majority not realize the differences between 720P and 1080P,but maybe games with resolution textures (2048x2048 at least),shaders levels at 1080P "native" cause more impact .

Case games example like RE5 at 720P native may not pass a mere upcaled effect for higher resolutions panels full HD/1080P providing only little extra sense of image for many.

I do not have large televisions (for me 58 "up), but two plasma 50" 768p and the other a 50 "1080P and LCD LED 40" 1080P at right distance/place and i realize substantial differences in the details(depth) in games and movies becomes quite noticeable (between 720 to 1080P).
 
Last edited by a moderator:
what looks better, a DVD or 720p youtube? :)

having owned a voodoo5, I had a fair share of fantastic looking 800x600 gaming.
it did blur but there are useful information in that blur (i.e. a visible powerline, or more detail on a far away counterstrike opponent). and an almost total lack of noise in the image.

tweak the texture LOD and then your textures are sharp, as if you're running high end AF.
I would like similar quality on a modern game, that would use MSAA for polygon edges, adaptive supersampling if there are old style fences, power lines, bushes and leaves, then built-in supersampling in pixel shaders (just like shadows may be drawn using 4x or 16x samples).
draw text fonds and huds at 1080p over the underlying 720p 3D rendering.

of course we're well into developer responsibility and it's probable that they would given a choice in which rendering resolution to use.
 
Yeah, I was quite surprised to see 42 inch 1024 x 768 plasmas commanding a price similar to a reasonable 42 (or higher) inch LCD. I was even more surprised to see how good the lower res plasmas looked, at least up until you got pretty close.

I prefer the look of plasmas, but in front of TVs I have a tendency to creep towards the screen as I play, almost ending up pressing my nose against the screen, drooling. I like to be closer than for tv and movies anyway. If next gen console games target 1920 x 1080 it would bother me knowing I was missing out.

It would seem to me you are reacting to ansi contrast ratio more than resolution. Plasma Tvs tend to have a much higher ansi constrast ratio than LCD tvs. When Samsung introduces OLED displays the ansi contrast ratio will be off the chart compared to LCD and Plasma.


I was listening to NPR radio and they were having a discussion about stereoscopic vision. If I understood it correctly, they were saying the research shows that artists tend to have a dominate eye versus the normal population. So an artist may not like 3d stereoscopic images as much as the general population. This left me wondering if different stereoscopic ratios much should be introduced. Something like 60 frames for the right eye and 30 frames for the left and vice versa. There seems to be a lot of research about this going on right now.

Anyway the bottom line is a 1080p Oled display showing a game running at 60fps @ native 1080p is going to be the greatest leap in graphic quality in the history of videogames for the average consumer when combined with a modern GPU.

720p with additional filtering/shading doesn't make any sense.
 
Im agree with you about more texture filters,shaders etc . Maybe I misinterpreted the previous posts, but you think or want games setting for next generation console up to 1080p at 60fps native (not upscale) or 720 3D with 4 * MSAA( for subpixels and FXAA too for edges)?
Assuming power and thermal constrains I would choose the latter. Upscalers can a really good job if they start on a "clean" picture.
For if we keep only the concept "better pixels than only more pixels" (contrasting old concept pc like) we will be stuck in 720p resolution (which clearly should already be the default resolution of the current generation...) and we will be more distant what we see in high end pcs.
I don't think that the difference that the main difference between high pc and consoles is resolution, see Fb2 engine or crisis there is more lacking than +100% to pixel count.
I do not wish in any way or claim the next gen consoles have the same performance as performance "high end" PCs 600watts, but at least the next consoles will not get extremely outdated and have to still for five years....
It could be like this gen but in fact better the difference could still be resolution but as long as the image is clean that the games receives the same assets, that lightning is the same quality, shadowing quality, LOD management, physics, AI, etc. is the same it will be good enough for consoles.
If some games developers decide that they are better off with 1080p they should be free to do so, (they did in Sacred 2 for example) but I think that overall trade off on resolution is a good call as long as you get rid off enough artefact. I would not calling being "stuck" at 720P as I'm for complete freedom for the studio to make the call, 720p is an arbitrary format (so is 1080p) they could choose anything between those two figures as well as different form factors for their render targets and frame buffer (ie not 16/9) if they think it gives better results.

Realtime 3D rendering is about tricks, in this regard I don't expect much changes consoles developers will have to rely on a bigger bag of tricks than PC developers. To help this I would want manufacturers to go with a more compute oriented GPUs than what available in PC space by their launch.
 
Last edited by a moderator:
Assuming power and thermal constrains I would choose the latter. Upscalers can a really good job if they start on a "clean" picture.
I don't think that the difference that the main difference between high pc and consoles is resolution, see Fb2 engine or crisis there is more lacking than +100% to pixel count.
It could be like this gen but in fact better the difference could still be resolution but as long as the image is clean that the games receives the same assets, that lightning is the same quality, shadowing quality, LOD management, physics, AI, etc. is the same it will be good enough for consoles.
If some games developers decide that they are better off with 1080p they should be free to do so, (they did in Sacred 2 for example) but I think that overall trade off on resolution is a good call as long as you get rid off enough artefact. I would not calling being "stuck" at 720P as I'm for complete freedom for the studio to make the call, 720p is an arbitrary format (so is 1080p) they could choose anything between those two figures as well as different form factors for their render targets and frame buffer (ie not 16/9) if they think it gives better results.

Realtime 3D rendering is about tricks, in this regard I don't expect much changes consoles developers will have to rely on a bigger bag of tricks than PC developers. To help this I would want manufacturers to go with a more compute oriented GPUs than what available in PC space by their launch.

In essence we agree on almost everything... but as I said earlier post that there should be a balance between the two paradigms ("better pixel than more pixels pixels" vs. "more pixels for better image quality") and would like to emphasize since 1994 "psone/psx era 3d graphics" that always had an increase of at least three times the pixels on the screen (psone= 256*224 to ps2 = 640 * 448 in game) and that was counting on advances between generations (without any kind "native"shaders on gpu/rasterizer on ps2 to shaders like 3.0+ on ps3,the equivalent 1.3 shaders xbox to xbox360 shaders 3.0 +) and if these additional pixels on the screen next generation doesn't coming,maybe many will perceived lack of progress.

I think there despite the restrictions of the next-gen consoles in TDP / watts,would be quite possible that the next gen games setting 1080P/2MSAA or 720P 3D/4MSAA native,because there is already today * hardware more than capable of showing high quality (i think we need more software to extract this power there was much talk about GPU's are underutilized.. see Radeon HD 5870 launched nearly two years still among the best on the market and also note that there are several games playable in 2560 * 1600),so even today i had impression we have hardware powerfull/enough room capable to show 1080P without great sacrifices and game engine as FB2 (158MB framebuffer for 1080P) and others may work very well in a closed box (consoles generally don't have limitations APIs pcs and not general porposes...yet).

There also a note that 1080P is already the majority of flat panels sold around the world (at the time were released ps3/x360 theres "HD Ready" CRTs,LCDs / plasmas 768p) and games would be fine setting natively 1080P better than best upscaled hardware or software can offer.

*
I have search some information and found with less than 100 watts (45/40nm) we are already have hardware (cpu and gpu notebooks) "powerfull enough" to show at least "40% to 50%" (processing power,bandwidth,pixel rate,texel rate etc) as we see in "high end" PCs.

Cpu AMD Phenon X4 n950 45nm 2.1GHz: 35 Watts ...(imagine this on Krishna APU at 3GHz 32/28nm without FPU,customised things pcs...less than 25 watts?)

http://www.cpu-world.com/CPUs/K10/AMD-Phenom II Quad-Core Mobile N950 - HMN950DCR42GM.html

Gpu area... see Radeon ("Barts" like Radeon HD 6850) HD 6950M 40nm at 50Watts (hypothetically ...see this in Krishna APU at 550/600MHz 28nm...30 Watts?)

http://en.wikipedia.org/wiki/Comparison_of_AMD_graphics_processing_units

Final Note: I have the impression (xbox360s with APU ps3 less problematic at sub 100 wattage) the next gen console coming with 100 Wattage at max... manufactures will not take riscs for "new especies" 3rls,Ylods,e74 etc again.
 
Last edited by a moderator:
I think you are capping them far too low. I expect 150W ish.

Hopefully you're right, 150 watts is a good number to start and shrinks( and better fan) could reach much less in 18 months (from 32/28nm to 22/20nm), but unfortunately I see signs that the wii phenomenon,US$ billions in damage repairs, warranties of years (even the PS3 had 10% higher rates of failure hardware above the 5% for the average electronics), with greater competition, partners and shareholders of the stock market profits by charging side by side, I don't think we will see performance jump as before (15 times or more) and wattage / TDP high as PS3 ( release model 60GB reach 190/210 watts) and 360 (170/190 watts).
 
Last edited by a moderator:
It would seem to me you are reacting to ansi contrast ratio more than resolution. Plasma Tvs tend to have a much higher ansi constrast ratio than LCD tvs. When Samsung introduces OLED displays the ansi contrast ratio will be off the chart compared to LCD and Plasma.

It's not like I think lower resolutions make things look better, just that until the extra detail becomes noticeable (because I'm close enough) the improved quality of the plasma image overrides the detail. I was excited about OLED screens but it feels like that was about 50 years ago, the damn things are worse than Bulldozer for actually turning up! :(

I was listening to NPR radio and they were having a discussion about stereoscopic vision. If I understood it correctly, they were saying the research shows that artists tend to have a dominate eye versus the normal population. So an artist may not like 3d stereoscopic images as much as the general population. This left me wondering if different stereoscopic ratios much should be introduced. Something like 60 frames for the right eye and 30 frames for the left and vice versa. There seems to be a lot of research about this going on right now.

Couldn't this lead to a perceived judder of moving scenes between the eyes? The idea of, err ... temporal inequality (?) between my eyes makes me feel a bit uneasy, especially if it's on a huge screen taking up most of my fov.

Anyway the bottom line is a 1080p Oled display showing a game running at 60fps @ native 1080p is going to be the greatest leap in graphic quality in the history of videogames for the average consumer when combined with a modern GPU.

720p with additional filtering/shading doesn't make any sense.

Where frame rate is key, like in the 60fps CoD games, I think it could still make sense to go for more frames instead of higher res frames. This doesn't have to mean a ? x 720 frame buffer (same as it doesn't this generation) but I expect to see none native 1920 x 1080 games out their next gen.
 
How 'bout LightPeak like the one on Sony's latest Z laptop ?
http://techreport.com/discussions.x/21204

The parts are expensive now but price will drop if adoption picks up.

More interesting still is the Power Media Dock, which hooks up to the Vaio Z via a dual-use port meant to accommodate both USB 3.0 and optical Light Peak connections. (Yes, Sony specifically says "optical," so this is altogether different from the Thunderbolt ports Apple has pioneered.) The Power Media Dock includes a DVD or Blu-ray drive, a Radeon HD 6650M graphics processor with 1GB of DDR3 RAM, and additional connectivity, including support for up to to three auxiliary displays.

Kutaragi mumbled something about optical interconnect in the early daze.

EDIT: Need to confirm whether it's really optical or Thunderbolt (copper-based). In any case, it hooks up to an external GPU. ^_^
May not be as fast as on-board interconnect but hey, it could be interesting.
 
How 'bout LightPeak like the one on Sony's latest Z laptop ?
http://techreport.com/discussions.x/21204

The parts are expensive now but price will drop if adoption picks up.



Kutaragi mumbled something about optical interconnect in the early daze.

EDIT: Need to confirm whether it's really optical or Thunderbolt (copper-based). In any case, it hooks up to an external GPU. ^_^
May not be as fast as on-board interconnect but hey, it could be interesting.

Dont belive the adoption will be high, basically that is only useful for media professionals, really high end media professionals, even then probably just for very high end video projects.

I really dont see anyone else having even the need to more than USB3 (or even that much), much less the optical thonderbolt, anytime soon.

This will be the firewire of the future, just for a few people and for a few jobs (even if important ones)...
 
It's in MacBook Pro and Vaio Z now. *If* the technology allows people to upgrade their laptops (e.g., GPU + memory), then it may find more use beyond the high end professional market. Let's see.
 
It's in MacBook Pro and Vaio Z now. *If* the technology allows people to upgrade their laptops (e.g., GPU + memory), then it may find more use beyond the high end professional market. Let's see.


Maybe, but the tendency seems otherwise, namely integrated gfx cards that give good enough performance.

It is wait and see, personally I wouldn't bet in it to be of interest for most of the people anytime soon, even USB3 will be way more than enough for the time being.
 
The port is used to drive monitors, external drives, expanded memory and discrete GPU.

Engadget also mentions optical cable for Vaio Z's LightPeak port:
http://www.engadget.com/2011/06/28/sonys-new-vaio-z-ultraportable-laptop-with-power-media-dock-han/

As with most ultraportables, the VAIO Z only packs a grand total of two USB ports, and only one of them is USB 3.0-compatible. But here's a surprise: the latter port is also where Light Peak is implemented: the fiber optic cables feed data to and from the media dock, which we will touch on later.

...

Was told Apple sells assorted adaptors (HDMI, DVI, etc.) for its Thunderbolt port.
 
If I'm not mistaken, Lightpeak was originally conceived as an optically based interconnect (hence the original moniker). Apple's version, "Thunderbolt", is essentially a scaled back, electrical version of the standard.
 
It's in MacBook Pro and Vaio Z now. *If* the technology allows people to upgrade their laptops (e.g., GPU + memory), then it may find more use beyond the high end professional market. Let's see.

What's the maximum bandwidth for lightpeak?


If it's anything like XGP, it won't be widely used at all.

I'm still waiting for Acer to release the promised external graphics card for my 2 year-old Ferrari One (meaning: it's never coming out) -> even though AMD has shown my UMPC running with a Juniper + 3 monitors in several tech shows.

(Damn you AMD and Acer, you really let me down with this one..)
 
If I'm not mistaken, Lightpeak was originally conceived as an optically based interconnect (hence the original moniker). Apple's version, "Thunderbolt", is essentially a scaled back, electrical version of the standard.

Yap. Apparently, Sony stuck to the original spec.

http://en.wikipedia.org/wiki/Thunderbolt_(interface)

The interface was originally intended to run on an optical physical layer using components and flexible optical fiber cabling developed by Intel partners and at Intel's Silicon Photonics lab. The Intel technology at the time was marketed under the name Light Peak,[6] today (2011) referred to as Silicon Photonics Link.[7] However, conventional copper wiring turned out to be able to furnish the desired 10 Gb/s Thunderbolt bandwidth at lower cost. Later versions of Thunderbolt are still planned to introduce an optical physical layer based on Intel silicon photonics technology.




What's the maximum bandwidth for light peak?

10Gbps for Thunderbolt.

If it's anything like XGP, it won't be widely used at all.

I'm still waiting for Acer to release the promised external graphics card for my 2 year-old Ferrari One (meaning: it's never coming out) -> even though AMD has shown my UMPC running with a Juniper + 3 monitors in several tech shows.

(Damn you AMD and Acer, you really let me down with this one..)

XGP is GPU vendor specific ? LightPeak is multi-vendor and can subsume all connected devices (HDD, monitors, GPUs, memory, another CPU, etc.)


I don't really know where this will swing. It's a possible evolution path for consoles and PCs. As you can see from MacBook Pro and Vaio Z, it can be done. ^_^
 
Status
Not open for further replies.
Back
Top