Predict: The Next Generation Console Tech

Discussion in 'Console Technology' started by Acert93, Jun 12, 2006.

Thread Status:
Not open for further replies.
  1. Laa-Yosh

    Laa-Yosh I can has custom title?
    Legend Subscriber

    Joined:
    Feb 12, 2002
    Messages:
    9,568
    Likes Received:
    1,455
    Location:
    Budapest, Hungary
    60fps is far less likely to come, but 1080p does not sound unreasonable.
    People just don't care about high refresh rates anymore. It's the COD games, some racers and GOW that have higher than 30fps, and even some of them may chose to abandon this feature in the future.

    1080p on the other hand may help minimize aliasing artifacts of all sorts and with increasing poly counts it'd probably help a bit with the GPU efficiency as well (smaller triangles mean mroe wasted cycles). It's also easier to implement than 60fps, IMHO.
     
  2. TheAlSpark

    TheAlSpark Moderator
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    22,146
    Likes Received:
    8,533
    Location:
    ಠ_ಠ
    I don't understand this line of thinking. You're effectively asking for devs to stick to 2011 visuals for a ~2013-2020 console.

    And as far as "5770 being enough for 1080p", well... even that's dubious considering DX11 and more advanced shaders.
     
  3. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,104
    Likes Received:
    16,896
    Location:
    Under my bridge
    Except hardware isn't particularly designed around a certain resolution and AA, and even if the hardware tries to do that (XB360 designed for 720p with tiling and 2x or 4x MSAA) the devs can ignore that design to do other things. The engineers can only provide devs with a balanced set of resources, and the devs chose how to use it, whether to go with more AA and lower resolution, or more framerate and less resolution, or more resolution and less something else. Hence another thread to discuss resolution as engineers look instead to get economical, efficient choices in CPU, GPU, RAM, BW, to hand over to the developers to do whatever they will with.

    Personal hopes belong in the other thread.

    Right. And then whatever hardware they pick to power your 1080p60 games, devs will use to make 720p30 games that look better in terms of pixel quality. This really is a subject for the other thread, because resolution and framerate isn't dictated by the hardware but by the software choices. ;)
     
  4. Tea2

    Newcomer

    Joined:
    Feb 25, 2010
    Messages:
    87
    Likes Received:
    0
    Any chance we will see motion interpolation tech implemented in the next gen consoles, on a hardware level (for "pseudo 60fps") ? It's widely implemented in hdtv's nowadays, and I imagine the interpolation should be getting better. Or is it a no go for some reason ?
     
  5. rekator

    Regular

    Joined:
    Dec 21, 2006
    Messages:
    793
    Likes Received:
    30
    Location:
    France
    60fps in game is generally implemented for reduce latency input. Motion interpolation tech in HDTV is to simulated a more fluid screen, total different goal.
     
  6. tunafish

    Regular

    Joined:
    Aug 19, 2011
    Messages:
    627
    Likes Received:
    414
    rekator got it right, but to elaborate more: Motion interpolation always introduces more latency in the output path. For media consumption, this is a complete non-issue -- just remember to delay they audio by the same amount, and the viewer will be none the wiser. For games, this is very noticeable and annoying.
     
  7. Squilliam

    Squilliam Beyond3d isn't defined yet
    Veteran

    Joined:
    Jan 11, 2008
    Messages:
    3,495
    Likes Received:
    114
    Location:
    New Zealand
    Well when art is excessively expensive it does push the cost/benefit the other way to support a faster frame time because it would probably cost less to implement and the difference between 60/30hz games wouldn't be as pronounced as a console marches up the diminishing returns curve for graphics fidelity.

    In saying this, I think since 3D is probably a given I would propose that maybe faster memory would be preferable to more memory especially if fast flash is implemented. In saying that I propose that extra memory is out and fast flash and fast memory is in. I don't think its going to be a 60fps/1080p tradeoff, they can do both and anything to improve overall user experiences is a good thing as Apple has proved many of these supposedly 'intangibles' of hardware are actually tangible to the end user.

    Edit: Ooops I think I just caused an infraction. I was reading back on the thread and I missed the mod comments.
    *gets down on knees and begs for forgiveness*

    Anyway I remember a great piece from Ars about console streaming and how the PS2 was more efficient than the PC platform simply because of how efficiently it operated. Let us not forget this in our pursuit of console perfection because as we look towards the PC for guidance as to what the next generation consoles should be we should take care to remember it isn't the most pc like console which is the best but the one which gives the most performance per dollar in a practical sense. So we shouldn't overlook the value of streaming speed over simply having large dumb caches which are used inefficiently. When one improved streaming then at the same time you'd have to acknowledge that the required memory for the same level of performance isn't as large so at that point more efficient and faster memory solutions will be a better fit for a consoles main use scenarios which is games and media.
     
    #7667 Squilliam, Sep 11, 2011
    Last edited by a moderator: Sep 11, 2011
  8. GuestLV

    Regular

    Joined:
    Aug 24, 2007
    Messages:
    259
    Likes Received:
    7
  9. archangelmorph

    Veteran

    Joined:
    Jun 19, 2006
    Messages:
    1,551
    Likes Received:
    12
    Location:
    London
  10. GuestLV

    Regular

    Joined:
    Aug 24, 2007
    Messages:
    259
    Likes Received:
    7
    They need more SGX for VITA?Why?
     
  11. TheWretched

    Regular

    Joined:
    Oct 7, 2008
    Messages:
    830
    Likes Received:
    23
    I'd say Cellphones. PS4 is a possibility, too, though far fetched at that.
     
  12. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,104
    Likes Received:
    16,896
    Location:
    Under my bridge
    Or even TVs and players. Would be good to offer set-top boxes with PSS games and SonyNET media for added value, and unified hardware would help there.
     
  13. Rolf N

    Rolf N Recurring Membmare
    Veteran

    Joined:
    Aug 18, 2003
    Messages:
    2,494
    Likes Received:
    55
    Location:
    yes
    Even so, the 5770 is a 100W part. With probably only one full-node process shrink before the next console generation, that may be the best we can expect to see.
     
  14. Heinrich4

    Regular

    Joined:
    Aug 11, 2005
    Messages:
    596
    Likes Received:
    9
    Location:
    Rio de Janeiro,Brazil
    I have admired all of your posts(and Alstrong too).

    Pardon my intrusion, im agree all of you,but there are much better GPU AMD than 5770 since July,thats the 6990M with 100watts with the 1120 SIMD and still coming 7870 with 1536 SIMD/stream processors and 120watts at 28nm shrinks by the end of this year,perhaps some future version "7870M" customised(pipes more oriented for console porposes) may are possible up to next gen consoles coming in 2013/1014.

    http://en.wikipedia.org/wiki/Comparison_of_AMD_graphics_processing_units

    7870 at 28nm with almost same specs of 6970 with only 120Watts.

    http://www.nordichardware.com/news/...-new-architecture-and-xdr2-rambus-memory.html
     
    #7674 Heinrich4, Sep 12, 2011
    Last edited by a moderator: Sep 12, 2011
  15. Proelite

    Veteran Subscriber

    Joined:
    Jul 3, 2006
    Messages:
    1,620
    Likes Received:
    1,106
    Location:
    Redmond
    I'll attempt to predict the next XGPU assuming that die space dedicated to the GPU is close to Xenos, and that it is built with 22 fab process, and must draw less than 150 Watts at peak.

    Looking back at AMD gpus that had die space between 250 mm2 and 260 mm2, the candidates are:

    10/1/2005 X1800 254mm2 90nm 321 million transistors 500mhz 256mb GDDR3 ~140 Watts 83 GFLOPS
    11/16/2005 Xenos ~260mm2 90nm 337 million transistors 500mhz 512mb GDDR3 125? Watts 240 GFLOPS
    6/25/2008 4850 256mm2 55nm 956 million transistors 575mhz 512mb GDDR5 110 Watts 1000 GFLOPS
    10/22/2010 6850 256mm2 40nm 1700 million transistors 775mhz 1024mb GDDR5 127 Watts 1488 GFLOPS

    A rough extrapolation of the next XGPU based on extremely dubious mathematics, namely the squaring of ratios between fab processes, the next XGPU looks like:

    Fall 2013 Next XGPU ~ 260mm2 22nm ~5000 million transistors <1000mhz <=4096mb GDDR5/XDR2 <150Watts <~4000 GFLOPS

    I am by no means an expert at this stuff, so any help with this crystal balling will be appreciated. :oops:
     
    #7675 Proelite, Sep 13, 2011
    Last edited by a moderator: Sep 13, 2011
  16. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,400
    Location:
    Wrong thread
    The launch 360 drew about 190 ~ 200 W from the wall. Assuming an expensive 80 - 85% efficient PSU you're looking at about 150 - 170 W for the entire system, with the CPU drawing much more power than the GPU.

    The figures I saw on the net around 2005 suggested 80W peak for the CPU and 30W for the GPU (possibly not including daughter die). I can't find these figures now, and may have actually dreamt them up, but I remember them seeming to be from officialish sources.

    I don't think you'll be getting 120W GPUs in next gen systems!
     
  17. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    9,470
    Likes Received:
    1,686
    Location:
    Treading Water
    I think that number on the cpu is probably 40 watts on the high side, but I'm only guessing.
     
  18. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,400
    Location:
    Wrong thread
    Given how much beefier the heasink for the CPU was (multiple heatpipes, massively higher surface area and most of the airflow from the fans) I don't think there can be any doubt the CPU was dissipating vastly more heat than the GPU. 40W for the CPU would also give it the kind of performance per Watt that even Intel's very, very best laptop parts would have been envious of, and also mean the rest of the Xbox (excluding CPU and GPU) was burning 100W+ plus (or something) which would be rather worrying!
     
  19. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    9,470
    Likes Received:
    1,686
    Location:
    Treading Water
    80 watts would put it very close to a much larger (219mm2 and 243 million transistors @90nm) athlon 64x2 from 2005 (89watt TDP, probably less in use)

    And I think video cards closest to xenos were using much more than 30 watts. x1800 was over 100 (not suggesting xenos is anywhere near 100...just probably closer to 50 than 30).

    Anyway it really doesn't matter unless you expect they will be somehow limited to following that split of power, I expect a bit more weight (than even between cpu/gpu) on the gpu for the coming generation. While 120W tdp is unlikely to be on the table for the gpu, I don't think that 75 is out of the question.
     
  20. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,400
    Location:
    Wrong thread
    A64's went up to 110W (even on the Opterons), and MS/IBM were pushing the frequency as high as they could - original specs showed 3.5 gHz for the cpu. Getting performance close to an 89/110W A64 out of an 80W chip wouldn't have been too bad an effort as it was, but considering it was a new chip, very high frequency, rushed and they had no fall back uses for working but out-of-spec processors, I think they did okay. That really is quite a chunky heatsink on the CPU, and we know MS were cutting it thin with the cooling as it was. There's no way GPU heat output was coming close to CPU heat output. Even with the emergency heatsink on the self-destruct-o-thon GPU it still had far less cooling.

    I expect more power to go towards the GPU side of things next generation, especially if Llano is any indicator of the shape of things to come (and I hope it is).
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...