Predict: The Next Generation Console Tech

Discussion in 'Console Technology' started by Acert93, Jun 12, 2006.

Thread Status:
Not open for further replies.
  1. hoho

    Veteran

    Joined:
    Aug 21, 2007
    Messages:
    1,218
    Likes Received:
    0
    Location:
    Estonia
    Biggest problem with edram is how to connect it to the GPU over fast enough bus for it to make any sense. I'd dare to say anything less than 64GB/s would be a waste and double that would probably be a good solution. Stacking could solve the problem but won't be easy or cheap.
     
  2. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,104
    Likes Received:
    16,896
    Location:
    Under my bridge
    I find that implausible. Wheres the mention of Naomi 3?

    It's doable, but would it be useable? It's way more than needed for two 1080p buffers. It'd have to take on a new role as working space like PS2, which would be a step out of the ordinary. eDRAM in XB360 was there to enable a cheaper, slower main RAM bus. If they had 256 GB/s, eDRAM would be redundant unless they've decided RAM access is the bottleneck for their future renderers (while everyone else is looking at shader calcs as the bottlenecks).

    I haven't done the maths, but 100 MB does look good for deferred rendering where you could store all buffers in the eDRAM and process them to your heart's content without messing up system BW. Still overkill, but that's one case I think that much could be used.
     
  3. Dominik D

    Regular

    Joined:
    Mar 23, 2007
    Messages:
    782
    Likes Received:
    22
    Location:
    Wroclaw, Poland
    I'm not sure if this has been discussed before:
    http://www.physorg.com/news/2011-11-calif-jury-rambus-antitrust.html
    http://www.pcworld.com/article/244068/rambus_considers_antitrust_appeal_as_stock_falls.html
    http://www.reuters.com/article/2011/11/17/us-rambus-micron-verdict-idUSTRE7AF1XL20111117

    Essentially Rambus has lost tons of monies due to the antitrust case. Rambus was responsible for some of the HW found in PS3. My question is: what's the impact of Rambus valuation on PS4 HW plans? Does this make them "desperate" in need of a big deal and they're willing to sell inexpensive but powerful goods (obviously not below the cost of manufacturing - that'd kill them) or does this mean Sony (or MS) should not team with them for tech since it's dangerous? Anyone?
     
  4. tunafish

    Regular

    Joined:
    Aug 19, 2011
    Messages:
    627
    Likes Received:
    414
    Rambus does not manufacture anything. They are a pure IP shop. If you want rambus memory, you need to get a license from rambus, and then buy the chips from someone who is willing to make them for you.
     
  5. MBDF

    Newcomer

    Joined:
    Dec 29, 2004
    Messages:
    175
    Likes Received:
    0
    Location:
    North Vancouver, Canada
    3D?

    Also would tie in with dual gpu's on one die.
     
  6. almighty

    Banned

    Joined:
    Dec 17, 2006
    Messages:
    2,469
    Likes Received:
    5
    I reckon a Quad core CPU with SMT and performance roughly to a 2600K.

    2-4Gb of total system memory

    GPU roughly GTX570/6950 but die shrunk

    When PS3 and 360 released they had GPU's that were level with the very highest end PC's but I can't see that happening next time round, if it did you would be looking at GPU power equal to or greater then a 6990/GTX590
     
  7. Dominik D

    Regular

    Joined:
    Mar 23, 2007
    Messages:
    782
    Likes Received:
    22
    Location:
    Wroclaw, Poland
    True, my bad. Still the question remains: does this impact the future of their tech?
     
  8. Megadrive1988

    Veteran

    Joined:
    May 30, 2002
    Messages:
    4,723
    Likes Received:
    242


    When PS3 launched it did not have the highest-end GPU, Nvidia had launched the G80/8800, while RSX was a cut down G70.
     
  9. hoho

    Veteran

    Joined:
    Aug 21, 2007
    Messages:
    1,218
    Likes Received:
    0
    Location:
    Estonia
    One could also claim that highest-end at that time was SLI setup.
     
  10. almighty

    Banned

    Joined:
    Dec 17, 2006
    Messages:
    2,469
    Likes Received:
    5
    8800GTX was released a few weeks after iirc

    And still my point remains, PS3 had a GPU that was much faster then 90%+ of the worlds PC user base and had more grunt then all but the highest end of gaming PC's.

    Can you see the same thing happening next gen? Because I can't.
     
  11. Gubbi

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,661
    Likes Received:
    1,114
    Easily. The vast majority of GPUs sold today are weak integrated ones.

    Cheers
     
  12. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    According to Wiki/google 8800GTX was release Nov 6 and PS3 Nov 17.

    Even before then I'm pretty sure they had G70's (?) clocked up to 650 mhz.

    I think you'd have a better argument with Xenos in 2005, it seems on par with RSX, which is on par with 7800GTX, which was on par with X1800XT, the fastest PC cards of late 2005. Except for maybe a slightly lower clock.

    But I agree with the general thrust of your argument, last time at least with Xenos we had something near highest end. I think this time it's pretty obvious we're going to get something from the mid-high ranks rather than the high-high, unless I'm surprised. The highest end PC GPU's seem to be much bigger and hotter today than they where then.
     
  13. almighty

    Banned

    Joined:
    Dec 17, 2006
    Messages:
    2,469
    Likes Received:
    5
    Read what I said again...
     
  14. stiftl

    Newcomer

    Joined:
    Jun 24, 2006
    Messages:
    118
    Likes Received:
    11
    This would be great:

    Ivy Bridge Quad Core with HT @ 3GHz should have a TDP of ~50W (i7-2600S has 65W @ 2,8GHz @ 32nm)
    2GB DDR3 Quad Channel
    AMD Radeon HD7850 est. <=100W @28nm
    1GB GDDR5

    Should fit in a PS3 power envelope (even in that of the latest Slim) and would be a beast as an embedded system.

    Also looking forward to Piledriver (especially power draw), maybe the rumored hexacore of the Nextbox is a 3 modul piledriver (AMD promotes this is a hexa core). Trinity has only 2 modules though IIRC and I am not sure about the number of shader units.
     
  15. almighty

    Banned

    Joined:
    Dec 17, 2006
    Messages:
    2,469
    Likes Received:
    5
    Exactly, with die shrinks and taking power and heat into things even using an AMD 6950 would be out of reach.

    Can see a 6850/6870 being much more realistic. There's 6850 out that don't even need a 6 pin PCIEX power connector as they they draw all there power from the PCIEX slot.

    Perfect card for next generation as its power efficient and a good 4-5x more powerful then RSX?
     
    #8215 almighty, Nov 17, 2011
    Last edited by a moderator: Nov 17, 2011
  16. TheAlSpark

    TheAlSpark Moderator
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    22,146
    Likes Received:
    8,533
    Location:
    ಠ_ಠ
  17. Prophecy2k

    Veteran

    Joined:
    Dec 17, 2007
    Messages:
    2,468
    Likes Received:
    379
    Location:
    The land that time forgot
    If they take the highest end GPU of today and get rid of all the double-precision and other such do-dats that aren't necessary for a console GPU, my question will be how big and how hot would such a customised GPU part be? Enough to fit in a console box with a reasonable cooling solution?

    What if the CPU is much smaller in terms of die area than what XCPU and CELL were at their launch? With a fixed console power envelope, and more of an emphasis on the GPU in terms of the overall system design, and with a heavily customised GPU part, then perhaps it could be possible for us to get a console in 2012 with the highest end (single) GPU of today?
     
  18. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    Umm yeah but remember we're dealing with likely 2013 at the earliest (2012 if you really must, but I say no chance on that).

    There will be a whole new generation of PC cards by then. They should hit in early 2012. Todays high end will be tomorrow's medium range, so I would certainly hope they can hit at least last gen's high end (6950-6970) if not better with decent thermals/power/etc.

    I have no idea but I dont think it's very simple at all. It will either have to have been a custom part in development for a while, or I suspect something a lot closer to off the shelf. Architecting a GPU is an enormous undertaking by now...so you'll probably have to use whatever they have for desktop as a base even if parts of it arent needed on a console...that's just my hunch.
     
  19. TheWretched

    Regular

    Joined:
    Oct 7, 2008
    Messages:
    830
    Likes Received:
    23
    Really? The PCIE slot is rated for 75 watts. I can't see any 6850 being that efficient. My 6870 actually needs two plugs, as it's rated for 151 watts... one watt above 1 6pin plug. My G80 beforehand only needed 1 plug for its 148 watts. I can't see a 6850 without one.
     
  20. Love_In_Rio

    Veteran

    Joined:
    Apr 21, 2004
    Messages:
    1,627
    Likes Received:
    226
    I have wondered myself that too. How much die area woud be saved getting rid in a GTX 580 of the DP logic?.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...