Predict: The Next Generation Console Tech

Discussion in 'Console Technology' started by Acert93, Jun 12, 2006.

Thread Status:
Not open for further replies.
  1. Proelite

    Regular

    Joined:
    Jul 3, 2006
    Messages:
    816
    Likes Received:
    98
    Location:
    Redmond
    30 compute units in an apu, is such a thing even possible?
     
  2. arijoytunir

    Regular

    Joined:
    Nov 13, 2012
    Messages:
    347
    Likes Received:
    12
    Sure ! With the efficiency of the GCN 2.0 cores its possible to include 30 CUs but definitely divided into APU and GPU.
     
  3. bkilian

    Veteran

    Joined:
    Apr 22, 2006
    Messages:
    1,539
    Likes Received:
    3
    No clue, it was just an example using arbitrary numbers. If it were done for real, it would probably be two units, the CPU and GPU like current machines. Although for performance reasons, it would be better to integrate them into an APU from the start. When the 360 got merged into a single chip, they actually had to add silicon to simulate the slower behavior of the old copper traces and buses connecting the two chip solution.
     
  4. fehu

    Veteran Regular

    Joined:
    Nov 15, 2006
    Messages:
    1,462
    Likes Received:
    395
    Location:
    Somewhere over the ocean
  5. RedVi

    Regular

    Joined:
    Sep 12, 2010
    Messages:
    387
    Likes Received:
    39
    Location:
    Australia
    PS4 launching last doesn't bode well for Sony IMO. The only way they took advantage of it this gen was with Blu-Ray, and I fear they won't have a hardware advantage this time either. Of course they are in a very different position and have a different attitude now. I'd love to be proven wrong... If Sony perhaps use HMC and go for larger chips with the intention of a quick shrink to 20nm in 6-12 months it could be worth launching 6 months after MS. A couple of months would probably net them no performance advantage and have them miss the holiday season - a year would be a bit too long to justify any performance advantage...
     
  6. DoctorFouad

    Newcomer

    Joined:
    Sep 30, 2011
    Messages:
    195
    Likes Received:
    0
    I totally agree with that, it is a crazy idea to have 2 asymmetric GPUs in a budget/TDP/limited console, the silicon budget would be better spread to create a more powerful feature rich GPU, or a more powerful CPU.
    Unless they go the nintendo way and put all their efforts in the GPU side and give us a 1.24 GHZ CPU :lol:
     
  7. Gipsel

    Veteran

    Joined:
    Jan 4, 2010
    Messages:
    1,620
    Likes Received:
    264
    Location:
    Hamburg, Germany
    I would claim, they didn't put too much effort on neither side. :roll:
     
  8. DoctorFouad

    Newcomer

    Joined:
    Sep 30, 2011
    Messages:
    195
    Likes Received:
    0
    :lol: relativity everything is relative my friend, compared to their CPU, the GPU in WiiU is revolutionary, a real generational leap with huge risk taken by nintendo :lol:

    I believe Nintendo took the extreme road of relying on GPU for consoles VS the other extreme road once taken by sony with PS3 (relying on CPUs). Maybe the best choice was that of Microsoft with xbox360 : an equilibrium between CPU and GPU power/budget. I do believe though that both ps4 and xboxnext would use this ideology.
     
  9. SedentaryJourney

    Regular

    Joined:
    Mar 13, 2003
    Messages:
    476
    Likes Received:
    27
    RSX is a customized version of Nvidia's high-end GPU from 2005. Just because Xenos outperforms it doesn't mean Sony cut corners the way Nintendo did.
     
  10. DoctorFouad

    Newcomer

    Joined:
    Sep 30, 2011
    Messages:
    195
    Likes Received:
    0
    of course sony didnt cut corners for its ps3 GPU the way nintendo did with its WiiU CPU, thats obvious, but I am talking here about the philosophy of designing the hardware, for sony it was CPU centric (we know that the first plans were to drop the GPU altogether replacing it with another CELL, the ps3 RSX eventhough was powerful, it was relatively an afterthought to save wasted time), and for nintendo WiiU it was GPU centric (just look at the silicon budget). I consider both philosophies (CPU centric VS GPU centric) as extremes.
     
  11. SedentaryJourney

    Regular

    Joined:
    Mar 13, 2003
    Messages:
    476
    Likes Received:
    27
    Isn't that a myth? I recall this being repudiated several times on this forum. There was a "reality synthesizer" GPU that was supposedly unconventional, but I doubt we'll ever get specifics on what the GPU was supposed to be before Nvidia got involved.
     
  12. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,322
    Likes Received:
    1,120
    I have a spotless record on console release dates (lol), and I'm almost certain you will see PS4 in 2013 as well.

    Also, DDR4 should be doable in late 13. The other exotic memory technologies are not going to happen anyway in either imo.
     
  13. DoctorFouad

    Newcomer

    Joined:
    Sep 30, 2011
    Messages:
    195
    Likes Received:
    0
    I totally agree, there are no tangible technological benefits and only commercial drawbacks in releasing the ps4 months after the xboxnext expected release date in fall 2013.

    In terms of costs, I dont think it does matter that much if sony and microsoft use a low speed inexpensive chipsets of GDDR5 or to be newly released but highest performance DDR4 chipsets. It will depend on the availability of the highest performance DDR4 chipsets in 2013, but in terms of bandwidth and quantity of RAM it wont make that huge difference. after all we are talking here about 4 Gb of RAM. (I dont believe either the 8 Gb scenario judging by console hardware throughout history).
     
  14. XpiderMX

    Veteran

    Joined:
    Mar 14, 2012
    Messages:
    1,768
    Likes Received:
    0
    Microsoft jumped from 64MB to 512MB in 4 years, now we are talking about 8 years since 360 release. And dev kit has 12GB ram.
     
  15. lefizz

    Newcomer

    Joined:
    Nov 7, 2005
    Messages:
    138
    Likes Received:
    2
    Gddr5 would always be a lot more expensive than ddr4 over the lifetime of the console. One is a commodity product that may be a little expensive to start with but will drop like a stone once volume production starts. The other is a total niche product which has and always will be expensive.
     
  16. lefizz

    Newcomer

    Joined:
    Nov 7, 2005
    Messages:
    138
    Likes Received:
    2
    I see 8gigs as totally sane if its just DDR 4 but anything more exotic and it looks less likely. With HMC maybe its doable and even necessary due to minimum dies to achieve the stack required.
     
  17. Prophecy2k

    Veteran

    Joined:
    Dec 17, 2007
    Messages:
    2,467
    Likes Received:
    377
    Location:
    The land that time forgot
    It doesn't have to be a HMC though as I'm quite sure that stacking through TSVs will certainly not be ready by the time the next-gen consoles ship. It'll be 2.5D with an interposer.
     
  18. DoctorFouad

    Newcomer

    Joined:
    Sep 30, 2011
    Messages:
    195
    Likes Received:
    0
    it jumped from 64Mb to 512Mb when PCs were using 2 Gb as standard, nowadays the standard for PCs is 8Gb, so I doubt consoles would use (or need) the same amount....add to this the fact that generally speaking consoles use high bandwidth RAM to compensate for the lack of quantity....But I want to be positively surprised :wink:
     
  19. lefizz

    Newcomer

    Joined:
    Nov 7, 2005
    Messages:
    138
    Likes Received:
    2
    But we are also moving towards longer life cycles so chucking in a lot of dirt cheap commodity team is an easy and relatively cheap way of adding some longevity to your platform. That's the way I see it anyway.
     
  20. Gipsel

    Veteran

    Joined:
    Jan 4, 2010
    Messages:
    1,620
    Likes Received:
    264
    Location:
    Hamburg, Germany
    If you put one or two stacks of DRAM dies on an interposer (like the AMD/Amkor prototype we have seen), how do you think the dies are connected and the interfacing to the PCB is done? For sure they will use TSVs in that case too (the mentioned prototypes did). Wire bonding does not appear viable for the relatively large dies involved and the fast communication between them one wants to have.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...