The Next-gen Situation discussion *spawn

Discussion in 'Console Industry' started by gongo, Apr 2, 2012.

  1. woundingchaney

    Regular

    Joined:
    Jan 29, 2006
    Messages:
    799
    Likes Received:
    1
    Location:
    Terre Haute, IN
    Perhaps I don't recall correctly, but I find this to be unlikely. MS went through various revisions with the 360 initially, but many of those were directly due to consumer hardware failure more so than manufacturing refinement. I was under the impression that the first "true" hardware revision/refinement we seen was around the same time as the Elite model was released.
     
  2. bgassassin

    Regular

    Joined:
    Aug 12, 2011
    Messages:
    507
    Likes Received:
    0
    It did according to old reports. When talking about the hardware direction of PS4 from a cost standpoint I felt it would be similar to 360, but at the time I didn't realize either that the 360's hardware had turned it around that fast. So unless those reports are wrong, the hardware became profitable.

    It never changed from the alpha kits though. It's been the same in both alpha and beta. My take on that comment was that it was about features and not power.
     
  3. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,841
    Likes Received:
    1,160
    Location:
    Guess...
    I can see you're picture perfectly well. I just happen to disagree.


    Both the xbox 360 and PS3 have much better communication between the CPU and GPU than even the latest PC's today. Nevertheless within a year of their launch the majority of cross platform games were looking as good or better on PC's and running at faster framerates. Fast, efficient communication between CPU and GPU is obviously going to benefit the consoles this generation as it did the last but it's not going to make up for a large raw power advantage. ERP mentioned earlier in this thread I believe to my own comments about how HSA might effect PC ports that there's not much that can't be overcome by doing things in a different way on a more powerful machine.


    All that bandwidth will be used primarily for graphics rendering so I've no idea why you keep trying to compare it to PC system memory which is never used for graphics rendering. Obviously it should be compared to PC graphics memory in combination with PC system memory since together they perform the same function as the unified memory in the PS4. So today you're looking at up to 6GB running at 288GB/s combined with say 8-16GB of system memory running at 25.6GB/s

    Exactly how is a 3.5GB pool (512MB reserved for the OS) at 192GB/s of shared memory faster or larger than that? The fact that it's unified will be advantageous but that's no different to the 360. Shared memory also brings it's own issues in that you have 2 (or even 3 in PS4's case) processors contending for the same bandwidth. Although the ease of development it brings to the table is a bigger advantage overall.

    There was nothing dissapointing about the Xbox 360's memory architectire back in 2005. It had double the unified memory space than most PC GPU's had graphics memory with only the very highest end GPU's matching it for size. Today we have PC GPU's which already have more graphics memory than the PS4 and we're still the best part of a year away from it's launch.

    In terms of bandwidth it was 4x faster than PC system memory bandwidth and about half the speed of PC graphics memory bandwidth but of course it also had the hugely fast 256GB/s pool of edram which was over 5x faster than anything a PC GPU was running.

    So with regards to memory size and bandwidth, the PS4 doesn't look to be in as good a position compared with PC's today (almost a year before it launches) as the 360 was in 2005 on the day it launched.


    Yes it did. Not to the same extent as the PS4 with it's system on chip design but both PS360 had big advantages over the PC in terms of CPU <-> GPU communication. It wasn't a game changer though, just as it won't be this generation.


    Let's face it. Your attempts at damage control not withstanding the PS4 will be much slower compared to PC's when it launches than the even the PS3 was when it launched (and PS3 came late to the party compared with the 360).

    PS3 had Cell, PC's had dual core Conroes. They both had their advantages and disadvantages but one could at least make an argument that Cell was faster - much faster at many tasks. PS4 will have 8 Jaguar cores at 1.6Ghz. PC's will have quad Haswells sporting likely double the multithreaded performance quadruple the single threaded performance and more than quadruple the SIMD capabilty. If you throw in the GPGPU unit on the PS4 though then at least the SIMD performance becomes more even (although Haswell may well still have the advantage baseed on the current rumours). But that's a far cry from Cell with it's > 4x SIMD performance of PC CPU's in mid 2006.

    The GPU situation is better than the CPU situtation but still worse than PS3's position in 2006. PS3 faced off against the 640MB Geforce 8800 GTX. A GPU which was 2-3x faster than RSX and had 25% more graphics memory than PS3 had total shared memory. The PS4 will have a GPU that is roughly half as fast as the top end GPU's of today but by the time it launches the 8xxx and 7xx series GPU's will be 6 months old. So not only will it well under half as fast as the fastest single GPU's but it will also be facing another round of GPU refreshes within 6 months. At least PS3 had a full year before something faster than the 8800GTX launched.

    Memory wise the PS3 had about 1/4 the total RAM as PC's had system RAM. That's pretty much a match with PS4. The aggregate speed of that memory was around 8x faster than PC system RAM, again, same story with PS4. Compared to PC graphics memory, PS3 had 80% as much memory as the top end GPU running at 56% of the bandwidth. PS4 will have 58% as much graphics memory assuming the top end next gen GPU's sport 6GB running at 66% of the bandwidth of TODAYS fastest GPU's. It will obviously be slower compared to the next generation of GPU's.

    So put all that together and what do you have? You have a system that's in a weaker position in almost every respect than it's predecessor was compared with PC's at the time of it's launch. And comparing to PS3 isn't really the best scenario since it came out 6 months after the 360 with the same level of performance. If the above analysis was done using the 360 then the new generation of consoles would have come out looking even worse in the comparison.

    I grant you the new consoles will have architectural advantages borne from HSA and likely other secret sauce but it's an inescapable fact they're no-where near as powerful this generation as they were last in comparison to PC's. arguing otherwise is nothing more than wishful thinking.
     
    #1163 pjbliverpool, Jan 19, 2013
    Last edited by a moderator: Jan 19, 2013
  4. wco81

    Legend

    Joined:
    Mar 20, 2004
    Messages:
    6,330
    Likes Received:
    337
    Location:
    West Coast
    But what will be the performance level expressed in terms of the current gen as the baseline?

    10x? 5X? More than those? Less than those?
     
  5. Ruskie

    Veteran

    Joined:
    Mar 7, 2010
    Messages:
    1,291
    Likes Received:
    1
    What I find interesting about rumored specs is the fact that Sony, even in such a bad financial state they are, still manages to outspend MS on hardware front. 4GB GDDR5 are considerably more expensive and heat generating then 8GB of DDR3. Also, Sony went with what seems much faster GPU, bigger transistor count and again, wattage. Both CPUs seem to be pretty much 8 core Jaguars and not much remains in these boxes.

    Kinda puts things in perspective at how the two companies operate. It would be a damn shame if Sony goes out or stumbles to irrelevance. When they have vision you kinda wish to give them more money for their products :oops:
     
  6. Carl B

    Carl B Friends call me xbd
    Moderator Legend

    Joined:
    Feb 20, 2005
    Messages:
    6,266
    Likes Received:
    63
    The Xenos project was one whereby MS was getting access to certain "advanced" GPU technologies in exchange for: a) the obvious role it has in determining DirectX specifications, and b) essentially funding the R&D that went into the architectural development.

    RSX is essentially a 2005 GPU as well though, so take that for what it's worth.
     
  7. Carl B

    Carl B Friends call me xbd
    Moderator Legend

    Joined:
    Feb 20, 2005
    Messages:
    6,266
    Likes Received:
    63
    When 360 and PS3 were shaping up prior to their launches, there was mixed thinking. Some people felt that Microsoft after their losses with XBox1 would make a half-hearted attempt at the 360, that they could never hope to compete with the billions Sony was spending on aggregate PS3 development, and that Sony was a juggernaut that could not be stopped. Conversely, some people felt that Microsoft would be able to toss whatever amount at the problem and buy their way to victory.

    This is the lesson of those launches: that you can spend whatever you want on R&D, include whatever you want in the box, but if it is priced too high and the loss-leading too steep, you will lose your shirt.

    Sony has plenty of money in the year 2013 to design whatever state-of-the art console they choose to. These guys are not broke. Likewise with Microsoft - they can do whatever they want. They can make the console of your dreams - but how much will it cost them, and by extension, you? Neither wants to repeat the expense of the current generation - after all, what did it yield them? They both absolutely need to make money (Sony more than MS), and they both need to act in support of their broader financial goals.

    There is a costs envelope both companies are trying to stay within. Their is a price at retail both companies are looking to achieve. There are ancillary features both companies want to propagate and encourage. This is not a matter of who can spend more and build the best, it's a matter of a contest where the goals are "build the best console you can to retail at $400."

    PS - I invite anyone who was maybe not as engaged at the time to search for and browse through the old system cost threads from 2006-2008 around here; there are a lot of them, and they are extremely in depth. It will be a quick lesson into how much money both MS and Sony were bleeding at the time, and how great a gamble Sony especially had taken with the loss-leading nature of its console.
     
  8. MrFox

    MrFox Deludedly Fantastic
    Legend Veteran

    Joined:
    Jan 7, 2012
    Messages:
    6,488
    Likes Received:
    5,995
    Please not that again. :roll:
    Both companies will spend based on what makes the most business sense. If GDDR5 allows sony to sell more consoles, keep their fanbase happy, solidifies the PS brand, and yield more profit, that's what they'll do. Just like they spent billions to build CMOS sensor fabs and buy back Ericsson (both of which have been very good moves). If they think they can have an edge, they'll spend the money in gaming too.
     
  9. Xenio

    Regular Banned

    Joined:
    Jan 18, 2013
    Messages:
    447
    Likes Received:
    0
    right, it was a good move and I think that this time will be the same again
     
  10. Carl B

    Carl B Friends call me xbd
    Moderator Legend

    Joined:
    Feb 20, 2005
    Messages:
    6,266
    Likes Received:
    63
    Outspend where though? Components? R&D? Subsidization? Tools? Games?

    Let's wait for the dust to clear before we declare any champions of the people so far as spending goes. If PS4 launches at $600 and the 720 at $200, a lot of the same people in this thread already wrapping up their conclusions are going to be singing a different tune.

    Not what I think will happen, but the point is specs... above all rumored specs... do not equate to financial commitment.
     
  11. Carl B

    Carl B Friends call me xbd
    Moderator Legend

    Joined:
    Feb 20, 2005
    Messages:
    6,266
    Likes Received:
    63
    Why do you think that?
     
  12. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,841
    Likes Received:
    1,160
    Location:
    Guess...
    Probably somewhere between 5 and 10x I'd have thought. On paper it looks like this assuming all the rumours are true and the GPU is identical to a 7970m with 18 CU's (which probably won't be the case)

    • 8x the total memory
    • 4.14x the memory bandwidth
    • 6.4x the fill rate (this may differ if they reduce the number of ROPS from 32)
    • 4.8x the texturing performance
    • 6.4x geometry setup (but this will be greatly enhanced by the tesselation capabilities)
    • 7.94x shader FLOPs
    You can add big efficiency gains onto most of those numbers as well so overall around 10x seems a reasonable ballpark estimate although shader limited situations should see something closer to 15x I'd have thought.

    The CPU's more difficult to estimate but I'll hazard to speculate ~16x none SIMD performance of Cell and maybe 50% more SIMD performance if you include the GPGPU block.
     
  13. Xenio

    Regular Banned

    Joined:
    Jan 18, 2013
    Messages:
    447
    Likes Received:
    0
    Common interest, Microsoft want the more advanced tecnology, AMD want early support for its advanced tecnology in next Dx on Windows8, it's a win-win situation in my point of view

    money can be a plus argument too in this equation

    I'm pretty sure you sound like a childish fanboy, if anyone have a doubt on my identity, write me in private and I'll give you my personal site, facebook profile
    I'm just me and please be mature
     
  14. Carl B

    Carl B Friends call me xbd
    Moderator Legend

    Joined:
    Feb 20, 2005
    Messages:
    6,266
    Likes Received:
    63
    Except that none of the signs are in place this time as they were last time. Last time, unified shaders was a known stop on the horizon - we were heading towards it, everyone knew it was coming, and DirectX specifications were being built around it.

    This time there is none of that. The present architectural arches for both AMD and NVidia are relatively in place, and there are no broader signs of a sea-change in how rendering is to be approached. No VTEs. In fact, it would undermine what seems the much more credible notion that there is going to be a lot of cross-device interoperability across the MS ecosystem.
     
  15. upnorthsox

    Veteran

    Joined:
    May 7, 2008
    Messages:
    2,102
    Likes Received:
    378
    Middling PC ports targeted at mediocre DX10 level PCs?
     
  16. XpiderMX

    Veteran

    Joined:
    Mar 14, 2012
    Messages:
    1,768
    Likes Received:
    0
    HSA?
     
  17. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,841
    Likes Received:
    1,160
    Location:
    Guess...
    If those games were developed specifcally for those 2005 GPU's like they were for Xenos they'd probably handle the games just fine. Obviously it depends exactly which 2005 GPU you're talking about. Each have their own strengths and weaknesses compared with Xenos.
     
  18. XpiderMX

    Veteran

    Joined:
    Mar 14, 2012
    Messages:
    1,768
    Likes Received:
    0
    I'm talking about PC. Do you have a 2005 GPU (PC) that can run Crysis 3 and BF3?

    Sorry off-topic...
     
  19. Carl B

    Carl B Friends call me xbd
    Moderator Legend

    Joined:
    Feb 20, 2005
    Messages:
    6,266
    Likes Received:
    63
    RSX. That was a 2005 PC chip derivative.
     
  20. XpiderMX

    Veteran

    Joined:
    Mar 14, 2012
    Messages:
    1,768
    Likes Received:
    0
    RSX is a console chip, I'm talking about PC graphics cards.

    EDIT: And it is off-topic, if you have a response, you can send me a priv msg.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...