NGGP: NextGen Garbage Pile (aka: No one reads the topics or stays on topic) *spawn*

Status
Not open for further replies.
170GB/s (you have to count the esram) vs 192GB/s if we are to believe current rumors. Not exactly an earth shattering advantage.
 
I'm interested in these data movers. Sweetvar26 mentioned that his friend said AMD is doing something special for Durango and that team at AMD feels that Durango is more powerful than Orbis, something akin to supercomputer. Of course, that sounds generic, but it makes little sense for MS to just "handicap" the system with considerably weaker GPU.
 
170GB/s (you have to count the esram) vs 192GB/s if we are to believe current rumors. Not exactly an earth shattering advantage.

Not that there's any data on 4Gbit GDDR5, but I'd kind of wonder about the cost of both that & 6Gbps bins.

<4Gb/s chips were supposed to have shipped in Q4 '12, with higher speeds this quarter, but... there's obviously no public data on the costs.
 
I'm interested in these data movers. Sweetvar26 mentioned that his friend said AMD is doing something special for Durango and that team at AMD feels that Durango is more powerful than Orbis, something akin to supercomputer. Of course, that sounds generic, but it makes little sense for MS to just "handicap" the system with considerably weaker GPU.

I would have thought AMD would have something like Chinese Walls in effect, so the Orbis and Durango teams would be cut off from one another. Maybe not. Maybe you can't do that effectively with an engineering team. Anyway, MS and Sony should have had absolutely no idea what each of the others were working on, unless of course the folks at AMD don't take their NDAs seriously, or they each have some spies in there. I'm not sure each could have known if they had the weaker of stronger gpu.
 
I would have thought AMD would have something like Chinese Walls in effect, so the Orbis and Durango teams would be cut off from one another. Maybe not. Maybe you can't do that effectively with an engineering team. Anyway, MS and Sony should have had absolutely no idea what each of the others were working on, unless of course the folks at AMD don't take their NDAs seriously, or they each have some spies in there. I'm not sure each could have known if they had the weaker of stronger gpu.

It is possible to compartmentalize engineering groups. IBM did it last gen, as well as ATI when it supplied Nintendo and Microsoft's graphics chips.

If it's not happening now, it's something AMD should have to answer for.
 
I would have thought AMD would have something like Chinese Walls in effect, so the Orbis and Durango teams would be cut off from one another. Maybe not. Maybe you can't do that effectively with an engineering team. Anyway, MS and Sony should have had absolutely no idea what each of the others were working on, unless of course the folks at AMD don't take their NDAs seriously, or they each have some spies in there. I'm not sure each could have known if they had the weaker of stronger gpu.
Thats what I thought, but AMD are stretched to thin in last year or two (Wii U, Durango, Orbis, PC). Obviously, its hard to have so much good engineers that work on these projects without "bumping" at each other.

Only reason why I believe this is because Sweetvar26 is basically the most legit "source". He said Sony dropped Steamroller for 8 core Jaguar@1.6Ghz, and that MS is also using Jaguar. Than he said GPUs are based on Pitcairn and Cape Verde, and that AMD is doing something special for Durango. Then everything that he posted was deleted from GAF and from Google cache so I think he was pretty spot on.
 
It is possible to compartmentalize engineering groups. IBM did it last gen, as well as ATI when it supplied Nintendo and Microsoft's graphics chips.

If it's not happening now, it's something AMD should have to answer for.

Well, the other way it happens is someone from MS or Sony with loose lips giving a little too much info to a vendor. Maybe they talk to RAM vendors and give a little bit too much information about what they're working, and then their competitor happens to call up to talk RAM and ends up getting the scoop. That's why we're under strict orders at my work to never tell vendors anything.
 
It's definitely possible that the leaks sprung from different points in the pipeline further down from AMD. There are points of potential overlap besides the hardware designers.
 
It's definitely possible that the leaks sprung from different points in the pipeline further down from AMD. There are points of potential overlap besides the hardware designers.

And I may not have happened at all, to end my derailing of this thread. Just wanted to be Ruskie that they most likely wouldn't have any idea they were "handicapping" their console relative to the competition.
 
With the spec either company has set at this moment I wouldn't say it's not possible for at least minor tweaks to be made depending on how they've currently designed their systems, and more so if they are mostly going with "off the shelf" parts as we hear.

That aside I think we can take his comment as that they just don't want to be one upped tech wise in general at the very least.

No...see below...

Kaz is probably referring to multiple configurations that exist in-house. Hardware companies always have more than one prototype in-flight. Minor stuff from storage config, additional units, bundled stuff, ...

I wonder if the config with additional compute unit(s) was thrown in late. We didn't hear about it until Eurogamer's article. Or it could be misreporting...

If it's only the storage subsystem, bundled games, controllers, etc. then that isn't even anything notable. You can do that mid generation just like with past consoles.

Anything that potentially impacts game performance however will be a non-starter. Unless you don't plan to launch with any games. To hit the launch window a game would have had to have been in developement for at least the past 1 year (small budget limited scale game) to 2+ years (large scale AAA budget title).

Hence, anything of any significance will likely have been set in stone at least 1 year prior to launch. Hence, why it is likely that both consoles are likely using Southern Islands based GPUs rather than anything newer.

I suppose it may be possible to bumb up memory capacity or speed of CPU/GPU without impacting ongoing game developement, but I find that unlikely as that also means redesigning the PCB and requalifications in most cases. Not only for the PCB, but for the case, power supply, cooling solutions, etc.

Regards,
SB
 
Thats what I thought, but AMD are stretched to thin in last year or two (Wii U, Durango, Orbis, PC). Obviously, its hard to have so much good engineers that work on these projects without "bumping" at each other.

Only reason why I believe this is because Sweetvar26 is basically the most legit "source". He said Sony dropped Steamroller for 8 core Jaguar@1.6Ghz, and that MS is also using Jaguar. Than he said GPUs are based on Pitcairn and Cape Verde, and that AMD is doing something special for Durango. Then everything that he posted was deleted from GAF and from Google cache so I think he was pretty spot on.

Maybe thats because Sony can and do design chips themselves, maybe they only need AMD to provide the CPU and GPU, whereas MS aren't really known for designing chips.

We have already heard there are things in "Orbis" that boost performance much like these custom blocks in "Durango"
 
Maybe thats because Sony can and do design chips themselves, maybe they only need AMD to provide the CPU and GPU, whereas MS aren't really known for designing chips.

We have already heard there are things in "Orbis" that boost performance much like these custom blocks in "Durango"
Yes, and Sony designed Pitcairn and Jaguar CPU that will, accidentally, be used in Durango too right?
 
I suppose it may be possible to bumb up memory capacity or speed of CPU/GPU without impacting ongoing game developement, but I find that unlikely as that also means redesigning the PCB and requalifications in most cases. Not only for the PCB, but for the case, power supply, cooling solutions, etc.

That's what I said too, that's it's not impossible. Nothing about likelihood. And nobody knows the real launch dates anyways. CPU/GPU memory bumps for a 2013 launch would be nice to haves at this point, however unlikely.

Maybe thats because Sony can and do design chips themselves, maybe they only need AMD to provide the CPU and GPU, whereas MS aren't really known for designing chips.

Pretty sure they're designing the package themselves, choosing rather to more or less just buy the parts they need.
 
Maybe thats because Sony can and do design chips themselves, maybe they only need AMD to provide the CPU and GPU, whereas MS aren't really known for designing chips.

We have already heard there are things in "Orbis" that boost performance much like these custom blocks in "Durango"

What about of this so called special sauce in ps4 was just the broadband engine from the ps3?..used for physics and boosting ai then for BC....is that even possible or cost effective?
 
What about of this so called special sauce in ps4 was just the broadband engine from the ps3?..used for physics and boosting ai then for BC....is that even possible or cost effective?

Cell is done. Nobody wants to really code for it anymore. Whatever special sauce it is it'll likely be FPGAs, DSPs or just more CUs.
 
Can you read?

I think he meant that, outside the CPU, GPU and USB, WiFi, SATA, Bluetooth, RAM and the probably already integrated memory controller there's not many chips for Sony to design that would significantly change their product. But I don't know, perhaps they have a kick-ass scaler this time(only for 4k televisions).
 
These 3/4 helper chips in Durango aren't part of either CPU or gpu. It's entirely possible although not probable that don't have implemented something doing exactly the same thing.
 
There is higher chance of 30fps vs 60fps than many believe. 4GB GDDR5 UMA and 50% more GPU is huge. Its not a wash

Going by what little info we have, Microsoft opted for a heavily customized GPU aided in its tasks by a couple of modules. Apparently, Sony went with a straight PC GPU card, similar to what they did with the PS3. If I were to take a bet, I wouldn't put my money on the PS4... :smile:
 
Status
Not open for further replies.
Back
Top