NGGP: NextGen Garbage Pile (aka: No one reads the topics or stays on topic) *spawn*

Status
Not open for further replies.
But we also have certain rumors that have pointed to 12gb in Durango dev kits. So really, 8gb isn't "half". For what the console is designed to do with that OS, your gonna need that ram.

Are you sure about that? I don't think this was mentioned anywhere.

Its only what i've heard
 
The RAM bit alone remains puzzling and suggests non-final hardware, where we can also not properly account for the devkit overhead (which typically have twice the memory of the target console anyway). 8GB RAM with 2.2GB of VRAM suggests a non-unified, PC style setup that would seem to be more likely for an early devkit than final hardware.

I think the VGleaks and Eurogamer information has to be right with regards to final hardware. The Kotaku piece was most definitely about an early dev kit.
 
Well Lherre is a Ubi dev and SuperDaE is a troll, you do the math :p

Even so, scently does make a good point in that clock speeds could have made a difference somewhat in evening out the raw performance gap. Changing any major parts at this point however, is just untenable for a 2013 release date.
Never knew Lherre is a Ubi dev, but if that's the case then I have no reason not to believe him. I remember Ranger had a conversation with SuperDae a while back regarding the Durango alpha kit and SuperDae sounded rather oblivious to the hardware.
 
Doesn't mean anything, just a statement and possibly where a rumor about NVidia based devkits came from.
Could have been MS trying to keep the vendor secret very early on, could have been more representative of the performance they expected, I have no idea and no inclination to speculate.
I believe Nividia flops offer better performance flop per flop compared to AMD. It makes sense since Durango has such a low flop count but will most likely offer good performance, so it's better to use nividia gpu to measure the performance.

Also, aieges has stated that he heard from microsoft that final Durango hardware should perform similar to gtx 680, which is again, a nividia gpu used for comparison. Probably no hidden meaning there.
 
2cegdhv.png

interesting
 
Aegis was also wrong on things. I think the 720 GPU will not be held back in any significant way from what developers want to do next gen, but comparing it straight to a GTX680 is pretty out there.
 
I seriously doubt the credibility of superdae.

What kind of people, with legitimate access to devkits, would openly sell it on the internet?
What kind of people, without legitimate access to devkits, can be counted on to provide the latest accurate information about Durango and Orbis?

What is superdae supposed to be? Ballmer's lovechild?
 
More 53rd hand information, but I'm told that early on MS were telling developers to use NVidia GPU's to get a good idea of final performance.

Yes, this might be possible, all I've heard is that a good way to tell if rumormonger is full of bs is if they're claiming Durango had non AMD CPUs or GPUs at one point in the kits.

Though apparently rumours of Durango having 16 cores (which were circulating round at one point) stems from the alpha kits having 8 extra cores to try and emulate some of the custom hardware.
 
For what is worth it target specs of Durango are from late 2011. I don't think they have changed, but I do believe SuperDae. He is known seller of dev kits, not only Durango's and not only of Orbis info. He trolls because he can't straight out say whats on his mind, but trolling along giving bits of info is probably not very damaging for either company.
 
What's the point of putting gpu like processors on a CPU then have them work to bolster a gpu?
Different processors have different pros/cons, and there are some graphics tasks SPEs will be better at than the DX9 shaders in RSX, such as post FX or (mildly) branchy shaders. SPEs are pretty ideal as vertex shaders, so PS3 lost nothing with silicon spent on SPEs used for that job over using fixed VSes in RSX. But there's a whole other thread for discussing the pros and cons of PS3's design.
 
Aegis was also wrong on things. I think the 720 GPU will not be held back in any significant way from what developers want to do next gen, but comparing it straight to a GTX680 is pretty out there.

I think he's similar to DaE, he has access to some information but has trouble interpreting it (though DaE seems to both have trouble interpreting and also a predilection for 'trolling for teh lolz')
 
I think he's similar to DaE, he has access to some information but has trouble interpreting it (though DaE seems to both have trouble interpreting and also a predilection for 'trolling for teh lolz')
Not really. DaE is known dev kit seller, not just Durango. He obviously has ties with someone in industry and I seriously doubt he can't interpret something like "12 CUs, 1.2TF etc.".
 
Richard Richard Leadbetter said this about Durango:

"So the question of the hour is, just how accurate is the information? Based on our communications with the leaker, the data appears genuine - the only real question is how recent it is. The proof presented by the source suggests that the data is at most nine months old: factoring in how long it takes to create a console, the chances are that there will not be many changes implemented since then.
 
Richard Richard Leadbetter said this about Durango:

"So the question of the hour is, just how accurate is the information? Based on our communications with the leaker, the data appears genuine - the only real question is how recent it is. The proof presented by the source suggests that the data is at most nine months old: factoring in how long it takes to create a console, the chances are that there will not be many changes implemented since then.
Its older then that. Late 2011 more accurately.
 
I believe Nividia flops offer better performance flop per flop compared to AMD.

Those days are long gone. Nvdidia 1D shader FLOPS were superior to AMD 5D and 4D shader FLOPS but GCN has 1D shader, too:

AMD Radeon HD5870 (VLIW5 architecture, 5D shader) has 2.7 TFLOPS theoretical peak performance
AMD Radeon HD7850 (GCN architecture, 1D shader) has 1.7 TFLOPS theoretical peak performance

Nevertheless the HD7850 is easily faster than the HD5870 in real world scenarios. So it definitely makes sense to use a Nvidia 1D shader GPU in a 2011 devkit for a console that will be using 1D shaders in 2013.
 
Where is that late 2011 info coming from, also if that info hasn't changed in Feb 2012, then the info is also from Feb 2012 :)
Aegies said his info is from Feb 2012, but 8 core CPU and 1.2TF number came before. First who actually had it was the guy that posted it on pastebin. I think it was Dec 2011.
 
Status
Not open for further replies.
Back
Top