Predict: The Next Generation Console Tech

Status
Not open for further replies.
Well, if the Orbis is supposed to be affordable the memory can't be 4GB @ GDDR5. There is no way that is possible. My Nvidia 670GTX has only 2GB of GDDR5 and it was $400. So ummmm no way...
Just curious - how do you derive the costs of G5 memory from the price your GPU? Newegg is showing 2GB GDDR5 GPU's at $135 at the moment; thats are fair margin of delta not related memory...
 
Well, if the Orbis is supposed to be affordable the memory can't be 4GB @ GDDR5. There is no way that is possible. My Nvidia 670GTX has only 2GB of GDDR5 and it was $400. So ummmm no way...

Put that in a fanboy dream. It might have 4GB of memory, but it's definitely not GDDR5.

I don't think 4GB of GDDR5 is impossible. 7850 with 2GB of GDDR5 is below $200. Your 670 is near top of the line GPU.
 
Well, this is interesting...

I was having a look at the Dec 2011 pastebin article Proelite mentioned a few pages back as being one of few legitimate leaks that bkilian might be referring to (it certainly got both codenames right)
http://forum.beyond3d.com/showpost.php?p=1690531&postcount=17279

http://pastebin.com/j4jVaUv0

This in particular caught my eye:
The X-Box 3 is going to have an 8-core 64-bit processor (assumedly an i7 or similar design) rated at 1.2 Teraflops.

Now obviously that can't refer to the CPU flops so it must be the GPU flops - or possibly combined GPU+CPU/APU flops.
(the source probably isn't knowledgeable enough to distinguish between the two)

So perhaps 1.2 TF is the magic number?
 
Last edited by a moderator:
So perhaps 1.2 TF is the magic number?

Or maybe it is too old?

XB3 is specified to use 4 GB RAM, and the PS4 will be shipped with 2GB.
The PS4 will feature a 4-core 32-bit processor.
I guess this pastebin was the first naming Durango and Orbis to next gen consoles.

EDIT:

Right now, programmers are pretty much looking at it and comparing the two consoles like one is a high-end gaming PC and the other's a piece of shit eMachines unit from Walmart's $200 special (including monitor).

Lol...
 
It might be old, but I don't think all the info is out of date. We know bgassassin said 1+TF was what was in the kits, and lherre and BG said it was weaker than the PS4 GPU at 1.8.

Bkilian also said all the info is out there in the wild (and he said that, in answer to people begging for hints to a TF number)

So maybe 1.2 TF is what we're looking at, unless there's another rumour with similar provenance that also has a TF number in the 1 to 1.8 TF range. I at least, can't find any others.
 
Just curious - how do you derive the costs of G5 memory from the price your GPU? Newegg is showing 2GB GDDR5 GPU's at $135 at the moment; thats are fair margin of delta not related memory...

The only time 4GB of GDDR5 has been on high-end cards. If GDDR5 was cheap it would be 4GB on most video cards and this isn't the case. GDDR5 isn't nearly as cheap as DDR3, that is very true. GDDR5 is expensive.
 
It might be old, but I don't think all the info is out of date. We know bgassassin said 1+TF was what was in the kits, and lherre and BG said it was weaker than the PS4 GPU at 1.8.

Bkilian also said all the info is out there in the wild (and he said that, in answer to people begging for hints to a TF number)

So maybe 1.2 TF is what we're looking at, unless there's another rumour with similar provenance that also has a TF number in the 1 to 1.8 TF range. I at least, can't find any others.

I really guess that pastebin is legit (Durango and Orbis names), but I think it is outdated info.

And, if PS4 was "weak", maybe now both have the same power (2GB->4GB and a beefier GPU for PS4).
 
I really guess that pastebin is legit (Durango and Orbis names), but I think it is outdated info.

And, if PS4 was "weak", maybe now both have the same power (2GB->4GB and a beefier GPU for PS4).

Yeah it's possible things have changed, PS4 is beefier now and Durango has 2x the memory.
But we've got no indication that the 1.2 has changed and that seems to be about right for what was in the alpha kits at least.
 
From the dualpixes.com rumor posted before:

It will not be called Playstation 4, teams have started to call the final name as Omni. Omni will be “very capable” of doing modern day graphics compared to a Direct X 11 level of technology like Unreal Engine 4 and Frostbite 2. Compared to Wii U, it is better, but not the biggest leap in the world according to developer friends of mine.

– Microsoft is the most vague out of all of them. It’s slated to be the most powerful out of all the next gen consoles about 4 to 6 times more powerful than Wii U and 2 to 3 times more powerful than Playstation Omni.
Microsoft is also running into manufacturing issues

This is interesting:
In 2014, Microsoft will introduce the “LiveWall”, a omni-projection unit that will allow game environments to be projected nearly 360 degrees around the user.

Looks like the IllumiRoom tech showed at CES.
 
Just because you've used liquid nitrogen it doesn't mean that you're claims about Piledriver are any more or less correct! The two things are unrelated.

You learn an awful lot about a CPU and it's architecture when you start going below 0c

Especially about clock scaling vs voltage vs cold

AMD chips have horrible leakage, one of the reasons why they love the cold!

As it happens, you're off the mark about Piledriver. Here is a Piledriver overlocked all the way up to 5 gHz and over-volted to a whopping great toasty 1.5v.

http://www.techpowerup.com/reviews/AMD/FX-8350_Piledriver_Review/7.html

I better message Tom on facebook then and tell him he's doing it wrong

http://www.overclock3d.net/reviews/cpu_mainboard/amd_vishera_fx8350_piledriver_review/9

205W off the 8-pin and 254W for the entire system (presumably at the wall, or it wouldn't be the entire system). At 4.6 and undervolted you'd have to be looking at well under 150W.

(Edit: Although if they're testing using something puny like "noob stable" wprime then actual max could possibly be higher)

They don't say how they stressed the CPU, it could be 205w when just running the benchmark suite.

Bulldozer was a hot mess. Piledriver shows massive improvements in performance per watt though (and on the same node and with a short turnaround), and 2 Piledriver modules in a console at about 3.5 gHz would be well under the current gen launch power envelopes. I'm looking forward to seeing if Richland squeezes any more improvements out!

I think I would have a Phenom 2 in a console instead, better power draw and better IPC
 
The only time 4GB of GDDR5 has been on high-end cards. If GDDR5 was cheap it would be 4GB on most video cards and this isn't the case. GDDR5 isn't nearly as cheap as DDR3, that is very true. GDDR5 is expensive.

So by your rationale GPU fans must also be expensive too.
 
So, I suppose I am wrong and I must have been ripped off. I only got 2GB and the most expensive ATI cards had 3GB of GDDR5.

For AMD's 384bit memory configuration it would either be 3GB or 6GB, 6GB is really not needed and that's why you don't see many SKU's with so much memory.

I just checked at EVGA's EU webshop and 2 GB GTX 670 superclocked+ goes for 406,90€ and 4GB for 429,90€, other than the memory, those cards are identical.

going from 2GB to 3GB costs you about 12€ on 660 series.
 
Can the 1.2tflops specs from Durango give more "real performance" than 1.8tflops from PS4? If ram is faster (GDDR5 vs DDR3) and GPU is better (20-33%), how can Durango be a more powerfull gaming hardware?

We have yet to see Sony packing 4 GB of GDDR5.. and we don't know if MS is using DDR3 or DDR4. 4 Piledrivers core, (two modules) have less throughput than 8 Jaguar-cores in FLOPs, and it has 4 thread vs 8 threads (altough they are much faster ones). Overall I think we will call it even on the CPU side. As for the GPU part, there are no talks of particular optimiziation for PS4 or of HSA support.
What i think, its that Sony will launch a beefed up Trinity, while MS will have a more balanced and future-oriented SoC. Having 8 GB of RAM may be of help with voxel based rendering techniques.
 
Well, if the Orbis is supposed to be affordable the memory can't be 4GB @ GDDR5. There is no way that is possible. My Nvidia 670GTX has only 2GB of GDDR5 and it was $400. So ummmm no way...

Put that in a fanboy dream. It might have 4GB of memory, but it's definitely not GDDR5.

And in late 2005, the X1800XT 512MB and 7800GTX 512MB went for $599 and $649 respectively. Yet, MS got the exact same memory in the 360 without taking too big of a loss on hardware. Clearly the GDDR3 wasn't a huge percentage of the cost of those cards, and that was very much a far worse scenario than today - they were cutting edge cards and the most expensive things out.

That 2GB of GDDR5 in your GTX 670 probably costs no more than $20. 4GB shouldn't be much more than $50, as of now, not 6 months from now.
 
This isn't GAF , where being ignorant and having blind faith in your faction of choice is a virtue.

If you'd actually read what I said, you'd know these are not my assumptions from that photo (which is legit, if you don't even know that much, you are way behind).

My assumptions from that photo were that there was a 2+ TF GPU in the kits.
Bkilian & lherre have both shot that down and so since the evidence no longer supports it, I have revised my position.
Which also ties in perfectly to what bgassassin was saying originally.

And yes while it is weak a 1.2 TF GPU like the 7770 would be about 8-10x the performance of Xenos (given the improvements in efficiency in modern GPU architectures). Plus, this is before you look at any MS specific customisations

I read your post and those of Bkilian(I follow the thread for some time) and still seems to me more a theory rather than concrete information but then again anything can be, and you're probably right.
Assuming that the GPU will be of 1.2 TFLOPS do not understand what they can do to increase performace.
All the customisations in the world can never transform a kart in a Ferrari
Unfortunately, if really xbox next and ps4 will be so underpowered to suffer the most will be the user who plays on pc.
 
Wasn't RSX supposed to be a 1.8 tf part? I know the claim was the PS3 was 2 Tf.

I suppose it's become a measure this time because amd is providing both gpus.
 
Status
Not open for further replies.
Back
Top