Predict: The Next Generation Console Tech

Status
Not open for further replies.
Despite having a 256bit memory bus, the memory controllers on Pitcairn are quite small relative to the total die according to die photos. (Tahiti however is another matter) There should still be space around the perimeter a future die-shrunk GPU/APU to fit them in, especially if there is a sizeable CPU component on the same die:


I'm not an expert on these matters though. There still could be problems routing traces from the memory controllers to the board on a die shrunk GPU/APU.
I'm not an expert either by far. I believe that the issue is not the size of the memory controller but the physical IO (pins or bumps at the back of the chip).

I wouldn't mind making the thread, but it seems like the specs leaked by bg's source is the only one that matters. :p
If so I guess there is no need :)
Also, I agree with most of what you said but are we sure they are going to be starting off using a SoC? I always imagined they would start off with discrete chips and move to a SoC down the line for cost reduction.
Even with two chips that a lot of power.
I won't post picture of a Hd7850 cooling solution everybody knows how it looks, nor I'll post pictures of cooler for PC CPU for the same reason.
The HD7850 is reasonnably cool but the cooling system is consistent.
Even if SOny were to get its hand on a PD with say a TDP of 45Watts, that still means a pretty impressive cooling system (when you take in account both chips).
The measurement above are made within pretty big PC tower, if you pack all that in significantly tinier box. I don't know sounds a bit on the high side to me.

Even with the SoC I was considering I would not be surprised if the clock of the GPU would have to be adjusted down. I'm not sure one may want more that 100Watts from a single chip to dissipate in a not that big box.

As for the price reduction, especially through a shrink I wonder. It seems to get costlier and costlier at every stage R&D, wafers, etc. Actually looking at TSMC 28nm process after more than one year in production it seems that it's still not mature. Capacity might still be a bit problematic with the impact on price. One may wonder if sticking to a process may save one more money than pursuing shrinks. When is 22nm going to be available? I would bet before production is mature and capacity no longer constrained it could be 3 years or more from now.
It's my bet but Sony in their sucky financial situation should plan on being comfortable with the part. If they can shrink the chip after more than 4 years (for the second part of the product life) all the best but they have to be comfortable from scratch.
 
Last edited by a moderator:
What would they be lying about? All is according to expectation.

By seven series GPU they meant RV700 not HD7000 if that's what you mean.



You're thinking of their article on DaE, not Eurogamer itself saying anything. That is only as good as you trust DAE. Who later changed to say it was an AMD GPU (but still Intel CPU) anyway.
Oh right,i mixed,and he changed to say i was AMD GPU?any link?(well i don't really believe he's rumor anyway)
 
Use some common sense for a minute. Do you really think it makes sense for archrivals who are directly competing against each other to use identical GPU hardware from the exact same GPU hardware vendor when there is equally good GPU hardware available from another experienced and established GPU hardware vendor? If Microsoft/Sony used different GPU hardware vendors for Xbox 360/PS3, then why would anyone in their right mind expect anything different for Xbox 720/PS4?

That isn't how these things work. You consult with the available vendors and come up with a proposal and then evaluate the proposals. The fact that you competitor is using the same company is largely immaterial as long as you have reasonable confidence that the company will properly firewall everything. Both consoles using the same GPU vendor with the same feature set is largely advantageous for both companies.
 
It almost makes me think back to those 6670 rumors :/ Which is supposedly only 570 Gflops.

I've always wondered if they chose the 6670 in the devkit (assuming its true) because it would have shader count similar to what the final GPU may have.

6670 -> 480 shaders @ 800 MHz
hypothetical 720 GPU ->512 shaders @ clock TBD by power dissipation
 
This isn't really the thread to compare the performance of rumoured specs. Maybe comparing platforms should start in a new thread when we actually know what the boxes are packing?
 
WOW I feel like this is heading nowhere and we've been there already on top of it ;)

How about we change topic?

Itsmydamnation posted this interesting chart about how Bobcat performed vs K8 and K10 parts:
benchfin2.png

That's really interesting, jaguar may very well outperform those cores. Pretty reassuring ain't it?
 
Last edited by a moderator:
What about 8GB rumour in Durango? devkit is supposed to have 12GB

This new "rumor" (1TF, 4GB RAM) comes from proelite who got the info "in a PM from a dev". Though, I understand the info is a couple months old.

bg stands by his 8GB stance/info.

Then again, the dev kit may be x, y, or z, but there's some faction within MS who might know what the final target is. So there may be a newer target within MS than the dev kit info bg may have (or, not).

However the whole "6t, 4g" or whatever riddle in the other thread also makes sense to come out to 4GB for games, if it means anything...
 
Last edited by a moderator:
Greetings to everyone. While none of you may know me, I feel very much a part of this community. After many years of quietly following these forums, I have finally decided to speak.

I read SemiAccurate's article today about Intel's Haswell processor and specifically it's GPU. The chip has "special" memory - Crystalwell, it's an APU that will be less than 100W, and it's x86. Intel has the whole console. There's no AMD, no NVidia.

My 2 cents.
 
I thought the accepted wisdom was that Intel wasn't gonna leave their high margin cpu business for the low margin console cpu/gpu business?
 
Hi, I've opened a new thread which purpose is only betting.
I tried to make a comprehensive form to make form to somehow give a normalized frame for everybody to give is bet.

The topic is here.

Going "on record" is imho entertaining, I'm still hesitating about what to post.

Please let me know if there are obvious lacking to the form, so far nobody answered so editing the form or rules may not be that troublesome :)
 
I'd be more prone to patronize the on record thread if we didn't have decently hard rumors about the specs already. That makes it kind of pointless to me. As I'm basically guesstimating which rumor I believe more.
 
I read SemiAccurate's article today about Intel's Haswell processor and specifically it's GPU. The chip has "special" memory - Crystalwell, it's an APU that will be less than 100W, and it's x86. Intel has the whole console. There's no AMD, no NVidia.
Haswell, even with its external RAM module, is far too low-performant to be a next-gen console. Haswell maybe matches PS3 or the 360 in GPU performance, it's hard to know since there aren't any benchmarks out yet, but from the amount of silicon intel can reasonably squeeze in there and the amount of power it can draw (less than 100W for both CPU and GPU all together) it certainly won't be (much) more than that.

Also, Intel chips are friggin' expensive.
 
I'd be more prone to patronize the on record thread if we didn't have decently hard rumors about the specs already. That makes it kind of pointless to me. As I'm basically guesstimating which rumor I believe more.
Not to sound rude be there is a lot more to patronize lately than the thread I opened imho.

I obviously disagree with you, a few months ago what we were supposedly knowing of Durango almost turned how as "it's what we are supposedly knowing for Orbis".
We were into AMD has all the contracts and a few days ago we had rumors about about Durango dev kits including Intel and Nvidia GPU.
We have those legit but really early leaks about the Yukon platform.
For Sony we don't hear that much.

Imo it's the perfect time to one to make its bet. After years it seems that most people have come to their sense and we no longer hear of 4TFLOPS systems with xx cpu cores, etc.

There is more room than ever for educated bets.

Whether you want to make a bet is another matter all together, I'm all for freedom :). But don't take it too seriously. Same applies to the rumors of late, seems to be a bit too much all over the place for me. There seem to be overall trend emerging but details (as important as cpu and GPU providers) keep changing at a fast pace.
 
Last edited by a moderator:
Haswell, even with its external RAM module, is far too low-performant to be a next-gen console. Haswell maybe matches PS3 or the 360 in GPU performance, it's hard to know since there aren't any benchmarks out yet, but from the amount of silicon intel can reasonably squeeze in there and the amount of power it can draw (less than 100W for both CPU and GPU all together) it certainly won't be (much) more than that.

Also, Intel chips are friggin' expensive.
Well as much as I would want Intel CPUs I indeed can't believe that Intel would have intensive to sell its CPUs at a bargain. One option that I could believe would be they plan to have extra capacity on a holder process (ie 45nm) for quiet a while and they are willing to sell quad core SnB chips at a bargain. (/OT The scary part for AMD if Haswell performs well enough is that INtel could use fully functional SnB core as its Pentium line).
Not much hope though.
The next Atom could match some rumors if they support hyper threading. They seems designed to scale up to four physical cores, with hyper threading that 8 logical cores. There are shadesw though, they are set to release in Q4 2013 so at the same time as durango. I may see Intel shipping a lot of those, they use the same process as their high performances products (haswell) which I could see having quiet some success too. I would be scared for production capacity, may be I'm wrong and Intel can deliver.
Not too mention that no matter next Atom prowess depending on the resources MSFT plans to devote on OS and services 4 Atom cores could fell short of their performances goals.
 
Last edited by a moderator:
I'm not real up on all the tech jargon regarding the new hardware, but can somebody explain to me why Microsoft can't go with cheaper hardware that although maybe less impressive than the competition at launch but instead was part of a new architecture that plans on getting regular updates similar to the smartphone model? So although it might not be better than PS4 at launch, another update a year or two down the road it could be better? I'm thinking along these lines due to Microsoft's relative success with Kinect & Slim and the idea that may want to target more than just games, like more apps & possibly Windows 8 functionality. The leaked document even hinted at possible licensing hardware. So the next Xbox might eventually turn into sort of a Windows Home Server. So creating a stagnant architecture that only gets updated once every 5 - 8 years may not be the best idea. Going with a simpler & cheaper scalable architecture sounds like a good way to hit their targets but have room for future growth. Tell me I'm crazy. ;)

Tommy McClain
 
By 'patronise' Ranger's means visit and partake in.
Oops, English issue on my side. Sorry Ranger I miss the point of your post, I though you were trying to be a bit rude. Damned it's bothering I got to work harder on my English, it's so frustrating :(
--------------------------
By the way I made my bets here and here, I think I took some risks on both system. Especially on Microsoft side as I bet for a complete Nvidia system, one has to live dangerously, at least on the internet it's cheap.
I couldn't help it and put my bets on solutions that include a bit of exoticism:
Interposer for Sony, ESRAM for MSFT (scratch pad memory on the GPU).

As in the leak paper I expect MSFT to include a 32nm rendition of Xenon (not vahalla, Xenon alone).

Overall I would expect Msft product to have sexier power characteristics especially once they will get rid of Xenon. The Nvidia GPU and the SoC should be easier to cool than the Sony SOC. Kepler has really good power characteristic and clocked the part pretty low (vs shipping product)

As opposed to quiet some people here, I think that next generation won't be a battle of FLOPS but about fillrate (with blending) which still seems a critical metric. Sebbbi has spoken of the benefits of "free blending" on the 360 so many times and vouch if more relevant as one of the 360 strength than Xenos unified shader architecture.
I bet that both companies will make different choices to try to push it as high as they can, Sony though interposer and wide IO, MSFT with on die scratch pad memory. I pray for an interesting battle.

I believe that Sony will be cheaper to produce, interposer may face issue at first (defective connections result in the waste of a SOC and 2 4mb memory chips) but once it's ironed out it should be pretty cheap. I expect Sony SOC around the same or tinier that MSFT GPU (due to ESRAM).
I expect MSFT system to be more expansive and MSFT to subsidize significantly. More chips, xenon overhead for BC, Kinect, then OS and service overhead, all that translate in hardware requirements and ultimately money.


If interposer are not problematic I could see Sony in the grey @299$, they may even push a 50$ rebate during the holidays season if production capacity is not a concern. I hope that Sony understood that their best bet is to lead on the price front this time around.

That's for me. Come guys let's take some (cheap) risks.
It's funny, the funniest case would be (assuming more people bet) that nobody ends close to the real specs.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top