Predict: The Next Generation Console Tech

Status
Not open for further replies.
Angry Mod :devilish:

That's the last time I'm going to do any cleanup. I've moved the Durango vs Orbis discussion to the Durango vs Orbis thread. I've moved the 'why bother with consoles instead of PC' to the Next-gen discussion thread. I will heartlessly delete OT posts from here-on in and temp-ban anyone who's been here longer enough to know better. If some noob comes in taking the thread OT with talk about the market for these platforms, the correct response from the regulars is to point them to the appropriate thread and not carry on another line of discussion here. This thread is filling up several pages overnight for me to review and it's almost all entirely unrelated to determining the hardware of the new machines.
 
Well, if the Orbis is supposed to be affordable the memory can't be 4GB @ GDDR5. There is no way that is possible. My Nvidia 670GTX has only 2GB of GDDR5 and it was $400. So ummmm no way...

Put that in a fanboy dream. It might have 4GB of memory, but it's definitely not GDDR5.

Yours 670gtx costed nvidia ~150$ to produce [fully assembled board], and ram was in the ~ 20-30 range. Further more, no one is currently using 512MB GDDR5 chips that are rumored to enter into mass production in early 2013. Sony can reach 4GB of unified GDDR5 pool only if they use those chips. I cant see them reaching that with 256BM ones.
 
The thread was locked for a bit there, you must have been an angry cookie :smile:.

An FX8350 at 4.6Ghz will easily pull 200w+, undervolted or not.

i never said put a 8 core 4.6ghz CPU in a console. I said you wouldn't need to drop the clocks by 1/2. i can tell you my chip isn't pulling 200watts, i dont have the cooling to handle that kind of load on my ESXi server.
 
The thread was locked for a bit there, you must have been an angry cookie :smile:.



i never said put a 8 core 4.6ghz CPU in a console. I said you wouldn't need to drop the clocks by 1/2. i can tell you my chip isn't pulling 200watts, i dont have the cooling to handle that kind of load on my ESXi server.

Gossip from a chinese site:

http://club.tgfcer.com/thread-6596213-4-1.html

Can't read chinese, but appearently a former Ubisoft employee claims:

- AMD-CPU with 8 cores
- GPU from the HD 8000-series
- 8GB ram
- 640 GB HDD
- Windows 8 kernel
 
Post from neogaf :
http://m.neogaf.com/showpost.php?p=46311321
http://m.neogaf.com/showpost.php?p=46312120

thuway:A source has recently come out and said the latest devkit with PS4 has 4 GB GDDR5. He also mentions there will be no APU + GPU, but it will be APU only. The rest of the specs match up almost exactly with the VGleaks. I could say something to get people fired, but the VGleaks page has the majority of things correct. Things can change from now til launch.

Originally Posted by DieH@rd:
Was there some new development [looks at last few pages]? New VGleak? Confirmation of some old vgleak?
Some good information has come down the grapevine that the VGleaks article was almost 100% correct and that Sony has upped RAM to 4 GB. It is 192 GB/S unified :). Also the Durango specs have been leaked and apparently some mystery sauce is at work.

Through PM I was just told I am almost right. This is your Orbis folks :).

If this is true then 4gb gddr5 at 192GB/s with a 18cu gpu ; how much performance can we expect from orbis and will it suffer compared to 8gb ddr3/4 Durango ?
 
Last edited by a moderator:
Post from neogaf :
http://m.neogaf.com/showpost.php?p=46311321
http://m.neogaf.com/showpost.php?p=46312120



If this is true then 4gb gddr5 at 192GB/s with a 18cu gpu ; how much performance can we expect from orbis and will it suffer compared to 8gb ddr3/4 Durango ?

I know some people at neogaf are insiders, but I just don't believe thuway. I think he is guessing.
Again some of the numbers he put out earlier is different than it is now.

I think 4GB of GDDR5 would be great of course, but I just think it's still too expensive no matter how you stack it.
 
Nothing about the processors right now is nearly as interesting as how everything is going to be connected. Such GFlops figures are meaningless. If they weren't, then Sony could as well have stuck with the Cell, or leave out CPU altogether, because none of those CPUs have impessive GFlops figures (Cell was 228GFlops or something like that iirc).

What will be really interesting is busses, bandwidth, who controls what memory, etc.
 
If this is true then 4gb gddr5 at 192GB/s with a 18cu gpu ; how much performance can we expect from orbis and will it suffer compared to 8gb ddr3/4 Durango ?
There's our first victim enjoying a 2 week ban because they can't follow the simplest of instructions.
 
192 GBps GDDR5. What bus does that need? Could it be some stacked memory, or move to that later?

This is what I thought actually. Especially as the price of GDDR5 goes up later in the gen. They could just wait for AMD to do all the R&D work for the mem controller redesign for using stacked DDR4 instead of GDDR5, and leverage that.
 
192 GBps GDDR5. What bus does that need? Could it be some stacked memory, or move to that later?

256bit Bus with 6 Gbs chips. If the chips are 4gb in density then it's only 8 of them.

In theory you could replace the memory with something else down the road as long as the memory bw, latency, and anything else visible to the software looks the same.
 
I better message Tom on facebook then and tell him he's doing it wrong

http://www.overclock3d.net/reviews/cpu_mainboard/amd_vishera_fx8350_piledriver_review/9

You said that "An FX8350 at 4.6Ghz will easily pull 200w+, undervolted or not", which wasn't true, and it was that I was responding to. Posting an awful result by "Tom" doesn't actually change that.

Tom's results at Overclock3D are insane. I don't even. His VRMs must have been lit up like a Christmas tree. You probably *should* message Tom and tell him that 1) wPrime is a weak-sauce torture test and that 2) his results are insane.

Even Anand - who so couldn't be arsed that they didn't even list overclocking voltage other than to say "less than 10% extra" - got to 4.8 gHz under load with a total power draw of 294W at the wall. That's only 40W above Tech Power Up (and likely with higher voltage).

Piledriver is not the power hog that you think it is. Looking at Trinity you can see AMD's mainstream, desktop part with 4 cores at 3.4 / 4.0 turbo complete with a pretty meaty integrated GPU at 65W. That's a very console-friendly area and those 4 cores would probably be preferable in programmer terms to 8 Jaguar cores. And a console would likely use something newer than Trinity. Richland should squeeze a little more ppw out and Streamroller (if it makes it in time) would see another jump in performance per Watt.
 
Just for fun:

"Romance in Durango by Bob Dylan"

Maybe that's where the codename come from ;)

Anyway, the mistery sauce of Microsoft must be really great, because we haven't heard developers screaming about the lack of power of Durango..

Rangers, you said there are 3 "external" custom hardware parts that can help the GPU. One is the audio DSP i guess, the other being some sort of fast cache coherent between the two (the SRAM).. the third one.. well it may be the most interesting one. Did we dismiss the interposer for good? The SoCs seems small, it could fit nicely on a interposer. Having hundreds of GB/s of bandwidth would surely impress the developers, making up for the lower raw power.
 
Seems like its pointing to the stacked ram on 2.5 interposer roumours being true. Either in a 512-bit or even 1024-bit configuration. Or is WideIO that even possible for GDDR5? Ive only seen it being refenced with regards to DDR3/4, would GDDR5 be too hot for stacking?

EDIT: the leak is in reference to the dev kit, this may be using GDDR5 but not what is planned for the final console. Anything that can provid 192GBs cost effectively is on the table.
 
Last edited by a moderator:
256 bit bus @ 6Gbps / pin, - hot and expensive, IMHO.
Cheers

Seems like its pointing to the stacked ram on 2.5 interposer roumours being true.
Well that's narrowed it down... :D

I feel inclined to think stacked RAM given Vita's use and the costs of 256 bit bus. That must be Sony's aim anyhow, even if launch machine (end of this year reportedly) has to be a conventional system.
 
Wouldnt the motherboard also be massively reduced in complexity if an APU with RAM all in one SoC? Maybe big cost saving hear could also offset some of the potential extra cost of being first wit 2.5d/stacking
 
Stacked memory requires significantly less power to run thus lower heat. The interconnects are so much shorter thus less effort needed to power them. The same principle applies to logic dies so the idea of stacking in general is super efficient.
 
Status
Not open for further replies.
Back
Top