NGGP: NextGen Garbage Pile (aka: No one reads the topics or stays on topic) *spawn*

Status
Not open for further replies.
My brother and I enjoyed many hours with Race Drivin' on the Genesis. It wasn't THAT bad. The 3D novelty never really wore off. It had a pretty cool track editor too.

oh, the snes version is specifically very slow (unplayable), because the 3.5MHz CPU is on its own, vs a 7 or 8MHz one on the genesis.

well we have one console better than the other and vice versa, with one maybe keeping an eye most times. it was the same thing with x360 vs PS3. but less dramatic as sound and number of colors shown on screen will be the same.

for Jaguar CPU I've always been skeptic about the idea, but we don't know much anyway.
I believe more in two steamroller modules + 1 teraflop of GPU on the APU itself but I wonder which memory scheme feeds it.
 
Last edited by a moderator:
I doubt we will see Cape Verde in Durango. There has been too many rumors about Durango GPU being custom, and I'm sure MS would like to trow as much as shader units as possible and drop anything they don't need from the "base" they they will likely take.
 
What about just using some dedicated GCN CU's as a coprocessor instead.

8 Jaguar cores at 2GHz = 128 Gigaflops
3 CU's (192 shaders) at 500MHz for an additional 192 single precision gigaflops
Total: 320 Gigaflops and all probably within 25-30 Watts.

When using the Xbox for streaming media/movies or just dashboard apps, the whole system could probably just execute out of that APU. It could probably passively cooled as well.

When gaming, the discrete GPU fires up and adds similar raw performance to a discrete Cape Verde, 1+ Tflops.

Merits of keeping the GFLOPS generation on the x86 cores vs on a separate array? I've always assumed that the on-core approach involved less latency or something as such because in the end it's the x86 cores that really decode, send and control instructions for the rest of the CPU yes? So I always believed in the "keep your friends close" idea (when it comes to computers), or is that a fallacy? I'm not an electrical engineer.............
 
They won't use Cape Verde...wait,actually,who said MS will use Cape Verde first?

Btw,i remember there is a rumor said PS4 will support 4k right?yeah,and not just 4k,Durango will support 4k either,i get that few hours ago because my friend just visited a game company,and he also said something about next gen's...art design?or graphic design?i don't really know what is he talking about(other than support 4k) so i will reply it later well i know what's that.it should be about next gen's graphic.
 
Last edited by a moderator:
I doubt we will see Cape Verde in Durango. There has been too many rumors about Durango GPU being custom, and I'm sure MS would like to trow as much as shader units as possible and drop anything they don't need from the "base" they they will likely take.

Custom can just mean adding ESRAM and whatnot. In general I dont see a lot of payoff for "custom" designs and expense is far too great anyway. It worked for X360 but that time is passed.
 
Yes, but instead of showing games, most of the stuff they DID show was non-game related. Like IE, and all the sports and movies stuff... i.e. everything that doesn't need fast hardware, since my Raspberry Pi can essentially do that (minus the IE part)
Not really,they show games mostly,in 90mins of E3 2012,the non-gaming announcement was about 30mins(should be even less),yeah i don't understand why people act like it was about 80 mins for non-gaming announcement

About "home entertainment",even Nintendo is starting this route now,and Sony is the first one did that,and they still doing that,of course MS also doing that too

Besides,MS opened/bought about 10+ game studios since 2010...yeah this year they cancelled one project from a new studio,but it's Kinect family title("Project Columbia",they really like use this type for codename btw).
 
Last edited by a moderator:
Custom can just mean adding ESRAM and whatnot. In general I dont see a lot of payoff for "custom" designs and expense is far too great anyway. It worked for X360 but that time is passed.
Xenos was custom, as was RSX and every other console GPU. What I'm talking about is they likely know AMD roadmap better than we do, and they know architecture thats going to be available in '13.

There is no need to take Cape Verde and down-clock it unless they are very interested in making money the first day they hit the stores, which I don't think is the case. What I think they could do is take a "base", rather than entire retail GPU, and add only what they consider needed. That way they are building something that shouldn't be expansive in terms of R&D, but still give them exactly what they need in their console.

If we go with this rumor from SA, it would be highly improbable that yields would be "catastrophic" like this if it was CapeVerde + ESRAM and Jaguar on nice little die.
 
They won't use Cape Verde...wait,actually,who said MS will use Cape Verde first?

No one did, but it's always been speculated about because Cape Verde has a really good performance/watt/area and is a good choice for a starting point for a console GPU, imo.


If we go with this rumor from SA, it would be highly improbable that yields would be "catastrophic" like this if it was CapeVerde + ESRAM and Jaguar on nice little die.

Agreed. Either the story is bunk, or MS is really cooking up a monster console. If by miracle we were looking at a 400mm or greater die, we'd probably be getting ~7970 levels of GPU performance. (I can dream! before reality punches me in the face)

Assuming SA is anything but bunk, just the ESRAM could be giving them problems or something.

Yeah, could be. Although I imagine it would have to be a ton of ESRAM, maybe even 64MB.
 
Last edited by a moderator:
I'm just dropping in to say that going from experience (last gen) this discussion will be difficult enough even if we had all the facts on both processors, so the whole point of this thread is lost on me, and I'm not coming back in here until we have hard facts. ;)
 
Last edited by a moderator:
I'm just dropping in to say that going from experience (last gen) this discussion will be difficult enough even if we had all the facts on both processors, so the whole point of this thread is lost on me, and I'm not coming back in here until we have hard stats. ;)

That's long time to wait,but i understand
 
we should expect a radeon 8000 series type of GPU, just because it's tuned for FSA (better GPU computing). maybe other GPU differences are minimal.

what can we remove on a GPU? probably not much. historically we've had the lowest end GPU (i.e. 8400GS, gf119 etc.) just using the full bling, only with very few of the units you can scale back : SM, rops, tmu, memory controller(s).. so in the end they use quite many transistors. but you just swallow it up, thanks to using an advanced fab process.
hence the strieving mobile GPU market, because high performance GPU don't scale low enough.


it will be custom, but no more than RSX is. RSX is amazingly not custom, the difference was mostly memory/rops and interface to the CPU and maybe a video scaler.
Xenos was a one-of-a-kind GPU :)
 
Xenos offered major advantages we were on the cusp of at the time. I dont see anything out there like that currently (at least in GPU's design proper, ignoring interposers or what have you).

For everything it did Xenos still probably beats RSX by 10%-15% if that. Microsoft could have done as well or better by simply dedicating a bit more silicon to a brute force design.
 
I would say difference is bigger than 10% for sure (at least from what I've read). Especially when it comes to vertex processing.
 
You know if someone sneaked into your house at night and secretly overclocked your console by 20% you'd never notice.
 
I think you would as a feel for games running smoother (if they're not framerate locked). more noticeably you'd have less tearing in some titles. Assuming of course overclocking a console doesn't break compatibility. ;) There are also aspects of games where a 10% average difference equates to much bigger real-world difference due to significant differences in specific areas. eg. XB360 has a considerable advantage in most games regards grass rendering, leaving PS3 looking fairly barren. It's not enough to make a difference that'll change people's buying choice, but it can be seen in games.

This is one of many reasons why the topic of 'best console' is moronic at the best of times even when the hardware is released, let alone when it's just rumours of a vague number difference. We could see one console have a 50% GPU flop count advantage and yet the rival excel in other areas, or the more flop console be gimped by a hardware bottleneck.
 
I think you would as a feel for games running smoother (if they're not framerate locked). more noticeably you'd have less tearing in some titles. Assuming of course overclocking a console doesn't break compatibility. ;) There are also aspects of games where a 10% average difference equates to much bigger real-world difference due to significant differences in specific areas. eg. XB360 has a considerable advantage in most games regards grass rendering, leaving PS3 looking fairly barren. It's not enough to make a difference that'll change people's buying choice, but it can be seen in games.

This is one of many reasons why the topic of 'best console' is moronic at the best of times even when the hardware is released, let alone when it's just rumours of a vague number difference. We could see one console have a 50% GPU flop count advantage and yet the rival excel in other areas, or the more flop console be gimped by a hardware bottleneck.


So by that logic all consoles are essentially equal?

Xbox was equal to PS2?

Wii equal to PS3/360? Obviously not.

In the case of PS3 vs 360, sure it's probably a little moronic (although people still care about the topic an incredible amount when the difference is miniscule either way)...
 
Status
Not open for further replies.
Back
Top