NGGP: NextGen Garbage Pile (aka: No one reads the topics or stays on topic) *spawn*

Status
Not open for further replies.
Is it too simple too say that the main reason for 2 instead of 3 (Durango) display engines ("paths") is that Microsoft foresees use-cases where external video (= TV, Bluray) is mixed with games (internally generated 3D graphics) and OSD ? In other words, a STB+... while Sony sticks to a pure gaming device ?

VGLeaks says that two of the three Durango display planes are reserved for the game and only one for the system:

The bottom and middle display planes are reserved for the running title. (...) The system reserves the top display plane for itself (...) VGLeaks

Or do you mean that a TV in the game (like in Alan Wake) uses IPTV or BluRay footage? That is probably possible, but I don't know.
 
the resolution upscaled from 320x200 is interesting for lag-free Gaikai. A 3d-cursor for each plane is interesting for...AR/VR. The system plane could be also used for automatic 3d algorithm.
 
Last edited by a moderator:
Wouldn't it just require wider or more vector units per core, like they did with Xenon and the VMX128 units.

Despite using the same PPE as in Cell, Xenon had 50% more perf per core, a total of 115 GFLOPS for the 3 cores vs 25.6 GFLOPS for the PPU in Cell, all thanks to the upgraded VMX units.

Xenon was (just under) 77 GFLOPS total. The way 115 GFLOPS was obtained was not the way FLOPS have ever been calculated. Each Xenon core averages much less cache than the PPE in the PS3.
 
Maybe because Jaguar is already a modified Bobcat so to expect another major revision before it has even seen first production silicon is asking a little too much? :shrug

But hey, believe what you want. It's all speculation anyhow, at least until Wednesday.

If MS had requested to push the Jaguar cores forward to meet full AVX2 support I can see the difficulty as others have noted this would take a sizable reworking of the core.

That said, if MS was aiming for lower hanging fruit without full AVX2 compliance this could be akin to their investment in VMX128 with IBM. Sony invested heavily into the EIB and SPEs and their XDR memory interface, leaving the PPE alone for the most part, and MS invested in cache locking and VMX128.

AMD is fairly tied on the PC front to walking in step with Intel; in the console space not so much. The problem in the console space is someone OTHER than AMD is going to have to foot the R&D budget. If MS had an approach that would not require a completely new core and was simpler than going full AVX2 support (which they don't need in the console space) and had the budgets (money, people, time) to get it implemented there is no reason it couldn't happen.

This is conjecture of course, but it would address the issue of why a base Jaguar might not have AVX2 but a more robust (2x wide) AVX derivative could have been created.
 
Maybe because Jaguar is already a modified Bobcat so to expect another major revision before it has even seen first production silicon is asking a little too much? :shrug

But hey, believe what you want. It's all speculation anyhow, at least until Wednesday.

And what was Cell? The jaguar cores weren't designed specifically for consoles. Further customization to jaguar to handle the specific needs of a console isn't far fetched, especially given that standard off the shelf cpus and gpus aren't the norm.
 
Or can it?
Not because one is particularly more powerful than the other, but because one allow the brute force approach while the other must be fine tuned, and in time constrained developments it can make a difference.
So even if they can do almost the same output, only half (just saying) of the developers will spend enough time on Durango to reach it

But not knowing any real details this is of course only my supposition

Given MS's approach of exposing the hardware's performance as quickly and easily as possible, I doubt the fine tuning will be a problem.
 
today I was thinking at multiple SKU, multiple chipset rumored for durango
but still believing to rumors that talks about one cpu and one gpu..

the solution can be:
2 free slot for future gpu module (a chipset with som memory) to SLI or replacing the onboard Gpu
it will be upgradable as some rumored,
it will last 15 years or more (xbox infinity?)
it will be cheap to implement (a lot more cheap than multiple sku projects) from start
it will not disappoint early adopters, you can buy it day one and upgrade every 5 years

is that a walkable road? (even if I think that i will not happens in this next gen)
 
today I was thinking at multiple SKU, multiple chipset rumored for durango
but still believing to rumors that talks about one cpu and one gpu..

the solution can be:
2 free slot for future gpu module (a chipset with som memory) to SLI or replacing the onboard Gpu
it will be upgradable as some rumored,
it will last 15 years or more (xbox infinity?)
it will be cheap to implement (a lot more cheap than multiple sku projects) from start
it will not disappoint early adopters, you can buy it day one and upgrade every 5 years

is that a walkable road? (even if I think that i will not happens in this next gen)

Yes if you don't mind developers effectively ditching the platform for PC.
Moving your specs every year or two and having the devs (and yourself) compensate and test for more and more versions is a nightmare.
More versions also mean more prone to failures and complications that arise over multiple SKUs.
Since when has a "upgradable console" ever taken off? It's not really a new idea.

Remember the famous KISS principle. Doing something as such is not logical.
 
Last edited by a moderator:
Yes if you don't mind developers effectively ditching the platform for PC.
Moving your specs every year or two and having the devs compensate and test for more and more versions is a nightmare.

Remember the famous KISS principle. Doing something as such is not logical.

As much as I despise the upgrading hardware scenario, developers won't abandon it as long as the install base is there. You could work it via a subscription model to insure the install base has current hardware. Like your cable company giving you a new box.

I just threw up a little in my mouth.
 
Yes if you don't mind developers effectively ditching the platform for PC.
Moving your specs every year or two and having the devs (and yourself) compensate and test for more and more versions is a nightmare.
More versions also mean more prone to failures and complications that arise over multiple SKUs.
Since when has a "upgradable console" ever taken off? It's not really a new idea.

Remember the famous KISS principle. Doing something as such is not logical.

I was thinking of a 5 years cycle, every 5 years you can put a 3-7x GPU-chipset in system
Durango have a strong software level, and hardly developer will go close to metal, so microsoft can do easy the switch for developers via software/OS/Dx
 
And what was Cell?
I don't know, what was it?

The jaguar cores weren't designed specifically for consoles. Further customization to jaguar to handle the specific needs of a console isn't far fetched, especially given that standard off the shelf cpus and gpus aren't the norm.

First, Jaguar isn't off the shelf because it hasn't even made the shelf yet.
And I wasn't responding to customizations of a console cpu implementation, that's expected.

What was being posited was without even seeing first silicon, they were going to double what they already doubled which was already double of a cpu they didn't want from the start? How could they safely determine what they'd end up with from that? They of course being Microsoft who are much too smart to go with a convoluted plan like that.
 
Or maybe its a modified jaguar, just like the ppc in xbox360 was a modified version of the one used in Cell. I really don't see why this is hard to believe.

the point is, what you are talking about( double the FP throughtput) isn't a modification, its the same thing like going from K8 to K10 or bobcat to jaguar, cores that took 3-4 years to develop. Its not a modification is a complete new core. that the bit thats nutts. if mircosoft started with AMD at bobcat 4 years and design their next gen console core for a buckload of cash then fair enough. But you would think we would have heard something about it over 4 years.
 
As much as I despise the upgrading hardware scenario, developers won't abandon it as long as the install base is there. You could work it via a subscription model to insure the install base has current hardware. Like your cable company giving you a new box.

I just threw up a little in my mouth.

Cable boxes cost way less than consoles and cable subscriptions are way higher than what anybody will pay for "console subscriptions". Users also don't care if they get a box that's 2 or 3 years old and/or is missing X and Y and is made by this company or that company. If it enables me to watch your content, record stuff, etc., most consumers would care less for variations. On consoles, you won't have the same leeway.

One HW update and you'll looking at a major distribution problem.

Not only that, Cable companies can basically recycle the cable boxes into lower tiered services, but what are you going to do with the old consoles?

Don't get me started with the logistic issue (and probably a host of legal issues) of supplying every single consumer across different countries and retrieving consoles from canceled subscriptions under different regions and areas.

There's a reason nobody's doing it. Usually because it doesn't work in one way or another.
 
Last edited by a moderator:
I was thinking of a 5 years cycle, every 5 years you can put a 3-7x GPU-chipset in system
Durango have a strong software level, and hardly developer will go close to metal, so microsoft can do easy the switch for developers via software/OS/Dx

5 year cycles are probably 1~2 years shorter than what we have now.
Don't see a difference between what you're proposing and what Nintendo is effectively trying to pull off with the WiiU.
 
VGLeaks says that two of the three Durango display planes are reserved for the game and only one for the system:

The bottom and middle display planes are reserved for the running title. (...) The system reserves the top display plane for itself (...) VGLeaks

Or do you mean that a TV in the game (like in Alan Wake) uses IPTV or BluRay footage? That is probably possible, but I don't know.

Possibly not "reserved for the game" but "reserved for games".
A possibility to enable multi-gaming as rumored?

To be honest, I really don't want to play two games at once and have them both loaded.
I understand how people would like to just pause a game, and jump into another game when some friend invited you to play something else, and when that session's over to just jump back, but I wouldn't get too excited for such a feature.
 
I was thinking that if MS started very early on, Jaguar and Durango could have been developed in parallel as sister projects. Is it possible that the additional TDP and die size required for doubling the FP didn't fit AMD's target for low power SKU, while they were fine for Durango? Hence one having a doubled FP?
 
I'm having a hard time understanding what's the big deal with ROP "efficiency", it's never been talked about for graphics cards. It's always been balanced on a need basis, and the 78xx has 32ROP for 153GB/s. That was supposed to be a good balance. Is Durango expected to perform like a 77xx card? It has a LOT more bandwidth available than a 77xx, unless the DME are wasting half of it.

Actually, if you are a regular in the 3D Architechture & Chips section you'd have seen quite a lot of discussion over ROPs over the years for various GPU architectures. Especially when it comes to Number of ROPs versus available bandwidth to feed them. As well as to how effectively those ROPs are used. And how effectively those ROPs can use the available bandwidth. Oh and the capabilities of those ROPs, when compared to the competition. ROPs have increased in capability over the years adding on more dedicated functions.

I kind of doubt that the consoles will be targeting anything but 1080p. Or, to put it another way, most games will be 720p+ resolutions. Whether or not we see games that aren't quite 1920x1080p is another story, though I am willing to bet we will see a good portion which aren't quite there.

But I think both MS and Sony will apply pressure for devs to target higher resolutions.

Didn't MS, back in the day, have some kind of requirement when they certified a game for the 360 to be 720p? I mean obviously alot of games came out that weren't, but were those special exceptions or was there no resolution requirement for the platform?

Of course, I may be biased coming from the PC side of things. Higher graphics performance doesn't = higher resolution when talking about pushing the 3D graphics envelope. True a game can be made to run at 60 fps at 2560x1600. But it won't ever look as good as a game made to run at 60 fps at 1920x1200 or 1920x1080.

Graphics hardware on PC is many times faster now than it was in 2005. But for smooth gameplay at max settings on hardware that is current with a game that pushes the boundaries (fairly rare) at the time the game is released generally means smooth gameplay is only achieveable at 1920x1080 or 1920x1200. And even then it may chug along at times and require turning down some settings to achieve smooth gameplay. 1920x1080 or 1920x1200 is the equivalent of 720p for consoles.

So basically what it comes down to. A console game optimized for 720p at 30 or 60 fps will likely almost always look better than a game optimized for 1080p at 30 or 60 fps. Assuming same game genre type. IE - open world versus open world or corridor versus corridor; and not open world versus corridor shooter.

Absolutely nothing stopped developers from making 1080p games on PS3 and X360, except for one thing. A 1080p 30 fps game would not look as good as a 720p 30 fps game.

The same will likely be true for Orbis and Durango. 1080p 30 fps may look a bit better than 720p 30 from the past generation but it won't look better than 720p 30 fps on the same hardware.

I fully expect that some developers, maybe more than last generation, will target 1080p. But I believe that all the best looking games will be 720p or perhaps slightly higher.

So that basically means that PlayStation games will have the same resolution for interface and graphics while Durango is able to render interface and graphics with a different resolution?

Not necessarily. It just means there is no dedicated hardware support for it. So it can be done, it just requires more GPU resources in order to do it.

Regards,
SB
 
Status
Not open for further replies.
Back
Top