NGGP: NextGen Garbage Pile (aka: No one reads the topics or stays on topic) *spawn*

Status
Not open for further replies.
There is no valid documentation they could have seen that would have said "based on Jaguar".

LOL now you are just screwing with people. Do you enjoy this? :devilish:

So too xbox fans mention twice the flops and the rest of you grasp it like it's gospel, it reeks of desperation.

I doubt their will be any difference in CPUs other than name. And even if it was twice as powerful it would be some 40 gflops down because of the OS.

I know people want to believe Sony are reserving cores and cu's for the OS but that is wishful thinking on their part hoping orbis is coming out weaker than it looks.

So, are you going to stand behind such big-talk rhetoric? Because if not your post should be deleted; if it comes true you can look like a bastion of common sense; if you are proven wrong at all you should be banned because if you are wrong and slink away from such talk, well, there is no place for it here on B3D.

It's more likely that Epic is targeting 1920x1200 or 1920x1080 on a 7970 or 680 class GPU. 1920x1080 being the most popular desktop resolution.

So I'd imagine that at 720p or sub 720p, Orbis should be able to do something similar. And since I'd say most developers targeting advanced rendering techniques are likely targeting 720p on next gen consoles, then that fits right in.

Regards,
SB

1080p may be the most popular resolution but 680/7970 class GPUs are not the most popular.

Going back to Epic, why would Epic target their next game at hardware less than 5% of the discrete PC gaming market has? It is pretty certain that Epic would not want to abandon the NV 4xx, 5xx, and low end 6xx hardware; ditto AMD. There is just far too much of the market share.

That being said I do believe quality level check boxes will probably even drown the 680/7970 hardware--set UE4 to 11 (insane texture resolution, or whatever they choose to call the new IQ level) and it will drop all HW to their knees. I would even go further--these new engines often ship with corner case features that KILL hardware performance, even on the best hardware. So in these corner cases going 1080p, all settings Ultra High, 4xMSAA with Post Process AA, and every feature check box enabled (3D, etc) yeah, watch the best GPUs drop well below 60Hz.

Those conditions aside, take their new engine/game, set it at 1080p, High Settings, and adjust AA based on bottlenecks, and don't enable the inefficient whiz-bank new/unoptimized lense flare effect and I would guess a 680GTX sings.

Epic better hope so, or they didn't learn the lessons from Crysis 1. And from a practical stand point Epic probably isn't in a financial position to be adding features that require unique asset creation and testing on the PCs when the consoles are fairly close to eachother and hit a fairly high baseline of PC performance well above the core PC demographic. You really would have to wonder why Epic would create/test major game features <5% of the PC market could use at 1080p30. But don't worry there will be some nice compute check boxes and such that kill all PCs (but the same effects will down the road will have much more efficient implementations anyways, so all they will really tell us is Epic throwing PC gamers a bone).
 
So I take it you are going to ban yourself when orbis has nothing reserved for the OS except the memory we already know about?

I have said many things in this thread that the likes of you have laughed at, yet slowly these things started to appear closer to the truth than the fantasies being pedalled.

I may not have your technical knowledge but I do not make things up and repeat them once a day because I want it to be true.
 
If your job is verifying the electrical signalling of the IO, what difference does it make what the signals are for? Lots of things go into making a complex system, and most are way below the level where the games are relevant.
In that case, one would not comment on being a "super computer", right? That would show interest, within itself. That would require more than a low level, isolated look. You couldn't verify something as low level as electrical signalling of an IO and say "super computer", right? Anyone at a high enough level to say "super computer" should know what their equipment is for and to quantify that, right?
 
It's more likely that Epic is targeting 1920x1200 or 1920x1080 on a 7970 or 680 class GPU. 1920x1080 being the most popular desktop resolution.

So I'd imagine that at 720p or sub 720p, Orbis should be able to do something similar. And since I'd say most developers targeting advanced rendering techniques are likely targeting 720p on next gen consoles, then that fits right in.

Regards,
SB

Wow, so Durango will be doing 480p then? That's alot worse than I thought.
 
In that case, one would not comment on being a "super computer", right? That would show interest, within itself. That would require more than a low level, isolated look. You couldn't verify something as low level as electrical signalling of an IO and say "super computer", right? Anyone at a high enough level to say "super computer" should know what their equipment is for and to quantify that, right?

I think you're trying way too hard to read anything into it. The relationship to super computer could mean a lot of things.
 
As it stands with what I know, that's pretty much it. It seems there maybe something akin to VMX128 with Durango's version. Still not quite sure. It could explain why Sony got "put to the side" by AMD for MS.

The thing is how much faster than Orbis jaguar durango jaguar is,because Orbis Jaguar is no vanilla either,so Durango jaguar been 100% stronger or faster doesn't translate into the same advantage vs Orbis Jaguar.

Because Vanilla Jaguar is from 2 to 4 cores not 8 like Orbis.
 
In that case, one would not comment on being a "super computer", right? That would show interest, within itself. That would require more than a low level, isolated look. You couldn't verify something as low level as electrical signalling of an IO and say "super computer", right? Anyone at a high enough level to say "super computer" should know what their equipment is for and to quantify that, right?

People comment on things they don't really know much about all the time.
Look at me...

Anyone can say the words "super" and "computer", and comparing what the rumors are coalescing towards and an actual supercomputer, I get a stronger impression that this source doesn't really know enough about the topic for a turn of phrase to be taken as gospel.
 
Wow, so Durango will be doing 480p then? That's alot worse than I thought.

Come on! All this negative talk of sub 720p..does anyone here actually believe we will be seeing that scenario..really??

Current gen consoles play the same games as high end pcs do. .obviously pcs look substantially better..but they are not worlds apart in the eyes of joe bloggs.

Durango looks to be the weakest. ..but will still be something like 10× ps360 in real world terms...maybe more once optimised..so how the hell anybody thinks we are staying 720p on next gen is beyond me.
 
So I take it you are going to ban yourself when orbis has nothing reserved for the OS except the memory we already know about?

It would be a curious console if Orbis had no CPU or GPU resources utilized, magical even.

I have said many things in this thread that the likes of you have laughed at, yet slowly these things started to appear closer to the truth than the fantasies being pedalled.

I may not have your technical knowledge but I do not make things up and repeat them once a day because I want it to be true.

Actually the problem with your quote is this part, "So too xbox fans mention twice the flops and the rest of you grasp it like it's gospel, it reeks of desperation."

I have no clue (nor made a value judgment) on whether the rumor is true. That said bgass. seems to back it up; seeing as some of his other info has been correct why not? We will wait and see. But the problem is you aggressive and downright trolling comments like "it reeks of desperation".

The difference between you and me is I didn't tell others when they interpreted the 14+4 rumors as indicating only a "small" impact on graphic performance that they "it reeks of desperation". I have repeated over and over these are unverified leaks without context/architecture; many leaks have ambiguous wording. So why attack and insult others like you always do?

Sure, when the issue of the 14+4 CUs was discussed I mentioned I thought 3dil. theory of QoS (something MS has as well in their PDF leak) sounded like it jived (I also think the 2CU Jaguar blocks also has potential to explain the remarks in the leak) but I have not been evangelizing this position as a factual leak.

And I surely have not told those who have a different take that they "reek of desperation". So why would I need a ban?

Then again telling people who disagree that they leak of desperation because they are taking a rumor seriously really just tells us what we always knew: you have never liked it when Sony's competitors look good. Hence your aggressive posturing and insults.

The funny part is the 2x Durango Jag flops may not even be true and a misunderstanding :LOL: But it was sure fun watching you attack the posters who took a double "confirmed" leak seriously. Ahhh the good old days!
 
I kind of doubt that the consoles will be targeting anything but 1080p. Or, to put it another way, most games will be 720p+ resolutions. Whether or not we see games that aren't quite 1920x1080p is another story, though I am willing to bet we will see a good portion which aren't quite there.

But I think both MS and Sony will apply pressure for devs to target higher resolutions.

Didn't MS, back in the day, have some kind of requirement when they certified a game for the 360 to be 720p? I mean obviously alot of games came out that weren't, but were those special exceptions or was there no resolution requirement for the platform?


I am very excited for the next gen. In particular Orbis, as I am a Playstation guy. So this coming Wed should be awesome!

I'm not a hardware guru like alot of chaps here, but I am quite certain when I say that it really doesn't matter a whole ton what current top of the line PC innards are like vis a vis the rumored orbis\durango specs. Compare the PS3 and 360 to the best PC's of their time and you will see a largely similar theoretical performance disparity.

The difference is, as has been pointed out at length in this and many other threads, is that consoles will always hit a far greater level of efficiency than any PC title can dream of. Every component in a console is placed there with due consideration and net performance of the whole unit is the target for the engineers (within a budget).

So yes, it won't be as powerful as a 400$ GPU, but it will output games that look mind blowing and insane all the same. Proof? God of War 3\Ascension, Killzone 2/3 and the Uncharted Trilogy. GT5.

For 360 it'd be Gears of War trilogy and Halo 3, 4 and Reach. Forza.
 
but I am quite certain when I say that it really doesn't matter a whole ton what current top of the line PC innards are like vis a vis the rumored orbis\durango specs. Compare the PS3 and 360 to the best PC's of their time and you will see a largely similar theoretical performance disparity.

No. When the 360 launched it was arguably 90% as powerful as the fastest PC GPU of the day while being more advanced (feature wise). This generation, Orbis will be level (or even behind) feature wise and roughly 35% as powerful. That's not the same situation at all.

The difference is, as has been pointed out at length in this and many other threads, is that consoles will always hit a far greater level of efficiency than any PC title can dream of. Every component in a console is placed there with due consideration and net performance of the whole unit is the target for the engineers (within a budget).

No one has ever ignored or tried to wash over this fact. But the reality is that the best case performance increase you could gain overall in the current generation (around 2x) would not be close to enough this coming generation to gain performance parity. And of course API's have got thinner and more efficient on the PC since the last generation so that 2x advantage is likely a fair bit lower now.
 
Didn't MS, back in the day, have some kind of requirement when they certified a game for the 360 to be 720p? I mean obviously alot of games came out that weren't, but were those special exceptions or was there no resolution requirement for the platform?

The TCR was 720p 4xMSAA as a standard, but it was dropped as quickly as it was mandated.
 
No. When the 360 launched it was arguably 90% as powerful as the fastest PC GPU of the day while being more advanced (feature wise). This generation, Orbis will be level (or even behind) feature wise and roughly 35% as powerful. That's not the same situation at all.



No one has ever ignored or tried to wash over this fact. But the reality is that the best case performance increase you could gain overall in the current generation (around 2x) would not be close to enough this coming generation to gain performance parity. And of course API's have got thinner and more efficient on the PC since the last generation so that 2x advantage is likely a fair bit lower now.


What? The fastest being like the GTX 690 or the AMD equivalent? Those monster cards whose power consumption and heat generation is through the roof and costs bucket loads of money?

Was there anything even like that kind of tier back then? The super enthusiast models of graphics card?

I don't know about you, but I don't consider that to be a fair comparison. How about a more reasonable target- how does the Orbis or Durango GPU compare to a GTX 680 on paper or the flagship single card from AMD (I don't really keep up with their lines, I'm an Nvidia guy).

Unless this is what you were comparing it too the whole time.

I know that Orbis will certainly not be 1:1 with either of those cards, but is its GPU really only barely a third as powerful as a GTX 680? To be thorough, at the time of the 360's launch what was the best single graphics card available. How much did it cost, what was the power consumption\heat generation, and its specs.

From what I have read, the Orbis GPU does not appear to be a third as powerful as this card, HD 7950, which is second to the top of the line (HD 7970). It seems to be alot more competitive than that?


http://www.amd.com/us/products/desktop/graphics/7000/7950/Pages/radeon-7950.aspx#3


Hell, for that matter (from a raw flops perspective) Orbis is about half as powerful as HD 7970. Clocks in at 3.79 Terraflops while Orbis is in the 1.80 Terraflops ballpark right?

Here's the 7970

http://www.amd.com/us/products/desktop/graphics/7000/7970/Pages/radeon-7970.aspx#3
 
Last edited by a moderator:
If Durango was packing an 8 core / 4 module Steamroller (which I don't expect it will be) then it would also likely be running at at least twice the clock speed of the Jaguars in Orbis meaning that in the worse case scenario it would match it in SIMD throughput and in best case it would double it. With reality being somewhere inbetween.

Plus both single and multithreaded scalar performance would be well over double.

well the Opteron 3380 is an 8 core 2.6ghz base and has a 65watt TDP.

the Opteron 6328 is 8 core 3.2ghz base and has a 115watt TDP. So there is no way we are going to see a 3.2ghz SR without massive power reductions.
 
An overly simplistic GPU flops comparison:

PS3: 367Gflops
7800 GTX 512: 400Gflops (fastest card available a year before release of the console), US$649 MSRP (~$700 retail), ~120W TDP.

PS4: 1.84Tflops
HD 7970 Ghz Edition: 4.3Tflops (fastest card available around a year before release of the console), US$499 MSRP (~$430 retail), 250W TDP.
 
Last edited by a moderator:
What? The fastest being like the GTX 690 or the AMD equivalent? Those monster cards whose power consumption and heat generation is through the roof and costs bucket loads of money?

Was there anything even like that kind of tier back then? The super enthusiast models of graphics card?

I don't know about you, but I don't consider that to be a fair comparison. How about a more reasonable target- how does the Orbis or Durango GPU compare to a GTX 680 on paper or the flagship single card from AMD (I don't really keep up with their lines, I'm an Nvidia guy).

Unless this is what you were comparing it too the whole time.

I'm only comparing to single GPU cards, not dual GPU like the GTX 690.

I know that Orbis will certainly not be 1:1 with either of those cards, but is its GPU really only barely a third as powerful as a GTX 680?

I didn't compare to the 680, I compared to the fastest GPU at the launch time of the consoles. That won't be the 680. The 680 to Orbis/Durango is more like the 6800 Ultra to the PS360. When we first heard reliable rumours about the 360 they talked about a console at least as powerful as 6800 Ultra's in SLI. Clearly no-one is even entertaining the new consoles being as powerful as 680's in SLI.

Within a few weeks, the GTX Titan will be available. Before the launch of the consoles there's a very good chance both the 8xxx and 7xx series GPU's will also be available.

The current fastest GPU on the same architecture as Orbis (Tahiti) sports 4.3 TFLOPS of shader performance. That's 2.3x Orbis and is also in line with 680 performance. Titan, available in a few weeks should sport at least 40% more real world performance. That's 3.3x Orbis. I would expect the high end 8xxx series launching at the end of this year from AMD to sport similar performance.

To be thorough, at the time of the 360's launch what was the best single graphics card available. How much did it cost, what was the power consumption\heat generation, and its specs.

It was the 7800GTX 512MB which was ridiculously expensive and probably had a pretty large TDP for the day, certainly larger than either console which were actually manufactured on a smaller process.) It's specs in relation to the 360 were 165% texturing, 220% fill rate, 55% geometry setup, 118% total shader throughout but only 86% when you include texture addressing and 21% the bandwidth of the edram or 240% the bandwidth of the main memory. So as you can see, my generalisation of the 360 being 90% of this card was a very rough average.

From what I have read, the Orbis GPU does not appear to be a third as powerful as this card, HD 7950, which is second to the top of the line (HD 7970). It seems to be alot more competitive than that?

But that's not the highest end GPU of today, nevermind 8 months from now.
 
That's not to say, Brad, that there aren't other reasons for including those elements, irrespective of their role in addressing issues. ESRAM, for instance, has benefits that exceed simply adding bandwidth to the design, as very clearly called out in the leaked docs. The design is not a series of patches.
If you read between the lines bkilian is saying Durango doesn't support patches which are required for tessellation. So Durango doesn't support tessellation. :devilish:

People comment on things they don't really know much about all the time.
Look at me...
Honesty on the internet. I thought it was a myth.
 
You still seemed to be getting confused. Hence, why I keep stressing utilization. In that case that you proposed, Orbis still isn't likely to be getting full utilization of GPU resources, while Durango will be much closer to full utilization of its resources. Hell, in the case where Orbis somehow managed 100% GPU utilization it's likely that Durango's GPU would also be at 100% utilization. The difference? Orbis would be faster obviously, but utilization (efficient use of GPU resources) would be exactly the same.

And again, let me stress something that you miss every single time when replying to me. Higher utilization (efficient use) of hardware resources does not necessarily mean higher performance. Hence, lower efficiency on Orbis will still likely result in higher performance in real world rendering.

So, why keep banging on about increased utilization of available hardware resources? Because it means the performance differential is likely going to be smaller than the pure hardware specs would imply.

And that's ONLY going by the rumored leaks, which may not even contain the correct details. And certainly doesn't contain all the relevant details in order to make a truly informed decision about where absolute or relative performance for each console will be.

Additionally before you get all uptight that I'm claiming that your Orbis is inefficient... Please note that I fully expect Orbis to make more efficient use of its hardware than is currently the case on PC. And I've never gone around claiming that the GPUs in PCs are inefficient.

Just because speculating on what was added and why to otherwise standard hardware would suggest that the entire focus of the hardware choices in Durango was to maximize efficient use of available hardware while Orbis isn't quite as aggressive by going with a more "traditional" layout and data flow, doesn't mean that Orbis is some incredibly inefficient design. It's just that with what is released thus far, Durango's design appears to be putting much more effort into increasing the efficient utilization of available hardware resources above and beyond what you get with a relatively more standard configuration.

Will it bear out that way in the final console designs? Maybe, maybe not. This is speculation after all, based on incredibly incomplete information.

Regards,
SB

The problem is utilization isn't the only measure of efficiency. And efficient design is as much about avoiding stalls, bottlenecks and saturation as it is about high utilization. If the city planners in your town designed the sewer system to be at a high level of "utilization" on an average day, that might seem very efficient, but only until the first time it rained and every toilet in the city backed up at the same time and millions of gallons of raw sewage is dumped into the local waterways. In that case building excess capacity to cope with peak loads and worst case scenarios is more efficient, which was my point about the ROPs.

The other problem is that when people say the Durango design is targeting efficiency they assume the goal, AND the result is higher efficiency than the Orbis design can provide. But that is literally unfounded and utterly without support. I agree that the added hardware in Durango makes it efficient. But it is being compared to a design that was also designed to be very efficient, only the choices made in the Orbis design do not require specialized hardware to increase efficiency. The reason each company made those choices are immaterial to the analysis of each.

The fact that Microsoft's innovations in hardware can bring Durango in line with the inherent efficiency of an Orbis style architecture is a remarkable achievement. What we don't have is any evidence or indication that they catapult Durango to higher efficiency in anything but the most specialized, latency sensitive situations. Meanwhile there will certainly be other situations where Orbis is more efficient. My argument has always been that on average they offer very similar levels of efficiency and as a consequence that is not an avenue by which Durango can close the gap in power that exists on paper.
 
Status
Not open for further replies.
Back
Top