NGGP: NextGen Garbage Pile (aka: No one reads the topics or stays on topic) *spawn*

Status
Not open for further replies.
The problem is utilization isn't the only measure of efficiency. And efficient design is as much about avoiding stalls, bottlenecks and saturation as it is about high utilization. If the city planners in your town designed the sewer system to be at a high level of "utilization" on an average day, that might seem very efficient, but only until the first time it rained and every toilet in the city backed up at the same time and millions of gallons of raw sewage is dumped into the local waterways. In that case building excess capacity to cope with peak loads and worst case scenarios is more efficient, which was my point about the ROPs.

What happens if it floods once in ten thousand years? Efficient in what sense? Utilization of money? Maybe it's better to just let flood and rebuilt/cleanup. You see like SB was trying to explain to you. Efficiency isn't the same as capacity/performance. You are right, there are different kind of efficiency. The efficiency is in the context of utilization of something. So when we're talking of GPU, we're talking of keeping all those all those units utilized.

In the (supposedly -- it's all rumors) Orbis's case, it can handle peaks a lot better. But it's will spend higher percentage of its time idle. So those idle time is what is been referred as inefficient. But hey, being inefficient isn't so bad...because I rather drive Cobra than a Honda Fit (though I can't afford neither -- too expensive to own a car in the city).
 
I'm only comparing to single GPU cards, not dual GPU like the GTX 690.

The overall point though is that the performance gulf between console and PC comes from doubling the TDP, which just isn't practical given size and cost considerations that consoles have to consider. Maybe someone else can explain to me why GPU size and power requirements have inflated the way they did. I personally haven't had a proper desktop since the Pentium 4 era so I'm probably not the best judge, but to me a console sized box is just about right for dedicated gaming hardware.
 
You moved from facts to the realm of subjectivity. I showed the statements were false. Those were, indeed, hardware additions and not removals. The end.

Which statement do you think you disproved? Because I don't see it. I was responding to a post touting Sony's skill at cost reduction using the fact that they so quickly introduced new, cheaper, revisions of the PS3.

One thing I said was that I found those cost reductions less impressive because they were partially accomplished by removing features and components and it is an indisputable fact that that is the case. Nothing subjective there and nothing you said disproves (or can disprove) that.

The other thing I said was that each iteration of the PS3 is less capable than the prior one. I didn't say the later revisions didn't bring any new capabilities. I was attempting to convey that the systems, overall, became less and less capable with each revision which is indeed subjective. Which by definition you can't disprove as it's my opinion.

If you were to be given a choice between keeping all of the PS3's original functionality at the expense of bigger hard drives and bitstreaming, but with the same price reductions or what actually occurred which would you prefer?

To be clear, I do think the later revisions of the PS3 represented a better value (which is why I own a slim). I'm not saying bad choices were made in what to cut and the sales numbers bear that out as getting the price down was critically important to the PS3 finally beginning to achieve some sales momentum. That doesn't mean that I agree that the PS3 cost reductions should be held up as proof of Sony's superior engineering prowess.
 
I think you're trying way too hard to read anything into it. The relationship to super computer could mean a lot of things.
Do you think so? For me, it seems quite obvious that something is out of place. It's like seeing three apples and then an orange. It seems to be less about trying hard to read into something than just about something sticking out.
 
Which statement do you think you disproved? Because I don't see it. I was responding to a post touting Sony's skill at cost reduction using the fact that they so quickly introduced new, cheaper, revisions of the PS3.

One thing I said was that I found those cost reductions less impressive because they were partially accomplished by removing features and components and it is an indisputable fact that that is the case. Nothing subjective there and nothing you said disproves (or can disprove) that.
"Partially" was not used. I would not have addressed the post, if it was used.

The other thing I said was that each iteration of the PS3 is less capable than the prior one. I didn't say the later revisions didn't bring any new capabilities. I was attempting to convey that the systems, overall, became less and less capable with each revision which is indeed subjective. Which by definition you can't disprove as it's my opinion.

If you were to be given a choice between keeping all of the PS3's original functionality at the expense of bigger hard drives and bitstreaming, but with the same price reductions or what actually occurred which would you prefer?

To be clear, I do think the later revisions of the PS3 represented a better value (which is why I own a slim). I'm not saying bad choices were made in what to cut and the sales numbers bear that out as getting the price down was critically important to the PS3 finally beginning to achieve some sales momentum. That doesn't mean that I agree that the PS3 cost reductions should be held up as proof of Sony's superior engineering prowess.
Can you provide evidence of hardware (hardware was the word used before) features that made the PS3 less capable with each revision? The original post I addressed said that Sony's PS3 hardware revisions were easier due to always taking something away. It didn't happen with every hardware revision and I showed that. Can you show otherwise? Please leave the subjectivity out of it (i.e. "which would you prefer").
 
Last edited by a moderator:
Do you think so? For me, it seems quite obvious that something is out of place. It's like seeing three apples and then an orange. It seems to be less about trying hard to read into something than just about something sticking out.

Trying to conclude anything from an out of context term passed on second hand seems a fruitless endeavor. All I see is you trying way too hard to discredit something that has very little credibility to start with because it might offend your favored brand.
 
Agree, completely. The companies certanly want to achieve their business model and make it an success. Even though they must look over their shoulder for the competitors specs/what they are planning (and adapt if needed/it is easy enough), but it is certainly not done in a fanboy war way, like some of the posters on the internet are implying.

Frankly, I think we would easier guess if these specs are correct or what is missing if we would know more about the business model and what they want to do (casual gaming, core gaming; where will you invest more? do you want to invest in AR/kinect/move (or just chug it along)? what is your main market/demographics? etc.).

One thing for sure I would love, after we have the real specs for both the Durango and the Orbis to debunk, each and every one of these false insiders in a new thread, because in many postings you can see attention "prostitutes" or really bad fanboyism...

I love these threads, only thinking about the possibilities with DVR/Tivo like stuff on the Durango or a share button on the Orbis (uploading gameplay to youtube without knowledge on capture hardware etc.) is really getting me excited, oh and getting better games helps too ;) . I am so glad that both Sony and MS have each other as competitors, only imagine what would be if they would ease up or only need to best Nintendos specs.

Cheers...

LOL! Nice, good way to put it!

This is very easy to tell in MANY postings. Even among veterans which is quite odd. Despite this being a technical forum, confirmation bias is a huge problem when dealing with new hardware, especially with new console technology. Anywhere on the internet really. I don't see many posters being objective in favor of being primarily subjective. In the case of the normal folks, it probably has something to do with fans. In the case of the experts though, I have a feeling that it's partially due to $$$ based on my past couple employers. In some of their cases at least.

At this point it is safe to say when it comes to anything anonymous on the internet, no matter who it is attributed to, it is NOT trustworthy due to the "golden rule" and by that I mean the money. Every game publisher has a PR department that essentially seed disinformation on EVERY influential forum or blog. They most certainty are doing it here, that is a fact. I know for a fact that EA posts here. Anybody who mentions "gen3" is very likely an industry shill because that is not normal industry jargon even, mostly EA. Especially if they mention it in the context of EA franchises like Madden (heh! somebody here has already recently done that!)

Shills are everywhere but they give away there position if you know their tricks. Sophistry and rhetoric. Interpretive half truths and in general lawyer's tactics. The entire video gaming industry from the hardware to the software has become a mathematic formula for predicting consumer habits as far as making the most money possible just like anything else. I don't expect a lost of regulars here or those socially awkward aspies to see it, but I am being 100% honest and you know it.
 
An overly simplistic GPU flops comparison:

PS3: 367Gflops
7800 GTX 512: 400Gflops (fastest card available a year before release of the console), US$649 MSRP (~$700 retail), ~120W TDP.

PS4: 1.84Tflops
HD 7970 Ghz Edition: 4.3Tflops (fastest card available around a year before release of the console), US$499 MSRP (~$430 retail), 250W TDP.

But you still have to consider the diminishing returns in DirectX11 level of graphics. In a modern PC game you can activate a single effect which you can barely see even in screenshots but it will cost you 20, 30, or even 40 percent of your framerate. That's crazy.

If next gen games are1080p (I assume that at least Sony's 1st Parties will be 1080p) then I think that next gen console games will look very enjoyable compared to PC games. The PC always had the advantage of the higher resolution, PS360 were inferior to the sharpness of the PC image from the get-go: They had nice geometry and fancy lighting but the image was blurry as hell. But with the next gen, console and PC might be starting on equal terms. Seeing comparisons of next gen games on console and PC both in 1080p would be awesome.

I know many, many, many people that play on a PC only because of the sharper image and the better framerate. These people say they just can't enjoy current gen console games due to the blurry (sub) 720p image. I think there are only a very few gamers in the world who want to max out everything and will buy a $500 enthusiasm card each and every year. The lion's share just wants to play good games with an adequate image quality and framerate. Current gen consoles can't deliver that anymore. In fact they never could, and I guess that's one of the reasons why PC gaming is stronger than ever. I expect at least one of the next gen consoles to render the games with 1080p and 60FPS. Actually both Orbis and Durango could easily achieve 1080p and 60FPS in any game, it's just a matter of principle. If next gen systems deliver the same crappy image quality and the same crappy framerates as PS360, then I expect the PC (and Valve) to become even stronger.

So for me it's totally clear in terms of visual capabilities: A good balance of graphics, framerate and image quality will win the core gamers.
 
Last edited by a moderator:
The thing is how much faster than Orbis jaguar durango jaguar is,because Orbis Jaguar is no vanilla either,so Durango jaguar been 100% stronger or faster doesn't translate into the same advantage vs Orbis Jaguar.

Because Vanilla Jaguar is from 2 to 4 cores not 8 like Orbis.

Bg has said we're looking at around 200 GFLOPS for the Durango CPU.
http://www.neogaf.com/forum/showpost.php?p=47803660&postcount=1042

Only 6 cores are available to the running game though, this is his summary of the info he has on Durango:

I'm going to summarize what I know with regards to Xbox 3.

VGLeaks info has been accurate due to what the info is coming from.

- Always online/internet connection required
- No used games
- No BC
- CPU has increased performance over normal Jaguar cores.
- ~3GB and 2 CPU cores are reserved for the OS, apps, Kinect, etc.
- Xbox has a crazy (IMO) amount of hardware dedicated to audio tasks (e.g. gaming and Siri capabilities)
- Kinect is a pack in
- Same day digital releases for all games
http://www.neogaf.com/forum/showpost.php?p=47825093&postcount=1283

Most of these things are corroborated by Edge/Kotaku or earlier rumours.
And ERP was speculating that mandatory installs are necessary because MS are going to go with simultaneous digital releases for all games.

Oh and bkilian, Proelite and bgassassin are dishing some dirt on your beloved 'flops capacitor' (read: audio block), turns out it isn't chopped liver after all ;)

Well, not to dismiss your case, but Durango has an audio chip that needs 100 gflops of CPU power to emulate
http://www.neogaf.com/forum/showpost.php?p=47822384&postcount=1219


Originally Posted by Reiko: View Post
Hmmmmmm...
It is true though I didn't have a FLOPs number as a reference. I wasn't going to say it first though (directly), haha.

There's a sizable amount of power put into the audio in Xbox 3. Kinect is at least partially to blame as I understand it. See Siri-like capabilities.
http://www.neogaf.com/forum/showpost.php?p=47822769&postcount=1225
 
The fact that Microsoft's innovations in hardware can bring Durango in line with the inherent efficiency of an Orbis style architecture is a remarkable achievement. What we don't have is any evidence or indication that they catapult Durango to higher efficiency in anything but the most specialized, latency sensitive situations. Meanwhile there will certainly be other situations where Orbis is more efficient. My argument has always been that on average they offer very similar levels of efficiency and as a consequence that is not an avenue by which Durango can close the gap in power that exists on paper.

I agree with this, the custom hardware in Durango will close the gap, but to believe somehow that Orbis will be inefficient allowing Durango to pass it in performance is silly. Sure the real world performance and visual gap between the two may not be all that noticeable

The key thing to keep in mind (and what i've been repeating over and over again) is that MS were going for 6-8x the power of 360 (which Durango certainly is) and they didn't know what Sony was doing.

From everything we know so far, and bkilian's lament about the new direction Xbox division is heading in, it's quite clear a traditional, performance focused console, aimed largely at core gamers is not what it had in mind.

In many ways, it represents the culmination of MS's strategy for the first Xbox, a trojan horse that would use gaming to gain MS control of the living room and the TV - their vaunted second screen after PCs (though they didn't foresee the rise of tablets and smartphones as third and fourth screens).
 
Is the talk about Durango being more efficient and generally having less idle time due to the DMEs? Cause I really don't see it as that big an advantage considering Sony has CUs sitting on the side that could theoretically be programmed to do the same thing, if a developer sees fit.

Especially considering they are somewhat still limited by durangos bandwidth.
 
Is the talk about Durango being more efficient and generally having less idle time due to the DMEs? Cause I really don't see it as that big an advantage considering Sony has CUs sitting on the side that could theoretically be programmed to do the same thing, if a developer sees fit.

Especially considering they are somewhat still limited by durangos bandwidth.

Yeah, specific custom DMEs are totally the same thing as general purpose CUs. :rolleyes:

Hell, why not just add a ton of CUs to replace all components! More CUs solves everything!
 
Yeah, specific custom DMEs are totally the same thing as general purpose CUs. :rolleyes:

Hell, why not just add a ton of CUs to replace all components! More CUs solves everything!

Totally appreciate the snarkyness but I say I raise a valid point. The dme are basically just DMAs with some compression/decode hardware.

We've known since AMD released the GCN whitepapers about GCNs dual DMA engines(albeit designed for pci-e) and the slew of memory management tools such as hardware/driver level functions like virtual memory and full compatability with the API functions in OpenCl and direct compute.

So again, considering Sony has supposedly separated 4 CUs from the rendering stack and given them explicit control to the dev, why couldn't they perform the same task as the DMEs? If its generally even needed in the first place?
 
Bg has said we're looking at around 200 GFLOPS for the Durango CPU.
http://www.neogaf.com/forum/showpost.php?p=47803660&postcount=1042

I think we need to know what a "single core" and a "single cluster" does in Gflops first.

I went ahead and read the 2012 presentation

http://www.hotchips.org/wp-content/...4/HotChips24.Proceedings-revised-12-09-07.pdf

It's clear from the presentation that a Single Jaguar "cluster" has 4 Jaguar Cores.
The standard thus should be one cluster.

Reviewing what we had before.

Durango
- x64 Architecture
- 8 CPU cores running at 1.6 gigahertz (GHz)
- each CPU thread has its own 32 KB L1 instruction cache and 32 KB L1 data cache
- each module of four CPU cores has a 2 MB L2 cache resulting in a total of 4 MB of L2 cache
- each core has one fully independent hardware thread with no shared execution resources
- each hardware thread can issue two instructions per clock


Orbis

  • Orbis contains eight Jaguar cores at 1.6 Ghz, arranged as two “clusters”
  • Each cluster contains 4 cores and a shared 2MB L2 cache
  • 256-bit SIMD operations, 128-bit SIMD ALU
  • SSE up to SSE4, as well as Advanced Vector Extensions (AVX)
  • One hardware thread per core
  • Decodes, executes and retires at up to two intructions/cycle
  • Out of order execution
  • Per-core dedicated L1-I and L1-D cache (32Kb each)
  • Two pipes per core yield 12,8 GFlops performance
  • 102.4 GFlops for system

There is a clear discrepancy here.

If we take everything VGleaks is giving us at at face value, as well as any current claims that Durango has ~200 Flops, then AMD must have found some huge breakthrough in their jaguar cores, or they've done something odd with the Durango.

for Durango's 8 core Jaguar to hit 200 Flops, each core has to hit ~ 25 GFlops
25 Gflops/1.6 Ghz= ~15.625 Flops/ cycle, realistically this should probably be 16 Flops per cycle per core.
In comparison, Current rumored Orbis cores are 102.4Gflops/8 core/1.6Ghz= 8 Flops per cycle per core.


So if Durango's CPU can achieve 200 Gflops and is Jaguar based, one of three things are true. May be missing some scenarios but nevertheless.

1. New Durango sports 16 Jaguar cores, arranged in 4 clusters, validating Orbis sporting 102.4 GFlops
2. New Durango sports Jaguar cores that can execute twice as many Flops as Orbis Jaguar Cores.
3. In fact Orbis also sports ~200 Gflops and Jaguar cores really do 16 Flops per cycle, and somebody misinterpreted the Orbis specs somewhere along the line.

Thoughts?
Correct me if I'm wrong.
 
Last edited by a moderator:
Sure the real world performance and visual gap between the two may not be all that noticeable

Or can it?
Not because one is particularly more powerful than the other, but because one allow the brute force approach while the other must be fine tuned, and in time constrained developments it can make a difference.
So even if they can do almost the same output, only half (just saying) of the developers will spend enough time on Durango to reach it

But not knowing any real details this is of course only my supposition
 
So if Durango's CPU can achieve 200 Gflops and is Jaguar based, one of three things are true. May be missing some scenarios but nevertheless.

1. New Durango sports 16 Jaguar cores, arranged in 4 clusters, validating Orbis sporting 102.4 GFlops
2. New Durango sports Jaguar cores that can execute twice as many Flops as Orbis Jaguar Cores.
3. In fact Orbis also sports ~200 Gflops and Jaguar cores really do 16 Flops per cycle, and somebody misinterpreted the Orbis specs somewhere along the line.

Thoughts?
Correct me if I'm wrong.

Wouldn't it just require wider or more vector units per core, like they did with Xenon and the VMX128 units.

Despite using the same PPE as in Cell, Xenon had 50% more perf per core, a total of 115 GFLOPS for the 3 cores vs 25.6 GFLOPS for the PPU in Cell, all thanks to the upgraded VMX units.
 
Or can it?
Not because one is particularly more powerful than the other, but because one allow the brute force approach while the other must be fine tuned, and in time constrained developments it can make a difference.
So even if they can do almost the same output, only half (just saying) of the developers will spend enough time on Durango to reach it

But not knowing any real details this is of course only my supposition

I'm saying it'll be like GTA4, Black Ops, Skyrim, RDR etc running on PS3 vs 360.

720 might lose DF Face Offs, but to the average gamer, these differences won't be readily apparent.
 
Wouldn't it just require wider or more vector units per core, like they did with Xenon and the VMX128 units.

Despite using the same PPE as in Cell, Xenon had 50% more perf per core, a total of 115 GFLOPS for the 3 cores vs 25.6 GFLOPS for the PPU in Cell, all thanks to the upgraded VMX units.

so jaguar can decode 2 ops a cycle, has 256bit load 128bit store, has buffers/queues designed for "optimal" 128bit 2FPU throughput, can retire two ops a cycle. in jaguars uarch how much are you going to buy with just adding more FPU( either number or width) and nothing else? i would expect not very much for a whole bunch of power.
 
I wonder if replacing the FPMUL unit with an FMA unit would help, not necessarily for increasing flop count, but efficiency. I would expect that in most work loads that's the type of operation, multiply accumulate, that will be done. It would seem that an FMA unit could help with register pressure and latency when dealing when scheduling the separate multiply/add instructions currently. In theory it could also help with instruction decode since it eliminates an instruction from the stream.
 
Status
Not open for further replies.
Back
Top