NGGP: NextGen Garbage Pile (aka: No one reads the topics or stays on topic) *spawn*

Status
Not open for further replies.
Actually, if you are a regular in the 3D Architechture & Chips section you'd have seen quite a lot of discussion over ROPs over the years for various GPU architectures. Especially when it comes to Number of ROPs versus available bandwidth to feed them. As well as to how effectively those ROPs are used. And how effectively those ROPs can use the available bandwidth. Oh and the capabilities of those ROPs, when compared to the competition. ROPs have increased in capability over the years adding on more dedicated functions.



Of course, I may be biased coming from the PC side of things. Higher graphics performance doesn't = higher resolution when talking about pushing the 3D graphics envelope. True a game can be made to run at 60 fps at 2560x1600. But it won't ever look as good as a game made to run at 60 fps at 1920x1200 or 1920x1080.

Graphics hardware on PC is many times faster now than it was in 2005. But for smooth gameplay at max settings on hardware that is current with a game that pushes the boundaries (fairly rare) at the time the game is released generally means smooth gameplay is only achieveable at 1920x1080. And even then it may chug along at times and require turning down some settings to achieve smooth gameplay. 1920x1080 or 1920x1200 is the equivalent of 720p for consoles.

So basically what it comes down to. A console game optimized for 720p at 30 or 60 fps will likely almost always look better than a game optimized for 1080p at 30 or 60 fps. Assuming same game genre type. IE - open world versus open world or corridor versus corridor; and not open world versus corridor shooter.

Absolutely nothing stopped developers from making 1080p games on PS3 and X360, except for one thing. A 1080p 30 fps game would not look as good as a 720p 30 fps game.

The same will likely be true for Orbis and Durango. 1080p 30 fps may look a bit better than 720p 30 from the past generation but it won't look better than 720p 30 fps on the same hardware.

I fully expect that some developers, maybe more than last generation, will target 1080p. But I believe that all the best looking games will be 720p or perhaps slightly higher.



Not necessarily. It just means there is no dedicated hardware support for it. So it can be done, it just requires more GPU resources in order to do it.

Regards,
SB


I'm a PC gamer too. The performance hit (and cost of the screens) at rendering post-1080p resolutions just doesn't really interest me. I'd need a very compelling reason to buy that screen and render at it, and right now I'm kind of 'meh' towards it.

I understand fully that rendering at a lower resolution provides performance benefits. But I think the console makers and the developers would very much like to hit 1080p for fidelity reasons- and to maximize the HD TV's that their owners have.

I think 1080p may have been a bit far out of reach for current generation consoles for memory and bandwidth reasons. In that scenario it just didn't make a whole lot of sense, since 720p provided an adequate resolution jump from prior generations while still allowing them to get good mileage from the hardware.

I don't see why there wouldn't be a similar jump with this new hardware..
 
So that basically means that PlayStation games will have the same resolution for interface and graphics while Durango is able to render interface and graphics with a different resolution?

That's what it sounds like, which would imply Durango has the advantage of maybe being able to use its planes for foreground/background display with the OS overlay on top while Orbis can just do game plane with overlay? I'd conjecture that if devs are able to do foreground/background planes (display foreground content within the HUD plane along with HUD) that could be a big difference in terms of how devs can optimize their visuals.
 
I'm a PC gamer too. The performance hit (and cost of the screens) at rendering post-1080p resolutions just doesn't really interest me. I'd need a very compelling reason to buy that screen and render at it, and right now I'm kind of 'meh' towards it.

I understand fully that rendering at a lower resolution provides performance benefits. But I think the console makers and the developers would very much like to hit 1080p for fidelity reasons- and to maximize the HD TV's that their owners have.

I think 1080p may have been a bit far out of reach for current generation consoles for memory and bandwidth reasons. In that scenario it just didn't make a whole lot of sense, since 720p provided an adequate resolution jump from prior generations while still allowing them to get good mileage from the hardware.

I don't see why there wouldn't be a similar jump with this new hardware..

The thing is. At typical desktop monitor distances (15-30 inches, average is 26.7 I believe) it's very easy to distinguish between 720p and 1080p as well as the scaling artifacts from scaling from 720p to 1080p.

At typical TV viewing distances for the relevant screen size, it's significantly more difficult to differentiate between native 1080p and upscale 720p unless your TV has an absolutely horrendous scaler.

For example my 24" inch monitor (using that instead of my 30" to keep pixel w:h similar) at about 2.5 - 3 feet viewing distance is roughly equivalent to a 110" HDTV at my TV viewing distance (10-12 feet away). My TV, like most peoples, however is much smaller than that. Mine happens to be 55" which is larger than what many of my friends have at that same distance. For a desktop monitor that would be like viewing a 10-12" monitor from 2.5 - 3 feet away and trying to distinguish between native 1080p and upscaled 720p. Unlikely to happen.

Or to put it another way. I would need a 110" TV in my living room to be able to get close to being able to distinguish between 720p and 1080p like I can with a 24" desktop monitor.

Hence, why people cannot easily tell between 540p with some AA compared to 1080p native with light AA or no AA on a TV screen at TV viewing distances. The only thing that could give it away are the jaggies due to aliasing. At 720p even that is reduced in significance and even without AA it is difficult to tell.

It's also why I do all of my PC gaming at 720p on the TV instead of 1080p. There is no noticeable difference in quality when set to the same graphical settings. But 720p allows me to bump up the rendering effects significantly making the 720p rendering far superior to the 1080p rendering. And all without having to have a monster (noisy and power hungry) GPU for that PC.

That certainly isn't the case for my desktop PC however, where the close proximity of the monitor doesn't allow for something like that.

Regards,
SB
 
I don't know, what was it?



First, Jaguar isn't off the shelf because it hasn't even made the shelf yet.
And I wasn't responding to customizations of a console cpu implementation, that's expected.

What was being posited was without even seeing first silicon, they were going to double what they already doubled which was already double of a cpu they didn't want from the start? How could they safely determine what they'd end up with from that? They of course being Microsoft who are much too smart to go with a convoluted plan like that.
What's so convoluted about replacing the 128bit ALUs with 256bit ALUs? That would give them the required performance, right? I understand it's not trivial, but neither was adding VMX128 to the 360.
 
This is true. Distance does play a significant factor in how we perceive detail. Still, that would kind of suck for console gamers (like myself) who play their games on PC monitors lol.

Then again, I could just hook it up to my HD TV like it is meant to be and call it a day. Hrm.


Unrelated, with the above being true why is there even a push for XHD resolutions? What advantage would this convey? It seems like it would be a tough sell by manufacturers to get consumers to buy that with no discernible advantages.

Yeah, that would be a bummer for people playing it on a desktop monitor. But how likely are either Sony or Microsoft or console game developers going to make concessions for those people?

XHD's main benefit is that it is "new" and would (the TV manufacturer's hope anyway) allow them to charge a premium price with associated high margins. It's what they had hoped would happen with 3D TV's but the market crashed due to low demand and they had to slash margins after the first couple years in order to get people to buy them.

The other benefit is that they hope it will get someone to upgrade sooner than the 10-15 years many people hang onto their TV for.

The other benefit will mostly be seen on the PC front, where the pixels will become far less easy to distinguish like they currently are with HDTVs (again at typical viewing distances). On the other end of the spectrum, instead of higher PPI, they could go with the same or slightly higher PPI, but have a much larger desktop workspace. IE - 24" UHD monitor would have high PPI making the pixels incredibly hard to see or you could go with a 30-40" UHD monitor for a large desktop workspace with useable resolution.

As well, very large TV's will benefit. But again that hits right into the high margin premium TV sets that TV manufacturer's would love to sell.

I used to think that higher resolution would be needed for computing on TV's. But I've since found that it's irrelevant. At typical viewing distances the pixels aren't nearly as apparent and you usually end up have to zoom/magnify the desktop a bit anyway to make small text readable.

Regards,
SB
 
Unrelated, with the above being true why is there even a push for XHD resolutions? What advantage would this convey? It seems like it would be a tough sell by manufacturers to get consumers to buy that with no discernible advantages.

Because Sony or Samsung and others want to sell you a new tv, studios want to sell you new copies of the movies you already own. But the fact is that it is becoming a hard sell. The uptake of blu-ray is really slow compared to dvd ( although there are other factors such as digital). I seriously doubt you'll see 4k take off (I use the term loosely, it took 10 years for hd to surpass sdtv and no doubt even that was enhanced by form factor) like hdtv did, unless they can introduce paper thin wireless devices that you can stick to the wall with a piece of gum.
 
What's so convoluted about replacing the 128bit ALUs with 256bit ALUs? That would give them the required performance, right? I understand it's not trivial, but neither was adding VMX128 to the 360.

If it was just 128bit to 256 then I'd agree, but it's not.

It 64bit 2 core to 128bit 4 core, then 128bit 4 core to 128bit 8 core, then 128bit 8 core to 256bit 8 core.

This doesn't seem convoluted to you?

Also along the line you go from no AVX to AVX to AVX2 and it's doubtful you use any of it.

On top of that you are going from 40nm to 28nm

And why If you were targeting 200 gflops all along, you start with something that's not even a tenth of your target?
 
Because Sony or Samsung and others want to sell you a new tv, studios want to sell you new copies of the movies you already own. But the fact is that it is becoming a hard sell. The uptake of blu-ray is really slow compared to dvd ( although there are other factors such as digital). I seriously doubt you'll see 4k take off (I use the term loosely, it took 10 years for hd to surpass sdtv and no doubt even that was enhanced by form factor) like hdtv did, unless they can introduce paper thin wireless devices that you can stick to the wall with a piece of gum.
I was really hoping the PS4 would support a new 4k format, but when Sony themselves said there probably won't be a disc format for 4k, the chance of it ever happening is getting slimmer every day, too many companies must agree and move together, and they never do. There's always this one loner company trying to sabotage any attempt.

Unlike bluray, 4K will certainly remain a niche. But even then, a great upscaler integrated in the display is enough to sell me a 4K projector right now. Some kind of support for 4k in the PS4 would have been appreciated considering it doesn't required much of an expense in hardware. All it needs is an industry standard and a portable format, without online DRM hell. We got rid of DIVX to save DVD, I don't want it's ghost back.
 
That's what it sounds like, which would imply Durango has the advantage of maybe being able to use its planes for foreground/background display with the OS overlay on top while Orbis can just do game plane with overlay? I'd conjecture that if devs are able to do foreground/background planes (display foreground content within the HUD plane along with HUD) that could be a big difference in terms of how devs can optimize their visuals.

This has been discussed to death, in this thread even.

The amount of power required to implement the display panes on Orbis is a trivial amount, iirc its about 1 CU for 1080P 60FPS for scaling and ~4 1080P writes to memory / frame (which is not a large amount considering how many it can do / frame (iirc ~>200).
 
If it was just 128bit to 256 then I'd agree, but it's not.

It 64bit 2 core to 128bit 4 core, then 128bit 4 core to 128bit 8 core, then 128bit 8 core to 256bit 8 core.

This doesn't seem convoluted to you?

Also along the line you go from no AVX to AVX to AVX2 and it's doubtful you use any of it.

On top of that you are going from 40nm to 28nm

And why If you were targeting 200 gflops all along, you start with something that's not even a tenth of your target?

What are you talking about, normal Jaguar is 128bit and MS were going to use 8 cores since the alpha kits. It's PS4 that switched from steamroller to jaguar.
 
Totally appreciate the snarkyness but I say I raise a valid point. The dme are basically just DMAs with some compression/decode hardware.

We've known since AMD released the GCN whitepapers about GCNs dual DMA engines(albeit designed for pci-e) and the slew of memory management tools such as hardware/driver level functions like virtual memory and full compatability with the API functions in OpenCl and direct compute.

So again, considering Sony has supposedly separated 4 CUs from the rendering stack and given them explicit control to the dev, why couldn't they perform the same task as the DMEs? If its generally even needed in the first place?

Actually, the whole purpose of the DMEs is to move data back and forth between the main memory in Durango and the embedded memory. Orbis does not need to devote shader resources to replicate this function since it only has a single unified memory pool. When decompression is required Orbis has a dedicated unit for that, and when it does need to move data into the GPUs L2 cache, it will rely on the 2 standard DMA units it should also have.
 
Actually, the whole purpose of the DMEs is to move data back and forth between the main memory in Durango and the embedded memory. Orbis does not need to devote shader resources to replicate this function since it only has a single unified memory pool. When decompression is required Orbis has a dedicated unit for that, and when it does need to move data into the GPUs L2 cache, it will rely on the 2 standard DMA units it should also have.
Could you detail that? The only compression/decompression specific hardware I've heard mentioned is for zlib, and that's fairly standard and not really comparable to what the DMEs offer, at least, as I understand it.
 
Could you detail that? The only compression/decompression specific hardware I've heard mentioned is for zlib, and that's fairly standard and not really comparable to what the DMEs offer, at least, as I understand it.

The DME's offer either the zip stuff or jpeg depending on what mode you want, I suspect the DMA units on the GPU In Orbis will provide untiling/tiling support because afaik modern GPU DMA's units already do this.
 
"Partially" was not used. I would not have addressed the post, if it was used.

I know that component cost reductions, most notably the cost of the components for the BluRay drive and the CPU and GPU moving to a smaller process also contributed. This, though, was nothing remarkable so I didn't mention it. It was what anyone with sufficient understanding of the subject was expecting to happen. In fact, the BluRay drive specifically was being singled out by posters here from launch as a component that would become much cheaper and allow for significant cost reductions.

Can you provide evidence of hardware (hardware was the word used before) features that made the PS3 less capable with each revision? The original post I addressed said that Sony's PS3 hardware revisions were easier due to always taking something away. It didn't happen with every hardware revision and I showed that. Can you show otherwise? Please leave the subjectivity out of it (i.e. "which would you prefer").


I found a reference and it turns out I did mis-remember the timing of some of the hardware cut-downs. The USB ports and optical pickup (disabling SACD playback) were both cut down prior to the slim which is when I thought one or both of those cuts were made. So the slim didn't include any significant hardware cuts, AFAICT.

I could argue for Linux being a feature removal, but I won't. I don't believe it was removed for cost reduction purposes and it's removed from all PS3s (with updated firmware) now.

Nonetheless, even after this correction I still don't think Sony did anything exceptional when cost-reducing the PS3. And I still think the 60GB launch PS3 was the best model. So, nothing really changed. I am happy to have the correct facts, though, for its own sake.
 
Could you detail that? The only compression/decompression specific hardware I've heard mentioned is for zlib, and that's fairly standard and not really comparable to what the DMEs offer, at least, as I understand it.

The LZ compression used by the DMEs is fairly standard, too. In fact, AFAIK Zlib is just a variation of LZ. The only interesting thing they do is decompress the data in flight, from one memory pool to the other. Since Orbis only has the one pool it doesn't need that, just something to lessen the burden on the CPU.
 
If it was just 128bit to 256 then I'd agree, but it's not.

It 64bit 2 core to 128bit 4 core, then 128bit 4 core to 128bit 8 core, then 128bit 8 core to 256bit 8 core.

This doesn't seem convoluted to you?

Also along the line you go from no AVX to AVX to AVX2 and it's doubtful you use any of it.

On top of that you are going from 40nm to 28nm

And why If you were targeting 200 gflops all along, you start with something that's not even a tenth of your target?
I have no idea what you're asking. Jaguar has been on AMD's timeline for a while now, when asking for a CPU for 2013 or so, It would have been the first thing offered by the company. If the only changes being asked for are putting 2 modules on a chip and widening the ALUs to 256 bit, that would be less customization than the last couple of generations of consoles. You can't start from Bobcat and assume that all the work to get to Jaguar is part of the console customization process. That's like saying because the Core I7 is a descendent of the pentium 3, that it would be impossible for apple to put it in a Mac mini because of the number of changes they'd have to make to the P3 to get there.

Also, AVX2 is not needed for this. AVX is 256 bit all by itself, and was in the planned Jaguar from the start.
 
Status
Not open for further replies.
Back
Top