NGGP: NextGen Garbage Pile (aka: No one reads the topics or stays on topic) *spawn*

Status
Not open for further replies.

All I meant was:

If your design has an "off-the-shelf" 7770, then you can change that for a 7850 with a certain degree of pain.

But if your design has a modified 7770, then you can't replace it with a 7850 without re-implementing all of your modifications from the 7770 onto the 7850...

The original rumours were that the 2 GPUs were very close... Did Sony upgrade their GPU, whilst Microsoft found themselves unable to do that?
 
If your design has an "off-the-shelf" 7770, then you can change that for a 7850 with a certain degree of pain.
But if your design has a modified 7770, then you can't replace it with a 7850 without re-implementing all of your modifications from the 7770 onto the 7850...

what?
 
All I meant was:

If your design has an "off-the-shelf" 7770, then you can change that for a 7850 with a certain degree of pain.

But if your design has a modified 7770, then you can't replace it with a 7850 without re-implementing all of your modifications from the 7770 onto the 7850...

The original rumours were that the 2 GPUs were very close... Did Sony upgrade their GPU, whilst Microsoft found themselves unable to do that?

Which rumours? We've heard rumours saying that the two systems are close or have comparable performance. Whether that was the GPU or not is anyone's guess.

It's quite frankly unlikely that Sony changed their specs and upgraded their GPU and RAM. The only rumours about changing specs i've heard were subtle and vague hints by journos on NeoGaf who claimed that MS upp'd their specs after devs asked for "more" (be that RAM, GPU or CPU performance, I don't think anyone who isn't NDA's to hell and back can say).

The original pastebin rumour for the PS4 is pretty much the same as the current most popular and sourced ones. They're just leaked more minor details and filled in the blanks. I think its just a bit useless to speculate over a rumour about either company changing their specs, when there's next to no way we'd ever really be able to verify such claims.
 
Many years ago we were talking about NURBS rendering (PSP's reported NURBS hardware IIRC) and there were lots of issues back then. Has there really been no progress in realtime SDS rendering such that we can't ditch the triangle meshes and go with the root models in the first place? :(

The problems are fundamentally the same, control mesh densities have to be very high to approximate the actual curvature of real objects, so high that you only end up tessellating in extreme close up.
IMO the trade offs just aren't worth while.
I think you'll see it used, but largely for things where the actual curvature isn't important, or is well defined as Laa Yosh states.
If engines and tool chains start to support it early this gen then it's possible that it might see broader use, but I'm not sold on it for a lot of things.
 
The problems are fundamentally the same, control mesh densities have to be very high to approximate the actual curvature of real objects, so high that you only end up tessellating in extreme close up.
IMO the trade offs just aren't worth while.
I think you'll see it used, but largely for things where the actual curvature isn't important, or is well defined as Laa Yosh states.
If engines and tool chains start to support it early this gen then it's possible that it might see broader use, but I'm not sold on it for a lot of things.
I was thinking more for broad curved surfaces like domes and wheels and stuff. SDS/HOS are in theory a godsend for LOD when you have hardware tessellation, or that's how it seems to me, and I'm surprised we're not actually progressing along that route to any degree. It sounds like that tech won't really develop any time soon and we'll have to stick to triangles, making tessellation hardware a lot more niche than it otherwise should be if only technical issues didn't keep getting in the way!
 
I was thinking more for broad curved surfaces like domes and wheels and stuff. SDS/HOS are in theory a godsend for LOD when you have hardware tessellation, or that's how it seems to me, and I'm surprised we're not actually progressing along that route to any degree. It sounds like that tech won't really develop any time soon and we'll have to stick to triangles, making tessellation hardware a lot more niche than it otherwise should be if only technical issues didn't keep getting in the way!

What is the difference of NURBS and normal mapping? When you are talking about nurbs...what exactly do you mean with this? I understand that typical polygon meshes (tri/quads) are basically straight faced (except quads, which have a slight curving due to the 4 independent corner points). So with nurbs you mean curved polygons? I always thought that normal mapping is exactly this: a reconstruction of a curved geometry overlaying the straight faced polys to mimick the curved geometry?
 
I was thinking more for broad curved surfaces like domes and wheels and stuff.

Those kinds of things are actually very rare in practically any environment, and thus it's not worth to develop a separate pipeline and content workflow for them.
Also, you say wheels, but have you actually looked at a car tire up close? :) It's so complex that even in offline CG it's a perfectly valid approach to just build a highres model with no tessellation at all.

Of course it's possible to design your content around smooth surfaces - but then it'd be a very unique and limited art style, and probably non-photorealistic. IMHO in that case it wouldn't be an efficient use of silicon to accommodate such limited uses.

SDS/HOS are in theory a godsend for LOD when you have hardware tessellation, or that's how it seems to me, and I'm surprised we're not actually progressing along that route to any degree.

They could be that, but then you'll require a full, almost micropolygon level implementation to be able to rely on just displacements only. I don't think the hardware is advanced enough for that in this upcoming generation yet.
And again, even if you have HOS and displacement, they won't help in all the cases and they won't necessarily make traditional discrete LOD obsolete either. Our average character nowadays is still 50k-80k without subdivision, and for far shots using large crowds we build completely separate models instead. And, again, hard surface (armored or robotic) characters would still not work with a lowres base and tessellation + displacement.
 
It's quite frankly unlikely that Sony changed their specs and upgraded their GPU and RAM. The only rumours about changing specs i've heard were subtle and vague hints by journos on NeoGaf who claimed that MS upp'd their specs after devs asked for "more" (be that RAM, GPU or CPU performance, I don't think anyone who isn't NDA's to hell and back can say).

The original pastebin rumour for the PS4 is pretty much the same as the current most popular and sourced ones. They're just leaked more minor details and filled in the blanks. I think its just a bit useless to speculate over a rumour about either company changing their specs, when there's next to no way we'd ever really be able to verify such claims.

(Speculation) Secret first/second party dev kits that weren't leaked. Aegies also said there was a 'major hardware change' for Orbis last summer. Afterwards we have vg247 with the 8 and 16gb kit rumor. The picture is certainly less clear for Orbis than Durango. And then lherre no longer willing to comment on performance comparisons, but at the same time also willing to say Durango alpha=beta=final.
 
What is the difference of NURBS and normal mapping? When you are talking about nurbs...what exactly do you mean with this? I understand that typical polygon meshes (tri/quads) are basically straight faced (except quads, which have a slight curving due to the 4 independent corner points). So with nurbs you mean curved polygons? I always thought that normal mapping is exactly this: a reconstruction of a curved geometry overlaying the straight faced polys to mimick the curved geometry?

Normal mapping takes a flat surface and changes the light as if the light was hitting bumps (displacement mapping takes it a step further and shifts the pixel being rendered, but the underlying geometry is still flat). NURBS and other HOS (High Order Surfaces) define surfaces with curves. You'd combine both normal maps and NURBS. eg. Let's compare a golf ball. In the normal case, you need a lot of triangles to create an approximate sphere onto which you'll add dimples by mapping. The golf-ball model will be quite large with lots of vertices. A HOS can describe a perfect sphere with just a few values (such as corners of the bounding cube), onto which you'd add the dimples, and tessellation would mean a perfect curve on screen no matter how close you view the ball (although dimples will be limited to texture resolution quality).
 
I always thought that normal mapping is exactly this: a reconstruction of a curved geometry overlaying the straight faced polys to mimick the curved geometry?

Normal mapping is a cheat, you only modify the surface normal. The actual volume and silhouette of the object remains the same, a low-res polygonal mesh. Which is also why volumetric shadows and raytraced shadows give it away easily.

NURBS and other higher order surfaces are what you had in mind, a (theoretically) infinitely smooth surface expressed by a limited number of control points, approximated by tessellating it to a large number of triangles. Tessellation itself can be discrete, or view-dependent.
 
Which rumours? We've heard rumours saying that the two systems are close or have comparable performance. Whether that was the GPU or not is anyone's guess.

True. But given that both have ended up with processors which appear to be rather similar, it would make sense that the GPU performance was also 'similar'. But the rumours were never specific on individual components.

It's quite frankly unlikely that Sony changed their specs and upgraded their GPU and RAM. The only rumours about changing specs i've heard were subtle and vague hints by journos on NeoGaf who claimed that MS upp'd their specs after devs asked for "more" (be that RAM, GPU or CPU performance, I don't think anyone who isn't NDA's to hell and back can say).

I think the surprisingly accuracy pastebin rumour/source tends to imply the opposite:
http://pastebin.com/j4jVaUv0

The X-Box 3 is going to have an 8-core 64-bit processor (assumedly an i7 or similar design) rated at 1.2 Teraflops. The PS4 will feature a 4-core 32-bit processor. XB3 will be using a GPU running support for DirectX 11, while the PS4 will be using an OpenGL 4 GPU. XB3 is specified to use 4 GB RAM, and the PS4 will be shipped with 2GB.

Specifications state that the PS4 will have *NO* hard drive by default, while the XB3 will come with a 500GB Hard Drive with 8GB of Flash memory.
 
People need to call it something to differentiate from truly custom hardware like Xenos or EE+GS, and the natural opposite is 'off-the-shelf'. I think that's a language we'll just have to accept.

One way that I generally think of it is...

If it is based on a readily available consumer PC part then it's off the shelf. Modifications just slightly change the capabilities or add on capabilities but don't massively change it into something unrecognizeable from the item it is based on.

If it was based on something either not readily available to a PC or uses something custom created then I view that as more custom.

Cell for instance is obviously custom. Xenon straddles the line but as PPC based CPU's aren't common in the desktop PC space (even Apple having abandoned that architecture tree many years ago) then it wouldn't be. Xenos could have been an off the shelf part if rumors are to be believed that it was based on a project that never made it into a shipping consumer part. But considering that project never made it into a shipping consumer part, then it's fair to say Xenos was pretty custom.

Jaguar being an x86 CPU that's coming to PC in the near future would be off the shelf. Even with some minor modifications. RSX is another example. Orbis and Durango GPU's even if there have been some modifications are still based on the currently available GCN GPUs with similar specs.

One could argue that it could be custom in that it will be an SoC. But in my head I view that as being more about an extension of the currently available APU's that AMD puts out. Another "off the shelf" technology design.

I suppose as there is no formal definition then each person is likely to come up with their own definition. But it basically comes down to how much is the hardware based on common or relatively common products that are available in the consumer space.

Regards,
SB
 
Yes,as far as we know now both of them are quite off the self.Even i cant see the great customization in Durango, at leat no more than i can see it in the Wiiu GPU, gpu with Edram attached.
 
It's clearly a custom APU, i.e. we're unlikely to ever see anything like it in the PC space (8 Jaguar cores, 18 GCN CU's and GDDR5 memory interface) but at the same time it uses mostly existing PC technology for the individual components.

The way the rumored Orbis system conforms to standard APU and GPU system organization is interesting to me in a theoretical sense, because if it's an SOC or SOP that can remove some of the custom IO and Sony-specific IP (if there is something that matches that descriptor) and instead put in a 16x PCIe link, that's actually something that could almost be a next-gen mid-range discrete card.
Having CPUs on-board along with a decent memory pool could make even a discrete card useful for client compute tasks, since it moves both endpoints in shared GPU/CPU work onto the same side of that slow expansion bus.

If it weren't for the proprietary nature of the implementation and fears of opening the system up to an uncontrolled and open platform, just the SOC and its RAM and maybe a front of case port box/optical drive would allow a PC/PS4.
If Microsoft got in the same game, there'd be a way to plug a PS4 card and XBOX720 card into your PC, and you could have a XPCS4720.

I'm not saying that this is in any way likely, just that the shared architectures and the extent of silicon integration make it physically and technically possible.
 
From the rumors, the modifications on both sides are just a different memory controller and some accelerators added (audio, lz, jpeg). If those are not considered part of "the architecture", what do we expect to be different about the actual architecture?
 
Physical implementation can make a lot of things different.
High-level unit counts hint strongly at a lot of the fundamental design elements being the same. However, let's say Durango or Orbis went to some of the extra contortions that the WiiU chip did for area savings. It would make a chip with all the same unit counts and checkboxes, but it wouldn't be a chip that would exist anywhere else.
 
But if your design has a modified 7770, then you can't replace it with a 7850 without re-implementing all of your modifications from the 7770 onto the 7850...
The reason companies like AMD can put out an entire family of chips in a short period of time is because features like the deltas between the 7770 and 7850 are configurable. So in most cases any customizations applied to a 7770 configuration are simultaneously applied to a 7850 configuration.
 
(Speculation) Secret first/second party dev kits that weren't leaked. Aegies also said there was a 'major hardware change' for Orbis last summer. Afterwards we have vg247 with the 8 and 16gb kit rumor. The picture is certainly less clear for Orbis than Durango. And then lherre no longer willing to comment on performance comparisons, but at the same time also willing to say Durango alpha=beta=final.

I don't believe a word that the abovementioned says. I don't consider him a reliable source of information as his NeoGaf posts have been all over the place for a while. It's also very clear that he misinterprets or misunderstands the things he sees/hears, and that's aside from everything that comes from his mouth being extremely coloured or slanted by his own personal biases.

As to the rest of what you post, again none of it is verifiable. And will probably never be. The only things we can take as reasonable in light of the many rumours is that Sony changed the RAM from 2 to 4GB, and seems to have lowered the target memory bandwidth from 192GB/s to 176GB/s. These have had more consistent rumours.

As they're all rumours, and nothing is official yet, I think it's more prudent to at least take those dubious, singular and unconfirmed rumours (i.e. by only one source - forum posts by randomz and not press reveals) with a healthier pinch of salt
(or sauce if you will ;-)
.
 
I don't believe a word that the abovementioned says. I don't consider him a reliable source of information as his NeoGaf posts have been all over the place for a while. It's also very clear that he misinterprets or misunderstands the things he sees/hears, and that's aside from everything that comes from his mouth being extremely coloured or slanted by his own personal biases.

Very true. He was part of TeamXbox after all. He definitely has some access to spec docs, but has no technical clue though.

As to the rest of what you post, again none of it is verifiable. And will probably never be. The only things we can take as reasonable in light of the many rumours is that Sony changed the RAM from 2 to 4GB, and seems to have lowered the target memory bandwidth from 192GB/s to 176GB/s. These have had more consistent rumours.

Everything is unverifiable, by us common folk that is. But the fun is in playing detective. Oh, Edge 8gb rumor bumps validity up by 2 points :D
 
Status
Not open for further replies.
Back
Top