CONFIRMED: PS3 to use "Nvidia-based Graphics processor&

Status
Not open for further replies.
accidentalsuccess said:
the fact is that it will be time/cost prohibitive to develop ultra-highrez assets to "truly" take advantage of it for most game projects. That's where I see M$'s XNA to truly be the trump card, which of course is their plan.

You do realise that if your only concern is developing hi-res art assets etc. etc. a PS3 developer can easily create all their art content using XNA on windows and then transfer the assets to a PS3 dev kit and use the results in their own PS3 engines? :)
 
Titanio said:
You do realise that if your only concern is developing hi-res art assets etc. etc. a PS3 developer can easily create all their art content using XNA on windows and then transfer the assets to a PS3 dev kit and use the results in their own PS3 engines? :)
Nothing a contract and a watermark couldn't prevent.
 
nelg said:
Titanio said:
You do realise that if your only concern is developing hi-res art assets etc. etc. a PS3 developer can easily create all their art content using XNA on windows and then transfer the assets to a PS3 dev kit and use the results in their own PS3 engines? :)
Nothing a contract and a watermark couldn't prevent.

True, Microsoft might like that, but would middleware developers agree? A lot of companies making modelling an animation tools, for example, don't make them for each console. They make them for PC, and developers use them on PC to create their art assets. I doubt such tools companies would want to engage in XNA if it limited the use of their product's output to windows.

Of course, I'm not saying everything about XNA will be useful for people working with platforms other than windows. Obviously any code that calls XNA-related functions etc. to bind two engines together, for example, isn't going to work outside of windows. But on the content/art creation side, there's potential for anyone to take advantage, regardless of what platform that content/art ends up on.

Also, how do you watermark a 3D mesh (for example)? (Honest question, i'm not suggesting it's impossible). And how does the program know you are going to take it and use it on another platform? It can't watermark all output.
 
rambus memory is too slow for gpu(random access), must have big embended mem(32MB?)
but this emb memory size is too small for texture and geometry
hence dont vertex processing in gpu, only cpu
 
Gubbi said:
CELL APUs can only speak to memory through DMA, so you have to set up source, destination and range for each memory transaction.
I don't think it will be that hard to directly expose that capability in the APU ISA.
There's much more than that if you read the patent. APUs can make DMA requests and group them..checking later for group(s) of dma requests completition.

Luckily it appears that CELL has a DMA queue, so instead of stalling, the APU can go do some other stuff until the result turns up. But it's certain to be slower (higher latency) than a simple load/store to a cache hierachy.
SPUs have a L1 cache! and we really don't know how slow external memory references will be.
The real problem (me and Faf are going to repeat that 1000 times..) is what APUs can do while waiting DMA queries to be completed.

ciao,
Marco
 
nAo said:
Anyway I know that someone is developing a custom architecture for physics simulation..

Funny, I do too. Are they in the midwest by chance?
 
PVR_Extremist said:
Out of interest how do MS and Sony stack up in terms of financial muscle?

All I'm alluding to is that it's not wise to count MS out of the "living room" PC experience. They are SERIOUSLY after "Windows xxx" in every living room and in reality Xbox 1 and 2 are just stepping stones to what they really want......Windows everywhere on virtually every device. I don't really think they are bothered about what hardware runs it so long as their software IP is on it.

Of course Sony have the same ultimate goal too (world domination :devilish: ) so I'm shit scared of any one of them achieving any kind of pre-eminance. What Sony don't have though is OS software experience (afaics) so will that be a hinderance to them?

Yes and no... I believe this is why they want their console to be compatible with Linux, instead. Therefore, leaving Linux to the job for them. One things for certain, though, this is where Microsoft are at their strongest.
 
Here's something to take a look at
http://www.eet.com/semi/news/showAr...BCCKH0CJUMEKJVN?articleId=54200580&pgno=2

but more specifically.....
But UNC's Zimmons has his doubts. "I believe that while theoretically having a large number of transistors enables teraflops-class performance, the PS3 [Playstation 3] will not be able to deliver this kind of power to the consumer," he wrote in response to an e-mail query from EE Times. "The PS3 memory is rumored to be able to transfer around 100 Gbytes/second, which would mean it could process new data at roughly 25 Gflops (at 32 bits) — far from the 1-Tflops number."
 
Khronus said:
"The PS3 memory is rumored to be able to transfer around 100 Gbytes/second, which would mean it could process new data at roughly 25 Gflops (at 32 bits) — far from the 1-Tflops number."

Please, not again. :?

Fredi
 
london-boy said:
Pugger said:
You got to hand it to Sony, they really know how to gain loads of free PR by saying so little. On the technical side of things can someone explain in lamen terms what the will be the difference of bolting on the next gen Nvidia card into the PS3 compared with bolting the next gen Nvidia GPU which is based on cell. Would cell alone make the GPU better? or I'm I totally missing the point? If its just the new Nvidia GPU wouldn't that mean that the differences between say the XB2 and Revolution will be minimal?

No one knows for sure, but i would expect a "normal" Nvidia GPU not to be 100% compatible or familiar with the CPU it is going to be attached to.
A Cell Nvidia GPU might interact better with a Cell based CPU. But we don't know yet...

Which is probably why Sony has taken part in helping Nvidia with the GPU.
 
Just to address the earlier comment of MS using their $40+ billion in cash to buy Sony. It's not really feasible due to anti-trust laws, and it would be more efficient to simply buy Rockstar, Konami, Capcom, and Square-Enix for around $8 billion. Taking away GTA, MGS, RE + DMC, and FF/DQ would be the easiest way to hurt PS3. It's still a difficult proposition though.
 
Guden Oden:
To me your focusing on the problems at the moment not the gains from solving these problems, virtually all those issues are fairly easily solveable.

Why bother solving them? Because (just as in Cell), these types operations are suited to particular special hardware. The simple fact is (as both Cell APUs and GPUs show) that its easier to scale n special CPU's (SPU or GPU ALU) than general purpose CPUs. We are looking at systems in a couple of years capable of a 1000 (real) FLOPs per cycle! Thats worth trying to adapt certain algorithms to use that power.

GPU and Cell are solutions to the computation problem, graphics were the first things that needed it, sound and physics second, AI will be third. Being able to switch where we compute things, allows us to overcome bottlenecks and produce higher quality games.
 
IGN David Roman Interview:

Firstly, how long has NVIDIA been working with Sony on this collaboration?

David Roman: NVIDIA has been working on aspects of Sony Computer Entertainment Inc.'s next generation system for the past 2 years.

How did NVIDIA become involved with working with Sony? Which party approached whom?

David Roman: Difficult to say. We have been talking to Sony about very many different projects from the early days of starting NVIDIA.

The Xbox GPU is essentially a beefy version of the GeForce 3. Will the PS3 (or whatever it will be called) GPU be based on forthcoming desktop GPU architecture or will it be its own entity entirely?

David Roman: It is a custom version of our next generation GPU

Up to this point, due to its proclaimed power, we had assumed that the Cell processor would be doing all of the processing in the PS3, both the generalized (AI, physics, etc.) and all of the graphical work. Will NVIDIA's GPU work in the terms that we're currently used to and handle the graphics entirely, or will it work with Cell in ways that current GPUs don't and let Cell handle some of the work (say, vertex transformation, for example)?

David Roman: I don't have that information. I know that this is a custom chip and we are working on its development with Sony Computer Entertainment

Will NVIDIA's GPU work be tied into the Cell architecture, or will it be a separate chip in the PS3?

David Roman: It will be a separate chip

S-Mart or Quickie Mart?

David Roman: …

Source: IGN
 
SegaR&D said:
1. MS wrote VertexShader and PixelShader specification.
Hmmm, so MS invented PS/VS before everybody else. Nevermind the initial 3DLabs OpenGL 2.0 SL proposal and the Stanford shading language - both predate the MS HLSL spec.

2. VS and PS are optimized for DirectX/XNA, not OpenGL.
VS/PS in OpenGL and DirectX are nearly identical apart from some small semantical differeneces. There's really nothing DirectX/OpenGL specific about each of the specs.

3. But PSX3 cannot use DirectX. Must use Embedded OpenGL.
4. Embedded OpenGL that SCEI hopes to use is still in early stage.
5. Compared to Embedded OpenGL, DirectX is a mature technology.
Embedded OpenGL will be based on OpenGL 2.0 minus some things not needed in a console. The MS API will be based on DX10. Is DX10 mature?

6. Developers have been coding DX shaders for four years now. Developers are unfamiliar with Embedded OpenGL shaders.
The shading languages are nearly identical. I doubt having to learn slightly different syntax will hinder the developer's abilities to write shaders.

7. PSX3 no longer enjoys a rendering performance advantage over Xbox Next.
Yeah, whatever...
 
ManuVlad3.0 said:
IGN David Roman Interview:

Firstly, how long has NVIDIA been working with Sony on this collaboration?

David Roman: NVIDIA has been working on aspects of Sony Computer Entertainment Inc.'s next generation system for the past 2 years.

How did NVIDIA become involved with working with Sony? Which party approached whom?

David Roman: Difficult to say. We have been talking to Sony about very many different projects from the early days of starting NVIDIA.

The Xbox GPU is essentially a beefy version of the GeForce 3. Will the PS3 (or whatever it will be called) GPU be based on forthcoming desktop GPU architecture or will it be its own entity entirely?

David Roman: It is a custom version of our next generation GPU

Up to this point, due to its proclaimed power, we had assumed that the Cell processor would be doing all of the processing in the PS3, both the generalized (AI, physics, etc.) and all of the graphical work. Will NVIDIA's GPU work in the terms that we're currently used to and handle the graphics entirely, or will it work with Cell in ways that current GPUs don't and let Cell handle some of the work (say, vertex transformation, for example)?

David Roman: I don't have that information. I know that this is a custom chip and we are working on its development with Sony Computer Entertainment

Will NVIDIA's GPU work be tied into the Cell architecture, or will it be a separate chip in the PS3?

David Roman: It will be a separate chip

S-Mart or Quickie Mart?

David Roman: …

Source: IGN

In some ways I kind of expected this, though a part of me is a bit disappointed. I guess we don't really know what nvidia's next GPU will be like, but I don't think it will be nearly as software oriented an approach as I was hoping for. Still, the silver lining is that it probably means that nvidia's linux drivers between the Cell workstation and PCs will be very similar and should share a lot of common code. The linux drivers will probably end up being very very good (Which is much more than I can say for ATIs drivers at this point).

Nite_Hawk
 
Will NVIDIA's GPU work be tied into the Cell architecture, or will it be a separate chip in the PS3?

David Roman: It will be a separate chip

Well there goes the visualizer. Now that means the ps3 will have a nvida graphics chip using there acrticiture(sp). Instead of using the visualizer artcututure (sp)
 
version said:
rambus memory is too slow for gpu(random access), must have big embended mem(32MB?)
but this emb memory size is too small for texture and geometry
hence dont vertex processing in gpu, only cpu

Please stop eating the brown acid, it kills brain cells.

What prompted this utterly random (and pretty much completely false) post? Do you even know anything about rambus memory except whatever (dis)information you've read posted by other clueless people like yourself on the web? :D

As for your silly talk about why eDRAM would be too small, you needn't worry; 3D renderering hardware don't store all the geometry in on-chip/board memory anyway, so your concerns are all moot.
 
nelg said:
Titanio said:
You do realise that if your only concern is developing hi-res art assets etc. etc. a PS3 developer can easily create all their art content using XNA on windows and then transfer the assets to a PS3 dev kit and use the results in their own PS3 engines? :)
Nothing a contract and a watermark couldn't prevent.

Beat me to it. heh.

Sure, art assets can be transferred but we'll see how it all plays out.

Back to more on topic quesitons: having a custom, beefed up "next gen" nvXX chip bolted onto the cell architecture would require custom, probably hardware, interfacing, right? Or (vice versa) would require custom bridging for PC architecture? There could be significant problems with an efficient bridge between two architectures, right?

I guess I don't understand how you can do a "custom chip based on next gen tech" that is compatable. This keeps getting more and more interesting.
 
Status
Not open for further replies.
Back
Top