Why not let PowerVR build RSX?

That was tit-for-tat.
Just because Img Tech doesn't run along in the desktop PC space doesn't mean they don't have the technology to render shiny pixels for a closed box. Likewise, just because ATI/AMD and NVIDIA dominate the PC graphics space doesn't mean they can outdo everyone else in every space at every design point.

The PC market is very different from closed boxes. Observing that PowerVR has not (re-)entered the PC market and extrapolating from there that they don't have a competitive hardware architecture is short-sighted.
You sure? I have been quite sure that Sony fabs RSX and can have a peek inside, do shrinks etc, though obviously with some agreement in place that they will only use it to build the PS3.

Ps3 may have set specs, but it's a hell of a lot closer to a PC than it is to a cell phone. Your argument would be pretty good for say...PSP2 (though PSP already looks like it outdid powervr's handheld graphics from a company with no real experience in the field), but ps3 (RSX) is at worst at a midrange pc's performance level, with performance several orders of magnitude higher than anything that powervr has ever done, and likely has a much larger feature set as well.

What benefit does PowerVR get from PS3 being a closed box that Nvidia didn't? We know things were cut from RSX that were in the geforce 7 series, so it's not like nvidia was tied to every single feature its pc gpus had. I'd imagine it was a heck of a lot easier for nvidia to slightly scale down their chip for a console than it would be for powervr to take a cell phone chip, or maybe some geforce 6 level design hidden away in a lab, and scale it up for the ps3. Not to mention nvidia has a much better history of timely execution than PowerVR. I assume PowerVR wouldn't be building a custom design, because assuming even equal talent and past designs to work from, it shouldn't be any cheaper for powervr (with less manpower) to build a custom design than it would be for nvidia.
 
Ps3 may have set specs, but it's a hell of a lot closer to a PC than it is to a cell phone. Your argument would be pretty good for say...PSP2 (though PSP already looks like it outdid powervr's handheld graphics from a company with no real experience in the field), but ps3 (RSX) is at worst at a midrange pc's performance level, with performance several orders of magnitude higher than anything that powervr has ever done, and likely has a much larger feature set as well.
SGX series has a really compettive feature-set with unified shaders (=high-performance vertex texture fetch), branching and absurd program lengths. Or to quote the blurb directly: "exceeds the requirements of OGL 2.0 and Microsoft Shader Model 3".
It's been available for a long time, too. It's a mystery of course how high exactly it could have been scaled in performance, in the window of time where this would have been relevant.
Fox5 said:
What benefit does PowerVR get from PS3 being a closed box that Nvidia didn't?
PowerVR doesn't gain a benefit, but NVIDIA loses a benefit: the driver legacy. The knowledge and accumulated code stacks to support twelve years of Win32 software history. That's what killed S3 Graphics and XGI when they tried to take part in the PC market, not the hardware, and "my favourite game doesn't run [well] on this POS!!" would be PowerVR's greatest challenge too, since they bowed out so long ago.

If you start your software base over from scratch, and that's what's happening with the PS3 as even software emulation for the previous generations, whenever it releases, will be new software, nobody will care whether or not AVP still runs as-is.
 

Common sense that's how. If SONY didn't suck more they wouldn't have hired Toshiba to design a GPU, they would've designed it themselves...make sense?


And KK also said they had examined the feasibility of a CELL based GPU which turned out to be not very competitive vs a dedicated GPU.

As far as PowerVR, I would say that SONY didn't want to risk going with PowerVR for various reasons. I don't even think they talked to Imagination Technologies for a GPU for PS3. PowerVR is considered an "underdog".
 
Last edited by a moderator:
Common sense that's how. If SONY didn't suck more they wouldn't have hired Toshiba to design a GPU, they would've designed it themselves...make sense?

Nope. By your logic, MS sucks the most since it hired ATI and IBM to do Xbox 360's GPU and CPU.
 
Common sense that's how. If SONY didn't suck more they wouldn't have hired Toshiba to design a GPU, they would've designed it themselves...make sense?

Hardly doubt they hired Toshiba , it would have been once again a partnership most likely.

I don't even think they talked to Imagination Technologies for a GPU for PS3.

Who knows , from what i have heard Sony did look around before going with NVIDIA
 
Thanks for the info, but you're basically emphasizing my point. NVidia is not going to invest millions of dollars in the design until it does indeed become material. They'll make proposals, and Sony will mull over them for a while wondering if they can make a better or cheaper design in house, but NVidia is not going to seriously work on it until they get the contract.

Agreed, but my thrust was that work having commenced or not, it would have been on the table well ahead of time whether an NVidia contract would have meant a 'custom' architecture or not; and in terms of implementation, they would have had ultimately no less time than MS/ATI themselves had, so no more or less 'rushed' either.

I agree with Nonamer that NVidia would probably just have been a little reluctant to get too wild with the design - afterall they probably already felt they were making concessions to the console paradigm well beyond what they had for XBox. ATI on the other hand was in a position where MS' R&D dollars would provide an advance test of their unified architecture, and where the pursuit of eDRAM may have ironically stemmed from the expectation that Sony would use it in their own console (not to mention experience with such via GC).

I'm personally of the mind that ultimately dev API familiarity and shader flexibility is what led Sony down such a 'conventional' path with RSX. Conventional in quotes because, well, we really should note that there are important system considerations in place, however minor on a hardware level they may be.

Remember also that at the time, Sony was ready to dedicate as much RAM to RSX as MS was willing to dedicate to their entire console, so bold moves relative to the strategies of their then contemporaries, although of course mooted with MS yielding to developer requests for 512MB.
 
Somewhere on this forum, on the discussion of Sony's choice for nVidia (which is a different topic to why they didn't choose PowerVR), it was said that there was a GPU from Toshiba and nVidia's offering. Idon't remember the details. All I know is this info has been talked about before. Best people go looking for it, I think. This thread ought to be comments on why not to use PowerVR. I'd say the most obvious reason is that PowerVR haven't managed to produce a competitve part for desktop devices for years. Even Lindbergh choose nVidia over PowerVR after lots of expectation from some parties that Lindbergh would incorporate a high-and-mighty ATI+nVidia-pwning GPU that'd outclass a G80 in 25 Million transistors or somesuch ;)

If their tech is that suitable for large-scale power use, shouldn't it be being shown in large-scale power systems?
 
I don't think the PowerVR tech was the problem. I think Nvidia was simply the safer bet. It's like gambling, you don't put your money on the underdog (PowerVR).
 
SGX series has a really compettive feature-set with unified shaders (=high-performance vertex texture fetch), branching and absurd program lengths. Or to quote the blurb directly: "exceeds the requirements of OGL 2.0 and Microsoft Shader Model 3".
It's been available for a long time, too. It's a mystery of course how high exactly it could have been scaled in performance, in the window of time where this would have been relevant.
PowerVR doesn't gain a benefit, but NVIDIA loses a benefit: the driver legacy. The knowledge and accumulated code stacks to support twelve years of Win32 software history. That's what killed S3 Graphics and XGI when they tried to take part in the PC market, not the hardware, and "my favourite game doesn't run [well] on this POS!!" would be PowerVR's greatest challenge too, since they bowed out so long ago.

If you start your software base over from scratch, and that's what's happening with the PS3 as even software emulation for the previous generations, whenever it releases, will be new software, nobody will care whether or not AVP still runs as-is.

You know, PowerVR was in PC graphics from the start, and they still couldn't compete driver wise with nvidia.
Also, most companies at least targeted a few popular benchmark games and made those run well, no game ran well on XGI or S3 hardware, and XGI at least had tons of hacks in their drivers to try and make games run better.
 
I don't think the PowerVR tech was the problem. I think Nvidia was simply the safer bet. It's like gambling, you don't put your money on the underdog (PowerVR).

Also the IP and toolset as mentioned to avoiding the repeat of "PS2 is difficult to program" fiasco.
 
Agreed, but my thrust was that work having commenced or not, it would have been on the table well ahead of time whether an NVidia contract would have meant a 'custom' architecture or not; and in terms of implementation, they would have had ultimately no less time than MS/ATI themselves had, so no more or less 'rushed' either.
I doubt they had the same amount of time. AFAIK Sony did not intend to release PS3 as late as it did, so NVidia's deadline was probably only a few months after ATI's. However, ATI put out their press release 16 months earlier than NV. The design time for Xenos was already surprisingly short for such a radical architecture, so expecting NVidia to do something completely custom in a year less is unreasonable, IMO. To give you some examples, NVidia said G80 was 4 years in the making. I heard about ATI's unified shader architecture back in 2001. GPUs aren't easy things to create.

Suppose NVidia had a better design, e.g. leveraging eDRAM and G80. Even if they had a design proposal ready and put a decent amount of thought into the architecture, there's a lot of expensive design, engineering, and testing to be done before the part will be ready. The original PS3 schedule wouldn't allow it, so this kind of advanced design wasn't possible. That's what all of us mean when we say "last-minute" or "rushed".

This is not to say G7x technology is a bad choice. Even today, I'd say it was the best choice of all existing PC GPU architectures out there if you look at them as is. It's just that there are aspects of console requirements that suggest a few modifications would have been better.
 
Suppose NVidia had a better design, e.g. leveraging eDRAM and G80. Even if they had a design proposal ready and put a decent amount of thought into the architecture, there's a lot of expensive design, engineering, and testing to be done before the part will be ready. The original PS3 schedule wouldn't allow it, so this kind of advanced design wasn't possible. That's what all of us mean when we say "last-minute" or "rushed".

Hmmm, see, but you and I don't agree on when NVidia entered the picture - the press release means little IMO. I know what "all of you" mean when you say 'last minute' and 'rushed,' so there's no confusion there. :) It's simply that since I subscribe to a different scenario, I hope you understand where I would think that the situation was not rushed however. Anyway, it is what it is. Ultimately it doesn't much matter - ATI is simply a better partner track-record wise in being willing to work around their clients needs rather than having their clients contort around their own present offerings a la NVidia. I have little doubt though that if Sony and NVidia continue on with PS4, it will be a "real" effort for the next go around.

Now, how the scene will look five years from now in the graphics space specifically - let alone computing in general - makes me wonder exactly what (and who) will be constituting the GPUs of both MS' and Sony's next consoles.
 
Now, how the scene will look five years from now in the graphics space specifically - let alone computing in general - makes me wonder exactly what (and who) will be constituting the GPUs of both MS' and Sony's next consoles.

yep

I have little doubt though that if Sony and NVidia continue on with PS4, it will be a "real" effort for the next go around.

Are you implying that nvidia while having enough time to do something "real" decided instead to shortchange Sony with RSX? ;)

jk - don't answer that. :smile:
 
Suppose NVidia had a better design, e.g. leveraging eDRAM and G80. Even if they had a design proposal ready and put a decent amount of thought into the architecture, there's a lot of expensive design, engineering, and testing to be done before the part will be ready. The original PS3 schedule wouldn't allow it, so this kind of advanced design wasn't possible. That's what all of us mean when we say "last-minute" or "rushed".

I see what you mean, but even given the same time as ATI, I don't see how it's possible for nVidia to do a G80 for PS3 since it's radically different from their previous products. As you highlighted, more testing would at least be needed (Not to mention scaling down and customizing for console requirements). Even today, they seem to be rushing to fix their Vista driver issues.

In my understanding, ATI was on the verge of bringing out Unified Architecture, hence MS was able to capitalize on it *at the right time* for Xbox 360 first. Perhaps, ATI should be credited for its ability to come up with UA early and Xenos within a short time, rather than nVidia being rushed.

In retrospect, I find it interesting that nVidia ends up with an Unified Architecture part on the PC earlier than ATI. It's just different priority by different companies.

The G7x architecture was/is the best option for PS3. Sony further added the Cell <--> RSX interaction in line with its vision (Remember there is still a SCC somewhere in the mix). I hope people who attend the RSX session can update us on its low-down after GDC.
 
Last edited by a moderator:
The less effective performance of devices with competing graphics processors to CLX2, Kyro 2, or MBX demonstrates the ignorance such manufacturers have of PowerVR's real-world benefits. PSP is no different: with probably whole multiples more silicon and higher power consumption than MBX, the PSP's GPU still doesn't perform as well for AA, color, and 3D depth and lacks competitive functionality for MBX's DOT3 per-pixel lighting, programmable vertex shading, and fractional tesselation plus variable LOD for curved surface rendering.

Imgtec claims that PowerVR's TBDR with SGX sustains a higher level of real-world performance because it's more latency tolerant, and it's certainly been more efficient in the past due to more orderly memory access, single contiguous writes to the framebuffer, decoupling of vertex and pixel processing, etc.
 
The less effective performance of devices with competing graphics processors to CLX2, Kyro 2, or MBX demonstrates the ignorance such manufacturers have of PowerVR's real-world benefits.
*sigh*

Every time I tell you why you're wrong, you ignore my points repeatedly. Go read this thread again, especially this post. Please stop this "TBDR destroys IMR" nonsense.
 
You know, PowerVR was in PC graphics from the start, and they still couldn't compete driver wise with nvidia.
1. I totally disagree.
2. PC drivers are irrelevant for a closed box. That was my point actually.
Fox5 said:
Also, most companies at least targeted a few popular benchmark games and made those run well, no game ran well on XGI or S3 hardware, and XGI at least had tons of hacks in their drivers to try and make games run better
Yes. PC drivers are hard work to write and maintain because so many different applications already exist. If you have to start somewhere, you're well advised to go for something that makes an impact on the marketability of your product, i.e. popular benchmarks.

"Benchmarks first" is a real-world symptom of the point I have been riding on all the time: the PC grraphics market is a pita to enter, and no complete indicator for closed-box competitiveness.
 
1. I totally disagree.
2. PC drivers are irrelevant for a closed box. That was my point actually.
Yes. PC drivers are hard work to write and maintain because so many different applications already exist. If you have to start somewhere, you're well advised to go for something that makes an impact on the marketability of your product, i.e. popular benchmarks.

"Benchmarks first" is a real-world symptom of the point I have been riding on all the time: the PC grraphics market is a pita to enter, and no complete indicator for closed-box competitiveness.

You appear to have missed my points.

Whether you agree or not, you can't deny that PowerVR was one of the old boys in the 3d card business, at least from the time period that mattered, they had their first card not long after 3dfx.

Also, my point with the benchmarks was more of a reference to PowerVR's drivers back in the day; ie, up until the kyro cards, they'd pick one or two benchmark games, optimize for those, perform great in them, but otherwise not be so hot. XGI couldn't even make their cards look competitive in a single completely static benchmark case while significantly lowering image quality, ie, it wasn't the drivers that alone that made them fail in the market. If they had competitive performance in at least one case, they would have lasted a bit longer than they did. PowerVR was around for a good few years off of just that.
 
PowerVR was great in the late 1990s when Nvidia was 'growing up' but since Nv basicly absorbed much of SGI ... the parts ATI didn't get, Nvidia became king and then co-king with ATI. while PowerVR has not had a highend desktop offering this decade. PowerVR has proven technology but was either unable or unwilling to go into the highend after PMX / PVR250 / Neon250.


*waits for Lazy8s to correct me*

PowerVR might have the tech today to take a huge dump on everyone else, but they're the poster guys for not using what they've got on a large scale.

sure they've got Series 6, 7 and 8 in development or on roadmap but when are we gonna see something other than for handhelds/ mobile devices ?

even Sega's lowend Aurora board with MBX technology is nowhere to be seen, nevermind a highend NAOMI 3.
 
Last edited by a moderator:
Back
Top