Is PS3's scaler working now?

One other theory is that the chip that features in the original PS3 was a stop-gap, thrown in oversized processor with most of it's functions ignored, and the new chip isn't a 65nm varient but a cut-down processor that's lost all the unused transistors from the original chip. In such a case the ratio of shrinkage would just be coincidental. That begs the question why such a stop-gap chip was used though.

Cost and time.

Developing any chip on a high end process is very expensive ($15million+) and takes a long time to do. If Toshiba were developing an I/O chip anyway (it looks perfect for a TV) then Sony could use it and put their efforts into developing a cost optimised 65nm part. They may be taking a financial hit by using it but the 90nm PS3 was probably never meant to be produced in large numbers anyway i.e. 6 million isn't very big compared to the 100M the PS2 sold...

Many embedded processors have all manner of weird and wonderful I/O ports and it's highly doubtful that anyone uses all of them. It's cheaper to develop a single one size fits all part than a series of specific chips.
 
One other theory is that the chip that features in the original PS3 was a stop-gap, thrown in oversized processor with most of it's functions ignored, and the new chip isn't a 65nm varient but a cut-down processor that's lost all the unused transistors from the original chip. In such a case the ratio of shrinkage would just be coincidental. That begs the question why such a stop-gap chip was used though.

The SCC springs to mind (made by Toshiba, same size etc.), I think Adex reasoning makes sense as well, but DeanA is very persistent when he is shooting down the SCC theory. It all begs the question, what is all that silicon needed for?
 
What does it mean that the chip was shrunk in the PAL version, was this the first chip to be moved to 65 nm?

While we don't have official confirmation, we're fairly certain it is indeed the first chip to move to 65nm (the size reduction just works out perfectly).
 
Is there any word on what process Toshiba are using for SCC? I couldn't find anything about a 65nm SCC with a quick Google.
 
Here are a few things that are known about the SCC:

sccyr6.jpg


A lot of stuff in there, quite a lot of redundant stuff from the PS3 point of view.

scc1yt7.jpg


It may be worth noting that the die picture is from June 2005, a move to 65 nm by now is not unreasonable.

Edit: The northbridge may be the first choice to move to 65 nm as it is the smallest and very likely running on a lower frequency than the RSX (SCC 333 MHz), i.e. it should be the one that is easiest to achieve acceptable yields with.
 
Last edited by a moderator:
I found this article as well which indicate that Toshiba designed the SCC to be a high volume part to be included in variety of products:

"Toshiba believes that nothing is more important to achieving reasonable development costs and shorter time to market with complex SOC’s, than the IP-core interface scheme. For this reason we have chosen OCP as the standard for Toshiba in the SCC," said Takashi Yoshimori, Technology Executive SOC Design Toshiba Corporation. "The SCC is another example of a high-volume production product based on the OCP specification."
 
Don't nearly all notebooks do scaling on the GPU? I know it says that in NV's control panel, and I'm pretty sure the Radeon GPUs do the same. I don't think there has ever been a performance hit due to this functionality. And it certainly has improved in quality over the years, most obviously compared to stuff from the '90s with horrible aliasing issues. It's just something else that got integrated and improved a long time ago.

So I still wonder why doing it on RSX would be shocking at all, or even bad? It seems to be a hardware function that Xenos does thru its "display pipeline".
 
So I still wonder why doing it on RSX would be shocking at all, or even bad? It seems to be a hardware function that Xenos does thru its "display pipeline".
It's not shocking at all as long as you have a dedicated unit for it, otherwise you have to sacrifice some rendering time and memory to scale your image.
 
So we're no closer at all to solving the great scaler mysteries.

What I'm getting is 360 has some form of dedicated hardware, but it's not in ANA/HANA, it's in the GPU.

PS3 most likely doesn't have dedicated hardware? (This is what I get from nAo being coy..)
 
Don't try to read into my words, I was simply explaining why a 'real hw scaler' is preferable to a pure sw solution
 
So we're no closer at all to solving the great scaler mysteries.

What I'm getting is 360 has some form of dedicated hardware, but it's not in ANA/HANA, it's in the GPU.

PS3 most likely doesn't have dedicated hardware? (This is what I get from nAo being coy..)

There is a thread discussing this on AVS Forum in which Darknight (someone very knowledgable about the PS3 and looks to be either a developer or insider) claims there is indeed a hardware scaler solution in the PS3. I believe him and the information he provided.

Link

It's funny how I mentioned before the PS3 launched that there was no guarentee that games would be scaled. In fact, it was up to the developer what resolution the game would output regardless of what the dashboard was set to. Nobody believed me then and insisted they would scale like the 360. Then everyone shouted there's no hardware scaler, and I insisted there was. Now look what has come out. I don't know how much more obvious it can be that I do know what I'm talking about.

There is a much more logical reasoning as to what is going on here, which unfortunately that I cannot dilvulge since I will not release direct info that is found on the development site, but nobody seems to have hit it yet. Maybe it's too obvious but there is a logical reasoning behind all this. Maybe it's obvious to me since I know why, but I'm surprised nobody has pointed it out.
 
What possible logical reasons are there to have a working scaling chip in PS3 and yet not use it? :???:

Various theories exist - this is Sony after all. One popular theory is that Sony didn't want developers making native 720p games and using the scaler to get cheap 1080p results.
 
That makes sod-all sense though, given the trouble with video output to various TVs. hoping to sell TVs on the strength of not supporting scaling for existing TV owners is ludicrous. Even moreso when they've enabled half the scaling! You get a half-baked upscaling solution so TVs are supported with a bit of developer faf but at lower quality than supporting true scaling.
 
Various theories exist - this is Sony after all. One popular theory is that Sony didn't want developers making native 720p games and using the scaler to get cheap 1080p results.
Where is this theory popular? insanityland?
 
Where is this theory popular? insanityland?

Well, it's been known for some time that Howard Stringer and Ken Kurtaragi both vacation on Insanityland.

As for the theory, it was one of the more popular idea's presented when we had this discussion on AVS Forums. There were a lot of theories bantered around. Another theory was that the firmware was immature and it didn't have the functionality fully exposed. I don't think anyone would argue that there is a lot of hardware that has yet to be tapped by the firmware / O/S.
 
I've always heard that it was put in place when the Cell was originally supposed to take care of all graphics functions. When they later added the RSX it was later in the data path and to take advantage of the scaler the data would have to go all the way back to the cell and come back out.
 
Various theories exist - this is Sony after all. One popular theory is that Sony didn't want developers making native 720p games and using the scaler to get cheap 1080p results.
People with 1080p tv's are just gonna have their tv do the scaling, so I doubt SONY would even care.Chances are most of these TV's will have scalers comparable to the one that might be in PS3, so they probably wouldn't worry about the image quality.
 
Back
Top