MS's "secret weapon" against the PS3 (Arstechnica)

1. I don't know.
2. Yes, presumably.
3. Right now I think the ANA handles the digital to analog conversion as well as the scaling? I don't know at what level the ANA operates.
4. I'm sure Microsoft can create an HDMI supporting 360 in the future if they want to. They will wait for a while though, as it will cost them a significant amount of money. As long as they think they can get by without HDMI, they will.
5. Scalers are meant to improve IQ, so you'd hope it does affect IQ. ;) However, if developers are relying on the scaler for their games, results may vary.
Thanks for the answers ...

My 5th question was about DVD upscaling , i forget to mention it ... Doesn't upscaling DVDs reduce image quality ?.. Higher resolution always mean better quality ?.. [ I said , still a noobie on technical issues :cry: ]

BTW, Any devs can answer / confirm these ?.. [ Forgive me Arwin if you are a dev :D ] ...
 
Regardless of the fact that Faf was replying to zeckensacks comment related to automatic supersampling, the statement that "the majority of 360 titles he's seen to date run 480 native" is what swanlee was commenting on. If Faf's meaning was that 360's games can be output natively at 480p as opposed to PS3's then that is a different statement and one that does not come across clearly FWIW. No disrespect intended towards Faf in either case because technically he may be correct.
It was clear to me. Zeckensack was saying PS3 didn't have supersampling for SDTV. Faf replied that XB360 renders for SDTV at natvie resolution without supersampling, whereas a lot of PS3 games are supersampling (not rendering 480p native, but rendering higher resolutions and downsmapling) without the addition of scaling hardware.

Which for me was one of the biggest advantages of a scaler, and at this point I wonder if XB360's is actually broken? Why not downsample, rather than render in grotto-vision? There were excuses for the early titles, but a year on they're still rendering to SDTVs at SDTV resolutions, no?
 
I think this point is debatable; there are plenty of us early HD adopters that understand why Sony is pursuing this route, and give them a pass on it based on the other things they have included in the system in it's place. On the other hand, obviously it is causing consternation among some.

agreed in the larger scheme it is not that big of a deal but...

when taken within the context of being a "HiDef solution to gaming", it seems not including out of the box scaling is a pretty big oversight. Most every other hidef device including my cable box have a scaler (the one in my SA 8300 is noticeably better than the one on my 1080i crt in fact, as is the scaler in the 360)

again.... this will probably be forgotten over time but atm is an issue to some who (rightfully so) don't want to have to think about nor deal with this kind of issue when they get their brand new $500 box home.
 
How about the cost?
How about it?

Scaling and HDMI doesn't imply any extra costs over HDMI or not.

Does it mean the scaler has no analog part at all?
Scaling can be implemented just as part of the display pipeline - as I previously poited out with the diagram of the AVIVO display pipeline:

pipes.jpg


Note that there are effectively two parts in that diagram - the "display pipelines" then the "display outputs". As the display pipelines are before the outputs you see that these are a digital process; you can tack any type of display output you want after that (i.e. replace one of those TMDS with HDMI in that diagram). The display pipeline does not need to be part of the graphics chip at all, it can be a completely separate ASIC(s) (as, indeed, it is with GeForce 8800) and you can bung in whatever outputs types you want.

Scaling itself is pretty much like texturing, or even resizing an image in a photo editor software package - you can apply certain filters to it in order to best stretch/shrink the input pixels to the desired output resolution.

Indeed. Given the very specific audio/video output requirements of PS3 (HDMI/HDCP, multi A/V out, optical out etc), it shouldn't come as a surprise that a different display output system to that found on nVidia G70 boards is used on PS3..

There's no reason why that has to be the case though.

Probably the biggest advantage of the 360's scaler right now is for game-programmers, who can achieve better frame-rates by 'cheating' and using a lower resolution. If you can't tell that the image has been upscaled this way, then who's complaining?

From an end user perspective its quite nice that my TV isn't flicking between resolutions each time I use a sorce that has a different resolution though.
 
Scaling itself is pretty much like texturing, or even resizing an image in a photo editor software package - you can apply certain filters to it in order to best stretch/shrink the input pixels to the desired output resolution.

Sounds like something CELL should be good at ... why isn't it implemented as such in games separately?
 
IMO I think this talk about the missing scaler is redundant without having some sort of basis how many are affected from this problem.

The only people seriously affected, are those, that have a TV that only accept a 1080i or 480p signal. I'm not sure about the US, but in Europe, these TVs are clearly not "HD-Ready" - something that is obviously a requirement if you want to enjoy full definition on PS3.

1080i is an HD resolution *worldwide* and in fact it was the highest resolution nominally available prior to the introduction and general release of consumer 1080P panels in 2006.

If I had the choice, I'm not really sure I would choose a scaler over a technical requirement to each software written to support at least a few sets of resolutions, especially when the market you are selling to features equipment that has a good-enough scaler on their own to the job.

It may have been a price thing. Based on the rasterization demos by Cell early in its life... I'm sure that it is a surmountable problem... very soon also. However, since when does flexibility and developer choice not appear good? MS also had the 720P "standard" and the good that did...

Only nerds and geeks investigated the actual resolution of PGR3 and poopooed an awesome looking and playing game. Do you not get you money's worth of its not at a certain resolution? If the dev chose to lower resolution to have some flexibility with other things that doesn't sound so bad... maybe there is a time or art constraint as opposed to a technical constraint who knows?

What is the actual native resolution of Gears? That should indicate to us if these consoles can really output at their stated resolutions as opposed to using a "scaling trick." If Gears is actually 720P native then what ever wasn't before it, had other limitations not related to hardware...

The flexibility and complexity of using SPE's for aiding graphic creation is good, but a rigid fairly inflexible output requirement is also good? :???: Something about that seems off...

In anycase, I predict that this will be something that won't be mentioned in 1 or 2 years anymore and will be long forgotten (with or without a software solution by Sony)...

Agreed and agreed. Xbd was on point with his comment.
 
"What happens with these TVs if you view a TV station that signals at 720p?"

Thats the issue, HD cable boxes have scalers built in so say your Watching Fox HD on a 1080i only HDTV, Fox broadcasts HD in 720P you can set your cable box to scale all HD content to 1080I. My comcast HD cable box does this perfectly fine.

Nearly every HD output device has some sort of scaler to deal with 1080i only HDTV's or HDTV with bad scalers in them so you feed it it's native resolution.

Sony is an HDTV maker their is no excuse for them not to know that HD output devices should have a scaler in them.
 
Last edited by a moderator:
That would presuppose a.) there is a display pipeline in RSX .b) if there is a display pipeline it is being used.

IIRC comments in the "RSX Vertex limited" thread mentioned that that the AA downsampling at the DAC that is used on NVIDIA PC chips isn't in operation I would guess that NVIDIA's display pipeline isn't in operation.
Alright! I get it..I think. :cool:

That really does make some sense then. Thanks for the advice. I try to learn about this stuff and your post helps* a lot.

Anyway I don't see the point in scaling the image all over the place.. Surely if someone has a 1080P TV the builtin scaler ought to be good enough. My TV is 16:10 and I have my 360 set to 720P. It works great. Don't even notice the image's a little stretched (I hate black bars). Just seems good enough for me outputting in native 720 without faffing about so mcuh.

Peace.
 
Last edited by a moderator:
1080i is an HD resolution *worldwide* and in fact it was the highest resolution nominally available prior to the introduction and general release of consumer 1080P panels in 2006.

I should have worded more carefully (the "" weren't enough I guess) -> "HD-Ready" is a standard or certificate here in Europe (I wasn't implying that 1080i is not a HD resolution). I'm not sure on all the bullet-points, but from what I recall (I'm sure London-Boy could be more specific about it and I'm too lazy to go find a link on the certificate), it requires the display to have at least a HDMI or DVI input and the display must be able to support 720p and 1080i signals.

It may have been a price thing. Based on the rasterization demos by Cell early in its life... I'm sure that it is a surmountable problem... very soon also. However, since when does flexibility and developer choice not appear good? MS also had the 720P "standard" and the good that did...

Only nerds and geeks investigated the actual resolution of PGR3 and poopooed an awesome looking and playing game. Do you not get you money's worth of its not at a certain resolution? If the dev chose to lower resolution to have some flexibility with other things that doesn't sound so bad... maybe there is a time or art constraint as opposed to a technical constraint who knows?

What is the actual native resolution of Gears? That should indicate to us if these consoles can really output at their stated resolutions as opposed to using a "scaling trick." If Gears is actually 720P native then what ever wasn't before it, had other limitations not related to hardware...

The flexibility and complexity of using SPE's for aiding graphic creation is good, but a rigid fairly inflexible output requirement is also good? :???: Something about that seems off...

I'm not sure how accurate I am since I haven't visited B3d for a while and am not quite up-to-date with the latest, but I do seem to recall that there is some sort of scaler in the PS3 but isn't allowed to be used. I presume it's not allowed to be used, since it's going to be phased out in future chip revisions to lower the price of the hardware.
 
"Nearly every HD output device has some sort of scaler to deal with 1080i only HDTV's or HDTV with bad scalers in them so you feed it it's native resolution.

Sony is an HDTV maker their is no excuse for them not to know that HD output devices should have a scaler in them.


I agree 100%. I hope Sony can at least talk about this problem soon. Then fix it this year.
 
Any answers on my '1080p games' question? Are there games that are locked into 1080p the way resistance is locked into 720p? What would happen if one were to play these games on a 720p display that cant accept a 1080p input? Would they display at 480p?
 
No, there are no games like that. If a game runs well at 1080p then implementing a 720p rendering path is trivial.

And from way back:
Has that ever been confirmed in regards to the resolution CoD3 runs on the 360?
I haven't seen any conformation as to the exact resolution, but I rented the game over Christmas and saw that it is quite clearly running at somewhere right around 1064x600.
 
Oh yeah:
What is the actual native resolution of Gears? That should indicate to us if these consoles can really output at their stated resolutions as opposed to using a "scaling trick." If Gears is actually 720P native then what ever wasn't before it, had other limitations not related to hardware...
Gears is 1280x720 in anything other than 480 line modes and 960x720 in 480 line modes. But that is just a matter of how they chose to do their rendering, where as Condemned came out at launch rendering at 1280x720 regardless of what output mode you choose.
 
I haven't seen any conformation as to the exact resolution, but I rented the game over Christmas and saw that it is quite clearly running at somewhere right around 1064x600.

being as I have already acknowledged your Eagle-graphics-eye :smile:, I believe you know what you're talking about, but how on earth can you possibly tell that when it is being output (scaled or rendered) at 720p or 1080i/p depending on your settings?
 
There's no reason why that has to be the case though.
Depends on your point of view. I would expect that from SCE's point of view, having those outputs would be a requirement though. As would supporting output from the combined EE+GS hardware in current PS3 revisions. If nVidia had stuff that handled this that they could tack onto the RSX without having to worry about increased development time, more $$$s, more die space requirements, and supporting a separate GS path, then sure.. maybe it would have made sense. But in my view (regardless of the whole scaling issue, which I can't really comment on) separating the display controller logic from RSX was/is the sane thing to do.

Cheers,
Dean
 
Actually Faf did say exactly that. And there is NOTHING in his statement that says anything about his comment being particular to SDTVs. That seems be just editorial on your part... for whatever reason. Be accurate in your corrections and criticism at least.

Blakjedi, if you look back at what Faf was replying to it was a response in regards to 480p SDTV quality (downscaled v nondownscaled). He was just being concise. I don't think he was saying most 360 games are upscaled from 480p, at least that isn't what he intended to say.

Indeed. Given the very specific audio/video output requirements of PS3 (HDMI/HDCP, multi A/V out, optical out etc), it shouldn't come as a surprise that a different display output system to that found on nVidia G70 boards is used on PS3..

Cheers,
Dean

Thanks DeanA, I think that answers my question. :smile:

Sounds like something CELL should be good at ... why isn't it implemented as such in games separately?

Yes, lets use a couple SPEs for vertex shaders, another for scaling the image, another for the OS... oh wait, what do we have left for the game? Cell is a great architecture, but if you are constantly calling on Cell to do the work for other components out of necessity (and not by choice) I see that as a problem. Obviously using them for VS work is a choice, but the idea that Cell should just pick up the slack means more of Cell being used for important things... like game code and new techniques to razzle and dazzle us.
 
being as I have already acknowledged your Eagle-graphics-eye :smile:, I believe you know what you're talking about, but how on earth can you possibly tell that when it is being output (scaled or rendered) at 720p or 1080i/p depending on your settings?
Over the years I've grown accustomed to the differences in the look of PC games at various resolutions which of course I set myself when playing PC games, so those same differences in rendering resolutions are easy to spot when playing games from the 360. As for the scaling, all sources get rescaled to my fixed pixel display in the long run, so wether they get upscaled a bit by the 360 first makes little difference in the resulting fidelity.
 
Yes, lets use a couple SPEs for vertex shaders, another for scaling the image, another for the OS... oh wait, what do we have left for the game?

I'm assuming it won't take up one SPE (and perhaps it will even be trivial), and you're assuming it will. I freely admit I have nothing concrete to backup my assumption, what about you?
 
I've no clue of how much an SPE it would take to scale one resolution to another, but it would also take some bus speed and ram as well which could well cause issues with games which were or are being developed without such limitations in mind.
 
Sounds like something CELL should be good at ... why isn't it implemented as such in games separately?
the actual logic of the scaling would be fairly easy, its more a case that you'd be bouncing the framebuffer data all over the place. RSX has to render the frame, when the frame is complete that would have to be passed back across FlexIO, possibly to system memory, processed, possibly sent back to system memory and then that passed back out to the display controller (assuming Cell has no direct path to it). You are wasting bandwidth and resource.

Depends on your point of view. I would expect that from SCE's point of view, having those outputs would be a requirement though. As would supporting output from the combined EE+GS hardware in current PS3 revisions. If nVidia had stuff that handled this that they could tack onto the RSX without having to worry about increased development time, more $$$s, more die space requirements, and supporting a separate GS path, then sure.. maybe it would have made sense.
Oh, I mean even with the display outputs that have actually been selected it can still be a part of the graphics processor. Most graphics processors have a DVO or digital input in as well for passing video in data or even data from a second graphics board (SLI/Crossfire) so there's no reason why the graphics controller couldn't be used for the EE+GS output as well (and still scale and process it). Die size would be the similar as the analogue circuitry of the output side of things doesn't really shrink well (in fact, you may even be pad limited in the second chip case, so you may be wasting silicon) and you would save on having to manage extra chip inventory and packaging etc.

I suspect the reason it wasn't done was that NVIDIA's current pipeline and even roadmapped pipeline doesn't match the exact requirements wanted and it would have cost more to implement it. It may also have been a case of time and a separate display controller would be quicker / easier to bring up independantly. I'd heard that dev boxes that went out used a Silicon Image chip for HDMI output, however I fail to see this on the PS3 motherboard pics, so I would guess that the display controller was one of the last chips to have been done.
 
Back
Top