Kaz Hirai (SCEA) Interview - 1up/EGM

Status
Not open for further replies.
Bobbler said:
On a side note: I don't see any reason that Xenos couldn't output at 1080p (outside of the output chip that doesn't support it)
Incorrect. The Xenos GPU can quite happily output 1080p. Presumably it can output any size given it's tile based rendering. It's the output chip sourced from elsewhere to ATi that MS have chosen that doesn't support 1080p output. This could easily be changed. I guess this decision was made for cost reasons, though I don't know what the price is for 720p/1080i output vs. 1080p output.
 
Qroach said:
what makes you think 1080p will be used in movies? Movies only need to be 30 fps, and they don't need 1080p for that.

I guess you'd only need 760p for games only, and should be disabling it while watching movies. Even better, people are paying too much for their 760p sets when they could have settled with 480p/760i ones.

Going progressive should reduce flicker artifacts regardless of source framerate.
 
your GPU supports MSAA at HDR. But regardless of the costs, the question is does it matter? Are the visuals at 1080p on a PS3 that much inferior to 720p images that devs would be better off supporting the lower res?
Well now thats a big problem for the rsx if its like the g70. HOwever hdr at 1080p will require more resources and memory bandwidth than hdr at 720p

As for the visuals they could be . 1080p takes more fillrate than 720p . Which means on more complex games like say kameo . They would not be able to draw as many characters on screen at once at 1080p as 720p.

Having seen HS I can't say I'm desperately disappointed at a lack of creature on screen At the moment you're only specualting that 1080p will look worse and so it's a waste of time, but maybe that's not the case? if it's a choice between 10 baddies at 1080p and 20 at 720p, I'd hate to see 1080p used. But if it's a difference between 500 and 600 baddies, who cares?!
I do . Its not just number of creatures , its numbers of shaders . 1080p has alot of advantages it also has disadvantages too .

But it's not wasted as you called it. Maybe it could be used more efficiently, but if the games still look amazing at 1080p, how much more difference would it make targetting 720p instead. Lets take HS as a real-world example. What could be done differently to beneft that game now if they dropped from a 1080 image? Especially when the game engine has been expanded to fully use the Cell.

Of course its wasted .

For the majority of users for the next few years will still be on sd . A small number will be 720p / 1080 i displayes and a very very tiny amount of people will be at 1080p .

For most people that use the ps3 in the majority of its life time will never see 1080p image at its native res .

[
 
But by that reasoning, MS forcing 720p is a waste of resources that most users will never see. At least Sony give the option (so far) of outputting a lower frame size and provide even more shader power etc. along with the option of higher quality output for those games that aren;t thrashing the shader pipes and would be nice for those who splashed out on the better quality screens.
 
Shifty Geezer said:
But by that reasoning, MS forcing 720p is a waste of resources that most users will never see. At least Sony give the option (so far) of outputting a lower frame size and provide even more shader power etc. along with the option of higher quality output for those games that aren;t thrashing the shader pipes and would be nice for those who splashed out on the better quality screens.

You can look at it that way


I look at it as a good middle ground :)

Anyway there is no point in 1080p unless sony makes it mandatory .

edit * I should say that i also want ms to make 720p 4x fsaa mandatory*
 
I don't know if anyone thinks there's gonna be lots. The argument here is 'what's wrong with including it as an option for devs to decide whether or not to use it?'
 
onanie said:
blakjedi said:
Just to clarify

1080p on ps3 is not a negative its just not as useful as some people think. 720p is a more useful resolution in general to aim for as well as 1080i. Of course the ps3 can do those resolutions so its a wash of an argument.

You are the one extending this argument with your "clarification". Or is it your continued insistence that people should look at things your way? There will be people who have as good reasons to believe that 1080p is desirable, as you do otherwise. You should respect that.

Onanie I dont know you but I will say this: you seem to have been "born" just yesterday and seemingly just for this thread so, im not really gonna pay attention to you or your ridiculous comments.

The reason why I'm even bothering is because you quote a fair and balanced part of my post as if its disrespectful. I'm sure your reading comprehesion is pretty good...why YOU dont get the part of my post you quoted and understand it, is your problem, not mine.

Oh lookee there the thread went onto page 9 without my help... :rolleyes:
 
jvd said:
But those resources at 1080p = AA at lower levels, so they're not 'wasted'. Alternatively HS could be written for 720p and written to have 2x SSAA...

Err you can most likely do 4x fsaa at 720p for less performance than going to 1080p . Remember all your shader work becomes greater at 1080p than 720p . ALso 1080p is not double the res of 720p

1920 x 1080 vs
1280 x 720

To get 2x supersampling at 720p you would need 2560x 1440 renderings to downsample to 720p for 2x super sampling . So its not really 2x and calling that is a bit missleading . Its like a 1.5

2xSSAA at 720p is 2x the 'samples' being 'downsampled' NOT 2x the horiz. res. AND 2x the vert. res. Otherwise, that's 4x the 'samples' being 'downsampled'.

Because 1080p is ~ 2 megapixels and 720p is ~ 1 megapixels, downsampling 1080p to 720p is 'equivalent' to 2x SSAA.

jvd said:
I'm rather skeptical as to what sacrifices have to be made to support 1080p vs. 720p. What difference will be noticeable in the shaders or poly counts? And we don't know how well PS3 handles 1080p in the first place! For all we know, 1080p on PS3 looks 10x better than 720p on XB360!
Less work per pixel . Less creatures on screen at once , less complex lvls and deisgns

You need more of all kinds of power for 1080p over 720p . There is no getting around that .

It's 'pixel shader' work per pixel that you're trading with the number of pixels on screen. Basically, 2 Megapixels means ~ half shader work/pixel than at 1 megapixel. Less creatures on screen at once, less complex levels and design are orthogonal to the number of pixels on screen.

Now if you don't trade in the 'pixel shader' work/pixel for those extra pixels on screen at 1080p, then you need to make trade-offs elsewhere. This would be upto the dev on what the game requirements are on a game-by-game basis...
 
Shifty Geezer said:
Bobbler said:
On a side note: I don't see any reason that Xenos couldn't output at 1080p (outside of the output chip that doesn't support it)
Incorrect. The Xenos GPU can quite happily output 1080p. Presumably it can output any size given it's tile based rendering. It's the output chip sourced from elsewhere to ATi that MS have chosen that doesn't support 1080p output. This could easily be changed. I guess this decision was made for cost reasons, though I don't know what the price is for 720p/1080i output vs. 1080p output.

That is what I said ;)
The output chip is what is causing the problem. Xenos itself should be able to do it fine.
 
Sorry :oops: . You mentioned output chip along with Xenos, rather than
On a side note: I don't see any reason that XB360 couldn't output at 1080p (outside of the output chip that doesn't support it)
So I read it to mean Xenos and the output chip on it can't support 1080p.
 
Shifty Geezer said:
Sorry :oops: . You mentioned output chip along with Xenos, rather than
On a side note: I don't see any reason that XB360 couldn't output at 1080p (outside of the output chip that doesn't support it)
So I read it to mean Xenos and the output chip on it can't support 1080p.

Ah, I considered Xenos to be the shader core and the eDram (+logic) only...

At this point, I wonder if MS has considered replacing the actual chip that is forcing the 720p/1080i only? It seems like it would be a good idea, if only on the marketing side of things. It can't be very costly these days -- it's not like we haven't had 2m pixel+ resolutions for several years now (on PC vid cards that is).
 
No one can say that it's somehow NOT a good thing to allow a machine 1080p support.

But I will say this, I think MS made a much smarter move by aiming at a realistic standard, and then making it mandatory.

To think Dev's are going to release game for these super high resolution, without being forced to is being overly optimistic to say the least. Look at past history and this doesn't matter, I've said it before and I'll say it again ~2% of XBOX games used 720p.

In other words I think the argument should really be this, would you rather have a system with the 'capability' to output to 1080p, but it has many games at a non-HD 480p? Or rather a system with a required 720p minimum ensuring 100% of games are HD, although it doesn't output for the <1% of 1080p owners.

Shifty - if there were great benefits in going to higher resolution for improved AA we sure didn't see it last gen. Why not? Around 20 games out of 1100 used 720p, around 2%, so obviously the trade-off is not worth it for that vast majority of developers.
 
Well for one reason downscaling in the existing hardware is a consuming process compared with having specialist downscaling on the output hardware, so maybe there was a substantial performance hit? I'm quite shocked at how little AA there was this gen even in XB titles (from screenshots at least)!

I agree 720p downscaled is a good level of performance. I just think the argument against 1080p offering inferior performance to 720p is rather countered by the same principle 720p will eat up shader resources that won't benefit SDTV user. From my POV I think the upcoming hardware has enough oomph to cater for 720p that there isn't any more that could be done at lower resolutions that would have any noticable benefit (though not everyone agrees with me. Who was it who said at SDTV we can do anything we want, but HDTV places us in the position we are now?), and I can't comment on 1080p performance until I see it in action. I'm being a boring fence sitters and reserving judgement ;)
 
scooby_dooby said:
In other words I think the argument should really be this, would you rather have a system with the 'capability' to output to 1080p, but it has many games at a non-HD 480p? Or rather a system with a required 720p minimum ensuring 100% of games are HD, although it doesn't output for the <1% of 1080p owners.

I think that is possibly the silliest thing I've read in a long time. Did you actually read it after you typed it?

Some of you need to wake up -- the only reason to use a lower resolution is because the system can't handle it. Why would you ever assume there will be games in 480p? You realize how hard you'd have to push the hardware to bring it to its knees rendering a lousy 300k pixels?

The reason to use a higher resolution is because your game can handle it -- It wouldn't even be feasible (without doing stupid and pointless things) to make a 480p game that maximizes the system (PS3 or Xbox360).

The thought that because Sony hasn't mandated a resolution there will be games in 480p and none in 1080p is ridiculous. The fact that MS has mandated a resolution is far more detrimental to games than Sony's approach -- Every game must be 720p and therefore they WILL be wasting the the GPU is they don't graphically push it to the limit with shaders (there is no option to use up the extra power by upping the resolution). Meanwhile on the PS3 a "simpler" game like HS can be made and still stress the system to its fullest. While either case isn't a huge deal (we'll see beautiful games regardless of the system), some of you are demonizing Sony for actually giving the choice to the developers (where it arguably should be left).
 
onanie said:
Acert93 said:
It wont be as easy as you suggest. Comparing current PC games and how they run at 1600x1200 is really shaky ground to make this analogy.

This comparison is not so shaky - it is 8% difference, after all.

In questioning the validity of using PC benchmarks, your inclusion of Quake 1 as an example, is perplexing. Today's benchmarks for 7800 are DX9, not DX8 that you suggest people have been using.

1. You missed the point of the Quake 1 example. Basically YOU CANNOT LOOK AT HOW OLD GAMES USE HARDWARE. Just because G70 is running PC games from 2003 and 2004 at 1600x1200 with 4xAA and 16xAF does not mean new games, specifically next gen quality games (note: Sony's render targets at E3 clearly show they are aiming much MUCH higher than D3, Far Cry, or HL2).

Simplified explaination: Just because a TNT2 could play Quake 1 at 1600x1200 does not mean a TNT2 can play Quake 3 at 1600x1200.

Similarly a 6800Ultra could play BF:V (Spring 2004 release) at 1600x1200 with 4xAA @60fps, but can only play BF2 (Spring 2005 release) at 1280x960 with the same settings (and that is without 16xAF, or any new features like HDR). Even though it has 58% less pixels to deal with, a more advanced game takes a significant performance cut.

So your suggestion that the G70 performing well at 1600x1200 validates PS3 performance at 1080p is not as easy or as straight forward as you are suggesting.

2. Most modern PC games are designed around DX7, or in the least DX8. They are not developed to take full advantage of DX9 because they are meant to strip away features to work on older hardware solutions.

So comparing how the G70 performs in HL2 (DX7 with some pretty DX9 shaders added and high resolution textures for DX9 cards) or D3 (built with the DX8 featureset in mind) to how it will perform in SM 3.0+ games is a non sequiter.

Basically while modern games support DX9, we wont begin seeing games built around DX9+ features as a baseline until ~2006. UE3 games are an example of games that will use the DX9 featureset as the bare minimum.

Now if a G70 can run, say, the final version of Unreal2007 at 1600x1200 with 4xA 16xAF HDR and all settings on HIGH at 60fps, now THAT would be relevant because Unreal 2007 is a good middle ground of what to expect in first generation console games.

But comparing current gen games on a G70 to how the PS3 will run next-gen quality games is shakey at best.

If your analogy held true we would have enjoyed consoles that could generate HD images in games for the last 7 or 8 years.

You are also assuming a static "install base" for HDTV users, by forgetting to mention also Consumer Electronics Association's estimate of 50% prevalence in american households by 2007 (such enthusiasm from CEA).

Where did I EVER assume such? Never. I never suggested such, so please do not set up straw men arguements I do not believe.

The hard numbers I have seen were 10M HD TVs installed in the US at the end of 2004. 0% supporting 1080p.

And an estimated 15M HD TVs installed in the US at the end of 2005, with a very nominal percentage supporting 1080p.

And if the CEA, by saying "by 2007" the beginning of 2007 (end of 2006) they are looking at a huge uphill battle. Shipping to 35M new homes (which would be like 45-50M total sales due to the fact people with more money do/have bought more than one) in 2006 would be an 8-10x fold increase in sales over 2005.

Even if they mean by the end of 2007 you are looking at a huge increase in sales--and that also means production.

Considering how long HD TV's sat in limbo (they were hyping them big BEFORE the N64 came out... broadcasts were supposed to start way back in the mid 90's!!) and companies have continually over projected HD TV sales and market penetration I would be in the pesimist group.

Over 50M homes in the US in 2007 is very ambitious. That is total sales in the 65M-70M sales range. Call me a skeptic.

Btw, this still ignores that CONSOLES are GLOBAL. HD TVs are almost non-existant in Europe. With no HD broadcasting and HD optical media still not in the mainstream it looks like it could be a while longer also. That fact surely has an influence on the debate. When 30% of the market looks to be out of the HD TV equation almost completely, it makes you re-evaluate your marketing angle.

Even using your "online gaming" analogy, we are comparing 10% of xbox consumers (despite the 215M broadband accounts) against 15% of american households in uptake of HDTV. And you say "HDTV" is less important?

1. I never said it was less important, I am a BIG fan of HD TV.

What I have tried to convey that Online via broadband is a mass market feature this generation, HD TV will be a mass market feature next gen.

2. You over simplify the numbers. Yes, ~10% of Xbox users are on Live. But I gave a number of reasons why that will change. Some off the top of my head:

a. When Live launched 3 years ago broadband penetration was MUCH smaller than it will be by the end of 2005. Just some examples:

US 2003 ~ 25M
US 2004 ~ 34M
US 2005 Projection ~ 40M

World Wide 2004 ~ 165M
Word Wide 2005 Projection ~ 215M

Broadband is on the "upswing" and beginning to hit mass market penetration.

b. The Xbox did not get its first truly "killer" Live app until 2004 (Halo 2).

c. Cause-and-effect. The hardest part of establishing a service is getting the ball rolling. Once you begin getting customers the community grows, and when the community grows is becomes more appealing to consumers.

d. Live was not packaged with the Xbox. Live will be packaged with the Xbox 360. Out of the box, headset included. Online is a major feature designed into the Hardware AND Software.

e. 15% of homes do NOT have HD TVs. ~15M projected HD TV sales by the end of 2005 != 15% marketshare/pentration. That is comparing apples and oranges. The straight off the top numbers would be ~40M Broadband vs. 15M HD TV sales. The major difference being that most homes do not have more than 1 broadband connection, while many homes do have more than 1 TV.


Overall I think you have missed the point of what I have said, the misquotes and mixing up of numbers gives me this impression.

I am not knocking HD TVs.

Anyone on the forum knows that HD is a buying criteria for ME this generation and I very much enjoy HD media.

But issue is separating my bias/preferences as a hardcore gamer, early adopter, and enthusiest from Mass Market Realities (tm). Being able to separate what I like/dislike from market conditions is important when talking about the industry. If you reflect on everything based on your console preference or your own buying needs you will surely misinterpret the market.

That is how I can say that I believe Broadband Online gaming is *now* for the mainstream; and

HD mass market penetration is the *next* gen. That does not mean a lot of consumers wont get to enjoy HD; I actually have stated MORE will enjoy HD this gen compared to Online with the Xbox/PS2.

So I am not downplaying HD; I am just recognizing the fact it is still growing and wont hit the world wide market in full force until the 4th 3D gen.

And this is important to note, as it dove tails directly back to the original debate. Because HD (720p / 1080i) is JUST BEGINNING to get a foothold this gen, why would developers stretch themselves even thinner (1080p) when a marginal % of consumers can support this?

As I originally stated, based on MARKET CONDITIONS, I would expect most developers creating modern/cutting edge games to focus on 1080i/720p. They can, in theory, create games 2x as complex "pretty" at those resolutions compared to 1080p.

A person in the business making position is going to look at the sales and market penetration, and realize that the sweet spot is 720p/1080i. If going for 1080p means sacrificing the visual punch to 98% of the market, well, that is just STUPID business sense.

There wont be enough 1080p TV sales to merit 1080p as a resolution goal for most cutting edge games until next gen. That is, assuming, the Hardware can even handle a cutting edge game with AA, AF, HDR, and all the other bells and whistles of a cutting edge game at 1080p.

That said some games that are less GPU intensive will support 1080p. But that is neither here nor there when discussing next gen quality software.
 
Qroach said:
what makes you think 1080p will be used in movies? Movies only need to be 30 fps, and they don't need 1080p for that.

For movies, there would be a 1080p24 mode.

Actually, it's somewhat surprising that the PS3 would output 1080p movies because there are fears the studios want to restrict output to 1080i at most.

The first announced HD-DVD players from Toshiba has only 1080i output.
 
wco81 said:
Qroach said:
what makes you think 1080p will be used in movies? Movies only need to be 30 fps, and they don't need 1080p for that.

For movies, there would be a 1080p24 mode.

Actually, it's somewhat surprising that the PS3 would output 1080p movies because there are fears the studios want to restrict output to 1080i at most.

The first announced HD-DVD players from Toshiba has only 1080i output.

The MSFT guy in avsforum said that HD-DVDs using VC-1 will most assuredly be 1080p, so I would imagine that BD-ROM will be 1080p as well.

Let us not forget, for us HDTV owners, that just because something says it supports a certain res, does not mean it will be any good a la The Matrix, running the game at 1080i showed me no improvement over 480p or 720p. In fact I would say it looked worse...what a tragic game.

AGAIN, I am not knocking 1080p, I have already stated my selfish reasons for 1080i....although I am looking at 1080p sets now as well.

I don't see any fundamental flaws with the PS3, however, I don't see any with the 360 either. I just want to get rid of the damn jaggies!
 
Shifty Geezer said:
I just think the argument against 1080p offering inferior performance to 720p is rather countered by the same principle 720p will eat up shader resources that won't benefit SDTV user. ;)

The main difference I see is that 720p/1080i actually has a fairly substantial market, that by 2008-2009 should be almost mainstream, in North America anyways.

1080p has no real market for the forseeable future. Of the small fraction of 1080p owners, how many play video games?

I think the 1080p argument just distracts from the real discussion IMO which should be about why Sony has no minimum resolution. I think we can all agree that judging from past history there won't be many games for the maximum resolution, isn;t it much more relevant to discuss what 95% of the games will be running at?

It just seems wierd to me that people slam x360 saying it doesn't have "true" HD , while ignoring the fact ps3 has no established standard. The 360 is the only system where EVERY game will be HD.

Maybe I'm just biased because I have a 1080i TV but I would much rather see a set standard(720p/1080i) for evey game, rather than lofty specs with no standards and the end results of which will be many games upscaled from crappy 480p resolutions.
 
Shifty Geezer said:
I agree 720p downscaled is a good level of performance. I just think the argument against 1080p offering inferior performance to 720p is rather countered by the same principle 720p will eat up shader resources that won't benefit SDTV user. From my POV I think the upcoming hardware has enough oomph to cater for 720p that there isn't any more that could be done at lower resolutions that would have any noticable benefit (though not everyone agrees with me. Who was it who said at SDTV we can do anything we want, but HDTV places us in the position we are now?), and I can't comment on 1080p performance until I see it in action. I'm being a boring fence sitters and reserving judgement ;)

Shifty I would slightly disagree with part of your statement. HD has the benefit of helping even SD tvs. Watch a station that is broadcasting in HD on a SD set and you should still see a visual improvement over traditional broadcasts, especially on larger screens where SD really really sucks, although I only have a 30". So 720p and 1080p is great for everyone.
Now let me scold you for being "a boring fence sitter..." ;-)

While I agree the more relevant argument will be what is the minimum, I find it difficult to fault Sony (can't believe I said that) for not disclosing anything, it is still relatively early. I would imagine hearing more at TGS or a week before/week of the release of the x360.
 
Status
Not open for further replies.
Back
Top