Have MS been digging themselves into a hole?

Status
Not open for further replies.
blakjedi said:
z said:
-PS3 offers THE best resolution ever;
you say X2 is better because we don’t have higher res than what it offers. You say that knowing that PS3 offers ‘exactly’ what X2 does ‘and’ then some more.

720P with 2AA or 4AA is MUCH better visually than 1080P if only because to SEE the difference between them visually you would need about a 100" screen and sit at about 8-9 to pick out the differences in resolution.

Techincally 1080P is better. Visually with high spec antialiased 720 P is better and "freer.

That's your opinion. You cannot make up for the increase in detail higher resolution gives, with AA.
1080p will have more detail than 720p, it's that simple.
Will PS3 render its games at 1080p as standard? Hardly.
Both consoles will be rendering predominantly at 720p or 1080i. PS3 might have a few titles rendered at 1080p but i think they won't be too many.

The point stands, more pixels gives you more detail, AA does not.
 
london-boy said:
That's your opinion. You cannot make up for the increase in detail higher resolution gives, with AA.
1080p will have more detail than 720p, it's that simple.

On the other hand you will be able to twice the number of shader instructions per pixel or twice framrate at 720P. Also if you use the same textures at 1080p and 720p the amount details can be limited by texture resolution not display resolution when you come up close. Higher res is a tradeoff you have to do less work per pixel to get soem more pixels sometimes it is a win sometimes it is not.
 
I disagree. As I said somewhere else on this forum, I can perceive a single hair on a 720x576 interlaced display with loads of antialiasing. Twice the res at less aa, there'll be all sorts of shimmer and artefacts.

The human 'eye' isn't exact. What you see is a lot of interpretation by your brain. Present some subtle variations in between pixels and your brain will fill out the details with approximations based on real-world observations.

I've seen this plenty in raytracing. Create a blue and white chequered floor stretching into the horizon. Without AA (or filtering - same difference) you get interence patterns and it looks a mess. Add AA and you lose technical clarity (pixels and either blue or white but shades in between) but the image is a lot better.

If a choice between 720p at 4x aa or 1080p at 0 aa, I'd choose 720p. If PS3 can't handle AA too well, XB360 might well have the visual quality advantage.
 
Tim said:
london-boy said:
That's your opinion. You cannot make up for the increase in detail higher resolution gives, with AA.
1080p will have more detail than 720p, it's that simple.

On the other hand you will be able to twice the number of shader instructions per pixel or twice framrate at 720P. Also if you use the same textures at 1080p and 720p the amount details can be limited by texture resolution not display resolution when you come up close. Higher res is a tradeoff you have to do less work per pixel to get soem more pixels sometimes it is a win sometimes it is not.

I'm very aware of that, that's why in the same post i also said:


I said:
Will PS3 render its games at 1080p as standard? Hardly.
Both consoles will be rendering predominantly at 720p or 1080i. PS3 might have a few titles rendered at 1080p but i think they won't be too many.
 
Shifty Geezer said:
I disagree. As I said somewhere else on this forum, I can perceive a single hair on a 720x576 interlaced display with loads of antialiasing. Twice the res at less aa, there'll be all sorts of shimmer and artefacts.

The human 'eye' isn't exact. What you see is a lot of interpretation by your brain. Present some subtle variations in between pixels and your brain will fill out the details with approximations based on real-world observations.

I've seen this plenty in raytracing. Create a blue and white chequered floor stretching into the horizon. Without AA (or filtering - same difference) you get interence patterns and it looks a mess. Add AA and you lose technical clarity (pixels and either blue or white but shades in between) but the image is a lot better.

If a choice between 720p at 4x aa or 1080p at 0 aa, I'd choose 720p. If PS3 can't handle AA too well, XB360 might well have the visual quality advantage.

Nonsense. 1080p is more or less double the amount of pixels on screen compared to 720p. That alone is enough to do away with AA for a while.

It's like photography, I'll always take a 2M pixel picture over a 1M picture with bloody AA. And i'm not sure how anyone else could not say the same.
 
Shifty Geezer said:
I disagree. As I said somewhere else on this forum, I can perceive a single hair on a 720x576 interlaced display with loads of antialiasing. Twice the res at less aa, there'll be all sorts of shimmer and artefacts.

The human 'eye' isn't exact. What you see is a lot of interpretation by your brain. Present some subtle variations in between pixels and your brain will fill out the details with approximations based on real-world observations.

I've seen this plenty in raytracing. Create a blue and white chequered floor stretching into the horizon. Without AA (or filtering - same difference) you get interence patterns and it looks a mess. Add AA and you lose technical clarity (pixels and either blue or white but shades in between) but the image is a lot better.

If a choice between 720p at 4x aa or 1080p at 0 aa, I'd choose 720p. If PS3 can't handle AA too well, XB360 might well have the visual quality advantage.

Thank you for clarying my point SG.
 
Shifty Geezer;

What if you're assuming that the picture is outputed to a tv with a fixed 1080 resolution - then native 1080p/i is definately better than a 720p anti-aliased upscaled picture IMO. On the contrary, on a 720p fixed pixel display, a 1080 picture is downscaled anyway to fill the 720 resolution, so you'll be getting some free AA there to. I'm not too sure the 4x AAed native 720p would be much better looking in that case.
 
I don't get it.

There's actually an argument going on about 1080p?

Why? The Xbox handles 720p (and 1080i). How many games have actually been released that handle that resolution? I've looked and certainly can't find any that are even remotely interesting.

So the question is, if the Xbox handles this higher HD standard why haven't games been made in that resolution over the past year or two?

And if developers (for whatever reason you give to the above question) didn't produce games taking full advantage of the Xbox's ability to render superior definition this past cycle, why would they in the next cycle?
 
london-boy said:
It's like photography, I'll always take a 2M pixel picture over a 1M picture with bloody AA. And i'm not sure how anyone else could not say the same.
Photography IS antialiased! With your camera each pixel is an area average of a portion of the scene. More pixels=more fidelity_same amount of aa, so obviously that's better. It's different with computer generated imagery.

Here's three pics showing in order : a 1080 scale image without AA, 720 scale image at 2x AA and upscaled to 1080 scale with filtering, and the same thing at 4x AA...

1080_noAA.png


720_2xAA_filtered.png


720_4xAA_filtered.png


The AA'd scenes are rendered at a lower resolution and then upscaled to the same size as the 1080 scale image, as would happen with a TV image displaying on a fixed screen size (1080 or 720 on a 42" screen, say).

The bottom image shows a degree of 'fuzz' lacking the clarity of the non antialiased image, but it also avoids jaggies and shimmer to a large degree. In a less abrupt scene, with natural lighting and shading and materials, the fuzz would be far less noticeably, whereas shimmer and jaggies would still remain on a non-AA'd display. Just think FFX... At 16xAA things start to look really sweet, but that's beyond next-gen.
 
Shifty Geezer said:
london-boy said:
It's like photography, I'll always take a 2M pixel picture over a 1M picture with bloody AA. And i'm not sure how anyone else could not say the same.
Photography IS antialiased! With your camera each pixel is an area average of a portion of the scene. More pixels=more fidelity_same amount of aa, so obviously that's better. It's different with computer generated imagery.

Here's three pics showing in order : a 1080 scale image without AA, 720 scale image at 2x AA and upscaled to 1080 scale with filtering, and the same thing at 4x AA...

1080_noAA.png


720_2xAA_filtered.png


720_4xAA_filtered.png


The AA'd scenes are rendered at a lower resolution and then upscaled to the same size as the 1080 scale image, as would happen with a TV image displaying on a fixed screen size (1080 or 720 on a 42" screen, say).

The bottom image shows a degree of 'fuzz' lacking the clarity of the non antialiased image, but it also avoids jaggies and shimmer to a large degree. In a less abrupt scene, with natural lighting and shading and materials, the fuzz would be far less noticeably, whereas shimmer and jaggies would still remain on a non-AA'd display. Just think FFX... At 16xAA things start to look really sweet, but that's beyond next-gen.


Well this whole discussion is full of "to a certain point" and "between boundaries".
In the end, if one chip internally renders at 1080p, no matter what the output is, it will look damn fine. If it's output at 720, there will be AA for free in there.
It is different from a chip rendering at 720p and outputting at whatever resolution with AA, because the rendering itself will have half the amount of detail than the 1080p one.
 
Yes, if PS3 renders to 1080 and downscales, with smart filtering the results could look very good. The only real take home point for me is that Sony really need some options for AA of sorts. If not, and it's not something they addressed at all so far, I think they'll suffer IQ.
 
To also note, PS3 will have 1080p which is higher than 1080i. and it is almost a given that it will have some sort of AA.

P.S. to the one who asked; GT4 is in 1080. as for Box, I am not sure- Halo2 maybe?
 
Staying on topic, MS has made all the right decisions with X360. They are supporting what's out there to the highest degree possible while keeping costs down. Good business model.
 
still 7 isn't even sides and a sports game doesn't lend itself to uneven teams or teams with uneven human players

Doesn't lend? I've played more times with uneven human players (whether it's been 3, 5, or 7) than even (regardless of whether we had even numbers of controllers available) becuase that's basically all folks we could scrounge up at the moment. You just deal with it. Besides the *teams* aren't actually uneven, just the number of human players are (the rest are usually computer controlled).

We could have a cheaper ps3 if bluray wasn't in it and personaly i don't wnat bluray .

1.) There's no garauntee the PS3 would be cheaper sans Blu-Ray. Sony could just use a cheap DVD-ROM drive, keep the same price, and cut losses.
2.) Microsoft was getting killed on Xbox costs, but that didn't stop them from slashing prices on Xbox to compete with Sony and Nintendo. Thus the 360 and PS3 really should be no different.
3.) While I can respect it, I find it difficult to fathom your reason to not want it. What would you rather have? Just a plain old DVD?

At least not untill after the format war . A hdd will allways be a hardrive it wont suddenly become a dead end or invalid . A bluray drive can become both of those rather quickly

The format war is really irrelevent in this regard. Blu-Ray is the chosen format for PS3 games regardless of what happens with the next generation of video disc formats. If Blu-Ray becomes the next video disc format, they yay! Bonus Buger! If it doesn't, then oh darn you've got a GameCube now...

hmmm mabye . Then again you can allways up the amount of controlers on the x360 with the usb ports (And the same on the ps3 ) However i think it will be used as much as 8 player was on the ps2 (or whatever the number was )

Sadly when the emphasis on 4 ports of the current gen, gaming took a step backwards and many of the sports and puzzle games that used to support 8-10 controllers on the Saturn and Playstation got cut down to 4 in the current gen. Pissed me and a lot of coworkers off...

Quite a few? I don't think you proved the point I made wrong, even in the slightest. Like I said there are hardly any HDTV's on the market that support 1080p, and the ones that do are doing so before the standard has been set.

First of all, before we get more nit picky, by market are we referring to existing market of TV owners, or the percentage of current new models of TVs? To begin with, if were talking the whole TV market, then the HD discussion is moot as HD capable set penetration is roughly 13% right now in the US, and of that only 2-3% of HD set owners even have the digital tuners capable of receiving HD broadcasts (just as an anecdote, but even my 50" 2-year old LCD TV didn't come with an ASTC tuner).

If the discussion revolves around new sets, then yes 1080 native res sets represent a small percentage of new sets (how much I don't know, although one could walk into a decent AV store and find at least a half-dozen models that do). However since the discussion revolves around resolution that negates discussing CRT based sets (which have no native-resolution, and most don't even do 720p to the glass, most upconvert to 1080i (and then some don't even resolve it all)), and the fact of the matter is that 1080 panels are becoming more common. It's where the industry has been trying to go for the past couple of years.

Finally the comment about supporting 1080p before the standard has been set, is a pile of bull. Aparently you haven't properly read your A/54 doc otherwise you'd know that 1080p represents 2 of the 18 formats specified by the ASTC for digital OTA broadcast.

I think you missed my point, that point is you can't get a 1080i pixel exact movie unless you watching on your computer. Even right now there's only a handful of 720p films available and those are all on the PC. Do you really expect movies to start coming in at 1080p any time soon after we have HD dvd players? I certainly don't.

Again, BS. I can get 1080i movies all the time OTA. I can't get pixel-exact on my current TV since it's native resolution is 1366x768, but it downsamples lovely. And I can watch them at work pixel exact (on 3 different Qualias at work), along a bunch of pre-release/test Blu-Ray content (All SPE movies have been mastered to 1080 for several years now and is essentially waiting to be released) or stuff recorded OTA to Blu-Ray.

I'm not talking about regular TV's, I'm talking about fixed pixel displays, and they really do suck at downscaling.
Mine does just fine (and it's using a rather old version of the VVega Engine vs. the newer ones). Upscaling is the nastier problem to deal with (although mine does have programmable filters and if you're more hard core (e.g. perhaps like Democoder) you can drop down in to the service mode and tweak the snot out of the TV).

Where have you seen that television stations are going to support 1080i instead of 720p? You're right about bandwidth, being an issue, however I heard only a few months ago that by 2008 all television stations in north america will be switching to high def. Well, even if you're right, then it's yet another reason to not bother with a 1080p TV, right? considering how you won't get a benefit out of that and can certainly find a cheaper television that only has 1080i support.

Not sure where you get your information from, but it's awfully bad... Pretty much all OTA TV stations are broadcasting 1080i with the exception of Fox and ABC. I get 14 in my area and only 2 are 720p. Even when you factor in digital satellite and cable you're only adding ESPN-HD to the 720p list.

Also, there is no mandate in 2008 for a switch to HiDef. The FCC mandate is for the end of 2006 (or beginning of 2007 if you prefer) to cease analog broadcast and all OTA networks will be broadcasting digital. Pretty much all the OTA networks where I live are already broadcasting digital, they just will be shutting down the analog broadcasts when the time comes. You should also notice I'm emphasizing digital because that doesn't mean everything is going HiDef. It'll be no different than it is right now where you watch your digital station and most of the content is typically 480p until an HD show starts, or they broadcast HD with a bunch of SD content interpolated across an HD frame.

As for hdtv standards, 1080p is already in the spec, but only at 24 and 30 fps. The equipment just isn't widely available, nor is there any broadcaster support as of now...but it is in the standard. We can only hope that the standard is extended at some point in the future to include 60 fps. Otherwise it would be a shame to come all the way out to 1080, and not have 60p to go along with it.

You probably won't see 1080/60p anytime soon. Doesn't fit within the 6MHz carrier signal. At least OTA, you may see it on satellite or cable however...

That was a neat bit about some 1920 not being a true 1920. I hadn't heard of that, but it's a neat tidbit. That kind of makes it nice that there is a good lump of additional performance in the max spec that we can look forward to enjoying at some later point...

Well that just applies to HD over digital cable. All my local OTA station that are broadcasting 1080i are broadcasting true/square-pixel 1920 as well. DirecTV was broadcasting true 1920 for a while too, but lately I've heard they've been cutting down to 1280x1080 (which is kinda weird but hey) because of crowded transponders. Which sorta makes sense and is why they've already launched one of many MPEG4 satellites...

Anyways, this whole 1080p discussion is pretty silly anyway since it's going to be transmitted over HDMI (and possibly component) and any TV with an HDMI port can accept a 1080p signal.

Does FFTSW at 640x480 look better than Doom 3 at 2400x1800? Yes.

Is FFTSW rendered at 640x480? No...
 
Sony is obvoiusly pushing 1080p because they've learned that the Xbox doesn't support it, and Nvidia's RSX does (according to ATI, due to its PC video card origins). It's like a war, or a fistfight - as soon as you see a weakness in your oppontent's defense, attack... ;)
 
But again.. whats the point if nobody actually develops 1080p games?

Very few developed 720p or 1080i games for the Xbox which could handly both. Most "HD" (note: not hd) games were just 480p.

So there must be some cost involved at the developer level for going to 720p or 1080i, considering nobody really seemed to want to do it this generation.

So why would developers want to pay that extra cost (whatever it is, I don't know.. I'm asking), to produce 1080p games?

Because despite what anybody in the sony camp wants to say, 1080p will not be a widely adopted standard by the end of the PS3's lifecycle. Hopefully, 720p will have hit a large market saturation by the end of the PS3's lifecycle. But 1080p? No way. It'll take half of the PS3's lifecycle just to get 1080p televisions to fall to prices where the general public will even consider making such a purchase.

Sure, it's great that Sony is offering it. But I sure don't see it as an advantage in terms of sales.
 
The format war is really irrelevent in this regard. Blu-Ray is the chosen format for PS3 games regardless of what happens with the next generation of video disc formats. If Blu-Ray becomes the next video disc format, they yay! Bonus Buger! If it doesn't, then oh darn you've got a GameCube now...
Say that to someone who buys more than just game media and finds out a year or two later that now they can only watch those movies on a ps3 because bluray is dead .


Sadly when the emphasis on 4 ports of the current gen, gaming took a step backwards and many of the sports and puzzle games that used to support 8-10 controllers on the Saturn and Playstation got cut down to 4 in the current gen. Pissed me and a lot of coworkers off...
with the internet you can hook up to boxes easily and be able to play those games and you can set it up in diffrent areas so that you don't have 7 or 8 people cluttered in front of a tv . For those with a 60 inch tv that may be fine in a big room . But for many its not something thats going to get much use like the dual tv .
 
Eh, having an option to use or not to use something is never a bad thing. If price point is comparable, you can't possibly spin having more features as being "bad".
 
Based on current DVD player prices, the best bet is to wait until the HD format is chosen and gets widespread. You don't always have to be an early adopter, like (IIRC) DemoCoder :)
 
Status
Not open for further replies.
Back
Top