N5 expected to use cutting edge processor and graphics tech

Qroach:

> THe 720P standard is being phased out.

Nonsense. 720p is part of a standard that will last many years. Compliant sets will support all resolutions.

> It won't be long until you can't buy any new TV set with 720p support.

Nonsense. 1080i/p capable flat panels support 720p as well. CRTs are optimized for specific resolutions hence why a 1080i capable CRT set may not support 720p.
 
wco81 said:
While there are more pixels, 720p apparently requires more expensive components so the manufacturers put out 1080i hardware with scalers to push 720p up to 1080i.

If it is converting 720p/60 to 1080i I would find mutilating more apt than pushing up.
 
wco81:

> Some early CRT-based sets did support 720p but CBS, which
> broadcasts in 1080i, may have influenced the manufacturers to produce
> sets with only 1080i support.

Forget CRT. It's old technology that's being phased out.

> However, these are small portions of the market because of the price of
> the hardware.

In two years prices will be similar to those of CRT based sets. Yes, in the US there will be more CRT based HDTVs when the new systems launch but flat panels are the future and with digital connections you have to factor in computer displays like Nintendo is doing.

In Europe most HD capable displays are flat panels and that won't change.

> As for 1080p, we'd all like to see it but without any 1080p sources on
> the horizon

If I'm not mistaken there are WMV HD capable DVD players out now.

BRD and HD-DVD aren't far off either.
 
Well Intel has promised their LCOS chips would lead to 50-inch displays under $2000. If they really deliver this, then CRT market share of new display sales will decrease.

However, you're talking about 100 million households in the US, with somewhere between 200-300 million sets. Most of those sets were purchased for well under $1000, probably under $500.

So no, I don't think CRTs are going away any time soon. There are cheap 20-inch LCDs for around $700, which are really the lowest of the lowend in flat panels. If you get name-brand 30-inch LCDs for under $1000 (they are upwards of $2500-4000), then the mass-market for something other than CRT will develop. Until then...

As for Blue Ray and HD DVD, neither camp has really published the resolutions they'd support yet. Not sure what the BR recorders in Japan do but one would imagine since they record HDTV they support 1080i. Still, wouldn't surprise me if prerecorded Blue Ray and HD DVD center around 720p.
 
Mfa said:
Only if they arent pushing the hardware.
Maintaining a constant and high framerate is more a matter of carefull game design then pushing (or not pushing) the hardware though.
If only titles that made good use of hw potential had framerate problems, we wouldn't be getting titles like Love Smash Tennis, Kabuki Warriors etc.

If it is converting 720p/60 to 1080i I would find mutilating more apt than pushing up.
Not if 1080i refreshes at 120hz :p
 
Fafalada said:
Mfa said:
Only if they arent pushing the hardware.
Maintaining a constant and high framerate is more a matter of carefull game design then pushing (or not pushing) the hardware though.
If only titles that made good use of hw potential had framerate problems, we wouldn't be getting titles like Love Smash Tennis, Kabuki Warriors etc.

And nothing concentrates the mind like field rendering :)

To those who don't know, if your drop a frame in field rendering the actual screen res will drop (i.e. 480i will be visually 240p if you drop a frame).

In the early PS2 days (before progressive scan was important) the joke was it was easy to see who where ex-PC devs as they used frame mode, the idea of guarenteeing framerate being completely alien to them :devilish:

Its actually a relatively new idea that you can drop a frame. On many earlier systems it wasn't an option. Its was fairly common to run with a single buffer, you had to sync not the vertical but the horizontal.

The term 'chasing the raster' is still used (at least in the UK dev scene) for very time critical rendering. It comes from when you were physically chasing the electron beam as it scanned out of memory, you could cleanly write only after it had moved.
 
wco81:

> If they really deliver this, then CRT market share of new display sales
> will decrease.

They're already decreasing. It's just a matter of time before flat panels overtake CRT displays.

> However, you're talking about 100 million households in the US, with
> somewhere between 200-300 million sets.

The vast majority of those sets are regular NTSC (no progressive) sets and as such aren't relevant when discussing HDTV.

> So no, I don't think CRTs are going away any time soon.

Of course they aren't going away completely but manufacturers are already looking at dumping CRT production. In the long run they just aren't cost effective and consumers are increasingly favoring flat panels over CRTs.

> As for Blue Ray and HD DVD, neither camp has really published the
> resolutions they'd support yet.

Both formats will support all HDTV resolutions.

> Not sure what the BR recorders in Japan do

So far there's only one (for the consumer market), the Sony BDZ-S77. It's based on an old version of the BRD specification and as such is an utterly irrelevant product.
 
Meh, if game designers have to design the game to keep the content of a view stable enough to keep rendering completely deterministic they are going to be pretty constricted in their freedom ... I doubt they honestly go looking for pain like that if they have the choice.
 
cybamerc said:
In two years prices will be similar to those of CRT based sets. Yes, in the US there will be more CRT based HDTVs when the new systems launch but flat panels are the future and with digital connections you have to factor in computer displays like Nintendo is doing.

Next gen consoles won't use a digital video connector, VGA or component is the options your likely to get. Take your pick but no HDMI or DVI, there both too expensive.
 
cybamerc said:
wco81:

> If they really deliver this, then CRT market share of new display sales
> will decrease.

They're already decreasing. It's just a matter of time before flat panels overtake CRT displays.

> However, you're talking about 100 million households in the US, with
> somewhere between 200-300 million sets.

The vast majority of those sets are regular NTSC (no progressive) sets and as such aren't relevant when discussing HDTV.

> So no, I don't think CRTs are going away any time soon.

Of course they aren't going away completely but manufacturers are already looking at dumping CRT production. In the long run they just aren't cost effective and consumers are increasingly favoring flat panels over CRTs.

> As for Blue Ray and HD DVD, neither camp has really published the
> resolutions they'd support yet.

Both formats will support all HDTV resolutions.

> Not sure what the BR recorders in Japan do

So far there's only one (for the consumer market), the Sony BDZ-S77. It's based on an old version of the BRD specification and as such is an utterly irrelevant product.

Are you sure you're not confusing computer monitor display market with television display market? Yes manufacturers are investing billions to bring flat panel display plants up. And flat panel TV sales are very healthy and growing.

But they represent the high-end of the market. That is why it's very relevant that most Americans have CRT sets that they paid $500 or less for. This is where most of the market is and the segment which the manufacturers will have to address.

It would be nice if we could all have $500 flat panel LCDs with black-levels, contrast ratios and viewing angles equivalent to CRTs but that isn't going to happen right away. That is why TVs sold at mass market prices ($250-$800) are predominantly CRTs and there are widescreen CRT monitors which have just started to drop into this price range.

But I like your optimism about cheaper flat panels, no more CRTs (a good ecological outcome if nothing else) and 1080p everywhere.
 
MfA said:
Meh, if game designers have to design the game to keep the content of a view stable enough to keep rendering completely deterministic they are going to be pretty constricted in their freedom ... I doubt they honestly go looking for pain like that if they have the choice.

By that logic its alright for the developer to have freedom to push the framerate to any level. A first person shooter at 1fps anybody? We work in real-time, we all know about limitations, at the moment we usually aim worst-case at 30fps with next-gen is seems fairly reasonable to aim worst-case at 60fps.

Console devs are used to working in limits, its what working in an embedded system is all about.
 
DeanoC:

> Next gen consoles won't use a digital video connector, VGA or
> component is the options your likely to get. Take your pick but no HDMI
> or DVI, there both too expensive.

I have no idea what it costs to implement.

But if a Ă‚ÂŁ20 gfx card...
http://uk.pricerunner.com/computing/components/graphics-cards/204284/details

... and a sub-Ă‚ÂŁ150 DVD player (two newer models are similarly priced):
http://uk.pricerunner.com/sound-and-vision/vision/dvd-players/175289/prices

... can have it I'm sure it's not out of reach for a console. Especially two years from now where it will be a requirement anyway (VGA is being phased out and component always was a temporary solution). If they're really cheap they can always offer an adapter like M$ is doing this gen.
 
DeanoC said:
By that logic its alright for the developer to have freedom to push the framerate to any level.

Yes that they have the freedom to do so is a good thing.

A first person shooter at 1fps anybody? We work in real-time, we all know about limitations, at the moment we usually aim worst-case at 30fps with next-gen is seems fairly reasonable to aim worst-case at 60fps.

Well there is soft and hard real-time. You aim for a worst-case of 30fps, but you can still afford to miss ... how much bigger would you have to make your margins to make sure your aim is good enough for field based rendering?

Not to mention that as complexity of the scenes goes up, predictability goes down (unless you want to stick the kind of scenes you have now, just with more tris for each object).
 
wco81:

> Are you sure you're not confusing computer monitor display market
> with television display market?

No. What part of my post makes you think that I am?

But make no mistake, computer displays and tv sets share the same basic technology, flat panels in particular. Financially it makes sense unify the production of those products as much as possible.

> That is why it's very relevant that most Americans have CRT sets that
> they paid $500 or less for.

It's not relevant when discussing HDTV because those tvs do not support such high resolutions. When making a decision about whether to support 720p or 1080p or to support DVI or VGA that market is irrelevant.
 
Mfa said:
Meh, if game designers have to design the game to keep the content of a view stable enough to keep rendering completely deterministic they are going to be pretty constricted in their freedom ... I doubt they honestly go looking for pain like that if they have the choice.
Naturally there's no such thing as completely deterministic (and not just for drawing but great many other things that affect framerate).
But fact is that a lot of effort is spent to control the amount of dynamics, visible content, and other processing intensive components - within a certain acceptable interval - in pretty much every game.
People that find that too constrictive shouldn't be working on realtime applications.

Well there is soft and hard real-time. You aim for a worst-case of 30fps, but you can still afford to miss ... how much bigger would you have to make your margins to make sure your aim is good enough for field based rendering?
It's not like field rendered games can never afford to miss a frame - I doubt you'll find a lot of people that are even aware of GT3 'resolution halving' during slowdowns in replay.
Flicker filters also tend to make the artifacts less obvious then one would think (you don't actually halve the resolution on a filtered display).

Granted though - 1080I rendering would no longer have the bandwith advantages over 1080P if you want to filter it - only memory requirement would remain lower...
 
cybamerc said:
wco81:
It's not relevant when discussing HDTV because those tvs do not support such high resolutions. When making a decision about whether to support 720p or 1080p or to support DVI or VGA that market is irrelevant.

Ultimately, the spectrum in the US and other countries is going to support analog or digital, not both. And the spectrum currently being used for analog, even at NTSC resolutions, is equivalent to the spectrum it takes to support HDTV resolutions.

The FCC and Congress wants that analog spectrum back. But they can't take it back until the people with those poor analog sets have the ability to get DTV receptions. That is why the FCC "proposed" a schedule whereby all sets sold in this country have DTV tuners, even those in the mass market price range.

The manufacturers are under pressure to deliver DTV (not necessarily HDTV) sets, including tuners, at prices most people can afford. They may determine that the only way to deliver at those prices is via 480p sets and tuners.
 
cybamerc said:
... can have it I'm sure it's not out of reach for a console. Especially two years from now where it will be a requirement anyway (VGA is being phased out and component always was a temporary solution). If they're really cheap they can always offer an adapter like M$ is doing this gen.

Pure digital interfaces require different data paths, this can be complicated if the DACs are doing extra work beyond just analog conversion. As with most things just because its cheap for one set of hardware, doesn't mean it is for another.

BTW:
I don't fully understand why digital interfaces are out, just repeating what I've heard from a VERY good source.
 
wco81:

> They may determine that the only way to deliver at those prices is via
> 480p sets and tuners.

But again, those people aren't relevant when discussing HDTV.

All the systems will support 480i and have at least composite out for people without HD sets. But as far as HDTV support is concerned only that market is relevant.
 
Fafalada said:
Granted though - 1080I rendering would no longer have the bandwith advantages over 1080P if you want to filter it - only memory requirement would remain lower...

But with the memory bandwidths next-gen systems will have available to them, video scan-out will basically be an insignificant factor; particulary for any hardware with on-chip framebuffer memory...
 
Guden Oden said:
Fafalada said:
Granted though - 1080I rendering would no longer have the bandwith advantages over 1080P if you want to filter it - only memory requirement would remain lower...

But with the memory bandwidths next-gen systems will have available to them, video scan-out will basically be an insignificant factor; particulary for any hardware with on-chip framebuffer memory...

Because memory bandwidth and eDram are both cheap resources that we will have plenty off :rolleyes:

1080p will either require massively more eDram (like 50% extra) if scanning out of eDRAM or will steal upto 5% of total memory bandwidth. Neither is insignificant.

8Mb or 1 Gb/s is alot even for next gen systems.
 
Back
Top