Is DirectX throttling Xbox 360 performance?

We already paid for it...

Yes and you paid for 1080p with Lair as well ...

Doesn't change the fact that I'd like to see Halo3 in 720p with 4xAA and it shouldn't cost them a ton to get it done. $5 should more than cover the costs and shouldn't break the online compatibility with those that have it on xb360.
 
Doesn't Halo have the ability to render at 480p?

Last I checked the official definition for HD includes 480 i/p, 720 i/p, and 1080 i/p.

Regards,
SB

Where do you check those things?
480i/p (along with 576i/p) are SD.

HD means 720i/p and above, unless you live in 30s or something.
 
Ah, you're correct. I got mixed up with 480 p as part of HD resolutions because 480p was introduced as part of the HDTV introduction to the marketplace...

Standard definition TV only has a 480i specification; the 480p came later with the advent of HDTV.

http://www.soundadviceblog.com/?p=168

But 480p was catagorized as SDTV when HDTV's were introduced, even though SDTVs cannot display a 480p picture.

Regards,
SB
 
Doesn't change the fact that I'd like to see Halo3 in 720p with 4xAA and it shouldn't cost them a ton to get it done. $5 should more than cover the costs and shouldn't break the online compatibility with those that have it on xb360.

Are you sure that it's so easy? Bungie is filling the 10 MB of eDRAM with two[/b] 1152x640 render buffers. What you're suggesting would require five tiles. I'm not sure that's as trivial as you seem to think. I'm not even sure that MSAA is compatible with the way Bungie blends the buffers.
 
Are you sure that it's so easy? Bungie is filling the 10 MB of eDRAM with two[/b] 1152x640 render buffers. What you're suggesting would require five tiles. I'm not sure that's as trivial as you seem to think. I'm not even sure that MSAA is compatible with the way Bungie blends the buffers.


Keep in mind that the reason they went with the current implementation was due to the lack of FP16 with alpha blending in Xenos. What would be interesting is changing the actual framebuffer setup.
 
Can someone explain me the purpose of upscaling?

The native resolution indicates the number of pixels, so that resolution should indicate the image quality.

Any full HD TV can display any resolution up to 1080p native. An image of 720p will run full screen on a large 1080p. What is upscaling supposed to do extra for that display? It wont run more pixels. It wont enlarge the image because it already runs at full screen.

So what the heck is upscaling supposed to do and what is the purpose of it?

For example my 42'' TV can do 1080p. My 360 games are upscaled to 1080p. I checked Gears 2 on both 720p and 1080p. I didnt notice any difference. The only difference was probably the distance of the camera and even that is questioned.

In most cases upscaling might even slightly blur the image. Which ruins the purpose of higher resolution. So what on earth is the benefit?
 
Can someone explain me the purpose of upscaling?

It is simple. If your console does not do it, your TV will upscale it internally.

So why do it?
A) the upscaler in your TV might be worse than the one provided by your console. That was especially an issue with early HDTVs.

B) Uniform user experience. The upscaler in your TV might use some different upscaling algorithm incl. postprocessing etc. so it might look different.

So in the end the basic underlying process is the same regardless whether the TV or the console handles it.
 
Generally speaking if the consoles doesn't upscale the game, your TV will. In general the console will do a better job than most HDTVs at upscaling.

Good TV's generally have good scalars. Cheap TV's generally skimp on it as part of the cost savings.

Regards,
SB
 
If your TV upscales, it introduces additional latency. As best as I can measure it, my aged but still excellent Dell 2405FPW monitor adds 33ms of latency (two frames on a 60fps camera recording compared to a CRT) if I feed it a 1080p signal. However, an additional 16ms of latency is added by feeding it a 720p signal it then has to scale.
 
What about the lag from the console upscaling? Is there any, or is it squeezed out in time for each refresh?
 
I saw no difference in frame rate between CoD4 on 360 running at 720p and 1080p, but I'm willing to bet there's a small overhead.
 
An image of 720p will run full screen on a large 1080p. What is upscaling supposed to do extra for that display? It wont run more pixels. It wont enlarge the image because it already runs at full screen.

So what the heck is upscaling supposed to do and what is the purpose of it?

A 720p image does not run full screen unless it's scaled to fit the native 1080p screen. Your X360 scales it, or the TV scales it if you uncheck the 1080p option from your Xbox settings. If there wouldn't be scaling you would get large black borders on each side.
 
Are you sure that it's so easy? Bungie is filling the 10 MB of eDRAM with two[/b] 1152x640 render buffers. What you're suggesting would require five tiles. I'm not sure that's as trivial as you seem to think. I'm not even sure that MSAA is compatible with the way Bungie blends the buffers.


Yeah this off-shoot got a bit out of hand.

I was talking about running Halo3 on xb3.0

The context of this whole Halo2&3 720 discussion was for what future console BC might be able to offer and at what cost to whom.

The conclusion was not even MS decided to give us a higher rendered Halo2 for their BC in xb360 for free. With the advent of DLC becoming widespread, I offered a solution to the issue of pubs/devs not wanting to canibalize new sales with old software. Pubs/Devs should do the up-rezed BC code with maybe a texture pack and charge a small fee for the DLC to run it on the NEW console.
 
Sorry to disturb the nice Halo backwards compatibility thread :) Here's something related to the OP:

http://blog.gamedeff.com/?p=235

Brief translation: by dropping D3DX/FX and going to manually assembled precompiled command buffers, this guy achieves a significant increase in drawcall throughput, reaching the absolutely astronomical 1 mln drawcalls/sec. Can't wait to hear if this passes certification...
 
Brief translation: by dropping D3DX/FX and going to manually assembled precompiled command buffers, this guy achieves a significant increase in drawcall throughput, reaching the absolutely astronomical 1 mln drawcalls/sec. Can't wait to hear if this passes certification...

Any idea on how much the same would have costs going through DX?
 
Back
Top