More high res xbox shots

DemoCoder said:
Why is there a VGA connector shown, but no DVI or HDMI?

Because it there is currently no Digital support? i.e. It can support an analog VGA but none of the digital formats because the scaler chip takes the digital image from the memory and outputs it in analog.

Now WHY they did not bother to make the chip output a digital signal... well, who knows. I know I do not have the answer for that :(

Edit: Ps- Outside of game selection the the VGA cable and Live are big features for me right now. I don't own a TV, let alone HDTV, and my best friend lives 2,000 miles away wants to game online with me. Sounds stupid, but online service/games and a simple VGA out support from Day-1 on 100% of games is going to be important to me.

Kind of the swing features... both platforms have good games... so it is about the online experience and whoever can work in 720p on my 8ms LCDs!
 
Last edited by a moderator:
as long as your monitor is capable of the res there shouldn't be a reason why it wont work on all games
 
jvd said:
as long as your monitor is capable of the res there shouldn't be a reason why it wont work on all games

This is what I am worried about.

I have a monitor, lets say a 21" 1600x1200 CRT. That is a 4:3 aspect.

I hook up an Xbox 360... what rez will it out put? What aspect?

Ideally, for ME, I would want 720p 16:9.

Ditto my 1280x1024 LCD. It has analog inputs... but to look good it would have to auto-letterbox and be 720p (1280x720).

I am curious to how they plan to impliment all of this. Getting 720p and letterbox is important to me... so knowing what resolutions and formats the VGA cable will support is important.

Yeah, I know it CAN do it... the question is will MS/devs make it standard for the VGA cable. I am assuming it takes a little more work to letter box it, that is why I am worried. A 720p TV is naturally 720p, so no worries. A monitor is a 4:3 aspect, so I don't want it stretched!
 
Well, no digital out sucks donkey ass. I have 2 HDTVs, one a DLP RPTV and another an LCD PJ. I own some of the expensive analog cable you can buy, and the result is that there is still "snow" and "haze" that is apparent compared to DVI/HDMI. DVI/HDMI is mega-stable. Yes, if you use analog, and never compare side by side, you will unlikely know what "haze" I am talking about, but trust me, component/VGA does exhibit an instability/noisiness that is apparent when you compare them. It is very subtle.

How the FSCK can Microsoft design a next-generation "era of HD" console, and not have digital out? This, and the amount of eDRAM which can ideally fit 480p completely in the buffer, leads me to conclude that the XB360 was *NOT* designed "for the era of HD", but this was PR b*llsh*t tacked on afterwards, and not an original design requirement of the system.

PS3 will undoubtably be my primary system, with XB360 sitting off the side, since first and foremost, it will output a rock-solid digital signal to my displays, and secondly, because I will be watching BRDVDs on it. Microsoft is insane if they think I'm gonna buy WMVHDs on DVDROM or some other proprietary format and watch them through an inferior connection.
 
Democoder the only hdtvs that i own and my friends and family own have component and vga connectors .

So while dvi option would be nice its still pretty niche .

Although i really don't see why they can't by pass teh scaler chip
 
DemoCoder said:
How the FSCK can Microsoft design a next-generation "era of HD" console, and not have digital out? This, and the amount of eDRAM which can ideally fit 480p completely in the buffer, leads me to conclude that the XB360 was *NOT* designed "for the era of HD", but this was PR b*llsh*t tacked on afterwards, and not an original design requirement of the system.

The 10MB eDRAM is an indication that early on they planned to hit 480p hard... but you must wonder that they had to know that some users had widescreen and HDTVs (even the Xbox supported this).

The tiling solution is obviously in hardware, so they must have realized that 1) most people would be at 480p and since 2) tiling worked with very little negative effect at 720p it would be a fair trade off for high end users.

So it would seem that 480p was the original goal. If not, why did they not make the eDRAM size fit a 1080i image in 3 tiles? They would have needed ~10.5MB for that. It seems that, based on the Xenos article, that it needs 4 tiles for 1080i 4x MSAA (although maybe the eDRAM is 10.5MB?)

As for the output... two theoriest

1. As a software company they were not gonna place any bets on what inputs would be standard. They KNOW that component is standard and works with everything, so why bet the house on HDMI when you have no control over adoption of it? Sony obviously has a little bit more control in this area... and HDMI has really taken off in the last year and has some momentum 2 years ago... but before that?

MS seems to avoided all gambles with the hardware and kept out as much "unused" stuff as possible. WiFi? Only a % will use so you Gotta pay. HDD? Well, only a percentage of games used it, so you gotta pay. 4 controller slots? Nope, gotta buy expensive wireless controllers. Everyone got component, so if you want HDMI...

2. You gotta pay... for their Xbox 360 HD-DVD player!

Yeah, I can see it now... HD-DVD (which looks about dead at this point... might as well do BR MS!) + HDMI.

So 3 SKUs!

Ballmar and Allard have hinted in the direction of this, so I have to believe the digital out was kept out so it could be put into a more expensive SKU.

So techies who are streaming OR want the best quality video will have to get the better 360... which, as PStwo Slims show, early adopters are FREQUENTLY willing to buy a new unit when theirs gets old.

A smaller Xbox 360 with HD-DVD? Yeah, I see it in the cards... and they will make it BLACK o_O
 
I'm still half a mind ot think that we may see two diffrent ps3's also . Perhaps one with alot of ports missing and the second dvi missing
 
Well, HD-WMVs aren't that bad... well, on LCD monitors they're not... the brightness and contrast on LCDs and such can usually overcome alot of the porblems with HD-WMV... or so I suspect, I know they're OK for PC monitors, dunno how much it changes going to HDTVs. Anyway, HD-WMVs pretty much suck else-wise, but they're not all bad.
 
DVI/HDMI are the standard, period. They have won. You can't bet on everyone having component either. RCA/S-Video are the lowest common denominator.

But Microsoft could have hedged their bets by including a digital output in their multi-output port, just like DVI can include both analog and digital. It looks like true digital out is IMPOSSIBLE on the XB360 because the GPU is incapable of it. At best, I can see some kind of USB2 hack that dumps the framebuffer over USB2->DVI converter, but that's dubious. There's also no 1394 connector either. Some media center this thing is, since 99% of digicamcorders won't plug into it.

JVD, niche my ass. Every DLP, LCD, PJ, and PDP sold in the last few years has digital input. Your relatives don't own "HDTVs" they own CRT EDTVs branded as "HD Ready", like Sony XBRs, etc. These sets are incapable of full HDTV resolution (especially horizontal), so the digital inputs are moot.
 
JVD, niche my ass. Every DLP, LCD, PJ, and PDP sold in the last few years has digital input. Your relatives don't own "HDTVs" they own CRT EDTVs branded as "HD Ready", like Sony XBRs, etc. These sets are incapable of full HDTV resolution (especially horizontal), so the digital inputs are moot.

My uncle has a nice samsung (i believe it can be sony) lcd 40 inch screen that he got from best buy last year that does 720p nativly and doesn't have dvi or hdmi .

My sisters sony crt does 1080i and has vga and component no hdmi or dvi .

So your pretty much wrong .
 
Mefisutoferesu said:
Well, HD-WMVs aren't that bad... well, on LCD monitors they're not... the brightness and contrast on LCDs and such can usually overcome alot of the porblems with HD-WMV... or so I suspect, I know they're OK for PC monitors, dunno how much it changes going to HDTVs. Anyway, HD-WMVs pretty much suck else-wise, but they're not all bad.

Yeah, but are you going to buy your movie collection in a proprietary format that won't play anywhere else but your XBox360 and Media Center PC? And frankly, a media format that is doomed from the start. The writing is on the wall, in the content wars, BluRay has most of the major movie studios onboard. MGM alone accounts for 4100+ movies.

I speak as an owner of WMVHD disks, namely, Terminator 2 Extreme Edition and Step Into Liquid. They make cool demos, but I am not about to build up a huge collection of them. We need a standard, and WMVHD+DVDROM isn't it.
 
DemoCoder said:
It looks like true digital out is IMPOSSIBLE on the XB360 because the GPU is incapable of it.

I know we have been through this before. No point getting upset at this same issue again :D

As for the above snip, technically the GPU cannot output anything to a screen. The scaler chip does this and it is separate from the GPU and can be changed at a later time (this is from the interview with the ATI guy).

There's also no 1394 connector either. Some media center this thing is, since 99% of digicamcorders won't plug into it.

Good thing they are not promoting it as a media center then, huh ;)

If you want those features get a PS3 or WMC PC because MS surely is not targeting this as a media center... only an extender.
 
True... the WMVHD+DVD setup simply isn't practical. I remember watching T2 and having to sit there as the movie unzipped itself or whatever it was, then having to unblock Windows Media Player on my firewall (I don't trust it a single bit) so it could update the DRM. Really, just a pain. Anyway, you're right blu-ray will win, it's only a matter of time.
 
DemoCoder said:
Well, no digital out sucks donkey ass. I have 2 HDTVs, one a DLP RPTV and another an LCD PJ. I own some of the expensive analog cable you can buy, and the result is that there is still "snow" and "haze" that is apparent compared to DVI/HDMI. DVI/HDMI is mega-stable. Yes, if you use analog, and never compare side by side, you will unlikely know what "haze" I am talking about, but trust me, component/VGA does exhibit an instability/noisiness that is apparent when you compare them. It is very subtle.

How the FSCK can Microsoft design a next-generation "era of HD" console, and not have digital out?
If it is "very subtle," maybe you just answered your own question. I'm certainly not knowledgeable on the topic, but I did find this. Maybe MS research found similar results:
http://forum.ecoustics.com/bbs/messages/34579/122868.html

Actually, I might go so far as to say that people having errant brightness/contrast/etc. settings and various other poor settings on their TVs are a much bigger concern than component vs. HDMI.
 
jvd said:
My uncle has a nice samsung (i believe it can be sony) lcd 40 inch screen that he got from best buy last year that does 720p nativly and doesn't have dvi or hdmi .
Model # please. The vast majority of LCD/DLP/PDP sets sold since 2002 have DVI. HDTV sales have been doubling everywhere. There are an estimated 16 million DTVs now, and 10 million will be sold this year alone. That means by end of 2005 alone, 10 million HDTVs with DVI/HDMI will be sold, as well as the 5 million from last year, easily shows that over 50% of the HDTV market, will by November, have DVI/HDMI.

My sisters sony crt does 1080i and has vga and component no hdmi or dvi .

So what. My DLP will take a 1080i signal as well, even though it is 1280x720. Even the best Sony CRT "HD"Tvs can't display more than about 1400 pixels horizontally due to inherent limitations in the number of phosphor triads, period. Thus, 1080i as displayed on a Sony CRT is more like 1400x700i *if that*. And most HD CRTs fall far short of that (800-1000 pixels horizontally)


So your pretty much wrong .

Or, you pretty much didn't understand my point, and as usual, you don't know what the hell you are talking about.
 
I'm not going to bother responding with your I'm the greatest ever attitude you seem to have even when others are telling you that your wrong . Paying for fancy equipment doesn't automaticly make you an expert on the subject .

Welcome to the block list
 
You're the one who got snippety "you're pretty much wrong", when in fact, YOU are the one who is wrong. CRT sets are limited in the resolution they can resolve. They can scan many horizontal frequencies, just like digital displays can rescale the image, but they still have limits. Analog != infinite resolution. Sony publishes data showing that their max claimed horizontal resolution on their best CRT sets is 1400. AVSFORUM.COM keeps track of HD CRT resolvable resolution. Since 1080i has 1920 pixels horizontally, ergo, your family's Sony CRT can't display 1080i. It can *ACCEPT* a 1080i signal, but accepting a signal means squat. EDTV PDP's that are 854x480 accept 720p and 1080i component and DVI signals and "display them".

Why do you feel compelled to butt into everythread and make comments on areas that you don't know about telling other people that they are wrong, when 5 minutes of googling would have told you your statement was wrong.
 
Inane_Dork said:
If it is "very subtle," maybe you just answered your own question. I'm certainly not knowledgeable on the topic, but I did find this. Maybe MS research found similar results:
http://forum.ecoustics.com/bbs/messages/34579/122868.html

Well, the noise is like aliasing. Once you notice it, you will forever be annoyed by it. Just like once you know what a good contrast ratio and good blacks look like, you will be annoyed by washed out displays. And man oh man does SDTV look blurry compared to HDTV when you switch back and forth between HD and SD broadcasts.

DVI/HDMI are not meant for running 50 foot of cable, but if your DVD/console is 50-200 foot from your TV, you've got other problems. My projector happens to be 30 feet from my equipment. I ran a $200 HDMI cable 30 feet and got flickery pixels (dropout errors), which were even more annoying than "noise" in component. I fixed it by running HDMI over ethernet using HDMI->ethernet/ethernet->HDMI converter boxes. I can run the video 1000 feet now if I want. But most people don't have 120" screens and projectors 23 feet behind their A/V rack.


Actually, I might go so far as to say that people having errant brightness/contrast/etc. settings and various other poor settings on their TVs are a much bigger concern than component vs. HDMI.

Well, calibration is another issue. I can't stand miscalibrated displays nor noise in the image. First thing I do when getting a new display is load up AVIA/DVE and break out a lightmeter.
 
Back
Top