Post-XMAS HD DVD Sales Surge: #8 ranked HD DVD surpasses #1 Blu-Ray in sales

No, from what I understand, you haven't just got a series of flipping pictures when you are talking about VC-1 or AVC codecs. Both these employ motion compensation (temporal compression) with variable block sizes. To put it simply, the parts of the picture that change slowly do not need to be updated so frequently, but need high resolution. The parts of the picture that change quickly need to be updated frequently, but don't need such high resolution because you can't see the detail anyway due to motion blur. The VC-1 and AVC codecs interpolate at intervals between the encoded frame data, and use variable block sizes to achieve this, which is how they get double the compression of MPEG-2 and why they require so much more processing power to decode. If you have 1080p encoded data on a disk, convert that to 1080i and then back to 1080p, you will lose something on the conversion with moving pictures because the compression is not lossless and the motion compensation computations are quite intensive and are difficult to do, although on a static picture, as Mintmaster said, you would not lose anything.

Of course I am not an expert on this, but this is what I have read on the Internet seems to be suggesting.

But the player still resolves the VC-1 or AVC stream to 24 frames every second (the original movie frames).

These frames are then transmitted to the diplay raw. For 1080p they are transmitted as frames, one line after the other. For 1080i they are transmitted as interlaced fields. But the images is not "converted", the player just scans out even lines in the first field, and odds lines in the next field. Since the cadence of 1080i is 60Hz the player employ 3:2 pulldown and sends the odd frames as 3 fields (with the 3rd field being completely redundant) and the even frames as two fields.

The display panel takes the 3 fields of the odd frames and de-interlaces, merges, the first two fields and discards the 3rd. Even frames are just merged from two fields.

In essence it is just swizzling of data prior to and after transmission, but the data is all there.

1080p makes a difference in games where you have twice the video bandwidth (twice the line-frequency).

- And could make a difference if we ever see 1080p60 material on disc, but that requires a complete change of the tool chain in the movie industry and is probably quite a while off.

Cheers
 
The links in my previous post above also seems to confirm Microsoft's position that 720p gives a better picture quality than 1080i.

720p@60 is better than 1080i@30. However, 720p@24 or 720p@30 is not necessarily better than 1080i@30.
 
It sounds as though MS are stepping up to the plate in terms of pushing HD DVD a little more directly now:

http://www.dvdtown.com/news/microsoftcontinuestosupporthddvd/4257

Microsoft used the HD-DVD press event at this years CES to make it clear that they are fully behind HD-DVD.

They will continue to sell the XBOX 360 HD-DVD add-on drive for $199.99. They will also continue to include the remote and "King Kong." They also said they had sold 10.4 million consoles worldwide to date.

Also at CES today, Microsoft and Broadcom Corp. announced a joint effort to support a hardware and software reference design for more cost-efficient HD DVD playback. The new platform uses Microsoft Windows CE 6.0 and Broadcom's BCM7440 system-on-chip solution, allowing consumer electronics manufacturers, original design manufacturers and systems integrators to more easily and affordably deliver HD DVD playback. Several of the more innovative consumer electronics companies plan to use this new hardware and software platform to speed the production of HD DVD players, including Lite-On IT Corp., one of the most experienced systems integrators backing high-volume consumer electronics manufacturers, and Zhenjiang Jiangkui Group Co. Ltd./ED Digital, one of the largest, high-volume manufacturers of DVD players in China.

In addition, Microsoft is working with Meridian Audio Ltd. to build high- end HD DVD players for the enthusiast market.
 
Yeah, I saw that (see the PR forum). Tho wtf they were thinking to "bury" that bit with Broadcom so far down the PR is beyond me.
 
It seems BR is getting pretty close to HD DVD now on amazon... the last couple days has had the top BR title ahead of the top HD DVD title, whenever I've looked, at least (although the rest of the top 10 and 100 were more in favor of HD DVD, but the gap is closing rather quickly).

With recent title announcements and the general lack of exclusive titles on HD DVD announced at CES, it looks like '07 is BR's for the taking. BDA really does need to get a cheaper player out... PS3 is rather awesome as a player, but not everyone wants something like that in their HT.
 
720p@60 is better than 1080i@30. However, 720p@24 or 720p@30 is not necessarily better than 1080i@30.

The main divergence in the US is between 720p @ 60 and 1080i @ 60 (which is basically 1080p @ 30), no? I'd think ABC/ESPN went with 720p b/c they figured 60fps was more important for sports.

That doesn't stop 1080i from looking spectacular on a "720p" plasma, though it can be jerky. Dunno if that's a 3:2 pull-down artifact or something else, as I don't know DiscoveryHD's source.
 
The main divergence in the US is between 720p @ 60 and 1080i @ 60 (which is basically 1080p @ 30), no? I'd think ABC/ESPN went with 720p b/c they figured 60fps was more important for sports.

That doesn't stop 1080i from looking spectacular on a "720p" plasma, though it can be jerky. Dunno if that's a 3:2 pull-down artifact or something else, as I don't know DiscoveryHD's source.

I have been looking at what type of HDTV I should buy for the last week or two. Here is what I have arrived at (although it may not necessarily all be correct).

Interlaced HD standards like 1080i are really intended for CRTs, and I have seen some articles questioning whether they really should have been introduced at a time when everything to moving from CRT to LCD/plasma for display and CCD for cameras. I think the difference is small, but 1080p is definitely better for technical reasons.

There are three serarate issues here.
1) 1080p HDTVs vs 1080i HDTVs:
1080p capable HDTVs will have definitely better image quality than 1080i HDTVs (when playing movies). But you probably need a big screen and fast motion to see it. The reason is that movies are stored in 1080p/24 (film movies are captured at 24 frames per sec) and are converted to 1080i/60 using 3:2 pulldown (http://en.wikipedia.org/wiki/Telecine) if you TV can't do it. To convert 1080p/24 to 1080i/60 requires 2 frames to be mapped to 5 (3:2 pulldown). Doing this gives a different weighting (2:3) to the odd and even frames which creates flicker artifacts in areas that are changing between frames. Also the intermediate frames are created by interlacing two 1080p frames which are 1/24 seconds out of sync. This can create aliasing artifact effects in areas that are changing rapidly. 1080p HDTVs are supposed to cost more than 1080i HDTVs, although I am not sure why they should since the only extra cost is the upscaler to convert first gen HD-DVD player 1080i output to 1080p.

2) 1080p movie player output (like PS3) vs 1080i movie player output (like the Toshiba HD-DVD player). As Mintmaster pointed out, inverse telecine can be used to perfectly re-create the original (http://en.wikipedia.org/wiki/Telecine#Reverse_telecine_.28a.k.a._IVTC.2Finverse_telecine.29). If your 1080p HDTV is able to convert 1080i to 1080p (as all 1080p HDTVs should) then you would get exactly the same picture on the Toshiba movie player as the PS3. However there can be problems with this. To convert 1080i to 1080p, the TV has to detect which are the even and odd frames, drop the interlaced frame created by merging two 1080p frames, and output the original 1080p at 24 frames per second. I am not sure why any movie player should not also output 1080p if that is the native storage format, but the first gen HD-DVD players all seem to output only in 1080p, while the second generation ones output 1080p.
http://en.wikipedia.org/wiki/Deinterlacing

3) 1080i vs 1080p in games and TV output.
All HD broadcasts at the moment seem to be at 1080i/60 or 50 (30 or 25 interlaced frames per sec) at the moment. Converting from 1080i to 1080p as in movie players probably isn't possible for broadcasts, because it shouldn't be done anyway for video camera broacasts scanned natively in 1080i format. On top of that, some sources say that video editing can sometimes switch even and odd frames in 1080i leading to faulty detection of even/odd frames, reducing image quality. If it is possible to deinterlace 1080i movie broadcasts to get 1080p then a 1080p TV would give you a better image quality as explained in 1) above.

Also 1080p/30 or 25 and 1080p/60 or 50 I think will eventually come. The only reason I can see for not doing it 1080p/30 or 25 now is that the broadcasters use cameras that output 1080i/60 or 50 (30 or 25 frame rate), since the broadcast bandwidth will be the same, and 1080p can easily be converted to 1080i in a set top box for those who have a 1080i only TV.

For games, I don't think 720p,1080i, or 1080p matters, since game image quality is less than perfect anyway.


Personally I have decided to buy on the "more the merrier" principle. An HDTV supporting 1080p supports more standards and is more future-proof, and I can't see why it should cost a whole lot more than a 1080i LCD and plasma TVs since they have the same number of pixels, unless 1080i is cheaper in order to offload old stock. I am going to wait for prices to drop a bit before buying. I have also decided on the PS3 as a movie player. People have complained that the first gen Toshiba player is very slow to start up, whereas the PS3 is fast. The second gen Toshiba player rectifies this, but is quite a bit more expensive than the PS3. Also I am not sure if HD-DVD will survive as a format in the long term. The same may be true for the PS3, but you get a games machine and a computer for the money even if BD fails. Again I will wait the price drop with 65nm fab before buying.
 
If you care about HD DVD or BR for the long run, then get a 1080p display that can accept a 1080p/24 input. This will give you the "ideal" image.

1080P in itself doesn't guarantee anything though. You can get a Westinghouse 1080P but a Pioneer at only 720p/1080i will still have a better picture. There is a lot more to PQ than resolution. Broadcast quality leaves much to be desired and 1080P native content isn't coming anytime soon thus
outside of PC use, the only thing 1080p is good for is Hi Def DVD's so might as well the best picture possible. As you pointed out, games aren't exactly perfect so 720p/1080i/1080p is all a crapshoot and does not guarantee a great game. Hell, 480p video content looks much better than 1080p games :)
 
I have been looking at what type of HDTV I should buy for the last week or two. Here is what I have arrived at (although it may not necessarily all be correct).

Interlaced HD standards like 1080i are really intended for CRTs, and I have seen some articles questioning whether they really should have been introduced at a time when everything to moving from CRT to LCD/plasma for display and CCD for cameras. I think the difference is small, but 1080p is definitely better for technical reasons.

There are three serarate issues here.
1) 1080p HDTVs vs 1080i HDTVs:
1080p capable HDTVs will have definitely better image quality than 1080i HDTVs (when playing movies). But you probably need a big screen and fast motion to see it. The reason is that movies are stored in 1080p/24 (film movies are captured at 24 frames per sec) and are converted to 1080i/60 using 3:2 pulldown (http://en.wikipedia.org/wiki/Telecine) if you TV can't do it. To convert 1080p/24 to 1080i/60 requires 2 frames to be mapped to 5 (3:2 pulldown). Doing this gives a different weighting (2:3) to the odd and even frames which creates flicker artifacts in areas that are changing between frames. Also the intermediate frames are created by interlacing two 1080p frames which are 1/24 seconds out of sync. This can create aliasing artifact effects in areas that are changing rapidly. 1080p HDTVs are supposed to cost more than 1080i HDTVs, although I am not sure why they should since the only extra cost is the upscaler to convert first gen HD-DVD player 1080i output to 1080p.

Repetition: For movies it's a non-issue.

First off 3:2 pull down, if un-treated, causes judder, not flicker, which is the un-even cadence of frames, one frame is shown for 33.3 ms, the next for 50ms, where each should be shown for 41.7ms (1/24th of a second). Note that you'd need to do 3:2 pulldown on a 1080p60 set too, frames just aren't telecined in this scenario.

That is if it is left untreated. However basically all modern TVs offer anti-3:2-pull down modes.

Sharp calls this TruD
Pioneer calls it Pure Cinema
Sony calls it Cinemotion
Panasonic calls it Progressive Cinema Scan
Samsung calls it Cinema Smooth

It's a solved problem. Of course most modern sets also support 1280x720p24 and 1920x1080p24 too which makes the point completely moot.

3) 1080i vs 1080p in games and TV output.
All HD broadcasts at the moment seem to be at 1080i/60 or 50 (30 or 25 interlaced frames per sec) at the moment. Converting from 1080i to 1080p as in movie players probably isn't possible for broadcasts, because it shouldn't be done anyway for video camera broacasts scanned natively in 1080i format.

I'm pretty sure most digital HD cameras can output progressive scan. The reason broadcasters don't broadcast in 1080p is because it uses twice the bandwidth (and hence twice the cost), 1280x720p60 uses roughly the same bandwith as 1920x1080i60.

For games, I don't think 720p,1080i, or 1080p matters, since game image quality is less than perfect anyway.
Since this is the only source for 60Hz material right now, this is the only place 1080p makes a difference.

Personally I have decided to buy on the "more the merrier" principle. An HDTV supporting 1080p supports more standards and is more future-proof, and I can't see why it should cost a whole lot more than a 1080i LCD and plasma TVs since they have the same number of pixels, unless 1080i is cheaper in order to offload old stock. I am going to wait for prices to drop a bit before buying. I have also decided on the PS3 as a movie player. People have complained that the first gen Toshiba player is very slow to start up, whereas the PS3 is fast. The second gen Toshiba player rectifies this, but is quite a bit more expensive than the PS3. Also I am not sure if HD-DVD will survive as a format in the long term. The same may be true for the PS3, but you get a games machine and a computer for the money even if BD fails. Again I will wait the price drop with 65nm fab before buying.

Good luck on your purchase. You should set your target now, on features and price, because there'll always be something better coming out in 3 months :)

Cheers
 
Repetition: For movies it's a non-issue.

It is a non-issue for movies if you have a 1080p TV set.

First off 3:2 pull down, if un-treated, causes judder, not flicker, which is the un-even cadence of frames, one frame is shown for 33.3 ms, the next for 50ms, where each should be shown for 41.7ms (1/24th of a second).

Yes, that is what I meant - telecine judder is the technically correct term for a receding edge between a dark and light area on a high def movie player connected to a 1080i display flickering at 15 Hz because of the uneven cadence of the frames.

Note that you'd need to do 3:2 pulldown on a 1080p60 set too, frames just aren't telecined in this scenario.

No, you never need 3:2 pull down (telecine) on a 1080p TV set (you can do inverse telecene to convert 1080i to 1080p and correct the telecine artifacts though). This is because 1080p (LCD or Plasma) TV sets match the frequencies including 1080p/24. 1080i on the other hand is intended for CRT TV sets, and CRT TVs are not able to alter frequencies as well so you are stuck with and 1080i/60 which has to be converted from other formats using 3:2 pulldows with consequent loss of image quality.
http://en.wikipedia.org/wiki/High-definition_television

Standard frame or field rates

* 23.977p (allow easy conversion to NTSC)
* 24p (cinematic film)
* 25p (PAL, SECAM DTV progressive material)
* 30p (NTSC DTV progressive material)
* 50p (PAL, SECAM DTV progressive material)
* 60p (NTSC DTV progressive material)
* 50i (PAL & SECAM)
* 60i (NTSC, PAL-M)

That is if it is left untreated. However basically all modern TVs offer anti-3:2-pull down modes.

A 1080p TV should be able to display 1080p input natively and won't need 3:2 pull down. Only a 1080p TV would be able to do reverse telecine the 1080i output and correct the telecine artifacts to display as 1080p/24 (if the original source was 1080p/24). A 1080i tv would just have to display 1080i/30 telecine artifacts and all.

It's a solved problem. Of course most modern sets also support 1280x720p24 and 1920x1080p24 too which makes the point completely moot.

Solved if you have a 1080p set. Not solved if you have a 1080i set since you will get telecine artifacts on that (unless your 1080i TV is in fact a 1080p/24 TV without a 1080p input).

I'm pretty sure most digital HD cameras can output progressive scan. The reason broadcasters don't broadcast in 1080p is because it uses twice the bandwidth (and hence twice the cost), 1280x720p60 uses roughly the same bandwith as 1920x1080i60.

No. the frequency usually refers to the scan rate (except in Europe where it usually refers to the frame rate). 1080p/30 gives you 30 full frames per second. 1080i/60 gives you 30 frames per second in two interlaced scans which has half the information of a frame at 60 Hz. Therefore the bandwidth of both 1080p/30 and 1080i/60 are the same (actually 1080p/30 is slightly better because the compression works slightly better). 1080p frames are for LCD/Plasma TVs which have persistant pixels, so a 60Hz interlaced scan rate is not needed.

Since this is the only source for 60Hz material right now, this is the only place 1080p makes a difference.

1080p/60 is for future compatibility. If you are spending $1000+ on a HDTV you surely want it to last 5 years + technology wise.

Good luck on your purchase. You should set your target now, on features and price, because there'll always be something better coming out in 3 months :)

True, but for a LCD/plasma TV, I don't see why 1080i should cost less than a 1080p, since no additional circuitry is required to make LCD/plasma displays persistant. 1080i is for CRT TVs and so it is the tail end of the last gen HDTVs, so my base spec would be a 1080p set.
 
Personally I have decided to buy on the "more the merrier" principle. An HDTV supporting 1080p supports more standards and is more future-proof, and I can't see why it should cost a whole lot more than a 1080i LCD and plasma TVs since they have the same number of pixels, unless 1080i is cheaper in order to offload old stock.
Normally it's a good principle, but I'd be more concerned with quality than quantity of pixels. I have a hard time discerning a difference in resolution b/w "720p" and 1080p sets at a CircuitCity when they're all multiplexing a satellite HD source, but I'm guessing the difference will be more evident in a more controlled home environment and especially with a less compressed high-def source.

But there's always a tradeoff, and I'm suspicious that 1080p might be too many pixels to process as well as 720p. I'm probably wrong, at least by now. I saw l-b's Sony V2500 thread, and the good reviews it's garnered. I may have an incorrect suspicion from the cheap Westinghouse 1080p days; I'm sure Sony brings better pixel massaging to the table, but I still think 1080p's a lot to massage in real-time.

I'm not sure what you mean by a 1080i LCD or plasma, as I don't believe there's such a thing, unless you're referring to 1080p sets that only accept 1080i inputs (if such a thing even exists).

As for buying an HDTV, you've just got to bite the bullet and accept that your TV will cost 50% less in a year. :) There were some great Black Friday deals, though, with the Pana 42" plasma going for ~$1100, IIRC. I think even the Sony V2500 has dropped considerably in just a few months. I'm sure the 100-120Hz 1080p LCDs are just around the corner. :devilish:
 
It is a non-issue for movies if you have a 1080p TV set.

You keep confusing this. The 1080i and 1080p designation only deals with what type of transmission the TV can input, how the display panel drives/displays the frames depends entirely on the internals. LCDs and PDPs both drive the panel progressively. CRTs mostly do double scan (100Hz TVs in PAL-land), this requires sophisticated assembly of fields into frames which are then displayed interlaced at twice the original scanrate.

No, you never need 3:2 pull down (telecine) on a 1080p TV set (you can do inverse telecene to convert 1080i to 1080p and correct the telecine artifacts though). This is because 1080p (LCD or Plasma) TV sets match the frequencies including 1080p/24. 1080i on the other hand is intended for CRT TV sets, and CRT TVs are not able to alter frequencies as well so you are stuck with and 1080i/60 which has to be converted from other formats using 3:2 pulldows with consequent loss of image quality.

Well, if your TV only supports 1080p @ 60Hz you bloody well need pull-down. But as I said in my above post (which you ignored) it's a moot point since most TVs today which accept 1080i (60Hz) inputs also accept 1080p24 (which is what you meant I think) because the pixel bandwidth needed is lower than 1080i60 and thus is easily implemented in 1080i sets.


A 1080p TV should be able to display 1080p input natively and won't need 3:2 pull down. Only a 1080p TV would be able to do reverse telecine the 1080i output and correct the telecine artifacts to display as 1080p/24 (if the original source was 1080p/24). A 1080i tv would just have to display 1080i/30 telecine artifacts and all.

Again, you fail to grasp the difference between transmission of video and displaying of video. 1080i and 1080p refers to how the video is fed to the display, not how the display operates internally.

No. the frequency usually refers to the scan rate

For ancient CRTs perhaps, but all modern TVs have driving the display decoupled from the actual input (even CRTs).

1080p/60 is for future compatibility. If you are spending $1000+ on a HDTV you surely want it to last 5 years + technology wise.
Fine, if you're going to run games in 1080p on your PS3/360 or hook a HTPC up to it, for movies it is completely unnecessary.

True, but for a LCD/plasma TV, I don't see why 1080i should cost less than a 1080p, since no additional circuitry is required to make LCD/plasma displays persistant. 1080i is for CRT TVs and so it is the tail end of the last gen HDTVs, so my base spec would be a 1080p set.
[/quote]
It costs less because the video processing circuitry of a 1080i set has to do half the work of a 1080p set (62.2Mpixels/s vs 124.4MPixels/s). This price differential is enhanced by mass market mechanics: Since 1080i sets are cheaper, and 1080p is useless for >90% of people buying a HDTV set (since all broadcasters transmit in 720p or 1080i and movies don't need it) 1080i is what is sold the most and hence enjoys an economies of scale advantage.

Of course with the progression of video processing electronics this price differential will diminish.

Cheers
 
It costs less because the video processing circuitry of a 1080i set has to do half the work of a 1080p set (62.2Mpixels/s vs 124.4MPixels/s). This price differential is enhanced by mass market mechanics: Since 1080i sets are cheaper, and 1080p is useless for >90% of people buying a HDTV set (since all broadcasters transmit in 720p or 1080i and movies don't need it) 1080i is what is sold the most and hence enjoys an economies of scale advantage.

Of course with the progression of video processing electronics this price differential will diminish.

Cheers

Don't your arguments only hold up if you have a TV that has a 1920x1080 pixels screen that nevertheless only supports a 1080i input signal? Now those, I am sure, are definitely in the minority and do not benefit from 'economics of scale'. Most sets will be 720p and accept a 1080i but then scale this to 720p. That's quite a difference from a 1080p set that accepts a 1080p natively or upscales a 1080i signal.

Or am I missing something?

I have just recently bought a 720p TV (well, 768 since I live in Europe), and my general principle is I spend max 1000 euros on a TV (give or take a bit of inflation), a strategy which I started on with my previous TV. But I do have to say that if I hadn't only needed a 32" TV (40" would be overkill at the 2.5m we sit from the TV), I would have preferred to wait until the 1080p TVs are affordable, because I do start noticing the difference between a 1080p and 720p TV from 40" and up. Sure, not for every kind of signal, but for enough of them. ;)
 
Don't your arguments only hold up if you have a TV that has a 1920x1080 pixels screen that nevertheless only supports a 1080i input signal? Now those, I am sure, are definitely in the minority and do not benefit from 'economics of scale'. Most sets will be 720p and accept a 1080i but then scale this to 720p. That's quite a difference from a 1080p set that accepts a 1080p natively or upscales a 1080i signal.

Or am I missing something?

The whole discussion were centered around getting the right cadence of movie frames (24Hz material) on a TV regardless of stream mode (1080i or 1080p). This is really disjoint from the resolution of the display itself, scaling happens after inverse 3:2 pulldown.

I have just recently bought a 720p TV (well, 768 since I live in Europe), and my general principle is I spend max 1000 euros on a TV (give or take a bit of inflation), a strategy which I started on with my previous TV. But I do have to say that if I hadn't only needed a 32" TV (40" would be overkill at the 2.5m we sit from the TV), I would have preferred to wait until the 1080p TVs are affordable, because I do start noticing the difference between a 1080p and 720p TV from 40" and up. Sure, not for every kind of signal, but for enough of them. ;)

That's really odd. I got a 50" PDP (1366x768), my viewing distance is 3-4 meters (depends on who gets the good chair). Three meters is still beyond the watershed mark for making out individual pixels (which is approximately at one arc-minute/pixel).

A friend of mine bought a Sony KDL 40V2000 (40" 1920x1080 LCD), which is what convinced me to go with the Pioneer PDP. Colours, contrast(blacks!) and motion is better on the PDP.

He sits closer to his TV though and uses it with a HTPC (as essentially a really big monitor, for normal work), so makes perfect sense for him to go with LCD and the higher res.

Depends on use I'd say.

Cheers
 
http://www.eproductwars.com/dvd/

Just a day or so after Christmas, HD DVD has experienced a sales surge on Amazon.com with its current #8 highest sales rank title having better sales rank than the #1 highest sales rank Blu-Ray title.
<...>
It might be over soon. Same link.
Currently BR's #3 out-salesranks HD-DVD's #1. The average of the top ten ranks, whatever the relevance, is also in Blu-Ray's favor. While Amazon salesranks are not an absolute metric, the numbers for Blu-Ray have grown considerably since pre-Xmas versus rather constant HD DVD statistics.
 
It might be over soon. Same link.
Currently BR's #3 out-salesranks HD-DVD's #1. The average of the top ten ranks, whatever the relevance, is also in Blu-Ray's favor. While Amazon salesranks are not an absolute metric, the numbers for Blu-Ray have grown considerably since pre-Xmas versus rather constant HD DVD statistics.

You're putting far too much faith in those graphs. Today is shows blu-ray outselling HDDVD - yesterday it was the opposite.

Nothing is going to be over for a while yet.
 
The new pre-orders and new releases will fluctuate those number a fair bit now that the "war" is close. For the record, I pay 0 attention to those numbers since I have yet to buy a title from amazon. All of mine come from B&M stores for both formats.

I believe Videoscan, which tracks retailers gets the better all around numbers and ratios. Some insiders generally have access to those. If come across those, I'll update this thread.
 
Last edited by a moderator:
If I'm BR camp I'd be moderately concerned that we're nearly to February and they haven't opened clear and rising separation on HD DVD yet from those numbers. "Being ahead" isn't nearly enough to deliver a death blow.
 
If I'm BR camp I'd be moderately concerned that we're nearly to February and they haven't opened clear and rising separation on HD DVD yet from those numbers. "Being ahead" isn't nearly enough to deliver a death blow.

I dunno, if I was in the BR camp and I saw the progress BR has made over the last two months, I think I'd be pretty happy.

Once the major titles come out over the next few months, I think you'll start to see the gap widen (since HD-DVD simply isn't realistically going to be able to compete in title volume with only Universal backing them exclusively) -- right now BR has gone from being sucky in sales to matching HD-DVD, and it's staying up there to. It'll take a couple more months, I think, for BR to noticeably take the lead, position wise, but it looks like it will (especially when using the amazon numbers, as like a few have noted here already, it's easier to change sales # positions by quite a bit when you're in the thousands, but hard when you're <500 -- it may well be that the difference between #2k and #1k is smaller than the difference in sales between #100 and #200).

At DVD Empire, it's gone from HD-DVD taking quite the majority to BR taking nearly 60% of the sales weekly (and apparently growing a bit each week).

If BR hasn't pulled away noticeably over the next couple months (especially after march), then I'd begin to wonder if we'll see a resolution to this war any time soon (like at all over the next several years). Also depends on what happens at '08 ces (if studios stay exclusive or go neutral on either side, we may it indefinitely continue or help quicken the war's end). Regardless, this war is still 2 years from a conclusion, at soonest -- even if one side gains quite a bit over the other this year.
 
You're putting far too much faith in those graphs. Today is shows blu-ray outselling HDDVD - yesterday it was the opposite.

Nothing is going to be over for a while yet.
The source and data was apparently good enough to get a lengthy thread going when it still spoke clearly in HD DVD's favor (see thread title). It doesn't need to be the win this week, it just shows a trend of progress.

The "too much" I put into these graphs now is that Blu-Ray sales rank as tracked by Amazon has been improving a lot while HD-DVD hasn't, and that this is a trend that will likely continue.
It should be pretty obvious what the reason for this growth is and if you think it involves faith to expect it to continue, okay, you can have that: I believe that people will prefer the Blu-ray version of a movie, if they already have a device that can play BRs, over the DVD version. I believe that the data presented there shows that the demographic I just described has been growing and is indeed generating sales (on Amazon), and that it will continue to grow.

If we are to dismiss that salesrank tracking stuff, what would be a better source of information?
 
Back
Top