Someone said that any cheapo PC has the grunt to decode H.264 files, and I have to disagree. As far as I know, it's about as CPU intensive to decode as WMV9, and I know from experience that (for instance) Athlon 64s are shite at decoding WMV9 HD content. Something like an A64 3000+ is only just capable of decoding, say, a 10Mbit 1080p WMV9 file, whereas a 3.0GHz P4 will chew it up at an average of around 50% CPU time. A 3.8GHz P4 absolutely eats WMV9 stuff for breakfast. AFAIK, it's down to the P4's superior streaming SIMD performance, which is more ckockspeed-bound than other computations. I'm expecting H.264 decode to be the same. It's also worth noting that only the very fastest 2.13GHz Pentium Ms are capable of decoding very high bitrate WMV9 stuff. So, decode support from a video card will be key, both to enable decode on notebooks and low power PCs, and to free up resources. Multi core CPUs make that less of an issue, but for me as a journalist, I'll be including an HD H.264 decoe test in my system/CPU/whatever reviews as soon as Apple release QT7 for windows, or as soon as another large player releases a quality H.264 decoder. Even if there's not much content now, for me it's very nice to know how a system or CPU handles decoding H.264. It's clearly going to the the dominant standard for HD content.