360 comes with pre-loaded content?

Shifty Geezer said:
ShootMyMonkey said:
I don't follow HD as well as people who know what money looks like, but I don't think there are 60 fps progressive scan modes.
I was of the impression HDTV was 60 fps, hence the arguments over PS3's 1080 output (is it 30 or 60 fps?). Certainly from what little I've gleaned 720p should be a full screen refresh every 1/60th of a second. I think the same is true of 480p.


NOT AGAIN!!

ALL video sources refresh the screen at 60HERTZ. From 480i to 1080p.

The content (videos, movies, sports) being broadcast by TV stations are all ~30FRAMESPERSECOND.

Games can be 30 or 60FPS, depending on the game. Nothing is stopping PS3 - or anything else - to output 1080p at 60fps, apart from obvious performance limitations.

TV stations are not tackling 1080p because it just uses too much bandwidth, not because 1080p inherently means 30Hz/30fps.
 
ShootMyMonkey said:
You can tell a differnce between 30fps and 60fps. Some video cameras can record at 60fps and it's noticable.
Sure... if you have an interlaced video mode that will actually scan the fields to render at 60 Hz. If not for that, you wouldn't notice 60 fps in a console game played on an ordinary TV (half the frames would be dropped)... With an interlaced scan mode, you'll at least get half your scanlines from a new frame each time, so the motion will look smoother. I don't follow HD as well as people who know what money looks like, but I don't think there are 60 fps progressive scan modes.

People make the assumption that framerate and smoothness are inherently linked, which isn't really true. Smoothness has more to do with the amount of information that is gathered by the eye, and the problem with in-game graphics is that a given frame only has information about position and not motion.

You most definately *CAN* tell the difference between 30fps and 60fps even on a standard TV. At least I can anyways.

Regular interlaced TV is showing one field every 1/60th of a second (where a field is half of a full frame, done on every other line). A 30fps game is showing the same image for two successive fields, then it updates and shows the next image for the next two fields.

A 60fps game on an interlaced TV is showing a new image for each field. Even though a field is only half of your screen pixels and you are "losing" half of each image, you are still seeing a new updated on-screen image every 1/60th of a second. So yes, motion appears to be "smoother" at 60fps even on an interlaced display, although very fast motion will also show interlacing artifacts, but they don't look any worse than what you see in a fast-moving television show on the same display.

Progressive scan displays (480p, 720p) refresh the entire screen every 1/60th of a second, so 60fps games (or video) look especially good on them, since you are no longer dropping any information.

To see the differences yourself, you can find any internet video of an always-60fps Xbox game, like Dead or Alive 3. They are almost always recorded at 30fps, and you can check the video statistics to verify. Then watch the actual game being played on a regular interlaced TV, and it will appear to be more fluid and smoother than the online 30fps video appeared to be. Then if you watch it on a progressive display in 480p mode, it will be eye-popping.

You can see the difference between 30fps and 60fps (without interlacing) on your PC too. Just download the 482MB version of the DOA4 trailer, and then download the 292MB version from microsoft.com above. The first file is 60fps, although slightly lower in resolution, while the second file is 30fps. If your computer is fast enough to display them at full size without dropping any frames (you need to be upwards of 2.5ghz with a decently fast video card) you should be able to tell.
 
I can;t tell which games are 60 and which games are 30 unless someone tells me.

I've also never heard a single person I know, IRL, complain about framerates(except slow-down which we're not discussing), ONLY online, on message boardsm, with PC gamers do you hear this argument.

I dunno to each his own, but I've never noticed the difference on consoles, and I really think most of it is just your mind making people think they see a difference when they really don't.

I haven't watched the PC video's you've mentioned, but I don't see how watching video playback on my PC applies to a console game being played on my TV anyways.

I think the vast majority of non-PC gamers have no clue about f/s, and they don't care. And the reason it's garnered so much discussion, is because it's one of the few advatages of PS2 over XBOX, which has made it a hot topic with many. Anyways, thats my opinion.
 
As a IRL case it was very apparent, me and friends playing Lego Star Wars, that it was better quality at 60fps. Everyone remarked (including non-gamers who'd seen other games and not remarked on those) how smooth it looked.

I for one certainly notice the difference.
 
You most definately *CAN* tell the difference between 30fps and 60fps even on a standard TV. At least I can anyways.
Try reading the sentence again. I said "If not for that [interlacing], you wouldn't be able to tell the difference... on a standard TV"

If your computer is fast enough to display them at full size without dropping any frames (you need to be upwards of 2.5ghz with a decently fast video card) you should be able to tell.
Which I've already tried, and it definitely is not fast enough. The 482 MB version plays at around 3 fps on my work machine (3.0 GHz w/ an X800), and around 24 spf at home (550 MHz w/ a Quadro). I even tried farming the playback over 3 machines, and it still ran at about 10 fps. I believe I can tell the difference between 30 fps and 10 fps. ;)

And again, you're completely missing the point I was making about the amount of information the eye collects. The reason you can see the difference between the videos is because of the lack of information per frame with in-game graphics. If that information is re-collected in the process of making the video, encoding at 60 fps becomes unnecessary. In-game graphics at 60 fps is a different matter because you'd have to re-render several times to get the same effect, but FMV is freely open all sorts of post-processing, which is why I brought up the question of why you'd need to encode FMV at 60 fps.

ALL video sources refresh the screen at 60HERTZ. From 480i to 1080p.

The content (videos, movies, sports) being broadcast by TV stations are all ~30FRAMESPERSECOND.
I wasn't talking about refresh rates, I was talking about content streams. i.e. is there a sync mode for delivery of content over component/HDMI/whatever that updates new full-frame fields at 60 fps or does the unit simply display the previous field twice and an update is sent over the cable once every other frame? And bearing in mind that we're confining the discussion to HDTV (i.e., PC monitors don't count), that means it'd be a mode that is inherently supported by the unit.

That too, I'd imagine 60 fps is beyond certain display technologies just because the swing time is too slow.
 
Back
Top