The Framerate Analysis Thread part 2

Skate demo on 360 is a perceptual 60fps (ie 60fps with frame drops, but still very very smooth) and looks great. No screen tear. Bearing in mind how poor the PS3 version of the original was, I'm looking forward to seeing the equivalent demo next week.
 
I've updated my original Call of Duty 4 analysis here with frame rate graphs on the videos. It's quite illuminating to see how many scenes impact frame rate at the same points on both systems - a phenomenon that's not quite so clear cut in CoD: WaW (see here.)

A pretty nice-looking YouTube HD compilation of the CoD4 videos can be seen here.
 
Thanks grandmaster, it was a very interesting read.

I haven't picked up Word at War yet (may skip it), but I'm very interested to see if the rumors you heard about COD5 come true.
 
I've updated my original Call of Duty 4 analysis here with frame rate graphs on the videos. It's quite illuminating to see how many scenes impact frame rate at the same points on both systems - a phenomenon that's not quite so clear cut in CoD: WaW (see here.)

A pretty nice-looking YouTube HD compilation of the CoD4 videos can be seen here.
Well done!
PS3 drops frames mostly in the heavy alpha-blended scenes. Looks like Xenon's eDRAM is keeping up smooth in this regard.
 
I'm going to start work on a "Does the Xenos scaler impact performance at all" feature and would welcome suggestions from those who think that certain 360 games perform worse at 1080p. I've done some Googling and Lost Planet/Dead Rising come up and I think someone here mentioned GTA IV 'in the rain' tearing more at 1080p, but are there any other games or performance tests any one would care for me to try?

Surely a v-locked 60fps game or near-60fps game like CoD: WaW or CoD4 would be worth testing as any performance hit would surely be more impactful?

As it is, I captured the initial intro to Lost Planet at both 720p and 1080p and - would you believe it - the game tears on pretty much identical frames (albeit with a minor variation in the location of the tear).
 
Last edited by a moderator:
It seems killzone 2 preview build has an average fps about 20 25 , the source is an italian forum forumeye.it and an italian review game republic. I'm really worried , strange any other site has reported this.
 
Last edited by a moderator:
I'm going to start work on a "Does the Xenos scaler impact performance at all" feature and would welcome suggestions from those who think that certain 360 games perform worse at 1080p.

Are you able to capture with the VGA output? I have also heard this as being a culprit.

Thanks again for your efforts. :)
 
anyone check Gears 2? I thought the game is going to crashed sometime when I play, the frame rate gets real bad on my system. Split Screen co-op is much worse. It feel like it runs much worse than Gears 1 overall.
 
Are you able to capture with the VGA output? I have also heard this as being a culprit.

Thanks again for your efforts. :)

Yes, our hardware has support for VGA and HDMI at any VESA or HDTV resolution up to 1080p. We've also just added support for component, making it the only capture hardware that can acquire full raster 1080p on all three major inputs :)

I'm also including a performance review of the 5:4 1280x1024 mode, which has thrown up some interesting results.
 
I've updated my original Call of Duty 4 analysis here with frame rate graphs on the videos. It's quite illuminating to see how many scenes impact frame rate at the same points on both systems - a phenomenon that's not quite so clear cut in CoD: WaW (see here.)

A pretty nice-looking YouTube HD compilation of the CoD4 videos can be seen here.
Nice work.

A couple notes:
- I think you mixed up the numbers in your final average in test #9 here:
http://www.digitalfoundry.org/blog/?page_id=186

- You are using a moving average for the blue and green lines, right? For your commentary, you may want to note that V-shaped changes correspond to single frame stalls rather than a gradual drop in framerate (say from reading the HDD or DVD/BR), and V-shaped changes with a flat bottom ( like \_/ ) correspond to a couple frames at a low rate with everything else fast.

- As an example, look at the V dip for the 360 in test #2 at 14 seconds here:
http://www.digitalfoundry.org/blog/?page_id=324
 
on the RE5 X360 demo in 1080p there are lot of tearing and not in 720p but it's the same native resolution and AA, just upscale, it's strange... X360 upscale not free?
select the first level, don't move, just rotate the camera and look the tearing
 
Last edited by a moderator:
on the RE5 X360 demo in 1080p there are lot of tearing and not in 720p but it's the same native resolution and AA, just upscale, it's strange... X360 upscale not free?
select the first level, don't move, just rotate the camera and look the tearing

I reported this a while ago. People said I was nuts.
 
Grandmaster- Will you have some of those awesome framerate analyses for Killzone 2 in aboot a month?

Also, have you found any more sources of 1920x1080p60fps? The only one I have is the WipeoutHD one you link to.
 
There's the PC intro to Devil May Cry 4 in the TrueHD 1080p Workstation section of www.digitalfoundry.org. That's the cleanest 1080p60 video produced for PS3 to date. I am also building a new PC based on the GTX295 and will be making some Crysis and Crysis Warhead 1080p60 videos. Encoding these videos is a bit of a fine art, but I've got better at it since I did the DMC4 one.

Yes, Killzone will be deconstructed big-style once I have the code.

With regards RE5, I'll take a look at that in the next day or so. My capture hardware can downscale 1080p to 720p on the fly, so a comparison video based on the bit Quaz points out should be very quick to do and bung up online.

The real test of whether Xenos scaling is free or not would - I suspect - come from a game like Call of Duty 4/WaW: I'd imagine a v-locked 60fps game that drops frames would drop more frames and be far more noticeable.
 
There is another thread somewhere regarding scaling possibly effecting performance, but as far as I remember nobody could explain how the Xenos scaler worked in practice. If it doesn't have some small block of RAM for itself acting as a FIFO queue that is always used when fetching data for video output, regardless of whether scaling happens or not, then I think (although I could be mistaken) that it would be obvious that scaling the video output could possibly effect performance due to the need for the additional memory accesses required for surrounding pixel sampling. I would guess that it be right to expect that the video output hardware would have the highest priority of access to the RAM, and thus, in the case of scaling being required, take precious bandwidth off the rest of the hardware subsystems - possibly causing problems if the said bandwidth was being used close to maximum without scaling going on.

It would be interesting to know how exactly Xenos performs its data fetches and from where when it performs is scaling operations.
 
I am also building a new PC based on the GTX295 and will be making some Crysis and Crysis Warhead 1080p60 videos.

Yes, please do this!! I got a Core i7 with a GTX260 (evga ssc 55nm, stock oc'd to 675MHz) and 6GB of RAM about two weeks ago and now, having finished Crysis, toying with some visual mods and stuff - boy was I craving this game - I can only say that... as someone who has always loved computer graphics, it's so good to be playing on the PC again. That game is untouchable, technically. It's also a pretty good game. :) Perhaps in 2 years (or even just 1) we will have single core graphics cards capable of truly constant 1080p60 in it.

For what is worth I played it in XP64 and a bit in Vista64, most of the time with the cudaats mod installed (level 5), but by the end I was just using the ToD patch that comes with it, and eventually even reverted back to no mods, due to some visual glitches (that I only got in XP but not in Vista curiously). The config I played most in was XP64 1080p with 2xAA and most things set to high with a few custom (higher) and I got mainly around 30 something fps. Sometimes it would go higher, in interiors for example, or lower in some cutscenes. It's funny how playing with a mouse seems to lessen the impact of framerate drops though.

I also played a bit, I think it was in Vista, either with cudaats level 5 or very high (definitely Vista if so) but in 1280x720 with 16xAA and it was very very good also and had similar framerates. Dropping the AA would make it quite smoother obviously. Eventually I decided I preferred the perfection of 1080p since it was mostly quite smooth.

Oh well I would talk a bit more about other games but I just now realized this is the Console section and this is already offtopic enough, sorry mods :LOL:. Just to end slightly more on-topic, it's pretty safe to say that any game that runs at 720p30 in consoles will run at 1080p60 in a mid to high-end PC with the settings cranked way up. Compare the FEAR 2 demo for example. It's the smoothness with _everything_ maxed out. Far Cry 2 will go down a bit from 60 though but never below 30 in Ultra with insane AA and AF.
 
I was toying with the idea to make a 1080p60 5.1 surround video. But then where would one upload such a huge video?
 
Back
Top