1080p 30fps VS 720p 60 fps - The 2012 poll

What's your preference?


  • Total voters
    105
  • Poll closed .
Just to take the discussion in another direction. Going by the rumored specs, and going by the expertise you guys have, what are the chances of a 60 vs. 30 frames per second scenario?

By that I mean. 3rd party devs will no doubt have to code to the 'lowest common denominator'. If devs were forced to target 30 frames on Durango, would there be enough power on Orbis to push the same code at 60? I'm talking no magic here: assuming both consoles are equally 'efficient', and the rumored specs are completely true, is there enough 'juice' to put it in scientific terms, in the more powerful GPU and enough of a bandwidth edge in ram to make something like that possible?

I'm big on frame rates. I loath 30 frames. And yeah I know I need to build a PC yadda yadda well most games I play aren't on PC, so that's not an option. To me, a dream scenario would be 60 frames standard on one of the consoles. I know I'm a dreamer... what say you, B3D?
 
Although as I'm on PC I can have both high resolution and frame rate in the past I've always chosen eye candy over 60fps+

I think the average joe gamer needs to see a differenece in order to justify and I think choosing 1080p@60fps over 720p@30fps with prettier pixels will limit the way people justify about making the upgrade
 
Although as I'm on PC I can have both high resolution and frame rate in the past I've always chosen eye candy over 60fps+

I think the average joe gamer needs to see a differenece in order to justify and I think choosing 1080p@60fps over 720p@30fps with prettier pixels will limit the way people justify about making the upgrade

Average Joe gamer might not know what a "frame per second" is, but they immediately see and feel the difference when they play Call of Duty and then play Halo, whether they understand the technical jargon behind it or not.
 
If devs were forced to target 30 frames on Durango, would there be enough power on Orbis to push the same code at 60?

Not very likely unless the "30fps-locked" game were already hitting much closer to 60fps without v-sync & the 30fps limiter. Mind you, the CPUs for each are rumoured to consist of 8 Jaguar cores @ similar clockspeeds if not the same.

Assuming the game itself were not CPU or vertex limited at all, then you'd still need considerably more bandwidth & ROPs to push twice the pixel rate. Texturing and ALU are only parts of the rendering equation, so it'll depend on the game itself, and +50% of both only gets you so far.
 
Average Joe gamer might not know what a "frame per second" is, but they immediately see and feel the difference when they play Call of Duty and then play Halo, whether they understand the technical jargon behind it or not.

If average Joe gamer doesn't know about frame per second then why should they give a crap about 60fps?

And COD is 35fps and is no better then Halo or any other 30fps FPS
 
Just to take the discussion in another direction. Going by the rumored specs, and going by the expertise you guys have, what are the chances of a 60 vs. 30 frames per second scenario?
I'll go one step further the AlStrong and say, 'no'. 1080p vs 720p is 2x pixel drawing and shading, requiring 2x the bandwidth and shader power, all things being equal. No-one got that advantage if the underlying tech is very similar. The difference will be slightly more stable framerate, similar to 360 vs. PS3.
 
If average Joe gamer doesn't know about frame per second then why should they give a crap about 60fps?

They might not know what "frames per second" are (in that they've never thought about it) but they'll certainly be able to perceive a difference.

And COD is 35fps and is no better then Halo or any other 30fps FPS

That's really not true at all. In the DF face off you can see that the 360 version spends most of its time in the 50's even during combat, and even the really hectic bits rarely drop below 50 and even then it's briefly. And 30 fps games like Far Cry and Crysis 2 often drop under 30 fps - sometimes well under.

So you're vastly underestimating CoD and vastly overestimating "any other 30 fps FPS". Here's the DF face off where you can see how CoD actually performs:

http://www.eurogamer.net/articles/digitalfoundry-black-ops-2-wii-u-face-off
 
I'll go one step further the AlStrong and say, 'no'. 1080p vs 720p is 2x pixel drawing and shading, requiring 2x the bandwidth and shader power, all things being equal. No-one got that advantage if the underlying tech is very similar. The difference will be slightly more stable framerate, similar to 360 vs. PS3.

Even if nextbox has only 2/3 of the pixel shading power you could just run at 2/3 of the resolution (or 3/4 or whatever the performance envelope allows) and be happy in the knowledge that most people won't notice or care.

1920 x 1080 vs 1600 x 900 would be indistinguishable by most people on most tvs, especially with a good scaler and with overscan turned on.
 
That's really not true at all. In the DF face off you can see that the 360 version spends most of its time in the 50's even during combat, and even the really hectic bits rarely drop below 50 and even then it's briefly. And 30 fps games like Far Cry and Crysis 2 often drop under 30 fps - sometimes well under.

You must of seen a different video to me, its around 35fps... during the most important part....
 
Dreamcast was a great system loved it, still love my dreamcast infact. The difference between DOA2 cut scenes in the PS2 version and Dreamcast versions was very noticeable side to side.

I never knew why the dreamcast version had the cut-scenes in 30fps and PS2 had them in 60fps

The Dreamcast was great I missed it (we had one and it died, the optical drive wouldn't do anything)
It gave clean 640x480 at 60fps mostly. RGB output that gives the best a TV could do, 50/60Hz selection for every game, good and reliable controller (it seemed)
 
You must of seen a different video to me, its around 35fps... during the most important part....

Which is the most important part? And what time is the most important part in the video?

You said the game runs at 35 fps and as a generalisation that's demonstrably completely false. In the most popular version, on the 360, it averages in the 50s with occasional dips lower for brief periods. By comparison games like Crysis 2 drop in to the low 20s.

By this method of judging framerate games on a Radeon 7950 with those gammy drivers run at about 10 fps.
 
320*200@30Hz and 16bits.
It worked for multiplayer Quake w/o hardware acceleration back in the days... You guys just like luxury. ;p
 
Quake was 8bit (the reason for that famous brown)
Mode X is good! else there is VESA. I was suprised to run DOS quake on a netbook that had a 1024x600 VESA mode, and seeing the worse-than-atom netbook run it at native res, smoothly.
 
Which is the most important part? And what time is the most important part in the video?

You said the game runs at 35 fps and as a generalisation that's demonstrably completely false. In the most popular version, on the 360, it averages in the 50s with occasional dips lower for brief periods. By comparison games like Crysis 2 drop in to the low 20s.

By this method of judging framerate games on a Radeon 7950 with those gammy drivers run at about 10 fps.

The most important part is how it acts in intense action, and in those cases it's only really ~30-35fps..

360 is generally better then PS3 but both games are still far from being 60fps.

I also don't gauge PC performance on average frame rate, I go by the minimum and I always will.
 
The most important part is how it acts in intense action, and in those cases it's only really ~30-35fps..

That's an even lower ranger. Where are you getting this data on frame rates from? In the really intense stuff on DF it's in 50's, with the occasional dip into the 40's for a fraction of a second.

You made reference to the video - where in the video are you getting these figures of 30~35 in action scenes from?
 
That's an even lower ranger. Where are you getting this data on frame rates from? In the really intense stuff on DF it's in 50's, with the occasional dip into the 40's for a fraction of a second.

You made reference to the video - where in the video are you getting these figures of 30~35 in action scenes from?

I'll find the video when I get back in..
 
Cheers.

If you were talking about the Wii U I could understand, as there are transparency and enemy heavy bits where it does dip into the 30's, but the 360 fares rather better.
 
Back
Top