120hz technology in future consoles?

I think you also need to consider the lighting. I think directors are just so used to setting the stage for 24 fps. At higher rate the PQ might not be as good or more difficult to obtain.

For CG though, I think it's just economy.
 
For CG though, I think it's just economy.

If economy includes time and computing power, then yes indeed. :p Rendering > double the frames.. yikes. These movies take long enough as it is for a 90 minute feature.
 
How does the new 3d stuff work. Am I right in thinking there are two projectors displaying alternating frames to get the 3d pop ?
 
If economy includes time and computing power, then yes indeed. :p Rendering > double the frames.. yikes. These movies take long enough as it is for a 90 minute feature.

You would also need more animation which is what really takes time and money....
 
If economy includes time and computing power, then yes indeed. :p Rendering > double the frames.. yikes. These movies take long enough as it is for a 90 minute feature.
I would quess that character animation would become harder as well due to increased temporal resolution.
 
Here you go, 3D Blu-ray and TV relying on 120hz input:
http://techon.nikkeibp.co.jp/article/HONSHI/20081030/160508/

the technologies proposed by Panasonic for 3D imagery storage, transfer, etc, all utilize existing standard technology. Image encoding uses the two-channel encoding function implemented in Moving Picture Coding Experts Group Phase 4 Advanced Video Coding (MPEG-4 AVC) H.264. The second channel stores only the data different from channel one, holding the increase in data volume to about 1.5 times. The HDMI standard is used to transfer data from the player to the television, with left- and right-eye images alternated in single-field (single-frame) units.

Extended HDMI for 3D:
http://www.eetimes.com/news/latest/showArticle.jhtml?articleID=212001657

Not sure whether and how this will affect 3D graphics. Given that both console players are somewhat limited by HD, they will probably go slow on this one.
 
i didnt see this posted here

But the mind-blowing didn't end there. For part two of its tech demonstration, PD showed off a playable demonstration of GT5 on a Sony developed Nano-Spindt Field Emissions Display (FED), which is capable of frame rates of 240fps. Again four PS3's were used in the demo to achieve the 240fps mark, which is four times that of a single PS3 console. In case you're wondering, 240fps is actually faster than the human eye can comprehend, and the resulting effect on the game produced an image that was described as "following a real world event happening right in front of your face with your own eyes....any and all flickering in the movement of the vehicle, in the smoke from the tires, etc. are completely gone and you are almost tricked into believing you are watching something in real life."

http://gear.ign.com/articles/932/932927p1.html
 
I was wondering, just as an interesting aside would anyone care to guesstimate how much graphical power/computing power/bandwidth it would require to run Crysis Warhead at 1920 by 1080 Very high quality W/4xAA and running at the magical 120hz? (Yes I know it doesn't scale very well so please just assume they made it play nice)
 
Isn't the limit of what eyes can detect somewhere around 72fps? I don't think anyone can see any difference above 80fps, but maybe I'm mistaken - possibly when objects move really fast, it makes a difference after all.
 
Isn't the limit of what eyes can detect somewhere around 72fps? I don't think anyone can see any difference above 80fps, but maybe I'm mistaken - possibly when objects move really fast, it makes a difference after all.
Just consider a real life (TM) scene with a moving object that's lit by a strobe light. If you increase the rate of the flashes, you can still make the individual "frames" noticeable by increasing the speed of the object.
 
So can frame interpolation technology used in many 120hz TV be used by game developers as a cheap software based solution for obtaining 60fps in games on non 120hz tvs :p
 
Last edited by a moderator:
James Cameron :
http://www.variety.com/article/VR1117983864.html?categoryid=1009&cs=1

"I've run tests on 48 frame per second stereo and it is stunning. The cameras can do it, the projectors can (with a small modification) do it. So why aren't we doing it, as an industry?"

"But 4K doesn't solve the curse of 24 frames per second. In fact it tends to stand in the way of the solutions to that more fundamental problem."

What a brilliant interview, thanks for sharing it. This quotation illuminates this discussion nicely.

James Cameron said:
The DLP chip in our current generation of digital projectors can currently run up to 144 frames per second, and they are still being improved. The maximum data rate currently supports stereo at 24 frames per second or 2-D at 48 frames per second. So right now, today, we could be shooting 2-D movies at 48 frames and running them at that speed. This alone would make 2-D movies look astonishingly clear and sharp, at very little extra cost, with equipment that's already installed or being installed.

And...

James Cameron said:
People have been asking the wrong question for years. They have been so focused on resolution, and counting pixels and lines, that they have forgotten about frame rate. Perceived resolution = pixels x replacement rate. A 2K image at 48 frames per second looks as sharp as a 4K image at 24 frames per second ... with one fundamental difference: the 4K/24 image will judder miserably during a panning shot, and the 2K/48 won't. Higher pixel counts only preserve motion artifacts like strobing with greater fidelity. They don't solve them at all.
 
Yeah, I really liked that interview too, just finished reading it, thanks LeGreg.

Makes me think that you could see Sony do a project where they make a PS3 with everything doubled up (like the experiment for GT5 Prologue) but this time to do 3D rendering. Then use that on all existing Sony PS3 games to start with and launch a DS like intermediate PS3 variant, maybe for Arcades initially, but who knows when 4 Cells and 4 RSXs will be small and cheap enough ...

Ah, I'm rambling. Still, that GT5 Prologue experiment makes a lot of sense in the context of this article ...
 
Then they could just simply downsample/downconvert the framerate to 24p to compensate for those lower-end theater projectors. See, problem solved.

Directors and filmmakers with a vision for their dream project aren't choosing 24p just because there's a low-end, lowest common denominator to account for. That would just be asinine. I mean, if that were the case, then why stop there? Why not shoot their films in 720x480p since that's the resolution that most people will ultimately be watching their DVDs in anyway?

24p is an artistic consideration, period. Don't argue with me on this.

http://en.wikipedia.org/wiki/24p

No problem, but you may want to have a little chat with Mr. Cameron there ;)
 
Moviegoers and filmmakers alike are accustomed to the look and feel of the motion of a 24hz film; this is a long-held aesthetic preference that has led to 24 fps being the longstanding, unshakable standard for everything you see on TV and in theaters today. I've found that Videogamers are generally SO oblivious and naive to this fact; they all think that "more is better" or "bigger numbers are better" on an absolute basis. This is just not the case for me, and MANY others.

Unfortunately, this is complete bullshit. Because you are ignoring the fact that 24 fps film imagery is built of frames with the inevitable motion blur from the non-zero camera shutter speed. Computer game imagery is perfectly, absolutely crisp and therefore does not blend in the viewer's eye/mind the same way. I am comfortable with a 60 fps game speed, that's my threshold for liquid smooth, and I want that as the minimum (not even average let alone highest) framerate -- I'll need the best visibility precisely when the going gets tough and the scene is full of characters to render trying to chop or zap my sorry virtual arse... So for the record I am one of those oblivious and naive buggers who think it's essential to get 100 fps average framerates.

Peter Jackson, Steven Soderbergh, George Lucas, Bryan Singer (Superman), and others have all the funds they need. When you have the budget they have, and you're operating on a budgetary level that they are operating in, the difference of a factor of 2.5x for digital disk space isn't going to mean that much to them.

Maybe they wanted to stick to 24 fps because they tend to have lots of computer generated imagery in a very large number of frames to create -- and there was a budget and a schedule for the planning, asset creation, direction, animation, rendering of that imagery? Camera shooting Frodo walking in a forest in funny clothes was the quick easy part there.

Edit: Shouldn't have stopped reading other replies by page three, all of this got addressed already, but here goes/stays regardless. :p
 
Last edited by a moderator:
Which the vast majority of gamers wouldn't even notice. Most LCDTVs out there don't have a game mode, and have extreme lag.
This 120 Hz technology, I guess, would add a lot of lag. If some people don't believe lag is important, try playing a music game!

While playing Rock Band, once I knew about "Game Mode" meant (reading Harmonix faqs), I switched to it and I started beating all my scores, in a natural, not forced way
 
It would add one frame of lag. The current true frame would be in the TV's memory as it interpolates with it and the previous frame across the period of one game frame. So if the game is outputting 30 fps, lag will be 1/30th of a second with screen updates 4 times in that period at 120 fps.
 
This 120 Hz technology, I guess, would add a lot of lag. If some people don't believe lag is important, try playing a music game!

While playing Rock Band, once I knew about "Game Mode" meant (reading Harmonix faqs), I switched to it and I started beating all my scores, in a natural, not forced way

But dont those music games have some sort of tool to calibrate and compensate for lag? I can feel the lag an in FPS, but to me it feels like turning on mouse smoothing in a PC game.
 
Image based motion compensation for games can be done, but as stated earlier it produces additional lag, and there are always some quality issues present (image artifacts).

However much better algorithms can be developed by the game developers, as opposed to the video footage, the game image is composed of transformed 3d primitives. This provides the game much more information to speed up and increase the quality of the motion compensation (or estimation):

- Game knows the object each rendered pixel belongs to.
- Game knows the transformation matrix of the camera and every object. The game knows the last frame matrices of the objects and camera, and knows the velocity of these entities.
- By using physics engine integrator (or just linear interpolation) the game can predict the object and camera matrices pretty well. Because of this the additional input lag can be reduced (or completely eliminated) by various methods.
- Game can easily motion compensate only the game camera view (3d viewport). All translucent UI badges for example can be rendered on top of every frame without motion compensation. Image based methods in TV sets cannot motion compensate alpha blended geometry at all, and struggle on small UI components and text floating on top of moving background.
 
Back
Top