For CG though, I think it's just economy.
If economy includes time and computing power, then yes indeed. Rendering > double the frames.. yikes. These movies take long enough as it is for a 90 minute feature.
I would quess that character animation would become harder as well due to increased temporal resolution.If economy includes time and computing power, then yes indeed. Rendering > double the frames.. yikes. These movies take long enough as it is for a 90 minute feature.
the technologies proposed by Panasonic for 3D imagery storage, transfer, etc, all utilize existing standard technology. Image encoding uses the two-channel encoding function implemented in Moving Picture Coding Experts Group Phase 4 Advanced Video Coding (MPEG-4 AVC) H.264. The second channel stores only the data different from channel one, holding the increase in data volume to about 1.5 times. The HDMI standard is used to transfer data from the player to the television, with left- and right-eye images alternated in single-field (single-frame) units.
But the mind-blowing didn't end there. For part two of its tech demonstration, PD showed off a playable demonstration of GT5 on a Sony developed Nano-Spindt Field Emissions Display (FED), which is capable of frame rates of 240fps. Again four PS3's were used in the demo to achieve the 240fps mark, which is four times that of a single PS3 console. In case you're wondering, 240fps is actually faster than the human eye can comprehend, and the resulting effect on the game produced an image that was described as "following a real world event happening right in front of your face with your own eyes....any and all flickering in the movement of the vehicle, in the smoke from the tires, etc. are completely gone and you are almost tricked into believing you are watching something in real life."
Just consider a real life (TM) scene with a moving object that's lit by a strobe light. If you increase the rate of the flashes, you can still make the individual "frames" noticeable by increasing the speed of the object.Isn't the limit of what eyes can detect somewhere around 72fps? I don't think anyone can see any difference above 80fps, but maybe I'm mistaken - possibly when objects move really fast, it makes a difference after all.
James Cameron :
http://www.variety.com/article/VR1117983864.html?categoryid=1009&cs=1
"I've run tests on 48 frame per second stereo and it is stunning. The cameras can do it, the projectors can (with a small modification) do it. So why aren't we doing it, as an industry?"
"But 4K doesn't solve the curse of 24 frames per second. In fact it tends to stand in the way of the solutions to that more fundamental problem."
James Cameron said:The DLP chip in our current generation of digital projectors can currently run up to 144 frames per second, and they are still being improved. The maximum data rate currently supports stereo at 24 frames per second or 2-D at 48 frames per second. So right now, today, we could be shooting 2-D movies at 48 frames and running them at that speed. This alone would make 2-D movies look astonishingly clear and sharp, at very little extra cost, with equipment that's already installed or being installed.
James Cameron said:People have been asking the wrong question for years. They have been so focused on resolution, and counting pixels and lines, that they have forgotten about frame rate. Perceived resolution = pixels x replacement rate. A 2K image at 48 frames per second looks as sharp as a 4K image at 24 frames per second ... with one fundamental difference: the 4K/24 image will judder miserably during a panning shot, and the 2K/48 won't. Higher pixel counts only preserve motion artifacts like strobing with greater fidelity. They don't solve them at all.
Then they could just simply downsample/downconvert the framerate to 24p to compensate for those lower-end theater projectors. See, problem solved.
Directors and filmmakers with a vision for their dream project aren't choosing 24p just because there's a low-end, lowest common denominator to account for. That would just be asinine. I mean, if that were the case, then why stop there? Why not shoot their films in 720x480p since that's the resolution that most people will ultimately be watching their DVDs in anyway?
24p is an artistic consideration, period. Don't argue with me on this.
http://en.wikipedia.org/wiki/24p
Moviegoers and filmmakers alike are accustomed to the look and feel of the motion of a 24hz film; this is a long-held aesthetic preference that has led to 24 fps being the longstanding, unshakable standard for everything you see on TV and in theaters today. I've found that Videogamers are generally SO oblivious and naive to this fact; they all think that "more is better" or "bigger numbers are better" on an absolute basis. This is just not the case for me, and MANY others.
Peter Jackson, Steven Soderbergh, George Lucas, Bryan Singer (Superman), and others have all the funds they need. When you have the budget they have, and you're operating on a budgetary level that they are operating in, the difference of a factor of 2.5x for digital disk space isn't going to mean that much to them.
This 120 Hz technology, I guess, would add a lot of lag. If some people don't believe lag is important, try playing a music game!Which the vast majority of gamers wouldn't even notice. Most LCDTVs out there don't have a game mode, and have extreme lag.
This 120 Hz technology, I guess, would add a lot of lag. If some people don't believe lag is important, try playing a music game!
While playing Rock Band, once I knew about "Game Mode" meant (reading Harmonix faqs), I switched to it and I started beating all my scores, in a natural, not forced way