1080p Dilemma

Well the exact number of pixels is a bit different, but 2K and 1080p are basically the same.

edit: VFX 2K is usually 2048*1536, and then it's letterboxed or scaled down, it depends on the material (like for anomorphic stuff).

4K is only used for IMAX most of the time; there were a few cases where movie VFX is rendered at higher resolutions but it's the minority. I think one case was the space scenes in Solaris where there was a lot of fine geometry on the space station, but it was a 'cheap' render - simple scenes, shaders and lighting. Transformers 2 on the other hand really taxed ILM on the 4K IMAX scenes.
Oh and just because you can watch a movie in IMAX, it does not automatically mean that it's a 4K render either ;) Many times it's just upscaled 2K material.
 
Not Ok! NOk!
http://www.youtube.com/watch?v=ZQ_Q2b_aXjk#t=6s

:3

Oh and just because you can watch a movie in IMAX, it does not automatically mean that it's a 4K render either ;) Many times it's just upscaled 2K material.

28465-Jerry-Seinfeld-leaves-gif-yxqm.gif
 
If you want to do IMAX 3D then your render times are going to get 8 times higher and that's a LOT. Even if your movie is stereo from the start, it's still a 4x increase. This will effect the entire production as well as you absolutely have to do all your dailies at that res to catch any problems; and you'll have to increase asset quality for close-ups and so on. Very few studios have the computing capacity to accommodate for that.

Oh and there's HFR as well ;) although I don't think it can be paired with IMAX. Still, going above 2K for movie VFX and CG animation is a very rare thing.
 
Oh and just because you can watch a movie in IMAX, it does not automatically mean that it's a 4K render either ;) Many times it's just upscaled 2K material.
Err, aren't a lot of "digital IMAX" theaters using 2K projectors anyway? Or am I mistaken...

Such nonsense doesn't surprise me, though; all I know for sure is that IMAX has experienced some of the most hilariously steep brand dilution ever in the "digital IMAX" age. What used to be called "IMAX" is mind-blowingly cool, what gets called "IMAX" today feels like little more than a basic multiplex room using a brand name to boost ticket costs.
 
You're right with that, the local theatre has also 'upgraded' from analog to 2K digital - although they promise to get up to 4K within a year or two.
 
You're right with that, the local theatre has also 'upgraded' from analog to 2K digital - although they promise to get up to 4K within a year or two.

Maybe MS/343i should do a limited release run of the Halo productions in theatres. *cough*
 
Well I did see Halo CE played in a movie theater many years ago ;) At 640*480 and all! :D
 
Would be nice to require 1080@60 or 720@60, that's the kind of nice things a console maker could put in place that players would appreciate. Ideal would be to make mandatory 1080@60 or both 1080@30 & 720@60 and the player picks up the one he prefers.

Exactly. Console makers just should force 1080p/60Hz vsync'd.

Starting with this, devs scale their engine and game accordingly. I would love such a gaming world...but DF would have no job!

Console makers could really differentiate their product compared to PC/laptop gaming by forcing a space-time resolution.
 
Actually that's not how it's done, brute force supersampling wouldn't be good enough as movie VFX and CG animation can not have any aliasing at all.
Offline renderers use sophisticated antialiasing methods - they are supersampling indeed, but much more cleverly. Pixar's Renderman in particular (used on most movies up until a few years ago) is especially complicated, as it decouples shading from sampling and uses stochastic patterns; but most other renderers have all kinds of trickery as well, like adaptive supersampling and so on.
Some info on PRMan:
http://www.hradec.com/ebooks/CGI/RMS_1.0/mtor/rendering/Renderman_Globals/rg-reyes.html

Most movie material is rendered at about 2000 pixel vertical resolutions (referred to as 2K) but has practically no aliasing at all. When CG is composited into live action, there's also some slight blur, DOF, and film grain applied in post, but no artifacts are left in the source image despite of that.

You are right though that when it's converted to DVD resolution - 720 * 586 or so - the process kinda acts like an additional step of supersampling.
When did classic/older movies start to be rendered/recorded at high resolutions? I mean, many "old" movies look very nice on Blu-ray despite being from yesteryear.

I wonder if we will ever see technologies like that on videogames, in realtime. sebbbi mentioned several times that taking advantage of recurrent pixels could be also key, I guess luminance and chroma separation might help a lot to achieve that?

Well I did see Halo CE played in a movie theater many years ago ;) At 640*480 and all! :D
I wonder how good it looked.
 
Last edited by a moderator:
When did classic/older movies start to be rendered/recorded at high resolutions? I mean, many "old" movies look very nice on Blu-ray despite being from yesteryear.
What's "high resolution" to you? There aren't really clear cutoffs in when things changed. Quality of film and recording tended to improve over time, but degradation is often a bigger issue than the original recording quality. Many old films that are artifact-laden garbage with flickering colors today were pristine and crystal-clear when first viewed in theaters.
 
What's "high resolution" to you? There aren't really clear cutoffs in when things changed. Quality of film and recording tended to improve over time, but degradation is often a bigger issue than the original recording quality.
What Laa Yosh defined as films being rendered at 2K. I mean, even movies made in the golden era of the DVD and older can look good on a Blu-ray these days, and I wonder if the original footage was recorded at high resolutions, as if they were future proof.

Old videogames can be rendered at superb high resolutions now, and sometimes you can see the great tech they used at the time. I played F-Zero GX on my laptop at 2600K -the fps were so poor tho, 5-10 fps on average- but I captured some images and it was crazy how good it looked.

Many old films that are artifact-laden garbage with flickering colors today were pristine and crystal-clear when first viewed in theaters.
Was that because of the original footage or because of the equipment used in cinemas or recording material back then? That's the question. I love to watch old western movies -scripts are usually very good- and some of them have been re-coloured with fine results.
 
What Laa Yosh defined as films being rendered at 2K. I mean, even movies made in the golden era of the DVD and older can look good on a Blu-ray these days, and I wonder if the original footage was recorded at high resolutions, as if they were future proof.

Old videogames can be rendered at superb high resolutions now, and sometimes you can see the great tech they used at the time. I played F-Zero GX on my laptop at 2600K -the fps were so poor tho, 5-10 fps on average- but I captured some images and it was crazy how good it looked.

Was that because of the original footage or because of the equipment used in cinemas or recording material back then? That's the question. I love to watch old western movies -scripts are usually very good- and some of them have been re-coloured with fine results.

If im not mistaken the film used in motion pictures of yesteryear are quite high resolution.
The problems that could occur that effected quality were dust in the lenses of projectors and damage to the film. Thats why movies from the the 60s on or even before that can easily be released on bluray. I could be mistaken.
 
If im not mistaken the film used in motion pictures of yesteryear are quite high resolution.
The problems that could occur that effected quality were dust in the lenses of projectors and damage to the film. Thats why movies from the the 60s on or even before that can easily be released on bluray. I could be mistaken.

There's some interesting articles out there on how much resolution can be resolved from various film stocks. This article for example:

http://www.kenrockwell.com/tech/film-resolution.htm

...puts Fuji Velvia 35mm film stock at around 175 megapixels in digital equivalent, albeit with film grain and presumably hideously expensive lenses being used.
 
When did classic/older movies start to be rendered/recorded at high resolutions? I mean, many "old" movies look very nice on Blu-ray despite being from yesteryear.

Movie VFX was always rendered at 2K res at least, and 35mm film stock has about as much detail in analog form as well.
Film stock is very high quality stuff usually and can be preserved for a very long time when stored properly (especially compared to digital data storage like DVD or tape, which deteriorate pretty fast).
Although older material is usually processed before BR releases - colors, sharpness etc. can all be significantly enhanced digitally, compared to their actual analog state. And in some cases the original material wasn't stored well and absolutely required restoration, like the original negatives of SW IV.

I wonder if we will ever see technologies like that on videogames, in realtime. sebbbi mentioned several times that taking advantage of recurrent pixels could be also key, I guess luminance and chroma separation might help a lot to achieve that?

Sebbi is talking about different things, although the principle of separating samples for various aspects of an image is sort of the same. But he also suggests to make many more trade offs and sacrifice precision wherever it's not as noticeable, similar to lossy compression techniques. Movie VFX has none of that, in fact almost everything is done at much higher precision levels and I don't see that going away. For example there's now research in moving beyond RGB colors to a spectral representation.
 
What Laa Yosh defined as films being rendered at 2K. I mean, even movies made in the golden era of the DVD and older can look good on a Blu-ray these days, and I wonder if the original footage was recorded at high resolutions, as if they were future proof.
The resolution of the film stock was excellent, but lenses and the reproduction process (film to film copies) reduced clarity. And even though film potentially offers higher base resolution, it's a pain in the butt to work with. Which would you rather use - a 16 MP digital SLR where you can capture 200 shots on a card, review them immediately, and edit/print all in the same session, or an 150 megapixel* film SLR where you have to keep burning through expensive rolls of 36 shots then send them to the lab to see what's what, then scan them in to edit and print?

(* that'd be an ideal case too. The real world limits of film in use put its resolving power well lower than theoretical maximums)

Hence the world has moved to digital, which means digital cameras. And Joker's link is out of date. There's better imaging tech now such as separate RGB sensors.
 
Yeah, understand that film has analogue "resolution" so it's very hard to compare to digitally rendered images. With proper preservation, it can also keep its quality and detail for a hundred years easily.
 
What Laa Yosh defined as films being rendered at 2K. I mean, even movies made in the golden era of the DVD and older can look good on a Blu-ray these days, and I wonder if the original footage was recorded at high resolutions, as if they were future proof.

As Laa-Yosh mentioned in the mean time, lots of film was/is recorded on analog film. Analog film offers a lot of detail over digital where you have a fixed amount of 'pixels'. Just realize that it took a long time for digital cameras to catch up to analog cameras when actually printing them at 300dpi. Because we tend to consume and view most of our digital media on digital screens with low resolution (= lower than 300dpi), it hasn't mattered much however.

Just take any camera from the 80ties or pictures that you used to print out and realize how high the print resolution is. With digital cameras getting to 36Mpixels, we are getting close to the detail analog films at the same size used to/can capture (assuming 35mm film).
 
Back
Top