35 mm movie film cameras vs HD cameras

suryad

Veteran
I dont know if this is the right place to start this thread, but I was wondering since 35 mm is supposed to have better resolution, why or what is the difference with that and an HD camera? Maybe I am missing something obvious but I dont get it.

I guess HD can be recorded directly to hard drive or something like that resulting in a tapeless workflow...sorta like the P2 system that Panasonic has. I can see that to be the only advantage really. Because according to the wikipedia, 35 mm film provides more dynamic range, and has better propensity to pick up finer detail. Anyone care to enlighten me? :) Thanks!
 
I have been told that there are no such thing as a fixed resolution for film as there are a multitude of different stock with a multitude of different properties. Secondly, that Digital HD is now "good enough" to replace film, but that there will be a significant delay in turnover due to *people*. People who inhibit a century of knowledge of film. Who knows the differences inherit to different stock, knows what results they'll yield in different settings, knows how to light accordingly, and so on...

That's the kind of knowledge that doesn't get obsoleted by the digital revolution, so these days the choice for a particular project basically boils down to what kind of people that are making it.

Eventually though, I'm sure digital will mostly supersede film. As a new generation of people get better at making it look "right", the workflow benefits will outweigh the experience of those that went before.
 
how many pixels (or whatever the proper name for them is) in a hd cammera ccd
no idea
but 1 gram of the light sensitive stuff in film would contain 600,000,000,000,000,000,000,000/189 molecules you do the math :D
 
If every molecule was a pixel that would be a valid comparison, but it's not. They come in rather large chunks, AFAIK roughly comparable to the pixel sizes in CCDs. Besides, a single molecule cannot in any way possibly represent more than a single bit.
Although the most limiting factor is likely the optics anyway. Camera manufacturers have gone way too far in the megapixel race IMHO, at least on the consumer side.
 
how many pixels (or whatever the proper name for them is) in a hd cammera ccd
Current high-end equipment is 4520x2540, I believe; with proponents of digital video gunning for an upcoming 7680×4320 standard. Digital projection standards for the end result currently sits at 2048×1080 and 4096×2160.
 
I dont know if this is the right place to start this thread, but I was wondering since 35 mm is supposed to have better resolution, why or what is the difference with that and an HD camera? Maybe I am missing something obvious but I dont get it.

I guess HD can be recorded directly to hard drive or something like that resulting in a tapeless workflow...sorta like the P2 system that Panasonic has. I can see that to be the only advantage really. Because according to the wikipedia, 35 mm film provides more dynamic range, and has better propensity to pick up finer detail. Anyone care to enlighten me? :) Thanks!
This may be off topic:
http://www.cst.fr/IMG/pdf/35mm_resolution_english.pdf
 
Most digital "HD" cams pretty much suck imo. Their actual resolution is less than 1080p because some technical shit (filters, nyquist theory, which also applies to scanned film) and they typically have much worse dynamic range. I believe the only reason they are used is because you can just shoot something and then easily import into a PC and look at it and edit it
Raw 35MM film has much higher resolution than 1080P but most,if not film is scanned at 2K so it's resolution technically is also a bit less than 1080P.
I have heard of some blurays coming out in a bit that are supposed to have been scanned at up to 6K so they should look utterly amazing.
 
but most,if not film is scanned at 2K so it's resolution technically is also a bit less than 1080P.
"2K" as in scanning resolution/digital projection resolution is referring to the vertical max resolution of 2048(×1080), so it'll in fact be a bit higher than 1080P (though, effectively only for aspect ratios wider than 16:9).
 
It's probably scanned and processed at 2K anyway even if it's taped on film ... so meh, 2K was good enough for LotR.
 
Most digital "HD" cams pretty much suck imo. Their actual resolution is less than 1080p because some technical shit (filters, nyquist theory, which also applies to scanned film) and they typically have much worse dynamic range. I believe the only reason they are used is because you can just shoot something and then easily import into a PC and look at it and edit it
Raw 35MM film has much higher resolution than 1080P but most,if not film is scanned at 2K so it's resolution technically is also a bit less than 1080P.
I have heard of some blurays coming out in a bit that are supposed to have been scanned at up to 6K so they should look utterly amazing.

So since the IMAX format portions of the Dark Knight was I guess downsampled to the 1080p resolution is that why the IMAX bits look a lot better than the non-IMAX bits in the movie? Talking about the blu ray version here...
 
"2K" as in scanning resolution/digital projection resolution is referring to the vertical max resolution of 2048(×1080), so it'll in fact be a bit higher than 1080P (though, effectively only for aspect ratios wider than 16:9).
You are not understanding what I am saying.
Google nyquist theory.
So since the IMAX format portions of the Dark Knight was I guess downsampled to the 1080p resolution is that why the IMAX bits look a lot better than the non-IMAX bits in the movie? Talking about the blu ray version here...
Well I IMAX film is 65MM and I believe scanned at 4k.
Btw The Dark Knight's 35MM scenes also look better than most (if not all) other movies because they never used a digital medium for them, apparently most movies get color corrected cgi and all that shit at a lower resolution than the photography, not so with the dark knight,it went straight to the negative.
http://www.moviemaker.com/cinematog...ght_christopher_nolan_wally_pfister_20080714/
I only wish I had a better display :oops: One that actually did blacks..
 
If every molecule was a pixel that would be a valid comparison, but it's not. They come in rather large chunks, AFAIK roughly comparable to the pixel sizes in CCDs. Besides, a single molecule cannot in any way possibly represent more than a single bit.
Although the most limiting factor is likely the optics anyway. Camera manufacturers have gone way too far in the megapixel race IMHO, at least on the consumer side.
I think that main problem is not the optics, but the sensor:

1. as for last generations of products - "pixels" are too small and causes a lot of digital noise, which is removed by different processing techniques, which removes fine details too

2. bayer mask is very limiting factor - causes a lot of aliasing artifacts, demosaic filter tends to blur

3. low-pass optical filter, which prevents artifacting on bayer mask, destroys fine details (image source: Zeiss):
lpofdesf7on.png


4. as for film - this shot was taken using cheap supermarket ISO200 35mm film:

35aahd.jpg


and this crop is from 40MP scan:

35bbg3.jpg


I don't know about any digital consumer product, which would be able to show this level of per-pixel detail at 40MP enlargement, so that's why I think the main limiting factor for most situations is the digital sensor with bayer mask, not the optics...

Unfortunately, majority of new digital cameras use single sensor with bayer mask instead of decent 3-CCD full-color sensors. It's cheaper and it's easier to achieve "HD" resolution. No one cares, that 2/3 of the image are created by simple digital interpolation...
 
You are not understanding what I am saying.
Google nyquist theory.
Snark much? What you were saying implies that film scanners today can't have higher optical resolutions than the "2K" output resolution. They do. At the very least enough lines for a 4K scan. Unless they were constrained for speed during production of the print or simply didn't care, I'd assume they were using the oversampled resolution.

Edit: Well, considering 2K digital editing of feature films is now, what, 12-14 years old; there are probably still equipment out there that doesn't oversample. A quick google revealed tech data for a "breakthrough" 1996 high speed Philips scanner stating it's specs as 1920x1792 (half colour). The point still stands, though. Saying that a film scan today @ 2048×1080 is "a bit less than 1080P" by simply referring to Nyquist isn't accurate.
 
Last edited by a moderator:
The highest resolution DSLR is 24.5MP, HD movie cameras have much lower resolution. Film is equivalent to around 16MP resolution tops but it has much higher dynamic range than digital. Digital is much cheaper to operate, film is very expensive.
 
Cameras we bought: http://catalog2.panasonic.com/webap...ns-_-Right Hand Promo-_-New Product AJ-HPX170

and

http://catalog2.panasonic.com/webap...d=112115&catGroupId=34401&surfModel=AG-HPX500

and this lens for the big camera http://www.fujinon.com/Broadcast/Product.aspx?cat=1029&id=1078

P2 is an awesome workflow and an awesome codec, but we bought it because we wanted HD right out of the camera. The images it makes are quite stunning. I will post some clips once the filming is done. But whatever we tested it looks badass.

I am more used to the HD side of things and understanding that but film I guess is mysterious to me.
 
It's probably scanned and processed at 2K anyway even if it's taped on film ... so meh, 2K was good enough for LotR.

LoTR is an awesome movie but I dont think it is a great example for clarity and detail and all that in a screen shot. It had an awful lot of grain in the DVD releases. Its not sharp at all but I guess that would add to the immersiveness of a fantasy story. I would definitely like to see the bluray release of it to see if it still has that sort of blurry sandy grainy look to it.

That is another thing...when they make these bluray releases do they go back to the source footage that was created after editing and all that and just downgrade from there to the 1080p version? So that would mean going from 2k to 1080p right? Or is it something different?
 
Snark much? What you were saying implies that film scanners today can't have higher optical resolutions than the "2K" output resolution. They do. At the very least enough lines for a 4K scan. Unless they were constrained for speed during production of the print or simply didn't care, I'd assume they were using the oversampled resolution.

Edit: Well, considering 2K digital editing of feature films is now, what, 12-14 years old; there are probably still equipment out there that doesn't oversample. A quick google revealed tech data for a "breakthrough" 1996 high speed Philips scanner stating it's specs as 1920x1792 (half colour). The point still stands, though. Saying that a film scan today @ 2048×1080 is "a bit less than 1080P" by simply referring to Nyquist isn't accurate.
It is totally accurate, maybe you should search AVSforum or something.
I never said todays films scanners weren't good enough, quit putting words in my mouth, what I said is that 2K scans aren't 100% good enough for 1080), 4k is. That's why upcoming films will start using 4K scans(theres even one that uses 6k), not because some new 2160P HD shit is coming out, but because it will look better.
Here's a link to a quality thread btw http://www.avsforum.com/avs-vb/showthread.php?t=825993&page=1
LoTR is an awesome movie but I dont think it is a great example for clarity and detail and all that in a screen shot. It had an awful lot of grain in the DVD releases. Its not sharp at all but I guess that would add to the immersiveness of a fantasy story. I would definitely like to see the bluray release of it to see if it still has that sort of blurry sandy grainy look to it.

That is another thing...when they make these bluray releases do they go back to the source footage that was created after editing and all that and just downgrade from there to the 1080p version? So that would mean going from 2k to 1080p right? Or is it something different?

+1 LOTR looks like shite.
 
Last edited by a moderator:
what I said is that 2K scans aren't 100% good enough for 1080), 4k is.
And I'm saying you're mixing apples and oranges. Nyquist applies to the sampling resolution and both "2K" 1080P are output resolutions. Stating that one output resolution is lesser than another output resolution with the same number of lines is just nonsense. They're the same. A "2K" scan today may very well be oversampled 4X or more just as the 1996 machine needed 2X color interpolation. Same "2K" output resolution, different sample resolution.
 
And I'm saying you're mixing apples and oranges. Nyquist applies to the sampling resolution and both "2K" 1080P are output resolutions. Stating that one output resolution is lesser than another output resolution with the same number of lines is just nonsense. They're the same. A "2K" scan today may very well be oversampled 4X or more just as the 1996 machine needed 2X color interpolation. Same "2K" output resolution, different sample resolution.
You aren't understanding me, I even gave you a link.. I dont know what more I can do.
2k would be perfect if film scanners were pefect, but they're not.
 
Look. Take a 2MP still camera and shoot a resolution test. Compare that to a 6MP camera (with same quality optics and CCD size) shooting a 2MP image and the latter camera shooting a 6MP image that you later downscale yourself to 2MP.

A 2K film scan with modern equipment these days are the second example, which will have captured the same effective resolution as the output resolution of the last one. If you're not going to do any resolution degrading post processing and you're not going to use more than 2MP in your final print, the latter way will only cost you more time and processing power for no increase in visual fidelity.
 
Back
Top