1080p Dilemma

Just take any camera from the 80ties or pictures that you used to print out and realize how high the print resolution is. With digital cameras getting to 36Mpixels, we are getting close to the detail analog films at the same size used to/can capture (assuming 35mm film).
Actually, many of one's own prints from the 80s will be pretty crap. It was a period of cheap compact cameras with cheap compact lenses and little resolving power, regardless of the film. Decent quality needed a decent camera (SLR).

A 6 MP digital camera from 2003 (Canon EOS 300D) could produce far better printed images on a decent printer then a 1980s compact, although you get even better results now with the same 2003 digital photos. Fuji's digital print process produces results indistinguishable from film prints. If you look under a magnifying glass at a digital or film print, you get the same structure.

Edit: Talking 6x4 glossies. But then for larger prints, film uses medium/large format anyway, so it's not like 35mm was providing enough resolution for a 10" glossy, in professional terms. And at 300 dpi, a 10"x8" photo is 3000x2400 pixels, or only 7 megapixels. Of course at the res, you're actually dealing with the issue of Bayer filtering. Take your sensor up to 21/28 MP and you get higher fidelity within the same print density (although I question if anyone could notice the difference without being an experienced expert!).
 
Shifty, I wasn't exactly refering to compact cameras when I was refering to cameras. It was more of a like for like comparison - compact with compact etc. Comparing a EOS 300D to what ever you considered to be compact in the 1980s of course isn't exactly fair, since you are adding the factor of the lens among others.

The post was more directed at comparing digital capturing through a CMOS sensor (at finite resolution) to analog film, assuming somewhat comparable lens/equipment etc. :p
 
Last edited by a moderator:
Shifty, I wasn't exactly refering to compact cameras when I was refering to cameras. It was more of a like for like comparison - compact with compact etc. Comparing a EOS 300D to what ever you considered to be compact in the 1980s of course isn't exactly fair, since you are adding the factor of the lens among others.
Precisely. The lens is more important then the film or sensor in most cases. You have to have a very nice optical setup to get a sharp image. Even high quality lenses have a sweet spot with maximum resolving power.

The post was more directed at comparing digital capturing through a CMOS sensor (at finite resolution) to analog film, assuming somewhat comparable lens/equipment etc. :p
Okay, so the "Just take any camera from the 80ties or pictures that you used to print out..." wasn't quite what you were saying, unless you were talking to professional photographers from the 80s! ;)
 
Perhaps you should re-read my post, in its entirety, which was directed at Cyan, who was questioning why his classic/older movies that he is now enjoying in HD on Bluray have excellent resolution and detail. When I said he could take any analog camera from the 80ties and some printed out photos, it was to point out that analog film, offers more than enough detail to match a fullHD resolution (a measly ~2Mpixel), which by any messure, is rather 'low res' in photography or printing at 300dpi or greater. Even if we assume that a 35mm film on average produces roughly the equivilant of detail a (higher quality) 6Mpixel camera does, it's still more than enough to cover the spectrum of a FullHD resolution.

The talk here isn't about lenses - it's about the medium, that being analog film vs. a CMOS sensor/digital.
 
AI and physics might have to be limited and environments will likely be sparse or enclosed. Accommodate complex AI and physics and expansive complex environments together in your 16ms refresh and you could end up with incredibly basic looking games which simply aren't very marketable.

We've had 3d games at 60fps en masse way back on the PS2 and since then, with every new console generation, we've gotten hardware that is magnitudes better than the generation before. That games would be "incredibly basic looking" is subjective and relative at best - and if anything, TLoU:Remaster and the Tomb Raider DE (not to mention any other game that targets 60fps, like BF4 or any of the CoD games) show that they are anything but, and stack up rather well, even to games that target a lower framerate.

The only reason why they wouldn't be very marketable (I prefer to say; they probably wouldn't be as marketable) is precisely because we have games targeting 30fps and as a result have a higher sense of expectations, knowing what would be possible with that trade-off. If a console maker were to create a compulsory target resolution and framerate of say 1080p60, we would be just appreciating what developers can do at that resolution. It's all about expectations. And if a game or the vision of a developer ain't possible at that resolution and that framerate, perhaps we just have to wait another generation. There will always be a threshold on what is doable and what isn't with finite hardware resources. It's nothing new - and no, 30fps isn't the holy grail that makes anything possible. It's all relative.
 
Perhaps you should re-read my post, in its entirety, which was directed at Cyan, who was questioning why his classic/older movies that he is now enjoying in HD on Bluray have excellent resolution and detail. When I said he could take any analog camera from the 80ties and some printed out photos, it was to point out that analog film, offers more than enough detail to match a fullHD resolution (a measly ~2Mpixel), which by any messure, is rather 'low res' in photography or printing at 300dpi or greater. Even if we assume that a 35mm film on average produces roughly the equivilant of detail a (higher quality) 6Mpixel camera does, it's still more than enough to cover the spectrum of a FullHD resolution.

The talk here isn't about lenses - it's about the medium, that being analog film vs. a CMOS sensor/digital.

I remember reading somewhere it would take ~ 200Mpixel to capture all the detail possible in a 35mm photo using Velvia 50.
 
It might sound a bit strange, but sometimes moving processing to GPU saves BOTH CPU and GPU cycles.

That is a helpful concept. The GPU can be quite effective at determining what doesn't as well as what does get rendered since they are essentially in the same "parallel/math" domain. From this perspective the trick is to keep as much of the scene and it's transformations from frame to frame resident on the GPU leaving the CPU to do as little as possible in that transformation. Less "heat" from the CPU, Memory Controller and more from the GPU.

Kinda of reminds me a bit of Mario Andretti's line "You'd be surprised how many drivers--including Formula One drivers--that still think the brakes are for stopping the car.". It's about adjusting vectors in small increments at the limits of adhesion not creating acute angles that burn up tires and brakes and oil etc. It's thinking about CPU time and memory transactions as resources to be managed like keeping momentum in a turn.
 
When did classic/older movies start to be rendered/recorded at high resolutions? I mean, many "old" movies look very nice on Blu-ray despite being from yesteryear.

Yeah, thats the beauty of high quality film stock. Depending on who does the restoration you can resolve far more detail from the negative then what equipment at the time was able to display.

BAKARA (shot on 70mm) was the first film to pioneer ultra high resolution digital transfer (8k) from film and the results were pretty damn amazing from when I first saw it, almost like they re-shot the entire thing. Since then there have been some excellent remasters like Lawrence of Arabia for example.
 
Oh and just because you can watch a movie in IMAX, it does not automatically mean that it's a 4K render either ;) Many times it's just upscaled 2K material.

4K is not actually quite good enough for IMAX screens either, as the equivalent resolution of IMAX 70mm film is something like 18K.

With regards to 1080p, I actually think that for the majority of people (who have screens <50 inches) developers would be better off using 900p with good quality AA and more expensive shaders/effects.

I mean last gen, Alan Wake was one of the best looking games on 360 despite being only 540p (with 4x MSAA). It definitely looked much better than it would have had it been 720p no AA.
 
The resolution of the film stock was excellent, but lenses and the reproduction process (film to film copies) reduced clarity. And even though film potentially offers higher base resolution, it's a pain in the butt to work with. Which would you rather use - a 16 MP digital SLR where you can capture 200 shots on a card, review them immediately, and edit/print all in the same session, or an 150 megapixel* film SLR where you have to keep burning through expensive rolls of 36 shots then send them to the lab to see what's what, then scan them in to edit and print?

(* that'd be an ideal case too. The real world limits of film in use put its resolving power well lower than theoretical maximums)

Hence the world has moved to digital, which means digital cameras. And Joker's link is out of date. There's better imaging tech now such as separate RGB sensors.
I think Phil is right, from what I could gather reading his words. I mean... shooting a movie in the past doesn't apparently have much to do with the final image quality of such movie, because analogic cameras were allegedly so good at recording superb high resolution material that they've passed the test of time.

This is another link which I found following the original link joker454 has shared. :smile2:

You can find the actual comparison here by rolling the mouse over the image, a la Digital Foundry, in another section of the webpage called "Why we love film":

http://www.kenrockwell.com/tech/why-we-love-film.htm

a 52-year-old camera still has more sharpness and resolution. The Retina camera and its meter both work without batteries.
This Kodak Retina IIIc cost me only $75 over eBay, complete with German lens and working meter. So it goes. Film: The Immortal Medium of the Masters. Digital: Profit center for large foreign corporations.
I was shooting a 1956 Kodak Retina IIIc, and out of curiosity had some automated 5,035 x 3,339 pixel scans made at NCPS. Just for laughs, since these scans looked so good, I shot the same thing with my 2008 state-of-the-art Nikon D3 (that cost me $5,000) and 24-70mm f/2.8 AFS lens (that cost me $1,700) to compare to the Kodak.

The 1956 camera, with its fixed 50mm f/2 Schneider Retina-Xenar on Fuji Velvia 50 and a good automated scan, has better resolution than Nikon's state-of-the-art digital!
This is a shot from a 58 years old camera, much sharper.

sc9gk0.png


This is the same shot from a modern digital camera --the sharp detail is lost.

xlaxaw.jpg
 
Last edited by a moderator:
By fixing games to 1/60th of a second for a refresh you are limiting the scope of what a game can do, what it can be. The compromises will be too great IMO.

AI and physics might have to be limited and environments will likely be sparse or enclosed. Accommodate complex AI and physics and expansive complex environments together in your 16ms refresh and you could end up with incredibly basic looking games which simply aren't very marketable.
Actually, if you see games like Halo 2 Anniversary, the game runs at 60 fps and looks so awesome, maybe because the game was so optimised as a classic, that now with a more powerful machine they could use the original assets and do justice to the developers who originally worked on the game.

And at a flawless 60 fps!

I think that with classic games we have a similar situation as with classic films. The details were there, the developers were limited by power and tried to squeeze every ounce of power from those consoles, and now that's paying off!

The original assets can be used to great effect adding on top of that excellent shading and new techniques without exceeding the 16ms budget.

If you watch the Halo 2 Anniversary: Remaking the Legend documentary, you can see the differences between Halo 2 original and Halo 2 remastered, in real-time, and that's... crazy.

Movie VFX was always rendered at 2K res at least, and 35mm film stock has about as much detail in analog form as well.
Film stock is very high quality stuff usually and can be preserved for a very long time when stored properly (especially compared to digital data storage like DVD or tape, which deteriorate pretty fast).
Although older material is usually processed before BR releases - colors, sharpness etc. can all be significantly enhanced digitally, compared to their actual analog state. And in some cases the original material wasn't stored well and absolutely required restoration, like the original negatives of SW IV.

Sebbi is talking about different things, although the principle of separating samples for various aspects of an image is sort of the same. But he also suggests to make many more trade offs and sacrifice precision wherever it's not as noticeable, similar to lossy compression techniques. Movie VFX has none of that, in fact almost everything is done at much higher precision levels and I don't see that going away. For example there's now research in moving beyond RGB colors to a spectral representation.
Thanks for the detailed explanation.

As for material requiring restoration, there is a teacher where I live who recorded some footage in the 60s!! That's so rare here, people were "poor" back then, and couldn't afford cameras.

I asked him for a copy of that footage and he told me it was recorded on either Betamax or Super 8 -iirc, it was Super 8- :smile2: and said that he will try to find and recover it (I told him it'd be so easy to upload that footage to youtube or similar).

He said that the problem was that most of the footage was lost and deteriorated, :cry: that back in the time that Super 8 -or similar- footage tended to deteriorate and parts of the video tape had to be cut with scissors and pasted with adhesive to keep it and for it to be played. :cry:
 
I think Phil is right, from what I could gather reading his words. I mean... shooting a movie in the past doesn't apparently have much to do with the final image quality of such movie, because analogic cameras were allegedly so good at recording superb high resolution material that they've passed the test of time.

This is another link which I found following the original link joker454 has shared. :smile2:

You can find the actual comparison here by rolling the mouse over the image, a la Digital Foundry, in another section of the webpage called "Why we love film":

http://www.kenrockwell.com/tech/why-we-love-film.htm

This is a shot from a 58 years old camera, much sharper.

sc9gk0.png


This is the same shot from a modern digital camera --the sharp detail is lost.

xlaxaw.jpg


Really? comparing a 2007 D3? in 2014?

Why don't we do that comparison with a 2014 model?
 
Actually, if you see games like Halo 2 Anniversary, the game runs at 60 fps and looks so awesome, maybe because the game was so optimised as a classic, that now with a more powerful machine they could use the original assets and do justice to the developers who originally worked on the game.

And at a flawless 60 fps!

I think that with classic games we have a similar situation as with classic films. The details were there, the developers were limited by power and tried to squeeze every ounce of power from those consoles, and now that's paying off!

The original assets can be used to great effect adding on top of that excellent shading and new techniques without exceeding the 16ms budget.
H2A isn't using the original assets. The models are being tied to the old animations and such for gameplay reasons, but the geometry and texturing is new.

The original Xbox has 64MB of RAM (1/128th as much as XB1's main pool) and games were typically stored to around a tenth the size of games today so that they could fit on DVDs. It would be generous to say that Halo 2's OG assets are making full use of composite NTSC, let alone shining in a modern game sent to a 1080p panel over HDMI.

H2A isn't 60fps because they're using old assets, it's 60fps because 343i decided to allow for a low graphical baseline (by eighth-gen AAA standards) in favour of targeting 60fps.
(Not that I think this is a bad thing.)
 
I think Phil is right
What Phil said about film having increased resolution is correct (and no different to what I said earlier). The line that any camera from the 80s or any of your photos look good wasn't right, which is what I was arguing with, but that wasn't his point.

You can find the actual comparison here by rolling the mouse over the image, a la Digital Foundry, in another section of the webpage called "Why we love film":
Not a fair comparison (it's clearly not trying to be a balanced piece but a pro-film piece). 1) He's comparing a prime lens versus a telephoto. Prime's are always better. 2) As I've already mentioned, professional camera tech has moved on. A three-sensor camera avoids the issues that need mathematically filtering and maintains ideal sharpness (approximately 3x the 'resolution' of a digital camera, whcih can sort of be considered a half or third or it's stated resolution). 3) He appears to have upscaled the digital camera image? It says 118% zoom, so to match the analogue size image, he's upscaled the digital image and introduced blur.

Point being, that one page isn't proof, but then doesn't need to be. Film has better resolving power than a several megapixel CCD/CMOS sensor - that's never been disputed! However, working with film requires prints to be made, and analogue copies degrade. So what you watched projected on the movie screen, a copy of a copy, was blurrier than the master used to create the BRD. And when you pull out the family photo album from the 80s, they'll be a blurry mess unless you shelled out on some decent lenses, regardless of how good the resolving power of the film was. So a 12 MP phone camera can take better pictures than a 1980s compact.
 
Really? comparing a 2007 D3? in 2014?

Why don't we do that comparison with a 2014 model?

To be fair, the D3 is an excellent piece of kit, even by todays standard. Its 12Mpixel CMOS sensor produces very high quality captures, with very little noise. The problem with comparing digital to analog however is that digital is a fixed resolution. So enlarging it beyond its fixed resolution will always result in blur. 12Mpixel are what they are (though in the D3s case, pretty much as good as you can get I'd say). If you take a current D810, obviously, you will get a lot higher resolution (36MPixel), but at this resolution, you are reaching the limits of very good prime lenses. I would think a very good 35mm film would be similar, if you had the equipment to capture what the medium can resolve.


Shifty; I know it's all relative, but to be honest, I'm a bit doubtful any phone camera could resolve as much detail as even what you may consider compact cameras from the 80ties (unless we're talking about throw-away cameras here). Truthfully, I haven't tested every single phone camera that is outthere - I hear the Sony Xperia phones do quite well here - but the quality I'm used to from these mobile devices is a far cry from what the above picture that demonstrates a digitalized high resolution scan off a "52 year old camera". Not even in the best case scenario and good lighting, can I see any device with such a tiny sensor resolve even half the amount (probably not even a quarter) of detail that the Megapixel rating implies. In my experience - mobile phone devices take good enough pictures to share over the internet where you'd not even need HD resolution, but they are still quite a far way off what digital compact cameras with similar Mpixel ratings achieve and a far cry off what any decent digital SLR does - and that's not even including fullframe SLRs.

Also, in regards to the comparison; The Kodac has a 50mm lens (2.0f) vs. the D3 with the 24-70mm (2.8f) nikkor lens. I assume he used the maximum focal length on the D3 and because the scan of the 35mm produced a higher resolution capture (5035 x 3339), he enlarged the D3 capture slightly to match it up to the digitalized scan of the Kodac. The D3 together with the 24-70mm is an excellent (and expensive) piece of kit, even by todays standards. A more recent comparison with a D810 might yield different results, but that would be comparing a 35mm film to a 36Mpixel camera.
 
Shifty; I know it's all relative, but to be honest, I'm a bit doubtful any phone camera could resolve as much detail as even what you may consider compact cameras from the 80ties
Some modern phones are pretty good, while some 1980's cameras were pretty poor. For one thing, plenty of people were taking snaps on 16 mm 110 film! With a really cheap lens. Here. Here's a 1980's photo.

Truthfully, I haven't tested every single phone camera that is outthere - I hear the Sony Xperia phones do quite well here - but the quality I'm used to from these mobile devices is a far cry from what the above picture that demonstrates a digitalized high resolution scan off a "52 year old camera".
The age of the camera doesn't matter, because the camera is basically just a light-sealed box with a controllable aperture. As long as the lens is good and the film is good and positioned perfectly aligned, a cardboard box would suffice! Note OI'm not saying all mobile phones >> all 1980s cameras. I'm saying a 1980's film camera isn't inherently better than every mobile phone because it uses film and film > digital. It depends on the camera, both which 1980's camera and which mobile phone camera. The same will be true of digital cameras. One can't assume that a film shot in the 1950s on film will have more clarity in the master than one shot in 2014 on digital.

A more recent comparison with a D810 might yield different results, but that would be comparing a 35mm film to a 36Mpixel camera.
Which is a fair comparison to be made if we're comparing what digital can do versus film.
 
Shifty Geezer said:
I'm saying a 1980's film camera isn't inherently better than every mobile phone because it uses film and film > digital

I guess I'm fixating on the medium because it's the biggest bottleneck on mobile devices. Assuming the 'box' as you call it is comparable, the limiting factor becomes the quality/size of the sensor on a digital device vs. the quality of the 35mm film. Effectively, we're comparing capturing light on a 16mm^2 area (a mobile device) vs. 864mm^2 (35mm film or Fullframe). It's only logical that the latter will resolve a lot more detail. Which is why I'm quite skeptical that there are mobile phone devices that supposedly have 'good quality' for anything other than online sharing...

It has taken a long time for digital sensors to catch up to what film has been able to resolve, even at fullframe sensor sizes. And even if todays state-of-the-art fullframe sensors do a good job - it's still arguable if they have caught up in every sense - there are still other factors to just mere resolution, like colour depth for instance.

I'm not advocating analog cameras or film in any sense btw; I exclusively do my photography on a modern digital SLR (currently a Nikon D800). That an analog camera with a 35mm film can even be compared to a modern semi-professional/professional d-SLR, is IMO quite a feat.
 
But the advantage of digital, which very much caught up with film a while ago, was always what you get from it in terms of convenience, ability to see results straight away, ability to work with the image and modify it as you see fit and well, everything else we can do now that we would have dreamt of with film.

I tend to disagree that digital has not caught up with film in terms of detail capture.

I also find it quite interesting that those still professing the superiority of film, are looking at those results on a computer monitor which means that the film picture went through a process of digital capture - scanned with unknown mediums and unknown quality settings. How can you compare two pictures like that? And if actually comparing a digital capture to a 'printed film capture' (How? With a magnifying lens?) then the quality of the print also comes into question.

Moral of the story, results can always be skewed to show what one wants to prove.

Film died because of the endless convenience factors of digital, and quality has very much caught up with it a long time ago.

There are of course exceptions like Interstellar which I think was shot with 70mm film. But 70mm is MASSIVE. Imagine a sensor that big on a digital camera, or just imagine a Sony sensor with the quality and pixel density such as its latest A7 full frame sensors, but at 70mm size... We're talking about a level of image quality that is just immense. Those sensors are what, 36Mpx? And those are GOOD 36Mpx. 70mm is 4 times 35mm, so we're talking about 144Mpx. Immense. It's the level where the Lenses are just as important as the sensor, and that's a completely different discussion.
 
Back
Top