*split* multiplatform console-world problems + Image Quality Debate

I was under the impression, from my days in the HD DVD group, that most movies are filmed on bog standard 35mm film. Even if not, they are transferred to 35mm for projection.

Really depends on the movie. There are a pretty wide variety of film formats in use. 135 is the most common and is what you are referring to (it is 35mm). Imax is 15/70 which is 65mm if I remember correctly. You even get 16mm for lower budget films. That is for the moving portion of the film. I was thinking more of the single image plate films that get used when a still shot is included. I believe the digital cameras used for those tend to be things like the 5D from Canon which are in the 20MP range.

Like tkf said though, the MP argument around these formats gets nearly religious at times. Film records pictures differently, and even the same size of film from different manufacturers performs differently. As others have said, the quality of your lense, the skill of the photographer, the quality of the camera body, and the sensor quality matters a whole lot more than MP.
 
I wouldn't bet too much on "average joe" not noticing too much though. The average joe of today has come a long way in educating himself on technical differences. Now days, every phone is bought and compared by its resolution (and other spec-sheet numbers). Resolution is playing a big role in everyday devices, so I wouldn't bet on the average gaming public to be as ignorant when it comes to resolution differences on consoles as perhaps a couple of years ago.

That assumes resolution is important to them of course. Since you mention small devices used close to ones eyes, let's take the iPad Mini as an example. It's the lowest resolution of all the iPads, yet it outsells the others even though it's not "retina". That's probably because resolution isn't their #1 priority when making a purchase, even though they will be looking at the thing just 15 inches from their eyes.

Anyways given that most people don't read forums like this and that most game boxes will say 1080p on the back as will their tv's, I doubt most people will even know that they aren't playing a 1080p game. I certainly still see that with today's educated gamer. When I game it's from the couch with a 360 controller. As such people that don't know me well think I'm actually playing on an Xbox 360. Of course that's because they don't see the pc that is tucked away in the closet, and because they can't tell 1080p resolution even on my 65" screen that is just 10 feet away. They just can't see it. It's aggravating to me to be quite honest, to have such a large visual difference to me that seems to go unnoticed to others. I really don't understand it, but it is what it is.

I'm dealing with the same thing with one of my video camera's, the Sony NX30 that I sometimes use with a wide conversion lens, and it drives me nuts that the wide conversion lens adds some blur and resolution loss to the final image. Of course I show samples of this resolution loss and blur to others explaining why I'm trying to find a higher quality wide conversion lens, and by and large they think I'm mad. Somehow they just don't see the image degradation, something which to me is roughly as obvious as a punch in the head. But then I remember my dvd vs bluray tests that I did many years ago when many couldn't see the difference, and I remember forums like Beyond 3d where people went on about how 1080p on pc was a non visual improvement over the 720p of consoles, or how game x running on one console looks the same as the other console version even though one has blur and lower resolution, and then it becomes more clear. Yes, people often just can't see it, there's a limit to what they can perceive as an appreciable difference. While it may be aggravating to me and maybe to you or others, it's just the reality we have to deal with being 1%'ers when it comes to visuals and easily seeing the more minute differences.

Of course these same people that aggravate me when it comes to visuals likely feel the same way against me when it comes to audio. I'm really not sensitive to audio at all, and I have gotten stares of "wtf?" when I've heard two audio sources and thought them to be more or less the same. I guess that's karma for you.
 
I was under the impression, from my days in the HD DVD group, that most movies are filmed on bog standard 35mm film. Even if not, they are transferred to 35mm for projection.

The digital movie making has essentially taking over:
http://en.wikipedia.org/wiki/List_of_films_shot_in_digital

In many ways it makes it so much easier to do post production, check the daily recordings and alot of other stuff. The challenge is (imho) to keep the video look at bay, and avoid the "perfect" clean picture quality that sometimes can drag the realism down in movies. Life is gritty and grained and so is the world :)
 
Yeah I know all of that, that's why I used the Souls games as an example of why "higher resolution will actually make the jaggies and shimmering less noticeable, I thought everyone understood this" is not true IMO, by just looking at the res of a game you can't tell if the IQ is good or not because there are other factors like quality of AA (plus things like shader aliasing), AF that are really important plus smaller things like motion blur and DoF that can affect IQ.

So where exactly you were disagreeing with me?
I dont know if Im disagreeing, Im pointing out that the
1024x720x2AA has a 'higher resolution' depth buffer than the 1280x720x1AA ~60% more samples. Hence the polygon edges will look better due to the higher resolution depth buffer.
i.e. the reason it looks better is cause its higher resolution
 
Got to take back my praise ... it's 720p so i would expect pixar graphics , and obviously this game hasn't.
It's not 2005 anymore .
I think I rather prefer Disney graphics with more effects and stuff going on at a rather decent resolution that at 1080p.

I kiiiiiinda like having games running at a native 1080p, but you should create your game and your engine around the resolution -plus the console's hardware- then it can be nice. Think of Forza 5...

Some TVs offer 1:1 pixel mapping so you could definitely bypass the TV's internal scaler regardless of resolution input.
Yes, that's what I do, not only to completely bypass the TV's internal scaler but also to get 1:1 pixel ratio and thus perfect FOV.

You never get to see the full image your console is actually outputting without 1:1 pixel mapping -maybe at 4:3, but then the image looks oblate and you can't see a thing.

The setting is called "Unscaled" in my TV. For Samsung TVs I remember it was Just Scan.
 
The digital movie making has essentially taking over:
http://en.wikipedia.org/wiki/List_of_films_shot_in_digital

In many ways it makes it so much easier to do post production, check the daily recordings and alot of other stuff. The challenge is (imho) to keep the video look at bay, and avoid the "perfect" clean picture quality that sometimes can drag the realism down in movies. Life is gritty and grained and so is the world :)
From what I can tell, the best digital film cameras are about 19MP. I was questioning the statement that movies are filmed at a 100MP equivalent nowadays.

The new Miami Vice movie was filmed in digital, and they added film grain to make it gritty. Unfortunately, they used digital noise for the grain instead of a proper film grain pattern. It made it exceedingly hard to compress well for inclusion on the blue laser formats. Ended up with lots of compression artifacts, even at relatively high bandwidths.
 
As I already said if the assets are identical with same or comparable AA and AF (like BF4 on PS4 and XB1) of course the version that runs at a higher resolution will have the better IQ, my post were more towards the posts that I see in a lot of gaming forums lately saying that IQ at 720p or 900p (when for example Ryse IQ wise looks really good) is unacceptable when there is more than resolution on why a game can look smooth and clean.
They are probably complaining because Ryse could've been at a higher resolution on PS4. Let's be honest here. I don't believe Ryse started as an exclusive title. Ryse doesn't have the most technically demanding gameplay elements. The other graphical features seem to be more than covered in KZ:SF.
 
*AHEM* Gameplay does not factor into this thread discussion even if this has been a cluster of a thread.
 
but not all tv have 1:1 option and not all TV "game mode" can present image without post process on TV.

these next-gen consoles still don't have "scaling option"? Its needed to kill the overscan on my samsung tv connected to radeon pc.
 
I dont know if Im disagreeing, Im pointing out that the
1024x720x2AA has a 'higher resolution' depth buffer than the 1280x720x1AA ~60% more samples. Hence the polygon edges will look better due to the higher resolution depth buffer.
i.e. the reason it looks better is cause its higher resolution

So you don't know if you're disagreeing with me now and when you talked about higher resolution you meant "higher resolution" depth buffer when the discussion was clearly about how much geometry resolution affects IQ? yeah right...

They are probably complaining because Ryse could've been at a higher resolution on PS4. Let's be honest here. I don't believe Ryse started as an exclusive title. Ryse doesn't have the most technically demanding gameplay elements. The other graphical features seem to be more than covered in KZ:SF.

Ryse started as a 360 exclusive as stated above and is probably funded by MS so what exactly kind of logic is this? if the PS3 exclusives were on the PC they would run at a higher resolution, higher frame-rate and much better AA too, does that make them less impressive for the hardware they were on? of course not.
 
Last edited by a moderator:
but not all tv have 1:1 option
I've never met one that didn't. They tend to call it different names. My first HD set called it "Normal" (versus "Zoom" that used overscan). My current set calls it "Just Scan" (while the "Normal" option actually has the overscan, the complete opposite of my first set). Might want to dig through your options again. It's probably in there, just using a name that doesn't necessarily indicate what it is.
 
Still find it very funny how fan boys watching heavily compressed Youtube movies get confused by:

- Video compression and inherently filtering
- Different contrast settings on the capturing set-up
- More post-processing / sharpness on Xbox One in its upscaling stage. Most ppl perceive sharpness are a better image, while on the other hand IMHO it should be turned off as much as possible, specially for digital content.

Still when you also read the MS propaganda on the "The Verge", you can clearly see that the XB1 is an US-focussed living room hub and the PS4 is just a gaming console.
 
Can we please ban the word "propaganda"? Or introduce a forum feature where I can pick words and whoever uses them goes on ignore automatically?
 
From what I can tell, the best digital film cameras are about 19MP. I was questioning the statement that movies are filmed at a 100MP equivalent nowadays.

The new Miami Vice movie was filmed in digital, and they added film grain to make it gritty. Unfortunately, they used digital noise for the grain instead of a proper film grain pattern. It made it exceedingly hard to compress well for inclusion on the blue laser formats. Ended up with lots of compression artifacts, even at relatively high bandwidths.

Oddly, I can jump in on this actually having worked with film and currently serving as a Director of Photography for a commercial production house.

Sensors of high end digital cine cameras are around the 16-19mp ballpark. These are cameras shooting hollywood level productions (Arri Alexa, Red Epic, Sony Fseries) That sensor size and megapxel density is not much larger than mid-high end digital photo cameras which can take a jpg at huge resolutions(many of the times its the same sensor). One of the reasons this sensor size is used is 1)it provides adequate resolutions for 1080p-4k needs. 2) that sensor size near matches the film plane of traditional Super35mm film, hence you retain lots of the look, (like Depth of field) and the ability to use 'Cinema' lenses like Panavision anamorphic lenses. Many of these lenses have been around for decades and are what we associate with a cinematic look. Two big of a sensor and its not covered by the lens, you get a Vignetting, a black circle around your image. Too small a sensor and you get a crop factor, you're zoomed in too much.

Anyway, I've written too much. But suffice to say senors are only one aspect of a cine camera. The technology in front of the sensor (lenses). and most important, the tech behind the sensor to capture and process the image, and understand 12 stops of dynamic range, 4:2:2 color, and write it into a RAW stream, etc. That is really what separates high end cameras.

35mm film is an organic tech. So there are a lot of parts to that soup. But in ideal situations can retain close to 4k worth of detail. Just depends on many factors. Lenses, film stocks, negative transfer tech.
 
From what I can tell, the best digital film cameras are about 19MP. I was questioning the statement that movies are filmed at a 100MP equivalent nowadays.

The new Miami Vice movie was filmed in digital, and they added film grain to make it gritty. Unfortunately, they used digital noise for the grain instead of a proper film grain pattern. It made it exceedingly hard to compress well for inclusion on the blue laser formats. Ended up with lots of compression artifacts, even at relatively high bandwidths.

Apart from being a crappy movie with no real connection with the TV-Series, they also fumbled the movie look, i recall several scenes looking very video, it comes as no surprise they thought digital noise = grain.

Another example of Resolution isn't everything comes from Avatar.. shot on a 2K camera it still looked marvelous, though i must admit i never saw it in 2D but just 3D. Shot on http://pro.sony.com/bbsccms/ext/cinealta/shoot/hdcf950.shtml resolution discussing here:
http://petavoxel.wordpress.com/2010/01/25/avatar-mp/

Of course, having a true 4:4:4 camera does make a difference in "pixel quality".

With film you would like to "oversample" in order to get the best look, the better res, the better result when you post process it. Bladerunner is a 30+ year old movie, the 4K scan and restoration is imho a perfect example of just what can be achieved with film with the right tools and craft.
 
Back
Top