Will Warner support Blu-ray?

Status
Not open for further replies.
iknowall said:
Dcinema don't have the same bitrate limitation.
At high bitrare mpeg2 look better.

Did you look at this doc? http://www.fastvdo.com/spie04/spie0...url]http://www.avicatech.com/jpeg2000.html#q2

from Avica's web site:

AVICA said:
Q. What is used now and why?

A. The codec currently approved by the major Hollywood studios for digital cinema use is HD MPEG2 at high bit-rates. There are a number of other codecs that are proprietary or have never been approved for major motion picture releases.

Avica - and our interoperability partners - use HD MPEG2 at 80Mb/sec MP@HL.

Among the benefits of using HD MPEG 2 is that it is a widely accepted compression standard and is commonly used in many industries. Therefore MPEG2 services can be easily (and cheaply) accessed from third party providers primarily for the provision of alternative and advertising content.

Like I said earlier, MPEG2 was used because it was an industry standard...but because of it's quality. If that was the case Digital Cinema Initialtives would not selected Motion JPEG-2000 as the new standard for Hollywood studios. H.264 can go tip-toe with JPEG-2000, so the notion that MPEG2 is better even at higher bitrate than h.264 kinda funny.

Since JPEG2000 is wavelet, and it's convenient to get multiple sizes for different cinema, but I'm sure there are other reasons why DCI selected JPEG2000 over h.264.

http://www.ee.ucla.edu/~ipl/intra_frame_jpeg_2000_vs._inter_frame_compression_comparison.pdf

Of course, we talking about 4k resolution that mainly used in high-end theaters. So, in that application it's clearly JPEG2000 is better. However, in consumer HD devices H.264 has its advantages.

iknowall said:
Because you have so much space that compressing audio make no sense at all.

Uncompressed audio is a dcinema feature and give you the best quality and audio dont take that much space.

I guess that makes sense...
 
TrungGap said:
Did you look at this doc? http://www.fastvdo.com/spie04/spie0...url]http://www.avicatech.com/jpeg2000.html#q2

from Avica's web site:


Like I said earlier, MPEG2 was used because it was an industry standard...but because of it's quality.


Mpeg2 was ONE of a lot of industry standard that could have been used with dcinema.

A lof of standard could have been used instead of mpeg2hd.

Mpeg2hd is the standard for the digital cinema because it give you the best result and people choised it, the best result .

A lot of people tested it and came to the conclusion that the quality is better.

This is the reason other people that i know and myself included want to master from hd-d5 only to mpeg2hd for the digital cinema.

No one force you to use mpeg2hd, you can master to vc1 and mpeg4, but on one do this.
Why ? cost ? no mastering is a little thing for a movie company.

Tecnical people that have years of experinece of video encoding actually choised the mpeg2hd and think is the better choise.

And all agree and every of them ose only this standard. You actually have every film masterded in the dcinema format using mpeg2hd.

I also have seen how vc1, mpeg4 and mpeg2hd look so i dont't have need to discuss of something that is just a fact on my eyes.

You never used a Dcinema mastering post production service but pretend to know this job better whan people that actually do it ?

that's just nonsense.

I said, if you dont want beleave me, just go to ask at any in the industry .
If that was the case Digital Cinema Initialtives would not selected Motion JPEG-2000 as the new standard for Hollywood studios. H.264 can go tip-toe with JPEG-2000, so the notion that MPEG2 is better even at higher bitrate than h.264 kinda funny.

Sorry but actually no standard exist . Actually the most used standard is the avica mastering format witch is the most used but we are far from using another standard.


Since JPEG2000 is wavelet, and it's convenient to get multiple sizes for different cinema, but I'm sure there are other reasons why DCI selected JPEG2000 over h.264.

http://www.ee.ucla.edu/~ipl/intra_frame_jpeg_2000_vs._inter_frame_compression_comparison.pdf

Depsite the fact that we are far from having an unified standard .

also "It is important to bear in mind that we are only at the beginning of the process and that the announcement does not in any way constitute a final outcome, but the first step in a process of defining a new codec for the industry."

Say all about how much this is nothing more whant just a start to try to use something
in common.

Every is just in a experimenting process.

and even

Avica believes that high bit-rate HD MPEG2 will remain a popular standard for many years to come - if only because of the large installed base of HD MPEG2 equipment and expertise globally.

Of course, we talking about 4k resolution that mainly used in high-end theaters.

Of corse in your dream since actually don't exist a digital projectior capable of a 4k resolution, the best you can have today is 2k.

So, in that application it's clearly JPEG2000 is better. However, in consumer HD devices H.264 has its advantages.

It's so clearly that actually no one use it.

No one know how jpeg2000 look since no one use it for digital cinema right now, and like the site say "Avica believes that high bit-rate HD MPEG2 will remain a popular standard for many years to come" .

I think this say all.




I guess that makes sense...

It is a fact, fell free to rerify by yourself.
 
Last edited by a moderator:
iknowall said:
The mpeg4 have a more powerfull compression, compression what you are not going to use with all the space you have on the blue ray disk.

Mpeg4 give a better result at a 5mbps bitrate ? so ? i will never use a 5mps bit rate with the a 100gb disk so all the point you make are useless.

The point is you really don't need 100GB of storage. SL BR disc isn't going to give you 100GB...

iknowall said:
The PSNR value is in accordance with compression quality but this metric does not reflect the presence of visual artefacts. You can't estimate the quality of the artefacts performed by some codec or detect the presence of the "snow" artefacts (strong flicking of the stand-alone pixels) in the compressed video using only PSNR metric.

I agreed, that's why I've pointed out perceptual testing that has shown the H.264 @ 8mbps is equal to that of MPEG2@24mbps.

iknowall said:
Lol higer resolution of 4k :LOL: ? do you have a clue of what are you talking about ?

Hum, yeah, you talk about over 4k res depsite that actually you can't even get a real 4k res.

I said "higher resolution of 4k", not higher resolution than 4k. Which mean they're using a higher resolution of 4K. I guess the wording can be easily confused, sorry bout that.

iknowall said:
I also have seen how vc1, mpeg4 and mpeg2hd look so i dont't have need to discuss of something that is just a fact on my eyes.

Care to explain to us? The only reason why MPEG2 is better than H.264 you gave us is because it's used in digital cinema. However, as I pointed out, it's not what it said on the avica web site. Given that MPEG2 softwares and hardware already exist doesn't mean new compression such h.264 and JPEG2000 isn't better than MPEG2. All it proves is that MPEG2 is more utillised...considering that it was probably the best you can get during that time. I've given you a lot more researches that shown h.264 offers the same quality at lower bitrate. Unless you can provide further explaination, other than we used it therefore it got to be better, I think it's pointless to further this discussion.

iknowall said:
Of corse in your dream since actually don't exist a digital projectior capable of a 4k resolution, the best you can have today is 2k.

Sony unveiled a market ready 4K projector, already. But unfortunately, everything must line up in order to the true 4K. Movies must be 4K from creation.

Spider-Man 2 was the first movie to be 4K...mean film scanned at 4K. CGI and production works were also at 4K. So there's a push to 4K. So at 4K what are you going to do about video compression?

iknowall said:
No one know how jpeg2000 look since no one use it for digital cinema right now, and like the site say "Avica believes that high bit-rate HD MPEG2 will remain a popular standard for many years to come" .

I think this say all.

What you mean no one knows how JPEG2000 will look? It's been researched, tested and look at already and not at consumer resolution, but at 4K. Whether it's commerically available is a different issue. It's like saying no one knows if BR disc will hold more than DVD.

iknowall said:
It is a fact, fell free to rerify by yourself.

Eh, why are you so confrontal? I asked a question, and got a response that to me made sense...why are you assuming otherwise?
 
I'm sorry to slightly derail the thread, but is it me or depending on what cinema i go to, i get different "outputs"?

Let me explain,

I went to see Charlie and the Chocolate Factory when it came out this summer at a Vue Cinema (the top cinema in London i think) and hell it looked SHARP. Much sharper than LOTR (which i saw at a UCL, one of the cheaper cinemas) and even the latest Harry Potter, which i also saw at one of the cheaper cinema megaplex's.

I have very trained eyes and i can tell the difference was about half the detail. Also the movement and everything seemed much better in Charlie. Don't know, it almost felt like Charlie was shown with a VERY high res digital projector and the others were using "something else"..........

I just really don't know much about cinema projectors and such.
 
TrungGap said:
The point is you really don't need 100GB of storage. SL BR disc isn't going to give you 100GB...

Blue ray disk is going to give you up to 200gb of storage. Tdk have already done a working 4 layer 100gb blue ray disk.

I agreed, that's why I've pointed out perceptual testing that has shown the H.264 @ 8mbps is equal to that of MPEG2@24mbps.

We all know that mpeg4 can have a better compression ratio, i pointed out that the parameter called PSNR that you used do not take in account the presence of the artifact, so you may have a good PSNR but this means nothing if you have artifact apperars .
SO you agree on what exactly ?

I said "higher resolution of 4k", not higher resolution than 4k. Which mean they're using a higher resolution of 4K. I guess the wording can be easily confused, sorry bout that.


wherever you say "of 4k" or "than 4k" if you say before "higer" you it means more of 4k.

Btw givining the benefit of the dubd, what do you mean, a resolution equal, superior, or inferior to 4k ?

Care to explain to us?

I use to see the same hd master compressed with a lot of codecs .

Less compressed is the codec i use better quality i get from the video .

More i compress the video, more i see the sharpness and detail of the video going down.

So that the dvcpro codec > mpeg2 > mpeg4 to give you an example.


and the mpeg2hd one is always bigger and with a better quality compared to the mpeg4 one.

The only reason why MPEG2 is better than H.264 you gave us is because it's used in digital cinema.

I gave tons of reasons explaining why mpeg2 is better. I gave tons of examples of what appens when a codec compress a video downsampling the image , color, croma ecc. explaining you why compression make the quality going down.

So mpeg4 is more compressed whan mpeg2 and for the reason i listed this means mpeg2 have a better quality.

But if you don't undersand i will repeat another time the concept :

1) less compressed is the video less quality you lose compared to the quality of the master.

2) less compression give you an higer bitrate and you have less possibility to have artifact




Wich codec have less compression ? mpeg2.


The fact that mpeg2hd is used for the dcinema just prof the fact that mpeg2hd is the better codec suited for the encoding of hi bit rate hd video . Overvise no one would use it.

However, as I pointed out, it's not what it said on the avica web site.

What you say is not what is written on the avica site, site that you know only because i talked to you about the avica tecnology

What i say is what people that actually use the avica tecnology with me say.

What you say are just an opinion without any reletion with the reality.

Given that MPEG2 softwares and hardware already exist doesn't mean new compression such h.264 and JPEG2000 isn't better than MPEG2.

But the fact that NO ONE WANT TO USE THE H.264 INSTEAD OF THE MPEG2HD COMPRESSION means that no one think that it is better.


You seems to have problem to uderstand.

i reapeat : NO ONE WANT TO USE THE H.264 INSTED OF THE MPEG2HD COMPRESSION
NO ONE WANT TO USE THE H.264 INSTEAD OF THE MPEG2HD COMPRESSION
NO ONE WANT TO USE THE H.264 INSTEAD OF THE MPEG2HD COMPRESSION
NO ONE WANT TO USE THE H.264 INSTEAD OF THE MPEG2HD COMPRESSION
NO ONE WANT TO USE THE H.264 INSTEAD OF THE MPEG2HD COMPRESSION
NO ONE WANT TO USE THE H.264 INSTEAD OF THE MPEG2HD COMPRESSION
NO ONE WANT TO USE THE H.264 INSTEAD OF THE MPEG2HD COMPRESSION

You can use h.264 and vc1 for the digital mastering and vc1 is also trying to be used in the digital cinema but NO ONE WANT TO USE H.264 NOR THE CV1 when you have to make the master.

Do you understand now ?


All it proves is that MPEG2 is more utillised...considering that it was probably the best you can get during that time.

Stop to making assumpion on a subject you have absolutly not experience about .

During what time ? right now NO ONE WANT TO USE THE H.264 INSTEAD OF THE MPEG2HD COMPRESSION even if you can use h.264 and vc1 for the digital mastering and vc1 is also trying to be used in the digital cinema but NO ONE WANT TO USE H.264 NOR THE CV1 when you have to make the master.


I've given you a lot more researches that shown h.264 offers the same quality at lower bitrate.

This is simple false and plain wrong.


Ok, now please say , how much time i have to explain that more compression = less quality ?


How much time i have to explain to you that when you compress a video you downsample
color, croma, resolution, and othe things ?


i just did the example of what you lose with a professional codec like dvcprohd :

When you compress a 720p video with the dvcpro codec, it downsample it from the original 1280 x 720 to 960 x 720 and get the croma downsampled from 640 to 480.

The quality get worse.

And this is what to a professional codec that use a little compression ratio of 6:1 , so just imagin how much quality you lose when you use an hig compressed codec like mpeg4.

Unless you can provide further explaination, other than we used it therefore it got to be better, I think it's pointless to further this discussion.

How much time have to give you the explanation that compression = downsampling = loos of quality ?

I hope you are not going to ask me the same thing again


Sony unveiled a market ready 4K projector, already.
Right now no 4k projector exist in any theater. Right now you can buy no 4k projector.
I don't dubd in the furure where will be an upgrade, just, right now, 2k is the best standard availble.

But unfortunately, everything must line up in order to the true 4K. Movies must be 4K from creation.

Wich give me make you a question ? Do you have a clue on that is the difference bethen a project originated on film and a project originated on hd ?

you talk like you don't know this difference.

Spider-Man 2 was the first movie to be 4K...mean film scanned at 4K.

You have no clue
it was scanned at 4k just to make the digital intermediate and than to go back to film .
You do this to don't lose film resolution.

CGI and production works were also at 4K.
So there's a push to 4K. So at 4K what are you going to do about video compression?

You know that spiderman 2 movie is maked on film right ? you kown film is an organic material not a digital data right ?

4k is not the resolution of the video, 4k is the resolution you finally get on film.

You are going to project the film in the theater, not a digital video.

What you mean no one knows how JPEG2000 will look? It's been researched, tested and look at already and not at consumer resolution, but at 4K. Whether it's commerically available is a different issue. It's like saying no one knows if BR disc will hold more than DVD.

Since you have no clue of what it look like what's the sense on speculate on it ?

Actually, no one use this tecnology, and avica itself think mpeg2hd will be used for the next years.


Eh, why are you so confrontal? I asked a question, and got a response that to me made sense...why are you assuming otherwise?

I just pointed out that i do not force you to beleve me and you can ask at everithing that work in the industry on what i am saying.
 
Last edited by a moderator:
london-boy said:
I'm sorry to slightly derail the thread, but is it me or depending on what cinema i go to, i get different "outputs"?

Let me explain,

I went to see Charlie and the Chocolate Factory when it came out this summer at a Vue Cinema (the top cinema in London i think) and hell it looked SHARP. Much sharper than LOTR (which i saw at a UCL, one of the cheaper cinemas) and even the latest Harry Potter, which i also saw at one of the cheaper cinema megaplex's.

I have very trained eyes and i can tell the difference was about half the detail. Also the movement and everything seemed much better in Charlie. Don't know, it almost felt like Charlie was shown with a VERY high res digital projector and the others were using "something else"..........

I just really don't know much about cinema projectors and such.

That's because Charlie and the Chocolate Factory is a Dcinema film.

Probalby the thaeater was equipped with a digital cinema system and you saw it in this format.

Outstanding quality, i know, that's the quality you get from Mpeg2hd compressed films.
 
iknowall said:
That's because Charlie and the Chocolate Factory is a Dcinema film.

Probalby the thaeater was equipped with a digital cinema system and you saw it in this format.

Outstanding quality, i know, that's the quality you get from Mpeg2hd compressed films.


Cool thanks! Really, from the first few frames i was like "WOAH WHAT THE..." It was much sharper than anything else i had seen before.
 
london-boy said:
Cool thanks! Really, from the first few frames i was like "WOAH WHAT THE..." It was much sharper than anything else i had seen before.

I know it is outstanding , you tasted the power of the dcinema , powerd by mpeg2hd :LOL:

This is a link about Charlie and dcinema :

http://www.dcinematoday.com/dc/pr.aspx?newsID=308

"Technicolor Digital Cinema Reaches Industry Milestone With Release of 100th Digital Cinema Title
Leader in Digital Cinema Services and Technologies Prepares and Distributes International Digital Version of Warner Bros. Pictures Summer Blockbuster Hit, Charlie and the Chocolate Factory...."
 
Last edited by a moderator:
Marvellous. So i take it most big titles on that same screen should be of the same quality?
Even from upclose it was just SHARP. Usually things get blurry (like Harry potter, the latest one)... Actually, i wonder if i went to see the new Harry Potter again in a Vue Cinema (the same where i saw Charlie), if it's digital too. The thing that really bugged me is that HP was quite blurry and these days my eyes are too trained not to be bothered by that kind of thing.
 
london-boy said:
Marvellous. So i take it most big titles on that same screen should be of the same quality?
Even from upclose it was just SHARP.

Actually when i saw costantine in a dcinema it was the digital film that impressed me more....i never seen color like that before...even more than star wars episode 3 that i saw in the dcinema version also....

I also remember i saw the digital version of The day after tomorrow and also I robot was a dcinema film.

Actually, i wonder if i went to see the new Harry Potter again in a Vue Cinema (the same where i saw Charlie), if it's digital too. The thing that really bugged me is that HP was quite blurry and these days my eyes are too trained not to be bothered by that kind of thing.


Yes i know for sure that Harry Potter and the Goblet of Fire is a dcinema film becasue it is actually projected digitally in the dcinema i use to go, so yes if you came back to see it to a dcinema therater you will see an huge difference on quality.

The film is also tranfered from digital to the traditional 35mm film to project the film with the tradotional 35mm projector, you have to think that the most part of the theaters don't have a dcinema system and can only project 35mm films.
 
Last edited by a moderator:
iknowall said:
That's because the film is also tranfered from digital to to the traditional 35mm film to project the film with the tradotional 35mm projector, you have to think that the most part of the theaters don't have a dcinema system and can only project 35mm films.

[follow me here, i really know very little about cinemas]
But i thought "real" film had pretty much infinite resolution so to speak, so in theory they should be more detailed? I guess it's all in the projector isn't it...
 
That's weird. I'm sure I saw in various 'makings of HP' TV programs they were using film cameras. Is it the reproduction to cinema film that's causing the dowgrade in quality? Certainly it can't be digital capturing that's producing the better quality because HP isn't (AFAIK)
 
london-boy said:
[follow me here, i really know very little about cinemas]
But i thought "real" film had pretty much infinite resolution so to speak, so in theory they should be more detailed? I guess it's all in the projector isn't it...


Where are different type of film quality, so you can found 6k film or 4k film negative but the fact is that when you develop it you lose a lot of resolution so in the end the real res of the film is less whan 4k, it is about 3 or 2k , witch is a lot of couse compared to video.

Yes film have a lot of resolution, but with a 1080p digital hd resolution you are near to the 2k standard quality and the detail you get on film.

Avtually 1980 x 1080 is the max res you can get from the best hd digital cinema camera avaible, like sony cinealta

http://www.cinealta.com

So even if theorically when you see a dcinema movie the movie have less resolution compared to a 35mm movie, in practice digitally it look better because the color are much more vivid and sharper and also the detail is more edivent and also the projector tecnology is more powerfull.
 
london-boy said:
[follow me here, i really know very little about cinemas]
But i thought "real" film had pretty much infinite resolution so to speak, so in theory they should be more detailed? I guess it's all in the projector isn't it...
Certainly not infinite. It's given by the crystal size and density of the photosensitive pigments. These are different from film to film. The resolving power of film depends on the speed of the film (ISO value) which is used and that's selected on how much light there is and what shutter speed is being used. Not sure how these vary in movie films. From what I've read, I'd guess a 35mm film would be about 10 megapixels digital equivalent. Certainly the best digital film cameras shouldn't be able to compete with a film camera at the moment. I can only guess better quality in the digital projection comes from the losses transferring from film master to cinema projection. A digital transfer would be lossless (save compression) whereas film to film transfer is going to have a loss of resolving power.
 
iknowall said:
Where are different type of film quality, so you can found 6k film or 4k film negative but the fact is that when you develop it you lose a lot of resolution so in the end the real res of the film is less whan 4k, it is about 3 or 2k , witch is a lot of couse compared to video.

Yes film have a lot of resolution, but with a 1080p digital hd resolution you are near to the 2k standard quality and the detail you get on film.
Hmmm, now your losing credibility with me.

Anyone's free to type in "35mm film megapixels" into google and come up with answers like...

The theoretical peak resolution of fine-grained 35mm film is, indeed, something like 50 megapixels, as shown by internationally recognised authority me in this diagram. That's right at the bleeding edge, though. In the real world, even very serious 35mm photographers have a hard time beating 25-megapixel-equivalent quality - you can use a fabulously expensive scanner at outrageous resolution to make much bigger files from 35mm, but all they give you is a larger view of the grain and the blur. So I think it's perfectly fair to say that a truly excellently sharp picture on 35mm film is about 25MP-equivalent. As Ken says. But as this diagram (also from my old D60 review) points out, it's a rare 35mm shot that contains more than 10 megapixels of actual detail. No amount of photographic wizardliness on your part will let you capture even 25 megapixels, if you're not shooting super-fine-grained film (which means pretty slow film - even 200-speed film doesn't cut it, and all-purpose 400-speed happy snapper film certainly doesn't), with a good lens, a steady camera, and (importantly) a subject that actually has detail there for the film to capture.
from http://www.dansdata.com/20d_res.htm
 
Shifty Geezer said:
That's weird. I'm sure I saw in various 'makings of HP' TV programs they were using film cameras. Is it the reproduction to cinema film that's causing the dowgrade in quality? Certainly it can't be digital capturing that's producing the better quality because HP isn't (AFAIK)

Well seems like harry potter is released in the imax digital version, but i can't understand if it was originated on film and after mastered digitally or if it was shooted directly on hd , the article is not clear:

http://sev.prnewswire.com/film-motion-picture/20051114/NYM05414112005-1.html
 
Shifty Geezer said:
Hmmm, now your losing credibility with me.

Wich i could not care less
Anyone's free to type in "35mm film megapixels" into google and come up with answers like...

Lol you don't know how much you are wrong . Well maybe is my fault because i did not expose right the concept.

You can't use megapixel to define film resolution .

Megapixels are something digital, film is not a digital thing so you can't use megapixel to define film resolution.

If you scan the film with a 2k scanner you have a video with a 2k resolution, if you scan at 4k you have a digital video with a 4k resolution , this do absolutly not means that film have a 2k or 4k resolution, where are different tipe of film, some better in low light condition, better is the quality more you have a detailed image , the resolution of film is standard,
where are also different format of film, super35mm give you a larger area with an extra detail, so at the end with a good quality film you end to have a nice pictures and more is good the detail more you can scan at higer resolutions without seeing film grain.
 
Last edited by a moderator:
iknowall said:
You can't use megapixel to define film resolution .
You can use it as an equivalency though. How many megapixels do you need to get the same sort of resolution as a 35mm film frame is a fair question.
Megapixels are something digital, film is not a digital thing so you can't use megapixel to define film resolution.
But you do use similar measure like lines (or line pairs) per inch so it's not a totally unrelated measure.
If you scan the film with a 2k scanner you have a video with a 2k resolution, if you scan at 4k you have a digital video with a 4k resolution , this do absolutly not means that film have a 2k or 4k resolution,
And if you scan it with a 10k scanner, you get a 10k resolution, and if you scan it with a 100k scanner, you get blobs because the pixel size is smaller than the grain size (not that such a scanner exits!). There's certainly nothing stopping films using a 10megapixel scanner to get 10 megapixel frames as there's certainly that much information in a slide. Which is in stark contrast to your assertion that 1080p is close to the resolution of a film frame
Yes film have a lot of resolution, but with a 1080p digital hd resolution you are near to the 2k standard quality and the detail you get on film.
I'd expect a film-like resolution in a digital camera to be something like 4000x2000, a lot more than 1920x1080.
 
Shifty Geezer said:
And if you scan it with a 10k scanner, you get a 10k resolution, and if you scan it with a 100k scanner, you get blobs because the pixel size is smaller than the grain size (not that such a scanner exits!). There's certainly nothing stopping films using a 10megapixel scanner to get 10 megapixel frames as there's certainly that much information in a slide.

Shifty, I think the point (which I think is less obvious given the way it was worded) is that a 35mm film doesn't have an infinite scope of detail, meaning it's not necessarely dependend on the scanner but on the source material. You can't expect to be able to take pictures with a 35mm film and expect to be able to print it the size of a house without seeing the limits of the film. While I can't vouche for his claim, I think his point was the limit of the 35mm film material would be near the 2 million pixel mark (1080p resolution).
 
Shifty Geezer said:
You can use it as an equivalency though. How many megapixels do you need to get the same sort of resolution as a 35mm film frame is a fair question.
But you do use similar measure like lines (or line pairs) per inch so it's not a totally unrelated measure.

No it dont make sense , i also asked this question at a friends that handle film post production and she said that using a digital temr like megapixel wont make sense for her.

because 35mm resolution is standard, it is how good the condition are and how good are when you film wich make the result of how good are the final image, a more clear and detailed image.

And if you scan it with a 10k scanner, you get a 10k resolution, and if you scan it with a 100k scanner, you get blobs because the pixel size is smaller than the grain size (not that such a scanner exits!). There's certainly nothing stopping films using a 10megapixel scanner to get 10 megapixel frames as there's certainly that much information in a slide.

Why talking about film scanners if you dont have a clue ?

I am not sure if even today a telecine 6k scanner actually exist , talking about a 10k film scanner dont make sense. Like i said the numer that make sense are 2k and 4k.

Which is in stark contrast to your assertion that 1080p is close to the resolution of a film frameI'd expect a film-like resolution in a digital camera to be something like 4000x2000, a lot more than 1920x1080.

At 2K it's impossible to see any lines unless you put the negative under a microscope.

So how many video to film tranfer you did to know if 1920 x 1080 is enough or not to get a good result when you transfer on film ?

How many ? So what sense have talking to you about something you never did before ?
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top