Will Warner support Blu-ray?

Status
Not open for further replies.
Phil said:
Shifty, I think the point (which I think is less obvious given the way it was worded) is that a 35mm film doesn't have an infinite scope of detail
I recognise that, and said as much in my reply to LB
me (hooray!) said:
Certainly not infinite. It's given by the crystal size and density of the photosensitive pigments.
While I can't vouche for his claim, I think his point was the limit of the 35mm film material would be near the 2 million pixel mark (1080p resolution).
Which is what I disagree with. Do any number of searchers and you find scientific investigations finding the crystal distribution in a 35mm is equal to 50 million pixels worth of data, down to personal investigations that find you're unlikely to get more than 10 megapixels worth of data from a scan. But it's definitely a lot more than 2 megapixels worth. Otherwise a 2 megapixel camera would be producing pictures as good and able to be blown up to a 10x6 as a 35mm SLR. Which they're not ;)
 
Shifty Geezer said:
I recognise that, and said as much in my reply to LB

Which is what I disagree with. Do any number of searchers and you find scientific investigations finding the crystal distribution in a 35mm is equal to 50 million pixels worth of data, down to personal investigations that find you're unlikely to get more than 10 megapixels worth of data from a scan. But it's definitely a lot more than 2 megapixels worth. Otherwise a 2 megapixel camera would be producing pictures as good and able to be blown up to a 10x6 as a 35mm SLR. Which they're not ;)

If you have a superman vision maybe you can tell a difference between a 4k scanned image and a 2k scanned image tranfered on film.

There are many reasons why a film projector cannot resolve any better: the lens, the registration, the contact printing method, etc. So to use a higher resolution, with low-resolution source material, doesn't gain you very much.

Fact is the best res you can get with an hd camera is 1920 x 1080, and when you scan it at 2k and tranfert it to film, you get a result where no one can say you if the project was originally shooted on film or not.

You can't tell that the transfer was originated on tape.
 
iknowall said:
Why talking about film scanners if you dont have a clue ?
Maybe not cinema film scanners, but normal 35mm scanners, yeah I know a bit.

At 2K it's impossible to see any lines unless you put the negative under a microscope.
Yes, but when that negative is blown up to a 30' image on a cinema screen...! If a 35mm frame has 4000 lines per inch, on a 10 ft high screen, you get 33 lines per inch on the screen. A 1080p image projected onto a 10ft high screen gives 9 lines per inch. Even if both look identical when viewed on a tiny little slide, 35mm film has a lot more information that's apparent when you do enlargements.
So how many video to film tranfer you did you fo to know actually know if 1920 x 1080 is enough or not to get a good result when you transfer on film ?
How many ? So what sense have talking to you about something you never did before ?
I don't know how many 35mm to digital cinefilm scanners manage > 1920x1080 images. But I do know 35mm scanners can go up to well beyond 10 megapixel captures from a 35mm slide. Unless cinematography film is exceptionally poor quality stuff, you could scan a 35mm slide at 10 megapixel resolution and get more detail than you would get scanning it at 2 megapixels. Hence your argument that 1080p is about as much detail as you get in a film is nonsense. Maybe that's as good as is currently possible given the limitations of current film-scanning technology, but in the context of LB's question, a 35mm slide has a greater information density than a 1080p image, and when projected onto a cinema screen a 35mm slide should look better. The reason it doesn't look better isn't because 35mm only has as much information as a 2 megapixel digital image. 1080p is not approaching the limits of film resolution.

Unless when you said 'developing' you meant also the duplication from 35mm source to 35mm projection rolls, which is what I already said. If that's true, you didn't word yourself at all clearly, understandable as English isn't your native language. But you don't half get annoying go around telling people they don't know what they're talking about. How's about instead of assuming anyone who disagrees with you is an ignorant fool with no knowledge or experience, you discuss matters more politely?
 
Shifty Geezer said:
Maybe not cinema film scanners, but normal 35mm scanners, yeah I know a bit.

Well actually it's called film recorder the maschine that record digital frames on film .
Do you know about film recorders ?

Yes, but when that negative is blown up to a 30' image on a cinema screen...! If a 35mm frame has 4000 lines per inch, on a 10 ft high screen, you get 33 lines per inch on the screen. A 1080p image projected onto a 10ft high screen gives 9 lines per inch. Even if both look identical when viewed on a tiny little slide, 35mm film has a lot more information that's apparent when you do enlargements.

First you can't blow up the negative, you have to develope the negative trough some process that make you lose almost half of the original resolution when you have the positive copy

But, backing to the fact that a 2k image transfered on film can't have a film quality...


If you have a superman vision maybe you can tell a difference between a 4k scanned image and a 2k scanned image tranfered on film.

No one can tell that the 35mm version of star wars episode 3 is originated on tape if you dont know it.

Unless you have a superman vision.

how many video to film tranfer have you seen screened ?


I don't know how many 35mm to digital cinefilm scanners manage > 1920x1080 images. But I do know 35mm scanners can go up to well beyond 10 megapixel captures from a 35mm slide. Unless cinematography film is exceptionally poor quality stuff, you could scan a 35mm slide at 10 megapixel resolution and get more detail than you would get scanning it at 2 megapixels.

Your reading comprension = not good

i asked you " how many video to film tranfer did you do to know if 1920 x 1080 is enough or not to get a good result when you transfer on film ? "

If we are talking about video to film tranfer, we are not talking about scanning a 35mm film , we are talking about transfering a 2k image to the film .


Hence your argument that 1080p is about as much detail as you get in a film is nonsense

You are nonsense becaue from the internegative to the final copy you lose almost half the resolution, that's because you have to deal with the film organic loss, and you won't get more whan 3-2k for the final positive 35mm copy.

But, backing to the fact that a 2k image transfered on film can't have a film quality....

Have you seen a 1080p video transfered on film and you could tell it was originated on tape ?

I have seen it with my eyes and you can't tell the difference when you project the movie.

If you see the 35mm of star wars episode 3 you can't tell it is originated on tape.

Maybe that's as good as is currently possible given the limitations of current film-scanning technology, but in the context of LB's question, a 35mm slide has a greater information density than a 1080p image, and when projected onto a cinema screen a 35mm slide should look better. The reason it doesn't look better isn't because 35mm only has as much information as a 2 megapixel digital image. 1080p is not approaching the limits of film resolution.

Again : we are not talking scanning a 35mm film, we are talking about recording a 1080p video on film.

Your theory is fun but the truth is aganist you.

Like i said when you scan it at 2k and tranfert it to film, you get a result where no one can say you if the project was originally shooted on film or not.


Unless when you said 'developing' you meant also the duplication from 35mm source to 35mm projection rolls, which is what I already said. If that's true, you didn't word yourself at all clearly, understandable as English isn't your native language. But you don't half get annoying go around telling people they don't know what they're talking about. How's about instead of assuming anyone who disagrees with you is an ignorant fool with no knowledge or experience, you discuss matters more politely?


I assume that you don't know about the 35mm duplication process.

You start with have an internegative and the optical sound, wich combined together
give you the negative.

The negative give you the positive copy.

From the internegative to the final copy you lose almost half the resolution, that's because you have to deal with the film organic loss.

So what's why i said the final positive copy will have a result more near to 3-2k.

So you are the one that make nonsense saiyng that a 2k resolution is not comparable at what you get on a final 35mm copy.

The funny thing is that the article you quoted talked about photograhy not filmaking, and with the filmaking the film get's involved in a lot of different development cycle.
 
Last edited by a moderator:
AVS forum has a thread with the MS VP of Windows Media division claiming VC-1 is 2-3 times as efficient as MPEG2. But it sounds like SPE is using MPEG2 because the encoders for it are more mature than H.264 and for VC-1, apparently MS is the only one with an encoder?

There's also been claims that H.264 and VC-1 license fees are much much less than MPEG2. But you need expensive hardware to run these real-time encoders for the new codecs.

Isn't digital cinema more about cutting costs than necessarily delivering better picture quality? Studios would save millions if they didn't have to make thousands of prints a year. But theater owners are resistant because they don't want to spend over $100k to help the studios cut costs. Supposedly the UK has some kind of govt. program to help upgrade theaters for digital.

BTW, what's the deal with Europeans and Hollywood? Have they completely surrendered? I see on Via Condotti big promos for that rotten Mr. and Mrs. Smith movie. Big cutouts of Brad and Angelina enhanced to look much younger. Than on one corner, they're setting up TV lights like they're having some premier event. Guess I could have hung out to see if Angelina showed up.

Then in some shopping gallery, they have a big Harry Potter display. Guess Hollywood has the money to run these promotions.
 
iknowall said:
If we are talking about video to film tranfer, we are not talking about scanning a 35mm film , we are talking about transfering a 2k image to the film .
No, we're not. LB asked about how the quality differs between digital and analogue projection, saying film was 'infinite' resolution. You reply saying film has a resolution of no more than 2-3k once printed. I explain otherwise.
So you are the one that make nonsense saiyng that a 2k resolution is not comparable at what you get on a final 35mm copy.
No, i was talking about the inherant resolution of film after you said it was only 2k.

In what you're saying, yes a digital image projected onto 35mm film of course isn't going to be any better than a digital projection. But you've failed to follow the course of the discussion. I was bringing you up on your assertion that 35mm film (the same technology is used in stills cameras so my knowledge cross over perfectly well thank you very much) which I think was just poor wording on your part. Of course you'd rather attribute that to my reading comprehension skills :rolleyes:

But heck, who cares. You're a rude, obnoxious, arrogant prat. Another worthless addition to the forum who's attitude gets in the way of anything constructive you could bring. Certainly accusing me of bad comprehension, and saying anyone who doesn't work in your field of experience doesn't know anything about anything, as though you're the sole expert on the subject, is completely the wrong attitude for constructive discussion. Mix ups with the topic can and do happen, and sometimes someone doesn't get a point being made, but you needn't call them out as a dunce because of it.

And as a last point, cinematography has sod all to do with console tech anyway! This is another of those wierd 'what's it doing here' threads!
 
As a tiny tangent, I expect the really high-end HD debates to be worthless. People go for features. MPEG4 at 10 Mbits is plenty to convince most people that it's really HD. And I would expect that Star Wars Episodes 1, 2 and 3 on one disc to be much more appealing that the draw of "better" quality that most people will not see (and most setups cannot show).

Yeah, maybe if you could do a direct comparison by switching them quickly people might see. But when will that happen? At BestBuy where no two TVs have the same color tone?
 
Shifty Geezer said:
No, we're not. LB asked about how the quality differs between digital and analogue projection, saying film was 'infinite' resolution. You reply saying film has a resolution of no more than 2-3k once printed. I explain otherwise.

Wich is true because the final copy have almost half the resolution of the internegative copy.

No, i was talking about the inherant resolution of film after you said it was only 2k.

No that's my reply " So even if theorically when you see a dcinema movie the movie have less resolution compared to a 35mm movie, in practice digitally it look better because the color are much more vivid and sharper and also the detail is more edivent and also the projector tecnology is more powerfull."

That's what i said.

In what you're saying, yes a digital image projected onto 35mm film of course isn't going to be any better than a digital projection.

No i am not even saiyng this. I am saiyng that a 2k image transfered on film will look as good as an image shooted originally on film.

But you've failed to follow the course of the discussion. I was bringing you up on your assertion that 35mm film (the same technology is used in stills cameras so my knowledge cross over perfectly well thank you very much) which I think was just poor wording on your part. Of course you'd rather attribute that to my reading comprehension skills :rolleyes:

That's because i ended talking about video to film tranfer and you was talking about 35mm scanning, the opposite thing.


But heck, who cares. You're a rude, obnoxious, arrogant prat. Another worthless addition to the forum who's attitude gets in the way of anything constructive you could bring. Certainly accusing me of bad comprehension, and saying anyone who doesn't work in your field of experience doesn't know anything about anything, as though you're the sole expert on the subject, is completely the wrong attitude for constructive discussion. Mix ups with the topic can and do happen, and sometimes someone doesn't get a point being made, but you needn't call them out as a dunce because of it.

I was rude because how you started with your first post

" Hmmm, now your losing credibility with me. "
 
Inane_Dork said:
As a tiny tangent, I expect the really high-end HD debates to be worthless. People go for features. MPEG4 at 10 Mbits is plenty to convince most people that it's really HD. And I would expect that Star Wars Episodes 1, 2 and 3 on one disc to be much more appealing that the draw of "better" quality that most people will not see (and most setups cannot show).

Yeah, maybe if you could do a direct comparison by switching them quickly people might see. But when will that happen? At BestBuy where no two TVs have the same color tone?

Well, this actually make sense, and it is a smart comment about the mpeg4 thing.
 
> NO ONE WANT TO USE THE H.264 INSTED OF THE MPEG2HD COMPRESSION

DirecTV does, but what do they know?

BTW, Warner's HD-DVDs will be in VC-1.

Hong.
 
hongcho said:
> NO ONE WANT TO USE THE H.264 INSTED OF THE MPEG2HD COMPRESSION

DirecTV does, but what do they know?
BTW, Warner's HD-DVDs will be in VC-1.
Hong.


I was not talking in general i was talking about only the dcinema compression where all use only mpeg2hd.

I dont dubd that vc1 and h.264 will be used
 
Last edited by a moderator:
iknowall said:
No that's my reply " So even if theorically when you see a dcinema movie the movie have less resolution compared to a 35mm movie, in practice digitally it look better because the color are much more vivid and sharper and also the detail is more edivent and also the projector tecnology is more powerfull."

That's what i said.
The bit I replied to was
Where are different type of film quality, so you can found 6k film or 4k film negative but the fact is that when you develop it you lose a lot of resolution so in the end the real res of the film is less whan 4k, it is about 3 or 2k , witch is a lot of couse compared to video.
Yes film have a lot of resolution, but with a 1080p digital hd resolution you are near to the 2k standard quality and the detail you get on film.
No i am not even saiyng this. I am saiyng that a 2k image transfered on film will look as good as an image shooted originally on film.
Which is what I was talking about but you said we weren't talking about that! :???:

A 35mm film camera can capture more than 1080p's worth of information per frame
That's because i ended talking about video to film tranfer and you was talking about 35mm scanning, the opposite thing.
I was and always have been talking about the inherent capabilities of photographic film to capture data. Whether a 35mm film is exposed to light in an SLR camera or a movie camera makes no odds to the fact that the photographic pigment particles are of a density such that there are enough particles per inch to represent 10+ megapixels worth of data.
I was rude because how you started with your first post
Which I started with after following you attitude towards others in this thread. I was willing to except your expert opinion on the MPEG4 point, but as others have contributed I've found you less willing to argue intelligently and just say 'I'm experienced, you're not, so everyone should believe me' coupled with a poor ability to express yourself in the English tongue. As a newbie I haven't yet formed an opinion on what you will be contributing (useful information or poor unintelligent arguments) and was(am) trying to figure that out. The credibilty thing came up because you were making claims about expertise, but what you say about 35mm film having little more resolution than an HD camera, is to my mind misinformation, which suggests the knowledge you have on MPG4 might also be off. Now I don't know if you really think a cinematic 35mm film image has no more detail once developed than a 1920x1080 digital capture or not, because one mean you seem to be saying one thing, and the next saying the other. I can't be botherd to try to make sense of it any more.

London-Boy : Dunno what you've got out of this, or even what you actually wanted to know! I will tell you that a 35mm film frame using silver-halide and equivalent light-sensitive pigments, as used in cinema cameras and SLR film cameras, as developed by the likes of Fujifilm and Kodak, can capture 10+ million pixels worth of image data. That's something like 4000 dots per inch. As to what you see on the screen in the cinema, I don't know. As I say I think the processing from master to projector copy is quite lossy. If you've ever got a print from a print for example you know how quality suffers. If you record your material digitally the digital resolution WILL be lower than what the 35mm frame can capture, but when you transfer the digital images onto analogue film the quality of the capture could be affected by all sorts of factors to mean it can't make a perfect reproduction, so a film projection of a digital capture might well not be as good as a digital projection of a digital capture. And that's me done here.
 
Shifty Geezer said:
The bit I replied to was

Which is what I was talking about but you said we weren't talking about that! :???:

Ok read here if you don't beleave what i say is true :

"Theoretically, based on the grain structure of the emulsion, film could be pegged as high as 6K. Practically, however, this is only true for first-generation camera original and only under ideal conditions. In practice, negative film is typically assumed to have a maximum resolution of about 4K. Release prints from an internegative, depending who you talk to, are said to have a resolution of well under 1,800 pixels across, and the projected image may actually be worse because lamps can be misaligned, and lenses can be dirty or out of focus."

So you see the final detail can get even worst than 2k.

http://www.editorsguild.com/newsletter/MayJun02/digital_intermediate.html


A 35mm film camera can capture more than 1080p's worth of information per frame
I was and always have been talking about the inherent capabilities of photographic film to capture data.
Whether a 35mm film is exposed to light in an SLR camera or a movie camera makes no odds to the fact that the photographic pigment particles are of a density such that there are enough particles per inch to represent 10+ megapixels worth of data.
Which I started with after following you attitude towards others in this thread.


"Theoretically, based on the grain structure of the emulsion, film could be pegged as high as 6K. Practically, however, this is only true for first-generation camera original and only under ideal conditions. In practice, negative film is typically assumed to have a maximum resolution of about 4K. Release prints from an internegative, depending who you talk to, are said to have a resolution of well under 1,800 pixels across, and the projected image may actually be worse because lamps can be misaligned, and lenses can be dirty or out of focus."

http://www.editorsguild.com/newsletter/MayJun02/digital_intermediate.html


I was willing to except your expert opinion on the MPEG4 point, but as others have contributed I've found you less willing to argue intelligently and just say 'I'm experienced, you're not, so everyone should believe me' coupled with a poor ability to express yourself in the English tongue.

My opinion is that if you have enough space mpeg2hd is the best codec, and i explained this is why you have a less compressed video with an higer bitrate.


And to validate my point that mpeg2hd give an oustanding result you have also the post of londonboy that saw the dcinema version of charlye


As a newbie I haven't yet formed an opinion on what you will be contributing (useful information or poor unintelligent arguments) and was(am) trying to figure that out. The credibilty thing came up because you were making claims about expertise, but what you say about 35mm film having little more resolution than an HD camera, is to my mind misinformation, which suggests the knowledge you have on MPG4 might also be off.

My credibility is out of question, i was always right, here are the proff


Now I don't know if you really think a cinematic 35mm film image has no more detail once developed than a 1920x1080 digital capture or not, because one mean you seem to be saying one thing, and the next saying the other. I can't be botherd to try to make sense of it any more.

I gave you the proff that what i say about film resolution is the truth.
 
Last edited by a moderator:
iknowall said:
"Theoretically, based on the grain structure of the emulsion, film could be pegged as high as 6K. Practically, however, this is only true for first-generation camera original and only under ideal conditions. In practice, negative film is typically assumed to have a maximum resolution of about 4K.
Which says exactly what I was saying. A 35mm film image from source has about 4000 lines across, anout 4000 DPI, about 10 megapixels worth of data. The intermediaries made from that source are downgraded in quality, which is what I said way back in my initial reply to LB
me (hooray!) said:
I can only guess better quality in the digital projection comes from the losses transferring from film master to cinema projection.
Case closed.
 
Shifty Geezer said:
Which says exactly what I was saying. A 35mm film image from source has about 4000 lines across, anout 4000 DPI, about 10 megapixels worth of data. The intermediaries made from that source are downgraded in quality, which is what I said way back in my initial reply to LB

Case closed.

Sorry but are you kidding me now ?

I said this :

"Where are different type of film quality, so you can found 6k film or 4k film negative but the fact is that when you develop it you lose a lot of resolution so in the end the real res of the film is less whan 4k, it is about 3 or 2k , witch is a lot of couse compared to video.

Yes film have a lot of resolution, but with a 1080p digital hd resolution you are near to the 2k standard quality and the detail you get on film."


Wich is exactly cofirmed here :

"Release prints from an internegative, depending who you talk to, are said to have a resolution of well under 1,800 pixels across, and the projected image may actually be worse because lamps can be misaligned, and lenses can be dirty or out of focus."

So i was always right, but you stated that i was not.
 
IKnowAll,

what in the h*ll are you talking about? H.264 supports croma formmat of 4:2:0, 4:2:2 and 4:4:4 what does MPEG2 support? 4:2:0 and 4:2:2. When you're compressing with h264 you're not resampling the resolution, you're not discarding any more information than you would MPEG2. Is it hard to understand that H.264 is more efficient than MPEG2 at storing the same visual information? One of the problems with h264 and jpeg2000 is that it's very demanding one both encoders and decoders.

To answer l-b, question Charlie was filmed in traditional film and scanned at 4K and then resized to 2K. There's a up side to scanning at 4K and resized to 2K, you get better quality since you've have more samples to work it, verse something like Star Wars prequels which were film in 2K digital.

Regarding film vs digital. Shifty is right, with film get you more detail. To maintain the higher quality some will do a 4K scan, such as Spider Man 2. It was filmed using traditional film and scanned in at 4K. Traditional scanner usually scan at 2k x 1.5k, however most recent system do 2k x 1k (slightly better than 1080p). The most people agree that the 5K digital is not discernable from (good quality) 35 film. Anyway, throughout the entire process everything is 4K. Last January, Sony did a special screening of Spider Man2 with it's new 4K digital projector.

http://www.dcinematoday.com/dc/pr.aspx?newsID=198
 
TrungGap said:
IKnowAll,

what in the h*ll are you talking about? H.264 supports croma formmat of 4:2:0, 4:2:2 and 4:4:4 what does MPEG2 support? 4:2:0 and 4:2:2.

Mpeg2 support 4:4:4 also, i alredy stated it and i alredy gave a link about it .
You better read all thead before posting.

When you're compressing with h264 you're not resampling the resolution, you're not discarding any more information than you would MPEG2.

Wrong and false, because you have less bitrate, and once you use more compression, once you have need to downsample more.

Sorry i am tired to say the same thing again and again so i will give you a link about it :

"A higher compression factor leads to lower video quality with more artifacts. "

http://www.xs4all.nl/~brw/ds_products/hot_math.html

Is it hard to understand that H.264 is more efficient than MPEG2 at storing the same visual information?

It is really really hard to undestand when you have less bitrate and more compression

""A higher compression factor leads to lower video quality with more artifacts. "
 
Last edited by a moderator:
TrungGap said:
IKnowAll,



To answer l-b, question Charlie was filmed in traditional film and scanned at 4K and then resized to 2K. There's a up side to scanning at 4K and resized to 2K, you get better quality since you've have more samples to work it, verse something like Star Wars prequels which were film in 2K digital.

http://www.dcinematoday.com/dc/pr.aspx?newsID=198


Costantine, I robot , and star wars and the are filmed with hd cinema cameras.

Star wars use the sony cinealta.
 
Last edited by a moderator:
iknowall said:
Wrong and false, because you have less bitrate, and once you use more compression, once you have need to downsample more.

Bullshit. It was bullshit 5 pages ago in this thread when you first spouted it and it's not smelling any better today. Get over it.
 
iknowall said:
So i was always right, but you stated that i was not.
Because I was talking 35mm (film master) resolution, whereas you were talking about copy resolutions, not source (film master) resolutions. Only that wasn't clear in how you worded your response to LB. From how you worded your response to, you were saying film in general was nearer 2k, HDTV resolution.
 
Status
Not open for further replies.
Back
Top