Business ramifications of a 2014 MS/Sony next gen *spawn

Status
Not open for further replies.
First 2k and then 4k were requested by effects and editing houses to cut down on artefacts when editing movies, these resolutions were never intended for consumer use. And on a 40" screen probably wouldn't be noticeable. We are suddenly getting a lot media attention on 4k displays because, for the first time, they are becoming readily available, previously hd and 2k monitors were used during the editing even though the source material and processing were in 4k. I'm sure there will a few wealthy AV enthusiasts who will buy these very large 4k studio monitors and watched upscaled HD, but it's going to be
very niche.
 
That's my point though, jeff. Games will keep on targeting 1920x1080 alongside 1920x1080 in 3D mode. That 1920x1080 in 3D can sometimes better be delivered by a panel capable of 4k doesn't necessarily change this. Perhaps at some point non-3D content could start using 4k as well, but this won't be a standard for console games, and it will take a loooooong time before we get there on any significant level, as right now almost nobody even gets their normal TV channels at 1920x1080 yet.

Make no mistake though - I'm quite confident that new consoles released in 2014 or later will be able to output 4k resolutions, but I highly doubt they will do anything meaningful with it the first five years.

We're still in the process of transitioning to 1920x1080p, with the transition to 3D yet to follow. It will be a long time before we're ready for another transition.
 
First 2k and then 4k were requested by effects and editing houses to cut down on artefacts when editing movies, these resolutions were never intended for consumer use. And on a 40" screen probably wouldn't be noticeable. We are suddenly getting a lot media attention on 4k displays because, for the first time, they are becoming readily available, previously hd and 2k monitors were used during the editing even though the source material and processing were in 4k. I'm sure there will a few wealthy AV enthusiasts who will buy these very large 4k studio monitors and watched upscaled HD, but it's going to be
very niche.

Agreed, but niche for how long?
 
Agreed, but niche for how long?

The question you have to ask yourself is, what benefits does it give you? Particually in the size range of displays that you'll find in the average home. Remember were not talking about PC CAD work when you're close to a screen, these are games and movies. Who's going to buy one and pay all that money for no real gain? Why are the movie studios going to introduce another movie format, either on disk or soaking up vast amounts of space on their servers and the vast increase in broadband speeds needed. All for no appreciable gain. It's just throwing money at a problem that doesn't exist.
 
First 2k and then 4k were requested by effects and editing houses to cut down on artefacts when editing movies, these resolutions were never intended for consumer use.

VFX houses generally don't like 4K, as it kinda multiplies rendering times by four just on its own (and we're not talking about upping the quality for assets including CG models and live stock footage).
They're already spending a lot on special equipment to be able to facilitate the data storage and computing power necessary - Weta has like 10.000 CPUs and extra cooling equipment and petabytes of online disk capacity. Moving to 4K would require horrible investments from these studios just to keep their current production capacity, but they're already expected to up the quality year after year, not to mention that 3D already requires double the work for the two eyes and now Cameron's talking about doubling the framerate too.

So no, CGI studios actually want to avoid 4K, they're having enough trouble already...
 
VFX houses generally don't like 4K, as it kinda multiplies rendering times by four just on its own (and we're not talking about upping the quality for assets including CG models and live stock footage).
They're already spending a lot on special equipment to be able to facilitate the data storage and computing power necessary - Weta has like 10.000 CPUs and extra cooling equipment and petabytes of online disk capacity. Moving to 4K would require horrible investments from these studios just to keep their current production capacity, but they're already expected to up the quality year after year, not to mention that 3D already requires double the work for the two eyes and now Cameron's talking about doubling the framerate too.

So no, CGI studios actually want to avoid 4K, they're having enough trouble already...

For CGI I agree with all you've said. It's the other areas of effects work and editing where 4k becomes useful. I work for a company that designs editing systems and it has been a common customer request for some time now. I work on the hardware side and it's been a few years since I attended a lecture on the pros of 4k so wouldnt like to quote anything half remembered. Certainly a lot of the demo material and rushes of AAA blockbuster movies we're given are 4k.
 
Realtime manipulation of 2K uncompressed data has been available for more than a decade now, so I suppose it's possible to create an editing suite that has four times the bandwidth to disk.
But unless they shoot with IMAX, the actual image sequences are probably still not as detailed IMHO. Even Avatar was simply upscaled for IMAX projections, very few shots were rendered at ~3K resolution.
 
Realtime manipulation of 2K uncompressed data has been available for more than a decade now, so I suppose it's possible to create an editing suite that has four times the bandwidth to disk.
But unless they shoot with IMAX, the actual image sequences are probably still not as detailed IMHO. Even Avatar was simply upscaled for IMAX projections, very few shots were rendered at ~3K resolution.

The Sony Cell BE and RSX are being used in a Commercial product offered by Sony to edit 4K. The quote from their information page reads; "Now that commercials and Films are going to 4K....."
 
Supersampling happens at render time AFAIK, rendering subpixel resolutions. You don't render the whole frame at 40,000 x 30,000 and then downsize it, but you render each pixel by performing lots of samples within the scope of the pixel, accumulating and averaging the result.
 
Honestly... 4K uncompressed data is a problem when you try to load it into RAM.

A single 4000x2000 frame at 32 bit is ~30MB and you need 24 per second for realtime playback, which is 720MB. you need a very expensive disk array to get that kind of speed, but even a cheap Intel based system should be perfectly able to play it back once you have the data in memory.

Compositing is where it gets even more complicated because you absolutely need floats for the math, and stuff like EXR file format to store all the various layers and masks and other channels. And of course your thousands of render nodes need to also write that out, so you need a pretty hardcore network infrastructure to support it. So that is what's already a burden at 2K for movie studios and 4K is a nightmare with its 4x increase in all image data.
 
Honestly... 4K uncompressed data is a problem when you try to load it into RAM.

A single 4000x2000 frame at 32 bit is ~30MB and you need 24 per second for realtime playback, which is 720MB. you need a very expensive disk array to get that kind of speed, but even a cheap Intel based system should be perfectly able to play it back once you have the data in memory.
Well, it's not cheap but it's not super (enterprise-level) expensive either.
http://www.newegg.com/Product/Product.aspx?Item=N82E16820227517&cm_re=z-drive-_-20-227-517-_-Product
Wouldn't this be perfect for 4k editing?
 
Realtime manipulation of 2K uncompressed data has been available for more than a decade now, so I suppose it's possible to create an editing suite that has four times the bandwidth to disk.
But unless they shoot with IMAX, the actual image sequences are probably still not as detailed IMHO. Even Avatar was simply upscaled for IMAX projections, very few shots were rendered at ~3K resolution.

Yes the real time editing of uncompressed 4k isn't an issue nowadays. These systems are resolution independent so you can be working with a 2k or lower CGI layer over a 4k live background footage without any up or downressing, do your grading and DI business and then spit it out the end at whatever resolution you want to store and project. It's the layering together where the extra resolution helps. 4k camera are becoming cheap enough for this to be the norm now. I agree with you about films aren't usually projected at 4k.

Apologies. This has nothing to do with the thread topic.
 
Last edited by a moderator:
"Screen Generation" Sorry, I'm using terms that apply to the CE industry. We even have "engines" in LCD and DLP TVs. If you will do a Google search using the terms screen generation and rendering you will find many references to "generator" as in computer generation of video. You may be more familiar with "render". As Ray tracing and OpenGL becomes more popular to reduce the cost of producing content rather than rasterization, my understanding of how the terms are used in my industry would have generation of the screen rather than render as more accurate. I have read and understand that the OpenGL process descriptions do use the term render.

The same could have been said for 1080P. Low demand and expensive for the first 3 years. The rate of acceptance for 4K is debatable because we have just recently reached a point where 1080P is in just about every TV over 32 inches.

Have you done any research into 4K TVs? Most of the 4K LCD TVs offer polarized 1080P 3-D with 4K, rather than shutter glasses. The idea is to make 3-D more attractive. 4K is easily and cheaply possible on the player end as everything is already in place. IF you have the hardware for 3-D, you just about have the hardware for 4K. The HDMI 1.4 specs reflect this.

The display end is another matter. And there can't be any cheating like they did early on with 720P TVs accepting 1080P and displaying at 720P. For the 1080P 3-D to use polarized glasses, it's necessary to have the full 4K display.

For projectors and rear projection DLP, it's possible to have a much less expensive 4K TV, just slightly more expensive than a 1080P.

And yes, "So there's absolutely no incentive on anyone's side to pursue another large jump in resolution." that is probably true for developers as well as Sony and MS and is possibly why the next generation Game machines have been shelved, it's mentioned in the article cited by Shifty that started this thread.

4K TVs will be released and will become a standard within the 10 year life of a PS4 just as 1080P did with the PS3. Consumers will expect support for their new TV during that 10 years and any Next generation console would have to support it eventually.

Thanks for the explanation. It's almost embarrassing that someone attacked your credibility and you had to go to this length to defend your integrity...and they ignore your response. Pathetic, really.
 
Supersampling happens at render time AFAIK, rendering subpixel resolutions. You don't render the whole frame at 40,000 x 30,000 and then downsize it, but you render each pixel by performing lots of samples within the scope of the pixel, accumulating and averaging the result.

Exactly. Depending on the renderer there are many different implementations to do the actual sampling, and you also need to sample a lot of things (GI, motion blur, glossy reflections etc) but the principle is just that.

However a typical rendering node outputs a LOT of data. There's a beauty pass which is the raw render, but every big studio also saves out individual passes: color, specular, normal, reflection, one for each direct light, one for indirect light, subsurface, hair, various masks to aid in compositing (post processing) and so on.
And the entire scene is broken down into many layers, like ground, sky, individual characters, foreground, background etc. etc.

These are all full 2K sized float images, but nowadays most studios put everything into a single EXR file per frame for a layer. It's a huge amount of data to move around and everything is re-rendered many times because the CGI process is a highly iterative one.
I've just checked and we have 18MB EXRs but I'm not sure if it's 720p or 1080p, we're in the middle of moving from one to the other.
 
Thanks for the explanation. It's almost embarrassing that someone attacked your credibility and you had to go to this length to defend your integrity...and they ignore your response. Pathetic, really.

Hey, I don't know everything and appreciate correction/explanations if I goof. Only when it's not pointed out where I went wrong is any criticism worthless.

Discussions on this board between and from professionals is a fantastic learning experience for me.
 
Last edited by a moderator:
I don't have a strong desire for 4K res. I heard some studio execs like it because they can control the distribution to the digital (4k) cinemas better, besides other tech advantages. Essentially, they can remotely/centrally release and play a movie without going through third parties' hands.
 
Thanks for the explanation. It's almost embarrassing that someone attacked your credibility and you had to go to this length to defend your integrity...and they ignore your response. Pathetic, really.

What he described is basically the TV's internal electronics upscaling content to 4K res. That's not gonna be enough to sell a new, expensive TV to anyone. The rest doesn't make sense.

And we all get that you don't like me.
 
The NGP GPU only updates what has changed on screen which results in a power savings and adds to the performance.
The NGP gpu doesn't work that way. In fact, no GPU that I know of works that way.

Imagine what a PowerVR could do if it was designed without power limitations and running at 6 Ghz (14nm) instead of 1+.
Worse than what I could do if I had $20B to spare. :cool:
 
Status
Not open for further replies.
Back
Top