4K displays announced at CES 2011

That's for productivity though on a desktop sat close to. What's the point of 4k for movies and games except on extra large screens where the resolution will actually be perceptible?

For movies wouldn't say 4x 1080P panels work better than one 4K panel? I always wondered whether the TVs of the future would be a few standard sizes but modular up to a certain number of modules, 6 perhaps?
 
They did indeed demo 4k GT5. Perhaps in combination with the recent patent for networked rendering, PS4 will render 4k with 4 PS4's in a cluster? :D

Your seeing it too...:LOL:

CES has come and gone and all major manufacturers were showing 4K TVs and several consumer prototype 4K camera with 2k 3-D.

Sony has this commercial product:

BCU-100 Computing Unit with Cell/B.E. and RSX for editing 4K video.

http://pro.sony.com/bbsccms/ext/ZEGO/files/BCU- 100_Whitepaper.pdf

With resolution for movies and commercials now jumping from 2K (2048×1080) to 4K (4096×2160), the time required for post production work has risen significantly, driving a growing demand for more efficient workflows to speed up the process.

To date, post production work has typically been carried out on general purpose workstations or cluster servers. Sony’s efforts to accelerate the process have now resulted in the development and deployment of the new BCU-100 Computing
Unit.

The new unit, which incorporates Cell/B.E. and RSX technologies, employs a “heterogeneous multicore processor” architecture that is specifically designed to optimize a particular type of application.Under current plans, the BCU-100 will be available with a software package that includes mental ray® from mental images GmbH and Houdini®

Batch from Side Effects Software, Inc., in order to accelerate the efficiency of CG productionworkflows, with the software implemented so as to take full advantage of the Cell/B.E. and RSX architecture. The BCU-100 incorporates a low-power consumption design that offers the added benefit of reduced running costs when many units are run together in a clustered configuration.

2. Processor Configuration
The BCU-100 consists of a high-speed Cell/B.E. microprocessor (running at 230 GFLOPS), an RSX graphics engine, a southbridge Super Companion Chip (SCC), and various peripherals connected to the SCC (see Fig. 1). High-speed buses enable fast data transfers: the Cell./B.E. communicates with the XDR main memory at 25.6 GB/s, with the SCC at 5 GB/s; and with the RSX at 20 GB/s (going) and 15 GB/S (coming). Total I/O speed of the RSX is in excess of 4 GB/s.

By bringing the RSX onto the motherboard, the BCU-100 implements a 1U server form unit with its own GPU. (See Fig. 2.) Because the Cell/B.E and RSX share memory space, the combination of Cell/B.E. and RSX can be considered as a single large virtual heterogeneous processor.

So a PS3 is essentially being used for 4K editing which means a PS3 can display 4K. HDMI 1.3 and 1.4 can output 4K. And distributed processing is being supported in this product too.
 
Last edited by a moderator:
Even if the capability comes to PS3, the best it can do is 4k at 24 hz. And I'm expecting mass produced OLED TVs before any manufacturer mass produces 4k TVs. :)

And when you consider the majority of households in the US still don't have an HDTV (720p or better), 4k screens are a distant pipedream for home mass adoption and thus not at all practical to target with a console. Although if Sony really wants a repeat of PS3 with a feature making PS4 far too expensive for most people at launch, I'm sure their stockholders will be very happy...NOT. :)

Regards,
SB
 
Even if the capability comes to PS3, the best it can do is 4k at 24 hz. And I'm expecting mass produced OLED TVs before any manufacturer mass produces 4k TVs. :)

And when you consider the majority of households in the US still don't have an HDTV (720p or better), 4k screens are a distant pipedream for home mass adoption and thus not at all practical to target with a console. Although if Sony really wants a repeat of PS3 with a feature making PS4 far too expensive for most people at launch, I'm sure their stockholders will be very happy...NOT. :)

Regards,
SB

There won't be any new news on 4K till the end of 2011 when 28nm foundries start producing faster more economical silicon to allow 4K to be practical for the consumer market. The display of choice will of course be above 50 inch and the technology used is still to be decided. CES 2012 should see the introduction of more 4K products.

The PS3 has problems with 1080P games in part because of Sony's choice to use only 256 meg of general purpose RAM. Arwin called me on this, even with more ram some games could still not achieve 1080P, some games would benefit enough to display at 1080P.

The PS3 is scheduled (according to developers) for 32nm soon. No mention has been made for 28nm but that might be next as talk has reducing the size to 32nm needing a redesign for pin placement. Since 28 nm is so close to 32 they might skip 32 nm. So 2012 a $199 PS3 with 32nm or possibly 28nm which may be passive cooled.

I expect the same as the PS3 with few games reaching 1080P, few games on the PS4 will reach 4K, the majority will be 1080P with 3-D 1080P a standard.

With 28nm silicon available 2012, next generation game consoles will be released soon after. 2014 4K video cameras and TVs will be an accepted standard.

My opinion
 
Last edited by a moderator:
http://www.hdmi.org/manufacturer/hdmi_1_4/hdmi_1_4_faq.aspx#21

4K is a term used to describe displays with resolutions that are essentially four times that of a 1080p device – or roughly 4,000 lines wide by 2,000 lines high. The HDMI 1.4 specification supports multiple 4K formats:

3840 pixels wide by 2160 pixels high @ 24Hz | 25Hz | 30Hz
4096 pixels wide by 2160 pixels high @ 24Hz
What kind of cable will I need to use for a 4K display?
A High Speed HDMI Cable (with or without Ethernet).

Are there any 4K displays available today? What about 4K content?
The first 4K displays were showcased at the 2009 Consumer Electronics Show. We expect them to be more widely available by the end of 2009, and we hope to see 4K source devices, such as up-scaling Blu-ray Disc players, introduced in roughly the same time frame.

The PS3 and other Blu-ray players (3-D) already have the frame buffer memory needed to do the above. 24 HZ is the blu-ray standard frame rate. Double frame 3-D devices at 24Hz should also output 4K at 2D. The TVs cited by me support the exact same standards; 1080P 3-D and 4K 2D (interlaced?)
 
Last edited by a moderator:
The PS3 has problems with 1080P games because of Sony's choice to use only 256 meg of general purpose RAM not because the silicon is not fast enough.

Where did you get that wisdom? Why, for instance, does a game like Wipeout HD run in 1920x1080 with a trick to dynamically downscale to 1440/1280/960x1080 when it is under heavy load? And what does that have to do with 'general purpose RAM'?
 
RAM is rarely the bottleneck for resolutions. Bandwidth is the main memory resource that increases with resolution, along with shader power and pixel fill-rate. PS3's RAM setup isn't the problem with 1080p. It also isn't an issue for 4k. 4k is roughly 24 megabtyes a frame, so no worries fitting a double-buffered movie and interface.
 
well if it needs to dynamically downscale then it certainly does have problems with 1080p

Yes, but not because of the RAM, that was my point.

That's not to say that RAM is never an issue - I remember that Insomniac discussed they were attempting to support 1080p for Resistance 1, but ran into issues with storing the textures because they needed bigger framebuffers for 1080p, something like that.

What I'm trying to say though is that RAM isn't an inherent issue. Bandwidth as Shifty says is by far the more common bottleneck, and the main reason why so many games support 1280x720p rather than 1080p.
 
That's not to say that RAM is never an issue - I remember that Insomniac discussed they were attempting to support 1080p for Resistance 1, but ran into issues with storing the textures because they needed bigger framebuffers for 1080p, something like that.
Even then, texture resolution isn't directly tied to framebuffer resolution. You can still render 1080p with lower quality textures, and it's not like one texel per pixel is the standard even at lower resolutions, so lower texture fidelity than screen resolution should be expected. Any PC game rendering to 4k won't have specialist 4k textures to make the most of it.
 
Heh, looking through that brought me to something else I'd rather have. :D

http://www.force-dynamics.com/index.php

I don't even want to know how much those cost. :p

Regards,
SB

seriously , who the hell wants to look at 2D slo as hell liquid crystal dell monitors once you have FFB like that.

That's like the opposite of this, http://www.youtube.com/watch?v=vunngNAnF8Y

when the CRT monitor is clearly held back by the "UI" . http://www.youtube.com/watch?v=vunngNAnF8Y
 
Last edited by a moderator:
that looks seriously cool :), just like those kind of apple II typewriters in Brazil the movie, or the old arcade game Starblade.

I would like to try some gaming on that.

I dont even.... :LOL:

I think I want a magnifier for my sony FW900 , and I also obtained a mitsubishi diamondtron 4:3 though, to "crosscalibrate":LOL: do you have any idea if these magnifiers are accessible and "3d ready" ?
 
I had further comments on it, like why people assume that 1920x1080 will still be a valid rendering resolution in 2020, and not 4K

Unless they start including 4k TVs with the purchase of a box of cereal in the near future it's safe to assume that very few people will be using it. And 1080 will still be supported in 2020 and probably for quite a while beyond, regardless what new technology is on the market. Things do not move that fast in the CE industry.
 
Unless they start including 4k TVs with the purchase of a box of cereal in the near future it's safe to assume that very few people will be using it. And 1080 will still be supported in 2020 and probably for quite a while beyond, regardless what new technology is on the market. Things do not move that fast in the CE industry.

you probably said the same about 1080p, ;) or 3d. Nowadays almost every new tv sold in europe is a 3D tv
 
you probably said the same about 1080p, ;) or 3d. Nowadays almost every new tv sold in europe is a 3D tv

If you would have said either of those, you would have been right. I'm sure "almost every" TV in europe is 3D TV, except that the adoption rate is about 2%.

http://good3dtv.com/3d-tv/3d-tv-adoption-rate-not-looking-bright-for-the-near-future/

When was the first HDTV launched, 1998? IIRC adoption rates in 2008 were about 16%. 1080p alone might get you a little further, 2007 being the first real 1080p TVs if I remember right, but four years later the majority of TVs out there are still 1080i or 720p. Diminishing returns once again, on the average TV size, how many people can really tell 1080p from 4k? How many really care?
 
Last edited by a moderator:
If you would have said either of those, you would have been right. I'm sure "almost every" TV in europe is 3D TV, except that the adoption rate is about 2%.

http://good3dtv.com/3d-tv/3d-tv-adoption-rate-not-looking-bright-for-the-near-future/

When was the first HDTV launched, 1998? IIRC adoption rates in 2008 were about 16%. 1080p alone might get you a little further, 2007 being the first real 1080p TVs if I remember right, but four years later the majority of TVs ot there are still 1080i or 720p. Diminishing returns once again, on the average TV size, how many people can really tell 1080p from 4k? How many really care?

what part of "nowadays" and "sold" don't you understand?
 
what part of "nowadays" and "sold" don't you understand?
Any evidence of that? A quick look at Argos shows out of a total of about 200 models for sale, 35 are 3D capable, and Googling the subject finds nothing suggestive of a majority of TV sales being 3D capable.

Secondly, the point is that, regardless what proportion of current sales are 3D capable, it's the actual adoption rate that matters. eg. If 100% of TV sales are now 3D, but sales of TVs sticks at only 1% of the national install base for years because everyone's happy with the HDTV they bought recently, the actual proportion of 3D users would remain extremely niche for ages.

Lots of people have recently upgraded their CRT to a slim HDTV, the final adoption of a standard introduced over a decade ago. This improved resolution represents approximately the limits of visual fidelity for a lot of homes given screen sizes and viewing distances. 3D adottion, the latest standard, has seen predictions like 25% of UK homes could have a 3D TV in 2015. Given that:

1) this generation of consoles had trouble hitting 1080p. The first doubling of power from this gen to next will be used up just to hit 1080p, ignoring the power requirements of rendering better visuals.

2) 4k is a resolution that very few people could actually benefit from so they have little incentive to buy a 4k screen

3) the storage requirements of 4k data are extreme, while no movie content is made at that resolution, so there's nothing much to be gained with movies

4) people have only recently (past ten years) upgraded their TVs to HD sets, with very few also upgrading to 3D, and 3D is the current big push with people buying new 3D so less lucky to go and buy another new set that offers 4k

5) there's a worldwide recession impacting people's buying habits

...how likely is it, really, that 4k is going to be a target for consoles launching in the next few years?
 
you probably said the same about 1080p, ;) or 3d. Nowadays almost every new tv sold in europe is a 3D tv

1) if i did i was right. 1080p was where 4k is now, in the 1980s.
2) i'd bet that is not even close to true as most value hdtvs are still not 3d. Not that it matters because 3d doesnt equate to 4k.
 
Back
Top