3D Gaming*

Not really console related but as remaking old films in 3D has been discussed here before. Apparently after the success of Avatar they are planning to re-release Star Wars and LOTR in 3D (endless milking). Thought this was interesting.

Bobby Jaffe, the chairman of Legend Films in San Diego, a company which converts films from 2D to 3D, said several Hollywood studios had been in contact to revive popular films into the format.

“We can turn an older film into 3-D in around 16 weeks,” he told The Sunday Times.

“It mostly suits action films, such as Top Gun or The Matrix, but Avatar proved it’s best to use the technology to immerse the audience in the story rather than throw things at them. This is the new, more sophisticated era of 3-D.”

That was pretty fast in my opinion.

Another possible use of the shutter glasses that struck me yesterday, was that if you synch both lenses to the same (even) frame and have another pair where both lenses are synched to the odd frame, it would be possible to have two players watching virtually two different screens. Bye, bye split screen. Would be nice in co-op, competitive shooters and racing, I hate the squeesed image and the distraction from the other player screen.
 
Nvidia's Shutter glasses work great. It only gets a little problematic when games are overly bright (Mirror's Edge is a prime example) as the the shutters don't block the light 100% and you get a bit of a ghost image affecting the wrong eye.

same exact issue I had with the e-dimensional glasses (late 90's hardware ;))
One of the best games was glquake : simple graphics, great effect, fun play with the grenade launcher.
Torch lights appear doubled.

It sucks to hear the problem is still here, but it's inherent with the tech.
The game gets dimmed when using 3D tech, too. That's why stereo drivers offer you additional gamma correction.
The dimming/loss of contrast can be distracting. In quake it's not bad as the game was dim already :p.

the nvidia goggles look tons better than the old H3D / e-dimensional ones, though. I'd love to try them (and wonder if ghosting got reduced, by using a faster LCD shutte).
 
I never used shutter glasses of any kind before, so no idea whether things have improved or not, but I've played a bit of TR Underworld at my buddy's house with them and man, was it a stunning experience. Can't what to try mine. (PC is on the way, whee!)
 
resolution alone plus screen size probably make it much better now. my experience was at 800x600 with no anti-aliasing (I believe it now works), also the overbright nature of LCDs probably gets useful :).

of equal importance is 3D being launched in the market(s) : cinema, home and gaming. You will be able to expect game support now. I believe the next gen consoles will support it too as the only thing required is a HDMI 1.4 connector there.

thus I believe this time 3D takes off somewhat. There has been no real new tech, but there's now industry support. Maybe it will be relevant in 2012 with new consoles and Olympics.
 
You know about polarised light, right? I'll assume so although just ask if not. In the cinema, they can project two images of different polarities. The glasses filter out the images, so each eye only sees one image. For polarised light to work ina TV, you need a polarising display, which adds costs. I've read (not extensively!) different things about how much this can cost. One solution worked with just a film behind the LCD I think, but results were said to be poor.

In contrast, shutter tech can work with any display. You just want a faster refresh to elliminate flicker, but I used 25fps shutter glasses (50fps total) on a SEGA Master System and though flickery, it worked well and I became acclimatised.

So you can have 3D on any TV? Why is it said that we need to buy 3D TV's then? Or does the 3D TV open up 3D to games that can't render two frames at 60fps each?
 
The key is what Arstechnica has already pointed out:

Because every frame that the active shutter displays push is full-resolution, a 3D TV can be used just like a normal TV with a very fast refresh rate. This simple fact is the key to why 3D TV is such a safe bet for the electronics industry: if consumers dislike anything about the 3D experience, they can just opt not to use it. Eventually, when all of the TV panels produced by the panel-makers are 3D-capable due to economies of scale, you'll have as hard a time finding a non-3D-capable display as you do finding a non-HD display today. So you may or may not choose to don some glasses and watch the big game in 3D, but your TV and your broadcaster will support it.
 
The key is what Arstechnica has already pointed out:

I see. So it's just a TV with a high refresh rate. My TV won't be able to play them, as it's only capable of 60Hz...but minimum 120Hz will be able to have 3D content in some form.

Interesting. So I guess they'll call all 120Hz TV's or above "3D Ready".
 
The tv must also be able to sync up with the shutter glasses so that the correct image is always displayed in a certain eye.

The Nvidia glasses sync with the graphics card/PC i believe so you can get 3D on non 3DReady TVs
 
I see. So it's just a TV with a high refresh rate. My TV won't be able to play them, as it's only capable of 60Hz...but minimum 120Hz will be able to have 3D content in some form.

Interesting. So I guess they'll call all 120Hz TV's or above "3D Ready".

TV would also need to have to capability to process the 3d signal from hdmi. This processing is missing from (all?) current tv's. I doubt many tv's would even accept signals with higher than 60Hz refresh rate(pc monitors excluded) if the player tried to make higher framerate frame alternating 3d via hdmi1.3.
 
TV would also need to have to capability to process the 3d signal from hdmi. This processing is missing from (all?) current tv's. I doubt many tv's would even accept signals with higher than 60Hz refresh rate(pc monitors excluded) if the player tried to make higher framerate frame alternating 3d via hdmi1.3.

So it can't be on any old high framerate TV with shutter glasses. Lots of different statements going on in this thread.
 
Just curious, how exactly is this done? (Given display latency and all.) Do you have to do that manually with a test pattern or something?

I have no idea, its something i have often wondered myself. While the glasses will know when a new frame is passed to the TV it would have no idea when the frame is actually displayed. I would guess there is some sort of configurable delay
 
I thought the glasses used IR to sync themselves with the display. Therefore the glasses are getting the current hz required through an initial sync operation and I would assume the IR being emitted from the TV's would be constant to allow others who just enter to use the glasses as well.

I don't particularly see this technology in its current state really taking off and being adopted by the masses. Relying on the use of shutter glasses is very prohibitive and costly; my feeling is the majority of people who end up purchasing a 3D display will unfortunately rarely use the 3D tech available and instead use it as a regular TV.

Once TV's offer actual 3D without the need for glasses or incorporate some form of hologram technology I would expect 3D to become main stream; until then however it in my opinion will be a niche market.
 
Once TV's offer actual 3D without the need for glasses or incorporate some form of hologram technology I would expect 3D to become main stream; until then however it in my opinion will be a niche market.

3d tv's might crawl out of niche sooner than expected. There is the whole sports and broadcasts being implemented in 3d(such as world cup 2010). And then there is the 3d dslr camera for consumers coming from sony and ofcourse the ps4/xbox720 3d gaming(and the current gen console and pc 3d games too). There is plenty of incentive to consume part of the media as 3d and part as 2d.
 
I thought the glasses used IR to sync themselves with the display. Therefore the glasses are getting the current hz required through an initial sync operation and I would assume the IR being emitted from the TV's would be constant to allow others who just enter to use the glasses as well.

That is how the majority of 3DReady TVs work i believe. We were discussing NVidia's tech, which works on non 3DReady tv/monitors that wouldnt have the IR sync modules.
 
Isn't the video card driving the monitor's refreshes on the PC? Our old GF2's had the glasses connected to them with wires so they were able to sync with the display.
 
Isn't the video card driving the monitor's refreshes on the PC? Our old GF2's had the glasses connected to them with wires so they were able to sync with the display.

Is the refresh instant though? Id expect there to be varying degrees of lag from when the signal for a new frame is recieved and when that actually appears on screen.
 
Well, GF2 was back in the CRT days, but communication with the display should be bidirectional anyway, right? Maybe it can tell the video card how much that delay actually is...
 
Since you need to sync the shutter glasses to the tv, would games have to be vsynced and maintain a stable framerate for this to be effective?

Tearing and dropped frames would really kill the 3D, I'd think.
 
Back
Top