Rift, Vive, and Virtual Reality

Rift for me too is regaining a bit of an edge. HTC Vive I'd want to wait until the next gen version perhaps. Too expensive for me right now too. If games start supporting a PS4 controller for the Rift, it'll be hard for me to resist one when I get a chance to buy one. Also like the Rift's motion controllers better, and the headset seems more comfortable.


Verzonden vanaf mijn iPhone met Tapatalk

That just reminded me of one of the bigger differences between the two headsets. Due to the way head tracking is done, the Vive has to have more hardware in the headset itself. And that's also why it needs more power than USB can currently provide. All of which means more weight and more bulk. Tested mentioned they were both comfortable to wear for long periods, just that the Rift was more comfortable.

If I had to choose one, that would weight heavily in my personal decision. Lighthouse also makes Vive unsuitable for my area. I live near a hospital, so whenever a helicopter goes by it shakes the whole house a bit. I can imagine that doing some really funky things with the view in the headset as the light houses would shake even if wall mounted (the whole building shakes).

On the other hand if you want "room scale" right now, Vive is the only choice until later this year when Occulus releases their controller as well as start allowing people to buy more camera stands.

But I really do like how Rift doesn't require any external power other than what is delivered through the USB connection to the computer. If this continues to be a thing I'll be interested to see what progress both companies make for 2nd and 3rd generation devices.

Regards,
SB
 
VR will be next year for me at the earliest as I'm not upgrading my gfx card until 14nm has settle down. Today though, I'd plump for the Rift as it's a better fitting headset and Touch are the more ergonomic controllers by the sounds of it.

Next year though, who knows? We haven't got wind of any other SteamVR headsets yet but it can't be too long before other manufactures jump on board. Based on HTCs phones, they could probably do a better & cheap job than the Vive.

I think I'm likely to go SteamVR in the end, purely base on the fact it uses lazors and lazors are awesome.
 
If I had to choose one, that would weight heavily in my personal decision. Lighthouse also makes Vive unsuitable for my area. I live near a hospital, so whenever a helicopter goes by it shakes the whole house a bit. I can imagine that doing some really funky things with the view in the headset as the light houses would shake even if wall mounted (the whole building shakes).

I've seen it mentioned somewhere that people had issues with putting the lighthouse emitters near doors or attached to walls with adjacent doors (not because of the door knocking into it, but because of the vibration through the floor/wall from shutting the door). Having stands on soft carpeted floors with near by foot traffic might also be problematic. Some devs have talked about using tension rods from floor to ceiling to have a more secure mount.
 
There's also another minor issue with Lighthouse in that it makes a noise. It's quiet, but having to turn them off if you have a bedroom or studio flat setup is a minor quality of life pita.
 
the vive is much heavier and if you choose to use high end ear phones the difference is staggering.

But the light houses are really nice and you don't need to run cable back to your computer vs the rift sensors that each need a usb 3.0 port.

The touch controllers feel much better than the vive wands however. They are much lighter and just feel better in your hand. Also the grip b utton on the vive is just weird
 
I use a DS4 controller with DS4Windows. Very easy to set up and switch between native mode and xbox controller emulation.

Does this take control of the whole Bluetooth driver stack, like the SCP for dualshock 3?
I'd love to use the DS4 with my Surface, but I don't want to lose its entire bluetooth functionality to the gamepad.
 
Anybody know how far rift, vive, and Psvr focus distance are?

My minus 3 eyes still can see just fine for a few meters..

Does this take control of the whole Bluetooth driver stack, like the SCP for dualshock 3?
I'd love to use the DS4 with my Surface, but I don't want to lose its entire bluetooth functionality to the gamepad.

It replace wireless controller to xbox controller.

The Bluetooth still works fine. At least it's like that when using input mapper.
 
NVIDIA Demos Zero Latency Display Running at 1700Hz

Nvidia is demonstrating a display that holds a true refresh-rate of 1700Hz. It is a prototype called zero latency display and it can display imagery stable even when shaking the screen, handy for VR.

According to vice-president of Nvidia graphics research, David Luebke this was possible due to the high refresh rate which is roughly 20x higher opposed to what current VR goggles are using. At 90Hz each 11ms an images is displayed, at 1700 Hz that's 0.58 millisecond. 90Hz is more than sufficient for a comfortable VR experience, NVIDIA Vice President of Research David Luebke says that ever higher refresh rates could improve the VR experience by further reducing latency.

http://www.guru3d.com/news-story/nvidia-demos-zero-latency-display-running-at-1700hz.html
 
So a ~16x refresh rate increase to go along with our needed 16x panel resolution increase (1k x 1k -> 4k x 4k per eye). Heh.
Honestly at this point I'm not sure that the latency would be that big of a benefit for that sort of brute force refresh rate. We're not that far from getting to those negligible sub-1ms ranges with prediction. Or getting there with less elegant brute force: using older SDKs (where the HMD was still treated as a monitor) you could disable vsync and with a very (very) low intensive scene you could pump frames out fast enough that you get a stream of swap tearing producing a crude form of racing the beam. At this point you're effectively bound by the IMU polling rate, and darn close to 1ms latency.

The big things we're short on right now are resolution and HDR, (and also "black" on OLED panels.) I guess if you could leverage that refresh rate in interesting ways for both. If you were to oscillate the panels and adjust your sub-sampling in sync, you could produce an effective higher resolution, or if you were to vary the persistence strobe duration of each refresh you could produce an effectively wider brightness range.
 
So a ~16x refresh rate increase to go along with our needed 16x panel resolution increase (1k x 1k -> 4k x 4k per eye). Heh.
We need a 256x increase in power to render current quality graphics in true VR. :eek:

This news will please @Globalisateur et al though, as there's the true promise of ditching faked motion blur and just rendering everything at stupid framerates for real moblur.
 
We need a 256x increase in power to render current quality graphics in true VR. :eek:

The images probably don't need to be updated at 1700fps though. The GPU can produce two spherical (or hemispherical, if certain assumptions are made) images and let the headset to product images at current looking angle. That means the latency for head turning will be extremely good while the images are updated at a slower rate.
 
The images probably don't need to be updated at 1700fps though. The GPU can produce two spherical (or hemispherical, if certain assumptions are made) images and let the headset to product images at current looking angle. That means the latency for head turning will be extremely good while the images are updated at a slower rate.

That would only help in a situation where your head is locked in place and only allowed to rotate. If you were to, for example, shake your head back and forth that would ruin the illusion as there would be no inherent parallax due to objects being at different distances from the viewer until the GPU updated the scene.

Combine motion of the head with rotation of the head (like moving your head forward and rotating it like peeking around a corner) and the illusion of depth would be further destroyed.

In your example solution, you'd rotate your view just fine, but you wouldn't get closer to the edge of whatever you're peaking around until the GPU was able to render a new spherical image for the head's new position.

So, you'd have the view rotating at 1700 (or whatever) FPS, but your head's location in the world would only update at 90 (or whatever) FPS. That'd lead to a rather huge disconnect between what your head and eyes are doing and seeing.

Regards,
SB
 
That would only help in a situation where your head is locked in place and only allowed to rotate. If you were to, for example, shake your head back and forth that would ruin the illusion as there would be no inherent parallax due to objects being at different distances from the viewer until the GPU updated the scene.

Combine motion of the head with rotation of the head (like moving your head forward and rotating it like peeking around a corner) and the illusion of depth would be further destroyed.

In your example solution, you'd rotate your view just fine, but you wouldn't get closer to the edge of whatever you're peaking around until the GPU was able to render a new spherical image for the head's new position.

So, you'd have the view rotating at 1700 (or whatever) FPS, but your head's location in the world would only update at 90 (or whatever) FPS. That'd lead to a rather huge disconnect between what your head and eyes are doing and seeing.

I think it's still possible to interpolate between small changes in location, especially with Z information. If the movement is too large to interpolate, then at worse it just falls back to the ordinary 90fps.
 
Back
Top