Parhelia experience

But this is REALLY GREAT in SGaming - just slow like hell if 3-4 people meet for big rumble... ;)

smallut2k3sg.jpg


http://hrc.webzeppelin.hu/-=Utils=-/ut2k3sg.jpg
 
Surround gaming, which seems like the extra two monitors is mostly for playing games but I am sure it could be of use for other stuff. I am more interested in playing games live size on a big HDTV set at the highest possible resolution with a high quality 5.1 sound system. To me that would be more realistic overall. 65" HDTV (wide screen) would make a 3 monitor setup look kinda puny. Anyways I havn't seen to many people go there yet. I wonder if the Radeon9700 can use their component output adapter that was designed for the Radeon 8500?
 
A 16:9 widescreen cannot offer the same peripheral vision as three displays.

Additionally, one of the big problems with having the display view so incredibly wide is distoration. Unless you use different transform matrices for each monitor, things can get quite distorted.

The reason is simple, really. The 3D transform matrices are designed to display on a flat surface. The optimal configuration for a 3-display system would be to have the outer displays "wrap around" the viewer. Since the display is no longer flat, a single transform matrix just no longer works.

Now, think about why a large flat display isn't a good idea.

Any display could be thought of as being meant to simulate what the eye sees. As a consequence, an optimal display surface would actually be spherical (Note I'm talking about display surface, not alternate display technologies, such as holographic...). A single monitor could be seen as an approximation of a piece of this optimal spherical display, whereas multiple monitors (in a 3D game, of course) could also be seen as further approximations of the same sphere.

In sum, what this means is that the larger a display gets, in relation to how far it is away from the eye, the worst of an approximation it becomes. By allowing three different displays, with the outer ones curving inward, it allows for a much better approximation of the "optimal sphere."
 
Chalnoth said:
The reason is simple, really. The 3D transform matrices are designed to display on a flat surface. The optimal configuration for a 3-display system would be to have the outer displays "wrap around" the viewer. Since the display is no longer flat, a single transform matrix just no longer works.

Exactly...and you can sort of do this with LCDs, but not really with CRTs (yes, I'm sure it is technically possible, but not practical).

I'm interested in more LCD feedback...it seems like many people swear by CRTs and say LCDs suck, but then a lot of people who own LCDs have no problems with them.
 
Typedef Enum said:
What about DVD quality? I didn't have a chance to check it out, but I think I already know the answer...

DVD playback is my major use of these high-end cards, gaming is second.

I thought the 8500 was already very good in DVD, the 9700 PRO is a bit better from my own comparison (I also have the 8500), the colors of 9700 PRO is smoother in my experience, the vibrancy is about the same.

A friend of mine has upgraded his Radeon 8500 to a Parhelia-512 a couple of weeks ago (well, under my influence, since he said he want quality, ha ha), he told me that the difference is significant enough to be spotted immediately, even just for the desktop, DVD is the same case although the CPU utilization is higher since there is no iDCT on Parhelia.

After reading all these, looks like I should borrow his Parhelia and see if the DVD playback is really so much better.
 
DVD playback shouldn't even be an issue to consider. DVDs look the same on all these cards... The DVD playback acceleration of the Radeon line is nice, but with 1 GHz (or even 500 MHz+) comps it's just unnecessary. The only reason a DVD would look better or worse on any of these cards is due to the 2d image quality.
 
Those can be done in software as well. I like hardware acceleration, but every enhancement that has been done in HW for DVD can be done with adequate speed in software. It's unlike software rasterization, where even a 1Ghz CPU can't render a decade DX8 title at adequate framerates. Software DVD decoding on the desktop can be done easily on modern CPUs.

Now for mobile devices, HW acceleration is a different story.
 
I was thinking that as I typed and wondering which, if any, current DVD players offer enhanced de-blocking? There seems to be no reason they couldn't, but the issue was a comparison between the things listed, not what was theoretically possible.

But it's not as if "accelerated" decoding still wasn't really at heart software decoding in any case, atleast for 3D cards, so the analogy is a bit more confused (atleast to me) anyways.
 
Nagorak said:
DVD playback shouldn't even be an issue to consider. DVDs look the same on all these cards... The DVD playback acceleration of the Radeon line is nice, but with 1 GHz (or even 500 MHz+) comps it's just unnecessary. The only reason a DVD would look better or worse on any of these cards is due to the 2d image quality.

I had tried them all (GF3, GF4Ti4600, R8500LE, R9700PRO), I have all the top end cards available (except Matrox Parhelia).

I have the popular DVD playback software (WinDVD 4, PowerDVD XP 4.0), and have spent many hours watching many DVDs on my PC (well, it is my room entertainment system) with them.

The difference is there, ATI is definitely better than GF3/4 when hardware acceleration is on, I have mentioned before that GF3/4's motion compensation seems broken in my experience, software playback is better than hardware on GF3/4. Without hardware acceleration, it is up to the 2D and scaling quality of the display cards. But I can tell you that when hardware acceleration is on, the difference between R8500/9700PRO and GF3/4 are bigger than without.

I use 2 displays, Compaq P110 21" CRT monitor and a Hitachi 15" LCD panel.
 
demalion said:
I was thinking that as I typed and wondering which, if any, current DVD players offer enhanced de-blocking? There seems to be no reason they couldn't, but the issue was a comparison between the things listed, not what was theoretically possible.

But it's not as if "accelerated" decoding still wasn't really at heart software decoding in any case, atleast for 3D cards, so the analogy is a bit more confused (atleast to me) anyways.

Because DVDs aren't significantly blocky to begin with. Blocky videos occur with higher compression rates generally found on stuff compressed to DivX or low quality MPEG-2 to go onto a CD. There's also really low quality downloaded video (like game demo movies, etc).

A DVD holds 8 or 10 times as much data as a CD. Deblocking a DVD would just be a collossal waste of time to program and a waste of processor resources to use (may as well just have them go to waste). You'd end up with a picture that was blurrier than the original with basically no improvement in image quality whatsoever. The quality of most DVDs is limited more by the film than anything else (grainy image quality can be seen especially on older films). Deblocking won't do jack to improve that.
 
For me, it's more a factor of desk space. Sure, I might be able to fit three 14" LCD displays on this desk, but then I'd have a hard time placing my speakers. Regardless, one of these days I'm probably going to get a DVI->VGA adapter and purchase a second monitor.

As for the "personal cave" idea, interesting, but I don't particularly like thinking about having to mount a projector, or the idea of having to deal with distortion (since the projector just simply could not point directly at the screens...the person at the computer would be in the way...). Besides, it detracts from one of the huge benefits of having multiple displays: larger desktop.

To date, the best "peripheral vision" display that I've ever seen was at the Pax River military base (used to be just the Navy, but now I think all the armed forces are there...). One of the F-18 simulators consisted of an F-18 cockpit placed in the center of a huge dome, which had a projected display in all directions (except directly behind...where the door was...). Now that's what I would call a large degree of peripheral vision...
 
maskrider said:
I had tried them all (GF3, GF4Ti4600, R8500LE, R9700PRO), I have all the top end cards available (except Matrox Parhelia).

I have the popular DVD playback software (WinDVD 4, PowerDVD XP 4.0), and have spent many hours watching many DVDs on my PC (well, it is my room entertainment system) with them.

The difference is there, ATI is definitely better than GF3/4 when hardware acceleration is on, I have mentioned before that GF3/4's motion compensation seems broken in my experience, software playback is better than hardware on GF3/4. Without hardware acceleration, it is up to the 2D and scaling quality of the display cards. But I can tell you that when hardware acceleration is on, the difference between R8500/9700PRO and GF3/4 are bigger than without.

I use 2 displays, Compaq P110 21" CRT monitor and a Hitachi 15" LCD panel.

Well if you've tried them all then your experience is probably better than mine. I just realized that all my cards (AIW Radeon, R8500x2) all support hardware IDCT, so I guess I was wrong to post my limited observations. The reason was, I thought that WinDVD didn't support hardware motion compensation (and it certainly didn't look worse than the ATi DVD player that definitely does), but I see now that I was wrong about that. So, just ignore my original comments.
 
Nagorak said:
Well if you've tried them all then your experience is probably better than mine. I just realized that all my cards (AIW Radeon, R8500x2) all support hardware IDCT, so I guess I was wrong to post my limited observations. The reason was, I thought that WinDVD didn't support hardware motion compensation (and it certainly didn't look worse than the ATi DVD player that definitely does), but I see now that I was wrong about that. So, just ignore my original comments.

I can only upgrade the things in my room but not the living room, that's why my concentration is on my PC than the real Home Theatre in the living room. I am banned by my wife from replacing/changing anything there until something is broken (or unless Plasma TV is cheap enough to her standard) :-?.

BTW, to support hardware acceleration, WinDVD 3 have to use DirectShow through DVD Genie or through registry, motion compensation can be selected with DVD genie. WinDVD 4 has built in support in the options like PowerDVD XP 4.0.

I have probably talked too much on non-3D and non-Parhelia related stuffs, better rest my comment until I get the Parhelia from my friend.
 
Chalnoth:
The distortion compensation is a "solve once and reuse" problem, it can cancel the distortion completely, and it won't cost you any performance (exept the obvious part that you need to draw the screen once for each projector screen area + at the monitor). If you let the user change the projector and screen geometry (different angles and sizes, and different projector placement), you can still make an easy to use calibration.

The projector is supposed to be placed so that the users head would cast a shadow on the monitor. But since the projector should "display" a black area over the monitor, that wouldn't matter.
There would of course be a problem to have someone sitting next to you.

If you want a multi monitor 2D desktop, this is of course not the way. But hey, we're talking about extreme gaming here. :D
 
Back
Top