When do you think Dual DVI will arrive from ATi or Nvidia?

Solomon

Newcomer
I don't know about you guys, but there is no way I would drop $500.00 for a video card when it doesn't even support Dual DVI. It's actually quite comical. ATi and Nvidia incorporate the latest for 3D technology, but none of them want to jump out of the gate and start using Dual DVI. They include a DVI to VGA adapter so I'm still waiting for either of these companies to release a consumer card with such capabilities.

Matrox has been doing it for the last 2 years for crying out loud! The only way to get Dual DVI is to buy a Quadro FX (Completely way over priced) or a Fire GL from ATi.

Anyone besides me wanting Dual DVI? :D
 
Yes, I want dual DVI, too. But it's mostly up to the card manufacturers to offer this, not the chip manufacturers.
 
both my monitors use DVI connectors. A DVI-VGA adapter shouldn't cost that much to include , and with the proliferation of LCDs now, there isn't a reason not to have Dual DVIs with an adapter included for CRTs
 
I've seen you stating this all over the place. I'll say you should keep up the crusade, as you can only hasten its occurrence (LCD popularity is already providing some incentive for it). From what I'm recalling about early Radeons, I think the DVI->VGA adapter was fairly expensive and at some point I was left with an impression of them cutting back on DVI in some way.

I'm pretty sure OLED or some further LCD improvements to kill CRTs and remove any concern about adapter cost fairly soon, and allow the burden of adapters to be placed on CRT makers or hold-out consumers...sales figures seem to be pointing in that direction, I think.

Death to Analog!
 
I thought only lcd's and such used dvi? I could be wrong, but to me the extra cost wouldn't outweigh the gain when 75% of people still use crt's. I still haven't seen a good enough lcd to make me want to buy one especially for gaming.
 
vrecan said:
I thought only lcd's and such used dvi? I could be wrong, but to me the extra cost wouldn't outweigh the gain when 75% of people still use crt's. I still haven't seen a good enough lcd to make me want to buy one especially for gaming.

The thing is vrecan, Video card manufacturers already include the DVI to VGA connector. So why not progress on going Dual DVI instead of one DVI and on Analog? :)

I'm using a Parhelia card now. I was on a Radeon 9700 Pro, but VGA vs DVI on that card was horrible on the LCD monitors (Hitachi CML174SXW B) I'm using. One seemed like built in FSAA was on all the time in 2D. Heh.

There really is no extra cost. Matrox has been doing this for quite some time. The workstation cards from Nvidia, Quadro FX 2000, FX 1000, and ATi's FireGL X1 all are equipped with Dual DVI's. It's just a shame that they don't come out with a Dual DVI gaming card especially since the top of the line ones are so expensive to begin with.

Think about it. If you are shelling out $500.00 for the top of the line gaming card you would think that those would be dual DVI at least. I guess time will tell. More and more LCD monitors are being released with 16ms now-a-days. Viewsonic, Solarim, Hitachi, (soon to be Samsung) all have or coming soon 16ms LCD monitors. That means 62.5fps with no smearing.

Bambers said:
I thought there were a few normal gf4s that had dual dvi?

MSI and Asus had versions, but they were not really available. I tried to get the MSI Ti4600 version but MSI's vendors never even carried it nor was able to get it. :? I'm just hoping that the next evolution will start to lean toward adapting the dual DVI scenerio. Matrox has and never looked back.
 
well if its true that its not much of an extra cost then I can see where you are comming from but I was under the impression that it was somewhat expensive something like an extra few bucks per card and when you add it up between a million cards or some large amount that is a lot of wasted money when most of your audience isn't even going to use it. One thing that doens't make sense to me why no manufacture hasn't decided to have atleast a card that has dual dvi since the gf4 which as you said was really difficult to find. Unless their really isn't a big enough audience for it.
 
Solomon said:
Bambers said:
I thought there were a few normal gf4s that had dual dvi?

MSI and Asus had versions, but they were not really available. I tried to get the MSI Ti4600 version but MSI's vendors never even carried it nor was able to get it. :? I'm just hoping that the next evolution will start to lean toward adapting the dual DVI scenerio. Matrox has and never looked back.

Gainward's Ti4600 offered this and it was available in quantity, although at a premium.
 
Sounds like simple economics to me. Consider most people are still using analog CRTs. Of the fraction using DFPs, most still use analog VGA connectors. Now of the small fraction with DFPs with DVI support, only those with two such monitors would need a dual DVI card. Now out of the tiny fraction of people willing and able to afford two DVI DFPs, you can eliminate the professional market who would buy a workstation card rather than a gaming card. I think it's pretty safe to say that the market you're left with is pretty microscopic (no offense to those 10 of you reading this who are in it ;) )

Of course, you could just sell a dual DVI board with two DVI-to-VGA adapters, but then you're asking 99% of the customers for such a product to pay for something they're never going to use. Until the market penetration of DVI DFPs gets much higher than it is now, it's just not going to make much sense. Unless you're Matrox, and you enjoy losing money :)
 
Well since DVI-I supports both digital and analog, and DVI-D supports Digital alone, I don't see why they don't make all screens with a DVI-I connector and an DVI-I/VGA adapter...

That would be smart for once.
 
Not sure about that 'small fraction'.

While the installed base of LCD's is pretty small, as percentage of new sales it's up around the 30% mark IIRC and expected to overtake CRT this year or next year. I'm not sure what the balance in the gaming sector is - I would guess somewhat more towards CRT because of the response time issues, but all the high end systems sold in the UK now come with LCD by default - CRT is just an option.

As someone who's tried to work with 2 monitors on his desk, it really just isn't a good idea with CRT unless you have a huge amount of space. With LCD it's fine.
 
Re: When do you think Dual DVI will arrive from ATi or Nvidi

Solomon said:
Anyone besides me wanting Dual DVI? :D
I know I do.

Can't wait to throw that Gainward GF4 Ti4600 out of my system! I know I would've upgraded to a R3xx over the last 6 months if they offered the option, but no ... not a single company seems to want my money. (yeah, it's only €450-500, after all :rolleyes: )

When I saw the PowerColor dual-DVI R300 (at CeBIT?), it got my hopes up that we'll at least gonna get a limited edition of some sorts but no luck. At least it showed that it isn't the chip that's the problem but rather the manufacturers that don't seem to confident there's a market for such a beast.

Even though I don't consider nVidia an option today, it's still sad that they decided to drop the dual-DVI option from the high-end reference designs. I'm sure I'm not going to replace my card with a 5600 or something. (some of those do have dual DVI, don't they?)

cu

incurable :cry: :cry:
 
Calavaro said:
Solomon said:
Bambers said:
I thought there were a few normal gf4s that had dual dvi?

MSI and Asus had versions, but they were not really available. I tried to get the MSI Ti4600 version but MSI's vendors never even carried it nor was able to get it. :? I'm just hoping that the next evolution will start to lean toward adapting the dual DVI scenerio. Matrox has and never looked back.

Gainward's Ti4600 offered this and it was available in quantity, although at a premium.

Also Asus dual-DVI versions are quite readily available, at least here in europe. One shop I sometimes buy stuff has currently 11 different Asus/Nvidia cards, 4 of them dual-dvi-i, 2 of them in stock (GF4 MX440 8x 64MB, GF4 Ti 4200 8x 128MB), 2 of them not yet available (FX 5200, FX5600). They don't really seem to cost more than the non-dual dvi versions (hard to say as the shop doesn't carry the same versions with only one dvi-i port).
However, I haven't seen any ATI consumer board with 2 dvi-i connectors. ATI uses up to now (even newest cards) only 1 internal TMDS transmitter, which means the card manufacturer would have to use an external one. (Not that it is expensive, but this requires also a slightly different board layout.)
 
There is the issue of how many people have two DVI monitors?
Most people only have one monitor, whilst the price is coming down LCD monitors are still expensive, for the cost of two rather low end LCD monitors you could get a very nice CRT.

The extra cost is not just in the DVI to VGA adapter, there is also the cost of the extra TMDS encoder needed to drive a digital display, this would also require extra pins on the chip package pushing the packaging costs up. So it really does make it a lot more expensive. I think that Matrox get round some of it by having their second DAC externally (don't quote me on it). And on the G400 (well the one I used )the DVI connector was an add on board that plugged into the side of the main board which you had to pay for hence offloading the cost to the people who want the extra functionality, which is the way it should be done (although the connector between boards looked very iffy in my opinion).

CC
 
the other issue is that, ATI at least, integrate a TMDS transmitter. To support a second DVI they would have to place another external transmitter on their board, further increasing the cost. I should imagine there is nothing stopping their board vendors from doing this, and in fact the unpopulated pin areas on the boards are able to house a daughter board for this (look at the FierGL versions).
 
I can certainly understand chip makers (ATI, nVidia, and whoever else) not integrating dual DVI on chip. Hell, I'd bet the majority of LCD panels out there still have analog connectors on them.

What I would actually like to see happen, is that Analog CRT monitors have thrie OWN internal RAMDACs. That is, have the DVI connector on the CRTs (much like recent HDTV sets). In other words, drop the DSUB connector on all monitors, which would make supporting DVI "only" on video cards more feasible.

Of course, that'll never happen. So we'll be stuck having multiple interfaces for another 10 years....
 
Oh, that's one of MY old whinges. I have no idea why we can't use digital connections for CRT's except for the Iiyama.

(Actually I do: it's because DVI doesn't have the bandwidth of analogue VGA yet. I whinged here on that topic about 6 months ago...)
 
Dio said:
Oh, that's one of MY old whinges. I have no idea why we can't use digital connections for CRT's except for the Iiyama.

(Actually I do: it's because DVI doesn't have the bandwidth of analogue VGA yet. I whinged here on that topic about 6 months ago...)


Out of curiosity, where does DVI top out?

Is 1600x1200x32bpp @ 85Hz near or at the limit?
 
Well above, I think.

IIRC it's 1280x1024x60Hz - which is why an 18" flatpanel's about the limit at the moment. However, I believe there is a protocol for tying pairs of DVI signals together (one DVI connector can carry two signals) to raise the res.
 
Back
Top