Nvidia/Ati - GPU roadmap

Natoma said:
T2k said:
Chalnoth said:
Is PCI Express really set to be ready by early next year?

It will be interesting to see how video card manufacturers decide to handle the switch to PCI express....

No way, it's BS.

First of all, it's still in paper-phase: http://www.pcisig.com/specifications/pciexpress/

Huh. Then how do you explain PCI-Express being in Intel's Grantsdale chipset for 1H'04?

Excuse me: did you say 1H04? C'mon... do you seriously think ANY VGA manufacturer will adopt it BEFORE its released? :oops: No way. On the other hand it's still Intel only.

Actually specifically 2Q'04, but there are rumors that if AMD's athlon64 garners some heavy competition in the desktop space, Intel will move Grantsdale and Tejas up to 1Q'04. But that's hearsay.

Still not a market for mass-producted video cards.

1066-FSB (eventually 1200-FSB), Serial ATA/USB2.0/Firewire ports out the arse, PCI-Express x16 (aggregate bandwidth of 32Gbit/sec. 2Gbit/sec per channel), DDR-I and DDR-II support, plus a few more nifty things.

That's what I'm waiting for. :)

Not before 2005. Mark my words.

PS: I mean not in normal, daily mass production, so in use...
 
Chalnoth said:
T2k said:
No way, it's BS.

First of all, it's still in paper-phase: http://www.pcisig.com/specifications/pciexpress/
Yeah, that's basically what I thought.

From what I've heard previously with PCI Express, here's what I basically expect:

1. First step will be to integrate the lowest-bandwidth form of PCI Express. From what I read, this could be implemented as a small bracket between the PCI slot and the edge of the motherboard, allowing the motherboard to support both types of cards.

2. Once adoption of PCI Express is high enough, start moving toward higher-bandwidth PCI Express forms.

3. In the meantime, motherboards may contain one PCI Express high bandwidth slot below the AGP slot, primarily for the graphics cards.

4. How high-end graphics cards will handle the change I still don't know. I expect that at first, most will be available in both formats, but within a year after adoption, most of the brand-new graphics cards will only be available in PCI Express format.

Agreed all of this except No4.
It the 2-3 main mobo manufacturer will equip their board with PCI-Express then ATI and NV and all the big players can move immediately... ah and don't forget: both of them (ATI, NV) are chipset makers also... ;)
 
gordon said:
Inq reports that intels next platform for the next generation Prescott and Tejas processors, to be released 2Q'04, will use it.
Heh, I don't care what the Inquirer reports. That news site has zero journalistic integrity.
 
Actally, they're consistenton one thing: consistently writing their "articles" :rolleyes: based upon this forum... :D :LOL:
 
Chalnoth said:
gordon said:
Inq reports that intels next platform for the next generation Prescott and Tejas processors, to be released 2Q'04, will use it.
Heh, I don't care what the Inquirer reports. That news site has zero journalistic integrity.

Well the information is pretty much common knowledge and confirmed by Intel from earlier this year, but I quoted the Inquirer since I heard it from them last year.
 
Kalbaz said:
one thing I would like to ask regarding the supposed move to PCI-Express boards in the near future according to the comparison chart. The limitation of AGP currently means we can only have one high-bandwidth video card in the PC at one time. With PCI-Express, I assume this means we can have mutliple display adapters running concurrently?

With this in mind, could we possibly see a trend towards support for multiple monitors in games? Of course Matrox tried this but given the lack of cards in the market, developers basically either didn't care or couldn't justify the added development costs for such a small market, thus such a small amount of titles with the capability.

But with the promise of multiple high-bandwidth GPU's in future systems, hopefully we will see more developement to multiple display support once PCI-Express becomes more mainstream?

kalbaz

The problem with multi-monitor games isn't the number of adapters, it's the number of monitors. :) Many (most?) cards already support dual displays, but how many people have more than one monitor? I don't see this changing, because monitors take space. Even flat ones. And having two of them is typically not worth it for most users. That won't change by allowing the users to plug in more cards.

There's also the issue of how easy it is for developers. With DX9, it's relatively easy to support several monitors on the same card. It's less easy to support displays on different cards. Because of this, developers might support multi-monitors in DX9 games, but only on the same card. So allowing multiple cards in a system isn't likely to be supported as much as several monitors on the same card.
 
I would consider the kind of people whom would have the capabilities to have the power for multi-display games (eg. GPU/system power), would first have more than one PC and secondly, would have the space available. I know myself that if I could do it effectively and at a decent speed, I would. I have 2 Trinitron 19" displays for my workstations and I wouldn't mind switching one over for playing certain games. Hell switchboxes aren't that expensive though you must still spend a bit to retain a certain level of quality.

If not mutli-dislpay cards then whats the option with single display cards? We already push the current high-end cards to the limit with high-resolution and AA/AF, so how will these cards cope with more than one display output? Are we talking about half the performance? 3/4?

Of course with the latest chipsets and the cost involved in producing them, the exorbitant price of dualchip cards would drive the costs up to only the extreme purchaser. So I guess this concept will die again despite the Parhelia trying to push this tech even more.
 
I've had dual 21" config and it was very helpful. When I got my Parhelia, I had three 21" - but that was a lotta space, man! :LOL:

Later I sold them out - now I have one widescreen 24" and more free space on my desk. ;)

But: I would love Parhelia-style three or more monitors with the speed of current single monitor gaming! That would be the real fun!
 
Well for the Parhelia the list of supported games is already quite long, and grows weekly :) I enjoy plenty of these:

http://www.matrox.com/mga/3d_gaming/surrgame.cfm

The only major problem so far has been those games coded with older dx interfaces (dx7 or less iirc), meaning the max res is 1920x480 (I think that no dimension can be > 2048 pixels). But those are getting to be a smaller proportion of the whole.

I think that pure stretched games, using 1 viewport, wider field of view and 12:3 aspect ratio are very easy to implement with most common 3d engines. 3 viewports (each monitor showing a different camera) are harder, hence less common, but this SS:SE and IL-2FB are some of the noteworthy examples that use this approach.

So for a very little effort from the developer, they can at least have something that REALLY impresses at eg tradeshows!

Monitor space is a problem, but performance doesn't seem to be - going from 1024x768 to 3072x768 only gives about a 30% or so hit in my experience. Not too shabby really for the extra "something" it brings to a game.

2 monitors just won't work for most games (Total Annihilation is a notable exception), but three is great. So I do see a future for this feature, but only if ATi/Nvidia have the guts to do it.
 
with the speed of current single monitor gaming!

yes this is the issue I can see stalling wider support for such tech.

2 monitors just won't work for most games (Total Annihilation is a notable exception), but three is great. So I do see a future for this feature, but only if ATi/Nvidia have the guts to do it.

well yes 2 monitors is noly useful for RTS or Simulation games where overhead maps, altrenate viewports may be useful. FPS games of course will require 3 monitors. One can easily buy 3 17" monitors for the price of a 21" and you will have more (windows) desktop space with the 3 monitor setup. I don't think something as simple as how much room you have no your desk should be the limiting factor.
 
Gnep makes a good point about 3072x768 performing fairly well. It's not that many more pixels than 1600x1200 and most monitors look good when displaying 1024x768.
 
Gnep said:
Monitor space is a problem, but performance doesn't seem to be - going from 1024x768 to 3072x768 only gives about a 30% or so hit in my experience. Not too shabby really for the extra "something" it brings to a game.

Only 30% On Parhelia? C'mon... noooo way. You've gotta be kidding... turn on everything (AA, AF) and start playing UT2003 on three monitors...

2 monitors just won't work for most games (Total Annihilation is a notable exception), but three is great.

100% agreed.

So I do see a future for this feature, but only if ATi/Nvidia have the guts to do it.

Well said: if they not gonna implement, nobody or only very few developer will spend extra time to code only for Parhelias...
 
T2k said:
Natoma said:
T2k said:
Chalnoth said:
Is PCI Express really set to be ready by early next year?

It will be interesting to see how video card manufacturers decide to handle the switch to PCI express....

No way, it's BS.

First of all, it's still in paper-phase: http://www.pcisig.com/specifications/pciexpress/

Huh. Then how do you explain PCI-Express being in Intel's Grantsdale chipset for 1H'04?

Excuse me: did you say 1H04? C'mon... do you seriously think ANY VGA manufacturer will adopt it BEFORE its released? :oops: No way. On the other hand it's still Intel only.

Errr, you don't think the VGA manufacturers know what the PCI-Express specs will be? :)

Besides, it'll be the same situation when AGP1x first came out. Most cards were PCI, but they were easily convertable to AGP1x. Didn't really take advantage of the faster bus, but they were slot compliant. I can see the same situation occurring here quite easily.

T2k said:
Actually specifically 2Q'04, but there are rumors that if AMD's athlon64 garners some heavy competition in the desktop space, Intel will move Grantsdale and Tejas up to 1Q'04. But that's hearsay.

Still not a market for mass-producted video cards.

Same situation as the transition from PCI to AGP imo.

T2k said:
1066-FSB (eventually 1200-FSB), Serial ATA/USB2.0/Firewire ports out the arse, PCI-Express x16 (aggregate bandwidth of 32Gbit/sec. 2Gbit/sec per channel), DDR-I and DDR-II support, plus a few more nifty things.

That's what I'm waiting for. :)

Not before 2005. Mark my words.

PS: I mean not in normal, daily mass production, so in use...

Intel would disagree with you. ;)
 
T2k said:
Gnep said:
Monitor space is a problem, but performance doesn't seem to be - going from 1024x768 to 3072x768 only gives about a 30% or so hit in my experience. Not too shabby really for the extra "something" it brings to a game.

Only 30% On Parhelia? C'mon... noooo way. You've gotta be kidding... turn on everything (AA, AF) and start playing UT2003 on three monitors...

Well keep in mind 16x FAA has a very surprisingly small performance hit.

And if you turn on AA or AF, the same % hit will be applied to both resolutions, meaning the performance difference will remain the same, ~30%.
 
Basic said:
Kalbaz:
If I haven't got it completely wrong, the typical PCI-Express motherboard would still only have one AGP-like slot. The typical configuration would be a couple of single channel slots (PCI-replacers), and one 16 channel slot (AGP-replacer). The market is probably too small to make chipsets with two 16 channel ports.
It will all depend on motherboard manf. as there are many uses for pci-exp, outside of video cards, how about: raid arrays, gigabit ethernet, high quality audio cards could also use the extra bandwith.
I would love to see the ability to have more than one video card on a multi-pci-express motherboard.
later,
 
ahhh the Parhelia.... I had actually preordered one until I saw Carmack update his .plan :( Oh well, the proffesional app performance is utter crap, sooo....
 
epicstruggle said:
I would love to see the ability to have more than one video card on a multi-pci-express motherboard.
later,
There will only be a single 16x tunnel available for the first round of desktop/workstation boards. The other slots will be 1x. Server boards will employ a mix of 8x & 4x slots for higher bandwidth devices such as GB LAN & SCSI/SATA RAID. Other interesting devices such as multi-channel/multi-stream 24/192 audio boards & HDTV decoders will be handled adequately by 1x slots. I am not certain whether PCI-E graphics must be exclusively installed in a 16x slot. FWIR, the 16x slot will offer additional power required by these cards.
 
Sage said:
ahhh the Parhelia.... I had actually preordered one until I saw Carmack update his .plan :( Oh well, the proffesional app performance is utter crap, sooo....
I've heard from several people that performance in professional apps have gone way up recently, especially with all the certified workstation drivers they've been putting out. (I don't have any links, it's just something I heard...)
 
Tagrineth said:
T2k said:
Only 30% On Parhelia? C'mon... noooo way. You've gotta be kidding... turn on everything (AA, AF) and start playing UT2003 on three monitors...

Well keep in mind 16x FAA has a very surprisingly small performance hit.

And if you turn on AA or AF, the same % hit will be applied to both resolutions, meaning the performance difference will remain the same, ~30%.
30% for tripling the work done?
i dont buy it.
Either that or the game was CPU bound before you did SG
 
stevem said:
There will only be a single 16x tunnel available for the first round of desktop/workstation boards. The other slots will be 1x. Server boards will employ a mix of 8x & 4x slots for higher bandwidth devices such as GB LAN & SCSI/SATA RAID.....
How sure are you of this? Id like to think that some of the vendors (abit, asus,...) would love to add in their press release how they can do multi-video cards with their PCI-E motherboards. I hope someone tries it. Or is the problem that current version of PCI-E only has the ability to have one 16x tunnel. That brings up the question as to whether a 8x tunnel could be used for the second video card.

later,
 
Back
Top