tEd said:the truth sometimes can be FUD
Bjorn said:http://www.ati.com/products/PCIexpress/index.html
Other graphics companies have cards that are compatible with PCI Express, but they are still only AGP cards that are “bridged†by a second chip to be physically compatible with PCI Express slots on the motherboard. This architecture can only work at AGP speeds, and is more vulnerable to failure, performance bottlenecks and incompatibility with software applications.
According to NVidia, the bridged cards can work at twice the speed of the standard 8X agp. And why would a bridged card automatically be more vulnerable to failure ... ?
The Baron said:Bridge chip is just another thing that can fail. I see absolutely no reason that it would cause compatibility issues (except for reduced speed in some software) since it will appear to the motherboard chipset to be a normal PEG card. It will not function internally as a PEG card, which should have no ramifications on compatibility with motherboards at all.
Then again, I wonder about latency.
rwolf said:Say didn't ATI say their was something big for ATI coming in their drivers in a couple of months.
I thought I was the only one who believed that...The Baron said:Hopefully, they've totally revamped their interface for controlling IQ. Right now, it sucks.
Nope. It's the most totally unintuitive, clumsy, and all-around stupid way to manage IQ. They seem to do it a HELL of a lot better on their Mac stuff--why can't we get some of that (supersampling not included)?micron said:I thought I was the only one who believed that...The Baron said:Hopefully, they've totally revamped their interface for controlling IQ. Right now, it sucks.
volt said:If so, does XP carry PEG support?
Kombatant said:First thing that comes to mind is that a bridge chip, is..well.. a bridge, to translate AGP to PCI Express. I sincerely doubt that would enchance performance, since all it does is "connecting" the two interfaces. Your speed can only be as fast as your source is. And if your source is AGP, you don't get faster than than.
Unknown Soldier said:Kombatant said:First thing that comes to mind is that a bridge chip, is..well.. a bridge, to translate AGP to PCI Express. I sincerely doubt that would enchance performance, since all it does is "connecting" the two interfaces. Your speed can only be as fast as your source is. And if your source is AGP, you don't get faster than than.
I think you should've rather said "Your speed can only be as fast as your slowest link.. which is AGP 8x."
US
Why does this have any relevance? The "AGP x16" will only be active when the bridge chip is in use. The data transfer mode is there to make better use of the bridge than nVidia's older graphics processors will be able to. It was described as being able to emulate the bi-directional mode of PCI Express by very rapidly switching betweens ending and receiving data.WaltC said:There is no "AGP x16" standard from Intel. The bridge-chip implementation nVidia speaks of here is common to only nVidia--motherboards will support AGPx8 or PCIex16; you won't see any "AGP x16" motherboard support out there (core logic chipsets, etc.)
Now that is FUD, plain and simple.What I believe ATi is simply saying here is that the fact that nVidia's bridge-chip implementation won't be either AGP x8 or PCIex16, but a custom "in-between mode," *might* prove problematic in certain situations.
It's FUD because of the "might."I don't think that information would classify as FUD, because it is accurate.
I saw a good argument on one of the previews (forget which one, unfortunately):If you are going to try and make the case that nVidia's "AGP x16" is "just as good" as PCIex16, then why doesn't nVidia simply follow ATi and do a native PCIex16 pcb itself and forego "AGP x16" completely as ATi has done?
Yes, with the rest of the NV4x lineup. They will use the bridge chip too, however, to operate on the AGP bus.Indeed, I fully expect nVidia to eventually natively support PCIex16 just as ATi is doing initially, so the "AGP x16" bridge chip is but a temporary solution and therefore becomes harder to ultimately justify, doesn't it?...
pax said:Heck how many games out there or in the works use ps 2.0... I dont read any web page or review of the 6800 that even mentions games that fully use PS 2.0. I bet we wont care until r500 is out on ps 3.0...