Vortigern_red said:
Interesting indeed. To early to make any final judgements though. Especially since both cards had really low performance compared to what they should be able to achieve.
Vortigern_red said:
Chalnoth said:Well, whatever reasons there may be to have high readback speeds, they're not going to be seen in games for some time. Regardless, the NV45 has purportedly taped out, and should be out pretty soon, so I don't know if the 6800 Ultra PCIe numbers are really that important.
jvd said:Chalnoth said:Well, whatever reasons there may be to have high readback speeds, they're not going to be seen in games for some time. Regardless, the NV45 has purportedly taped out, and should be out pretty soon, so I don't know if the 6800 Ultra PCIe numbers are really that important.
was it not discovered that the nv45s are just the current nv40s with the bridge on the same package as the nv45 instead of a seprate package elsewhere on the pcb ?
So the nV45 that Chalnoth is referring to is going to actually be the same as the 6800 Ultra PCIe 'cept the bridge will be on the chip rather than the pcb...it'll be pretty much exactly the same in other words? (I'm not trying to be a smart ass, I'm just checking to make sure I understand this right.)AlphaWolf said:jvd said:Chalnoth said:Well, whatever reasons there may be to have high readback speeds, they're not going to be seen in games for some time. Regardless, the NV45 has purportedly taped out, and should be out pretty soon, so I don't know if the 6800 Ultra PCIe numbers are really that important.
was it not discovered that the nv45s are just the current nv40s with the bridge on the same package as the nv45 instead of a seprate package elsewhere on the pcb ?
yup, nv45 is just nv40 with the hsi in the package.
So the nV45 that Chalnoth is referring to is going to actually be the same as the 6800 Ultra PCIe 'cept the bridge will be on the chip rather than the pcb...it'll be pretty much exactly the same in other words? (I'm not trying to be a smart ass, I'm just checking to make sure I understand this right.)
ramfart said:Just how low will NVIDIA stoop, to try and sell there slot blockin leaf blower. I never thought I would see Nvidia resort to such desperate measures. Its a shame they had to lower themselves to this. I know the 2 companies trade punches but this is a disgrace.
[/quote]Let me explain for all you guys who don't know what the big deal is here.
ATi says it uses a native PCI express interface because a bridge chip will increase latencies and thus decrease performance, so they are bashing nvidia because nv choses a bridge chip solution.
Now it looks like ATi does not use a native PIC express interface but a bridge chip in the core package or a PCIE-AGP bridge. So one can wonder why ATi is bashing someone for something they also do.
What is wrong with nv bringing this to the public? Because so many ATi fanboys' dreams are shattered? Because ATi can't do nothing wrong?
This reminds me ..... with the release of the X800 ATi told the reviewers how the review the x800 and more so, they told reviewers how compare to the 6800u. They gave directions in their docs how to disable the brilinear optimisations in the 6800u and how to force true trilinear filtering on all stages on the 6800u. Now with the recent discoveries, ATi itself has no true trilinear filtering on all stages. Ok, with no IQ degradation but that's not he point.
ATi is consistently deceiving the public accusing nvidia doing things ATi does also.
That is the big deal. It is not that hard to understand.
let me elaborate on your post...HaLDoL said:Let me explain for all you guys who don't know what the big deal is here.
if you are an electrical engineer, you would know that this turns out to be true...latency can be measured but it may be a small impact for anyone to noticeATi says it uses a native PCI express interface because a bridge chip will increase latencies and thus decrease performance,
i would not consider it bashing, its just a fundamental understanding of electronicsso they are bashing nvidia because nv choses a bridge chip solution.
you are mistaking a "chip" with an interface, a chip would imply that it is not in the gpu coreNow it looks like ATi does not use a native PIC express interface but a bridge chip in the core package
refer to the previous commentor a PCIE-AGP bridge. So one can wonder why ATi is bashing someone for something they also do.
they stated that it was a bridge (think libel)...it could be a bridge but then again it may not be as well...What is wrong with nv bringing this to the public?
i find it funny that most people resort to this argument (for either IHV)Because so many ATi fanboys' dreams are shattered? Because ATi can't do nothing wrong?
i would agree with you to a point here, but you seem to be refuting your own argumentThis reminds me ..... with the release of the X800 ATi told the reviewers how the review the x800 and more so, they told reviewers how compare to the 6800u. They gave directions in their docs how to disable the brilinear optimisations in the 6800u and how to force true trilinear filtering on all stages on the 6800u. Now with the recent discoveries, ATi itself has no true trilinear filtering on all stages. Ok, with no IQ degradation but that's not he point.
No, that's not what NVIDIA is saying. What NVIDIA is trying to do is sow the seeds of doubt, and distract people from their own bridged solution.Sandwich said:I believe Ati here. All Nvidia has done is admit they can't figure out how Ati's solution works even though they tried.
It's hard as hell to understand your spin on it for me, it really is.HaLDoL said:ramfart said:Just how low will NVIDIA stoop, to try and sell there slot blockin leaf blower. I never thought I would see Nvidia resort to such desperate measures. Its a shame they had to lower themselves to this. I know the 2 companies trade punches but this is a disgrace.
Let me explain for all you guys who don't know what the big deal is here.
ATi says it uses a native PCI express interface because a bridge chip will increase latencies and thus decrease performance, so they are bashing nvidia because nv choses a bridge chip solution.
Now it looks like ATi does not use a native PIC express interface but a bridge chip in the core package or a PCIE-AGP bridge. So one can wonder why ATi is bashing someone for something they also do.
What is wrong with nv bringing this to the public? Because so many ATi fanboys' dreams are shattered? Because ATi can't do nothing wrong?
This reminds me ..... with the release of the X800 ATi told the reviewers how the review the x800 and more so, they told reviewers how compare to the 6800u. They gave directions in their docs how to disable the brilinear optimisations in the 6800u and how to force true trilinear filtering on all stages on the 6800u. Now with the recent discoveries, ATi itself has no true trilinear filtering on all stages. Ok, with no IQ degradation but that's not he point.
ATi is consistently deceiving the public accusing nvidia doing things ATi does also.
That is the big deal. It is not that hard to understand.
trinibwoy said:OK, I'm trying to sift through all the rubble here. What's the big deal about read back speeds? Are there any applications or games that make use of this or will there be sometime soon? Or is this just fodder for people to nail Nvidia to the cross yet again for false advertising?