nvidia just can't stop

At the moment, is there a really compulsive reason for ATI or NVIDIA to increase the GPU readback speeds? (GPGPU apps are still a minority). What about collision detection on the GPU?
 
Well, whatever reasons there may be to have high readback speeds, they're not going to be seen in games for some time. Regardless, the NV45 has purportedly taped out, and should be out pretty soon, so I don't know if the 6800 Ultra PCIe numbers are really that important.
 
Chalnoth said:
Well, whatever reasons there may be to have high readback speeds, they're not going to be seen in games for some time. Regardless, the NV45 has purportedly taped out, and should be out pretty soon, so I don't know if the 6800 Ultra PCIe numbers are really that important.

was it not discovered that the nv45s are just the current nv40s with the bridge on the same package as the nv45 instead of a seprate package elsewhere on the pcb ?
 
jvd said:
Chalnoth said:
Well, whatever reasons there may be to have high readback speeds, they're not going to be seen in games for some time. Regardless, the NV45 has purportedly taped out, and should be out pretty soon, so I don't know if the 6800 Ultra PCIe numbers are really that important.

was it not discovered that the nv45s are just the current nv40s with the bridge on the same package as the nv45 instead of a seprate package elsewhere on the pcb ?

yup, nv45 is just nv40 with the hsi in the package.
 
COme on Nvidia you went from first class to trash with this

Just how low will NVIDIA stoop, to try and sell there slot blockin leaf blower. I never thought I would see Nvidia resort to such desperate measures. Its a shame they had to lower themselves to this. I know the 2 companies trade punches but this is a disgrace. :(
 
AlphaWolf said:
jvd said:
Chalnoth said:
Well, whatever reasons there may be to have high readback speeds, they're not going to be seen in games for some time. Regardless, the NV45 has purportedly taped out, and should be out pretty soon, so I don't know if the 6800 Ultra PCIe numbers are really that important.

was it not discovered that the nv45s are just the current nv40s with the bridge on the same package as the nv45 instead of a seprate package elsewhere on the pcb ?

yup, nv45 is just nv40 with the hsi in the package.
So the nV45 that Chalnoth is referring to is going to actually be the same as the 6800 Ultra PCIe 'cept the bridge will be on the chip rather than the pcb...it'll be pretty much exactly the same in other words? (I'm not trying to be a smart ass, I'm just checking to make sure I understand this right.)
 
So the nV45 that Chalnoth is referring to is going to actually be the same as the 6800 Ultra PCIe 'cept the bridge will be on the chip rather than the pcb...it'll be pretty much exactly the same in other words? (I'm not trying to be a smart ass, I'm just checking to make sure I understand this right.)

thats how i read it in the other thread which i have no clue where it was .

But as you can see i'm not the only one to read that .

Its already tapped out from what i understand and will come out in late summer.

The nv48 orwhatever will be the real refresh and will clock higher (mabye the nv45 will too but i don't think thats the main goal of it )
 
Re: COme on Nvidia you went from first class to trash with t

ramfart said:
Just how low will NVIDIA stoop, to try and sell there slot blockin leaf blower. I never thought I would see Nvidia resort to such desperate measures. Its a shame they had to lower themselves to this. I know the 2 companies trade punches but this is a disgrace. :(

Let me explain for all you guys who don't know what the big deal is here.

ATi says it uses a native PCI express interface because a bridge chip will increase latencies and thus decrease performance, so they are bashing nvidia because nv choses a bridge chip solution.
Now it looks like ATi does not use a native PIC express interface but a bridge chip in the core package or a PCIE-AGP bridge. So one can wonder why ATi is bashing someone for something they also do.

What is wrong with nv bringing this to the public? Because so many ATi fanboys' dreams are shattered? Because ATi can't do nothing wrong?

This reminds me ..... with the release of the X800 ATi told the reviewers how the review the x800 and more so, they told reviewers how compare to the 6800u. They gave directions in their docs how to disable the brilinear optimisations in the 6800u and how to force true trilinear filtering on all stages on the 6800u. Now with the recent discoveries, ATi itself has no true trilinear filtering on all stages. Ok, with no IQ degradation but that's not he point.
ATi is consistently deceiving the public accusing nvidia doing things ATi does also.

That is the big deal. It is not that hard to understand.
 
You're dreaming if you think these images are "proof" of what what nvidia is claiming since they can no more interpret exactly what ati are doing from these images than you or I can. Plus, the performances linked to in this thread proove its bunk - nvidia supposedly has an bridge that works at agp16x, but they are telling us ati has an internal agp8x bridge, so the performances should be faster for nvidia and yet its ati that has nearly twice the transfer rates over nvidia. Looks like they shot themselves in the foot again.
 
Your argument is stupid.

You are telling us don't look that because you should look that other point. Why don't look at both? Because one is not so good for your favorite IHV ?

Let me explain for all you guys who don't know what the big deal is here.

ATi says it uses a native PCI express interface because a bridge chip will increase latencies and thus decrease performance, so they are bashing nvidia because nv choses a bridge chip solution.
Now it looks like ATi does not use a native PIC express interface but a bridge chip in the core package or a PCIE-AGP bridge. So one can wonder why ATi is bashing someone for something they also do.

What is wrong with nv bringing this to the public? Because so many ATi fanboys' dreams are shattered? Because ATi can't do nothing wrong?

This reminds me ..... with the release of the X800 ATi told the reviewers how the review the x800 and more so, they told reviewers how compare to the 6800u. They gave directions in their docs how to disable the brilinear optimisations in the 6800u and how to force true trilinear filtering on all stages on the 6800u. Now with the recent discoveries, ATi itself has no true trilinear filtering on all stages. Ok, with no IQ degradation but that's not he point.
ATi is consistently deceiving the public accusing nvidia doing things ATi does also.

That is the big deal. It is not that hard to understand.
[/quote]
 
Re: COme on Nvidia you went from first class to trash with t

HaLDoL said:
Let me explain for all you guys who don't know what the big deal is here.
let me elaborate on your post...
ATi says it uses a native PCI express interface because a bridge chip will increase latencies and thus decrease performance,
if you are an electrical engineer, you would know that this turns out to be true...latency can be measured but it may be a small impact for anyone to notice
so they are bashing nvidia because nv choses a bridge chip solution.
i would not consider it bashing, its just a fundamental understanding of electronics
Now it looks like ATi does not use a native PIC express interface but a bridge chip in the core package
you are mistaking a "chip" with an interface, a chip would imply that it is not in the gpu core
or a PCIE-AGP bridge. So one can wonder why ATi is bashing someone for something they also do.
refer to the previous comment
What is wrong with nv bringing this to the public?
they stated that it was a bridge (think libel)...it could be a bridge but then again it may not be as well...
Because so many ATi fanboys' dreams are shattered? Because ATi can't do nothing wrong?
:rolleyes: i find it funny that most people resort to this argument (for either IHV)
This reminds me ..... with the release of the X800 ATi told the reviewers how the review the x800 and more so, they told reviewers how compare to the 6800u. They gave directions in their docs how to disable the brilinear optimisations in the 6800u and how to force true trilinear filtering on all stages on the 6800u. Now with the recent discoveries, ATi itself has no true trilinear filtering on all stages. Ok, with no IQ degradation but that's not he point.
i would agree with you to a point here, but you seem to be refuting your own argument
 
I believe Ati here. All Nvidia has done is admit they can't figure out how Ati's solution works even though they tried.
 
Sandwich said:
I believe Ati here. All Nvidia has done is admit they can't figure out how Ati's solution works even though they tried.
No, that's not what NVIDIA is saying. What NVIDIA is trying to do is sow the seeds of doubt, and distract people from their own bridged solution.

FUD in it's purest form.

-FUDie
 
OK, I'm trying to sift through all the rubble here. What's the big deal about read back speeds? Are there any applications or games that make use of this or will there be sometime soon? Or is this just fodder for people to nail Nvidia to the cross yet again for false advertising?
 
Re: COme on Nvidia you went from first class to trash with t

HaLDoL said:
ramfart said:
Just how low will NVIDIA stoop, to try and sell there slot blockin leaf blower. I never thought I would see Nvidia resort to such desperate measures. Its a shame they had to lower themselves to this. I know the 2 companies trade punches but this is a disgrace. :(

Let me explain for all you guys who don't know what the big deal is here.

ATi says it uses a native PCI express interface because a bridge chip will increase latencies and thus decrease performance, so they are bashing nvidia because nv choses a bridge chip solution.
Now it looks like ATi does not use a native PIC express interface but a bridge chip in the core package or a PCIE-AGP bridge. So one can wonder why ATi is bashing someone for something they also do.

What is wrong with nv bringing this to the public? Because so many ATi fanboys' dreams are shattered? Because ATi can't do nothing wrong?

This reminds me ..... with the release of the X800 ATi told the reviewers how the review the x800 and more so, they told reviewers how compare to the 6800u. They gave directions in their docs how to disable the brilinear optimisations in the 6800u and how to force true trilinear filtering on all stages on the 6800u. Now with the recent discoveries, ATi itself has no true trilinear filtering on all stages. Ok, with no IQ degradation but that's not he point.
ATi is consistently deceiving the public accusing nvidia doing things ATi does also.

That is the big deal. It is not that hard to understand.
It's hard as hell to understand your spin on it for me, it really is. :rolleyes:
 
trinibwoy said:
OK, I'm trying to sift through all the rubble here. What's the big deal about read back speeds? Are there any applications or games that make use of this or will there be sometime soon? Or is this just fodder for people to nail Nvidia to the cross yet again for false advertising?

Currently there isn't really anything, but certainly applications could be written to offload tasks onto the gpu which might prevent the cpu from being a bottleneck.

I doubt you will see a game anytime soon which makes any use of it, but other applications tend to adapt much more quickly. If you check ati/pinnacles performance figures for hdtv editiing you can see where the bandwidth helps. It's found at the end of ATi's whitepaper on pci-e, I'm too lazy to link it atm. :p
 
Back
Top