Anyone got any ideas (if its true of course...) ?
http://www.theinquirer.net/17060213.htm
Our chap took a large amount of graphic data and sent it directly to the graphics processor (GPU), just to see what might happen.
What you would usually expect is that the GPU would send data into memory while running its calculations. The chip ordiniarily keeps some data in graphics memory, since the memory can remember and keep some interstates of calculations, before returning it to the GPU to get a final result.
But the result of the experiment was quite surprising; at least it surprised us a lot. Our investigator didn’t get any data transfer from chip to memory and there where no interstates. The result came straight from the chip.
There is only one logical conclusion we can draw from this experiment.
If you don't get any calculation interstates then you can say that the specific GPU must have some kind of buffer - some memory built into the processor. So, Geforce GPUs have some amount of memory inside the chip that is used as some kind of buffer and maybe as a cache as well.
Many Nvidia chaps we have talked with have said that we have been misled and there is no memory buffer or cache on the processors themselves, but we heard this sort of denial many times before.
Father of all Geforce cards, Nvidia chief scientist David Kirk, confirmed to us that Geforce 4 does have memory inside the chip for caching certain things but he said that there is only few Kb of it (16 Kb, we think he said). But I would say that 16K is not enough. I don’t doubt that there is 16 Kb for some calculations but you need more than that for some serious calculations -- like the 1Mb used in the experiment here.
http://www.theinquirer.net/17060213.htm