Join Date: Aug 2002
NVIDIA's Acquisition of MediaQ Discussion
I thought I'd discuss the exact implications of this deal here, and I'm sure many of you will do so too here.
First of all, let me remind you that a PDA chip *is* in the works at nVidia ( and no, that isn't the NV33 AFAIK - or it might be, who knows, hehe ) - it's fairly obvious considering this acquisition, but I posted it at GPU:RW before that press release
So, obviously, nVidia needs their low-power technology. But... Is the need for that low-power technology limited to PDAs and other devices?
It's fairly obvious nVidia is limited in the NV35 ( and NV30, since they needed Flow FX ) by heat: And heat is very closely related to power, as you hopefully all know...
So, let's look at this in an IP perspective: what patents does MediaQ currently own?
1. Method and apparatus to power up an integrated device from a low power state : http://patft.uspto.gov/netacgi/nph-P...diaQ&RS=MediaQ
2. Graphics engine FIFO interface architecture: http://patft.uspto.gov/netacgi/nph-P...diaQ&RS=MediaQ
3. Parsing graphics data structure into command and data queues: http://patft.uspto.gov/netacgi/nph-P...diaQ&RS=MediaQ
4. Programmable and flexible power management unit : http://patft.uspto.gov/netacgi/nph-P...diaQ&RS=MediaQ
Okay, so what do we see here?
2 patents related to general GPU stuff, things nVidia doesn't need *at all*.
2 patents related to power consumption, not limited to GPUs.
I believe the most interesting of these patents is the last one. Here's the abstract:
Obviously, in current architectures, the use is obvious: If a part of the pipeline ( VS, PS or Triangle Setup ) is stalled, do something like disable half of its units. Of course, that's the brute force approach, and better techniques would work, well, better.
In the future, however, we're going to unite Pixel Shader, Vertex Shader, and so on. So, will this remain useful? Absolutely!
Moving to instruction-level parallelism does not prevent you from having unused circuits. For efficiency purposes, I believe we're also likely not to be able to run more than 3 or 4 programs at once in the GPU ( VS, PS and PPP programs, mostly, I suppose ) - so, good luck always using ALL units!
And there must be other reasons I fail to apprehend too, I guess. Which is obviously why feedback is appreciated!
Also, the first patent I linked to here enables to reduce/increase clock speed. That might also be used by automatically increasing / decreasing clock speed based on a goal FPS value. Such an implementation might actually be stranger, and it might be good to be able to disactivate such a thing in the driver panel. But sometimes, I believe it could be a quite nifty feature, particularly for old games I guess...
Obviously, MediaQ is also very important for PDAs and other small devices, and that's the primary use of the acquisition. They got quite a bit of very nifty technology IMO, and I must admit that on the long-term, I think nVidia made a worthwhile acquisition here. But then, who am I to say that, an analyst?
Feedback, comments, flames, whatever?
|Thread||Thread Starter||Forum||Replies||Last Post|
|NVIDIA Announces Acquisition of MediaQ||Dave Baumann||Press Releases||7||05-Aug-2003 14:55|
|Halflife 2 to be nVidia only????||martrox||3D & Semiconductor Industry||176||24-Apr-2003 21:39|
|Best tweak utility for R300 based boards?-All fixed||BenSkywalker||3D Hardware, Software & Output Devices||142||01-Apr-2003 04:23|
|HSR discussion (ATI vs. NVidia vs. PVR)||Mintmaster||General 3D Technology||12||05-May-2002 21:32|
|NVIDIA's New nForce 620-D And nForce 615-D Processors||Dave Baumann||Press Releases||0||14-Mar-2002 15:16|