It's squiggly lines with no particular sign of context or understanding, and it makes comparisons to other architectures that it has neglected to profile and above all else failed to log something as fundamental as what clock and voltage steps were being used. It's hard to credit the squiggles to something they didn't keep track of.We certainly don't need to care about this as consumers, but it sure is interesting.
It's very likely that there is a lot of microsecond-scale variation, but why stop at GPUs, heck why stop at chips made in this decade?In particular, I wonder if previous GPUs exhibited this much variance within a single millisecond,
For reference, AMD's marketing for the 290 pointed out their power control method could operate in 10 usec increments, so what are the odds that there would be signs of variation on the oscilloscope for Hawaii?
One additional nitpick: what's with the "die shot" being bandied about.
Is there entertainment value on putting some grayscale anonymous chip as a base layer and then playing space invaders on top?
Is this somehow preferable to AMD's method of *nothing*?