Could this be the worst ever GPU article ever written?

kyetech

Regular
Im so sorry to degrade this fine forum, I dont want to sensationalise the title of this thread either, but im shocked and appauled by this:

http://www.theinquirer.net/articles/postComment/gb/inquirer/news/2007/12/18/gpus-dying

I wanted to highlight how bad this article is. And let people here just rip it apart, it deserves nothing less.... good god!

I left my comment on the site, not sure if it will be allowed, but this was the comment I submitted from that article:

-------------------------------
The whole premise for this article is a joke, it sounds like the applied logic of a child (or very ignorant adult)

Its so flawed on so many levels that I cant even begin to bother to pull it apart, I know what Ill do, Ill start a thread in Beyond3d and let the laughs begin, I cant wait to see responses from over there about this ridiculous article,

only problem is, I might get flamed for starting a thread about a subject that is so technically incompetent, that ill be accused of degrading the quality of the forum.

This truly is a low point for the inq.
----------------------------------
 
Last edited by a moderator:
I'm not sure why Charlie brought up the bit precision argument concerning the performance of GPUs.
I could make a programmable calculator perform calculations at a greater bit precision than the eye can discern.

I think he's seriously underestimating the gap between what we have now and the "good enough" far future he's contemplating.

As for the "upgrade treadmill" where each new GPU has to find new ways to be better from the last one and to lag the competition is bad: welcome to the world of semiconductors as a profit-making venture.
Seriously, what does he think Intel's been doing for decades?

edit:
Maybe this is some kind of modern art in text form. It really strikes me as a rather superficial argument. Perhaps he's using irony in a subtle characterization of historical arguments, like the patent official in the 1800s who declared all invention over.
How else can one argue that something is doomed due to its sheer awesomeness?
Maybe he's trying to stir up discussion, most notably about him as opposed to the numbing silence after he got the Barcelona stuff horridly wrong.
 
Last edited by a moderator:
Agreed, and I can safely say I wasn't that kind in my e-mail correspondence to him. Oh, and he's dead serious about everything he said in there (he does admit to have made a few minor mistakes, but claims I'm the one not seeing the 'big picture'). Ugh...
 
If a modern GPU works in 32-bit depth most of the time, with some capable of 64, 128-bit, aka extreme overkill is a factor of four GPU power away. To put that in doubling periods, it is two generations away to get from 32b with current frame rates to 128b with the same frame rates. Based on the numbers above, that would be a year if GPU makers wanted to go there and end that argument once and for all.

OMG!
rofl.gif
rofl.gif
 
I love Charlie. Reading his articles is like being at the circus while enjoying the comfort of my home or office.
 
Im so sorry to degrade this fine forum, I dont want to sensationalise the title of this thread either, but im shocked and appauled by this:

http://www.theinquirer.net/articles/postComment/gb/inquirer/news/2007/12/18/gpus-dying

I wanted to highlight how bad this article is. And let people here just rip it apart, it deserves nothing less.... good god!

I left my comment on the site, not sure if it will be allowed, but this was the comment I submitted from that article:

-------------------------------
The whole premise for this article is a joke, it sounds like the applied logic of a child (or very ignorant adult)

Its so flawed on so many levels that I cant even begin to bother to pull it apart, I know what Ill do, Ill start a thread in Beyond3d and let the laughs begin, I cant wait to see responses from over there about this ridiculous article,

only problem is, I might get flamed for starting a thread about a subject that is so technically incompetent, that ill be accused of degrading the quality of the forum.

This truly is a low point for the inq.
----------------------------------


What do you want from a guy who sleeps with ATI/AMD.
 
Someone needs to get Charlie an SLI rig with Crysis... STAT!

I'm sure the article will quickly be pulled after that experience.. :D
 
The thing about publications like The Inquirer or News of the World is that new and controversial content creating is the only thing that really counts. A lot of website writers are freelancers paid on a (low) per-word commission. Quality is completely optional.

Once in a while he's able to hide his technical ignorance, but this article was really a nice reminder, yet highly entertaining. I'd say Charlie earned his commission.
 
Charlie's stories usually indicate a certain level of technical familiarity, at least when it comes to CPUs.

This story is unusually shallow in its statements, even with the amount of text devoted to it.

As for Charlie's AMD bias (possibly to the exclusion of ATI, if going by this story), I think there may be an asymmetry in his sources. He looked like a sap because the Barcelona/Phenom stepping debacle a few months back, though nothing of a similar inside nature has occurred with Intel.
He was probably getting jerked around by internal politics and rivalries in AMD filtered through one low-level or peripheral source. It seems Charlie's sources in Intel, if there are any, are either more circumspect or are not as close to the action in the larger company.

Perhaps Charlie's AMD source(s) was someone in marketing, but possibly someone in the CPU division.
If anyone has a grudge against GPUs, I'd bet it's someone in that group, or someone threatened by ATI, perhaps some marketer who was afraid of being supplanted by an ATI counterpart, some unhappy exec (or assistant), or (tongue-in-cheek I hope) someone in AMD's CPU mask team who just found his workload doubled by Fusion.
 
Back
Top