X-Box hacker interview

don't we have anything to say about this interview than to speak about nintendo ?

i mean this guy says a lot of interesting things and a for sure lot of controversial ones...

and most of you talk about framerates, his supposed agenda (of course, he MUST have one for saying such things..), nintendo, fanb*yism...

doesn't he deserves more respect than being reated as the casual fanb*y..

is this interview so much disturbing for you minds ?
 
Disturbing? No. The guy just doesn' know what he's talking about and is throwing his nonsensical opinion around.

Why would YOU take his opinion on anything when there's so much information to the contrary from actual game developers? What reason would you have for defending anything he's said in that article, unless you WANTED to believe in what he's saying?

People that don't know what they are talking about will usually not recieve any "respect". Respect is earned, not given. This bloke hasn't earned anything with this article.
 
Its funny that even after all the benchmarks and all the writing by developers to the contrary people will jump right back on the "Xo8x si teh suxor" bandwagon at the any opportunity.
 
It's a good interview, but what bothers me both about this interview, and about Bunnie's original security paper, is Bunnie's habit arguing "fact A is demonstratably true, therefore unrelated assertion based on my personal feelings B must also be true".

Bunnie's a good hardware hacker, but why should I be persuaded by his opinions about console system design? Or Microsoft? Or about OS software?

When he's arguing, Bunnie tends to make things up, like "...they cost nVidia untold millions of dollars in scrapped chips...". But there's no evedence that any chips were scrapped. It took six months from when Xbox was first hacked until the new Xboxes appeared on the market. That's plenty of time for the stockpile of old chips to be used up.

I'm really surprised that Bunnie could get a PHD at a place like MIT, with high academic standards, and still be so intellectually lazy and misleading in his arguments. Surely he knows better.
 
duffer said:
When he's arguing, Bunnie tends to make things up, like "...they cost nVidia untold millions of dollars in scrapped chips...". But there's no evedence that any chips were scrapped. It took six months from when Xbox was first hacked until the new Xboxes appeared on the market. That's plenty of time for the stockpile of old chips to be used up.

Ehmm Nvidia actually told this themselves didn't they?
 
duffer said:
When he's arguing, Bunnie tends to make things up, like "...they cost nVidia untold millions of dollars in scrapped chips...". But there's no evedence that any chips were scrapped. It took six months from when Xbox was first hacked until the new Xboxes appeared on the market. That's plenty of time for the stockpile of old chips to be used up.
Actually, I clearly remember NVIDIA saying in some press release or quarterly report or something that MS had caused them huge losses by forcing them to scrap a lot of chips. I don't even know where to start looking for that one though.
 
Nah, you guys don't know how to listen to what companies say. :)

NVIDIA said something like "We lost X million dollars this quarter, partially due to scrapping Xbox parts." But they were being disingenuous, because they didn't break out what percentage of the loss is due to Xbox, and what percentage is due to other, more embarasing factors, such as scrapping their inventory of poorly-selling PC motherboard nFORCE chipsets.
 
duffer said:
Nah, you guys don't know how to listen to what companies say. :)

NVIDIA said something like "We lost X million dollars this quarter, partially due to scrapping Xbox parts." But they were being disingenuous, because they didn't break out what percentage of the loss is due to Xbox, and what percentage is due to other, more embarasing factors, such as scrapping their inventory of poorly-selling PC motherboard nFORCE chipsets.

Ahh you might be the man i have been looking for. Since you have insider information you can maybe tell me the numbers of sold chips in the different Nvidia Generations, starting with the Geforce 1.

I'm VERY interested in knowing how many gaming rigs/pcs that are actually out there.
 
Who says I have insider information? I just surf the web like anyone else.

I think you might be able to buy that information in the form of a report from Microprocessor Reports or Gartner Group or somebody like that.

You might also figure it out yourself by reading the NVIDIA annual reports and doing some calculations. You would have to make some assumptions about average selling price, but I bet you could come pretty close to the truth.
 
The guy clearly doesn't understand economies of scale either.

Part of the reason of including the innefficient x86 design is b/c its mass produced to such a high degree, that they can effectively include much higher performing x86 parts than they would if they custom built their processor.

Quality/price is the key factor that MS optimized for. Not Quality, not price, but the ratio of the two.

So when we see competitor parts with much lower clocks, though far more efficient, one wonders who actually ended up with the last laugh.

See benchmarks for details.
 
Because he is a hardware hacker, this guy is supposed to be an expert on gaming?

I hardly play any games on my video game consoles--
the most challenging and addictive game for me is hacking them.

He doesn't have an axe to grind with MS?

If you believe the historical documentaries such as "Hackers" by Steven Levy, Bill is almost single-handedly responsible for destroying any real innovation in the software world, in a manner very similar to the way Intel destroyed any real innovation in computer architecture. (I had a long flame here to back up that point, but I cut it because I've already written too much...)

It also really irks me that Microsoft released such a sub-par piece of hardware for the Xbox. Almost every person I've talked to who does circuit boards or consumer products has agreed that the Xbox is really a steaming pile of dung.

It makes me sad that Microsoft is buying its way into such an important market and lowering the bar like this.

This despite the very clear hardware advantage that the XBox has proven it has. In terms of them losing money, who does that hurt? MS. Who does that help? The consumer if anyone. When you buy an XBox you get more in terms of hardware then you do buying the other two consoles.

I personally know dozens of unhappy Xbox users who have had to return their units, or have bought "new" units that were actually refurbs. In contrast, I know of not a single Gamecube or Playstation2 owner who has had to return their console.

Being a hardware hacker at MIT and explaining out why people try to hack the XBox due to its nature of being capable of more, and then explaining out that he knows of people that returned them, should it surprise anyone?

This guy is very clearly biased against MS and the XBox for that matter. His largest complaint with the Box is the design philosophy used, not the end results.
 
A successfull console creates a pretty large scale market in its own right.

The timeframe for release determined what they could use, and a knock off from the x86 PC architecture was really the only thing compatible with it.

Even if it was the right decision for this gen (which I think it was IF you assume the console should have been developed in such a short timeframe) using slightly customised existing IP meant for x86 PCs for their console will not work as well next time. Sony made a lot of mistakes with the PS2, which I doubt they will repeat with the PS3. This time embedded memory will not force them to use a castrated rasterizer, whereas PC graphics architectures still wont be able to use embedded memory due to the vastly higher resolutions they need ... unless Sony ballses it up bigtime a competetive architecture from the PC realm will need to be much more costly, unless we see a shift to tilers there (maybe IMG will finally get off its ass and get with the cutting edge ;). Also the further we go in transistor counts the more diminishing returns hurt serial architectures in respect to concurrent architectures, so also for the main processor there is a larger hurdle to overcome.

If they plan next time to just pull the same stunt again and ask the market "who can cobble together some of their existing nearly complete designs into a console for us in 1-2 years" they are going to have to take even bigger losses on the hardware to be competetive against Sony (in the market, assuming Sony will lead them again in release that means that just like this time it has to be a lot better ... not just a little).
 
randycat99 said:
How does he get that latency is 10x slower than the competition? That is a pretty specific statement. Is it really true?

It might be 10x worse, in theory, than the 1T-SRAM in the GC, but it's probably better than PS2.

The xbox has a decent sized L2 cache, and that also probably helps hide a lot of the stalls.
 
It's been proven by many HW sites that DDR has much lower latency than Rambus. Even if the UMA architecture creates stalls, think about the PS2 and its DMAing stuff (assuming its properly debugged) across multiple buses with pidly 4k / 16k caches.

Bunny's a smart guy, but he's not a game developer. He's made a number of posts over at xboxhacker.net, and he's prefaced many of them with "...since my lawyers don't want me talking about X or Y...". Yeah, right, no grudge with MS :LOL:

Bunny's ideal console would probably include all sorts of wacked out theoretical hardware hand picked out of textbooks and lectures that developers wouldn't have a clue where to start with.

zurich
 
Actually, I've heard the embedded, dual Rambus memory chips in the PS2 are actually quite good in throughput and low latency. This is unlike a typical i850 chipset arrangement where dozens of Rambus modules must be interconnected and switched to- an entirely different situation as far as latency goes.

Also, I was under the impression that the CPU in the Xbox has an L2 size akin to a Celeron. That is- 128 KB, which is hardly generous, and more aptly about as tiny as you can get away with, barring not having one at all.
 
randycat99 said:
Also, I was under the impression that the CPU in the Xbox has an L2 size akin to a Celeron. That is- 128 KB, which is hardly generous, and more aptly about as tiny as you can get away with, barring not having one at all.

It may be small, but from Faf's complaints I've read elsewhere, the PS2 is even worse off in this respect regarding CPU and memory access.
 
NV2A is supposed to have 256k of L2 cache too, no ?

And even if the PS2 incarnation of Rambus is done well, its serial nature should still make it a dog for latency. (eDRAM is another matter entirely)

zurich
 
Hey, I guess Faf would do well to have a talk with the Bunnie then, right?

I noticed that no one here has bothered to give this "hacker" guy his due props- just excerpts from the article and bashes simply because he isn't stroking your Xbox egos (as if that is a prerequisite to be listenned to these days). This guy is a PHD grad from MIT. He worked at SGI. I'd say he tends to know something. His word on the subject of computer architectures is certainly worth something over your average poster here who just luvs their Xbox. I know there are a lot of developers here, as well. Reasonably you should hold them both at a similar level of credibility. He isn't just some hacker guy living in Mom's basement, who thinks he is badass cuz he can build his own computer for cheap, can tweak Windows to run like a top, and be a master fragger in Quake. If you read the article in its entirety (not just the Xbox parts, as some of you have quoted exclusively to make a point), you get the impression that this is a more than reasonable person who has his fair share of eccentricities as a computer architecture/hardware genius. He is easily has "hardcore" as some of you consider yourself to be.

So what if he shows a preference for OSX as a platform. You know he has seen everything in his area of work. He still does his major work using Windows tools. He didn't up and decide to hate M$ one day. You would think if a person can have that much exposure, yet still hold a fond eye of a certain Unix OS, that means something. You may not agree with him, but he is entitled to his own preferences for sure.

You also get the impression that M$ themself has a certain level of respect for him and what he knows. They work with him, instead of just nailing him to the wall for messing with them. They know that he would be the wrong guy to piss off and cause him to form a grudge, by dealing harshly with him. A hacker is M$'s worst enemy. A master hacker with a PHD is really the wrong person for them to piss-off.
 
zurich said:
...its serial nature should still make it a dog for latency. (eDRAM is another matter entirely)

zurich

What is that based on? Is such a broad generalization even worthwhile over just not saying anything at all (the fact is, it seems to work just fine in practice in a PS2, so what's the big deal)? Should we just caption everything by the prevailing generalization- Celerons are cut down versions of a "real" CPU, parallel memory schemes are doomed for elevated clockrates and packaging, your rickety, overly complex Windows install is about a month away from needing a reinstall to work right as usual, all 4-banger engines are weak, all V8's are low revvers, American cars can't turn ...or, in the end it all depends on the implementation?
 
Back
Top