bit-tech Richard Huddy Interview - good Read

You mean compared with other PhysX (TWIMTBP) titles?

No. they just wanted to make batman fancier. With more PhysX effects than what exists in the shipping game. Time just ran out.

Chris
 
Got a good explanation for why Physx doesn't even max a single CPU?

-Charlie

For the same reasons why games without PhysX doesn’t max out a single CPU.
1. The game/engine is not limited by the CPU and therefore spend time (that’s not shown as processor time) waiting for the blocking component. Most likely the GPU.
2. Bad multi core code that blocks execution in an unfortunately way. Threads that are blocked are not shown as processor time either.
 
PhysX = proprietary NVIDIA
Havok = proprietary Intel (and why AMD in the end dropped it...after much(again) PR and nothing to show)

Then we have BulletPhysics that from what I gather only has been used in "indiegames"..and now is the hope of AMD...(until they pick something else).

Except that as far as we know, ATI is still assisting Havok with porting to OpenCL. So they haven't exactly dropped it.

[Edit] Replying to your later post, it's business. Businesses aren't so petty usually as to snub their competition if their competition has something they can work on together. MS and Sony still work on projects together even though they are direct competitors. MS still works with Apple. And AMD is still working with Intel (especially with regards to Havok).

And to the one above this one. Yup, voting with Wallet. Not a single person I know in RL has purchased Batman AA due to vendor lock in. And many have stated to me they will be skipping at least 1 (or more) generation of Nvidia hardware even if it completely blows ATI away in hopes Nvidia gets the message that they would prefer open standards/stuff to work cross vendor.

It's a bit of a sea change around here, as many of these same people have never owned anything other than Nvidia cards.

Regards,
SB
 
Last edited by a moderator:
Except that as far as we know, ATI is still assisting Havok with porting to OpenCL. So they haven't exactly dropped it.

They still talk yes...but after Intel canned "Larrabee" they have no interest (ATM) in GPU-physics.
The problem is that they have talked since 2006...and that is it.

[Edit] Replying to your later post, it's business. Businesses aren't so petty usually as to snub their competition if their competition has something they can work on together. MS and Sony still work on projects together even though they are direct competitors. MS still works with Apple. And AMD is still working with Intel (especially with regards to Havok).

I know that the IT-world is not black&white.

But Intel is still a competitor to AMD and with "larrabee" on the backbruner...and AMD's "fusion" in the horizon, you can see with ease why Intel would let AMD hang.

And to the one above this one. Yup, voting with Wallet. Not a single person I know in RL has purchased Batman AA due to vendor lock in. And many have stated to me they will be skipping at least 1 (or more) generation of Nvidia hardware even if it completely blows ATI away in hopes Nvidia gets the message that they would prefer open standards/stuff to work cross vendor.

It's a bit of a sea change around here, as many of these same people have never owned anything other than Nvidia cards.

Regards,
SB

I fail to see the relevance on this topic (which is AMD PR (even making false claims))?
 
I fail to see the relevance on this topic (which is AMD PR (even making false claims))?

I should have put that into a new post, it was a reply to a post above yours that I replied to, where the person urged people to vote with their wallets rather than complain if a company is doing things some people view as harmful to the industry.

As to Havok. It is a wholely owned subsidiary of Intel as far as I'm aware, which means it makes it's own profits regardless of what Intel does. Whether Larrabee is ever released or not, Havok as a company still makes a profit and for Intel to abandon it would be to throw away their investment as well as a continuing revenue stream.

If Intel truly felt Havok was no longer of use for them, then they'd be better off to spin it off or look for a new buyer/allow Havok to buy themselves out.

Regards,
SB
 
I should have put that into a new post, it was a reply to a post above yours that I replied to, where the person urged people to vote with their wallets rather than complain if a company is doing things some people view as harmful to the industry.

As to Havok. It is a wholely owned subsidiary of Intel as far as I'm aware, which means it makes it's own profits regardless of what Intel does. Whether Larrabee is ever released or not, Havok as a company still makes a profit and for Intel to abandon it would be to throw away their investment as well as a continuing revenue stream.

If Intel truly felt Havok was no longer of use for them, then they'd be better off to spin it off or look for a new buyer/allow Havok to buy themselves out.

Regards,
SB

They put "larrabee" on the backburner...they havn't scrapped their plans about eventually going into the GPU market all together. (AMD's "fusion" is the reason of that.)

Their venture into the GPU market is why they aquired Havok in the first place and I'll bet they hold on tight...the Intel way.
 
Since Intel aquired Havok (and just let AMD hang in the wind...they are not friends you know)...and since AMD annnoced this last year:
http://www.amd.com/us/press-releases/Pages/amd-announces-new-levels-of-realism-2009sept30.aspx
But we are still in 2006...all words...and nothing to show.

Which is why Huddy really shouldn't have bended the facts like that.

So the demo of Havok running on ATI hardware at GDC last spring (March 2009?) was not real? Damn, I wonder how they worked together to port that then? In fact, I wonder why the Intel people in the audience were so friendly and happy?

Intel and ATI are working very closely on Havok last time I looked, and it makes a lot of sense. The last round of Physics announcements I read had both Havok and Bullet in them, but there could have been others. I have not heard about a falling out from either side, it is in their best interests to work together.

-Charlie

Edit: Found the link I was looking for.
http://www.theinquirer.net/inquirer/news/1051673/ati-physics-argument

And the official line:
http://www.amd.com/us/press-releases/Pages/amd_demonstrates_optimized-2009mar26.aspx

You either are so out of the loop it isn't funny, don't bother to look for contradictory info, or have an axe to grind.
 
Last edited by a moderator:
Even Nvidia and Amd work together sometimes

"In a normal class certification hearing, the plaintiffs revealed an email between Nvidia senior VP of marketing, Dan Vivoli, and ATI’s president and COO, Dave Orton, which points to inflated prices and collusion. It reads: “I really think we should work harder together on the marketing front. As you and I have talked about, even though we are competitors, we have the common goal of making our category a well positioned, respected playing field. $5 and $8 stocks are a result of no respect.”"
 
Even Nvidia and Amd work together sometimes

"In a normal class certification hearing, the plaintiffs revealed an email between Nvidia senior VP of marketing, Dan Vivoli, and ATI’s president and COO, Dave Orton, which points to inflated prices and collusion. It reads: “I really think we should work harder together on the marketing front. As you and I have talked about, even though we are competitors, we have the common goal of making our category a well positioned, respected playing field. $5 and $8 stocks are a result of no respect.”"

Another Dave? I've got the 'Clone Wars' vibe going on here. So is it multiple clones of the one 'Dave' or is it one 'Dave' doing multiple jobs?

Anyway is that legal to work together on industry wide marketing collusion so long as they don't cooperate on price? I think other industries have worked together on standards and thats fine so long as its not anticompetitive right?
 
It's amazing that you continue to completely ignore the fact that Havok on its own is a profitable venture. If it was costing Intel money I could see they may shutter it or far more likely, try to find a buyer for it.

But Intel is a business. They make money because they generally follow good business practices.

Havok is providing a postive revenue stream. Intel won't be shuttering it or otherwise changing any aspect of Havok, especially since Havok is relatively independant of Intel.

Noone will argue that Intel most likely were hoping to use Havok to help push Larrabee. But to think that Intel will suddenly attempt to sabotage Havok in order to spite the rest of the industry? That's just so far out in left field, I don't even know how to comment on it.

In fact if you are going to use emotions to guide business practices as you seem to be advocating. Intel would be far more likely to partner with AMD with regards to Havok in order to spite Nvidia who have been directly provoking Intel. Fortunately, Intel is a business. And such things like that won't factor into it.

Havok makes money. Havok continues to make money. Thus business as usual.

Regards,
SB
 
They make money because they generally follow good business practices.

Curiously, the FTC, the EU, AMD, and Nvidia seem to be in rare agreement about the fact that above statement is questionable.

Havok is providing a postive revenue stream.

Like software modems, sound, vertex processing and virtualisation, Havok is just another reason to sell quadcore CPUs.
 
Intel's only (real) interrest in Havok is "larrabee" related.
Funny how all your arguments are pre-"larragate"
It's not only Larrabee related. They want Havok to maintain dominance in the physics market because then they can use the CPU implementation to their advantage over AMD's CPUs.

If that means making their CPU look weak compared to a top of the line AMD GPU in certain situations, so be it. For the majority of the market, though, they only care about making their CPU look better than AMD's CPU.
 
This is good stuff. nVidia just keeps stepping in it.

http://www.xbitlabs.com/news/multim...ling_Multi_Core_CPU_Support_in_PhysX_API.html

But to nVidias defense..at least they have something. The following video is from 06..... running on a X1600. The 58xx series should be a monster at physics :cry: Epic FAIL at the end of the video with the "end of this year" promise. We don't even have a flipping demo yet...

Yeah unfortunately ATI is epic for making promises and never fulfilling them.. but hey they haven't joined the likes of DNF/BB .. yet
 
I don't get this part of the interview:
R.Huddy said:
bit-tech: Given Intel's approach to using Intel Architecture (IA) in Larrabee, and as an x86 company yourself, do you think it's because Intel are using IA specifically that it's the problem?

RH: They really have a whole host of problems: some of which I wouldn't want to describe in too much detail because it points them in the right direction and they've got their own engineers to see that. The x86 instruction set is absolutely not ideal for graphics and by insisting on supporting the whole of that instruction set they have silicon which is sitting around doing nothing for a significant amount of time.

By building an in-order CPU core - which is what they are doing on Larrabee - they are going to get pretty poor serial performance. When the code is not running completely parallel their serial performance will be pretty dismal compared to modern CPUs. As far as I can tell they haven't done a fantastic job of latency hiding either - it's hyperthreaded like mad and has a huge, totally conventional CPU cache. Well it shouldn't come as a big surprise that it's simply not what we do. Our GPU caches simply don't work like CPU caches and they are for flushing data out of the pipeline at one end and preparing it to be inserted at the other - a read and write cache to keep all the GPU cores filled. One large cache and lots of local caches for the cores is not a great design. On top of which it doesn't actually have enough fixed function hardware to take on the problem as it's set out at the moment, so it needs to be rearchitected if Intel is to have a decent chance of competing.
(The bold part) I don't get isn't it obvious? Would it be possible to preserve throughput and improve serial perfs in chip of the same size at larrabee? Is it a bit of free bashing or he suggests that as Larrabee was not to compete with nowadays GPUs and Intel should have known it they may have made sense to sacrifice some throughput to offer a more balanced chip (by the way making plain X86 choice for the ISA more relevant)?
 
Last edited by a moderator:
It's amazing that you continue to completely ignore the fact that Havok on its own is a profitable venture. If it was costing Intel money I could see they may shutter it or far more likely, try to find a buyer for it.

But Intel is a business. They make money because they generally follow good business practices.

Havok is providing a postive revenue stream. Intel won't be shuttering it or otherwise changing any aspect of Havok, especially since Havok is relatively independant of Intel.

Noone will argue that Intel most likely were hoping to use Havok to help push Larrabee. But to think that Intel will suddenly attempt to sabotage Havok in order to spite the rest of the industry? That's just so far out in left field, I don't even know how to comment on it.

In fact if you are going to use emotions to guide business practices as you seem to be advocating. Intel would be far more likely to partner with AMD with regards to Havok in order to spite Nvidia who have been directly provoking Intel. Fortunately, Intel is a business. And such things like that won't factor into it.

Havok makes money. Havok continues to make money. Thus business as usual.

Regards,
SB

I just find few oddities in that first AMD starts talking Bullet-physics after been talking (Keyword: talking) Havok GPU-physics for years (since 2006) and then 3 months later Intel spill the beans about "Larra-gate" (the delay of "Larrabee").

And havok still makes money from it's CPU-software buisness...but that is not what I am talking about(The GPU side of things).
 
Take it to PM, or to the thread I made for those that want to solve their issues with Charlie. Please.
 
Back
Top