Nvidia losing influence due to Ps3 involvement?

russo121 said:
Drak said:
I'm just saying that NVIDIA do not have to suck to microsoft so much. When Microsoft is going to jump to DXNext, it needs the support of the only two main hardware players left.

I don't ask this for a long time, but here we go: "Have you been smoking some hallucinogens?"

Microsoft needs support from Ati, Nvidia? Or it's the contrary? I think Ati and Nvidia are licking all the time Microsoft... Do you know how many comps in the world have windows installed? As said before ~99%....
Just a personal question... Do you have linux installed and is it your principal OS? :rolleyes:

the point was (I think) that Microsoft does need Ati's and NVIDIA's support for their new 3D-tech. That is a fact. If they do not support it, developers will not use it. I mean, what alternatives do MS, developers and consumers have? S3? XGI? Yeah, right!

While MS has enormous power over Ati and NVIDIA, The two companies also have enormous power over MS, at least when 3D is concerned. The thing that saves MS is the fact that Ati and NV are fierce competitors. If one of them decides not to support latest MS tech, it wouldn't achieve much, it would just end up hurting the company in question since the other company has more than enough know-how and market-share to make that tech succesfull. But if both of them agreed not to support some latest MS-tech, it would be MS that would be screwed. Or are you SERIOUSLY suggesting that XGI or S3 have the necessary know-how and market-share to achieve anything? I doubt it. Why would consumers buy anything from S3 or XGI, since the performance, drivers and image-quality would suck royally?

And yes, I do have Linux installed. Right alongside W2K. Does that fact somehow make my comment not valid?
 
nVidia will always continue to support DirectX. It's how they got to be where they are today.

The Riva128 was popular with OEM's because it worked better with DirectX than other chips at the time.

Combine that with feature leadership and strong performance (sometimes best of class performance, but not always) and you have a winning combination.

Anyone who thinks nVidia will depart from this winning combo without good reason, is quite frankly, crazy.

The good thing about nVidia is not only do they provide feature leadership, but great OpenGL support, and great platform support also.
 
Nemesis77 said:
the point was (I think) that Microsoft does need Ati's and NVIDIA's support for their new 3D-tech. That is a fact. If they do not support it, developers will not use it. I mean, what alternatives do MS, developers and consumers have? S3? XGI? Yeah, right!

Microsoft only needs *one* of those companies to support any new hardware/API requirements. That's why MS can play Nvidia and ATI off against one another. If either one fails to toe the Microsoft line, well I'm sure the other one will be happy to take up the slack and supply all consumers with cards that work in all the DXN and WGF machines that Dell and other OEMs will be shipping.

Developers will simply code for all the cards that support DXN/WGF as these will eventually be the largest user base and what MS supports. This was clearly demonstrated when Nvidia was having problems with NV30 and ATI was supplying the only viable high end graphics card in the form of R300. Many developers shifted away from the position of Nvidia being their primary development platform, to ATI gaining a lot of that ground.

The same thing will happen if either of ATI or Nvidia stop supplying cards that can work with DXN/WGF because you won't be able to run games, and then you won't be able to run Windows. It just won't happen because neither ATI nor Nvidia are large enough to push the market/OEMs/Developers into a direction that Microsoft doesn't agree with.
 
You guys are giving MS too much credit here i think. It's not like MS is 'driving' everything and IHVs have to follow it or go bust. It's much like MS and IHVs are working together to bring the new hardware with adequate software support.

So i'd say that DXNext (not 10 specifically but any next DX) is created something like this:

1. IHVs (all of them, but the biggest have major influence) describe DX team what they would like to see in the next DX.

2. If there are conflicting points (one wants this, other whats that and they are mutually exclusive), then DX team makes some research and go for one of the proposed variants. At this point HW design is far from final and IHV whos proposition was declined can still change it's hardware.

3. Everything are launched :)

One thing you all have to remember that NV is not something like XGI, it controls a major part of the market. And WGF is needed not only for games this time but for OS itself. So dumping NV from WGF development (which is possible given MSs bias towards ATI in DX9 and the work bieng done on Xbox2) will eventually kick MS back with Longhorn being shit on NVs hardware. Do they want this? Somehow i doubt that :)

Another thing is that MS would surely prefer to have several key players in graphics market between which it can balance and be a leader instead of getting another Intel which pretty much controls everything and not giving MS any chance to influence CPU development ;)

As for unified shader units - i think we reading too much into Kirk's comments here. They are in fact somewhat old and can be related to something like NV47. Anyway why are everybody so sure that unified shader units are 100% better than dedicated? :) To me it looks like trying to fix badly written software (not balanced properly) with really expensive hardware instead of trying to fix a software which is much easier...
 
I'm thinking longhorn and WGF will be a fairly major success among ordinary consumers for one simple reason.

With a 3d desktop rather than a 2d one, you finally break the "800x600" barrier that exists for a lot of users- ie: you can have a desktop that is 1600x1200 pixels and still have the easy readability of 800x600 which is a major reason why lots of people even with expensive monitors stay at that resolution - people don't like trying to read small text on monitors.

I see unified shaders as being like a 386 vs current x86 cpu's. The basic x86 architecture isn't very efficient at all, and you certainly can't trust programmers to write efficient code (just like current DX9 code is woefully inefficient), so you really need to create your own efficiencies and squeeze every ounce of performance out of your own hardware if you want to get anywhere. More generally the more unified the chip is the better, since less of the chip is forced to sit idle, if your pipelines can tackle almost any task thrown at them, there is no need to keep them idle or stalled unnecessarily, just give them another task instead of waiting on the bottleneck.
 
DegustatoR said:
You guys are giving MS too much credit here i think. It's not like MS is 'driving' everything and IHVs have to follow it or go bust. It's much like MS and IHVs are working together to bring the new hardware with adequate software support.
.

MS does have the final say. If you don't follow them, you do run into major trouble, especially when you talk about gaming. Decide you're not going to release a DX9 card when your competitors have, and see how the competition will eat you alive.

Look at what happened when Nvidia walked away from DX9, and tried to implement CG. Compare what happened when Nvidia was in favour at MS and directly influenced DX because of their work on Xbox.

Look at what happened when MS decided it didn't want OGL on their OS. Sure you have Carmack as the one last holdout, powerful enough to make it stick, but that may not be the case an engine or two down the line.

To make a card that doesn't comply with DX/Windows makes it very unfriendly to OEMs and developers. Without support from developers and OEMs, you put yourself in the position of 3DFX, which is not a nice place to be.
 
radar1200gs said:
With a 3d desktop rather than a 2d one, you finally break the "800x600" barrier that exists for a lot of users- ie: you can have a desktop that is 1600x1200 pixels and still have the easy readability of 800x600 which is a major reason why lots of people even with expensive monitors stay at that resolution - people don't like trying to read small text on monitors.
Now there's a blast from the past - 800x600 died along with the last 15" CRT. You may still find a few leftover sites optimized for 800 pixel width, but as a desktop resolution, it's virtually nonexistent.
 
anaqer said:
Now there's a blast from the past - 800x600 died along with the last 15" CRT. You may still find a few leftover sites optimized for 800 pixel width, but as a desktop resolution, it's virtually nonexistent.

Only for us geeks. My ISP did a survey last year, and 800x600 is still apparently the most popular screensize.

When I built a PC last year for someone to access the internet and do their digital photography, I had to show them the benefits of larger screen sizes on 17+ inch monitors, but they didn't like the small fonts. I had to show them how to raise the font sizes. I bet there's a lot of people out there who still don't know how to do this.
 
Bouncing Zabaglione Bros. said:
Nemesis77 said:
the point was (I think) that Microsoft does need Ati's and NVIDIA's support for their new 3D-tech. That is a fact. If they do not support it, developers will not use it. I mean, what alternatives do MS, developers and consumers have? S3? XGI? Yeah, right!

Microsoft only needs *one* of those companies to support any new hardware/API requirements.

I did say that in my post.

Developers will simply code for all the cards that support DXN/WGF as these will eventually be the largest user base and what MS supports.

S3 and XGI would not have largest user-base. Not by far. That would be Ati and NV (and Intel, if you include integrated stuff) And even todays games work just fine with DX8-hardware.

It just won't happen because neither ATI nor Nvidia are large enough to push the market/OEMs/Developers into a direction that Microsoft doesn't agree with.

If (and this is a large "if") both Ati and NVIDIA publicly said that "no, we will not support DirectX Next, We will support DX9 and OpenGL only", developers would listen. Why should those developers waste their time writing software for technology that was only supported by few niche-players (XGI, S3 etc.) and even then, it wouldn't work very well?
 
karlotta said:
Nemesis77 said:
...And even todays games work just fine with DX8-hardware.
no they dont.
Sure they do, provided you're willing to lower your resolution. It'll be a little while yet before any games really require higher than DX8-level graphics.
 
Nemesis77 said:
If (and this is a large "if") both Ati and NVIDIA publicly said that "no, we will not support DirectX Next, We will support DX9 and OpenGL only", developers would listen. Why should those developers waste their time writing software for technology that was only supported by few niche-players (XGI, S3 etc.) and even then, it wouldn't work very well?

As you said, it's a "big if". It's just fantasy to suggest that the two biggest players in the market will refuse to support the overwhelmingly biggest OS maker in the markets that makes them the most money.

It won't happen because:

(a) if one of the big two were to refuse to support DXN, the other would step in and have an unassailable grip on the market.

(b) if you don't support DXN, your card won't appear in any OEM machines, and no one will buy it because you won't be able to play games or eventually even run the windowing system in your nice shiny new Longhorn PC. You will go out of business as no one who has a PC buys your products.

As I said, Microsoft only needs either Nvidia or ATI, and that's why neither can afford to try to lead the market away from Microsoft - because the other one will step in and take the market by linking itself strongly to MS, Windows, and DXN/WGF.

Look at where ATI was relative to Nvidia before DX9. Nvidia walked away from MS and tried to take the market with it, and gave ATI the opening to come in, catch up, and overtake Nvidia.

With Xbox2 on it's way from ATI, who do you think is going to influence the future direction of DXN more? ATI and MS working closely together, or Nvidia with Sony? Of course it doesn't matter for Sony, but Nvidia still has to sell chips into the OEM and comsumer space for the PC market, and that is overwhelmingly MS and Windows with DX and it's future varients.
 
nVidia and Microsoft have also been working closely together for a number of years now. There appears to have been some competition for specification between Microsoft and nVidia in DirectX 9, but the NV3x really was a broken architecture anyway, and thus I don't think one can draw any definite conclusions here.

That is to say, due to the apparently large number of respins and whatnot of the NV30 before final release, I don't think we can be certain that it was actually the core nVidia originally intended to release. The extremely high transistor counts for the featureset seem a testament to this: nVidia doubled the FP power without much changing the integer power of the NV30 with the NV35 without hardly increasing the number of transistors at all. Furthermore, there were rumors that partial precision itself was a late addition to the DX9 spec, which would seem to indicate that nVidia didn't originally intend to support partial precision at all with the NV30. But I don't think we can know this for certain.
 
Bouncing Zabaglione Bros said:
As you said, it's a "big if". It's just fantasy to suggest that the two biggest players in the market will refuse to support the overwhelmingly biggest OS maker in the markets that makes them the most money.

It won't happen because:

(a) if one of the big two were to refuse to support DXN, the other would step in and have an unassailable grip on the market.

(b) if you don't support DXN, your card won't appear in any OEM machines, and no one will buy it because you won't be able to play games or eventually even run the windowing system in your nice shiny new Longhorn PC. You will go out of business as no one who has a PC buys your products.

c) And for sure you'll turn your Pc into a console that some developers may (big may) try to do something for it.

d) More, Microsoft has the money and power to be the next IHV if they wanted, but I think they don't want to start from scratch.
 
Chalnoth said:
Furthermore, there were rumors that partial precision itself was a late addition to the DX9 spec, which would seem to indicate that nVidia didn't originally intend to support partial precision at all with the NV30. But I don't think we can know this for certain.
Did the rumor say if they were looking at supporting just FP16 or FP32?
 
Chalnoth said:
Furthermore, there were rumors that partial precision itself was a late addition to the DX9 spec, which would seem to indicate that nVidia didn't originally intend to support partial precision at all with the NV30. But I don't think we can know this for certain.

I thought the rumours were the exact opposite. Nvidia built 32 bit for quality (remember Cinematic Computing?) and then tried to claw back speed with 16 bit. MS went the ATI route of a 24 bit midpoint compromise between speed, quality and transistor budget. Nvidia then had to lobby madly to get PP included in one of the DX9 revisions.

Nvidia's route probably *needed* the (then) brand new .13 process, and one of the reasons they came unstuck. They would have been designing NV30 for a couple of years, and there was no way they would have been able to just lever in PP six or twelve months before launch.
 
russo121 said:
d) More, Microsoft has the money and power to be the next IHV if they wanted, but I think they don't want to start from scratch.

Microsoft would probably just buy ATI or Nvidia, which they can easily afford. I doubt MS would though, as it would probably leave them open to monopoly investigations.
 
Back
Top