Integrated graphics passes 50% mark.

Is this good or bad?

According to Taiwanese MIC (Market Intelligence Center), integrated core-logic sets occupied 31% of the market in the first quarter, but skyrocketed to 39% and 48% in the second and the third quarters respectively after Intel announced its i845G series and rolled out its new Celeron CPUs. In the fourth quarter of 2002 the market share of integrated core-logic products achieved 52%, more than the share of discrete solutions.
I'm not sure how I should feel about this. Either A) add-in-board sales should increase because of this flood of the crappy integrated video, or B) PC game sales should decrease because people are too cheap to buy a new video card to play new games.
 
Wouldn't lots of these computers ship without having an AGP slot for an add-in board?

I think we might soon find ourselves at a time where PC game makers can only afford to target what Intel is building, and consumer-priced high-end graphics will appear only on consoles.
 
There's a difference with whats going into retail and what OEM's are pusshing out. OEM's are cost consious, so if it increases cost it goes out.
 
Re: Is this good or bad?

KnightBreed said:
I'm not sure how I should feel about this. Either A) add-in-board sales should increase because of this flood of the crappy integrated video, or B) PC game sales should decrease because people are too cheap to buy a new video card to play new games.

A) Assuming people want better graphics than they get.

B) Assuming people want to play games on their PC.

I'm assuming that "people" are of varying age groups and that only a small subset are interested in full-time gaming. These are the people that search out the high-end boards in the first place, so they fall out of the group that buy these integrated boards. (We will ignore those that are children whose parents buy them a computer. If they truly wish to play games, they will buy their own specific hardware when they get the opportunity.)

The rest of the population (which I'll arbitrarily divide into those over 30 years old) typically don't give a damn about playing games on the computer. The extent to which they play games is typically limited to card games, pinball games, and freeware/shareware fare. The older gamer is still a rarity, statistically speaking.

So, consider that the bulk of the home market isn't gamers (or at least the buyer of the machine isn't a gamer) and that the whole of the business sector doesn't need gaming solutions (ignoring very special cases) we arrive at the same place we started ten years ago: Gamers will buy special systems and hardware and push the envelope while the casual users and business users buy average systems that simply do what they need to do.
 
flf said:
So, consider that the bulk of the home market isn't gamers (or at least the buyer of the machine isn't a gamer) and that the whole of the business sector doesn't need gaming solutions (ignoring very special cases) we arrive at the same place we started ten years ago: Gamers will buy special systems and hardware and push the envelope while the casual users and business users buy average systems that simply do what they need to do.

The difference being that now games are much more expensive to produce, especially if they target any high-end graphics features (such as shaders). Also, as technology advances, there is a point of diminishing returns in visual quality. Does Morrowind really look that much better than Ultima IX, even though it was targeted at hardware three (or more) years more advanced?

The high-end PC market now may simply not be large enough to support development of games that have a graphics quality beyond that offered by console games or console ports (remember, Doom 3 is a console game). If that is the case, there will be less reason for gamers to upgrade their PCs instead of getting a console for considerably less money.
 
From some limited experience, OEM's favourite chipset of the last year was the SiS 650, which comes without an AGP slot.

However some OEM manufacturers build and design PC's around the concept that the user may wish to upgrade their system (eMachine springs to mind).
 
Graphic card prices have increased in price over the last couple of years in Canada and have now been hitting close to $800.00, by the looks of it the Nv30 Ultra could be the 1st card to hit $1000.00....which is insane.

There comes a time when a video card costs more than a motherboard, CPU and Ram to start weighing the benefits of the card...current games are mostly platform limited and users would benefit more by upgrading the core PC (motherboard, CPU and RAM) then spending $1000 on a card...

I welcome cards like the 9500 pro, reasonably priced, fast and up to date DX9 features...people just don't want to spend that kind of money anymore as your return is minimal.

Playing DungeonSiege LAN last night on my upstairs computer and it has a orginal Radeon, downstairs is a 8500...was their any graphic difference..nope..I was getting 55 fps vs 41 fps ...otherwise the game looked the same.

I realise the costs involved in high end PC graphics cards is alot, point is the card going to return anything worthwhile during its lifespan worth $800-1000.
 
These numbers are misleading. Fact is more motherboards ship with some form of onboard video these days than without. However those numbers dont reflect the percentage of installed onboard video that is being used and not being supplanted by an add in card. And I know your going to say "Only gamers or enthusiasts add an additional board" but thats not true either. My office recieved 10 new dells just recently and all had onboard video but STILL came with radeon 7200's as the primary video driver.
 
Very rare Johnny there is 600 workstations where I work and they all used the Intel integrated video..all Dell Optilex PC's also.
We have a few high end machines with add in video cards for CAD..thats it.
 
This could relate in an interesting way to changes Intel is expected to make to its motherboards by the end of next year. Apologies in advance if I get this wrong but IIRC Intel is 1) moving to to PCI 66 for its PCI slots and 2) moving to an even more advanced format further down he road.

Just considering PCI 66, is it possible that this will revive sales of PCI gaming cards? Would it offer better performance than PCI 33 to the point vendors will offer medium/high end products for it?
 
PCI 66 would only be as fast as AGP1x, and the bus is still shared with other cards. So no, PCI 66 won't help PCI compete against AGP. Anyway, I'd be surprised if Intel actually went for PCI 66 before 3GIO.

If Intel does go for PCI 66 before moving to 3GIO, the primary benefit won't be for video. It will essentially be exactly like running your shiny AGP video card with AGP disabled (Which makes the card run at PCI 66). The primary benefits for PCI 66 will be in input and output, which is currently stressing the PCI bus quite well.

For example, plain PCI is capable of 133MB/sec transfers. Strap on a Gigabit ethernet card or ATA/133 hard disk, and you're already limiting out your bus (Not all the time, obviously, but sometimes...). Due to the multi-device nature of the PCI bus, you really don't want any one device to ever attempt to command nearly all of the available bandwidth.

Still, the primary benefit for going PCI 66 is that it's already here. PCI 66 and 64-bit PCI have been in the server market for some time now. I don't think 64-bit PCI will ever make it to the consumer sector (too expensive).
 
flf said:
The rest of the population (which I'll arbitrarily divide into those over 30 years old) typically don't give a damn about playing games on the computer. The extent to which they play games is typically limited to card games, pinball games, and freeware/shareware fare.

Limited to card games!! Don't give a damn about playing!!?? HUH? grrrrrr! Oh you said over 30, I guess being over 40 and having a mid life crisis brings me back into the gaming arena ;) Any babes out there who want to swing? :p
 
Doomtrooper said:
Very rare Johnny there is 600 workstations where I work and they all used the Intel integrated video..all Dell Optilex PC's also.
We have a few high end machines with add in video cards for CAD..thats it.

Yeah, but so what? No one is going to play games on those machines either way. I'd say this comes back to the numbers being misleading...
 
What is interesting is the trend. Discrete graphics cards are being replaced by on-board video options.

This isn't news to the traditional chipset manufacturers. Nor to ATI or nVidia either of course, they have seen this coming for a long time otherwise they wouldn't both be offering northbridges with integrated graphics today.

The question is how far down this road the industry will go, and what consequences it will have.

Ultimately, though these things take time to penetrate the market fully, integrated graphics is going to erode the low-end gfx-card market completely. Integrated graphics increase northbridge costs by a few dollars and there is no way a seperate card solution can compete with that, or the assembly benefit of removing card installation completely. In time, only customers who explicitly desire and are willing to pay for a separate gfx-card are going to get one.
It is difficult to see how large a part of the market that represents. Pretty much no professionally/publicly used computers will use add-in cards. A large part of home computers won't either, though this is harder to judge.
The data quoted in this article would seem to imply the outlook for game driven gfx-card sales in PC-space is rather grim.
http://www.startribune.com/stories/535/3524128.html

How will this affect gfx-card manufacturers and content providers?

Well, the gfx-card manufacturers will loose their low-end revenue stream. This is significant but not disastrous, judging by for instance recent ATI statements. While the low-end volume has been large, the margins have been correspondingly small, so the absense of their contribution to total revenue and profits shouldn't be lethal.

Further, depending on how well the integrated solutions perform, even some of those who today would go with a separate card "just to be safe" may no longer feel the need. That would erode some of the occasional gamers part of the home market as well.

Will the gfx-card manufacturers make up this loss in sales of gfx-enabled Northbridges? Nope. No way. Whereas competition is free as far as AGP-cards are concerned, that is not the case for chipsets. Intel owns the necessary IP and decides both who it liscenses to, and at what price. Intel competes in that market, and control the terms of that competition completely. It's a no-win situation for anyone else. With AMD the situation is different, but the marketshare is much smaller.

So what do the card manufacturers need to do? In order to keep revenues coming in, gfx-card manufacturers have to keep pulling rabbits out of their hats in terms of card abilities and performance. Furthermore, they have to push the adoption of higher end features by games developers, or the incentive will to buy will not be there. This is actually in the interests of PC-oriented developers too, or they will continue to loose ground to consoles. PCs offer faster CPUs, more memory, and higher performance graphics. However, developers have to actually leverage those advantages to create compelling content. (Microsoft and Intel should have an interest in driving such development as well against anything that erodes the position of the PC platform.)

So, IMHO, within the next year or two we are likely to see PC-graphics mimic the development of PC-sound, and split into two clearly distinct parts - one integrated and one for add-on cards with the integrated graphics volume being substantially larger. Overall, the add-on card market is going shift upwards. We may or may not see PC game content follow this, but the odds are reasonable that it will.

Longer term?
Clearer crystal balls than mine are required. The chips can fall in many ways.

Entropy
 
The thing is integrated graphics has nearly caught up to gaming software progression so there isn't a need for the mass market to go purchase a highend gaming card. This gap will only continue to get smaller squeezing ATI and Nvidia out of the lowend addon card market. If Intel's integrated graphics chipsets are good enough for average gamers to play 80% of the games currently on the market, then there's only 20% left for the niche highend market. If the majority of game developers continut to target the mass market the the situation will only get better for Intel at least on the desktop. If Nvidia and ATI wants marketshare in the lowend, they need to continue to make improvements in the integrated chipset like the Nforce etc.
 
Entropy said:
I was surprised to see noone had linked these data.
http://www.xbitlabs.com/news/story.html?id=1041245055

I'm only bouncing past the computer, no time for a considered analysis right now. But the numbers are significant if you are at all interested in the future of desktop graphics.

Entropy

If this is true, I don't think it means much except that the market's getting larger and more boxes are commodities. If anything the raging fiscal success of ATI's 9700 Pro series should indicate a very healthy sector in the add-in card market, and a very profitable one, too. 10-15 years ago integrated was most if not all of the graphics market. "We've come a long way, baby" and I don't think there's any going back--integrated will always serve its market but I don't see the specialized 3D market doing anything but growing vigorously for a long time to come. (He says hopefully...)
 
Back
Top