Dedicated Graphics card sales Slow Down in Q2, NVIDIA Holds Market Share Lead

pharma

Legend
Jon Peddie Research is going into "turbo" mode with all these statistics!

Jon Peddie Research (JPR), the industry's research and consulting firm for graphics and multimedia, announced estimated graphics add-in-board (AIB) shipments and suppliers' market share for 2014 2Q.
The quarter in general
JPR found that AIB shipments during Q2 2014 behaved according to past years with regard to seasonality, but the decrease was more than the 10-year average. The news was disappointing, quarter-to-quarter, the market dropped 17.5 % (compared to the desktop PC market, which increased 1.3%).

  • Total AIB shipments decreased this quarter to 11.5 million units from last quarter.
  • AMD's quarter-to-quarter total desktop AIB unit shipments decreased 10.7%.
  • Nvidia's quarter-to-quarter unit shipments decreased 21%.
  • Nvidia continues to hold a dominant market share position at 62%.
  • Figures for the other suppliers were flat to declining.
However, in spite of the overall decline, which has been caused in part by tablets and embedded graphics, PC gaming momentum continues to build and is the bright spot in the AIB market.

http://www.guru3d.com/news-story/de...down-in-q2nvidia-holds-market-share-lead.html
 
ah, so this is why the PS4 are selling so hot. all those people buying PS4 stopped buying graphic cards. and then the will buy graphic cards again a few years later when "New Video Game Graphic Killer X" released.

on a serious note,
I believe usually new DirectX push will make new push in new graphic card sales. But DirectX 12 i dont see how they can market it have much better graphic than 11... so.. no push for selling more graphic card
 
"Real" next gen games, 4K, VR and DX12 should all be pretty big graphics card movers over the next 2-3 years. Hopefully this slow down will encourage NV and AMD to re-consider their recent ridiculous pricing though.
 
"Real" next gen games, 4K, VR and DX12 should all be pretty big graphics card movers over the next 2-3 years. Hopefully this slow down will encourage NV and AMD to re-consider their recent ridiculous pricing though.

We are in an odd place right now where even $250-$300 GPUs are coming with 2GB VRAM. Best card on the market = HD7950 AKA R9 280. You can score one for just over $200 with the Gold game bundle. That extra 1GB of memory will be very welcome in future titles (and is already useful in some games at 1080p).

Funny, I got my 7950 in early 2013 for the exact same price, with the same game bundle. And don't get me started on the 780Ti shipping with 3GB... lol.
 
I agree. If I were a cynical person I may see the low graphics memory amounts as deliberate ways to keep otherwise vastly superior GPU's from being 100% future proof and thus ensuring future upgrade cycles....
 
NVidia and AMD should team up and fund a pure PC centric game dev team. You know...like a 'first' party dev in console space, just for their GPUs.

There is really no great need to upgrade your GPU when a) most of the games run at highest settings or quite good b) the difference in high and ultra settings is often more subtle than right into your face.

A dedicated PC centric powerhouse dev could justify investment in those expensive GPUs...
 
I need to upgrade my GPU going from single to triple screen. My 7870 just doesn't cut it at 5040x1050.
 
I agree. If I were a cynical person I may see the low graphics memory amounts as deliberate ways to keep otherwise vastly superior GPU's from being 100% future proof and thus ensuring future upgrade cycles....

This. It makes very little sense to ship the R9 285 with 2GB when it is supposed to be replacing a 3 year old chip that never shipped below 3GB. It should have at least 4GB, with 8GB models being optional. GTX780Ti should have 6GB default, with 12GB options. It is a $600+ GPU.
 
NVidia and AMD should team up and fund a pure PC centric game dev team. You know...like a 'first' party dev in console space, just for their GPUs.

There is really no great need to upgrade your GPU when a) most of the games run at highest settings or quite good b) the difference in high and ultra settings is often more subtle than right into your face.

A dedicated PC centric powerhouse dev could justify investment in those expensive GPUs...

I completely agree with this. GPUs are primarily a 3D accelerator for ... games!!

A few good PC titles with the ability to push current GPU technology to the brink could really en-light the dying fire and the need of these so called video cards in the first place. dGPUs are not going anywhere as long there is a demand for them, and what is better not to invest in the source of this demand?

The industry is suffering from a strong case of consololitis but the demand for more graphical settings, games that push the envelope etc are all still there as long as its a decent game.

I cant for the life of me remember the last time I played using low/med settings on a PC. Let alone if there was anything different between the settings nor if the settings were actually there in the first place!
 
I think some of you guys forget that PC gaming is diverse. Few people are obsessed with 60 fps, and certainly not 4k. What I think is happening is lots of people are using old hardware. You can still game just fine on an 8800GT in many cases! That's unheard of longevity. Intel HD and AMD APUs are adequate in a lot of cases too.


There is really no great need to upgrade your GPU when a) most of the games run at highest settings or quite good b) the difference in high and ultra settings is often more subtle than right into your face.
Yeah I've played all the hot releases on my 6950 or 560 Ti from almost 4 years ago now. I think the time of 2GB VRAM is gradually coming to an end though. I am looking forward to 20nm GPUs with HBM. Well, aside from the likely price tag anyway.

Something else I think about is how, after all the hoopla and hype, R9 290X and Titan are actually only about 2x the speed of an old 6970. Talk about diminshing returns on hardware.
 
I think some of you guys forget that PC gaming is diverse. Few people are obsessed with 60 fps, and certainly not 4k. What I think is happening is lots of people are using old hardware. You can still game just fine on an 8800GT in many cases! That's unheard of longevity. Intel HD and AMD APUs are adequate in a lot of cases too.



Yeah I've played all the hot releases on my 6950 or 560 Ti from almost 4 years ago now. I think the time of 2GB VRAM is gradually coming to an end though. I am looking forward to 20nm GPUs with HBM. Well, aside from the likely price tag anyway.

Something else I think about is how, after all the hoopla and hype, R9 290X and Titan are actually only about 2x the speed of an old 6970. Talk about diminshing returns on hardware.

Maybe not a 8800GT, but yes, i aggree with you, most peoples have old gpu´'s .. 5770, GTX460 ..

I still use 2x HD7970's, i dont see anything yet who can make me upgrade, before just for benchmarking i will have do, but i dont forcibly do it anymore ... I play at 1440p with max settings in all games .. why i will change ? GCN 1.0 in CFX is extremely strong.. ( good custom watercooling system anyway )...
 
Last edited by a moderator:
It's actually more like 3x, at least at higher resolutions:

http://www.guru3d.com/articles_pages/amd_radeon_r9_285_review,16.html

So that's a 3x increase in 3 years (from launch of 6970 to launch of 290x). Hopefully Volcanic Islands will soon bump that up to 4x.


It is more like 2x in the games here.
http://techreport.com/review/25509/amd-radeon-r9-290x-graphics-card-reviewed

I had a 290x for awhile but couldn't stand the heat it creates. I also found that I really didn't need the speed. Games need to demand more. More resolution isn't exciting enough.

I figure I will get in on Maxwell or Volcanic Islands when I get a game that demands it.
 
It is more like 2x in the games here.
http://techreport.com/review/25509/amd-radeon-r9-290x-graphics-card-reviewed

I had a 290x for awhile but couldn't stand the heat it creates. I also found that I really didn't need the speed. Games need to demand more. More resolution isn't exciting enough.

I figure I will get in on Maxwell or Volcanic Islands when I get a game that demands it.

I can understand that. I think VR will be a big driver for extra performance though. Resolutions above 1080p there will have a huge impact on the visual experience while frame rates below a locked 60fps are virtually a deal breaker. I'd expect the consumer version of OR to require at least 1440p vsynced at 75fps to max it out. That's going to require a huge amount of power to max out the new generation of games at those settings.
 
More than 3X if you count in the frame time analysis, average fps is just one metric.

One of the games I tried when I had the 290x was FarCry 3. My first impression was while it was obviously running much faster, it was actually stuttering more than my old 6970. Maybe things have improved since but that was a curious thing. Perhaps it was also related to the higher framerate providing opportunity for more obvious stutter.

It was also a nice demo of how shitty AMD's cooler is. Listening to the hair dryer ramp up and down depending on where I was looking in the game (differing GPU load). To stand owning that card I was going to need an expensive aftermarket cooler. And it would still heat up the room. I have a Thermalright cooler on my 6970 and it is very quiet, but it couldn't handle Hawaii. Didn't feel like spending even more.
 
I can understand that. I think VR will be a big driver for extra performance though. Resolutions above 1080p there will have a huge impact on the visual experience while frame rates below a locked 60fps are virtually a deal breaker. I'd expect the consumer version of OR to require at least 1440p vsynced at 75fps to max it out. That's going to require a huge amount of power to max out the new generation of games at those settings.

Yeah I am watching VR. Seems far off yet unfortunately.
 
One of the games I tried when I had the 290x was FarCry 3. My first impression was while it was obviously running much faster, it was actually stuttering more than my old 6970. Maybe things have improved since but that was a curious thing. Perhaps it was also related to the higher framerate providing opportunity for more obvious stutter.

It was also a nice demo of how shitty AMD's cooler is. Listening to the hair dryer ramp up and down depending on where I was looking in the game (differing GPU load). To stand owning that card I was going to need an expensive aftermarket cooler. And it would still heat up the room. I have a Thermalright cooler on my 6970 and it is very quiet, but it couldn't handle Hawaii. Didn't feel like spending even more.

In a silent room my GTX670 squeals like a piggy in some gaming situations. It's not the fan, I think it's called "coil whine". I only notice in game menus and stuff where there is no background noise. Is that because of shitty electrical components on the card or something on my end? The PSU is a nice 520W Seasonic.
 
In a silent room my GTX670 squeals like a piggy in some gaming situations. It's not the fan, I think it's called "coil whine". I only notice in game menus and stuff where there is no background noise. Is that because of shitty electrical components on the card or something on my end? The PSU is a nice 520W Seasonic.
I've had cards with coil whine. It's caused by the inductors in the card's power circuitry resonating at various frequencies AFAIK. If it gets just right you can hear it and yeah it's annoying. It seems to vary from card to card, even within the same product line. And yup when you are at a menu with vsync disabled and the card is rendering 2000 fps, this seems to be a solid cause of it. High load like Furmark seems to do it too.

My 8800GTX spewed RFI/EMI into the system and it was audible on my sound card! :)
 
Yeah I am watching VR. Seems far off yet unfortunately.

I'd expect the consumer Oculus Rift to launch next year. And given it's supposed to support at least 1440p at 90Hz (and VR is where those things really do matrer - a lot)I can see the requirements for high end GPU's going up dramatically by the end of the year.
 
Back
Top