AMD / ATI ponderings at HKEPC

http://www.hkepc.com/bbs/itnews.php?tid=656056&starttime=1156204800&endtime=1156291200

Sounds like job relocations / rationalization and also concentration on IGP chipsets rather than the more sexy high end stuff.
Sure, but that´s just how a typical aquisition moves forward. You concentrate on stuff that gives you the most in return and try to conquer new markets you didn´t really have something to add (think in value- or execution-terms). AMD always had plans on their desk for different things, but until recently they couldn´t execute a lot of them, since practically their whole staff worked solely on CPUs and they certainly did a hell of a good job if you ask me (vs. Intel).

What i like about it is that AMD certainly will strengthen their position if they can keep a good balance in terms of mainstream IGP (notebooks) and discrete stuff. What´s even more important is that they now have ATi staff working on the chipset-side, which is one of their considered weaknesses they had to deal with over the years, top-to-bottom execution in a reasonable amount of time.

However, AMD/ATi would be pretty stupid if they give up their high-end, i don´t think this will happen (for the foreseeable future). They´ve put so much into it - not only because they practically changed the industry from R300 on - so they will continue until the point where GPUs and CPUs will basically melt together in terms of the featureset and/or capabilities and further process technology advancements, but i wouldn´t really dare to give a timetable for that, since there´s too much unknowns involved, in a market that overall moves pretty fast and will certainly accelerate even more over the years, so that remains to be seen.

AMD will now be defined not solely through their excellence and efficiency in CPU-architectures and markets (Sempron, Athlon64, FX-series, Opteron and Turion) but also their combined strengths, that is, with ATi´s knowledge, their patents and workforce, which will almost certainly not be limited to only "low-end" stuff. You wouldn´t really buy a high-end FX together with a X1600, everything lower than a X1800 wouldn´t make a lot of sense.

Hopefully they will also improve their marketing, to a point where it´s more consumer/customer-driven (think NV), rather than company-driven, so they won´t make too much promises, but instead they will deliver their products on time.
 
Last edited by a moderator:
Well, it's worth remembering a couple things "focus" wise. One, they apparently will be shifting the Intel chipset team. Two, they are nearing the end of two nearly simultaneous console chip developments (Xenos and Hollywood). So they can greatly increase engineers on AMD IGP without having to take them from elsewhere.
 
What I find scary is that it might only take one person at AMD who is high enough to think Hi-End GPU's are not cost-effective and wahey there goes NVIDIA's only competitor in that market.

Weirder things have happened... but the way AMD restructured it is unlikely this scenario will see the light of day. The two Dave's would save the day ;)
 
What I find scary is that it might only take one person at AMD who is high enough to think Hi-End GPU's are not cost-effective and wahey there goes NVIDIA's only competitor in that market.

Weirder things have happened... but the way AMD restructured it is unlikely this scenario will see the light of day. The two Dave's would save the day ;)


Uhm.. instead of forecasting DOOOOOOOOOM for everyone (world domination for nV) what would happen if DAAMIT raises the bar for IGP to a point where a separate video card would prove to be a very uneconomic option?

Hell it's already uneconomical, but as soon as your $400 CPU performs just as well as a $200 CPU and $200 G/VPU the balance of the money making market (low, mid-low end) shifts completely to the integrated graphics.
 
Uhm.. instead of forecasting DOOOOOOOOOM for everyone (world domination for nV) what would happen if DAAMIT raises the bar for IGP to a point where a separate video card would prove to be a very uneconomic option?

Hell it's already uneconomical, but as soon as your $400 CPU performs just as well as a $200 CPU and $200 G/VPU the balance of the money making market (low, mid-low end) shifts completely to the integrated graphics.


Problem is that is very uneconomical to do that also lol, where are ya going to get the gddr4 or 3 ram at the cost of gddr2 ram? Or are ya suggesting there will be seperate socket for vram just for the graphics core? That will increase cost of the motherboard added to this extra circuity for the extra ram sockets. Then if you want to upgrade a cpu or graphics card, you will need a new motherboard, just gets a bit hairy.

Integrated graphics core in a cpu you are suggesting, what if the graphics core has problems like we have with discrete chips where a quad has to be disabled, or the core has to be underclocked? How many CPU sku's are there going to be for this? Business wise it will make it harder to sell what you make with a hundred different sku's (internal competition), confuse consumers, and end of it the cores that don't make it to the expected final chip, will just cause a loss in revenue. That would be taking a nice overall profit for CPU's and making a drop in revunue. Now I can see it happening possibly for very low end integrated GPU's (it has to be played safe) but thats it at least for the next 2 years. Then possible upping the performance alittle, but then again, by the time AMD and Intel start using GDDR3 memory discrete solutions will be using GDDR4 or 5
 
Last edited by a moderator:
I personally dont think the integrated and discrete video markets crossover in any meaningful manner in the PC market.

As already stated by Razor1 the discrete GPU is fast due to its other parts like fast memory, there is also the thermal design to consider as well of these monsters.

A high end integrated solution that costs $400 for graphics and CPU not inc. motherboard is a dead end marketwise now and in the future. This is where the mid-high end lives right now.

Reduce the above solution to $40 and we may be onto something... for mobile devices, handhelds etc. This is an area AMD and ATI are going to be the leaders in my opinion. I just hope they wont forget the high end GPU market in the future.
 
I don't care about what chip would be integrated in my cpu or chipset, just like the rest of the consumers.

I'm not talking about high end graphics here, not talking about ddr3/4 here.. we're talking about low end or low-midrange stuff that doesn't need it.
Don't forget you're talking about different amounts of chips you have to make. if the IGP bar was to be raised to x1600 levels this cycle (instead of x700) by DAAMIT, what would that do to nv's low end market? sure it costs money, but no one ever won a war saving his pennies.
 
if the IGP bar was to be raised to x1600 levels this cycle (instead of x700)

The IGP of the RS600 incorporates X700 technology but comes nowhere near the speed of a discrete X700.

Again if we had X1600 type-tech in an integrated solution on the motherboard or CPU from ATI/AMD this solution would not compete with a discrete X1600 or 7600GS etc... it would be much, much slower in gaming.

I assume we are moving to talking about gaming (3D) power and features of the low end now...
 
I don't care about what chip would be integrated in my cpu or chipset, just like the rest of the consumers.

I'm not talking about high end graphics here, not talking about ddr3/4 here.. we're talking about low end or low-midrange stuff that doesn't need it.
Don't forget you're talking about different amounts of chips you have to make. if the IGP bar was to be raised to x1600 levels this cycle (instead of x700) by DAAMIT, what would that do to nv's low end market? sure it costs money, but no one ever won a war saving his pennies.


Think about it this way, if it was possible, nV and ATi probably would have made a motherboard that was upgradable with a socket already, it would have driven thier cost down while increasing thier IGP chip market at least for thier own chips. Would be nice to just slap in a 7300 where that 6200 was, they are pin compatible, plus the motherboard features havn't changed that much either. As it is they don't make much money per unit sold on inegrated graphics motherboards, this would have helped them substantially.
 
i agree with both razor/tahir and neliz on this. The IGP setup does not appear capable of supplanting discrete graphics at anytime in the near (2-3 years) future. The current highend designsare just too hot, not to mention BIG. Also, for high-end video enthusiasts, the GPU replacement cycle is still much faster than for the CPU. So IGP in this sense would really be less economical for enthusiasts. It makes no sense to me.

However, the advantages of IGP are obvious, and these advantages might be extended to all-in-one chip design; I'm not sure. If that's what ppl at AMD seem to think, I'd take their word for it. Some obvious advantages would be price and size, and anything that gets my semi-capable games machine down to mini-ITX size is a huge win for me. I can handle gaming at 800x600, and even the diehards could handle this in their cars and mobiles.

The fear for me is that ATi drop the high-end research, not because of where this leaves the market (I think I can live without better gfx parts. if nV wants to slow down the cycle a touch and bring me some kickass $100 parts, then great. maybe they could even afford to stop cheating on AA and filtering that way ;p [Edit: or do I stand corrected?]). I just think it seems like a bad idea to slow down their high-end research. I could be mistaken, but it seems that if nV controls the future of high-end dX, then all those ATi IGPs will steadily fall behind, just like those of Intel, SiS, and Via (and arguably the discretes of matrox & PVR). First it's just a few "meaningless" features, but then after a couple gens, one of them turns out to be the 2008 version of hardware transform & lighting. Not to mention the "high-end prestige" marketing model both companies have been pursuing. Frankly I wish that would die, but they seem to be right -- the kids buy what they think they've heard of. :rolleyes:
 
Last edited by a moderator:
Also...: ATi to discontinue Intel chipset development.

still in the rumor phase, it seems, though in the digitimes article that led me there, they claimed:
digitimes said:
The first impact of AMD's acquisition of ATI Technologies on Taiwan Semiconductor Manufacturing Company (TSMC) and United Microelectronics Corporation (UMC) has surfaced, as ATI has cancelled launch of the company's 65nm chipsets for the Intel platform, the RD700, RS700 and RC710, which were scheduled to be released during the second half of 2007, according to the Chinese-language Commercial Times. TSMC and UMC may lose the secured contract orders, indicated the paper.
 
Also...: ATi to discontinue Intel chipset development.

still in the rumor phase, it seems, though in the digitimes article that led me there, they claimed:

Hey now! We were there three days earlier! http://www.beyond3d.com/forum/showthread.php?t=32836 Really, we have a front page!

Tho there have been enough of these little confirming hints the last several days, starting with the Italian guy's piece, to make one wonder if there's a bit of a concerted effort to make the message plain just now.

6K posts. . .
 
Well, if there are things that ATI are doing now that don't make money, they probably shouldn't be doing them, regardless of AMD. But I do think AMD has done that quite a bit themselves, so I wouldn't worry on that front. If you want to think gloom and doom, consider AMD taking what it wants from ATI and selling off the rest, in a progression of sales leading to bob's fish and ATI chips down the road.. :)

As for integrated gfx, no I don't see 2GB GDDR-4 on motherboards any time soon.. :) Although that might be very nice, come to think of it..at least for the gfx side, dunno how that kind of ram would work for random access. Just think of a laptop using gfx card memory types for system ram. 1GB would be double what's on a gfx card these days, but the speed/bandwidth vs ddr-2 could be compelling.
 
I see GPU/CPU/Northbridge on one chip with high speed memory interface and south bridge. You eliminate two chips and you could eliminate pci express slots. This would allow you to build really low cost PC's that would fit markets like India and China. With all the expansion options available for PC's and all the sound, nic, usb, firewire built in the need for slots in low end PC's is becoming moot.
 
Last edited by a moderator:
Uhm.. instead of forecasting DOOOOOOOOOM for everyone (world domination for nV) what would happen if DAAMIT raises the bar for IGP to a point where a separate video card would prove to be a very uneconomic option?

Hell it's already uneconomical, but as soon as your $400 CPU performs just as well as a $200 CPU and $200 G/VPU the balance of the money making market (low, mid-low end) shifts completely to the integrated graphics.

The problem is it seems someone has to be doomed by this. ATI or Nvidia. This is too great a change IMO for there not to be some major repercussions to the equilibrium the market has reached. If IGP becomes the defacto thing then it might well kill Nvidia...
 
Back
Top