Haswell vs Kaveri

I think it's a question of "what do we do with these cores?" As process nodes become harder to reach, and single threaded performance gains slow down, chip manufacturers have effectively "built sideways" instead of "building upwards".

With their transistor budgets, they could build eight core chips with no graphics, but both AMD and Intel have decided to use some of that budget for graphics, believing this creates a better product that is more attractive to a larger portion of the market.

Large numbers of CPU cores where several sit idle seems not such a good proposition as being able to sell both a CPU and GPU functionality in one package. AMD getting a single chip that provides both functions into all the new consoles seems to be a good example of that.

The compromise of converging CPU and GPU into one chip may not be suitable for the likes of us here on B3D, but you can imagine it's seems a pretty good idea for the casual users and the companies that build computing devices for them.
 
Anyways, SoCs are indisputably going to destroy discrete in laptops and anything smaller, but it's not clear if anyone cares in desktops. If SoCs were to become popular in larger form factors I imagine it's going to be in all-in-ones, not conventional sockets.
I agree gaming and workstation desktops will stay discrete, but I assume many business desktops will stop having low end discrete cards if they haven't stopped already. They might have gone IGP years ago though.
 
Almost all business desktop have integrated gpu in one form or anorther
even the i810 was considered enough in that segment
 
Business users tend to only need the GUI rendered. 810 was a perfectly adequate GUI accelerator. In retrospect too its 3D was not that bad. It would have been impressive the year before. The main problem was Intel didn't give it AGP capability so it made enthusiasts angry.
 
I agree gaming and workstation desktops will stay discrete, but I assume many business desktops will stop having low end discrete cards if they haven't stopped already. They might have gone IGP years ago though.
Indeed, but honestly a lot of business have gone to laptops/docking stations as well. I imagine some will go to AIOs to save space. And I doubt the ones that stick with socketed desktops need anything more than absolutely minimal graphics power.
 
Almost all business desktop have integrated gpu in one form or anorther
even the i810 was considered enough in that segment

Don't know how accurate this is. At my workplace (large-ish automotive supplier company) for example, we used to have low end HD4xxx series discrete and now some low end Quadros. We don't need them in the slightest for the Ivy Bridge CPUs we have.

I think our IT department just happily orders whatever HP throws at us and calls it a day.
 
I think our IT department just happily orders whatever HP throws at us and calls it a day.

This is what happens everywhere I've ever known, except it's usually Dell that gets the business here in the UK. Most of them haven't a clue what they are getting but just assume it must be good because of the brand.
 
Anyways, SoCs are indisputably going to destroy discrete in laptops and anything smaller, but it's not clear if anyone cares in desktops. If SoCs were to become popular in larger form factors I imagine it's going to be in all-in-ones, not conventional sockets.

SoCs will severely gut the low and up to possibly mid-range discrete market, however.

That leaves some fraction of the middle range and the performance and enthusiast SKUs, with whatever massive price tag attached to a GPU there is for the highest end.

Even if desktop users don't care, the cannibalization of discrete volumes to SoCs could put sand in the gears of any development that is done solely for the discrete market.

That's why I was a intrigued a bit by an unsubstantiated rumor a while back that Kaveri could have a PCI client mode. That would bring additional volumes to a discrete segment that would ebb and flow based on how competitive SoCs were. If one of the SoCs was itself, so much the better.
 
Business users tend to only need the GUI rendered. 810 was a perfectly adequate GUI accelerator. In retrospect too its 3D was not that bad. It would have been impressive the year before. The main problem was Intel didn't give it AGP capability so it made enthusiasts angry.


back when it was released I think the 810 was more competitive to high end cards than what Haswell PRO 5200 is now.

but yes, even the lowest HD Graphics still feels adequate for me, I decided to test the HD 2000 from my CPU a few months ago and the experience was very positive.
 
back when it was released I think the 810 was more competitive to high end cards than what Haswell PRO 5200 is now.

...
A 810? not even a 810e+ where close to a Radeon or Nvidia card. Maybe you could make a case for this if you where using a small CRT, like really small. I think its the same difference(HaswellHD) relative to the discreet offerings of today. But its good enough. just not at 1440p...
 
When i810 came out it was still almost a novelty to have something 3D accelerated running at 640x480. It was more 3D capable than the proto GPUs like S3 Virge, better than an ATI Rage pro. But those were the times of terrible drivers that may give you inconsistent performance or games not running, image quality problems. Also you've got quite funny bandwith starvation running on a 66 or 100MHz FSB.

I would say that nowadays you have less worries about drivers and your game being able to run at all, and less worry about the game being unplayably slow or the whole computer getting slow due to the memory subsystem.
 
A 810? not even a 810e+ where close to a Radeon or Nvidia card. Maybe you could make a case for this if you where using a small CRT, like really small. I think its the same difference(HaswellHD) relative to the discreet offerings of today. But its good enough. just not at 1440p...
Radeons didn't exist when 810 came out. Competition was like nvidia tnt and rage 128 (I don't think voodoo3 was out and voodoo2 was 3d only), both were faster but featurewise all were bad :).
But yes it got obsolete pretty quickly with radeons and geforces appearing pretty quickly after that.
 
Unless my memory fails me, comparing the 810 with the Virge and Rage Pro is not particularly useful, as it came out in an age in which 3dfx was already well established, and you had things like the Matrox G200/G400, the S3 Savage or the Riva TNT. Not to say that the i740 (which is what the 810 was pretty much, IIRC, albeit Intel renamed it to i752) wasn't an interesting piece of kit, but it was not a scorcher in 1999. Also, I think we are drifting off-topic at an accelerated pace.:cry:
 
Unless my memory fails me, comparing the 810 with the Virge and Rage Pro is not particularly useful, as it came out in an age in which 3dfx was already well established, and you had things like the Matrox G200/G400, the S3 Savage or the Riva TNT. Not to say that the i740 (which is what the 810 was pretty much, IIRC, albeit Intel renamed it to i752) wasn't an interesting piece of kit, but it was not a scorcher in 1999. Also, I think we are drifting off-topic at an accelerated pace.:cry:


let say all of that is past on history .

anyway, you have follow the Top500 HPC list ? aieee Intel on Top with his co processors ,,, i will have not think they will take the number one place with it.
 
Mediocre gains as expected. They were already at the TDP limit and doubling execution units isn't doing a lot for them. The MBA actually loses in some games to an Acer with HD 4400.

It's ok for the same node I guess but the area cost makes it absolutely not worth it.
 
Mediocre gains as expected. They were already at the TDP limit and doubling execution units isn't doing a lot for them. The MBA actually loses in some games to an Acer with HD 4400.

It's ok for the same node I guess but the area cost makes it absolutely not worth it.
HD 5000 base clock rate is halved (200 MHz) compared to HD 4000 series (400 MHz). With twice as many EUs, the GPU can run at 1/2 the clocks and still provide the same performance. As everyone here knows, double clocks consume more than double power (4x is more closer in general). So Intel actually traded die area to power savings. I think it was worth it as MBA 2013 has outstanding 12 hour battery life (+5 hours compared to the last year's model AND slightly better performance).

HD 5000 (GT3) can still turbo clock up to 1100-1300 MHz in cases where the (15W) TDP allows it. But that shouldn't occur as often as it did in the past (but at 650 MHz it should already offer similar performance). Also without Crystalwell the GPU will be severely bandwidth bound at maximum clocks, lowering the potential performance gains even further. Anand's tests with GT3e showed that GT3 is TDP bound even at 47W. He increased the TDP to 55W (using Intel Extreme Tuning Utility) and it brought noticeable gains in games (but not that much in pure GPU synthetic benchmarks, as the CPU half is pretty much idling in those and gives its TDP to the GPU).
 
The Macbook Air with HD5000 is running in singlechannel RAM mode afaik. At least the 4GB version which Anand tested. 8GB version should run in dualchannel mode 2x4GB. What you guys expect from singlechannel? Did anand compared HD5000 singlechannel against HD4000/HD4400 dualchannel?
 
Back
Top