EU cripples future graphics cards

fellix

Veteran
NordicHardware has seen exclusive information about a new energy law that will apply within the EU. The law requires that both discrete and integrated graphics cards live up to certain energy standards. AMD is worried that this will affect next generation graphics cards and have them barred from sales in the EU.

There are standardizations that make sure pre-built computers, but also discrete components, achieve a certain level of energy efficiency. Exactly how much depends on a row of criteria. These standards also include simple things, such as that after a certain amount of time the computer will enter sleep mode.
Read more...

:oops:
 
At last... I've been waiting for that for years, it's about time we stop wasting energy like that and force people to be a little more responsible.
The goal of graphics hardware is not to produce heat anyway.
 
The article is a complete fucking mismash of poor linking and poor reading as the guy in the comments points out (I send them a mail to amend the article, leaving it up as is is just plain irresponsible). The actual relevant doc is this :
http://www.eup-network.de/fileadmin/user_upload/Computers-Draft-Regulation-subject-to-ISC.PDF

The 320 GB/s is actually a lower limit for a temporary exemption of the SLEEP/IDLE/OFF power consumption requirements. There are no requirements for the ON state power consumption whatsoever.

Which is not to say that the incredibly short amount of time hardware developers get to react to this can't be disastrous ... if AMD by fluke just can't meet the requirements, but NVIDIA just can ... well bye bye hardware competition in the EU.
 
The article is a complete fucking mismash of poor linking and poor reading as the guy in the comments points out. The actual relevant doc is this :
http://www.eup-network.de/fileadmin/user_upload/Computers-Draft-Regulation-subject-to-ISC.PDF

The 320 GB/s is actually a lower limit for a temporary exemption of the SLEEP/IDLE/OFF power consumption requirements. There are no requirements for the ON state power consumption whatsoever.

That's disappointing :(
Better than nothing I guess.
 
At last... I've been waiting for that for years, it's about time we stop wasting energy like that and force people to be a little more responsible.
The goal of graphics hardware is not to produce heat anyway.
Why do you think regulation is necessary? The market has been pushing the semiconductor and computer industries to be power efficient for the past few years and there has been progress with each generation of hardware. This is a jobs bill for lawyers.
 
The goal of graphics hardware is not to produce heat anyway.
The goal of graphics hardware is to transform energy into something else that's useful to the user. Heat is the necessary and inevitable byproduct, the way it is for any other transformation process in existence.

What's your point?
 
Last edited by a moderator:
Rodéric said:
The goal of graphics hardware is not to produce heat anyway.
Well, games cause excitement and heat production in humans not to mention extra CO2... Crackdown on Wii and Kinect next? Or maybe futbol...
 
EU cripples future graphics cards (Exclusive)

http://www.nordichardware.com/news/71-graphics/46718-eu-cripples-future-graphics-ca

There are currently seven specifications for graphics cards - G1, G2, G3, G4, G5, G6 and G7. Graphics cards of the G7 classification have a bandwidth of 128 GB/s (GigaByte per Second) and more, without an upper limit today.
...

The commission wants to stop dedicated graphics cards of group G7 from going above
320 GB/s - that is in theory a memory bus at 384-bit connected to memory operating at 6667 MHz or 512-bit with 5001 MHz. This is definitely within reach for the next generation graphics cards. Radeon HD 7970 GHz Edition currently has a bandwidth of 288 GB/s with a 384-bit memory bus and 6000 MHz memory. For notebooks the limit will be only 225 GB/s.

...

Earlier today there were talk about the new restrictions going into effect in early 2013, but now it looks like it will be 2014. This will put nearly unrealistic demands on both AMD and Nvidia. Besides the fact the standardization is not very logical since memory bandwidth does not translate into performance that easily we see it as a great obstacle for future graphics cards...
WOW, so much rubbish in one article.
WTH are they smoking? Should be really bad... If true, this will stop progress of videocards, no faster memory, no higher performance. Surely, memory bandwidth does translate into higher performance.

Luckily, we have people with comments who give a better idea on the new regulation.

We've heard of this directly from AMD who is seriously concerned about this as it could limit their future generations of GPU's.
Whatever you heard, you didn't understand it. The regulations (the draft is linked in comments above, not this report to amended directive you provide- http://www.eup-network.de/fileadmin/user_upload/Computers-Draft-Regulation-subjectto-ISC.PDF- divide computers into energy classes (like refrigerators for example). The classes of GPU are to this mean and the numbers are the same in the draft, there won't be an upper limits.
The numbers you provided - 320 and 225 GB/s - also appear in the draft and you completely misunderstood them. Those are numbers ABOVE which high-end computers (which just wouldn't fit into the energy requirements) are EXEMPT from the regulation. And the problem the industry has with this is that those number are too high- http://www.digitaleurope.org/Portals/0/Documents/ENV/EcoDesign/DIGITALEUROPE%20Response%20Draft%20Regulation%20ErP%20Lot%203_20120801.pdf
 
Last edited by a moderator:
UniversalTruth said:
If true, this will stop progress of videocards, no faster memory, no higher
No it won't. The EU will just get down-clocked crippled versions.

You know F1 cars only get about 4 miles per gallon, maybe next they can bump that up to about 35.
 
What's your point?

I think Roderic's point is that AMD and Nvidia has pushed power consumption to absurd levels trying to get highest absolute performance with no regard to electricity usage, - causing excessive heat production and associated noise.

I remember a time when computer ICs didn't have heat sinks at all (not even passive ones) and computers were silent.

Cheers
 
I think Roderic's point is that AMD and Nvidia has pushed power consumption to absurd levels trying to get highest absolute performance with no regard to electricity usage, - causing excessive heat production and associated noise.

I remember a time when computer ICs didn't have heat sinks at all (not even passive ones) and computers were silent.

Cheers

Nothing like a 4MHz, Z-80 to show those pixels a thing or two :)
 
Highlights from my reading of the document MfA linked:

  • Cat. D desktops (e.g. 4 cores, 4GB RAM) can use around 66W maximum in idle, before a discrete GPU is considered.
  • Cat. B desktops (e.g. dual core 2GB RAM) can use around 50W idle instead.
  • G7 GPUs (192-bit or higher, up to 320GB/s) can add around 64W idle. (They can add more if the rest of the system is below limit.) This drops to around 38W after 30 months.
  • G4 GPUs (64-96 GB/s, like 8770 or 650) can add around 28W instead. This drops to 17W in 30 months.
  • Limits are different if adding a second GPU. Small adjustments for extra memory, storage devices, etc.

There are no limits for a computer that has all of the below:
  • 6+ cores
  • discrete GPU with 320GB/s+ bandwidth
  • 16GB+ RAM
  • 1kW+ PSU
Comments:

The initial wattages are high, IMHO. Even the lowered wattages are higher than the idle wattages most reviews have for GPUs in those categories. The exception is if multiple monitors are connected - I hope that's a part of the testing procedures but it doesn't say.

I didn't examine the mobile requirements.
 
Nothing like a 4MHz, Z-80 to show those pixels a thing or two :)

Hey!

My first workstation dedicated to Wolfenstein and Doom^W^W^W software development had a 486 that used less than 4W, it had a *tiny* heat sink.

Luckily it looks like the tablet form factor is forcing power consumption down.

Cheers
 
I remember a time when computer ICs didn't have heat sinks at all (not even passive ones) and computers were silent.
I remember that time too, like yesterday. In fact, not just yesterday, today too. They're called iPad and Nexus 7 and the like. They're also 1000x faster than the one you're probably remembering. There's also MacBook Airs or similar that are totally silent, or near silent when at maximum capacity.

But it's still pointless to make any such comparison.

If you want to encourage sensible energy usage, fine: increase energy prices anyway you want. But imposing technical limits on a field that's still evolving at breakneck speeds is plain stupid. You can do it for cars or air conditioners or your fridge, which have had 50y to evolve to a loin where all improvements are very incremental. Doing it for a field where you'll still see a 10x perf improvement a couple of years from now is incredibly short-sighted. It's also unnecessary: GPUs have been running into the power wall already anyway. AMD and Nvidia are forced to self-regulate on this point by themselves.
 
Of course, if Europe is willing to put constraints on its global technology competitiveness this way, that's their choice to make. It's not as if they're not already missing the boat on cloud computing for similar regulatory reasons either...
 
The 320 GB/s is actually a lower limit for a temporary exemption of the SLEEP/IDLE/OFF power consumption requirements. There are no requirements for the ON state power consumption whatsoever

Whose is this incredibly "genius" idea to put a regulation there? And wth does it have to do with power requirements?
 
I don't quite understand what you mean. Do you mean why they put the GB/s limit for the exemption in? It's not really that bad a way to separate performance classes ... really the regulation is pretty decently written, the only problem seeming to be that the time frames for full compliance are too short.

Also silent guy clearly needs a higher resolution monitor if he couldn't read the third comment on the thread :)
 
Back
Top