Ivy Bridge to have 77W max TDP?

DSC

Regular
Banned
http://vr-zone.com/articles/ivy-bri...d-forwards-compatibility-explained/13754.html

A fair few new details of Intel's upcoming Ivy Bridge CPUs and accompanying platforms have appeared over on a Chinese forum and although we'd take this with a pinch of salt, the roadmaps do look like the real deal. If proven to be correct, then we're looking at a peak TDP of 77W for the high-end models, down from the current 95W for Sandy Bridge.

77W TDP for a quad-core 3500K(2500K equivalent) and 3700K(2600K/2700K equivalent)? That's almost 20W TDP lower than Sandy Bridge. Looks like another win for Intel, increased performance and lower power consumption.
 
Nothing new really, they are using 3D Tri-gate transistors in IB, even their first slides shown that it will be 20-30% more power efficient from what i remember.
 
I regard IVY Bridge as some kind of "Enviroment-Friendly demonstrate CPU with low power consumption".....But actually I believe this issue happens because of the bad performance of Bulldozer
As we all know, according to the formula P=1/2CV^2f, If the power can be lower by the new tecnology, then we can set a higher frequency in order to get a better performance......Maybe due to the bad performance of Bulldozer Intel think it won't do harm to them if they keep a conservative strategy.
 
They've been very conservative since Core 2. See the huge clock headroom their chips have. They totally control the definition of the high end.
 
Even at 77W Ivy Bridge will be faster than Sandy Bridge. Nothing to complain about if you're a gamer, since SNB is already fast enough for every game and will be for a long time.

This is actually a really good thing for PC gaming, since massive CPU power will be available in low power laptops, which means gaming-grade CPUs will enjoy a much wider market penetration.
 
News like this makes me so mad and angry....

I could only imagine what a 77w TDP chip coupled with my 300w single stage unit could do..

Damn you Intel and your stupid cold bugs :(
 
But again, for gaming there's no need for a >77W Intel CPU. They have been more than fast enough for a while now. Making them smaller means they can be used in more places, means more gaming capable PCs, means better for PC gaming.

And make no mistake. 22nm Intel CPUs will be more capable than next-gen console CPUs. And high-end single GPUs like the GF580 are more capable than the next-gen console GPUs. We don't have to worry about being outdated any more. For maybe the first time ever.
 
And make no mistake. 22nm Intel CPUs will be more capable than next-gen console CPUs. And high-end single GPUs like the GF580 are more capable than the next-gen console GPUs. We don't have to worry about being outdated any more. For maybe the first time ever.
Yeah well unfortunately there aren't (m)any PC graphics blitz titles coming to prove it and the inefficiencies of Windows will surely cause deficiencies compared to the new game tuned console hardware/software. :cry:
 
I hear so much about the inefficiencies of Windows, but see little evidence. Games don't look 10x better on PC because devs don't spend the money on PC titles. We get upgraded console ports. Even the PC flagship Battlefield 3 was designed to run on current gen consoles.

And still, an oldschool 8800GT will run console ports with way better IQ than the consoles. If Windows was such a hindrance, would this be the case?
 
All you need to do is look around for comments from people who work on all of the platforms. At this point PC hardware is so far beyond the console hardware that it is simply ridiculous and yet all we can argue about is some visual quality aspects that are essentially superficial. We can't even get the same fluidity sometimes . Sure it's probably partly because developers don't spend a zillion hours working around Windows issues, driver quirks and various hardware interactions. But I'm not sure I remember a time when PC gaming was so refined anyway. Actually I think right now PC gaming is working better than ever before.

If you go back to DOS you can see PC hardware really leveraged for all it was worth. Of course this era was a pain for many reasons but it was down to the metal as some industry figures are now asking to partly see a return of.

I think Windows could use some desktop gaming OS competition. I think people assume it's tuned and optimal because there is no way to see it done another way besides those extremely efficient 6 year old consoles.
 
Half the hinderence to PC's is DX9. That will be dead an buried by the time the PC starts seeing next gen console ports. It'll be DX11+ all the way and that should even the playing field quite a bit.

Plus don't forget that those superficial improvements equate to pushing 4x the pixels at twice the framerate in some cases (1080p @ 60fps). That's 8x the power requirement right there before you consider the other graphical improvements seen in PC games. Seems to me that if PC's were so inefficient compared to consoles we'd need a lot more than the ~10x the power offered by high end PC's GPU's to achieve that.

Certainly no game has an issue achieving the same fluidy as its console counterpart on a sufficiently powerful PC unless there's a genuine issue with the engine. The problem is achieving the same fluidity at massively increased settings. It really doesn't take much of a PC to manage a 30fps locked vsync at sub 720p and low-medium settings even in the highest end games.
 
Half the hinderence to PC's is DX9. That will be dead an buried by the time the PC starts seeing next gen console ports. It'll be DX11+ all the way and that should even the playing field quite a bit.

Plus don't forget that those superficial improvements equate to pushing 4x the pixels at twice the framerate in some cases (1080p @ 60fps). That's 8x the power requirement right there before you consider the other graphical improvements seen in PC games. Seems to me that if PC's were so inefficient compared to consoles we'd need a lot more than the ~10x the power offered by high end PC's GPU's to achieve that.

Certainly no game has an issue achieving the same fluidy as its console counterpart on a sufficiently powerful PC unless there's a genuine issue with the engine. The problem is achieving the same fluidity at massively increased settings. It really doesn't take much of a PC to manage a 30fps locked vsync at sub 720p and low-medium settings even in the highest end games.

If the pixel density on monitors was significantly higher than on TVs, then people would see the improvement in IQ.
 
If the pixel density on monitors was significantly higher than on TVs, then people would see the improvement in IQ.

It effectively is, though How many games truly output in 720P? Yeah, the television might be a 1080P unit, but it's upscale at best meaning blur or else hard pixel edges. Plugging my PC into my HDTV provides instantly noticeable upgrade in picture quality versus my various consoles, because my PC can drive all ~2M of the pixels on my TV, versus less than half of them + upscaling.
 
People also sit a long way away from the TV. Poor IQ is not as noticeable as it is on a monitor from 24inches.

Also, 720p can provide stellar image quality. We need better AA.
 
If the pixel density on monitors was significantly higher than on TVs, then people would see the improvement in IQ.

I'm not sure I understand the point, improvement in image quality of what to what? If it's TV vs monitor then I don't disagree. But I'm not talking about the output device, I'm talking about the source.

Regardless of whether the display is a 1080p TV or a 1080 monitor, a 1080p source looks a lot better than a 600p source.

Although for the record the pixel density of monitors is much higher than large screen TV's because they have the same number of pixels packed into a smaller space. But that's countered by the distance the user sits away from the screen.
 
Half the hinderence to PC's is DX9. That will be dead an buried by the time the PC starts seeing next gen console ports. It'll be DX11+ all the way and that should even the playing field quite a bit.
Yeah but it's not unprecedented. 2005 had the same thing with DX9 and 360/PS3. Prior to that we had a lot of DX7/8-ish multiplatform games from PS2 and Xbox 1. Before that generation we had ports of N64/PS1/Dreamcast stuff too. I imagine whichever API is the new reset button will last 8 years or so again and it will become the new boat anchor eventually.
 
Yeah but it's not unprecedented. 2005 had the same thing with DX9 and 360/PS3. Prior to that we had a lot of DX7/8-ish multiplatform games from PS2 and Xbox 1. Before that generation we had ports of N64/PS1/Dreamcast stuff too. I imagine whichever API is the new reset button will last 8 years or so again and it will become the new boat anchor eventually.

Possibly, but each API increases programmability and gets the developers closer to the metal on the PC. So its a genuine improvement each generation. Consoles by comparison are more limited in how much they can improve since they already let you as close to the metal as is reasonably feasible and there's only so far you can go towards total programmability (PS3 was arguably most of the way there already this generation).
 
Back
Top