Sandy Bridge review thread.

I'm so tempted. Just wish AMD would get some bulldozer benchmarks out there so I could write it off and get SB (or not).

I suggest you read the Bulldozer thread here and the Realworldtech article on Bulldozer. It will tell you everything you need to know about it in terms of overall performance.

If you're lazy, I can sum it up for you. :p
 
I suggest you read the Bulldozer thread here and the Realworldtech article on Bulldozer. It will tell you everything you need to know about it in terms of overall performance.

If you're lazy, I can sum it up for you. :p

I have read the thread and skimmed the Realworldtech article and my summary is Bulldozer's performance depends on achievable clocks and buffer/branch efficiencies which loosely translates into "we don't know yet."

[Edit] That and claims that an insider has said Bulldozer will be 50% faster than a 980.

The thing is we know people have working hardware in hand now (or there are some good fake CPUID shots) so I would love a leak.
 
I installed my new 2500K yesterday. Damn it's fast!! I had no idea how much my old C2D 6600 (2.4Ghz) was holding back my HD 4890 but since swapping CPU's, tons of my games are significantly faster and smoother. I want to try out the real CPU killer - GTA4 but my DVD drives down atm so I can't. Can't wait to wipe the smug grin off that games face :)
 
Congratulations for the upgrade! That brings me memories, when I moved from Opteron 165 (Socket 939) to Core 2 Duo E8400 -- noticeable performance boost and smooth work all over the place, both in games and desktop applications, even without using benchmarking. ;)
 
I recall when I updated from a Northwood based Celeron to a Conroe based Pentium Dual-Core that was OCed to 3 GHZ. Daaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaamn that was a massive difference.
 
Athlon X2 3800 to 3.2GHz E6750 was an astounding upgrade for Team Fortress 2. Unfortunately I'm still using the E6750 :(
 
really i went from a 1.2 tbird -- p4 3ghz -- q6600
and in every case everything ran fine before the upgrade and everything ran fine after upgrade
noticeable difference not much :(

and tf2 ran bad on a x2 really ?
 
and tf2 ran bad on a x2 really ?

On 32 player servers the X2 didn't cope very well. My C2D maintains a pretty consistent 60fps where the X2 would hover around 30fps during moderate action, and in big fights would fall well below that.

Also the X2 would totally fail in X3:Reunion during large battles, where the C2D can keep things at least playable.
 
I upgraded from the Q6600 to 2600K just the last week. Was going to get the 2500K but they were out of stock. One thing I notice is that my computer startups darn fast :) Before the bios-checks and whatnot took almost 15secs and now it just skips through those. But overall, in programming and regular desktop use the difference is quite small. I have SSD which already had made everything stupidly fast.

I suppose my GPU (5850) is somewhat holding me back now regarding gaming but I think I'm waiting for the next generation before updating. Most of the games seem to run nicely with the 1900x1200 resolution but I'm missing the horsepower required to run the games with 2560x1440.
 
The more I test the 2500K the more I realise how much the C2D was holding my system back. Games I've tested so far where I've noticed or recorded a game changing speed up (in some cases this is just the ability to maintain a solid 60fps where I couldn't before):

Assasins Creed 2
Lost Planet 2
RE5 Benchmark - 60fps
Streetfighter 4 Benchmark - 60fps
DMC4 Benchmark - 60 fps
GTA4/EFLC - assumed as I haven't tested this yet but EFLC was near enough unplayable on the C2D.

EDIT: I also had fun with that old Intel quad core demo - Ice Storm Fighters which my C2D couldn't come close to handling at high detail but the 2500K just laughs at it :)
 
I can safely say that most modern games are CPU limited on my 3.2GHz C2D. Now I only have a GTX260 but I also only have a 1440x900 monitor so the GPU is usually not the problem. Didn't start to notice this until Far Cry 2, which doesn't go much above 30fps at high settings and graphics options like AA don't make much difference at all.

I've noticed that console ports are especially likely to be CPU limited, which is funny considering how vastly more capable the C2D should be than the Xbox CPU.
 
I can safely say that most modern games are CPU limited on my 3.2GHz C2D. Now I only have a GTX260 but I also only have a 1440x900 monitor so the GPU is usually not the problem. Didn't start to notice this until Far Cry 2, which doesn't go much above 30fps at high settings and graphics options like AA don't make much difference at all.

I've noticed that console ports are especially likely to be CPU limited, which is funny considering how vastly more capable the C2D should be than the Xbox CPU.

Yes that's what I've noticed as well. I guess it's because on the PC developers rely more on the CPU than they do on the consoles since CPU capabilities are always completely generalised but you can never guarentee the GPU's capabilities. Hence PC CPU's end up doing more work than console CPU's in the same game. And that's before you get into platform specific optimisations and the fact that PC's have to run Windows etc...

That said I'd be suprised if a 3.2Ghz C2D would ever fall behind console performance. It's 1/3rd faster than my old CPU which aside from GTA4 and Saints Row 2 (both very poorly CPU optimised) could pretty much match console performance in any game. Far Cry 2 in fact was always more than fast enough, definately 30fps+.
 
Methinks some developers do a straight recompile for Windows and call it a day. With proper optimization I can't see the Xbox CPU ever approaching a 3200MHz Conroe in a video game.
 
Streetfighter 4 Benchmark - 60fps

Are you sure your not suffering from the placebo effect ? or vysnc


couple of re5 results :

E8400 @ 3.8ghz
4gb DDR2-1066
GTX 260 Core 216 @ stock clocks
Windows 7 build 7600 x64

DX10 mode,1920 x 1200, 4x AA, V sync On, everything else maxed

59.9 FPS constantly through the entire test

DX 9
All High or on
C8xQAA
1600x900

Average 62.1 fps

E6750 @ 3,5 Ghz
2gb ram
8800GTS G92

Vista Dual Core Athlon 64X2 5600+ @ 2.8GHz
Nvidia 8800 GT 512mb
DX10
I average around 57 fps, for a B rank on the variable benchmark. 34fps for the fixed benchmark.
 
Athlon X2 3800 to 3.2GHz E6750 was an astounding upgrade for Team Fortress 2. Unfortunately I'm still using the E6750 :(

a nice upgrade I did was from XP2400+ to the lowest end 65nm sempron. it was much faster with 100MHz less, and would overclock till I half-borked the mobo.
it was derided back then as if it were a 486SX, and was well under the line but it was my first CPU both fast and low power.

a modern, but not as cheap equivalent would be the pentium G620, it scores ridiculously high in hardware reviews, if the game isn't too thread heavy.
 
Methinks some developers do a straight recompile for Windows and call it a day. With proper optimization I can't see the Xbox CPU ever approaching a 3200MHz Conroe in a video game.

there's the API overhead, especially draw calls, so there's stuff that cheap on the consoles (particles, flying newspapers and debris) but not so much on PC.

what's more, maybe the framerate is already higher on the PC and the shader workload should be relatively low if you're running a "bad ass" GPU (anything that trounces consoles)
 
I can't see spending the extra $$$ on anything past the 2500K. Would rather get a bigger SSD or faster GPU than a few hundred extra MHz that you'll likely never notice, at least in games.
 
Are you sure your not suffering from the placebo effect ? or vysnc

couple of re5 results :

E8400 @ 3.8ghz
4gb DDR2-1066
GTX 260 Core 216 @ stock clocks
Windows 7 build 7600 x64

DX10 mode,1920 x 1200, 4x AA, V sync On, everything else maxed

59.9 FPS constantly through the entire test

DX 9
All High or on
C8xQAA
1600x900

Average 62.1 fps

E6750 @ 3,5 Ghz
2gb ram
8800GTS G92

Vista Dual Core Athlon 64X2 5600+ @ 2.8GHz
Nvidia 8800 GT 512mb
DX10
I average around 57 fps, for a B rank on the variable benchmark. 34fps for the fixed benchmark.

By 60fps I meant a fixed 60 fps, i.e. vsync on and the game never deviates from 60fps. I can get a much higher average with vsync off.

Bare in mind my old CPU was only a 2.4Ghz Conroe so that's quote a lot slower than any of the CPU's your getting a solid 60fps with there. In fact it's not much faster than the 5600+ where your averaging 34 fps.
 
Back
Top