Intels Conroe benchmarked!

Druga Runda said:
amdbios.jpg
I didn't even notice this "unknown processor" :rolleyes:

well hard to believe we are all so sloppy, but this "crossfire bug" is odds on responsible for the huge intel platform gain in games



regardless of the other bits where he pretty much casts a quite decent "doubt" on the whole presentation. Overall

icon13.gif


to Intel.

I am sure it is a capable chip, very likely faster than current AMD stuff given real life PM numbers but Intel PR department needs to go help us all and go on well deserved holiday.
amdbios.jpg

Anyone able to find the reference to the Crossfire bug? (A link to the relevent info would be appreciated.)

In any case, the ATI "driver fix" seems rather suspicious as well..

I did find it strange for the BIOS to "not recognize" the processor... though it can mean something else is at work though.. an old BIOS, or it's a really "new" processor.

I'll wait until real benchmarks are out. I don't mind Intel doing some improvements.. but sabotaging your competition like that doesn't help their cause. Too little control over the setup enviornment gives enough doubt in itself...
 
Last edited by a moderator:
Shame on Hexus and Anand and the others for not noticing this BIOS revision and related bugs. :(
 
overclocked_enthusiasm said:
Looks like ATI wrote a special BIOS revision for Conroe on the RD480 (or was it a driver revision on Catalyst?) for the benchmarking.
Intel couldn't/wouldn't have written it themsleves right?

The RD480 is a AMD board the Conroe was running on a intel board.
Ati modified the Catalyst package so that it could recognize the new intel cpu, but the same driver was used on both machines.
No, Intel usually doesn't write ATI's drivers. ;)

overclocked_enthusiasm said:
With the recent Skype handicap on any AMD core via a driver penalty that only allows Intel cores to do 10 way CC instead of 5 way CC for AMD, I am leery on Intel's motives and actions here.

It's has nothing to do with drivers. It was an application decision (still i think a bad decision), and I don't see how skype has anything to do with this matter.

overclocked_enthusiasm said:
Maybe everything was on the up and up and they simply built a 65 nm beast that will slay AMD. Count me in the skeptical camp for now. With DDR2 at 800 MHz does AMD have a chance based upon these benches or do they need to do some crazy cache or clock pushing?

It's probably true that the AMD platform wasn't optimized by intel, but I guess a optimized one wouldn't get anywhere 40% faster.
My old pentium M is already faster clock for clock compared to an A64, I don't see as impossible that the conroe technology advantages over the old family of processor, (65nm, 1033 fbs, ddr2, 4MB of cache, multiple instructions per clock) make it that faster.
 
EasyRaider said:
ANova said:
It won't change. BTX allows for better cooling in general, not just for the cpu but for the whole system. As such, even if the cpu isn't a heater that just means a much quieter computer is a better possibility
Not much quieter, just a small edge (at most, I'd say). The more important for quiet aircooling are things like slow fans, efficient heatsinks and good airflow management (least possible obstructions near fans, etc.). The placement of the individual components is less critical, you can already get silent systems with ATX (although not a Prescott obviously).
I'm with EasyRaider on this one. The BTX form factor benefits are largely if the CPU is the dominant source of heat. If not, it's a toss up in practise, I'd say.

Otherwise, the overall main determinant for system noise is total heat output. This is what has to be dissipated, slow large fans and all the rest are just ways to deal with a problem that is already there. What Intel (and perhaps AMD as well) are doing is that they are attacking the problem at its core, and lowering the total power draw/heat output of their products. This is good.

It is long overdue IMHO that power draw joins absolute performance as a prime consideration. At some point, even the gfx manufacturers are going to follow suit, there is no way I would ever stick that 190W peak powerdraw X1900XT in any of my computers, particularly not if my future CPU draws less than 50 W. Advanced thermal management, noise reduction measures, bulky cases (I've got a P180 now for the main PC) et cetera are all possible, but they are also a major pain in the ass and most folks who don't go through the cost and tedium are stuck with the drone and whine.
It's just downright lousy ergonomics, and I for one am happy that Intel is doing very visible work to break the trend.
 
Entropy said:
I'm with EasyRaider on this one. The BTX form factor benefits are largely if the CPU is the dominant source of heat. If not, it's a toss up in practise, I'd say.

Otherwise, the overall main determinant for system noise is total heat output. This is what has to be dissipated, slow large fans and all the rest are just ways to deal with a problem that is already there. What Intel (and perhaps AMD as well) are doing is that they are attacking the problem at its core, and lowering the total power draw/heat output of their products. This is good.

It is long overdue IMHO that power draw joins absolute performance as a prime consideration. At some point, even the gfx manufacturers are going to follow suit, there is no way I would ever stick that 190W peak powerdraw X1900XT in any of my computers, particularly not if my future CPU draws less than 50 W. Advanced thermal management, noise reduction measures, bulky cases (I've got a P180 now for the main PC) et cetera are all possible, but they are also a major pain in the ass and most folks who don't go through the cost and tedium are stuck with the drone and whine.
It's just downright lousy ergonomics, and I for one am happy that Intel is doing very visible work to break the trend.
A 1900XTX doesn't use 190 watts :???:
http://www.xbitlabs.com/articles/video/display/gpu-consumption2006_4.html
 
EasyRaider said:
Not much quieter, just a small edge (at most, I'd say). The more important for quiet aircooling are things like slow fans, efficient heatsinks and good airflow management (least possible obstructions near fans, etc.). The placement of the individual components is less critical, you can already get silent systems with ATX (although not a Prescott obviously).
BTX is designed with good air flow in mind and combined with more efficient and cooler running processors means smaller passive heatsinks. One 120mm intake fan blowing over it would be sufficient and extremely quiet. Placement also does matter, since heat rises.

Tahir2 said:
http://www.beyond3d.com/forum/showpo...8&postcount=84

Already made my comment and just like J-Lo that is a big but for me.

The Conroe/Merom are supposed to include x86-64 (or IA-32e as Intel calls it). I personally wouldn't use that as a determining factor though.
 
Last edited by a moderator:
radeonic2 said:
The 190 Watt peak figure is from ATI themselves.
While the effort xbitlabs have put into their measurement scheme is commendable, it should be noted that it consistently posts low figures for power draw. Part of that is by not including PSU losses that most other sites do, but I suspect that their testing methodology doesn't generate maximum power draw. This matters little if you use their numbers for internal comparisons of different cards, which incidentally is how xbitlabs uses their data.

I admit that the peak number is unlikely to be comparable to Intels TDP - I used it for contrast. While it remains to be seen just how low the CPU manufacturers dare to go with their power draws, it is clear that they have broken their trend of constantly pushing up power draw to achieve higher clocks and performance. Maintaining low power draw and low noise, and enabling smaller footprint computers are perceived as bringing greater overall user value than another xx% of absolut performance. The same reasoning is not automatically applicable to GPUs, because the IHVs can and do offer different chips for different market strata, but it is also obvious that thus far, IHVs have largely seen power draw as irrelevant for GPUs compared to pushing up performance.

It is also clear that top end GPUs will draw a lot more power than the CPUs. I wonder if OEMs are willing to carry the cost for that in terms of basic design. Will Dell want to carry the extra burden in terms of PSU capabilities, thermal design complexity and cooling capabilities, the constraints it places on suitable case volumes et cetera for the eventuality that the customer might want to put a high end gfx card into the box? I doubt it. Even among people who play games, the high end gaming rigs for the hardware fetishists are a narrowing niche, that may or may not be self-sustainable. I feel that the industry by pushing the limits too far, cheered on by hardware review sites, is hurting the PC gaming business as a whole.

As far as I'm concerned, if some people want to race towards the power draw precipice with their systems, that's fine with me. But I don't want to pay the price for it in terms of development costs and industry standards limitations. Those folks can damn well carry their own costs, and my wallet will support alternative priorities. Now my wallet isn't big enough to make a major difference, unfortunately :). But if your interests clashes with the interests of every administative computer, every business computer, every home surfing computer and so on, the writing really is on the wall.
 
Last edited by a moderator:
ANova said:
BTX is designed with good air flow in mind and combined with more efficient and cooler running processors means smaller passive heatsinks. One 120mm intake fan blowing over it would be sufficient and extremely quiet. Placement also does matter, since heat rises.
One 120 mm fan would be sufficient with ATX, too. And really, what good does it do to swap positions of CPU and GPU, moving one up and the other down?
 
ANova said:
The purpose of the test was to show relative performance comparisons between the two at stock settings, though the Conroe won't be out for another few months this is why they chose AMD's latest and greatest and even overclocked it. The Athlon 64 is designed to run with DDR400 ram and they even set it to run at optimal performance with 2-2-2-5 1T timings, you call that crippled?
Yes, running AMD system with old BIOS which can't run FX60 or better CPU with 2-2-2-5 1T timings is crippling , no ? ;)

Anyway, I'd expect someone here to get a FX-60 system with decent MB, update the bios :), then to run few becnhmarks with 2 x X1900 ... that should be quite decisive no?

PS btw I don't think that 20% on average, as claimed by Intel is too much.
Suppose AMD can get 10% from going DDR2-800 and add 5-10% from doubling the cache (the test was comparing Conroe EE with 4MB shared cache vs FX with 2MB total L2)
Although I prefer AMD to lower the prices for X2 3800+ ;)
 
Last edited by a moderator:
Well another funny result for all those intel fans out there
There also was a demo benchmark of Call of Duty II, in which an overclocked 4.1 GHz Pentium Extreme Edition computer tried to keep up with Conroe - and trailed its successor with 90 fps to 110 fps

That should certainly put to rest the notion in anyones mind that the EE edition could ever be worth anything as a gaming processor.
 
NocturnDragon said:
It wasn't a EE (those will have a 1333 bus and higher freq). All the conroes will have 4MB of cache.
yes, not EE version, though I read somewhere that there will be versions with 2MB (mainstream) and 1MB L2 cache (value). My point was that 4>2 and we can wonder if K8 can gain performance from increased L2 cache...
AMD may release Fx-EE version just for PR reasons (as 7800-512 from NV :) )
 
What is with intel hype?

They run benchmarks of AMD current CPU to future Intel CPU's that are still 6-12 months away and now Intel is "king"? Shouldn't we wait to benchmark AMD future CPUs that are coming out in 6-12 months with these new Intel's?

The benchmarks are impressive but nothing to conclude such a title as "Intel Regains the Performance Crown" as said by AnandTech.com

I'll wait till I make my fair conclusion.

lol
 
EasyRaider said:
One 120 mm fan would be sufficient with ATX, too. And really, what good does it do to swap positions of CPU and GPU, moving one up and the other down?
No...it wouldn't. And it makes every sense, re-read what I said. The cpu is moved closer to the front of the case and the gpu is moved higher up, plus it allows for better airflow.

Yes, running AMD system with old BIOS which can't run FX60 or better CPU with 2-2-2-5 1T timings is crippling , no ?

Anyway, I'd expect someone here to get a FX-60 system with decent MB, update the bios , then to run few becnhmarks with 2 x X1900 ... that should be quite decisive no?

I see, so the old bios can't run the FX-60. How is it that they even managed to run the computer to get those benchmark scores then? Whether or not it detects the cpu correctly matters not in memory settings, nor performance in most cases.

PS btw I don't think that 20% on average, as claimed by Intel is too much.
Suppose AMD can get 10% from going DDR2-800 and add 5-10% from doubling the cache (the test was comparing Conroe EE with 4MB shared cache vs FX with 2MB total L2)
Although I prefer AMD to lower the prices for X2 3800+

10% going to DDR2 and an extra 5-10% doubling the cache huh? I'm sure AMD would love it if those numbers were true. Tell me, if cache is so important, why is it that the 1 MB cache A64s beat the 2 MB cache P4s in gaming by as much as 10%?

Follow up benchmarks from Anandtech. They also address the BIOS/X-fire bug issue.

http://www.anandtech.com/tradeshows/showdoc.aspx?i=2716

Where's that popcorn smiley.
 
AMD's exclusive cache design isn't going to see much gain at 2MB. Look at the diff between 1MB and 512K. The L2 on AMD's chips is very slow compared to Intel's. L2 is critical for Intel for various reasons, like their tiny L1 cache. This is one of those areas where the performance curve slopes sharply. Diminshed returns.
 
Back
Top