R300 clocked at 315mhz!!!

Doomtrooper said:
Don't base your decision on the very political 3Dmark Slash Screen Edition
I think you spelled some words wrong there DT! Shouldn't it say:

Do base your decision on the very politically correct 3Dmark Second Edition

Right? ;)

And could you please elaborate what the "Slash Screen Edition" is by the way? Or did you just miss the p key (Splash)?

misae said:
..in a synthetic benchmark loved by this industry a little too much.
In Finnish I'd say Höpö Höpö! ;)
 
From the Inquirer article:

...we learned that the first reviews will appear after the official presentation...

Hmmm, and again according to the Inquirer (big grain of salt), the R300 will be announced on July 17th, which means reviews soon? Huh? Seems like we'd know if review cards were out, or were coming anytime soon. Or maybe the announcement isn't the "official presentation". :rolleyes:
 
worm[MadOnion.com said:
]
Doomtrooper said:
Don't base your decision on the very political 3Dmark Slash Screen Edition
I think you spelled some words wrong there DT! Shouldn't it say:

Do base your decision on the very politically correct 3Dmark Second Edition

Right? ;)

And could you please elaborate what the "Slash Screen Edition" is by the way? Or did you just miss the p key (Splash)?

misae said:
..in a synthetic benchmark loved by this industry a little too much.
In Finnish I'd say Höpö Höpö! ;)

Doh sorry worm, I'll fix it...The very political 3Dmark 2002 $plash Screen Edition.

Sorry about that ;)
 
Nappe1 said:
hmmh... does anybody else find that 3DMark result a bit low for 8 pipeline chip with 256Bit memory interface and core clocked to 315Mhz?? because Afaik, here in finland some overclockers have surpassed 16000 3DMarks with GF4Ti4600...

http://gamershq.madonion.com/hardware/halloffame/
I'm jumping in because I want to dispel some myths about those 3D Mark scores.

First, do your own tests on 3D Mark 2001 on a fast CPU, say 2 Ghz P4 and compare it to a 2.4 Ghz P4. I think you will see that on fast GPUs, only the Nature scene does not improve much, if at all.

Second, those scores with 16000 3D marks on a GeForce 4 clocked at 390+ are definitely suspect. Why? Nature went from 45 on a non-overclocked GeForce 4, which is 300 mhz if I am not mistaken, to 88 at about 400 mhz. This doesn't seem reasonable at all as that's a 95% improvement for only a 33% increase in core clock speed and the memory speed increase was even less. Notice that the multitexture fillrate numbers (when they give them!) have only increased by the clock speed increases.

People who look at benchmark numbers like these should take them with a grain of salt... People complain enough about numbers from "reputable" sites like Tom's and Anandtech, yet people take these numbers as reasonable? Weird.
 
RE: macci, DJ and all the other hardcore overclockers

The scores are 100% legit... talk to macci sometime... when you have pix of your hardware that match the specs, its kinda absurd to say they faked it(digital jesus too.) You need to get your ass in the know how regarding fake and not fake scores. If you take a look at the theoretical(spell?) scores, they match allmost perfectly with the clock speed, and considering the many varibals, they match 100%.
K? :rolleyes:
 
Re: RE: macci, DJ and all the other hardcore overclockers

radeonic2 said:
The scores are 100% legit... talk to macci sometime... when you have pix of your hardware that match the specs, its kinda absurd to say they faked it(digital jesus too.) You need to get your ass in the know how regarding fake and not fake scores. If you take a look at the theoretical(spell?) scores, they match allmost perfectly with the clock speed, and considering the many varibals, they match 100%.
K? :rolleyes:
Can you reproduce the results?
Do you know how the LOD was set?
Did you read my post?
:rolleyes:

Nature is FILL LIMITED. 26% increase in FILLRATE doesn't gain you 93% in performance.

QED.
:rolleyes:

P.S. Unless there's some sort of "splash screen" trickery going on in the driver ;)
 
OpenGL guy said:
People who look at benchmark numbers like these should take them with a grain of salt... People complain enough about numbers from "reputable" sites like Tom's and Anandtech, yet people take these numbers as reasonable? Weird.
Macci's scores are "legit" and reproducable. He has a wide variety of uploaded projects (with different setups) in our database, and we haven't found one which would look "fake" or cheated. His system(s) is modified in many ways to get the best performance out. Not only overclocking the CPU and GPU, or lowering some LOD or disabling filtering. There are other tweaks that improve the performance too you know. ;)

Actually, we will soon (as we get more features done) start trying to get the webistes to post compare URL's in their reviews. That way they simply can not just type in whatever number they want and make a nice graph. At the moment nobody needs to be logged in or registered to the ORB to view a project. Yes, you need to be logged in, in order to make any searches but if you for example click on my compare URL, you will be able to see it without having to even register. Let's hope review sites would start to use 5-10 seconds after running 3DMark to upload the score in order for a compare URL. That would IMHO be a good thing..
 
8)

I remember how you some years ago could inflate the 3Dmark score with the SoftFSB trick*; I wonder if it could still be done. In that case the copmpare URL´s won´t help stopping cheating.


*The SoftFSB exe, by H. Oda, let you change the FSB on the fly, without rebooting. To cheat 3Dmark you did the following:

downclock your FSB to lowest possible
start 3Dmark but don´t run the benchmarking
change your FSB to highest possible
run the benchmarking, and
look at that score!


:p
 
Doomtrooper said:
ATI Radeon GPUs historically operate at higher frequencies with less heat, I would not be surprised if this is true in the least :p

Ati Radeon GPUs, when reaching retail, historically operate at lower frequencies than hyped or even officially stated beforehand. I would not be surprised at all if review samples were to operate at higher frequencies than their retail counterparts.
 
AFAIK it was reduced from 300MHz to 275MHz. Initial reviews used 300MHz samples, while most retail cards were clocked at 275 (OEM cards even lower I think), which caused some annoyance ...
 
Gollum said:
AFAIK it was reduced from 300MHz to 275MHz. Initial reviews used 300MHz samples, while most retail cards were clocked at 275 (OEM cards even lower I think), which caused some annoyance ...

Wrong. Initial Reviews were at 250MHz:

http://www.anandtech.com/showdoc.html?i=1517&p=2

The 0.15-micron core is clocked at 250MHz, a full 37% higher than the 183MHz clock of its 0.18-micron predecessor. Unlike the R100 core, the R200 features four rendering pipelines (instead of two), giving it a 1 Gigapixel/s fill rate vs. the 366 Megapixels/s fill rate of the R100.

AFAIK ATi PR subsequently said they were aiming for 300Mhz (at that time of writing) but eventually got 275Mhz - OEM boards were clocked at 250 then subsequently were renamed LE's.
 
I don't recall any review or preview that stated 300 Mhz clock. The very first preview that was published that I'm aware of (Anandtech), had stated the Core / Memory clock of the R-200 to be 250/275.

http://www.anandtech.com/showdoc.html?i=1517&p=2

Other than some confusing OEM vs. Retail clock speed issues, I don't recall ATI ever releasing a product that had a lower core or memory clock than they initially publically stated.

EDIT: Lol...iRC beat me to it. ;)
 
If anything, the 8500 had the opposite effect. Most people assumed it would be clocked under Titanium levels (< 240-250 MHz)...and when they announced the final shipping speeds, I know it really took me by surprise.
 
Thanks for correcting, I mainly remembered the OEM/Retail confusion, guess I was wrong on the 300MHz thingy then... :)
 
I am sorry that I am the forum grouch lately, but these 3dMark discussions in general are ticking me off. 3DMark certainly seems like a nice tool to compare how your system should be running. This means that yes the Kyro has a lower 3DMark than the GeForce2MX, but that doesn't mean the GeForce is faster than the Kyro. The only thing 3DMark should be used for is to gauge the performance of is your exact system compared to a similarly equipped system to see if your setup is optimal for what you have.

Frankly, I wouldn't mind if we Taboo 3DMark all together on these boards, as it seems to produce nothing.
 
The only thing 3DMark should be used for is to gauge the performance of is your exact system compared to a similarly equipped system to see if your setup is optimal for what you have.

Me and others (like Sharkfood as he has said the same) agree! Its the ONLY thing 3dmarks is good at. Predicting how games run on certain hardware? I think my Magic 8 ball has just as good of a chance :)
 
worm[MadOnion.com said:
]
OpenGL guy said:
People who look at benchmark numbers like these should take them with a grain of salt... People complain enough about numbers from "reputable" sites like Tom's and Anandtech, yet people take these numbers as reasonable? Weird.
Macci's scores are "legit" and reproducable. He has a wide variety of uploaded projects (with different setups) in our database, and we haven't found one which would look "fake" or cheated. His system(s) is modified in many ways to get the best performance out. Not only overclocking the CPU and GPU, or lowering some LOD or disabling filtering. There are other tweaks that improve the performance too you know. ;)
What possible good does it do to compare numbers where you aren't rendering things as they were intended? Tweaking LOD, disabling filtering, etc. is stupid as then you can't compare your results to anyone else's.

That is why I claim these scores are not "legit". Having these "legit" scores as the top scores on your site doesn't make me feel any better. Now, when some product comes along and scores X, people will compare it to these "legit" numbers and say "Look, you can't even beat what's already out there!" :rolleyes:

I have some "legit" scores as well: I can completely disabling rendering and get a score well above anything you've seen (52000 megatexels on my lowly 1.8 Ghz P4!). According to what you are saying, they are valid results. However, they are completely useless when comparing against results done on other hardware.
 
jb said:
Me and others (like Sharkfood as he has said the same) agree! Its the ONLY thing 3dmarks is good at. Predicting how games run on certain hardware? I think my Magic 8 ball has just as good of a chance :)

Look at the scores for different cards in 3D Mark, then look at the same cards in UT2003.

See any similiarities ?

I do. So i would definitely not say that 3D Mark doesn't say anything on how certain cards will run a certain game.

By using the same rule, any benchmarks using anything but the game you want to play is useless. Even games using the same engine (f.e Quake3) can give different results so even those comparisions aren't fool proof.
 
Back
Top