Doom3 benches revisited.

Borsti said:
You say that I should stop using 3DM03 because NV left the beta program??

Actually, you should stop using 3DM03 with nVidia hardware until the cheating issue is resolved one way or another.

Otherwise, you should not stop using 3DMark 03, because the benchmark is publically available to anyone. nVidia won't be "surprised" to see 3DMark03 benchmarks appear in any web reviews...so if they choose not to provide drivers that optimize for it, that's their decision.
 
Borsti said:
Just like nvidia was working closely with FutureMark on 3D Mark 2003, until last December. Would you say that nvidia is still working closely with FutureMark just because they did in the past? There's a serious flaw in your logic.
You say that I should stop using 3DM03 because NV left the beta program??
How did you come to that conclusion based on what I wrote? If you're going to speak for me, I won't respond at all.

You said:
Borsti said:
Yes I think that ATI has access to Doom 3. They had in the past (http://www.ati.com/companyinfo/press/2002/4495.html) and they always say that they are working close with them. Last QuakeCon ATI provided the card to show Doom III (http://www.bluesnews.com/plans/6/)
So you're basing your "thoughts" on things that happened last year. What else has happened in the last year regarding Doom 3? Maybe a leak that many people have attributed to ATI? You think that may have affected ATI's access to Doom 3?

The point I was making in the original quote above was that you can't always base current relationships between companies based on the past. How closely do you think Carmack is working with 3dfx these days?

-FUDie
 
Borsti said:
So what? You think I can´t set up a system?

Define "setting up". Did you install WinXP yourself? Was it the naked hardware NVIDIA provided to you?

Shame on ATI

For spending their engineering time on performance optimisations for available games instead of unrealeased game alphas?

Detecting cheats was not a problem. I had time enough during the benchmark runs to compare quality

Did you really? Have you detected the Serious Sam or 3DM2003 cheats for example?

Wrong. All numbers are absolutely correct. I never said anything like that.

Sorry Borsti, your 4X AA scores in your review are incorrect as stated by yourself several times.

Ok.. .tell me what was going on with cat 3.2. Enlight us!

Did I publish a RADEON 9800 vs. GeForce FX 5900 review or did you?

interresting. What might be indicative in your eyes?

You're aware of the basic coherences between cpu and fillrate limits on the final fps score, aren't you?

Understand my reaction, after recognizing such serious flaws in the way you (and probably the other DOOM3 testers too) test 3d graphic cards (benchmarking in a rush, not aware what driver settings affect, publishing wrong numbers, not beeing aware what game settings mean) I got quite a bit scared that there are more "issues" behind all these DOOM3 tests.

Nevertheless, at least you seem to "learn".
 
http://www.rage3d.com/board/showthread.php?s=&threadid=33685071&perpage=20&pagenumber=2

my thoughts of the day
Just wasted a few brain cells reading this post. I see the only guy on my ignore list is up to his usual stuff.

Anyways.... Doom III.
Interesting little game, even more interesting that reviews would appear on an unreleased game. All I can say at this point is that we have not had that particular benchmark before the review sites started using it. What a shame that people are getting to look at something that we havent had a chance to play around with a bit.

Anyways please dont pay any attention to that benchmark. It is on an unfinished product that we have not looked at yet. Wait till it really comes out and we will be ready to rock.

Terry Makedon above sums it up well, Websites saw a chance to get some hits with co-operation with Nvidia and jumped at it.
Nobody is pulling the wool over this old timers eyes, and it shows what I've been preaching about for a while, journalistic integrity sucks..big time :!: :!:

It's all about money.
 
Doomtrooper said:
http://www.rage3d.com/board/showthread.php?s=&threadid=33685071&perpage=20&pagenumber=2

Terry Makedon above sums it up well, Websites saw a chance to get some hits with co-operation with Nvidia and jumped at it.
Nobody is pulling the wool over this old timers eyes, and it shows what I've been preaching about for a while, journalistic integrity sucks..big time :!: :!:

It's all about money.

Yep - just pimping traffic and hits. Not a problem though. I just do not waste my bandwith on sites like this anymore.
 
In short, based on the old and incorrect lables, the NV35 performs better at minimal AND maximum quality.

For HQ that´s correct. I did not test minimal

Clarification, did you find a big difference in performance, or in Quality (or both?) I'll assume performance, because that's what I said Anand did not find all that different.

I'll have to refute that though. According to your tests, The GeForceFX had a significant impact in performance when going from medium to high quality. (83 to 55 FPs at 1024). However, the Radeon did not. It only went from 68 to 61 FPS.

The last one is correct. If you change the Doom III setting from MediumQuality to HighQuality it will use Aniso automaticly. I can´t tell what level of aniso but I´ll ask JC about that. It will use Aniso no matter if you disabled Aniso in the driver or not. That´s what I meant when I said that there are issues with the driver settings. And that´s the reason why I did not like to post AF numbers (means driver forced AF settings). I ran some tests with the NV35. With 8x AF (Quality forced in the driver) the perfomance droped from 83 to 80,8 (in medium quality). And it droped from 55,0 to 54,5 in HQ (all in 10x7). So there´s more going on than only Aniso. That´s why I did not post more results on that.

Again, this is my point. Based on your numbers, the FX takes a major performance penalty when going from medium to high quality, and the Radeon 9800 does not.

That´s correct. But I´m not sure about the reasons for that. NV says it must be a driver bug.... maybe because of the notcompressed textures. But I still wanted to post those results since they show the Radeon in the lead.

I don't know what you're trying to say here. The app calls for 8X aniso....how do you know there is a performance difference between what is asked for, vs. what is delivered? What is your baseline for measurement?

See above. A slight performance drop between "standard" AF quality (asked by the game) and "forced" aniso.

Again, I don't follow you.

Let´s see it with numbers:

Medium Quality, 10x7
NV 35 no AF: 83,0
NV 35 forced 8x AF Quality in the drivers: 80,8

High Quality, 10x7
NV 35 no AF (in the drivers): 55,0
NV 35 forced 8x AF Quality in the drivers: 54,5

So it looks that the performance drop of NV35 in Quality mode has nothing to do with Aniso at all. Seems to be a trouble with the textures or whatsoever. I feel very bad that I did not run more HQ tests with the R350. That would make things much clearer now.

To be clear, I don't doubt that Carmack beileves that that HIS ENGINE CODE is pretty near final with only minor tweaks left. What about ATI's drivers though? Does he have ATI driver builds that the public does not? Does he know how much if any headroom is left in whatever ATI drivers he does have?

I´ll ask him on that!

I know that ATI feels handicaped here. But it´s not my fault that they did´nt optimize the driver yet (as they say).

The publically available drivers? Maybe, maybe not.

You think there´s a parallel development in the drivers? There are MANY games outside... this would make compatibility testing almost impossible. I know that NV has allready optimized code for not-yet released in it. I would be surprised if ATI does that different.

Kudos to them. If they aren't ready, then they shouldn't do it.

That´s right. But id said that they are ready.

I wouldn't either, if all vendors had amply notice that the benchmarks were going to be done, to make sure that any driver optimisations they may have in house make it into the testing drivers.

If the game developer says that they´re ready for a performance evaluation there´s no reason not to do that. If one company neglected their optimization for a certain game yet - well, shame on them!

Shame on you.

If one company knows that the game / demo is going to be released by date X, THEN shame on that company if they don't get drivers out to support it. If one company is blindsided by an unknown public display, SHAME ON ANYONE involved. This includes ID, NVIDIA, and yes, you.

i see this a little bit different. :)

Shame on you for assuming that they don't have drivers in house that are not released to the public.

Well, that´s what they say.

Because your readers expect benchmarks to be done on as fair and level playing field as possible, perhaps :?:
If id says that´s the case.... !?

You seem to basically miss the premise that publically released drivers are not the same as in-house / development drivers. There's no sense in publically releasing drivers that have optimizations for a game or benchmark that is not available, if the suite of optimizations has not been tested enough to ensure it doesn't cause other problems to already shipping titles and benchmarks.

As I said. I´ll ask JC on that.

Let´s take HL2 as an (virtual) example. If you want to buy a new card right now and you´re waiting for that game. If I run benchmarks at this time with an alpha the ATI cards might look better. So what´s the conclusion for that guy? He´ll buy an ATI card. Is that unfair?

That all depends on the circustances surrounding the test!

What circumstances?

I´ll try to get some information from JC on that. But as I said: If id feels ready for a perfomance testing why should I rrefuse it? That ATIs marketing is not pleased with what happened is no surprise for me. They have to find something against NV35. That´s their job!

I´ll let you guys know if I get some info from JC on the ATI driver.

Lars
 
Sabastian said:
http://biz.yahoo.com/rc/030512/tech_graphics_1.html

They used the combination of DoomIII and the 3DMark2003 cheat to manipulate the preception that the Geforce FX 5900 Ultra is faster then the Radeon 9800 Pro. We all know that these benchmarks are used by OEMs to determine what is the next video card that goes in the next model .... right?

Sabastian - that right there is the article I was talking about earlier. Yep - people took the scores and ran with it.

I guess the armatures needed a startling statistic for an attention getter...
 
Borsti said:
for HQ that´s correct. I did not test minimal

I know you didn't test minimal, I misspoke. ;)

The last one is correct. If you change the Doom III setting from MediumQuality to HighQuality it will use Aniso automaticly. I can´t tell what level of aniso but I´ll ask JC about that. It will use Aniso no matter if you disabled Aniso in the driver or not. That´s what I meant when I said that there are issues with the driver settings.

Then you don't understand how these driver settings work. These driver settings don't force OFF aniso. They either force on a certain setting, or they let the application choose the filtering method.

The only question is, if you force on one type of setting, "performance 4x" for example, and you also "turn on aniso" in the game, what happens? That can vary depending on the driver.

And that´s the reason why I did not like to post AF numbers (means driver forced AF settings).

You should be safe posting "forced" AF numbers, as long as you ran Doom3 in medium quality mode. (Though you should be checking the quality of the Aniso that results.)

Doom3 I ran some tests with the NV35. With 8x AF (Quality forced in the driver) the perfomance droped from 83 to 80,8 (in medium quality). And it droped from 55,0 to 54,5 in HQ (all in 10x7). So there´s more going on than only Aniso. That´s why I did not post more results on that.

Don't follow you. There IS more going on than aniso when going from medium to the HQ game setting. TC is turned off. The fact that the Radeon doesn't take nearly as large a performance hit when going from medium to high quality suggests any number of things:

1) ATI simply handles non compressed textures and aniso with less performance impact
2) There's a bug in nVidia's drivers causing slower than expected performance in high quality mode.
3) There's a bug in ATI's drivers that are causing incorrect (lower) quality, and faster than expected performance in high quality mode
4) There's a bug in nVidia's drivers causing faster than expected performance in medium quality mode.
5) There's a bug in ATI's drivers causing slower than expected performance in medium quality mode.

I just don't see why option 2 (the most favorable to nVidia) was laid out as "the" possibility in your article, and not option 1...or any of the other options which the data supports. In other words, your article reads like the high quality scores are an anomoly, when the truth is we don't know.

Let´s see it with numbers:

Medium Quality, 10x7
NV 35 no AF: 83,0
NV 35 forced 8x AF Quality in the drivers: 80,8

High Quality, 10x7
NV 35 no AF (in the drivers): 55,0
NV 35 forced 8x AF Quality in the drivers: 54,5

For reference, do you have benchmarks with ATI hardware with the same settings (forced on or not?)

So it looks that the performance drop of NV35 in Quality mode has nothing to do with Aniso at all.

Possibly....or that driver forcing on Aniso with the GeForce doesn't do anything at all in Doom3. (Did you check the image quality?)

It is very surprising to see almost no performance drop with 8X "quality" aniso on the FX. This is unlike any other situation I know of. Look at your own UT benchmarks.

In short....looking at your data (the medium quality numbers), I would be more suspect that nVidia has a driver bug that DOESN'T ACTUALLY TURN ON aniso with the control panel, or perhaps turns on a different setting (performance) than selected. And that the performance drop between medium and high quality doom3 settings is in fact a combination of proper aniso actually being turned on, and higher quality (more bandwidth sucking) textures.

Seems to be a trouble with the textures or whatsoever. I feel very bad that I did not run more HQ tests with the R350. That would make things much clearer now.

Again, I more suspect that "forcing on" Aniso isn't properly working with Doom3.

I´ll ask him on that!

Specifically, I would ask him what ATI driver build he was basing his "should be representative of performance" comments on.

You think there´s a parallel development in the drivers? There are MANY games outside...

For a game/benchmark as important as Doom3? Wouldn't surprise me in the least.

If one company knows that the game / demo is going to be released by date X, THEN shame on that company if they don't get drivers out to support it. If one company is blindsided by an unknown public display, SHAME ON ANYONE involved. This includes ID, NVIDIA, and yes, you.

i see this a little bit different. :)

Apparently! :)

If id says that´s the case.... !?

That's right. Even if ID says that's the case. Unless you know that ID has the same drivers that are available to the public. There's a very easy retort to your argument:

Fact 1: Latest ATI drivers are 3.4
Fact 2: 3.4 drivers suck for Doom3

There is no way you can reconcile those two facts, and believe that Carmack only has access to the latest public drivers, and also believes performance comparisons are fair.

That all depends on the circustances surrounding the test!

What circumstances?

circumstances fully outlines earlier: like are all parties aware that there will be a public benchmark release?

I´ll try to get some information from JC on that. But as I said: If id feels ready for a perfomance testing why should I rrefuse it?

See above.

Ask yourself: why did you not just use the 3.4 cats then? ID said everything was cool....so there's "no reason" then to have any issue with the Cat 3.4 drivers, right?

That ATIs marketing is not pleased with what happened is no surprise for me. They have to find something against NV35. That´s their job!

You are deluded by your own results, which is the problem. (EDIT: Inserted smiley here! :)) From what I can gather..the 9800 Pro is ALREADY good enough to go up against the NV35. They are pretty much equal in terms of performance. With the possible exception of Doom3.

But the Doom3 scores are basically useless because of how the benchmarking was sponsored and done.

So I certainly haven't concluded that ATI needs to "find something"...and the problem is, that's pretty much what every review that tested Doom3 concluded. Check the reviews that didn't bench Doom3, and it's a much different conclusion. Usually along the lines of "in some cases NV35 is faster, but not by much, and the overall image quality of the 9800 makes it an overall better deal."

To be clear...if the Doom3 benchmarks are in fact truly representative, then there is a clear case to be made for NV35 superiority. (It comes down to a preference between performance or image quality.) Problem is, we really have no idea if they are representative or not.

I´ll let you guys know if I get some info from JC on the ATI driver.

Lars

Thanks...it is appreciated! :)
 
Joe DeFuria said:
You seem to basically miss the premise that publically released drivers are not the same as in-house / development drivers. There's no sense in publically releasing drivers that have optimizations for a game or benchmark that is not available, if the suite of optimizations has not been tested enough to ensure it doesn't cause other problems to already shipping titles and benchmarks.


One can argue that there might be an optimization but it´s not implemented in the public drivers yet. But why should they make such a silly descission?

SILLY?!

I explained it above, and someone else also explained it. Why SHOULD a company release drivers to the public that have optimizations for a game that doesn't (or shouldn't) exist yet!? Possible upside? NONE. Possible downside...negative effects for other games.

Just felt like dropping my .02$ on this...

What about people beta testing a game?
There are qutite a few people out there that's testing games that aren't anywhere near completion... How do you think they'd feel if the game they're testing was limited to 10 fps in the drivers?

I'm quite sure that there's plenty of support for upcoming games in ATI's publicly released drivers, as sending out different driver sets to the people doing beta testing would be way to time consuming and playing merry hell with re-integration into the public drivers...

Wouldn't it be easier to do it all with the "unified" driver code, and do your tests on that, instead of needing to test several different code builds for different games, and re-testing it once you've gotten it all intergrated into a "unified" driver?

You might argue that having support for a game not-yet-released doesn't mean they have optimized for it... But that leaves the question, when do you start optimizing for the game?

Most likely ATI will have had dev rel helping the game producer for quite some time making sure it'll work on ATI hardware...
(Resolving driver issues vs game issuse, suggesting ways to increase IQ etc)
Wouldn't it be logical to optimize the drivers at the same time that the game is being beta tested, instead of just before it goes gold?

This would also give you the benefit of having the game beta testers testing whatever optimizations that you do, to make sure it doesn't degrade the iq....

The point I'm trying to make here is that having optimizations for games not-yet-released in the drivers might not be such a bad idea after all (maybe even a good idea)...

Oh, and to whoever talked about "ATI is doing the right thing by optimizing their drivers for games currently out", I'm not so sure that's "the right thing to do"....
Why haven't ATI optimized their drivers for these games allread?

Yes, continuing to optimized for games after they're released is a Good Thing, but having no optimizations when the game is launched is a Bad Thing in my book...

Does this make any sense to anyone but me?
 
MrGaribaldi said:
There are qutite a few people out there that's testing games that aren't anywhere near completion... How do you think they'd feel if the game they're testing was limited to 10 fps in the drivers?

Right...so how can anyone like carmack say that the Catalyst drivers, which are limited to 10 FPS, are representative of game performance?

The only logical answer is: Carmack could not be referring to Cat 3.4 drivers. The question is, are the Cat 3.2's representative of what Carmack has? That cat 3.2s are a couple months old as a PUBLIC release, who knows how far they are behind the latest dev release.

I'm quite sure that there's plenty of support for upcoming games in ATI's publicly released drivers, as sending out different driver sets to the people doing beta testing would be way to time consuming and playing merry hell with re-integration into the public drivers...

I agree with that. Though I also agree that ID and Doom3 is a special case.

The point I'm trying to make here is that having optimizations for games not-yet-released in the drivers might not be such a bad idea after all (maybe even a good idea)...

I do understand your point...but that still doesn't reconcile with the Catalyst 3.4 release. we know that the Cat 3.4s don't represent final performance in doom. It does not follow that the cat 3.2s (which only utilize 128 MB of memory), should be representative of final performance.

Yes, continuing to optimized for games after they're released is a Good Thing, but having no optimizations when the game is launched is a Bad Thing in my book...

AGREED. The entire point in this case is: Doom3 is not launched. I would have no issues at all if ATI was given the head's up a month or so ago that this benchmark was coming.

Does this make any sense to anyone but me?

To be clear, it does make sense to have as little "parallel" driver development as possible. I do believe though, that if there is ever a reason to have a special parallel development version, it would be for Doom3.

In any case, the fact that catalyst 3.4 is "broken" with Doom3 certainly means that Carmack was not basing his "representative" comments on that particular driver version.
 
There are optimizations for OpenGL...that's part of writing the driver. What there might not be is specific optimizations for the unique way Doom 3 is doing things.

Exactly how many games using ARB_fragment_program are there that are out right now? This is new territory, AFAIK.

One hypothetical example: what if ATI was having issues in implementing F-Buffer support? OpenGL is the place they're likely to do it first, and getting it working would be more important than getting it working at high speed initially (it is primarily a development feature, not game feature). It seems quite likely that such a decision would be made for a shipping driver, with no idea that Doom 3 benchmarking would occur.

There are plenty of other possibilities related to the uniqueness of Doom 3 and that it won't be shipping for several months. None of them seemed to have been considered when proposing "Shame on ATI". <- Unacceptable proposition (about ATI not optimizing enough :!:) from a reviewer for issues they've introduced in their own self-interest, IMO.
 
Joe DeFuria said:
MrGaribaldi said:
There are qutite a few people out there that's testing games that aren't anywhere near completion... How do you think they'd feel if the game they're testing was limited to 10 fps in the drivers?

Right...so how can anyone like carmack say that the Catalyst drivers, which are limited to 10 FPS, are representative of game performance?

The only logical answer is: Carmack could not be referring to Cat 3.4 drivers. The question is, are the Cat 3.2's representative of what Carmack has? That cat 3.2s are a couple months old as a PUBLIC release, who knows how far they are behind the latest dev release.

Agreed
smile.gif

But then again, he could be referring to the unreleased Cat 3.3.
The Cat 3.4 seems to be hardlocked into producing 10 fps in the code, so it wouldn't be to unreasonable to think that he was thinking about the previous version of the drivers...

To him that would logicaly (sp?) be Cat 3.3, as they were supposed to be the previous release... That they didn't make it to the public is another matter...
(No, I don't think Carmack thought about the fact that the 3.3 wasn't released, since ATI had trumpeted them for quite some time...)

Joe DeFuria said:
The point I'm trying to make here is that having optimizations for games not-yet-released in the drivers might not be such a bad idea after all (maybe even a good idea)...

I do understand your point...but that still doesn't reconcile with the Catalyst 3.4 release. we know that the Cat 3.4s don't represent final performance in doom. It does not follow that the cat 3.2s (which only utilize 128 MB of memory), should be representative of final performance.

See above for the first part...

As for the second, well it should have given us some clues about the performance of the 9800pro 128mb ver... (If what I said above is correct)

Joe DeFuria said:
Yes, continuing to optimized for games after they're released is a Good Thing, but having no optimizations when the game is launched is a Bad Thing in my book...

AGREED. The entire point in this case is: Doom3 is not launched. I would have no issues at all if ATI was given the head's up a month or so ago that this benchmark was coming.

Fully agreed! I do not think that websites with the D3 demo did the right thing since they included cards from other IHV's than nvidia...


Joe DeFuria said:
To be clear, it does make sense to have as little "parallel" driver development as possible. I do believe though, that if there is ever a reason to have a special parallel development version, it would be for Doom3.

I'm not too sure about there being a reason for having a special driver build for D3, as I think they should've incorporated the optimizations/bugfixes into the official drivers before they beta test it, but that's just my opinion
smile.gif
 
Joe DeFuria said:
Possibly....or that driver forcing on Aniso with the GeForce doesn't do anything at all in Doom3. (Did you check the image quality?)

This is one possible thing I am worried about. I also mentioned it on JF_Aidan_Pryde's thread in regards to his request for suggestions on benchmarking the NV35. In light of recent events with Nvidia's Inflateonator drivers I think it would be prudent for any reviewer to examine images to ensure that the settings that are requested are infact produced. Furthermore with respect to paths, it was mentioned in an interview with J.C. that the ARB path (wich the 9800 uses) is of higher quality than the NV30 path which the NV35 uses. I know that the J.C. said that the differences would be slight but, perhaps as a service to readers who may spend $500 on a video card, Kyle, Anand or Lars could elaborate. J.C. may be a great coder but I know nothing of his eyesight. ;)
 
In light of recent events with Nvidia's Inflateonator drivers I think it would be prudent for any reviewer to examine images to ensure that the settings that are requested are infact produced

I've seen 2 reviews for the nv35 that benched splinter cell with AA yet didn't have AA on for the fx cards. Do these reviewers not pay any attention?
 
jjayb said:
Do these reviewers not pay any attention?

that's just a NO. Sad but true

For example. I know that Anand got many emails explaining him the SC AA Problem but he didn't seem to care because he still not changed his wrong Nvidia SC AA numbers in his review and he still claims that the rendering artifacts occuring when AA is used in SC being solely an ati driver problem -> it's sad really

SC=SplinterCell
 
Joe DeFuria said:
Actually, you should stop using 3DM03 with nVidia hardware until the cheating issue is resolved one way or another.
I'm OK with their hardware, just not their drivers. ;)

Borsti said:
Shame on ATI.
For what? For ruining your scoop so you couldn't show how a 9800P stacked up to a 5900U? Are you that self-centered? Because I can't think of another reason for you to say this. How can you be disappointed that ATi drivers haven't been optimized for a game when that game isn't available to be played anywhere yet? I can understand once it's been released, but not months before. I'm not sure it's appropriate for you to keep your Doom 3 scores up while you don't know whether ATi's current drivers are representative of their final performance. This seems to me something you and Anand and [H] should have clarified with ATi before, not after, benchmarking. A disappointing lack of professionalism on all your parts, IMO.
 
For example. I know that Anand got many emails explaining him the SC AA Problem but he didn't seem to care because he still not changed his wrong Nvidia SC AA numbers in he's review

One of their hardware reviewers posted in their forum last night that they are looking into it. That was a little under 24 hours ago.

I also see that Lars still hasn't updated his review yet. Guess it takes a lot longer than I thought to change "high quality" to "medium quality" in a picture.

Don't know what's worse. Sloppy reviews or dragging your feet fixing the mistakes in sloppy reviews.
 
HMMMMM??

I wonder if Omega or someone competent can undo the hacks in NV drivers
and release a Nocheat driver to check out the True performance of zee beast(NV35).

Would sure be nice wouldn´t it??? ;)
 
tEd said:
jjayb said:
Do these reviewers not pay any attention?

that's just a NO. Sad but true

For example. I know that Anand got many emails explaining him the SC AA Problem but he didn't seem to care because he still not changed his wrong Nvidia SC AA numbers in his review and he still claims that the rendering artifacts occuring when AA is used in SC being solely an ati driver problem -> it's sad really

SC=SplinterCell

Alot of these reviewers would change their tune if they woke up in the morning, checked the newspaper and reads "winning numbers in last nights Lottery".

Continues to read, checks their numbers...Anand/Borsti see they got a 'winner'.
Jump in their BMW, roar down to the lottery center to pick up their 10 million but when they get there they find their numbers are wrong.

There was a typo in the newspaper.
 
Back
Top