What was your first post on B3D?

A fascinating thread, I’m not sure what my first post was but I seem to recall I came across B3D (or whatever it was called) when I was on my placement year (1997/98), I’m not sure if it existed in any form that long ago? I’m not even sure when I became a member. I’m sure I previously had an alternative login but lost the password and email access, however I read it for years before my current membership date.

I read it mainly for the wealth of knowledge on the board and it was a great place for gathering all the snippets of info regarding up-coming products.

The quality reviews of videocards on the main site was an attraction.
 
My first post was:

"Hello All!

Its my first post here at Beyond 3D, although being a lurker for many time.
Im going here on a limb and guess GF100 is NOT same chip as FERMI compute.
Why?

Because of these posts at Xtreme Systems:

http://www.xtremesystems.org/Forums/showpost.php?p=4243892&postcount=458

http://www.xtremesystems.org/Forums/showpost.php?p=4244268&postcount=491

http://www.xtremesystems.org/Forums/showpost.php?p=4244316&postcount=498

Gemini is always viewed as Dual Fermi. But Gemini is also word for Twin. And there are false twins. So my guess is GF100 is GT300 twin, but not at 100%. Thats why they were presented at two different time. And internally at nVIDIA it might be the word for GF100.

With this in mind:
- GF100 might really be a smaller chip. Remember the guy that claimed to be an ex-nvidia employee said GF100 doesn't like to be called fatty ;)
- Maybe software is really what is delaying GF100 launch. Parallelizing geometry probably is not easy.

Cheers
"

Well, I was very very wrong. But eventually the parting of way between Graphics and Pure Compute parts came true with Pascal xD
My English was quite a bit worse at the time as well....
EDIT - Well the distributed geometry / interconnect was part of the problem, just not in software only but at the transistor level, so there is that...
 
Last edited:
A lot easier to find ones first post when having a very low post count!

Hi
I think I'm not alone experiencing the following situation: Gaming happily without much action going on on the screen and the framerate is smooth and nice, but as soon as the enemies start to fill the screen and things get a bit tough, the framerate drops to less satisfying levels. So my question is if it would be possible to have an adaptive level of anti-aliasing, or for that matter any other image enhancement?

I for one would really like it, because I usually use less AA in games then I would like, to ensure that I have enough FPS in the critical moments (Not that my current X700 allows much of AA in newer games, but still :smile: ). Some sort of adaptive amount of AA would hopefully lead to more stable and consistent frame rate.

Not too different from dynamic resolution , so at least my idea turned out not to be completely off in the long run, but I quickly went back to lurking after this post :)
 
Wasn't there a forerunner site to B3D as well? I have a feeling Kristof ran one - Dimension3D, perhaps? Or did that have no linkage to what became B3D?

I have an email from a certain kristof@beyond3d.com from July 1997 so there may be a clue there :)


I believe it was Tommy McClain (AKA azbat) who ran
DIM3D.GIF

For those of a certain age, there were also PVR-Net and PowerVR Revolution.

Pvrnet_logo.jpg pvrrev.jpg
 
Last edited:
DX 10 vs DX 9 features in Crysis
Aren't there at least a couple of effects in Crysis that don't work properly/at all under DX9? Object based motion blur for example is glitchy when forced under DX9, and I seem to recall Crytek stating this was due to an actual technical limitation of the API.
I was right about that btw. Object motion blur is super glitchy if forced in DX9. :)

P.S. I see the post count every day but scrolling through all that made me realize how many countless hours I've wasted talking about videogames and shit :LOL:
 
Back
Top