Fast writes, they are actually useful!?!

Colourless

Monochrome wench
Veteran
Ok, so I'm helping develop a 2D sprite based engine. I decided add in code to test the speed of rendering. And the speeds are pretty good for me on my Radeon. 'Only' 4 ms to draw 600 objects. It ran at about 75 fps.

Of course I started getting reports from others that they were lucky to be getting even 30 FPS on similar machines. Their drawing times triple or more than I was getting. The main difference was their graphics cards. One had a V5 and the other a TNT2.

So what was causing this? Well I tried my V5 and it's drawing speed was 12 ms, three time longer than my Radeon. I could think of an obvious reason why. Then it occured to me, the Radeon supports and uses fast writes, the V5 does not. So, I disabled fast write on the Radeon and the drawing time went to.... 12 ms.

So, there you are. For me, at least, fast writes made drawing 3 times faster. :)

I guess that 'would' have been useful, oh, 10 years ago when software renderers were all the rage, but as it were, we get them when we need them the least. :)
 
I never had any trouble getting over 30 FPS on a TNT2 for my software renderer... Could you do some tests which only stress fillrate? Do you use transparency and if so do you use a back buffer in system memory or video memory?
 
Nick said:
I never had any trouble getting over 30 FPS on a TNT2 for my software renderer... Could you do some tests which only stress fillrate? Do you use transparency and if so do you use a back buffer in system memory or video memory?

He's doing a 2D engine, not 3D.
 
Can't do any test to test fillrate. Don't draw anything that can be used to test the fillrate. The drawing is of rle compressed sprites. Not triangles, quads or any such primitives. Everything is sprites. In theory occluded objects are supposed to be culled, but it doesn't quite work yet.

Some transparency is used, but I haven't done any tests. It's not very common.

As for the buffer setup, using a software buffer give me a painting time of 8 ms.
 
I imagine that doing the framebuffer in system ram, and keeping it small enough (or cutting it into chunks) to keep the data set in cache would go a long way keeping your paint time low.
 
Can't the video card's blitter be used to draw sprites? Surely that must be much faster than doing it in software, or is OS overhead so incredibly high these days having a 200+MHz, 128-bit wide blitter simply isn't worth it at all?

*G*
 
Is there any standard way to DMA a framebuffer from system to video memory? (directdraw? opengl textures?) As that would give you full AGP speed without tying up the processor while moving the data ...
 
arjan de lumens said:
Is there any standard way to DMA a framebuffer from system to video memory? (directdraw? opengl textures?) As that would give you full AGP speed without tying up the processor while moving the data ...

DirectDraw?
 
arjan de lumens said:
Is there any standard way to DMA a framebuffer from system to video memory? (directdraw? opengl textures?) As that would give you full AGP speed without tying up the processor while moving the data ...
I think those are called Blts :D
 
OpenGL guy said:
arjan de lumens said:
Is there any standard way to DMA a framebuffer from system to video memory? (directdraw? opengl textures?) As that would give you full AGP speed without tying up the processor while moving the data ...
I think those are called Blts :D

blits they are called indeed, but problem is (at list during my time of digging into it (i.e. in the PCI era)) there were practically no cards to support device-driven system-to-local blits (and PCI had a proto-GART-like mechanism so it was possible - my PCI SCSI does that). so has anything changed in this respect since then? btw, it's quite a different question which would give better thoughput - host-driven blitting or device-driven blitting to video memory. come to think of it, with fast writes & write combines (they can be used simultaneously, no?) host-driven blits could pretty much be on par (at least).
 
People, people, people, I welcome your concern but it's not really a problem. I was just pointing out that fast writes can make a difference. Quite possibly rendering speeds will be the last of our concerns once the bytecode interpreter is up and running.

Anyway, blits aren't really an option. Memory required for decompressed sprites is huge. Some of the sprites have well in excess of 1000 frames. With all that said, I have been working on an OpenGL renderer for it. Obviously, it is really fast, but texture memory can cause issues.
 
What's the reason to use 2d instead of 3d? The storage space required for all those sprites seems rather ridiculous. Look at Baldur's Gate 2 and Diablo 2... and both of those are low-res and low-framerate.
 
Baldur's Gate 2 supports any resolution your monitor supports, and the artwork is certainly very high resolution.

And Diablo II is very low-resolution (especially the artwork), but it tends to, at least in my experience, have quite a high framerate.

So, I guess I'm not sure what your point is.
 
BoddoZerg said:
What's the reason to use 2d instead of 3d?
Sure is. We're re-creating a game engine. We use the data sets of the old games.

The storage space required for all those sprites seems rather ridiculous.

Depending on the game the compressed sprites take between 15 and 60 MB.
 
Sounds like uncompressing them in system (but AGP able) memory, and using DirectDraw and/or Direct3d might be the way to go.

Sure, people scoffed at AGP texturing, but here's the perfect example (a large data set that doesn't require a lot of fill rate). Of course it's to power a game that came out 10 years ago, but still...
 
Chalnoth said:
And Diablo II is very low-resolution (especially the artwork), but it tends to, at least in my experience, have quite a high framerate.

Depends on your vid card, vid mode, and CPU speed.

The game attempts to lock single-player at 24fps at all times, and skips frames to do so.

On a Voodoo5 5500 in GLide (fastest possible mode, no IQ loss compared to other modes as far as I'm concerned), I was pusing 24fps with over 300 skipped frames. Online I'd hover between 200 and 400fps. It was - excuse me but I have to say it - So Fast it's Kind of Ridiculous. :LOL:

On my Voodoo2 SLI right now, it's very unstable, but in single-card mode I can run 800x600 at ~40-50fps in GLide online.

However, on my cousin's GeForce2 Go with a P4/Celeron 1.5GHz (Dell screwed up their order; should've been a Northwood), the game sustains 60fps-ish in towns and in minor action, but once the effects start piling on, Direct3D just dies (~10fps)... which is a real shame.
 
Tagrineth said:
Chalnoth said:
And Diablo II is very low-resolution (especially the artwork), but it tends to, at least in my experience, have quite a high framerate.

Depends on your vid card, vid mode, and CPU speed.

The game attempts to lock single-player at 24fps at all times, and skips frames to do so.

On a Voodoo5 5500 in GLide (fastest possible mode, no IQ loss compared to other modes as far as I'm concerned), I was pusing 24fps with over 300 skipped frames. Online I'd hover between 200 and 400fps. It was - excuse me but I have to say it - So Fast it's Kind of Ridiculous. :LOL:

On my Voodoo2 SLI right now, it's very unstable, but in single-card mode I can run 800x600 at ~40-50fps in GLide online.

However, on my cousin's GeForce2 Go with a P4/Celeron 1.5GHz (Dell screwed up their order; should've been a Northwood), the game sustains 60fps-ish in towns and in minor action, but once the effects start piling on, Direct3D just dies (~10fps)... which is a real shame.

Yeah, Diablo 2's Direct3D is God-awful. You need a GeForce4 to be able to run it smoothly, which is ridiculous considering how low-resolution the game is.

Anyways, I was just wondering why anyone would use 2d for a game on modern PCs. I can understand 2ds for handhelds; but on anything with any 3d rendering power it seems like the storage space required for high framerate/high resolution 2d art totally outweighs any other reason to use 2d graphics. Haven't seen an impressive-looking totally-2d game engine since Starcraft.
 
...Which is pretty funny, since Starcraft is anything BUT impressive graphically. Diablo2 is much better with at least simulated 3D parallax, vertex lighting, heavy use of transparencies etc.

I don't see why everything has to be 3D all of a sudden. I'm probably just too old-school to understand. :rolleyes:


*G*
 
Grall said:
I don't see why everything has to be 3D all of a sudden. I'm probably just too old-school to understand. :rolleyes:

Nice to know that im not the only who one still prefers 2D in a lot of game types.

Was quite saddened when Bioware chose not to continue developing/upgrading the Infinity engine used in BG I and II, IWD I and II, etc.

-Neutrality-
 
Back
Top