Acceleon in action!

Nappe1

lp0 On Fire!
Veteran
here is the video, but not to make it too easy, it is your turn to find out how to watch it. ;)

http://rp-design.totalnfs.net/Asm2003/Video008.3gp

presentation was, as Onslaught said, "pretty impressive" and for some basics nAo can told you more what I thought about it just after the presentation.

Thanks for hkultala taking this video with <NDA!> device. :)

more pictures and comments can follow if and when we have time / motivation / what ever to comment things. :)
 
Nappe1 said:
here is the video, but not to make it too easy, it is your turn to find out how to watch it. ;)

http://rp-design.totalnfs.net/Asm2003/Video008.3gp

presentation was, as Onslaught said, "pretty impressive" and for some basics nAo can told you more what I thought about it just after the presentation.

Thanks for hkultala taking this video with <NDA!> device. :)

The device is not under NDA anymore(if it were it would not have left my workplace), it's released but just not in stores yet(6650)
 
The seminar and the demonstration were very good. I have a hard time imagening that no mobile phone company will pick some of these chips up. The graphics were beautiful I think. Petri Nordlund stated when I asked him at the seminar that the chip won't be the reason a telephone battery goes dead. I will say that it all sounds good. Although I do have faith in Bitboys we all know, of course, that companies can twist the truth about the abilties of their products, but as I said it looked good and sounded good. 8)
 
For one you can turn the clock down and let the hardware handle it for a fraction of the power consumption.
 
Nick said:
With 200 MHz CPUs and 320x200 screens you have more than 3000 clock cycles per pixel which is gigantic: 3D Graphics on Mobile Devices. I don't see the need for hardware rendering...

OK, lets add the must have AA on such screens, lets add some depth complexity, lets add some vertex processing, lets add an actual game engine running... not to mention the complexity that filtering brings not to mention bandwidth.

Using your argument you could say todays 3.2GHz CPUs should be plenty to do graphics at 1024x768 since you'd have over 4000 clocks per pixel... err think again ;)

And voxels.... euuuwh... nice in theory but all games I have seen that tried to use them failed miserably unless its only used for the terrain and even then... but opinions differ.

K-
 
Kristof said:
OK, lets add the must have AA on such screens, lets add some depth complexity, lets add some vertex processing, lets add an actual game engine running... not to mention the complexity that filtering brings not to mention bandwidth.
Do you really think people are interested in playing, say, Unreal Tournament on a 6 by 5 cm screen at 320x200?
Using your argument you could say todays 3.2GHz CPUs should be plenty to do graphics at 1024x768 since you'd have over 4000 clocks per pixel... err think again ;)
Well my Real Virtuality tech demo runs in 640x480 resolution at 20 FPS on my Celeron 1200. A little calculation shows that the 3.2 GHz can run it in 1024x768 mode. It's not fully optimized yet, and Hyper-Threading could give a big gain too. But you don't actually need high framerates and resultions to make a game fun.
And voxels.... euuuwh... nice in theory but all games I have seen that tried to use them failed miserably unless its only used for the terrain and even then... but opinions differ.
I think Delta Force 2 and Commanche 3 were excellent games.
 
People are interested in getting the best quality at the right price. Our eyes are plenty good to be able to recognise large polygons, aliasing and poor texture filtering on such a screen.

The only way you can mix traditional voxel terrain with polygons is if you keep the engine 4-DOF (tilting and pitching with traditional voxel terrain are hacks for which the results do not correspond to what you would get with correct camera transforms). This limits the applications a bit, aliasing and the use of rather coarse sampling were pretty obvious in DF and Commanche too.
 
Nick said:
Do you really think people are interested in playing, say, Unreal Tournament on a 6 by 5 cm screen at 320x200?

You'd be surprised when you see one of these new displays, I have seen several of these new small phone displays and the quality they deliver is stunning. And as we all know size does not matter. If kids can be hooked to their Game Boy they will definitely be hooked when they can play UT on a similar device... at least IMHO.

Comanche 3 might have been an enjoyable game but you'll note that Comanche 4 got rid of the excellent voxels... what happens to your enegines performance when you move up to trilinear filtering ?

Also on an LCD platform you want 30 fps solid (min), you think the 400MHz mobile platform CPUs can deliver that (remember they are not like Intel or AMDs latests CPUs with advanced SSE and other optimised media instruction sets, nor do they have dual bus interfaces to high speed DDR memory or huge caches) ?

K-
 
Kristof said:
You'd be surprised when you see one of these new displays, I have seen several of these new small phone displays and the quality they deliver is stunning. And as we all know size does not matter. If kids can be hooked to their Game Boy they will definitely be hooked when they can play UT on a similar device... at least IMHO.
I'm sure their quality is stunning. But I'd rather not have to squint to read my health level, unless you want to use 1/3 of the screen for the HUD? And with those pixel sizes you probably also have to keep it at 10 cm to see the individual pixels. Did you play Jakko's game in the small or big window? I even think the small one was beyond scale...
Comanche 3 might have been an enjoyable game but you'll note that Comanche 4 got rid of the excellent voxels... what happens to your enegines performance when you move up to trilinear filtering?
Voxels are in my opinion better at displaying tiny detail and giving the impression of real vegetation. After all it's just the impression that counts. Polygon landscapes are not the way to go, they look way too flat.

And you don't need trilinear filtering to enjoy a game. Anyway the performance hit is less than 10% (keeping lightmap filter at bilinear). Besides none of the Acceleon product support it and I would be surprised if their mipmapping was per-pixel like mine.

And like I said, there's still a lot of opportunity for optimization. Notably the HSR isn't optimal for software rendering. A portal clipper could reduce the overdraw to zero and might keep everything at 30 FPS minimum.
Also on an LCD platform you want 30 fps solid (min), you think the 400MHz mobile platform CPUs can deliver that (remember they are not like Intel or AMDs latests CPUs with advanced SSE and other optimised media instruction sets, nor do they have dual bus interfaces to high speed DDR memory or huge caches)?
You're spoiled if you -need- 30 FPS. Unreal Tournament is perfecly playable on my 300 MHz laptop at 18-25 FPS. And I would rather invest in a floating-point unit that is useful for the whole game than an extra chip.

Just an example: Game Boy Advance has a 240x160 display, 6 by 4 cm. The processor is a 16 MHz RISC. Still, it is capable of 3D games that look like Quake, GTA, NFS, Soccer, etc. A quick calculation shows that we have 20 clock cycles per pixel if we target 20 FPS. Now imagine what a 400 MHz CPU with a floating-point unit could do...
 
Kristof said:
OK, lets add the must have AA on such screens.
Kristof said:
You'd be surprised when you see one of these new displays, I have seen several of these new small phone displays and the quality they deliver is stunning. And as we all know size does not matter. If kids can be hooked to their Game Boy they will definitely be hooked when they can play UT on a similar device... at least IMHO.

I would submit that high quality AA and filtering is much more important on these devices than on the desktop.
1. They inherently have very high pixel to pixel contrast (compared to CRTs).
2. Their linear resolution is 3-5 times less than typical computer screens today. (Roughly one tenth the information content.)

One way of looking at number 2 is that high levels of AA/filtering manages to cram more information into the same pixel area. So these devices really cry out for higher quality AA than is typical on the desktop today. It would be interesting to hear if you've made experiments using very high levels of sampling, Kristof. You've sounded like it.

Entropy
 
Nick said:
And you don't need trilinear filtering to enjoy a game. Anyway the performance hit is less than 10% (keeping lightmap filter at bilinear). Besides none of the Acceleon product support it and I would be surprised if their mipmapping was per-pixel like mine.

You're spoiled if you -need- 30 FPS. Unreal Tournament is perfecly playable on my 300 MHz laptop at 18-25 FPS. And I would rather invest in a floating-point unit that is useful for the whole game than an extra chip.

I don't care about what Acceleon supports or does not support ;)
MBX supports all the way up to anisotropic filtering if you check the ARM documents.

And the second point quoted is ... well... interesting. Do note that nobody is talking about extra chips... more like extra units within the already existing core chip.

I do believe its fairly foolish to keep promoting the "CPU alone is good enough"-ideology... since its fallen flat on its face in the PC Market and will do the same in the mobile market. The inherent parallelism and caching of 3D Graphics is simply not implemented on a standard CPU architecture and hence will be inefficient, and inefficiency is huge no-no on mobile platforms for obvious reasons (heat, power consumption and bandwidth).

And all those game boy 3D titles... well... erhm... they either run very slow or look poorly with hardly anything happening on the screen. 2D is something else since there is some special hardware there to improve sprite handling and memory copies.

K-
 
Kristof said:
And the second point quoted is ... well... interesting. Do note that nobody is talking about extra chips... more like extra units within the already existing core chip.
Well my point remains the same. Add a floating-point unit that is useful for more than just (3d) games.
I do believe its fairly foolish to keep promoting the "CPU alone is good enough"-ideology... since its fallen flat on its face in the PC Market and will do the same in the mobile market. The inherent parallelism and caching of 3D Graphics is simply not implemented on a standard CPU architecture and hence will be inefficient, and inefficiency is huge no-no on mobile platforms for obvious reasons (heat, power consumption and bandwidth).
I'd rather have a more powerful CPU that also allows me to do other multimedia applications.
And all those game boy 3D titles... well... erhm... they either run very slow or look poorly with hardly anything happening on the screen. 2D is something else since there is some special hardware there to improve sprite handling and memory copies.
You are greatly underestimating the CPU. When Quake was released in 1996, a 400 MHz processor was unimaginable. Here's an interesting quote from one of the reviews back then, who ran it on a Pentium 75 (inferiour to current RISC architectures):
All this is backed by graphics that are awesome in their own right. The creatures that fill the game’s four worlds and 28 levels are, as you might expect, sick, twisted, and perverse. The visceral effect of the bloody grimaces and entropic bioforms is intensified by animation that’s unusually smooth and utterly convincing.
So, now that we have graphics cards with anti-aliasing, anisotropic filtering and per-pixel lighting, this suddenly isn't true any more? That's why I say today people are spoiled. They don't realize any more that you really don't need all that extra eye candy to have an impressive and enjoyable game. So a 400 MHz CPU is definitely going to fulfil the needs of the average customer. Else you've got fine laptops with DirectX 9 compatible graphics cards and nice resolution nowadays.
 
And the reason why there's "hardly anything happening" with current 3D GBA games is that the screen is simply too small. A mobile phone's screen is mostly even smaller!

Also, how are you going to play a fast-paced game on a handheld device? With a mouse that fits under your pinkie? 3D hardware acceleration isn't worth anything if you can't create a playable game with it.
 
You are greatly underestimating the CPU. When Quake was released in 1996, a 400 MHz processor was unimaginable. Here's an interesting quote from one of the reviews back then, who ran it on a Pentium 75 (inferiour to current RISC architectures):
And not much later Quake 2 was launched, which looked great at 320x200 in software mode. Until I saw what Quake 2 looked like on a 3dfx Voodoo:

- 640x480 as resolution
- bilinear texture filtering
- real transparency (as opposed to dithering)
- colored lights

And all this ran much more fluently than software mode.

No sir, the nah-sayers that are against hardware accelerated graphics are just sticking their heads in the sand for what could be possible.
 
I'd rather have a more powerful CPU that also allows me to do other multimedia applications.

In order to play a 3D game, whatever the complexity of the application you'd need ideally both a strong CPU combined with a strong graphics chip. Image quality improving features or not (ie AA or AF), a game that relies on software rendering looks plain and simple likeAss(tm).

All Kristof said is that there has to be a balance between CPU and graphics power, which applies for the PC desktop market for years now too, once we've abandoned software rendering.

Combine a strong CPU with a weak graphics chip and the results should be pretty predictable. The CPU just gets stomped with all the extra load.

At the Inquirer I saw a BB official claiming roughly over PS1 performance (dunno which of the three models he claimed). At the opposite the MBX PRO at 120MHz delivers up to 3.75M polys/sec, while the optional VGP goes up to 480MFLOPs.

Multimedia applications? Who told you that above sollutions are weaker with those than just a strong CPU anyway?

So, now that we have graphics cards with anti-aliasing, anisotropic filtering and per-pixel lighting, this suddenly isn't true any more? That's why I say today people are spoiled. They don't realize any more that you really don't need all that extra eye candy to have an impressive and enjoyable game. So a 400 MHz CPU is definitely going to fulfil the needs of the average customer. Else you've got fine laptops with DirectX 9 compatible graphics cards and nice resolution nowadays.

In the case of a 400MHz CPU and a game that can run both in 3D as in software rendering, let's say the original UT, even if you ignore the high resolutions you can render it in on today's PCs, high resolution textures, and AA/AF you can add, software rendering doesn't only look likeA**(tm), it's also dog slow.

Back then when the 400MHz CPUs were high end it was always better to have a Voodoo running it in glide, then the pathetic software rendering mode.

Once you acknowledge that people are supposedly "spoiled", you've also detected what the market actually wants and needs ;)

As far as antialiasing goes, Acceleon supports AA too. It's "that" unnecessary. :rolleyes:
 
Nick said:
Time will tell...

Time told it's story already in the PC and Console graphics market. Apparently we'll have to go through the same stupid cycles for some people to understand, albeit it shouldn't be necessary at all.
 
Back
Top