Question about using OpenGL/OpenGLES and 2d animation

Flux

Regular
I am going to develope a game on andriod and ios that uses opengl with animated textured quads.I have some questions about the rendering 2d animations in opengl.


Can you render a texture on a quad,clear the color buffer and alpha buffer then render another texture in the next frame?

Because opengl limits you to only one begin() and End() can you render multiple textures to a quad and then use the alpha channel to make one animation viewable and the others transparent?

Are there better more efficient ways to do 2d animation on android than this?


Thank you for your reply.
 
Since youve not had a reply I have decided that you should benefit from my wisdom yes your not worthy but I'll make an exception.
185 views and no replies I cant beleive the clever people here dont have an answer for you so maybe youve phrased the question in a way people dont get
So what are you try to achieve exactly in plain english (I'm thinking some sort of sprite animation)
also why opengl (cross platform portability) dont android et al have their own api's http://developer.android.com/guide/topics/graphics/2d-graphics.html
when I think 2d animation I think flash (yes its the spawn of satan) is that an option ?
 
I hardly qualify as clever people, at least as far as OpenGL is concerned, but this sure seems doable. You should be able to bind a different texture to your quad for each frame.

Or you could bind several textures from the beginning, pass a uniform parameter to your shader, test this parameter and, according to the result, decide to sample from the correct texture. That may be a stupid way of doing it, though.

I don't think messing with transparency is necessary or advisable in this case (unless you want semi-transparency) but I don't know for sure.
 
I hardly qualify as clever people, at least as far as OpenGL is concerned, but this sure seems doable. You should be able to bind a different texture to your quad for each frame.

Or you could bind several textures from the beginning, pass a uniform parameter to your shader, test this parameter and, according to the result, decide to sample from the correct texture. That may be a stupid way of doing it, though.

I don't think messing with transparency is necessary or advisable in this case (unless you want semi-transparency) but I don't know for sure.

So you could...

1)bind a texture to a quad
2)clear the color/depth/alpha buffer
3)repeat


Is this practical in opengl/opengles?
 
Back
Top