I am rendering in 3D multiple objects with textures that have alpha. All the textures load fine but when I try to render them in front of each other I get the following:
Left is what I have. Right is what it should be. The grid is just to help visualize the perspective.
The texture in front of the red circle texture is clipped. I searched around for an answer and it says for me to use:
GLES20.glEnable( GLES20.GL_BLEND );
GLES20.glBlendFunc(GLES20.GL_SRC_ALPHA, GLES20.GL_ONE_MINUS_SRC_ALPHA );
But I am using it and it still isn't working. My setup in which I correctly placed in the onSurfaceCreated() function is:
GLES20.glClearColor( 0.75f, 0.85f, 1f, 1.0f );
GLES20.glEnable( GLES20.GL_BLEND );
GLES20.glBlendFunc(GLES20.GL_SRC_ALPHA, GLES20.GL_ONE_MINUS_SRC_ALPHA );
GLES20.glEnable( GLES20.GL_DEPTH_TEST );
GLES20.glDepthFunc( GLES20.GL_LEQUAL );
GLES20.glDepthMask( true );
GLES20.glClearDepthf( 1f );
My fragment shader is:
uniform sampler2D texture;
varying vec2 texCoord;
void main(){
gl_FragColor = texture2D( texture, texCoord );
}
Do I have to include anything in the Android manifest to enable alpha testing? I do not want to end up having to manually organize my polygons or use alpha discard() because I need and want some pixels to be translucent.
How do I get 3D alpha testing depth buffers to work?
Here is an overview of a few methods to render with transparency in OpenGL, with advantages and disadvantages for each.
This is a very limited method, but is sufficient for the specific case the poster asked about. The example shown does not really need transparency because everything is either fully opaque or fully transparent (alpha = 1.0 or alpha = 0.0).
There used to be an alpha test for this purpose in OpenGL, but that is a deprecated feature, and is of course not in ES. You can emulate the same thing in your fragment shader, which will look something like this:
vec4 val = texture2D(tex, texCoord);
if (val.a > 0.5) {
gl_FragColor = val;
} else {
discard;
}
Advantages:
Disadvantages:
Rendering transparency is a primary use case for blending. The most common approach is to set the blend function toSRC_ALPHA, ONE_MINUS_SRC_ALPHA
, enable blending, and render with the alpha component of the rendered fragments containing the desired opacity.
If the scene contains a mixture of fully opaque objects and objects with transparency, the fully opaque objects can be rendered first, without a need for them to be sorted. Only the objects with transparency need to be sorted. The sequence is then:
Advantages:
Disadvantages:
This is a very clever use of OpenGL features, IMHO, and can be a good practical solution. It does require multiple rendering passes. The simple form requires 3 passes:
A minimal shader can be used for pass 2, since it does not need to produce any valid fragment colors.
Advantages:
Disadvantages:
I haven't used this myself, so the following is based on my limited theoretical understanding. It looks like another interesting method. This requires multi-sampled rendering. The feature is enabled with glEnable(GL_SAMPLE_ALPHA_TO_COVERAGE)
, which then translates alpha values into a coverage mask, resulting in only parts of the samples being written depending on alpha value. This results in a transparency effect when the multi-sample buffer is downsampled to the final color buffer.
Advantages:
Disadvantages: