One of the topics we’ve handily avoided so far in our exploration of Metal is rendering of materials that are not opaque. In this post, we’ll explore a couple of related techniques for achieving transparency and translucency: alpha testing and alpha blending.
The sample scene for this post is a desert containing many palm trees and a few pools of water. The leaves of the palm trees consist of a few polygons each, textured with a partially-transparent texture, and the water is rendered as a translucent surface via alpha blending, which we’ll discuss in detail below.
What is Alpha, Anyway?
Alpha is used to represent the opacity (or coverage) of a color value. The higher the alpha value, the more opaque the color. When speaking in terms of fragment color, the alpha component indicates how much the scene behind the fragment shows through.
An excellent paper by Andrew Glassner details the complementarity of opacity and coverage to a much greater extent than I can write about it here.
The first technique we’ll use to render partially-transparent surfaces is alpha testing.
Alpha testing allows us to determine whether a fragment should contribute to the renderbuffer, by comparing its opacity against a threshold (called the reference value). Alpha testing is used to make surfaces selectively transparent.
One common application of alpha testing is the rendering of foliage, where relatively few polygons can be used to describe the shape of leaves, and the alpha channel of the leaf texture can be used to determine which pixels are drawn which are not.
Alpha Testing in Metal
Alpha testing in Metal is implemented in the fragment function. The shader file contains a global reference value (0.5 in the sample app), against which our alpha values will be compared. The fragment function samples the diffuse texture of the palm tree, which has an alpha value of 0 for the texels that should be transparent. The sampled color is then used to decide whether the fragment should be visible, described in the sections that follow.
The ‘discard_fragment’ function
In order to indicate to Metal that we don’t want to provide a color for a fragment, we cannot simply set the returned alpha value to 0. If we did, the fragment depth would still be written into the depth buffer, causing any geometry behind the “transparent” point to be obscured.
Instead, we need to call a special function that avoids specifying a color value for the fragment entirely:
discard_fragment. Calling this function prevents Metal from writing the computed depth and color values of the fragment into the renderbuffer, which allows the scene behind the fragment to show through.
The Texture Alpha Channel
To perform the per-fragment alpha test, we need a specially-constructed texture that contains suitable coverage information in its alpha channel. The figure below shows the palm leaf texture used in the sample app.
Performing the Alpha Test
The implementation of alpha testing in the fragment shader is very straightforward. Bringing together the techniques we just discussed, we test the sampled alpha value against the threshold, and discard any fragments that fail the alpha test:
float4 textureColor = texture.sample(texSampler, vert.texCoords); if (textureColor.a < kAlphaTestReferenceValue) discard_fragment();
A Note on Performance (Optional)
discard_fragment has performance implications. In particular, it prevents the hardware from performing an optimization called early depth test (sometimes called early-z).
Oftentimes, the hardware can determine whether a fragment will contribute to the renderbuffer before calling the fragment function, since fragment depth is computed in the rasterizer. This means that it can avoid shading blocks of fragments that are known to be obscured by other geometry.
On the other hand, if a fragment is going to be shaded with a function that contains conditional
discard_fragment calls, this optimization cannot be applied, and the hardware must invoke the shader for every potentially-visible fragment.
In the sample code for this post, we have separate fragment functions for geometry that is rendered with the alpha test and without. The alpha testing function should only be used on geometry that actually needs it, as overuse of
discard_fragment can have a strongly negative performance impact.
A full explanation of early depth-testing is well beyond the scope of this post, but you can read Fabien Giesen’s fantastic blog post on the subject for all of the details.
Another useful technique for implementing transparency is alpha blending.
Alpha blending is achieved by interpolating between the color already in the renderbuffer (the destination) and the fragment currently being shaded (the source). The exact formula used depends on the desired effect, but in this article we will use the following equation:
where and are the RGB components of the source and destination colors, respectively; is the source color’s opacity; and is the final blended color value for the fragment.
Expressed in words: we multiply the source color’s opacity by the source color’s RGB components, producing the contribution of the fragment to the final color. This is added to the additive inverse of the opacity multiplied by the color already in the renderbuffer. This “blends” the two colors to create the new color that is written to the renderbuffer.
Alpha Blending in Metal
In the sample project for this article, we use alpha blending to make the surface of the water translucent. This effect is computationally cheap, and it can be enhanced by with partial reflection and an animation that slides the texture over the water surface or physically displaces it.
Enabling Blending in the Pipeline State
There are at least two ways of achieving blending in Metal: fixed-function and programmable. We will discuss fixed-function blending in this section. Fixed-function blending consists of setting properties on the color attachment descriptor of the render pipeline descriptor. These properties determine how the color returned by the fragment function is combined with the pixel’s existing color to produce the final color.
To make the next several steps more concise, we save a reference to the color attachment:
MTLRenderPipelineColorAttachmentDescriptor *renderbufferAttachment = pipelineDescriptor.colorAttachments;
To enable blending, set
renderbufferAttachment.blendingEnabled = YES;
Next, we select the operation that is used to combine the weighted color and alpha components. Here, we choose
renderbufferAttachment.rgbBlendOperation = MTLBlendOperationAdd; renderbufferAttachment.alphaBlendOperation = MTLBlendOperationAdd;
Now we need to specify the weighting factors for the source color and alpha. We select the
SourceAlpha factor, to match with the formula given above.
renderbufferAttachment.sourceRGBBlendFactor = MTLBlendFactorSourceAlpha; renderbufferAttachment.sourceAlphaBlendFactor = MTLBlendFactorSourceAlpha;
Finally, we choose the weighting factors for the destination color and alpha. These are the additive inverse of the factors for the source color,
renderbufferAttachment.destinationRGBBlendFactor = MTLBlendFactorOneMinusSourceAlpha; renderbufferAttachment.destinationAlphaBlendFactor = MTLBlendFactorOneMinusSourceAlpha;
The Alpha Blending Fragment Shader
The fragment shader is responsible for returning a suitable alpha value for blending. This value is then used as the “source” alpha value in the blending equation configured on the rendering pipeline state.
In the sample code, we multiply the sampled diffuse texture color with the current vertex color to get a source color for the fragment. Since the texture is opaque, the vertex color’s alpha component becomes the opacity of the water. Passing this value as the alpha of the fragment color’s return value produces alpha blending according to how the color attachment was previously configured.
Order-Dependent Blending versus Order-Independent Blending
When rendering translucent surfaces, order matters. In order for alpha blending to produce a correct image, translucent objects should be rendered after all opaque objects. Additionally, they should be drawn from back to front (in decreasing order from the camera). Since order is dependent on the view position and orientation, objects need to be re-sorted every time their relative order changes, often whenever the camera moves.
We avoid this problem entirely in the sample app by having only a single, convex, translucent surface in the scene: the water. Other applications won’t have it so easy.
There has been much research in the last several years on ways to avoid this expensive sorting step. One technique uses a so-called A-buffer to maintain a pair of color and depth values for the fragments that contribute to each pixel. These fragments are then sorted and blended in a second pass.
More recently, research has placed an emphasis on the mathematics of alpha compositing itself, in an attempt to derive an order-independent transparency solution. Refer to McGuire and Bavoil’s recent paper for the full details on this technique.
In this article, we have seen two ways of rendering non-opaque materials: alpha testing and alpha blending. These two techniques can be used side-by-side to add extra realism to your virtual scenes.