Chapters

Hide chapters

Metal by Tutorials

Third Edition · macOS 12 · iOS 15 · Swift 5.5 · Xcode 13

Section I: Beginning Metal

Section 1: 10 chapters
Show chapters Hide chapters

Section II: Intermediate Metal

Section 2: 8 chapters
Show chapters Hide chapters

Section III: Advanced Metal

Section 3: 8 chapters
Show chapters Hide chapters

20. Fragment Post-Processing
Written by Caroline Begbie & Marius Horga

Heads up... You're reading this book for free, with parts of this chapter shown beyond this point as scrambled text.

After the fragments are processed in the pipeline, a series of operations run on the GPU. These operations are sometimes referred to as Per-sample Processing (https://www.khronos.org/opengl/wiki/Per-Sample_Processing) and include: alpha testing, depth testing, stencil testing, scissor testing, blending and anti-aliasing. You’ve already encountered a few of these operations in earlier chapters, such as depth testing and stencil testing. Now it’s time to revisit those concepts while also learning about the others.

The Starter App

➤ In Xcode, open the starter app for this chapter, and build and run the app.

The starter app
The starter app

The standard forward renderer renders the scene using the PBR shader. This scene has a tree and ground plane, along with an extra window model that you’ll add later in this chapter. You can use the options at the top-left of the screen to toggle the post-processing effects. Those effects aren’t active yet, but they will be soon!

Submesh and Model now accepts an optional texture to use as an opacity map. Later in this chapter, you’ll update the PBR shader function to take into account a model’s opacity. If you need help adding textures to your renderer, review Chapter 11, “Maps & Materials”.

Using Booleans in a C Header File

In Renderer.swift, updateUniforms(scene:) saves the screen options into Params, which the fragment shader will use to determine the post-processing effects to apply. While the Metal Shading Language includes a Boolean operator (bool), this operator is not available in C header files. In the Shaders group included with this starter project, is stdbool.h. This file defines a bool, which Common.h imports. It then uses the bool operator to define the Boolean parameters in Params.

Alpha Testing

Move closer to the tree using the scroll wheel or the two-finger gesture on your trackpad, and you’ll notice the leaves look a little odd.

Opaque edge around leaves
Uqagea omhu apoexg taoxiw

Tree-color texture
Xheo-dedos qejsogu

float4 color = baseColorTexture.sample(
  textureSampler,
  in.uv * params.tiling);
if (params.alphaTesting && color.a < 0.1) {
  discard_fragment();
  return 0;
}
material.baseColor = color.rgb;
Alpha testing
Uxwle velqesb

Depth Testing

Depth testing compares the depth value of the current fragment to one stored in the framebuffer. If a fragment is farther away than the current depth value, this fragment fails the depth test and is discarded since it’s occluded by another fragment. You learned about depth testing in Chapter 7, “The Fragment Function”.

Stencil Testing

Stencil testing compares the value stored in a stencil attachment to a masked reference value. If a fragment makes it through the mask it’s kept, otherwise it’s discarded. You learned about stencil testing in Chapter 15, “Tile-Based Deferred Rendering”.

Scissor Testing

If you only want to render part of the screen, you can tell the GPU to render only within a particular rectangle. This is much more efficient than rendering the entire screen. The scissor test checks whether a fragment is inside a defined 2D area known as the scissor rectangle. If the fragment falls outside of this rectangle, it’s discarded.

if params.scissorTesting {
  let marginWidth = Int(params.width) / 4
  let marginHeight = Int(params.height) / 4
  let width = Int(params.width) / 2
  let height = Int(params.height) / 2
  let rect = MTLScissorRect(
    x: marginWidth, y: marginHeight, width: width, height: height)
  renderEncoder.setScissorRect(rect)
}
Scissor testing
Kxirbed viggord

Alpha Blending

Alpha blending is different from alpha testing in that the latter only works with total transparency. In that case, all you have to do is discard fragments. For translucent or partially transparent objects, discarding fragments is not the best solution because you want the fragment color to contribute to a certain extent of the existing framebuffer color. You don’t just want to replace it. You had a taste of blending in Chapter 14, “Deferred Rendering”, when you blended the result of your point lights.

window.position = [0, 3, -1]
models = [window, ground, tree]
The window in the scene
Nko cefsir an msi cwoku

Opacity

To define transparency in models, you either create a grayscale texture known as an opacity map, or you define opacity in the submesh’s material. The window’s glass group has an opacity map where white means fully opaque, and black means fully transparent.

The window's opacity map
Gmi cizdop'p ebiguxs miz

Blending

To implement blending, you need a second pipeline state in your render pass. You’ll still use the same shader functions, but you’ll turn on blending in the GPU.

// 1
let attachment = pipelineDescriptor.colorAttachments[0]
// 2
attachment?.isBlendingEnabled = true
// 3
attachment?.rgbBlendOperation = .add
// 4
attachment?.sourceRGBBlendFactor = .sourceAlpha
// 5
attachment?.destinationRGBBlendFactor = .oneMinusSourceAlpha
var transparentPSO: MTLRenderPipelineState
transparentPSO = PipelineStates.createForwardTransparentPSO()
renderEncoder.setRenderPipelineState(transparentPSO)
if (params.alphaBlending) {
  if (!is_null_texture(opacityTexture)) {
    material.opacity =
      opacityTexture.sample(textureSampler, in.uv).r;
  }
}
return float4(diffuseColor + specularColor, material.opacity);
Opacity not working
Imazihg vac goyxutd

models = [ground, tree, window]
Opacity is working
Equdutr ux sibroyt

Transparent Mesh Rendering Order

The blending order is important. Anything that you need to see through transparency, you need to render first. However, it may not always be convenient to work out exactly which models require blending. In addition, using a pipeline state that blends is slower than using one that doesn’t.

var transparency: Bool {
  return textures.opacity != nil || material.opacity < 1.0
}
let hasTransparency: Bool
hasTransparency = meshes.contains { mesh in
  mesh.submeshes.contains { $0.transparency }
}
if submesh.transparency != params.transparency { continue }
bool transparency;
renderEncoder.setRenderPipelineState(pipelineState)
var params = params
params.transparency = false
// transparent mesh
renderEncoder.pushDebugGroup("Transparency")
let models = scene.models.filter {
  $0.hasTransparency
}
params.transparency = true
if params.alphaBlending {
  renderEncoder.setRenderPipelineState(transparentPSO)
}
for model in models {
  model.render(
    encoder: renderEncoder,
    uniforms: uniforms,
    params: params)
}
renderEncoder.popDebugGroup()
Alpha blending
Ojjji brarxoql

Alpha blending turned off
Inmma ybiwvigg kavmik eyd

Antialiasing

Often, rendered models show slightly jagged edges that are visible when you zoom in. This is known aliasing and is caused by the rasterizer when generating the fragments.

Rasterizing a triangle
Xajqowefuxx e xcoahpxo

pipelineDescriptor.sampleCount = 4
var pipelineState_MSAA: MTLRenderPipelineState
var transparentPSO_MSAA: MTLRenderPipelineState
pipelineState_MSAA = PipelineStates.createForwardPSO_MSAA()
transparentPSO_MSAA =
  PipelineStates.createForwardTransparentPSO_MSAA()
let pipelineState = params.antialiasing ?
  pipelineState_MSAA : pipelineState
let transparentPSO = params.antialiasing ?
  transparentPSO_MSAA : transparentPSO
view.sampleCount = options.antialiasing ? 4 : 1
Antialiasing
Ibkiovoitiwb

Fog

Let’s have a bit more fun and add some fog to the scene!

float4 fog(float4 position, float4 color) {
  // 1
  float distance = position.z / position.w;
  // 2
  float density = 0.2;
  float fog = 1.0 - clamp(exp(-density * distance), 0.0, 1.0);
  // 3
  float4 fogColor = float4(1.0);
  color = mix(color, fogColor, fog);
  return color;
}
float4 color =
  float4(diffuseColor + specularColor, material.opacity);
if (params.fog) {
  color = fog(in.position, color);
}
return color;
Fog
Div

Key Points

  • Per-sample processing takes place in the GPU pipeline after the GPU processes fragments.
  • Using discard_fragment() in the fragment function halts further processing on the fragment.
  • To render only part of the texture, you can define a 2D scissor rectangle. The GPU discards any fragments outside of this rectangle.
  • You set up the pipeline state object with blending when you require transparency. You can then set the alpha value of the fragment in the fragment function. Without blending in the pipeline state object, all fragments are fully opaque, no matter their alpha value.
  • Multisample antialiasing improves render quality. You set up MSAA with the sampleCount in the pipeline state descriptor.
  • You can add fog with some clever distance shading in the fragment function.

Where to Go From Here?

Programmable antialiasing is possible via programmable sample positions, which allow you to set custom sample positions for different render passes. This is different to fixed-function antialiasing where the same sample positions apply to all render passes. For further reading, you can review Apple’s Positioning Samples Programmatically article.

Have a technical question? Want to report a bug? You can ask questions and report bugs to the book authors in our official book forum here.
© 2024 Kodeco Inc.

You're reading for free, with parts of this chapter shown as scrambled text. Unlock this book, and our entire catalogue of books and videos, with a Kodeco Personal Plan.

Unlock now