Chapters

Hide chapters

Metal by Tutorials

Second Edition · iOS 13 · Swift 5.1 · Xcode 11

Before You Begin

Section 0: 3 chapters
Show chapters Hide chapters

Section I: The Player

Section 1: 8 chapters
Show chapters Hide chapters

Section III: The Effects

Section 3: 10 chapters
Show chapters Hide chapters

7. Maps & Materials
Written by Caroline Begbie

Heads up... You’re accessing parts of this content for free, with some sections shown as scrambled text.

Heads up... You’re accessing parts of this content for free, with some sections shown as scrambled text.

Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.

Unlock now

In the previous chapter, using Model I/O, you imported and rendered a simple house with a flat color texture. But if you look at the objects around you, you’ll notice how their basic color changes according to how light falls on them. Some objects have a smooth surface, and some have a rough surface. Heck, some might even be shiny metal!

In this chapter, you’ll find out how to use material groups to describe a surface, and how to design textures for micro detail. This is also the final chapter on how to render still models.

Normal maps

The following example best describes normal maps:

On the left, there’s a lit cube with a color texture. On the right, there’s the same low poly cube with the same color texture and lighting, however, it also has a second texture applied to it, called a normal map.

With the normal map, it looks as if the cube is a high poly cube with all of the nooks and crannies modeled into the object. But this is just an illusion!

For this illusion to work, it needs a texture, like this:

All models have normals that stick out perpendicular to each face. A cube has six faces, and each face’s normal points in a different direction. Also, each face is flat. If you wanted to create the illusion of bumpiness, you need to change a normal in the fragment shader.

In the following image, on the left is a flat surface with normals in the fragment shader. On the right, you see perturbed normals. The texels in a normal map supply the direction vectors of these normals through the RGB channels.

Take a look at this single brick split out into the red, green and blue channels that make up an RGB image.

Each channel has a value between 0 and 1, and you generally visualize them in grayscale as it’s easier to read color values. For example, in the red channel, a value of 0 is no red at all, while a value of 1 is full red. When you convert 0 to an RGB color (0, 0, 0), that results in black. On the opposite spectrum, (1, 1, 1) is white, and in the middle you have (0.5, 0.5, 0.5) which is mid-gray. In grayscale, all three RGB values are the same, so you only need refer to a grayscale value by a single float.

Take a closer look at the edges of the red channel’s brick. Look at the left and right edges in the grayscale image. The red channel has the darkest color where the normal values of that fragment should point left (-X, 0, 0), and the lightest color where it should point right (+X, 0, 0).

Now look at the green channel. The left and right edges have equal value but are different for the top and bottom edges of the brick. The green channel in the grayscale image has darkest for pointing down (0, -Y, 0) and lightest for pointing up (0, +Y, 0).

Finally, the blue channel is mostly white in the grayscale image because the brick — except for a few irregularities in the texture — points outward. The edges of the brick are the only places where the normals should point away.

Note: Normal maps can be either right-handed or left-handed. Your renderer will expect positive y to be up, but some apps will generate normal maps with positive y down. To fix this, you can take the normal map into Photoshop and invert the green channel.

The base color of a normal map — where all normals are “normal” (orthogonal to the face) — is (0.5, 0.5, 1).

This is an attractive color but was not chosen arbitrarily. RGB colors have values between 0 and 1, whereas a model’s normal values are between -1 and 1. A color value of 0.5 in a normal map translates to a model normal of 0.

The result of reading a flat texel from a normal map should be a z value of 1 and the x and y values as 0. Converting these values (0, 0, 1) into the colorspace of a normal map results in the color (0.5, 0.5, 1). This is why most normal maps appear bluish.

Creating normal maps

To create successful normal maps, you need a specialized app. In the previous chapter, you learned about texturing apps, such as Substance Designer and Mari. Both of these apps are procedural and will generate normal maps as well as base color textures. In fact, the brick texture in the image at the start of the chapter was created in Substance Designer.

Tangent space

You send the normal map to the fragment function in the same way as a color texture, and you extract the normal values using the same UVs. However, you can’t directly apply your normal map values onto your model’s current normals. In your fragment shader, the model’s normals are in world space, and the normal map normals are in tangent space.

Using normal maps

Open up the starter project for this chapter. Note there are a few changes from the previous chapter’s final code:

return float4(normalValue, 1);

return float4(normalValue, 1);

1. Load tangents and bitangents

Open VertexDescriptor.swift and look at defaultVertexDescriptor. Model I/O is currently reading into the vertex buffer the normal values from the .obj file. And you can see that you’re telling the vertex descriptor that there are normal values in the attribute named MDLVertexAttributeNormal.

let (mdlMeshes, mtkMeshes) = 
     try! MTKMesh.newMeshes(asset: asset,
                            device: Renderer.device)
var mtkMeshes: [MTKMesh] = []
let mdlMeshes = 
     asset.childObjects(of: MDLMesh.self) as! [MDLMesh]
_ = mdlMeshes.map { mdlMesh in
  mdlMesh.addNormals(withAttributeNamed: 
                        MDLVertexAttributeNormal,
                     creaseThreshold: 1.0)
  mtkMeshes.append(try! MTKMesh(mesh: mdlMesh, 
                                device: Renderer.device))
}

mdlMesh.addNormals(withAttributeNamed: MDLVertexAttributeNormal,
                   creaseThreshold: 1.0)
mdlMesh.addTangentBasis(forTextureCoordinateAttributeNamed: 
            MDLVertexAttributeTextureCoordinate,
          tangentAttributeNamed: MDLVertexAttributeTangent,
          bitangentAttributeNamed: MDLVertexAttributeBitangent)
static var vertexDescriptor: MDLVertexDescriptor =
  MDLVertexDescriptor.defaultVertexDescriptor
Model.vertexDescriptor = mdlMesh.vertexDescriptor
let vertexDescriptor = Model.vertexDescriptor

2. Send tangent and bitangent values to the GPU

In Renderer.swift, in draw(in:), locate // render multiple buffers and these lines of code:

let vertexBuffer = mesh.mtkMesh.vertexBuffers[0].buffer
renderEncoder.setVertexBuffer(vertexBuffer, offset: 0,
                      index: Int(BufferIndexVertices.rawValue))
for (index, vertexBuffer) in 
      mesh.mtkMesh.vertexBuffers.enumerated() {
  renderEncoder.setVertexBuffer(vertexBuffer.buffer,
                                offset: 0, index: index)
}
typedef enum {
  BufferIndexVertices = 0,
  BufferIndexUniforms = 1,
  BufferIndexLights = 2,
  BufferIndexFragmentUniforms = 3
} BufferIndices;
typedef enum {
  BufferIndexVertices = 0, 
  BufferIndexUniforms = 11,
  BufferIndexLights = 12,
  BufferIndexFragmentUniforms = 13
} BufferIndices;

3. Convert tangent and bitangent values to world space

Just as you converted the model’s normals to world space, you need to convert the tangents and bitangents to world space in the vertex function.

Tangent = 3,
Bitangent = 4
float3 tangent [[attribute(Tangent)]];
float3 bitangent [[attribute(Bitangent)]];
float3 worldTangent;
float3 worldBitangent;
.worldTangent = uniforms.normalMatrix * vertexIn.tangent,
.worldBitangent = uniforms.normalMatrix * vertexIn.bitangent,

4. Calculate the new normal

Now that you have everything in place, it’ll be a simple matter to calculate the new normal.

normalValue = normalValue * 2 - 1;
float3 normalDirection = normalize(in.worldNormal);
float3 normalDirection = float3x3(in.worldTangent, 
                                  in.worldBitangent, 
                                  in.worldNormal) * normalValue;
normalDirection = normalize(normalDirection);

Other texture map types

Normal maps are not the only way of changing a model’s surface. There are other texture maps:

Materials

Not all models have textures. For example, the train you rendered earlier in the book has different material groups that specify a color instead of using a texture.

typedef struct {
  vector_float3 baseColor;
  vector_float3 specularColor;
  float roughness;
  float metallic;
  vector_float3 ambientOcclusion;
  float shininess;
} Material;
let material: Material
private extension Material {
  init(material: MDLMaterial?) {
    self.init()
    if let baseColor = material?.property(with: .baseColor),
      baseColor.type == .float3 {
      self.baseColor = baseColor.float3Value
    }
  }
}
if let specular = material?.property(with: .specular),
  specular.type == .float3 {
  self.specularColor = specular.float3Value
}
if let shininess = material?.property(with: .specularExponent),
  shininess.type == .float {
  self.shininess = shininess.floatValue
}
material = Material(material: mdlSubmesh.material)
BufferIndexMaterials = 14
var material = submesh.material
renderEncoder.setFragmentBytes(&material,
              length: MemoryLayout<Material>.stride,
              index: Int(BufferIndexMaterials.rawValue))
constant Material &material [[buffer(BufferIndexMaterials)]],
float3 baseColor = material.baseColor;
float3 materialSpecularColor = material.specularColor;
float materialShininess = material.shininess;

Function specialization

Over the years there has been much discussion about how to render different materials. Should you create separate short fragment shaders for the differences? Or should you have one long “uber” shader with all of the possibilities listed conditionally? Function specialization deals with this problem, and allows you to create one shader that the compiler turns into separate shaders.

static func makeFunctionConstants(textures: Textures) 
                                  -> MTLFunctionConstantValues {
  let functionConstants = MTLFunctionConstantValues()
  var property = textures.baseColor != nil
  functionConstants.setConstantValue(&property, 
                                     type: .bool, index: 0)
  property = textures.normal != nil
  functionConstants.setConstantValue(&property, 
                                     type: .bool, index: 1)
  return functionConstants
}
let functionConstants = 
      makeFunctionConstants(textures: textures)
let fragmentFunction: MTLFunction?
do {
  fragmentFunction = 
         try library?.makeFunction(name: "fragment_main",
                            constantValues: functionConstants)
} catch {
  fatalError("No Metal function exists")
}
constant bool hasColorTexture [[function_constant(0)]];
constant bool hasNormalTexture [[function_constant(1)]];
float3 baseColor = material.baseColor;
float3 baseColor;
if (hasColorTexture) {
  baseColor = baseColorTexture.sample(textureSampler,
                       in.uv * fragmentUniforms.tiling).rgb;
} else {
  baseColor = material.baseColor;
}
float3 normalValue;
if (hasNormalTexture) {
  normalValue = normalTexture.sample(textureSampler,
                       in.uv * fragmentUniforms.tiling).rgb;
  normalValue = normalValue * 2 - 1;
} else {
  normalValue = in.worldNormal;
}
normalValue = normalize(normalValue);
texture2d<float> baseColorTexture [[texture(BaseColorTexture), 
                          function_constant(hasColorTexture)]],
texture2d<float> normalTexture [[texture(NormalTexture), 
                          function_constant(hasNormalTexture)]],
if (hasColorTexture) {
  return float4(1, 0, 0, 1);
}
return float4(0, 1, 0, 1);

Physically based rendering

To achieve spectacular scenes, you need to have good textures, but lighting plays an even more significant role. In recent years, the concept of physically based rendering (PBR) has become much more popular than the simplistic Phong shading model. As its name suggests, PBR attempts physically realistic interaction of light with surfaces. Now that Augmented Reality has become part of our lives, it’s even more important to render your models to match their physical surroundings.

PBR workflow

First, change the fragment function to use the PBR calculations. In Submesh.swift, in makePipelineState(textures:), change the name of the referenced fragment function from "fragment_main" to "fragment_mainPBR".

let roughness: MTLTexture?
roughness = property(with: .roughness)
if let roughness = material?.property(with: .roughness),
  roughness.type == .float3 {
  self.roughness = roughness.floatValue
}
property = textures.roughness != nil
functionConstants.setConstantValue(&property, 
                                   type: .bool, index: 2)
property = false
functionConstants.setConstantValue(&property, 
                                   type: .bool, index: 3)
functionConstants.setConstantValue(&property, 
                                   type: .bool, index: 4)
renderEncoder.setFragmentTexture(submesh.textures.roughness,
                                 index: 2)
camera.distance = 3
camera.target = [0, 0, 0]

Channel packing

Later on, you’ll again be using the PBR fragment function for rendering. Even if you don’t understand the mathematics, understand the layout of the function and the concepts used.

roughness = roughnessTexture.sample(textureSampler, in.uv).r;

Challenge

In the Resources folder for this chapter is a fabulous treasure chest model from Demeter Dzadik at Sketchfab.com. Your challenge is to render this model! There are three textures that you’ll load into the asset catalog. Don’t forget to change Interpretation from Color to Data, so the textures don’t load as sRGB.

Where to go from here?

The sky’s the limit! Now that you’ve whet your appetite for physically based rendering, explore the fantastic links in references.markdown which you’ll find in the Resources folder. Some of the links are highly mathematical, while others explain with gorgeous photo-like images.

Have a technical question? Want to report a bug? You can ask questions and report bugs to the book authors in our official book forum here.
© 2024 Kodeco Inc.

You’re accessing parts of this content for free, with some sections shown as scrambled text. Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.

Unlock now