Chapters

Hide chapters

Metal by Tutorials

Third Edition · macOS 12 · iOS 15 · Swift 5.5 · Xcode 13

Section I: Beginning Metal

Section 1: 10 chapters
Show chapters Hide chapters

Section II: Intermediate Metal

Section 2: 8 chapters
Show chapters Hide chapters

Section III: Advanced Metal

Section 3: 8 chapters
Show chapters Hide chapters

8. Textures
Written by Marius Horga & Caroline Begbie

Heads up... You're reading this book for free, with parts of this chapter shown beyond this point as scrambled text.

So far, you’ve learned how to use fragment functions and shaders to add colors and details to your models. Another option is to use image textures, which you’ll learn how to do in this chapter. More specifically, you’ll learn about:

  • UV coordinates: How to unwrap a mesh so that you can apply a texture to it.
  • Texturing a model: How to read the texture in a fragment shader.
  • Samplers: Different ways you can read (sample) a texture.
  • Mipmaps: Multiple levels of detail so that texture resolutions match the display size and take up less memory.
  • Asset catalog: How to organize your textures.

Textures and UV Maps

The following image shows a house model with twelve vertices. The wireframe is on the left (showing the vertices), and the textured model is on the right.

A low poly house
A low poly house

Note: If you want a closer look at this model, you’ll find the Blender and .obj files in the resources/LowPolyHouse folder for this chapter.

To texture a model, you first have to flatten that model using a process known as UV unwrapping. UV unwrapping creates a UV map by unfolding the model. To unfold the model, you mark and cut seams using a modeling app. The following image shows the result of UV unwrapping the house model in Blender and exporting its UV map.

The house UV map
The house UV map

Notice that the roof and walls have marked seams. Seams are what make it possible for this model to lie flat. If you print and cut out this UV map, you can easily fold it back into a house. In Blender, you have complete control of the seams and how to cut up your mesh. Blender automatically unwraps the model by cutting the mesh at these seams. If necessary, you can also move vertices in the UV Unwrap window to suit your texture.

Now that you have a flattened map, you can “paint” onto it by using the UV map exported from Blender as a guide. The following image shows the house texture (made in Photoshop) that was created by cutting up a photo of a real house.

Low poly house color texture
Low poly house color texture

Note how the edges of the texture aren’t perfect, and the copyright message is visible. In the spaces where there are no vertices on the map, you can add whatever you want since it won’t show up on the model.

Note: It’s a good idea to not match the UV edges exactly, but instead to let the color bleed, as sometimes computers don’t accurately compute floating-point numbers.

You then import that image into Blender and assign it to the model to get the textured house that you saw above.

When you export a UV mapped model to an .obj file, Blender adds the UV coordinates to the file. Each vertex has a two-dimensional coordinate to place it on the 2D texture plane. The top-left is (0, 1) and the bottom-right is (1, 0).

The following diagram indicates some of the house vertices, with the matching coordinates from the .obj file. You can look at the contents of the .obj file using TextEdit.

UV coordinates
UV coordinates

One of the advantages of mapping from 0 to 1 is that you can swap in lower or higher resolution textures. If you’re only viewing a model from a distance, you don’t need a highly detailed texture.

This house is easy to unwrap, but imagine how complex unwrapping curved surfaces might be. The following image shows the UV map of the train (which is still a simple model):

The train's UV map
The train's UV map

Photoshop, naturally, is not the only solution for texturing a model. You can use any image editor for painting on a flat texture. In the last few years, several other apps that allow painting directly on the model have become mainstream:

  • Blender (free)
  • Substance Designer and Substance Painter by Adobe ($$): In Designer, you can create complex materials procedurally. Using Substance Painter, you can paint these materials on the model.
  • 3DCoat by 3Dcoat.com ($$)
  • Mudbox by Autodesk ($$)
  • Mari by Foundry ($$$)

In addition to texturing, using Blender, 3DCoat or Mudbox, you can sculpt models in a similar fashion to ZBrush and create low poly models from the high poly sculpt. As you’ll find out later, color is not the only texture you can paint using these apps, so having a specialized texturing app is invaluable.

The Starter App

➤ Open the starter project for this chapter, and build and run the app.

The starter app
Gwo dredqab emx

1. Loading the Texture

A model typically has several submeshes that reference one or more textures. Since you don’t want to repeatedly load this texture, you’ll create a central TextureController to hold your textures.

import MetalKit

enum TextureController {
  static var textures: [String: MTLTexture] = [:]
}
static func loadTexture(filename: String) throws -> MTLTexture? {
  // 1
  let textureLoader = MTKTextureLoader(device: Renderer.device)
  // 2
  let textureLoaderOptions: [MTKTextureLoader.Option: Any] =
    [.origin: MTKTextureLoader.Origin.bottomLeft]
  // 3
  let fileExtension =
    URL(fileURLWithPath: filename).pathExtension.isEmpty ?
      "png" : nil
  // 4
  guard let url = Bundle.main.url(
    forResource: filename,
    withExtension: fileExtension)
    else {
      print("Failed to load \(filename)")
      return nil
  }
  let texture = try textureLoader.newTexture(
    URL: url,
    options: textureLoaderOptions)
  print("loaded texture: \(url.lastPathComponent)")
  return texture
}
static func texture(filename: String) -> MTLTexture? {
  if let texture = textures[filename] {
    return texture
  }
  let texture = try? loadTexture(filename: filename)
  if texture != nil {
    textures[filename] = texture
  }
  return texture
}

Loading the Submesh Texture

Each submesh of a model’s mesh has a different material characteristic, such as roughness, base color and metallic content. For now, you’ll focus only on the base color texture. In Chapter 11, “Maps & Materials”, you’ll look at some of the other characteristics.

The .obj's material file
Vde .avs'p pekadaap migo

struct Textures {
  let baseColor: MTLTexture?
}

let textures: Textures
private extension Submesh.Textures {
  init(material: MDLMaterial?) {
    func property(with semantic: MDLMaterialSemantic)
      -> MTLTexture? {
      guard let property = material?.property(with: semantic),
        property.type == .string,
        let filename = property.stringValue,
        let texture =
          TextureController.texture(filename: filename)
      else { return nil }
      return texture
    }
    baseColor = property(with: MDLMaterialSemantic.baseColor)
  }
}
textures = Textures(material: mdlSubmesh.material)

2. Passing the Loaded Texture to the Fragment Function

In a later chapter, you’ll learn about several other texture types and how to send them to the fragment function using different indices.

typedef enum {
  BaseColor = 0
} TextureIndices;
extension TextureIndices {
  var index: Int {
    return Int(self.rawValue)
  }
}
encoder.setFragmentTexture(
  submesh.textures.baseColor,
  index: BaseColor.index)

3. Updating the Fragment Function

➤ Open Shaders.metal, and add the following new argument to fragment_main, immediately after VertexOut in [[stage_in]],:

texture2d<float> baseColorTexture [[texture(BaseColor)]]
constexpr sampler textureSampler;
float3 baseColor = baseColorTexture.sample(
  textureSampler,
  in.uv).rgb;
return float4(baseColor, 1);
The textured house
Tdo wuwwiton veipu

sRGB Color Space

You’ll notice that the rendered texture looks much darker than the original image. This change in color happens because lowpoly-house-color.png is an sRGB texture. sRGB is a standard color format that compromises between how cathode ray tube monitors work and what colors the human eye sees. As you can see in the following example of grayscale values from 0 to 1, sRGB colors are not linear. Humans are more able to discern between lighter values than darker ones.

sRGBcolor = pow(linearColor, 1.0/2.2);
let textureLoaderOptions: [MTKTextureLoader.Option: Any] =
  [.origin: MTKTextureLoader.Origin.bottomLeft]
let textureLoaderOptions: [MTKTextureLoader.Option: Any] = [
  .origin: MTKTextureLoader.Origin.bottomLeft,
  .SRGB: false
]
Linear workflow
Xuheej wajrqsap

Capture GPU Workload

There’s an easy way to find out what format your texture is in on the GPU, and also to look at all the other Metal buffers currently residing there: the Capture GPU workload tool (also called the GPU Debugger).

A GPU trace
O YGI qrepe

Resources on the GPU
Yeqaawsik od vxu NFE

Samplers

When sampling your texture in the fragment function, you use a default sampler. By changing sampler parameters, you can decide how your app reads your texels.

lazy var ground: Model = {
  Model(name: "plane.obj")
}()
ground.scale = 40
ground.rotation.y = sin(timer)
ground.render(
  encoder: renderEncoder,
  uniforms: uniforms,
  params: params)
A stretched texture
A smdilnfar jaysaca

constexpr sampler textureSampler(filter::linear);
A smoothed texture
E ksaedqur xiwsuku

Filtering
Koscifazd

constexpr sampler textureSampler(
  filter::linear,
  address::repeat);
float3 baseColor = baseColorTexture.sample(
  textureSampler,
  in.uv * 16).rgb;
The sampler address mode
Clu zoydliq ekjhagz goti

Texture tiling
Goypita mezetr

uint tiling;
var tiling: UInt32 = 1
params.tiling = tiling
lazy var ground: Model = {
  var ground = Model(name: "plane.obj")
  ground.tiling = 16
  return ground
}()
float3 baseColor = baseColorTexture.sample(
  textureSampler,
  in.uv * params.tiling).rgb;
Corrected tiling
Yaglipnav wukubv

A moiré example
E fueqé odixnfi

Mipmaps

Check out the relative sizes of the roof texture and how it appears on the screen.

Size of texture compared to on-screen viewing
Godu ed qafsopu qucqusus va in-trqaor gooyiwz

Mipmaps
Bukrepn

Mipmap comparison
Dufbes zaglosegud

let textureLoaderOptions: [MTKTextureLoader.Option: Any] = [
  .origin: MTKTextureLoader.Origin.bottomLeft,
  .SRGB: false,
  .generateMipmaps: NSNumber(value: true)
]
mip_filter::linear
Mipmaps added
Dospawn ahsoq

Mipmap level 4 example
Honqan huvut 8 ukespyo

Anisotropy

Your rendered ground is looking a bit muddy and blurred in the background. This is due to anisotropy. Anisotropic surfaces change depending on the angle at which you view them, and when the GPU samples a texture projected at an oblique angle, it causes aliasing.

max_anisotropy(8)
Anisotropy
Ixexayqohf

The Asset Catalog

As its name suggests, the asset catalog can hold all of your assets, whether they be data, images, textures or even colors. You’ve probably used the catalog for app icons and images. Textures differ from images in that the GPU uses them, and thus they have different attributes in the catalog. To create textures, you add a new texture set to the asset catalog.

Texture options in the asset catalog
Mebjifo otkoems oz hlu uwrim yukevet

Mipmap slots
Watdup ttamr

 if let texture = try? textureLoader.newTexture(
   name: filename,
   scaleFactor: 1.0,
   bundle: Bundle.main,
   options: nil) {
   print("loaded texture: \(filename)")
   return texture
 }
map_Kd ground.png
#map_Kd ground.png
map_Kd grass
SRGB rendering
SPQZ mebrefavv

Convert texture to data
Nawxihz feqpesa qe fuwu

Corrected color space
Luptuftav posud wxuwe

The Right Texture for the Right Job

Using asset catalogs gives you complete control over how to deliver your textures. Currently, you only have two color textures. However, if you’re supporting a wide variety of devices with different capabilities, you’ll likely want to have specific textures for each circumstance. On devices with less RAM, you’d want smaller graphics.

Custom textures in the asset catalog
Laxham limpekeq av mni iwlaq rucocez

Texture Compression

In recent years, people have put much effort into compressing textures to save both CPU and GPU memory. There are various formats you can use, such as ETC and PVRTC. Apple has embraced ASTC as being the most high-quality compressed format. ASTC is available on the A8 chip and newer.

App memory usage
Uxf zabuyf ayosi

Compressed texture comparison
Bodymuybej ciwcelo jasrafuxit

Key Points

  • UVs, also known as texture coordinates, match vertices to the location in a texture.
  • During the modeling process, you flatten the model by marking seams. You can then paint on a texture that matches the flattened model map.
  • You can load textures using either the MTKTextureLoader or the asset catalog.
  • A model may be split into groups of vertices known as submeshes. Each of these submeshes can reference one texture or multiple textures.
  • The fragment function reads from the texture using the model’s UV coordinates passed on from the vertex function.
  • The sRGB color space is the default color gamut. Modern Apple monitors and devices can extend their color space to P3 or wide color.
  • Capture GPU workload is a useful debugging tool. Use it regularly to inspect what’s happening on the GPU.
  • Mipmaps are resized textures that match the fragment sampling. If a fragment is a long way away, it will sample from a smaller mipmap texture.
  • The asset catalog is a great place to store all of your textures. Later, you’ll have multiple textures per model, and it’s better to keep them all in one place. Customization for different devices is easy using the asset catalog.
  • Topics such as color and compression are huge. In the resources folder for this chapter, in references.markdown, you’ll find some recommended articles to read further.
Have a technical question? Want to report a bug? You can ask questions and report bugs to the book authors in our official book forum here.
© 2024 Kodeco Inc.

You're reading for free, with parts of this chapter shown as scrambled text. Unlock this book, and our entire catalogue of books and videos, with a Kodeco Personal Plan.

Unlock now