How To Export Blender Models to OpenGL ES: Part 3/3

In this third part of our Blender to OpenGL ES tutorial series, learn how to implement a simple shader to showcase your model! By Ricardo Rendon Cepeda.

Leave a rating/review
Save for later
Share

Welcome back to the three-part tutorial series that teaches you how to make an awesome 3D model viewer for iOS by exporting your Blender models to OpenGL ES.

Here’s an overview of the series:

  • Part 1: In the first part, you learned all about the OBJ geometry definition and file format, and used this new knowledge to create a command line tool to parse a simple Blender cube into suitable arrays for OpenGL ES. You also created a simple iOS OpenGL ES app that displayed your model.
  • Part 2: In the second part, you learned all about the MTL material definition and file format, which allowed you to add Blender materials to your model.
  • Part 3: You are here! Get ready to implement a simple lighting model for your 3D scene by writing your own OpenGL ES shaders! A shader is a dedicated program that instructs a graphics processor how to render a scene.

The Return Of The King, The Dark Knight Rises, The Last Crusade… all great trilogy conclusions, but none like this one: GLBlender3!

Getting Started

First, download the starter pack for this tutorial. Here’s a quick overview of each component—you can find more information in their dedicated tutorial sections later on. The contents are a bit different from those in Parts 1 and 2:

  • /Blender/: This folder contains the Blender scene for your new model (starship.blend). You won’t have to open it in this tutorial, but it’s there if you want to explore or modify it.
  • /Code/: Here you’ll find the Xcode project for your iOS app (GLBlender3). The iOS app is the same as GLBlender2 from Part 2, but with a different model (and additional resources). The project also has an extra class called Shader that you’ll use later on. Your command line tool is gone since you completed it in Part 2 and won’t be needing it here.
  • /Maya/: This folder contains the Maya scene for your new model (starship.mb). The model was built in Maya and then exported to Blender, so this is the original copy. Again, you won’t have to open it in this tutorial, but it’s there if you want to explore or modify it—if you have Autodesk Maya, of course.
  • /Resources/: This folder contains two subfolders: /cube/ and /starship/. They both contain all of your models’ files required by OpenGL ES and a few extras. Inside /starship/ you’ll find starship.obj and starship.mtl, which were exported from Blender and then processed by blender2opengles (from Part 2) to create starship.h and starship.c. starship_decal.png is a texture for your model that you’ll use later on. You can find similar files for your cube model from Part 2 in /cube/.

This time you won’t be accessing or modifying your directory as much, but it doesn’t hurt to be neat and tidy.

Your New Model: The Star (Fox) of the Show

Meet your new model! Based on the Arwing from Star Fox for SNES, I’m pleased to introduce starship. Here’s what the model looks like in Blender and Maya:

b_Starship

m_Starship

I chose this model for its popularity, simplicity and historical significance in 3D computer graphics. If you open starship.obj and starship.mtl with a text editor, you should be able to analyze the file quickly to understand the model a bit better. You could do the same with starship.h and starship.c, but these might be a little more daunting.

I’ve done all of the file exporting and processing for you already, so let’s jump straight into the app!

The Starship Model Viewer

Launch Xcode and open your GLBlender3 project.

Build and run! You’ll see your new model in all its polygonal glory.

s_Run1

Let the animation run for a little while and you’ll witness some odd behavior, most noticeably when the fins (blue) overlap with the wings (gray). GL_CULL_FACE works well with your previous cube model because there is no instance where two front-facing triangles overlap each other. With this new starship model, though, this happens many times. Culling is still very useful, but now you must also introduce a depth buffer.

Rendering With Depth

You’re going to use your storyboard to modify your rendering, so open MainStoryboard.storyboard and click on GLKit View. In the Utilities sidebar, click on the Attributes inspector tab and select 16 from the Depth Format drop-down menu. This creates a depth buffer for your OpenGL ES scene with a 16-bit entry for each pixel.

While you’re here, it’s not a bad idea to improve the quality of your rendering. In the same sidebar, select 4X from the Multisample drop-down menu. You should be careful with multisampling because it incurs a huge increase in memory/processing requirements, though with the simple starship model, that’s not an issue. With 4X enabled, you are essentially rendering to a frame buffer object (FBO) four times the size of your screen (2x width, 2x height) and then scaling the resulting image down to the proper screen size.

Take a look at the setup for both properties below:

s_GLKView

Your view is set, but you also need to tell OpenGL ES that your scene has depth. Open MainViewController.m and add the following line to viewDidLoad, at the end of your OpenGL ES settings:

glEnable(GL_DEPTH_TEST);

Next, scroll down to glkView:drawInRect: and add the depth buffer to your glClear() parameters, like so:

glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

Build and run! Your model now renders properly and looks much better, too.

s_Run2

Implementing Shaders

You’re about to take a deep dive into 3D graphics programming by writing your own shaders! You learned a little bit about shaders in Part 2 and now you’ll get to see them in action. The gist is that they are programs that instruct the GPU how to draw your scene.

Shaders work in tandem, with the two components being the vertex shader and the fragment shader. You may have encountered them in one, two or three other tutorials before, but in case you missed them, here’s a quick overview:

  • A vertex shader is called once per vertex in your scene. In this tutorial, the vertex shader’s main job is to compute a value for gl_Position, which contains the position of the current vertex.
  • A fragment shader is called once per fragment in your scene. Fragments contain the raw data necessary to generate a pixel, including color, depth and transparency, whereas pixels are the actual points on your screen/image. The terms are used interchangeably, but it’s definitely worth knowing the difference. In this tutorial, the fragment shader’s main job is to compute a value for gl_FragColor, which contains the color of the current fragment.

Note: Both shaders get called on every frame, but one more than the other.

Your starship has 66 vertices, equaling the number of calls to your vertex shader.

For an iPhone 5, your scene contains 640×1136 pixels multi-sampled at 4X, meaning your fragment shader could get called up to 2,908,160 times—holy macaroni!

That’s a lot more than 66, although in this tutorial your fragment shader only gets called for every fragment on screen occupied by your model. Remember when I said multisampling is expensive? Well, there you go…

Note: Both shaders get called on every frame, but one more than the other.

Your starship has 66 vertices, equaling the number of calls to your vertex shader.

For an iPhone 5, your scene contains 640×1136 pixels multi-sampled at 4X, meaning your fragment shader could get called up to 2,908,160 times—holy macaroni!

That’s a lot more than 66, although in this tutorial your fragment shader only gets called for every fragment on screen occupied by your model. Remember when I said multisampling is expensive? Well, there you go…

OK, it’s time for you to add some shaders to your project!

In Xcode, go to File\New\File…, choose the iOS\Other\Empty template and click Next. Name your new file Phong.vsh, uncheck the box next to your GLBlender3 target, placing the new file in your Shaders group, and click Create, as shown in the screenshot below:

g_Shaders

Repeat this process once more, but name this second file Phong.fsh.

Next, open Phong.vsh and add the following code inside:

// Vertex Shader

static const char* PhongVSH = STRINGIFY
(

void main(void)
{
    gl_Position = vec4(0.0, 0.0, 0.0, 1.0);
}

);

Similarly, open up Phong.fsh and add the following:

// Fragment Shader

static const char* PhongFSH = STRINGIFY
(
 
void main(void)
{
    gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);
}
 
);

To get that nice syntax coloring, go to Editor\Syntax Coloring\GLSL. Do this now for both files. As you can see, shaders are short programs written in a C-like language called GLSL (OpenGL Shading Language). They are processed as strings, which is why they are wrapped inside STRINGIFY, a macro you’ll call later that returns a pointer to a string.

As mentioned before, the vertex shader writes out data to gl_Position, which expects an XYZW coordinate. In this tutorial, you only need to worry about the XYZ part, since W will always be 1.0 in representation of points, not vectors.

The fragment shader writes out data to gl_FragColor, which expects an RGBA color. For the most part, you’re only concerned with the RGB channels, but A will be useful later on when you implement transparency.

Ricardo Rendon Cepeda

Contributors

Ricardo Rendon Cepeda

Author

Over 300 content creators. Join our team.