Creating a Texturing Shader

Introduction

In the previous few tutorials, we have done quite a bit with lighting. We are now going to move beyond that and do texturing with a shader. Texturing is actually quite simple to do. Because texturing has long been such an important part of graphics and video games, a lot of work has gone in to optimizing it and making it easy to do. By the time you get done with this tutorial, I think you'll agree with me that texturing is a simpler task than diffuse and specular lighting.

More About Texturing

A texture is simply an image that is drawn on a 3D model. The only tricky part to understand about textures is how they are applied to a model. I think it's important to spend a little time discussing how texturing works before we attempt to do it ourselves, so that's where we'll start. The first thing we need to discuss is how locations in a texture are referenced. Every point in a texture can be defined by what percent across the texture the point is, and what percent down the texture the point is. This is sort of an x- and y-coordinate for the point. However, these coordinates aren't usually referred to as x- and y-coordinates. we are already working with an x- and y-coordinate in our 3D coordinate system (along with a z-coordinate) for positions in our 3D world, and so to help us not get confused, we usually call the coordinates in a texture u- and v-coordinates. Sometimes they are combined together into one term and called uv-coordinates. The u-coordinate indicates how far across the image the point is, and the v-coordinate indicates how far down the image the point is. So, for instance, a uv-coordinate of (0.25, 0.5) refers to a point that is 1/4 of the way across the image, and half way down the image. Also, note that this measurement doesn't care about how big the image is. A value of 0.5 will be half way, regardless of whether the image was 5 pixels big or 5000.

Every vertex in the model is assigned a texture coordinate. This coordinate is stored in the model file and loaded into XNA. It will be available in our shaders if we want to use it. When we go to work on our pixel shader, we can look up the current texture coordinate in our texture file and color our pixel accordingly.

A Texturing Shader

Once again, we will be starting with the shader that we used last time and building on it. You could easily start with the diffuse shader from a couple of tutorials ago, and you could probably even get away with starting from the ambient light tutorial that we did at first, but it won't look as nice. If you don't have the HLSL code, or you want to make sure you start from something that is working, you can copy the HLSL code below:

float4x4 World;
float4x4 View;
float4x4 Projection;

float4 AmbientColor = float4(1, 1, 1, 1);
float AmbientIntensity = 0.1;

float4x4 WorldInverseTranspose;

float3 DiffuseLightDirection = float3(1, 0, 0);
float4 DiffuseColor = float4(1, 1, 1, 1);
float DiffuseIntensity = 1.0;

float Shininess = 200;
float4 SpecularColor = float4(1, 1, 1, 1);    
float SpecularIntensity = 1;
float3 ViewVector = float3(1, 0, 0);

struct VertexShaderInput
{
    float4 Position : POSITION0;    
    float4 Normal : NORMAL0;   
};

struct VertexShaderOutput
{
    float4 Position : POSITION0;
    float4 Color : COLOR0; 
    float3 Normal : TEXCOORD0;
};

VertexShaderOutput VertexShaderFunction(VertexShaderInput input)
{
    VertexShaderOutput output;

    float4 worldPosition = mul(input.Position, World);
    float4 viewPosition = mul(worldPosition, View);
    output.Position = mul(viewPosition, Projection);

    float4 normal = normalize(mul(input.Normal, WorldInverseTranspose));
    float lightIntensity = dot(normal, DiffuseLightDirection);
    output.Color = saturate(DiffuseColor * DiffuseIntensity * lightIntensity);

    output.Normal = normal;

    return output;
}

float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0
{
    float3 light = normalize(DiffuseLightDirection);
    float3 normal = normalize(input.Normal);
    float3 r = normalize(2 * dot(light, normal) * normal - light);
    float3 v = normalize(mul(normalize(ViewVector), World));

    float dotProduct = dot(r, v);
    float4 specular = SpecularIntensity * SpecularColor * max(pow(dotProduct, Shininess), 0) * length(input.Color);

    return saturate(input.Color + AmbientColor * AmbientIntensity + specular);
}

technique Specular
{
    pass Pass1
    {
        VertexShader = compile vs_2_0 VertexShaderFunction();
        PixelShader = compile ps_2_0 PixelShaderFunction();
    }
}

Of course, you can also just continue on where you left off with the last tutorial if you want, but once again, I'd recommend that you make a copy of the old specular shader so that you have it to fall back to.

Alright. We're ready to start doing texturing. We will want to add a couple of variables to our effect file like we have done before. So at the top of your code, with the rest of your variables, add the following:

texture ModelTexture;
sampler2D textureSampler = sampler_state {
    Texture = (ModelTexture);
    MagFilter = Linear;
    MinFilter = Linear;
    AddressU = Clamp;
    AddressV = Clamp;
};

The first line will be a reference to the texture we are going to use for our model. The rest of this is what is called a sampler. A sampler is a way for the graphics card to determine exactly how it should extract the texture color from the texture. The body of the sampler lists five properties of the sampler that tell it how to extract this information. The first one, Texture = (ModelTexture);, says which texture to use, which we set to the texture we want. The next two properties, MagFilter and MinFilter, tell us how to handle the situation when the texture is magnified or "minified". For instance, if the texture is sort of stretched out across the model, there will be times when we need texture coordinates that are between pixels, and the sampler needs to know how to interpolate between the nearest actual pixels. Linear is a pretty decent filter, which interpolates between the nearest colors. There are others though, including None, Point, and Anisotropic.

The last two properties here, AddressU and AddressV deal with how the sampler should respond if it gets a value that is beyond the normal range of 0-1. In almost all models, this isn't going to happen. But the sampler should know how to handle it in case it comes up, because this could happen accidentally, or even intentionally. In our case, we are going to use Clamp, which says that if the value is less than 0, to just use the value at 0 instead, and if the value is more than one, to just use the texture color at 1. Once again, there are other choices that could be used. The value Border means that if a value beyond 0 or 1 is used, the "border color" is used instead, which is often just black. Wrap means that if the value goes over 1, it should just start over and repeat at 0. So a value of 1.25 will have the same value as 0.25. The same is true if you go below 0. Also, there is a Mirror value, which means that if you go beyond 1, it starts going back the other direction instead. So 1.1 is mapped to 0.9. Note that you can have different values for the u-coordinate and the v-coordinate.

The next thing we need to do now is update our data structures so that they can store texture information. Update (or replace) the existing data structures so that they have a texture coordinate, like the ones below:

struct VertexShaderInput
{
    float4 Position : POSITION0;
    float4 Normal : NORMAL0;
    float2 TextureCoordinate : TEXCOORD0; // this is new
};

struct VertexShaderOutput
{
    float4 Position : POSITION0;
    float4 Color : COLOR0;
    float3 Normal : TEXCOORD0;
    float2 TextureCoordinate : TEXCOORD1; // this is new
};

Notice in the vertex shader that the semantic is TEXCOORD1 because we are already using TEXCOORD0 for the normal.

The next thing to do is to update our vertex shader, which will be simple. All we need to do is modify it so that it will transfer the texture coordinate over from the input to the output. This can be done by adding the following line to your vertex shader:

output.TextureCoordinate = input.TextureCoordinate;

So now your vertex shader should look like this:

VertexShaderOutput VertexShaderFunction(VertexShaderInput input)
{
    VertexShaderOutput output;
 
    float4 worldPosition = mul(input.Position, World);
    float4 viewPosition = mul(worldPosition, View);
    output.Position = mul(viewPosition, Projection);
 
    float4 normal = normalize(mul(input.Normal, WorldInverseTranspose));
    float lightIntensity = dot(normal, DiffuseLightDirection);
    output.Color = saturate(DiffuseColor * DiffuseIntensity * lightIntensity);
 
    output.Normal = normal;
    output.TextureCoordinate = input.TextureCoordinate;
 
    return output;
}

Next we need to update the pixel shader, which will be a little more work, but it won't be too bad. We will need to tell our texture sampler to give us the color value for the texture at the current uv-coordinate and then factor that in with our diffuse light. We will do this in two steps. First, add the following code to the pixel shader, just before it returns. This will calculate the color of the pixel, based on the texture.

float4 textureColor = tex2D(textureSampler, input.TextureCoordinate);
textureColor.a = 1;

Notice that we specifically set the alpha value for the color in the second line, because the texture sampler doesn't pull out the alpha value from the texture. The second part is to modify the final pixel color (which is what's returned) to include our texture color. To do this, we want to remove the previous return value, which says:

return saturate(input.Color + AmbientColor * AmbientIntensity + specular);

and replace it with:

return saturate(textureColor * (input.Color) + AmbientColor * AmbientIntensity + specular);

Also, you will probably want to change the name of your technique from "Specular" to "Textured" or something like that.

Our final texture shader code should look something like this:

float4x4 World;
float4x4 View;
float4x4 Projection;
float4x4 WorldInverseTranspose;
 
float4 AmbientColor = float4(1, 1, 1, 1);
float AmbientIntensity = 0.1;
 
float3 DiffuseLightDirection = float3(1, 0, 0);
float4 DiffuseColor = float4(1, 1, 1, 1);
float DiffuseIntensity = 1.0;
 
float Shininess = 200;
float4 SpecularColor = float4(1, 1, 1, 1);
float SpecularIntensity = 1;
float3 ViewVector = float3(1, 0, 0);
 
texture ModelTexture;
sampler2D textureSampler = sampler_state {
    Texture = (ModelTexture);
    MinFilter = Linear;
    MagFilter = Linear;
    AddressU = Clamp;
    AddressV = Clamp;
};
 
struct VertexShaderInput
{
    float4 Position : POSITION0;
    float4 Normal : NORMAL0;
    float2 TextureCoordinate : TEXCOORD0;
};
 
struct VertexShaderOutput
{
    float4 Position : POSITION0;
    float4 Color : COLOR0;
    float3 Normal : TEXCOORD0;
    float2 TextureCoordinate : TEXCOORD1;
};
 
VertexShaderOutput VertexShaderFunction(VertexShaderInput input)
{
    VertexShaderOutput output;
 
    float4 worldPosition = mul(input.Position, World);
    float4 viewPosition = mul(worldPosition, View);
    output.Position = mul(viewPosition, Projection);
 
    float4 normal = normalize(mul(input.Normal, WorldInverseTranspose));
    float lightIntensity = dot(normal, DiffuseLightDirection);
    output.Color = saturate(DiffuseColor * DiffuseIntensity * lightIntensity);
 
    output.Normal = normal;
 
    output.TextureCoordinate = input.TextureCoordinate;
    return output;
}
 
float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0
{
    float3 light = normalize(DiffuseLightDirection);
    float3 normal = normalize(input.Normal);
    float3 r = normalize(2 * dot(light, normal) * normal - light);
    float3 v = normalize(mul(normalize(ViewVector), World));
    float dotProduct = dot(r, v);
 
    float4 specular = SpecularIntensity * SpecularColor * max(pow(dotProduct, Shininess), 0) * length(input.Color);
 
    float4 textureColor = tex2D(textureSampler, input.TextureCoordinate);
    textureColor.a = 1;
 
    return saturate(textureColor * (input.Color) + AmbientColor * AmbientIntensity + specular);
}
 
technique Textured
{
    pass Pass1
    {
        VertexShader = compile vs_1_1 VertexShaderFunction();
        PixelShader = compile ps_2_0 PixelShaderFunction();
    }
}

XNA Code

There are only a couple of things that we will need to do to get our shader going. We just need to give it a texture to draw with. Usually, if your model is textured, you will already have the needed texture files. Probably, right now, you have the textures located in the same directory as your model, but not included in your project. We will want to include the textures into our project so that we can load them in our game, but if you just include them as they are, you will likely get an error that says you are trying to load the texture in twice. (The model will try to load it by default, and now you are trying to load it your self.) There are a lot of ways to get around this small problem, but often, I just copy the texture to another directory in the Content directory of your game (usually I call it "Textures") and tell it to read that one in.

So I added an instance variable to my class with this line:

Texture2D texture;

Then in the LoadContent() method, I added a line that would load the texture in:

texture = Content.Load<Texture2D>("Textures/helicopterTexture");

Finally, in the DrawModelWithEffect() method, where we set all of the other parameters, I set the texture with this line:

effect.Parameters["ModelTexture"].SetValue(texture);

At this point, you should be able to run your game and see your shader with texturing!

screenshot1.png

What's Next?

Now we have done ambient, diffuse, and specular lighting, and also texturing. That covers most of the things that BasicEffect can do. From here, we will try out a few other things that the BasicEffect class doesn't have support for, including bump mapping and environment mapping.


Troubleshooting.png Having problems with this tutorial? Try the troubleshooting page!