This is the last step of our journey: knowing the technical aspects of lighting sources and apply them to 2D objects to achieve a more realistic look.


The more advanced features of OpenGL are concerned, for the most part, the simulation of natural phenomena like light and materials. In this way, through the continuous development, an attempt to provide the library of characteristics such as to realize scenes increasingly close to reality.

Let’s turn some light on

When drawing primitives or creating solid objects, up to this moment, we are able to associate them a given RGB color. In the real world, however, we simply do not see things according to their absolute color. People knows that color is nothing more than an interpretation by our visual organs of a particular range of radiation, ie, the light. Usually that arises from a well-defined source and the color of objects varies depending on the position with respect to it. OpenGL can approximate the conditions of the real world through the use of three types of illumination:

  1. The ‘ambient light‘, which has a brightness that but don’t comes from a specific direction, although having a source. When creating an ‘ambient light’ all the polygons of a scene are illuminated in the same manner.
  2. The ‘scattered light‘. It represents that kind of light that is reflected in a uniform manner. In this way a surface becomes brighter when the rays emitted by the source strikes it perpendicularly.
  3. The ‘specular light‘. It is similar to the previous unless the fact that, in this case, the rays from the source are reflected back to a particular direction. This effect generates, usually a bright spot on the reflecting surface as can the sunlight on the bodywork of a car.

The light, however, represents only a component of the equation. Indeed, we know that color and brightness of an object are also dependent on the material that constitutes it, and how this it reacts to light radiation.

Using the illumination we will not discuss any more about the color of a polygon in terms of RGB levels, but regarding the manner in which it reflects or absorbs certain wavelengths. We could create a blue ball or a metallic-looking cube or more similar to plastic one, simply by setting the properties of their constituent materials. Let’s see how join it all together. To enable the use of light is sufficient the OpenGL call


which enables the color calculation of each vertex according to the material and the type and position of the light source. Obviously, in the absence of the latter the scene would remain dark. First you need to set the three types of lighting via the function

void glLightfv(GLenum light, GLenum pname, const GLfloat * param)

The first parameter of the function identifies the light source to which apply the settings. OpenGL provides a maximum of 8 light sources: from GL_LIGHT0 to GL_LIGHT7. The second parameter asks you to specify one of three types of light, ie

  1. GL_AMBIENT to set the values ​​of ambient light
  2. GL_DIFFUSE to set the values ​​of the diffused light
  3. GL_SPECULAR to set the values ​​of the specular highlight
  4. GL_POSITION to set the position of the light

Finally, the function, takes an array of 4 float which contains the settings of the source where the first three variables represent the intensity of the components between 0.0f and 1.0f. The value 1.0f to the fourth causes the light rays from spreading in all directions, while if it will be 0.0f as coming from an infinite distance. In the latter way the rays are processed as they are parallel to each other. In order to use the source must be set to activate it by calling


Now it is necessary to make sure that the polygons has a material for reacting to the lighting. We enable the management of materials by


then we inform OpenGL about the properties of the polygons’s faces with the function

void glColorMaterial(GLenum face, GLenum mode)

The first parameter indicates to which face we want to apply that setting. The possible values ​​are GL_FRONT, GL_BACK for front and back faces, while GL_FRONT_AND_BACK does it on both. The second parameter takes the values ​​GL_EMISSION, GL_AMBIENT, GL_DIFFUSE, GL_SPECULAR, GL_AMBIENT_AND_DIFFUSE. It say that the material must react accordingly to the color settings of the indicated light type. After that we just have to make the material with the mirror function between

void glMaterialfv(GLenum face, GLenum pname, const GLfloat * params)

we make the call in this form

GLfloat reflectivity[] = {1.0f, 1.0f, 1.0f, 1.0f};
glMaterialfv(GL_FRONT_AND_BACK, GL_SPECULAR, reflectivity);

and set its level of reflectivity with

glMateriali(GL_FRONT, GL_SHINIESS, 128)

The more the last parameter value is high – it can take up to 128 – the more mirror effect appears “focused”, ie, the point generated is more defined and focused. Let’s see a practical example, following listing shows a possible initialization GL_LIGHT0 light source and the material is a generic program.

// Array for the lights settings
GLfloat ambientLight[] = {0.1f, 0.1f, 0.1f, 1.0f};
GLfloat diffuseLight[] = {0.45f, 0.2f, 0.4f, 1.0f};
GLfloat specularLight[] = {1.0f, 1.0f, 0.4f, 1.0f};
GLfloat light0Position[] = {0.0f, 0.0f, 0.0f, 1.0f};

// Array for reflectivity settings
GLfloat reflectivity[] = {1.0f, 1.0f, 1.0f, 1.0f};

void SetupRC()

    // Activates lighting 

    // Sets LIGHT0 components
    glLightfv(GL_LIGHT0, GL_AMBIENT, ambientLight);
    glLightfv(GL_LIGHT0, GL_DIFFUSE, diffuseLight);
    glLightfv(GL_LIGHT0, GL_SPECULAR, specularLight);
    glLightfv(GL_LIGHT0, GL_POSITION, light0Position);

    // Activates LIGHT0

    // Enables materials

    // Sets materials reactivity to the 
    // and scattered light

    // Sets the reflectivity values
    glMaterialfv(GL_FRONT_AND_BACK, GL_SPECULAR, reflectivity);

    // Sets the reflectivity level
    glMateriali(GL_FRONT_AND_BACK, GL_SHININESS, 32);

The sample programs Luci.exe describes in practice the application of rules learned so far, Figure 1 is an image taken from the example. Now everything is ready and OpenGL is able to properly manage the light source, but we have not put any object in the scene. Even if we had done the result was not what was intended, and this is because it was not introduced an additional concept: the normals.

Figure 1 – Example of lighting on an object

A normal is nothing but the vector perpendicular to the plane where a polygon is placed, so it is used to determine the angle of inclination between it and a light source. It is clear that if this angle is equal to zero the light is placed perpendicularly to the figure so the polygon will be better illuminated. Otherwise, when the angle increases, the quantity of reflected light decreases.

There are no specific functions for calculating the normal of a polygon and, without dwelling too much on the subject, I recommend you use the library ‘gltools’ that you will find on the FTP site. This provides the function

gltGetNormalVector(GLTVector VP1, GLTVector VP2, GLTVector VP3, GLTVector vNormal)

The variable GLTVector is an array of three floats containing the coordinates of one vertex of the polygon. The parameter vNormal is the result of the calculation and should be passed to the function

glNormal3fv(const GLfloat * v)

which accepts an array of float in place of the vector representing the normal. Look following listing for a clearer idea of ​​how to implement proper handling of normals.

// Create the walls by storing them in a list
void drawWalls()
    GLint i, j;

    // Creates a new list 
    glNewList(1, GL_COMPILE);

    // Set the color for bottom wall
    glColor3f(1.0f, 1.0f, 1.0f);

    // Bottom wall

    for (i = -HALF_WALL_SIZE; i < HALF_WALL_SIZE - STEP; i+=STEP)
        for (j = -HALF_WALL_SIZE; j < HALF_WALL_SIZE - STEP; j += STEP)
            // Creates a vector array containing
            // the coordinates of wall's vertices
            GLTVector3 points[4] = {{j, 0, i+STEP}, {j+STEP, 0, i+STEP}, {j+STEP, 0, i}, {j, 0, i}};

            // Calculation of the normal of three points
            gltGetNormalVector(points[0], points[1], points[2], vNormal);

            // Sets the normal

            // Sets vertex values


    // Right lateral

    for (i = 0; i < WALL_SIZE; i+=STEP)
        for (j = -HALF_WALL_SIZE; j < HALF_WALL_SIZE - STEP; j += STEP)
            // Creates a vector array containing
            // the coordinates of wall's vertices
            GLTVector3 points[4] = {{j, i, -HALF_WALL_SIZE}, {j+STEP, i, -HALF_WALL_SIZE}, {j+STEP, i+STEP, -HALF_WALL_SIZE}, {j, i+STEP, -HALF_WALL_SIZE}};

            // Calculation of the normal of three points
            gltGetNormalVector(points[0], points[1], points[2], vNormal);

            // Sets the normal

            // Sets vertex values


    // Left lateral

    for (i = 0; i < WALL_SIZE; i+=STEP)
        for (j = -HALF_WALL_SIZE; j < HALF_WALL_SIZE - STEP; j += STEP)
            // Creates a vector array containing
            // the coordinates of wall's vertices
            GLTVector3 points[4] = {{HALF_WALL_SIZE-STEP, i, j+STEP}, {HALF_WALL_SIZE-STEP, i+STEP, j+STEP}, {HALF_WALL_SIZE-STEP, i+STEP, j}, {HALF_WALL_SIZE-STEP, i, j}};

            // Calculation of the normal of three points
            gltGetNormalVector(points[0], points[1], points[2], vNormal);

            // Sets the normal

            // Sets vertex values


    // Closes the list

You’ll also notice the function

glNewList(1, GL_COMPILE)

This instruction causes the OpenGL compiles all commands that meets for fast execution the next call


Whenever you need to list just call the function



Filling a polygon through a simple color often is not sufficient to meet our requirements. An important feature that each 3D graphics library provides is the possibility of using an image to fill the polygons. This technique is called ‘texture mapping’. We can create very convincing scenes providing to objects a near-real appearance. We have already seen, in the first article, an example of texture mapping.

The first thing we must do to use textures in a program is to load the images. To this end we will use the function

void glTexImage2D(GLenum target, GLint level, GLint internalformat, GLsizei width, GLsizei height, GLint border, GLenum format, GLenum type, const GLvoid * data)

That enormous amount of parameters have to provide all the information needed to correctly interpret the data pointed by ‘data’ array. Let’s look at the meaning of each one

  • target must be set to GL_TEXTURE_2D[(/li)] and represents the type of texture, but we will not cover this topic.
  • level when set to 0 specifies the level of mipmap. We will not cover this topic.
  • internalformat specifies the format to use for organizing data in memory when they are loaded. It can be set to GL_ALPHA, GL_LUMINANCE, GL_LUMINANCE_ALPHA, GL_RGB, GL_RGBA. Usually, this parameter is set to GL_RGB and GL_RGBA.
  • width, height represents the dimensions of the image. It is very important to know that the image can have the following measures following powers of 2 ie 4×2, 64×128, 4096×1024 and so on …
  • border serves to create a border for the image. Set it to 0.
  • format indicates the the format of provided image data. There are many values ​​that this parameter can assume, but the most common are GL_RGB and GL_RGBA.
  • type is the data type used for express image data. For images at 24 or 32-bit the correct setting is GL_UNSIGNED_BYTE
  • data is the array containing the image

As was obvious to expect, OpenGL has no function to load image from files. We use the function contained in the file LoadTGA.c.

Before using a texture we must set other parameters. First we must define what happens when it is resized, ie, when the distance of the object to the observer changes:


In short we will ensure that OpenGL will use a filter that blurs the edges slightly on enlarged pixel, or when the objects go far away, avoiding the unsightly jagged. Even the texture management must be enabled


Once the image have been loaded we have to map it on the target polygon. This must be done during the vertices specification phase calling

glTexCoord2f(GLfloat s, GLfloat t)

Each coordinate is associated with the following vertex and can vary between 0.0 and 1.0 regardless of the size of the image. The following code associates a generic image of a triangle

    glTexCoord2f(0.0f, 0.0f);
    glVertex3f(0.0f, 0.0f, 0.0f);
    glTexCoord2f(0.5f, 1.0f);
    glVertex3f(5.0f, 10.0f, 0.0f);
    glTexCoord2f(1.0f, 0.0f);
    glVertex3f(10.0f, 0.0f, 0.0f);

In Figure 2 you can see the output of the program Texture.exe. By now you should become familiar with our programs, so studying the code should not be so difficult an undertaking.

Figure 2 – On the scene there are two objects, the teapot and the horizontal plane, each associated with its own texture

The fog effect

A very simple to use with OpenGL is fogging which is applied at the end of all operations related to the color. It determines how to mix the color, of the geometry present in the scene, with that choosen for the mist. It is also possible to vary the density values ​​and the equations that generate a different effect on distant of objects, from the point of observation. To use the effect, simply follow these steps

// Initialize an array of four GLfloat
// containing the fog color
GLfloat fogColor[] = {0.7f, 0.7f, 0.7f, 1.0f};

// Enable the effect

// Sets the fog color
glFogfv(GL_FOG_COLOR, fogColor);

// Sets the distance from the observer, the
// start of the fog bank
glFogf(GL_FOG_START, 5.0f);

// Sets the distance where this ends
glFogf(GL_FOG_END, 30.0f);

// Indicates the equation of fog densification

// Sets the density of the fog
glFogf(GL_FOG_DENSITY, 0.5f);

This is it. The only observation that can be made in the last statement is that OpenGL allows you to change the method that fog thickening is calculated, as a function of the distance from the observer. Indeed, we can specify additional parameters in addition to GL_LINEAR as GL_EXP and GL_EXP2. Choosing such equations we get an exponential thickening effect more pronounced only at big distances.


We know by now that images are drawn in the color buffer by overlaping values. We may be interested to enable the blending


so that the new figures in the process of being drawn inside the color buffer will be combined with the existing ones.
There are several ways to combine the color of two images and each one can generate a particular effect, such as, for example, the transparency.

To proceed, you must know the terminology commonly used. The existing color in the color buffer is called the target, while the one that is being written is the source. To determine how the two colors will be combined we have to use the following equation

Cf = (Cs * S) + (Cd * D)

where: Cf is the final color, Cs is the source and CD the destination. S and D are the factors to scale source and destination colors. These factors are required by

glBlendFunc(GLenum S, GLenum D)

Below the possible values ​​for these parameters

Function RGB Blend Factor Alpha Blend Factor
GL_ZERO (0, 0, 0) 0
GL_ONE (1, 1, 1) 1
GL_SRC_COLOR (Rs, Gs, Bs) As
GL_ONE_MINUS_SRC_COLOR (1, 1, 1) – (Rs, Gs, Bs) 1 – As
GL_DST_COLOR (Rd, Gd, Bd) Ad
GL_ONE_MINUS_DST_COLOR (1, 1, 1) – (Rd, Gd, Bd) 1 – Ad
GL_SRC_ALPHA (As, As, As) As
GL_ONE_MINUS_SRC_ALPHA (1, 1, 1) – (As, As, As) 1 – As
GL_DST_ALPHA (Ad, Ad, Ad) Ad
GL_ONE_MINUS_DST_ALPHA (1, 1, 1) – (Ad, Ad, Ad) 1 – Ad
GL_ONE_MINUS_CONSTANT_COLOR (1, 1, 1) – (Rc, Gc, Bc) 1 – Ac
GL_ONE_MINUS_CONSTANT_ALPHA (1, 1, 1) – (Ac, Ac, Ac) 1 – Ac

We see now a classic example. This is the most common approach to generate a transparency effect


Here’s what happens if the buffer is red-colored {1.0f, 0.0f, 0.0f, 0.0f}, while the color to be drawn is blue with an alpha value set to 0.5f {0.0f, 0.0f, 1.0f, 0.5f}. We have that

Cd = {1.0f, 0.0f, 0.0f, 1.0f}
Cs = {0.0f, 0.0f, 1.0f, 0.5f}
S = source level of alpha = 0.5f
D = 1 - 0.5f = 0.5f

The equation takes the form

Cf = (Cs * 0.5f) + (Cd * 0.5f) = {0.5, 0.0f, 0.0f, 0.0f} + {0.0f, 0.0f, 0.5f, 0.5f} = {0.5f, 0.0f, 0.5f, 0.5f}

generating a dark purple.


I hope I was clear enough and comprehensive because facing a complex subject does not suffice an entire book. It is very important, therefore, that you refer to the official OpenGL specifications (which can be found inside the Demo package) to learn more about the functions and parameters that shall be accepted.

There are many features, in fact, I have not had a chance to show for obvious reasons. Perhaps I will return to discuss issues related to OpenGL programming, but it also depends on you. Write and participate with questions and suggestions. Let us know if our work is useful for you or, in some way, interesting.


  1. Richard S.Wright jr. and Benjamin Lipchak, “OpenGL SuperBible – Third Edition”, SAMS, 2005, ISBN 0-672-32601-9

Original article