Device Coordinates and Object Coordinates

All of the OpenGL posts so far have defined objects (points, lines, and triangles) in two dimensions with values in device coordinates. Using device coordinates, values that are displayed are limited to the following ranges:

  • x between [-1, +1]
  • y between [-1, +1]

where the square brackets indicate that the edge values are included in the range.

If you remember the post where we drew a circle at the origin, the vertices of the square containing the circle were specified in these device coordinates. However, the fragment shader provides the coordinates of the fragment it is working on as pixel coordinates. The graphics pipeline determines the pixel coordinate of each fragment inside each object before calling the fragment shader.

That is fine, but device coordinates are not always the most convenient coordinates to work in. The universe you are describing in your program may naturally fit into a different set of coordinates; for example, it may be more convenient to use pixel coordinates when specifying the vertices of the triangles that make up your objects. You may do so, but you must also provide the conversion factor from your object coordinates to device coordinates.

To do that, you must specify each vertex as a set of 4 values. In OpenGL these values are referred to as x, y, z, and w. In your shaders, these values are contained in a vec4. Your object coordinates are converted to the following device coordinates: x/w, y/w, and z/w. This conversion is handled for you in the graphics pipeline.

Let’s look at an example: the program that draws a circle at the origin in device coordinates. This program is described in the Drawing Circles With OpenGL post. In that program, the vertices of the square that contains the circle were defined using device coordinates as:

float points[] = {
    -0.2f, -0.2f,
    0.2f, -0.2f,
    0.2f, 0.2f,
    -0.2f, 0.2f

To convert these values to canvas-relative pixel coordinates we need to know the size of the canvas (the wxGLCanvas object). We defined that as 800 by 800 pixels when it was created in the window object that contains the canvas. Therefore, the viewable dimensions are -400 to +400 in both x and y directions. That means that we can change the coordinates of the vertices to:

float points[] = {
    -80.0f, -80.0f,
    80.0f, -80.0f,
    80.0f, 80.0f,
    -80.0f, 80.0f

The only problem with this definition is that we have not specified the conversion value w. To do that, we must specify each point in 4-D, with the fourth value as w. It is necessary to specify the z-coordinate of each vertex when using anything other than device coordinates. Since we are dealing with only a two-dimensional world, we will set the z-coordinate of all of the vertices to 0.0f. So that makes our vertices values as:

float points[] = {
    -80.0f, -80.0f, 0.0f, w,
    80.0f, -80.0f, 0.0f, w,
    80.0f, 80.0f, 0.0f, w,
    -80.0f, 80.0f, 0.0f, w

In fact, in the original program, we added the z (0.0) and w (1.0) values in the vertex shader. That is the only thing that the vertex shader did.

Now all we need is the value of w. We could either hard-code the value to 400.0f, or we could take the value as half the width of the canvas (800/2).
We will use the latter method. Add the following code before the definition of points:

wxSize canvasSize = GetSize();
float w = static_cast<float>(canvasSize.x) / 2.0f;

The only other changes needed to the program are to define outerRadius in terms of pixels, to modify the vertex shader to accept the vertex as a vec4 rather than a vec2, to modify the fragment shader to accept the radius in pixels, and to change the glVertexAttribPoint call to indicate that each vertex is specified as 4 floating point values. In CirclesAndRotatorsCanvas::OnPaint, change the circle value to 80.0f:

glUniform1f(m_circleOuterRadius, 80.0f);

The vertex shader program becomes:

const GLchar* vertexSource =
    "#version 330 core\n"
    "in vec4 position;"
    "void main()"
    "    gl_Position = position;"

The fragment shader program changes to:

const GLchar* fragmentSource =
    "#version 330 core\n"
    "uniform vec2 viewDimensions;"
    "uniform float outerRadius;"
    "out vec4 outColor;"
    "void main()"
    "   float x = gl_FragCoord.x - viewDimensions.x / 2.0f;"
    "   float y = gl_FragCoord.y - viewDimensions.y / 2.0f;"
    // discard fragment if outside the circle
    "   float len = sqrt(x * x + y * y);"
    "	if (len > outerRadius) {"
    "       discard;"
    "    }"
    // else set its colour to green
    "   outColor = vec4(0.0, 1.0, 0.0, 1.0);"

The only changes are the calculations of x and y.

In CirclesAndRotatorsCanvas::BuildCircleShaderProgram, change the glVertexAttribPointer call to:

	glVertexAttribPointer(posAttrib, 4, GL_FLOAT, GL_FALSE, 4 * sizeof(GLfloat), 0);

If you compile and run the program, you will again get a green circle displayed at the centre of the window.

This example modified the program to create a object using pixel coordinates instead of device coordinates, but you could use any object coordinates that make sense in your universe. You just have to change the definitions of your vertices, possibly your vertex shader, and your fragment shader to use values in those object coordinates.

The source code, as modified, is available in the objectviewbranch on GitHub.


One thought on “Device Coordinates and Object Coordinates

  1. Pingback: Moving The Circle Off Centre | Using C++

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s