OpenGL space coordinate transformation

First, you need to understand matrices. Matrices are often used for coordinate transformation in programs.
Common matrices are:

1. Scaling matrix

2. Displacement matrix

More one-dimensional coordinates: Homogeneous Coordinates
use:
1. Allows us to displace on a 3D vector (we cannot displace the vector without the w component)
2. Divide the x, y and z coordinates by the w coordinate respectively, and use the w value to create a 3D visual effect

If the homogeneous coordinate of a vector is 0, this coordinate is the direction vector (Direction Vector)

3. Rotation matrix

Rotation matrices for different axes

Composite matrix for arbitrary axis rotation

.

Note: Different matrices can be combined, such as: scaling + rotation, but pay attention to the order

How does OpenGL use matrices?
Library used: OpenGL Mathematics (GLM)
portal

Usage of matrix:
See code comments for details

//Introduce related header files
#include <glm/glm.hpp>
#include <glm/gtc/matrix_transform.hpp>
#include <glm/gtc/type_ptr.hpp>

//Matrix definition, by default the data in it are illegal data and need to be initialized
glm::mat4 trans;

//Initialization method 1
// trans = mat4(1.0f);

//The elements in 4 rows and 4 columns in mat4 are arranged in columns.
// Mainly because OpenGL is stored in columns in the video memory, so after preparing the data, you can directly copy mat4 to the video memory.
//You can use the [] operator to access the column of mat4, and use the secondary [] to access the element of a certain column.

//Initialization method 2, familiar with memory distribution
trans[0][0] = 1.0f; trans[1][0] = 0.0f; trans[2][0] = 0.0f; trans[3][0] = 0.0f;
trans[0][1] = 0.0f; trans[1][1] = 1.0f; trans[2][1] = 0.0f; trans[3][1] = 0.0f;
trans[0][2] = 0.0f; trans[1][2] = 0.0f; trans[2][2] = 1.0f; trans[3][2] = 0.0f;
trans[0][3] = 0.0f; trans[1][3] = 0.0f; trans[2][3] = 0.0f; trans[3][3] = 1.0f;

//Translation 1,1,0
//0.5 0.0 0.0 1.0 * 1 = 1.5
//0.0 0.5 0.0 1.0 * 0 = 1
//0.0 0.0 0.5 0.0 * 0 = 0
//0.0 0.0 0.0 1.0 * 1 = 1
trans = translate(trans, vec3(1.0f, 1.0f, 0.0f));

//Rotate
//0.5 0.0 0.0 0.0 * 1 = 0.5
//0.0 0.5 0.0 0.0 * 0 = 0
//0.0 0.0 0.5 0.0 * 0 = 0
//0.0 0.0 0.0 1.0 * 1 = 1
trans = scale(trans, vec3(0.5f, 0.5f, 0.5f));

//Rotate 90 degrees
//trans = rotate(trans, radians(90.0f), vec3(0.0f, 0.0f, 1.0f));

vec4 vec(1.0f, 0.0f, 0.0f, 1.0f); //w is 1 and is the coordinate, which means the displacement is valid

vec = trans * vec;
std::cout << vec.x << "," << vec.y << "," << vec.z << std::endl;

//Multiple right, first zoom and then translate
//Final output =====>> 1.5,1,0

A vertex goes through several different spaces and is finally converted into a fragment

What are the different coordinate systems?
1. Local Space (Local Space, also known as Object Space)
2. World Space
3. Observation Space (View Space, also known as Eye Space)
4.Clip Space
5.Screen Space

As shown in the picture:

What needs attention:
The projection matrix of the clipping space is divided into two types:
a) Orthographic Projection Matrix
b)Perspective Projection Matrix

OpenGL expects that every time the vertex shader is run, the returned vertices are in NDC space.
But OpenGL will automatically perform perspective division (divided by W) and cropping, and then perform viewport transformation (related to screen resolution)

So the gl_Position returned in the end is actually converted to the clipping space above.
The formula is as follows:

V represents the vertex and M is the matrix
The formula for right multiplication, looking from right to left

How to implement the code for drawing 3D objects?
Assume there is no camera and the camera angle is fixed. Define the MVP matrix yourself.

1. Model matrix
Contains displacement, scaling and rotation operations, which will be applied to the vertices of all objects to transform into the global world space

//Define model matrix
glm::mat4 model = glm::mat4(1.0f);
//Translation
model = glm::translate(model, cubePositions[i]);
//Rotate
model = glm::rotate(model, time * glm::radians(20), glm::vec3(1.0f, 0.3f, 0.5f));

2. Observation matrix
Move back slightly in the scene so that the object becomes visible (i.e. the object is forward)
OpenGL is a right-handed coordinate system (Right-handed System), so it must be inverted

//Define view matrix
glm::mat4 view = glm::mat4(1.0f);
view = glm::translate(view, glm::vec3(0.0f, 0.0f, -3.0f));

3. Projection matrix
Take the perspective matrix as an example (orthogonal is simpler, no perspective division is needed)
But they are all good ways to encapsulate them with glm.

//Define projection matrix
glm::mat4 projection = glm::mat4(1.0f);
//perspective creates a perspective projection matrix
// Parameter 1: FOV, the default is 45, which is close to reality
// Parameter 2: aspect ratio
// Parameter 3: Near clipping plane
// Parameter 4: Far clipping plane
projection = glm::perspective(glm::radians(45.0f), (float)SCR_WIDTH / (float)SCR_HEIGHT, 0.1f, 100.0f);

4. Passed to the vertex shader

//Shader.cpp added auxiliary function
void Shader::setM4(const std::string & amp;name, const glm::mat4 & amp;m4) const
{<!-- -->
auto location = glGetUniformLocation(ID, name.c_str());
//Set matrix value
// Parameter 1: location
// Parameter 2: several matrices
// Parameter 3: Whether to transpose
// Parameter 4: Matrix data, pointer required
glUniformMatrix4fv(location, 1, GL_FALSE, & amp;m4[0][0]);
}

//Set the matrix through glUniformMatrix4fv
testShader.setM4("model", model);
testShader.setM4("view", view);
testShader.setM4("projection", projection);

4.Shader performs coordinate conversion

#version 330 core

layout (location=0) in vec3 aPos; //0 is the vertex position
layout (location=1) in vec2 aTexcoord; //2 is the UV coordinate

uniform mat4 model; //Model matrix
uniform mat4 view; //observation matrix
uniform mat4 projection; //projection matrix

out vec2 Texcoord; //UV returned to the fragment shader

void main()
{
//Multiple right by transformation matrix
vec4 pos = projection * view * model * vec4(aPos, 1.0f);

//Return cropping coordinates
gl_Position = pos;

//Return UV
Texcoord = aTexcoord;
}

5. Draw a square display

//Define vertices
float vertices[] = {
-0.5f, -0.5f, -0.5f, 0.0f, 0.0f,
0.5f, -0.5f, -0.5f, 1.0f, 0.0f,
0.5f, 0.5f, -0.5f, 1.0f, 1.0f,
0.5f, 0.5f, -0.5f, 1.0f, 1.0f,
-0.5f, 0.5f, -0.5f, 0.0f, 1.0f,
-0.5f, -0.5f, -0.5f, 0.0f, 0.0f,

-0.5f, -0.5f, 0.5f, 0.0f, 0.0f,
0.5f, -0.5f, 0.5f, 1.0f, 0.0f,
0.5f, 0.5f, 0.5f, 1.0f, 1.0f,
0.5f, 0.5f, 0.5f, 1.0f, 1.0f,
-0.5f, 0.5f, 0.5f, 0.0f, 1.0f,
-0.5f, -0.5f, 0.5f, 0.0f, 0.0f,

-0.5f, 0.5f, 0.5f, 1.0f, 0.0f,
-0.5f, 0.5f, -0.5f, 1.0f, 1.0f,
-0.5f, -0.5f, -0.5f, 0.0f, 1.0f,
-0.5f, -0.5f, -0.5f, 0.0f, 1.0f,
-0.5f, -0.5f, 0.5f, 0.0f, 0.0f,
-0.5f, 0.5f, 0.5f, 1.0f, 0.0f,

0.5f, 0.5f, 0.5f, 1.0f, 0.0f,
0.5f, 0.5f, -0.5f, 1.0f, 1.0f,
0.5f, -0.5f, -0.5f, 0.0f, 1.0f,
0.5f, -0.5f, -0.5f, 0.0f, 1.0f,
0.5f, -0.5f, 0.5f, 0.0f, 0.0f,
0.5f, 0.5f, 0.5f, 1.0f, 0.0f,

-0.5f, -0.5f, -0.5f, 0.0f, 1.0f,
0.5f, -0.5f, -0.5f, 1.0f, 1.0f,
0.5f, -0.5f, 0.5f, 1.0f, 0.0f,
0.5f, -0.5f, 0.5f, 1.0f, 0.0f,
-0.5f, -0.5f, 0.5f, 0.0f, 0.0f,
-0.5f, -0.5f, -0.5f, 0.0f, 1.0f,

-0.5f, 0.5f, -0.5f, 0.0f, 1.0f,
0.5f, 0.5f, -0.5f, 1.0f, 1.0f,
0.5f, 0.5f, 0.5f, 1.0f, 0.0f,
0.5f, 0.5f, 0.5f, 1.0f, 0.0f,
-0.5f, 0.5f, 0.5f, 0.0f, 0.0f,
-0.5f, 0.5f, -0.5f, 0.0f, 1.0f
};

//Draw a square using GL_TRIANGLES
glDrawArrays(GL_TRIANGLES, 0, 36);

This way you can draw

Notice:
1. Depth test needs to be enabled in 3D space
OpenGL stores all its depth information in a Z-buffer, also called a Depth Buffer.
GLFW will automatically generate such a buffer for you, and the depth value is stored in each fragment (as the z value of the fragment)
When a fragment wants to output its color, OpenGL compares its depth value to the z-buffer
If the current fragment is after another fragment, it will be discarded, otherwise it will be overwritten.

//Enable depth testing
glEnable(GL_DEPTH_TEST);

//Clear the depth buffer at the beginning of each rendering loop
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

2. If multiple objects are rendered, just change the model matrix

//Multiple cubes, defining world space
glm::vec3 cubePositions[] = {<!-- -->
glm::vec3(0.0f, 0.0f, 0.0f),
glm::vec3(2.0f, 5.0f, -15.0f),
glm::vec3(-1.5f, -2.2f, -2.5f),
glm::vec3(-3.8f, -2.0f, -12.3f),
glm::vec3(2.4f, -0.4f, -3.5f),
glm::vec3(-1.7f, 3.0f, -7.5f),
glm::vec3(1.3f, -2.0f, -2.5f),
glm::vec3(1.5f, 2.0f, -2.5f),
glm::vec3(1.5f, 0.2f, -1.5f),
glm::vec3(-1.3f, 1.0f, -1.5f)
};

// pseudocode
//In rendering loop...

testShader.setM4("view", glm::value_ptr(view));
testShader.setM4("projection", glm::value_ptr(projection));

lBindVertexArray(VAO);
for (unsigned int i = 0; i < 10; i + + )
{<!-- -->
//model matrix
glm::mat4 model = glm::mat4(1.0f);
model = glm::translate(model, cubePositions[i]);

float angle = 20.0f * i;
model = glm::rotate(model, time * glm::radians(angle), glm::vec3(1.0f, 0.3f, 0.5f));
testShader.setM4("model", model);

glDrawArrays(GL_TRIANGLES, 0, 36);
}

Rendering: