Android audio and video development – OpenGL ES orthogonal projection implementation method

The example in this article shares the specific code of the OpenGL ES orthogonal projection display for your reference. The specific content is as follows

Draw a square

The hexagon drawn at the beginning seems to be very easy, and there is no problem. Next, we might as well forget the previous code for drawing the hexagon, and let us draw a simple square according to our own understanding.

According to my understanding, if you want to display a square in the middle of the screen, the effect is as shown in the figure below

image.png

The data that should be created is shown in the image below

image.png

That is, the vertex data passed to the rendering pipeline is as follows:

float[] vertexArray = new float[] {
   (float) -0.5, (float) -0.5, 0,
   (float) 0.5, (float) -0.5, 0,
   (float) -0.5, (float) 0.5, 0,
   (float) 0.5, (float) 0.5, 0
  };

So the code is roughly like this, where the code irrelevant to the theme is omitted, and the color is filled with a solid color, so specifying the color in the fragment shader also omits a series of matrix transformations. In the vertex shader, the vertices are directly passed to the rendering pipeline, and in the fragment shader, a fixed color red is set for the fragment.

Rectangle.java

public class Rectangle {
 private FloatBuffer mVertexBuffer;
 private int mProgram;
 private int mPositionHandle;

 public Rectangle(float r) {
  initVetexData(r);
 }

 public void initVetexData(float r) {
  // Initialize vertex coordinates
  float[] vertexArray = new float[] {
   (float) -0.5, (float) -0.5, 0,
   (float) 0.5, (float) -0.5, 0,
   (float) -0.5, (float) 0.5, 0,
   (float) 0.5, (float) 0.5, 0
  };

  ByteBuffer buffer = ByteBuffer. allocateDirect(vertexArray. length * 4);
  buffer.order(ByteOrder.nativeOrder());
  mVertexBuffer = buffer.asFloatBuffer();
  mVertexBuffer. put(vertexArray);
  mVertexBuffer. position(0);

  int vertexShader = loaderShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode);
  int fragmentShader = loaderShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderCode);

  mProgram = GLES20.glCreateProgram();
  GLES20.glAttachShader(mProgram, vertexShader);
  GLES20.glAttachShader(mProgram, fragmentShader);
  GLES20.glLinkProgram(mProgram);

  mPositionHandle = GLES20.glGetAttribLocation(mProgram, "aPosition");
 }

 public void draw() {
  GLES20. glUseProgram(mProgram);
  // pass vertex data to pipeline, vertex shader
  GLES20.glVertexAttribPointer(mPositionHandle, 3, GLES20.GL_FLOAT, false, 0, mVertexBuffer);
  GLES20.glEnableVertexAttribArray(mPositionHandle);
  // draw primitives
  GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
 }

 private int loaderShader(int type, String shaderCode) {
  int shader = GLES20.glCreateShader(type);
  GLES20.glShaderSource(shader, shaderCode);
  GLES20.glCompileShader(shader);
  return shader;
 }

 private String vertexShaderCode = "attribute vec3 aPosition;"
    + "void main(){"
    + "gl_Position = vec4(aPosition,1);"
    + "}";

 private String fragmentShaderCode = "precision mediump float;"
    + "void main(){"
    + "gl_FragColor = vec4(1,0,0,0);"
    + "}";

}

RectangleView.java

public class RectangleView extends GLSurfaceView{

 public RectangleView(Context context) {
  super(context);
  setEGLContextClientVersion(2);
  setRenderer(new MyRender());
 }

 class MyRender implements GLSurfaceView. Renderer {
  private Rectangle rectangle;

  @Override
  public void onSurfaceCreated(GL10 gl, EGLConfig config) {
   GLES20.glClearColor(0.5f, 0.5f, 0.5f, 1);
   rectangle = new Rectangle(0.5f);
   GLES20.glEnable(GLES20.GL_DEPTH_TEST);
  }

  @Override
  public void onSurfaceChanged(GL10 gl, int width, int height) {
   GLES20.glViewport(0, 0, width, height);
  }

  @Override
  public void onDrawFrame(GL10 gl) {
   GLES20.glClear( GLES20.GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT);
   rectangle. draw();
  }
 }

}

Then the resulting effect is like this. In fact, the coordinates on the screen are not like this. Later, you can know that the above drawing is actually just a normalized device coordinate. The normalized device coordinates can be mapped to the actual mobile phone screen through a formula, which will be learned later.

image.png

Hey, the actual effect seems to be different from the imagination. My original intention was to display a square, but in reality it is a rectangle, stretched on the y-axis, and the situation is similar in the horizontal screen state. But more coincidentally, if you make a coordinate axis with the center of the screen, you will find that the four vertices of this rectangle are in the middle of the x and y range of the coordinate axis [-1,1].

In fact, all objects to be displayed are mapped to the mobile phone screen, and they are mapped to the [-1,1] range on the x, y, and z axes. The coordinates in this range are called normalized device coordinates, independent depending on the actual size and shape of the screen.

Therefore, according to such regulations, it is very difficult for us to create a square, because to create a square, we must consider the aspect ratio of the mobile phone, and it is more complicated when passing in data: we can’t just stand at the angle of the object to be drawn. read. That is to say, to draw a square in the above example, the y-coordinate of the incoming vertex data needs to be transformed according to the ratio. For example, for a 16:9 screen, multiply the y-coordinate of the incoming vertex data by 9 /16 is fine. But at the same time, you will find that when you are in a horizontal screen, you have to deal with the value of the x coordinate passed in. Obviously this is not a good solution.

Introduce projection

In fact, for an object, it has its own coordinates. This space is called the object space, which is a coordinate space used when designing the object. The geometric center of the object is at the origin of the coordinates. After normalization, the coordinate range is in Between [-1,1], the x and y axis divisions are consistent.

When drawing objects in this space directly to the normalized coordinates of the mobile phone screen, due to the problem of the aspect ratio of the screen, the expected result will be different. So you only need to do a mapping on the coordinates of the object space.

Orthographic projection is to solve this problem,

public static void orthoM(float[] m, int mOffset,
  float left, float right, float bottom, float top,
  float near, float far)

The math behind orthographic projection

The matrix generated by the orthoM function will map all the points between left and right, up and down, and far and near to the normalized device coordinates.

The meaning of each parameter is shown in the figure

image.png

Orthographic projection is a kind of parallel projection, the projection lines are parallel, and its viewing volume is a cuboid. Only objects whose coordinates are located in the viewing volume are valid, and the part of the object projected on the near plane in the viewing volume will eventually be displayed. to the screen’s viewport, about the viewport back will drop to.

The following matrix will be generated. The negative value of the z-axis will reverse the z-coordinate. This is because the normalized device coordinates are left-handed systems, while the coordinate systems in OpenGL ES are all right-handed systems. This also involves the w component of the vertex coordinates. , currently unavailable.

image.png

Using the matrix, the coordinates between [-1,1] in the object space can be mapped to the [-1,1] of the normalized device coordinates of the screen. The normalized screen coordinates are a right-handed coordinate system, the origin is at the center of the screen, the right direction is the positive direction of the x-axis, the upward direction is the positive direction of the y-axis, and the z-axis is vertical to the outside of the screen. Take the vertical screen as an example, such as setting left=-1, right=1, bottom=-hight/width, top=hight/width, for example, the resolution of my mobile phone is 1920*1080 =1.8 for the above square point (0.5, 0.5) The coordinates are changed to (0.5,0.3)

image.png

In the normalized device coordinates of the screen, it looks like a square, because the y-axis range is obviously larger than the x-axis, and the actual length corresponding to 0.3 is the same as the 0.5 length of the x-axis.

The above code needs to be modified as follows, add the following code in onSurfaceChanged

@Override
  public void onSurfaceChanged(GL10 gl, int width, int height) {
   GLES20.glViewport(0, 0, width, height);
   // Set the projection matrix according to the screen orientation
   float ratio= width height ? (float)width / height : (float)height / width;
   if (width height) {
    // horizontal screen
    Matrix.orthoM(mProjectionMatrix, 0, -ratio, ratio, -1, 1, 0, 5);
   } else {
    Matrix.orthoM(mProjectionMatrix, 0, -1, 1, -ratio, ratio, 0, 5);
   }
  }

Then multiply the vertex by the projection matrix in the vertex shader

private String vertexShaderCode = "uniform mat4 uProjectionMatrix;" // add this line
    + "attribute vec3 aPosition;"
    + "void main(){"
    + "gl_Position = uProjectionMatrix * vec4(aPosition,1);" // Instead of assigning directly, it is multiplied by the projection matrix
    + "}";

Finally, add the code part to get the uProjectionMatrix in the shader and the incoming value. Regardless of whether the screen is horizontal or vertical, the final effect is the square we expect.

Camera Settings

What needs to be added is that the meaning of the above parameters near and far refers to the distance from the viewpoint. The viewpoint seems to have not been touched so far. It refers to the position of the camera, which is the same as viewing objects with a camera in real life. From different The photos obtained by shooting the same object at different angles and positions are definitely different, and the camera position is specified with the setLookAtM function.

 public static void setLookAtM(float[] rm, // generated camera matrix
        int rmOffset,
   float eyeX, float eyeY, float eyeZ, // camera position
   float centerX, float centerY, float centerZ, // observe the position of the target point
               // The position of the camera and the position of the observation target point determine the observation direction
   float upX, float upY, float upZ // The components of the up vector on the x, y, and z axes, I think it should generally be perpendicular to the viewing direction
        )

The above-mentioned determined viewing volume is related to the camera position and viewing direction specified by the above function. The default position of the camera is at (0,0,0). Under the above settings, if the correction square is translated by 1 unit along the positive direction of the z-axis, it will not be displayed on the screen, because it has already run outside the set viewing volume. .

You need to pay attention to the settings of the camera parameters and the projection near and far parameters, and it must not be set randomly! The viewing volume defined by the camera’s position, direction, and projection matrix finally determines the position of the viewing volume. If it is not set properly, the object will not be displayed on the screen, because the coordinates of the object may be located outside the viewing volume.

Viewport

As mentioned earlier, objects in the viewing volume will eventually be projected onto the near plane, and finally displayed on the viewport, just as set in onSurfaceChanged.

public static native void glViewport(
  int x,
  int y,
  int width,
  int height
 );

The meaning of each parameter in the viewport

image.png

The origin of the screen coordinate system used by the viewport is not in the upper left corner of the screen but in the lower left corner, the x axis is right and the y axis is upward. In fact, it is not very accurate. To be precise, the coordinate origin of the viewport is located in the lower left corner of the View, because GLSurfaceView does not always occupy the entire screen.

The above is the whole content of this article, I hope it will be helpful to everyone’s study.