Android uses Camera2 API and GLSurfaceView to implement camera preview

GLSurfaceView and SurfaceView are two view classes used to display images in Android. They have some differences in implementation and usage scenarios.

  • Implementation method: GLSurfaceView is implemented based on OpenGL ES technology and can render images through OpenGL ES. SurfaceView uses a thread-based drawing method to perform drawing operations in independent threads.
  • Performance: Because GLSurfaceView uses OpenGL ES technology, which can fully utilize the GPU for image rendering, it generally has better performance when processing complex images and animations. In contrast, SurfaceView uses the CPU for image drawing, and performance can be relatively low.
  • Usage scenarios: If you need to perform complex graphics drawing, image processing or animation, then GLSurfaceView is a better choice because it provides powerful OpenGL ES function support. In addition, GLSurfaceView can also be integrated with other OpenGL ES-related libraries and tools. SurfaceView is more common in some simple image display scenarios, such as displaying pictures, playing videos, etc.
  • Usage complexity: Since GLSurfaceView uses OpenGL ES, it needs to write a shader program for image rendering and needs to handle OpenGL ES-related context management. Relatively speaking, using SurfaceView is relatively simple. You only need to inherit the SurfaceView class and implement custom drawing logic.

It should be noted that because GLSurfaceView uses OpenGL ES technology, it has higher requirements for developers and requires familiarity with OpenGL ES-related knowledge and programming techniques. SurfaceView is easier to use and understand in some simple scenarios.

In short, GLSurfaceView is suitable for scenes that require complex graphics rendering and animation, while SurfaceView is suitable for general image display and simple drawing needs. Which class to choose depends on your specific needs and technical abilities.

  1. Add camera permissions in the AndroidManifest.xml file:

    <uses-permission android:name="android.permission.CAMERA" />
    
  2. Create a layout for camera preview

     <RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
        xmlns:app="http://schemas.android.com/apk/res-auto"
        xmlns:tools="http://schemas.android.com/tools"
        android:layout_width="match_parent"
        android:layout_height="match_parent"
        tools:context=".CameraActivity">
        <android.opengl.GLSurfaceView
            android:id="@ + id/glsurfaceview"
            android:layout_width="match_parent"
            android:layout_height="match_parent" />
    </RelativeLayout>
    
  3. Create a camera preview activity to manage camera preview and OpenGL drawing,

     package com.test.jnitest
     
     import android.Manifest
     import android.content.Context
     import android.content.pm.PackageManager
     import android.graphics.SurfaceTexture
     import android.hardware.camera2.CameraCaptureSession
     import android.hardware.camera2.CameraDevice
     import android.hardware.camera2.CameraManager
     import android.hardware.camera2.CaptureRequest
     import android.opengl.GLSurfaceView
     import android.os.Bundle
     import android.util.Size
     import android.view.Surface
     import android.view.WindowManager
     import androidx.appcompat.app.AppCompatActivity
     import androidx.core.app.ActivityCompat
     import com.test.jnitest.databinding.ActivityCameraBinding
     import java.util.*
     
     class CameraActivity : AppCompatActivity() {
         var mGLSurfaceView:GLSurfaceView?=null
         var mRenderer:CameraRenderer?=null
         var cameraManager:CameraManager?=null
         var mCameraDevice:CameraDevice?=null
         var mCaptureSession:CameraCaptureSession?=null
         var mRequestBuild:CaptureRequest.Builder?=null
         var size = Size(1920,1080)
         lateinit var mContext:Context
         lateinit var binding:ActivityCameraBinding
         override fun onCreate(savedInstanceState: Bundle?) {
             super.onCreate(savedInstanceState)
             binding = ActivityCameraBinding.inflate(layoutInflater)
             setContentView(binding.root)
             //Set the status bar to be transparent
             window.addFlags(WindowManager.LayoutParams.FLAG_TRANSLUCENT_STATUS)
             //Set the navigation bar to be transparent
             window.addFlags(WindowManager.LayoutParams.FLAG_TRANSLUCENT_NAVIGATION)
             mContext = this
             mGLSurfaceView = binding.glsurfaceview
             mGLSurfaceView?.setEGLContextClientVersion(2)
             //Create and set up the camera renderer
             mRenderer = CameraRenderer(mGLSurfaceView!!)
             mGLSurfaceView?.setRenderer(mRenderer)
             mGLSurfaceView?.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);
             // Get the camera manager
             cameraManager = getSystemService(Context.CAMERA_SERVICE) as CameraManager
             if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
                 this.requestPermissions(mutableListOf<String>(Manifest.permission.CAMERA).toTypedArray(),200)
                 return
             }
             cameraManager?.openCamera("5",mCameraStateCallback,null)
         }
     
         override fun onResume() {
             super.onResume()
             mGLSurfaceView?.onResume()
         }
     
         override fun onDestroy() {
             super.onDestroy()
             closeCamera()
         }
         //Camera status callback
         var mCameraStateCallback = object : CameraDevice.StateCallback() {
             override fun onOpened(p0: CameraDevice) {
                 mCameraDevice = p0
                 //Create preview session
                 var surfaceTexture = mRenderer?.mSurfaceTexture
                 surfaceTexture?.setDefaultBufferSize(size.width,size.height)
                 var surface = Surface(surfaceTexture)
                 mRequestBuild = mCameraDevice?.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW)
                 mRequestBuild?.addTarget(surface)
                 val surfaces = Arrays.asList(surface)
                 mCameraDevice?.createCaptureSession(surfaces,mCaptureCallback,null)
             }
     
             override fun onDisconnected(p0: CameraDevice) {
                 p0.close()
             }
     
             override fun onError(p0: CameraDevice, p1: Int) {
                 p0.close()
             }
     
         }
         // Capture session state callback
         var mCaptureCallback = object : CameraCaptureSession.StateCallback() {
             override fun onConfigured(p0: CameraCaptureSession) {
                 mCaptureSession = p0
                 mRequestBuild?.build()?.let { mCaptureSession?.setRepeatingRequest(it,null,null) }
             }
     
             override fun onConfigureFailed(p0: CameraCaptureSession) {
                 p0.close()
                 mCaptureSession = null
             }
     
         }
     
         // Close camera
         private fun closeCamera() {
             mCaptureSession?.close()
             mCaptureSession = null
             mCameraDevice?.close()
             mCameraDevice = null
         }
     }
    
  4. Create a camera renderer and create a class inherited from GLSurfaceView.Renderer to implement the logic of OpenGL drawing and interaction with the camera

     package com.test.jnitest
     
     import android.content.Context
     import android.graphics.SurfaceTexture
     import android.graphics.SurfaceTexture.OnFrameAvailableListener
     import android.opengl.GLES11Ext
     import android.opengl.GLES20
     import android.opengl.GLSurfaceView
     import java.nio.ByteBuffer
     import java.nio.ByteOrder
     import java.nio.FloatBuffer
     import javax.microedition.khronos.egl.EGLConfig
     import javax.microedition.khronos.opengles.GL10
     
     class CameraRenderer(var mGLSurfaceView: GLSurfaceView):GLSurfaceView.Renderer,OnFrameAvailableListener {
         //Texture ID of camera image
         var textureId:Int = 0
         var mSurfaceTexture:SurfaceTexture?=null
         private val COORDS_PER_VERTEX = 2
         private val TEXTURE_COORDS_PER_VERTEX = 2
         //vertex shader
         var vertexShaderCode = """attribute vec4 a_position;
             attribute vec2 a_textureCoord;
             varying vec2 v_textureCoord;
             void main() {
               gl_Position = a_position;
               v_textureCoord = a_textureCoord;
             }
             """
         // fragment shader
         var fragmentShaderCode = """#extension GL_OES_EGL_image_external : require
             precision mediump float;
             uniform samplerExternalOES u_texture;
             varying vec2 v_textureCoord;
             void main() {
               gl_FragColor = texture2D(u_texture, v_textureCoord);
             }
             """
     
         //Vertex coordinate data, indicating the position and size of the preview image.
         private val VERTEX_COORDS = floatArrayOf(
             -1.0f, -1.0f,
             1.0f, -1.0f,
             -1.0f, 1.0f,
             1.0f, 1.0f
         )
         //Texture coordinate data represents the mapping relationship of the camera image in the preview area.
         private val TEXTURE_COORDS = floatArrayOf(
             0f, 1f,
             1f, 1f,
             0f, 0f,
             1f, 0f
         )
         //The ID of the shader program
         private var programId = 0
         //Handle to vertex attributes
         private var positionHandle = 0
         private var textureCoordHandle = 0
     
         init {
             textureId = createTexture()
             mSurfaceTexture = SurfaceTexture(textureId)
             mSurfaceTexture?.setOnFrameAvailableListener(this)
         }
     
         /**
          * Initialize OpenGL and load the vertex shader and fragment shader. By compiling and linking the shader, creating the shader program and getting a handle to the vertex attribute.
          */
         override fun onSurfaceCreated(p0: GL10?, p1: EGLConfig?) {
             // Initialize the OpenGL environment here, such as creating textures, shader programs, etc.
             //Set the color value when clearing the color buffer to black
             GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f)
             //Load vertex shader and fragment shader
             val vertexShader: Int = loadShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode)
             val fragmentShader: Int = loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderCode)
             // Create a shader program and bind the vertex shader and fragment shader to it
             programId = GLES20.glCreateProgram()
             GLES20.glAttachShader(programId, vertexShader)
             GLES20.glAttachShader(programId, fragmentShader)
             // Link the shader program and check whether the link is successful
             GLES20.glLinkProgram(programId)
             // Get the position of the vertex coordinate attribute and texture coordinate attribute
             positionHandle = GLES20.glGetAttribLocation(programId, "a_position")
             textureCoordHandle = GLES20.glGetAttribLocation(programId, "a_textureCoord")
             //Use shader program
             GLES20.glUseProgram(programId)
         }
     
         override fun onSurfaceChanged(p0: GL10?, p1: Int, p2: Int) {
             //Respond to GLSurfaceView size changes here, such as updating the viewport size, etc.
             GLES20.glViewport(0, 0, p1, p2);
         }
     
         /**
          * Draw each frame, perform actual drawing operations here, such as clearing the screen, drawing textures, etc.
          */
         override fun onDrawFrame(p0: GL10?) {
             //Update texture image
             mSurfaceTexture?.updateTexImage();
             // Clear the color buffer
             GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
             //Set the vertex coordinate attribute and enable it
             GLES20.glVertexAttribPointer(positionHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, 0, floatBufferFromArray(VERTEX_COORDS));
             GLES20.glEnableVertexAttribArray(positionHandle);
             //Set texture coordinate properties and enable
             GLES20.glVertexAttribPointer(textureCoordHandle, TEXTURE_COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, 0, floatBufferFromArray(TEXTURE_COORDS));
             GLES20.glEnableVertexAttribArray(textureCoordHandle);
             // Activate texture unit 0 and bind the current texture to the external OES texture target
             GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
             GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);
             //Draw the primitives of the triangle strip
             GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, VERTEX_COORDS.size / COORDS_PER_VERTEX);
         }
     
         /**
          * Create camera texture
          */
         private fun createTexture(): Int {
             //Create an array to store texture IDs
             val textureIds = IntArray(1)
             // Generate a texture object and store the texture ID into an array
             GLES20.glGenTextures(1, textureIds, 0)
             // Bind the current texture to the OpenGL ES texture target (external OES texture)
             GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureIds[0])
             //Set the wrapping mode of the texture S axis to GL_CLAMP_TO_EDGE, that is, the texture coordinates beyond the boundary will be intercepted to the texels on the boundary
             GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE)
             //Set the wrapping mode of the texture T axis to GL_CLAMP_TO_EDGE
             GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE)
             //Set the texture reduction filter to GL_NEAREST, which uses nearest neighbor sampling for texture reduction.
             GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST)
             //Set the texture amplification filter to GL_NEAREST
             GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST)
             return textureIds[0]
         }
         /**
          * Load the shader, accept the shader type and shader code as parameters, and return the ID of the compiled shader object
          * @param type shader type, such as GLES20.GL_VERTEX_SHADER or GLES20.GL_FRAGMENT_SHADER
          * @param shaderCode shader code
          * @return shader ID
          */
         private fun loadShader(type: Int, shaderCode: String): Int {
             //Create a new shader object
             val shader = GLES20.glCreateShader(type)
             //Load the shader code into the shader object
             GLES20.glShaderSource(shader, shaderCode)
             //Compile shader
             GLES20.glCompileShader(shader)
             return shader
         }
     
         private fun floatBufferFromArray(array: FloatArray): FloatBuffer? {
             val byteBuffer: ByteBuffer = ByteBuffer.allocateDirect(array.size * 4)
             byteBuffer.order(ByteOrder.nativeOrder())
             val floatBuffer: FloatBuffer = byteBuffer.asFloatBuffer()
             floatBuffer.put(array)
             floatBuffer.position(0)
             return floatBuffer
         }
     
         override fun onFrameAvailable(p0: SurfaceTexture?) {
             // Called back when a new frame is available from the camera, some processing can be done here
             mGLSurfaceView.requestRender()
         }
     }
    

Through the above steps, you can implement the function of previewing the camera using Camera2 API and GLSurfaceView. In CameraActivity, we open the camera through the Camera2 API and create a camera preview session, then pass the SurfaceTexture of the camera preview to CameraRenderer, and draw the texture content of the camera preview frame in the onDrawFrame() method of CameraRenderer.