Research on the principle of clearing the last frame of the camera using TextureView

Recently, I encountered an interesting problem while assisting a camera-related project. I will record it here.

The original problem is probably that when using TextureView to preview the camera, after closing the camera, the last frame will remain on the screen, and it needs to be cleared. The method I used at the beginning was to get the Surface’s Canvas and paint the entire canvas black to clear the screen:

Canvas canvas = mPreviewSurface.lockCanvas(null);
canvas.drawColor(Color.BLACK);
mPreviewSurface.unlockCanvasAndPost(canvas);

But I encountered the following problem:

  1. LockCanvas will throw IllegalArgumentException before CameraDevice.close
  2. Although lockCanvas can clear the screen after CameraDevice.close, opening it again and calling CameraDevice.createCaptureSession will fail and call back onConfigureFailed.

After searching on the Internet, master fadden on stackoverflow explained this:

You can't do this, due to a limitation of the Android app framework (as of Android 4.4 at least).

The SurfaceTexture that underlies the TextureView is a buffer consumer. The MediaPlayer is one example of a buffer producer, Canvas is another. Once you attach a producer, you have to detach it before you can attach a second producer.

The trouble is that there is no way to detach a software-based (Canvas) buffer producer. There could be, but isn't. So once you draw with Canvas, you're stuck. (There's a note to that effect here.)

You can detach a GLES producer. For example, in one of Grafika's video player classes you can find a clearSurface() method that clears the surface to black using GLES. Note the EGL context and window are created and explicitly released within the scope of the method. You could expand the method to show an image instead.

The general meaning is that TextureView, as a consumer of a picture, can be bound to different picture producers (Canvas is one of them, and MediaPlayer and Camera can also be used as picture producers). Once connected to a producer, you cannot connect to other producers again, and Canvas is a relatively brutal producer and does not provide a method to unbind. So once TextureView is bound to Canvas, MediaPlayer and Camera can no longer use this Surface area to display the picture.

Then the solution he provided was to refer to Grafika and use OpenGL to do the cleanup.

Consumer-producer model

The consumer producer model is still a relatively important thing in the Android graphics system. From the introduction of the official document, we can roughly see the entire workflow:

bufferqueue.png

  • Producers such as Camera, video decoder, OpenGL ES, Canvas, etc. call dequeue to obtain a blank Buffer from the BufferQueue, and then use the Buffer to draw. After the drawing is completed, call the queue to return the Buffer to the BufferQueue.
  • Consumer such as SurfaceFlinger calls acquire to obtain a drawn Buffer from the BufferQueue, and then renders the picture. After the rendering is completed, it calls release to return the Buffer to the BufferQueue as a blank Buffer.
Canvas canvas = mPreviewSurface.lockCanvas(null);
canvas.drawColor(Color.BLACK);
mPreviewSurface.unlockCanvasAndPost(canvas);

Use the lockCanvas above as an example. In the code, Producer is specifically the IGraphicBufferProducer interface, which is passed in when the Surface is constructed and connected when connecting:

// https://cs.android.com/android/platform/superproject/ + /android-13.0.0_r8:frameworks/native/libs/gui/Surface.cpp
Surface::Surface(const sp<IGraphicBufferProducer> & amp; bufferProducer, bool controlledByApp,
                 const sp<IBinder> & amp; surfaceControlHandle)
      : mGraphicBufferProducer(bufferProducer),
      ...

int Surface::connect(
        int api, const sp<IProducerListener> & amp; listener, bool reportBufferRemoval) {<!-- -->
    ...
    int err = mGraphicBufferProducer->connect(listener, api, mProducerControlledByApp, & amp;output);
    ...
}

Then Surface.lockCanvas calls the nativeLockCanvas of the native layer to use Surface::lock to dequeueBuffer to obtain the Buffer provided to the Canvas for drawing:

// https://cs.android.com/android/platform/superproject/ + /android-13.0.0_r8:frameworks/base/core/jni/android_view_Surface.cpp
static jlong nativeLockCanvas(JNIEnv* env, jclass clazz,
        jlong nativeObject, jobject canvasObj, jobject dirtyRectObj) {<!-- -->
    ANativeWindow_Buffer buffer;
    status_t err = surface->lock( & amp;buffer, dirtyRectPtr);
    ...
    graphics::Canvas canvas(env, canvasObj);
    canvas.setBuffer( & amp;buffer, static_cast<int32_t>(surface->getBuffersDataSpace()));
    ...
}

// https://cs.android.com/android/platform/superproject/ + /android-13.0.0_r8:frameworks/native/libs/gui/Surface.cpp
status_t Surface::lock(
        ANativeWindow_Buffer* outBuffer, ARect* inOutDirtyBounds)
{<!-- -->
    ...
    status_t err = dequeueBuffer( & amp;out, & amp;fenceFd);
    ...
}

int Surface::dequeueBuffer(android_native_buffer_t** buffer, int* fenceFd) {<!-- -->
    ...
    status_t result = mGraphicBufferProducer->dequeueBuffer( & amp;buf, & amp;fence, dqInput.width,
                                                            dqInput.height, dqInput.format,
                                                            dqInput.usage, &mBufferAge,
                                                            dqInput.getTimestamps?
                                                             & amp;frameTimestamps : nullptr);
    ...
}

Surface.unlockCanvasAndPost will call nativeUnlockCanvasAndPost of the native layer to call Surface::unlockAndPost to queueBuffer:

// https://cs.android.com/android/platform/superproject/ + /android-13.0.0_r8:frameworks/base/core/jni/android_view_Surface.cpp
static void nativeUnlockCanvasAndPost(JNIEnv* env, jclass clazz,
        jlong nativeObject, jobject canvasObj) {<!-- -->
    ...
    //detach the canvas from the surface
    graphics::Canvas canvas(env, canvasObj);
    canvas.setBuffer(nullptr, ADATASPACE_UNKNOWN);

    // unlock surface
    status_t err = surface->unlockAndPost();
    ...
}

// https://cs.android.com/android/platform/superproject/ + /android-13.0.0_r8:frameworks/native/libs/gui/Surface.cpp
status_t Surface::unlockAndPost()
{<!-- -->
    ...
    err = queueBuffer(mLockedBuffer.get(), fd);
    ...
}

int Surface::queueBuffer(android_native_buffer_t* buffer, int fenceFd) {<!-- -->
    ...
    status_t err = mGraphicBufferProducer->queueBuffer(i, input, & amp;output);
    ...
}

In this way, the continuous loop of lockCanvas, drawCanvas, and unlockCanvasAndPost can continuously provide images to the Consumer of SurfaceFlinger for rendering.

Although the general reasons and solutions have been clearly explained, I still have three questions:

  1. Does lockCanvas need to be locked after CameraDevice.close? Does that mean unbinding will be done in CameraDevice.close?
  2. Why does calling unlockCanvasAndPost not unbind the content producer Canvas?
  3. GLES can unbind, but how does it unbind?

Canvas can be locked only after CameraDevice.close

I didn’t find the answer through online searches, so I had to analyze the source code myself. First, let’s start with the log where createCaptureSession fails again after unlockCanvasAndPost to see if we can find any useful information:

11-04 18:55:13.130 28137 25285 E BufferQueueProducer: [SurfaceTexture-0-28137-0](id:6de900000001,api:2,p:28137,c:28137) connect: already connected (cur=2 req=4)
11-04 18:55:13.130 1905 8873 E Camera3-OutputStream: configureConsumerQueueLocked: Unable to connect to native window for stream 0
11-04 18:55:13.130 1905 8873 E Camera3-Stream: finishConfiguration: Unable to configure stream 0 queue: Invalid argument (-22)
11-04 18:55:13.130 1905 8873 E Camera3-Device: Camera 0: configureStreamsLocked: Can't finish configuring output stream 0: Invalid argument (
-twenty two)
11-04 18:55:13.130 1047 1365 E minksocket: MinkIPC_QRTR_Service: client with node 1 port 6838 went down
11-04 18:55:13.130 1905 8873 D CameraService: CameraPerf: setpriority success, tid is 8873, priority is 0
11-04 18:55:13.130 1905 8873 E CameraDeviceClient: endConfigure: Camera 0: Unsupported set of inputs/outputs provided

From the log, you can see that Surface::connect: is called in Camera3OutputStream::configureConsumerQueueLocked.

// https://cs.android.com/android/platform/superproject/ + /android-13.0.0_r8:frameworks/av/services/camera/libcameraservice/device3/Camera3OutputStream.h
sp<Surface> mConsumer;

// https://cs.android.com/android/platform/superproject/ + /android-13.0.0_r8:frameworks/av/services/camera/libcameraservice/device3/Camera3OutputStream.cpp
status_t Camera3OutputStream::configureConsumerQueueLocked(bool allowPreviewRespace) {<!-- -->
    ...
    // Configure consumer-side ANativeWindow interface. The listener may be used
    // to notify buffer manager (if it is used) of the returned buffers.
    res = mConsumer->connect(NATIVE_WINDOW_API_CAMERA,
            /*reportBufferRemoval*/true,
            /*listener*/mBufferProducerListener);
    if (res != OK) {<!-- -->
        ALOGE("%s: Unable to connect to native window for stream %d",
                __FUNCTION__, mId);
        return res;
    }
    ...
}

In Surface::connect, BufferQueueProducer::connect: will be called.

// https://cs.android.com/android/platform/superproject/ + /android-13.0.0_r8:frameworks/native/libs/gui/Surface.cpp
int Surface::connect(int api) {<!-- -->
    static sp<IProducerListener> listener = new StubProducerListener();
    return connect(api, listener);
}

int Surface::connect(int api, const sp<IProducerListener> & amp; listener) {<!-- -->
    return connect(api, listener, false);
}

int Surface::connect(
        int api, const sp<IProducerListener> & amp; listener, bool reportBufferRemoval) {<!-- -->
    ...
    int err = mGraphicBufferProducer->connect(listener, api, mProducerControlledByApp, & amp;output);
    ...
}


In BufferQueueProducer::connect, it will be judged that if mCore->mConnectedApi is not BufferQueueCore::NO_CONNECTED_API (that is, it has already been connected), it cannot be connected again:

// https://cs.android.com/android/platform/superproject/ + /android-13.0.0_r8:frameworks/native/libs/gui/include/gui/BufferQueueProducer.h
sp<BufferQueueCore> mCore;

// https://cs.android.com/android/platform/superproject/ + /android-13.0.0_r8:frameworks/native/libs/gui/BufferQueueProducer.cpp
status_t BufferQueueProducer::connect(const sp<IProducerListener> & amp; listener,
        int api, bool producerControlledByApp, QueueBufferOutput *output) {<!-- -->
    ...
    if (mCore->mConnectedApi != BufferQueueCore::NO_CONNECTED_API) {<!-- -->
        BQ_LOGE("connect: already connected (cur=%d req=%d)",
                mCore->mConnectedApi, api);
        return BAD_VALUE;
    }
    ...
    mCore->mConnectedApi = api;
    ...
}

So the already connected log we see is printed from here.

11-04 18:55:13.130 28137 25285 E BufferQueueProducer: [SurfaceTexture-0-28137-0](id:6de900000001,api:2,p:28137,c:28137) connect: already connected (cur=2 req=4)

The types of connect api are as follows, so we can analyze from the log that SurfaceTexture has been connected to NATIVE_WINDOW_API_CPU and can no longer be connected to NATIVE_WINDOW_API_CAMERA:

// https://cs.android.com/android/platform/superproject/ + /android-13.0.0_r8:frameworks/native/libs/nativewindow/include/system/window.h
/* parameter for NATIVE_WINDOW_[API_][DIS]CONNECT */
enum {<!-- -->
    /* Buffers will be queued by EGL via eglSwapBuffers after being filled using
     * OpenGL ES.
     */
    NATIVE_WINDOW_API_EGL = 1,

    /* Buffers will be queued after being filled using the CPU
     */
    NATIVE_WINDOW_API_CPU = 2,

    /* Buffers will be queued by Stagefright after being filled by a video
     * decoder. The video decoder can either be a software or hardware decoder.
     */
    NATIVE_WINDOW_API_MEDIA = 3,

    /* Buffers will be queued by the the camera HAL.
     */
    NATIVE_WINDOW_API_CAMERA = 4,
};

In CameraDevice.close, Camera3OutputStream::disconnectLocked will be called and eventually BufferQueueProducer::disconnect will be assigned mCore->mConnectedApi back to BufferQueueCore::NO_CONNECTED_API:

// https://cs.android.com/android/platform/superproject/ + /android-13.0.0_r8:frameworks/av/services/camera/libcameraservice/device3/Camera3OutputStream.cpp
status_t Camera3OutputStream::disconnectLocked() {<!-- -->
    ...
    ALOGV("%s: disconnecting stream %d from native window", __FUNCTION__, getId());

    res = native_window_api_disconnect(mConsumer.get(),
                                       NATIVE_WINDOW_API_CAMERA);
    ...
}

// https://cs.android.com/android/platform/superproject/ + /android-13.0.0_r8:frameworks/native/libs/nativewindow/include/system/window.h
static inline int native_window_api_disconnect(
        struct ANativeWindow* window, int api)
{<!-- -->
    return window->perform(window, NATIVE_WINDOW_API_DISCONNECT, api);
}

// https://cs.android.com/android/platform/superproject/ + /android-13.0.0_r8:frameworks/native/libs/gui/Surface.cpp
int Surface::perform(int operation, va_list args)
{<!-- -->
    ...
    case NATIVE_WINDOW_API_DISCONNECT:
        res = dispatchDisconnect(args);
        break;
    ...
}

int Surface::dispatchDisconnect(va_list args) {<!-- -->
    int api = va_arg(args, int);
    return disconnect(api);
}

int Surface::disconnect(int api, IGraphicBufferProducer::DisconnectMode mode) {<!-- -->
    ...
    int err = mGraphicBufferProducer->disconnect(api, mode);
    ...
}

// https://cs.android.com/android/platform/superproject/ + /android-13.0.0_r8:frameworks/native/libs/gui/BufferQueueProducer.cpp
status_t BufferQueueProducer::disconnect(int api, DisconnectMode mode) {<!-- -->
    ...
    mCore->mConnectedApi = BufferQueueCore::NO_CONNECTED_API;
    ...
}


So after CameraDevice.close, mCore->mConnectedApi is assigned to BufferQueueCore::NO_CONNECTED_API, and lockCanvas will not fail if it goes to BufferQueueProducer::connect again.

lockCanvas & amp; unlockCanvasAndPost

Surface.lockCanvas will eventually go to Surface::lock and call Surface::connect(NATIVE_WINDOW_API_CPU):

// https://cs.android.com/android/platform/superproject/ + /android-13.0.0_r8:frameworks/base/core/jni/android_view_Surface.cpp
static jlong nativeLockCanvas(JNIEnv* env, jclass clazz,
        jlong nativeObject, jobject canvasObj, jobject dirtyRectObj) {<!-- -->
    sp<Surface> surface(reinterpret_cast<Surface *>(nativeObject));
    ...
    status_t err = surface->lock( & amp;buffer, dirtyRectPtr);
    ...
}

// https://cs.android.com/android/platform/superproject/ + /android-13.0.0_r8:frameworks/native/libs/gui/Surface.cpp
status_t Surface::lock(
        ANativeWindow_Buffer* outBuffer, ARect* inOutDirtyBounds)
{<!-- -->
    ...
    if (!mConnectedToCpu) {<!-- -->
        int err = Surface::connect(NATIVE_WINDOW_API_CPU);
        if (err) {<!-- -->
            return err;
        }
        // we're intending to do software rendering from this point
        setUsage(GRALLOC_USAGE_SW_READ_OFTEN | GRALLOC_USAGE_SW_WRITE_OFTEN);
    }
    ...
}

The following process is similar to calling Surface::connect in Camera3OutputStream::configureConsumerQueueLocked. Eventually, BufferQueueProducer::connect will be called to assign mCore->mConnectedApi to NATIVE_WINDOW_API_CPU. But the slightly different thing is that the connect api will be judged in Surface::connect and mConnectedToCpu will be assigned a value of true:

// https://cs.android.com/android/platform/superproject/ + /android-13.0.0_r8:frameworks/native/libs/gui/Surface.cpp
int Surface::connect(
        int api, const sp<IProducerListener> & amp; listener, bool reportBufferRemoval) {<!-- -->
    int err = mGraphicBufferProducer->connect(listener, api, mProducerControlledByApp, & amp;output);
    ...
    if (!err & amp; & amp; api == NATIVE_WINDOW_API_CPU) {<!-- -->
        mConnectedToCpu = true;
        // Clear the dirty region in case we're switching from a non-CPU API
        mDirtyRegion.clear();
    }
    ...
}

Therefore, unlockCanvasAndPost does not disconnect BufferQueueProducer and will not cause the problem of repeated Surface::connect(NATIVE_WINDOW_API_CPU) when calling Surface.lockCanvas again:

// https://cs.android.com/android/platform/superproject/ + /android-13.0.0_r8:frameworks/base/core/jni/android_view_Surface.cpp
static void nativeUnlockCanvasAndPost(JNIEnv* env, jclass clazz,
        jlong nativeObject, jobject canvasObj) {<!-- -->
    sp<Surface> surface(reinterpret_cast<Surface *>(nativeObject));
    if (!isSurfaceValid(surface)) {<!-- -->
        return;
    }

    //detach the canvas from the surface
    graphics::Canvas canvas(env, canvasObj);
    canvas.setBuffer(nullptr, ADATASPACE_UNKNOWN);

    // unlock surface
    status_t err = surface->unlockAndPost();
    if (err < 0) {<!-- -->
        jniThrowException(env, IllegalArgumentException, NULL);
    }
}

// https://cs.android.com/android/platform/superproject/ + /android-13.0.0_r8:frameworks/native/libs/gui/Surface.cpp
status_t Surface::unlockAndPost()
{<!-- -->
    if (mLockedBuffer == nullptr) {<!-- -->
        ALOGE("Surface::unlockAndPost failed, no locked buffer");
        return INVALID_OPERATION;
    }

    int fd = -1;
    status_t err = mLockedBuffer->unlockAsync( & amp;fd);
    ALOGE_IF(err, "failed unlocking buffer (%p)", mLockedBuffer->handle);

    err = queueBuffer(mLockedBuffer.get(), fd);
    ALOGE_IF(err, "queueBuffer (handle=%p) failed (%s)",
            mLockedBuffer->handle, strerror(-err));

    mPostedBuffer = mLockedBuffer;
    mLockedBuffer = nullptr;
    return err;
}

It can also be seen from the above code that Surface.unlockCanvasAndPost only separates Canvas from Surface, but BufferQueueProducer does not disconnect, and its mCore->mConnectedApi is still NATIVE_WINDOW_API_CPU. So when connecting to the Camera again, connecting NATIVE_WINDOW_API_CAMERA will fail.

The NATIVE_WINDOW_API_CPU type will only be disconnected when the Surface is destructed:

// https://cs.android.com/android/platform/superproject/ + /android-13.0.0_r8:frameworks/native/libs/gui/Surface.cpp
Surface::~Surface() {<!-- -->
    if (mConnectedToCpu) {<!-- -->
        Surface::disconnect(NATIVE_WINDOW_API_CPU);
    }
}

GLES disconnect

In fact, GLES relies on EGL14.eglDestroySurface to call BufferQueueProducer::disconnect. If it is not called, connecting to the camera again will fail:

11-04 20:13:59.940 29586 25849 E BufferQueueProducer: [SurfaceTexture-0-29586-0](id:739200000001,api:1,p:29586,c:29586) connect: already connected (cur=1 req=4)

This time, NATIVE_WINDOW_API_EGL is connected, and the request for NATIVE_WINDOW_API_CAMERA connection failed.

Differentiate connect api

Why do we need to distinguish connect APIs? This is because the connect types of different APIs may have different processing logic. For example, BufferQueueProducer::queueBuffer judges the NATIVE_WINDOW_API_EGL type:

// https://cs.android.com/android/platform/superproject/ + /android-13.0.0_r8:frameworks/native/libs/gui/BufferQueueProducer.cpp
status_t BufferQueueProducer::queueBuffer(int slot,
        const QueueBufferInput & amp;input, QueueBufferOutput *output) {<!-- -->
    ...
    // Wait without lock held
    if (connectedApi == NATIVE_WINDOW_API_EGL) {<!-- -->
        // Waiting here allows for two full buffers to be queued but not a
        // third. In the event that frames take varying time, this makes a
        // small trade-off in favor of latency rather than throughput.
        lastQueuedFence->waitForever("Throttling EGL Production");
    }
    ...
}

Finally

If you want to become an architect or want to break through the 20-30K salary range, then don’t be limited to coding and business, you must be able to select and expand, and improve your programming thinking. In addition, good career planning is also very important, and learning habits are important, but the most important thing is to be able to persevere. Any plan that cannot be implemented consistently is empty talk.

If you have no direction, here is a set of “Advanced Notes on the Eight Modules of Android” written by a senior architect at Alibaba to help you systematically organize messy, scattered, and fragmented knowledge, so that you can systematically and efficiently Master various knowledge points of Android development.
img
Compared with the fragmented content we usually read, the knowledge points in this note are more systematic, easier to understand and remember, and are strictly arranged according to the knowledge system.

Everyone is welcome to support with one click and three links. If you need the information in the article, just scan the CSDN official certification WeChat card at the end of the article to get it for free ↓↓↓ (There is also a small bonus of the ChatGPT robot at the end of the article, don’t miss it)

PS: There is also a ChatGPT robot in the group, which can answer everyone’s work or technical questions

Picture