본문 바로가기

프로그래밍/WebRTC

[WebRTC] Android SurfaceTextureHelper

WebRTC 에서 미디어 스트림 생성시 트랙을 생성하게 되는데, 비디오 트랙의 경우 surface texture helper 를 사용해 VideoFrame 이라는 버퍼를 전달할 수 있도록 되어있다. 

Camera1Session, Camera2Session 에서 startListening() 으로 카메라 데이터 수신을 위한 루프를 동작시키고, 이 루프에서는 주기적으로 SurfaceTexture의 updateTexImage()를 호출해 새로운 데이터를 가져오도록 한다.


처리순서
1. 텍스처 생성 : oes 타입 텍스처
2. 서피스 텍스처 생성 : 텍스처로부터 surface texture를 생성
3. 해당 서피스 텍스처를 카메라 모듈등의 output으로 전달
4. surface texture 의 onFrameAvailable() 콜백 수신
5. VideoFrame.TextureBuffer 생성
6. TextureBuffer 에서 VideoFrame 생성
7. VideoFrame 전달

기본적으로 비디오 프레임 처리하기 위해 이해하고 있어야 하는 부분들인데, 우선 SurfaceTextureHelper 클래스를 간단히 정리한다. 

 

 

생성을 위한 static 헬퍼 메쏘드

SurfaceTextureHelper 가 동작할 쓰레드명과 egl context, yuv 변환용 쉐이더로 생성

private final Handler handler;
private final EglBase eglBase;
private final SurfaceTexture surfaceTexture;
private final int oesTextureId;

private final YuvConverter yuvConverter;
@Nullable private final TimestampAligner timestampAligner;



public static SurfaceTextureHelper create(final String threadName, final EglBase.Context sharedContext) {
    return create(threadName, sharedContext, /* alignTimestamps= */ false, new YuvConverter());
}

public static SurfaceTextureHelper create(final String threadName, final EglBase.Context sharedContext, boolean alignTimestamps) {
    return create(threadName, sharedContext, alignTimestamps, new YuvConverter());
}

public static SurfaceTextureHelper create(final String threadName,final EglBase.Context sharedContext, boolean alignTimestamps,final YuvConverter yuvConverter) {
    // 쓰레드 생성
    final HandlerThread thread = new HandlerThread(threadName);
    thread.start();
    
    final Handler handler = new Handler(thread.getLooper());
	
    // ThreadUtils: 동일한 쓰레드에서 생성이 일어나도록 유틸 클래스를 통해 생성
    return ThreadUtils.invokeAtFrontUninterruptibly(handler, new Callable<SurfaceTextureHelper>() {
        @Nullable
        @Override
        public SurfaceTextureHelper call() {
            try {
                return new SurfaceTextureHelper(sharedContext, handler, alignTimestamps, yuvConverter);
            } catch (RuntimeException e) {
                Logging.e(TAG, threadName + " create failure", e);
                return null;
            }
        }
    });
}

 

생성자

dummy pbuffer surface 생성
external oes texuture 생성
texuture 핸들을 통해 SurfaceTexture 생성
SurfaceTexture.setOnFrameAvailableListener 등록

EGL context와 텍스처, SurfaceTexture 를 생성한다. 카메라 데이터를 수신할 안드로이드 Surface 가 필요하기에 OpenGL es 텍스처로부터 SurfaceTexture를 생성한다.
Surfacetexture를 사용하므로 텍스처는 GL_TEXTURE_EXTERNAL_OES 로 바인딩 되어야 한다.

private SurfaceTextureHelper(EglBase.Context sharedContext, Handler handler, boolean alignTimestamps, YuvConverter yuvConverter) {
    
    // 쓰레드 체크
    if (handler.getLooper().getThread() != Thread.currentThread()) {
        throw new IllegalStateException("SurfaceTextureHelper must be created on the handler thread");
    }
    
    
    this.handler = handler;
    this.timestampAligner = alignTimestamps ? new TimestampAligner() : null;
    this.yuvConverter = yuvConverter;


    // Egl설정
    eglBase = EglBase.create(sharedContext, EglBase.CONFIG_PIXEL_BUFFER);
    try {
        // PBuffer 서피스 생성해 현재 컨텍스트에 연결
        eglBase.createDummyPbufferSurface();
        eglBase.makeCurrent();
    } catch (RuntimeException e) {
        // Clean up before rethrowing the exception.
        eglBase.release();
        handler.getLooper().quit();
        throw e;
    }


    // 텍스처 생성 : GIUtil.java - generateTexture() 참조
    int target = GLES11Ext.GL_TEXTURE_EXTERNAL_OES;
    
    final int textureArray[] = new int[1];
    GLES20.glGenTextures(1, textureArray, 0);
    final int textureId = textureArray[0];
    GLES20.glBindTexture(target, textureId);
    GLES20.glTexParameterf(target, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
    GLES20.glTexParameterf(target, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
    GLES20.glTexParameterf(target, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
    GLES20.glTexParameterf(target, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
    
    
    oesTextureId = textureId;
    
    // 서피스 텍스처 생성
    surfaceTexture = new SurfaceTexture(oesTextureId);
    
    // SurfaceTexture 리스너 설정
    // SurfaceTexture가 변경되면 해당 데이터를 가져오는 루틴을 호출한다.
    if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) {
        surfaceTexture.setOnFrameAvailableListener((SurfaceTexture st)->{
            hasPendingTexture=true;
            tryDeliverTextureFrame();
        }, handler);
    } else {
        // The documentation states that the listener will be called on an arbitrary thread, but in
        // pratice, it is always the thread on which the SurfaceTexture was constructed. There are
        // assertions in place in case this ever changes. For API >= 21, we use the new API to
        // explicitly specify the handler.
        surfaceTexture.setOnFrameAvailableListener((SurfaceTexture st)-> {
            hasPendingTexture = true;
            tryDeliverTextureFrame();
        });
    }

}

 

 

onFrameAvailable 콜백 처리


surfaceTexture.getTransformMatrix
surfaceTexture.getTimestamp
VideoFrame.TexutreBuffer 생성
VideoFrame 생성
VideoSink.onFrame

private void tryDeliverTextureFrame() {
    if (handler.getLooper().getThread() != Thread.currentThread()) {
        throw new IllegalStateException("Wrong thread.");
    }
    
    if (isQuitting || !hasPendingTexture || isTextureInUse || listener == null) {
        return;
    }

    if (textureWidth == 0 || textureHeight == 0) {
        // Information about the resolution needs to be provided by a call to setTextureSize() before
        // frames are produced.
        Logging.w(TAG, "Texture size has not been set.");
        return;
    }
    
    isTextureInUse = true;
    hasPendingTexture = false;

    synchronized( EglBase.lock ) {
        surfaceTexture.updateTexImage();
    }

    final float[] transformMatrix = new float[16];
    surfaceTexture.getTransformMatrix(transformMatrix);
    
    long timestampNs = surfaceTexture.getTimestamp();
    if (timestampAligner != null) {
        timestampNs = timestampAligner.translateTimestamp(timestampNs);
    }
    
    final VideoFrame.TextureBuffer buffer =
        new TextureBufferImpl(textureWidth, 
                              textureHeight, 
                              TextureBuffer.Type.OES, 
                              oesTextureId,
                              RendererCommon.convertMatrixToAndroidGraphicsMatrix(transformMatrix), 
                              handler,
                              yuvConverter,
                              textureRefCountMonitor);
    
    if (frameRefMonitor != null) {
      frameRefMonitor.onNewBuffer(buffer);
    }
    
    final VideoFrame frame = new VideoFrame(buffer, frameRotation, timestampNs);
    listener.onFrame(frame);
    frame.release();
}
// RenderCommon.java
public android.graphics.Matrix convertMatrixToAndroidGraphicsMatrix( float[] matrix4x4 ) {
    float[] values = {
        matrix4x4[0 * 4 + 0], matrix4x4[1 * 4 + 0], matrix4x4[3 * 4 + 0],
        matrix4x4[0 * 4 + 1], matrix4x4[1 * 4 + 1], matrix4x4[3 * 4 + 1],
        matrix4x4[0 * 4 + 3], matrix4x4[1 * 4 + 3], matrix4x4[3 * 4 + 3],
    };
    
    android.graphics.Matrix matrix = new android.graphics.Matrix();
    matrix.setValues(values);
    return matrix;
}

 

 

VideoFrame 생성한 뒤 등록된 listener.onFrame(frame) 을 호출 한다.
해당 인터페이스는 아래와 같다.

public interface VideoSink {
    /**
    * Implementations should call frame.retain() if they need to hold a reference to the frame after
    * this function returns. Each call to retain() should be followed by a call to frame.release()
    * when the reference is no longer needed.
    */
    @CalledByNative void onFrame(VideoFrame frame);
}

 

매 프레임 마다 setListenerRunnable 이 호출되고, 이 Runnable에서는  surfacetexture.updateTexImage() 메쏘드를 호출해 다음 프레임을 가져올 수록 한다. (결국 그냥 루프인데... 쓰레드 때문에 요렇게 구성해 둔 듯... )

// These variables are only accessed from the |handler| thread.
@Nullable private VideoSink listener;
@Nullable private VideoSink pendingListener;

private boolean hasPendingTexture;

// 핸들러를 사용해 리스너를 설정한다.
final Runnable setListenerRunnable = new Runnable() {
    @Override
    public void run() {
        Logging.d(TAG, "Setting listener to " + pendingListener);
        listener = pendingListener;
        pendingListener = null;
        
        // May have a pending frame from the previous capture session - drop it.
        // SurfaceTexture의 onFrameAvailableListener 호출되면 hasPendingTexture 플래그가 true 로 설정됨.
        if (hasPendingTexture) {
            // SurfaceTexture 의 텍스처 업데이트
            synchronized( EglBase.lock ) {
                surfaceTexture.updateTexImage();
            }
            hasPendingTexture = false;
        }
    }
};

 

리스너 시작, 중지

public void startListening(final VideoSink listener) {
    if (this.listener != null || this.pendingListener != null) {
        throw new IllegalStateException("SurfaceTextureHelper listener has already been set.");
    }
    
    this.pendingListener = listener;
    handler.post(setListenerRunnable);
}

public void stopListening() {
    Logging.d(TAG, "stopListening()");
    handler.removeCallbacks(setListenerRunnable);
    ThreadUtils.invokeAtFrontUninterruptibly(handler, () -> {
        listener = null;
        pendingListener = null;
    });
}

 

 


텍스처 버퍼

TextureBuffer 는 OES(GLES11Ext.GL_TEXTURE_EXTERNAL_OES), RGB(GLES20.GL_TEXTURE_2D) 형식을 지원
YuvConverter는 YUV 420 포맷을 RGB로 변환하거나 텍스처 데이터를 I420Buffer 포맷으로 변환하는데 사용.

TextureBufferImpl

    private final int unscaledWidth;
    private final int unscaledHeight;
    // This is the resolution that has been applied after cropAndScale().
    private final int width;
    private final int height;
    private final Type type;
    private final int id;
    private final Matrix transformMatrix;
    private final Handler toI420Handler;
    private final YuvConverter yuvConverter;
    private final RefCountDelegate refCountDelegate;

    public TextureBufferImpl( int width,
                              int height,
                              Type type,
                              int id,
                              Matrix transformMatrix,
                              Handler toI420Handler,
                              YuvConverter yuvConverter,
                              @Nullable Runnable releaseCallback) {
        this.unscaledWidth = width;
        this.unscaledHeight = height;
        this.width = width;
        this.height = height;
        this.type = type;
        this.id = id;
        this.transformMatrix = transformMatrix;
        this.toI420Handler = toI420Handler;
        this.yuvConverter = yuvConverter;
        this.refCountDelegate = new RefCountDelegate(releaseCallback);
    }

    private TextureBufferImpl( int unscaledWidth,
                               int unscaledHeight,
                               int width,
                               int height,
                               Type type,
                               int id,
                               Matrix transformMatrix,
                               Handler toI420Handler,
                               YuvConverter yuvConverter,
                               @Nullable Runnable releaseCallback) {
        this.unscaledWidth = unscaledWidth;
        this.unscaledHeight = unscaledHeight;
        this.width = width;
        this.height = height;
        this.type = type;
        this.id = id;
        this.transformMatrix = transformMatrix;
        this.toI420Handler = toI420Handler;
        this.yuvConverter = yuvConverter;
        this.refCountDelegate = new RefCountDelegate(releaseCallback);
    }

    @Override
    public VideoFrame.TextureBuffer.Type getType() {
        return type;
    }

    @Override
    public int getTextureId() {
        return id;
    }

    @Override
    public Matrix getTransformMatrix() {
        return transformMatrix;
    }

    @Override
    public int getWidth() {
        return width;
    }

    @Override
    public int getHeight() {
        return height;
    }

    @Override
    public VideoFrame.I420Buffer toI420() {
        return ThreadUtils.invokeAtFrontUninterruptibly(
            toI420Handler, () -> yuvConverter.convert(this));
    }

    @Override
    public void retain() {
        refCountDelegate.retain();
    }

    @Override
    public void release() {
        refCountDelegate.release();
    }

    @Override
    public VideoFrame.Buffer cropAndScale( int cropX,
                                           int cropY,
                                           int cropWidth,
                                           int cropHeight,
                                           int scaleWidth,
                                           int scaleHeight) {
        final Matrix cropAndScaleMatrix = new Matrix();
        // In WebRTC, Y=0 is the top row, while in OpenGL Y=0 is the bottom row. This means that the Y
        // direction is effectively reversed.
        final int cropYFromBottom = height - (cropY + cropHeight);
        cropAndScaleMatrix.preTranslate(cropX / (float) width, cropYFromBottom / (float) height);
        cropAndScaleMatrix.preScale(cropWidth / (float) width, cropHeight / (float) height);

        return applyTransformMatrix(cropAndScaleMatrix,
            (int) Math.round(unscaledWidth * cropWidth / (float) width),
            (int) Math.round(unscaledHeight * cropHeight / (float) height), scaleWidth, scaleHeight);
    }

    /**
     * Returns the width of the texture in memory. This should only be used for downscaling, and you
     * should still respect the width from getWidth().
     */
    public int getUnscaledWidth() {
        return unscaledWidth;
    }

    /**
     * Returns the height of the texture in memory. This should only be used for downscaling, and you
     * should still respect the height from getHeight().
     */
    public int getUnscaledHeight() {
        return unscaledHeight;
    }

    public Handler getToI420Handler() {
        return toI420Handler;
    }

    public YuvConverter getYuvConverter() {
        return yuvConverter;
    }

    /**
     * Create a new TextureBufferImpl with an applied transform matrix and a new size. The
     * existing buffer is unchanged. The given transform matrix is applied first when texture
     * coordinates are still in the unmodified [0, 1] range.
     */
    public TextureBufferImpl applyTransformMatrix( Matrix transformMatrix, int newWidth, int newHeight) {
        return applyTransformMatrix(transformMatrix, 
            /* unscaledWidth= */ newWidth,
            /* unscaledHeight= */ newHeight,
            /* scaledWidth= */ newWidth,
            /* scaledHeight= */ newHeight);
    }

    private TextureBufferImpl applyTransformMatrix(Matrix transformMatrix,
                                                   int unscaledWidth,
                                                   int unscaledHeight,
                                                   int scaledWidth,
                                                   int scaledHeight) {
        final Matrix newMatrix = new Matrix(this.transformMatrix);
        newMatrix.preConcat(transformMatrix);
        retain();
        return new TextureBufferImpl(unscaledWidth,
                                     unscaledHeight,
                                     scaledWidth,
                                     scaledHeight,
                                     type,
                                     id,
                                     newMatrix,
                                     toI420Handler,
                                     yuvConverter,
                                     this ::release);
    }