本文整理了Java中org.webrtc.YuvConverter
类的一些代码示例,展示了YuvConverter
类的具体用法。这些代码示例主要来源于Github
/Stackoverflow
/Maven
等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。YuvConverter
类的具体详情如下:
包路径:org.webrtc.YuvConverter
类名称:YuvConverter
[英]Class for converting OES textures to a YUV ByteBuffer. It should be constructed on a thread with an active EGL context, and only be used from that thread.
[中]用于将OES纹理转换为YUV ByteBuffer的类。它应该构建在具有活动EGL上下文的线程上,并且只能从该线程使用。
代码示例来源:origin: linsir6/WebRTC-Voice
@Override
public void run() {
if (yuvConverter == null) {
yuvConverter = new YuvConverter();
}
yuvConverter.convert(buf, width, height, stride, textureId, transformMatrix);
}
});
代码示例来源:origin: twilio/video-quickstart-android
YuvConverter yuvConverter = new YuvConverter();
yuvConverter.convert(outputFrameBuffer,
width,
height,
yuvConverter.release();
代码示例来源:origin: linsir6/WebRTC-Voice
@Override
public void run() {
try {
videoOutFile.close();
} catch (IOException e) {
Logging.d(TAG, "Error closing output video file");
}
yuvConverter.release();
eglBase.release();
renderThread.quit();
cleanupBarrier.countDown();
}
});
代码示例来源:origin: linsir6/WebRTC-Voice
@Override
public void run() {
eglBase = EglBase.create(sharedContext, EglBase.CONFIG_PIXEL_BUFFER);
eglBase.createDummyPbufferSurface();
eglBase.makeCurrent();
yuvConverter = new YuvConverter();
}
});
代码示例来源:origin: linsir6/WebRTC-Voice
videoOutFile.write("FRAME\n".getBytes());
if (!frame.yuvFrame) {
yuvConverter.convert(outputFrameBuffer, outputFileWidth, outputFileHeight, outputFileWidth,
frame.textureId, texMatrix);
代码示例来源:origin: linsir6/WebRTC-Voice
private void release() {
if (handler.getLooper().getThread() != Thread.currentThread()) {
throw new IllegalStateException("Wrong thread.");
}
if (isTextureInUse || !isQuitting) {
throw new IllegalStateException("Unexpected release.");
}
if (yuvConverter != null) {
yuvConverter.release();
}
GLES20.glDeleteTextures(1, new int[] {oesTextureId}, 0);
surfaceTexture.release();
eglBase.release();
handler.getLooper().quit();
}
}
内容来源于网络,如有侵权,请联系作者删除!