我有一个C++代码,看起来像这样:
appsrc do-timestamp=TRUE is-live=TRUE caps=
“video/x-h264, stream-format=(string)byte-stream, alignment=(string)none, framerate=(fraction)0/1” min-latency=300000000 ! h264parse ! video/x-h264, stream-format=(string)avc, alignment=(string)au ! tee name=t \
t. ! queue ! valve drop=FALSE ! decodebin ! glupload ! glcolorconvert ! qtsink sync=FALSE \
t. ! queue ! valve drop=FALSE ! mp4mux reserved-max-duration=3600000000000 reserved-moov-update-period=10000000000 ! filesink sync=FALSE location=”....../out.mp4”
appsrc将来自无人机usb无线视频接收器的视频注入管道。
更多背景:
usb接收器硬件为我们提供512字节的非时间戳原始附件b h.264视频块
帧速率应为60 fps,但在实践中,它很少与之保持同步,并且根据信号强度而变化(因此帧速率=(分数)0/1”,这就是qtsink和filesink都不同步到管道的原因(sync=false))
硬件引入了最低300毫秒的延迟,如appsrc中设置的
appsrc自动为我的缓冲区添加时间戳(do timestamp=true)
我使用mp4mux reserved max duration和reserved moov update period来防止应用程序崩溃破坏mp4文件
我正在为android使用gstreamer 1.18.4
无人机不在空中时,视频录制工作正常。但当它起飞时,在大约15秒的正确视频记录后,mp4mux元件出现故障,并显示消息“buffer has no pts”。不幸的是,一些用户一直在报道这一点,但我无法复制它(因为它需要驾驶我没有的无人机),这没有多大意义。到目前为止,我的猜测是,在那个特定时刻,无线视频链路可能会出现一些拥塞,一些视频数据包可能会延迟几毫秒,这可能会造成一些麻烦。
下面是创建appsrc的(简化)代码
_pAppSrc = gst_element_factory_make("appsrc", "artosyn_source");
gpointer pAppSrc = static_cast<gpointer>(_pAppSrc);
// Retain one more ref, so the source is destroyed
// in a controlled way
gst_object_ref(_pAppSrc);
pCaps = gst_caps_from_string("video/x-h264, stream-format=(string)byte-stream, alignment=none, framerate=(fraction)0/1"));
g_object_set(G_OBJECT(pAppSrc), "caps", pCaps,
"is-live", TRUE,
"min-latency", G_GINT64_CONSTANT(300000000),
"format", GST_FORMAT_TIME,
"do-timestamp", TRUE,
nullptr);
_pBufferPool = gst_buffer_pool_new();
pConfig = gst_buffer_pool_get_config (_pBufferPool);
static const guint kBufferSize = 512;
static const guint kPoolSize = 0x400000;
static const guint kPoolSizeMax = 0x600000;
qsizetype nBuffersMin = kPoolSize / kBufferSize;
qsizetype nBuffersMax = kPoolSizeMax / kBufferSize;
gst_buffer_pool_config_set_params(pConfig, pCaps, kBufferSize, nBuffersMin, nBuffersMax);
gst_buffer_pool_set_config(_pBufferPool, pConfig);
gst_buffer_pool_set_active(_pBufferPool, TRUE);
gst_caps_unref(GST_CAPS(pCaps));
当usb驱动程序填满一个新的缓冲区时,它会被推入管道,如下所示:
bool unref = false;
gst_buffer_unmap(b->pBuffer, &b->mapInfo);
gst_buffer_set_size(b->pBuffer, xfer.pXfer->actual_length);
if(result == LIBUSB_TRANSFER_COMPLETED)
{
//-- DROP DATA IF NOT IN PLAYING STATE --
GstState st, pend;
GstStateChangeReturn scr = gst_element_get_state(GST_ELEMENT(_pAppSrc), &st, &pend, GST_CLOCK_TIME_NONE);
Q_UNUSED(scr)
bool drop = (st != GST_STATE_PLAYING);
if(!drop)
{
GstFlowReturn ret = GST_FLOW_OK;
// Push into pipeline
ret = gst_app_src_push_buffer(GST_APP_SRC(_pAppSrc), b->pBuffer);
if(ret != GST_FLOW_OK)
qCDebug(MMCVideoLog()) << "Can't push buffer to the pipeline (" << ret << ")";
else
unref = false; // Don't unref since gst_app_src_push_buffer() steals one reference and takes ownership
}
} else if(result == LIBUSB_TRANSFER_CANCELLED)
{
qCDebug(MMCVideoLog()) << "! Buffer canceled";
} else {
qCDebug(MMCVideoLog()) << "? Buffer result = " << result;
}
if(unref)
gst_buffer_unref(b->pBuffer);
这是我从受影响的机器上从android logcat获得的信息:
[07-22 18:37:45.753 17414:18734 E/QGroundControl]
VideoReceiverLog: GStreamer error: [element ' "mp4mux0" '] Could not multiplex stream.
[07-22 18:37:45.753 17414:18734 E/QGroundControl]
VideoReceiverLog: Details: ../gst/isomp4/gstqtmux.c(5010): gst_qt_mux_add_buffer (): /GstPipeline:receiver/GstBin:sinkbin/GstMP4Mux:mp4mux0:
Buffer has no PTS.
我所尝试的:
将gstbaseparser pts_插值设置为true,并将推断设置为true
因此,我的问题是:
你能看到我的代码有什么问题吗?我遗漏了什么?
我可以依靠matroskamux暂时避免这个问题,直到找到真正的原因吗?
暂无答案!
目前还没有任何答案,快来回答吧!