ios 从CMSampleBuffer提取数据以创建深层副本

dojqjjoe  于 2022-11-26  发布在  iOS
关注(0)|答案(5)|浏览(203)

我正在尝试创建由AVCaptureVideoDataOutputSampleBufferDelegate中的captureOutput返回的CMSampleBuffer的副本。
由于CMSampleBuffers来自一个预分配的缓冲池(15个),如果我给它们附加一个引用,它们就不能被回收,这会导致所有剩余的帧被丢弃。
为了保持最佳性能,某些样本缓冲区直接引用可能需要由设备系统和其他捕获输入重复使用的内存池。对于未压缩的设备本机捕获,通常会出现这种情况,其中内存块复制得越少越好。如果多个样本缓冲区引用此类内存池的时间太长,输入将不再能够将新样本复制到存储器中,并且那些样本将被丢弃。
如果您的应用程序因保留所提供的CMSampleBufferRef对象太久而导致样品被卸除,但它需要长时间存取样品数据,请考虑将数据复制到新的缓冲区,然后释放样品缓冲区(如果先前已保留),以便可以重复使用它所指涉的内存。
显然我必须复制CMSampleBuffer,但CMSampleBufferCreateCopy()只会创建一个浅层副本。因此我得出结论,我必须使用CMSampleBufferCreate()。我填写了构造函数所需的12个参数,但遇到了CMSampleBuffers不包含blockBuffer的问题(不完全确定这是什么,但似乎很重要)。
这个问题已经问过好几次了,但没有得到回答。
Deep Copy of CMImageBuffer or CVImageBufferCreate a copy of CMSampleBuffer in Swift 2.0的函数
一个可能的答案是"我终于知道如何使用它来创建一个深度克隆。所有的复制方法都重用了堆中的数据,而堆中的数据会锁定AVCaptureSession。所以我必须将数据拉到一个NSMutableData对象中,然后创建一个新的样本缓冲区。"
如果你感兴趣的话,thisprint(sampleBuffer)的输出。没有提到blockBuffer,也就是CMSampleBufferGetDataBuffer返回nil。有一个imageBuffer,但是使用CMSampleBufferCreateForImageBuffer创建一个"副本"似乎也没有释放CMSampleBuffer。
编辑:自从这个问题被贴出来后,我一直在尝试更多的复制内存的方法。
我做了用户Kametrixom尝试过的同样的事情。This是我尝试过的同样的想法,首先复制CV泛指elBuffer,然后使用CMSampleBufferCreateForImageBuffer创建最终的样本缓冲区。但是,这会导致以下两个错误之一:

  • memcpy指令上的EXC_BAD_ACCESS。AKA是试图访问应用程序内存之外的一个segfault。
  • 或者,内存将成功复制,但CMSampleBufferCreateReadyWithImageBuffer()将失败,结果代码为-12743,表示"指示给定媒体的格式与给定格式描述不匹配。例如,格式描述与CVImageBuffer配对,导致CMVideoFormatDescriptionMatchesImageBuffer失败"。

您可以看到,我和Kametrixom都使用了CMSampleBufferGetFormatDescription(sampleBuffer)来复制源缓冲区的格式描述,因此,我不确定为什么给定媒体的格式与给定的格式描述不匹配。

bqujaahr

bqujaahr1#

好了,我想我终于明白了。我创建了一个助手扩展来制作一个CVPixelBuffer的完整副本:

extension CVPixelBuffer {
    func copy() -> CVPixelBuffer {
        precondition(CFGetTypeID(self) == CVPixelBufferGetTypeID(), "copy() cannot be called on a non-CVPixelBuffer")

        var _copy : CVPixelBuffer?
        CVPixelBufferCreate(
            nil,
            CVPixelBufferGetWidth(self),
            CVPixelBufferGetHeight(self),
            CVPixelBufferGetPixelFormatType(self),
            CVBufferGetAttachments(self, kCVAttachmentMode_ShouldPropagate)?.takeUnretainedValue(),
            &_copy)

        guard let copy = _copy else { fatalError() }

        CVPixelBufferLockBaseAddress(self, kCVPixelBufferLock_ReadOnly)
        CVPixelBufferLockBaseAddress(copy, 0)

        for plane in 0..<CVPixelBufferGetPlaneCount(self) {
            let dest = CVPixelBufferGetBaseAddressOfPlane(copy, plane)
            let source = CVPixelBufferGetBaseAddressOfPlane(self, plane)
            let height = CVPixelBufferGetHeightOfPlane(self, plane)
            let bytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(self, plane)

            memcpy(dest, source, height * bytesPerRow)
        }

        CVPixelBufferUnlockBaseAddress(copy, 0)
        CVPixelBufferUnlockBaseAddress(self, kCVPixelBufferLock_ReadOnly)

        return copy
    }
}

现在可以在didOutputSampleBuffer方法中使用它:

guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }

let copy = pixelBuffer.copy()

toProcess.append(copy)

但要注意,一个这样的像素缓冲区占用大约3MB的内存(1080p),这意味着在100帧你已经得到了大约300MB,这是大约在iPhone说STAHP(和崩溃)的点。
请注意,您实际上并不想复制CMSampleBuffer,因为它实际上只包含一个CVPixelBuffer,因为它是一个映像。

kr98yfug

kr98yfug2#

这是Swift 3对最高评分答案的解决方案。

extension CVPixelBuffer {
func copy() -> CVPixelBuffer {
    precondition(CFGetTypeID(self) == CVPixelBufferGetTypeID(), "copy() cannot be called on a non-CVPixelBuffer")

    var _copy : CVPixelBuffer?
    CVPixelBufferCreate(
        kCFAllocatorDefault,
        CVPixelBufferGetWidth(self),
        CVPixelBufferGetHeight(self),
        CVPixelBufferGetPixelFormatType(self),
        nil,
        &_copy)

    guard let copy = _copy else { fatalError() }

    CVPixelBufferLockBaseAddress(self, CVPixelBufferLockFlags.readOnly)
    CVPixelBufferLockBaseAddress(copy, CVPixelBufferLockFlags(rawValue: 0))

    let copyBaseAddress = CVPixelBufferGetBaseAddress(copy)
    let currBaseAddress = CVPixelBufferGetBaseAddress(self)

    memcpy(copyBaseAddress, currBaseAddress, CVPixelBufferGetDataSize(self))

    CVPixelBufferUnlockBaseAddress(copy, CVPixelBufferLockFlags(rawValue: 0))
    CVPixelBufferUnlockBaseAddress(self, CVPixelBufferLockFlags.readOnly)

    return copy
}
}
juud5qan

juud5qan3#

我花了几个小时试图让这个工作。结果发现原始CV泛指elBuffer中的附件和泛指elBufferMetalCompatibilityKey中的IOSurface选项都是必要的。
(FYI,遗憾的是,VideoToolbox的泛指elTransferSession仅适用于macOS。)
下面是我最后的结果。我在最后留下了几行,让你通过比较原始和复制的CV泛指elBuffers的平均颜色来验证memcpy。它确实会减慢速度,所以一旦你确信你的副本()工作正常,就应该删除它。CIImage.averageColour扩展名是从this code改编而来的。

extension CVPixelBuffer {
    func copy() -> CVPixelBuffer {
        precondition(CFGetTypeID(self) == CVPixelBufferGetTypeID(), "copy() cannot be called on a non-CVPixelBuffer")

        let ioSurfaceProps = [
            "IOSurfaceOpenGLESFBOCompatibility": true as CFBoolean,
            "IOSurfaceOpenGLESTextureCompatibility": true as CFBoolean,
            "IOSurfaceCoreAnimationCompatibility": true as CFBoolean
        ] as CFDictionary

        let options = [
            String(kCVPixelBufferMetalCompatibilityKey): true as CFBoolean,
            String(kCVPixelBufferIOSurfacePropertiesKey): ioSurfaceProps
        ] as CFDictionary

        var _copy : CVPixelBuffer?
        CVPixelBufferCreate(
            nil,
            CVPixelBufferGetWidth(self),
            CVPixelBufferGetHeight(self),
            CVPixelBufferGetPixelFormatType(self),
            options,
            &_copy)

        guard let copy = _copy else { fatalError() }

        CVBufferPropagateAttachments(self as CVBuffer, copy as CVBuffer)

        CVPixelBufferLockBaseAddress(self, CVPixelBufferLockFlags.readOnly)
        CVPixelBufferLockBaseAddress(copy, CVPixelBufferLockFlags(rawValue: 0))

        let copyBaseAddress = CVPixelBufferGetBaseAddress(copy)
        let currBaseAddress = CVPixelBufferGetBaseAddress(self)

        memcpy(copyBaseAddress, currBaseAddress, CVPixelBufferGetDataSize(self))

        CVPixelBufferUnlockBaseAddress(copy, CVPixelBufferLockFlags(rawValue: 0))
        CVPixelBufferUnlockBaseAddress(self, CVPixelBufferLockFlags.readOnly)

        // let's make sure they have the same average color
//        let originalImage = CIImage(cvPixelBuffer: self)
//        let copiedImage = CIImage(cvPixelBuffer: copy)
//
//        let averageColorOriginal = originalImage.averageColour()
//        let averageColorCopy = copiedImage.averageColour()
//
//        assert(averageColorCopy == averageColorOriginal)
//        debugPrint("average frame color: \(averageColorCopy)")

        return copy
    }
}
uz75evzq

uz75evzq4#

我相信在VideoToolbox.framework中,你可以使用VTPixelTransferSession来复制像素缓冲区。事实上,这是这个类唯一能做的事情。
参考:https://developer.apple.com/documentation/videotoolbox/vtpixeltransfersession-7cg

46scxncf

46scxncf5#

花了一些时间试图把这个拉在一起。我需要一个函数,应该能够创建一个CMSampleBuffer深度副本,里面有CV泛指elBuffer,而不仅仅是像素缓冲区副本。
以下是我的想法,它适用于我在iOS 15,Swift 5:

extension CVPixelBuffer {
func copy() -> CVPixelBuffer {
    precondition(CFGetTypeID(self) == CVPixelBufferGetTypeID(), "copy() cannot be called on a non-CVPixelBuffer")

    var _copy : CVPixelBuffer?
    CVPixelBufferCreate(
            nil,
            CVPixelBufferGetWidth(self),
            CVPixelBufferGetHeight(self),
            CVPixelBufferGetPixelFormatType(self),
            nil,
            &_copy)
    guard let copy = _copy else { fatalError() }

    CVBufferPropagateAttachments(self, copy)

    CVPixelBufferLockBaseAddress(self, .readOnly)
    CVPixelBufferLockBaseAddress(copy, CVPixelBufferLockFlags())

    for plane in 0 ..< CVPixelBufferGetPlaneCount(self) {
        let dest = CVPixelBufferGetBaseAddressOfPlane(copy, plane)
        let source = CVPixelBufferGetBaseAddressOfPlane(self, plane)
        let height = CVPixelBufferGetHeightOfPlane(self, plane)
        let bytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(self, plane)

        memcpy(dest, source, height * bytesPerRow)
    }

    CVPixelBufferUnlockBaseAddress(copy, CVPixelBufferLockFlags())
    CVPixelBufferUnlockBaseAddress(self, .readOnly)

    return copy
  }
}

样品缓冲扩展

extension CMSampleBuffer {
func copy() -> CMSampleBuffer {
    guard let pixelBuffer = CMSampleBufferGetImageBuffer(self) else { fatalError("CMSampleBuffer copy: get image buffer")  }

    let copiedPixelBuffer = pixelBuffer.copy()
    let format = CMSampleBufferGetFormatDescription(self)!
    var timing = CMSampleTimingInfo()
    CMSampleBufferGetSampleTimingInfo(self, at: 0, timingInfoOut: &timing)

    var copiedSampleBuffer : CMSampleBuffer?

    let status = CMSampleBufferCreateReadyWithImageBuffer(allocator: nil,
            imageBuffer: copiedPixelBuffer,
            formatDescription: format,
            sampleTiming: &timing,
            sampleBufferOut: &copiedSampleBuffer)
    guard let bufOut = copiedSampleBuffer else { fatalError("CMSampleBuffer copy: CreateReady \(status)") }
    return bufOut
  }
}

又一个叫抄:

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
    let bufferCopy = sampleBuffer.copy()
    // free to use bufferCopy as it won't stop video recording
    // don't forget to put it into autoreleasepool { } if used in non-main thread
}

相关问题