在swift中从视频中提取帧

6ojccjat  于 2022-12-21  发布在  Swift
关注(0)|答案(4)|浏览(188)

我正在尝试从Swift中的视频中提取帧作为UIMages。我找到了几个Objective C解决方案,但在Swift中找不到任何解决方案。假设以下内容正确,有人可以帮助我将以下内容转换为Swift或给我他们自己的方法吗?
来源:Grabbing the first frame of a video from UIImagePickerController?

- (UIImage *)imageFromVideo:(NSURL *)videoURL atTime:(NSTimeInterval)time {
    
    AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoURL options:nil];
    NSParameterAssert(asset);
    AVAssetImageGenerator *assetIG =
    [[AVAssetImageGenerator alloc] initWithAsset:asset];
    assetIG.appliesPreferredTrackTransform = YES;
    assetIG.apertureMode = AVAssetImageGeneratorApertureModeEncodedPixels;
    
    CGImageRef thumbnailImageRef = NULL;
    CFTimeInterval thumbnailImageTime = time;
    NSError *igError = nil;
    thumbnailImageRef =
    [assetIG copyCGImageAtTime:CMTimeMake(thumbnailImageTime, 60)
                    actualTime:NULL
                         error:&igError];
    
    if (!thumbnailImageRef)
        NSLog(@"thumbnailImageGenerationError %@", igError );
    
    UIImage *image = thumbnailImageRef
    ? [[UIImage alloc] initWithCGImage:thumbnailImageRef]
    : nil;
    
    return image;
}
xdnvmnnf

xdnvmnnf1#

它确实起作用了。

func imageFromVideo(url: URL, at time: TimeInterval) -> UIImage? {
    let asset = AVURLAsset(url: url)

    let assetIG = AVAssetImageGenerator(asset: asset)
    assetIG.appliesPreferredTrackTransform = true
    assetIG.apertureMode = AVAssetImageGeneratorApertureModeEncodedPixels

    let cmTime = CMTime(seconds: time, preferredTimescale: 60)
    let thumbnailImageRef: CGImage
    do {
        thumbnailImageRef = try assetIG.copyCGImage(at: cmTime, actualTime: nil)
    } catch let error {
        print("Error: \(error)")
        return nil
    }

    return UIImage(cgImage: thumbnailImageRef)
}

但是请记住,这个函数是同步的,最好不要在主队列中调用它。
您可以执行以下任一操作:

DispatchQueue.global(qos: .background).async {
    let image = self.imageFromVideo(url: url, at: 0)

    DispatchQueue.main.async {
        self.imageView.image = image
    }
}

或者使用generateCGImagesAsynchronously代替copyCGImage

c7rzv4ha

c7rzv4ha2#

以下是Dmitry解决方案的SWIFT 5替代方案,您不必担心自己处于哪个队列:

public func imageFromVideo(url: URL, at time: TimeInterval, completion: @escaping (UIImage?) -> Void) {
    DispatchQueue.global(qos: .background).async {
        let asset = AVURLAsset(url: url)

        let assetIG = AVAssetImageGenerator(asset: asset)
        assetIG.appliesPreferredTrackTransform = true
        assetIG.apertureMode = AVAssetImageGenerator.ApertureMode.encodedPixels

        let cmTime = CMTime(seconds: time, preferredTimescale: 60)
        let thumbnailImageRef: CGImage
        do {
            thumbnailImageRef = try assetIG.copyCGImage(at: cmTime, actualTime: nil)
        } catch let error {
            print("Error: \(error)")
            return completion(nil)
        }

        DispatchQueue.main.async {
            completion(UIImage(cgImage: thumbnailImageRef))
        }
    }
}

现在来使用它:

imageFromVideo(url: videoUrl, at: 0) { image in
   // Do something with the image here
}
wnvonmuf

wnvonmuf3#

你可以在iOS上轻松做到这一点。下面是一段代码片段,说明如何在Swift上做到这一点。

let url = Bundle.main.url(forResource: "video_name", withExtension: "mp4")
    let videoAsset = AVAsset(url: url!)
    
    let t1 = CMTime(value: 1, timescale: 1)
    let t2 = CMTime(value: 4, timescale: 1)
    let t3 = CMTime(value: 8, timescale: 1)
    let timesArray = [
        NSValue(time: t1),
        NSValue(time: t2),
        NSValue(time: t3)
    ]
    
    let generator = AVAssetImageGenerator(asset: videoAsset)
    generator.requestedTimeToleranceBefore = .zero
    generator.requestedTimeToleranceAfter = .zero
    
    generator.generateCGImagesAsynchronously(forTimes: timesArray ) { requestedTime, image, actualTime, result, error in
        
       let img = UIImage(cgImage: image!)
       
    }

您可以找到演示代码here和中等文章here

lbsnaicq

lbsnaicq4#

下面是@Dmitry的async/await版本,可以帮助那些不喜欢完成处理程序的人

func imageFromVideo(url: URL, at time: TimeInterval) async throws -> UIImage {
        try await withCheckedThrowingContinuation({ continuation in
            DispatchQueue.global(qos: .background).async {
                let asset = AVURLAsset(url: url)
                
                let assetIG = AVAssetImageGenerator(asset: asset)
                assetIG.appliesPreferredTrackTransform = true
                assetIG.apertureMode = AVAssetImageGenerator.ApertureMode.encodedPixels
                
                let cmTime = CMTime(seconds: time, preferredTimescale: 60)
                let thumbnailImageRef: CGImage
                do {
                    thumbnailImageRef = try assetIG.copyCGImage(at: cmTime, actualTime: nil)
                } catch {
                    continuation.resume(throwing: error)
                    return
                }
                continuation.resume(returning: UIImage(cgImage: thumbnailImageRef))
            }
        })
    }

用法:

let vidUrl = <#your url#>
do {
    let firstFrame = try await imageFromVideo(url: vidUrl, at: 0)
    // do something with image
} catch {
    // handle error
}

如果你使用抛出函数,就像这样:

func someThrowingFunc() throws {
    let vidUrl = <#your url#>
    let firstFrame = try await imageFromVideo(url: vidUrl, at: 0)
    // do something with image
}

相关问题