本文介绍了从AVPlayer中提取位图是非常不确定的的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我想在某个时间点从视频播放器中提取UIIMage,并对该图像进行一些处理。
我有以下代码:
pausePlayer() let time = player!.currentTime() imageFromVideo(url: fileURL(for: movieName!), at: time.seconds) { image in let result = self.detectLines(image: image! if let result = result { // Display results by handing off to the InferenceViewController. DispatchQueue.main.async { self.drawNewResults(result: result) } }和
public func imageFromVideo(url: URL, at time: TimeInterval, completion: @escaping (UIImage?) -> Void) { DispatchQueue.global(qos: .background).async { let asset = AVURLAsset(url: url) let assetIG = AVAssetImageGenerator(asset: asset) assetIG.appliesPreferredTrackTransform = true assetIG.apertureMode = AVAssetImageGenerator.ApertureMode.encodedPixels let cmTime = CMTime(seconds: time, preferredTimescale: CMTimeScale(30.0)) var newTime = CMTime(seconds: time, preferredTimescale: CMTimeScale(30.0)) let thumbnailImageRef: CGImage do { thumbnailImageRef = try assetIG.copyCGImage(at: cmTime, actualTime: &newTime) } catch let error { print("Error: (error)") return completion(nil) } print("Time on click: %f", cmTime.seconds) print("Actual time: %f", newTime.seconds) DispatchQueue.main.async { completion(UIImage(cgImage: thumbnailImageRef)) } } }问题是,在我想要的图像和我得到的图像之间有一个巨大的差距。通常约为0.2s和0.4s,大致为7 14帧之间的偏移量。因此,我所做和显示的处理是错误的,因为我没有处理观察到的图像。
示例:
Time on click: %f 0.8333333333333334 Actual time: %f 1.0666666666666667 Time on click: %f 1.6333333333333333 Actual time: %f 2.1666666666666665 推荐答案此处建议:swift: How to take screenshot of AVPlayerLayer()
assetIG.requestedTimeToleranceAfter = CMTime.zero assetIG.requestedTimeToleranceBefore = CMTime.zero成功了。
更多推荐
从AVPlayer中提取位图是非常不确定的
发布评论