一尘不染

用增强现实录制视频的最佳方法是什么

swift

用增强现实录制视频的最佳方法是什么?(向iPhone / iPad相机的框架添加文字,图像徽标)

以前,我试图弄清楚如何绘制CIImage并转换CIImageCMSampleBufferCIImage回到CMSampleBuffer中
我几乎做了所有事情,只是在使用new CMSampleBufferin 录制视频时遇到问题AVAssetWriterInput

但是,无论如何,这种解决方案都不是很好,它在转换CIImageCVPixelBufferciContext.render(ciImage!,to: aBuffer))时会占用大量CPU

因此,我想在这里停下来,找到其他方法来录制具有增强现实的视频(例如,在将视频编码为mp4文件的同时,在帧内动态添加(绘制)文本)

在这里,我已经尝试过并且不想再使用了…

// convert original CMSampleBuffer to CIImage, 
// combine multiple `CIImage`s into one (adding augmented reality -  
// text or some additional images)
let pixelBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
let ciimage : CIImage = CIImage(cvPixelBuffer: pixelBuffer)
var outputImage: CIImage?
let images : Array<CIImage> = [ciimage, ciimageSec!] // add all your CIImages that you'd like to combine
for image in images {
    outputImage = outputImage == nil ? image : image.composited(over: outputImage!)
}

// allocate this class variable once         
if pixelBufferNew == nil {
    CVPixelBufferCreate(kCFAllocatorSystemDefault, CVPixelBufferGetWidth(pixelBuffer),  CVPixelBufferGetHeight(pixelBuffer), kCVPixelFormatType_32BGRA, nil, &pixelBufferNew)
}

// convert CIImage to CVPixelBuffer
let ciContext = CIContext(options: nil)
if let aBuffer = pixelBufferNew {
    ciContext.render(outputImage!, to: aBuffer) // >>> IT EATS A LOT OF <<< CPU
}

// convert new CVPixelBuffer to new CMSampleBuffer
var sampleTime = CMSampleTimingInfo()
sampleTime.duration = CMSampleBufferGetDuration(sampleBuffer)
sampleTime.presentationTimeStamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
sampleTime.decodeTimeStamp = CMSampleBufferGetDecodeTimeStamp(sampleBuffer)
var videoInfo: CMVideoFormatDescription? = nil
CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, pixelBufferNew!, &videoInfo)
var oBuf: CMSampleBuffer?
CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, pixelBufferNew!, true, nil, nil, videoInfo!, &sampleTime, &oBuf)

/*
try to append new CMSampleBuffer into a file (.mp4) using 
AVAssetWriter & AVAssetWriterInput... (I met errors with it, original buffer works ok 
- "from func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)")
*/*

有没有更好的解决方案?


阅读 293

收藏
2020-07-07

共1个答案

一尘不染

现在我回答我自己的问题

最好是使用Objective-C++类(.mm),我们可以使用OpenCV的和容易/快速转换CMSampleBuffercv::Mat和回CMSampleBuffer处理后

我们可以轻松地从Swift调用Objective-C ++函数

2020-07-07