一尘不染

使用AVFoundation录制方形视频并添加水印

swift

我正在尝试做的插图

我正在尝试执行以下操作:

  • 播放音乐
  • 录制方形视频(视图中有一个容器,可以显示正在录制的内容)
  • 在方形视频的顶部和底部分别添加标签和应用图标和名称。

到目前为止,我设法播放音乐,在另一个视图中的方形容器中显示AVCaptureVideoPreviewLayer并将视频保存到相机胶卷中。

事实是,我几乎找不到关于使用AVFoundation的模糊教程,这是我的第一个应用程序,这使工作变得非常困难。

我设法做到了这些,但是我仍然不了解AVFoundation的工作方式。该文档对于初学者来说含糊不清,而且我还没有找到适合自己特定需求的教程,而将多个教程(用Obj
C编写)组合在一起就使这一切变得不可能。我的问题如下:

  1. 视频不会保存为正方形。(提及该应用不支持横向)
  2. 视频没有音频。(我认为我应该添加视频以外的某种音频输入)
  3. 如何在视频中添加水印?
  4. 我有一个错误:我创建了一个带有文本和图像的视图(messageView;在代码中看到),让用户知道视频已保存到相机胶卷中。但是,如果我第二次开始录制,则在录制视频时会出现该视图,而不是在录制之后。我怀疑这与为每个视频命名相同有关。

所以我做了准备:

override func viewDidLoad() {
        super.viewDidLoad()

        // Preset For High Quality
        captureSession.sessionPreset = AVCaptureSessionPresetHigh

        // Get available devices capable of recording video
        let devices = AVCaptureDevice.devicesWithMediaType(AVMediaTypeVideo) as! [AVCaptureDevice]

        // Get back camera
        for device in devices
        {
            if device.position == AVCaptureDevicePosition.Back
            {
                currentDevice = device
            }
        }

        // Set Input
        let captureDeviceInput: AVCaptureDeviceInput
        do
        {
            captureDeviceInput = try AVCaptureDeviceInput(device: currentDevice)
        }
        catch
        {
            print(error)
            return
        }

        // Set Output
        videoFileOutput = AVCaptureMovieFileOutput()

        // Configure Session w/ Input & Output Devices
        captureSession.addInput(captureDeviceInput)
        captureSession.addOutput(videoFileOutput)

        // Show Camera Preview
        cameraPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
        view.layer.addSublayer(cameraPreviewLayer!)
        cameraPreviewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
        let width = view.bounds.width*0.85
        cameraPreviewLayer?.frame = CGRectMake(0, 0, width, width)

        // Bring Record Button To Front
        view.bringSubviewToFront(recordButton)
        captureSession.startRunning()

//        // Bring Message To Front
//        view.bringSubviewToFront(messageView)
//        view.bringSubviewToFront(messageText)
//        view.bringSubviewToFront(messageImage)
    }

然后,当我按下“录制”按钮时:

@IBAction func capture(sender: AnyObject) {
    if !isRecording
    {
        isRecording = true

        UIView.animateWithDuration(0.5, delay: 0.0, options: [.Repeat, .Autoreverse, .AllowUserInteraction], animations: { () -> Void in
            self.recordButton.transform = CGAffineTransformMakeScale(0.5, 0.5)
            }, completion: nil)

        let outputPath = NSTemporaryDirectory() + "output.mov"
        let outputFileURL = NSURL(fileURLWithPath: outputPath)
        videoFileOutput?.startRecordingToOutputFileURL(outputFileURL, recordingDelegate: self)
    }
    else
    {
        isRecording = false

        UIView.animateWithDuration(0.5, delay: 0, options: [], animations: { () -> Void in
            self.recordButton.transform = CGAffineTransformMakeScale(1.0, 1.0)
            }, completion: nil)
        recordButton.layer.removeAllAnimations()
        videoFileOutput?.stopRecording()
    }
}

在录制视频之后:

func captureOutput(captureOutput: AVCaptureFileOutput!, didFinishRecordingToOutputFileAtURL outputFileURL: NSURL!, fromConnections connections: [AnyObject]!, error: NSError!) {
    let outputPath = NSTemporaryDirectory() + "output.mov"
    if UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(outputPath)
    {
        UISaveVideoAtPathToSavedPhotosAlbum(outputPath, self, nil, nil)
        // Show Success Message
        UIView.animateWithDuration(0.4, delay: 0, options: [], animations: {
            self.messageView.alpha = 0.8
            }, completion: nil)
        UIView.animateWithDuration(0.4, delay: 0, options: [], animations: {
            self.messageText.alpha = 1.0
            }, completion: nil)
        UIView.animateWithDuration(0.4, delay: 0, options: [], animations: {
            self.messageImage.alpha = 1.0
            }, completion: nil)
        // Hide Message
        UIView.animateWithDuration(0.4, delay: 1, options: [], animations: {
            self.messageView.alpha = 0
            }, completion: nil)
        UIView.animateWithDuration(0.4, delay: 1, options: [], animations: {
            self.messageText.alpha = 0
            }, completion: nil)
        UIView.animateWithDuration(0.4, delay: 1, options: [], animations: {
            self.messageImage.alpha = 0
            }, completion: nil)
    }
}

那我该怎么解决呢?我一直在搜索和查看教程,但我想不通…我读了有关添加水印的信息,我发现这与在视频顶部添加CALayers有关。但是显然我无法做到这一点,因为我什至不知道如何使视频变得方形并添加音频。


阅读 317

收藏
2020-07-07

共1个答案

一尘不染

一些东西:

就音频而言,您要添加视频(摄像机)输入,但不添加音频输入。这样做以获得声音。

    let audioInputDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeAudio)

    do {
        let input = try AVCaptureDeviceInput(device: audioInputDevice)

        if sourceAVFoundation.captureSession.canAddInput(input) {
            sourceAVFoundation.captureSession.addInput(input)
        } else {
            NSLog("ERROR: Can't add audio input")
        }
    } catch let error {
        NSLog("ERROR: Getting input device: \(error)")
    }

为了使视频更加方形,您将不得不使用AVAssetWriter而不是AVCaptureFileOutput。这更加复杂,但是您获得了更多的“力量”。您已经创建了一个AVCaptureSession,它非常棒,要连接AssetWriter,您需要执行以下操作:

    let fileManager = NSFileManager.defaultManager()
    let urls = fileManager.URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)
    guard let documentDirectory: NSURL = urls.first else {
        print("Video Controller: getAssetWriter: documentDir Error")
        return nil
    }

    let local_video_name = NSUUID().UUIDString + ".mp4"
    self.videoOutputURL = documentDirectory.URLByAppendingPathComponent(local_video_name)

    guard let url = self.videoOutputURL else {
        return nil
    }


    self.assetWriter = try? AVAssetWriter(URL: url, fileType: AVFileTypeMPEG4)

    guard let writer = self.assetWriter else {
        return nil
    }

    //TODO: Set your desired video size here! 
    let videoSettings: [String : AnyObject] = [
        AVVideoCodecKey  : AVVideoCodecH264,
        AVVideoWidthKey  : captureSize.width,
        AVVideoHeightKey : captureSize.height,
        AVVideoCompressionPropertiesKey : [
            AVVideoAverageBitRateKey : 200000,
            AVVideoProfileLevelKey : AVVideoProfileLevelH264Baseline41,
            AVVideoMaxKeyFrameIntervalKey : 90,
        ],
    ]

    assetWriterInputCamera = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: videoSettings)
    assetWriterInputCamera?.expectsMediaDataInRealTime = true
    writer.addInput(assetWriterInputCamera!)

    let audioSettings : [String : AnyObject] = [
        AVFormatIDKey : NSInteger(kAudioFormatMPEG4AAC),
        AVNumberOfChannelsKey : 2,
        AVSampleRateKey : NSNumber(double: 44100.0)
    ]

    assetWriterInputAudio = AVAssetWriterInput(mediaType: AVMediaTypeAudio, outputSettings: audioSettings)
    assetWriterInputAudio?.expectsMediaDataInRealTime = true
    writer.addInput(assetWriterInputAudio!)

设置好AssetWriter之后,然后为视频和音频连接一些输出

    let bufferAudioQueue = dispatch_queue_create("audio buffer delegate", DISPATCH_QUEUE_SERIAL)
    let audioOutput = AVCaptureAudioDataOutput()
    audioOutput.setSampleBufferDelegate(self, queue: bufferAudioQueue)
    captureSession.addOutput(audioOutput)

    // Always add video last...
    let videoOutput = AVCaptureVideoDataOutput()
    videoOutput.setSampleBufferDelegate(self, queue: bufferVideoQueue)
    captureSession.addOutput(videoOutput)
    if let connection = videoOutput.connectionWithMediaType(AVMediaTypeVideo) {
        if connection.supportsVideoOrientation {
            // Force recording to portrait
            connection.videoOrientation = AVCaptureVideoOrientation.Portrait
        }

        self.outputConnection = connection
    }


    captureSession.startRunning()

最后,您需要捕获缓冲区并处理这些东西…确保您使类成为AVCaptureVideoDataOutputSampleBufferDelegate和AVCaptureAudioDataOutputSampleBufferDelegate的委托

//MARK: Implementation for AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate
func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {

    if !self.isRecordingStarted {
        return
    }

    if let audio = self.assetWriterInputAudio where connection.audioChannels.count > 0 && audio.readyForMoreMediaData {

        dispatch_async(audioQueue!) {
            audio.appendSampleBuffer(sampleBuffer)
        }
        return
    }

    if let camera = self.assetWriterInputCamera where camera.readyForMoreMediaData {
        dispatch_async(videoQueue!) {
            camera.appendSampleBuffer(sampleBuffer)
        }
    }
}

缺少一些零碎的部分,但希望这足以让您连同文档一起弄清楚。

最后,如果要添加水印,可以通过多种方式实时完成,但一种可能的方式是修改sampleBuffer并将水印写入图像。您会在StackOverflow上找到其他与此相关的问题。

2020-07-07