一尘不染

在iOS中的AVCaptureDevice的输出上设置灰度

swift

我想在我的应用程序中实现自定义相机。因此,我正在使用创建此相机AVCaptureDevice

现在,我只想在我的自定义相机中显示“灰度输出”。所以我正在尝试使用setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:AVCaptureWhiteBalanceGains。我正在使用AVCamManual:为此扩展了AVCam以使用手动捕获

- (void)setWhiteBalanceGains:(AVCaptureWhiteBalanceGains)gains
{
    NSError *error = nil;

    if ( [videoDevice lockForConfiguration:&error] ) {
        AVCaptureWhiteBalanceGains normalizedGains = [self normalizedGains:gains]; // Conversion can yield out-of-bound values, cap to limits
        [videoDevice setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:normalizedGains completionHandler:nil];
        [videoDevice unlockForConfiguration];
    }
    else {
        NSLog( @"Could not lock device for configuration: %@", error );
    }
}

但是为此,我必须在1至4之间传递 RGB增益值 。因此,我正在创建此方法来检查MAX和MIN值。

- (AVCaptureWhiteBalanceGains)normalizedGains:(AVCaptureWhiteBalanceGains) gains
{
    AVCaptureWhiteBalanceGains g = gains;

    g.redGain = MAX( 1.0, g.redGain );
    g.greenGain = MAX( 1.0, g.greenGain );
    g.blueGain = MAX( 1.0, g.blueGain );

    g.redGain = MIN( videoDevice.maxWhiteBalanceGain, g.redGain );
    g.greenGain = MIN( videoDevice.maxWhiteBalanceGain, g.greenGain );
    g.blueGain = MIN( videoDevice.maxWhiteBalanceGain, g.blueGain );

    return g;
}

我也试图获得不同的效果,例如传递RGB增益静态值。

- (AVCaptureWhiteBalanceGains)normalizedGains:(AVCaptureWhiteBalanceGains) gains
{
    AVCaptureWhiteBalanceGains g = gains;
    g.redGain = 3;
    g.greenGain = 2;
    g.blueGain = 1;
    return g;
}

现在,我想在我的自定义相机上设置此灰度(公式:像素= 0.30078125f * R + 0.5859375f * G + 0.11328125f * B)。我已经尝试过此公式。

- (AVCaptureWhiteBalanceGains)normalizedGains:(AVCaptureWhiteBalanceGains) gains
{
    AVCaptureWhiteBalanceGains g = gains;

    g.redGain = g.redGain * 0.30078125;
    g.greenGain = g.greenGain * 0.5859375;
    g.blueGain = g.blueGain * 0.11328125;

    float grayScale = g.redGain + g.greenGain + g.blueGain;

    g.redGain = MAX( 1.0, grayScale );
    g.greenGain = MAX( 1.0, grayScale );
    g.blueGain = MAX( 1.0, grayScale );

    g.redGain = MIN( videoDevice.maxWhiteBalanceGain, g.redGain );
    g.greenGain = MIN( videoDevice.maxWhiteBalanceGain, g.greenGain);
    g.blueGain = MIN( videoDevice.maxWhiteBalanceGain, g.blueGain );

    return g;
}

那么, 如何在1到4之间传递该值 呢?

有什么方法或规模可以比较这些东西吗?

任何帮助,将不胜感激。


阅读 389

收藏
2020-07-07

共1个答案

一尘不染

CoreImage 提供了大量用于使用GPU调整图像的过滤器,并且可以有效地与来自摄像机源或视频文件的视频数据一起使用。

objc.io上有一篇文章介绍了如何执行此操作。例子在Objective-C中,但是解释应该足够清楚以至于可以遵循。

基本步骤是:

  1. 创建一个EAGLContext配置为使用OpenGLES2的。
  2. 使用创建一个GLKView以显示渲染的输出EAGLContext
  3. 创建一个CIContext,使用相同的EAGLContext
  4. CIFilter使用CIColorMonochrome CoreImage过滤器创建一个。
  5. 使用创建AVCaptureSession一个AVCaptureVideoDataOutput
  6. AVCaptureVideoDataOutputDelegate方法中,将转换CMSampleBufferCIImage。将应用于CIFilter图像。将过滤后的图像绘制到CIImageContext

该流水线确保视频像素缓冲区保留在GPU(从摄像机到显示器)上,并避免将数据移至CPU,以保持实时性能。

要保存过滤后的视频,请实施AVAssetWriter,然后将样本缓冲区附加到AVCaptureVideoDataOutputDelegate进行过滤的位置。

这是Swift中的示例。

GitHub上的示例

import UIKit
import GLKit
import AVFoundation

private let rotationTransform = CGAffineTransformMakeRotation(CGFloat(-M_PI * 0.5))

class ViewController: UIViewController, AVCaptureVideoDataOutputSampleBufferDelegate {

    private var context: CIContext!
    private var targetRect: CGRect!
    private var session: AVCaptureSession!
    private var filter: CIFilter!

    @IBOutlet var glView: GLKView!

    override func prefersStatusBarHidden() -> Bool {
        return true
    }

    override func viewDidAppear(animated: Bool) {
        super.viewDidAppear(animated)

        let whiteColor = CIColor(
            red: 1.0,
            green: 1.0,
            blue: 1.0
        )

        filter = CIFilter(
            name: "CIColorMonochrome",
            withInputParameters: [
                "inputColor" : whiteColor,
                "inputIntensity" : 1.0
            ]
        )

        // GL context

        let glContext = EAGLContext(
            API: .OpenGLES2
        )

        glView.context = glContext
        glView.enableSetNeedsDisplay = false

        context = CIContext(
            EAGLContext: glContext,
            options: [
                kCIContextOutputColorSpace: NSNull(),
                kCIContextWorkingColorSpace: NSNull(),
            ]
        )

        let screenSize = UIScreen.mainScreen().bounds.size
        let screenScale = UIScreen.mainScreen().scale

        targetRect = CGRect(
            x: 0,
            y: 0,
            width: screenSize.width * screenScale,
            height: screenSize.height * screenScale
        )

        // Setup capture session.

        let cameraDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)

        let videoInput = try? AVCaptureDeviceInput(
            device: cameraDevice
        )

        let videoOutput = AVCaptureVideoDataOutput()
        videoOutput.setSampleBufferDelegate(self, queue: dispatch_get_main_queue())

        session = AVCaptureSession()
        session.beginConfiguration()
        session.addInput(videoInput)
        session.addOutput(videoOutput)
        session.commitConfiguration()
        session.startRunning()
    }

    func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {

        guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else {
            return
        }

        let originalImage = CIImage(
            CVPixelBuffer: pixelBuffer,
            options: [
                kCIImageColorSpace: NSNull()
            ]
        )

        let rotatedImage = originalImage.imageByApplyingTransform(rotationTransform)

        filter.setValue(rotatedImage, forKey: kCIInputImageKey)

        guard let filteredImage = filter.outputImage else {
            return
        }

        context.drawImage(filteredImage, inRect: targetRect, fromRect: filteredImage.extent)

        glView.display()
    }

    func captureOutput(captureOutput: AVCaptureOutput!, didDropSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
        let seconds = CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer))
        print("dropped sample buffer: \(seconds)")
    }
}
2020-07-07