didOutputSampleBuffer丢弃帧

时间:2016-04-20 09:36:15

标签: swift avfoundation cifilter ciimage cmsamplebuffer

我正在编写长曝光图像拍摄的应用程序。

我使用了func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) 获得CMSampleBuffer使用CIFilter申请CILightenBlendMode

问题是,混合时间过长会导致帧丢失。 我试图复制缓冲区:

var copiedBuffer:CMSampleBuffer?
CMSampleBufferCreateCopy(nil, sampleBuffer, &copiedBuffer)
blendImages(copiedBuffer!)

但这没有帮助,帧仍然下降。

完整代码:

func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {

    if(CameraService.longExposureRunning){
        var copiedBuffer:CMSampleBuffer?
        CMSampleBufferCreateCopy(nil, sampleBuffer, &copiedBuffer)
        blendImages(copiedBuffer!)
    }
}

func captureOutput(captureOutput: AVCaptureOutput!, didDropSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
    print("Dropped")

}


func blendImages(buffer:CMSampleBuffer){

    let priority = DISPATCH_QUEUE_PRIORITY_DEFAULT
    dispatch_async(dispatch_get_global_queue(priority, 0)){
        let pixelBuffer = CMSampleBufferGetImageBuffer(buffer)

        let cameraImage = CIImage(CVPixelBuffer: pixelBuffer!)

        if let backgroundImage = self.lastImage{
            let blendEffect = CIFilter(name: "CILightenBlendMode")
            blendEffect?.setValue(backgroundImage, forKey: kCIInputBackgroundImageKey)
            blendEffect?.setValue(cameraImage, forKey: kCIInputImageKey)
            self.lastImage = blendEffect?.outputImage
            print("Blending")
        }else{
            self.lastImage = cameraImage
        }

        let filteredImage = UIImage(CIImage: self.lastImage!)
        dispatch_async(dispatch_get_main_queue())
        {
            imageView.image = filteredImage
        }
    }
}

2 个答案:

答案 0 :(得分:1)

我怀疑CoreImage将所有帧连接成一个巨大的内核。您可以找到CIImageAccumulator帮助,但我可以通过强制Core Image渲染链并从每个帧重新开始来使代码工作。

我已更改lastImage变量的类型而非可选UIImage,并添加了名为context的常量CIContext。有了这些,这很好用:

使用: let context:CIContext = CIContext(options: [kCIContextUseSoftwareRenderer:false])用于GPU而不是CPU渲染。

func blendImages(buffer:CMSampleBuffer){

let priority = DISPATCH_QUEUE_PRIORITY_DEFAULT
dispatch_async(dispatch_get_global_queue(priority, 0)){
  let pixelBuffer = CMSampleBufferGetImageBuffer(buffer)

  let cameraImage = CIImage(CVPixelBuffer: pixelBuffer!)

  if let backgroundImage = self.lastImage {
    let blendEffect = CIFilter(name: "CILightenBlendMode")!

    blendEffect.setValue(
      CIImage(image: backgroundImage),
      forKey: kCIInputBackgroundImageKey)

    blendEffect.setValue(
      cameraImage, forKey:
      kCIInputImageKey)

    let imageRef = self.context.createCGImage(
      blendEffect.outputImage!,
      fromRect: blendEffect.outputImage!.extent)

    self.lastImage = UIImage(CGImage: imageRef)
    print("Blending")
  }else{
    let imageRef = self.context.createCGImage(
      cameraImage,
      fromRect: cameraImage.extent)

    self.lastImage = UIImage(CGImage: imageRef)
  }

  let filteredImage = self.lastImage
  dispatch_async(dispatch_get_main_queue())
  {
    self.imageView.image = filteredImage
  }
}
}

时髦的效果!

西蒙

答案 1 :(得分:0)

我能想到的最明显的事情是检查你的输出设置。

确保在AVAssetWriterInput上将expectedDataInRealTime设置为true。

https://developer.apple.com/library/tvos/documentation/AVFoundation/Reference/AVAssetWriterInput_Class/index.html#//apple_ref/occ/instp/AVAssetWriterInput/expectsMediaDataInRealTime

相关问题