如何使用AVCaptureVideoDataOutput避免AVCaptureSession中的帧丢失

时间:2018-07-12 21:26:25

标签: c# ios xamarin avcapturesession

我在Xamarin.iOS项目中设置了AVCaptureSession,无论我在设备上设置的帧速率如何,都会有很高的丢帧率。当我简化会话以使用后置摄像头和仅一个输出(AVCaptureVideoDataOutput)时,我的委托人的DidOutputSampleBuffer方法可能仅被调用2/3。即使我将最大和最小帧持续时间设置为1/3秒,并且本质上破坏了委托方法,除了递增计数器之外,它什么也没做,这是事实。很抱歉,后面的代码墙,但是我不知道哪里可能出了问题。

捕获会话设置:

AVCaptureDeviceType[] deviceTypes = new AVCaptureDeviceType[] {
AVCaptureDeviceType.BuiltInTrueDepthCamera,
AVCaptureDeviceType.BuiltInDualCamera,
AVCaptureDeviceType.BuiltInWideAngleCamera};
AVCaptureDeviceDiscoverySession discoverySession =
    AVCaptureDeviceDiscoverySession.Create(deviceTypes,
        AVMediaType.Video, AVCaptureDevicePosition.Back);
AVCaptureDevice[] devices = discoverySession.Devices;
AVCaptureDevice device = devices.FirstOrDefault();
if (device == null)
{
    return false;
}

previewView.VideoPreviewLayer.Session = null;
m_CaptureSession?.Dispose();
m_CaptureSession = new AVCaptureSession();

NSError error;
AVCaptureDeviceInput input = new AVCaptureDeviceInput(device, out error);
if (error == null)
{
    if (m_CaptureSession.CanAddInput(input))
    {
        m_CaptureSession.AddInput(input);
    }

    // note: will try for 120fps on my iPhone 5S, but I've
    // also overridden the settings with a hard-coded 3fps
    // and get the same results (dropping a large percentage
    // of frames)
    ConfigureCameraForHighestFrameRate(device);

    AVCaptureVideoDataOutput output = new AVCaptureVideoDataOutput();
    AVVideoSettingsUncompressed settings =
        new AVVideoSettingsUncompressed();
    settings.PixelFormatType = CVPixelFormatType.CV32BGRA;
    output.UncompressedVideoSetting = settings;

    m_OutputQueue = new DispatchQueue("outputQueue", false);
    m_OutputRecorder = new OutputRecorder(device);
    output.SetSampleBufferDelegateQueue(m_OutputRecorder, m_OutputQueue);
    if (m_CaptureSession.CanAddOutput(output))
    {
        m_CaptureSession.AddOutput(output);
    }

    // don't hook up preview view, so we can test the video data
    // output on its own...
    // previewView.VideoPreviewLayer.Session = m_CaptureSession;

    m_CaptureSession.StartRunning();
}

委托方法:

public override void DidOutputSampleBuffer(AVCaptureOutput captureOutput,
    CMSampleBuffer sampleBuffer, AVCaptureConnection connection)
{
    lock (m_Mutex)
    {
        if (!m_Recording)
        {
            return;
        }
        m_FrameCount++;
        return;
    }
}

public override void DidDropSampleBuffer(AVCaptureOutput captureOutput,
    CMSampleBuffer sampleBuffer, AVCaptureConnection connection)
{
    lock (m_Mutex)
    {
        if (!m_Recording)
        {
            return;
        }
        m_DroppedFrameCount++;
    }
}

有什么明显的地方我做错了吗?

1 个答案:

答案 0 :(得分:2)

因此,问题显然出在sampleBuffer的处理上,这是委托人的工作,而当我对录制时生成的帧进行处理时,我不是针对录制开始之前捕获的帧。解决方案是确保无论应用程序是否正在录制,都sampleBuffer被处置。这可能是Xamarin特有的问题和解决方案。我不知道这对Swift开发人员是否有用。

相关问题