无法使用AVCapturePhotoOutput捕获照片swift + xcode

时间:2016-12-13 14:18:43

标签: ios swift xcode swift3 avfoundation

我正在开发一个自定义相机应用程序,该教程使用了AVCaptureStillImageOutput,这是ios 10不推荐使用的。我已经设置了相机,现在我仍然坚持如何拍摄照片

以下是我拥有相机的完整视图

import UIKit
import AVFoundation

var cameraPos = "back"

class View3: UIViewController,UIImagePickerControllerDelegate,UINavigationControllerDelegate {


@IBOutlet weak var clickButton: UIButton!
@IBOutlet var cameraView: UIView!
var session: AVCaptureSession?
var stillImageOutput: AVCapturePhotoOutput?
var videoPreviewLayer: AVCaptureVideoPreviewLayer?

override func viewDidLoad() {
    super.viewDidLoad()        
}

override func didReceiveMemoryWarning() {
    super.didReceiveMemoryWarning()
}

override func viewDidAppear(_ animated: Bool) {
    super.viewDidAppear(animated)
    clickButton.center.x = cameraView.bounds.width/2
    loadCamera()
}

override func viewWillAppear(_ animated: Bool) {
    super.viewWillAppear(animated)
 }

@IBAction func clickCapture(_ sender: UIButton) {

    if let videoConnection = stillImageOutput!.connection(withMediaType: AVMediaTypeVideo) {
       // This is where I need help 
        }
}

@IBAction func changeDevice(_ sender: UIButton) {
    if cameraPos == "back"
    {cameraPos = "front"}

    else
    {cameraPos = "back"}


    loadCamera()
}

func loadCamera()
{
    session?.stopRunning()
    videoPreviewLayer?.removeFromSuperlayer()

    session = AVCaptureSession()
    session!.sessionPreset = AVCaptureSessionPresetPhoto

    var backCamera = AVCaptureDevice.defaultDevice(withDeviceType: .builtInWideAngleCamera, mediaType: AVMediaTypeVideo, position: .front)

    if cameraPos == "back"
    {
        backCamera = AVCaptureDevice.defaultDevice(withDeviceType: .builtInWideAngleCamera, mediaType: AVMediaTypeVideo, position: .back)
    }

    var error: NSError?
    var input: AVCaptureDeviceInput!
    do {
        input = try AVCaptureDeviceInput(device: backCamera)
    } catch let error1 as NSError {
        error = error1
        input = nil
        print(error!.localizedDescription)
    }

    if error == nil && session!.canAddInput(input) {
        session!.addInput(input)

        stillImageOutput = AVCapturePhotoOutput()

if session!.canAddOutput(stillImageOutput) {
            session!.addOutput(stillImageOutput)
            videoPreviewLayer = AVCaptureVideoPreviewLayer(session: session)
            videoPreviewLayer?.frame = cameraView.bounds
            videoPreviewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
            videoPreviewLayer?.connection.videoOrientation = AVCaptureVideoOrientation.portrait

            cameraView.layer.addSublayer(videoPreviewLayer!)
            session!.startRunning()

        }        }
}
}

这是我需要帮助的地方

@IBAction func clickCapture(_ sender: UIButton) {

if let videoConnection = stillImageOutput!.connection(withMediaType: AVMediaTypeVideo) {
   // This is where I need help 
    }
}

我在这里找到答案How to use AVCapturePhotoOutput 但我不明白如何在该代码中包含该代码,因为它涉及声明一个新类

1 个答案:

答案 0 :(得分:5)

你快到了。

输出为AVCapturePhotoOutput

查看AVCapturePhotoOutput documentation以获取更多帮助。

这些是拍摄照片的步骤。

  1. 创建一个AVCapturePhotoOutput对象。使用其属性 确定支持的捕获设置并启用某些功能 (例如,是否捕捉实时照片)。
  2. 创建并配置要选择的AVCapturePhotoSettings对象 特定捕获的功能和设置(例如,是否 启用图像稳定或闪光灯。)
  3. 通过将照片设置对象传递给照片来捕获图像 capturePhoto(with:delegate:)方法以及委托对象 实施AVCapturePhotoCaptureDelegate协议。照片 捕获输出然后调用您的代理通知您重要 捕获过程中的事件。
  4. 在您的clickCapture方法中使用以下代码,并且不要忘记在您的课程中确认并实施委派。

    let settings = AVCapturePhotoSettings()
    let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
    let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
                                 kCVPixelBufferWidthKey as String: 160,
                                 kCVPixelBufferHeightKey as String: 160,
                                 ]
    settings.previewPhotoFormat = previewFormat
    self.cameraOutput.capturePhoto(with: settings, delegate: self)
    

    输出为AVCaptureStillImageOutput

    如果您打算从视频连接中拍摄照片。您可以按照以下步骤操作。

    第1步:获取连接

    if let videoConnection = stillImageOutput!.connectionWithMediaType(AVMediaTypeVideo) {
      // ...
      // Code for photo capture goes here...
    }
    

    第2步:拍摄照片

    • 打开captureStillImageAsynchronouslyFromConnection功能 stillImageOutput
    • sampleBuffer表示捕获的数据。
    stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (sampleBuffer, error) -> Void in
      // ...
      // Process the image data (sampleBuffer) here to get an image file we can put in our captureImageView
    })
    

    第3步:处理图像数据

    • 我们需要采取一些步骤来处理sampleBuffer中的图像数据,以便最终得到一个UIImage,我们可以将其插入到captureImageView中,并轻松地在我们的app中的其他地方使用。
    if sampleBuffer != nil {
      let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
      let dataProvider = CGDataProviderCreateWithCFData(imageData)
      let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault)
      let image = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right)
      // ...
      // Add the image to captureImageView here...
    }
    

    第4步:保存图片

    根据您的需要将图片保存到照片库或在图片视图中显示

    有关详情,请查看快照照片

    下的Create custom camera view guide
相关问题