How to record a video with avfoundation in Swift?

Garret Kaye picture Garret Kaye · Oct 21, 2015 · Viewed 16.1k times · Source

I am trying to figure out how to record a video using AVFoundation in Swift. I have got as far as creating a custom camera but I only figured out how to take still pictures with it and I can't figure out how to record video. From what I understand you have to use AVCaptureVideoDataOutput to get the data from the recording but I can't figure out how to start the recording and implement the delegate methods.

The whole AVFoundation Programing Guide/Still and Video Media Capture is in Objective-C and I can't seem to decipher it out. Here's my attempt to accomplish this task:

First I set up the camera/capture session

override func viewDidLoad() {
    super.viewDidLoad()

    captureSession.sessionPreset = AVCaptureSessionPresetHigh
    let devices = AVCaptureDevice.devices()
    for device in devices {
        if (device.hasMediaType(AVMediaTypeVideo)) {
            if(device.position == AVCaptureDevicePosition.Back) {
                captureDevice = device as? AVCaptureDevice
                if captureDevice != nil {
                    beginSession()
                }
            }
        }
    }

}

Then once beginSession() is called I set up the live feed

func beginSession() {
    var err : NSError? = nil
    captureSession.addInput(AVCaptureDeviceInput(device: captureDevice, error: &err))
    if err != nil {
        println("error: \(err?.localizedDescription)")
    }
    previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
    self.cameraView.layer.addSublayer(previewLayer)
    self.cameraView.bringSubviewToFront(takePhotoButton)
    self.cameraView.bringSubviewToFront(self.snappedPicture)
    self.cameraView.bringSubviewToFront(self.backButton)
    previewLayer?.frame = self.cameraView.layer.frame
    captureSession.startRunning()
}

Here's where I get stuck where the user presses record to actually record and capture a video:

@IBAction func takeVideoAction(sender: AnyObject) {

    var recordingDelegate:AVCaptureFileOutputRecordingDelegate? = self

    var videoFileOutput = AVCaptureMovieFileOutput()
    self.captureSession.addOutput(videoFileOutput)

    let filePath = NSURL(fileURLWithPath: "filePath")

    videoFileOutput.startRecordingToOutputFileURL(filePath, recordingDelegate: recordingDelegate)

}

Then I call self.videoFileOutput.stopRecording().

Then lastly the delegate methods

func captureOutput(captureOutput: AVCaptureFileOutput!, didFinishRecordingToOutputFileAtURL outputFileURL: NSURL!, fromConnections connections: [AnyObject]!, error: NSError!) {
    return
}

func captureOutput(captureOutput: AVCaptureFileOutput!, didStartRecordingToOutputFileAtURL fileURL: NSURL!, fromConnections connections: [AnyObject]!) {
    return
}

And nothing happens as you can imagine because I can't figure out how to implement it correctly.

Answer

liuyaodong picture liuyaodong · Oct 23, 2015

I have tried your code. When set a meaningful filePath, everything goes fine. Change your filePath like this and you will get the delegate methods invoked:

    let documentsURL = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)[0]
    let filePath = documentsURL.URLByAppendingPathComponent("temp")

    // Do recording and save the output to the `filePath`
    videoFileOutput.startRecordingToOutputFileURL(filePath, recordingDelegate: recordingDelegate)