Get output from AVCaptureSession in Swift to send to server

iamjonesy picture iamjonesy · Oct 5, 2014 · Viewed 8.3k times · Source

I've managed to write some code that opens the camera and previews the video. I now want to capture the frames from the output to send to a server ideally encoded as H.264

Here's what I've got:

import UIKit
import AVFoundation

class ViewController: UIViewController {

    let captureSession = AVCaptureSession()
    var previewLayer : AVCaptureVideoPreviewLayer?

    // If we find a device we'll store it here for later use
    var captureDevice : AVCaptureDevice?

    override func viewDidLoad() {
        super.viewDidLoad()            
        // Do any additional setup after loading the view, typically from a nib.

        captureSession.sessionPreset = AVCaptureSessionPresetHigh

        let devices = AVCaptureDevice.devices()

        // Loop through all the capture devices on this phone
        for device in devices {
            // Make sure this particular device supports video
            if (device.hasMediaType(AVMediaTypeVideo)) {
                // Finally check the position and confirm we've got the back camera
                if(device.position == AVCaptureDevicePosition.Back) {
                    captureDevice = device as? AVCaptureDevice
                    if captureDevice != nil {
                        println("Capture device found")
                        beginSession()
                    }
                }
            }
        }

    }

    func beginSession() {

        var err : NSError? = nil
        captureSession.addInput(AVCaptureDeviceInput(device: captureDevice, error: &err))

        if err != nil {
            println("error: \(err?.localizedDescription)")
        }

        previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
        self.view.layer.addSublayer(previewLayer)
        previewLayer?.frame = self.view.layer.frame

        captureSession.startRunning()

    }

}

This open the camera successfully and I can preview the footage.

I've found this Objective C code that looks like it gets the output but I don't know how to convert it to swift. It's using AVCaptureVideoDataOutput, AVAssetWriter, AVAssetWriterInput and AVAssetWriterInputPixelBufferAdaptor to write frames out to an H.264 encoded movie file.

Can use AVCaptureVideoDataOutput and AVCaptureMovieFileOutput at the same time?

Can someone help convert it or give me pointers as to how to get the frames out of my current code?

Answer

pteofil picture pteofil · May 13, 2015

Apple has a sample project AVCam in ObjectiveC that works with this things.

Here's another question on SO about using AVCamera in Swift.

I personally used this https://github.com/alex-chan/AVCamSwift, and it's fine. I only had to convert it to Latest Swift syntax in Xcode and it worked fine.

Another suggestion is to use the ObjectiveC code that you found and import it in you Swift code through a bridging header.