I'm trying to get depth data from the camera in iOS 11 with AVDepthData, tho when I setup a photoOutput with the AVCapturePhotoCaptureDelegate the photo.depthData is nil.
So I tried setting up the AVCaptureDepthDataOutputDelegate with a AVCaptureDepthDataOutput, tho I don't know how to capture the depth photo?
Has anyone ever got an image from AVDepthData?
Edit:
Here's the code I tried:
// delegates: AVCapturePhotoCaptureDelegate & AVCaptureDepthDataOutputDelegate
@IBOutlet var image_view: UIImageView!
@IBOutlet var capture_button: UIButton!
var captureSession: AVCaptureSession?
var sessionOutput: AVCapturePhotoOutput?
var depthOutput: AVCaptureDepthDataOutput?
var previewLayer: AVCaptureVideoPreviewLayer?
@IBAction func capture(_ sender: Any) {
self.sessionOutput?.capturePhoto(with: AVCapturePhotoSettings(format: [AVVideoCodecKey: AVVideoCodecType.jpeg]), delegate: self)
}
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
self.previewLayer?.removeFromSuperlayer()
self.image_view.image = UIImage(data: photo.fileDataRepresentation()!)
let depth_map = photo.depthData?.depthDataMap
print("depth_map:", depth_map) // is nil
}
func depthDataOutput(_ output: AVCaptureDepthDataOutput, didOutput depthData: AVDepthData, timestamp: CMTime, connection: AVCaptureConnection) {
print("depth data") // never called
}
override func viewDidLoad() {
super.viewDidLoad()
self.captureSession = AVCaptureSession()
self.captureSession?.sessionPreset = .photo
self.sessionOutput = AVCapturePhotoOutput()
self.depthOutput = AVCaptureDepthDataOutput()
self.depthOutput?.setDelegate(self, callbackQueue: DispatchQueue(label: "depth queue"))
do {
let device = AVCaptureDevice.default(for: .video)
let input = try AVCaptureDeviceInput(device: device!)
if(self.captureSession?.canAddInput(input))!{
self.captureSession?.addInput(input)
if(self.captureSession?.canAddOutput(self.sessionOutput!))!{
self.captureSession?.addOutput(self.sessionOutput!)
if(self.captureSession?.canAddOutput(self.depthOutput!))!{
self.captureSession?.addOutput(self.depthOutput!)
self.previewLayer = AVCaptureVideoPreviewLayer(session: self.captureSession!)
self.previewLayer?.frame = self.image_view.bounds
self.previewLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
self.previewLayer?.connection?.videoOrientation = AVCaptureVideoOrientation.portrait
self.image_view.layer.addSublayer(self.previewLayer!)
}
}
}
} catch {}
self.captureSession?.startRunning()
}
I'm trying two things, one where the depth data is nil and one where I'm trying to call a depth delegate method.
Dose anyone know what I'm missing?
First, you need to use the dual camera, otherwise you won't get any depth data.
let device = AVCaptureDevice.default(.builtInDualCamera, for: .video, position: .back)
And keep a reference to your queue
let dataOutputQueue = DispatchQueue(label: "data queue", qos: .userInitiated, attributes: [], autoreleaseFrequency: .workItem)
You'll also probably want to synchronize the video and depth data
var outputSynchronizer: AVCaptureDataOutputSynchronizer?
Then you can synchronize the two outputs in your viewDidLoad() method like this
if sessionOutput?.isDepthDataDeliverySupported {
sessionOutput?.isDepthDataDeliveryEnabled = true
depthDataOutput?.connection(with: .depthData)!.isEnabled = true
depthDataOutput?.isFilteringEnabled = true
outputSynchronizer = AVCaptureDataOutputSynchronizer(dataOutputs: [sessionOutput!, depthDataOutput!])
outputSynchronizer!.setDelegate(self, queue: self.dataOutputQueue)
}
I would recommend watching WWDC session 507 - they also provide a full sample app that does exactly what you want.