The first step needs to be declare access to any user private data types that is a new requirement in iOS 10. You can do it by adding a usage key to your app’s Info.plist
together with a purpose string.
Because if you are using one of the following frameworks and fail to declare the usage your app will crash when it first makes the access:
Contacts, Calendar, Reminders, Photos, Bluetooth Sharing, Microphone, Camera, Location, Health, HomeKit, Media Library, Motion, CallKit, Speech Recognition, SiriKit, TV Provider.
To avoid the crash you need to add the suggested key to Info.plist
:
And then the system shows the purpose string when asking the user to allow access:
For more information about it you can use this article:
I have done a little modifications to your BarcodeViewController
to make it work properly as you can see below:
BarcodeViewController
import UIKit
import AVFoundation
protocol BarcodeDelegate {
func barcodeReaded(barcode: String)
}
class BarcodeViewController: UIViewController, AVCaptureMetadataOutputObjectsDelegate {
var delegate: BarcodeDelegate?
var videoCaptureDevice: AVCaptureDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
var device = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
var output = AVCaptureMetadataOutput()
var previewLayer: AVCaptureVideoPreviewLayer?
var captureSession = AVCaptureSession()
var code: String?
override func viewDidLoad() {
super.viewDidLoad()
self.view.backgroundColor = UIColor.clear
self.setupCamera()
}
private func setupCamera() {
let input = try? AVCaptureDeviceInput(device: videoCaptureDevice)
if self.captureSession.canAddInput(input) {
self.captureSession.addInput(input)
}
self.previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
if let videoPreviewLayer = self.previewLayer {
videoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
videoPreviewLayer.frame = self.view.bounds
view.layer.addSublayer(videoPreviewLayer)
}
let metadataOutput = AVCaptureMetadataOutput()
if self.captureSession.canAddOutput(metadataOutput) {
self.captureSession.addOutput(metadataOutput)
metadataOutput.setMetadataObjectsDelegate(self, queue: DispatchQueue.main)
metadataOutput.metadataObjectTypes = [AVMetadataObjectTypeQRCode, AVMetadataObjectTypeEAN13Code]
} else {
print("Could not add metadata output")
}
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
if (captureSession.isRunning == false) {
captureSession.startRunning();
}
}
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
if (captureSession.isRunning == true) {
captureSession.stopRunning();
}
}
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputMetadataObjects metadataObjects: [Any]!, from connection: AVCaptureConnection!) {
// This is the delegate's method that is called when a code is read
for metadata in metadataObjects {
let readableObject = metadata as! AVMetadataMachineReadableCodeObject
let code = readableObject.stringValue
self.dismiss(animated: true, completion: nil)
self.delegate?.barcodeReaded(barcode: code!)
print(code!)
}
}
}
One of the important points was to declare the global variables and start and stop the captureSession
inside the viewWillAppear(:)
and viewWillDisappear(:)
methods. In your previous code I think it was not called at all as it never enter inside the method to process the barcode.
I hope this help you.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…