Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
747 views
in Technique[技术] by (71.8m points)

ios - Objective-C Simple way to take a photo without a camera interface. Just get a picture from camera and save to a file

I can't find a simple way of taking a photo without a camera interface. I just need to get a picture from the camera and save it to a file.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

I used this code to take a photo with frontal camera. Not all code is mine but I didn't find a link to original source. This code also produces a shutter sound. Image quality is not very good (it's quite dark) so code needs a tweak or two.

-(void) takePhoto 
{
    AVCaptureDevice *frontalCamera;

    NSArray *allCameras = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];

    for ( int i = 0; i < allCameras.count; i++ )
    {
        AVCaptureDevice *camera = [allCameras objectAtIndex:i];

        if ( camera.position == AVCaptureDevicePositionFront )
        {
            frontalCamera = camera;
        }
    }

    if ( frontalCamera != nil )
    {
        photoSession = [[AVCaptureSession alloc] init];

        NSError *error;
        AVCaptureDeviceInput *input =
        [AVCaptureDeviceInput deviceInputWithDevice:frontalCamera error:&error];

        if ( !error && [photoSession canAddInput:input] )
        {
            [photoSession addInput:input];

            AVCaptureStillImageOutput *output = [[AVCaptureStillImageOutput alloc] init];

            [output setOutputSettings:
             [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil]];

            if ( [photoSession canAddOutput:output] )
            {
                [photoSession addOutput:output];

                AVCaptureConnection *videoConnection = nil;

                for (AVCaptureConnection *connection in output.connections)
                {
                    for (AVCaptureInputPort *port in [connection inputPorts])
                    {
                        if ([[port mediaType] isEqual:AVMediaTypeVideo] )
                        {
                            videoConnection = connection;
                            break;
                        }
                    }
                    if (videoConnection) { break; }
                }

                if ( videoConnection )
                {
                    [photoSession startRunning];

                    [output captureStillImageAsynchronouslyFromConnection:videoConnection
                                                        completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {

                        if (imageDataSampleBuffer != NULL)
                        {
                            NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
                            UIImage *photo = [[UIImage alloc] initWithData:imageData];
                            [self processImage:photo]; //this is a custom method
                        }
                    }];
                }
            }
        }
    }
}

photoSession is an AVCaptureSession * ivar of the class holding the takePhoto method.

EDIT (tweak): If you change the if ( videoConnection ) block to the code below you will add 1 second delay and get a good image.

if ( videoConnection )
{
    [photoSession startRunning];

    dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, 1 * NSEC_PER_SEC);
    dispatch_after(popTime, dispatch_get_main_queue(), ^(void){

        [output captureStillImageAsynchronouslyFromConnection:videoConnection
                                            completionHandler:^(CMSampleBufferRefimageDataSampleBuffer, NSError *error) {

            if (imageDataSampleBuffer != NULL)
            {
                NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
                UIImage *photo = [[UIImage alloc] initWithData:imageData];
                [self processImage:photo];
            }
        }];
    });
}

If lag is not acceptable for you application you could split the code in two parts and start the photoSession at viewDidAppear (or somewhere similar) and simply take an immediate snapshot whenever needed - usually after some user interaction.

dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, 0.25 * NSEC_PER_SEC);

also produces a good result - so there is no need for a whole second lag.

Note that this code is written to take a photo with frontal camera - I'm sure you will know how to mend it if you need to use back camera.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...