Now, I can capture an image using avfoundation , like below. But how should I do to capture images (e.g. 20 or 30 images开发者_JAVA技巧) continuously?
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
NSLog(@"attachements: %@", exifAttachments);
}
else
NSLog(@"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
// use the image
}];
Whoops, I forgot to follow up on your comment:
The scenario you describe would most easily be solved by using some sort of timer.
Depending on what level of accuracy you need, there are different candidates to look for:
- A repeating
NSTimer
is pretty easy and straight-forward to use. As this class works in conjunction with runloops, there are some pitfalls to be aware of — one being, that the accuracy is limited (but for what you seemingly want to accomplish, that should not be a problem at all). - If you need a little more accuracy in the long run, you can still use NSTimer: Use
initWithFireDate:…
withrepeats:NO
and create a new timer this way, (using a date relative to the supposed fireDate of the old one) when the old one fires. - If you really need a high degree of accuracy, you should have a look into dispatch timers. They are a part of GCD and, thus, a pretty low-level tech. If you decide to follow this path for accuracy, you should probably use your own dispatch queue in the call to
dispatch_source_create
.
In any case, your code for snapping the picture goes into the respective handler.
精彩评论