开发者

Converting CVImageBufferRef to PNG

开发者 https://www.devze.com 2023-01-21 16:09 出处:网络
I want to write all frames to disk using QTKit.framework for camera input. But all the images I get are half transparent and without some colors, maybe colorspace problem? ;(

I want to write all frames to disk using QTKit.framework for camera input. But all the images I get are half transparent and without some colors, maybe colorspace problem? ;( They seem good in the preview View but when I write them "something" happens. I wonder what it that.

-(void)savePNGImage:(CGImageRef)imageRef path:(NSString *)path {
    NSURL *outURL = [[NSURL alloc] initFileURLWithPath:path]; 
    CGImageDestinationRef dr = CGImageDestinationCreateWithURL ((CFURLRef)outURL, (CFStringRef)@"public.png" , 1, NULL);
    CGImageDestinationAddImage(dr, imageRef, NULL);
    CGImageDestinationFinalize(dr);
    [outURL release];
}
-(void)saveJPEGImage:(CGImageRef)imageRef path:(NSString *)path {
    CFMutableDictionaryRef mSaveMetaAndOpts = CFDictionaryCreateMutable(nil, 0,
                                                                    &kCFTypeDictionaryKeyCallBacks,  &kCFTypeDictionaryValueCallBacks);
    CFDictionarySetValue(mSaveMetaAndOpts, kCGImageDestinationLossyCompressionQuality, 
                         [NSNumber numberWithFloat:1.0]);   // set the compression quality here
    NSURL *outURL = [[NSURL alloc] initFileURLWithPath:path];
    CGImageDestinationRef dr = CGImageDestinationCreateWithURL ((CFURLRef)outURL, (CFStringRef)@"public.jpeg" , 1, NULL);
    CGImageDestinationAddImage(dr, imageRef, mSaveMetaAndOpts);
    CGImageDestinationFinalize(dr);
    [outURL release];
}


- (void)captureOutput:(QTCaptureOutput *)captureOutput 
  didOutputVideoFrame:(CVImageBufferRef)videoFrame 
     withSampleBuffer:(QTSampleBuffer *)sampleBuffer 
       fromConnection:(QTCaptureConnection *)connection{

    CVPixelBufferLockBaseAddress(videoFrame,0); 
    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(videoFrame); 
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(videoFrame); 
    size_t width = CVPixelBufferGetWidth(videoFrame); 
    size_t height = CVPixelBufferGetHeight(videoFrame); 
    CVPixelBufferUnlockBaseAddress(videoFrame,0);

    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 
    CGContextRef newContext = CGBitmapContextCreate(baseAddress, 
                                                        width, height, 8, 
                                                        bytesPerRow, 
                                                        colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
        CGImageRef frame = CGBitmapContextCreateImage(newContext); 

    CGContextRelease(newContext); 
    CGColorSpaceRelease(colorSpace);

    [self savePNGImage:frame path:[NSString stringWithFormat:@"/Users/nacho4d/Desktop/framesCam/frame%003d.png", frameNum++]];
[self saveJPEGImage:frame path:[NSString stringWithFormat:@"/Users/nacho4d/Desktop/framesCam/frame%003d.jpeg", frameNum++]];

    CGImageRelease(frame);
}

Capturer attributes are just frame size and pixel format like so: [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32ARGB], (id)kCVPixelBufferPixelFormatTypeKey

Update:

I have tried with JPEG also and I get the same开发者_如何学JAVA kind of image, like there are some channels lacking in the image and all written frames have a white background. (since JPEG does not allow transparence ??)

The original and the written (JPEG) one:

Converting CVImageBufferRef to PNG

Converting CVImageBufferRef to PNG

(I don't show the PNG since is transparent and is difficult to see in a browser)

Thanks in advance.


I'm not sure what your problem is, but here's how I'm saving images to disk. Just pass in the CVImageBufferRef you get from your delegate method, and of course your own URL and it should work fine. At least it does for me. Good luck!

-(void)saveImage:(CVImageBufferRef)image toURL:(NSURL*)url{

    NSCIImageRep *imageRep = [NSCIImageRep imageRepWithCIImage:[CIImage imageWithCVImageBuffer:image]];

    NSImage *_image = [[NSImage alloc] initWithSize:[imageRep size]];
    [_image addRepresentation:imageRep];

    NSData *bitmapData = [_image TIFFRep    resentation];
    NSBitmapImageRep *bitmapRep = [NSBitmapImageRep imageRepWithData:bitmapData];
    NSData *imageData = [bitmapRep representationUsingType:NSJPEGFileType properties:nil];

    [_image release];
    _image = [[NSImage alloc] initWithData:imageData];

    NSBitmapImageRep *imgRep = [[_image representations] objectAtIndex: 0];
    NSData *data = [imgRep representationUsingType: NSPNGFileType properties: nil];
    [data writeToFile: [self getSaveString] atomically: NO];

    [_image release];
}


You're unlocking the image buffer before you've actually processed it. Try moving this line:

CVPixelBufferUnlockBaseAddress(videoFrame,0);

to the end


I'm doing something somewhat similar, and seeing the same problem. I modified your code ever so slightly, and changed kCGImageAlphaPremultipliedFirst to kCGImageAlphaNoneSkipFirst. Presumably because the frame doesn't contain alpha even though it's 32 bits per pixel this fixed the problem for me.

0

精彩评论

暂无评论...
验证码 换一张
取 消