开发者

How to change pixel color on the fly in iPhone camera preview window?

开发者 https://www.devze.com 2023-01-26 00:46 出处:网络
I am using UIImagePickerController to take photos on iPhone. I\'d like to adjust the photo on the fly, it appears that I could use 开发者_开发知识库UIImagePickerController to adjust the shape of the p

I am using UIImagePickerController to take photos on iPhone. I'd like to adjust the photo on the fly, it appears that I could use 开发者_开发知识库UIImagePickerController to adjust the shape of the photo on the fly, but I am not able to find a way to change the color on the fly. For example, change all the color to black/white.

Thanks.


The best way to do this is with an AVCaptureSession object. I'm doing exactly what you're talking about in my free app "Live Effects Cam"

There are several code examples on-line that a will help you implement this too. Here is a sample chunk of code that might help:

- (void) activateCameraFeed
    {
    videoSettings = nil;

#if USE_32BGRA
    pixelFormatCode = [[NSNumber alloc] initWithUnsignedInt:(unsigned int)kCVPixelFormatType_32BGRA];
    pixelFormatKey = [[NSString alloc] initWithString:(NSString *)kCVPixelBufferPixelFormatTypeKey];
    videoSettings = [[NSDictionary alloc] initWithObjectsAndKeys:pixelFormatCode, pixelFormatKey, nil]; 
#endif

    videoDataOutputQueue = dispatch_queue_create("com.jellyfilledstudios.ImageCaptureQueue", NULL);

    captureVideoOutput = [[AVCaptureVideoDataOutput alloc] init];
    [captureVideoOutput setAlwaysDiscardsLateVideoFrames:YES]; 
    [captureVideoOutput setSampleBufferDelegate:self queue:videoDataOutputQueue];
    [captureVideoOutput setVideoSettings:videoSettings];
    [captureVideoOutput setMinFrameDuration:kCMTimeZero];

    dispatch_release(videoDataOutputQueue); // AVCaptureVideoDataOutput uses dispatch_retain() & dispatch_release() so we can dispatch_release() our reference now

    if ( useFrontCamera )
        {
        currentCameraDeviceIndex = frontCameraDeviceIndex;
        cameraImageOrientation = UIImageOrientationLeftMirrored;
        }
    else
        {
        currentCameraDeviceIndex = backCameraDeviceIndex;
        cameraImageOrientation = UIImageOrientationRight;
        }

    selectedCamera = [[AVCaptureDevice devices] objectAtIndex:(NSUInteger)currentCameraDeviceIndex];

    captureVideoInput = [AVCaptureDeviceInput deviceInputWithDevice:selectedCamera error:nil];

    captureSession = [[AVCaptureSession alloc] init];

    [captureSession beginConfiguration];

    [self setCaptureConfiguration];

    [captureSession addInput:captureVideoInput];
    [captureSession addOutput:captureVideoOutput];
    [captureSession commitConfiguration];
    [captureSession startRunning];
    }


// AVCaptureVideoDataOutputSampleBufferDelegate
// AVCaptureAudioDataOutputSampleBufferDelegate
//
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection 
    {
    NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];

    if ( captureOutput==captureVideoOutput )
        {
        [self performImageCaptureFrom:sampleBuffer fromConnection:connection];
        }

    [pool drain];
    } 



- (void) performImageCaptureFrom:(CMSampleBufferRef)sampleBuffer
    {
    CVImageBufferRef imageBuffer;

    if ( CMSampleBufferGetNumSamples(sampleBuffer) != 1 )
        return;
    if ( !CMSampleBufferIsValid(sampleBuffer) )
        return;
    if ( !CMSampleBufferDataIsReady(sampleBuffer) )
        return;

    imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 

    if ( CVPixelBufferGetPixelFormatType(imageBuffer) != kCVPixelFormatType_32BGRA )
        return;

    CVPixelBufferLockBaseAddress(imageBuffer,0); 

    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer); 
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    int bufferSize = bytesPerRow * height;

    uint8_t *tempAddress = malloc( bufferSize );
    memcpy( tempAddress, baseAddress, bytesPerRow * height );

    baseAddress = tempAddress;

    //
    // Apply affects to the pixels stored in (uint32_t *)baseAddress
    //
    //
    // example: grayScale( (uint32_t *)baseAddress, width, height );
    // example: sepia( (uint32_t *)baseAddress, width, height );
    //

    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 
    CGContextRef newContext = nil;

    if ( cameraDeviceSetting != CameraDeviceSetting640x480 )        // not an iPhone4 or iTouch 5th gen
        newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace,  kCGBitmapByteOrder32Little | kCGImageAlphaNoneSkipFirst);
    else
        newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);

    CGImageRef newImage = CGBitmapContextCreateImage( newContext );
    CGColorSpaceRelease( colorSpace );
    CGContextRelease( newContext );

    free( tempAddress );

    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    if ( newImage == nil )
        {
        return;
        }

    // To be able to display the CGImageRef newImage in your UI you will need to do it like this
    // because you are running on a different thread here…
    //
    [self performSelectorOnMainThread:@selector(newCameraImageNotification:) withObject:(id)newImage waitUntilDone:YES];
    }


You can overlay a view on the image and change the blending mode to match a Black/White effect.

Check out the QuartzDemo from Apple, specifically in that Demo, the Blending Modes example


Another way to do this would be to convert each frame using AVFoundation. I don't have a ton of experience with this but the "Session 409 - Using the Camera with AVFoundation" video from WWDC2010 and its sample projects should go a long way to helping you with your problem.

That is, of course, if you're okay using iOS4 classes.

0

精彩评论

暂无评论...
验证码 换一张
取 消