I am trying to convert Core Surface RGB frame buffer(Iphone) to ffmpeg Avfarme to encode into a movie file. But I am not getting the correct video output (video showing colors dazzling not the correct picture)
I guess there is something wrong with converting from core surface frame buffer into AVFrame.
Here is my code :
Surface *surface = [[Surface alloc]initWithCoreSurfaceBuffer:coreSurfaceBuffer];
[surface lock];
unsigned int height = surface.height;
unsigned int width = surface.width;
unsigned int alignmentedBytesPerRow = (width * 4);
if (!readblePixels) {
readblePixels = CGBitmapAllocateData(alignmentedBytesPerRow * height);
NSLog(@"alloced readablepixels");
}
unsigned int bytesPerRow = surface.bytesPerRow;
void *pixels = surface.baseAddress;
for (unsigned int j = 0; j < height; j++) {
memcpy(readblePixels + alignmentedBytesPerRow * j, pixels + bytesPerRow * j, bytesPerRow);
}
pFrameRGB->data[0] = readblePixels;开发者_运维问答 // I guess here is what I am doing wrong.
pFrameRGB->data[1] = NULL;
pFrameRGB->data[2] = NULL;
pFrameRGB->data[3] = NULL;
pFrameRGB->linesize[0] = pCodecCtx->width;
pFrameRGB->linesize[1] = 0;
pFrameRGB->linesize[2] = 0;
pFrameRGB->linesize[3] = 0;
sws_scale (img_convert_ctx, pFrameRGB->data, pFrameRGB->linesize,
0, pCodecCtx->height,
pFrameYUV->data, pFrameYUV->linesize);
Please help me out.
Thanks,
Raghu
This will solve the problem:
pFrameRGB->linesize[0] = pCodecCtx->width * 4; // linesize includes total bytes for the line (ARGB)
Dont waste time though, you are not supposed to use Surface as St3fan suggested. App will be rejected.
精彩评论