开发者

Webcam stream with FFMpeg on iPhone

开发者 https://www.devze.com 2023-03-10 20:39 出处:网络
I\'m trying to send and show a webcam stream from a linux server to an iPhone app. I don\'t know if it\'s the best solution, but I downloaded and installed FFMpeg on the linux server (following, for t

I'm trying to send and show a webcam stream from a linux server to an iPhone app. I don't know if it's the best solution, but I downloaded and installed FFMpeg on the linux server (following, for those who want to know, this tutorial). FFMpeg is working fine. After a lots of wandering, I managed to send a stream to the client launching

ffmpeg  -s 320x240 -f video4linux2 -i /dev/video0 -f mpegts -vcodec libx264 udp://192.168.1.34:1234

where 192.168.1.34 is the address of the client. Actually the client is a Mac, but it is supposed to be an iPhone. I know the stream is sent and received correctly (tested in different ways).

However I didn't managed to watch the stream directly on the iPhone.

I thought of different (possible) solutions:

  • first solution: store incoming data in a NSMutableData object. Then, when the stream ends, store it and then play it using a MPMoviePlayerController. Here's the code:

    [video writeToFile:@"videoStream.m4v" atomically:YES];
    NSURL *url开发者_运维百科 = [NSURL fileURLWithPath:@"videoStream.m4v"];
    
    MPMoviePlayerController *videoController = [[MPMoviePlayerController alloc] initWithContentURL:url];
    
    [videoController.view setFrame:CGRectMake(100, 100, 150, 150)];
    
    [self.view addSubview:videoController.view];
    
    [videoController play];
    

    the problem of this solution is that nothing is played (I only see a black square), even if the video is saved correctly (I can play it directly from my disk using VLC). Besides, it's not such a great idea. It's just to make things work.

  • Second solution: use CMSampleBufferRef to store the incoming video. Much more problems comes with this solution: first of all, there's no CoreMedia.framework in my system. Besides I do not get well what does this class represents and what should I do to make it works: I mean if I start (somehow) filling this "SampleBuffer" with bytes I receive from UDP connection, then it will automatically call the CMSampleBufferMakeDataReadyCallback function I set during creation? If yes, when? When the single frame is completed or when the whole stream is received?

  • Third solution: use AVFoundation framework (neither this is actually available on my Mac). I did not understand if it's actually possible to start recording from a remote source or even from a NSMutableData, a char* or something like that. On AVFoundation Programming Guide I didn't find any reference that say if it's possible or not.

I don't know which one of this solution is the best for my purpose. ANY suggestion would be appreciate.

Besides, there's also another problem: I didn't use any segmenter program to send the video. Now, if I'm not getting wrong, segmenter needs to split the source video in smaller/shorter video easier to send. If it is right, then maybe it's not strictly necessary to make things work (may be added later). However, since the server is running under linux, I cannot use Apple's mediastreamsegmeter. May someone suggest an opensource segmenter to use in association with FFMpeg?


UPDATE: I edited my question adding more informations on what I did since now and what my doubts are.


MPMoviePlayerController can handle streaming video, try just handing it the URL directly.

As for the video not playing even when it is saved, are you sure the video is in a supported format? Quoth the documentation:

This class plays any movie or audio file supported in iOS. This includes both streamed content and fixed-length files. For movie files, this typically means files with the extensions .mov, .mp4, .mpv, and .3gp and using one of the following compression standards:

  • H.264 Baseline Profile Level 3.0 video, up to 640 x 480 at 30 fps. (The Baseline profile does not support B frames.)
  • MPEG-4 Part 2 video (Simple Profile)

Try using -vcodec libx264 -vpre baseline on your ffmpeg command line to use the baseline profile.


Instead of sending the stream as UDP, try sending the stream with RTSP .. MPMoviePlayerController will play it.


Carson McDonald has implemented an excellent solution for HTTP Live Streaming which he uses from Linux to iOS. He's a user here and his site is Ion Cannon.

See this question for more details.


http://wiki.videolan.org/Documentation:Streaming_HowTo/Streaming_for_the_iPhone

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号