开发者

Access to the iOS' video decoder?

开发者 https://www.devze.com 2023-01-13 22:04 出处:网络
The iPad/iOS has video streaming support for e.g. H.264 using MPMoviePlayerController etc., but i receive H.264 data through a custom, proprietary, stream and need to decode it in a soft real-time sce

The iPad/iOS has video streaming support for e.g. H.264 using MPMoviePlayerController etc., but i receive H.264 data through a custom, proprietary, stream and need to decode it in a soft real-time scenario.

Can the iPads/iOS' video decoder be accessed in any way to decode this data?

Update: Apparently the iOS 4.0 Core Media Framework supports decoding frames and knows of H.264, but there is no sample code nor can i s开发者_JAVA技巧ee what i actually am supposed to call for the actual decoding.


Update (ten years later!)

For anyone googling here, you do this in iOS these days with "VideoToolbox".



After raising the issue with Apple DTS it turns out that there currently is no way to decode video data from custom stream sources.

I will file an enhancement request for this.


If you continue to have problems with it, I suggest you take a look at libavcodec for decoding the data (available on the ffmpeg project).

There are great ffmpeg tutorials at dranger that show how to properly decode (through libavcodec) and display video data (using libsdl), among other things.


2019

There are two solutions

  1. Do it "by hand" which means using AVFoundation and in particular VideoToolbox.

To get going with that you basically start with https://developer.apple.com/videos/play/wwdc2014/513/ Enjoy!

I have to say, that is really the "correct and better" solution.

  1. If you can get ffmpeg-api working inside your iOS app, you can use ffmpeg, FFmpeg will do hardware decoding after some fiddling.

There are a number of ways to get started with that. (One absolutely amazing new thing is the SWIFT ffmpeg made by sunlubo: https://github.com/sunlubo/SwiftFFmpeg )

Be aware with the "ffmpeg" approach that there are, in short, a number of legal/license issues with ffmpeg / iOS. One can search and read about those problems.

However on the technical side, these days indeed it is possible to compile ffmpeg right in to iOS, and use it raw in your iOS code. (Using a C library may be easiest.)

We just did an enormous project doing just this, as well as other approaches. (I never want to see FFmpeg again!)

You can in fact achieve actual hardware decoding, in iOS, using FFmpeg.

We found it to be incredibly fiddly. And a couple of bugs need to be patched in FFmpeg. (I hope I never see videotoolbox.c again :/ )

So once again your two options for hardware decoding in iOS are

  1. Do it "by hand" AVFoundation/VideoToolbox.

  2. Use FFmpeg.

Item 2 is incredibly fiddly and uses a lot of time. Item 1 uses a huge amount of time. Tough choice :/


With iOS 8, you can use video toolbox (https://developer.apple.com/reference/videotoolbox) to decode H264 to raw frames. VT APIs are hardware accelerated and will provide you much better performance when compared with libavcodec. If you want to play the frames or generate a preview, you can use eagl based renderer to play. I have written a sample app to encode the frames from raw to h.264 (https://github.com/manishganvir/iOS-h264Hw-Toolbox). h.264 to raw shouldn't be that difficult !


Have you tried writing the H.264 stream that you receive from your protocol to a temporary file which you continually append to, and then once you have written enough bytes to avoid buffering playback, passing the url of your temp file to MPMoviePlayerController?

0

精彩评论

暂无评论...
验证码 换一张
取 消