I've got an Android project I'm working on开发者_运维知识库 that, ultimately, will require me to create a movie file out of a series of still images taken with a phone's camera. That is to say, I want to be able to take raw image frames and string them together, one by one, into a movie. Audio is not a concern at this stage.
Looking over the Android API, it looks like there are calls in it to create movie files, but it seems those are entirely geared around making a live recording from the camera on an immediate basis. While nice, I can't use that for my purposes, as I need to put annotations and other post-production things on the images as they come in before they get fed into a movie (plus, the images come way too slowly to do a live recording). Worse, looking over the Android source, it looks like a non-trivial task to rewire that to do what I want it to do (at least without touching the NDK).
Is there any way I can use the API to do something like this? Or alternatively, what would be the best way to go about this, if it's even feasible on cell phone hardware (which seems to keep getting more and more powerful, strangely...)?
Is there any way I can use the API to do something like this?
No.
Or alternatively, what would be the best way to go about this, if it's even feasible on cell phone hardware (which seems to keep getting more and more powerful, strangely...)?
It is possible you can find a Java library that lets you assemble movies out of stills and annotations, but I would be rather surprised if it met your needs, would run on Android, and would run acceptably on mobile phone hardware.
IMHO, the best route is to use a Web service. Use the device for data collection, use the server to do all the heavy lifting of assembling the movie out of the parts.
If you have to do it on-device, the NDK seems like the only practical route.
Do you just want to create movie files or do you want to display them on the phone?
If you just want to display the post-processed annotated images as a movie then it's possible. What is the format of your images ? Currently, I'm able to display to MJPEG video on a Nexus One (running 2.1) without any noticeable lag without using the NDK. In my case the images are coming from the network.
On the other hand, if you just want to create movie files and store is on the phone or some other place then CommonsWave's idea of "delegating" this to a server makes more sense since you will have more processing power and storage on the server. This will require that you have access to a network and don't mind sending all the images from the phone to the server and then download the movie file back to the phone.
精彩评论