I am trying to build a small photo app with a burst mode for camera. The main idea is to shoot a picture every 0,3sec and to store the pictures in an Array until the last picture is taken. The number of pictures to shoot should specified.
I can only shoot a picture every 2sec. as shortest interval. Then, when the pictures should be written to storage, the applications chrashes.
Does someone know how to implement this? Here is my code:
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.ObjectInputStream;
import java.io.ObjectOutputStream;
import java.util.ArrayList;
import java.util.Iterator;
import java.util.List;
import android.app.Activity;
import android.content.Context;
import android.content.Intent;
import android.graphics.PixelFormat;
import android.hardware.Camera;
import android.os.Bundle;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.view.Window;
import android.view.WindowManager;
import android.view.View.OnClickListener;
public class CamaeaView extends Activity implements SurfaceHolder.Callback, OnClickListener {
static final int FOTO_MODE = 1;
private static final String TAG = "Test";开发者_JS百科
Camera mCamera;
boolean mPreviewRunning = false;
private Context mContext = this;
ArrayList<Object> listImage = null;
public static ArrayList<Object> List1[];
public void onCreate(Bundle icicle) {
super.onCreate(icicle);
Log.e(TAG, "onCreate");
@SuppressWarnings("unused")
Bundle extras = getIntent().getExtras();
getWindow().setFormat(PixelFormat.TRANSLUCENT);
requestWindowFeature(Window.FEATURE_NO_TITLE);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,
WindowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.main);
mSurfaceView = (SurfaceView) findViewById(R.id.surface_camera);
mSurfaceView.setOnClickListener(this);
mSurfaceHolder = mSurfaceView.getHolder();
mSurfaceHolder.addCallback(this);
mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
@Override
protected void onRestoreInstanceState(Bundle savedInstanceState) {
super.onRestoreInstanceState(savedInstanceState);
}
Camera.PictureCallback mPictureCallback = new Camera.PictureCallback() {
public void onPictureTaken(byte[] imageData, Camera c) {
Log.d("Start: ","Imagelist");
Intent mIntent = new Intent();
listImage.add(StoreByteImage(imageData));
SaveImage(listImage);
mCamera.startPreview();
setResult(FOTO_MODE, mIntent);
finish();
}
};
protected void onResume() {
Log.d(TAG, "onResume");
super.onResume();
}
protected void onSaveInstanceState(Bundle outState) {
super.onSaveInstanceState(outState);
}
protected void onStop() {
Log.e(TAG, "onStop");
super.onStop();
}
public void surfaceCreated(SurfaceHolder holder) {
Log.d(TAG, "surfaceCreated");
mCamera = Camera.open();
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
Log.d(TAG, "surfaceChanged");
// XXX stopPreview() will crash if preview is not running
if (mPreviewRunning) {
mCamera.stopPreview();
}
Camera.Parameters p = mCamera.getParameters();
p.setPreviewSize(w, h);
p.setPictureSize(1024, 768);
p.getSupportedPictureFormats();
p.setPictureFormat(PixelFormat.JPEG);
mCamera.setParameters(p);
try {
mCamera.setPreviewDisplay(holder);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
mCamera.startPreview();
mPreviewRunning = true;
}
public void surfaceDestroyed(SurfaceHolder holder) {
Log.e(TAG, "surfaceDestroyed");
mCamera.stopPreview();
mPreviewRunning = false;
mCamera.release();
}
private SurfaceView mSurfaceView;
private SurfaceHolder mSurfaceHolder;
public void onClick(View arg0) {
int i = 0;
while (i < 4) {
mCamera.takePicture(null, mPictureCallback, mPictureCallback);
i = i + 1;
try {
Thread.sleep(3000);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
public static Object StoreByteImage(byte[] imageData) {
Object obj = null;
try {
ByteArrayInputStream bis = new ByteArrayInputStream(imageData);
ObjectInputStream objImageData = new ObjectInputStream(bis);
obj = objImageData.readObject();
} catch (IOException ex) {
// TODO: Handle the exception
} catch (ClassNotFoundException ex) {
// TODO: Handle the exception
}
return obj;
}
public static Object createImagesArray(byte[] imageData) {
Object obj = null;
try {
ByteArrayInputStream bis = new ByteArrayInputStream(imageData);
ObjectInputStream objImageData = new ObjectInputStream(bis);
obj = objImageData.readObject();
} catch (IOException ex) {
// TODO: Handle the exception
} catch (ClassNotFoundException ex) {
// TODO: Handle the exception
}
return obj;
}
public static boolean SaveImage(List<Object> listImage) {
FileOutputStream outStream = null;
Log.d("ListSize: ", Integer.toString(listImage.size()));
Iterator<Object> itList = listImage.iterator();
byte[] imageBytes = null;
while (itList.hasNext()) {
ByteArrayOutputStream bos = new ByteArrayOutputStream();
try {
ObjectOutputStream oos = new ObjectOutputStream(bos);
oos.writeObject(listImage);
oos.flush();
oos.close();
bos.close();
imageBytes = bos.toByteArray();
} catch (IOException ex) {
// TODO: Handle the exception
}
try {
// Write to SD Card
outStream = new FileOutputStream(String.format(
"/sdcard/DCIM/%d.jpg", System.currentTimeMillis())); // <9>
outStream.write(imageBytes);
outStream.close();
} catch (FileNotFoundException e) { // <10>
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
}
Log.d(TAG, "onPictureTaken - jpeg");
return true;
}
}
Error from the comments:
ERROR/MemoryHeapBase(1382): error opening /dev/pmem_camera: No such file or directory
ERROR/QualcommCameraHardware(1382): failed to construct master heap for pmem pool /dev/pmem_camera
ERROR/QualcommCameraHardware(1382): initRaw X failed with pmem_camera, trying with pmem_adsp
AFAIK, you cannot take another picture until the first one is complete. Eliminate your for
loop and Thread.sleep()
. Take the next picture in mPictureCallback
.
The main idea is to shoot a picture every 0,3sec and to store the pictures in an Array until the last picture is taken.
The "0,3sec" objective is faster than most devices can process an image, and you may not have enough heap space for your array of images. I suspect that you will have to write each image out to disk as it comes in via an AsyncTask
, so you can free up the heap space while also not tying up the main application thread.
You can try capturing subsequent preview frames in preallocated byte arrays. See the following methods:
http://developer.android.com/reference/android/hardware/Camera.html#addCallbackBuffer(byte[]) http://developer.android.com/reference/android/hardware/Camera.html#setPreviewCallbackWithBuffer(android.hardware.Camera.PreviewCallback)
...or http://developer.android.com/reference/android/hardware/Camera.html#setOneShotPreviewCallback(android.hardware.Camera.PreviewCallback) for better control over timing.
精彩评论