API 21中將原來的 camera API 棄用轉而推薦使用新增的 camera2 API,這是一個大的動作,因為新 API 換了架構,讓開發(fā)者用起來更難了。
先來看看 camera2包架構示意圖:
http://wiki.jikexueyuan.com/project/android-actual-combat-skills/images/33-1.png" alt="fig.1" />
這里引用了管道的概念將安卓設備和攝像頭之間聯(lián)通起來,系統(tǒng)向攝像頭發(fā)送 Capture 請求,而攝像頭會返回 CameraMetadata。這一切建立在一個叫作 CameraCaptureSession 的會話中。
下面是 camera2包中的主要類:
http://wiki.jikexueyuan.com/project/android-actual-combat-skills/images/33-2.png" alt="fig.2" />
其中 CameraManager 是那個站在高處統(tǒng)管所有攝像投設備(CameraDevice)的管理者,而每個 CameraDevice 自己會負責建立 CameraCaptureSession 以及建立 CaptureRequest。
CameraCharacteristics 是 CameraDevice 的屬性描述類,非要做個對比的話,那么它與原來的 CameraInfo 有相似性。
類圖中有著三個重要的 callback,雖然這增加了閱讀代碼的難度,但是你必須要習慣,因為這是新包的風格。其中 CameraCaptureSession.CaptureCallback 將處理預覽和拍照圖片的工作,需要重點對待。
這些類是如何相互配合的?下面是簡單的流程圖。
http://wiki.jikexueyuan.com/project/android-actual-combat-skills/images/33-3.png" alt="fig.3" />
我是用 SurfaceView 作為顯示對象(當然還可以 TextureView 去顯示,詳見參考中的項目)
核心代碼如下:
mCameraManager = (CameraManager) this.getSystemService(Context.CAMERA_SERVICE);
mSurfaceView = (SurfaceView)findViewById(R.id.surfaceview);
mSurfaceHolder = mSurfaceView.getHolder();
mSurfaceHolder.addCallback(new SurfaceHolder.Callback() {
@Override
public void surfaceCreated(SurfaceHolder holder) {
initCameraAndPreview();
}
});
private void initCameraAndPreview() {
Log.d("linc","init camera and preview");
HandlerThread handlerThread = new HandlerThread("Camera2");
handlerThread.start();
mHandler = new Handler(handlerThread.getLooper());
try {
mCameraId = ""+CameraCharacteristics.LENS_FACING_FRONT;
mImageReader = ImageReader.newInstance(mSurfaceView.getWidth(), mSurfaceView.getHeight(),
ImageFormat.JPEG,/*maxImages*/7);
mImageReader.setOnImageAvailableListener(mOnImageAvailableListener, mHandler);
mCameraManager.openCamera(mCameraId, DeviceStateCallback, mHandler);
} catch (CameraAccessException e) {
Log.e("linc", "open camera failed." + e.getMessage());
}
}
private CameraDevice.StateCallback DeviceStateCallback = new CameraDevice.StateCallback() {
@Override
public void onOpened(CameraDevice camera) {
Log.d("linc","DeviceStateCallback:camera was opend.");
mCameraOpenCloseLock.release();
mCameraDevice = camera;
try {
createCameraCaptureSession();
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
};
private void createCameraCaptureSession() throws CameraAccessException {
Log.d("linc","createCameraCaptureSession");
mPreviewBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
mPreviewBuilder.addTarget(mSurfaceHolder.getSurface());
mState = STATE_PREVIEW;
mCameraDevice.createCaptureSession(
Arrays.asList(mSurfaceHolder.getSurface(), mImageReader.getSurface()),
mSessionPreviewStateCallback, mHandler);
}
private CameraCaptureSession.StateCallback mSessionPreviewStateCallback = new
CameraCaptureSession.StateCallback() {
@Override
public void onConfigured(CameraCaptureSession session) {
Log.d("linc","mSessionPreviewStateCallback onConfigured");
mSession = session;
try {
mPreviewBuilder.set(CaptureRequest.CONTROL_AF_MODE,
CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
mPreviewBuilder.set(CaptureRequest.CONTROL_AE_MODE,
CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
session.setRepeatingRequest(mPreviewBuilder.build(), mSessionCaptureCallback, mHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
Log.e("linc","set preview builder failed."+e.getMessage());
}
}
};
private CameraCaptureSession.CaptureCallback mSessionCaptureCallback =
new CameraCaptureSession.CaptureCallback() {
@Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request,
TotalCaptureResult result) {
// Log.d("linc","mSessionCaptureCallback, onCaptureCompleted");
mSession = session;
checkState(result);
}
@Override
public void onCaptureProgressed(CameraCaptureSession session, CaptureRequest request,
CaptureResult partialResult) {
Log.d("linc","mSessionCaptureCallback, onCaptureProgressed");
mSession = session;
checkState(partialResult);
}
private void checkState(CaptureResult result) {
switch (mState) {
case STATE_PREVIEW:
// NOTHING
break;
case STATE_WAITING_CAPTURE:
int afState = result.get(CaptureResult.CONTROL_AF_STATE);
if (CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED == afState ||
CaptureResult.CONTROL_AF_STATE_NOT_FOCUSED_LOCKED == afState
|| CaptureResult.CONTROL_AF_STATE_PASSIVE_FOCUSED == afState
|| CaptureResult.CONTROL_AF_STATE_PASSIVE_UNFOCUSED == afState) {
//do something like save picture
}
break;
}
}
};
按下 capture 按鈕:
public void onCapture(View view) {
try {
Log.i("linc", "take picture");
mState = STATE_WAITING_CAPTURE;
mSession.setRepeatingRequest(mPreviewBuilder.build(), mSessionCaptureCallback, mHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
測試用 genemotion 模擬器,直接調用筆記本的攝像頭。
配置圖如下:
http://wiki.jikexueyuan.com/project/android-actual-combat-skills/images/33-4.png" alt="fig.4" />
demo 界面如下圖:
http://wiki.jikexueyuan.com/project/android-actual-combat-skills/images/33-5.png" alt="fig.5" />
源碼:
請參考 github 中的兩個 demo 項目: