Managing Audio Playback
setVolumeControlStream()
,系統(tǒng)將在Activity/Fragment仍在界面上時(shí),自動(dòng)響應(yīng)設(shè)備的音量操作鍵,增大或減小設(shè)置類(lèi)型媒體的音量android.intent.action.MEDIA_BUTTON
的廣播,如下BroadcastReceiver可以響應(yīng)處理這一廣播(需要在AndroidManifest.xml中聲明):
public class RemoteControlReceiver extends BroadcastReceiver {
@Override
public void onReceive(Context context, Intent intent) {
if (Intent.ACTION_MEDIA_BUTTON.equals(intent.getAction())) {
KeyEvent event = (KeyEvent)intent.getParcelableExtra(Intent.EXTRA_KEY_EVENT);
if (KeyEvent.KEYCODE_MEDIA_PLAY == event.getKeyCode()) {
// Handle key press.
}
}
}
}
Managing Audio Focus
請(qǐng)求/釋放audio focus
AudioManager am = mContext.getSystemService(Context.AUDIO_SERVICE);
...
// Request audio focus for playback
int result = am.requestAudioFocus(afChangeListener,
// Use the music stream.
AudioManager.STREAM_MUSIC,
// Request permanent focus.
AudioManager.AUDIOFOCUS_GAIN);
if (result == AudioManager.AUDIOFOCUS_REQUEST_GRANTED) {
am.registerMediaButtonEventReceiver(RemoteControlReceiver);
// Start playback.
}
...
// Abandon audio focus when playback complete
am.abandonAudioFocus(afChangeListener);
requestAudioFocus
的最后一個(gè)參數(shù)可以設(shè)為AudioManager.AUDIOFOCUS_GAIN_TRANSIENT_MAY_DUCK
,用于請(qǐng)求短暫的audio focus,允許其他app在失去audio focus時(shí)繼續(xù)播放音樂(lè)(但應(yīng)該降低音量)OnAudioFocusChangeListener afChangeListener = new OnAudioFocusChangeListener() {
public void onAudioFocusChange(int focusChange) {
if (focusChange == AUDIOFOCUS_LOSS_TRANSIENT
// Pause playback
} else if (focusChange == AudioManager.AUDIOFOCUS_GAIN) {
// Resume playback
} else if (focusChange == AudioManager.AUDIOFOCUS_LOSS) {
am.unregisterMediaButtonEventReceiver(RemoteControlReceiver);
am.abandonAudioFocus(afChangeListener);
// Stop playback
}
}
};
OnAudioFocusChangeListener afChangeListener = new OnAudioFocusChangeListener() {
public void onAudioFocusChange(int focusChange) {
if (focusChange == AUDIOFOCUS_LOSS_TRANSIENT_CAN_DUCK) {
// Lower the volume
} else if (focusChange == AudioManager.AUDIOFOCUS_GAIN) {
// Raise it back to normal
}
}
};
if (audioManager.isBluetoothA2dpOn()) {
// A2DP audio routing to the Bluetooth headset
} else if (audioManager.isSpeakerphoneOn()) {
// Adjust output for Speakerphone.
} else if (audioManager.isWiredHeadsetOn()) {
// Adjust output for headsets
} else if (audioManager.isBluetoothScoOn()) {
// SCO is used for communications
} else {
// If audio plays and noone can hear it, is it still playing?
}
一旦耳機(jī)/藍(lán)牙耳機(jī)斷開(kāi)連接,系統(tǒng)將繼續(xù)使用默認(rèn)設(shè)備(揚(yáng)聲器)播放,同時(shí)系統(tǒng)會(huì)發(fā)送一個(gè)AudioManager.ACTION_AUDIO_BECOMING_NOISY
廣播,可以通過(guò)以下代碼進(jìn)行響應(yīng)
private class BroadcastReceiver myNoisyAudioStreamReceiver = new BroadcastReceiver() {
@Override
public void onReceive(Context context, Intent intent) {
if (AudioManager.ACTION_AUDIO_BECOMING_NOISY.equals(intent.getAction())) {
// Pause the playback
}
}
};
private void startPlayback() {
registerReceiver(myNoisyAudioStreamReceiver, new IntentFilter(AudioManager.ACTION_AUDIO_BECOMING_NOISY));
}
private void stopPlayback() {
unregisterReceiver(myNoisyAudioStreamReceiver);
}
Capturing Photos
使用已有相機(jī)APP拍照
<manifest ... >
<uses-feature android:name="android.hardware.camera" android:required="true" />
...
</manifest>
packageManager.hasSystemFeature(PackageManager.FEATURE_CAMERA)
發(fā)送Intent調(diào)起已有相機(jī)APP拍照,發(fā)送Intent之前需要檢查是否有其他APP可以響應(yīng)此Intent,如沒(méi)有卻調(diào)用了startActivity,將會(huì)拋出異常
static final int REQUEST_IMAGE_CAPTURE = 1;
private void dispatchTakePictureIntent() {
Intent takePictureIntent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
if (takePictureIntent.resolveActivity(getPackageManager()) != null) {
startActivityForResult(takePictureIntent, REQUEST_IMAGE_CAPTURE);
}
}
在發(fā)送takePictureIntent之前,可以手動(dòng)設(shè)置要照片要保存的位置takePictureIntent.putExtra(android.provider.MediaStore.EXTRA_OUTPUT, imageFileUri);
,拍照成功返回之后,可以直接訪(fǎng)問(wèn)該uri。
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode == REQUEST_IMAGE_CAPTURE && resultCode == RESULT_OK) {
Bundle extras = data.getExtras();
Bitmap imageBitmap = (Bitmap) extras.get("data");
mImageView.setImageBitmap(imageBitmap);
}
}
獲取完整照片
照片保存在外置存儲(chǔ)卡的公開(kāi)區(qū)域,需要聲明權(quán)限,WRITE_EXTERNAL_STORAGE
包含了READ_EXTERNAL_STORAGE
權(quán)限,目錄路徑通過(guò)Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES)
函數(shù)獲得
<manifest ...>
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
...
</manifest>
如果要保存在APP私有目錄下,API 18之后,將不用聲明該權(quán)限,目錄路徑通過(guò)context.getExternalFilesDir(Environment.DIRECTORY_PICTURES)
函數(shù)獲得,app訪(fǎng)問(wèn)自己對(duì)應(yīng)的該目錄,從API 19起,將不需要任何權(quán)限,但是訪(fǎng)問(wèn)其他APP對(duì)應(yīng)的目錄時(shí),需要WRITE_EXTERNAL_STORAGE
/READ_EXTERNAL_STORAGE
權(quán)限,該目錄不一定任何時(shí)候都可以訪(fǎng)問(wèn),也不具備安全性
<manifest ...>
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"
android:maxSdkVersion="18" />
...
</manifest>
拍照設(shè)置保存文件
String mCurrentPhotoPath;
private File createImageFile() throws IOException {
// Create an image file name
String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss").format(new Date());
String imageFileName = "JPEG_" + timeStamp + "_";
File storageDir = Environment.getExternalStoragePublicDirectory(
Environment.DIRECTORY_PICTURES);
File image = File.createTempFile(
imageFileName, /* prefix */
".jpg", /* suffix */
storageDir /* directory */
);
// Save a file: path for use with ACTION_VIEW intents
mCurrentPhotoPath = "file:" + image.getAbsolutePath();
return image;
}
static final int REQUEST_TAKE_PHOTO = 1;
private void dispatchTakePictureIntent() {
Intent takePictureIntent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
// Ensure that there's a camera activity to handle the intent
if (takePictureIntent.resolveActivity(getPackageManager()) != null) {
// Create the File where the photo should go
File photoFile = null;
try {
photoFile = createImageFile();
} catch (IOException ex) {
// Error occurred while creating the File
...
}
// Continue only if the File was successfully created
if (photoFile != null) {
takePictureIntent.putExtra(MediaStore.EXTRA_OUTPUT,
Uri.fromFile(photoFile));
startActivityForResult(takePictureIntent, REQUEST_TAKE_PHOTO);
}
}
}
context.getExternalFilesDir(type)
時(shí),將無(wú)法加入Gallery
private void galleryAddPic() {
Intent mediaScanIntent = new Intent(Intent.ACTION_MEDIA_SCANNER_SCAN_FILE);
File f = new File(mCurrentPhotoPath);
Uri contentUri = Uri.fromFile(f);
mediaScanIntent.setData(contentUri);
this.sendBroadcast(mediaScanIntent);
}
獲取壓縮后的圖片,用于顯示在ImageView上
private void setPic() {
// Get the dimensions of the View
int targetW = mImageView.getWidth();
int targetH = mImageView.getHeight();
// Get the dimensions of the bitmap
BitmapFactory.Options bmOptions = new BitmapFactory.Options();
bmOptions.inJustDecodeBounds = true;
BitmapFactory.decodeFile(mCurrentPhotoPath, bmOptions);
int photoW = bmOptions.outWidth;
int photoH = bmOptions.outHeight;
// Determine how much to scale down the image
int scaleFactor = Math.min(photoW/targetW, photoH/targetH);
// Decode the image file into a Bitmap sized to fill the View
bmOptions.inJustDecodeBounds = false;
bmOptions.inSampleSize = scaleFactor;
bmOptions.inPurgeable = true;
Bitmap bitmap = BitmapFactory.decodeFile(mCurrentPhotoPath, bmOptions);
mImageView.setImageBitmap(bitmap);
}
Intent takeVideoIntent = new Intent(MediaStore.ACTION_VIDEO_CAPTURE);
,onActivityResult
返回的intent的data數(shù)據(jù),就是錄制視頻的Uri,intent.getData()
。直接使用相機(jī)
Camera API
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-feature android:name="android.hardware.camera" android:required="true" />
在onResume中inflate SurfaceView,動(dòng)態(tài)添加到layout中,在onPause中停止預(yù)覽,移除SurfaceView,以解決界面onPause后再onResume就無(wú)法預(yù)覽的問(wèn)題。在SurfaceView的surfaceCreated
回調(diào)中打開(kāi)相機(jī),開(kāi)始預(yù)覽
@Override
public void surfaceCreated(SurfaceHolder holder) {
mSurfaceHolder = holder;
initPreview(holder);
}
@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
releaseResources();
}
@Override
protected void onResume() {
mSurfaceView = (SurfaceView) LayoutInflater.from(getActivity())
.inflate(R.layout.ui_surface_view, null);
flContainer.addView(mSurfaceView, 0);
isPreview = true;
SurfaceHolder mSurfaceHolder = mSurfaceView.getHolder();
mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
mSurfaceHolder.setKeepScreenOn(true);
mSurfaceHolder.addCallback(this);
}
打開(kāi)相機(jī),開(kāi)始預(yù)覽
private boolean initPreview(SurfaceHolder holder) {
int mSupportVideoFormat[] = {ImageFormat.NV21, ImageFormat.YV12};
try {
cameraid = getCameraId(isFrontCamera);
mCamera = Camera.open(cameraid);
} catch (Exception e) {
e.printStackTrace();
return false;
}
if (mCamera == null) {
ToastUtils.toastResId(R.string.error_init_player_fail);
return false;
}
// change to portrait record
setCameraDisplayOrientation(cameraid, mCamera);
try {
mCamera.setPreviewDisplay(holder);
} catch (IOException e) {
ToastUtils.toastResId(R.string.error_IO_error);
e.printStackTrace();
return false;
}
Camera.Parameters parameters = mCamera.getParameters();
parameters.setPreviewSize(width, height);
parameters.setPictureSize(width, height);
if (isLightOn) {
parameters.setFlashMode(Camera.Parameters.FLASH_MODE_TORCH);
}
int actualFormat = 0;
List<Integer> list = parameters.getSupportedPreviewFormats();
for (int format : mSupportVideoFormat) {
for (Integer i : list) {
Timber.i("startVideoCapture " + "suport format:" + i);
if (format == i.intValue()) {
actualFormat = format;
break;
}
}
if (actualFormat != 0) {
break;
}
}
if (actualFormat == 0) {
Timber.e("startVideoCapture" + " no suport format be found");
return false;
}
parameters.setPreviewFormat(actualFormat);// ImageFormat.YV12
try {
mCamera.setParameters(parameters);
mCamera.startPreview();
} catch (Exception e) {
e.printStackTrace();
return false;
}
return true;
}
Google Camera2 Sample,簡(jiǎn)版
<uses-sdk
android:minSdkVersion="21"
android:targetSdkVersion="21" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-feature android:name="android.hardware.camera2.full" />
layout
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:paddingBottom="@dimen/activity_vertical_margin"
android:paddingLeft="@dimen/activity_horizontal_margin"
android:paddingRight="@dimen/activity_horizontal_margin"
android:paddingTop="@dimen/activity_vertical_margin"
tools:context="com.example.camera2te.MainActivity" >
<TextureView
android:id="@+id/texture"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_alignParentStart="true"
android:layout_alignParentTop="true" />
</RelativeLayout>
設(shè)置TextureView回調(diào),在onSurfaceTextureAvailable
回調(diào)中打開(kāi)相機(jī)
private TextureView.SurfaceTextureListener mSurfaceTextureListener = new TextureView.SurfaceTextureListener(){
@Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
Log.e(TAG, "onSurfaceTextureAvailable, width="+width+",height="+height);
openCamera();
}
@Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surface,
int width, int height) {
}
@Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
return false;
}
@Override
public void onSurfaceTextureUpdated(SurfaceTexture surface) {
}
};
@Override
protected void onResume() {
...
mTextureView.setSurfaceTextureListener(mSurfaceTextureListener);
...
}
打開(kāi)相機(jī),在相機(jī)回調(diào)中開(kāi)始預(yù)覽
private void openCamera() {
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
String cameraId = manager.getCameraIdList()[0];
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
mPreviewSize = map.getOutputSizes(SurfaceTexture.class)[0];
manager.openCamera(cameraId, mStateCallback, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
private CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {
@Override
public void onOpened(CameraDevice camera) {
mCameraDevice = camera;
startPreview();
}
@Override
public void onDisconnected(CameraDevice camera) {
}
@Override
public void onError(CameraDevice camera, int error) {
}
};
開(kāi)啟預(yù)覽
protected void startPreview() {
if(null == mCameraDevice || !mTextureView.isAvailable() || null == mPreviewSize) {
Log.e(TAG, "startPreview fail, return");
return;
}
SurfaceTexture texture = mTextureView.getSurfaceTexture();
if(null == texture) {
Log.e(TAG,"texture is null, return");
return;
}
texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
Surface surface = new Surface(texture);
try {
mPreviewBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
} catch (CameraAccessException e) {
e.printStackTrace();
}
mPreviewBuilder.addTarget(surface);
try {
mCameraDevice.createCaptureSession(Arrays.asList(surface), new CameraCaptureSession.StateCallback() {
@Override
public void onConfigured(CameraCaptureSession session) {
mPreviewSession = session;
updatePreview();
}
@Override
public void onConfigureFailed(CameraCaptureSession session) {
Toast.makeText(MainActivity.this, "onConfigureFailed", Toast.LENGTH_LONG).show();
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
protected void updatePreview() {
if(null == mCameraDevice) {
Log.e(TAG, "updatePreview error, return");
}
mPreviewBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
HandlerThread thread = new HandlerThread("CameraPreview");
thread.start();
Handler backgroundHandler = new Handler(thread.getLooper());
try {
mPreviewSession.setRepeatingRequest(mPreviewBuilder.build(), null, backgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
Printing Content, >= API 19
PrintHelper photoPrinter = new PrintHelper(getActivity());
photoPrinter.setScaleMode(PrintHelper.SCALE_MODE_FIT);
Bitmap bitmap = BitmapFactory.decodeResource(getResources(),
R.drawable.droids);
photoPrinter.printBitmap("droids.jpg - test print", bitmap);
打印HTML文檔
private WebView mWebView;
private void doWebViewPrint() {
// Create a WebView object specifically for printing
WebView webView = new WebView(getActivity());
webView.setWebViewClient(new WebViewClient() {
public boolean shouldOverrideUrlLoading(WebView view, String url) {
return false;
}
@Override
public void onPageFinished(WebView view, String url) {
Log.i(TAG, "page finished loading " + url);
createWebPrintJob(view);
mWebView = null;
}
});
// Generate an HTML document on the fly:
String htmlDocument = "<html><body><h1>Test Content</h1><p>Testing, " +
"testing, testing...</p></body></html>";
webView.loadDataWithBaseURL(null, htmlDocument, "text/HTML", "UTF-8", null);
// Keep a reference to WebView object until you pass the PrintDocumentAdapter
// to the PrintManager
mWebView = webView;
}
private void createWebPrintJob(WebView webView) {
// Get a PrintManager instance
PrintManager printManager = (PrintManager) getActivity()
.getSystemService(Context.PRINT_SERVICE);
// Get a print adapter instance
PrintDocumentAdapter printAdapter = webView.createPrintDocumentAdapter();
// Create a print job with name and adapter instance
String jobName = getString(R.string.app_name) + " Document";
PrintJob printJob = printManager.print(jobName, printAdapter,
new PrintAttributes.Builder().build());
// Save the job object for later status checking
mPrintJobs.add(printJob);
}