我正在尝试构建一个 Cardboard android 应用程序,它并排显示 2 个摄像头视图.[就像相机视图适用于 VRCinema Android 应用一样.]
I am trying to build a Cardboard android application that shows 2 camera view side by side. [Just like the camera view works for VRCinema Android app.]
所以我研究了 GitHub 上的 Cardboard 代码,做了一些修改,到目前为止,我能够使用 imageView 并排复制相同的图像.
So I studies the Cardboard code from GitHub, made some modification and so far I am able to use the imageView to replicate the same image side by side.
到目前为止的代码看起来像这样.
and the Code so far looks like this.
AndroidManifest.XML
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.google.vrtoolkit.cardboard.samples.treasurehunt" >
<uses-permission android:name="android.permission.NFC" />
<uses-permission android:name="android.permission.VIBRATE" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-feature android:name="android.hardware.camera" android:required="false" />
<uses-feature android:name="android.hardware.camera.autofocus" android:required="false" />
<uses-feature android:name="android.hardware.camera.front" android:required="false" />
<uses-sdk android:minSdkVersion="14"/>
<uses-feature android:glEsVersion="0x00020000" android:required="true" />
<application
android:allowBackup="true"
android:icon="@drawable/ic_launcher"
android:label="@string/app_name" >
<activity
android:screenOrientation="landscape"
android:name=".MainActivity"
android:label="@string/app_name" >
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
</intent-filter>
</activity>
</application>
</manifest>
common_ui.xml
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:id="@+id/ui_layout"
android:orientation="vertical"
android:layout_width="fill_parent"
android:layout_height="fill_parent" >
<com.google.vrtoolkit.cardboard.CardboardView
android:id="@+id/cardboard_view"
android:layout_width="fill_parent"
android:layout_height="fill_parent"
android:layout_alignParentTop="true"
android:layout_alignParentLeft="true" />
<com.google.vrtoolkit.cardboard.samples.treasurehunt.CardboardOverlayView
android:id="@+id/overlay"
android:layout_width="fill_parent"
android:layout_height="fill_parent"
android:layout_alignParentLeft="true"
android:layout_alignParentTop="true"
android:layout_centerInParent="true" />
</RelativeLayout>
CardboardOverlayView.java
package com.google.vrtoolkit.cardboard.samples.treasurehunt;
import android.content.Context;
import android.graphics.Color;
import android.graphics.Typeface;
import android.util.AttributeSet;
import android.util.TypedValue;
import android.view.Gravity;
import android.view.View;
import android.view.ViewGroup;
import android.view.animation.AlphaAnimation;
import android.view.animation.Animation;
import android.widget.ImageView;
import android.widget.LinearLayout;
import android.widget.TextView;
/**
* Contains two sub-views to provide a simple stereo HUD.
*/
public class CardboardOverlayView extends LinearLayout {
private static final String TAG = CardboardOverlayView_bkp1.class.getSimpleName();
private final CardboardOverlayEyeView mLeftView;
private final CardboardOverlayEyeView mRightView;
private AlphaAnimation mTextFadeAnimation;
public CardboardOverlayView(Context context, AttributeSet attrs) {
super(context, attrs);
setOrientation(HORIZONTAL);
LayoutParams params = new LayoutParams(
LayoutParams.MATCH_PARENT, LayoutParams.MATCH_PARENT, 1.0f);
params.setMargins(0, 0, 0, 0);
mLeftView = new CardboardOverlayEyeView(context, attrs);
mLeftView.setLayoutParams(params);
addView(mLeftView);
mRightView = new CardboardOverlayEyeView(context, attrs);
mRightView.setLayoutParams(params);
addView(mRightView);
// Set some reasonable defaults.
setDepthOffset(0.016f);
setColor(Color.rgb(150, 255, 180));
setVisibility(View.VISIBLE);
mTextFadeAnimation = new AlphaAnimation(1.0f, 0.0f);
mTextFadeAnimation.setDuration(5000);
}
public void show3DToast(String message) {
setText(message);
setTextAlpha(1f);
mTextFadeAnimation.setAnimationListener(new EndAnimationListener() {
@Override
public void onAnimationEnd(Animation animation) {
setTextAlpha(0f);
}
});
startAnimation(mTextFadeAnimation);
}
public void show3DImage() {
setImg();
}
private abstract class EndAnimationListener implements Animation.AnimationListener {
@Override public void onAnimationRepeat(Animation animation) {}
@Override public void onAnimationStart(Animation animation) {}
}
private void setDepthOffset(float offset) {
mLeftView.setOffset(offset);
mRightView.setOffset(-offset);
}
//---------------------------------------------------------------------------------------------
private void setImg(){
mLeftView.imageView.setImageResource(R.drawable.mona_lisa);
mRightView.imageView.setImageResource(R.drawable.mona_lisa);
}
//------------------------------------------------------------------------------------------
private void setText(String text) {
mLeftView.setText(text);
mRightView.setText(text);
}
private void setTextAlpha(float alpha) {
mLeftView.setTextViewAlpha(alpha);
mRightView.setTextViewAlpha(alpha);
}
private void setColor(int color) {
mLeftView.setColor(color);
mRightView.setColor(color);
}
/**
* A simple view group containing some horizontally centered text underneath a horizontally
* centered image.
*
* This is a helper class for CardboardOverlayView.
*/
private class CardboardOverlayEyeView extends ViewGroup {
private final ImageView imageView;
private final TextView textView;
private float offset;
public CardboardOverlayEyeView(Context context, AttributeSet attrs) {
super(context, attrs);
imageView = new ImageView(context, attrs);
imageView.setScaleType(ImageView.ScaleType.FIT_CENTER);
imageView.setAdjustViewBounds(true); // Preserve aspect ratio.
addView(imageView);
textView = new TextView(context, attrs);
textView.setTextSize(TypedValue.COMPLEX_UNIT_DIP, 14.0f);
textView.setTypeface(textView.getTypeface(), Typeface.BOLD);
textView.setGravity(Gravity.CENTER);
textView.setShadowLayer(3.0f, 0.0f, 0.0f, Color.DKGRAY);
addView(textView);
}
public void setColor(int color) {
//imageView.setColorFilter(color);
textView.setTextColor(color);
}
public void setText(String text) {
textView.setText(text);
}
public void setTextViewAlpha(float alpha) {
textView.setAlpha(alpha);
}
public void setOffset(float offset) {
this.offset = offset;
}
@Override
protected void onLayout(boolean changed, int left, int top, int right, int bottom) {
// Width and height of this ViewGroup.
final int width = right - left;
final int height = bottom - top;
// The size of the image, given as a fraction of the dimension as a ViewGroup. We multiply
// both width and heading with this number to compute the image's bounding box. Inside the
// box, the image is the horizontally and vertically centered.
final float imageSize = 1.0f;
// The fraction of this ViewGroup's height by which we shift the image off the ViewGroup's
// center. Positive values shift downwards, negative values shift upwards.
final float verticalImageOffset = -0.07f;
// Vertical position of the text, specified in fractions of this ViewGroup's height.
final float verticalTextPos = 0.52f;
// Layout ImageView
float imageMargin = (1.0f - imageSize) / 2.0f;
float leftMargin = (int) (width * (imageMargin + offset));
float topMargin = (int) (height * (imageMargin + verticalImageOffset));
imageView.layout(
(int) leftMargin, (int) topMargin,
(int) (leftMargin + width * imageSize), (int) (topMargin + height * imageSize));
// Layout TextView
leftMargin = offset * width;
topMargin = height * verticalTextPos;
textView.layout(
(int) leftMargin, (int) topMargin,
(int) (leftMargin + width), (int) (topMargin + height * (1.0f - verticalTextPos)));
}
}
}
MainActivity.Java
package com.google.vrtoolkit.cardboard.samples.treasurehunt;
import android.app.Activity;
import android.content.Context;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.graphics.Bitmap;
import android.hardware.Camera;
import android.net.Uri;
import android.os.Bundle;
import android.os.Vibrator;
import android.util.Log;
import android.widget.ImageView;
import com.google.vrtoolkit.cardboard.*;
import javax.microedition.khronos.egl.EGLConfig;
import java.nio.FloatBuffer;
/**
* A Cardboard sample application.
*/
public class MainActivity extends CardboardActivity implements CardboardView.StereoRenderer {
private static final int CAMERA_REQUEST = 1888;
private static final String TAG = "MainActivity";
private static final int CAPTURE_IMAGE_ACTIVITY_REQ = 0;
Uri fileUri = null;
ImageView photoImage = null;
private static final float CAMERA_Z = 0.01f;
private static final float TIME_DELTA = 0.3f;
private static final float YAW_LIMIT = 0.12f;
private static final float PITCH_LIMIT = 0.12f;
// We keep the light always position just above the user.
private final float[] mLightPosInWorldSpace = new float[]{0.0f, 2.0f, 0.0f, 1.0f};
private final float[] mLightPosInEyeSpace = new float[4];
private static final int COORDS_PER_VERTEX = 3;
private final WorldLayoutData DATA = new WorldLayoutData();
private FloatBuffer mFloorVertices;
private FloatBuffer mFloorColors;
private FloatBuffer mFloorNormals;
private FloatBuffer mCubeVertices;
private FloatBuffer mCubeColors;
private FloatBuffer mCubeFoundColors;
private FloatBuffer mCubeNormals;
private int mGlProgram;
private int mPositionParam;
private int mNormalParam;
private int mColorParam;
private int mModelViewProjectionParam;
private int mLightPosParam;
private int mModelViewParam;
private int mModelParam;
private int mIsFloorParam;
private float[] mModelCube;
private float[] mCamera;
private float[] mView;
private float[] mHeadView;
private float[] mModelViewProjection;
private float[] mModelView;
private float[] mModelFloor;
private int mScore = 0;
private float mObjectDistance = 12f;
private float mFloorDepth = 20f;
private Vibrator mVibrator;
private CardboardOverlayView mOverlayView;
public MainActivity() {
}
/**
* Sets the view to our CardboardView and initializes the transformation matrices we will use
* to render our scene.
* //@param savedInstanceState
*/
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.common_ui);
CardboardView cardboardView = (CardboardView) findViewById(R.id.cardboard_view);
cardboardView.setRenderer(this);
setCardboardView(cardboardView);
mModelCube = new float[16];
mCamera = new float[16];
mView = new float[16];
mModelViewProjection = new float[16];
mModelView = new float[16];
mModelFloor = new float[16];
mHeadView = new float[16];
mVibrator = (Vibrator) getSystemService(Context.VIBRATOR_SERVICE);
mOverlayView = (CardboardOverlayView) findViewById(R.id.overlay);
mOverlayView.show3DToast("Pull the magnet when you find an object.");
mOverlayView.show3DImage();
}
@Override
public void onRendererShutdown(){Log.i(TAG, "onRendererShutdown");
}
@Override
public void onSurfaceChanged(int width, int height) {
Log.i(TAG, "onSurfaceChanged");
}
/**
* Creates the buffers we use to store information about the 3D world. OpenGL doesn't use Java
* arrays, but rather needs data in a format it can understand. Hence we use ByteBuffers.
*
* @param config The EGL configuration used when creating the surface.
*/
@Override
public void onSurfaceCreated(EGLConfig config) {
Log.i(TAG, "onSurfaceCreated");
}
/**
* Prepares OpenGL ES before we draw a frame.
*
* @param headTransform The head transformation in the new frame.
*/
@Override
public void onNewFrame(HeadTransform headTransform) {
}
/**
* Draws a frame for an eye. The transformation for that eye (from the camera) is passed in as
* a parameter.
*
* @param transform The transformations to apply to render this eye.
*/
@Override
public void onDrawEye(EyeTransform transform) {
}
@Override
public void onFinishFrame(Viewport viewport) {
}
}
我注意到的几点:
我还检查了自过去 3 天以来堆栈溢出中可用的所有链接.最接近我的问题的是 Android 多摄像头预览 - 但它是不能完全解决我的问题.我还浏览了 Android - Camera 文档并了解了他们如何使用 cameraPreview 和 surfaceView.
I have also checked all the links that are available in stack overflow since last 3 days. The closest to my question was Android multuple camera preview - but it's not quite exactly solves my question. I have also gone through the Android - Camera documentation and learned about how they are using cameraPreview and surfaceView.
所以我的问题是我现在需要做什么才能看到 CameraPreview [或包含 CameraPreview 的 surfaceView] 而不是 Imageview,以便我可以在横向模式下像分屏一样提供实时摄像头馈送?
So my question is what do I need to do now to be able to see the CameraPreview [or the surfaceView that holds the CameraPreview] instead of the Imageview so that I cam provide live camera feed like a split screen in landscape mode?
我希望问题足够详细.如果您需要任何进一步的信息,尽管问.
I hope the question is detailed enough.Still if you need any further info, just ask.
基本上你需要将相机绘制成一个OpenGL纹理并显示纹理.
Basically you need to draw the camera into an OpenGL texture and show the texture.
我在这里写了两个版本,随意修改代码并添加拉取请求:https://github.com/Sveder/CardboardPassthrough
I wrote two versions of this here, feel free to modify the code and add pull requests: https://github.com/Sveder/CardboardPassthrough
这篇关于如何并排显示 2 个摄像头预览?[对于纸板应用程序]的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持html5模板网!