如何操作相机预览?

如何解决如何操作相机预览?

| 有几本教程介绍了如何在Android设备上启动并运行简单的相机预览。但是我找不到任何示例来说明如何在渲染图像之前对其进行操作。 我想做的是实现自定义滤色器以模拟例如红色和/或绿色不足。     

解决方法

        我对此进行了一些研究,并提出了一个可行的示例。这是我发现的。从相机中获取原始数据非常容易。它以YUV字节数组形式返回。您需要手动将其绘制到曲面上才能对其进行修改。为此,您需要具有一个SurfaceView,可以使用它手动运行绘制调用。您可以设置几个标志来实现此目的。 为了手动进行绘图调用,您需要将字节数组转换为某种位图。此时,位图和BitmapDecoder似乎不能很好地处理YUV字节数组。为此已提交了一个错误,但请不要怀疑它的状态是什么。因此,人们一直在尝试自己将字节数组解码为RGB格式。 似乎手动进行解码有点慢,人们在此方面取得了不同程度的成功。这样的事情可能实际上应该在NDK级别使用本机代码完成。 尽管如此,仍然有可能使其工作。另外,我的小演示只是我花了几个小时一起破解东西(我想这样做多少引起了我的想象;)。因此,通过一些调整,您可能会大大改善我已经设法工作的内容。 这个小代码片段还包含我发现的其他一些gem。如果只想在表面上绘制,则可以覆盖表面的onDraw函数-您可以分析返回的相机图像并绘制覆盖图-比尝试处理每一帧要快得多。另外,我更改了SurfaceHolder.SURFACE_TYPE_NORMAL,以防您需要显示相机预览。因此,对代码进行了一些更改-注释掉的代码:
//try { mCamera.setPreviewDisplay(holder); } catch (IOException e)
//  { Log.e(\"Camera\",\"mCamera.setPreviewDisplay(holder);\"); }
并且:
SurfaceHolder.SURFACE_TYPE_NORMAL //SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS - for preview to work
应该允许您基于摄像机预览在真实预览之上叠加帧。 无论如何,这是一段有效的代码-应该给您一些开始。 只需在您的其中一个视图中放置一行代码,如下所示:
<pathtocustomview.MySurfaceView android:id=\"@+id/surface_camera\"
    android:layout_width=\"fill_parent\" android:layout_height=\"10dip\"
    android:layout_weight=\"1\">
</pathtocustomview.MySurfaceView>
并将此类包含在您的源代码中的某个位置:
package pathtocustomview;

import java.io.IOException;
import java.nio.Buffer;

import android.content.Context;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.Canvas;
import android.graphics.Paint;
import android.graphics.Rect;
import android.hardware.Camera;
import android.util.AttributeSet;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.SurfaceHolder.Callback;
import android.view.SurfaceView;

public class MySurfaceView extends SurfaceView implements Callback,Camera.PreviewCallback {

    private SurfaceHolder mHolder;

    private Camera mCamera;
    private boolean isPreviewRunning = false;
    private byte [] rgbbuffer = new byte[256 * 256];
    private int [] rgbints = new int[256 * 256];

    protected final Paint rectanglePaint = new Paint();

    public MySurfaceView(Context context,AttributeSet attrs) {
    super(context,attrs);
        rectanglePaint.setARGB(100,200,0);
        rectanglePaint.setStyle(Paint.Style.FILL);
        rectanglePaint.setStrokeWidth(2);

        mHolder = getHolder();
        mHolder.addCallback(this);
        mHolder.setType(SurfaceHolder.SURFACE_TYPE_NORMAL);
    }

    @Override
    protected void onDraw(Canvas canvas) {
        canvas.drawRect(new Rect((int) Math.random() * 100,(int) Math.random() * 100,200),rectanglePaint);
        Log.w(this.getClass().getName(),\"On Draw Called\");
    }

    public void surfaceChanged(SurfaceHolder holder,int format,int width,int height) {
    }

    public void surfaceCreated(SurfaceHolder holder) {
        synchronized (this) {
            this.setWillNotDraw(false); // This allows us to make our own draw
                                    // calls to this canvas

            mCamera = Camera.open();

            Camera.Parameters p = mCamera.getParameters();
            p.setPreviewSize(240,160);
            mCamera.setParameters(p);


            //try { mCamera.setPreviewDisplay(holder); } catch (IOException e)
            //  { Log.e(\"Camera\",\"mCamera.setPreviewDisplay(holder);\"); }

            mCamera.startPreview();
            mCamera.setPreviewCallback(this);

        }
    }

    public void surfaceDestroyed(SurfaceHolder holder) {
        synchronized (this) {
            try {
                if (mCamera != null) {
                    mCamera.stopPreview();
                    isPreviewRunning = false;
                    mCamera.release();
                }
            } catch (Exception e) {
                Log.e(\"Camera\",e.getMessage());
            }
        }
    }

    public void onPreviewFrame(byte[] data,Camera camera) {
        Log.d(\"Camera\",\"Got a camera frame\");

        Canvas c = null;

        if(mHolder == null){
            return;
        }

        try {
            synchronized (mHolder) {
                c = mHolder.lockCanvas(null);

                // Do your drawing here
                // So this data value you\'re getting back is formatted in YUV format and you can\'t do much
                // with it until you convert it to rgb
                int bwCounter=0;
                int yuvsCounter=0;
                for (int y=0;y<160;y++) {
                    System.arraycopy(data,yuvsCounter,rgbbuffer,bwCounter,240);
                    yuvsCounter=yuvsCounter+240;
                    bwCounter=bwCounter+256;
                }

                for(int i = 0; i < rgbints.length; i++){
                    rgbints[i] = (int)rgbbuffer[i];
                }

                //decodeYUV(rgbbuffer,data,100,100);
                c.drawBitmap(rgbints,256,false,new Paint());

                Log.d(\"SOMETHING\",\"Got Bitmap\");

            }
        } finally {
            // do this in a finally so that if an exception is thrown
            // during the above,we don\'t leave the Surface in an
            // inconsistent state
            if (c != null) {
                mHolder.unlockCanvasAndPost(c);
            }
        }
    }
}
    ,        我使用了walta的解决方案,但在YUV转换,相机框架输出尺寸和相机发行时崩溃时遇到了一些问题。 最后,以下代码对我有用:
public class MySurfaceView extends SurfaceView implements Callback,Camera.PreviewCallback {

private static final String TAG = \"MySurfaceView\";

private int width;
private int height;

private SurfaceHolder mHolder;

private Camera mCamera;
private int[] rgbints;

private boolean isPreviewRunning = false; 

private int mMultiplyColor;

public MySurfaceView(Context context,attrs);

    mHolder = getHolder();
    mHolder.addCallback(this);
    mMultiplyColor = getResources().getColor(R.color.multiply_color);
}

// @Override
// protected void onDraw(Canvas canvas) {
// Log.w(this.getClass().getName(),\"On Draw Called\");
// }

@Override
public void surfaceChanged(SurfaceHolder holder,int height) {

}

@Override
public void surfaceCreated(SurfaceHolder holder) {
    synchronized (this) {
        if (isPreviewRunning)
            return;

        this.setWillNotDraw(false); // This allows us to make our own draw calls to this canvas


        mCamera = Camera.open();
        isPreviewRunning = true;
        Camera.Parameters p = mCamera.getParameters();
        Size size = p.getPreviewSize();
        width = size.width;
        height = size.height;
        p.setPreviewFormat(ImageFormat.NV21);
        showSupportedCameraFormats(p);
        mCamera.setParameters(p);

        rgbints = new int[width * height];

        // try { mCamera.setPreviewDisplay(holder); } catch (IOException e)
        // { Log.e(\"Camera\",\"mCamera.setPreviewDisplay(holder);\"); }

        mCamera.startPreview();
        mCamera.setPreviewCallback(this);

    }
}


@Override
public void surfaceDestroyed(SurfaceHolder holder) {
    synchronized (this) {
        try {
            if (mCamera != null) {
                //mHolder.removeCallback(this);
                mCamera.setPreviewCallback(null);
                mCamera.stopPreview();
                isPreviewRunning  = false;
                mCamera.release();
            }
        } catch (Exception e) {
            Log.e(\"Camera\",e.getMessage());
        }
    }
}

@Override
public void onPreviewFrame(byte[] data,Camera camera) {
    // Log.d(\"Camera\",\"Got a camera frame\");
    if (!isPreviewRunning)
        return;

    Canvas canvas = null;

    if (mHolder == null) {
        return;
    }

    try {
        synchronized (mHolder) {
            canvas = mHolder.lockCanvas(null);
            int canvasWidth = canvas.getWidth();
            int canvasHeight = canvas.getHeight();

            decodeYUV(rgbints,width,height);

            // draw the decoded image,centered on canvas
            canvas.drawBitmap(rgbints,canvasWidth-((width+canvasWidth)>>1),canvasHeight-((height+canvasHeight)>>1),height,null);

            // use some color filter
            canvas.drawColor(mMultiplyColor,Mode.MULTIPLY);

        }
    }  catch (Exception e){
        e.printStackTrace();
    } finally {
        // do this in a finally so that if an exception is thrown
        // during the above,we don\'t leave the Surface in an
        // inconsistent state
        if (canvas != null) {
            mHolder.unlockCanvasAndPost(canvas);
        }
    }
}



/**
 * Decodes YUV frame to a buffer which can be use to create a bitmap. use
 * this for OS < FROYO which has a native YUV decoder decode Y,U,and V
 * values on the YUV 420 buffer described as YCbCr_422_SP by Android
 * 
 * @param rgb
 *            the outgoing array of RGB bytes
 * @param fg
 *            the incoming frame bytes
 * @param width
 *            of source frame
 * @param height
 *            of source frame
 * @throws NullPointerException
 * @throws IllegalArgumentException
 */
public void decodeYUV(int[] out,byte[] fg,int height) throws NullPointerException,IllegalArgumentException {
    int sz = width * height;
    if (out == null)
        throw new NullPointerException(\"buffer out is null\");
    if (out.length < sz)
        throw new IllegalArgumentException(\"buffer out size \" + out.length + \" < minimum \" + sz);
    if (fg == null)
        throw new NullPointerException(\"buffer \'fg\' is null\");
    if (fg.length < sz)
        throw new IllegalArgumentException(\"buffer fg size \" + fg.length + \" < minimum \" + sz * 3 / 2);
    int i,j;
    int Y,Cr = 0,Cb = 0;
    for (j = 0; j < height; j++) {
        int pixPtr = j * width;
        final int jDiv2 = j >> 1;
    for (i = 0; i < width; i++) {
        Y = fg[pixPtr];
        if (Y < 0)
            Y += 255;
        if ((i & 0x1) != 1) {
            final int cOff = sz + jDiv2 * width + (i >> 1) * 2;
            Cb = fg[cOff];
            if (Cb < 0)
                Cb += 127;
            else
                Cb -= 128;
            Cr = fg[cOff + 1];
            if (Cr < 0)
                Cr += 127;
            else
                Cr -= 128;
        }
        int R = Y + Cr + (Cr >> 2) + (Cr >> 3) + (Cr >> 5);
        if (R < 0)
            R = 0;
        else if (R > 255)
            R = 255;
        int G = Y - (Cb >> 2) + (Cb >> 4) + (Cb >> 5) - (Cr >> 1) + (Cr >> 3) + (Cr >> 4) + (Cr >> 5);
        if (G < 0)
            G = 0;
        else if (G > 255)
            G = 255;
        int B = Y + Cb + (Cb >> 1) + (Cb >> 2) + (Cb >> 6);
        if (B < 0)
            B = 0;
        else if (B > 255)
            B = 255;
        out[pixPtr++] = 0xff000000 + (B << 16) + (G << 8) + R;
    }
    }

}

private void showSupportedCameraFormats(Parameters p) {
    List<Integer> supportedPictureFormats = p.getSupportedPreviewFormats();
    Log.d(TAG,\"preview format:\" + cameraFormatIntToString(p.getPreviewFormat()));
    for (Integer x : supportedPictureFormats) {
        Log.d(TAG,\"suppoterd format: \" + cameraFormatIntToString(x.intValue()));
    }

}

private String cameraFormatIntToString(int format) {
    switch (format) {
    case PixelFormat.JPEG:
        return \"JPEG\";
    case PixelFormat.YCbCr_420_SP:
        return \"NV21\";
    case PixelFormat.YCbCr_422_I:
        return \"YUY2\";
    case PixelFormat.YCbCr_422_SP:
        return \"NV16\";
    case PixelFormat.RGB_565:
        return \"RGB_565\";
    default:
        return \"Unknown:\" + format;

        }
    }
}
要使用它,请在您活动的onCreate上运行以下代码:
            SurfaceView surfaceView = new MySurfaceView(this,null);
        RelativeLayout.LayoutParams layoutParams = new RelativeLayout.LayoutParams(RelativeLayout.LayoutParams.MATCH_PARENT,RelativeLayout.LayoutParams.MATCH_PARENT);
        surfaceView.setLayoutParams(layoutParams);
        mRelativeLayout.addView(surfaceView);
    ,        您看过GPUImage吗? 它最初是Brad Larson制作的OSX / iOS库,作为OpenGL / ES的Objective-C包装器存在。 https://github.com/BradLarson/GPUImage Cyber​​ Agent的人们已经制作了一个Android端口(它没有完整的功能对等),它是OpenGLES之上的一组Java包装器。它具有较高的水平,并且很容易实现,并且具有上述许多相同的功能... https://github.com/Cyber​​Agent/android-gpuimage     

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。

相关推荐


依赖报错 idea导入项目后依赖报错,解决方案:https://blog.csdn.net/weixin_42420249/article/details/81191861 依赖版本报错:更换其他版本 无法下载依赖可参考:https://blog.csdn.net/weixin_42628809/a
错误1:代码生成器依赖和mybatis依赖冲突 启动项目时报错如下 2021-12-03 13:33:33.927 ERROR 7228 [ main] o.s.b.d.LoggingFailureAnalysisReporter : *************************** APPL
错误1:gradle项目控制台输出为乱码 # 解决方案:https://blog.csdn.net/weixin_43501566/article/details/112482302 # 在gradle-wrapper.properties 添加以下内容 org.gradle.jvmargs=-Df
错误还原:在查询的过程中,传入的workType为0时,该条件不起作用 &lt;select id=&quot;xxx&quot;&gt; SELECT di.id, di.name, di.work_type, di.updated... &lt;where&gt; &lt;if test=&qu
报错如下,gcc版本太低 ^ server.c:5346:31: 错误:‘struct redisServer’没有名为‘server_cpulist’的成员 redisSetCpuAffinity(server.server_cpulist); ^ server.c: 在函数‘hasActiveC
解决方案1 1、改项目中.idea/workspace.xml配置文件,增加dynamic.classpath参数 2、搜索PropertiesComponent,添加如下 &lt;property name=&quot;dynamic.classpath&quot; value=&quot;tru
删除根组件app.vue中的默认代码后报错:Module Error (from ./node_modules/eslint-loader/index.js): 解决方案:关闭ESlint代码检测,在项目根目录创建vue.config.js,在文件中添加 module.exports = { lin
查看spark默认的python版本 [root@master day27]# pyspark /home/software/spark-2.3.4-bin-hadoop2.7/conf/spark-env.sh: line 2: /usr/local/hadoop/bin/hadoop: No s
使用本地python环境可以成功执行 import pandas as pd import matplotlib.pyplot as plt # 设置字体 plt.rcParams[&#39;font.sans-serif&#39;] = [&#39;SimHei&#39;] # 能正确显示负号 p
错误1:Request method ‘DELETE‘ not supported 错误还原:controller层有一个接口,访问该接口时报错:Request method ‘DELETE‘ not supported 错误原因:没有接收到前端传入的参数,修改为如下 参考 错误2:cannot r
错误1:启动docker镜像时报错:Error response from daemon: driver failed programming external connectivity on endpoint quirky_allen 解决方法:重启docker -&gt; systemctl r
错误1:private field ‘xxx‘ is never assigned 按Altʾnter快捷键,选择第2项 参考:https://blog.csdn.net/shi_hong_fei_hei/article/details/88814070 错误2:启动时报错,不能找到主启动类 #
报错如下,通过源不能下载,最后警告pip需升级版本 Requirement already satisfied: pip in c:\users\ychen\appdata\local\programs\python\python310\lib\site-packages (22.0.4) Coll
错误1:maven打包报错 错误还原:使用maven打包项目时报错如下 [ERROR] Failed to execute goal org.apache.maven.plugins:maven-resources-plugin:3.2.0:resources (default-resources)
错误1:服务调用时报错 服务消费者模块assess通过openFeign调用服务提供者模块hires 如下为服务提供者模块hires的控制层接口 @RestController @RequestMapping(&quot;/hires&quot;) public class FeignControl
错误1:运行项目后报如下错误 解决方案 报错2:Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.1:compile (default-compile) on project sb 解决方案:在pom.
参考 错误原因 过滤器或拦截器在生效时,redisTemplate还没有注入 解决方案:在注入容器时就生效 @Component //项目运行时就注入Spring容器 public class RedisBean { @Resource private RedisTemplate&lt;String
使用vite构建项目报错 C:\Users\ychen\work&gt;npm init @vitejs/app @vitejs/create-app is deprecated, use npm init vite instead C:\Users\ychen\AppData\Local\npm-