glsurfaceview 播放器怎样播放YUV数据

> Jerikc的博客详情
Android SDK为Camera预览提供了一个Demo,这个Demo的大致流程是初始化一个Camera和一个SurfaceView,SurfaceView被 创建之后可以获取到一个SurfaceHolder的实例,将这个SurfaceHolder传递给Camera,这样Camera就会自动的将捕获到的 视频数据渲染到SurfaceView上面,这也就是Camera预览的效果。当然更多的时候我们需要获取到Camera的实时视频数据来自己进行预处理 并渲染,Camera也提供了这个接口,用法如下:
mCamera.setPreviewCallback(new PreviewCallback(){
public void onPreviewFrame(byte[] data, Camera camera)
在这个回调里我们就能够获取到当前帧的数据,我们可以对其进行预处理,比如压缩、加密、特效处理等,不过byte[]这个buffer里面的数据是 YUV格式的,一般是YUV420SP,而Android提供的SurfaceView、GLSurfaceView、TextureView等控件只支 持RGB格式的渲染,因此我们需要一个算法来解码。
先介绍一个YUV转RGB的算法,转换的公式一般如下,也是线性的关系: R=Y+1.4075*(V-128) G=Y-0.3455*(U-128) – 0.7169*(V-128) B=Y+1.779*(U-128)
下面是一段将YUV转成ARGB_8888的jni代码,类似的代码网上很多,将这个代码简单修改一下也能直接用在C中。
jintArray Java_com_spore_jni_ImageUtilEngine_decodeYUV420SP(JNIEnv * env,
jobject thiz, jbyteArray buf, jint width, jint height)
jbyte * yuv420sp = (*env)-&GetByteArrayElements(env, buf, 0);
int frameSize = width *
jint rgb[frameSize]; // 新图像像素值
int i = 0, j = 0,yp = 0;
int uvp = 0, u = 0, v = 0;
for (j = 0, yp = 0; j & j++)
uvp = frameSize + (j && 1) *
for (i = 0; i & i++, yp++)
int y = (0xff & ((int) yuv420sp[yp])) - 16;
if (y & 0)
if ((i & 1) == 0)
v = (0xff & yuv420sp[uvp++]) - 128;
u = (0xff & yuv420sp[uvp++]) - 128;
int y1192 = 1192 *
int r = (y1192 + 1634 * v);
int g = (y1192 - 833 * v - 400 * u);
int b = (y1192 + 2066 * u);
if (r & 0) r = 0; else if (r & 262143) r = 262143;
if (g & 0) g = 0; else if (g & 262143) g = 262143;
if (b & 0) b = 0; else if (b & 262143) b = 262143;
rgb[yp] = 0xff000000 | ((r && 6) & 0xff0000) | ((g && 2) & 0xff00) | ((b && 10) & 0xff);
jintArray result = (*env)-&NewIntArray(env, frameSize);
(*env)-&SetIntArrayRegion(env, result, 0, frameSize, rgb);
(*env)-&ReleaseByteArrayElements(env, buf, yuv420sp, 0);
} JNI代码对应的Java接口如下:
public native int[] decodeYUV420SP(byte[] buf, int width, int height); 从这个接口就很容易理解了,参数buf就是从Camera的onPreviewFrame回调用获取到的YUV格式的视频帧数据,width和 height分别是对应的Bitmap的宽高。返回的结果是一个ARGB_8888格式的颜色数组,将这个数组组装成Bitmap也是十分容易的,代码如 下:
mBitmap = Bitmap.createBitmap(data, width, height, Config.ARGB_8888);
基本上这样就能实现YUV2RGB了,但是这样的实现有一个问题:由于是软解码,所以性能并不理想。如果考虑到一般的视频通话的场景,例如 320*240左右的分辨率的话,那基本能满足实时性的需求,但是对于720P的高清视频则基本无望。当然,对于上面的实现,我们也可以尽我们所能的做一 些优化。
上面的算法实现中,已经没有浮点运算了,并且大多数操作已经使用了移位运算,剩下的可优化部分只有中间的乘法了,我们可以使用查表法来替代。上面的 代码我们简单分析就可以发现,Y、U、V的取值都只有256种情况,而对应的r、g、b跟YUV是线性的关系,其中r跟Y和V相关,g跟Y、V、U相 关,b跟Y和U相关,于是我们可以预先计算出所有可能的情况,比如所有的1634 * v的值保存在一个长度为256的数组中,这样我们只需要根据v值查找相乘的结果即可,可以节省这次的乘法运算。
考虑到RGB和YUV的相关性,我们可以把R和B的所有可能值预先计算并缓存,其长度均是256 * 256的int数组,也就是256KB,为什么不针对G值建表呢?因为G值跟YUV三个分量都有关,需要建256 * 256 *256长的表才行,也就是64M,这在手机设备上是不可行的。
下面是查表优化的代码:
int g_v_table[256],g_u_table[256],y_table[256];
int r_yv_table[256][256],b_yu_table[256][256];
int inited = 0;
void initTable()
if (inited == 0)
inited = 1;
int m = 0,n=0;
for (; m & 256; m++)
g_v_table[m] = 833 * (m - 128);
g_u_table[m] = 400 * (m - 128);
y_table[m] = 1192 * (m - 16);
int temp = 0;
for (m = 0; m & 256; m++)
for (n = 0; n & 256; n++)
temp = 1192 * (m - 16) + 1634 * (n - 128);
if (temp & 0) temp = 0; else if (temp & 262143) temp = 262143;
r_yv_table[m][n] =
temp = 1192 * (m - 16) + 2066 * (n - 128);
if (temp & 0) temp = 0; else if (temp & 262143) temp = 262143;
b_yu_table[m][n] =
jintArray Java_com_spore_jni_ImageUtilEngine_decodeYUV420SP(JNIEnv * env,
jobject thiz, jbyteArray buf, jint width, jint height)
jbyte * yuv420sp = (*env)-&GetByteArrayElements(env, buf, 0);
int frameSize = width *
jint rgb[frameSize]; // 新图像像素值
initTable();
int i = 0, j = 0,yp = 0;
int uvp = 0, u = 0, v = 0;
for (j = 0, yp = 0; j & j++)
uvp = frameSize + (j && 1) *
for (i = 0; i & i++, yp++)
int y = (0xff & ((int) yuv420sp[yp]));
if (y & 0)
if ((i & 1) == 0)
v = (0xff & yuv420sp[uvp++]);
u = (0xff & yuv420sp[uvp++]);
int y1192 = y_table[y];
int r = r_yv_table[y][v];
int g = (y1192 - g_v_table[v] - g_u_table[u]);
int b = b_yu_table[y][u];
if (g & 0) g = 0; else if (g & 262143) g = 262143;
rgb[yp] = 0xff000000 | ((r && 6) & 0xff0000) | ((g && 2) & 0xff00) | ((b && 10) & 0xff);
jintArray result = (*env)-&NewIntArray(env, frameSize);
(*env)-&SetIntArrayRegion(env, result, 0, frameSize, rgb);
(*env)-&ReleaseByteArrayElements(env, buf, yuv420sp, 0);
当然,还有其他的一些细节可以优化一下,比如转化结果的数组,可以预先在Java层分配,将数组的指针传递给JNI,这样可以省去数组在Java和C之间的传递时间,因为720P的图片是很大的,所以这个成本值得去优化。
下面是效果结果:
左边是一个SurfaceView用于Camera的预览,右侧是GLSurfaceView,将转码后的Bitmap渲染出来,由于截屏软件的问题,左侧Camera预览区域变成黑的了。
这样转码的效率如何呢?根据我在Nexus One上的测试结果,720P的图像,也就是1280 * 720的分辨率,转码并渲染的速度大概是8帧。
另外介绍一个看起来速度应该更快的查表转码的算法:。不过这里没有对参数进行说明,所以我调了好久发现转码之后的Bitmap始终很奇怪,大家可以去研究一下,如果调通了请告知一下多谢。
完整的,请点击此处
人打赏支持
码字总数 22757
支付宝支付
微信扫码支付
打赏金额: ¥
已支付成功
打赏金额: ¥
& 开源中国(OSChina.NET) |
开源中国社区(OSChina.net)是工信部
指定的官方社区Please click
if you are not redirected within a few seconds.
扫码订阅微信公众号
Links:当社は日本で最高品質は 代引き対応N級時計/財布/バッグ通販を主な業務として,弊店はスーパーコピー代引き時計業界で最大な 代引き時計/、スーパーコピー代引き時計/コピーブランド 代引き/スーパーコピーブランド代引きバッグ/コピー時計代引き販売専門店です后使用快捷导航没有帐号?
正在上传图片(0/1)
关于获取图像数据的问题。
各位,我想获得H264解码后的YUV数据或者RGB数据,进行一些分析和处理。请问SDK中有相关的API吗?有没有哪位高手指点一下。
关于获取图像数据的问题。
通过一番努力,终于把H264的码流解码成YUV数据了。大概思路如下,希望可以帮助有同样问题的朋友。
1.由于DJIReceivedVideoDataCallBack类的onResult方法中获得数据并不是完整一帧,所以需要进行拼接获得完整的一帧数据再送到解码器。
2.通过H264协议可知,在H264码流中并没有帧的概念,只是一个一个的NALU。如果通过解析NALU还获取一帧比较麻烦,但是H264有一条规则就是在一帧结束后插入一个00 00 00 01 09(但并不是强制的,主要看编码器)。通过观察大疆提供的码流,比较幸运,该编码器采用了这条规则。所以我们可以判断一帧的结尾,这样就能获取一帧完整的数据,然后送到解码器。
2.在Android平台上解码器可以选择FFMEPG或者自带的mediacodec。据说后一种为硬件解码,效率较高,而且节能,但是对Android版本有要求,同时不同平台也可能也略有区别,毕竟硬件解码跟CPU有关系的。前一种需要在jni中进行编码,需要编译FFMPEG库。我选择了MediaCodec,使用起来比较方便,最后得到了YUV数据。IOS平台的话,应该也可以使用FFMPEG,IOS还在学习中,视频解码还没接触过,所以其他方法我暂时还不太了解。
3.得到YUV数据后,因为我希望获得RGB的数据进行一些数字图像处理的算法,所以要转换成RGB图像,这个很简单,有很多方法,我选择的是OPENCV中的cvtcolor转换,效果还不错。
下图为在手机用opencv保存的转换后的图像。
本帖子中包含更多资源
才可以下载或查看,没有帐号?
关于获取图像数据的问题。
您好,我们提供的是裸的H264码流,但是目前并不提供相关解码的API文档。
关于获取图像数据的问题。
好吧,既然这样只能自己解码搞了。
关于获取图像数据的问题。
您好,我们提供的是裸的H264码流,但是目前并不提供相关解码的API文档。你好,为什么地图和视频集成在一块的时候,视频就会特别卡,延迟特别高,我用的是高得3D地图,请问是什么原因导致的呢?
关于获取图像数据的问题。
您好,感谢您对于SDK的关注与支持。地图的渲染和优化以及视频的解码过程都是您自己实现的,建议您把地图和视频做在两个VIEW里面去实现。另外不排除使用的设备的处理器的运算能力的问题。
关于获取图像数据的问题。
你好,为什么地图和视频集成在一块的时候,视频就会特别卡,延迟特别高,我用的是高得3D地图,请问是什么 ...您好,感谢您对于SDK的关注与支持。地图的渲染和优化以及视频的解码过程都是您自己实现的,建议您把地图和视频做在两个VIEW里面去实现。另外不排除使用的设备的处理器的运算能力的问题。
关于获取图像数据的问题。
通过一番努力,终于把H264的码流解码成YUV数据了。大概思路如下,希望可以帮助有同样问题的朋友。
1.由于DJ ...
感谢分享干货!已经mark,或许未来能在我自己的项目里用上,再次感谢!
关于获取图像数据的问题。
通过一番努力,终于把H264的码流解码成YUV数据了。大概思路如下,希望可以帮助有同样问题的朋友。
1.由于DJ ...Hi, 有两个地方不是很明白想请教一下。
1. 看到您说 0x00 0x00 0x00 0x01 0x09 意味着一帧的结束,但看别的地方的说明指的是0x00 0x00 0x00 0x01代表开始,中间是内容,0x09代表一帧结束。想问一下楼主是不是这样,以及每一帧都分割一定是buffer的分割吗?同一个buffer里面会不会含有两帧的内容?
2. 解码后的video stream buffer怎么显示在DjiGLSurfaceView上?找了一下没有除了setDataToDecoder之外的API,还是说这个时候需要我们自己新建一个view了?GLSurfaceView?
SDK板块日常维护志愿者。
我没有QQ,我也不接受私信提问。有问题请去论坛发帖,利人利己。
我没有QQ,我没有QQ,我没有QQ。
重要的事情说三遍。
关于获取图像数据的问题。
Hi, 有两个地方不是很明白想请教一下。
1. 看到您说 0x00 0x00 0x00 0x01 0x09 意味着一帧的结束,但看别 ...1.第一个问题,我们收到的码流一个一个的NALU组成的,并没有帧的概念,每一个NALU都是0x00 0x00 0x00 0x01开头的,后面有编码的内容。只有在一帧结束后,会单独插入一个0x00 0x00 0x00 0x00 0x09,后面没有编码内容,0x09之后就是下一帧的起始了。SDK中的那个回调函数传过来的BUFFER一般是固定1024个字节的,不是完整一帧,有可能会存着上一帧的后半部分和下一帧的前半部分,所以要自己做个缓冲区,每次取完数据都要判定一下,缓冲区中是否有完整的一帧。
2.第二个问题,调用setDataToDecoder之后,DjiGLSurfaceView就会把数据渲染到Surface上了。DjiGLSurfaceView本身就是个SurfaceView了不需要再建。DjiGLSurfaceView具体怎么做的看不到,大概是在JNI中调用FFMPEG解码后,然后在渲染到Surface上吧。
关于获取图像数据的问题。
1.第一个问题,我们收到的码流一个一个的NALU组成的,并没有帧的概念,每一个NALU都是0x00 0x00 0x00 0x0 ...谢谢,第二点还是不太明白。
当我调用MediaCodec进行解码之后怎么把YUV格式的video buffer显示在surfaceview上呢?新的buffer虽然不是H.264但是一样可以调用setDataToDecoder?
SDK板块日常维护志愿者。
我没有QQ,我也不接受私信提问。有问题请去论坛发帖,利人利己。
我没有QQ,我没有QQ,我没有QQ。
重要的事情说三遍。
关于获取图像数据的问题。
谢谢,第二点还是不太明白。
当我调用MediaCodec进行解码之后怎么把YUV格式的video buffer显示在surface ...自己的Surface可以在mediacodec的configure配置的时候设置,然后解码后releaseOutputBuffer会显示,文档上是这么写的,我还没试过。
关于获取图像数据的问题。
通过一番努力,终于把H264的码流解码成YUV数据了。大概思路如下,希望可以帮助有同样问题的朋友。
1.由于DJ ...楼主,请问能否把H264的码流解码成YUV数据这一块的代码贴一下吗?
最近也在弄这块。
关于获取图像数据的问题。
楼主,请问能否把H264的码流解码成YUV数据这一块的代码贴一下吗?
最近也在弄这块。最后我使用的是FFMPEG软解的,思路大概和官方SDK中的解码应该是一样的。我贴出关键片段。
在Jni中FFMpeg初始化
JNIEXPORT void Java_org_opencv_samples_tutorial2_H264Decoder_nativeInit(JNIEnv* env, jobject thiz, jint color_format) {
&&DecoderContext *ctx = calloc(1, sizeof(DecoderContext));
&&D("Creating native H264 decoder context");
&&switch (color_format) {
&&case COLOR_FORMAT_YUV420:
& & ctx->color_format = PIX_FMT_YUV420P;
&&case COLOR_FORMAT_RGB565LE:
& & ctx->color_format = PIX_FMT_RGB565LE;
&&case COLOR_FORMAT_BGR32:
& & ctx->color_format = PIX_FMT_BGR32;
&&res = mkfifo(FIFO_NAME, 0777);
&&if (res != 0)
& && &E("Could not create fifo %s,error code : %d\n", FIFO_NAME,errno);
&&ctx->videobuff = (unsigned char *)av_mallocz(32768);
&&ctx->pb = avio_alloc_context(ctx->videobuff,32768,0,NULL,read_video_data,NULL,NULL);
&&ctx->ic =&&avformat_alloc_context();
&&ctx->ic->pb = ctx->
&&res = avformat_open_input(&ctx->ic, "noting", NULL, NULL);
&&if ( res&&< 0) {
& & & & E("avformat open failed,err code:%d.\n",res);
&&} else {
& & & & E("avformat_open_input success!\n");
&&ctx->codec = avcodec_find_decoder(CODEC_ID_H264);
&&ctx->codec_ctx = avcodec_alloc_context3(ctx->codec);
&&ctx->codec_ctx->pix_fmt = PIX_FMT_YUV420P;
&&ctx->codec_ctx->flags2 |= CODEC_FLAG2_CHUNKS;
&&ctx->src_frame = avcodec_alloc_frame();
&&ctx->dst_frame = avcodec_alloc_frame();
&&avcodec_open2(ctx->codec_ctx, ctx->codec, NULL);
&&set_ctx(env, thiz, ctx);
&&g_frame_ready = &ctx->frame_
&&E("H264 decoder init over!");
JNIEXPORT jint Java_org_opencv_samples_tutorial2_H264Decoder_decodeVideo(JNIEnv* env, jobject thiz) {
&&DecoderContext *ctx = get_ctx(env, thiz);
&&int res = -1;
&&av_init_packet(&packet);
&&if (av_read_frame(ctx->ic, &packet) >= 0)
& & & & //&&D("av_read_frame success!");
& & & && &int frameFinished = 0;
& & & && &res = avcodec_decode_video2(ctx->codec_ctx, ctx->src_frame, &frameFinished, &packet);
& & & && &if (frameFinished)
& & & && &{
& & & & & & & & ctx->frame_ready = 1;
& & & && &}
& & & && &av_free_packet(&packet);
获得一帧数据:
JNIEXPORT jlong Java_org_opencv_samples_tutorial2_H264Decoder_getFrame(JNIEnv* env, jobject thiz, jbyteArray out_buffer,jint size) {
&&DecoderContext *ctx = get_ctx(env, thiz);
&&if (!ctx->frame_ready)
& & return -1;
&&jbyte *out_buf=(*env)->GetByteArrayElements(env,out_buffer,0);
&&long out_buf_len =
&&int pic_buf_size = avpicture_get_size(ctx->color_format, ctx->codec_ctx->width, ctx->codec_ctx->height);
&&if (out_buf_len < pic_buf_size) {
& & D("Input buffer too small,pic_buf_size = %d",pic_buf_size);
& & return -1;
&&if (ctx->color_format == COLOR_FORMAT_YUV420) {
& &memcpy(ctx->src_frame->data, out_buffer, pic_buf_size);
& & if (ctx->convert_ctx == NULL) {
& && &ctx->convert_ctx = sws_getContext(ctx->codec_ctx->width, ctx->codec_ctx->height, ctx->codec_ctx->pix_fmt,
& && && & ctx->codec_ctx->width, ctx->codec_ctx->height, ctx->color_format, SWS_FAST_BILINEAR, NULL, NULL, NULL);
& & avpicture_fill((AVPicture*)ctx->dst_frame, (uint8_t*)out_buf, ctx->color_format, ctx->codec_ctx->width,
& && &&&ctx->codec_ctx->height);
& & sws_scale(ctx->convert_ctx, (const uint8_t**)ctx->src_frame->data, ctx->src_frame->linesize, 0, ctx->codec_ctx->height,
& && &&&ctx->dst_frame->data, ctx->dst_frame->linesize);
& & & & //E("send vsync");
& & memcpy(s_pixels, out_buf, pic_buf_size);
& & & & send_vsync();
&&ctx->frame_ready = 0;
//&&if (ctx->src_frame->pkt_pts == AV_NOPTS_VALUE) {
//& & D("No PTS was passed from avcodec_decode!");
&&(*env)->ReleaseByteArrayElements(env, out_buffer, out_buf, 0);
&&return ctx->src_frame->pkt_
这是OnResult中调用的那个方法。
JNICALL Java_org_opencv_samples_tutorial2_H264Decoder_setDataToDecoder
&&(JNIEnv * env, jobject thiz , jbyteArray array, jint size)
& & & && &jbyte *m_temp=(*env)->GetByteArrayElements(env,array,0);
& & & && &if(g_write_fd < 0)
& & & && &{
& & & & & & & && &g_write_fd = open(FIFO_NAME, O_RDWR);
& & & & & & & && &if(g_write_fd < 0)
& & & & & & & & & & & && &
& & & && &}
& & & && &write(g_write_fd, (char *)m_temp, size);
& & & && &(*env)->ReleaseByteArrayElements(env, array, m_temp, 0);
这个是自定义FFmpeg获取数据的方法。
int read_video_data(void *opaque, uint8_t *buf, int buf_size) {
& & & & //E("buf_size = %d",buf_size);
& & & & if(g_read_fd < 0)
& & & & & & & & g_read_fd = open(FIFO_NAME, O_RDWR);
& & & & & & & & if(g_read_fd <0)
& & & & & & & & & & & & return 0;
& & & & ret = read(g_read_fd,buf,buf_size);
& & & & if(ret < 0)
& & & & & & & & return 0;
& & & & else
& & & & & & & &
我使用的有名管道来通信,也可以使用其他方法来做。
关于获取图像数据的问题。
最后我使用的是FFMPEG软解的,思路大概和官方SDK中的解码应该是一样的。我贴出关键片段。
在Jni中FFMpeg ...hi,最近终于把ffmpeg编译好了,我开始试你这个代码了,
你能把DecoderContext *ctx 这个数据结构贴一下吗,谢谢了!
还有就是在初始化里面这一句代码 ctx->pb = avio_alloc_context(ctx->videobuff,32768,0,NULL,read_video_data,NULL,NULL);
这个read_video_data函数可以这么用吗,都不用传参数的吗,c的代码不是很会,见笑了。
关于获取图像数据的问题。
hi,最近终于把ffmpeg编译好了,我开始试你这个代码了,
你能把DecoderContext *ctx 这个数据结构贴一下 ...好久没来了,没看到回复,不知道现在对你还有没有帮助。
typedef struct DecoderContext {
&&int color_
&&struct AVCodec *
&&struct AVCodecContext *codec_
&&struct AVFrame *src_
&&struct AVFrame *dst_
&&struct SwsContext *convert_
&&AVIOContext *
&&AVFormatContext *
&&unsigned char *
&&int fifo_
&&int fifo_
&&int frame_
} DecoderC
至于那个函数,这是函数指针,当做一个回调函数来用。
关于获取图像数据的问题。
楼主,请问下这个对视频的解码是用在安卓平台上的,如果要实现在pc上对h.264码流解码,存储并显示的话也是类似的方面嘛?
我之前看过一些资料,说是Directshow可以进行类似的处理,或者类似VLC的开源视频软件。
另外问一个比较基础的问题,如果要对视频数据进行处理显示的话,是不是都要使用类似ffmpeg的软件解码成数据流然后才能用类似VLC的开源软件进行存储播放。
关于获取图像数据的问题。
最后我使用的是FFMPEG软解的,思路大概和官方SDK中的解码应该是一样的。我贴出关键片段。
在Jni中FFMpeg ...楼主您好 请问您的解码程序直接放进cpp里,然后主程序中直接调用就可以了是吗?9726人阅读
开源项目移植(5)
由于项目需要,需要在android上面实现视频流的解码显示,综合考虑决定使用ffmpeg解码,opengl渲染视频。
技术选型确定以后,开始写demo,不做不知道,一做才发现网上的东西太不靠谱了,基于jni实现的opengl不是直接渲染yuv&#26684;式的数据,都是yuv转rgb以后在显示的,有实现的资料都是在java层做的,我不是java出生,所以对那个不感冒,综合考虑之后决定自己通过jni来实现,由于以前基于webrtc开发了一款产品,用的是webrtc的c&#43;&#43;接口开发(现在的webrtc都基于浏览器开发了,更加成熟了,接口也更加简单,^_^我觉得还是挖c&#43;&#43;代码出来自己实现接口层有意思,我那个项目就是这样搞的),废话不多说,开始讲述实现步骤。
注意:android2.3.3版本才开始支持opengl。
写jni的时候需要在Android.mk里面加上opengl的库连接,这里我发一个我的Android.mk出来供大家参考一下:
LOCAL_PATH := $(call my-dir)
MY_LIBS_PATH := /Users/chenjianjun/Documents/work/ffmpeg-android/build/lib
MY_INCLUDE_PATH := /Users/chenjianjun/Documents/work/ffmpeg-android/build/include
include $(CLEAR_VARS)
LOCAL_MODULE := libavcodec
LOCAL_SRC_FILES :=
$(MY_LIBS_PATH)/libavcodec.a
include $(PREBUILT_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := libavfilter
LOCAL_SRC_FILES :=
$(MY_LIBS_PATH)/libavfilter.a
include $(PREBUILT_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := libavformat
LOCAL_SRC_FILES :=
$(MY_LIBS_PATH)/libavformat.a
include $(PREBUILT_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := libavresample
LOCAL_SRC_FILES :=
$(MY_LIBS_PATH)/libavresample.a
include $(PREBUILT_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := libavutil
LOCAL_SRC_FILES :=
$(MY_LIBS_PATH)/libavutil.a
include $(PREBUILT_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := libpostproc
LOCAL_SRC_FILES :=
$(MY_LIBS_PATH)/libpostproc.a
include $(PREBUILT_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := libswresample
LOCAL_SRC_FILES :=
$(MY_LIBS_PATH)/libswresample.a
include $(PREBUILT_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := libswscale
LOCAL_SRC_FILES :=
$(MY_LIBS_PATH)/libswscale.a
include $(PREBUILT_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE_TAGS := MICloudPub
LOCAL_MODULE := libMICloudPub
LOCAL_SRC_FILES := H264Decoder.cpp \
#我的H264基于ffmpeg的解码接口代码
render_opengles20.cpp \
#opengl的渲染代码
#测试接口代码
LOCAL_CFLAGS :=
LOCAL_C_INCLUDES := $(MY_INCLUDE_PATH)
LOCAL_CPP_INCLUDES := $(MY_INCLUDE_PATH)
LOCAL_LDLIBS := \
&span style=&font-size:32color:#FF0000;&&-lGLESv2 \&/span&
LOCAL_WHOLE_STATIC_LIBRARIES := \
libavcodec \
libavfilter \
libavformat \
libavresample \
libavutil \
libpostproc \
libswresample \
libswscale
include $(BUILD_SHARED_LIBRARY)
&#160;上面红色的是opengl的库,我是mac电脑上面编译的,其他系统的不知道是不是叫这个名字哈(不过这么弱智的应该不会变哈).
写java代码(主要是为了jni里面的代码回调java的代码实现,其中的妙用大家后面便知)
我把webrtc里面的代码拿出来改动了一下,就没自己去写了(不用重复造轮子)
ViEAndroidGLES20.java
package hzcw.
import java.util.concurrent.locks.ReentrantL
import javax.microedition.khronos.egl.EGL10;
import javax.microedition.khronos.egl.EGLC
import javax.microedition.khronos.egl.EGLC
import javax.microedition.khronos.egl.EGLD
import javax.microedition.khronos.opengles.GL10;
import android.app.ActivityM
import android.content.C
import android.content.pm.ConfigurationI
import android.graphics.PixelF
import android.opengl.GLSurfaceV
import android.util.L
public class ViEAndroidGLES20 extends GLSurfaceView implements GLSurfaceView.Renderer
private static String TAG = &MICloudPub&;
private static final boolean DEBUG =
// True if onSurfaceCreated has been called.
private boolean surfaceCreated =
private boolean openGLCreated =
// True if NativeFunctionsRegistered has been called.
private boolean nativeFunctionsRegisted =
private ReentrantLock nativeFunctionLock = new ReentrantLock();
// Address of Native object that will do the drawing.
private long nativeObject = 0;
private int viewWidth = 0;
private int viewHeight = 0;
public static boolean UseOpenGL2(Object renderWindow) {
return ViEAndroidGLES20.class.isInstance(renderWindow);
public ViEAndroidGLES20(Context context) {
super(context);
init(false, 0, 0);
public ViEAndroidGLES20(Context context, boolean translucent,
int depth, int stencil) {
super(context);
init(translucent, depth, stencil);
private void init(boolean translucent, int depth, int stencil) {
// By default, GLSurfaceView() creates a RGB_565 opaque surface.
// If we want a translucent one, we should change the surface&#39;s
// format here, using PixelFormat.TRANSLUCENT for GL Surfaces
// is interpreted as any 32-bit surface with alpha by SurfaceFlinger.
if (translucent) {
this.getHolder().setFormat(PixelFormat.TRANSLUCENT);
// Setup the context factory for 2.0 rendering.
// See ContextFactory class definition below
setEGLContextFactory(new ContextFactory());
// We need to choose an EGLConfig that matches the format of
// our surface exactly. This is going to be done in our
// custom config chooser. See ConfigChooser class definition
setEGLConfigChooser( translucent ?
new ConfigChooser(8, 8, 8, 8, depth, stencil) :
new ConfigChooser(5, 6, 5, 0, depth, stencil) );
// Set the renderer responsible for frame rendering
this.setRenderer(this);
this.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);
private static class ContextFactory implements GLSurfaceView.EGLContextFactory {
private static int EGL_CONTEXT_CLIENT_VERSION = 0x3098;
public EGLContext createContext(EGL10 egl, EGLDisplay display, EGLConfig eglConfig) {
Log.w(TAG, &creating OpenGL ES 2.0 context&);
checkEglError(&Before eglCreateContext&, egl);
int[] attrib_list = {EGL_CONTEXT_CLIENT_VERSION, 2, EGL10.EGL_NONE };
EGLContext context = egl.eglCreateContext(display, eglConfig,
EGL10.EGL_NO_CONTEXT, attrib_list);
checkEglError(&After eglCreateContext&, egl);
public void destroyContext(EGL10 egl, EGLDisplay display, EGLContext context) {
egl.eglDestroyContext(display, context);
private static void checkEglError(String prompt, EGL10 egl) {
while ((error = egl.eglGetError()) != EGL10.EGL_SUCCESS) {
Log.e(TAG, String.format(&%s: EGL error: 0x%x&, prompt, error));
private static class ConfigChooser implements GLSurfaceView.EGLConfigChooser {
public ConfigChooser(int r, int g, int b, int a, int depth, int stencil) {
mRedSize =
mGreenSize =
mBlueSize =
mAlphaSize =
mDepthSize =
mStencilSize =
// This EGL config specification is used to specify 2.0 rendering.
// We use a minimum size of 4 bits for red/green/blue, but will
// perform actual matching in chooseConfig() below.
private static int EGL_OPENGL_ES2_BIT = 4;
private static int[] s_configAttribs2 =
EGL10.EGL_RED_SIZE, 4,
EGL10.EGL_GREEN_SIZE, 4,
EGL10.EGL_BLUE_SIZE, 4,
EGL10.EGL_RENDERABLE_TYPE, EGL_OPENGL_ES2_BIT,
EGL10.EGL_NONE
public EGLConfig chooseConfig(EGL10 egl, EGLDisplay display) {
// Get the number of minimally matching EGL configurations
int[] num_config = new int[1];
egl.eglChooseConfig(display, s_configAttribs2, null, 0, num_config);
int numConfigs = num_config[0];
if (numConfigs &= 0) {
throw new IllegalArgumentException(&No configs match configSpec&);
// Allocate then read the array of minimally matching EGL configs
EGLConfig[] configs = new EGLConfig[numConfigs];
egl.eglChooseConfig(display, s_configAttribs2, configs, numConfigs, num_config);
if (DEBUG) {
printConfigs(egl, display, configs);
// Now return the &best& one
return chooseConfig(egl, display, configs);
public EGLConfig chooseConfig(EGL10 egl, EGLDisplay display,
EGLConfig[] configs) {
for(EGLConfig config : configs) {
int d = findConfigAttrib(egl, display, config,
EGL10.EGL_DEPTH_SIZE, 0);
int s = findConfigAttrib(egl, display, config,
EGL10.EGL_STENCIL_SIZE, 0);
// We need at least mDepthSize and mStencilSize bits
if (d & mDepthSize || s & mStencilSize)
// We want an *exact* match for red/green/blue/alpha
int r = findConfigAttrib(egl, display, config,
EGL10.EGL_RED_SIZE, 0);
int g = findConfigAttrib(egl, display, config,
EGL10.EGL_GREEN_SIZE, 0);
int b = findConfigAttrib(egl, display, config,
EGL10.EGL_BLUE_SIZE, 0);
int a = findConfigAttrib(egl, display, config,
EGL10.EGL_ALPHA_SIZE, 0);
if (r == mRedSize && g == mGreenSize && b == mBlueSize && a == mAlphaSize)
private int findConfigAttrib(EGL10 egl, EGLDisplay display,
EGLConfig config, int attribute, int defaultValue) {
if (egl.eglGetConfigAttrib(display, config, attribute, mValue)) {
return mValue[0];
return defaultV
private void printConfigs(EGL10 egl, EGLDisplay display,
EGLConfig[] configs) {
int numConfigs = configs.
Log.w(TAG, String.format(&%d configurations&, numConfigs));
for (int i = 0; i & numC i++) {
Log.w(TAG, String.format(&Configuration %d:\n&, i));
printConfig(egl, display, configs[i]);
private void printConfig(EGL10 egl, EGLDisplay display,
EGLConfig config) {
int[] attributes = {
EGL10.EGL_BUFFER_SIZE,
EGL10.EGL_ALPHA_SIZE,
EGL10.EGL_BLUE_SIZE,
EGL10.EGL_GREEN_SIZE,
EGL10.EGL_RED_SIZE,
EGL10.EGL_DEPTH_SIZE,
EGL10.EGL_STENCIL_SIZE,
EGL10.EGL_CONFIG_CAVEAT,
EGL10.EGL_CONFIG_ID,
EGL10.EGL_LEVEL,
EGL10.EGL_MAX_PBUFFER_HEIGHT,
EGL10.EGL_MAX_PBUFFER_PIXELS,
EGL10.EGL_MAX_PBUFFER_WIDTH,
EGL10.EGL_NATIVE_RENDERABLE,
EGL10.EGL_NATIVE_VISUAL_ID,
EGL10.EGL_NATIVE_VISUAL_TYPE,
0x3030, // EGL10.EGL_PRESERVED_RESOURCES,
EGL10.EGL_SAMPLES,
EGL10.EGL_SAMPLE_BUFFERS,
EGL10.EGL_SURFACE_TYPE,
EGL10.EGL_TRANSPARENT_TYPE,
EGL10.EGL_TRANSPARENT_RED_VALUE,
EGL10.EGL_TRANSPARENT_GREEN_VALUE,
EGL10.EGL_TRANSPARENT_BLUE_VALUE,
0x3039, // EGL10.EGL_BIND_TO_TEXTURE_RGB,
0x303A, // EGL10.EGL_BIND_TO_TEXTURE_RGBA,
0x303B, // EGL10.EGL_MIN_SWAP_INTERVAL,
0x303C, // EGL10.EGL_MAX_SWAP_INTERVAL,
EGL10.EGL_LUMINANCE_SIZE,
EGL10.EGL_ALPHA_MASK_SIZE,
EGL10.EGL_COLOR_BUFFER_TYPE,
EGL10.EGL_RENDERABLE_TYPE,
0x3042 // EGL10.EGL_CONFORMANT
String[] names = {
&EGL_BUFFER_SIZE&,
&EGL_ALPHA_SIZE&,
&EGL_BLUE_SIZE&,
&EGL_GREEN_SIZE&,
&EGL_RED_SIZE&,
&EGL_DEPTH_SIZE&,
&EGL_STENCIL_SIZE&,
&EGL_CONFIG_CAVEAT&,
&EGL_CONFIG_ID&,
&EGL_LEVEL&,
&EGL_MAX_PBUFFER_HEIGHT&,
&EGL_MAX_PBUFFER_PIXELS&,
&EGL_MAX_PBUFFER_WIDTH&,
&EGL_NATIVE_RENDERABLE&,
&EGL_NATIVE_VISUAL_ID&,
&EGL_NATIVE_VISUAL_TYPE&,
&EGL_PRESERVED_RESOURCES&,
&EGL_SAMPLES&,
&EGL_SAMPLE_BUFFERS&,
&EGL_SURFACE_TYPE&,
&EGL_TRANSPARENT_TYPE&,
&EGL_TRANSPARENT_RED_VALUE&,
&EGL_TRANSPARENT_GREEN_VALUE&,
&EGL_TRANSPARENT_BLUE_VALUE&,
&EGL_BIND_TO_TEXTURE_RGB&,
&EGL_BIND_TO_TEXTURE_RGBA&,
&EGL_MIN_SWAP_INTERVAL&,
&EGL_MAX_SWAP_INTERVAL&,
&EGL_LUMINANCE_SIZE&,
&EGL_ALPHA_MASK_SIZE&,
&EGL_COLOR_BUFFER_TYPE&,
&EGL_RENDERABLE_TYPE&,
&EGL_CONFORMANT&
int[] value = new int[1];
for (int i = 0; i & attributes. i++) {
int attribute = attributes[i];
String name = names[i];
if (egl.eglGetConfigAttrib(display, config, attribute, value)) {
Log.w(TAG, String.format(&
%s: %d\n&, name, value[0]));
// Log.w(TAG, String.format(&
%s: failed\n&, name));
while (egl.eglGetError() != EGL10.EGL_SUCCESS);
// Subclasses can adjust these values:
protected int mRedS
protected int mGreenS
protected int mBlueS
protected int mAlphaS
protected int mDepthS
protected int mStencilS
private int[] mValue = new int[1];
// IsSupported
// Return true if this device support Open GL ES 2.0 rendering.
public static boolean IsSupported(Context context) {
ActivityManager am =
(ActivityManager) context.getSystemService(Context.ACTIVITY_SERVICE);
ConfigurationInfo info = am.getDeviceConfigurationInfo();
if(info.reqGlEsVersion &= 0x20000) {
// Open GL ES 2.0 is supported.
public void onDrawFrame(GL10 gl) {
nativeFunctionLock.lock();
if(!nativeFunctionsRegisted || !surfaceCreated) {
nativeFunctionLock.unlock();
if(!openGLCreated) {
if(0 != CreateOpenGLNative(nativeObject, viewWidth, viewHeight)) {
// Failed to create OpenGL
openGLCreated = // Created OpenGL successfully
DrawNative(nativeObject); // Draw the new frame
nativeFunctionLock.unlock();
public void onSurfaceChanged(GL10 gl, int width, int height) {
surfaceCreated =
viewWidth =
viewHeight =
nativeFunctionLock.lock();
if(nativeFunctionsRegisted) {
if(CreateOpenGLNative(nativeObject,width,height) == 0)
openGLCreated =
nativeFunctionLock.unlock();
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
public void RegisterNativeObject(long nativeObject) {
nativeFunctionLock.lock();
this.nativeObject = nativeO
nativeFunctionsRegisted =
nativeFunctionLock.unlock();
public void DeRegisterNativeObject() {
nativeFunctionLock.lock();
nativeFunctionsRegisted =
openGLCreated =
this.nativeObject = 0;
nativeFunctionLock.unlock();
public void ReDraw() {// jni层解码以后的数据回调,然后由系统调用onDrawFrame显示
if(surfaceCreated) {
// Request the renderer to redraw using the render thread context.
this.requestRender();
private native int CreateOpenGLNative(long nativeObject, int width, int height);
private native void DrawNative(long nativeObject);
ViERenderer.javapackage hzcw.
import android.content.C
import android.view.SurfaceV
public class ViERenderer
public static SurfaceView CreateRenderer(Context context) {
return CreateRenderer(context, false);
public static SurfaceView CreateRenderer(Context context,
boolean useOpenGLES2) {
if(useOpenGLES2 == true && ViEAndroidGLES20.IsSupported(context))
return new ViEAndroidGLES20(context);
GL2JNILib.java (native接口代码)
package com.example.
public class GL2JNILib {
System.loadLibrary(&MICloudPub&);
public static native void init(Object glSurface);
public static native void step(String filepath);
第二步:写jni代码
com_example_filltriangle_GL2JNILib.h (javah自动生成的)
/* DO NOT EDIT THIS FILE - it is machine generated */
#include &jni.h&
/* Header for class com_example_filltriangle_GL2JNILib */
#ifndef _Included_com_example_filltriangle_GL2JNILib
#define _Included_com_example_filltriangle_GL2JNILib
#ifdef __cplusplus
extern &C& {
com_example_filltriangle_GL2JNILib
* Signature: (II)V
JNIEXPORT void JNICALL Java_com_example_filltriangle_GL2JNILib_init
(JNIEnv *, jclass, jobject);
com_example_filltriangle_GL2JNILib
* Signature: ()V
JNIEXPORT void JNICALL Java_com_example_filltriangle_GL2JNILib_step
(JNIEnv *, jclass, jstring);
#ifdef __cplusplus
#include &jni.h&
#include &stdlib.h&
#include &stdio.h&
#include &render_opengles20.h&
#include &com_example_filltriangle_GL2JNILib.h&
#include &H264Decoder.h&
class AndroidNativeOpenGl2Channel
AndroidNativeOpenGl2Channel(JavaVM* jvm,
void* window)
_ptrWindow =
_buffer = (uint8_t*)malloc(1024000);
~AndroidNativeOpenGl2Channel()
bool isAttached =
JNIEnv* env = NULL;
if (_jvm-&GetEnv((void**) &env, JNI_VERSION_1_4) != JNI_OK) {
// try to attach the thread and get the env
// Attach this thread to JVM
jint res = _jvm-&AttachCurrentThread(&env, NULL);
// Get the JNI env for this thread
if ((res & 0) || !env) {
WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
&%s: Could not attach thread to JVM (%d, %p)&,
__FUNCTION__, res, env);
env = NULL;
isAttached =
if (env && _deRegisterNativeCID) {
env-&CallVoidMethod(_javaRenderObj, _deRegisterNativeCID);
env-&DeleteGlobalRef(_javaRenderObj);
env-&DeleteGlobalRef(_javaRenderClass);
if (isAttached) {
if (_jvm-&DetachCurrentThread() & 0) {
WEBRTC_TRACE(kTraceWarning, kTraceVideoRenderer, _id,
&%s: Could not detach thread from JVM&,
__FUNCTION__);
free(_buffer);
int32_t Init()
if (!_ptrWindow)
WEBRTC_TRACE(kTraceWarning, kTraceVideoRenderer, _id,
&(%s): No window have been provided.&, __FUNCTION__);
return -1;
if (!_jvm)
WEBRTC_TRACE(kTraceWarning, kTraceVideoRenderer, _id,
&(%s): No JavaVM have been provided.&, __FUNCTION__);
return -1;
// get the JNI env for this thread
bool isAttached =
JNIEnv* env = NULL;
if (_jvm-&GetEnv((void**) &env, JNI_VERSION_1_4) != JNI_OK) {
// try to attach the thread and get the env
// Attach this thread to JVM
jint res = _jvm-&AttachCurrentThread(&env, NULL);
// Get the JNI env for this thread
if ((res & 0) || !env) {
WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
&%s: Could not attach thread to JVM (%d, %p)&,
__FUNCTION__, res, env);
return -1;
isAttached =
// get the ViEAndroidGLES20 class
jclass javaRenderClassLocal = reinterpret_cast&jclass& (env-&FindClass(&hzcw/opengl/ViEAndroidGLES20&));
if (!javaRenderClassLocal) {
WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
&%s: could not find ViEAndroidGLES20&, __FUNCTION__);
return -1;
_javaRenderClass = reinterpret_cast&jclass& (env-&NewGlobalRef(javaRenderClassLocal));
if (!_javaRenderClass) {
WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
&%s: could not create Java SurfaceHolder class reference&,
__FUNCTION__);
return -1;
// Delete local class ref, we only use the global ref
env-&DeleteLocalRef(javaRenderClassLocal);
jmethodID cidUseOpenGL = env-&GetStaticMethodID(_javaRenderClass,
&UseOpenGL2&,
&(Ljava/lang/O)Z&);
if (cidUseOpenGL == NULL) {
WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, -1,
&%s: could not get UseOpenGL ID&, __FUNCTION__);
jboolean res = env-&CallStaticBooleanMethod(_javaRenderClass,
cidUseOpenGL, (jobject) _ptrWindow);
// create a reference to the object (to tell JNI that we are referencing it
// after this function has returned)
_javaRenderObj = reinterpret_cast&jobject& (env-&NewGlobalRef((jobject)_ptrWindow));
if (!_javaRenderObj)
WEBRTC_TRACE(
kTraceError,
kTraceVideoRenderer,
&%s: could not create Java SurfaceRender object reference&,
__FUNCTION__);
return -1;
// get the method ID for the ReDraw function
_redrawCid = env-&GetMethodID(_javaRenderClass, &ReDraw&, &()V&);
if (_redrawCid == NULL) {
WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
&%s: could not get ReDraw ID&, __FUNCTION__);
return -1;
_registerNativeCID = env-&GetMethodID(_javaRenderClass,
&RegisterNativeObject&, &(J)V&);
if (_registerNativeCID == NULL) {
WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
&%s: could not get RegisterNativeObject ID&, __FUNCTION__);
return -1;
_deRegisterNativeCID = env-&GetMethodID(_javaRenderClass,
&DeRegisterNativeObject&, &()V&);
if (_deRegisterNativeCID == NULL) {
WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
&%s: could not get DeRegisterNativeObject ID&,
__FUNCTION__);
return -1;
JNINativeMethod nativeFunctions[2] = {
{ &DrawNative&,
(void*) &AndroidNativeOpenGl2Channel::DrawNativeStatic, },
{ &CreateOpenGLNative&,
(void*) &AndroidNativeOpenGl2Channel::CreateOpenGLNativeStatic },
if (env-&RegisterNatives(_javaRenderClass, nativeFunctions, 2) == 0) {
WEBRTC_TRACE(kTraceDebug, kTraceVideoRenderer, -1,
&%s: Registered native functions&, __FUNCTION__);
WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, -1,
&%s: Failed to register native functions&, __FUNCTION__);
return -1;
env-&CallVoidMethod(_javaRenderObj, _registerNativeCID, (jlong) this);
if (isAttached) {
if (_jvm-&DetachCurrentThread() & 0) {
WEBRTC_TRACE(kTraceWarning, kTraceVideoRenderer, _id,
&%s: Could not detach thread from JVM&, __FUNCTION__);
WEBRTC_TRACE(kTraceDebug, kTraceVideoRenderer, _id, &%s done&,
__FUNCTION__);
if (_openGLRenderer.SetCoordinates(zOrder, left, top, right, bottom) != 0) {
return -1;
void DeliverFrame(int32_t widht, int32_t height)
bool isAttached =
JNIEnv* env = NULL;
if (_jvm-&GetEnv((void**) &env, JNI_VERSION_1_4) != JNI_OK) {
// try to attach the thread and get the env
// Attach this thread to JVM
jint res = _jvm-&AttachCurrentThread(&env, NULL);
// Get the JNI env for this thread
if ((res & 0) || !env) {
WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
&%s: Could not attach thread to JVM (%d, %p)&,
__FUNCTION__, res, env);
env = NULL;
isAttached =
if (env && _redrawCid)
env-&CallVoidMethod(_javaRenderObj, _redrawCid);
if (isAttached) {
if (_jvm-&DetachCurrentThread() & 0) {
WEBRTC_TRACE(kTraceWarning, kTraceVideoRenderer, _id,
&%s: Could not detach thread from JVM&,
__FUNCTION__);
void GetDataBuf(uint8_t*& pbuf, int32_t& isize)
isize = 1024000;
static jint CreateOpenGLNativeStatic(JNIEnv * env,
jlong context,
jint width,
jint height)
AndroidNativeOpenGl2Channel* renderChannel =
reinterpret_cast&AndroidNativeOpenGl2Channel*& (context);
WEBRTC_TRACE(kTraceInfo, kTraceVideoRenderer, -1, &%s:&, __FUNCTION__);
return renderChannel-&CreateOpenGLNative(width, height);
static void DrawNativeStatic(JNIEnv * env,jobject, jlong context)
AndroidNativeOpenGl2Channel* renderChannel =
reinterpret_cast&AndroidNativeOpenGl2Channel*&(context);
renderChannel-&DrawNative();
jint CreateOpenGLNative(int width, int height)
return _openGLRenderer.Setup(width, height);
void DrawNative()
_openGLRenderer.Render(_buffer, _widht, _height);
void* _ptrW
jobject _javaRenderO
jclass _javaRenderC
JNIEnv* _javaRenderJniE
_registerNativeCID;
_deRegisterNativeCID;
RenderOpenGles20 _openGLR
uint8_t* _
static JavaVM* g_jvm = NULL;
static AndroidNativeOpenGl2Channel* p_opengl_channel = NULL;
extern &C&
JNIEXPORT jint JNI_OnLoad(JavaVM* vm, void *reserved)
JNIEnv* env = NULL;
jint result = -1;
if (vm-&GetEnv((void**) &env, JNI_VERSION_1_4) != JNI_OK)
return -1;
return JNI_VERSION_1_4;
extern &C&
int mTrans = 0x0F0F0F0F;
int MergeBuffer(uint8_t *NalBuf, int NalBufUsed, uint8_t *SockBuf, int SockBufUsed, int SockRemain)
//把读取的数剧分割成NAL块
int i = 0;
for (i = 0; i & SockR i++) {
Temp = SockBuf[i + SockBufUsed];
NalBuf[i + NalBufUsed] = T
mTrans &&= 8;
mTrans |= T
if (mTrans == 1) // 找到一个开始字
JNIEXPORT void JNICALL Java_com_example_filltriangle_GL2JNILib_init
(JNIEnv *env, jclass oclass, jobject glSurface)
if (p_opengl_channel)
WEBRTC_TRACE(kTraceInfo, kTraceVideoRenderer, -1, &初期化失败[%d].&, __LINE__);
p_opengl_channel = new AndroidNativeOpenGl2Channel(g_jvm, glSurface);
if (p_opengl_channel-&Init() != 0)
WEBRTC_TRACE(kTraceInfo, kTraceVideoRenderer, -1, &初期化失败[%d].&, __LINE__);
JNIEXPORT void JNICALL Java_com_example_filltriangle_GL2JNILib_step(JNIEnv* env, jclass tis, jstring filepath)
const char *filename = env-&GetStringUTFChars(filepath, NULL);
WEBRTC_TRACE(kTraceInfo, kTraceVideoRenderer, -1, &step[%d].&, __LINE__);
FILE *_imgFileHandle =
fopen(filename, &rb&);
if (_imgFileHandle == NULL)
WEBRTC_TRACE(kTraceInfo, kTraceVideoRenderer, -1, &File No Exist[%s][%d].&, filename, __LINE__);
H264Decoder* pMyH264 = new H264Decoder();
X264_DECODER_H handle = pMyH264-&X264Decoder_Init();
if (handle &= 0)
WEBRTC_TRACE(kTraceInfo, kTraceVideoRenderer, -1, &X264Decoder_Init Error[%d].&, __LINE__);
int iTemp = 0;
int bytesRead = 0;
int NalBufUsed = 0;
int SockBufUsed = 0;
bool bFirst =
bool bFindPPS =
uint8_t *SockBuf = (uint8_t *)malloc(204800);
uint8_t *NalBuf = (uint8_t *)malloc(4098000);
int nWidth, nH
memset(SockBuf, 0, 204800);
uint8_t *buffOut = NULL;
int outSize = 0;
p_opengl_channel-&GetDataBuf(buffOut, outSize);
uint8_t *IIBuf = (uint8_t *)malloc(204800);
int IILen = 0;
bytesRead = fread(SockBuf, 1, 204800, _imgFileHandle);
WEBRTC_TRACE(kTraceInfo, kTraceVideoRenderer, -1, &bytesRead
= %d&, bytesRead);
if (bytesRead &= 0) {
SockBufUsed = 0;
while (bytesRead - SockBufUsed & 0) {
nalLen = MergeBuffer(NalBuf, NalBufUsed, SockBuf, SockBufUsed,
bytesRead - SockBufUsed);
NalBufUsed += nalL
SockBufUsed += nalL
while (mTrans == 1) {
mTrans = 0xFFFFFFFF;
if (bFirst == true) // the first start flag
else // a complete NAL data, include 0x trail.
if (bFindPPS == true) // true
if ((NalBuf[4] & 0x1F) == 7 || (NalBuf[4] & 0x1F) == 8)
bFindPPS =
NalBuf[0] = 0;
NalBuf[1] = 0;
NalBuf[2] = 0;
NalBuf[3] = 1;
NalBufUsed = 4;
if (NalBufUsed == 16 || NalBufUsed == 10 || NalBufUsed == 54 || NalBufUsed == 12 || NalBufUsed == 20) {
memcpy(IIBuf + IILen, NalBuf, NalBufUsed);
IILen += NalBufU
memcpy(IIBuf + IILen, NalBuf, NalBufUsed);
IILen += NalBufU
// decode nal
iTemp = pMyH264-&X264Decoder_Decode(handle, (uint8_t *)IIBuf,
IILen, (uint8_t *)buffOut,
outSize, &nWidth, &nHeight);
if (iTemp == 0) {
WEBRTC_TRACE(kTraceInfo, kTraceVideoRenderer, -1, &解码成功,宽度:%d高度:%d,解码数据长度:%d.&, nWidth, nHeight, iTemp);
[self.glView setVideoSize:nWidth height:nHeight];
[self.glView displayYUV420pData:buffOut
width:nWidth
height:nHeight];
p_opengl_channel-&DeliverFrame(nWidth, nHeight);
WEBRTC_TRACE(kTraceInfo, kTraceVideoRenderer, -1, &解码失败.&);
IILen = 0;
NalBuf[0]=0;
NalBuf[1]=0;
NalBuf[2]=0;
NalBuf[3]=1;
NalBufUsed=4;
}while (bytesRead&0);
fclose(_imgFileHandle);
pMyH264-&X264Decoder_UnInit(handle);
free(SockBuf);
free(NalBuf);
delete pMyH264;
env-&ReleaseStringUTFChars(filepath, filename);
render_opengles20.cpp
#include &GLES2/gl2.h&
#include &GLES2/gl2ext.h&
#include &stdio.h&
#include &stdlib.h&
#include &stdio.h&
#include &render_opengles20.h&
const char RenderOpenGles20::g_indices[] = { 0, 3, 2, 0, 2, 1 };
const char RenderOpenGles20::g_vertextShader[] = {
&attribute vec4 aP\n&
&attribute vec2 aTextureC\n&
&varying vec2 vTextureC\n&
&void main() {\n&
gl_Position = aP\n&
vTextureCoord = aTextureC\n&
// The fragment shader.
// Do YUV to RGB565 conversion.
const char RenderOpenGles20::g_fragmentShader[] = {
&uniform sampler2D Y\n&
&uniform sampler2D Utex,V\n&
&varying vec2 vTextureC\n&
&void main(void) {\n&
float nx,ny,r,g,b,y,u,v;\n&
mediump vec4 txl,ux,&
nx=vTextureCoord[0];\n&
ny=vTextureCoord[1];\n&
y=texture2D(Ytex,vec2(nx,ny)).r;\n&
u=texture2D(Utex,vec2(nx,ny)).r;\n&
v=texture2D(Vtex,vec2(nx,ny)).r;\n&
y=1.1643*(y-0.0625);\n&
u=u-0.5;\n&
v=v-0.5;\n&
r=y+1.5958*v;\n&
g=y-0.39173*u-0.81290*v;\n&
b=y+2.017*u;\n&
gl_FragColor=vec4(r,g,b,1.0);\n&
RenderOpenGles20::RenderOpenGles20() :
_textureWidth(-1),
_textureHeight(-1)
WEBRTC_TRACE(kTraceDebug, kTraceVideoRenderer, _id, &%s: id %d&,
__FUNCTION__, (int) _id);
const GLfloat vertices[20] = {
// X, Y, Z, U, V
-1, -1, 0, 1, 0, // Bottom Left
1, -1, 0, 0, 0, //Bottom Right
1, 1, 0, 0, 1, //Top Right
-1, 1, 0, 1, 1 }; //Top Left
memcpy(_vertices, vertices, sizeof(_vertices));
RenderOpenGles20::~RenderOpenGles20() {
glDeleteTextures(3, _textureIds);
int32_t RenderOpenGles20::Setup(int32_t width, int32_t height) {
WEBRTC_TRACE(kTraceDebug, kTraceVideoRenderer, _id,
&%s: width %d, height %d&, __FUNCTION__, (int) width,
(int) height);
printGLString(&Version&, GL_VERSION);
printGLString(&Vendor&, GL_VENDOR);
printGLString(&Renderer&, GL_RENDERER);
printGLString(&Extensions&, GL_EXTENSIONS);
int maxTextureImageUnits[2];
int maxTextureSize[2];
glGetIntegerv(GL_MAX_TEXTURE_IMAGE_UNITS, maxTextureImageUnits);
glGetIntegerv(GL_MAX_TEXTURE_SIZE, maxTextureSize);
WEBRTC_TRACE(kTraceDebug, kTraceVideoRenderer, _id,
&%s: number of textures %d, size %d&, __FUNCTION__,
(int) maxTextureImageUnits[0], (int) maxTextureSize[0]);
_program = createProgram(g_vertextShader, g_fragmentShader);
if (!_program) {
WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
&%s: Could not create program&, __FUNCTION__);
return -1;
int positionHandle = glGetAttribLocation(_program, &aPosition&);
checkGlError(&glGetAttribLocation aPosition&);
if (positionHandle == -1) {
WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
&%s: Could not get aPosition handle&, __FUNCTION__);
return -1;
int textureHandle = glGetAttribLocation(_program, &aTextureCoord&);
checkGlError(&glGetAttribLocation aTextureCoord&);
if (textureHandle == -1) {
WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
&%s: Could not get aTextureCoord handle&, __FUNCTION__);
return -1;
// set the vertices array in the shader
// _vertices contains 4 vertices with 5 coordinates.
// 3 for (xyz) for the vertices and 2 for the texture
glVertexAttribPointer(positionHandle, 3, GL_FLOAT, false,
5 * sizeof(GLfloat), _vertices);
checkGlError(&glVertexAttribPointer aPosition&);
glEnableVertexAttribArray(positionHandle);
checkGlError(&glEnableVertexAttribArray positionHandle&);
// set the texture coordinate array in the shader
// _vertices contains 4 vertices with 5 coordinates.
// 3 for (xyz) for the vertices and 2 for the texture
glVertexAttribPointer(textureHandle, 2, GL_FLOAT, false, 5
* sizeof(GLfloat), &_vertices[3]);
checkGlError(&glVertexAttribPointer maTextureHandle&);
glEnableVertexAttribArray(textureHandle);
checkGlError(&glEnableVertexAttribArray textureHandle&);
glUseProgram(_program);
int i = glGetUniformLocation(_program, &Ytex&);
checkGlError(&glGetUniformLocation&);
glUniform1i(i, 0); /* Bind Ytex to texture unit 0 */
checkGlError(&glUniform1i Ytex&);
i = glGetUniformLocation(_program, &Utex&);
checkGlError(&glGetUniformLocation Utex&);
glUniform1i(i, 1); /* Bind Utex to texture unit 1 */
checkGlError(&glUniform1i Utex&);
i = glGetUniformLocation(_program, &Vtex&);
checkGlError(&glGetUniformLocation&);
glUniform1i(i, 2); /* Bind Vtex to texture unit 2 */
checkGlError(&glUniform1i&);
glViewport(0, 0, width, height);
checkGlError(&glViewport&);
// SetCoordinates
// Sets the coordinates where the stream shall be rendered.
// Values must be between 0 and 1.
int32_t RenderOpenGles20::SetCoordinates(int32_t zOrder,
const float left,
const float top,
const float right,
const float bottom) {
if ((top & 1 || top & 0) || (right & 1 || right & 0) ||
(bottom & 1 || bottom & 0) || (left & 1 || left & 0)) {
WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
&%s: Wrong coordinates&, __FUNCTION__);
return -1;
X, Y, Z, U, V
// -1, -1, 0, 0, 1, // Bottom Left
1, -1, 0, 1, 1, //Bottom Right
1, 0, 1, 0, //Top Right
1, 0, 0, 0
//Top Left
// Bottom Left
_vertices[0] = (left * 2) - 1;
_vertices[1] = -1 * (2 * bottom) + 1;
_vertices[2] = zO
//Bottom Right
_vertices[5] = (right * 2) - 1;
_vertices[6] = -1 * (2 * bottom) + 1;
_vertices[7] = zO
//Top Right
_vertices[10] = (right * 2) - 1;
_vertices[11] = -1 * (2 * top) + 1;
_vertices[12] = zO
//Top Left
_vertices[15] = (left * 2) - 1;
_vertices[16] = -1 * (2 * top) + 1;
_vertices[17] = zO
GLuint RenderOpenGles20::loadShader(GLenum shaderType, const char* pSource)
GLuint shader = glCreateShader(shaderType);
if (shader) {
glShaderSource(shader, 1, &pSource, NULL);
glCompileShader(shader);
GLint compiled = 0;
glGetShaderiv(shader, GL_COMPILE_STATUS, &compiled);
if (!compiled) {
GLint infoLen = 0;
glGetShaderiv(shader, GL_INFO_LOG_LENGTH, &infoLen);
if (infoLen) {
char* buf = (char*) malloc(infoLen);
if (buf) {
glGetShaderInfoLog(shader, infoLen, NULL, buf);
WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
&%s: Could not compile shader %d: %s&,
__FUNCTION__, shaderType, buf);
free(buf);
glDeleteShader(shader);
shader = 0;
GLuint RenderOpenGles20::createProgram(const char* pVertexSource,
const char* pFragmentSource) {
GLuint vertexShader = loadShader(GL_VERTEX_SHADER, pVertexSource);
if (!vertexShader) {
GLuint pixelShader = loadShader(GL_FRAGMENT_SHADER, pFragmentSource);
if (!pixelShader) {
GLuint program = glCreateProgram();
if (program) {
glAttachShader(program, vertexShader);
checkGlError(&glAttachShader&);
glAttachShader(program, pixelShader);
checkGlError(&glAttachShader&);
glLinkProgram(program);
GLint linkStatus = GL_FALSE;
glGetProgramiv(program, GL_LINK_STATUS, &linkStatus);
if (linkStatus != GL_TRUE) {
GLint bufLength = 0;
glGetProgramiv(program, GL_INFO_LOG_LENGTH, &bufLength);
if (bufLength) {
char* buf = (char*) malloc(bufLength);
if (buf) {
glGetProgramInfoLog(program, bufLength, NULL, buf);
WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
&%s: Could not link program: %s&,
__FUNCTION__, buf);
free(buf);
glDeleteProgram(program);
program = 0;
void RenderOpenGles20::printGLString(const char *name, GLenum s) {
const char *v = (const char *) glGetString(s);
WEBRTC_TRACE(kTraceDebug, kTraceVideoRenderer, _id, &GL %s = %s\n&,
void RenderOpenGles20::checkGlError(const char* op) {
#ifdef ANDROID_LOG
for (GLint error = glGetError(); error = glGetError()) {
WEBRTC_TRACE(kTraceError, kTraceVideoRenderer, _id,
&after %s() glError (0x%x)\n&, op, error);
static void InitializeTexture(int name, int id, int width, int height) {
glActiveTexture(name);
glBindTexture(GL_TEXTURE_2D, id);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, width, height, 0,
GL_LUMINANCE, GL_UNSIGNED_BYTE, NULL);
// Uploads a plane of pixel data, accounting for stride != width*bpp.
static void GlTexSubImage2D(GLsizei width, GLsizei height, int stride,
const uint8_t* plane) {
if (stride == width) {
We can upload the entire plane in a single GL call.
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, width, height, GL_LUMINANCE,
GL_UNSIGNED_BYTE,
static_cast&const GLvoid*&(plane));
Since GLES2 doesn&#39;t have GL_UNPACK_ROW_LENGTH and Android doesn&#39;t
// have GL_EXT_unpack_subimage we have to upload a row at a time.
for (int row = 0; row & ++row) {
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, row, width, 1, GL_LUMINANCE,
GL_UNSIGNED_BYTE,
static_cast&const GLvoid*&(plane + (row * stride)));
int32_t RenderOpenGles20::Render(void * data, int32_t widht, int32_t height)
WEBRTC_TRACE(kTraceDebug, kTraceVideoRenderer, _id, &%s: id %d&,
__FUNCTION__, (int) _id);
glUseProgram(_program);
checkGlError(&glUseProgram&);
if (_textureWidth != (GLsizei) widht || _textureHeight != (GLsizei) height) {
SetupTextures(widht, height);
UpdateTextures(data, widht, height);
glDrawElements(GL_TRIANGLES, 6, GL_UNSIGNED_BYTE, g_indices);
checkGlError(&glDrawArrays&);
void RenderOpenGles20::SetupTextures(int32_t width, int32_t height)
glDeleteTextures(3, _textureIds);
glGenTextures(3, _textureIds); //Generate
the Y, U and V texture
InitializeTexture(GL_TEXTURE0, _textureIds[0], width, height);
InitializeTexture(GL_TEXTURE1, _textureIds[1], width / 2, height / 2);
InitializeTexture(GL_TEXTURE2, _textureIds[2], width / 2, height / 2);
checkGlError(&SetupTextures&);
_textureWidth =
_textureHeight =
void RenderOpenGles20::UpdateTextures(void* data, int32_t widht, int32_t height)
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, _textureIds[0]);
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, widht, height, GL_LUMINANCE, GL_UNSIGNED_BYTE,
glActiveTexture(GL_TEXTURE1);
glBindTexture(GL_TEXTURE_2D, _textureIds[1]);
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, widht / 2, height / 2, GL_LUMINANCE,
GL_UNSIGNED_BYTE, (char *)data + widht * height);
glActiveTexture(GL_TEXTURE2);
glBindTexture(GL_TEXTURE_2D, _textureIds[2]);
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, widht / 2, height / 2, GL_LUMINANCE,
GL_UNSIGNED_BYTE, (char *)data + widht * height * 5 / 4);
checkGlError(&UpdateTextures&);
H264Decoder.cpp (解码代码,前面的博客贴过代码,这里就不贴了)
第三步:编译jni,生成so文件
第四步:把生成的so文件拷贝到android工程里面去,这里贴一下我的Activity代码,如下:
package com.example.
import java.io.IOE
import java.io.InputS
import hzcw.opengl.ViER
import android.app.A
import android.os.B
import android.os.E
import android.util.L
import android.view.SurfaceV
public class FillTriangle extends Activity {
private SurfaceView mView =
System.loadLibrary(&MICloudPub&);
@Override protected void onCreate(Bundle icicle) {
super.onCreate(icicle);
mView = ViERenderer.CreateRenderer(this, true);
if (mView == null) {
Log.i(&test&, &mView is null&);
setContentView(mView);
GL2JNILib.init(mView);
new MyThread().start();
public class MyThread extends Thread {
public void run() {
GL2JNILib.step(&/sdcard/test.264&);
这个demo就是读一个视频文件,解码以后在界面显示出来。便于运行,最后上效果图哈,免得有人怀疑项目真实性。
参考知识库
* 以上用户言论只代表其个人观点,不代表CSDN网站的观点或立场
访问:64875次
排名:千里之外
原创:15篇
评论:76条
(1)(2)(1)(1)(2)(2)(5)(1)}

我要回帖

更多关于 yuv420播放器 的文章

更多推荐

版权声明:文章内容来源于网络,版权归原作者所有,如有侵权请点击这里与我们联系,我们将及时删除。

点击添加站长微信