This article introduces how to collect camera data in Android and call the ffmpeg database to change them into H.264 videos by JNI standard. After that, send the videos to WIN via simple UDP protocol without RTSP and consideration of efficiency. The details are illustrated as follows.
1. Acquire camera data in Android. Fail to acquire the height and width of YUV.
1. @Override 2. public void onPreviewFrame(byte[] arg0, Camera arg1) { 3. // TODO Auto-generated method stub 4. if(arg0 != null) 5. { 6. int len = arg0.length; 7. Log.i("xinw","len:"+String.valueOf(len)); 8. encodeprocess(arg0); 9. 10. } 11. } 12. 13. public boolean getIsPreviewCallback() 14. { 15. return ispreviewcallback; 16. } 17. public void setIsPreviewCallback(boolean flag) 18. { 19. ispreviewcallback = flag; 20. }
Here’s the screen of android.
2. Define JNI to call the FFMPEG database. JAVA codes are shown below. 1. static{ 2. System.loadLibrary("postproc-53"); 3. System.loadLibrary("avutil-54"); 4. System.loadLibrary("avcodec-56"); 5. System.loadLibrary("swresample-1"); 6. System.loadLibrary("avformat-56"); 7. System.loadLibrary("swscale-3"); 8. System.loadLibrary("avfilter-5"); 9. System.loadLibrary("avdevice-56"); 10. System.loadLibrary("x264"); 11. System.loadLibrary("ffmpeg_codec"); 12. } 13. public native int encodeinit(String inputStr); 14. public native int encodeprocess(byte[] yuv420spArr); JNI in C language. 1. //static void video_encode_example(int i) 2. JNIEXPORT jint JNICALL Java_com_tchip_remotepushvideo_CameraPreview_encodeprocess(JNIEnv *env, jobject obj,jbyteArray yuvspArr) 3. { 4. jbyte *tmpHandle = (*env)->GetByteArrayElements(env,yuvspArr,NULL); 5. av_init_packet(&pkt); 6. pkt.data = NULL; // packet data will be allocated by the encoder 7. pkt.size = 0; 8. 9. 10. LOGD("ffmpeg cxx"); 11. /* prepare a dummy image */ 12. /* Y */ 13. for (y = 0; y < c->height; y++) { 14. for (x = 0; x < c->width; x++) { 15. frame->data[0][y * frame->linesize[0] + x] = tmpHandle[x + y*1024]; 16. } 17. } 18. 19. /* Cb and Cr */ 20. 21. for (y = 0; y < c->height/2; y++) { 22. for (x = 0; x < c->width/2; x++) { 23. frame->data[1][y * frame->linesize[1] + x] = tmpHandle[1024*768 + 512*y + 2*x + 1]; 24. frame->data[2][y * frame->linesize[2] + x] = tmpHandle[1024*768 + 512*y + 2*x]; 25. 26. } 27. } 28. //Change YUV420SP into YUV420P caixx 29. frame->pts = i++; 30. 31. /* encode the image */ 32. ret = avcodec_encode_video2(c, &pkt, frame, &got_output); 33. if (ret < 0) { 34. //fprintf(stderr, "Error encoding frame\n"); 35. //exit(1); 36. } 37. 38. if (got_output) { 39. //printf("Write frame %3d (size=%5d)\n", i, pkt.size); 40. //printf("count:%d\n",g_h264q.size()); 41. //fwrite(pkt.data, 1, pkt.size, f); 42. sendto(g_socket,pkt.data, pkt.size, 0, (struct sockaddr *)&g_addr, sizeof(g_addr)); 43. av_free_packet(&pkt); 44. } 45. 46. } 3. Decode the received H.264 data in PC. The effect is show below.
Codes are mainly based on transformation of sample and decoding_encoding.c in the ffmpeg database. P.S. The IP addresses and arrays are set to be of absolute value for easy debugging. Please change the relevant parameters for normal use.
|