The low fps when streaming a video to texture is due to the glTexture2D function, which is highly used for most programmers. This function is relatively slow, since it writes some buffers before actually copying the data to the GPU, which is processed and then displayed.
When I was working on an Augmented Reality demo, it was running on about 15 fps with images with resolution of 320x240, if I wanted to display higher resolution for a better look of the application, it dropped to 7fps, pretty bad.
On a recent research on how to improve the frame rate of my application I found that we can write our data (image) directly to the GPU buffer and displays it without using the glTexture2D function.
The application used for this test (the video can be found at the end of this post) simply get image from the webcam and use it as a texture to a plane. The webcam captures a live youtube video stream being displayed on my desktop monitor and send the data for processing at 30fps (maximum speed at 800x640). This application has 2 threads: one for video capturing and the another one for rendering. While running the render thread, it now reaches 80 fps for a 800x640 images !
Freescale´s OpenGL ES API gives you some extra functions that allows to write directly to the GPU buffer.
Below you can find a piece of code, which do all the magic:
void LoadGLTextures (EGLDisplay egldisplay, IplImage *texture)
{
//Setup eglImage
char* imageBuffer = NULL;
static int start = 0;
EGLint attribs[] = { EGL_WIDTH, TEXTURE_W,
EGL_HEIGHT, TEXTURE_H,
EGL_IMAGE_FORMAT_FSL, EGL_FORMAT_BGRA_8888_FSL, EGL_NONE};
if (! start)
{
g_imgHandle = eglCreateImageKHR(egldisplay, EGL_NO_CONTEXT, EGL_NEW_IMAGE_FSL, NULL, attribs);
glEGLImageTargetTexture2DOES(GL_TEXTURE_2D, g_imgHandle);
start = 1;
}
//printf ("init --> g_imgHandle: 0x%08x\n", (int)g_imgHandle);
eglQueryImageFSL(egldisplay, g_imgHandle, EGL_CLIENTBUFFER_TYPE_FSL, (EGLint *)&imageBuffer);
memcpy (imageBuffer, texture->imageData, texture->imageSize);
return;
}
As you can see it is pretty simple, we create an Image and it is passed to g_imageHandle and then we initialize the texture for this image handle, note that it is only initialized once.
Once got the image and texture initialized, we use the function eglQueryImageFSL which gives us the pointer to the GPU buffer, and then, the data is written to the GPU buffer using memcpy.
And the result is:
note how fast the video is being displayed as a texture on the plane.
EOF !
Andre,
ReplyDeletewhat Linux distribution did you use as the basis for your work? The suppplied Ubuntu image on SD card, or your own build from LTIB, or something else?
Do you have a simple helloworld for OpenGL/ES for the i.MX53 to get me started with it?
Leon.
Hi Leon,
DeleteYou can find sample code in the GPU SDK (gpu-sdk1008.tgz)
from the i.MX53 QSB Freescale webppage.
Kind regards
/Torbörn
I love your, Dude!!!! two days i search this solution! sorry for bad english)
ReplyDeleteyou are welcome dude =)
DeleteHi, the perfomance improvement is great and exactly what i need. But i still stuck with the proper includes and packages. What packages (i think egl image extensions) i need and what includes and complier-flags are nessesary for glEGLImageTargetTexture2DOES() and eglCreateImageKHR()? Documentations are rare. Where the hell someone can get an idea of freescales API, for example eglQueryImageFSL()???
ReplyDeleteHi,
Deleteyou can get the GPU SDK package at freescale´s website. The eglimage is defined in one of the headers in this package..
Thanks, I've used the fslvideorendercard for the missing part. At what position at rendering do I have to put the LoadGLTextures() and how is this texture is binded as current to the render process and with what handle or ID? The example doesn't use glEnable(GL_TEXTURE_2D) and it is creating a new texture every frame, but not with the mentioned glTexture2D(), instead it overwrites the same textureID everytime. This won't work long i believe.
ReplyDeleteKind Regards
It is not creating everytime, it creates once and keeps updating (copying the data from gstreamer)the gpu buffer constantly. You can see the flag "start" in the code, of course you can make this code look better, but this flag is set once in the code when it starts, and this is the point that the texture is create only once.
Deleteregards,
Andre
Hi,
ReplyDeleteI'm trying the 1008 sdk with x11 but i get an error:
root@freescale /$ export DISPLAY=:0
root@freescale /$ ./es2_lesson03_x11
XOpenDisplay
DefaultScreen [0]
RootWindow
eglGetDisplay (81944)
Disp = 81945
eglInitialize
eglBindAPI
eglChooseConfig
XCreateSimpleWindow
*eglCreateWindowSurface
es2_lesson03_x11: lesson03_imx.c:208: int init(): Assertion `eglGetError() == 0x3000' failed.
Aborted
The return code of eglGetError() is BAD_ALLOC.
Used ltib with fslgnome profile.
Thank you very much
TOm
Hi Tom, this code is not x11 based, in your case you have to initialize EGL differently.
Deletewill be something like this:
http://pastebin.com/5yh5egfS
regards,
Andre
Does this code work on i.MX6 processors?
ReplyDeleteHello,
ReplyDeleteI have cross compiled OpenCV-2.2 for i.MX53. I was able to stream the video to display from Camera. However, I am having trouble in recording video with VideoWriter. I am getting an error saying that the VideoWriter can't be opened.
Has anyone implemented video recording using OpenCV i.mx53??
Regards,
Gopi