[gst-devel] Video processing
Bruno
botte.pub at gmail.com
Wed Aug 20 11:54:17 CEST 2008
Hi Aurélien,
Thanks you for your answer. I got it to work, so if some people where
wondering about the same problem :
/* Initialize the the Gstreamer pipeline. Below is a diagram
* of the pipeline that will be created:
* -
* |Camera| |CSP | |Screen| |Screen| |Image |
* |src |->|Filter|->|queue |->|sink |-> |processing|-> Display
*/
static gboolean initialize_pipeline(AppData *appdata,
int *argc, char ***argv)
{
GstElement *pipeline, *camera_src, *screen_sink;
GstElement *screen_queue;
GstElement *csp_filter;
GstCaps *caps;
GstBus *bus;
GstPad *sinkpad;
/* Initialize Gstreamer */
gst_init(argc, argv);
/* Create pipeline and attach a callback to it's
* message bus */
pipeline = gst_pipeline_new("test-camera");
bus = gst_pipeline_get_bus(GST_PIPELINE(pipeline));
gst_bus_add_watch(bus, (GstBusFunc)bus_callback, appdata);
gst_object_unref(GST_OBJECT(bus));
/* Save pipeline to the AppData structure */
appdata->pipeline = pipeline;
/* Create elements */
/* Camera video stream comes from a Video4Linux driver */
camera_src = gst_element_factory_make(VIDEO_SRC, "camera_src");
/* Colorspace filter is needed to make sure that sinks understands
* the stream coming from the camera */
csp_filter = gst_element_factory_make("ffmpegcolorspace", "csp_filter");
/* Queue creates new thread for the stream */
screen_queue = gst_element_factory_make("queue", "screen_queue");
/* Sink that shows the image on screen. Xephyr doesn't support XVideo
* extension, so it needs to use ximagesink, but the device uses
* xvimagesink */
screen_sink = gst_element_factory_make(VIDEO_SINK, "screen_sink");
sinkpad = gst_element_get_static_pad(screen_sink,"sink");
gst_pad_add_buffer_probe(sinkpad,G_CALLBACK(process_frame), appdata);
/* Check that elements are correctly initialized */
if(!(pipeline && camera_src && screen_sink && csp_filter &&
screen_queue))
{
g_critical("Couldn't create pipeline elements");
return FALSE;
}
/* Add elements to the pipeline. This has to be done prior to
* linking them */
gst_bin_add_many(GST_BIN(pipeline), camera_src, csp_filter,
screen_queue, screen_sink, NULL);
/* Specify what kind of video is wanted from the camera */
caps = gst_caps_new_simple("video/x-raw-yuv",
"width", G_TYPE_INT, IMAGE_WIDTH,
"height", G_TYPE_INT, IMAGE_HEIGHT,
"framerate", GST_TYPE_FRACTION, FRAMERATE, 1,
NULL);
/* Link the camera source and colorspace filter using capabilities
* specified */
if(!gst_element_link_filtered(camera_src, csp_filter, caps))
{
return FALSE;
}
gst_caps_unref(caps);
/* Connect Colorspace Filter -> Screen Queue -> Screen Sink
* This finalizes the initialization of the screen-part of the pipeline
*/
if(!gst_element_link_many(csp_filter, screen_queue, screen_sink, NULL))
{
return FALSE;
}
/* As soon as screen is exposed, window ID will be advised to the sink
*/
g_signal_connect(appdata->screen, "expose-event", G_CALLBACK(expose_cb),
screen_sink);
gst_element_set_state(pipeline, GST_STATE_PLAYING);
return TRUE;
}
so now a function "frame_process" is called each time data flows through the
pad I created on the screen_sink.
Thanks again !
Bruno
2008/8/18, Aurelien Grimaud <gstelzz at yahoo.fr>:
>
> Hi,
> You can add buffer probes on pads
>
> http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer/html/GstPad.html#gst-pad-add-buffer-probe
> It will be called each time a buffer goes through the pad.
> And choose whether to let buffer go (return TRUE), or not (return FALSE).
> I do not know if sink will handle gap in buffers properly...
>
> What about using a videorate to get less frames from src, and therefore
> make sure nokia processor can handle the stream ?
>
> Aurelien
>
> ----- Message d'origine ----
> De : Bruno <botte.pub at gmail.com>
> À : gstreamer-devel at lists.sourceforge.net
> Envoyé le : Lundi, 18 Août 2008, 17h22mn 09s
> Objet : [gst-devel] Video processing
>
> Hello everyone !
>
> I'm trying to develop an image-processing application for the nokia N810
> using gstreamer. I have made the structure of this application using an
> example which display the image of the camera on the screen.
>
> I tweaked a bit this code for my need, and it is working, I can actually
> display the picture from the camera, start/stop the media pipeline... But I
> don't find where to put the image processing code (which is working alone on
> the nokia).
>
> What I want is that when the program receive a frame from the camera, it
> has to do some calculation with the current frame buffer, and during these
> calculations, I'd like it to ignore the other frame coming from the camera.
> (I don't think that the processor will be able to handle realtime
> processing).
> When the calculation is done, it should start again with the next coming
> frame.
>
> I managed to do it, but I had to press a button each time I wanted to
> process a frame. I'd like the program to do it continuously.
>
> Here is my code :
>
>
>
> #define VIDEO_SRC "v4l2src"
> #define VIDEO_SINK "xvimagesink"
>
>
>
> typedef struct
> {
> HildonProgram *program;
> HildonWindow *window;
>
> GstElement *pipeline;
> GtkWidget *screen;
>
> guint buffer_cb_id;
> } AppData;
>
>
>
>
> /* Callback that gets called when user clicks the "START/STOP" button */
> static void button1_pressed(GtkWidget *widget,AppData *appdata)
> {
> if (GTK_TOGGLE_BUTTON(widget)->active)
> {
> /* Display a note to the user */
> hildon_banner_show_information(GTK_WIDGET(appdata->window),
> NULL, "Running ...");
> gst_element_set_state(appdata->pipeline, GST_STATE_PLAYING);
> }
>
> else
> {
> /* Display a note to the user */
> hildon_banner_show_information(GTK_WIDGET(appdata->window),
> NULL, "Stopped ...");
> gst_element_set_state(appdata->pipeline, GST_STATE_PAUSED);
>
> }
> }
>
>
>
>
> /* Callback that gets called when user clicks the "Expression ON/OFF"
> button */
> static void button2_pressed(GtkWidget *widget, AppData *appdata)
> {
> if (GTK_TOGGLE_BUTTON(widget)->active)
> {
> /* Display a note to the user */
> hildon_banner_show_information(GTK_WIDGET(appdata->window),
> NULL, "Expressions ON");
> }
>
> else
> {
> /* Display a note to the user */
> hildon_banner_show_information(GTK_WIDGET(appdata->window),
> NULL, "Expressions OFF");
> }
> }
>
>
>
>
>
>
>
>
>
>
> /* Callback that gets called whenever pipeline's message bus has
> * a message */
> static void bus_callback(GstBus *bus, GstMessage *message, AppData
> *appdata)
> {
> gchar *message_str;
> const gchar *message_name;
> GError *error;
>
> /* Report errors to the console */
> if(GST_MESSAGE_TYPE(message) == GST_MESSAGE_ERROR)
> {
> gst_message_parse_error(message, &error, &message_str);
> g_error("GST error: %s\n", message_str);
> g_free(error);
> g_free(message_str);
> }
>
> /* Report warnings to the console */
> if(GST_MESSAGE_TYPE(message) == GST_MESSAGE_WARNING)
> {
> gst_message_parse_warning(message, &error, &message_str);
> g_warning("GST warning: %s\n", message_str);
> g_free(error);
> g_free(message_str);
> }
>
> /* See if the message type is GST_MESSAGE_APPLICATION which means
> * thet the message is sent by the client code (this program) and
> * not by gstreamer. */
> if(GST_MESSAGE_TYPE(message) == GST_MESSAGE_APPLICATION)
> {
> /* Get name of the message's structure */
> message_name =
> gst_structure_get_name(gst_message_get_structure(message));
>
> /* The hildon banner must be shown in here, because the bus
> callback is
> * called in the main thread and calling GUI-functions in gstreamer
> threads
> * usually leads to problems with X-server */
>
> if(!strcmp(message_name, "anger"))
> {
> hildon_banner_show_information(
> GTK_WIDGET(appdata->window),
> NULL, "Anger");
> }
>
> if(!strcmp(message_name, "disgust"))
> {
> hildon_banner_show_information(
> GTK_WIDGET(appdata->window),
> NULL, "Disgust");
> }
> if(!strcmp(message_name, "fear"))
> {
> hildon_banner_show_information(
> GTK_WIDGET(appdata->window),
> NULL, "Fear");
> }
> if(!strcmp(message_name, "happy"))
> {
> hildon_banner_show_information(
> GTK_WIDGET(appdata->window),
> NULL, "Happy");
> }
> if(!strcmp(message_name, "neutral"))
> {
> hildon_banner_show_information(
> GTK_WIDGET(appdata->window),
> NULL, "Neutral");
> }
> if(!strcmp(message_name, "sad"))
> {
> hildon_banner_show_information(
> GTK_WIDGET(appdata->window),
> NULL, "Sad");
> }
> if(!strcmp(message_name, "surprise"))
> {
> hildon_banner_show_information(
> GTK_WIDGET(appdata->window),
> NULL, "Surprise");
> }
> if(!strcmp(message_name, "unknown"))
> {
> hildon_banner_show_information(
> GTK_WIDGET(appdata->window),
> NULL, "Unknown !");
> }
> }
>
> }
>
>
>
>
>
>
>
>
>
>
> /* Callback to be called when the screen-widget is exposed */
> static gboolean expose_cb(GtkWidget * widget, GdkEventExpose * event,
> gpointer data)
> {
> /* Tell the xvimagesink/ximagesink the x-window-id of the screen
> * widget in which the video is shown. After this the video
> * is shown in the correct widget */
> gst_x_overlay_set_xwindow_id(GST_X_OVERLAY(data),
> GDK_WINDOW_XWINDOW(widget->window));
> return FALSE;
> }
>
>
>
>
>
>
>
> /* Initialize the the Gstreamer pipeline. Below is a diagram
> * of the pipeline that will be created:
> *
> *
> * -
> * |Camera| |CSP | |Screen| |Screen| |Image |
> * |src |->|Filter|->|queue |->|sink |-> |processing|-> Display
> */
> static gboolean initialize_pipeline(AppData *appdata,
> int *argc, char ***argv)
> {
> GstElement *pipeline, *camera_src, *screen_sink;
> GstElement *screen_queue;
> GstElement *csp_filter;
> GstCaps *caps;
> GstBus *bus;
>
>
> /* Initialize Gstreamer */
> gst_init(argc, argv);
>
> /* Create pipeline and attach a callback to it's
> * message bus */
> pipeline = gst_pipeline_new("test-camera");
>
> bus = gst_pipeline_get_bus(GST_PIPELINE(pipeline));
> gst_bus_add_watch(bus, (GstBusFunc)bus_callback, appdata);
> gst_object_unref(GST_OBJECT(bus));
>
> /* Save pipeline to the AppData structure */
> appdata->pipeline = pipeline;
>
> /* Create elements */
> /* Camera video stream comes from a Video4Linux driver */
> camera_src = gst_element_factory_make(VIDEO_SRC, "camera_src");
> /* Colorspace filter is needed to make sure that sinks understands
> * the stream coming from the camera */
> csp_filter = gst_element_factory_make("ffmpegcolorspace",
> "csp_filter");
> /* Queue creates new thread for the stream */
> screen_queue = gst_element_factory_make("queue", "screen_queue");
> /* Sink that shows the image on screen. Xephyr doesn't support XVideo
> * extension, so it needs to use ximagesink, but the device uses
> * xvimagesink */
> screen_sink = gst_element_factory_make(VIDEO_SINK, "screen_sink");
>
>
> /* Check that elements are correctly initialized */
> if(!(pipeline && camera_src && screen_sink && csp_filter &&
> screen_queue))
> {
> g_critical("Couldn't create pipeline elements");
> return FALSE;
> }
>
>
> /* Add elements to the pipeline. This has to be done prior to
> * linking them */
> gst_bin_add_many(GST_BIN(pipeline), camera_src, csp_filter,
> screen_queue, screen_sink, NULL);
>
> /* Specify what kind of video is wanted from the camera */
> caps = gst_caps_new_simple("video/x-raw-rgb",
> "width", G_TYPE_INT, 640,
> "height", G_TYPE_INT, 480,
> "framerate", GST_TYPE_FRACTION, 25, 1,
> NULL);
>
>
> /* Link the camera source and colorspace filter using capabilities
> * specified */
> if(!gst_element_link_filtered(camera_src, csp_filter, caps))
> {
> return FALSE;
> }
> gst_caps_unref(caps);
>
> /* Connect Colorspace Filter -> Screen Queue -> Screen Sink
> * This finalizes the initialization of the screen-part of the pipeline
> */
> if(!gst_element_link_many(csp_filter, screen_queue, screen_sink, NULL))
> {
> return FALSE;
> }
>
> /* As soon as screen is exposed, window ID will be advised to the sink
> */
> g_signal_connect(appdata->screen, "expose-event",
> G_CALLBACK(expose_cb),
> screen_sink);
>
>
>
> gst_element_set_state(pipeline, GST_STATE_PAUSED);
>
> return TRUE;
> }
>
>
>
>
>
>
>
>
>
>
>
> /* Destroy the pipeline on exit */
> static void destroy_pipeline(GtkWidget *widget, AppData *appdata)
> {
> /* Free the pipeline. This automatically also unrefs all elements
> * added to the pipeline */
> gst_element_set_state(appdata->pipeline, GST_STATE_NULL);
> gst_object_unref(GST_OBJECT(appdata->pipeline));
> }
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> int main(int argc, char **argv)
> {
> // variables for face detection
> // main structure for vjdetect
> pdata = (mainstruct*) calloc(1, sizeof(mainstruct));
> // Allocate memory for array of face detections returned by
> facedetector (VjDetect).
> pdata->pFaceDetections = (FLY_Rect
> *)calloc(MAX_NUMBER_OF_FACE_DETECTIONS, sizeof(FLY_Rect));
> init(pdata);
>
> AppData appdata;
> GtkWidget *hbox, *vbox_button, *vbox, *button1, *button2;
>
> /* Initialize and create the GUI */
>
> example_gui_initialize(
> &appdata.program, &appdata.window,
> &argc, &argv, "Expression Detector");
>
> vbox = gtk_vbox_new(FALSE, 0);
> hbox = gtk_hbox_new(FALSE, 0);
> vbox_button = gtk_vbox_new(FALSE, 0);
>
> gtk_box_pack_start(GTK_BOX(hbox), vbox, FALSE, FALSE, 0);
> gtk_box_pack_start(GTK_BOX(hbox), vbox_button, FALSE, FALSE, 0);
>
> appdata.screen = gtk_drawing_area_new();
> gtk_widget_set_size_request(appdata.screen, 500, 380);
> gtk_box_pack_start(GTK_BOX(vbox), appdata.screen, FALSE, FALSE, 0);
>
> button1 = gtk_toggle_button_new_with_label("Run/Stop");
> gtk_widget_set_size_request(button1, 170, 75);
> gtk_box_pack_start(GTK_BOX(vbox_button), button1, FALSE, FALSE, 0);
>
> button2 = gtk_toggle_button_new_with_label("Expressions ON/OFF");
> gtk_widget_set_size_request(button2, 170, 75);
> gtk_box_pack_start(GTK_BOX(vbox_button), button2, FALSE, FALSE, 0);
>
> g_signal_connect(G_OBJECT(button1), "clicked",
> G_CALLBACK(button1_pressed), &appdata);
>
> g_signal_connect(G_OBJECT(button2), "clicked",
> G_CALLBACK(button2_pressed), &appdata);
>
>
> gtk_container_add(GTK_CONTAINER(appdata.window), hbox);
>
> /* Initialize the GTK pipeline */
> if(!initialize_pipeline(&appdata, &argc, &argv))
> {
> hildon_banner_show_information(
> GTK_WIDGET(appdata.window),
> "gtk-dialog-error",
> "Failed to initialize pipeline");
> }
>
>
> g_signal_connect(G_OBJECT(appdata.window), "destroy",
> G_CALLBACK(destroy_pipeline), &appdata);
>
> /* Begin the main application */
> example_gui_run(appdata.program, appdata.window);
>
> /* Free the gstreamer resources. Elements added
> * to the pipeline will be freed automatically */
>
> return 0;
> }
>
>
>
>
> I removed the image processing functions to have a better clarity.
> I tried to put a printf in the exposecb functions, in the main and in the
> pipeline, with the hope that it will display the text each time a new frame
> is displayed on the screen. That didn't work, it appears only once. I read
> tutorials about gstreamer but couldn't find how to do something continously.
> Maybe I should write a image_processing callback, and put in my main a
> g_signal_connect call that start this callback each time a new frame is
> displayed ? But what would be the correct signal to use ?
>
> Any idea welcome.
>
> Thanks in advance !
> Bruno
>
> ------------------------------
> Envoyé avec Yahoo! Mail<http://us.rd.yahoo.com/mailuk/taglines/isp/control/*http://us.rd.yahoo.com/evt=52423/*http://fr.docs.yahoo.com/mail/overview/index.html>
> .
> Une boite mail plus intelligente.
>
> -------------------------------------------------------------------------
> This SF.Net email is sponsored by the Moblin Your Move Developer's
> challenge
> Build the coolest Linux based applications with Moblin SDK & win great
> prizes
> Grand prize is a trip for two to an Open Source event anywhere in the world
> http://moblin-contest.org/redirect.php?banner_id=100&url=/
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.freedesktop.org/archives/gstreamer-devel/attachments/20080820/dd0fd956/attachment.htm>
More information about the gstreamer-devel
mailing list