Meego camera input
Stefan Kost
ensonic at hora-obscura.de
Fri Sep 9 04:19:13 PDT 2011
On 09/05/11 10:45, Peter Staab wrote:
> Hello,
>
> I am trying to record video with gstreamer on a N950 Meego device (from a Qt application). For first tests I adopted this sample code: http://maemo.org/maemo_release_documentation/maemo4.1.x/node9.html
you don't wan to send rgb to xvimagesink. I'd also suggest to use
autovideosink.
Stefan
> The adopted code looks this way:
> #####
> /* Initialize Gstreamer */
> gst_init(NULL, NULL);
>
> /* Create pipeline and attach a callback to it's
> * message bus */
> pipeline = gst_pipeline_new("test-camera");
>
> bus = gst_pipeline_get_bus(GST_PIPELINE(pipeline));
> gst_bus_add_watch(bus, (GstBusFunc)bus_callback, this);
> gst_object_unref(GST_OBJECT(bus));
>
> /* Create elements */
> /* Camera video stream comes from a Video4Linux driver */
> camera_src = gst_element_factory_make("autovideosrc", "camera_src"); //also tested: subdevsrc, v4l2camsrc, v4l2src
> /* Colorspace filter is needed to make sure that sinks understands
> * the stream coming from the camera */
> csp_filter = gst_element_factory_make("ffmpegcolorspace", "capsfilter");
> /* Tee that copies the stream to multiple outputs */
> tee = gst_element_factory_make("tee", "tee");
> /* Queue creates new thread for the stream */
> screen_queue = gst_element_factory_make("queue", "screen_queue");
> /* Sink that shows the image on screen. Xephyr doesn't support XVideo
> * extension, so it needs to use ximagesink, but the device uses
> * xvimagesink */
> screen_sink = gst_element_factory_make("xvimagesink", "screen_sink");
> /* Creates separate thread for the stream from which the image
> * is captured */
> image_queue = gst_element_factory_make("queue", "image_queue");
> /* Filter to convert stream to use format that the gdkpixbuf library
> * can use */
> image_filter = gst_element_factory_make("ffmpegcolorspace", "image_filter");
> /* A dummy sink for the image stream. Goes to bitheaven */
> image_sink = gst_element_factory_make("fakesink", "image_sink");
>
> /* Check that elements are correctly initialized */
> if(!(pipeline && camera_src && screen_sink && csp_filter && screen_queue
> && image_queue && image_filter && image_sink))
> {
> qDebug() << "Couldn't create pipeline elements";
> QApplication::exit(0);
> }
>
> /* Set image sink to emit handoff-signal before throwing away
> * it's buffer */
> g_object_set(G_OBJECT(image_sink),
> "signal-handoffs", TRUE, NULL);
>
>
>
> /* Add elements to the pipeline. This has to be done prior to
> * linking them */
> gst_bin_add_many(GST_BIN(pipeline), camera_src, csp_filter,
> tee, screen_queue, screen_sink, image_queue,
> image_filter, image_sink, NULL);
>
> /* Specify what kind of video is wanted from the camera */
> caps = gst_caps_new_simple("video/x-raw-rgb", //x-raw-rgb yuv
> "width", G_TYPE_INT, 640,
> "height", G_TYPE_INT, 480,
> NULL);
>
> if (!caps)
> {
> qDebug() << "caps NULL";
> }
>
>
> /* Link the camera source and colorspace filter using capabilities
> * specified */
> if(!gst_element_link_filtered(camera_src, csp_filter, caps))
> {
> qDebug() << "gst_element_link_filtered caps error";
> // QApplication::exit(0);
> }
> gst_caps_unref(caps);
>
> /* Connect Colorspace Filter -> Tee -> Screen Queue -> Screen Sink
> * This finalizes the initialization of the screen-part of the pipeline */
> if(!gst_element_link_many(csp_filter, tee, screen_queue, screen_sink, NULL))
> {
> qDebug() << "gst_element_link_many tee error";
> QApplication::exit(0);
> }
>
> /* gdkpixbuf requires 8 bits per sample which is 24 bits per
> * pixel */
> caps = gst_caps_new_simple("video/x-raw-rgb",
> "width", G_TYPE_INT, 640,
> "height", G_TYPE_INT, 480,
> "bpp", G_TYPE_INT, 24,
> "depth", G_TYPE_INT, 24,
> "framerate", GST_TYPE_FRACTION, 15, 1,
> NULL);
>
> /* Link the image-branch of the pipeline. The pipeline is
> * ready after this */
> if(!gst_element_link_many(tee, image_queue, image_filter, NULL))
> {
> qDebug() << "gst_element_link_many image_queue error";
> QApplication::exit(0);
> }
>
> if(!gst_element_link_filtered(image_filter, image_sink, caps))
> {
> qDebug() << "gst_element_link_filtered image_sink error";
> QApplication::exit(0);
> }
>
> gst_caps_unref(caps);
>
> /* As soon as screen is exposed, window ID will be advised to the sink */
> //g_signal_connect(this, "expose-event", G_CALLBACK(expose_cb),
> // screen_sink);
>
> GstStateChangeReturn sret = gst_element_set_state(pipeline, GST_STATE_PLAYING);
> if(sret == GST_STATE_CHANGE_FAILURE)
> {
> qDebug() << "Error GST_STATE_PLAYING: " << sret;
> }
> else
> {
> qDebug() << "Ok GST_STATE_PLAYING: " << sret;
> }
> #####
>
> Now the problem is that I just get "GST_STATE_CHANGE_FAILURE" as result and no preview (no error/debug messages).
> I already tested different parameters and looked into the Qt Mobility source code for reference which uses gstreamer too, all without success.
>
> Is there something wrong with the code, or can someone provide me a simple working example how to make the camera preview visible (in Qt/Meego)?
>
> Regards,
> Peter
More information about the gstreamer-devel
mailing list