[gst-devel] simple video player in C

Erwan Masson masson.erwan1 at gmail.com
Wed May 16 17:07:47 CEST 2007


Hello,
 Have you installed all the plugin?
 Try something like gst-inspect flutsdemux. Same for  flump3dec, mpeg2dec.
 You should have theses elements to make the pipeline works.
 After try something like that : gst-launch-0.10 -v playbin uri=
file:///home/myprofile/myfile.ts
If it works, then comment this line
gst_element_set_state (pipeAudio, GST_STATE_PAUSED);

Otherwise each time you see an audio padd add you will bloc your audio
pipeline.

Erwan Masson



2007/5/16, fabien <fabien.castan at free.fr>:
>
> Hello
> I've tested your code on two different pc-s with two different versions
> of GStreamer (08.10 and 10.12) but didn't work on either of them. After
> several tests, I finally commented the line that set PLAYING on the
> pipeAudio. That's when I managet to get an image for the first time.
> But then I tested it with avi and mpg files and it doesn't work, it only
> opens .ogg. Furthermore it only works once in every two times I try...
> Fabien
>
>
> Erwan Masson a écrit :
> > Hi,
> > I have try some test, it seems if you dont put the pipevideo to pause,
> > and you put pieAudio to play it works.
> > I dont know why, moreover if I put your main pipeline to play it
> > crashes, in my code it dont crashes.
> >
> > For your notify caps, I have never be able to retrieves info, I use
> > have type callback.
> >
> > For your have data function, I would have make a hand off callback
> > instead.
> >
> > This code works fine with me.
> > Try it,
> > Erwan
> >
> > 2007/5/14, fabien <fabien.castan at free.fr <mailto:fabien.castan at free.fr
> >>:
> >
> >     Thank you for your fast answer.
> >     But I did remove the capsfilter but it still doesn't work... :(
> >     From time to time, the sound begins to play and stop after 2
> >     seconds or
> >     so and I get no video.
> >     Fabien
> >
> >     Erwan Masson a écrit :
> >     > Hello,
> >     >  Remove the capsfilter and it will works:) .
> >     > I used capsfilter linked with fakesink to grab each frame in a
> >     > specific format(with a hand off signal).
> >     >  Erwan
> >     >
> >     >
> >     > 2007/5/14, Michael Smith < msmith at fluendo.com
> >     <mailto:msmith at fluendo.com>
> >     > <mailto:msmith at fluendo.com <mailto:msmith at fluendo.com>>>:
> >     >
> >     >     On Mon, 2007-05-14 at 14:14 +0200, fabien wrote:
> >     >     > Hello,
> >     >     > I tested your code, but it still doesn't work. If I
> >     comment the
> >     >     video
> >     >     > part, the audio is playing (from avi file but not from
> >     mpeg..).
> >     >     So it
> >     >     > must be an error on the video pipeline. I made a sketch of
> the
> >     >     pipeline...
> >     >     > If anyone could find the error, it would be of great help
> >     for me.
> >     >     > Thank you
> >     >     >
> >     >
> >     >     You're using a capsfilter to force a particular pixel format,
> >     >     size, and
> >     >     framerate.
> >     >
> >     >     You use ffmpegcolorspace (which can convert to the required
> >     pixel
> >     >     format), but you don't have anything to convert to the size
> and
> >     >     framerate you're asking for.
> >     >
> >     >     One specific problem you're likely to run into is that many
> >     files have
> >     >     non-square pixels, but ximagesink requires square pixels.
> >     You can use
> >     >     the 'videoscale' element to resize appropriately.
> >     >
> >     >     You can also try using videorate to change video framerate.
> >     >
> >     >     There may be other problems with your code, I didn't read it
> >     >     closely or
> >     >     try it myself.
> >     >
> >     >     Mike
> >     >
> >     >
> >
> >
> >
> > ------------------------------------------------------------------------
> >
> >
> > /* g++ `pkg-config --libs --cflags gstreamer-0.10` -Wall exampleNew.cpp-o exampleNew */
> > /* usage: ./example videoFileName */
> >
> > #include <gst/gst.h>
> > #include <unistd.h>
> >
> >
> > #define VIDEO_WIDTH 360
> > #define VIDEO_HEIGHT 288
> >
> >
> > GstElement* pipeline;
> > GstElement* pipeVideo;
> > GstElement* pipeAudio;
> >
> >
> > static int ind = 0;
> > static gboolean
> > cb_bus_call (GstBus     *bus,
> >         GstMessage *msg,
> >         gpointer    data)
> > {
> >   GMainLoop *loop = (GMainLoop *) data;
> >   GstState oldstate, newstate, pending;
> >       //g_print ("Got %s message\n", GST_MESSAGE_TYPE_NAME (msg));
> >   switch (GST_MESSAGE_TYPE (msg)) {
> >       case GST_MESSAGE_STATE_CHANGED:
> >
> >        gst_message_parse_state_changed (msg , &oldstate, &newstate,
> &pending);
> >               //return TRUE;
> >               g_print("State changed, ");
> >
> >               switch(oldstate){
> >                       case GST_STATE_VOID_PENDING:
> >                               g_print("GST_STATE_VOID_PENDING");
> >                       break;
> >                       case GST_STATE_NULL:
> >                               g_print("GST_STATE_NULL");
> >                       break;
> >                       case GST_STATE_READY:
> >                               g_print("GST_STATE_READY");
> >                       break;
> >                       case GST_STATE_PAUSED  :
> >                               g_print("GST_STATE_PAUSED");
> >                       break;
> >                       case GST_STATE_PLAYING:
> >                               g_print("GST_STATE_PLAYING");
> >                       break;
> >                       default :
> >                       break;
> >               }
> >               g_print(" -> ");
> >               switch(newstate){
> >                       case GST_STATE_VOID_PENDING:
> >                               g_print("GST_STATE_VOID_PENDING");
> >                       break;
> >                       case GST_STATE_NULL:
> >                               g_print("GST_STATE_NULL");
> >                       break;
> >                       case GST_STATE_READY:
> >                               g_print("GST_STATE_READY");
> >                       break;
> >                       case GST_STATE_PAUSED  :
> >                               g_print("GST_STATE_PAUSED");
> >                       break;
> >                       case GST_STATE_PLAYING:
> >                               g_print("GST_STATE_PLAYING");
> >                       break;
> >                       default :
> >                       break;
> >               }
> >
> >                       g_print("\n");
> >
> >                //old state %s, new state %s, p video_bppending
> %s",*oldstate, *newstate, *pending);
> >               break;
> >     case GST_MESSAGE_EOS:
> >       g_print ("End-of-stream\n");
> >       g_main_loop_quit (loop);
> >       break;
> >     case GST_MESSAGE_ERROR: {
> >       gchar *debug;
> >       GError *err;
> >
> >       gst_message_parse_error (msg, &err, &debug);
> >       g_free (debug);
> >
> >       g_print ("Error: %s\n", err->message);
> >
> >       g_error_free (err);
> >
> >       g_main_loop_quit (loop);
> >
> >       break;
> >     }
> >     default:
> >       break;
> >   }
> >
> >   return TRUE;
> > }
> >
> > static gboolean cb_have_data (GstPad *pad, GstBuffer *buffer, gpointer
> u_data)
> > {
> >       printf("nb frame %d",ind++);
> >       return TRUE;
> > }
> >
> >
> >
> > /* This callback will be called when GStreamer finds some info about the
> >  * video. In this case we want the width & height. */
> > static void cb_notify_caps(GObject *obj,
> >               GParamSpec *pspec, gpointer data)
> > {
> >       GstPad *pad = GST_PAD(obj);
> >       GstCaps *caps;
> >       GstStructure *str;
> >       gint width, height;
> >
> >       if (!(caps = gst_pad_get_caps (pad))) return;
> >       if (!(str = gst_caps_get_structure (caps, 0))) return;
> >       if (!gst_structure_get_int (str, "width", &width) ||
> >               !gst_structure_get_int (str, "height", &height))
> >               return;
> >       g_print("cb_notify_caps    width:%d height:%d\n", width, height);
> >       //video_width = width;
> >       //video_height = height;
> > }
> >
> >
> >
> >
> >
> > /* This callback will be called when GStreamer finds some stream (audio
> or
> >  * video) in the open file. This function links the appropriate
> elements. */
> >
> > // A new pad callback function:
> > static void cb_new_pad (GstElement *element, GstPad *pad, gpointer data)
> > {
> >       GstPad *sinkpad;
> >       GstCaps *caps;
> >       GstStructure *str;
> >       gint i, max;
> >
> >       caps = gst_pad_get_caps (pad);
> >
> >       str = gst_caps_get_structure (caps, 0);
> >       g_print("________________________cb New
> Pad______________________________________________\n");
> >       g_print("GstStructure: %s\n",gst_structure_get_name (str));
> >       /* We can now link this pad with the audio or video  decoder */
> >       g_print ("Dynamic pad created, linking parser/decoder \n");
> >       g_print("-------\n");
> >
> >       // VIDEO
> >       if (g_strrstr (gst_structure_get_name (str), "video"))
> >       {
> >               g_print("OOOOooook Video link\n");
> >
> >               max = gst_structure_n_fields (str); // RÃ(c)cupère le
> nombre de field dans la structure
> >
> >               g_print("nb field = %d\n", max);
> >
> >               for(i=0;i<max; i++){
> >                       g_print("\n Nom de la structure %s \n",
> gst_structure_nth_field_name (str, i)); // Recupère le nom de la structure.
> >               }
> >               g_print("GstCaps: %s\n", gst_caps_to_string(caps));
> >               //Add now the video thread in the main pipeline
> >               gst_bin_add(GST_BIN(pipeline), pipeVideo);
> >               //Put the pipeline video on state ready (It can bug if it
> is not init to ready)
> >               gst_element_set_state (pipeVideo, GST_STATE_READY);
> >               //Retrieves the sink pad of the pipevideo(the ghost pad)
> >               sinkpad = gst_element_get_pad (pipeVideo, "sink");
> >
> >               //If the pileline is already link, stop here
> >               if (GST_PAD_IS_LINKED (sinkpad)) return;
> >               else g_print("le pipeline n'est pas deja lie au sink donc
> on le fait...\n");
> >               //You can add here a notify caps:
> >
> //g_signal_connect(sinkpad,    "notify::caps",G_CALLBACK(cb_notify_caps),
> NULL);
> >
> >               //Link the main pipeline pad with the pipeline video pad
> >               if (gst_pad_link(pad, sinkpad)!= GST_PAD_LINK_OK)
> >               {
> >                       g_error("Cannot link video\n");
> >                       return;
> >               }
> >               //Put the state at pause, it can be crash if not init
> >       //      gst_element_set_state (pipeVideo, GST_STATE_PAUSED);
> >       //      gst_element_set_state (pipeAudio, GST_STATE_PAUSED);
> >       //      gst_element_set_state (pipeline, GST_STATE_PAUSED);
> >
> >               g_print("Video playing\n");
> >               //gst_element_set_state (pipeVideo, GST_STATE_PLAYING);
> >       }
> >
> >       // AUDIO
> >       if ( g_strrstr (gst_structure_get_name (str), "audio"))
> >       {
> >               g_print("Audio link\n");
> >
> >               gst_bin_add(GST_BIN(pipeline), pipeAudio);
> >               gst_element_set_state (pipeAudio, GST_STATE_READY);
> >               sinkpad = gst_element_get_pad (pipeAudio, "sink");
> >               if (GST_PAD_IS_LINKED (sinkpad)){
> >                       g_print("Already link");
> >
> >               }
> >
> >               if (gst_pad_link(pad, sinkpad) != GST_PAD_LINK_OK)
> >               {
> >                       g_error("Cannot link audio\n");
> >                       return;
> >               }
> >               g_print("Audio paused\n");
> >               gst_element_set_state (pipeAudio, GST_STATE_PAUSED);
> >
> >               g_print("Audio playing\n");
> >               //gst_element_set_state (pipeVideo, GST_STATE_PAUSED);
> >               gst_element_set_state (pipeAudio, GST_STATE_PLAYING);
> >       //      gst_element_set_state (pipeline, GST_STATE_PAUSED);
> >               //      g_print("pipe playing2\n");
> >
> >               /*
> >               if(!haveTypeAudio){
> >                       g_print("Type de l'audio pas encore trouvÃ(c) %d
> \n", video_frames);
> >                       return;
> >               };
> >               */
> >
> >               //gst_object_unref (sinkpad);
> >       }
> >       //gst_element_set_state (pipeline, GST_STATE_PLAYING);
> > }
> >
> >
> >
> >
> >
> >
> >
> >
> > gint main (gint argc, gchar *argv[])
> > {
> >       /* make sure we have input */
> >       if (argc != 2) {
> >               g_print ("Usage: %s <filename>\n", argv[0]);
> >               return -1;
> >       }
> >       GstBus* bus ;
> >       /* initialize GStreamer */
> >       gst_init (&argc, &argv);
> >       GMainLoop *loop = g_main_loop_new (NULL, FALSE);
> >
> >       /* Main pipeline */
> >       pipeline = gst_pipeline_new ("Main pipeline");
> >
> >       GstElement* source = gst_element_factory_make ("filesrc",
> "file-source");
> >       /* the parser got 2 dynamic output pad, you will have to link them
> to your audio thread and video thread */
> >       GstElement* parser = gst_element_factory_make ("decodebin",
> "decodebin-parser");
> >
> >       /* Audio Pipeline */
> >       pipeAudio = gst_pipeline_new ("audio-player ");
> >       /* A queue is needed to synchronise with Video thread */
> >       GstElement* aqueue = gst_element_factory_make("queue", "aqueue");
> > //    GstElement* adecoder = gst_element_factory_make ("identity",
> "identity-decoder-audio");
> >       GstElement* aconv = gst_element_factory_make ("audioconvert",
> "converteraudio");
> >       /* Identity, useful for add handdoff signal (to grab a sample) */
> > //    GstElement* aconv2 = gst_element_factory_make ("identity", "
> identity conv2");
> >       /* With typefind you are able to retrieves some info in the signal
> */
> > //    GstElement* afind = gst_element_factory_make ("typefind",
> "typefindaudio");
> >       GstElement* asink = gst_element_factory_make ("alsasink",
> "alsa-output");
> >
> >
> >       /* Video Pipeline */
> >       // GstElement*
> >       pipeVideo = gst_pipeline_new ("video-player");
> >       /* queue usefull to synchronize with audio thread */
> >       GstElement* vqueue = gst_element_factory_make("queue", "vqueue");
> >       //GstElement* vdecoder = gst_element_factory_make ("identity",
> "identity-decoder");
> >       GstElement* vconv = gst_element_factory_make ("ffmpegcolorspace",
> "convertervideo");
> >       /* Use capsfilter if you want to convert to RGB (default ffmpeg
> output is YUV */
> > //    GstElement* vcapsfilter = gst_element_factory_make ("capsfilter",
> "restreint le caps");
> >       /*g_object_set (G_OBJECT (vcapsfilter), "caps",
> gst_caps_new_simple ("video/x-raw-rgb",
> >                                                                          "width",  G_TYPE_INT,
> VIDEO_WIDTH,
> >                                                                          "height",
> G_TYPE_INT, VIDEO_HEIGHT,
> >                                                                          "framerate",
> GST_TYPE_FRACTION, 25, 1,
> >                                                                          "bpp",
> G_TYPE_INT, 3*8,
> >                                                                          "depth",
> G_TYPE_INT, 3*8,
> >
> >                                                                          "red_mask",
> G_TYPE_INT, 0xff0000,
> >                                                                          "green_mask",
> G_TYPE_INT, 0x00ff00,
> >                                                                          "blue_mask",  G_TYPE_INT,
> 0x0000ff,
> >
>
> >                                                                          NULL)
> >                                                                          ,
> NULL);*/
> >
> >       /* Put a handoff signal on identity and you will grab video frame
> */
> > //    GstElement* vconv2 = gst_element_factory_make ("identity",
> "identity-vconv2");
> > //    g_signal_connect (vconv2, "handoff", G_CALLBACK
> (cb_handoff_video), NULL);
> >       /* use typefind if you want to grab some info on video, like
> width, height....*/
> > //    GstElement* vfind = gst_element_factory_make ("typefind",
> "typefindVideo2");
> >       //GstElement* vsink = gst_element
> >               gst_element_factory_make ("fakesink",
> "video-fake-output");
> >       GstElement* vsink = gst_element_factory_make ("ximagesink",
> "video-output");
> >
> >       /* You need to test all Element to see if they are created */
> >       if (!pipeline || !source || !parser) {
> >               g_print ("One basic element could not be created.\n");
> >               if (!pipeline) g_print("pipeline\n");
> >               if (!source)   g_print("source\n");
> >               if (!parser)   g_print("parser\n");
> >               return -1;
> >       }
> >       if (!pipeAudio || !aqueue || /*!adecoder ||*/ !aconv || !asink) {
> >               g_print ("One audio element could not be created.\n");
> >               if (!pipeAudio) g_print("pipeline\n");
> >               if (!aqueue)    g_print("queue\n");
> >               //if (!adecoder)  g_print("decoder\n");
> >               if (!aconv)     g_print("conv\n");
> >               if (!asink)     g_print("sink\n");
> >               return -1;
> >       }
> >       if (!pipeVideo || !vqueue ||/* !vdecoder ||*/ !vconv || !vsink) {
> >               g_print ("One video element could not be created.\n");
> >               if (!pipeVideo) g_print("pipeline\n");
> >               if (!vqueue)    g_print("queue\n");
> >               //if (!vdecoder)  g_print("decoder\n");
> >               if (!vconv)     g_print("conv\n");
> >               if (!vsink)     g_print("sink\n");
> >               return -1;
> >       }
> >
> >       g_object_set (G_OBJECT (source), "location",argv[1], NULL);
> >
> >       /* Add a  bus to catch Information */
> >       /*
> >       GstBus* bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline));
> > //    gst_bus_add_watch (bus, cb_bus_call, loop);
> >       gst_object_unref (bus);
> >       bus = gst_pipeline_get_bus (GST_PIPELINE (pipeVideo));
> > //    gst_bus_add_watch (bus, cb_bus_call, loop);
> >       gst_object_unref (bus);
> >       */
> >       bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline));
> >   gst_bus_add_watch (bus, cb_bus_call, loop);
> >   gst_object_unref (bus);
> >  // bus = gst_pipeline_get_bus (GST_PIPELINE (pipeVideo));
> >  // gst_bus_add_watch (bus, cb_bus_call, loop);
> >  // gst_object_unref (bus);
> >
> >   bus = gst_pipeline_get_bus (GST_PIPELINE (pipeAudio));
> >   gst_bus_add_watch (bus, cb_bus_call, loop);
> >   gst_object_unref (bus);
> >
> >       /* Video pipeline */
> >       /* Add element in pipeline */
> >       gst_bin_add_many (GST_BIN (pipeVideo), vqueue, /*vdecoder,*/
> vconv, /*vcapsfilter, vconv2, vfind, */vsink, NULL);
> >       /* Link element in pipeline */
> >       gst_element_link_many (vqueue, /*vdecoder,*/  vconv,
> /*vcapsfilter, vconv2, vfind ,*/ vsink, NULL);
> >
> >       /* Set the ghost pad for the viedo pipeline (pad for input)*/
> >       GstPad* pad = gst_element_get_pad (vqueue, "sink");
> >       gst_element_add_pad (pipeVideo, gst_ghost_pad_new ("sink", pad));
> >       gst_object_unref (GST_OBJECT (pad));
> >
> >       /* Audio pipeline */
> >       gst_bin_add_many (GST_BIN (pipeAudio),aqueue, /*adecoder,*/
> aconv,/* aconv2, afind, */asink, NULL);
> >       gst_element_link_many (aqueue, /*adecoder,*/ aconv, /*aconv2,
> afind,*/ asink, NULL);
> >       pad = gst_element_get_pad (aqueue, "sink");
> >       gst_element_add_pad (pipeAudio, gst_ghost_pad_new ("sink", pad));
> >       gst_object_unref (GST_OBJECT (pad));
> >
> >       /* Main pipeline */
> >       gst_bin_add_many (GST_BIN (pipeline), source, parser, NULL);
> >       gst_element_link (source, parser);
> >
> >       /* link together - note that we cannot link the parser and
> >       * decoder yet, because the parser uses dynamic pads. For
> that,GST_STATE_READY
> >       * we set a pad-added signal handler. */
> >       g_signal_connect (parser, "pad-added", G_CALLBACK (cb_new_pad),
> NULL);
> >
> >
> >       /* Now set to playing and iterate. */
> >       g_print ("Setting to PLAYING\n");
> >       gst_element_set_state (pipeline, GST_STATE_READY);
> >       g_print ("Setting to PLAYING\n");
> >       gst_element_set_state (pipeline, GST_STATE_PLAYING);
> > //    gst_element_set_state (pipeVideo, GST_STATE_PLAYING);
> >
> >       /* wait until it's up and running or failed */
> >       if (gst_element_get_state (pipeline, NULL, NULL, -1) ==
> GST_STATE_CHANGE_FAILURE) {
> >               g_error ("Failed to go into PLAYING state");
> >       }
> >
> >
> >       // called on each frame being read
> >       GstPad *padObserver = gst_element_get_pad (source, "source");
> >       gst_pad_add_buffer_probe (padObserver, G_CALLBACK (cb_have_data),
> NULL);
> >       gst_object_unref (padObserver);
> >
> >
> >
> >
> >       g_print ("Running\n");
> >       g_main_loop_run (loop);
> >
> >       /* exit */
> >       printf("exit");
> >       gst_element_set_state (pipeline, GST_STATE_NULL);
> >       gst_object_unref (pipeline);
> >
> >
> > }
> >
> >
> >
> >
> > ------------------------------------------------------------------------
> >
> >
> -------------------------------------------------------------------------
> > This SF.net email is sponsored by DB2 Express
> > Download DB2 Express C - the FREE version of DB2 express and take
> > control of your XML. No limits. Just data. Click to get it now.
> > http://sourceforge.net/powerbar/db2/
> > ------------------------------------------------------------------------
> >
> > _______________________________________________
> > gstreamer-devel mailing list
> > gstreamer-devel at lists.sourceforge.net
> > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
> >
>
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.freedesktop.org/archives/gstreamer-devel/attachments/20070516/1cb5f0af/attachment.htm>


More information about the gstreamer-devel mailing list