gstreamer video playback , cannot play

Chuck Crisler ccrisler at mutualink.net
Tue Mar 12 07:17:10 PDT 2013


Another answer is buried in the GStreamer docs. Elements are connected by
pads. Pads have caps (capabilities). Those caps describe the stream, all of
the relevant information is there. That is how decodebin works - it
examines the caps and then 'does the right thing'. If you write code to
parse the caps and then switch to use the right elements, you will be
duplicating decodebin, though it is a good learning exercise.

You also really need to learn about caps to understand how to build encode
pipelines also. Many of the elements are very restrictive in what they
accept. If you don't follow what they need it won't work and it may not be
obvious why, until you turn on detailed logging.

The short answer is: caps.

On Tue, Mar 12, 2013 at 5:53 AM, Ian Davidson
<id012c3076 at blueyonder.co.uk>wrote:

>  There are sounds that we hear and things that we see. In order to be
> able to record these things, they need to be converted to some digital
> format – and there are many choices for these. For example, a sound could
> be stored as a WAV file, a WMA file, an OGG file (and many more). The sound
> could have been recorded at 44100 samples per second (quite typical) or
> perhaps 1100 samples per second (very poor quality).
>
> Many of the elements in a Gstreamer pipeline are used to manipulate the
> data from one format to another. A typical element will have a 'sink'
> (where you can 'pour in some data') and a 'src' (which will supply data to
> the next element in the sequence).
>
> Obviously, the data will need to come from somewhere – so if you are
> reading a file, you will have a 'filesrc' which will pick up the data from
> a file and pass it on to the next element; such an element will have a
> 'location' parameter to tell it where to find the file.
>
> At the other end of the pipeline you will want to do something with this
> data you have been manipulating, so you will choose a 'sink' such as
> 'alsasink' to play the sound.
>
> In between, you will need to manipulate the data that you read from disk
> such that it can be played – so if your input is a WAV file, you will need
> 'wavparse' to convert the data.
>
> It may be that 'wavparse' is not always ready to accept data when your
> source wants to give it, so it is a good idea to include a 'queue' between
> elements to handle such problems.
>
> I don't know exactly what you want to do. Go to
> http://gstreamer.freedesktop.org/documentation/ and follow the link to
> “Overview of all Plug-ins ” to see what plug-ins are available. Many of the
> descriptions give an example of how you might use them.
>
> Start by building a test pipeline using gst-launch. When you have got that
> working, then look to writing a program to replicate what you have tested
> using gst-launch.
>
>  On 12/03/2013 02:06, aero wrote:
>
> in brief my question is what are the elements required to play an audio and
> video. how to connect them.
> if elements depends on format of the flie (ogg,mp3,avi etc)  how to know
> what  to use for what type.
>
> i just want to get some basic idea of properties of elements before i do any
> programming.
>
>
>
>
>
>
>
> --
> View this message in context: http://gstreamer-devel.966125.n4.nabble.com/gstreamer-video-playback-cannot-play-tp4659024p4659054.html
> Sent from the GStreamer-devel mailing list archive at Nabble.com.
> _______________________________________________
> gstreamer-devel mailing listgstreamer-devel at lists.freedesktop.orghttp://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>
>
>
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.freedesktop.org
> http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.freedesktop.org/archives/gstreamer-devel/attachments/20130312/18c56ebe/attachment-0001.html>


More information about the gstreamer-devel mailing list