[gst-devel] Re: Comparison: MAS, GStreamer, NMM
mlohse at cs.uni-sb.de
Wed Aug 25 09:02:20 CEST 2004
Thomas Vander Stichele wrote:
>>(1) Allow the playback of an encoded audio file (e.g. MP3). This will
>>result in similar setups: a component for reading data from a
>>file connected to a component for decoding connected to a component
>>for audio output. (Together, this is called "pipeline" or "flow
>>(2) Set the filename of the file to be read.
>>(3) Manually request/setup this functionality, i.e. no automatic setup
>>of flow graphs.
>>(4) Include some error handling.
>I'm curious about (3) - why should it not be done automatically ? So
>you're saying you just want an application that can only play mp3's ?
>Personally I think a much better test would be to have a helloworld that
>can take a media file and just play it, whatever type it is. That's
>what users care about, anyway.
Sure, there are a lot of users that simply want to play back files. But
as stated in my first email, this comparison is intended to compare the
programming model and the APIs to be used by programmers.
Automatic setup of flow graphs is definitely a nice feature. However, a
programmer will have to use the API to develop such a feature. Providing
some "feeling" for that was the idea of this first example. Furthermore,
such a feature is definitely not something that belongs to the core API
of any multimedia framework; it's an extension built on top of the core.
>>In a second step, we would like to extend the helloworld program with
>>following feature (helloworld II):
>>(1) Add a listener that gets notified if the currently playing file
>>has ended, i.e. this listener is to be triggered after the last byte
>>was played by the audio device.
>What sort of thing is your listener ? An in-program function callback ?
>Another process ? Something else ?
just take a look at our source code:
so, basically it's an in-program function callback.
>>In a final step, we would like to extened the helloworld program
>>(helloworld I) to allow for distributed playback (helloworld III):
>>(1) The component for reading data from a file should be located on the
>>local host. The component for decoding, and playing the audio data should
>>be located on remote host.
>>Notice that this third example should also demonstrate how easy (or
>>painful) it is to develop networked multimedia applications using the
>>particular framework. We hope that this will finally show that
>>developing distributed multimedia applications means more than "well,
>>simply write a component for streaming data and put that into your
>Not sure why the third is important. While it's important for
>multimedia frameworks to be able to do things like this, I don't see the
>value of this for a desktop environment. Can you provide a use case
>where this makes sense ?
there are a lot of examples, just take a look at our slides of the talk.
>Also, I don't think it's smart to do this for audio only. We all know
>audio is the easiest to get right anyway, and audio presents a lot less
>challenge to frameworks.
yes, there might be other examples. However, the idea was to compare the
basic programming model and API of the frameworks.
>I'm sure I could come up with other things that are important to be
>tested, I'll think about it some more.
Yes, sure. However, before everyone comes up with some more things to
compare, I think it would be great if we could first finish this comparison.
Again: together, the idea was to compare the basic programming model and
API of the frameworks. We think that the provided examples are very
suitable for that.
Have fun, Marco.
More information about the gstreamer-devel