shared wasabi implementation
Magnus Bergman
magnus.bergman at observer.net
Mon Feb 26 06:18:36 PST 2007
On Sat, 24 Feb 2007 17:43:48 +0100
"Mikkel Kamstrup Erlandsen" <mikkel.kamstrup at gmail.com> wrote:
> 2007/2/23, Magnus Bergman <magnus.bergman at observer.net>:
> >
> > On Fri, 23 Feb 2007 21:26:26 +0800
> > "Fabrice Colin" <fabrice.colin at gmail.com> wrote:
> >
> > > On 2/23/07, Magnus Bergman <magnus.bergman at observer.net> wrote:
> > > > What gstreamer does is to set up a pipeline for handling media
> > > > streams in a very generic way. The pipeline consists of a source
> > > > element, any number of filter elements and a sink element (which
> > > > could be an indexer of a document viewer). There are currently
> > > > no filter elements for handling text (except for subtitles and
> > > > such). But filters for converting, let's say word documents to
> > > > plain text, could very well be implemented as gstreamer
> > > > elements. The benefit from this is that the documents are
> > > > streamed instead of first downloaded (if necessary) and then
> > > > converted (all in-process), which is of course much faster. And
> > > > gstreamer can automatically figure out which elements are
> > > > needed and set up an appropriate pipeline for doing the job.
> > > >
> > > OK, I see. I agree that streaming documents is a better approach.
> > >
> > > Most, if not all, third-party libraries to access and manipulate
> > > the document types I am interested in don't allow for document
> > > streaming. Basically, if I adopted a stream-based approach, I
> > > wouldn't be able to make use of the wealth of code that's been
> > > written, tested, debugged and optimized by others, and writing
> > > filters would be a much more difficult task.
> >
> > That's true, but not a problem as I see it. Both approaches could be
> > used in parallel. A gstreamer based pipeline could be used as a
> > stand alone filter. And an element for gstreamer wrapping stand
> > alone filters could also easily be written (there might even
> > already be one packaged with gstreamer).
> >
> > > Jos tried to get me to adopt the document streaming code he wrote
> > > for Strigi. I declined for the same reason. The time I would spend
> > > writing low-level document decoders is better spent elsewhere.
> >
> > My point was not so much to prefer streaming over stand alone
> > filters. But rather to prefer gstreamer over some other streaming
> > framework. Since we already have gstreamer which is relatively
> > robust and well tested it would be a waste to add another streaming
> > framework as well.
>
>
>
> Forgive me being naiive - but I've always just related gstreamer to
> audio and video streams, never with documents streaming on a more
> abstract level.
Gstreamer itself is in no way limited to any type of streams. Any
mimetype can be handled if there are elements to handle them. So I
guess you can say gstreamer works on a more abstract level than other
audio/video frameworks. What it does is just to create a pipeline by
linking up some elements and stream any data through it. Even the output
(like alsa-sound and X-video) are handled by elements which can be
replaced.
> Can you give some examples on how you've used gstreamer in non
> audio/video cases?
A simple example is an element which translates XML to plain text. The
element tells gstreamer which mimetype it produces (text/plain) and
what mimetype(s) it reads (text/xml). Just install that element and
gstreamer will be able to translate XML to plain text in the same way
it coverts mp3 to ogg for example. I would give you the source if I
could, but I'm not allowed to. But just wrapping a SAX parser in a
gstreamer element is quite simple and anyone slightly familiar
with gstremaer can probably do it to provide an example.
More information about the xdg
mailing list