Gstreamer android/linux portability

Andoni Morales ylatuya at gmail.com
Tue Feb 11 03:36:03 PST 2014


2014-02-11 11:55 GMT+01:00 Lee Matthews <lma at spaceapplications.com>:

> Hi,
>
> I'm new to using gstreamer and I'm in need of some general advice
> regarding its utilisation.
>
> I'm developing some software that needs to get audio from a microphone and
> stream it over the network. Other devices need to connect to my audio
> software component to receive audio streams over the network.
>
> The audio component that I am developing needs to be built for linux and
> android platforms.
>
> My current thinking is to solve the problem in the following fashion :
>
> Linux : Get raw audio via pulseAudio, feed this as a raw buffer into
> gstreamer using appsrc, encode using gstreamer, send stream to network
> socket
>
> Android : Get raw audio via OpenSL ES, feed this as a raw buffer into
> gstreamer using appsrc, encode using gstreamer, send stream to network
> socket
>
> I would then build some kind of abstraction layer that lets me choose
> between sourcing the data from PulseAudio or OpenSLES depending on what the
> final build target is.
>
> Some questions :
>
> 1) I'm not sure that this is the best way of doing this, do you think that
> this sounds like a good approach ?
>
> 2) Instead, could I simply use exactly the same implementation on both
> linux and Android platforms just using gstreamer ? So for example, instead
> of using appsrc to push the raw buffer data into the gstreamer pipeline,
> could I use the element autoaudiosrc to grab the audio stream from the
> microphone independent of what the platform is ?
>
> 3) If I do implementation (2), would the gstreamer code need to change
> between android and linux ?
>

Hi Lee,

GStreamer already provides source elements for pulse audio in Linux and
OpenSL ES in Android named pulsesrc and openslessrc, so you don't have to
write anything on your side to capture data and push it an appsrc as you
can use those source elements directly to feed the encoder. There is also
an autoaudiosrc element that will pick the best audio source element in
your platform, abstracting completely the backend being used.
The resulting pipeline would work transparently in both platformrs and it
would look like:
autoaudiosrc ! encoder ! networksink

Cheers,
Andoni

>
> Thanks in advance
> Lee
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.freedesktop.org
> http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>



-- 
Andoni Morales Alastruey

LongoMatch:The Digital Coach
http://www.longomatch.ylatuya.es
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.freedesktop.org/archives/gstreamer-devel/attachments/20140211/b056146f/attachment.html>


More information about the gstreamer-devel mailing list