<div dir="ltr"><br><div class="gmail_extra"><br><br><div class="gmail_quote">2014-02-11 11:55 GMT+01:00 Lee Matthews <span dir="ltr"><<a href="mailto:lma@spaceapplications.com" target="_blank">lma@spaceapplications.com</a>></span>:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Hi,<br>
<br>
I'm new to using gstreamer and I'm in need of some general advice regarding its utilisation.<br>
<br>
I'm developing some software that needs to get audio from a microphone and stream it over the network. Other devices need to connect to my audio software component to receive audio streams over the network.<br>
<br>
The audio component that I am developing needs to be built for linux and android platforms.<br>
<br>
My current thinking is to solve the problem in the following fashion :<br>
<br>
Linux : Get raw audio via pulseAudio, feed this as a raw buffer into gstreamer using appsrc, encode using gstreamer, send stream to network socket<br>
<br>
Android : Get raw audio via OpenSL ES, feed this as a raw buffer into gstreamer using appsrc, encode using gstreamer, send stream to network socket<br>
<br>
I would then build some kind of abstraction layer that lets me choose between sourcing the data from PulseAudio or OpenSLES depending on what the final build target is.<br>
<br>
Some questions :<br>
<br>
1) I'm not sure that this is the best way of doing this, do you think that this sounds like a good approach ?<br>
<br>
2) Instead, could I simply use exactly the same implementation on both linux and Android platforms just using gstreamer ? So for example, instead of using appsrc to push the raw buffer data into the gstreamer pipeline, could I use the element autoaudiosrc to grab the audio stream from the microphone independent of what the platform is ?<br>
<br>
3) If I do implementation (2), would the gstreamer code need to change between android and linux ?<br></blockquote><div><br></div><div>Hi Lee,<br><br></div><div>GStreamer already provides source elements for pulse audio in Linux and OpenSL ES in Android named pulsesrc and openslessrc, so you don't have to write anything on your side to capture data and push it an appsrc as you can use those source elements directly to feed the encoder. There is also an autoaudiosrc element that will pick the best audio source element in your platform, abstracting completely the backend being used.<br>
</div><div>The resulting pipeline would work transparently in both platformrs and it would look like:<br></div><div>autoaudiosrc ! encoder ! networksink<br></div><div><br></div><div>Cheers,<br>Andoni<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<br>
Thanks in advance<br>
Lee<br>
_______________________________________________<br>
gstreamer-devel mailing list<br>
<a href="mailto:gstreamer-devel@lists.freedesktop.org">gstreamer-devel@lists.freedesktop.org</a><br>
<a href="http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel" target="_blank">http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel</a><br>
</blockquote></div><br><br clear="all"><br>-- <br>Andoni Morales Alastruey<br><br>LongoMatch:The Digital Coach<br><a href="http://www.longomatch.ylatuya.es">http://www.longomatch.ylatuya.es</a>
</div></div>