Getting started with HW accelerated H.264 streaming on Android

nessup nessup at gmail.com
Sat Sep 21 22:07:28 PDT 2013


Hi there,

I'm working on a somewhat ambitious project: I have an OMAP4430-based
Android device from which I would like to develop a camera streaming app.
I've read that Gstreamer has support for OpenMAX for accelerated encoding,
which I'd love to use.

While I've been studying the Gstreamer docs to figure out how to start this
project, I would love if someone could give me a rough outline of plugins I
should check out and general problems I should be aware of.

So far my plan of action is this:
1. Create an H.264 -> RTSP/RTP server pipeline
2. Modify one of the tutorial sample code projects to load that pipeline
3. Test by intercepting the stream via gst-launch or VLC on a local endpoint

My questions are as follows:
1. When I base my project off of the tutorial code, will it come with the
necessary plugins to a) create an RTSP/RTP server and b) take advantage of
OpenMAX-accelerated H.264 streaming? If not, how do I build them for
Android?

2. Is Gstreamer on Android capable of grabbing frames from the camera and
microphone on Android devices? Or do I need to pass the raw frames into
Gstreamer myself?

3. Are there any open-source projects out there (teleconferencing solutions,
etc) for Android based on Gstreamer that would be worthwhile for me to look
at?

Again, I really apologize as I'm a noob to Gstreamer and am trying to wrap
my head around everything. Thanks!



--
View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Getting-started-with-HW-accelerated-H-264-streaming-on-Android-tp4662119.html
Sent from the GStreamer-devel mailing list archive at Nabble.com.


More information about the gstreamer-devel mailing list