Android Port: How to access camera using Gstreamer

David Röthlisberger david at rothlis.net
Thu Sep 12 04:06:53 PDT 2013


On 11 Sep 2013, at 14:46, yoyosuper8 wrote:
> 
>>> my thought
>>> was to somehow use the android camera api and transfer the video buffer to
>>> jni, so that I can provide that as my input source for gst_parse_launch()
>>> method. But I'm still not sure how I can achieve this.
>> 
>> If you want to go this way, instead of implementing a plugin, you can
>> use an appsrc to push you data to gstreamer.
> 
> I hate being a noob at something. I'm just completely lost. How do I use
> appsrc. I understand is a plugin, but how do I import it into my android
> project? How do I use it in the other jni code in tutorial 3 of android
> gstreamer? How do I code the java portion of it in the app to transfer the
> Camera video feed to appsrc?

Type "gst-inspect appsrc" to see its "element actions" (procedure calls)
like "push-buffer".

You'll use gst_parse_launch to create a pipeline like "appsrc ! ...".
Then you'll create a GStreamer buffer from the image you have retrieved
from the camera, add the appropriate metadata to the GStreamer buffer
(timestamp, caps), and then push the buffer to the appsrc element.

I do something similar but in python (and nothing to do with Android):
Creating a GStreamer buffer from an image held in memory in an array,
and pushing it to an appsrc:
https://github.com/drothlis/stb-tester/blob/0.15/stbt.py#L865

Of course it would be cleaner to have a GStreamer source element that
wraps the Android camera API, as suggested earlier in the thread. If
your project needs this badly enough you might consider paying somebody
to write one, it might be cheaper & faster than doing it yourself. :-)



More information about the gstreamer-devel mailing list