Hi, <br>I know there has been stagrefright and open core inside the
Android Multimedia Framework.But Now I am trying to use the audio/video
streaming functionalities using the <b>gstreamer</b> commands(as well c application for them) in the adb shell.<br>
<br>For example wheni pluggedin a web cam in my ubuntu machine and issue this command --> <b>gst-launch-0.10 autovideosrc ! video/x-raw-yuv,framerate=\(fraction\)30/1,width=640,height=480 ! ffmpegcolorspace ! autovideosink</b>
while connecting the web cam to my linux machine i will be able to see
the web cam turned on, <br><b>now I want to simulate the same on the android
phone and issue the same command(with some modification and also by
cross compiling it with respect to the andrid phone or tablet)<br></b>
<br>I have got this <a href="http://gstreamer.freedesktop.org/wiki/GstreamerAndroid_InstallInstructions" target="_blank">link</a> for compiling the gstreamer for Nexus S .Before get to start i want to ask : <br>(1) Has any one tried cross compiling the <b>gstreamer</b> for any android phone and using it (application or via commands)<br>
(2) Is it possible to compile it using the ndk tool chain and put the <b>gstreamer</b>
code inside our android application by creatign a shared library out of
it and loading and using the functionalities in the app code.<br>
<br>I want to request : <br><br><b>(1) For any tutorial /blog link if you have doen this before</b><br><br>Thanks and advance,I will keep adding to this post my work so far<br>plz assit and send your inputs<br>Rgds,<br>Softy<br clear="all">
"..pain is temporary.....quitting lasts forever......"<br><br>