<div dir="ltr">Thanks you Sebastian, I'll digg into documentation.<br></div><br><div class="gmail_quote">пт, 3 апр. 2015 г. в 22:38, Sebastian Dröge <<a href="mailto:sebastian@centricular.com">sebastian@centricular.com</a>>:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">On Fr, 2015-04-03 at 19:30 +0000, Dmitri Afanasjev wrote:<br>
> Hello!<br>
><br>
> I have udp stream which is generated by:<br>
><br>
> raspivid -t 999999 -w 1296 -h 730 -fps 30 -b 20000000 -o - | gst-launch-1.0<br>
> -e -vvvv fdsrc ! h264parse ! rtph264pay pt=96 name=pay0 config-interval=5 !<br>
> udpsink host=<destination ip> port=5001<br>
><br>
> Receiver part on Windows looks like:<br>
><br>
> C:\Users\Dmitri>gst-launch-1.0 -e -v udpsrc port=5001 ! application/x-rtp,<br>
> media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264,<br>
> sprop-parameter-sets=\"J2QA<br>
> KKwrQCiC78kA8SJq\\,KO4fLA\\\=\<u></u>\\=\", payload=(int)96 ! rtph264depay !<br>
> avdec_h264 ! autovideosink<br>
><br>
> (caps are taken from transmitter part) so all works well.<br>
><br>
> Now i want to create the same pipeline in Android. I'm using tutorial 5<br>
> from GStreamer SDK modified to work with RTMP streams as starting point.<br>
><br>
> when i create pipline with:<br>
><br>
> data->pipeline = gst_parse_launch("udpsrc port=5001<br>
> caps=\"application/x-rtp, media=video, clock-rate=90000,<br>
> encoding-name=H264, payload=96\" ! rtph264depay ! h264parse ! avdec_h264 !<br>
> autovideosink sync=false", &error);<br>
><br>
> nothing start playing and i see the error in logcat<br>
><br>
> W/GLib+GLib-GObject﹕ invalid cast from 'GstPipeline' to 'GstVideoOverlay'<br>
> E/GLib﹕ gst_video_overlay_set_window_<u></u>handle: assertion<br>
> 'GST_IS_VIDEO_OVERLAY (overlay)' failed<br>
><br>
> As i understand I need to give the rendering point to my pipline, but error<br>
> occured.<br>
><br>
> How its possible to receive UDP stream as I've done in Winows environment,<br>
> how should look like pipline in Android JNI ?<br>
<br>
The pipeline is correct, but you have to understand the actual code<br>
there. Tutorial 5 uses playbin, which implements the GstVideoOverlay<br>
interface. And that is used by Tutorial 5 to tell playbin where to<br>
render the video.<br>
<br>
A generic pipeline like yours does not implement GstVideoOverlay, so<br>
Tutorial 5 just fails. You have to change the code so that it uses the<br>
GstVideoOverlay interface of the video sink directly, see the<br>
documentation how to do that.<br>
<br>
--<br>
Sebastian Dröge, Centricular Ltd · <a href="http://www.centricular.com" target="_blank">http://www.centricular.com</a><br>
______________________________<u></u>_________________<br>
gstreamer-devel mailing list<br>
<a href="mailto:gstreamer-devel@lists.freedesktop.org" target="_blank">gstreamer-devel@lists.<u></u>freedesktop.org</a><br>
<a href="http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel" target="_blank">http://lists.freedesktop.org/<u></u>mailman/listinfo/gstreamer-<u></u>devel</a><br>
</blockquote></div>