<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<html>
<head>
<meta http-equiv="content-type" content="text/html; charset=ISO-8859-1">
</head>
<body text="#000000" bgcolor="#ffffff">
Hello all!<br>
<br>
I'm struggling here trying to play the video acquired from the
web-cam and stream it simultaneously but it isn't working. Here goes
my pipeline (inspired in the examples given):<br>
<br>
<font face="monospace">#!/bin/sh<br>
<br>
# Destination of the stream<br>
DEST=127.0.0.1<br>
<br>
# Tuning parameters to make the sender send the streams out of
sync. Can be used<br>
# ot test the client RTCP synchronisation. <br>
VOFFSET=0<br>
AOFFSET=0<br>
<br>
# Video setup<br>
VELEM="v4l2src "<br>
VCAPS="video/x-raw-yuv,width=352,height=288,framerate=20/1"<br>
VSOURCE="$VELEM ! $VCAPS ! queue ! ffmpegcolorspace"<br>
VENC="x264enc tune=zerolatency byte-stream=true bitrate=550
threads=0 speed-preset=3"<br>
<br>
# Video transmission setup<br>
VRTPSINK="udpsink port=5000 host=$DEST ts-offset=$VOFFSET
name=vrtpsink"<br>
VRTCPSINK="udpsink port=5001 host=$DEST sync=false async=false
name=vrtcpsink"<br>
VRTCPSRC="udpsrc port=5005 name=vrtpsrc"<br>
<br>
# Audio setup<br>
AELEM="pulsesrc"<br>
ASOURCE="$AELEM ! queue ! audioconvert"<br>
AENC=" faac! rtpmp4apay"<br>
<br>
# Audio transmission setup<br>
ARTPSINK="udpsink port=5002 host=$DEST ts-offset=$AOFFSET
name=artpsink"<br>
ARTCPSINK="udpsink port=5003 host=$DEST sync=false async=false
name=artcpsink"<br>
ARTCPSRC="udpsrc port=5007 name=artpsrc"<br>
<br>
# Pipeline construction<br>
gst-launch -v gstrtpbin name=rtpbin \<br>
$VSOURCE ! tee name=v ! autovideosink v. ! $VENC ! rtph264pay
! rtpbin.send_rtp_sink_0 \<br>
rtpbin.send_rtp_src_0 ! $VRTPSINK \<br>
rtpbin.send_rtcp_src_0 ! $VRTCPSINK \<br>
$VRTCPSRC ! rtpbin.recv_rtcp_sink_0 \<br>
$ASOURCE ! $AENC ! rtpbin.send_rtp_sink_1 \<br>
rtpbin.send_rtp_src_1 ! $ARTPSINK \<br>
rtpbin.send_rtcp_src_1 ! $ARTCPSINK \<br>
$ARTCPSRC ! rtpbin.recv_rtcp_sink_1</font> <br>
<br>
Has all can see I got a tee after the acquisition, after that
autovideosink in order to displays and I pass on the video flux for
streaming.<br>
<br>
The output of execution is:<br>
<br>
<font face="monospace">$ sh v4l2server.sh<br>
Setting pipeline to PAUSED ...<br>
/GstPipeline:pipeline0/GstPulseSrc:pulsesrc0: actual-buffer-time =
47551927<br>
/GstPipeline:pipeline0/GstPulseSrc:pulsesrc0: actual-latency-time
= 9977<br>
/GstPipeline:pipeline0/GstPulseSrc:pulsesrc0.GstPad:src: caps =
audio/x-raw-int, endianness=(int)1234, signed=(boolean)true,
width=(int)16, depth=(int)16, rate=(int)44100, channels=(int)1,
channel-positions=(GstAudioChannelPosition)<
GST_AUDIO_CHANNEL_POSITION_FRONT_MONO ><br>
ERROR: Pipeline doesn't want to pause.<br>
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
Could not negotiate format<br>
Additional debug info:<br>
gstbasesrc.c(2811): gst_base_src_start ():
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0:<br>
Check your filtered caps, if any<br>
Setting pipeline to NULL ...<br>
/GstPipeline:pipeline0/GstPulseSrc:pulsesrc0.GstPad:src: caps =
NULL<br>
Freeing pipeline ...</font><br>
<br>
The last message makes suggestion to look at the caps, but removing
the tee element and the autovideosink it works.<br>
<br>
Any ideas/suggestions would be great!!<br>
<br>
Thanks all for help!!<br>
Regards,<br>
Paulo Paiva<br>
</body>
</html>