Hi,<div>Thank you for the fast answer, however I have the impression that there is a misunderstanding.</div><div>The command that fails is the server one, meaning :<br><meta http-equiv="content-type" content="text/html; charset=utf-8"><i>gst-launch filesrc location=./testfile.ogg ! tcpserversink port=1234</i></div>
<div>So I do not even have time to launch the client side. I believe that the ffmpegcolorspace ! videoscale you proposed is for the client side (the one who should actually display the video).</div><div>So if you would have any other idea ... </div>
<div><br></div><div>I am also not in favor at all of sending raw data but I just made it as a try in case of.</div><div><br></div><div>About the global picture, I have a 8 audio channel capture card and a video capture card.</div>
<div>What I am to achieve (LIVE !!!) is resampling the 8 audio channels to be in sync with the video (I am planning to write a gstreamer module for that) then encode all streams (video/audios) in mpeg2 for multichannel dvd, h264/AAC for live mp4 streaming and all audios also in mp3 for mp3 CDs.</div>
<div>I need to do it live because everything needs to be available right after the performance.</div><div>As you can imagine I would need a monster PC to do all that on one, that's the reason why my first step is to find a protocol to send the streams at the different stages of the process.</div>
<div><br></div><div>Timothé</div><div>PS: I took an theora file for my test though this is not the codec I'll use but as I said my first step is to define how to communicate.</div><div> </div><div><br><div class="gmail_quote">
On Sat, Mar 26, 2011 at 12:05 AM, Tim-Philipp Müller <span dir="ltr"><<a href="mailto:t.i.m@zen.co.uk">t.i.m@zen.co.uk</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
Hi,<br>
<div class="im"><br>
> gst-launch filesrc location=./testfile.ogg ! tcpserversink port=1234<br>
><br>
> gst-launch tcpclientsrc protocol=none host=127.0.0.1 port=1234 !<br>
> oggdemux ! theoradec ! autovideosink<br>
><br>
> However here is what I get when I launch the server:<br>
</div><div class="im">> DRREUR : de l'élément /pipeline0/tcpserversink0 : Internal GStreamer<br>
> error: negotiation problem.<br>
<br>
</div>Try adding an ffmpegcolorspace ! videoscale before the videosink. If<br>
that doesn't help, also try a typefind element before oggdemux.<br>
<div class="im"><br>
<br>
> If I try to do the demuxing and decoding on the server side I have<br>
> following commands:<br>
> gst-launch filesrc location=./testfile.ogg ! oggdemux ! theoradec !<br>
> tcpserversink port=1234<br>
><br>
> gst-launch tcpclientsrc protocol=none host=127.0.0.1 port=1234 !<br>
> autovideosink<br>
> Here it does not crash but I do not get any video displayed.<br>
<br>
</div>Sending raw video/pixels over the network is usually a bad idea. In this<br>
case, you will not only loose the framing, but also the timing<br>
information, and the caps. You can use a videoparse to work around that,<br>
but it's not really a great solution. You could also use gdppay before<br>
sending the data and then gdpdepay on the receiving side, that will<br>
maintain most info (but the timestamps may need fixing up then).<br>
<div class="im">><br>
> Does anybody has an idea, or any better way to pass streams between<br>
> pipelines ?<br>
<br>
</div>Maybe you could describe the bigger picture of what you're trying to do?<br>
<br>
Cheers<br>
-Tim<br>
<br>
<br>
<br>
_______________________________________________<br>
gstreamer-devel mailing list<br>
<a href="mailto:gstreamer-devel@lists.freedesktop.org">gstreamer-devel@lists.freedesktop.org</a><br>
<a href="http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel" target="_blank">http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel</a><br>
</blockquote></div><br></div>