communication between multiple gstreamer pipeline/apps

timothe jahan timothejahan at gmail.com
Sat Mar 26 02:43:47 PDT 2011


Hi,
Thank you for the fast answer, however I have the impression that there is a
misunderstanding.
The command that fails is the server one, meaning :
*gst-launch filesrc location=./testfile.ogg ! tcpserversink port=1234*
So I do  not even have time to launch the client side. I believe that the
ffmpegcolorspace ! videoscale you proposed is for the client side (the one
who should actually display the video).
So if you would have any other idea ...

I am also not in favor at all of sending raw data but I just made it as a
try in case of.

About the global picture, I have a 8 audio channel capture card and a video
capture card.
What I am to achieve (LIVE !!!) is resampling the 8 audio channels to be in
sync with the video (I am planning to write a gstreamer module for that)
then encode all streams (video/audios) in mpeg2 for multichannel dvd,
h264/AAC for live mp4 streaming and all audios also in mp3 for mp3 CDs.
I need to do it live because everything needs to be available right after
the performance.
As you can imagine I would need a monster PC to do all that on one, that's
the reason why my first step is to find a protocol to send the streams at
the different stages of the process.

Timothé
PS: I took an theora file for my test though this is not the codec I'll use
but as I said my first step is to define how to communicate.


On Sat, Mar 26, 2011 at 12:05 AM, Tim-Philipp Müller <t.i.m at zen.co.uk>wrote:

> Hi,
>
> > gst-launch filesrc location=./testfile.ogg ! tcpserversink port=1234
> >
> > gst-launch tcpclientsrc protocol=none host=127.0.0.1 port=1234 !
> > oggdemux ! theoradec  ! autovideosink
> >
> > However here is what I get when I launch the server:
> > DRREUR : de l'élément /pipeline0/tcpserversink0 : Internal GStreamer
> > error: negotiation problem.
>
> Try adding an ffmpegcolorspace ! videoscale before the videosink. If
> that doesn't help, also try a typefind element before oggdemux.
>
>
> > If I try to do the demuxing and decoding on the server side I have
> > following commands:
> > gst-launch filesrc location=./testfile.ogg ! oggdemux ! theoradec !
> > tcpserversink port=1234
> >
> > gst-launch tcpclientsrc protocol=none host=127.0.0.1 port=1234 !
> > autovideosink
> > Here it does not crash but I do not get any video displayed.
>
> Sending raw video/pixels over the network is usually a bad idea. In this
> case, you will not only loose the framing, but also the timing
> information, and the caps. You can use a videoparse to work around that,
> but it's not really a great solution. You could also use gdppay before
> sending the data and then gdpdepay on the receiving side, that will
> maintain most info (but the timestamps may need fixing up then).
> >
> > Does anybody has an idea, or any better way to pass streams between
> > pipelines ?
>
> Maybe you could describe the bigger picture of what you're trying to do?
>
> Cheers
>  -Tim
>
>
>
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.freedesktop.org
> http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.freedesktop.org/archives/gstreamer-devel/attachments/20110326/84110db3/attachment-0001.htm>


More information about the gstreamer-devel mailing list