[gst-devel] Trouble using x264enc with a tee

Marco Ballesio gibrovacco at gmail.com
Wed Dec 8 14:13:26 CET 2010


Hi,

On Tue, Dec 7, 2010 at 9:12 PM, JonathanHenson
<jonathan.henson at innovisit.com> wrote:
>
> "What if you store the file in one pass (not using a socket) and AFTER
> it you try to play? I know it's not exactly as in your requirements,
> but it may help understanding the following.. "
>
> If i use filesink instead of multifdsink, I can play the file back fine. I
> am using the multifdsink so that I can stream to an indefinite number of
> clients over a TCP/IP socket. However, I just realized, that to receive the
> stream on the C#.NET side, I will need to use RTP (of which I know very
> little about--I did read the spec though).

Ok so this confirms my suspects -and you understood the issue
properly, as I read below ;) -. Please note that GStreamer DOES
support multicast streams.

>
> "Are there any reasons why you cannot use a streaming protocol for this? "
>
> Yes, I don't know shit about RTP. I need some help on the gstreamer side. It
> seems that I am not supposed to multiplex the stream before sending it in
> RTP but send to separate streams.

Yep, you don't need to mux anything when dealing with multimedia
streaming. Here are some pretty good examples that maybe you've
already seen:

http://library.gnome.org/devel/gst-plugins-libs/unstable/gst-plugins-good-plugins-gstrtpbin.html

See the "example pipelines" section.

> I currently have:
>
> command = g_strdup_printf ("v4l2src ! video/x-raw-yuv, format=(fourcc)I420,
> width=%d, height=%d, framerate=(fraction)%d/1 !"
>                        " videobalance name=VideoBalance ! textoverlay name=chanNameFilter !
> textoverlay name=osdMessageFilter ! textoverlay name=sessionTimerOverlay ! "
>                        "tee name=t ! queue ! appsink name=videoSink t. ! queue ! ffenc_wmv1
> name=videoEncoder me-method=5 ! amux.  alsasrc ! "
>                        "audio/x-raw-int, depth=%d, width=%d, channels=2, endianness=1234,
> rate=%d, signed=true ! volume name=volumeFilter ! "
>                        "tee name=souTee ! queue ! appsink name=soundSink souTee. ! queue !
> ffenc_wmav1 ! amux. asfmux name=amux ! rtpasfpay ! multifdsink
> name=multifdsink",
>                         width, height, fps, bitWidth, bitWidth, audioSampleRate);
>
> I think I would need so use gstrtpbin instead of the multifdsink and get rid
> of the muxing?

yes, you should plug a gstrtpbin (if you're going to stream over a
network), as described in the "Encode and payload H263 video captured
from a v4l2src. Encode and payload AMR audio generated from
audiotestsrc" example in gstrtpbin.

>
> "see you're using appsrc and appsink. Usually, the first good
> question in such a case is: "do I really need such elements in my
> pipeline?". The answer depends on your use case and requirements :).
>
> Note that if your app is not producing and consuming buffers in the
> proper way you may run into troubles. The behaviour will e.g. differ
> depending on whether you're in pull or push mode. See the elements'
> documentation for more details. "
>
> I am using the appsinks because this app is using OPAL in other threads to
> answer H323 and SIP requests and it needs the raw data buffer. This thread
> is used for a client control computer and SDK which will monitor sessions,
> and make recordings of the sessions (i.e. a Windows Server 2008 Web Server
> using ASP.NET/ C#.net with an SDK I have written to talk to this device.) Do
> you have a better approach in mind?

well, you may want to use telepathy and farsight/stream engine for this.

the "framework": http://telepathy.freedesktop.org/wiki/
the "SIP backend" (well, you also need libsofiasip):
http://git.collabora.co.uk/?p=telepathy-sofiasip.git
the "streaming backend" (using GStreamer) http://farsight.freedesktop.org/wiki/

After installing the proper plugins, SIP support is great, I can grant
it. I've read here and there about H323 support, but I've never tested
it personally and don't know where sources can be found (I've heard
about a mod_opal but never seen it).

The only drawback may be about the portability of the stack across
platforms (as it appears you're using MS stuff), so I suggest you to
check on the #farsight channel on freenode abot this issue. If the
effort is too hard, maybe you solution will be easier to deploy.

Btw I've no religious arguments against appsink/appsource and it
appear your case may well justify their usage.

Regards

>
> The OPAL thread grabs this buffer when it needs it.
>
> Thank you once again,
>
> Jonathan
>
>
> --
> View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Trouble-using-x264enc-with-a-tee-tp3067583p3077032.html
> Sent from the GStreamer-devel mailing list archive at Nabble.com.
>
> ------------------------------------------------------------------------------
> What happens now with your Lotus Notes apps - do you make another costly
> upgrade, or settle for being marooned without product support? Time to move
> off Lotus Notes and onto the cloud with Force.com, apps are easier to build,
> use, and manage than apps on traditional platforms. Sign up for the Lotus
> Notes Migration Kit to learn more. http://p.sf.net/sfu/salesforce-d2d
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
>




More information about the gstreamer-devel mailing list