[gst-devel] Trouble using x264enc with a tee
jonathan.henson at innovisit.com
Tue Dec 7 20:12:50 CET 2010
"What if you store the file in one pass (not using a socket) and AFTER
it you try to play? I know it's not exactly as in your requirements,
but it may help understanding the following.. "
If i use filesink instead of multifdsink, I can play the file back fine. I
am using the multifdsink so that I can stream to an indefinite number of
clients over a TCP/IP socket. However, I just realized, that to receive the
stream on the C#.NET side, I will need to use RTP (of which I know very
little about--I did read the spec though).
"Are there any reasons why you cannot use a streaming protocol for this? "
Yes, I don't know shit about RTP. I need some help on the gstreamer side. It
seems that I am not supposed to multiplex the stream before sending it in
RTP but send to separate streams. I currently have:
command = g_strdup_printf ("v4l2src ! video/x-raw-yuv, format=(fourcc)I420,
width=%d, height=%d, framerate=(fraction)%d/1 !"
" videobalance name=VideoBalance ! textoverlay name=chanNameFilter !
textoverlay name=osdMessageFilter ! textoverlay name=sessionTimerOverlay ! "
"tee name=t ! queue ! appsink name=videoSink t. ! queue ! ffenc_wmv1
name=videoEncoder me-method=5 ! amux. alsasrc ! "
"audio/x-raw-int, depth=%d, width=%d, channels=2, endianness=1234,
rate=%d, signed=true ! volume name=volumeFilter ! "
"tee name=souTee ! queue ! appsink name=soundSink souTee. ! queue !
ffenc_wmav1 ! amux. asfmux name=amux ! rtpasfpay ! multifdsink
width, height, fps, bitWidth, bitWidth, audioSampleRate);
I think I would need so use gstrtpbin instead of the multifdsink and get rid
of the muxing?
"see you're using appsrc and appsink. Usually, the first good
question in such a case is: "do I really need such elements in my
pipeline?". The answer depends on your use case and requirements :).
Note that if your app is not producing and consuming buffers in the
proper way you may run into troubles. The behaviour will e.g. differ
depending on whether you're in pull or push mode. See the elements'
documentation for more details. "
I am using the appsinks because this app is using OPAL in other threads to
answer H323 and SIP requests and it needs the raw data buffer. This thread
is used for a client control computer and SDK which will monitor sessions,
and make recordings of the sessions (i.e. a Windows Server 2008 Web Server
using ASP.NET/ C#.net with an SDK I have written to talk to this device.) Do
you have a better approach in mind?
The OPAL thread grabs this buffer when it needs it.
Thank you once again,
View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Trouble-using-x264enc-with-a-tee-tp3067583p3077032.html
Sent from the GStreamer-devel mailing list archive at Nabble.com.
More information about the gstreamer-devel