How to make encoding loss less to measure the bandwidth consumed by RTMP?

nmrony pro.nmrony at gmail.com
Thu Nov 27 21:29:54 PST 2014


We are trying to measure max bandwidth consumed by a rtmp feed from an iOS
mobile device. we are using gstreamer to push the rtmp to server using the
following pipeline


* gst-launch-1.0 -e videotestsrc pattern=snow ! video/x-raw, framerate=30/1,
width=1920, height=1080 ! x264enc pass=qual quantizer=0 ! h264parse  !
flvmux ! rtmpsink location='rtmp://localhost:1935/app/streamname'*


RTMP also sending an error 


Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Redistribute latency...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: WriteN, RTMP send error 104 (129 bytes)


Why this is happening? We want the pipeline to use highest available
bandwidth to push the stream to RTMP server. What is the wrong with above
pipeline?




--
View this message in context: http://gstreamer-devel.966125.n4.nabble.com/How-to-make-encoding-loss-less-to-measure-the-bandwidth-consumed-by-RTMP-tp4669723.html
Sent from the GStreamer-devel mailing list archive at Nabble.com.


More information about the gstreamer-devel mailing list