appsrc + h264/rtp streaming

Martin Vachovski Martin.Vachovski at skytek.com
Tue Aug 8 15:44:20 UTC 2017


Hello everyone!


We have an issue with streaming of H264 encoded video over UDP which comes from an appsrc element


We need to provide interoperability between a UWP (Windows 10) app and gstreamer.

NOTE: we cannot use the ksvideosrc element since we're trying to capture the frames  from the HoloLens mixed reality camera.

For that purpose we are trying to feed the frames as taken from Windows webcam API to an appsrc element, encode as h264 and transmit over udp...

We're using gstreamer 1.12.2 for Windows


Here it is what I am trying to do:


Transmitter:

AppSrcPipeline.pipeline = gst_parse_launch(
"appsrc name=appsrc_element block=true ! video/x-raw,format=RGB,width=320,height=240,framerate=30/1 ! identity check-imperfect-timestamp=true ! "
"videoconvert ! x264enc ! video/x-h264,profile=\"high-4:4:4\" ! rtph264pay ! udpsink host=192.168.168.98",
&err);


Receiver:

gst-launch-1.0 udpsrc ! application/x-rtp,framerate=30/1 ! rtpjitterbuffer ! rtph264depay ! decodebin ! videoconvert ! ximagesink


I feed data into the appsrc when the signal "need-data" arrives. The function that feeds the frames has the following:


void data_feed(GstElement * pipeline, guint size, void *app) {

size_t sz = 3 * 320 * 240 * sizeof(guchar);
GstFlowReturn ret;
static int c = 70;
GstBuffer *buffer = gst_buffer_new_allocate(NULL, sz, NULL);
GstMapInfo info;
static GstClockTime timestamp = 0, duration = 33333333, offset = 0;
gst_buffer_map(buffer, &info, GST_MAP_WRITE);
/*GST_BUFFER_PTS(buffer) = timestamp;                            // TIMESTAMP
GST_BUFFER_DURATION(buffer) = duration;
GST_BUFFER_OFFSET(buffer) = offset++;
GST_BUFFER_OFFSET_END(buffer) = offset;*/
timestamp += duration;
memset(info.data, c, sz); // for now just feed a grayscale screen with changing intensity
gst_buffer_unmap(buffer, &info);
g_signal_emit_by_name(AppSrcPipeline.src, "push-buffer", buffer, &ret);
gst_buffer_unref(buffer);
        c++;
}

So the problem is that only the first frame is ever decoded and displayed on the other side.
The transmitter pipeline is in PLAYING state and also there is network traffic, but only one frame is ever drawn.
I have tried different modifications to the pipeline in order to understand which combination of elements works and which doesn't

1) replace appsrc -> videotestsrc. Then everything works and I can see the "live" videotest screen on the receiving side
        AppSrcPipeline.pipeline = gst_parse_launch(
"videotestsrc name=appsrc_element ! video/x-raw,format=RGB,width=320,height=240,framerate=30/1 ! identity check-imperfect-timestamp=true name=\"identity_element\" ! "
"videoconvert ! x264enc ! video/x-h264,profile=\"high-4:4:4\" ! rtph264pay ! udpsink host=192.168.168.98",
&err);

2) If I do encoding/decoding but don't "payload" h264 content from the appsrc. The following pipeline also works:
AppSrcPipeline.pipeline = gst_parse_launch(
"appsrc name=appsrc_element block=true ! video/x-raw,format=RGB,width=320,height=240,framerate=30/1 ! identity check-imperfect-timestamp=true ! "
"videoconvert ! x264enc ! video/x-h264,profile=\"high-4:4:4\" ! decodebin ! videoconvert ! autovideosink ",
&err);


So we can summarize the situation like this:
1) appsrc + autovideosink OK
2) videotestsrc + h264 encoding/decoding + RTP streaming OK
3) appsrc + h264 encoding/decoding OK
4) appsrc + h264 encoding + RTP streaming PROBLEM

My current understanding of the problem is that I don't provide the correct timestamps on the buffers I produce
and this is a problem for the rtph264pay/rtph264depay elements. I have tried to provide timestamps- if I uncomment
the "TIMESTAMP" code block I get an error in the logs and the pipeline doesn't seem to start at all
GStreamer-CRITICAL **: gst_segment_to_running_time: assertion 'segment->format == format' failed
This error message seems to be quite generic so googling it didn't show up any relevant discussions

Also I have noticed that if I use the combination
appsrc + autovideosink
the data_feed function is called about 20-30 times a second (which corresponds to the desired framerate)
but if I use
appsrc + h264 encoding + RTP streaming
The function is called a couple of hundreds of times per second and utilizes 100% CPU

It would be much appreciated if anybody can point at possible causes of the problem.

Best Regards
Martin




-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20170808/77d5e323/attachment.html>


More information about the gstreamer-devel mailing list