Issue with rtsp server source h264 encoded
LEVY David
David.Levy at ino.ca
Fri Sep 22 12:34:42 UTC 2023
Same behaviour with both main and constrained-baseline
But since the rtsp server does not emit the "need-data" signal more than once, I would suppose the issue is coming from here, and not the encoding
(but I don't know at all how all this work im new, so it may be completely wrong)
From: Nicolas Dufresne <nicolas at ndufresne.ca>
Sent: Thursday, September 21, 2023 8:33 PM
To: Discussion of the development of and with GStreamer <gstreamer-devel at lists.freedesktop.org>
Cc: LEVY David <David.Levy at ino.ca>; Michael Gruner <michael.gruner at ridgerun.com>
Subject: Re: Issue with rtsp server source h264 encoded
videotestsrc ! x264enc tune=zerolayency tends to negotiate yuv444+high profile. That can cause introp issues. Ide suggest a caps filter after the encoder with caps video/x-h264,profile=main (or constrained-baseline, though that will not compress as much).
Hopefully this will improve your situation.
Le jeu. 21 sept. 2023, 16 h 30, LEVY David via gstreamer-devel <gstreamer-devel at lists.freedesktop.org<mailto:gstreamer-devel at lists.freedesktop.org>> a écrit :
Same behaviour.
I just noted that "need-data" is only called once with ffplay, and twice with gst-launch (probably once udp and once tcp)
From: Michael Gruner <michael.gruner at ridgerun.com<mailto:michael.gruner at ridgerun.com>>
Sent: Thursday, September 21, 2023 2:45 PM
To: Discussion of the development of and with GStreamer <gstreamer-devel at lists.freedesktop.org<mailto:gstreamer-devel at lists.freedesktop.org>>
Cc: LEVY David <David.Levy at ino.ca<mailto:David.Levy at ino.ca>>
Subject: Re: Issue with rtsp server source h264 encoded
Hi
Try sending periodic I frames, "x264enc key-int-max=10" or, programmatically:
g_object_set( lEncoder, "key-int-max", 10, NULL ); // 4 = zerolatency
10 is just a made up number to send a reference frame every second (given that your framerate is 10/1). You may space them according to your needs.
Michael.
On 21 Sep 2023, at 09:59, LEVY David via gstreamer-devel <gstreamer-devel at lists.freedesktop.org<mailto:gstreamer-devel at lists.freedesktop.org>> wrote:
Hello,
i'm new to gstreamer / rtsp so, sorry if it's a basic mistake.
Im trying to set up a pipeline that end with an appsink that push data into a rtsp server starting with an appsrc.
It would be:
videotestsrc -> x264enc -> appsink (push data to src as soon as received)
and
appsrc -> rtph264pay
I can't find any example similar, but I found this one:
https://github.com/GStreamer/gst-rtsp-server/blob/master/examples/test-appsrc.c
Which does:
appsrc-> videoconvert -> x264enc -> rtph264pay (and uses "need-data")
I have successfully tweaked it to
videotestsrc -> appsink
appsrc-> videoconvert -> x264enc -> rtph264pay (and uses "need-data")
My next step would be
Videotestsrc -> x264enc -> appsink
appsrc-> rtph264pay (and uses "need-data")
but when I try to read the stream from either
gst-launch-1.0 --gst-debug=rtspsrc:2 rtspsrc location=rtsp://127.0.0.1:8554/test ! queue ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! videoscale ! autovideosink
or ffplay rtsp://127.0.0.1:8554/test
I don't have any playback:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Got context from element 'autovideosink0': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"\(GstGLDisplayWayland\)\ gldisplaywayland0";
Progress: (open) Opening Stream
Pipeline is PREROLLED ...
Prerolled, waiting for progress to finish...
Progress: (connect) Connecting to rtsp://127.0.0.1:8554/test
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (open) Opened Stream
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Redistribute latency...
Progress: (request) Sending PLAY request
Redistribute latency...
Progress: (request) Sent PLAY request
Redistribute latency...
Redistribute latency...
0:00:05.945770874 15996 0x55c83ec97f60 WARN rtspsrc gstrtspsrc.c:5964:gst_rtspsrc_reconnect:<rtspsrc0> warning: Could not receive any UDP packets for 5.0000 seconds, maybe your firewall is blocking it. Retrying using a tcp connection.
WARNING: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: Could not read from resource.
Additional debug info:
../gst/rtsp/gstrtspsrc.c(5964): gst_rtspsrc_reconnect (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:
Could not receive any UDP packets for 5.0000 seconds, maybe your firewall is blocking it. Retrying using a tcp connection.
Redistribute latency...
And it is stuck here
I'm using the code below:
#include <gst/gst.h>
#include <gst/app/gstappsink.h>
#include <gst/app/gstappsrc.h>
#include <gst/rtsp-server/rtsp-server.h>
GstElement *gSink = nullptr;
static void InitPipeline()
{
GstElement *lPipeline = gst_pipeline_new( "x264-pipeline" );
GstElement *lSource = gst_element_factory_make( "videotestsrc", "source" );
GstElement *lEncoder = gst_element_factory_make( "x264enc", "x264-encoder" );
GstElement *lSink = gst_element_factory_make( "appsink", "appsink" );
gst_bin_add_many( GST_BIN( lPipeline ), lSource, lEncoder, lSink, NULL );
// gst_bin_add_many( GST_BIN( lPipeline ), lSource, lSink, NULL );
g_object_set( lEncoder, "tune", 4, NULL ); // 4 = zerolatency
GstCaps *lCaps =
gst_caps_new_simple( "video/x-raw", "width", G_TYPE_INT, 384, "height", G_TYPE_INT, 288, "framerate", GST_TYPE_FRACTION, 10, 1, NULL );
if( !gst_element_link_filtered( lSource, lEncoder, lCaps ) )
// if( !gst_element_link_filtered( lSource, lSink, lCaps ) )
{
gst_caps_unref( lCaps );
throw;
}
gst_caps_unref( lCaps );
gst_element_link( lEncoder, lSink );
GstStateChangeReturn ret = gst_element_set_state( lPipeline, GST_STATE_PLAYING );
if( ret == GST_STATE_CHANGE_FAILURE )
{
throw;
}
gSink = lSink;
}
static void need_data( GstElement *appsrc, guint unused, void *aUnused )
{
GstSample *lSample = gst_app_sink_pull_sample( GST_APP_SINK_CAST( gSink ) );
GstFlowReturn lRet = gst_app_src_push_sample( GST_APP_SRC( appsrc ), lSample );
gst_sample_unref( lSample );
return;
}
static void media_configure( GstRTSPMediaFactory *factory, GstRTSPMedia *media, gpointer user_data )
{
GstElement *element, *appsrc;
element = gst_rtsp_media_get_element( media );
appsrc = gst_bin_get_by_name_recurse_up( GST_BIN( element ), "mysrc" );
gst_util_set_object_arg( G_OBJECT( appsrc ), "format", "time" );
if( GstPad *pad = gst_element_get_static_pad( gSink, "sink" ) )
{
g_print( "Got pad\n" );
GstCaps *caps = gst_pad_get_current_caps( pad );
if( caps )
{
auto *caps_str = gst_caps_to_string( caps );
g_print( "Got caps %s\n", caps_str );
g_object_set( G_OBJECT( appsrc ), "caps", caps, NULL );
gst_caps_unref( caps );
}
}
g_signal_connect( appsrc, "need-data", (GCallback)need_data, NULL );
gst_object_unref( appsrc );
gst_object_unref( element );
}
int main( int argc, char *argv[] )
{
GMainLoop *loop;
GstRTSPServer *server;
GstRTSPMountPoints *mounts;
GstRTSPMediaFactory *factory;
gst_init( &argc, &argv );
loop = g_main_loop_new( NULL, FALSE );
InitPipeline();
server = gst_rtsp_server_new();
mounts = gst_rtsp_server_get_mount_points( server );
factory = gst_rtsp_media_factory_new();
// gst_rtsp_media_factory_set_launch( factory, "( appsrc name=mysrc ! videoconvert ! x264enc ! rtph264pay name=pay0 pt=96 )" );
gst_rtsp_media_factory_set_launch( factory, "( appsrc name=mysrc ! rtph264pay name=pay0 pt=96 )" );
g_signal_connect( factory, "media-configure", (GCallback)media_configure, NULL );
gst_rtsp_mount_points_add_factory( mounts, "/test", factory );
g_object_unref( mounts );
gst_rtsp_server_attach( server, NULL );
g_print( "stream ready at rtsp://127.0.0.1:8554/test\n" );
g_main_loop_run( loop );
return 0;
}
Is it a good idea to convert before the appsink?
Thanks for your time
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20230922/6428a102/attachment-0001.htm>
More information about the gstreamer-devel
mailing list