adding datetime stamp to location
Dwight Kulkarni
dwight at realtime-7.com
Wed Sep 8 20:40:19 UTC 2021
Hi Nicolas,
I am examining the implications of this change. The main reason to have the
single pipeline was to save the extra processor utilization from H264
encoding twice.
However, when I checked online some other people who tried this said that
if they did the H264 encoding on the main pipeline before appsink to RTSP,
they were getting synch issues and their memory consumption was increasing
substantially.
Do you think it will be better to have a second independent pipeline or try
to integrate with appsink to the RTSP server in terms of overall
performance.
On Wed, Sep 8, 2021 at 3:11 PM Nicolas Dufresne <nicolas at ndufresne.ca>
wrote:
> Le mercredi 08 septembre 2021 à 15:03 -0400, Dwight Kulkarni a écrit :
>
> Hi Nicolas and everyone,
>
> I was able to get the issue resolved with the method that Nicolas
> mentioned, namely to get media-configure callback and then access the
> element by name and then get the "format-location" callback on the
> splitmuxsink and pass it the formatted name in a function.
>
> This is working great !! Thank you Nicolas.
>
> However, now I am having a problem that I start the RTSP server, and the
> pipeline only seems to save when I login with VLC client to view the
> stream. As soon as I log off, the file stops saving.
>
> Is there a way to have the RTSP pipeline run even when no client is
> connected ? Or can I simulate a fake client in someway ?
>
>
> This is how RTSP server handles it's internal pipeline. To get full
> control of the recording, I would suggest using an external pipeline, and
> have a branch that ends with appsink, that will then feed an appsink in the
> rtspserver pipeline.
>
>
>
>
> On Wed, Sep 8, 2021 at 1:36 PM Dwight Kulkarni <dwight at realtime-7.com>
> wrote:
>
> Hi Nicolas,
>
> Thank you for this info !
>
> In the example shown we get callback to media_configure, and access the
> pipeline element. Do I have to then create another call back for the file
> sink as shown below?
>
> static voidmedia_configure (GstRTSPMediaFactory * factory, GstRTSPMedia * media, gpointer user_data){ GstElement *element = gst_rtsp_media_get_element (media);
>
> GstElement * filesink = gst_bin_get_by_name_recurse_up (GST_BIN (element), "filesink");
>
> # add a callback signal
> g_signal_connect (G_OBJECT (filesink), "format-location", G_CALLBACK (format_location_callback), NULL);
>
>
> PIPELINE =
> (v4l2src device=/dev/video1 ! video/x-raw,width=1920, height=1080!
> queue ! vpuenc_h264 ! queue ! h264parse ! splitmuxsink name=filesink
> location=video%02d.mkv max-size-time=10000000000
> muxer-factory=matroskamux muxer-properties="properties,streamable=true)
>
>
>
>
>
> On Wed, Sep 8, 2021 at 12:16 PM Nicolas Dufresne <nicolas at ndufresne.ca>
> wrote:
>
> Le mercredi 08 septembre 2021 à 10:35 -0400, Dwight Kulkarni via
> gstreamer-devel a écrit :
>
> Hi all,
>
> I am trying to get my pipeline to add a datetime stamp to the location, as
> opposed to the numeric value.
>
> Previously I was using: splitmuxsink location=video%02d.mkv
>
> This is creating multiple files with video00.mkv etc. names.
>
> Now I tried using: splitmuxsink location=video$(date +%s).mkv
>
> This is creating only a single file with timestamp and appears to
> overwrite the previous files.
>
> 1) I see some examples where a callback is used. Should I use a callback
> or is there another way ?
>
> 2) If I use callback, I have GstMediaFactory object where I assign the
> pipeline with: gst_rtsp_media_factory_set_launch
>
>
> You have to connect to the "media-configure" signal of your factory:
>
>
> https://gitlab.freedesktop.org/gstreamer/gst-rtsp-server/-/blob/master/examples/test-appsrc.c#L123
>
> And then use gst_rtps_media_get_element() to retreive the pipeline:
>
>
> https://gitlab.freedesktop.org/gstreamer/gst-rtsp-server/-/blob/master/examples/test-appsrc.c#L66
>
> regards,
> Nicolas
>
>
> I can't seem to find how to access the pipeline element in GstMediaFactory
> object.
>
> I want to do something like this code:
>
> static gchararray
> format_location_callback (GstElement * splitmux,
> guint fragment_id,
> gpointer udata)
> {
> static int i =0;
> gchararray myarray = g_strdup_printf("myvid%d.mp4", i);
> i += 1;
>
> return myarray;
> }
>
>
> g_signal_connect (G_OBJECT (bin->sink), "format-location",
> G_CALLBACK (format_location_callback), bin);
>
>
> Any help is appreciated. Thanks !
>
>
>
>
> --
> Sincerely,
>
> Dwight Kulkarni
>
>
>
>
>
--
Sincerely,
Dwight Kulkarni
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20210908/54752794/attachment-0001.htm>
More information about the gstreamer-devel
mailing list