Why gst_base_sink_is_too_late!

Anton Olegovich ivanushkin.anton.olegovich at gmail.com
Wed Sep 3 00:54:05 PDT 2014


On 02.09.2014 19:40, Anton Olegovich wrote:
>
> On 02.09.2014 16:29, Anton Olegovich wrote:
>>
>> On 02.09.2014 15:58, Tim Müller wrote:
>>> On Tue, 2014-09-02 at 15:40 +0400, Anton Olegovich wrote:
>>>
>>> Hi,
>>>
>>>> Because of what can be this Logcat output:
>>>>
>>>> W/GStreamer+basesink(2096): 0:20:11.970461533 0x551bcd50
>>>> gstbasesink.c:2789:gst_base_sink_is_too_late:<autovideosink-actual-sink-glimage> 
>>>>
>>>> warning: A lot of buffers are being dropped.
>>>> W/GStreamer+basesink(2096): 0:20:11.970658764 0x551bcd50
>>>> gstbasesink.c:2789:gst_base_sink_is_too_late:<autovideosink-actual-sink-glimage> 
>>>>
>>>> warning: There may be a timestamping problem, or this computer to slow
>>>>
>>>> And what does it mean?
>>> Could you tell us more about when this happens? What's the pipeline?
>>>
>>> Often it either means the system is too slow (e.g. decoding an 1080p
>>> video in software on an embedded system), or there's a problem with
>>> latency reporting in a live streaming pipeline.
>>>
>>> Cheers
>>>   -Tim
>>>
>> Here are the pipelines
>>
>> on transmitter:
>>
>> v4l2src device=/dev/video6 num-buffers=-1 always-copy=false 
>> queue-size=1 ! TIPrepEncBuf numOutputBufs=1 
>> contiguousInputFrame=false !  TIVidenc1 engineName=codecServer 
>> codecName=h264enc contiguousInputFrame=true rateControlPreset=1 ! 
>> rtph264pay pt=96 ! udpsink (with properties)
>>
>> void create_pipe_universal_elements(void)
>> {
>>   GstBus *bus;
>>
>>   pipe.element.pipeline = gst_pipeline_new("camera_pipeline");
>>   bus = gst_pipeline_get_bus(GST_PIPELINE (pipe.element.pipeline));
>>   pipe.bus_watch_id = gst_bus_add_watch (bus, bus_call, 
>> pipe.loop);     //Регистрируем message handler
>>   gst_object_unref(bus);
>>   pipe.element.video_src = gst_element_factory_make("v4l2src", 
>> "videosrc");
>>   g_object_set(GST_OBJECT(pipe.element.video_src), "device", 
>> get_isp_out_dev_name(), NULL);
>>   pipe.element.sink = gst_element_factory_make("udpsink", 
>> "videoudpsink");
>> }
>>
>> int create_pipe_elements_video(void)
>> {
>>   pipe.element.video_prep_buffer = 
>> gst_element_factory_make("TIPrepEncBuf", "TIPrepEncBuf");
>>   pipe.element.video_encoder = gst_element_factory_make("TIVidenc1", 
>> "TIVidenc1");
>>   pipe.element.rtph264pay = gst_element_factory_make("rtph264pay", 
>> "rtph264pay");
>>   if(!pipe.element.video_src || !pipe.element.video_prep_buffer || 
>> !pipe.element.video_encoder ||\
>>       !pipe.element.rtph264pay || !pipe.element.sink)
>>     {
>>       warnx(ERROR_GST_ELEMENT_CREATION);
>>       return FALSE;
>>     }
>>   gst_bin_add_many(GST_BIN (pipe.element.pipeline), 
>> pipe.element.video_src, \
>>         pipe.element.video_prep_buffer, pipe.element.video_encoder, 
>> pipe.element.rtph264pay, pipe.element.sink, NULL);
>>   if(TRUE != gst_element_link_many(pipe.element.video_src, \
>>       pipe.element.video_prep_buffer, pipe.element.video_encoder, 
>> pipe.element.rtph264pay, pipe.element.sink, NULL))
>>     {
>>       warnx("Can not link video branch.");
>>       return FALSE;
>>     }
>>   g_object_set(GST_OBJECT(pipe.element.video_prep_buffer), 
>> "numOutputBufs", 1, "contiguousInputFrame", FALSE, NULL);
>>   g_object_set(GST_OBJECT(pipe.element.video_encoder), "engineName", 
>> "codecServer", "codecName", "h264enc",
>>       "contiguousInputFrame", TRUE, "rateControlPreset", 
>> IVIDEO_LOW_DELAY, NULL);
>>   g_object_set(GST_OBJECT(pipe.element.rtph264pay), "pt", 96, NULL);
>>   return TRUE;
>> }
>>
>>
>> on resiver:
>>
>> "udpsrc,rtph264depay,avdec_h264,/*videorate,*/clockoverlay,tee,/*queue,*/autovideosink" 
>>
>>
>>
>> data->video_pipeline = gst_pipeline_new ("videopipeline");
>> gst_bin_add_many(GST_BIN(data->video_pipeline),udpsrc,rtph264depay,avdec_h264,/*videorate,*/clockoverlay,tee,/*queue,*/autovideosink,NULL); 
>>
>>
>>   if (!gst_element_link_filtered (udpsrc,rtph264depay,udpsrc_caps)){
>>       GST_ERROR ("Can't link udpsrc and rtph264depay with caps");
>>   }
>>
>>   if (!gst_element_link_many 
>> (rtph264depay,avdec_h264,/*videorate,*/clockoverlay,tee,/*queue,*/autovideosink,NULL)){
>>       GST_ERROR ("Can't link many to tee");
>>   }
>>  /* if (!gst_element_link_many (queue,autovideosink,NULL)){
>>        GST_ERROR ("Can't link queue and videosink");
>>    }*/
>>
>>     gst_object_unref (G_OBJECT(videorate_src_pad));//Возможно 
>> получится ошибка с пямятью
>>     gst_caps_unref(videorate_caps);///Освобождаем caps
>>
>> this two pipelines impliments in applications for DM3730 and Android
>>
>> How can I realise
>>
>> latency reporting ?
>>
>



More information about the gstreamer-devel mailing list