What a pity

Jeff Shanab jshanab at jfs-tech.com
Tue Jan 4 00:26:23 UTC 2022


I need to confess, I work on IP cameras all day long. I work for a company
that writes Security Camera software.

In the beginning, there was Analog (bear with me)
  Very low latency, no buffering, very slow searching that ole VCR.

then IP camera on the scene with same resolution but was frame by brame
compression.
  Variable framerate, on demand to control of bandwidth,
  Storage was predictable and pruning was linear

Then Mp4,H264,H265, B frames, VPn,
  Gop based video -  a periodic key frame and then difference frames 1/10
their size allowing Mega-Pixel bandwidth and storage
  But now the encoding delay becomes important. on 30 fps video it is not
uncommon for the encoding of the keyframe to take longer than sample time
  This requires operating in the past, ever so slightly.
    Lets say encoding the keyframe takes 25ms and and diff frames take 8ms.
no problem 29 * 8 + 35 = 267 ms  , much less than the second it spans
    But You need to buffer and operate with a latency of 18ms downstream to
stay smooth AND IMPORTANT FOR FFMPEG.  not complain about frames to late to
display and drop right after the keyframe.

Now there is HLS
   This introdeces Segments. and the first segment must start with a
keyframe and usually every segment has segments made on n number of complet
gops, makes searching easier.
   Now you must buffer 1.5 segments and the viewer operates with multipl
seconds of delay.

In the beginning apple speced HLS with 10 second long segments You watched
video that was alomst always 12 seconds behind.
I ran it as low as 2 seconds long segments before dash came around
with...you guessed it, shorter segemnts.
These work by having a manifest, essitially a virtual filesystem in memory
for live,  But the delay is imho unacceptable.

Newer protocols on newer browsers focus on reducing the latency but good
luck getting all browsers to cooperate on a standard.

BTW in just the last 5 years IP cameras have gone from 100's of ms of
encoding delay on their side to less than 10ms.




On Mon, Jan 3, 2022 at 6:58 PM Peter Maersk-Moller via gstreamer-devel <
gstreamer-devel at lists.freedesktop.org> wrote:

> Some webcams, like the old-timer Logitech C920, can deliver H.264 encoded
> video over V4L2.
>
> That said, while cameras like the C920 camera records frames at a steady
> rate evenly spaced apart in time, these arrive in the driver/kernel level
> unevenly spaced apart and may need a mechanism to timestamp them with an
> even spaced time stamps. However the uneven spacing may only occur either
> between each second or perhaps between each I-frame. Usually I-frames will
> be late and the following P and B frames may arrive in a bulk or in some
> cases, the fP and B frames following an I frame, may have the same
> timestamp or nearly the same timestamp, If this is the case and the
> hlssink2 muxer does not correct for the timestamps being evenly spread out
> (it most likely do not), the mixer may have a hard time to mux audio with
> perhaps correct timestamps and video with basically faulty timestamps. You
> can check that by adding the "identity silent=false" element to your video
> pipeline before the hlssink2 video input. You may have to add "-v" to
> gst-launch-1.0.
>
> Regards
> Peter MM
>
> On Mon, Jan 3, 2022 at 11:31 PM Jeff Shanab via gstreamer-devel <
> gstreamer-devel at lists.freedesktop.org> wrote:
>
>> Webcams are raw yuv video, arn't they? gstreamer in this case is using
>> ffmpegs x-h264 to encode it
>>
>>
>> On Mon, Jan 3, 2022 at 4:51 PM Nicolas Dufresne via gstreamer-devel <
>> gstreamer-devel at lists.freedesktop.org> wrote:
>>
>>> Hi James,
>>>
>>> Le lundi 03 janvier 2022 à 07:16 +0800, James via gstreamer-devel a
>>> écrit :
>>> > gstreamer seems very nice in concept. The fact that I've been trying
>>> for 3 months and can get no help is a big deterent.
>>> >
>>> > I've got a 4 core i7 NUC clocked to 4.8G and I get a stream of QoS
>>> messages telling me the computer is too slow.
>>> > (GST_DEBUG=2,pulsesrc:6)
>>>
>>> Sorry if previous message were missed. Perhaps your issue is specific to
>>> your
>>> WebCam ? I don't owned myself a webcam that encodes to H264, so I've used
>>> vaapih264enc, and performance was decent.
>>>
>>> Asking question on public channels is an art, make sure to narrow down
>>> as much
>>> as possible your issue, and try to think on how others will be able to
>>> reproduce, if you can't make sure to share extra information that would
>>> allow
>>> controlling the variable when simulating with audiotestsrc and similar.
>>>
>>> regards,
>>> Nicolas
>>>
>>> >
>>> > The machine is idle running a single pipeline.
>>> > The stream stutters. ffmpeg shows dup and often a 100 dropped frames
>>> on each segment.
>>> >
>>> > Using audiotestsrc renders perfectly.
>>> >
>>> > #! /bin/bash
>>> >
>>> > IP=`hostname -I`
>>> >
>>> > gst-launch-1.0 -e -v v4l2src device=/dev/video2 ! \
>>> >       video/x-h264,width=1920,height=1080,framerate=30/1 ! \
>>> >       h264parse ! \
>>> >       tee name=vt \
>>> >       vt. ! queue ! hlssink2 max-files=5 name=hl \
>>> >       playlist-root=http://$IP
>>> playlist-location=/dev/shm/channel1.m3u8 location=/dev/shm/segment_%05d.ts \
>>> >       pulsesrc device=0 ! audioconvert ! avenc_aac ! \
>>> >         tee name=at \
>>> >       at. ! queue ! aacparse ! hl.audio
>>> >
>>> > The redundant tee's are for use later.
>>> > unless I see a euroka moment I'll have to try somethink else
>>> > James
>>> >
>>>
>>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20220103/9144b921/attachment-0001.htm>


More information about the gstreamer-devel mailing list