Playing a raw h.264 stream from a USB source (timestamp issues)

mikeGD miguel.exposito at generaldrones.es
Wed Mar 17 20:09:33 UTC 2021


I'm dealing with a live video source (drone wireless video receiver) that
outputs a raw h.264 video stream over USB. My goal is to integrate it into
QGroundStation in Android, which has a GStreamer pipeline like this:



I have dumped a slice of the received USB data to a file, which is perfectly
playable with vlc using the following command:



However, if I play it back using this GStreamer pipeline, the playback speed
is too high (like x10)



I'm using appsrc to push the USB data into the QGroundControl pipeline. The
video plays, but lots of frames are dropped and gstreamer complains about
packets dropped because frames are too late.



After closer inspection of my dump, I realized that the stream is lacking
pts and dts information (which seems to be usual in baseline h.264 streams)



But apparently, the duration information is there.

The USB endpoint reads 512-byte chunks (due to the USB Hi-Speed max. payload
size for a bulk endpoint), and some transfers are smaller (400+ bytes long).
I have no way to detect the beginning/end of NALs since it's an opaque
continuous byte stream. (video/x-h264, stream-format=(string)byte-stream,
alignment=none)

So I built an appsrc to push the video stream to the pipeline and tried to
blindly timestamp the buffers like this:



... but still no luck ...

I have had limited success by using the following pipeline that encodes the
h.264 stream into RTP payloads and then decodes it with a caps filter
specifying the target framerate:



I could build that into QGroundControl in C++ but I don't think it's the
right approach and I should not make any assumptions about the target
framerate since in this case it's 30 fps, but it may change dynamically.

So, my questions are:

- What would be the right approach to getting the video playing at the right
speed without any frame drops?
- Is it reasonable or possible to ask GStreamer to generate the PTS/DTS
(there are no B-frames, so PTS should be equal to DTS) based on the duration
information of the packets using the standard pipeline?

Any help would be greatly appreciated.
Thanks!

Mike



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/


More information about the gstreamer-devel mailing list