How to decode a still-image bitstream in MS bitmap format?

Antonio Ospite ao2 at ao2.it
Mon Jan 7 16:36:49 UTC 2019


On Sat, 5 Jan 2019 05:00:07 +0000
Zhou Weidong <weidong.zhou at hotmail.com> wrote:

> Hello, All
> 
> I am a software engineer from NXElec and working in a project with
> bitstream in MS bitmap format.
>
> Our target board receives a bitmap bitstream from USB port and we want
> to use gstreamer/decodebin3 to decode it.  The problem/issue we
> currently facing is that decodebin3 hang when the bitstream sent from
> the USB Host, and the log messages:
[...]

I tried to simulate a test without involving USB:

$ gst-launch-1.0 videotestsrc pattern=ball num-buffers=100 !  avenc_bmp ! filesink location=multi_bmp.bin

And tried to decode multi_bmp.bin:

$ gst-launch-1.0 -e filesrc location=multi_bmp.bin ! typefind ! avdec_bmp ! videoconvert ! autovideosink

This fails (even for a single frame tho), because avdec_bmp does not get
the full frame:

ERROR	libav :0:: not enough data (4096 < 307254), trying to decode anyway
ERROR   libav :0:: not enough data (4042 < 307200)

Using multifilesrc would work for a single file, but not for multiple
BMPs.

I used avdec_bmp because using decodebin3 (which picks up gdkpixbufdec)
did not work well for me with autovideosink.

> To compare with a similar situation, we save  an unique still image to
> a bitmap file and use gstreamer filesrc to decode it.  This time,
> decode success. The log messages as below:
[...]

> I noticed the difference between 2 scenarios is that decodebin3
> detects an EOS signal when it read from a bitmap file. And in USB
> device mode, there is no EOS signal detecting and this leads
> decodebin3 hang...
>

When loading a single frame you can pass it all at once to the decoder
and it will know how to decode it, I think the EOS marks the end of the
frame buffer in your second test.

When sending multiple frames without a container you need something to
tell when one frame ends or, I guess depending on the decoder, to pass
single full frames one by one to the decoder.

This is the job of a "parser" element in GStreamer.

> So the question is how can we use decodebin3 to decode a bitmap
> bitstream?  Bitmap bitstream here means a bitstream that contents
> consequent still images in MS Bitmap format. Currently we send this
> bitmap stream from Windows Host, I guess add some tags/delimiters
> among images in bitstream will let decodebin3 recognize and decode it
> properly. But I do not know how.
>

A possible solution is to write a bmpparse element to split the
bitstream into single frames:

$ gst-launch-1.0 -e filesrc location=multi_bmp.bin ! bmpparse ! avdec_bmp ! videoconvert ! autovideosink

I hacked up a proof-of-concept with an hardcoded buffer size in pyhton
and it seems to do the job.

> Thanks in advance if any suggestions/clues come. The current
> gstreamer/decodebin3 version we are using is 1.10.4.
> 

Try to use the latest stable version if possible.

Ciao,
   Antonio

-- 
Antonio Ospite
https://ao2.it
https://twitter.com/ao2it

A: Because it messes up the order in which people normally read text.
   See http://en.wikipedia.org/wiki/Posting_style
Q: Why is top-posting such a bad thing?


More information about the gstreamer-devel mailing list