Encode YUV420 buffer with appsrc

Antonio Ospite ao2 at ao2.it
Sun Oct 15 18:40:24 UTC 2017


On Sun, 15 Oct 2017 00:59:48 -0700 (MST)
pchaurasia <pchaurasia at gameloreinc.com> wrote:

[...]
> When I move away from the experimental code (i.e. code where you suggested
> changes) and use my actual production code, I have problem. We made with
> your changes in out production code. Previously (i.e. without your recent
> changes)we were  getting an mp4 bitstream,  which just had an header and it
> wouldnot play. Now with your changes we are getting a legal bitstream which
> is played by mplayer bit the picture is visually garbage. 
> 
> I am suspecting that this could be due to our buffer format. In our
> production code we have three matrices - one each for Y, U and V. The data
> for each of these matrix, is 'not contiguous' in memory.

So does your data look something like this?

YYYYYYYYPP
YYYYYYYYPP
...
UUUUP
UUUUP
...
VVVVP
VVVVP
...

> My question is how
> to specify start of each of these buffers to
> gst_buffer_add_video_meta_full() call. The offset array that
> gst_buffer_add_video_meta_full(), takes in as input, would be the way to
> specify start of each of Y, U and V buffer. Can it take negative offsets ?
> What is reference of this offset - i.e. I am thinking 
> 
> offset[0] = 0; // this specifies start of Y buffer
> offset[1] = YStart - UStart; // this specifies start of U from Y
> offset[2] = YStart - VStart; // this specifies start of V from Y
> 
> Also strides are different Y and UV buffers, in our case. I specify strides
> like -
> 
> stride[0] = 2048;           // Y Buffer stride
> stride[1] = 1024;           // U buffer stride
> stride[2] = 1024;           // V buffer stride
> 
> Our YUV buffer is 1920x1080 resolution. However, due to memory organization
> - the strides are different. Please let me know your thoughts. Aforesaid
> settings are still not working, in our production code, but with you changes
> we do get legal bitstream which did not happen before your changes.
> 

The first quick experiment I'd do is to produce a 2048x1080 video
assuming contiguous data, and see how it looks like.

Could you also dump the original data to a file you can share?
Just one frame is enough.

Then you could make your self-contained test program read back the data
from a file for the appsrc to push downstream the pipeline, even if
it's always the same image this can still be used to validate the
final visual result you are after.

With this setup it would be easier for us to replicate your issue.

I never used gst_buffer_add_video_meta_full() so I might be interested
to play with it a little bit.

Ciao,
   Antonio

-- 
Antonio Ospite
https://ao2.it
https://twitter.com/ao2it

A: Because it messes up the order in which people normally read text.
   See http://en.wikipedia.org/wiki/Posting_style
Q: Why is top-posting such a bad thing?


More information about the gstreamer-devel mailing list