AVPacket to GStreamer pipeline

fre deric frenky.picasso at seznam.cz
Wed Jan 6 10:02:07 UTC 2021


Hi, 
I have still issue to send AVPacket to GStreamer pipeline. I went through 
gst-libav <https://github.com/GStreamer/gst-libav/tree/master/ext/libav>  
code and I found some hints there to improve my code (Thank you to gst-libav
team). However, it still does not work and I have no idea what's wrong here.

I have 2 threads. The first thread reads video file, demux it and creates
GstBuffer from AVPacket: 
// Create GstBuffer
buffer = gst_buffer_new_and_alloc(packet.size);
gst_buffer_fill(buffer, 0, packet.data, packet.size);
// Send the GstBuffer to app_src element in gst pipeline
gst_app_src_push_buffer((GstAppSrc *)app_source, buffer);


The second threads is Gst pipeline. I test it on 2 different video file: mp4
and ogg with suitable video caps:

For ogg file:
container: ogg
codec: Theora
dim: 1920x1080
framerate: 30fps

the video caps is:
const gchar *video_caps = "video/x-theora, width=(int)1920,
height=(int)1080, framerate=(fraction)30/1";


For mp4 file:
container: Quicktime
codec: H.264
dim: 1920x1080
framerate: 30fps

I tried four video caps:
const gchar *video_caps = "video/x-h264, width=(int)1920, height=(int)1080,
framerate=(fraction)30/1, alignment=(string)au, stream-format=(string)avc";
const gchar *video_caps = "video/x-h264, width=(int)1920, height=(int)1080,
framerate=(fraction)30/1, alignment=(string)au,
stream-format=(string)byte-stream";
const gchar *video_caps = "video/x-h264, width=(int)1920, height=(int)1080,
framerate=(fraction)30/1, alignment=(string)nal, stream-format=(string)avc";
const gchar *video_caps = "video/x-h264, width=(int)1920, height=(int)1080,
framerate=(fraction)30/1, alignment=(string)nal,
stream-format=(string)byte-stream";


and finally the pipeline is:
string = g_strdup_printf("appsrc name=app_source caps=\"%s\" ! decodebin !
videoconvert ! autovideosink", video_caps);


This code throws following error:

for ogg file:
"ERROR from element theoradec0: Could not decode stream. Debugging info:
gsttheoradec.c(812): theora_handle_data_packet ():
/GstPipeline:pipeline0/GstTheoraDec:theoradec0: no header sent yet"


for mp4 file:
- for stream-format=avc in video caps:
"ERROR from element vaapidecode0: No valid frames decoded before end of
stream
Debugging info: gstvideodecoder.c(1161):
gst_video_decoder_sink_event_default ():
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstVaapiDecodeBin:vaapidecodebin0/GstVaapiDecode:vaapidecode0:
no valid frames found"

- for stream-format=avc in video caps:
"ERROR from element h264parse0: No valid frames found before end of stream
Debugging info: gstbaseparse.c(1329): gst_base_parse_sink_event_default ():
/GstPipeline:pipeline0/GstH264Parse:h264parse0"

 Does anyone know, what's wrong here? 

The only thing I can think of is setting of timestamp and duration of the
GstBuffer. From gst_ffmpegdemux_loop function in  gstavdemux.c
<https://github.com/GStreamer/gst-libav/blob/master/ext/libav/gstavdemux.c>  
in gst-libav, they do following:

timestamp = gst_ffmpeg_time_ff_to_gst(packet.pts, av_stream->time_base);
duration = gst_ffmpeg_time_ff_to_gst(packet.duration, av_stream->time_base);
GST_BUFFER_TIMESTAMP(buffer) = timestamp;
GST_BUFFER_DURATION(buffer) = duration;

where the timestamp and duration is computed from AVPacket and AVStream
time_base. I added this to my code too but maybe it can be wrong. Do the
wrong timestamp and duration could cause these errors? Thank you.

My complete code with this example is here:
https://pastebin.com/wrSDPuD0





--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/


More information about the gstreamer-devel mailing list