Encode YUV420 buffer with appsrc
Nicolas Dufresne
nicolas at ndufresne.ca
Fri Oct 6 19:42:10 UTC 2017
Le vendredi 06 octobre 2017 à 11:41 -0700, pchaurasia a écrit :
> I would like to encode output of my opencv algorithm. THe buffer to
> encode is
> not available exactly at 33 ms for encoder to encode at 30 fps. It
> seems
> there are two way I can send buffer-to-be-encoded to appsrc
>
> a) VideoWriter::write method.
>
> GstCaps *caps;
>
> string aaDebugPipeline = "appsrc name=myappsrc !
> autovideoconvert ! omxh265enc ! matroskamux ! filesink
> location=test.mkv ";
> m_ppipeline =
> gst_parse_launch(aaDebugPipeline.c_str(),NULL);
> g_assert (m_ppipeline);
>
> m_pappsrc = gst_bin_get_by_name
> (GST_BIN(m_ppipeline), "myappsrc");
> g_assert (m_pappsrc);
>
> caps = gst_caps_new_simple ("video/x-raw",
> "format",G_TYPE_STRING,"I420",
> "bpp",G_TYPE_INT,12,
> "depth",G_TYPE_INT,8,
> "width", G_TYPE_INT, 1920,
> "height", G_TYPE_INT, 1080,
> "pitch", G_TYPE_INT, 2048,
This field is invalid and will be ignored. You need to use both caps
and VideoMeta.
> "framerate", GST_TYPE_FRACTION, 30, 1,
> NULL);
>
> gst_app_src_set_caps(GST_APP_SRC(appsrc), caps);
> GstStateChangeReturn state_ret =
> gst_element_set_state((GstElement*)m_ppipeline, GST_STATE_PLAYING);
> g_warning("set state returned %d\n", state_ret);
>
> m_videoWriter.open(aaDebugPipeline,0, (double)30, cv::Size(1920,
> 1080),
> false);
> if (!m_videoWriter.isOpened()) {
> REPORT_ERROR("can't create writer\n");
> exit(1);
> }
>
>
> // call VideoWriter::write
>
> m_videoWriter.write (cv::Mat imgY)
>
> However I am not able to specify format of buffer while calling
> m_videoWriter::write() . Is there a way I can specify video frame
> format to
> m_videoWriter.write() , in this methods ?
>
> b) Can attach GstVideoMeta to a GstBuffer and pass that buffer to
> gst_app_src_push_buffer().
>
> void initAppSrc()
> {
> GstStateChangeReturn state_ret;
>
> m_offset[0] = m_offset[1] = m_offset[2] = 0;
> m_stride[0] = 2048; // TBD : FIXME : no magic no
> m_stride[1] = 1024;
> m_stride[2] = 1024;
>
> m_ppipeline =
> (GstPipeline*)gst_pipeline_new("mypipeline");
> m_pappsrc =
> (GstAppSrc*)gst_element_factory_make("appsrc",
> "aa-appsrc");
> m_pvideoConvert =
> gst_element_factory_make("autovideoconvert",
> "aa-videoconvert");
> m_pencoder = gst_element_factory_make("omxh265enc",
> "aa-videoencoder");
> m_pmux = gst_element_factory_make("matroskamux",
> "aa-mux");
> m_pfsink = gst_element_factory_make("filesink",
> "aa-filesink");
>
>
> g_assert(m_ppipeline);
> g_assert(m_pappsrc);
> g_assert(m_pvideoConvert);
> g_assert(m_pencoder);
> g_assert(m_pmux);
> g_assert(m_pfsink);
>
>
> g_signal_connect(m_pappsrc, "need-data", G_CALLBACK(start_feed),
> this);
> g_signal_connect(m_pappsrc, "enough-data", G_CALLBACK(stop_feed),
> this);
>
> g_object_set( G_OBJECT( m_pfsink ), "location", "test1.mkv", NULL
> );
>
> gst_bin_add_many(GST_BIN(m_ppipeline), (GstElement*)m_pappsrc,
> m_pvideoConvert, m_pencoder, m_pmux, m_pfsink, NULL);
>
> if(!gst_element_link_many((GstElement*)m_pappsrc,
> m_pvideoConvert,
> m_pencoder, m_pmux, m_pfsink)){
> g_warning("failed to link appsrc, autovideoconvert, encoder,
> muxer,
> and filesink");
> }
>
> state_ret = gst_element_set_state((GstElement*)m_ppipeline,
> GST_STATE_PLAYING);
> g_warning("set state returned %d\n", state_ret);
> }
>
> static gboolean read_data(gst_app_t *app)
> {
>
> // I would need to wait for opencv output here ;
> // Can this thread block ??
It's usually a bad idea to block in an idle callback, if you had a UI
running ont he GMainLoop thread, you'd get a big disaster.
Instead, block directly in the need-data callback by looping and
pushing and stop when enough-data has callcack (just set a state when
this is called, it will likely be called while you are pushing in a re-
entrant fashion).
>
> }
>
> static void start_feed (GstElement * pipeline, guint size, gst_app_t
> *app)
> {
> if (app->sourceid == 0) {
> GST_DEBUG ("start feeding");
> app->sourceid = g_idle_add ((GSourceFunc) read_data, app);
> }
> }
>
> static void stop_feed (GstElement * pipeline, gst_app_t *app)
> {
> if (app->sourceid != 0) {
> GST_DEBUG ("stop feeding");
> g_source_remove (app->sourceid);
> app->sourceid = 0;
> }
> }
>
> void pushFrame()
> {
> int size = 1920*1080*1.5;
> m_pgstBuffer = gst_buffer_new_wrapped_full(
> (GstMemoryFlags)0,
> (gpointer)(img.data), size, 0, size, NULL, NULL );
> m_pgstVideoMeta =
> gst_buffer_add_video_meta_full(m_pgstBuffer,GST_VIDEO_FRAME_FLAG_NONE
> ,
> GST_VIDEO_FORMAT_I420, 1920,1080, 3, m_offset, m_stride );
>
> //ref buffer to give copy to appsrc
> gst_buffer_ref(m_pgstBuffer);
>
>
> GstFlowReturn ret;
> ret =
> gst_app_src_push_buffer((GstAppSrc*)m_pappsrc,
> m_pgstBuffer);
> if(ret != GST_FLOW_OK)
> {
> g_printerr("could not push buffer\n");
> g_printerr("ret enum: %i\n", ret);
> }
>
> //dec. ref count so that we can edit data on next run
> gst_buffer_unref(m_pgstBuffer);
>
> }
>
>
> Here I am able to specify input GstBuffer format. But do not quite know how
> to supply output of my opencv code to a function like pushFrame() ? Or
> alternatively, if I make a queue between output of my opencv and appsrc read
> function , read_data(), can this function block and wait for opencv output
> to arrive ?
appsrc already provide a queue, just configure it's size and enough-
data will tell you when it's full.
>
> Thanks
>
>
>
>
>
>
>
> --
> Sent from: http://gstreamer-devel.966125.n4.nabble.com/
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.freedesktop.org
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 195 bytes
Desc: This is a digitally signed message part
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20171006/88d28cfa/attachment.sig>
More information about the gstreamer-devel
mailing list