GStreamer playback of MP4 files with and without audio

Gareth Alldread gda190672 at gmail.com
Thu Oct 3 08:24:33 UTC 2024


I haven't got experience with filesrc - but I attached my example doing
something similar with uridecodebin (which will work with an video file
btw).  I have stripped this out of my app - so it is encapsulated inside a
class - but I'm sure you get the idea.



On Wed, 2 Oct 2024 at 17:46, Terry Barnaby via gstreamer-devel <
gstreamer-devel at lists.freedesktop.org> wrote:

> I have tried using g_signal_connect(m_sourceElement, "pad-added", ...) and
> within that adding a sub pipeline to the main pipeline, but I am getting
> something wrong when I add the sub pipeline.
>
> My simple program attached, sees the signal callback, but when I add the
> sub pipeline and link it to the new pad, the displayed video just freezes.
> I've looked at the "Basic tutorial 3: Dynamic pipelines" and it looks like
> I am doing the right thing and no errors are seen.
>
> Any ideas on what I am ding wrong ?
>
>
> On 02/10/2024 08:26, Gareth Alldread via gstreamer-devel wrote:
>
> I would add a handler for the "pad-added" event on the filesrc element and
> only create/add the audio part of the pipeline if you get an audio pad
> created.  That is what I have done to handle rtspsrc streams that may or
> may not have audio.  Reply if you want more details.
>
> g_signal_connect(m_sourceElement, "pad-added",
> G_CALLBACK(&SynxRTSPPipeline::onPadAdded), this);
>
> On Wed, 2 Oct 2024 at 07:30, Terry Barnaby via gstreamer-devel <
> gstreamer-devel at lists.freedesktop.org> wrote:
>
>> I am developing a C++ video inspection program that creates MP4/H264/MP3
>> files and needs to play them back with the video stream processed by
>> various gstreamer elements and audio separately.
>>
>> In general this has been working fine when just video was being
>> recorded/played back, but I am just adding audio to the mix. Some MP4 files
>> will contain only a video stream and some will contain both video and audio
>> streams.
>>
>> I need some way to handle the playback of these MP4 files that may or may
>> not have MP3 audio streams.
>>
>> As a simple idea if I use something like (The real C++ code constructs
>> the gstreamer piple line and tees the video stream to various gstreamer sub
>> pipelines):
>>
>> gst-launch-1.0 -v filesrc location=temp.mp4 ! qtdemux name=demux
>>
>>  demux.video_0 ! queue ! h264parse ! openh264dec ! glimagesink
>>
>>  demux.audio_0 ! queue ! decodebin ! audioconvert ! pulsesink
>>
>> This plays back the video and audio streams fine (not sure how well
>> synchronised?) from an MP4 with video and audio streams, but hangs if the
>> MP4 only has a video stream.
>>
>> So I think I need to:
>>
>> 1. Maybe there is some gstreamer element or attribute that can ignore the
>> audio stream if not present somehow ?
>>
>> 2. Check if the MP4 file has an audio stream before creating the
>> gstreamer pipeline in C++.
>>
>> 3. Create the basic C++ pipleline in C++ and interrogate pads or
>> something somehow and add the "demux.audio_0 ! queue ! decodebin !
>> audioconvert ! pulsesink" sub pipeline if it is seen an audio stream is
>> present.
>> Any ideas on the simplest/most CPU efficient way of doing this with
>> gstreamer ?
>>
>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20241003/9d94fbda/attachment.htm>
-------------- next part --------------
	URIPipeline::URIPipeline(guintptr renderWindowHandle)
	{
		m_videoSinkElement = gst_element_factory_make("d3d11videosink", "livevsink");

		m_sourceElement = gst_element_factory_make("uridecodebin", "livesource");

		// NOTE - only allocate these if we are going to use them
		/////////////////////////////////////////////////////////
		m_audioConvertElement = nullptr;
		m_audioResampleElement = nullptr;
		m_audioSinkElement = nullptr;


		m_videoConvertElement = gst_element_factory_make("d3d11convert", "videoconvert");
		m_overlayElement = gst_element_factory_make("synoverlay", "ovl");

		m_pipeline = gst_pipeline_new("live-pipeline");

		gst_bin_add_many(GST_BIN(m_pipeline), m_sourceElement, m_videoConvertElement,  m_overlayElement, m_videoSinkElement, NULL);

		if (!gst_element_link_many(m_videoConvertElement,  m_overlayElement, m_videoSinkElement, NULL))
		{
		}
		gst_video_overlay_set_window_handle(GST_VIDEO_OVERLAY(m_videoSinkElement), renderWindowHandle);

		g_signal_connect(m_sourceElement, "pad-added", G_CALLBACK(&URIPipeline::onPadAdded), this);

		g_object_set(m_sourceElement, "uri", "file:///mp4.mp4", NULL);

	}


	gboolean URIPipeline::onPadAdded(GstElement* src, GstPad* new_pad, URIPipeline* obj)
	{
		GstPad* asink_pad = nullptr;
		GstPad* vsink_pad = gst_element_get_static_pad(obj->m_videoConvertElement, "sink");
		GstPadLinkReturn ret;
		GstCaps* new_pad_caps = NULL;
		GstStructure* new_pad_struct = NULL;
		const gchar* new_pad_type = NULL;

		/* Check the new pad's type */
		new_pad_caps = gst_pad_get_current_caps(new_pad);
		new_pad_struct = gst_caps_get_structure(new_pad_caps, 0);
		new_pad_type = gst_structure_get_name(new_pad_struct);

		if (g_str_has_prefix(new_pad_type, "audio/x-raw"))
		{
			// If our converter is already created and linked, we have nothing to do here
			if (obj->m_audioConvertElement != nullptr)
			{
				g_print("We are already linked. Ignoring.\n");
				goto exit;
			}

			obj->m_audioConvertElement = gst_element_factory_make("audioconvert", "audioconvert");
			obj->m_audioResampleElement = gst_element_factory_make("audioresample", "audioresample");
			obj->m_audioSinkElement = gst_element_factory_make("autoaudiosink", "audiosink");

			// put them in the pipeline
			gst_bin_add_many(GST_BIN(obj->m_pipeline), obj->m_audioConvertElement, obj->m_audioResampleElement, obj->m_audioSinkElement, NULL);

			// link them
			if (!gst_element_link_many(obj->m_audioConvertElement, obj->m_audioResampleElement, obj->m_audioSinkElement, NULL))
			{
			}

			// resync state
			gst_element_sync_state_with_parent(obj->m_audioConvertElement);
			gst_element_sync_state_with_parent(obj->m_audioResampleElement);
			gst_element_sync_state_with_parent(obj->m_audioSinkElement);

			// link the new pad
			asink_pad = gst_element_get_static_pad(obj->m_audioConvertElement, "sink");

			///* Attempt the link */
			ret = gst_pad_link(new_pad, asink_pad);
			if (GST_PAD_LINK_FAILED(ret)) {
				g_print("Type is '%s' but link failed.\n", new_pad_type);
			}
			else {
				g_print("Link succeeded (type '%s').\n", new_pad_type);
			}

		}
		else if (g_str_has_prefix(new_pad_type, "video/x-raw"))
		{

			/* If our converter is already linked, we have nothing to do here */
			if (gst_pad_is_linked(vsink_pad)) {
				g_print("We are already linked. Ignoring.\n");
				goto exit;
			}

			/* Attempt the link */
			ret = gst_pad_link(new_pad, vsink_pad);
			if (GST_PAD_LINK_FAILED(ret)) {
				g_print("Type is '%s' but link failed.\n", new_pad_type);
			}
			else {
				g_print("Link succeeded (type '%s').\n", new_pad_type);
			}

		}
		else
		{
			g_print("It has type '%s' which is not raw audio. Ignoring.\n",
				new_pad_type);
			goto exit;

		}

		/*video/x-raw*/

	exit:
		/* Unreference the new pad's caps, if we got them */
		if (new_pad_caps != NULL)
			gst_caps_unref(new_pad_caps);

		/* Unreference the sink pad */
		if (vsink_pad)
		{
			gst_object_unref(vsink_pad);
		}
		if (asink_pad)
		{
			gst_object_unref(asink_pad);
		}

		return TRUE;
	}
}


More information about the gstreamer-devel mailing list