[gst-devel] audio problem with gstreamer and live555

Alessio alessio at cedeo.net
Mon Jul 26 18:00:35 CEST 2010


A little update:

I've modified the plugins to clean up properly everything when the 
stream ends, using g_free on my structures,  gst_*_unref on gst elements 
and so on.

Now in linux the first time I play a stream the video is displayed but 
no audio is played, than the subsequent times everything works, audio 
and video.

It looks like the player is not able to retrieve some information from 
the stream.
The second time the stream is played, the player does not retrieve audio 
information from the stream, but from "somewhere" in the memory, and as 
a result you can hear the audio as well.
We can't figure where the problem is.
In windows the subsequent times a stream is played there is only a 
little chunk of audio at the beginning of the stream, of less than a second.
  This may be due to the fact that windows has another policy in 
cleaning the memory.

In Linux, where the memory is not deleted as in windows, the second time 
a stream is played, everything works.

Apparently, it looks like that there are TWO problems:

-something is initialized too late when a stream is played.

-the memory is not cleaned properly, so when a stream is re-started, 
some data is still in memory and used to play the audio.

I'm not sure about what kinds of gst elements needs to be unreffed after 
usage (GstBuf, GstPads, GstCaps etc...) and when, where can I find some 
information about the cleansing of memory?

Thanks in advance.

Alessio.


Il 23/07/2010 15:15, Alessio ha scritto:
> Hi all
>
> sorry for my bad english
>
>     since the gstreamer rtspsrc module in windows doesn't support 
> rtscp tcp interleaved mode, I implemented a rtsp plugin based on live555.
>
> I based my work on how both vlc and mplayer live555 based modules are 
> made.
>
> I've a src plugin that manages the network connection and reads the 
> frames one by one, and a demuxer plugin that receives that frames and 
> creates a src pad for each subsession, based on the standard behaviour 
> of gstreamer demuxer.
>
> In the application I set up a basic pipeline based like that:
>
> gst-launch livertsp uri=rtsp://127.0.0.1:8554/bighic320x240.avi ! 
> livedemuxer name=demux  demux.audio_00 ! queue ! decodebin  ! 
> audioconvert ! audioresample ! autoaudiosink   demux.video_00 ! queue 
> ! decodebin ! ffmpegcolorspace ! videoscale ! autovideosink
>
> where livertsp and livedemuxer are the two plugins I wrote.
>
> but when I try to play a stream i see the video but i didn't hear the 
> audio.
>
> The behaviour in the same both on a C application and trough 
> gst-launch, both in windows and in linux.
>
> Sometimes (randomly) I hear some little and distubed frame of audio, 
> for less than a second.
>
> I checked trough a probe plugin I wrote what is received by the audio 
> sink, and it to receive correctly all the frames, it is exactly the 
> same that is received when I play the same file in local, with the 
> same pipeline using filesrc and avidemuxer in place of my plugins, 
> both the data and the timestamp and duration of the frames are correct.
>
> I also tried to save the audio output using filesink in place of the 
> audiosink and the result is the same for the rtsp and the file case, 
> so the audio is received by the sink but isn't played.
>
> In the attached file there is the output of the pipeline launched with 
> gst-launch, where everything seems allright to me.
>
> Anybody can help me?
>
> Thanks in advance, Alessio
>
>    




More information about the gstreamer-devel mailing list