Hi,<br><br><div class="gmail_quote">On Fri, May 20, 2011 at 10:33 PM, Eric Shoquist <span dir="ltr"><<a href="mailto:eshoquist@trellisware.com">eshoquist@trellisware.com</a>></span> wrote:<br><div>..snip..<br> </div>
<blockquote class="gmail_quote" style="margin: 0pt 0pt 0pt 0.8ex; border-left: 1px solid rgb(204, 204, 204); padding-left: 1ex;"><div link="blue" vlink="purple" lang="EN-US"><div><p class="MsoNormal"><font face="Arial" size="2"><span style="font-size:10.0pt;font-family:Arial">Audio however seems to be a different beast. I’ve
found even when piping AAC to a filesink, in order for quicktime to play the
audio, I have to use qtmux or mp4mux after the encoding phase.</span></font></p></div></div></blockquote><div><br>it's understandable as long as it's about playing local files.. streaming is a different beast and usually the muxer role is played from the payloader (see below).<br>
<br> </div><blockquote class="gmail_quote" style="margin: 0pt 0pt 0pt 0.8ex; border-left: 1px solid rgb(204, 204, 204); padding-left: 1ex;"><div link="blue" vlink="purple" lang="EN-US"><div><p class="MsoNormal"><font face="Arial" size="2"><span style="font-size: 10pt; font-family: Arial;"> When I
attempt to mimic my video pipeline for audio like the following:</span></font></p>
<p class="MsoNormal"><font face="Arial" size="2"><span style="font-size:10.0pt;font-family:Arial"> </span></font></p>
<p class="MsoNormal"><font face="Arial" size="2"><span style="font-size:10.0pt;font-family:Arial">gst-launch -v alsasrc device=hw:0,0 num-buffers=$((30*60*5))
! \</span></font></p>
<p class="MsoNormal"><font face="Arial" size="2"><span style="font-size:10.0pt;font-family:Arial"> audio/x-raw-int,
endianness=1234 ! \</span></font></p>
<p class="MsoNormal"><font face="Arial" size="2"><span style="font-size:10.0pt;font-family:Arial"> queue ! \</span></font></p>
<p class="MsoNormal"><font face="Arial" size="2"><span style="font-size:10.0pt;font-family:Arial"> TIAudenc1
engineName=codecServer codecName=aacheenc ! \</span></font></p>
<p class="MsoNormal"><font face="Arial" size="2"><span style="font-size:10.0pt;font-family:Arial"> qtmux ! \</span></font></p>
<p class="MsoNormal"><font face="Arial" size="2"><span style="font-size:10.0pt;font-family:Arial"> rtpmp4vpay pt=96
! \</span></font></p>
<p class="MsoNormal"><font face="Arial" size="2"><span style="font-size:10.0pt;font-family:Arial"> udpsink
host=225.1.1.2 port=9010 auto-multicast=true</span></font></p>
<p class="MsoNormal"><font face="Arial" size="2"><span style="font-size:10.0pt;font-family:Arial"> </span></font></p>
<p class="MsoNormal"><font face="Arial" size="2"><span style="font-size:10.0pt;font-family:Arial">The pipeline doesn’t play – I get errors linking
mp4mux to rtpmp4apay (caps don’t match). Using gst-inspect I see that
qtmux has only 1 src - video/quicktime, and when looking at the rtpmp4 plugins
none of them take that as a sink. So I’m unclear how I can stream the
audio with rtp. Is there a different mux I can use that doesn’t put out
video/quicktime, but still works with quicktime player? Or is there some other
plugins to use between qtmux and one of the rtpmp4[a,v,g]pay plugins that will
convert the caps to video/mpeg?</span></font></p></div></div></blockquote><div><br>you should use NO muxers if yoiu're already payloading, that is, try connecting the encoder directly to the payloader. It should just work (r).<br>
<br>Besides, you should make sure that in the rtpsp negotiation (I didn't understand how it's done prior using these pipelines, but I'll assume some magic happens under the hood) the audio format is specified as something like MP4A-LATM (as for rfc3016). Indeed, RealMedia has its own payloading format and it might happen that the negotiation assumes that one to be used.<br>
<br>Regards <br></div></div>