[gst-devel] RTP+MJPEG - Sniffing packets with wireshark, How clients join the rtp server

Marco Ballesio gibrovacco at gmail.com
Wed Dec 22 13:27:18 CET 2010


Hi,

On Wed, Dec 22, 2010 at 1:20 PM, Mauro Brenna <malloblenne at gmail.com> wrote:
> Hi,
>
> thank you for the replies.
>
> I looked for some more information about wireshark and it appears that
> wireshark can detect the RFC 2435 (RTP for MJPEG).
>
> http://www.wireshark.org/docs/dfref/j/jpeg.html

"detect" is the wrong term, at least in the context of the link you
sent. Wireshark can "display" the content of these packets or, that is
the same, it has a dissector for the protocol.

In my (limited) experience with the tool, the automagic detection
occurs only if you have a signalling phase Wireshark understands.

You want to just "see" the contents and you don't care about
automagic, right-click a packet containing RFC2435 -> "Decode as.." ->
"RTP" in the scroll down list. For more questions about Wireshark,
please refer to the appropriate contacts:

http://www.wireshark.org/lists/

>
> So, I expected, like it is for VLC using RFC 2250, that every UDP
> packet containing an RTP packet as payload will be detected. During my
> tests, I see only standard UDP packets.

Please, check if you have a signalling phase in the VLC capture. It's
very easy, you should see some "RTSP" packets going back and forth
before the actual stream begins.

>
> I read something more about RTP and JPEG, I would like to know if I
> can use the RFC 2435 for images and another RFC for audio

git clone git://anongit.freedesktop.org/gstreamer/gst-plugins-good
grep -r RFC gst-plugins-good/gst/rtp | grep audio

gives me:

gst-plugins-good/gst/rtp/gstrtpL16pay.c:      "Payload-encode Raw
audio into RTP packets (RFC 3551)",
gst-plugins-good/gst/rtp/gstrtpac3depay.c:      "Extracts AC3 audio
from RTP packets (RFC 4184)",
gst-plugins-good/gst/rtp/gstrtpamrdepay.c:      "Extracts AMR or
AMR-WB audio from RTP packets (RFC 3267)",
gst-plugins-good/gst/rtp/gstrtpamrpay.c:      "Payload-encode AMR or
AMR-WB audio into RTP packets (RFC 3267)",
gst-plugins-good/gst/rtp/gstrtpbvdepay.c:      "Extracts BroadcomVoice
audio from RTP packets (RFC 4298)",
gst-plugins-good/gst/rtp/gstrtpbvpay.c:      "Packetize BroadcomVoice
audio streams into RTP packets (RFC 4298)",
gst-plugins-good/gst/rtp/gstrtpg722pay.c:      "Payload-encode Raw
audio into RTP packets (RFC 3551)",
gst-plugins-good/gst/rtp/gstrtpg723depay.c:      "Extracts G.723 audio
from RTP packets (RFC 3551)",
gst-plugins-good/gst/rtp/gstrtpg723pay.c:    GST_STATIC_CAPS
("audio/G723, "     /* according to RFC 3551 */
gst-plugins-good/gst/rtp/gstrtpg729depay.c:      "Extracts G.729 audio
from RTP packets (RFC 3551)",
gst-plugins-good/gst/rtp/gstrtpg729pay.c:    GST_STATIC_CAPS
("audio/G729, "     /* according to RFC 3555 */
gst-plugins-good/gst/rtp/gstrtpilbcdepay.c:      "Extracts iLBC audio
from RTP packets (RFC 3952)",
gst-plugins-good/gst/rtp/gstrtpmp4adepay.c:      "Extracts MPEG4 audio
from RTP packets (RFC 3016)",
gst-plugins-good/gst/rtp/gstrtpmp4apay.c:      "Payload MPEG4 audio as
RTP packets (RFC 3016)",
gst-plugins-good/gst/rtp/gstrtpmpadepay.c:      "Extracts MPEG audio
from RTP packets (RFC 2038)",
gst-plugins-good/gst/rtp/gstrtpmpapay.c:      "Payload MPEG audio as
RTP packets (RFC 2038)",
gst-plugins-good/gst/rtp/gstrtpmparobustdepay.c:      "Extracts MPEG
audio from RTP packets (RFC 5219)",
gst-plugins-good/gst/rtp/gstrtpqcelpdepay.c:      "Extracts QCELP
(PureVoice) audio from RTP packets (RFC 2658)",
gst-plugins-good/gst/rtp/gstrtpqdmdepay.c:      "Extracts QDM2 audio
from RTP packets (no RFC)",
gst-plugins-good/gst/rtp/gstrtpvorbispay.c:      "Payload-encode
Vorbis audio into RTP packets (RFC 5215)",

(Disclaimer: that is not a comprehensive list)

> and
> synchronize audio and video using RTCP.

RTP is actually designed to grant synchronisation. Unless you've some
peculiar requirements, all you have to use is a proper GStreamer
pipeline with gstrtpbin.

> In this case, I would like to
> know if I can do the same for N video sources in RFC 2435.

In theory, yes. In practice, I never tried to synchronise multiple
video streams over RTP (and you mat run into issues like synchronising
the clock of the various source devices).

> In the
> negative case I believe I will be constraint to use RFC 2250 for
> streaming both video and audio.

RFC 2250 says "RTP Payload Format for MPEG1/MPEG2 Video". Taking apart
the fact it is only for video and quite obsolete, are you limited to
this technology because of customer requirements? You could achieve
better quality (and lower bandwidth) using, for instance, h264 and
rfc3984.

> Moreover, can I stream directly JPEG frames using RFC2250? IMHO,Since
> it isn't written in the RFC I believe it isn't possible.

on the same folder of before,

grep -r RFC gst-plugins-good/gst/rtp | grep JPEG

gives some answers ;).

>
> I know that these questions are a bit OT because they are more general
> than gstreamer use, but I am trying to understand the real
> potentiality of the protocol for developing my realtime application.

Sure it depends on how "standard" are your requirements. I've seen
nothing too complex from your descriptions so far.

Regards

>
>
> Thanks,
>
> Mauro
>
> ------------------------------------------------------------------------------
> Forrester recently released a report on the Return on Investment (ROI) of
> Google Apps. They found a 300% ROI, 38%-56% cost savings, and break-even
> within 7 months.  Over 3 million businesses have gone Google with Google Apps:
> an online email calendar, and document program that's accessible from your
> browser. Read the Forrester report: http://p.sf.net/sfu/googleapps-sfnew
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
>




More information about the gstreamer-devel mailing list