IP Camera live stream to HLS

pzymen simon.nilsson at videocent.com
Mon Jan 11 12:56:41 PST 2016


Hello, I'm working on a project where I need to connect to an IP Camera via a
P2P API to receive video and audio data and then output this to a HTTP Live
Stream.

The data received from the camera has the following format:

Video: h264, 1280x720p, variable frame rate (10fps~)
Audio: PCM s16le 16bit 8khz mono

Timestamps in milliseconds are sent along with the data from the camera.

I have built the following pipeline in attempt to accomplish my goal:

[video data] > appsrc > h264parse --------------------- mpegtsmux > hlssink
[audio data] > appsrc > audioconvert > avenc_aac ---/

I have run in to a couple of problems.

1. using avdec_h264 and fpsdisplaysink I was able to output the incoming
video stream but it droped over 90% of the frames resulting in an fps <1. I
suspect this has something to do with the timestamps. h264parse detects the
stream as having vfr (0/1). I have tried setting the timestamp when pushing
data to the appsrc buffer using:

GST_BUFFER_PTS(buffer) = gst_util_uint64scale(timestamp, GST_MSECOND, 1);

This does not help however.


2. When using the pipeline above the application crashes when trying to push
data to the audio appsrc using gst_app_src_push_buffer(); with an memory
access violation. The code is the same as used for the video appsrc but it
works fine.


3. Using the pipeline above no playlist or segments files are created.
hlssink is used with default settings.


I know this is a lot of issues but I would be very grateful for any help I
can get. If you need more information just ask, I didn't want to put to much
in here.





--
View this message in context: http://gstreamer-devel.966125.n4.nabble.com/IP-Camera-live-stream-to-HLS-tp4675213.html
Sent from the GStreamer-devel mailing list archive at Nabble.com.


More information about the gstreamer-devel mailing list