Using Gstreamer for Live Streaming - Dieptranivsr/DroneIVSR GitHub Wiki
### Nvidia JPEG encode/decoder
$ gst-launch-1.0 -v v4l2src device=/dev/video2 ! video/x-raw,framerate=6/1 ! nvvidconv! nvjpegenc ! rtpjpegpay ! udpsink host=192.168.68.38 port=5000 sync=false async=false
$ gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=JPEG,payload=96 ! rtpjpegdepay ! decodebin ! videoconvert ! pngenc ! multifilesink location=/home/dieptran/Pictures/Test/file-%03d.png
# -----------------------------
$ gst-launch-1.0 -v v4l2src device=/dev/video2 ! video/x-raw, framerate=6/1 ! videoconvert ! jpegenc ! rtpjpegpay ! udpsink host=192.168.68.17 port=5200
# NVENC V4L2
$ gst-launch-1.0 -v v4l2src device=/dev/video2 ! video/x-raw,framerate=6/1 ! nvvidconv ! omxh264enc control-rate=2 bitrate=4000000 ! video/x-h264, stream-format=byte-stream ! rtph264pay mtu=1400 ! udpsink host=192.168.68.38 port=5000 sync=false async=false
$ gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! avdec_h264 ! videoconvert ! video/x-raw,format=RGB ! decodebin ! videoconvert ! pngenc ! multifilesink location=/home/dieptran/Pictures/Test/file-%03d.png
# NVENC V4L2
$ gst-launch-1.0 -v v4l2src device=/dev/video2 ! video/x-raw,framerate=6/1 ! nvvidconv ! nvv4l2h265enc bitrate=8000000 insert-sps-pps=true ! rtph265pay mtu=1400 ! udpsink host=192.168.68.66 port=5000 sync=false async=false
$ gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=H265,payload=96 ! rtph265depay ! avdec_h265 ! videoconvert ! video/x-raw,format=RGB ! decodebin ! videoconvert ! pngenc ! multifilesink location=/home/dieptran/Pictures/Test/file-%03d.png
### H.264
$ gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw,width=640,height=480,framerate=15/1 ! omxh264enc control-rate=2 bitrate=4000000 ! video/x-h264, stream-format=byte-stream ! rtph264pay mtu=1400 ! udpsink host=192.168.68.38 port=5000 sync=false async=false
$ gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! video/x-raw,format=RGB ! jpegenc ! multifilesink location=/home/nguyenanhquang/Pictures/images/file-%03d.jpg
$ gst-launch-1.0 -v v4l2src device=/dev/video2 ! video/x-raw,width=640,height=480,framerate=15/1 ! nvvidconv ! nvv4l2h264enc bitrate=8000000 insert-sps-pps=true ! rtph264pay mtu=1400 ! udpsink host=192.168.1.3 port=5000 sync=false async=false
$ gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! avdec_h264 ! videoconvert ! video/x-raw,format=RGB ! jpegenc ! multifilesink location=/home/nguyenanhquang/Pictures/images/file-%03d.jpg
### H.265
$ gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw, width=640, height=480, framerate=15/1 ! omxh265enc control-rate=2 bitrate=4000000 ! video/x-h265, stream-format=byte-stream ! rtph265pay mtu=1400 ! udpsink host=192.168.68.38 port=5000 sync=false async=false
$ gst-launch-1.0 -v v4l2src device=/dev/video2 ! video/x-raw,width=640,height=480,framerate=15/1 ! nvvidconv ! nvv4l2h265enc bitrate=8000000 insert-sps-pps=true ! rtph264pay mtu=1400 ! udpsink host=192.168.1.3 port=5000 sync=false async=false
$ gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=H265,payload=96 ! rtph264depay ! h265parse ! avdec_h265 ! videoconvert ! video/x-raw,format=RGB ! jpegenc ! multifilesink location=/home/nguyenanhquang/Pictures/images/file-%03d.jpg
### VP8
$ gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw, width=640, height=480, framerate=15/1 ! omxvp8enc ! rtpvp8pay mtu=1400 ! udpsink host=192.168.68.38 port=5000 sync=false async=false
$ gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=VP8,payload=96 ! rtpvp8depay ! avdec_vp8 ! videoconvert ! video/x-raw,format=RGB ! jpegenc ! multifilesink location=/home/nguyenanhquang/Pictures/images/file-%03d.jpg
Command | Description |
---|---|
gst-launch-1.0 |
build and run GStreamer pipelines from the command line |
gst-inspect-1.0 |
find out what GStreamer elements you have available and their capabilities |
gst-discoverer-1.0 |
discover the internal structure of media files |
You can see cameras attached to your Jetson Nano with the command:
Note for this cases: autovideosink ~= xvimagesink
$ gst-device-monitor-1.0 # Camera Logitech C310
$ gst-device-monitor-1.0 # Depth Camera D435
Show Video - You can dowload from link
$ gst-launch-1.0 filesrc location=./Elecard_about_Tomsk_part2_HEVC_UHD.mp4 ! qtdemux ! queue ! avdec_h265 ! videoconvert ! videoscale ! video/x-raw, width=1980, height=1280 ! xvimagesink
$ gst-launch-1.0 v4l2src device=/dev/video0 ! videoconvert! videoscale ! video/x-raw, width=1280, height=720 ! timeoverlay halignment=left valignment=bottom text="Time:" shaded-background=true font-desc="Sans, 12" ! clockoverlay halignment=right valignment=bottom shaded-background=true time-format="CLOCK: %D %H:%M:%S" font-desc="Sans, 12" ! autovideosink
$ gst-launch-1.0 v4l2src num-buffers=1 ! jpegenc ! filesink location=capture1.jpeg
Element | Description |
---|---|
videoconvert |
does raw video format conversion, making sure other elements can understand each other. |
decodebin |
automatically constructs a decoding pipeline using available decoders and demuxers via auto-plugging until raw media is obtained. It is used internally by 'uridecodebin' which is often more convenient to use, as it creates a suitable source element as well. It replaces the old 'decodebin' element. It acts like a demuxer, so it offers as many source pads as streams are found in the media. |
* Because the output image captured from the webcam is video/x-raw and image/jpeg format.
* So you must use extra element( decodebin && videoconvert) to get png images.
* jpeg format
gst-launch-1.0 -v v4l2src device=/dev/video0 ! videorate ! video/x-raw,framerate=1/10 ! jpegenc ! multifilesink location=file-%02d.jpg
* png format
gst-launch-1.0 -v v4l2src device=/dev/video0 ! videorate ! video/x-raw,framerate=1/10 ! decodebin ! videoconvert ! pngenc ! multifilesink location=file-%02d.png
# get the IP address of receiver side
$ hostname -I
# start the sender (Jetson Nano)
$ gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=-1 ! video/x-raw, width=640, height=480, framerate=30/1 ! videoconvert ! jpegenc ! rtpjpegpay ! udpsink host=192.168.68.38 port=5200
# start the reciever (Laptop/Desktop)
$ gst-launch-1.0 -v udpsrc port=5200 ! application/x-rtp, media=video, clock-rate=90000, payload=96 ! rtpjpegdepay ! jpegdec ! videoconvert ! autovideosink
# get the IP address of sender side
$ hostname -I
# start the sender (Jetson Nano)
$ gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=-1 ! video/x-raw,width=640,height=480, framerate=30/1 ! videoconvert ! jpegenc ! tcpserversink host=192.168.68.66 port=5000
# start the reciever (Laptop/Desktop)
$ gst-launch-1.0 tcpclientsrc host=192.168.68.66 port=5000 ! jpegdec ! videoconvert ! autovideosink
# get the IP address of sender side
$ hostname -I
cd ~/gst-rtsp-server-1.14.5/examples
# start the sender (Jetson Nano)
$ ./test-launch "v4l2src device=/dev/video0 ! video/x-h264, width=640, height=480, framerate=30/1 ! h264parse config-interval=1 ! rtph264pay name=pay0 pt=96"
# start the reciever (Laptop/Desktop)
$ gst-launch-1.0 rtspsrc location=rtsp://192.168.68.66:8554/test/ latency=10 ! decodebin ! autovideosink
# start the sender (Jetson Nano)
gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw, framerate=5/1 ! videoconvert ! jpegenc ! rtpjpegpay ! udpsink host=192.168.68.17 port=5200
# Check time on sender side
gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw, framerate=5/1 ! videoconvert ! jpegenc ! rtpjpegpay ! udpsink host=192.168.68.17 port=5200
# start the reciever (Laptop/Desktop)
gst-launch-1.0 -v udpsrc port=5200 ! application/x-rtp, payload=96, encoding-name=JPEG ! rtpjpegdepay ! videorate ! image/jpeg, framerate=1/5 ! decodebin ! videoconvert ! pngenc ! multifilesink location=/media/data/team_ROS/tien/files/automatic_reconstruction/images/file-%03d.png
Pipeline on sender side
Pipeline on receiver side
# start the sender (Jetson Nano)
gst-launch-1.0 -v v4l2src device=/dev/video2 ! video/x-raw, framerate=6/1 ! videoconvert ! jpegenc ! rtpjpegpay ! udpsink host=192.168.68.17 port=5200
# start the reciever (Laptop/Desktop)
gst-launch-1.0 -v udpsrc port=5200 ! application/x-rtp, payload=96, encoding-name=JPEG ! rtpjpegdepay ! videorate ! image/jpeg, framerate=1/3 ! decodebin ! videoconvert ! pngenc ! multifilesink location=/media/data/team_ROS/tien/files/automatic_reconstruction/images/file-%03d.png
Element | Description |
---|---|
v4l2src |
can be used to capture video from v4l2 devices, like webcams and tv cards |
videorate |
This element takes an incoming stream of timestamped video frames. It will produce a perfect stream that matches the source pad's framerate. The correction is performed by dropping and duplicating frames, no fancy algorithm is used to interpolate frames (yet). By default the element will simply negotiate the same framerate on its source and sink pad. |
capsfilter |
The element does not modify data as such, but can enforce limitations on the data format. |
videoconvert |
Convert video frames between a great variety of video formats |
jpegenc |
Encodes jpeg images |
rtpjpegpay |
Payload encode JPEG pictures into RTP packets according to RFC 2435. For detailed information see: http://www.rfc-editor.org/rfc/rfc2435.txt The payloader takes a JPEG picture, scans the header for quantization tables (if needed) and constructs the RTP packet header followed by the actual JPEG entropy scan. The payloader assumes that correct width and height is found in the caps. |
udpsink |
udpsink is a network sink that sends UDP packets to the network. It can be combined with RTP payloaders to implement RTP streaming. |
udpsrc |
udpsrc is a network source that reads UDP packets from the network. It can be combined with RTP depayloaders to implement RTP streaming. The udpsrc element supports automatic port allocation by setting the property to 0. After setting the udpsrc to PAUSED, the allocated port can be obtained by reading the port property. udpsrc can read from multicast groups by setting the property to the IP address of the multicast group. Alternatively one can provide a custom socket to udpsrc with the property, udpsrc will then not allocate a socket itself but use the provided one. The property is mainly used to give a type to the UDP packet so that they can be autoplugged in GStreamer pipelines. This is very useful for RTP implementations where the contents of the UDP packets is transferred out-of-bounds using SDP or other means. The property is used to change the default kernel buffersizes used for receiving packets. The buffer size may be increased for high-volume connections, or may be decreased to limit the possible backlog of incoming data. The system places an absolute limit on these values, on Linux, for example, the default buffer size is typically 50K and can be increased to maximally 100K. The property is used to strip off an arbitrary number of bytes from the start of the raw udp packet and can be used to strip off proprietary header, for example. |
rtpjpegdepay |
Extracts JPEG video from RTP packets (RFC 2435) |
multifilesink |
Write incoming data to a series of sequentially-named files. This element is usually used with data where each buffer is an independent unit of data in its own right (e.g. raw video buffers or encoded JPEG or PNG images) or with streamable container formats such as MPEG-TS or MPEG-PS. It is not possible to use this element to create independently playable mp4 files, use the splitmuxsink element for that instead. The filename property should contain a string with a %d placeholder that will be substituted with the index for each filename. If the property is TRUE, it sends an application message named GstMultiFileSink after writing each buffer. |
# Depth camera D435
# JPG image
# start the sender (Jetson Nano)
gst-launch-1.0 -v v4l2src device=/dev/video2 ! video/x-raw, framerate=6/1 ! videoconvert ! jpegenc ! rtpjpegpay ! udpsink host=192.168.68.17 port=5200
# start the reciever (Laptop/Desktop)
gst-launch-1.0 -v udpsrc port=5200 ! application/x-rtp, payload=26, encoding-name=JPEG ! rtpjpegdepay ! videorate ! image/jpeg, framerate=1/5 ! multifilesink location=/media/data/team_ROS/tien/files/automatic_reconstruction/images/file-%03d.jpg
# PNG image
# start the sender (Jetson Nano)
gst-launch-1.0 -v v4l2src device=/dev/video2 ! video/x-raw, framerate=6/1 ! videoconvert ! jpegenc ! rtpjpegpay ! udpsink host=192.168.68.17 port=5200
# start the reciever (Laptop/Desktop)
gst-launch-1.0 -v udpsrc port=5200 ! application/x-rtp, payload=26, encoding-name=JPEG ! rtpjpegdepay ! videorate ! image/jpeg, framerate=1/5 ! decodebin ! videoconvert ! pngenc ! multifilesink location=/media/data/team_ROS/tien/files/automatic_reconstruction/images/file-%03d.png