_MediaMTX_ (formerly _rtsp-simple-server_) is a ready-to-use and zero-dependency real-time media server and media proxy that allows to publish, read, proxy, record and playback video and audio streams. It has been conceived as a "media router" that routes media streams from one end to the other.
Live streams can be published to the server with:
|protocol|variants|video codecs|audio codecs|
|--------|--------|------------|------------|
|[SRT clients](#srt-clients)||H265, H264, MPEG-4 Video (H263, Xvid), MPEG-1/2 Video|Opus, MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3), AC-3|
|[SRT cameras and servers](#srt-cameras-and-servers)||H265, H264, MPEG-4 Video (H263, Xvid), MPEG-1/2 Video|Opus, MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3), AC-3|
|[WebRTC clients](#webrtc-clients)|Browser-based, WHIP|AV1, VP9, VP8, H265, H264|Opus, G722, G711 (PCMA, PCMU)|
|[WebRTC servers](#webrtc-servers)|WHEP|AV1, VP9, VP8, H265, H264|Opus, G722, G711 (PCMA, PCMU)|
|[RTSP clients](#rtsp-clients)|UDP, TCP, RTSPS|AV1, VP9, VP8, H265, H264, MPEG-4 Video (H263, Xvid), MPEG-1/2 Video, M-JPEG and any RTP-compatible codec|Opus, MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3), AC-3, G726, G722, G711 (PCMA, PCMU), LPCM and any RTP-compatible codec|
|[RTSP cameras and servers](#rtsp-cameras-and-servers)|UDP, UDP-Multicast, TCP, RTSPS|AV1, VP9, VP8, H265, H264, MPEG-4 Video (H263, Xvid), MPEG-1/2 Video, M-JPEG and any RTP-compatible codec|Opus, MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3), AC-3, G726, G722, G711 (PCMA, PCMU), LPCM and any RTP-compatible codec|
|[RTMP clients](#rtmp-clients)|RTMP, RTMPS, Enhanced RTMP|AV1, VP9, H265, H264|MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3), G711 (PCMA, PCMU), LPCM|
|[RTMP cameras and servers](#rtmp-cameras-and-servers)|RTMP, RTMPS, Enhanced RTMP|H264|MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3)|
|[HLS cameras and servers](#hls-cameras-and-servers)|Low-Latency HLS, MP4-based HLS, legacy HLS|AV1, VP9, H265, H264|Opus, MPEG-4 Audio (AAC)|
|[UDP/MPEG-TS](#udpmpeg-ts)|Unicast, broadcast, multicast|H265, H264, MPEG-4 Video (H263, Xvid), MPEG-1/2 Video|Opus, MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3), AC-3|
|[Raspberry Pi Cameras](#raspberry-pi-cameras)||H264||
And can be read from the server with:
|protocol|variants|video codecs|audio codecs|
|--------|--------|------------|------------|
|[SRT](#srt)||H265, H264, MPEG-4 Video (H263, Xvid), MPEG-1/2 Video|Opus, MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3), AC-3|
|[WebRTC](#webrtc)|Browser-based, WHEP|AV1, VP9, VP8, H264|Opus, G722, G711 (PCMA, PCMU)|
|[RTSP](#rtsp)|UDP, UDP-Multicast, TCP, RTSPS|AV1, VP9, VP8, H265, H264, MPEG-4 Video (H263, Xvid), MPEG-1/2 Video, M-JPEG and any RTP-compatible codec|Opus, MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3), AC-3, G726, G722, G711 (PCMA, PCMU), LPCM and any RTP-compatible codec|
|[RTMP](#rtmp)|RTMP, RTMPS, Enhanced RTMP|H264|MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3)|
|[HLS](#hls)|Low-Latency HLS, MP4-based HLS, legacy HLS|AV1, VP9, H265, H264|Opus, MPEG-4 Audio (AAC)|
And can be recorded and played back with:
|format|video codecs|audio codecs|
|------|------------|------------|
|[fMP4](#record-streams-to-disk)|AV1, VP9, H265, H264, MPEG-4 Video (H263, Xvid), MPEG-1/2 Video, M-JPEG|Opus, MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3), AC-3, G711 (PCMA, PCMU), LPCM|
|[MPEG-TS](#record-streams-to-disk)|H265, H264, MPEG-4 Video (H263, Xvid), MPEG-1/2 Video|Opus, MPEG-4 Audio (AAC), MPEG-1/2 Audio (MP3), AC-3|
**Features**
* Publish live streams to the server
* Read live streams from the server
* Streams are automatically converted from a protocol to another
* Serve multiple streams at once in separate paths
* Record streams to disk
* Playback recorded streams
* Authenticate users
* Redirect readers to other RTSP servers (load balancing)
* Control the server through the Control API
* Reload the configuration without disconnecting existing clients (hot reloading)
* Read Prometheus-compatible metrics
* Run hooks (external commands) when clients connect, disconnect, read or publish streams
* Compatible with Linux, Windows and macOS, does not require any dependency or interpreter, it's a single executable
**Note about rtsp-simple-server**
_rtsp-simple-server_ has been rebranded as _MediaMTX_. The reason is pretty obvious: this project started as a RTSP server but has evolved into a much more versatile product that is not tied to the RTSP protocol anymore. Nothing will change regarding license, features and backward compatibility.
## Table of contents
* [Installation](#installation)
* [Standalone binary](#standalone-binary)
* [Docker image](#docker-image)
* [Arch Linux package](#arch-linux-package)
* [OpenWrt binary](#openwrt-binary)
* [Basic usage](#basic-usage)
* [Publish to the server](#publish-to-the-server)
* [By software](#by-software)
* [FFmpeg](#ffmpeg)
* [GStreamer](#gstreamer)
* [OBS Studio](#obs-studio)
* [OpenCV](#opencv)
* [Web browsers](#web-browsers)
* [By device](#by-device)
* [Generic webcam](#generic-webcam)
* [Raspberry Pi Cameras](#raspberry-pi-cameras)
* [By protocol](#by-protocol)
* [SRT clients](#srt-clients)
* [SRT cameras and servers](#srt-cameras-and-servers)
* [WebRTC clients](#webrtc-clients)
* [WebRTC servers](#webrtc-servers)
* [RTSP clients](#rtsp-clients)
* [RTSP cameras and servers](#rtsp-cameras-and-servers)
* [RTMP clients](#rtmp-clients)
* [RTMP cameras and servers](#rtmp-cameras-and-servers)
* [HLS cameras and servers](#hls-cameras-and-servers)
* [UDP/MPEG-TS](#udpmpeg-ts)
* [Read from the server](#read-from-the-server)
* [By software](#by-software-1)
* [FFmpeg](#ffmpeg-1)
* [GStreamer](#gstreamer-1)
* [VLC](#vlc)
* [Web browsers](#web-browsers-1)
* [By protocol](#by-protocol-1)
* [SRT](#srt)
* [WebRTC](#webrtc)
* [RTSP](#rtsp)
* [RTMP](#rtmp)
* [HLS](#hls)
* [Other features](#other-features)
* [Configuration](#configuration)
* [Authentication](#authentication)
* [Internal](#internal)
* [HTTP-based](#http-based)
* [JWT-based](#jwt-based)
* [Encrypt the configuration](#encrypt-the-configuration)
* [Remuxing, re-encoding, compression](#remuxing-re-encoding-compression)
* [Record streams to disk](#record-streams-to-disk)
* [Playback recorded streams](#playback-recorded-streams)
* [Forward streams to other servers](#forward-streams-to-other-servers)
* [Proxy requests to other servers](#proxy-requests-to-other-servers)
* [On-demand publishing](#on-demand-publishing)
* [Start on boot](#start-on-boot)
* [Linux](#linux)
* [OpenWrt](#openwrt)
* [Windows](#windows)
* [Hooks](#hooks)
* [Control API](#control-api)
* [Metrics](#metrics)
* [pprof](#pprof)
* [SRT-specific features](#srt-specific-features)
* [Standard stream ID syntax](#standard-stream-id-syntax)
* [WebRTC-specific features](#webrtc-specific-features)
* [Authenticating with WHIP/WHEP](#authenticating-with-whipwhep)
* [Solving WebRTC connectivity issues](#solving-webrtc-connectivity-issues)
* [RTSP-specific features](#rtsp-specific-features)
* [Transport protocols](#transport-protocols)
* [Encryption](#encryption)
* [Corrupted frames](#corrupted-frames)
* [RTMP-specific features](#rtmp-specific-features)
* [Encryption](#encryption-1)
* [Compile from source](#compile-from-source)
* [Standard](#standard)
* [OpenWrt](#openwrt-1)
* [Cross compile](#cross-compile)
* [Compile for all supported platforms](#compile-for-all-supported-platforms)
* [License](#license)
* [Specifications](#specifications)
* [Related projects](#related-projects)
## Installation
There are several installation methods available: standalone binary, Docker image, Arch Linux package and OpenWrt binary.
### Standalone binary
1. Download and extract a standalone binary from the [release page](https://github.com/bluenviron/mediamtx/releases) that corresponds to your operating system and architecture.
2. Start the server:
```sh
./mediamtx
```
### Docker image
Download and launch the image:
```
docker run --rm -it --network=host bluenviron/mediamtx:latest
```
Available images:
|name|FFmpeg included|RPI Camera support|
|----|---------------|------------------|
|bluenviron/mediamtx:latest|:x:|:x:|
|bluenviron/mediamtx:latest-ffmpeg|:heavy_check_mark:|:x:|
|bluenviron/mediamtx:latest-rpi|:x:|:heavy_check_mark:|
|bluenviron/mediamtx:latest-ffmpeg-rpi|:heavy_check_mark:|:heavy_check_mark:|
The `--network=host` flag is mandatory since Docker can change the source port of UDP packets for routing reasons, and this doesn't allow the RTSP server to identify the senders of the packets. This issue can be avoided by disabling the UDP transport protocol:
```
docker run --rm -it \
-e MTX_PROTOCOLS=tcp \
-e MTX_WEBRTCADDITIONALHOSTS=192.168.x.x \
-p 8554:8554 \
-p 1935:1935 \
-p 8888:8888 \
-p 8889:8889 \
-p 8890:8890/udp \
-p 8189:8189/udp \
bluenviron/mediamtx
```
set `MTX_WEBRTCADDITIONALHOSTS` to your local IP address.
### Arch Linux package
If you are running the Arch Linux distribution, run:
```sh
git clone https://aur.archlinux.org/mediamtx.git
cd mediamtx
makepkg -si
```
### OpenWrt binary
If the architecture of the OpenWrt device is amd64, armv6, armv7 or arm64, use the [standalone binary method](#standalone-binary) and download a Linux binary that corresponds to your architecture.
Otherwise, [compile the server from source](#openwrt-1).
## Basic usage
1. Publish a stream. For instance, you can publish a video/audio file with _FFmpeg_:
```sh
ffmpeg -re -stream_loop -1 -i file.ts -c copy -f rtsp rtsp://localhost:8554/mystream
```
or _GStreamer_:
```sh
gst-launch-1.0 rtspclientsink name=s location=rtsp://localhost:8554/mystream filesrc location=file.mp4 \
! qtdemux name=d d.video_0 ! queue ! s.sink_0 d.audio_0 ! queue ! s.sink_1
```
2. Open the stream. For instance, you can open the stream with _VLC_:
```sh
vlc --network-caching=50 rtsp://localhost:8554/mystream
```
or _GStreamer_:
```sh
gst-play-1.0 rtsp://localhost:8554/mystream
```
or _FFmpeg_:
```sh
ffmpeg -i rtsp://localhost:8554/mystream -c copy output.mp4
```
## Publish to the server
### By software
#### FFmpeg
FFmpeg can publish a stream to the server in multiple ways (SRT client, SRT server, RTSP client, RTMP client, UDP/MPEG-TS, WebRTC with WHIP). The recommended one consists in publishing as a [RTSP client](#rtsp-clients):
```
ffmpeg -re -stream_loop -1 -i file.ts -c copy -f rtsp rtsp://localhost:8554/mystream
```
The RTSP protocol supports multiple underlying transport protocols, each with its own characteristics (see [RTSP-specific features](#rtsp-specific-features)). You can set the transport protocol by using the `rtsp_transport` flag, for instance, in order to use TCP:
```sh
ffmpeg -re -stream_loop -1 -i file.ts -c copy -f rtsp -rtsp_transport tcp rtsp://localhost:8554/mystream
```
The resulting stream will be available in path `/mystream`.
#### GStreamer
GStreamer can publish a stream to the server in multiple ways (SRT client, SRT server, RTSP client, RTMP client, UDP/MPEG-TS, WebRTC with WHIP). The recommended one consists in publishing as a [RTSP client](#rtsp-clients):
```sh
gst-launch-1.0 rtspclientsink name=s location=rtsp://localhost:8554/mystream \
filesrc location=file.mp4 ! qtdemux name=d \
d.video_0 ! queue ! s.sink_0 \
d.audio_0 ! queue ! s.sink_1
```
If the stream is video only:
```sh
gst-launch-1.0 filesrc location=file.mp4 ! qtdemux name=d \
d.video_0 ! rtspclientsink location=rtsp://localhost:8554/mystream
```
The RTSP protocol supports multiple underlying transport protocols, each with its own characteristics (see [RTSP-specific features](#rtsp-specific-features)). You can set the transport protocol by using the `protocols` flag:
```sh
gst-launch-1.0 filesrc location=file.mp4 ! qtdemux name=d \
d.video_0 ! rtspclientsink protocols=tcp name=s location=rtsp://localhost:8554/mystream
```
The resulting stream will be available in path `/mystream`.
GStreamer can also publish a stream by using the [WebRTC / WHIP protocol](#webrtc). Make sure that GStreamer version is at least 1.22, and that if the codec is H264, the profile is baseline. Use the `whipclientsink` element:
```
gst-launch-1.0 videotestsrc \
! video/x-raw,width=1920,height=1080,format=I420 \
! x264enc speed-preset=ultrafast bitrate=2000 \
! video/x-h264,profile=baseline \
! whipclientsink signaller::whip-endpoint=http://localhost:8889/mystream/whip
```
#### OBS Studio
OBS Studio can publish to the server in multiple ways (SRT client, RTMP client, WebRTC client). The recommended one consists in publishing as a [RTMP client](#rtmp-clients). In `Settings -> Stream` (or in the Auto-configuration Wizard), use the following parameters:
* Service: `Custom...`
* Server: `rtmp://localhost`
* Stream key: `mystream`
If credentials are in use, use the following parameters:
* Service: `Custom...`
* Server: `rtmp://localhost`
* Stream key: `mystream?user=myuser&pass=mypass`
Save the configuration and click `Start streaming`.
If you want to generate a stream that can be read with WebRTC, open `Settings -> Output -> Recording` and use the following parameters:
* FFmpeg output type: `Output to URL`
* File path or URL: `rtsp://localhost:8554/mystream`
* Container format: `rtsp`
* Check `show all codecs (even if potentically incompatible)`
* Video encoder: `h264_nvenc (libx264)`
* Video encoder settings (if any): `bf=0`
* Audio track: `1`
* Audio encoder: `libopus`
Then use the button `Start Recording` (instead of `Start Streaming`) to start streaming.
Latest versions of OBS Studio can publish to the server with the [WebRTC / WHIP protocol](#webrtc). Use the following parameters:
* Service: `WHIP`
* Server: `http://localhost:8889/mystream/whip`
* Bearer Token: `myuser:mypass` (if internal authentication is enabled) or JWT (if JWT-based authentication is enabled)
Save the configuration and click `Start streaming`.
The resulting stream will be available in path `/mystream`.
#### OpenCV
OpenCV can publish to the server through its GStreamer plugin, as a [RTSP client](#rtsp-clients). It must be compiled with GStreamer support, by following this procedure:
```sh
sudo apt install -y libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev gstreamer1.0-plugins-ugly gstreamer1.0-rtsp python3-dev python3-numpy
git clone --depth=1 -b 4.5.4 https://github.com/opencv/opencv
cd opencv
mkdir build && cd build
cmake -D CMAKE_INSTALL_PREFIX=/usr -D WITH_GSTREAMER=ON ..
make -j$(nproc)
sudo make install
```
You can check that OpenCV has been installed correctly by running:
```sh
python3 -c 'import cv2; print(cv2.getBuildInformation())'
```
Check that the output contains `GStreamer: YES`.
Videos can be published with `VideoWriter`:
```python
from datetime import datetime
from time import sleep, time
import cv2
import numpy as np
fps = 15
width = 800
height = 600
colors = [
(0, 0, 255),
(255, 0, 0),
(0, 255, 0),
]
out = cv2.VideoWriter('appsrc ! videoconvert' + \
' ! video/x-raw,format=I420' + \
' ! x264enc speed-preset=ultrafast bitrate=600 key-int-max=' + str(fps * 2) + \
' ! video/x-h264,profile=baseline' + \
' ! rtspclientsink location=rtsp://localhost:8554/mystream',
cv2.CAP_GSTREAMER, 0, fps, (width, height), True)
if not out.isOpened():
raise Exception("can't open video writer")
curcolor = 0
start = time()
while True:
frame = np.zeros((height, width, 3), np.uint8)
# create a rectangle
color = colors[curcolor]
curcolor += 1
curcolor %= len(colors)
for y in range(0, int(frame.shape[0] / 2)):
for x in range(0, int(frame.shape[1] / 2)):
frame[y][x] = color
out.write(frame)
print("%s frame written to the server" % datetime.now())
now = time()
diff = (1 / fps) - now - start
if diff > 0:
sleep(diff)
start = now
```
The resulting stream will be available in path `/mystream`.
#### Web browsers
Web browsers can publish a stream to the server by using the [WebRTC protocol](#webrtc). Start the server and open the web page:
```
http://localhost:8889/mystream/publish
```
The resulting stream will be available in path `/mystream`.
This web page can be embedded into another web page by using an iframe:
```html
```
For more advanced setups, you can create and serve a custom web page by starting from the [source code of the WebRTC publish page](internal/servers/webrtc/publish_index.html).
### By device
#### Generic webcam
If the OS is Linux-based, edit `mediamtx.yml` and replace everything inside section `paths` with the following content:
```yml
paths:
cam:
runOnInit: ffmpeg -f v4l2 -i /dev/video0 -pix_fmt yuv420p -preset ultrafast -b:v 600k -f rtsp rtsp://localhost:$RTSP_PORT/$MTX_PATH
runOnInitRestart: yes
```
If the OS is Windows:
```yml
paths:
cam:
runOnInit: ffmpeg -f dshow -i video="USB2.0 HD UVC WebCam" -pix_fmt yuv420p -c:v libx264 -preset ultrafast -b:v 600k -f rtsp rtsp://localhost:$RTSP_PORT/$MTX_PATH
runOnInitRestart: yes
```
Where `USB2.0 HD UVC WebCam` is the name of a webcam, that can be obtained with:
```sh
ffmpeg -list_devices true -f dshow -i dummy
```
The resulting stream will be available in path `/cam`.
#### Raspberry Pi Cameras
_MediaMTX_ natively supports the Raspberry Pi Camera, enabling high-quality and low-latency video streaming from the camera to any user, for any purpose. There are a couple of requirements:
1. The server must run on a Raspberry Pi, with one of the following operating systems:
* Raspberry Pi OS Bookworm
* Raspberry Pi OS Bullseye
Both 32 bit and 64 bit architectures are supported.
2. If you are using Raspberry Pi OS Bullseye, make sure that the legacy camera stack is disabled. Type `sudo raspi-config`, then go to `Interfacing options`, `enable/disable legacy camera support`, choose `no`. Reboot the system.
If you want to run the standard (non-Docker) version of the server:
1. Download the server executable. If you're using 64-bit version of the operative system, make sure to pick the `arm64` variant.
2. Edit `mediamtx.yml` and replace everything inside section `paths` with the following content:
```yml
paths:
cam:
source: rpiCamera
```
The resulting stream will be available in path `/cam`.
If you want to run the server inside Docker, you need to use the `latest-rpi` image and launch the container with some additional flags:
```sh
docker run --rm -it \
--network=host \
--privileged \
--tmpfs /dev/shm:exec \
-v /run/udev:/run/udev:ro \
-e MTX_PATHS_CAM_SOURCE=rpiCamera \
bluenviron/mediamtx:latest-rpi
```
Be aware that the server is not compatible with cameras that requires a custom `libcamera` (like some ArduCam products), since it comes with a bundled `libcamera`. If you want to use a custom one, you can [compile from source](#compile-from-source).
Camera settings can be changed by using the `rpiCamera*` parameters:
```yml
paths:
cam:
source: rpiCamera
rpiCameraWidth: 1920
rpiCameraHeight: 1080
```
All available parameters are listed in the [sample configuration file](/mediamtx.yml).
In order to add audio from a USB microfone, install GStreamer and alsa-utils:
```sh
sudo apt install -y gstreamer1.0-tools gstreamer1.0-rtsp gstreamer1.0-alsa alsa-utils
```
list available audio cards with:
```sh
arecord -L
```
Sample output:
```
surround51:CARD=ICH5,DEV=0
Intel ICH5, Intel ICH5
5.1 Surround output to Front, Center, Rear and Subwoofer speakers
default:CARD=U0x46d0x809
USB Device 0x46d:0x809, USB Audio
Default Audio Device
```
Find the audio card of the microfone and take note of its name, for instance `default:CARD=U0x46d0x809`. Then create a new path that takes the video stream from the camera and audio from the microphone:
```yml
paths:
cam:
source: rpiCamera
cam_with_audio:
runOnInit: >
gst-launch-1.0
rtspclientsink name=s location=rtsp://localhost:$RTSP_PORT/cam_with_audio
rtspsrc location=rtsp://127.0.0.1:$RTSP_PORT/cam latency=0 ! rtph264depay ! s.
alsasrc device=default:CARD=U0x46d0x809 ! opusenc bitrate=16000 ! s.
runOnInitRestart: yes
```
The resulting stream will be available in path `/cam_with_audio`.
### By protocol
#### SRT clients
SRT is a protocol that allows to publish and read live data stream, providing encryption, integrity and a retransmission mechanism. It is usually used to transfer media streams encoded with MPEG-TS. In order to publish a stream to the server with the SRT protocol, use this URL:
```
srt://localhost:8890?streamid=publish:mystream&pkt_size=1316
```
Replace `mystream` with any name you want. The resulting stream will be available in path `/mystream`.
If credentials are enabled, append username and password to `streamid`:
```
srt://localhost:8890?streamid=publish:mystream:user:pass&pkt_size=1316
```
If you need to use the standard stream ID syntax instead of the custom one in use by this server, see [Standard stream ID syntax](#standard-stream-id-syntax).
If you want to publish a stream by using a client in listening mode (i.e. with `mode=listener` appended to the URL), read the next section.
Known clients that can publish with SRT are [FFmpeg](#ffmpeg), [GStreamer](#gstreamer), [OBS Studio](#obs-studio).
#### SRT cameras and servers
In order to ingest into the server a SRT stream from an existing server, camera or client in listening mode (i.e. with `mode=listener` appended to the URL), add the corresponding URL into the `source` parameter of a path:
```yml
paths:
proxied:
# url of the source stream, in the format srt://host:port?streamid=streamid&other_parameters
source: srt://original-url
```
#### WebRTC clients
WebRTC is an API that makes use of a set of protocols and methods to connect two clients together and allow them to exchange real-time media or data streams. You can publish a stream with WebRTC and a web browser by visiting:
```
http://localhost:8889/mystream/publish
```
The resulting stream will be available in path `/mystream`.
WHIP is a WebRTC extensions that allows to publish streams by using a URL, without passing through a web page. This allows to use WebRTC as a general purpose streaming protocol. If you are using a software that supports WHIP (for instance, latest versions of OBS Studio), you can publish a stream to the server by using this URL:
```
http://localhost:8889/mystream/whip
```
Regarding authentication, read [Authenticating with WHIP/WHEP](#authenticating-with-whipwhep).
Depending on the network it may be difficult to establish a connection between server and clients, read [Solving WebRTC connectivity issues](#solving-webrtc-connectivity-issues).
Known clients that can publish with WebRTC and WHIP are [FFmpeg](#ffmpeg), [GStreamer](#gstreamer), [OBS Studio](#obs-studio).
#### WebRTC servers
In order to ingest into the server a WebRTC stream from an existing server, add the corresponding WHEP URL into the `source` parameter of a path:
```yml
paths:
proxied:
# url of the source stream, in the format whep://host:port/path (HTTP) or wheps:// (HTTPS)
source: wheps://host:port/path
```
#### RTSP clients
RTSP is a protocol that allows to publish and read streams. It supports different underlying transport protocols and allows to encrypt streams in transit (see [RTSP-specific features](#rtsp-specific-features)). In order to publish a stream to the server with the RTSP protocol, use this URL:
```
rtsp://localhost:8554/mystream
```
The resulting stream will be available in path `/mystream`.
Known clients that can publish with RTSP are [FFmpeg](#ffmpeg), [GStreamer](#gstreamer), [OBS Studio](#obs-studio).
#### RTSP cameras and servers
Most IP cameras expose their video stream by using a RTSP server that is embedded into the camera itself. In particular, cameras that are compliant with ONVIF profile S or T meet this requirement. You can use _MediaMTX_ to connect to one or multiple existing RTSP servers and read their video streams:
```yml
paths:
proxied:
# url of the source stream, in the format rtsp://user:pass@host:port/path
source: rtsp://original-url
```
The resulting stream will be available in path `/proxied`.
The server supports any number of source streams (count is just limited by available hardware resources) it's enough to add additional entries to the paths section:
```yml
paths:
proxied1:
source: rtsp://url1
proxied2:
source: rtsp://url1
```
#### RTMP clients
RTMP is a protocol that allows to read and publish streams, but is less versatile and less efficient than RTSP and WebRTC (doesn't support UDP, doesn't support most RTSP codecs, doesn't support feedback mechanism). Streams can be published to the server by using the URL:
```
rtmp://localhost/mystream
```
The resulting stream will be available in path `/mystream`.
In case authentication is enabled, credentials can be passed to the server by using the `user` and `pass` query parameters:
```
rtmp://localhost/mystream?user=myuser&pass=mypass
```
Known clients that can publish with RTMP are [FFmpeg](#ffmpeg), [GStreamer](#gstreamer), [OBS Studio](#obs-studio).
#### RTMP cameras and servers
You can use _MediaMTX_ to connect to one or multiple existing RTMP servers and read their video streams:
```yml
paths:
proxied:
# url of the source stream, in the format rtmp://user:pass@host:port/path
source: rtmp://original-url
```
The resulting stream will be available in path `/proxied`.
#### HLS cameras and servers
HLS is a streaming protocol that works by splitting streams into segments, and by serving these segments and a playlist with the HTTP protocol. You can use _MediaMTX_ to connect to one or multiple existing HLS servers and read their video streams:
```yml
paths:
proxied:
# url of the playlist of the stream, in the format http://user:pass@host:port/path
source: http://original-url/stream/index.m3u8
```
The resulting stream will be available in path `/proxied`.
#### UDP/MPEG-TS
The server supports ingesting UDP/MPEG-TS packets (i.e. MPEG-TS packets sent with UDP). Packets can be unicast, broadcast or multicast. For instance, you can generate a multicast UDP/MPEG-TS stream with GStreamer:
```
gst-launch-1.0 -v mpegtsmux name=mux alignment=1 ! udpsink host=238.0.0.1 port=1234 \
videotestsrc ! video/x-raw,width=1280,height=720,format=I420 ! x264enc speed-preset=ultrafast bitrate=3000 key-int-max=60 ! video/x-h264,profile=high ! mux. \
audiotestsrc ! audioconvert ! avenc_aac ! mux.
```
or FFmpeg:
```
ffmpeg -re -f lavfi -i testsrc=size=1280x720:rate=30 \
-pix_fmt yuv420p -c:v libx264 -preset ultrafast -b:v 600k \
-f mpegts udp://238.0.0.1:1234?pkt_size=1316
```
Edit `mediamtx.yml` and replace everything inside section `paths` with the following content:
```yml
paths:
mypath:
source: udp://238.0.0.1:1234
```
The resulting stream will be available in path `/mypath`.
Known clients that can publish with WebRTC and WHIP are [FFmpeg](#ffmpeg) and [GStreamer](#gstreamer).
## Read from the server
### By software
#### FFmpeg
FFmpeg can read a stream from the server in multiple ways (RTSP, RTMP, HLS, WebRTC with WHEP, SRT). The recommended one consists in reading with [RTSP](#rtsp):
```sh
ffmpeg -i rtsp://localhost:8554/mystream -c copy output.mp4
```
The RTSP protocol supports multiple underlying transport protocols, each with its own characteristics (see [RTSP-specific features](#rtsp-specific-features)). You can set the transport protocol by using the `rtsp_transport` flag:
```sh
ffmpeg -rtsp_transport tcp -i rtsp://localhost:8554/mystream -c copy output.mp4
```
#### GStreamer
GStreamer can read a stream from the server in multiple ways (RTSP, RTMP, HLS, WebRTC with WHEP, SRT). The recommended one consists in reading with [RTSP](#rtsp):
```sh
gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/mystream latency=0 ! decodebin ! autovideosink
```
The RTSP protocol supports multiple underlying transport protocols, each with its own characteristics (see [RTSP-specific features](#rtsp-specific-features)). You can change the transport protocol by using the `protocols` flag:
```sh
gst-launch-1.0 rtspsrc protocols=tcp location=rtsp://127.0.0.1:8554/mystream latency=0 ! decodebin ! autovideosink
```
If encryption is enabled, set `tls-validation-flags` to `0`:
```sh
gst-launch-1.0 rtspsrc tls-validation-flags=0 location=rtsps://ip:8322/...
```
#### VLC
VLC can read a stream from the server in multiple ways (RTSP, RTMP, HLS, SRT). The recommended one consists in reading with [RTSP](#rtsp):
```sh
vlc --network-caching=50 rtsp://localhost:8554/mystream
```
The RTSP protocol supports multiple underlying transport protocols, each with its own characteristics (see [RTSP-specific features](#rtsp-specific-features)).
In order to use the TCP transport protocol, use the `--rtsp_tcp` flag:
```sh
vlc --network-caching=50 --rtsp-tcp rtsp://localhost:8554/mystream
```
In order to use the UDP-multicast transport protocol, append `?vlcmulticast` to the URL:
```sh
vlc --network-caching=50 rtsp://localhost:8554/mystream?vlcmulticast
```
##### Ubuntu bug
The VLC shipped with Ubuntu 21.10 doesn't support playing RTSP due to a license issue (see [here](https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=982299) and [here](https://stackoverflow.com/questions/69766748/cvlc-cannot-play-rtsp-omxplayer-instead-can)). To fix the issue, remove the default VLC instance and install the snap version:
```
sudo apt purge -y vlc
snap install vlc
```
##### Encrypted streams
At the moment VLC doesn't support reading encrypted RTSP streams. However, you can use a proxy like [stunnel](https://www.stunnel.org) or [nginx](https://nginx.org/) or a local _MediaMTX_ instance to decrypt streams before reading them.
#### Web browsers
Web browsers can read a stream from the server in multiple ways (WebRTC or HLS).
You can read a stream by using the [WebRTC protocol](#webrtc-1) by visiting the web page:
```
http://localhost:8889/mystream
```
This web page can be embedded into another web page by using an iframe:
```html
```
For more advanced setups, you can create and serve a custom web page by starting from the [source code of the WebRTC read page](internal/servers/webrtc/read_index.html).
Web browsers can also read a stream with the [HLS protocol](#hls). Latency is higher but there are less problems related to connectivity between server and clients, furthermore the server load can be balanced by using a common HTTP CDN (like CloudFront or Cloudflare), and this allows to handle readers in the order of millions. Visit the web page:
```
http://localhost:8888/mystream
```
This web page can be embedded into another web page by using an iframe:
```html
```
For more advanced setups, you can create and serve a custom web page by starting from the [source code of the HLS read page](internal/servers/hls/index.html).
### By protocol
#### SRT
SRT is a protocol that allows to publish and read live data stream, providing encryption, integrity and a retransmission mechanism. It is usually used to transfer media streams encoded with MPEG-TS. In order to read a stream from the server with the SRT protocol, use this URL:
```
srt://localhost:8890?streamid=read:mystream
```
Replace `mystream` with the path name.
If credentials are enabled, append username and password to `streamid`:
```
srt://localhost:8890?streamid=read:mystream:user:pass
```
If you need to use the standard stream ID syntax instead of the custom one in use by this server, see [Standard stream ID syntax](#standard-stream-id-syntax).
Known clients that can read with SRT are [FFmpeg](#ffmpeg-1), [GStreamer](#gstreamer-1) and [VLC](#vlc).
#### WebRTC
WebRTC is an API that makes use of a set of protocols and methods to connect two clients together and allow them to exchange real-time media or data streams. You can read a stream with WebRTC and a web browser by visiting:
```
http://localhost:8889/mystream
```
WHEP is a WebRTC extensions that allows to read streams by using a URL, without passing through a web page. This allows to use WebRTC as a general purpose streaming protocol. If you are using a software that supports WHEP, you can read a stream from the server by using this URL:
```
http://localhost:8889/mystream/whep
```
Regarding authentication, read [Authenticating with WHIP/WHEP](#authenticating-with-whipwhep).
Depending on the network it may be difficult to establish a connection between server and clients, read [Solving WebRTC connectivity issues](#solving-webrtc-connectivity-issues).
Known clients that can read with WebRTC and WHEP are [FFmpeg](#ffmpeg-1), [GStreamer](#gstreamer-1) and [web browsers](#web-browsers-1).
#### RTSP
RTSP is a protocol that allows to publish and read streams. It supports different underlying transport protocols and allows to encrypt streams in transit (see [RTSP-specific features](#rtsp-specific-features)). In order to read a stream with the RTSP protocol, use this URL:
```
rtsp://localhost:8554/mystream
```
Known clients that can read with RTSP are [FFmpeg](#ffmpeg-1), [GStreamer](#gstreamer-1) and [VLC](#vlc).
##### Latency
The RTSP protocol doesn't introduce any latency by itself. Latency is usually introduced by clients, that put frames in a buffer to compensate network fluctuations. In order to decrease latency, the best way consists in tuning the client. For instance, in VLC, latency can be decreased by decreasing the _Network caching_ parameter, that is available in the _Open network stream_ dialog or alternatively can be set with the command line:
```
vlc --network-caching=50 rtsp://...
```
#### RTMP
RTMP is a protocol that allows to read and publish streams, but is less versatile and less efficient than RTSP and WebRTC (doesn't support UDP, doesn't support most RTSP codecs, doesn't support feedback mechanism). Streams can be read from the server by using the URL:
```
rtmp://localhost/mystream
```
In case authentication is enabled, credentials can be passed to the server by using the `user` and `pass` query parameters:
```
rtmp://localhost/mystream?user=myuser&pass=mypass
```
Known clients that can read with RTMP are [FFmpeg](#ffmpeg-1), [GStreamer](#gstreamer-1) and [VLC](#vlc).
#### HLS
HLS is a protocol that works by splitting streams into segments, and by serving these segments and a playlist with the HTTP protocol. You can use _MediaMTX_ to generate a HLS stream, that is accessible through a web page:
```
http://localhost:8888/mystream
```
and can also be accessed without using the browsers, by software that supports the HLS protocol (for instance VLC or _MediaMTX_ itself) by using this URL:
```
http://localhost:8888/mystream/index.m3u8
```
Although the server can produce HLS with a variety of video and audio codecs (that are listed at the beginning of the README), not all browsers can read all codecs.
You can check what codecs your browser can read by [using this tool](https://jsfiddle.net/g1qyf4ea).
If you want to support most browsers, you can to re-encode the stream by using the H264 and AAC codecs, for instance by using FFmpeg:
```sh
ffmpeg -i rtsp://original-source \
-pix_fmt yuv420p -c:v libx264 -preset ultrafast -b:v 600k \
-c:a aac -b:a 160k \
-f rtsp rtsp://localhost:8554/mystream
```
Known clients that can read with HLS are [FFmpeg](#ffmpeg-1), [GStreamer](#gstreamer-1), [VLC](#vlc) and [web browsers](#web-browsers-1).
##### LL-HLS
Low-Latency HLS is a recently standardized variant of the protocol that allows to greatly reduce playback latency. It works by splitting segments into parts, that are served before the segment is complete. LL-HLS is enabled by default. If the stream is not shown correctly, try tuning the hlsPartDuration parameter, for instance:
```yml
hlsPartDuration: 500ms
```
##### Compatibility with Apple devices
In order to correctly display Low-Latency HLS streams in Safari running on Apple devices (iOS or macOS), a TLS certificate is needed and can be generated with OpenSSL:
```sh
openssl genrsa -out server.key 2048
openssl req -new -x509 -sha256 -key server.key -out server.crt -days 3650
```
Set the `hlsEncryption`, `hlsServerKey` and `hlsServerCert` parameters in the configuration file:
```yml
hlsEncryption: yes
hlsServerKey: server.key
hlsServerCert: server.crt
```
Keep also in mind that not all H264 video streams can be played on Apple Devices due to some intrinsic properties (distance between I-Frames, profile). If the video can't be played correctly, you can either:
* re-encode it by following instructions in this README
* disable the Low-latency variant of HLS and go back to the legacy variant:
```yml
hlsVariant: mpegts
```
##### Latency
in HLS, latency is introduced since a client must wait for the server to generate segments before downloading them. This latency amounts to 500ms-3s when the low-latency HLS variant is enabled (and it is by default), otherwise amounts to 1-15secs.
To decrease the latency, you can:
* try decreasing the hlsPartDuration parameter
* try decreasing the hlsSegmentDuration parameter
* The segment duration is influenced by the interval between the IDR frames of the video track. An IDR frame is a frame that can be decoded independently from the others. The server changes the segment duration in order to include at least one IDR frame into each segment. Therefore, you need to decrease the interval between the IDR frames. This can be done in two ways:
* if the stream is being hardware-generated (i.e. by a camera), there's usually a setting called Key-Frame Interval in the camera configuration page
* otherwise, the stream must be re-encoded. It's possible to tune the IDR frame interval by using ffmpeg's -g option:
```sh
ffmpeg -i rtsp://original-stream -pix_fmt yuv420p -c:v libx264 -preset ultrafast -b:v 600k -max_muxing_queue_size 1024 -g 30 -f rtsp rtsp://localhost:$RTSP_PORT/compressed
```
## Other features
### Configuration
All the configuration parameters are listed and commented in the [configuration file](mediamtx.yml).
There are 3 ways to change the configuration:
1. By editing the `mediamtx.yml` file, that is
* included into the release bundle
* available in the root folder of the Docker image (`/mediamtx.yml`); it can be overridden in this way:
```
docker run --rm -it --network=host -v $PWD/mediamtx.yml:/mediamtx.yml bluenviron/mediamtx
```
The configuration can be changed dynamically when the server is running (hot reloading) by writing to the configuration file. Changes are detected and applied without disconnecting existing clients, whenever it's possible.
2. By overriding configuration parameters with environment variables, in the format `MTX_PARAMNAME`, where `PARAMNAME` is the uppercase name of a parameter. For instance, the `rtspAddress` parameter can be overridden in the following way:
```
MTX_RTSPADDRESS="127.0.0.1:8554" ./mediamtx
```
Parameters that have array as value can be overridden by setting a comma-separated list. For example:
```
MTX_PROTOCOLS="tcp,udp"
```
Parameters in maps can be overridden by using underscores, in the following way:
```
MTX_PATHS_TEST_SOURCE=rtsp://myurl ./mediamtx
```
This method is particularly useful when using Docker; any configuration parameter can be changed by passing environment variables with the `-e` flag:
```
docker run --rm -it --network=host -e MTX_PATHS_TEST_SOURCE=rtsp://myurl bluenviron/mediamtx
```
3. By using the [Control API](#control-api).
### Authentication
#### Internal
The server provides three way to authenticate users:
* Internal: users are stored in the configuration file
* HTTP-based: an external HTTP URL is contacted to perform authentication
* JWT: an external identity server provides authentication through JWTs
The internal authentication method is the default one. Users are stored inside the configuration file, in this format:
```yml
authInternalUsers:
# Username. 'any' means any user, including anonymous ones.
- user: any
# Password. Not used in case of 'any' user.
pass:
# IPs or networks allowed to use this user. An empty list means any IP.
ips: []
# List of permissions.
permissions:
# Available actions are: publish, read, playback, api, metrics, pprof.
- action: publish
# Paths can be set to further restrict access to a specific path.
# An empty path means any path.
# Regular expressions can be used by using a tilde as prefix.
path:
- action: read
path:
- action: playback
path:
```
Only clients that provide username and passwords will be able to perform a certain action:
```
ffmpeg -re -stream_loop -1 -i file.ts -c copy -f rtsp rtsp://myuser:mypass@localhost:8554/mystream
```
If storing plain credentials in the configuration file is a security problem, username and passwords can be stored as hashed strings. The Argon2 and SHA256 hashing algorithms are supported. To use Argon2, the string must be hashed using Argon2id (recommended) or Argon2i:
```
echo -n "mypass" | argon2 saltItWithSalt -id -l 32 -e
```
Then stored with the `argon2:` prefix:
```yml
authInternalUsers:
- user: argon2:$argon2id$v=19$m=4096,t=3,p=1$MTIzNDU2Nzg$OGGO0eCMN0ievb4YGSzvS/H+Vajx1pcbUmtLp2tRqRU
pass: argon2:$argon2i$v=19$m=4096,t=3,p=1$MTIzNDU2Nzg$oct3kOiFywTdDdt19kT07hdvmsPTvt9zxAUho2DLqZw
permissions:
- action: publish
```
To use SHA256, the string must be hashed with SHA256 and encoded with base64:
```
echo -n "mypass" | openssl dgst -binary -sha256 | openssl base64
```
Then stored with the `sha256:` prefix:
```yml
authInternalUsers:
- user: sha256:j1tsRqDEw9xvq/D7/9tMx6Jh/jMhk3UfjwIB2f1zgMo=
pass: sha256:BdSWkrdV+ZxFBLUQQY7+7uv9RmiSVA8nrPmjGjJtZQQ=
permissions:
- action: publish
```
**WARNING**: enable encryption or use a VPN to ensure that no one is intercepting the credentials in transit.
#### HTTP-based
Authentication can be delegated to an external HTTP server:
```yml
authMethod: http
authHTTPAddress: http://myauthserver/auth
```
Each time a user needs to be authenticated, the specified URL will be requested with the POST method and this payload:
```json
{
"user": "user",
"password": "password",
"ip": "ip",
"action": "publish|read|playback|api|metrics|pprof",
"path": "path",
"protocol": "rtsp|rtmp|hls|webrtc|srt",
"id": "id",
"query": "query"
}
```
If the URL returns a status code that begins with `20` (i.e. `200`), authentication is successful, otherwise it fails. Be aware that it's perfectly normal for the authentication server to receive requests with empty users and passwords, i.e.:
```json
{
"user": "",
"password": ""
}
```
This happens because RTSP clients don't provide credentials until they are asked to. In order to receive the credentials, the authentication server must reply with status code `401`, then the client will send credentials.
Some actions can be excluded from the process:
```yml
# Actions to exclude from HTTP-based authentication.
# Format is the same as the one of user permissions.
authHTTPExclude:
- action: api
- action: metrics
- action: pprof
```
#### JWT-based
Authentication can be delegated to an external identity server, that is capable of generating JWTs and provides a JWKS endpoint. With respect to the HTTP-based method, this has the advantage that the external server is contacted just once, and not for every request, greatly improving performance. In order to use the JWT-based authentication method, set `authMethod` and `authJWTJWKS`:
```yml
authMethod: jwt
authJWTJWKS: http://my_identity_server/jwks_endpoint
authJWTClaimKey: mediamtx_permissions
```
The JWT is expected to contain a claim, with a list of permissions in the same format as the one of user permissions:
```json
{
"mediamtx_permissions": [
{
"action": "publish",
"path": ""
}
]
}
```
Clients are expected to pass the JWT in the Authorization header (in case of HLS and WebRTC) or in query parameters (in case of all other protocols), for instance:
```
ffmpeg -re -stream_loop -1 -i file.ts -c copy -f rtsp rtsp://localhost:8554/mystream?jwt=MY_JWT
```
For instance (HLS):
```
GET /mypath/index.m3u8 HTTP/1.1
Host: example.com
Authorization: Bearer MY_JWT
```
Here's a tutorial on how to setup the [Keycloak identity server](https://www.keycloak.org/) in order to provide such JWTs:
1. Start Keycloak:
```
docker run --name=keycloak -p 8080:8080 -e KEYCLOAK_ADMIN=admin -e KEYCLOAK_ADMIN_PASSWORD=admin quay.io/keycloak/keycloak:23.0.7 start-dev
```
2. Open the Keycloak administration console on http://localhost:8080, click on _master_ in the top left corner, _create realm_, set realm name to `mediamtx`, Save
3. Open page _Client scopes_, _create client scope_, set name to `mediamtx`, Save
4. Open tab _Mappers_, _Configure a new Mapper_, _User Attribute_
* Name: `mediamtx_permissions`
* User Attribute: `mediamtx_permissions`
* Token Claim Name: `mediamtx_permissions`
* Claim JSON Type: `JSON`
* Multivalued: `On`
Save
5. Open page _Clients_, _Create client_, set Client ID to `mediamtx`, Next, Client authentication `On`, Next, Save
6. Open tab _Credentials_, copy client secret somewhere
7. Open tab _Client scopes_, _Add client scope_, Select `mediamtx`, Add, Default
8. Open page _Users_, _Create user_, Username `testuser`, Tab credentials, _Set password_, pick a password, Save
9. Open tab _Attributes_, _Add an attribute_
* Key: `mediamtx_permissions`
* Value: `{"action":"publish", "paths": "all"}`
You can add as many attributes with key `mediamtx_permissions` as you want, each with a single permission in it
10. In MediaMTX, use the following URL:
```yml
authJWTJWKS: http://localhost:8080/realms/mediamtx/protocol/openid-connect/certs
```
11. Perform authentication on Keycloak:
```
curl \
-d "client_id=mediamtx" \
-d "client_secret=$CLIENT_SECRET" \
-d "username=$USER" \
-d "password=$PASS" \
-d "grant_type=password" \
http://localhost:8080/realms/mediamtx/protocol/openid-connect/token
```
The JWT is inside the `access_token` key of the response:
```json
{"access_token":"eyJhbGciOiJSUzI1NiIsInR5cCIgOiAiSldUIiwia2lkIiA6ICIyNzVjX3ptOVlOdHQ0TkhwWVk4Und6ZndUclVGSzRBRmQwY3lsM2wtY3pzIn0.eyJleHAiOjE3MDk1NTUwOTIsImlhdCI6MTcwOTU1NDc5MiwianRpIjoiMzE3ZTQ1NGUtNzczMi00OTM1LWExNzAtOTNhYzQ2ODhhYWIxIiwiaXNzIjoiaHR0cDovL2xvY2FsaG9zdDo4MDgwL3JlYWxtcy9tZWRpYW10eCIsImF1ZCI6ImFjY291bnQiLCJzdWIiOiI2NTBhZDA5Zi03MDgxLTQyNGItODI4Ni0xM2I3YTA3ZDI0MWEiLCJ0eXAiOiJCZWFyZXIiLCJhenAiOiJtZWRpYW10eCIsInNlc3Npb25fc3RhdGUiOiJjYzJkNDhjYy1kMmU5LTQ0YjAtODkzZS0wYTdhNjJiZDI1YmQiLCJhY3IiOiIxIiwiYWxsb3dlZC1vcmlnaW5zIjpbIi8qIl0sInJlYWxtX2FjY2VzcyI6eyJyb2xlcyI6WyJvZmZsaW5lX2FjY2VzcyIsInVtYV9hdXRob3JpemF0aW9uIiwiZGVmYXVsdC1yb2xlcy1tZWRpYW10eCJdfSwicmVzb3VyY2VfYWNjZXNzIjp7ImFjY291bnQiOnsicm9sZXMiOlsibWFuYWdlLWFjY291bnQiLCJtYW5hZ2UtYWNjb3VudC1saW5rcyIsInZpZXctcHJvZmlsZSJdfX0sInNjb3BlIjoibWVkaWFtdHggcHJvZmlsZSBlbWFpbCIsInNpZCI6ImNjMmQ0OGNjLWQyZTktNDRiMC04OTNlLTBhN2E2MmJkMjViZCIsImVtYWlsX3ZlcmlmaWVkIjpmYWxzZSwibWVkaWFtdHhfcGVybWlzc2lvbnMiOlt7ImFjdGlvbiI6InB1Ymxpc2giLCJwYXRocyI6ImFsbCJ9XSwicHJlZmVycmVkX3VzZXJuYW1lIjoidGVzdHVzZXIifQ.Gevz7rf1qHqFg7cqtSfSP31v_NS0VH7MYfwAdra1t6Yt5rTr9vJzqUeGfjYLQWR3fr4XC58DrPOhNnILCpo7jWRdimCnbPmuuCJ0AYM-Aoi3PAsWZNxgmtopq24_JokbFArY9Y1wSGFvF8puU64lt1jyOOyxf2M4cBHCs_EarCKOwuQmEZxSf8Z-QV9nlfkoTUszDCQTiKyeIkLRHL2Iy7Fw7_T3UI7sxJjVIt0c6HCNJhBBazGsYzmcSQ_GrmhbUteMTg00o6FicqkMBe99uZFnx9wIBm_QbO9hbAkkzF923I-DTAQrFLxT08ESMepDwmzFrmnwWYBLE3u8zuUlCA","expires_in":300,"refresh_expires_in":1800,"refresh_token":"eyJhbGciOiJIUzI1NiIsInR5cCIgOiAiSldUIiwia2lkIiA6ICI3OTI3Zjg4Zi05YWM4LTRlNmEtYWE1OC1kZmY0MDQzZDRhNGUifQ.eyJleHAiOjE3MDk1NTY1OTIsImlhdCI6MTcwOTU1NDc5MiwianRpIjoiMGVhZWFhMWItYzNhMC00M2YxLWJkZjAtZjI2NTRiODlkOTE3IiwiaXNzIjoiaHR0cDovL2xvY2FsaG9zdDo4MDgwL3JlYWxtcy9tZWRpYW10eCIsImF1ZCI6Imh0dHA6Ly9sb2NhbGhvc3Q6ODA4MC9yZWFsbXMvbWVkaWFtdHgiLCJzdWIiOiI2NTBhZDA5Zi03MDgxLTQyNGItODI4Ni0xM2I3YTA3ZDI0MWEiLCJ0eXAiOiJSZWZyZXNoIiwiYXpwIjoibWVkaWFtdHgiLCJzZXNzaW9uX3N0YXRlIjoiY2MyZDQ4Y2MtZDJlOS00NGIwLTg5M2UtMGE3YTYyYmQyNWJkIiwic2NvcGUiOiJtZWRpYW10eCBwcm9maWxlIGVtYWlsIiwic2lkIjoiY2MyZDQ4Y2MtZDJlOS00NGIwLTg5M2UtMGE3YTYyYmQyNWJkIn0.yuXV8_JU0TQLuosNdp5xlYMjn7eO5Xq-PusdHzE7bsQ","token_type":"Bearer","not-before-policy":0,"session_state":"cc2d48cc-d2e9-44b0-893e-0a7a62bd25bd","scope":"mediamtx profile email"}
```
### Encrypt the configuration
The configuration file can be entirely encrypted for security purposes by using the `crypto_secretbox` function of the NaCL function. An online tool for performing this operation is [available here](https://play.golang.org/p/rX29jwObNe4).
After performing the encryption, put the base64-encoded result into the configuration file, and launch the server with the `MTX_CONFKEY` variable:
```
MTX_CONFKEY=mykey ./mediamtx
```
### Remuxing, re-encoding, compression
To change the format, codec or compression of a stream, use _FFmpeg_ or _GStreamer_ together with _MediaMTX_. For instance, to re-encode an existing stream, that is available in the `/original` path, and publish the resulting stream in the `/compressed` path, edit `mediamtx.yml` and replace everything inside section `paths` with the following content:
```yml
paths:
compressed:
original:
runOnReady: >
ffmpeg -i rtsp://localhost:$RTSP_PORT/$MTX_PATH
-pix_fmt yuv420p -c:v libx264 -preset ultrafast -b:v 600k
-max_muxing_queue_size 1024 -f rtsp rtsp://localhost:$RTSP_PORT/compressed
runOnReadyRestart: yes
```
### Record streams to disk
To save available streams to disk, set the `record` and the `recordPath` parameter in the configuration file:
```yml
pathDefaults:
# Record streams to disk.
record: yes
# Path of recording segments.
# Extension is added automatically.
# Available variables are %path (path name), %Y %m %d %H %M %S %f %s (time in strftime format)
recordPath: ./recordings/%path/%Y-%m-%d_%H-%M-%S-%f
```
All available recording parameters are listed in the [sample configuration file](/mediamtx.yml).
Be aware that not all codecs can be saved with all formats, as described in the compatibility matrix at the beginning of the README.
To upload recordings to a remote location, you can use _MediaMTX_ together with [rclone](https://github.com/rclone/rclone), a command line tool that provides file synchronization capabilities with a huge variety of services (including S3, FTP, SMB, Google Drive):
1. Download and install [rclone](https://github.com/rclone/rclone).
2. Configure _rclone_:
```
rclone config
```
3. Place `rclone` into the `runOnInit` and `runOnRecordSegmentComplete` hooks:
```yml
pathDefaults:
# this is needed to sync segments after a crash.
# replace myconfig with the name of the rclone config.
runOnInit: rclone sync -v ./recordings myconfig:/my-path/recordings
# this is called when a segment has been finalized.
# replace myconfig with the name of the rclone config.
runOnRecordSegmentComplete: rclone sync -v --min-age=1ms ./recordings myconfig:/my-path/recordings
```
If you want to delete local segments after they are uploaded, replace `rclone sync` with `rclone move`.
### Playback recorded streams
Existing recordings can be served to users through a dedicated HTTP server, that can be enabled inside the configuration:
```yml
playback: yes
playbackAddress: :9996
```
The server provides an endpoint to list recorded timespans:
```
http://localhost:9996/list?path=[mypath]
```
Where [mypath] is the name of a path. The server will return a list of timespans in JSON format:
```json
[
{
"start": "2006-01-02T15:04:05Z07:00",
"duration": "60.0",
"url": "http://localhost:9996/get?path=[mypath]&start=2006-01-02T15%3A04%3A05Z07%3A00&duration=60.0"
},
{
"start": "2006-01-02T15:07:05Z07:00",
"duration": "32.33",
"url": "http://localhost:9996/get?path=[mypath]&start=2006-01-02T15%3A07%3A05Z07%3A00&duration=32.33"
}
]
```
The server provides an endpoint to download recordings:
```
http://localhost:9996/get?path=[mypath]&start=[start_date]&duration=[duration]&format=[format]
```
Where:
* [mypath] is the path name
* [start_date] is the start date in [RFC3339 format](https://www.utctime.net/)
* [duration] is the maximum duration of the recording in seconds
* [format] (optional) is the output format of the stream. Available values are "fmp4" (default) and "mp4"
All parameters must be [url-encoded](https://www.urlencoder.org/). For instance:
```
http://localhost:9996/get?path=mypath&start=2024-01-14T16%3A33%3A17%2B00%3A00&duration=200.5
```
The resulting stream uses the fMP4 format, that is natively compatible with any browser, therefore its URL can be directly inserted into a \