Docs improvements (#8641)

* go all in on ruff

* upgrade docusaurus

* add netlify toml

* broken link

* fix netlify toml

* start filling out guide

* add debian setup detail

* simplify bash command
This commit is contained in:
Blake Blackshear
2023-11-18 14:04:43 +00:00
committed by GitHub
parent c6208b266b
commit 4879de263b
22 changed files with 7152 additions and 13125 deletions

View File

@@ -3,6 +3,8 @@ id: configuring_go2rtc
title: Configuring go2rtc
---
# Configuring go2rtc
Use of the bundled go2rtc is optional. You can still configure FFmpeg to connect directly to your cameras. However, adding go2rtc to your configuration is required for the following features:
- WebRTC or MSE for live viewing with higher resolutions and frame rates than the jsmpeg stream which is limited to the detect stream

View File

@@ -3,7 +3,141 @@ id: getting_started
title: Getting started
---
This guide walks through the steps to build a configuration file for Frigate. It assumes that you already have an environment setup as described in [Installation](../frigate/installation.md). You should also configure your cameras according to the [camera setup guide](/frigate/camera_setup). Pay particular attention to the section on choosing a detect resolution.
# Getting Started
## Setting up hardware
This section guides you through setting up a server with Debian Bookworm and Docker. If you already have an environment with Linux and Docker installed, you can continue to [Installing Frigate](#installing-frigate) below.
### Install Debian 12 (Bookworm)
There are many guides on how to install Debian Server, so this will be an abbreviated guide. Connect a temporary monitor and keyboard to your device so you can install a minimal server without a desktop environment.
#### Prepare installation media
1. Download the small installation image from the [Debian website](https://www.debian.org/distrib/netinst)
1. Flash the ISO to a USB device
1. Boot your device from USB
#### Install and setup Debian for remote access
1. You will be prompted to set the root user password and create a user with a password
1. Install the minimum software. Fewer dependencies result in less maintenance.
1. Uncheck "Debian desktop environment" and "GNOME"
1. Check "SSH server"
1. Keep "standard system utilities" checked
1. After reboot, login as root at the command prompt to add user to sudoers
1. Install sudo
```bash
apt update && apt install -y sudo
```
1. Add the user you created to the sudo group (change `blake` to your own user)
```bash
usermod -aG sudo blake
```
1. Shutdown by running `poweroff`
At this point, you can install the device in a permanent location. The remaining steps can be performed via SSH.
#### Finish setup via SSH
1. Connect via SSH and login with your non-root user
1. Setup passwordless sudo so you don't have to type your password for each sudo command
```bash
echo 'blake ALL=(ALL) NOPASSWD:ALL' | sudo tee /etc/sudoers.d/user
```
1. Logout and login again to activate passwordless sudo
1. Setup automatic security updates for the OS (optional)
1. Ensure everything is up to date by running
```bash
sudo apt update && sudo apt upgrade -y
```
1. Install unattended upgrades
```bash
sudo apt install -y unattended-upgrades
echo unattended-upgrades unattended-upgrades/enable_auto_updates boolean true | sudo debconf-set-selections
sudo dpkg-reconfigure -f noninteractive unattended-upgrades
```
Now you have a minimal Debian server that requires very little maintenance.
### Install Docker
1. Install Docker Engine (not Docker Desktop) using the [official docs](https://docs.docker.com/engine/install/debian/)
1. Specifically, follow the steps in the [Install using the apt repository](https://docs.docker.com/engine/install/debian/#install-using-the-repository) section
2. Add your user to the docker group as described in the [Linux postinstall steps](https://docs.docker.com/engine/install/linux-postinstall/)
## Installing Frigate
This section shows how to create a minimal directory structure for a Docker installation on Debian. If you have installed Frigate as a Home Assistant addon or another way, you can continue to [Configuring Frigate](#configuring-frigate).
### Setup directories
Frigate requires a valid config file to start. The following directory structure is the bare minimum to get started. Once Frigate is running, you can use the built-in config editor which supports config validation.
```
.
├── docker-compose.yml
├── config/
│ └── config.yml
└── storage/
```
This will create the above structure:
```bash
mkdir storage config && touch docker-compose.yml config/config.yml
```
:::note
This `docker-compose.yml` file is just a starter for amd64 devices. You will need to customize it for your setup as detailed in the [Installation docs](/frigate/installation#docker).
:::
`docker-compose.yml`
```yaml
version: "3.9"
services:
frigate:
container_name: frigate
restart: unless-stopped
image: ghcr.io/blakeblackshear/frigate:stable
volumes:
- ./config:/config
- ./storage:/media/frigate
- type: tmpfs # Optional: 1GB of memory, reduces SSD/SD Card wear
target: /tmp/cache
tmpfs:
size: 1000000000
ports:
- "5000:5000"
- "8554:8554" # RTSP feeds
```
`config.yml`
```yaml
mqtt:
enabled: False
cameras:
dummy_camera: # <--- this will be changed to your actual camera later
enabled: False
ffmpeg:
inputs:
- path: rtsp://127.0.0.1:554/rtsp
roles:
- detect
```
Now you should be able to start Frigate by running `docker compose up -d` from within the folder containing `docker-compose.yml`. Frigate should now be accessible at `server_ip:5000` and you can finish the configuration using the built-in configuration editor.
## Configuring Frigate
This section assumes that you already have an environment setup as described in [Installation](../frigate/installation.md). You should also configure your cameras according to the [camera setup guide](/frigate/camera_setup). Pay particular attention to the section on choosing a detect resolution.
### Step 1: Add a detect stream
@@ -15,6 +149,7 @@ mqtt:
cameras:
name_of_your_camera: # <------ Name the camera
enabled: True
ffmpeg:
inputs:
- path: rtsp://10.0.10.10:554/rtsp # <----- The stream you want to use for detection
@@ -36,7 +171,21 @@ FFmpeg arguments for other types of cameras can be found [here](../configuration
Now that you have a working camera configuration, you want to setup hardware acceleration to minimize the CPU required to decode your video streams. See the [hardware acceleration](../configuration/hardware_acceleration.md) config reference for examples applicable to your hardware.
Here is an example configuration with hardware acceleration configured for Intel processors with an integrated GPU using the [preset](../configuration/ffmpeg_presets.md):
Here is an example configuration with hardware acceleration configured to work with most Intel processors with an integrated GPU using the [preset](../configuration/ffmpeg_presets.md):
`docker-compose.yml` (after modifying, you will need to run `docker compose up -d` to apply changes)
```yaml
version: "3.9"
services:
frigate:
...
devices:
- /dev/dri/renderD128 # for intel hwaccel, needs to be updated for your hardware
...
```
`config.yml`
```yaml
mqtt: ...
@@ -53,6 +202,19 @@ cameras:
By default, Frigate will use a single CPU detector. If you have a USB Coral, you will need to add a detectors section to your config.
`docker-compose.yml` (after modifying, you will need to run `docker compose up -d` to apply changes)
```yaml
version: "3.9"
services:
frigate:
...
devices:
- /dev/bus/usb:/dev/bus/usb # passes the USB Coral, needs to be modified for other versions
- /dev/apex_0:/dev/apex_0 # passes a PCIe Coral, follow driver instructions here https://coral.ai/docs/m2/get-started/#2a-on-linux
...
```
```yaml
mqtt: ...

View File

@@ -1,65 +0,0 @@
---
id: video_pipeline
title: The video pipeline
---
Frigate uses a sophisticated video pipeline that starts with the camera feed and progressively applies transformations to it (e.g. decoding, motion detection, etc.).
This guide provides an overview to help users understand some of the key Frigate concepts.
## Overview
At a high level, there are five processing steps that could be applied to a camera feed
```mermaid
%%{init: {"themeVariables": {"edgeLabelBackground": "transparent"}}}%%
flowchart LR
Feed(Feed\nacquisition) --> Decode(Video\ndecoding)
Decode --> Motion(Motion\ndetection)
Motion --> Object(Object\ndetection)
Feed --> Recording(Recording\nand\nvisualization)
Motion --> Recording
Object --> Recording
```
As the diagram shows, all feeds first need to be acquired. Depending on the data source, it may be as simple as using FFmpeg to connect to an RTSP source via TCP or something more involved like connecting to an Apple Homekit camera using go2rtc. A single camera can produce a main (i.e. high resolution) and a sub (i.e. lower resolution) video feed.
Typically, the sub-feed will be decoded to produce full-frame images. As part of this process, the resolution may be downscaled and an image sampling frequency may be imposed (e.g. keep 5 frames per second).
These frames will then be compared over time to detect movement areas (a.k.a. motion boxes). These motion boxes are combined into motion regions and are analyzed by a machine learning model to detect known objects. Finally, the snapshot and recording retention config will decide what video clips and events should be saved.
## Detailed view of the video pipeline
The following diagram adds a lot more detail than the simple view explained before. The goal is to show the detailed data paths between the processing steps.
```mermaid
%%{init: {"themeVariables": {"edgeLabelBackground": "transparent"}}}%%
flowchart TD
RecStore[(Recording\nstore)]
SnapStore[(Snapshot\nstore)]
subgraph Acquisition
Cam["Camera"] -->|FFmpeg supported| Stream
Cam -->|"Other streaming\nprotocols"| go2rtc
go2rtc("go2rtc") --> Stream
Stream[Capture main and\nsub streams] --> |detect stream|Decode(Decode and\ndownscale)
end
subgraph Motion
Decode --> MotionM(Apply\nmotion masks)
MotionM --> MotionD(Motion\ndetection)
end
subgraph Detection
MotionD --> |motion regions| ObjectD(Object detection)
Decode --> ObjectD
ObjectD --> ObjectFilter(Apply object filters & zones)
ObjectFilter --> ObjectZ(Track objects)
end
Decode --> |decoded frames|Birdseye
MotionD --> |motion event|Birdseye
ObjectZ --> |object event|Birdseye
MotionD --> |"video segments\n(retain motion)"|RecStore
ObjectZ --> |detection clip|RecStore
Stream -->|"video segments\n(retain all)"| RecStore
ObjectZ --> |detection snapshot|SnapStore
```