Docs updates, fix recording warnings, default log level for ws4py (#5294)

* set default log level for ws4py

* proactively cleanup cache in all retain modes

* docs updates

* typo

* fix link

* updates
This commit is contained in:
Blake Blackshear 2023-01-30 17:42:53 -06:00 committed by GitHub
parent 7edeaa3407
commit f06e8b47be
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
10 changed files with 154 additions and 124 deletions

View File

@ -271,11 +271,6 @@ record:
# Optional: Enable recording (default: shown below) # Optional: Enable recording (default: shown below)
# WARNING: If recording is disabled in the config, turning it on via # WARNING: If recording is disabled in the config, turning it on via
# the UI or MQTT later will have no effect. # the UI or MQTT later will have no effect.
# WARNING: Frigate does not currently support limiting recordings based
# on available disk space automatically. If using recordings,
# you must specify retention settings for a number of days that
# will fit within the available disk space of your drive or Frigate
# will crash.
enabled: False enabled: False
# Optional: Number of minutes to wait between cleanup runs (default: shown below) # Optional: Number of minutes to wait between cleanup runs (default: shown below)
# This can be used to reduce the frequency of deleting recording segments from disk if you want to minimize i/o # This can be used to reduce the frequency of deleting recording segments from disk if you want to minimize i/o

View File

@ -3,17 +3,17 @@ id: live
title: Live View title: Live View
--- ---
Frigate has different live view options, some of which require [restream](restream.md) to be enabled. Frigate has different live view options, some of which require the bundled `go2rtc` to be configured as shown in the [step by step guide](/guides/configuring_go2rtc).
## Live View Options ## Live View Options
Live view options can be selected while viewing the live stream. The options are: Live view options can be selected while viewing the live stream. The options are:
| Source | Latency | Frame Rate | Resolution | Audio | Requires Restream | Other Limitations | | Source | Latency | Frame Rate | Resolution | Audio | Requires go2rtc | Other Limitations |
| ------ | ------- | ------------------------------------- | -------------- | ---------------------------- | ----------------- | -------------------------------------------- | | ------ | ------- | ------------------------------------- | -------------- | ---------------------------- | --------------- | -------------------------------------------- |
| jsmpeg | low | same as `detect -> fps`, capped at 10 | same as detect | no | no | none | | jsmpeg | low | same as `detect -> fps`, capped at 10 | same as detect | no | no | none |
| mse | low | native | native | yes (depends on audio codec) | yes | not supported on iOS, Firefox is h.264 only | | mse | low | native | native | yes (depends on audio codec) | yes | not supported on iOS, Firefox is h.264 only |
| webrtc | lowest | native | native | yes (depends on audio codec) | yes | requires extra config, doesn't support h.265 | | webrtc | lowest | native | native | yes (depends on audio codec) | yes | requires extra config, doesn't support h.265 |
### Audio Support ### Audio Support
@ -38,10 +38,10 @@ There may be some cameras that you would prefer to use the sub stream for live v
go2rtc: go2rtc:
streams: streams:
rtsp_cam: rtsp_cam:
- rtsp://192.168.1.5:554/live0 # <- stream which supports video & aac audio. This is only supported for rtsp streams, http must use ffmpeg - rtsp://192.168.1.5:554/live0 # <- stream which supports video & aac audio.
- "ffmpeg:rtsp_cam#audio=opus" # <- copy of the stream which transcodes audio to opus - "ffmpeg:rtsp_cam#audio=opus" # <- copy of the stream which transcodes audio to opus
rtsp_cam_sub: rtsp_cam_sub:
- rtsp://192.168.1.5:554/substream # <- stream which supports video & aac audio. This is only supported for rtsp streams, http must use ffmpeg - rtsp://192.168.1.5:554/substream # <- stream which supports video & aac audio.
- "ffmpeg:rtsp_cam_sub#audio=opus" # <- copy of the stream which transcodes audio to opus - "ffmpeg:rtsp_cam_sub#audio=opus" # <- copy of the stream which transcodes audio to opus
cameras: cameras:
@ -69,15 +69,15 @@ WebRTC works by creating a TCP or UDP connection on port `8555`. However, it req
- For external access, over the internet, setup your router to forward port `8555` to port `8555` on the Frigate device, for both TCP and UDP. - For external access, over the internet, setup your router to forward port `8555` to port `8555` on the Frigate device, for both TCP and UDP.
- For internal/local access, unless you are running through the add-on, you will also need to set the WebRTC candidates list in the go2rtc config. For example, if `192.168.1.10` is the local IP of the device running Frigate: - For internal/local access, unless you are running through the add-on, you will also need to set the WebRTC candidates list in the go2rtc config. For example, if `192.168.1.10` is the local IP of the device running Frigate:
```yaml title="/config/frigate.yaml" ```yaml title="/config/frigate.yaml"
go2rtc: go2rtc:
streams: streams:
test_cam: ... test_cam: ...
webrtc: webrtc:
candidates: candidates:
- 192.168.1.10:8555 - 192.168.1.10:8555
- stun:8555 - stun:8555
``` ```
:::tip :::tip

View File

@ -0,0 +1,59 @@
---
id: configuring_go2rtc
title: Configuring go2rtc
---
Use of the bundled go2rtc is optional. You can still configure FFmpeg to connect directly to your cameras. However, adding go2rtc to your configuration is required for the following features:
- WebRTC or MSE for live viewing with higher resolutions and frame rates than the jsmpeg stream which is limited to the detect stream
- RTSP (instead of RTMP) relay for use with Home Assistant or other consumers to reduce the number of connections to your camera streams
# Setup a go2rtc stream
First, you will want to configure go2rtc to connect to your camera stream by adding the stream you want to use for live view in your Frigate config file. If you set the stream name under go2rtc to match the name of your camera, it will automatically be mapped and you will get additional live view options for the camera. Avoid changing any other parts of your config at this step. Note that go2rtc supports [many different stream types](https://github.com/AlexxIT/go2rtc#module-streams), not just rtsp.
```yaml
go2rtc:
streams:
back:
- rtsp://user:password@10.0.10.10:554/cam/realmonitor?channel=1&subtype=2
```
The easiest live view to get working is MSE. After adding this to the config, restart Frigate and try to watch the live stream by selecting MSE in the dropdown after clicking on the camera.
### What if my video doesn't play?
If you are unable to see your video feed, first check the go2rtc logs in the Frigate UI under Logs in the sidebar. If go2rtc is having difficulty connecting to your camera, you should see some error messages in the log. If you do not see any errors, then video codec of the stream may not be supported in your browser. If your camera stream is set to H265, try switching to H264. You can see more information about [video codec compatibility](https://github.com/AlexxIT/go2rtc#codecs-madness) in the go2rtc documentation. If you are not able to switch your camera settings from H265 to H264 or your stream is a different format such as MJPEG, you can use go2rtc to re-encode the video using the [FFmpeg parameters](https://github.com/AlexxIT/go2rtc#source-ffmpeg). It supports rotating and resizing video feeds and hardware acceleration. Keep in mind that transcoding video from one format to another is a resource intensive task and you may be better off using the built-in jsmpeg view. Here is an example of a config that will re-encode the stream to H264 without hardware acceleration:
```yaml
go2rtc:
streams:
back:
- rtsp://user:password@10.0.10.10:554/cam/realmonitor?channel=1&subtype=2
- "ffmpeg:back#video=h264"
```
If you can see the video but do not have audio, this is most likely because your camera's audio stream is not AAC. If possible, update your camera's audio settings to AAC. If your cameras do not support AAC audio, you will need to tell go2rtc to re-encode the audio to AAC on demand if you want audio. This will use additional CPU and add some latency. To add AAC audio on demand, you can update your go2rtc config as follows:
```yaml
go2rtc:
streams:
back:
- rtsp://user:password@10.0.10.10:554/cam/realmonitor?channel=1&subtype=2
- "ffmpeg:back#audio=aac"
```
If you need to convert **both** the audio and video streams, you can use the following:
```yaml
go2rtc:
streams:
back:
- rtsp://user:password@10.0.10.10:554/cam/realmonitor?channel=1&subtype=2
- "ffmpeg:back#video=h264#audio=aac"
```
## Next steps
1. If the stream you added to go2rtc is also used by Frigate for the `record` or `detect` role, you can migrate your config to pull from the RTSP restream to reduce the number of connections to your camera as shown [here](/configuration/restream#reduce-connections-to-camera).
1. You may also prefer to [setup WebRTC](/configuration/live#webrtc-extra-configuration) for slightly lower latency than MSE. Note that WebRTC only supports h264 and specific audio formats.

View File

@ -1,10 +0,0 @@
---
id: events_setup
title: Setting Up Events
---
[Snapshots](../configuration/snapshots.md) and/or [Recordings](../configuration/record.md) must be enabled for events to be created for detected objects.
## Limiting Events to Areas of Interest
The best way to limit events to areas of interest is to use [zones](../configuration/zones.md) along with `required_zones` for events and snapshots to only have events created in areas of interest.

View File

@ -1,79 +1,32 @@
--- ---
id: getting_started id: getting_started
title: Creating a config file title: Getting started
--- ---
This guide walks through the steps to build a configuration file for Frigate. It assumes that you already have an environment setup as described in [Installation](../frigate/installation.md). You should also configure your cameras according to the [camera setup guide](/guides/camera_setup) This guide walks through the steps to build a configuration file for Frigate. It assumes that you already have an environment setup as described in [Installation](../frigate/installation.md). You should also configure your cameras according to the [camera setup guide](/frigate/camera_setup). Pay particular attention to the section on choosing a detect resolution.
### Step 1: Configure the MQTT server (Optional) ### Step 1: Add a detect stream
Use of a functioning MQTT server is optional for Frigate, but required for the home assistant integration. Start by adding the mqtt section at the top level in your config: First we will add the detect stream for the camera:
If using mqtt:
```yaml
mqtt:
host: <ip of your mqtt server>
```
If not using mqtt:
```yaml ```yaml
mqtt: mqtt:
enabled: False enabled: False
```
If using the Mosquitto Addon in Home Assistant, a username and password is required. For example:
```yaml
mqtt:
host: <ip of your mqtt server>
user: <username>
password: <password>
```
Frigate supports many configuration options for mqtt. See the [configuration reference](../configuration/index.md#full-configuration-reference) for more info.
### Step 2: Configure detectors
By default, Frigate will use a single CPU detector. If you have a USB Coral, you will need to add a detectors section to your config.
```yaml
mqtt:
host: <ip of your mqtt server>
detectors:
coral:
type: edgetpu
device: usb
```
More details on available detectors can be found [here](../configuration/detectors.md).
### Step 3: Add a minimal camera configuration
Now let's add the first camera:
```yaml
mqtt:
host: <ip of your mqtt server>
detectors:
coral:
type: edgetpu
device: usb
cameras: cameras:
camera_1: # <------ Name the camera camera_1: # <------ Name the camera
ffmpeg: ffmpeg:
inputs: inputs:
- path: rtsp://10.0.10.10:554/rtsp # <----- Update for your camera - path: rtsp://10.0.10.10:554/rtsp # <----- The stream you want to use for detection
roles: roles:
- detect - detect
detect: detect:
enabled: False # <---- disable detection until you have a working camera feed
width: 1280 # <---- update for your camera's resolution width: 1280 # <---- update for your camera's resolution
height: 720 # <---- update for your camera's resolution height: 720 # <---- update for your camera's resolution
``` ```
### Step 4: Start Frigate ### Step 2: Start Frigate
At this point you should be able to start Frigate and see the the video feed in the UI. At this point you should be able to start Frigate and see the the video feed in the UI.
@ -81,41 +34,48 @@ If you get an error image from the camera, this means ffmpeg was not able to get
FFmpeg arguments for other types of cameras can be found [here](../configuration/camera_specific.md). FFmpeg arguments for other types of cameras can be found [here](../configuration/camera_specific.md).
### Step 5: Configure hardware acceleration (optional) ### Step 3: Configure hardware acceleration (recommended)
Now that you have a working camera configuration, you want to setup hardware acceleration to minimize the CPU required to decode your video streams. See the [hardware acceleration](../configuration/hardware_acceleration.md) config reference for examples applicable to your hardware. Now that you have a working camera configuration, you want to setup hardware acceleration to minimize the CPU required to decode your video streams. See the [hardware acceleration](../configuration/hardware_acceleration.md) config reference for examples applicable to your hardware.
In order to best evaluate the performance impact of hardware acceleration, it is recommended to temporarily disable detection. Here is an example configuration with hardware acceleration configured for Intel processors with an integrated GPU using the [preset](../configuration/ffmpeg_presets.md):
```yaml ```yaml
mqtt: ... mqtt: ...
detectors: ...
cameras:
camera_1:
ffmpeg: ...
detect:
enabled: False
...
```
Here is an example configuration with hardware acceleration configured:
```yaml
mqtt: ...
detectors: ...
cameras: cameras:
camera_1: camera_1:
ffmpeg: ffmpeg:
inputs: ... inputs: ...
hwaccel_args: -c:v h264_v4l2m2m hwaccel_args: preset-vaapi
detect: ... detect: ...
``` ```
### Step 6: Setup motion masks ### Step 4: Configure detectors
By default, Frigate will use a single CPU detector. If you have a USB Coral, you will need to add a detectors section to your config.
```yaml
mqtt: ...
detectors: # <---- add detectors
coral:
type: edgetpu
device: usb
cameras:
camera_1:
ffmpeg: ...
detect:
enabled: True # <---- turn on detection
...
```
More details on available detectors can be found [here](../configuration/detectors.md).
Restart Frigate and you should start seeing detections for `person`. If you want to track other objects, they will need to be added according to the [configuration file reference](../configuration/index.md#full-configuration-reference).
### Step 5: Setup motion masks
Now that you have optimized your configuration for decoding the video stream, you will want to check to see where to implement motion masks. To do this, navigate to the camera in the UI, select "Debug" at the top, and enable "Motion boxes" in the options below the video feed. Watch for areas that continuously trigger unwanted motion to be detected. Common areas to mask include camera timestamps and trees that frequently blow in the wind. The goal is to avoid wasting object detection cycles looking at these areas. Now that you have optimized your configuration for decoding the video stream, you will want to check to see where to implement motion masks. To do this, navigate to the camera in the UI, select "Debug" at the top, and enable "Motion boxes" in the options below the video feed. Watch for areas that continuously trigger unwanted motion to be detected. Common areas to mask include camera timestamps and trees that frequently blow in the wind. The goal is to avoid wasting object detection cycles looking at these areas.
@ -131,7 +91,7 @@ Your configuration should look similar to this now.
```yaml ```yaml
mqtt: mqtt:
host: mqtt.local enabled: False
detectors: detectors:
coral: coral:
@ -153,9 +113,13 @@ cameras:
- 0,461,3,0,1919,0,1919,843,1699,492,1344,458,1346,336,973,317,869,375,866,432 - 0,461,3,0,1919,0,1919,843,1699,492,1344,458,1346,336,973,317,869,375,866,432
``` ```
### Step 7: Enable recording (optional) ### Step 6: Enable recording and/or snapshots
To enable recording video, add the `record` role to a stream and enable it in the config. In order to see Events in the Frigate UI, either snapshots or record will need to be enabled.
#### Record
To enable recording video, add the `record` role to a stream and enable it in the config. If record is disabled in the config, turning it on via the UI will not have any effect.
```yaml ```yaml
mqtt: ... mqtt: ...
@ -169,7 +133,7 @@ cameras:
- path: rtsp://10.0.10.10:554/rtsp - path: rtsp://10.0.10.10:554/rtsp
roles: roles:
- detect - detect
- path: rtsp://10.0.10.10:554/high_res_stream # <----- Add high res stream - path: rtsp://10.0.10.10:554/high_res_stream # <----- Add stream you want to record from
roles: roles:
- record - record
detect: ... detect: ...
@ -182,9 +146,9 @@ If you don't have separate streams for detect and record, you would just add the
By default, Frigate will retain video of all events for 10 days. The full set of options for recording can be found [here](../configuration/index.md#full-configuration-reference). By default, Frigate will retain video of all events for 10 days. The full set of options for recording can be found [here](../configuration/index.md#full-configuration-reference).
### Step 8: Enable snapshots (optional) #### Snapshots
To enable snapshots of your events, just enable it in the config. To enable snapshots of your events, just enable it in the config. Snapshots are taken from the detect stream because it is the only stream decoded.
```yaml ```yaml
mqtt: ... mqtt: ...
@ -201,3 +165,10 @@ cameras:
``` ```
By default, Frigate will retain snapshots of all events for 10 days. The full set of options for snapshots can be found [here](../configuration/index.md#full-configuration-reference). By default, Frigate will retain snapshots of all events for 10 days. The full set of options for snapshots can be found [here](../configuration/index.md#full-configuration-reference).
### Step 7: Follow up guides
Now that you have a working install, you can use the following guides for additional features:
1. [Configuring go2rtc](configuring_go2rtc) - Additional live view options and RTSP relay
2. [Home Assistant Integration](../integrations/home-assistant.md) - Integrate with Home Assistant

View File

@ -1,6 +1,6 @@
--- ---
id: reverse_proxy id: reverse_proxy
title: Setting up a Reverse Proxy title: Setting up a reverse proxy
--- ---
This guide outlines the basic configuration steps needed to expose your Frigate UI to the internet. This guide outlines the basic configuration steps needed to expose your Frigate UI to the internet.
@ -8,6 +8,7 @@ A common way of accomplishing this is to use a reverse proxy webserver between y
A reverse proxy accepts HTTP requests from the public internet and redirects them transparently to internal webserver(s) on your network. A reverse proxy accepts HTTP requests from the public internet and redirects them transparently to internal webserver(s) on your network.
The suggested steps are: The suggested steps are:
- **Configure** a 'proxy' HTTP webserver (such as [Apache2](https://httpd.apache.org/docs/current/) or [NPM](https://github.com/NginxProxyManager/nginx-proxy-manager)) and only expose ports 80/443 from this webserver to the internet - **Configure** a 'proxy' HTTP webserver (such as [Apache2](https://httpd.apache.org/docs/current/) or [NPM](https://github.com/NginxProxyManager/nginx-proxy-manager)) and only expose ports 80/443 from this webserver to the internet
- **Encrypt** content from the proxy webserver by installing SSL (such as with [Let's Encrypt](https://letsencrypt.org/)). Note that SSL is then not required on your Frigate webserver as the proxy encrypts all requests for you - **Encrypt** content from the proxy webserver by installing SSL (such as with [Let's Encrypt](https://letsencrypt.org/)). Note that SSL is then not required on your Frigate webserver as the proxy encrypts all requests for you
- **Restrict** access to your Frigate instance at the proxy using, for example, password authentication - **Restrict** access to your Frigate instance at the proxy using, for example, password authentication
@ -31,6 +32,7 @@ On Debian Apache2 the configuration file will be named along the lines of `/etc/
Make life easier for yourself by presenting your Frigate interface as a DNS sub-domain rather than as a sub-folder of your main domain. Make life easier for yourself by presenting your Frigate interface as a DNS sub-domain rather than as a sub-folder of your main domain.
Here we access Frigate via https://cctv.mydomain.co.uk Here we access Frigate via https://cctv.mydomain.co.uk
```xml ```xml
<VirtualHost *:443> <VirtualHost *:443>
ServerName cctv.mydomain.co.uk ServerName cctv.mydomain.co.uk

View File

@ -4,11 +4,11 @@ module.exports = {
"frigate/index", "frigate/index",
"frigate/hardware", "frigate/hardware",
"frigate/installation", "frigate/installation",
"frigate/camera_setup",
], ],
Guides: [ Guides: [
"guides/camera_setup",
"guides/getting_started", "guides/getting_started",
"guides/events_setup", "guides/configuring_go2rtc",
"guides/false_positives", "guides/false_positives",
"guides/ha_notifications", "guides/ha_notifications",
"guides/stationary_objects", "guides/stationary_objects",

View File

@ -117,6 +117,9 @@ class FrigateApp:
if not "werkzeug" in self.config.logger.logs: if not "werkzeug" in self.config.logger.logs:
logging.getLogger("werkzeug").setLevel("ERROR") logging.getLogger("werkzeug").setLevel("ERROR")
if not "ws4py" in self.config.logger.logs:
logging.getLogger("ws4py").setLevel("ERROR")
def init_queues(self) -> None: def init_queues(self) -> None:
# Queues for clip processing # Queues for clip processing
self.event_queue: Queue = mp.Queue() self.event_queue: Queue = mp.Queue()

View File

@ -100,12 +100,9 @@ class RecordingMaintainer(threading.Thread):
for camera in grouped_recordings.keys(): for camera in grouped_recordings.keys():
segment_count = len(grouped_recordings[camera]) segment_count = len(grouped_recordings[camera])
if segment_count > keep_count: if segment_count > keep_count:
retain_mode = self.config.cameras[camera].record.retain.mode logger.warning(
# this is only true when retain_mode is all. with other modes, segments are expected to age out. f"Unable to keep up with recording segments in cache for {camera}. Keeping the {keep_count} most recent segments out of {segment_count} and discarding the rest..."
if retain_mode == RetainModeEnum.all: )
logger.warning(
f"Unable to keep up with recording segments in cache for {camera}. Keeping the {keep_count} most recent segments out of {segment_count} and discarding the rest..."
)
to_remove = grouped_recordings[camera][:-keep_count] to_remove = grouped_recordings[camera][:-keep_count]
for f in to_remove: for f in to_remove:
cache_path = f["cache_path"] cache_path = f["cache_path"]
@ -223,6 +220,19 @@ class RecordingMaintainer(threading.Thread):
cache_path, cache_path,
record_mode, record_mode,
) )
# if it doesn't overlap with an event, go ahead and drop the segment
# if it ends more than the configured pre_capture for the camera
else:
pre_capture = self.config.cameras[
camera
].record.events.pre_capture
most_recently_processed_frame_time = self.recordings_info[
camera
][-1][0]
retain_cutoff = most_recently_processed_frame_time - pre_capture
if end_time.timestamp() < retain_cutoff:
Path(cache_path).unlink(missing_ok=True)
self.end_time_cache.pop(cache_path, None)
# else retain days includes this segment # else retain days includes this segment
else: else:
record_mode = self.config.cameras[camera].record.retain.mode record_mode = self.config.cameras[camera].record.retain.mode