Add ability to restream birdseye (#4761)

* Try using RTSP for restream

* Add ability to get snapshot of birdseye when birdseye restream is enabled

* Write to pipe instead of encoding mpeg1

* Write to cache instead

* Use const for location

* Formatting

* Add hardware encoding for birdseye based on ffmpeg preset

* Provide framerate

* Adjust args

* Fix order

* Delete pipe file if it exists

* Cleanup spacing

* Fix spacing
This commit is contained in:
Nicolas Mowen 2022-12-31 07:54:10 -07:00 committed by GitHub
parent da1b7c2e28
commit ff56262c6e
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
11 changed files with 303 additions and 17 deletions

View File

@ -356,6 +356,9 @@ restream:
enabled: True enabled: True
# Optional: Force audio compatibility with browsers (default: shown below) # Optional: Force audio compatibility with browsers (default: shown below)
force_audio: True force_audio: True
# Optional: Restream birdseye via RTSP (default: shown below)
# NOTE: Enabling this will set birdseye to run 24/7 which may increase CPU usage somewhat.
birdseye: False
# Optional: jsmpeg stream configuration for WebUI # Optional: jsmpeg stream configuration for WebUI
jsmpeg: jsmpeg:
# Optional: Set the height of the jsmpeg stream. (default: 720) # Optional: Set the height of the jsmpeg stream. (default: 720)

View File

@ -12,7 +12,7 @@ Live view options can be selected while viewing the live stream. The options are
| Source | Latency | Frame Rate | Resolution | Audio | Requires Restream | Other Limitations | | Source | Latency | Frame Rate | Resolution | Audio | Requires Restream | Other Limitations |
| ------ | ------- | -------------------------------------- | -------------- | ---------------------------- | ----------------- | --------------------- | | ------ | ------- | -------------------------------------- | -------------- | ---------------------------- | ----------------- | --------------------- |
| jsmpeg | low | same as `detect -> fps`, capped at 10 | same as detect | no | no | none | | jsmpeg | low | same as `detect -> fps`, capped at 10 | same as detect | no | no | none |
| mse | low | native | native | yes (depends on audio codec) | yes | none | | mse | low | native | native | yes (depends on audio codec) | yes | not supported on iOS |
| webrtc | lowest | native | native | yes (depends on audio codec) | yes | requires extra config | | webrtc | lowest | native | native | yes (depends on audio codec) | yes | requires extra config |
### WebRTC extra configuration: ### WebRTC extra configuration:

View File

@ -7,6 +7,14 @@ title: Restream
Frigate can restream your video feed as an RTSP feed for other applications such as Home Assistant to utilize it at `rtsp://<frigate_host>:8554/<camera_name>`. Port 8554 must be open. [This allows you to use a video feed for detection in frigate and Home Assistant live view at the same time without having to make two separate connections to the camera](#reduce-connections-to-camera). The video feed is copied from the original video feed directly to avoid re-encoding. This feed does not include any annotation by Frigate. Frigate can restream your video feed as an RTSP feed for other applications such as Home Assistant to utilize it at `rtsp://<frigate_host>:8554/<camera_name>`. Port 8554 must be open. [This allows you to use a video feed for detection in frigate and Home Assistant live view at the same time without having to make two separate connections to the camera](#reduce-connections-to-camera). The video feed is copied from the original video feed directly to avoid re-encoding. This feed does not include any annotation by Frigate.
#### Force Audio
Different live view technologies (ex: MSE, WebRTC) support different audio codecs. The `restream -> force_audio` flag tells the restream to make multiple streams available so that all live view technologies are supported. Some camera streams don't work well with this, in which case `restream -> force_audio` should be disabled.
#### Birdseye Restream
Birdseye RTSP restream can be enabled at `restream -> birdseye` and accessed at `rtsp://<frigate_host>:8554/birdseye`. Enabling the restream will cause birdseye to run 24/7 which may increase CPU usage somewhat.
### RTMP (Deprecated) ### RTMP (Deprecated)
In previous Frigate versions RTMP was used for re-streaming. RTMP has disadvantages however including being incompatible with H.265, high bitrates, and certain audio codecs. RTMP is deprecated and it is recommended to move to the new restream role. In previous Frigate versions RTMP was used for re-streaming. RTMP has disadvantages however including being incompatible with H.265, high bitrates, and certain audio codecs. RTMP is deprecated and it is recommended to move to the new restream role.

View File

@ -519,6 +519,7 @@ class RestreamConfig(FrigateBaseModel):
force_audio: bool = Field( force_audio: bool = Field(
default=True, title="Force audio compatibility with the browser." default=True, title="Force audio compatibility with the browser."
) )
birdseye: bool = Field(default=False, title="Restream the birdseye feed via RTSP.")
jsmpeg: JsmpegStreamConfig = Field( jsmpeg: JsmpegStreamConfig = Field(
default_factory=JsmpegStreamConfig, title="Jsmpeg Stream Configuration." default_factory=JsmpegStreamConfig, title="Jsmpeg Stream Configuration."
) )

View File

@ -1,6 +1,7 @@
BASE_DIR = "/media/frigate" BASE_DIR = "/media/frigate"
CLIPS_DIR = f"{BASE_DIR}/clips" CLIPS_DIR = f"{BASE_DIR}/clips"
RECORD_DIR = f"{BASE_DIR}/recordings" RECORD_DIR = f"{BASE_DIR}/recordings"
BIRDSEYE_PIPE = "/tmp/cache/birdseye"
CACHE_DIR = "/tmp/cache" CACHE_DIR = "/tmp/cache"
YAML_EXT = (".yaml", ".yml") YAML_EXT = (".yaml", ".yml")
PLUS_ENV_VAR = "PLUS_API_KEY" PLUS_ENV_VAR = "PLUS_API_KEY"

View File

@ -129,6 +129,107 @@ PRESETS_HW_ACCEL_SCALE = {
], ],
} }
PRESETS_HW_ACCEL_ENCODE = {
"preset-intel-vaapi": [
"-c:v",
"h264_vaapi",
"-g",
"50",
"-bf",
"0",
"-profile:v",
"high",
"-level:v",
"4.1",
"-sei:v",
"0",
],
"preset-intel-qsv-h264": [
"-c:v",
"h264_qsv",
"-g",
"50",
"-bf",
"0",
"-profile:v",
"high",
"-level:v",
"4.1",
"-async_depth:v",
"1",
],
"preset-intel-qsv-h265": [
"-c:v",
"h264_qsv",
"-g",
"50",
"-bf",
"0",
"-profile:v",
"high",
"-level:v",
"4.1",
"-async_depth:v",
"1",
],
"preset-amd-vaapi": [
"-c:v",
"h264_vaapi",
"-g",
"50",
"-bf",
"0",
"-profile:v",
"high",
"-level:v",
"4.1",
"-sei:v",
"0",
],
"preset-nvidia-h264": [
"-c:v",
"h264_nvenc",
"-g",
"50",
"-profile:v",
"high",
"-level:v",
"auto",
"-preset:v",
"p2",
"-tune:v",
"ll",
],
"preset-nvidia-h265": [
"-c:v",
"h264_nvenc",
"-g",
"50",
"-profile:v",
"high",
"-level:v",
"auto",
"-preset:v",
"p2",
"-tune:v",
"ll",
],
"default": [
"-c:v",
"libx264",
"-g",
"50",
"-profile:v",
"high",
"-level:v",
"4.1",
"-preset:v",
"superfast",
"-tune:v",
"zerolatency",
],
}
def parse_preset_hardware_acceleration_decode(arg: Any) -> list[str]: def parse_preset_hardware_acceleration_decode(arg: Any) -> list[str]:
"""Return the correct preset if in preset format otherwise return None.""" """Return the correct preset if in preset format otherwise return None."""
@ -158,6 +259,14 @@ def parse_preset_hardware_acceleration_scale(
return scale return scale
def parse_preset_hardware_acceleration_encode(arg: Any) -> list[str]:
"""Return the correct scaling preset or default preset if none is set."""
if not isinstance(arg, str):
return PRESETS_HW_ACCEL_ENCODE["default"]
return PRESETS_HW_ACCEL_ENCODE.get(arg, PRESETS_HW_ACCEL_ENCODE["default"])
PRESETS_INPUT = { PRESETS_INPUT = {
"preset-http-jpeg-generic": _user_agent_args "preset-http-jpeg-generic": _user_agent_args
+ [ + [

View File

@ -825,6 +825,24 @@ def latest_frame(camera_name):
frame = cv2.resize(frame, dsize=(width, height), interpolation=cv2.INTER_AREA) frame = cv2.resize(frame, dsize=(width, height), interpolation=cv2.INTER_AREA)
ret, jpg = cv2.imencode(
".jpg", frame, [int(cv2.IMWRITE_JPEG_QUALITY), resize_quality]
)
response = make_response(jpg.tobytes())
response.headers["Content-Type"] = "image/jpeg"
response.headers["Cache-Control"] = "no-store"
return response
elif camera_name == "birdseye" and current_app.frigate_config.restream.birdseye:
frame = cv2.cvtColor(
current_app.detected_frames_processor.get_current_frame(camera_name),
cv2.COLOR_YUV2BGR_I420,
)
height = int(request.args.get("h", str(frame.shape[0])))
width = int(height * frame.shape[1] / frame.shape[0])
frame = cv2.resize(frame, dsize=(width, height), interpolation=cv2.INTER_AREA)
ret, jpg = cv2.imencode( ret, jpg = cv2.imencode(
".jpg", frame, [int(cv2.IMWRITE_JPEG_QUALITY), resize_quality] ".jpg", frame, [int(cv2.IMWRITE_JPEG_QUALITY), resize_quality]
) )

View File

@ -880,6 +880,12 @@ class TrackedObjectProcessor(threading.Thread):
return {} return {}
def get_current_frame(self, camera, draw_options={}): def get_current_frame(self, camera, draw_options={}):
if camera == "birdseye":
return self.frame_manager.get(
"birdseye",
(self.config.birdseye.height * 3 // 2, self.config.birdseye.width),
)
return self.camera_states[camera].get_current_frame(draw_options) return self.camera_states[camera].get_current_frame(draw_options)
def get_current_frame_time(self, camera) -> int: def get_current_frame_time(self, camera) -> int:

View File

@ -3,6 +3,7 @@ import glob
import logging import logging
import math import math
import multiprocessing as mp import multiprocessing as mp
import os
import queue import queue
import signal import signal
import subprocess as sp import subprocess as sp
@ -21,17 +22,56 @@ from ws4py.server.wsgiutils import WebSocketWSGIApplication
from ws4py.websocket import WebSocket from ws4py.websocket import WebSocket
from frigate.config import BirdseyeModeEnum, FrigateConfig from frigate.config import BirdseyeModeEnum, FrigateConfig
from frigate.const import BASE_DIR from frigate.const import BASE_DIR, BIRDSEYE_PIPE
from frigate.util import SharedMemoryFrameManager, copy_yuv_to_position, get_yuv_crop from frigate.util import SharedMemoryFrameManager, copy_yuv_to_position, get_yuv_crop
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
class FFMpegConverter: class FFMpegConverter:
def __init__(self, in_width, in_height, out_width, out_height, quality): def __init__(
ffmpeg_cmd = f"ffmpeg -f rawvideo -pix_fmt yuv420p -video_size {in_width}x{in_height} -i pipe: -f mpegts -s {out_width}x{out_height} -codec:v mpeg1video -q {quality} -bf 0 pipe:".split( self,
" " in_width: int,
) in_height: int,
out_width: int,
out_height: int,
quality: int,
birdseye_rtsp: bool = False,
):
if birdseye_rtsp:
if os.path.exists(BIRDSEYE_PIPE):
os.remove(BIRDSEYE_PIPE)
os.mkfifo(BIRDSEYE_PIPE, mode=0o777)
stdin = os.open(BIRDSEYE_PIPE, os.O_RDONLY | os.O_NONBLOCK)
self.bd_pipe = os.open(BIRDSEYE_PIPE, os.O_WRONLY)
os.close(stdin)
else:
self.bd_pipe = None
ffmpeg_cmd = [
"ffmpeg",
"-f",
"rawvideo",
"-pix_fmt",
"yuv420p",
"-video_size",
f"{in_width}x{in_height}",
"-i",
"pipe:",
"-f",
"mpegts",
"-s",
f"{out_width}x{out_height}",
"-codec:v",
"mpeg1video",
"-q",
f"{quality}",
"-bf",
"0",
"pipe:",
]
self.process = sp.Popen( self.process = sp.Popen(
ffmpeg_cmd, ffmpeg_cmd,
stdout=sp.PIPE, stdout=sp.PIPE,
@ -40,9 +80,16 @@ class FFMpegConverter:
start_new_session=True, start_new_session=True,
) )
def write(self, b): def write(self, b) -> None:
self.process.stdin.write(b) self.process.stdin.write(b)
if self.bd_pipe:
try:
os.write(self.bd_pipe, b)
except BrokenPipeError:
# catch error when no one is listening
return
def read(self, length): def read(self, length):
try: try:
return self.process.stdout.read1(length) return self.process.stdout.read1(length)
@ -50,6 +97,9 @@ class FFMpegConverter:
return False return False
def exit(self): def exit(self):
if self.bd_pipe:
os.close(self.bd_pipe)
self.process.terminate() self.process.terminate()
try: try:
self.process.communicate(timeout=30) self.process.communicate(timeout=30)
@ -88,7 +138,7 @@ class BroadcastThread(threading.Thread):
class BirdsEyeFrameManager: class BirdsEyeFrameManager:
def __init__(self, config, frame_manager: SharedMemoryFrameManager): def __init__(self, config: FrigateConfig, frame_manager: SharedMemoryFrameManager):
self.config = config self.config = config
self.mode = config.birdseye.mode self.mode = config.birdseye.mode
self.frame_manager = frame_manager self.frame_manager = frame_manager
@ -386,6 +436,7 @@ def output_frames(config: FrigateConfig, video_output_queue):
config.birdseye.width, config.birdseye.width,
config.birdseye.height, config.birdseye.height,
config.birdseye.quality, config.birdseye.quality,
config.restream.birdseye,
) )
broadcasters["birdseye"] = BroadcastThread( broadcasters["birdseye"] = BroadcastThread(
"birdseye", converters["birdseye"], websocket_server "birdseye", converters["birdseye"], websocket_server
@ -398,6 +449,12 @@ def output_frames(config: FrigateConfig, video_output_queue):
birdseye_manager = BirdsEyeFrameManager(config, frame_manager) birdseye_manager = BirdsEyeFrameManager(config, frame_manager)
if config.restream.birdseye:
birdseye_buffer = frame_manager.create(
"birdseye",
birdseye_manager.yuv_shape[0] * birdseye_manager.yuv_shape[1],
)
while not stop_event.is_set(): while not stop_event.is_set():
try: try:
( (
@ -421,10 +478,12 @@ def output_frames(config: FrigateConfig, video_output_queue):
# write to the converter for the camera if clients are listening to the specific camera # write to the converter for the camera if clients are listening to the specific camera
converters[camera].write(frame.tobytes()) converters[camera].write(frame.tobytes())
# update birdseye if websockets are connected if config.birdseye.enabled and (
if config.birdseye.enabled and any( config.restream.birdseye
ws.environ["PATH_INFO"].endswith("birdseye") or any(
for ws in websocket_server.manager ws.environ["PATH_INFO"].endswith("birdseye")
for ws in websocket_server.manager
)
): ):
if birdseye_manager.update( if birdseye_manager.update(
camera, camera,
@ -433,7 +492,12 @@ def output_frames(config: FrigateConfig, video_output_queue):
frame_time, frame_time,
frame, frame,
): ):
converters["birdseye"].write(birdseye_manager.frame.tobytes()) frame_bytes = birdseye_manager.frame.tobytes()
if config.restream.birdseye:
birdseye_buffer[:] = frame_bytes
converters["birdseye"].write(frame_bytes)
if camera in previous_frames: if camera in previous_frames:
frame_manager.delete(f"{camera}{previous_frames[camera]}") frame_manager.delete(f"{camera}{previous_frames[camera]}")

View File

@ -6,6 +6,8 @@ import requests
from frigate.util import escape_special_characters from frigate.util import escape_special_characters
from frigate.config import FrigateConfig from frigate.config import FrigateConfig
from frigate.const import BIRDSEYE_PIPE
from frigate.ffmpeg_presets import parse_preset_hardware_acceleration_encode
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -42,6 +44,11 @@ class RestreamApi:
escape_special_characters(input.path) escape_special_characters(input.path)
) )
if self.config.restream.birdseye:
self.relays[
"birdseye"
] = f"exec:ffmpeg -hide_banner -f rawvideo -pix_fmt yuv420p -video_size {self.config.birdseye.width}x{self.config.birdseye.height} -r 10 -i {BIRDSEYE_PIPE} {' '.join(parse_preset_hardware_acceleration_encode(self.config.ffmpeg.hwaccel_args))} -rtsp_transport tcp -f rtsp {{output}}"
for name, path in self.relays.items(): for name, path in self.relays.items():
params = {"src": path, "name": name} params = {"src": path, "name": name}
requests.put("http://127.0.0.1:1984/api/streams", params=params) requests.put("http://127.0.0.1:1984/api/streams", params=params)

View File

@ -1,14 +1,83 @@
import { h } from 'preact'; import { h, Fragment } from 'preact';
import { usePersistence } from '../context';
import ActivityIndicator from '../components/ActivityIndicator';
import JSMpegPlayer from '../components/JSMpegPlayer'; import JSMpegPlayer from '../components/JSMpegPlayer';
import Heading from '../components/Heading'; import Heading from '../components/Heading';
import WebRtcPlayer from '../components/WebRtcPlayer';
import MsePlayer from '../components/MsePlayer';
import useSWR from 'swr';
import videojs from 'video.js';
export default function Birdseye() { export default function Birdseye() {
const { data: config } = useSWR('config');
const [viewSource, setViewSource, sourceIsLoaded] = usePersistence('birdseye-source', 'mse');
const sourceValues = ['mse', 'webrtc', 'jsmpeg'];
if (!config || !sourceIsLoaded) {
return <ActivityIndicator />;
}
let player;
if (viewSource == 'mse' && config.restream.birdseye) {
if (videojs.browser.IS_IOS) {
player = (
<Fragment>
<div className="w-5xl text-center text-sm">
MSE is not supported on iOS devices. You'll need to use jsmpeg or webRTC. See the docs for more info.
</div>
</Fragment>
);
} else {
player = (
<Fragment>
<div className="max-w-5xl">
<MsePlayer camera="birdseye" />
</div>
</Fragment>
);
}
} else if (viewSource == 'webrtc' && config.restream.birdseye) {
player = (
<Fragment>
<div className="max-w-5xl">
<WebRtcPlayer camera="birdseye" />
</div>
</Fragment>
);
} else {
player = (
<Fragment>
<div className="max-w-7xl">
<JSMpegPlayer camera="birdseye" />
</div>
</Fragment>
);
}
return ( return (
<div className="space-y-4 p-2 px-4"> <div className="space-y-4 p-2 px-4">
<Heading size="2xl">Birdseye</Heading> <div className="flex justify-between">
<div className="max-w-7xl"> <Heading className="p-2" size="2xl">
<JSMpegPlayer camera="birdseye" /> Birdseye
</Heading>
{config.restream.birdseye && (
<select
className="basis-1/8 cursor-pointer rounded dark:bg-slate-800"
value={viewSource}
onChange={(e) => setViewSource(e.target.value)}
>
{sourceValues.map((item) => (
<option key={item} value={item}>
{item}
</option>
))}
</select>
)}
</div> </div>
{player}
</div> </div>
); );
} }