Improve live streaming (#16447)

* config file changes

* config migrator

* stream selection on single camera live view

* camera streaming settings dialog

* manage persistent group streaming settings

* apply streaming settings in camera groups

* add ability to clear all streaming settings from settings

* docs

* update reference config

* fixes

* clarify docs

* use first stream as default in dialog

* ensure still image is visible after switching stream type to none

* docs

* clarify docs

* add ability to continue playing stream in background

* fix props

* put stream selection inside dropdown on desktop

* add capabilities to live mode hook

* live context menu component

* resize observer: only return new dimensions if they've actually changed

* pass volume prop to players

* fix slider bug, https://github.com/shadcn-ui/ui/issues/1448

* update react-grid-layout

* prevent animated transitions on draggable grid layout

* add context menu to dashboards

* use provider

* streaming dialog from context menu

* docs

* add jsmpeg warning to context menu

* audio and two way talk indicators in single camera view

* add link to debug view

* don't use hook

* create manual events from live camera view

* maintain grow classes on grid items

* fix initial volume state on default dashboard

* fix pointer events causing context menu to end up underneath image on iOS

* mobile drawer tweaks

* stream stats

* show settings menu for non-restreamed cameras

* consistent settings icon

* tweaks

* optional stats to fix birdseye player

* add toaster to live camera view

* fix crash on initial save in streaming dialog

* don't require restreaming for context menu streaming settings

* add debug view to context menu

* stats fixes

* update docs

* always show stream info when restreamed

* update camera streaming dialog

* make note of no h265 support for webrtc

* docs clarity

* ensure docs show streams as a dict

* docs clarity

* fix css file

* tweaks
This commit is contained in:
Josh Hawkins 2025-02-10 10:42:35 -06:00 committed by GitHub
parent 2a28964e63
commit dd7820e4ee
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
31 changed files with 2681 additions and 219 deletions

View File

@ -3,9 +3,9 @@ id: live
title: Live View title: Live View
--- ---
Frigate intelligently displays your camera streams on the Live view dashboard. Your camera images update once per minute when no detectable activity is occurring to conserve bandwidth and resources. As soon as any motion is detected, cameras seamlessly switch to a live stream. Frigate intelligently displays your camera streams on the Live view dashboard. By default, Frigate employs "smart streaming" where camera images update once per minute when no detectable activity is occurring to conserve bandwidth and resources. As soon as any motion or active objects are detected, cameras seamlessly switch to a live stream.
## Live View technologies ### Live View technologies
Frigate intelligently uses three different streaming technologies to display your camera streams on the dashboard and the single camera view, switching between available modes based on network bandwidth, player errors, or required features like two-way talk. The highest quality and fluency of the Live view requires the bundled `go2rtc` to be configured as shown in the [step by step guide](/guides/configuring_go2rtc). Frigate intelligently uses three different streaming technologies to display your camera streams on the dashboard and the single camera view, switching between available modes based on network bandwidth, player errors, or required features like two-way talk. The highest quality and fluency of the Live view requires the bundled `go2rtc` to be configured as shown in the [step by step guide](/guides/configuring_go2rtc).
@ -51,19 +51,32 @@ go2rtc:
- ffmpeg:rtsp://192.168.1.5:554/live0#video=copy - ffmpeg:rtsp://192.168.1.5:554/live0#video=copy
``` ```
### Setting Stream For Live UI ### Setting Streams For Live UI
There may be some cameras that you would prefer to use the sub stream for live view, but the main stream for recording. This can be done via `live -> stream_name`. You can configure Frigate to allow manual selection of the stream you want to view in the Live UI. For example, you may want to view your camera's substream on mobile devices, but the full resolution stream on desktop devices. Setting the `live -> streams` list will populate a dropdown in the UI's Live view that allows you to choose between the streams. This stream setting is _per device_ and is saved in your browser's local storage.
Additionally, when creating and editing camera groups in the UI, you can choose the stream you want to use for your camera group's Live dashboard.
::: note
Frigate's default dashboard ("All Cameras") will always use the first entry you've defined in `streams:` when playing live streams from your cameras.
:::
Configure the `streams` option with a "friendly name" for your stream followed by the go2rtc stream name.
Using Frigate's internal version of go2rtc is required to use this feature. You cannot specify paths in the `streams` configuration, only go2rtc stream names.
```yaml ```yaml
go2rtc: go2rtc:
streams: streams:
test_cam: test_cam:
- rtsp://192.168.1.5:554/live0 # <- stream which supports video & aac audio. - rtsp://192.168.1.5:554/live_main # <- stream which supports video & aac audio.
- "ffmpeg:test_cam#audio=opus" # <- copy of the stream which transcodes audio to opus for webrtc - "ffmpeg:test_cam#audio=opus" # <- copy of the stream which transcodes audio to opus for webrtc
test_cam_sub: test_cam_sub:
- rtsp://192.168.1.5:554/substream # <- stream which supports video & aac audio. - rtsp://192.168.1.5:554/live_sub # <- stream which supports video & aac audio.
- "ffmpeg:test_cam_sub#audio=opus" # <- copy of the stream which transcodes audio to opus for webrtc test_cam_another_sub:
- rtsp://192.168.1.5:554/live_alt # <- stream which supports video & aac audio.
cameras: cameras:
test_cam: test_cam:
@ -80,7 +93,10 @@ cameras:
roles: roles:
- detect - detect
live: live:
stream_name: test_cam_sub streams: # <--- Multiple streams for Frigate 0.16 and later
Main Stream: test_cam # <--- Specify a "friendly name" followed by the go2rtc stream name
Sub Stream: test_cam_sub
Special Stream: test_cam_another_sub
``` ```
### WebRTC extra configuration: ### WebRTC extra configuration:
@ -101,6 +117,7 @@ WebRTC works by creating a TCP or UDP connection on port `8555`. However, it req
``` ```
- For access through Tailscale, the Frigate system's Tailscale IP must be added as a WebRTC candidate. Tailscale IPs all start with `100.`, and are reserved within the `100.64.0.0/10` CIDR block. - For access through Tailscale, the Frigate system's Tailscale IP must be added as a WebRTC candidate. Tailscale IPs all start with `100.`, and are reserved within the `100.64.0.0/10` CIDR block.
- Note that WebRTC does not support H.265.
:::tip :::tip
@ -148,3 +165,50 @@ For devices that support two way talk, Frigate can be configured to use the feat
- For the Home Assistant Frigate card, [follow the docs](https://github.com/dermotduffy/frigate-hass-card?tab=readme-ov-file#using-2-way-audio) for the correct source. - For the Home Assistant Frigate card, [follow the docs](https://github.com/dermotduffy/frigate-hass-card?tab=readme-ov-file#using-2-way-audio) for the correct source.
To use the Reolink Doorbell with two way talk, you should use the [recommended Reolink configuration](/configuration/camera_specific#reolink-doorbell) To use the Reolink Doorbell with two way talk, you should use the [recommended Reolink configuration](/configuration/camera_specific#reolink-doorbell)
### Streaming options on camera group dashboards
Frigate provides a dialog in the Camera Group Edit pane with several options for streaming on a camera group's dashboard. These settings are _per device_ and are saved in your device's local storage.
- Stream selection using the `live -> streams` configuration option (see _Setting Streams For Live UI_ above)
- Streaming type:
- _No streaming_: Camera images will only update once per minute and no live streaming will occur.
- _Smart Streaming_ (default, recommended setting): Smart streaming will update your camera image once per minute when no detectable activity is occurring to conserve bandwidth and resources, since a static picture is the same as a streaming image with no motion or objects. When motion or objects are detected, the image seamlessly switches to a live stream.
- _Continuous Streaming_: Camera image will always be a live stream when visible on the dashboard, even if no activity is being detected. Continuous streaming may cause high bandwidth usage and performance issues. **Use with caution.**
- _Compatibility mode_: Enable this option only if your camera's live stream is displaying color artifacts and has a diagonal line on the right side of the image. Before enabling this, try setting your camera's `detect` width and height to a standard aspect ratio (for example: 640x352 becomes 640x360, and 800x443 becomes 800x450, 2688x1520 becomes 2688x1512, etc). Depending on your browser and device, more than a few cameras in compatibility mode may not be supported, so only use this option if changing your config fails to resolve the color artifacts and diagonal line.
:::note
The default dashboard ("All Cameras") will always use Smart Streaming and the first entry set in your `streams` configuration, if defined. Use a camera group if you want to change any of these settings from the defaults.
:::
## Live view FAQ
1. Why don't I have audio in my Live view?
You must use go2rtc to hear audio in your live streams. If you have go2rtc already configured, you need to ensure your camera is sending PCMA/PCMU or AAC audio. If you can't change your camera's audio codec, you need to [transcode the audio](https://github.com/AlexxIT/go2rtc?tab=readme-ov-file#source-ffmpeg) using go2rtc.
Note that the low bandwidth mode player is a video-only stream. You should not expect to hear audio when in low bandwidth mode, even if you've set up go2rtc.
2. Frigate shows that my live stream is in "low bandwidth mode". What does this mean?
Frigate intelligently selects the live streaming technology based on a number of factors (user-selected modes like two-way talk, camera settings, browser capabilities, available bandwidth) and prioritizes showing an actual up-to-date live view of your camera's stream as quickly as possible.
When you have go2rtc configured, Live view initially attempts to load and play back your stream with a clearer, fluent stream technology (MSE). An initial timeout, a low bandwidth condition that would cause buffering of the stream, or decoding errors in the stream will cause Frigate to switch to the stream defined by the `detect` role, using the jsmpeg format. This is what the UI labels as "low bandwidth mode". On Live dashboards, the mode will automatically reset when smart streaming is configured and activity stops. You can also try using the _Reset_ button to force a reload of your stream.
If you are still experiencing Frigate falling back to low bandwidth mode, you may need to adjust your camera's settings per the recommendations above or ensure you have enough bandwidth available.
3. It doesn't seem like my cameras are streaming on the Live dashboard. Why?
On the default Live dashboard ("All Cameras"), your camera images will update once per minute when no detectable activity is occurring to conserve bandwidth and resources. As soon as any activity is detected, cameras seamlessly switch to a full-resolution live stream. If you want to customize this behavior, use a camera group.
4. I see a strange diagonal line on my live view, but my recordings look fine. How can I fix it?
This is caused by incorrect dimensions set in your detect width or height (or incorrectly auto-detected), causing the jsmpeg player's rendering engine to display a slightly distorted image. You should enlarge the width and height of your `detect` resolution up to a standard aspect ratio (example: 640x352 becomes 640x360, and 800x443 becomes 800x450, 2688x1520 becomes 2688x1512, etc). If changing the resolution to match a standard (4:3, 16:9, or 32:9, etc) aspect ratio does not solve the issue, you can enable "compatibility mode" in your camera group dashboard's stream settings. Depending on your browser and device, more than a few cameras in compatibility mode may not be supported, so only use this option if changing your `detect` width and height fails to resolve the color artifacts and diagonal line.
5. How does "smart streaming" work?
Because a static image of a scene looks exactly the same as a live stream with no motion or activity, smart streaming updates your camera images once per minute when no detectable activity is occurring to conserve bandwidth and resources. As soon as any activity (motion or object/audio detection) occurs, cameras seamlessly switch to a live stream.
This static image is pulled from the stream defined in your config with the `detect` role. When activity is detected, images from the `detect` stream immediately begin updating at ~5 frames per second so you can see the activity until the live player is loaded and begins playing. This usually only takes a second or two. If the live player times out, buffers, or has streaming errors, the jsmpeg player is loaded and plays a video-only stream from the `detect` role. When activity ends, the players are destroyed and a static image is displayed until activity is detected again, and the process repeats.
This is Frigate's default and recommended setting because it results in a significant bandwidth savings, especially for high resolution cameras.
6. I have unmuted some cameras on my dashboard, but I do not hear sound. Why?
If your camera is streaming (as indicated by a red dot in the upper right, or if it has been set to continuous streaming mode), your browser may be blocking audio until you interact with the page. This is an intentional browser limitation. See [this article](https://developer.mozilla.org/en-US/docs/Web/Media/Autoplay_guide#autoplay_availability). Many browsers have a whitelist feature to change this behavior.

View File

@ -572,10 +572,12 @@ go2rtc:
# Optional: Live stream configuration for WebUI. # Optional: Live stream configuration for WebUI.
# NOTE: Can be overridden at the camera level # NOTE: Can be overridden at the camera level
live: live:
# Optional: Set the name of the stream configured in go2rtc # Optional: Set the streams configured in go2rtc
# that should be used for live view in frigate WebUI. (default: name of camera) # that should be used for live view in frigate WebUI. (default: name of camera)
# NOTE: In most cases this should be set at the camera level only. # NOTE: In most cases this should be set at the camera level only.
stream_name: camera_name streams:
main_stream: main_stream_name
sub_stream: sub_stream_name
# Optional: Set the height of the jsmpeg stream. (default: 720) # Optional: Set the height of the jsmpeg stream. (default: 720)
# This must be less than or equal to the height of the detect stream. Lower resolutions # This must be less than or equal to the height of the detect stream. Lower resolutions
# reduce bandwidth required for viewing the jsmpeg stream. Width is computed to match known aspect ratio. # reduce bandwidth required for viewing the jsmpeg stream. Width is computed to match known aspect ratio.

View File

@ -1,3 +1,5 @@
from typing import Dict
from pydantic import Field from pydantic import Field
from ..base import FrigateBaseModel from ..base import FrigateBaseModel
@ -6,6 +8,9 @@ __all__ = ["CameraLiveConfig"]
class CameraLiveConfig(FrigateBaseModel): class CameraLiveConfig(FrigateBaseModel):
stream_name: str = Field(default="", title="Name of restream to use as live view.") streams: Dict[str, str] = Field(
default_factory=list,
title="Friendly names and restream names to use for live view.",
)
height: int = Field(default=720, title="Live camera view height") height: int = Field(default=720, title="Live camera view height")
quality: int = Field(default=8, ge=1, le=31, title="Live camera view quality") quality: int = Field(default=8, ge=1, le=31, title="Live camera view quality")

View File

@ -199,17 +199,18 @@ def verify_config_roles(camera_config: CameraConfig) -> None:
) )
def verify_valid_live_stream_name( def verify_valid_live_stream_names(
frigate_config: FrigateConfig, camera_config: CameraConfig frigate_config: FrigateConfig, camera_config: CameraConfig
) -> ValueError | None: ) -> ValueError | None:
"""Verify that a restream exists to use for live view.""" """Verify that a restream exists to use for live view."""
if ( for _, stream_name in camera_config.live.streams.items():
camera_config.live.stream_name if (
not in frigate_config.go2rtc.model_dump().get("streams", {}).keys() stream_name
): not in frigate_config.go2rtc.model_dump().get("streams", {}).keys()
return ValueError( ):
f"No restream with name {camera_config.live.stream_name} exists for camera {camera_config.name}." return ValueError(
) f"No restream with name {stream_name} exists for camera {camera_config.name}."
)
def verify_recording_retention(camera_config: CameraConfig) -> None: def verify_recording_retention(camera_config: CameraConfig) -> None:
@ -586,15 +587,15 @@ class FrigateConfig(FrigateBaseModel):
zone.generate_contour(camera_config.frame_shape) zone.generate_contour(camera_config.frame_shape)
# Set live view stream if none is set # Set live view stream if none is set
if not camera_config.live.stream_name: if not camera_config.live.streams:
camera_config.live.stream_name = name camera_config.live.streams = {name: name}
# generate the ffmpeg commands # generate the ffmpeg commands
camera_config.create_ffmpeg_cmds() camera_config.create_ffmpeg_cmds()
self.cameras[name] = camera_config self.cameras[name] = camera_config
verify_config_roles(camera_config) verify_config_roles(camera_config)
verify_valid_live_stream_name(self, camera_config) verify_valid_live_stream_names(self, camera_config)
verify_recording_retention(camera_config) verify_recording_retention(camera_config)
verify_recording_segments_setup_with_reasonable_time(camera_config) verify_recording_segments_setup_with_reasonable_time(camera_config)
verify_zone_objects_are_tracked(camera_config) verify_zone_objects_are_tracked(camera_config)

View File

@ -13,7 +13,7 @@ from frigate.util.services import get_video_properties
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
CURRENT_CONFIG_VERSION = "0.15-1" CURRENT_CONFIG_VERSION = "0.16-0"
DEFAULT_CONFIG_FILE = "/config/config.yml" DEFAULT_CONFIG_FILE = "/config/config.yml"
@ -84,6 +84,13 @@ def migrate_frigate_config(config_file: str):
yaml.dump(new_config, f) yaml.dump(new_config, f)
previous_version = "0.15-1" previous_version = "0.15-1"
if previous_version < "0.16-0":
logger.info(f"Migrating frigate config from {previous_version} to 0.16-0...")
new_config = migrate_016_0(config)
with open(config_file, "w") as f:
yaml.dump(new_config, f)
previous_version = "0.16-0"
logger.info("Finished frigate config migration...") logger.info("Finished frigate config migration...")
@ -289,6 +296,29 @@ def migrate_015_1(config: dict[str, dict[str, any]]) -> dict[str, dict[str, any]
return new_config return new_config
def migrate_016_0(config: dict[str, dict[str, any]]) -> dict[str, dict[str, any]]:
"""Handle migrating frigate config to 0.16-0"""
new_config = config.copy()
for name, camera in config.get("cameras", {}).items():
camera_config: dict[str, dict[str, any]] = camera.copy()
live_config = camera_config.get("live", {})
if "stream_name" in live_config:
# Migrate from live -> stream_name to live -> streams -> dict
stream_name = live_config["stream_name"]
live_config["streams"] = {stream_name: stream_name}
del live_config["stream_name"]
camera_config["live"] = live_config
new_config["cameras"][name] = camera_config
new_config["version"] = "0.16-0"
return new_config
def get_relative_coordinates( def get_relative_coordinates(
mask: Optional[Union[str, list]], frame_shape: tuple[int, int] mask: Optional[Union[str, list]], frame_shape: tuple[int, int]
) -> Union[str, list]: ) -> Union[str, list]:

15
web/package-lock.json generated
View File

@ -54,7 +54,7 @@
"react-day-picker": "^8.10.1", "react-day-picker": "^8.10.1",
"react-device-detect": "^2.2.3", "react-device-detect": "^2.2.3",
"react-dom": "^18.3.1", "react-dom": "^18.3.1",
"react-grid-layout": "^1.4.4", "react-grid-layout": "^1.5.0",
"react-hook-form": "^7.52.1", "react-hook-form": "^7.52.1",
"react-icons": "^5.2.1", "react-icons": "^5.2.1",
"react-konva": "^18.2.10", "react-konva": "^18.2.10",
@ -5120,7 +5120,8 @@
"node_modules/fast-equals": { "node_modules/fast-equals": {
"version": "4.0.3", "version": "4.0.3",
"resolved": "https://registry.npmjs.org/fast-equals/-/fast-equals-4.0.3.tgz", "resolved": "https://registry.npmjs.org/fast-equals/-/fast-equals-4.0.3.tgz",
"integrity": "sha512-G3BSX9cfKttjr+2o1O22tYMLq0DPluZnYtq1rXumE1SpL/F/SLIfHx08WYQoWSIpeMYf8sRbJ8++71+v6Pnxfg==" "integrity": "sha512-G3BSX9cfKttjr+2o1O22tYMLq0DPluZnYtq1rXumE1SpL/F/SLIfHx08WYQoWSIpeMYf8sRbJ8++71+v6Pnxfg==",
"license": "MIT"
}, },
"node_modules/fast-glob": { "node_modules/fast-glob": {
"version": "3.3.2", "version": "3.3.2",
@ -7275,9 +7276,10 @@
} }
}, },
"node_modules/react-grid-layout": { "node_modules/react-grid-layout": {
"version": "1.4.4", "version": "1.5.0",
"resolved": "https://registry.npmjs.org/react-grid-layout/-/react-grid-layout-1.4.4.tgz", "resolved": "https://registry.npmjs.org/react-grid-layout/-/react-grid-layout-1.5.0.tgz",
"integrity": "sha512-7+Lg8E8O8HfOH5FrY80GCIR1SHTn2QnAYKh27/5spoz+OHhMmEhU/14gIkRzJOtympDPaXcVRX/nT1FjmeOUmQ==", "integrity": "sha512-WBKX7w/LsTfI99WskSu6nX2nbJAUD7GD6nIXcwYLyPpnslojtmql2oD3I2g5C3AK8hrxIarYT8awhuDIp7iQ5w==",
"license": "MIT",
"dependencies": { "dependencies": {
"clsx": "^2.0.0", "clsx": "^2.0.0",
"fast-equals": "^4.0.3", "fast-equals": "^4.0.3",
@ -7624,7 +7626,8 @@
"node_modules/resize-observer-polyfill": { "node_modules/resize-observer-polyfill": {
"version": "1.5.1", "version": "1.5.1",
"resolved": "https://registry.npmjs.org/resize-observer-polyfill/-/resize-observer-polyfill-1.5.1.tgz", "resolved": "https://registry.npmjs.org/resize-observer-polyfill/-/resize-observer-polyfill-1.5.1.tgz",
"integrity": "sha512-LwZrotdHOo12nQuZlHEmtuXdqGoOD0OhaxopaNFxWzInpEgaLWoVuAMbTzixuosCx2nEG58ngzW3vxdWoxIgdg==" "integrity": "sha512-LwZrotdHOo12nQuZlHEmtuXdqGoOD0OhaxopaNFxWzInpEgaLWoVuAMbTzixuosCx2nEG58ngzW3vxdWoxIgdg==",
"license": "MIT"
}, },
"node_modules/resolve": { "node_modules/resolve": {
"version": "1.22.8", "version": "1.22.8",

View File

@ -60,7 +60,7 @@
"react-day-picker": "^8.10.1", "react-day-picker": "^8.10.1",
"react-device-detect": "^2.2.3", "react-device-detect": "^2.2.3",
"react-dom": "^18.3.1", "react-dom": "^18.3.1",
"react-grid-layout": "^1.4.4", "react-grid-layout": "^1.5.0",
"react-hook-form": "^7.52.1", "react-hook-form": "^7.52.1",
"react-icons": "^5.2.1", "react-icons": "^5.2.1",
"react-konva": "^18.2.10", "react-konva": "^18.2.10",

View File

@ -40,9 +40,9 @@ export default function CameraFeatureToggle({
<div <div
onClick={onClick} onClick={onClick}
className={cn( className={cn(
className,
"flex flex-col items-center justify-center", "flex flex-col items-center justify-center",
variants[variant][isActive ? "active" : "inactive"], variants[variant][isActive ? "active" : "inactive"],
className,
)} )}
> >
<Icon <Icon

View File

@ -1,4 +1,9 @@
import { CameraGroupConfig, FrigateConfig } from "@/types/frigateConfig"; import {
AllGroupsStreamingSettings,
CameraGroupConfig,
FrigateConfig,
GroupStreamingSettings,
} from "@/types/frigateConfig";
import { isDesktop, isMobile } from "react-device-detect"; import { isDesktop, isMobile } from "react-device-detect";
import useSWR from "swr"; import useSWR from "swr";
import { MdHome } from "react-icons/md"; import { MdHome } from "react-icons/md";
@ -43,7 +48,6 @@ import {
AlertDialogTitle, AlertDialogTitle,
} from "../ui/alert-dialog"; } from "../ui/alert-dialog";
import axios from "axios"; import axios from "axios";
import FilterSwitch from "./FilterSwitch";
import { HiOutlineDotsVertical, HiTrash } from "react-icons/hi"; import { HiOutlineDotsVertical, HiTrash } from "react-icons/hi";
import IconWrapper from "../ui/icon-wrapper"; import IconWrapper from "../ui/icon-wrapper";
import { zodResolver } from "@hookform/resolvers/zod"; import { zodResolver } from "@hookform/resolvers/zod";
@ -66,6 +70,11 @@ import {
MobilePageHeader, MobilePageHeader,
MobilePageTitle, MobilePageTitle,
} from "../mobile/MobilePage"; } from "../mobile/MobilePage";
import { Label } from "../ui/label";
import { Switch } from "../ui/switch";
import { CameraStreamingDialog } from "../settings/CameraStreamingDialog";
import { DialogTrigger } from "@radix-ui/react-dialog";
import { useStreamingSettings } from "@/context/streaming-settings-provider";
type CameraGroupSelectorProps = { type CameraGroupSelectorProps = {
className?: string; className?: string;
@ -607,6 +616,16 @@ export function CameraGroupEdit({
const { data: config, mutate: updateConfig } = const { data: config, mutate: updateConfig } =
useSWR<FrigateConfig>("config"); useSWR<FrigateConfig>("config");
const { allGroupsStreamingSettings, setAllGroupsStreamingSettings } =
useStreamingSettings();
const [groupStreamingSettings, setGroupStreamingSettings] =
useState<GroupStreamingSettings>(
allGroupsStreamingSettings[editingGroup?.[0] ?? ""],
);
const [openCamera, setOpenCamera] = useState<string | null>();
const birdseyeConfig = useMemo(() => config?.birdseye, [config]); const birdseyeConfig = useMemo(() => config?.birdseye, [config]);
const formSchema = z.object({ const formSchema = z.object({
@ -656,6 +675,16 @@ export function CameraGroupEdit({
setIsLoading(true); setIsLoading(true);
// update streaming settings
const updatedSettings: AllGroupsStreamingSettings = {
...Object.fromEntries(
Object.entries(allGroupsStreamingSettings || {}).filter(
([key]) => key !== editingGroup?.[0],
),
),
[values.name]: groupStreamingSettings,
};
let renamingQuery = ""; let renamingQuery = "";
if (editingGroup && editingGroup[0] !== values.name) { if (editingGroup && editingGroup[0] !== values.name) {
renamingQuery = `camera_groups.${editingGroup[0]}&`; renamingQuery = `camera_groups.${editingGroup[0]}&`;
@ -679,7 +708,7 @@ export function CameraGroupEdit({
requires_restart: 0, requires_restart: 0,
}, },
) )
.then((res) => { .then(async (res) => {
if (res.status === 200) { if (res.status === 200) {
toast.success(`Camera group (${values.name}) has been saved.`, { toast.success(`Camera group (${values.name}) has been saved.`, {
position: "top-center", position: "top-center",
@ -688,6 +717,7 @@ export function CameraGroupEdit({
if (onSave) { if (onSave) {
onSave(); onSave();
} }
setAllGroupsStreamingSettings(updatedSettings);
} else { } else {
toast.error(`Failed to save config changes: ${res.statusText}`, { toast.error(`Failed to save config changes: ${res.statusText}`, {
position: "top-center", position: "top-center",
@ -704,7 +734,16 @@ export function CameraGroupEdit({
setIsLoading(false); setIsLoading(false);
}); });
}, },
[currentGroups, setIsLoading, onSave, updateConfig, editingGroup], [
currentGroups,
setIsLoading,
onSave,
updateConfig,
editingGroup,
groupStreamingSettings,
allGroupsStreamingSettings,
setAllGroupsStreamingSettings,
],
); );
const form = useForm<z.infer<typeof formSchema>>({ const form = useForm<z.infer<typeof formSchema>>({
@ -762,16 +801,66 @@ export function CameraGroupEdit({
), ),
].map((camera) => ( ].map((camera) => (
<FormControl key={camera}> <FormControl key={camera}>
<FilterSwitch <div className="flex items-center justify-between gap-1">
isChecked={field.value && field.value.includes(camera)} <Label
label={camera.replaceAll("_", " ")} className="mx-2 w-full cursor-pointer capitalize text-primary"
onCheckedChange={(checked) => { htmlFor={camera.replaceAll("_", " ")}
const updatedCameras = checked >
? [...(field.value || []), camera] {camera.replaceAll("_", " ")}
: (field.value || []).filter((c) => c !== camera); </Label>
form.setValue("cameras", updatedCameras);
}} <div className="flex items-center gap-x-2">
/> {camera !== "birdseye" && (
<Dialog
open={openCamera === camera}
onOpenChange={(isOpen) =>
setOpenCamera(isOpen ? camera : null)
}
>
<DialogTrigger asChild>
<Button
className="flex h-auto items-center gap-1"
aria-label="Camera streaming settings"
size="icon"
variant="ghost"
disabled={
!(field.value && field.value.includes(camera))
}
>
<LuIcons.LuSettings
className={cn(
field.value && field.value.includes(camera)
? "text-primary"
: "text-muted-foreground",
"size-5",
)}
/>
</Button>
</DialogTrigger>
<CameraStreamingDialog
camera={camera}
groupStreamingSettings={groupStreamingSettings}
setGroupStreamingSettings={
setGroupStreamingSettings
}
setIsDialogOpen={(isOpen) =>
setOpenCamera(isOpen ? camera : null)
}
/>
</Dialog>
)}
<Switch
id={camera.replaceAll("_", " ")}
checked={field.value && field.value.includes(camera)}
onCheckedChange={(checked) => {
const updatedCameras = checked
? [...(field.value || []), camera]
: (field.value || []).filter((c) => c !== camera);
form.setValue("cameras", updatedCameras);
}}
/>
</div>
</div>
</FormControl> </FormControl>
))} ))}
</FormItem> </FormItem>

View File

@ -0,0 +1,302 @@
import {
ReactNode,
useCallback,
useEffect,
useMemo,
useRef,
useState,
} from "react";
import {
ContextMenu,
ContextMenuContent,
ContextMenuItem,
ContextMenuSeparator,
ContextMenuTrigger,
} from "@/components/ui/context-menu";
import {
MdVolumeDown,
MdVolumeMute,
MdVolumeOff,
MdVolumeUp,
} from "react-icons/md";
import { Dialog } from "@/components/ui/dialog";
import { VolumeSlider } from "@/components/ui/slider";
import { CameraStreamingDialog } from "../settings/CameraStreamingDialog";
import {
AllGroupsStreamingSettings,
GroupStreamingSettings,
} from "@/types/frigateConfig";
import { useStreamingSettings } from "@/context/streaming-settings-provider";
import { IoIosWarning } from "react-icons/io";
import { cn } from "@/lib/utils";
import { useNavigate } from "react-router-dom";
type LiveContextMenuProps = {
className?: string;
camera: string;
streamName: string;
cameraGroup?: string;
preferredLiveMode: string;
isRestreamed: boolean;
supportsAudio: boolean;
audioState: boolean;
toggleAudio: () => void;
volumeState?: number;
setVolumeState: (volumeState: number) => void;
muteAll: () => void;
unmuteAll: () => void;
statsState: boolean;
toggleStats: () => void;
resetPreferredLiveMode: () => void;
children?: ReactNode;
};
export default function LiveContextMenu({
className,
camera,
streamName,
cameraGroup,
preferredLiveMode,
isRestreamed,
supportsAudio,
audioState,
toggleAudio,
volumeState,
setVolumeState,
muteAll,
unmuteAll,
statsState,
toggleStats,
resetPreferredLiveMode,
children,
}: LiveContextMenuProps) {
const [showSettings, setShowSettings] = useState(false);
// streaming settings
const { allGroupsStreamingSettings, setAllGroupsStreamingSettings } =
useStreamingSettings();
const [groupStreamingSettings, setGroupStreamingSettings] =
useState<GroupStreamingSettings>(
allGroupsStreamingSettings[cameraGroup ?? ""],
);
useEffect(() => {
if (cameraGroup) {
setGroupStreamingSettings(allGroupsStreamingSettings[cameraGroup]);
}
// set individual group when all groups changes
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [allGroupsStreamingSettings]);
const onSave = useCallback(
(settings: GroupStreamingSettings) => {
if (!cameraGroup || !allGroupsStreamingSettings) {
return;
}
const updatedSettings: AllGroupsStreamingSettings = {
...Object.fromEntries(
Object.entries(allGroupsStreamingSettings || {}).filter(
([key]) => key !== cameraGroup,
),
),
[cameraGroup]: {
...Object.fromEntries(
Object.entries(settings).map(([cameraName, cameraSettings]) => [
cameraName,
cameraName === camera
? {
...cameraSettings,
playAudio: audioState ?? cameraSettings.playAudio ?? false,
volume: volumeState ?? cameraSettings.volume ?? 1,
}
: cameraSettings,
]),
),
// Add the current camera if it doesn't exist
...(!settings[camera]
? {
[camera]: {
streamName: streamName,
streamType: "smart",
compatibilityMode: false,
playAudio: audioState,
volume: volumeState ?? 1,
},
}
: {}),
},
};
setAllGroupsStreamingSettings?.(updatedSettings);
},
[
camera,
streamName,
cameraGroup,
allGroupsStreamingSettings,
setAllGroupsStreamingSettings,
audioState,
volumeState,
],
);
// ui
const audioControlsUsed = useRef(false);
const VolumeIcon = useMemo(() => {
if (!volumeState || volumeState == 0.0 || !audioState) {
return MdVolumeOff;
} else if (volumeState <= 0.33) {
return MdVolumeMute;
} else if (volumeState <= 0.67) {
return MdVolumeDown;
} else {
return MdVolumeUp;
}
// only update when specific fields change
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [volumeState, audioState]);
const handleVolumeIconClick = (e: React.MouseEvent) => {
e.stopPropagation();
audioControlsUsed.current = true;
toggleAudio();
};
const handleVolumeChange = (value: number[]) => {
audioControlsUsed.current = true;
setVolumeState(value[0]);
};
const handleOpenChange = (open: boolean) => {
if (!open && audioControlsUsed.current) {
onSave(groupStreamingSettings);
audioControlsUsed.current = false;
}
};
// navigate for debug view
const navigate = useNavigate();
return (
<div className={cn("w-full", className)}>
<ContextMenu key={camera} onOpenChange={handleOpenChange}>
<ContextMenuTrigger>{children}</ContextMenuTrigger>
<ContextMenuContent>
<div className="flex flex-col items-start gap-1 py-1 pl-2">
<div className="text-md capitalize text-primary-variant">
{camera.replaceAll("_", " ")}
</div>
{preferredLiveMode == "jsmpeg" && isRestreamed && (
<div className="flex flex-row items-center gap-1">
<IoIosWarning className="mr-1 size-4 text-danger" />
<p className="mr-2 text-xs">Low-bandwidth mode</p>
</div>
)}
</div>
{preferredLiveMode != "jsmpeg" && isRestreamed && supportsAudio && (
<>
<ContextMenuSeparator className="mb-1" />
<div className="p-2 text-sm">
<div className="flex w-full flex-col gap-1">
<p>Audio</p>
<div className="flex flex-row items-center gap-1">
<VolumeIcon
className="size-5"
onClick={handleVolumeIconClick}
/>
<VolumeSlider
disabled={!audioState}
className="my-3 ml-0.5 rounded-lg bg-background/60"
value={[volumeState ?? 0]}
min={0}
max={1}
step={0.02}
onValueChange={handleVolumeChange}
/>
</div>
</div>
</div>
</>
)}
<ContextMenuSeparator />
<ContextMenuItem>
<div
className="flex w-full cursor-pointer items-center justify-start gap-2"
onClick={muteAll}
>
<div className="text-primary">Mute All Cameras</div>
</div>
</ContextMenuItem>
<ContextMenuItem>
<div
className="flex w-full cursor-pointer items-center justify-start gap-2"
onClick={unmuteAll}
>
<div className="text-primary">Unmute All Cameras</div>
</div>
</ContextMenuItem>
<ContextMenuSeparator />
<ContextMenuItem>
<div
className="flex w-full cursor-pointer items-center justify-start gap-2"
onClick={toggleStats}
>
<div className="text-primary">
{statsState ? "Hide" : "Show"} Stream Stats
</div>
</div>
</ContextMenuItem>
<ContextMenuItem>
<div
className="flex w-full cursor-pointer items-center justify-start gap-2"
onClick={() => navigate(`/settings?page=debug&camera=${camera}`)}
>
<div className="text-primary">Debug View</div>
</div>
</ContextMenuItem>
{cameraGroup && cameraGroup !== "default" && (
<>
<ContextMenuSeparator />
<ContextMenuItem>
<div
className="flex w-full cursor-pointer items-center justify-start gap-2"
onClick={() => setShowSettings(true)}
>
<div className="text-primary">Streaming Settings</div>
</div>
</ContextMenuItem>
</>
)}
{preferredLiveMode == "jsmpeg" && isRestreamed && (
<>
<ContextMenuSeparator />
<ContextMenuItem>
<div
className="flex w-full cursor-pointer items-center justify-start gap-2"
onClick={resetPreferredLiveMode}
>
<div className="text-primary">Reset</div>
</div>
</ContextMenuItem>
</>
)}
</ContextMenuContent>
</ContextMenu>
<Dialog open={showSettings} onOpenChange={setShowSettings}>
<CameraStreamingDialog
camera={camera}
groupStreamingSettings={groupStreamingSettings}
setGroupStreamingSettings={setGroupStreamingSettings}
setIsDialogOpen={setShowSettings}
onSave={onSave}
/>
</Dialog>
</div>
);
}

View File

@ -673,7 +673,8 @@ export function ObjectSnapshotTab({
</TransformComponent> </TransformComponent>
{search.data.type == "object" && {search.data.type == "object" &&
search.plus_id !== "not_enabled" && search.plus_id !== "not_enabled" &&
search.end_time && ( search.end_time &&
search.label != "on_demand" && (
<Card className="p-1 text-sm md:p-2"> <Card className="p-1 text-sm md:p-2">
<CardContent className="flex flex-col items-center justify-between gap-3 p-2 md:flex-row"> <CardContent className="flex flex-col items-center justify-between gap-3 p-2 md:flex-row">
<div className={cn("flex flex-col space-y-3")}> <div className={cn("flex flex-col space-y-3")}>

View File

@ -58,6 +58,7 @@ export default function BirdseyeLivePlayer({
height={birdseyeConfig.height} height={birdseyeConfig.height}
containerRef={containerRef} containerRef={containerRef}
playbackEnabled={true} playbackEnabled={true}
useWebGL={true}
/> />
); );
} else { } else {

View File

@ -1,6 +1,7 @@
import { baseUrl } from "@/api/baseUrl"; import { baseUrl } from "@/api/baseUrl";
import { useResizeObserver } from "@/hooks/resize-observer"; import { useResizeObserver } from "@/hooks/resize-observer";
import { cn } from "@/lib/utils"; import { cn } from "@/lib/utils";
import { PlayerStatsType } from "@/types/live";
// @ts-expect-error we know this doesn't have types // @ts-expect-error we know this doesn't have types
import JSMpeg from "@cycjimmy/jsmpeg-player"; import JSMpeg from "@cycjimmy/jsmpeg-player";
import React, { useEffect, useMemo, useRef, useState } from "react"; import React, { useEffect, useMemo, useRef, useState } from "react";
@ -12,6 +13,8 @@ type JSMpegPlayerProps = {
height: number; height: number;
containerRef: React.MutableRefObject<HTMLDivElement | null>; containerRef: React.MutableRefObject<HTMLDivElement | null>;
playbackEnabled: boolean; playbackEnabled: boolean;
useWebGL: boolean;
setStats?: (stats: PlayerStatsType) => void;
onPlaying?: () => void; onPlaying?: () => void;
}; };
@ -22,6 +25,8 @@ export default function JSMpegPlayer({
className, className,
containerRef, containerRef,
playbackEnabled, playbackEnabled,
useWebGL = false,
setStats,
onPlaying, onPlaying,
}: JSMpegPlayerProps) { }: JSMpegPlayerProps) {
const url = `${baseUrl.replace(/^http/, "ws")}live/jsmpeg/${camera}`; const url = `${baseUrl.replace(/^http/, "ws")}live/jsmpeg/${camera}`;
@ -33,6 +38,9 @@ export default function JSMpegPlayer({
const [hasData, setHasData] = useState(false); const [hasData, setHasData] = useState(false);
const hasDataRef = useRef(hasData); const hasDataRef = useRef(hasData);
const [dimensionsReady, setDimensionsReady] = useState(false); const [dimensionsReady, setDimensionsReady] = useState(false);
const bytesReceivedRef = useRef(0);
const lastTimestampRef = useRef(Date.now());
const statsIntervalRef = useRef<NodeJS.Timeout | null>(null);
const selectedContainerRef = useMemo( const selectedContainerRef = useMemo(
() => (containerRef.current ? containerRef : internalContainerRef), () => (containerRef.current ? containerRef : internalContainerRef),
@ -111,6 +119,8 @@ export default function JSMpegPlayer({
const canvas = canvasRef.current; const canvas = canvasRef.current;
let videoElement: JSMpeg.VideoElement | null = null; let videoElement: JSMpeg.VideoElement | null = null;
let frameCount = 0;
setHasData(false); setHasData(false);
if (videoWrapper && playbackEnabled) { if (videoWrapper && playbackEnabled) {
@ -123,21 +133,68 @@ export default function JSMpegPlayer({
{ {
protocols: [], protocols: [],
audio: false, audio: false,
disableGl: camera != "birdseye", disableGl: !useWebGL,
disableWebAssembly: camera != "birdseye", disableWebAssembly: !useWebGL,
videoBufferSize: 1024 * 1024 * 4, videoBufferSize: 1024 * 1024 * 4,
onVideoDecode: () => { onVideoDecode: () => {
if (!hasDataRef.current) { if (!hasDataRef.current) {
setHasData(true); setHasData(true);
onPlayingRef.current?.(); onPlayingRef.current?.();
} }
frameCount++;
}, },
}, },
); );
// Set up WebSocket message handler
if (
videoElement.player &&
videoElement.player.source &&
videoElement.player.source.socket
) {
const socket = videoElement.player.source.socket;
socket.addEventListener("message", (event: MessageEvent) => {
if (event.data instanceof ArrayBuffer) {
bytesReceivedRef.current += event.data.byteLength;
}
});
}
// Update stats every second
statsIntervalRef.current = setInterval(() => {
const currentTimestamp = Date.now();
const timeDiff = (currentTimestamp - lastTimestampRef.current) / 1000; // in seconds
const bitrate = (bytesReceivedRef.current * 8) / timeDiff / 1000; // in kbps
setStats?.({
streamType: "jsmpeg",
bandwidth: Math.round(bitrate),
totalFrames: frameCount,
latency: undefined,
droppedFrames: undefined,
decodedFrames: undefined,
droppedFrameRate: undefined,
});
bytesReceivedRef.current = 0;
lastTimestampRef.current = currentTimestamp;
}, 1000);
return () => {
if (statsIntervalRef.current) {
clearInterval(statsIntervalRef.current);
frameCount = 0;
statsIntervalRef.current = null;
}
};
}, 0); }, 0);
return () => { return () => {
clearTimeout(initPlayer); clearTimeout(initPlayer);
if (statsIntervalRef.current) {
clearInterval(statsIntervalRef.current);
statsIntervalRef.current = null;
}
if (videoElement) { if (videoElement) {
try { try {
// this causes issues in react strict mode // this causes issues in react strict mode

View File

@ -11,6 +11,7 @@ import { useCameraActivity } from "@/hooks/use-camera-activity";
import { import {
LivePlayerError, LivePlayerError,
LivePlayerMode, LivePlayerMode,
PlayerStatsType,
VideoResolutionType, VideoResolutionType,
} from "@/types/live"; } from "@/types/live";
import { getIconForLabel } from "@/utils/iconUtil"; import { getIconForLabel } from "@/utils/iconUtil";
@ -20,20 +21,26 @@ import { cn } from "@/lib/utils";
import { TbExclamationCircle } from "react-icons/tb"; import { TbExclamationCircle } from "react-icons/tb";
import { TooltipPortal } from "@radix-ui/react-tooltip"; import { TooltipPortal } from "@radix-ui/react-tooltip";
import { baseUrl } from "@/api/baseUrl"; import { baseUrl } from "@/api/baseUrl";
import { PlayerStats } from "./PlayerStats";
type LivePlayerProps = { type LivePlayerProps = {
cameraRef?: (ref: HTMLDivElement | null) => void; cameraRef?: (ref: HTMLDivElement | null) => void;
containerRef?: React.MutableRefObject<HTMLDivElement | null>; containerRef?: React.MutableRefObject<HTMLDivElement | null>;
className?: string; className?: string;
cameraConfig: CameraConfig; cameraConfig: CameraConfig;
streamName: string;
preferredLiveMode: LivePlayerMode; preferredLiveMode: LivePlayerMode;
showStillWithoutActivity?: boolean; showStillWithoutActivity?: boolean;
useWebGL: boolean;
windowVisible?: boolean; windowVisible?: boolean;
playAudio?: boolean; playAudio?: boolean;
volume?: number;
playInBackground: boolean;
micEnabled?: boolean; // only webrtc supports mic micEnabled?: boolean; // only webrtc supports mic
iOSCompatFullScreen?: boolean; iOSCompatFullScreen?: boolean;
pip?: boolean; pip?: boolean;
autoLive?: boolean; autoLive?: boolean;
showStats?: boolean;
onClick?: () => void; onClick?: () => void;
setFullResolution?: React.Dispatch<React.SetStateAction<VideoResolutionType>>; setFullResolution?: React.Dispatch<React.SetStateAction<VideoResolutionType>>;
onError?: (error: LivePlayerError) => void; onError?: (error: LivePlayerError) => void;
@ -45,14 +52,19 @@ export default function LivePlayer({
containerRef, containerRef,
className, className,
cameraConfig, cameraConfig,
streamName,
preferredLiveMode, preferredLiveMode,
showStillWithoutActivity = true, showStillWithoutActivity = true,
useWebGL = false,
windowVisible = true, windowVisible = true,
playAudio = false, playAudio = false,
volume,
playInBackground = false,
micEnabled = false, micEnabled = false,
iOSCompatFullScreen = false, iOSCompatFullScreen = false,
pip, pip,
autoLive = true, autoLive = true,
showStats = false,
onClick, onClick,
setFullResolution, setFullResolution,
onError, onError,
@ -60,6 +72,18 @@ export default function LivePlayer({
}: LivePlayerProps) { }: LivePlayerProps) {
const internalContainerRef = useRef<HTMLDivElement | null>(null); const internalContainerRef = useRef<HTMLDivElement | null>(null);
// stats
const [stats, setStats] = useState<PlayerStatsType>({
streamType: "-",
bandwidth: 0, // in kbps
latency: undefined, // in seconds
totalFrames: 0,
droppedFrames: undefined,
decodedFrames: 0,
droppedFrameRate: 0, // percentage
});
// camera activity // camera activity
const { activeMotion, activeTracking, objects, offline } = const { activeMotion, activeTracking, objects, offline } =
@ -144,6 +168,25 @@ export default function LivePlayer({
setLiveReady(false); setLiveReady(false);
}, [preferredLiveMode]); }, [preferredLiveMode]);
const [key, setKey] = useState(0);
const resetPlayer = () => {
setLiveReady(false);
setKey((prevKey) => prevKey + 1);
};
useEffect(() => {
if (streamName) {
resetPlayer();
}
}, [streamName]);
useEffect(() => {
if (showStillWithoutActivity && !autoLive) {
setLiveReady(false);
}
}, [showStillWithoutActivity, autoLive]);
const playerIsPlaying = useCallback(() => { const playerIsPlaying = useCallback(() => {
setLiveReady(true); setLiveReady(true);
}, []); }, []);
@ -153,15 +196,19 @@ export default function LivePlayer({
} }
let player; let player;
if (!autoLive) { if (!autoLive || !streamName) {
player = null; player = null;
} else if (preferredLiveMode == "webrtc") { } else if (preferredLiveMode == "webrtc") {
player = ( player = (
<WebRtcPlayer <WebRtcPlayer
key={"webrtc_" + key}
className={`size-full rounded-lg md:rounded-2xl ${liveReady ? "" : "hidden"}`} className={`size-full rounded-lg md:rounded-2xl ${liveReady ? "" : "hidden"}`}
camera={cameraConfig.live.stream_name} camera={streamName}
playbackEnabled={cameraActive || liveReady} playbackEnabled={cameraActive || liveReady}
getStats={showStats}
setStats={setStats}
audioEnabled={playAudio} audioEnabled={playAudio}
volume={volume}
microphoneEnabled={micEnabled} microphoneEnabled={micEnabled}
iOSCompatFullScreen={iOSCompatFullScreen} iOSCompatFullScreen={iOSCompatFullScreen}
onPlaying={playerIsPlaying} onPlaying={playerIsPlaying}
@ -173,10 +220,15 @@ export default function LivePlayer({
if ("MediaSource" in window || "ManagedMediaSource" in window) { if ("MediaSource" in window || "ManagedMediaSource" in window) {
player = ( player = (
<MSEPlayer <MSEPlayer
key={"mse_" + key}
className={`size-full rounded-lg md:rounded-2xl ${liveReady ? "" : "hidden"}`} className={`size-full rounded-lg md:rounded-2xl ${liveReady ? "" : "hidden"}`}
camera={cameraConfig.live.stream_name} camera={streamName}
playbackEnabled={cameraActive || liveReady} playbackEnabled={cameraActive || liveReady}
audioEnabled={playAudio} audioEnabled={playAudio}
volume={volume}
playInBackground={playInBackground}
getStats={showStats}
setStats={setStats}
onPlaying={playerIsPlaying} onPlaying={playerIsPlaying}
pip={pip} pip={pip}
setFullResolution={setFullResolution} setFullResolution={setFullResolution}
@ -194,6 +246,7 @@ export default function LivePlayer({
if (cameraActive || !showStillWithoutActivity || liveReady) { if (cameraActive || !showStillWithoutActivity || liveReady) {
player = ( player = (
<JSMpegPlayer <JSMpegPlayer
key={"jsmpeg_" + key}
className="flex justify-center overflow-hidden rounded-lg md:rounded-2xl" className="flex justify-center overflow-hidden rounded-lg md:rounded-2xl"
camera={cameraConfig.name} camera={cameraConfig.name}
width={cameraConfig.detect.width} width={cameraConfig.detect.width}
@ -201,6 +254,8 @@ export default function LivePlayer({
playbackEnabled={ playbackEnabled={
cameraActive || !showStillWithoutActivity || liveReady cameraActive || !showStillWithoutActivity || liveReady
} }
useWebGL={useWebGL}
setStats={setStats}
containerRef={containerRef ?? internalContainerRef} containerRef={containerRef ?? internalContainerRef}
onPlaying={playerIsPlaying} onPlaying={playerIsPlaying}
/> />
@ -293,7 +348,7 @@ export default function LivePlayer({
)} )}
> >
<AutoUpdatingCameraImage <AutoUpdatingCameraImage
className="size-full" className="pointer-events-none size-full"
cameraClasses="relative size-full flex justify-center" cameraClasses="relative size-full flex justify-center"
camera={cameraConfig.name} camera={cameraConfig.name}
showFps={false} showFps={false}
@ -331,6 +386,9 @@ export default function LivePlayer({
</Chip> </Chip>
)} )}
</div> </div>
{showStats && (
<PlayerStats stats={stats} minimal={cameraRef !== undefined} />
)}
</div> </div>
); );
} }

View File

@ -1,5 +1,9 @@
import { baseUrl } from "@/api/baseUrl"; import { baseUrl } from "@/api/baseUrl";
import { LivePlayerError, VideoResolutionType } from "@/types/live"; import {
LivePlayerError,
PlayerStatsType,
VideoResolutionType,
} from "@/types/live";
import { import {
SetStateAction, SetStateAction,
useCallback, useCallback,
@ -15,7 +19,11 @@ type MSEPlayerProps = {
className?: string; className?: string;
playbackEnabled?: boolean; playbackEnabled?: boolean;
audioEnabled?: boolean; audioEnabled?: boolean;
volume?: number;
playInBackground?: boolean;
pip?: boolean; pip?: boolean;
getStats?: boolean;
setStats?: (stats: PlayerStatsType) => void;
onPlaying?: () => void; onPlaying?: () => void;
setFullResolution?: React.Dispatch<SetStateAction<VideoResolutionType>>; setFullResolution?: React.Dispatch<SetStateAction<VideoResolutionType>>;
onError?: (error: LivePlayerError) => void; onError?: (error: LivePlayerError) => void;
@ -26,7 +34,11 @@ function MSEPlayer({
className, className,
playbackEnabled = true, playbackEnabled = true,
audioEnabled = false, audioEnabled = false,
volume,
playInBackground = false,
pip = false, pip = false,
getStats = false,
setStats,
onPlaying, onPlaying,
setFullResolution, setFullResolution,
onError, onError,
@ -57,6 +69,7 @@ function MSEPlayer({
const [connectTS, setConnectTS] = useState<number>(0); const [connectTS, setConnectTS] = useState<number>(0);
const [bufferTimeout, setBufferTimeout] = useState<NodeJS.Timeout>(); const [bufferTimeout, setBufferTimeout] = useState<NodeJS.Timeout>();
const [errorCount, setErrorCount] = useState<number>(0); const [errorCount, setErrorCount] = useState<number>(0);
const totalBytesLoaded = useRef(0);
const videoRef = useRef<HTMLVideoElement>(null); const videoRef = useRef<HTMLVideoElement>(null);
const wsRef = useRef<WebSocket | null>(null); const wsRef = useRef<WebSocket | null>(null);
@ -316,6 +329,8 @@ function MSEPlayer({
let bufLen = 0; let bufLen = 0;
ondataRef.current = (data) => { ondataRef.current = (data) => {
totalBytesLoaded.current += data.byteLength;
if (sb?.updating || bufLen > 0) { if (sb?.updating || bufLen > 0) {
const b = new Uint8Array(data); const b = new Uint8Array(data);
buf.set(b, bufLen); buf.set(b, bufLen);
@ -508,12 +523,22 @@ function MSEPlayer({
} }
}; };
document.addEventListener("visibilitychange", listener); if (!playInBackground) {
document.addEventListener("visibilitychange", listener);
}
return () => { return () => {
document.removeEventListener("visibilitychange", listener); if (!playInBackground) {
document.removeEventListener("visibilitychange", listener);
}
}; };
}, [playbackEnabled, visibilityCheck, onConnect, onDisconnect]); }, [
playbackEnabled,
visibilityCheck,
playInBackground,
onConnect,
onDisconnect,
]);
// control pip // control pip
@ -525,6 +550,16 @@ function MSEPlayer({
videoRef.current.requestPictureInPicture(); videoRef.current.requestPictureInPicture();
}, [pip, videoRef]); }, [pip, videoRef]);
// control volume
useEffect(() => {
if (!videoRef.current || volume == undefined) {
return;
}
videoRef.current.volume = volume;
}, [volume, videoRef]);
// ensure we disconnect for slower connections // ensure we disconnect for slower connections
useEffect(() => { useEffect(() => {
@ -542,6 +577,68 @@ function MSEPlayer({
// eslint-disable-next-line react-hooks/exhaustive-deps // eslint-disable-next-line react-hooks/exhaustive-deps
}, [playbackEnabled]); }, [playbackEnabled]);
// stats
useEffect(() => {
const video = videoRef.current;
let lastLoadedBytes = totalBytesLoaded.current;
let lastTimestamp = Date.now();
if (!getStats) return;
const updateStats = () => {
if (video) {
const now = Date.now();
const bytesLoaded = totalBytesLoaded.current;
const timeElapsed = (now - lastTimestamp) / 1000; // seconds
const bandwidth = (bytesLoaded - lastLoadedBytes) / timeElapsed / 1024; // kbps
lastLoadedBytes = bytesLoaded;
lastTimestamp = now;
const latency =
video.seekable.length > 0
? Math.max(
0,
video.seekable.end(video.seekable.length - 1) -
video.currentTime,
)
: 0;
const videoQuality = video.getVideoPlaybackQuality();
const { totalVideoFrames, droppedVideoFrames } = videoQuality;
const droppedFrameRate = totalVideoFrames
? (droppedVideoFrames / totalVideoFrames) * 100
: 0;
setStats?.({
streamType: "MSE",
bandwidth,
latency,
totalFrames: totalVideoFrames,
droppedFrames: droppedVideoFrames || undefined,
decodedFrames: totalVideoFrames - droppedVideoFrames,
droppedFrameRate,
});
}
};
const interval = setInterval(updateStats, 1000); // Update every second
return () => {
clearInterval(interval);
setStats?.({
streamType: "-",
bandwidth: 0,
latency: undefined,
totalFrames: 0,
droppedFrames: undefined,
decodedFrames: 0,
droppedFrameRate: 0,
});
};
}, [setStats, getStats]);
return ( return (
<video <video
ref={videoRef} ref={videoRef}

View File

@ -0,0 +1,100 @@
import { cn } from "@/lib/utils";
import { PlayerStatsType } from "@/types/live";
type PlayerStatsProps = {
stats: PlayerStatsType;
minimal: boolean;
};
export function PlayerStats({ stats, minimal }: PlayerStatsProps) {
const fullStatsContent = (
<>
<p>
<span className="text-white/70">Stream Type:</span>{" "}
<span className="text-white">{stats.streamType}</span>
</p>
<p>
<span className="text-white/70">Bandwidth:</span>{" "}
<span className="text-white">{stats.bandwidth.toFixed(2)} kbps</span>
</p>
{stats.latency != undefined && (
<p>
<span className="text-white/70">Latency:</span>{" "}
<span
className={`text-white ${stats.latency > 2 ? "text-danger" : ""}`}
>
{stats.latency.toFixed(2)} seconds
</span>
</p>
)}
<p>
<span className="text-white/70">Total Frames:</span>{" "}
<span className="text-white">{stats.totalFrames}</span>
</p>
{stats.droppedFrames != undefined && (
<p>
<span className="text-white/70">Dropped Frames:</span>{" "}
<span className="text-white">{stats.droppedFrames}</span>
</p>
)}
{stats.decodedFrames != undefined && (
<p>
<span className="text-white/70">Decoded Frames:</span>{" "}
<span className="text-white">{stats.decodedFrames}</span>
</p>
)}
{stats.droppedFrameRate != undefined && (
<p>
<span className="text-white/70">Dropped Frame Rate:</span>{" "}
<span className="text-white">
{stats.droppedFrameRate.toFixed(2)}%
</span>
</p>
)}
</>
);
const minimalStatsContent = (
<div className="flex flex-row items-center justify-center gap-4">
<div className="flex flex-col items-center justify-start gap-1">
<span className="text-white/70">Type</span>
<span className="text-white">{stats.streamType}</span>
</div>
<div className="flex flex-col items-center gap-1">
<span className="text-white/70">Bandwidth</span>{" "}
<span className="text-white">{stats.bandwidth.toFixed(2)} kbps</span>
</div>
{stats.latency != undefined && (
<div className="hidden flex-col items-center gap-1 md:flex">
<span className="text-white/70">Latency</span>
<span
className={`text-white ${stats.latency >= 2 ? "text-danger" : ""}`}
>
{stats.latency.toFixed(2)} sec
</span>
</div>
)}
{stats.droppedFrames != undefined && (
<div className="flex flex-col items-center justify-end gap-1">
<span className="text-white/70">Dropped</span>
<span className="text-white">{stats.droppedFrames} frames</span>
</div>
)}
</div>
);
return (
<>
<div
className={cn(
minimal
? "absolute bottom-0 left-0 max-h-[50%] w-full overflow-y-auto rounded-b-lg p-1 md:rounded-b-xl md:p-3"
: "absolute bottom-2 right-2 min-w-52 rounded-2xl p-4",
"z-50 flex flex-col gap-1 bg-black/70 text-[9px] duration-300 animate-in fade-in md:text-xs",
)}
>
{minimal ? minimalStatsContent : fullStatsContent}
</div>
</>
);
}

View File

@ -1,5 +1,5 @@
import { baseUrl } from "@/api/baseUrl"; import { baseUrl } from "@/api/baseUrl";
import { LivePlayerError } from "@/types/live"; import { LivePlayerError, PlayerStatsType } from "@/types/live";
import { useCallback, useEffect, useMemo, useRef, useState } from "react"; import { useCallback, useEffect, useMemo, useRef, useState } from "react";
type WebRtcPlayerProps = { type WebRtcPlayerProps = {
@ -7,9 +7,12 @@ type WebRtcPlayerProps = {
camera: string; camera: string;
playbackEnabled?: boolean; playbackEnabled?: boolean;
audioEnabled?: boolean; audioEnabled?: boolean;
volume?: number;
microphoneEnabled?: boolean; microphoneEnabled?: boolean;
iOSCompatFullScreen?: boolean; // ios doesn't support fullscreen divs so we must support the video element iOSCompatFullScreen?: boolean; // ios doesn't support fullscreen divs so we must support the video element
pip?: boolean; pip?: boolean;
getStats?: boolean;
setStats?: (stats: PlayerStatsType) => void;
onPlaying?: () => void; onPlaying?: () => void;
onError?: (error: LivePlayerError) => void; onError?: (error: LivePlayerError) => void;
}; };
@ -19,9 +22,12 @@ export default function WebRtcPlayer({
camera, camera,
playbackEnabled = true, playbackEnabled = true,
audioEnabled = false, audioEnabled = false,
volume,
microphoneEnabled = false, microphoneEnabled = false,
iOSCompatFullScreen = false, iOSCompatFullScreen = false,
pip = false, pip = false,
getStats = false,
setStats,
onPlaying, onPlaying,
onError, onError,
}: WebRtcPlayerProps) { }: WebRtcPlayerProps) {
@ -194,6 +200,16 @@ export default function WebRtcPlayer({
videoRef.current.requestPictureInPicture(); videoRef.current.requestPictureInPicture();
}, [pip, videoRef]); }, [pip, videoRef]);
// control volume
useEffect(() => {
if (!videoRef.current || volume == undefined) {
return;
}
videoRef.current.volume = volume;
}, [volume, videoRef]);
useEffect(() => { useEffect(() => {
videoLoadTimeoutRef.current = setTimeout(() => { videoLoadTimeoutRef.current = setTimeout(() => {
onError?.("stalled"); onError?.("stalled");
@ -215,6 +231,75 @@ export default function WebRtcPlayer({
onPlaying?.(); onPlaying?.();
}; };
// stats
useEffect(() => {
if (!pcRef.current || !getStats) return;
let lastBytesReceived = 0;
let lastTimestamp = 0;
const interval = setInterval(async () => {
if (pcRef.current && videoRef.current && !videoRef.current.paused) {
const report = await pcRef.current.getStats();
let bytesReceived = 0;
let timestamp = 0;
let roundTripTime = 0;
let framesReceived = 0;
let framesDropped = 0;
let framesDecoded = 0;
report.forEach((stat) => {
if (stat.type === "inbound-rtp" && stat.kind === "video") {
bytesReceived = stat.bytesReceived;
timestamp = stat.timestamp;
framesReceived = stat.framesReceived;
framesDropped = stat.framesDropped;
framesDecoded = stat.framesDecoded;
}
if (stat.type === "candidate-pair" && stat.state === "succeeded") {
roundTripTime = stat.currentRoundTripTime;
}
});
const timeDiff = (timestamp - lastTimestamp) / 1000; // in seconds
const bitrate =
timeDiff > 0
? (bytesReceived - lastBytesReceived) / timeDiff / 1000
: 0; // in kbps
setStats?.({
streamType: "WebRTC",
bandwidth: Math.round(bitrate),
latency: roundTripTime,
totalFrames: framesReceived,
droppedFrames: framesDropped,
decodedFrames: framesDecoded,
droppedFrameRate:
framesReceived > 0 ? (framesDropped / framesReceived) * 100 : 0,
});
lastBytesReceived = bytesReceived;
lastTimestamp = timestamp;
}
}, 1000);
return () => {
clearInterval(interval);
setStats?.({
streamType: "-",
bandwidth: 0,
latency: undefined,
totalFrames: 0,
droppedFrames: undefined,
decodedFrames: 0,
droppedFrameRate: 0,
});
};
// we need to listen on the value of the ref
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [pcRef, pcRef.current, getStats]);
return ( return (
<video <video
ref={videoRef} ref={videoRef}

View File

@ -0,0 +1,371 @@
import { useState, useCallback, useEffect, useMemo } from "react";
import { IoIosWarning } from "react-icons/io";
import { Button } from "@/components/ui/button";
import {
DialogContent,
DialogHeader,
DialogTitle,
DialogDescription,
} from "@/components/ui/dialog";
import {
Select,
SelectContent,
SelectItem,
SelectTrigger,
SelectValue,
} from "@/components/ui/select";
import {
Popover,
PopoverContent,
PopoverTrigger,
} from "@/components/ui/popover";
import { Checkbox } from "@/components/ui/checkbox";
import { Label } from "@/components/ui/label";
import {
FrigateConfig,
GroupStreamingSettings,
StreamType,
} from "@/types/frigateConfig";
import ActivityIndicator from "../indicators/activity-indicator";
import useSWR from "swr";
import { LuCheck, LuExternalLink, LuInfo, LuX } from "react-icons/lu";
import { Link } from "react-router-dom";
import { LiveStreamMetadata } from "@/types/live";
type CameraStreamingDialogProps = {
camera: string;
groupStreamingSettings: GroupStreamingSettings;
setGroupStreamingSettings: React.Dispatch<
React.SetStateAction<GroupStreamingSettings>
>;
setIsDialogOpen: React.Dispatch<React.SetStateAction<boolean>>;
onSave?: (settings: GroupStreamingSettings) => void;
};
export function CameraStreamingDialog({
camera,
groupStreamingSettings,
setGroupStreamingSettings,
setIsDialogOpen,
onSave,
}: CameraStreamingDialogProps) {
const { data: config } = useSWR<FrigateConfig>("config");
const [isLoading, setIsLoading] = useState(false);
const [streamName, setStreamName] = useState(
Object.entries(config?.cameras[camera]?.live?.streams || {})[0]?.[1] || "",
);
const [streamType, setStreamType] = useState<StreamType>("smart");
const [compatibilityMode, setCompatibilityMode] = useState(false);
// metadata
const isRestreamed = useMemo(
() =>
config &&
Object.keys(config.go2rtc.streams || {}).includes(streamName ?? ""),
[config, streamName],
);
const { data: cameraMetadata } = useSWR<LiveStreamMetadata>(
isRestreamed ? `go2rtc/streams/${streamName}` : null,
{
revalidateOnFocus: false,
},
);
const supportsAudioOutput = useMemo(() => {
if (!cameraMetadata) {
return false;
}
return (
cameraMetadata.producers.find(
(prod) =>
prod.medias &&
prod.medias.find((media) => media.includes("audio, recvonly")) !=
undefined,
) != undefined
);
}, [cameraMetadata]);
// handlers
useEffect(() => {
if (!config) {
return;
}
if (groupStreamingSettings && groupStreamingSettings[camera]) {
const cameraSettings = groupStreamingSettings[camera];
setStreamName(cameraSettings.streamName || "");
setStreamType(cameraSettings.streamType || "smart");
setCompatibilityMode(cameraSettings.compatibilityMode || false);
} else {
setStreamName(
Object.entries(config?.cameras[camera]?.live?.streams || {})[0]?.[1] ||
"",
);
setStreamType("smart");
setCompatibilityMode(false);
}
}, [groupStreamingSettings, camera, config]);
const handleSave = useCallback(() => {
setIsLoading(true);
const updatedSettings = {
...groupStreamingSettings,
[camera]: {
streamName,
streamType,
compatibilityMode,
playAudio: groupStreamingSettings?.[camera]?.playAudio ?? false,
volume: groupStreamingSettings?.[camera]?.volume ?? 1,
},
};
setGroupStreamingSettings(updatedSettings);
setIsDialogOpen(false);
setIsLoading(false);
onSave?.(updatedSettings);
}, [
groupStreamingSettings,
setGroupStreamingSettings,
camera,
streamName,
streamType,
compatibilityMode,
setIsDialogOpen,
onSave,
]);
const handleCancel = useCallback(() => {
if (!config) {
return;
}
if (groupStreamingSettings && groupStreamingSettings[camera]) {
const cameraSettings = groupStreamingSettings[camera];
setStreamName(cameraSettings.streamName || "");
setStreamType(cameraSettings.streamType || "smart");
setCompatibilityMode(cameraSettings.compatibilityMode || false);
} else {
setStreamName(
Object.entries(config?.cameras[camera]?.live?.streams || {})[0]?.[1] ||
"",
);
setStreamType("smart");
setCompatibilityMode(false);
}
setIsDialogOpen(false);
}, [groupStreamingSettings, camera, config, setIsDialogOpen]);
if (!config) {
return null;
}
return (
<DialogContent className="sm:max-w-[425px]">
<DialogHeader className="mb-4">
<DialogTitle className="capitalize">
{camera.replaceAll("_", " ")} Streaming Settings
</DialogTitle>
<DialogDescription>
Change the live streaming options for this camera group's dashboard.{" "}
<em>These settings are device/browser-specific.</em>
</DialogDescription>
</DialogHeader>
<div className="flex flex-col space-y-8">
{!isRestreamed && (
<div className="flex flex-col gap-2">
<Label>Stream</Label>
<div className="flex flex-row items-center gap-1 text-sm text-muted-foreground">
<LuX className="size-4 text-danger" />
<div>Restreaming is not enabled for this camera.</div>
<Popover>
<PopoverTrigger asChild>
<div className="cursor-pointer p-0">
<LuInfo className="size-4" />
<span className="sr-only">Info</span>
</div>
</PopoverTrigger>
<PopoverContent className="w-80 text-xs">
Set up go2rtc for additional live view options and audio for
this camera.
<div className="mt-2 flex items-center text-primary">
<Link
to="https://docs.frigate.video/configuration/live"
target="_blank"
rel="noopener noreferrer"
className="inline"
>
Read the documentation{" "}
<LuExternalLink className="ml-2 inline-flex size-3" />
</Link>
</div>
</PopoverContent>
</Popover>
</div>
</div>
)}
{isRestreamed &&
Object.entries(config?.cameras[camera].live.streams).length > 0 && (
<div className="flex flex-col items-start gap-2">
<Label htmlFor="stream" className="text-right">
Stream
</Label>
<Select value={streamName} onValueChange={setStreamName}>
<SelectTrigger className="">
<SelectValue placeholder="Choose a stream" />
</SelectTrigger>
<SelectContent>
{camera !== "birdseye" &&
Object.entries(config?.cameras[camera].live.streams).map(
([name, stream]) => (
<SelectItem key={stream} value={stream}>
{name}
</SelectItem>
),
)}
</SelectContent>
<div className="flex flex-row items-center gap-1 text-sm text-muted-foreground">
{supportsAudioOutput ? (
<>
<LuCheck className="size-4 text-success" />
<div>Audio is available for this stream</div>
</>
) : (
<>
<LuX className="size-4 text-danger" />
<div>Audio is unavailable for this stream</div>
<Popover>
<PopoverTrigger asChild>
<div className="cursor-pointer p-0">
<LuInfo className="size-4" />
<span className="sr-only">Info</span>
</div>
</PopoverTrigger>
<PopoverContent className="w-80 text-xs">
Audio must be output from your camera and configured
in go2rtc for this stream.
<div className="mt-2 flex items-center text-primary">
<Link
to="https://docs.frigate.video/configuration/live"
target="_blank"
rel="noopener noreferrer"
className="inline"
>
Read the documentation{" "}
<LuExternalLink className="ml-2 inline-flex size-3" />
</Link>
</div>
</PopoverContent>
</Popover>
</>
)}
</div>
</Select>
</div>
)}
<div className="flex flex-col items-start gap-2">
<Label htmlFor="streaming-method" className="text-right">
Streaming Method
</Label>
<Select
value={streamType}
onValueChange={(value) => setStreamType(value as StreamType)}
>
<SelectTrigger className="">
<SelectValue placeholder="Choose a streaming option" />
</SelectTrigger>
<SelectContent>
<SelectItem value="no-streaming">No Streaming</SelectItem>
<SelectItem value="smart">
Smart Streaming (recommended)
</SelectItem>
<SelectItem value="continuous">Continuous Streaming</SelectItem>
</SelectContent>
</Select>
{streamType === "no-streaming" && (
<p className="text-sm text-muted-foreground">
Camera images will only update once per minute and no live
streaming will occur.
</p>
)}
{streamType === "smart" && (
<p className="text-sm text-muted-foreground">
Smart streaming will update your camera image once per minute when
no detectable activity is occurring to conserve bandwidth and
resources. When activity is detected, the image seamlessly
switches to a live stream.
</p>
)}
{streamType === "continuous" && (
<>
<p className="text-sm text-muted-foreground">
Camera image will always be a live stream when visible on the
dashboard, even if no activity is being detected.
</p>
<div className="flex items-center gap-2">
<IoIosWarning className="mr-2 size-5 text-danger" />
<div className="max-w-[85%] text-sm">
Continuous streaming may cause high bandwidth usage and
performance issues. Use with caution.
</div>
</div>
</>
)}
</div>
<div className="flex flex-col items-start gap-2">
<div className="flex items-center gap-2">
<Checkbox
id="compatibility"
className="size-5 text-white accent-white data-[state=checked]:bg-selected data-[state=checked]:text-white"
checked={compatibilityMode}
onCheckedChange={() => setCompatibilityMode(!compatibilityMode)}
/>
<Label
htmlFor="compatibility"
className="text-sm font-medium leading-none peer-disabled:cursor-not-allowed peer-disabled:opacity-70"
>
Compatibility mode
</Label>
</div>
<div className="flex flex-col gap-2 leading-none">
<p className="text-sm text-muted-foreground">
Enable this option only if your camera's live stream is displaying
color artifacts and has a diagonal line on the right side of the
image.
</p>
</div>
</div>
</div>
<div className="flex flex-1 flex-col justify-end">
<div className="flex flex-row gap-2 pt-5">
<Button
className="flex flex-1"
aria-label="Cancel"
onClick={handleCancel}
>
Cancel
</Button>
<Button
variant="select"
aria-label="Save"
disabled={isLoading}
className="flex flex-1"
onClick={handleSave}
>
{isLoading ? (
<div className="flex flex-row items-center gap-2">
<ActivityIndicator />
<span>Saving...</span>
</div>
) : (
"Save"
)}
</Button>
</div>
</div>
</DialogContent>
);
}

View File

@ -18,7 +18,7 @@ const Slider = React.forwardRef<
<SliderPrimitive.Track className="relative h-2 w-full grow overflow-hidden rounded-full bg-secondary"> <SliderPrimitive.Track className="relative h-2 w-full grow overflow-hidden rounded-full bg-secondary">
<SliderPrimitive.Range className="absolute h-full bg-primary" /> <SliderPrimitive.Range className="absolute h-full bg-primary" />
</SliderPrimitive.Track> </SliderPrimitive.Track>
<SliderPrimitive.Thumb className="block h-5 w-5 rounded-full cursor-pointer border-2 border-primary bg-primary ring-offset-background transition-colors focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2 disabled:pointer-events-none disabled:opacity-50" /> <SliderPrimitive.Thumb className="block h-5 w-5 cursor-pointer rounded-full border-2 border-primary bg-primary ring-offset-background transition-colors data-[disabled]:pointer-events-none data-[disabled]:opacity-50 focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2" />
</SliderPrimitive.Root> </SliderPrimitive.Root>
)); ));
Slider.displayName = SliderPrimitive.Root.displayName; Slider.displayName = SliderPrimitive.Root.displayName;
@ -36,9 +36,9 @@ const VolumeSlider = React.forwardRef<
{...props} {...props}
> >
<SliderPrimitive.Track className="relative h-1 w-full grow overflow-hidden rounded-full bg-muted"> <SliderPrimitive.Track className="relative h-1 w-full grow overflow-hidden rounded-full bg-muted">
<SliderPrimitive.Range className="absolute h-full bg-white" /> <SliderPrimitive.Range className="absolute h-full bg-primary data-[disabled]:opacity-20" />
</SliderPrimitive.Track> </SliderPrimitive.Track>
<SliderPrimitive.Thumb className="block h-3 w-3 rounded-full bg-white ring-white focus:ring-white disabled:pointer-events-none disabled:opacity-50" /> <SliderPrimitive.Thumb className="block h-3 w-3 rounded-full bg-primary ring-primary data-[disabled]:pointer-events-none data-[disabled]:bg-muted focus:ring-primary" />
</SliderPrimitive.Root> </SliderPrimitive.Root>
)); ));
VolumeSlider.displayName = SliderPrimitive.Root.displayName; VolumeSlider.displayName = SliderPrimitive.Root.displayName;
@ -58,7 +58,7 @@ const NoThumbSlider = React.forwardRef<
<SliderPrimitive.Track className="relative h-full w-full grow overflow-hidden rounded-full"> <SliderPrimitive.Track className="relative h-full w-full grow overflow-hidden rounded-full">
<SliderPrimitive.Range className="absolute h-full bg-selected" /> <SliderPrimitive.Range className="absolute h-full bg-selected" />
</SliderPrimitive.Track> </SliderPrimitive.Track>
<SliderPrimitive.Thumb className="block h-4 w-16 rounded-full bg-transparent -translate-y-[50%] ring-offset-transparent focus-visible:outline-none focus-visible:ring-transparent disabled:pointer-events-none disabled:opacity-50 cursor-col-resize" /> <SliderPrimitive.Thumb className="block h-4 w-16 -translate-y-[50%] cursor-col-resize rounded-full bg-transparent ring-offset-transparent data-[disabled]:pointer-events-none data-[disabled]:opacity-50 focus-visible:outline-none focus-visible:ring-transparent" />
</SliderPrimitive.Root> </SliderPrimitive.Root>
)); ));
NoThumbSlider.displayName = SliderPrimitive.Root.displayName; NoThumbSlider.displayName = SliderPrimitive.Root.displayName;
@ -78,8 +78,8 @@ const DualThumbSlider = React.forwardRef<
<SliderPrimitive.Track className="relative h-1 w-full grow overflow-hidden rounded-full bg-selected/60"> <SliderPrimitive.Track className="relative h-1 w-full grow overflow-hidden rounded-full bg-selected/60">
<SliderPrimitive.Range className="absolute h-full bg-selected" /> <SliderPrimitive.Range className="absolute h-full bg-selected" />
</SliderPrimitive.Track> </SliderPrimitive.Track>
<SliderPrimitive.Thumb className="block size-3 rounded-full bg-selected transition-colors cursor-col-resize disabled:pointer-events-none disabled:opacity-50" /> <SliderPrimitive.Thumb className="block size-3 cursor-col-resize rounded-full bg-selected transition-colors data-[disabled]:pointer-events-none data-[disabled]:opacity-50" />
<SliderPrimitive.Thumb className="block size-3 rounded-full bg-selected transition-colors cursor-col-resize disabled:pointer-events-none disabled:opacity-50" /> <SliderPrimitive.Thumb className="block size-3 cursor-col-resize rounded-full bg-selected transition-colors data-[disabled]:pointer-events-none data-[disabled]:opacity-50" />
</SliderPrimitive.Root> </SliderPrimitive.Root>
)); ));
DualThumbSlider.displayName = SliderPrimitive.Root.displayName; DualThumbSlider.displayName = SliderPrimitive.Root.displayName;

View File

@ -5,6 +5,7 @@ import { ApiProvider } from "@/api";
import { IconContext } from "react-icons"; import { IconContext } from "react-icons";
import { TooltipProvider } from "@/components/ui/tooltip"; import { TooltipProvider } from "@/components/ui/tooltip";
import { StatusBarMessagesProvider } from "@/context/statusbar-provider"; import { StatusBarMessagesProvider } from "@/context/statusbar-provider";
import { StreamingSettingsProvider } from "./streaming-settings-provider";
type TProvidersProps = { type TProvidersProps = {
children: ReactNode; children: ReactNode;
@ -17,7 +18,11 @@ function providers({ children }: TProvidersProps) {
<ThemeProvider defaultTheme="system" storageKey="frigate-ui-theme"> <ThemeProvider defaultTheme="system" storageKey="frigate-ui-theme">
<TooltipProvider> <TooltipProvider>
<IconContext.Provider value={{ size: "20" }}> <IconContext.Provider value={{ size: "20" }}>
<StatusBarMessagesProvider>{children}</StatusBarMessagesProvider> <StatusBarMessagesProvider>
<StreamingSettingsProvider>
{children}
</StreamingSettingsProvider>
</StatusBarMessagesProvider>
</IconContext.Provider> </IconContext.Provider>
</TooltipProvider> </TooltipProvider>
</ThemeProvider> </ThemeProvider>

View File

@ -0,0 +1,68 @@
import {
createContext,
useState,
useEffect,
ReactNode,
useContext,
} from "react";
import { AllGroupsStreamingSettings } from "@/types/frigateConfig";
import { usePersistence } from "@/hooks/use-persistence";
type StreamingSettingsContextType = {
allGroupsStreamingSettings: AllGroupsStreamingSettings;
setAllGroupsStreamingSettings: (settings: AllGroupsStreamingSettings) => void;
isPersistedStreamingSettingsLoaded: boolean;
};
const StreamingSettingsContext =
createContext<StreamingSettingsContextType | null>(null);
export function StreamingSettingsProvider({
children,
}: {
children: ReactNode;
}) {
const [allGroupsStreamingSettings, setAllGroupsStreamingSettings] =
useState<AllGroupsStreamingSettings>({});
const [
persistedGroupStreamingSettings,
setPersistedGroupStreamingSettings,
isPersistedStreamingSettingsLoaded,
] = usePersistence<AllGroupsStreamingSettings>("streaming-settings");
useEffect(() => {
if (isPersistedStreamingSettingsLoaded) {
setAllGroupsStreamingSettings(persistedGroupStreamingSettings ?? {});
}
}, [isPersistedStreamingSettingsLoaded, persistedGroupStreamingSettings]);
useEffect(() => {
if (Object.keys(allGroupsStreamingSettings).length) {
setPersistedGroupStreamingSettings(allGroupsStreamingSettings);
}
}, [allGroupsStreamingSettings, setPersistedGroupStreamingSettings]);
return (
<StreamingSettingsContext.Provider
value={{
allGroupsStreamingSettings,
setAllGroupsStreamingSettings,
isPersistedStreamingSettingsLoaded,
}}
>
{children}
</StreamingSettingsContext.Provider>
);
}
// eslint-disable-next-line react-refresh/only-export-components
export function useStreamingSettings() {
const context = useContext(StreamingSettingsContext);
if (!context) {
throw new Error(
"useStreamingSettings must be used within a StreamingSettingsProvider",
);
}
return context;
}

View File

@ -17,7 +17,15 @@ export function useResizeObserver(...refs: RefType[]) {
() => () =>
new ResizeObserver((entries) => { new ResizeObserver((entries) => {
window.requestAnimationFrame(() => { window.requestAnimationFrame(() => {
setDimensions(entries.map((entry) => entry.contentRect)); setDimensions((prevDimensions) => {
const newDimensions = entries.map((entry) => entry.contentRect);
if (
JSON.stringify(prevDimensions) !== JSON.stringify(newDimensions)
) {
return newDimensions;
}
return prevDimensions;
});
}); });
}), }),
[], [],

View File

@ -1,16 +1,29 @@
import { CameraConfig, FrigateConfig } from "@/types/frigateConfig"; import { CameraConfig, FrigateConfig } from "@/types/frigateConfig";
import { useCallback, useEffect, useState } from "react"; import { useCallback, useEffect, useState } from "react";
import useSWR from "swr"; import useSWR from "swr";
import { LivePlayerMode } from "@/types/live"; import { LivePlayerMode, LiveStreamMetadata } from "@/types/live";
export default function useCameraLiveMode( export default function useCameraLiveMode(
cameras: CameraConfig[], cameras: CameraConfig[],
windowVisible: boolean, windowVisible: boolean,
) { ) {
const { data: config } = useSWR<FrigateConfig>("config"); const { data: config } = useSWR<FrigateConfig>("config");
const { data: allStreamMetadata } = useSWR<{
[key: string]: LiveStreamMetadata;
}>(config ? "go2rtc/streams" : null, { revalidateOnFocus: false });
const [preferredLiveModes, setPreferredLiveModes] = useState<{ const [preferredLiveModes, setPreferredLiveModes] = useState<{
[key: string]: LivePlayerMode; [key: string]: LivePlayerMode;
}>({}); }>({});
const [isRestreamedStates, setIsRestreamedStates] = useState<{
[key: string]: boolean;
}>({});
const [supportsAudioOutputStates, setSupportsAudioOutputStates] = useState<{
[key: string]: {
supportsAudio: boolean;
cameraName: string;
};
}>({});
useEffect(() => { useEffect(() => {
if (!cameras) return; if (!cameras) return;
@ -18,26 +31,56 @@ export default function useCameraLiveMode(
const mseSupported = const mseSupported =
"MediaSource" in window || "ManagedMediaSource" in window; "MediaSource" in window || "ManagedMediaSource" in window;
const newPreferredLiveModes = cameras.reduce( const newPreferredLiveModes: { [key: string]: LivePlayerMode } = {};
(acc, camera) => { const newIsRestreamedStates: { [key: string]: boolean } = {};
const isRestreamed = const newSupportsAudioOutputStates: {
config && [key: string]: { supportsAudio: boolean; cameraName: string };
Object.keys(config.go2rtc.streams || {}).includes( } = {};
camera.live.stream_name,
);
if (!mseSupported) { cameras.forEach((camera) => {
acc[camera.name] = isRestreamed ? "webrtc" : "jsmpeg"; const isRestreamed =
} else { config &&
acc[camera.name] = isRestreamed ? "mse" : "jsmpeg"; Object.keys(config.go2rtc.streams || {}).includes(
} Object.values(camera.live.streams)[0],
return acc; );
},
{} as { [key: string]: LivePlayerMode }, newIsRestreamedStates[camera.name] = isRestreamed ?? false;
);
if (!mseSupported) {
newPreferredLiveModes[camera.name] = isRestreamed ? "webrtc" : "jsmpeg";
} else {
newPreferredLiveModes[camera.name] = isRestreamed ? "mse" : "jsmpeg";
}
// check each stream for audio support
if (isRestreamed) {
Object.values(camera.live.streams).forEach((streamName) => {
const metadata = allStreamMetadata?.[streamName];
newSupportsAudioOutputStates[streamName] = {
supportsAudio: metadata
? metadata.producers.find(
(prod) =>
prod.medias &&
prod.medias.find((media) =>
media.includes("audio, recvonly"),
) !== undefined,
) !== undefined
: false,
cameraName: camera.name,
};
});
} else {
newSupportsAudioOutputStates[camera.name] = {
supportsAudio: false,
cameraName: camera.name,
};
}
});
setPreferredLiveModes(newPreferredLiveModes); setPreferredLiveModes(newPreferredLiveModes);
}, [cameras, config, windowVisible]); setIsRestreamedStates(newIsRestreamedStates);
setSupportsAudioOutputStates(newSupportsAudioOutputStates);
}, [cameras, config, windowVisible, allStreamMetadata]);
const resetPreferredLiveMode = useCallback( const resetPreferredLiveMode = useCallback(
(cameraName: string) => { (cameraName: string) => {
@ -61,5 +104,11 @@ export default function useCameraLiveMode(
[config], [config],
); );
return { preferredLiveModes, setPreferredLiveModes, resetPreferredLiveMode }; return {
preferredLiveModes,
setPreferredLiveModes,
resetPreferredLiveMode,
isRestreamedStates,
supportsAudioOutputStates,
};
} }

View File

@ -180,6 +180,11 @@ html {
opacity: 0.5 !important; opacity: 0.5 !important;
} }
.react-grid-layout,
.react-grid-layout .react-grid-item {
transition: none !important;
}
.react-lazylog, .react-lazylog,
.react-lazylog-searchbar { .react-lazylog-searchbar {
background-color: transparent !important; background-color: transparent !important;

View File

@ -37,6 +37,7 @@ import AuthenticationView from "@/views/settings/AuthenticationView";
import NotificationView from "@/views/settings/NotificationsSettingsView"; import NotificationView from "@/views/settings/NotificationsSettingsView";
import SearchSettingsView from "@/views/settings/SearchSettingsView"; import SearchSettingsView from "@/views/settings/SearchSettingsView";
import UiSettingsView from "@/views/settings/UiSettingsView"; import UiSettingsView from "@/views/settings/UiSettingsView";
import { useSearchEffect } from "@/hooks/use-overlay-state";
const allSettingsViews = [ const allSettingsViews = [
"UI settings", "UI settings",
@ -119,6 +120,21 @@ export default function Settings() {
} }
}, [tabsRef, pageToggle]); }, [tabsRef, pageToggle]);
useSearchEffect("page", (page: string) => {
if (allSettingsViews.includes(page as SettingsType)) {
setPage(page as SettingsType);
}
return true;
});
useSearchEffect("camera", (camera: string) => {
const cameraNames = cameras.map((c) => c.name);
if (cameraNames.includes(camera)) {
setSelectedCamera(camera);
}
return true;
});
useEffect(() => { useEffect(() => {
document.title = "Settings - Frigate"; document.title = "Settings - Frigate";
}, []); }, []);

View File

@ -87,7 +87,7 @@ export interface CameraConfig {
live: { live: {
height: number; height: number;
quality: number; quality: number;
stream_name: string; streams: { [key: string]: string };
}; };
motion: { motion: {
contour_area: number; contour_area: number;
@ -175,10 +175,18 @@ export interface CameraConfig {
alerts: { alerts: {
required_zones: string[]; required_zones: string[];
labels: string[]; labels: string[];
retain: {
days: number;
mode: string;
};
}; };
detections: { detections: {
required_zones: string[]; required_zones: string[];
labels: string[]; labels: string[];
retain: {
days: number;
mode: string;
};
}; };
}; };
rtmp: { rtmp: {
@ -230,6 +238,24 @@ export type CameraGroupConfig = {
order: number; order: number;
}; };
export type StreamType = "no-streaming" | "smart" | "continuous";
export type CameraStreamingSettings = {
streamName: string;
streamType: StreamType;
compatibilityMode: boolean;
playAudio: boolean;
volume: number;
};
export type GroupStreamingSettings = {
[cameraName: string]: CameraStreamingSettings;
};
export type AllGroupsStreamingSettings = {
[groupName: string]: GroupStreamingSettings;
};
export interface FrigateConfig { export interface FrigateConfig {
audio: { audio: {
enabled: boolean; enabled: boolean;
@ -326,12 +352,6 @@ export interface FrigateConfig {
camera_groups: { [groupName: string]: CameraGroupConfig }; camera_groups: { [groupName: string]: CameraGroupConfig };
live: {
height: number;
quality: number;
stream_name: string;
};
logger: { logger: {
default: string; default: string;
logs: Record<string, string>; logs: Record<string, string>;

View File

@ -32,3 +32,17 @@ export type LiveStreamMetadata = {
}; };
export type LivePlayerError = "stalled" | "startup" | "mse-decode"; export type LivePlayerError = "stalled" | "startup" | "mse-decode";
export type AudioState = Record<string, boolean>;
export type StatsState = Record<string, boolean>;
export type VolumeState = Record<string, number>;
export type PlayerStatsType = {
streamType: string;
bandwidth: number;
latency: number | undefined;
totalFrames: number;
droppedFrames: number | undefined;
decodedFrames: number | undefined;
droppedFrameRate: number | undefined;
};

View File

@ -1,5 +1,6 @@
import { usePersistence } from "@/hooks/use-persistence"; import { usePersistence } from "@/hooks/use-persistence";
import { import {
AllGroupsStreamingSettings,
BirdseyeConfig, BirdseyeConfig,
CameraConfig, CameraConfig,
FrigateConfig, FrigateConfig,
@ -20,7 +21,12 @@ import {
} from "react-grid-layout"; } from "react-grid-layout";
import "react-grid-layout/css/styles.css"; import "react-grid-layout/css/styles.css";
import "react-resizable/css/styles.css"; import "react-resizable/css/styles.css";
import { LivePlayerError, LivePlayerMode } from "@/types/live"; import {
AudioState,
LivePlayerMode,
StatsState,
VolumeState,
} from "@/types/live";
import { ASPECT_VERTICAL_LAYOUT, ASPECT_WIDE_LAYOUT } from "@/types/record"; import { ASPECT_VERTICAL_LAYOUT, ASPECT_WIDE_LAYOUT } from "@/types/record";
import { Skeleton } from "@/components/ui/skeleton"; import { Skeleton } from "@/components/ui/skeleton";
import { useResizeObserver } from "@/hooks/resize-observer"; import { useResizeObserver } from "@/hooks/resize-observer";
@ -42,6 +48,8 @@ import {
} from "@/components/ui/tooltip"; } from "@/components/ui/tooltip";
import { Toaster } from "@/components/ui/sonner"; import { Toaster } from "@/components/ui/sonner";
import useCameraLiveMode from "@/hooks/use-camera-live-mode"; import useCameraLiveMode from "@/hooks/use-camera-live-mode";
import LiveContextMenu from "@/components/menu/LiveContextMenu";
import { useStreamingSettings } from "@/context/streaming-settings-provider";
type DraggableGridLayoutProps = { type DraggableGridLayoutProps = {
cameras: CameraConfig[]; cameras: CameraConfig[];
@ -76,8 +84,26 @@ export default function DraggableGridLayout({
// preferred live modes per camera // preferred live modes per camera
const { preferredLiveModes, setPreferredLiveModes, resetPreferredLiveMode } = const {
useCameraLiveMode(cameras, windowVisible); preferredLiveModes,
setPreferredLiveModes,
resetPreferredLiveMode,
isRestreamedStates,
supportsAudioOutputStates,
} = useCameraLiveMode(cameras, windowVisible);
const [globalAutoLive] = usePersistence("autoLiveView", true);
const { allGroupsStreamingSettings, setAllGroupsStreamingSettings } =
useStreamingSettings();
const currentGroupStreamingSettings = useMemo(() => {
if (cameraGroup && cameraGroup != "default" && allGroupsStreamingSettings) {
return allGroupsStreamingSettings[cameraGroup];
}
}, [allGroupsStreamingSettings, cameraGroup]);
// grid layout
const ResponsiveGridLayout = useMemo(() => WidthProvider(Responsive), []); const ResponsiveGridLayout = useMemo(() => WidthProvider(Responsive), []);
@ -342,6 +368,105 @@ export default function DraggableGridLayout({
placeholder.h = layoutItem.h; placeholder.h = layoutItem.h;
}; };
// audio and stats states
const [audioStates, setAudioStates] = useState<AudioState>({});
const [volumeStates, setVolumeStates] = useState<VolumeState>({});
const [statsStates, setStatsStates] = useState<StatsState>(() => {
const initialStates: StatsState = {};
cameras.forEach((camera) => {
initialStates[camera.name] = false;
});
return initialStates;
});
const toggleStats = (cameraName: string): void => {
setStatsStates((prev) => ({
...prev,
[cameraName]: !prev[cameraName],
}));
};
useEffect(() => {
if (!allGroupsStreamingSettings) {
return;
}
const initialAudioStates: AudioState = {};
const initialVolumeStates: VolumeState = {};
Object.entries(allGroupsStreamingSettings).forEach(([_, groupSettings]) => {
Object.entries(groupSettings).forEach(([camera, cameraSettings]) => {
initialAudioStates[camera] = cameraSettings.playAudio ?? false;
initialVolumeStates[camera] = cameraSettings.volume ?? 1;
});
});
setAudioStates(initialAudioStates);
setVolumeStates(initialVolumeStates);
}, [allGroupsStreamingSettings]);
const toggleAudio = (cameraName: string) => {
setAudioStates((prev) => ({
...prev,
[cameraName]: !prev[cameraName],
}));
};
const onSaveMuting = useCallback(
(playAudio: boolean) => {
if (!cameraGroup || !allGroupsStreamingSettings) {
return;
}
const existingGroupSettings =
allGroupsStreamingSettings[cameraGroup] || {};
const updatedSettings: AllGroupsStreamingSettings = {
...Object.fromEntries(
Object.entries(allGroupsStreamingSettings || {}).filter(
([key]) => key !== cameraGroup,
),
),
[cameraGroup]: {
...existingGroupSettings,
...Object.fromEntries(
Object.entries(existingGroupSettings).map(
([cameraName, settings]) => [
cameraName,
{
...settings,
playAudio: playAudio,
},
],
),
),
},
};
setAllGroupsStreamingSettings?.(updatedSettings);
},
[cameraGroup, allGroupsStreamingSettings, setAllGroupsStreamingSettings],
);
const muteAll = () => {
const updatedStates: AudioState = {};
cameras.forEach((camera) => {
updatedStates[camera.name] = false;
});
setAudioStates(updatedStates);
onSaveMuting(false);
};
const unmuteAll = () => {
const updatedStates: AudioState = {};
cameras.forEach((camera) => {
updatedStates[camera.name] = true;
});
setAudioStates(updatedStates);
onSaveMuting(true);
};
return ( return (
<> <>
<Toaster position="top-center" closeButton={true} /> <Toaster position="top-center" closeButton={true} />
@ -364,7 +489,7 @@ export default function DraggableGridLayout({
</div> </div>
) : ( ) : (
<div <div
className="no-scrollbar my-2 overflow-x-hidden px-2 pb-8" className="no-scrollbar my-2 select-none overflow-x-hidden px-2 pb-8"
ref={gridContainerRef} ref={gridContainerRef}
> >
<EditGroupDialog <EditGroupDialog
@ -420,40 +545,87 @@ export default function DraggableGridLayout({
} else { } else {
grow = "aspect-video"; grow = "aspect-video";
} }
const streamName =
currentGroupStreamingSettings?.[camera.name]?.streamName ||
Object.values(camera.live.streams)[0];
const autoLive =
currentGroupStreamingSettings?.[camera.name]?.streamType !==
"no-streaming";
const showStillWithoutActivity =
currentGroupStreamingSettings?.[camera.name]?.streamType !==
"continuous";
const useWebGL =
currentGroupStreamingSettings?.[camera.name]
?.compatibilityMode || false;
return ( return (
<LivePlayerGridItem <GridLiveContextMenu
className={grow}
key={camera.name} key={camera.name}
cameraRef={cameraRef} camera={camera.name}
className={cn( streamName={streamName}
"rounded-lg bg-black md:rounded-2xl", cameraGroup={cameraGroup}
grow,
isEditMode &&
showCircles &&
"outline-2 outline-muted-foreground hover:cursor-grab hover:outline-4 active:cursor-grabbing",
)}
windowVisible={
windowVisible && visibleCameras.includes(camera.name)
}
cameraConfig={camera}
preferredLiveMode={preferredLiveModes[camera.name] ?? "mse"} preferredLiveMode={preferredLiveModes[camera.name] ?? "mse"}
onClick={() => { isRestreamed={isRestreamedStates[camera.name]}
!isEditMode && onSelectCamera(camera.name); supportsAudio={
}} supportsAudioOutputStates[streamName].supportsAudio
onError={(e) => { }
setPreferredLiveModes((prevModes) => { audioState={audioStates[camera.name]}
const newModes = { ...prevModes }; toggleAudio={() => toggleAudio(camera.name)}
if (e === "mse-decode") { statsState={statsStates[camera.name]}
newModes[camera.name] = "webrtc"; toggleStats={() => toggleStats(camera.name)}
} else { volumeState={volumeStates[camera.name]}
newModes[camera.name] = "jsmpeg"; setVolumeState={(value) =>
} setVolumeStates({
return newModes; [camera.name]: value,
}); })
}} }
onResetLiveMode={() => resetPreferredLiveMode(camera.name)} muteAll={muteAll}
unmuteAll={unmuteAll}
resetPreferredLiveMode={() =>
resetPreferredLiveMode(camera.name)
}
> >
<LivePlayer
key={camera.name}
streamName={streamName}
autoLive={autoLive ?? globalAutoLive}
showStillWithoutActivity={showStillWithoutActivity ?? true}
useWebGL={useWebGL}
cameraRef={cameraRef}
className={cn(
"rounded-lg bg-black md:rounded-2xl",
grow,
isEditMode &&
showCircles &&
"outline-2 outline-muted-foreground hover:cursor-grab hover:outline-4 active:cursor-grabbing",
)}
windowVisible={
windowVisible && visibleCameras.includes(camera.name)
}
cameraConfig={camera}
preferredLiveMode={preferredLiveModes[camera.name] ?? "mse"}
playInBackground={false}
showStats={statsStates[camera.name]}
onClick={() => {
!isEditMode && onSelectCamera(camera.name);
}}
onError={(e) => {
setPreferredLiveModes((prevModes) => {
const newModes = { ...prevModes };
if (e === "mse-decode") {
newModes[camera.name] = "webrtc";
} else {
newModes[camera.name] = "jsmpeg";
}
return newModes;
});
}}
onResetLiveMode={() => resetPreferredLiveMode(camera.name)}
playAudio={audioStates[camera.name]}
volume={volumeStates[camera.name]}
/>
{isEditMode && showCircles && <CornerCircles />} {isEditMode && showCircles && <CornerCircles />}
</LivePlayerGridItem> </GridLiveContextMenu>
); );
})} })}
</ResponsiveGridLayout> </ResponsiveGridLayout>
@ -596,41 +768,57 @@ const BirdseyeLivePlayerGridItem = React.forwardRef<
}, },
); );
type LivePlayerGridItemProps = { type GridLiveContextMenuProps = {
className?: string;
style?: React.CSSProperties; style?: React.CSSProperties;
className: string;
onMouseDown?: React.MouseEventHandler<HTMLDivElement>; onMouseDown?: React.MouseEventHandler<HTMLDivElement>;
onMouseUp?: React.MouseEventHandler<HTMLDivElement>; onMouseUp?: React.MouseEventHandler<HTMLDivElement>;
onTouchEnd?: React.TouchEventHandler<HTMLDivElement>; onTouchEnd?: React.TouchEventHandler<HTMLDivElement>;
children?: React.ReactNode; children?: React.ReactNode;
cameraRef: (node: HTMLElement | null) => void; camera: string;
windowVisible: boolean; streamName: string;
cameraConfig: CameraConfig; cameraGroup: string;
preferredLiveMode: LivePlayerMode; preferredLiveMode: string;
onClick: () => void; isRestreamed: boolean;
onError: (e: LivePlayerError) => void; supportsAudio: boolean;
onResetLiveMode: () => void; audioState: boolean;
toggleAudio: () => void;
statsState: boolean;
toggleStats: () => void;
volumeState?: number;
setVolumeState: (volumeState: number) => void;
muteAll: () => void;
unmuteAll: () => void;
resetPreferredLiveMode: () => void;
}; };
const LivePlayerGridItem = React.forwardRef< const GridLiveContextMenu = React.forwardRef<
HTMLDivElement, HTMLDivElement,
LivePlayerGridItemProps GridLiveContextMenuProps
>( >(
( (
{ {
style,
className, className,
style,
onMouseDown, onMouseDown,
onMouseUp, onMouseUp,
onTouchEnd, onTouchEnd,
children, children,
cameraRef, camera,
windowVisible, streamName,
cameraConfig, cameraGroup,
preferredLiveMode, preferredLiveMode,
onClick, isRestreamed,
onError, supportsAudio,
onResetLiveMode, audioState,
toggleAudio,
statsState,
toggleStats,
volumeState,
setVolumeState,
muteAll,
unmuteAll,
resetPreferredLiveMode,
...props ...props
}, },
ref, ref,
@ -644,18 +832,26 @@ const LivePlayerGridItem = React.forwardRef<
onTouchEnd={onTouchEnd} onTouchEnd={onTouchEnd}
{...props} {...props}
> >
<LivePlayer <LiveContextMenu
cameraRef={cameraRef}
className={className} className={className}
windowVisible={windowVisible} camera={camera}
cameraConfig={cameraConfig} streamName={streamName}
cameraGroup={cameraGroup}
preferredLiveMode={preferredLiveMode} preferredLiveMode={preferredLiveMode}
onClick={onClick} isRestreamed={isRestreamed}
onError={onError} supportsAudio={supportsAudio}
onResetLiveMode={onResetLiveMode} audioState={audioState}
containerRef={ref as React.RefObject<HTMLDivElement>} toggleAudio={toggleAudio}
/> statsState={statsState}
{children} toggleStats={toggleStats}
volumeState={volumeState}
setVolumeState={setVolumeState}
muteAll={muteAll}
unmuteAll={unmuteAll}
resetPreferredLiveMode={resetPreferredLiveMode}
>
{children}
</LiveContextMenu>
</div> </div>
); );
}, },

View File

@ -17,6 +17,11 @@ import {
DropdownMenuItem, DropdownMenuItem,
DropdownMenuTrigger, DropdownMenuTrigger,
} from "@/components/ui/dropdown-menu"; } from "@/components/ui/dropdown-menu";
import {
Popover,
PopoverContent,
PopoverTrigger,
} from "@/components/ui/popover";
import { import {
Tooltip, Tooltip,
TooltipContent, TooltipContent,
@ -62,29 +67,52 @@ import {
FaMicrophoneSlash, FaMicrophoneSlash,
} from "react-icons/fa"; } from "react-icons/fa";
import { GiSpeaker, GiSpeakerOff } from "react-icons/gi"; import { GiSpeaker, GiSpeakerOff } from "react-icons/gi";
import { TbViewfinder, TbViewfinderOff } from "react-icons/tb";
import { IoMdArrowRoundBack } from "react-icons/io";
import { import {
TbRecordMail,
TbRecordMailOff,
TbViewfinder,
TbViewfinderOff,
} from "react-icons/tb";
import { IoIosWarning, IoMdArrowRoundBack } from "react-icons/io";
import {
LuCheck,
LuEar, LuEar,
LuEarOff, LuEarOff,
LuExternalLink,
LuHistory, LuHistory,
LuInfo,
LuPictureInPicture, LuPictureInPicture,
LuVideo, LuVideo,
LuVideoOff, LuVideoOff,
LuX,
} from "react-icons/lu"; } from "react-icons/lu";
import { import {
MdNoPhotography, MdNoPhotography,
MdOutlineRestartAlt,
MdPersonOff, MdPersonOff,
MdPersonSearch, MdPersonSearch,
MdPhotoCamera, MdPhotoCamera,
MdZoomIn, MdZoomIn,
MdZoomOut, MdZoomOut,
} from "react-icons/md"; } from "react-icons/md";
import { useNavigate } from "react-router-dom"; import { Link, useNavigate } from "react-router-dom";
import { TransformWrapper, TransformComponent } from "react-zoom-pan-pinch"; import { TransformWrapper, TransformComponent } from "react-zoom-pan-pinch";
import useSWR from "swr"; import useSWR from "swr";
import { cn } from "@/lib/utils"; import { cn } from "@/lib/utils";
import { useSessionPersistence } from "@/hooks/use-session-persistence"; import { useSessionPersistence } from "@/hooks/use-session-persistence";
import {
Select,
SelectContent,
SelectGroup,
SelectItem,
SelectTrigger,
} from "@/components/ui/select";
import { usePersistence } from "@/hooks/use-persistence";
import { Label } from "@/components/ui/label";
import { Switch } from "@/components/ui/switch";
import axios from "axios";
import { toast } from "sonner";
import { Toaster } from "@/components/ui/sonner";
type LiveCameraViewProps = { type LiveCameraViewProps = {
config?: FrigateConfig; config?: FrigateConfig;
@ -109,17 +137,20 @@ export default function LiveCameraView({
// supported features // supported features
const [streamName, setStreamName] = usePersistence<string>(
`${camera.name}-stream`,
Object.values(camera.live.streams)[0],
);
const isRestreamed = useMemo( const isRestreamed = useMemo(
() => () =>
config && config &&
Object.keys(config.go2rtc.streams || {}).includes( Object.keys(config.go2rtc.streams || {}).includes(streamName ?? ""),
camera.live.stream_name, [config, streamName],
),
[camera, config],
); );
const { data: cameraMetadata } = useSWR<LiveStreamMetadata>( const { data: cameraMetadata } = useSWR<LiveStreamMetadata>(
isRestreamed ? `go2rtc/streams/${camera.live.stream_name}` : null, isRestreamed ? `go2rtc/streams/${streamName}` : null,
{ {
revalidateOnFocus: false, revalidateOnFocus: false,
}, },
@ -209,6 +240,13 @@ export default function LiveCameraView({
const [pip, setPip] = useState(false); const [pip, setPip] = useState(false);
const [lowBandwidth, setLowBandwidth] = useState(false); const [lowBandwidth, setLowBandwidth] = useState(false);
const [playInBackground, setPlayInBackground] = usePersistence<boolean>(
`${camera.name}-background-play`,
false,
);
const [showStats, setShowStats] = useState(false);
const [fullResolution, setFullResolution] = useState<VideoResolutionType>({ const [fullResolution, setFullResolution] = useState<VideoResolutionType>({
width: 0, width: 0,
height: 0, height: 0,
@ -337,6 +375,7 @@ export default function LiveCameraView({
return ( return (
<TransformWrapper minScale={1.0} wheel={{ smoothStep: 0.005 }}> <TransformWrapper minScale={1.0} wheel={{ smoothStep: 0.005 }}>
<Toaster position="top-center" closeButton={true} />
<div <div
ref={mainRef} ref={mainRef}
className={ className={
@ -460,13 +499,24 @@ export default function LiveCameraView({
/> />
)} )}
<FrigateCameraFeatures <FrigateCameraFeatures
camera={camera.name} camera={camera}
recordingEnabled={camera.record.enabled_in_config} recordingEnabled={camera.record.enabled_in_config}
audioDetectEnabled={camera.audio.enabled_in_config} audioDetectEnabled={camera.audio.enabled_in_config}
autotrackingEnabled={ autotrackingEnabled={
camera.onvif.autotracking.enabled_in_config camera.onvif.autotracking.enabled_in_config
} }
fullscreen={fullscreen} fullscreen={fullscreen}
streamName={streamName ?? ""}
setStreamName={setStreamName}
preferredLiveMode={preferredLiveMode}
playInBackground={playInBackground ?? false}
setPlayInBackground={setPlayInBackground}
showStats={showStats}
setShowStats={setShowStats}
isRestreamed={isRestreamed ?? false}
setLowBandwidth={setLowBandwidth}
supportsAudioOutput={supportsAudioOutput}
supports2WayTalk={supports2WayTalk}
/> />
</div> </div>
</TooltipProvider> </TooltipProvider>
@ -499,9 +549,13 @@ export default function LiveCameraView({
showStillWithoutActivity={false} showStillWithoutActivity={false}
cameraConfig={camera} cameraConfig={camera}
playAudio={audio} playAudio={audio}
playInBackground={playInBackground ?? false}
showStats={showStats}
micEnabled={mic} micEnabled={mic}
iOSCompatFullScreen={isIOS} iOSCompatFullScreen={isIOS}
preferredLiveMode={preferredLiveMode} preferredLiveMode={preferredLiveMode}
useWebGL={true}
streamName={streamName ?? ""}
pip={pip} pip={pip}
containerRef={containerRef} containerRef={containerRef}
setFullResolution={setFullResolution} setFullResolution={setFullResolution}
@ -816,12 +870,49 @@ function PtzControlPanel({
); );
} }
function OnDemandRetentionMessage({ camera }: { camera: CameraConfig }) {
const rankMap = { all: 0, motion: 1, active_objects: 2 };
const getValidMode = (retain?: { mode?: string }): keyof typeof rankMap => {
const mode = retain?.mode;
return mode && mode in rankMap ? (mode as keyof typeof rankMap) : "all";
};
const recordRetainMode = getValidMode(camera.record.retain);
const alertsRetainMode = getValidMode(camera.review.alerts.retain);
const effectiveRetainMode =
rankMap[alertsRetainMode] < rankMap[recordRetainMode]
? recordRetainMode
: alertsRetainMode;
const source = effectiveRetainMode === recordRetainMode ? "camera" : "alerts";
return effectiveRetainMode !== "all" ? (
<div>
Your {source} recording retention configuration is set to{" "}
<code>mode: {effectiveRetainMode}</code>, so this on-demand recording will
only keep segments with {effectiveRetainMode.replaceAll("_", " ")}.
</div>
) : null;
}
type FrigateCameraFeaturesProps = { type FrigateCameraFeaturesProps = {
camera: string; camera: CameraConfig;
recordingEnabled: boolean; recordingEnabled: boolean;
audioDetectEnabled: boolean; audioDetectEnabled: boolean;
autotrackingEnabled: boolean; autotrackingEnabled: boolean;
fullscreen: boolean; fullscreen: boolean;
streamName: string;
setStreamName?: (value: string | undefined) => void;
preferredLiveMode: string;
playInBackground: boolean;
setPlayInBackground: (value: boolean | undefined) => void;
showStats: boolean;
setShowStats: (value: boolean) => void;
isRestreamed: boolean;
setLowBandwidth: React.Dispatch<React.SetStateAction<boolean>>;
supportsAudioOutput: boolean;
supports2WayTalk: boolean;
}; };
function FrigateCameraFeatures({ function FrigateCameraFeatures({
camera, camera,
@ -829,14 +920,124 @@ function FrigateCameraFeatures({
audioDetectEnabled, audioDetectEnabled,
autotrackingEnabled, autotrackingEnabled,
fullscreen, fullscreen,
streamName,
setStreamName,
preferredLiveMode,
playInBackground,
setPlayInBackground,
showStats,
setShowStats,
isRestreamed,
setLowBandwidth,
supportsAudioOutput,
supports2WayTalk,
}: FrigateCameraFeaturesProps) { }: FrigateCameraFeaturesProps) {
const { payload: detectState, send: sendDetect } = useDetectState(camera); const { payload: detectState, send: sendDetect } = useDetectState(
const { payload: recordState, send: sendRecord } = useRecordingsState(camera); camera.name,
const { payload: snapshotState, send: sendSnapshot } = );
useSnapshotsState(camera); const { payload: recordState, send: sendRecord } = useRecordingsState(
const { payload: audioState, send: sendAudio } = useAudioState(camera); camera.name,
);
const { payload: snapshotState, send: sendSnapshot } = useSnapshotsState(
camera.name,
);
const { payload: audioState, send: sendAudio } = useAudioState(camera.name);
const { payload: autotrackingState, send: sendAutotracking } = const { payload: autotrackingState, send: sendAutotracking } =
useAutotrackingState(camera); useAutotrackingState(camera.name);
// manual event
const recordingEventIdRef = useRef<string | null>(null);
const [isRecording, setIsRecording] = useState(false);
const [activeToastId, setActiveToastId] = useState<string | number | null>(
null,
);
const createEvent = useCallback(async () => {
try {
const response = await axios.post(
`events/${camera.name}/on_demand/create`,
{
include_recording: true,
duration: null,
},
);
if (response.data.success) {
recordingEventIdRef.current = response.data.event_id;
setIsRecording(true);
const toastId = toast.success(
<div className="flex flex-col space-y-3">
<div className="font-semibold">
Started manual on-demand recording.
</div>
{!camera.record.enabled || camera.record.retain.days == 0 ? (
<div>
Since recording is disabled or restricted in the config for this
camera, only a snapshot will be saved.
</div>
) : (
<OnDemandRetentionMessage camera={camera} />
)}
</div>,
{
position: "top-center",
duration: 10000,
},
);
setActiveToastId(toastId);
}
} catch (error) {
toast.error("Failed to start manual on-demand recording.", {
position: "top-center",
});
}
}, [camera]);
const endEvent = useCallback(() => {
if (activeToastId) {
toast.dismiss(activeToastId);
}
try {
if (recordingEventIdRef.current) {
axios.put(`events/${recordingEventIdRef.current}/end`, {
end_time: Math.ceil(Date.now() / 1000),
});
recordingEventIdRef.current = null;
setIsRecording(false);
toast.success("Ended manual on-demand recording.", {
position: "top-center",
});
}
} catch (error) {
toast.error("Failed to end manual on-demand recording.", {
position: "top-center",
});
}
}, [activeToastId]);
const handleEventButtonClick = useCallback(() => {
if (isRecording) {
endEvent();
} else {
createEvent();
}
}, [createEvent, endEvent, isRecording]);
useEffect(() => {
// ensure manual event is stopped when component unmounts
return () => {
if (recordingEventIdRef.current) {
endEvent();
}
};
// mount/unmount only
// eslint-disable-next-line react-hooks/exhaustive-deps
}, []);
// navigate for debug view
const navigate = useNavigate();
// desktop shows icons part of row // desktop shows icons part of row
if (isDesktop || isTablet) { if (isDesktop || isTablet) {
@ -888,6 +1089,264 @@ function FrigateCameraFeatures({
} }
/> />
)} )}
<CameraFeatureToggle
className={cn(
"p-2 md:p-0",
isRecording && "animate-pulse bg-red-500 hover:bg-red-600",
)}
variant={fullscreen ? "overlay" : "primary"}
Icon={isRecording ? TbRecordMail : TbRecordMailOff}
isActive={isRecording}
title={`${isRecording ? "Stop" : "Start"} on-demand recording`}
onClick={handleEventButtonClick}
/>
<DropdownMenu modal={false}>
<DropdownMenuTrigger>
<div
className={cn(
"flex flex-col items-center justify-center rounded-lg bg-secondary p-2 text-secondary-foreground md:p-0",
)}
>
<FaCog
className={`text-secondary-foreground" size-5 md:m-[6px]`}
/>
</div>
</DropdownMenuTrigger>
<DropdownMenuContent className="max-w-96">
<div className="flex flex-col gap-5 p-4">
{!isRestreamed && (
<div className="flex flex-col gap-2">
<Label>Stream</Label>
<div className="flex flex-row items-center gap-1 text-sm text-muted-foreground">
<LuX className="size-4 text-danger" />
<div>Restreaming is not enabled for this camera.</div>
<Popover>
<PopoverTrigger asChild>
<div className="cursor-pointer p-0">
<LuInfo className="size-4" />
<span className="sr-only">Info</span>
</div>
</PopoverTrigger>
<PopoverContent className="w-80 text-xs">
Set up go2rtc for additional live view options and audio
for this camera.
<div className="mt-2 flex items-center text-primary">
<Link
to="https://docs.frigate.video/configuration/live"
target="_blank"
rel="noopener noreferrer"
className="inline"
>
Read the documentation{" "}
<LuExternalLink className="ml-2 inline-flex size-3" />
</Link>
</div>
</PopoverContent>
</Popover>
</div>
</div>
)}
{isRestreamed &&
Object.values(camera.live.streams).length > 0 && (
<div className="flex flex-col gap-1">
<Label htmlFor="streaming-method">Stream</Label>
<Select
value={streamName}
onValueChange={(value) => {
setStreamName?.(value);
}}
>
<SelectTrigger className="w-full">
{Object.keys(camera.live.streams).find(
(key) => camera.live.streams[key] === streamName,
)}
</SelectTrigger>
<SelectContent>
<SelectGroup>
{Object.entries(camera.live.streams).map(
([stream, name]) => (
<SelectItem
key={stream}
className="cursor-pointer"
value={name}
>
{stream}
</SelectItem>
),
)}
</SelectGroup>
</SelectContent>
</Select>
{preferredLiveMode != "jsmpeg" && isRestreamed && (
<div className="flex flex-row items-center gap-1 text-sm text-muted-foreground">
{supportsAudioOutput ? (
<>
<LuCheck className="size-4 text-success" />
<div>Audio is available for this stream</div>
</>
) : (
<>
<LuX className="size-4 text-danger" />
<div>Audio is unavailable for this stream</div>
<Popover>
<PopoverTrigger asChild>
<div className="cursor-pointer p-0">
<LuInfo className="size-4" />
<span className="sr-only">Info</span>
</div>
</PopoverTrigger>
<PopoverContent className="w-80 text-xs">
Audio must be output from your camera and
configured in go2rtc for this stream.
<div className="mt-2 flex items-center text-primary">
<Link
to="https://docs.frigate.video/configuration/live"
target="_blank"
rel="noopener noreferrer"
className="inline"
>
Read the documentation{" "}
<LuExternalLink className="ml-2 inline-flex size-3" />
</Link>
</div>
</PopoverContent>
</Popover>
</>
)}
</div>
)}
{preferredLiveMode != "jsmpeg" &&
isRestreamed &&
supportsAudioOutput && (
<div className="flex flex-row items-center gap-1 text-sm text-muted-foreground">
{supports2WayTalk ? (
<>
<LuCheck className="size-4 text-success" />
<div>
Two-way talk is available for this stream
</div>
</>
) : (
<>
<LuX className="size-4 text-danger" />
<div>
Two-way talk is unavailable for this stream
</div>
<Popover>
<PopoverTrigger asChild>
<div className="cursor-pointer p-0">
<LuInfo className="size-4" />
<span className="sr-only">Info</span>
</div>
</PopoverTrigger>
<PopoverContent className="w-80 text-xs">
Your device must suppport the feature and
WebRTC must be configured for two-way talk.
<div className="mt-2 flex items-center text-primary">
<Link
to="https://docs.frigate.video/configuration/live/#webrtc-extra-configuration"
target="_blank"
rel="noopener noreferrer"
className="inline"
>
Read the documentation{" "}
<LuExternalLink className="ml-2 inline-flex size-3" />
</Link>
</div>
</PopoverContent>
</Popover>
</>
)}
</div>
)}
{preferredLiveMode == "jsmpeg" && isRestreamed && (
<div className="flex flex-col items-center gap-3">
<div className="flex flex-row items-center gap-2">
<IoIosWarning className="mr-1 size-8 text-danger" />
<p className="text-sm">
Live view is in low-bandwidth mode due to buffering
or stream errors.
</p>
</div>
<Button
className={`flex items-center gap-2.5 rounded-lg`}
aria-label="Reset the stream"
variant="outline"
size="sm"
onClick={() => setLowBandwidth(false)}
>
<MdOutlineRestartAlt className="size-5 text-primary-variant" />
<div className="text-primary-variant">
Reset stream
</div>
</Button>
</div>
)}
</div>
)}
{isRestreamed && (
<div className="flex flex-col gap-1">
<div className="flex items-center justify-between">
<Label
className="mx-0 cursor-pointer text-primary"
htmlFor="backgroundplay"
>
Play in background
</Label>
<Switch
className="ml-1"
id="backgroundplay"
checked={playInBackground}
onCheckedChange={(checked) =>
setPlayInBackground(checked)
}
/>
</div>
<p className="text-sm text-muted-foreground">
Enable this option to continue streaming when the player is
hidden.
</p>
</div>
)}
<div className="flex flex-col gap-1">
<div className="flex items-center justify-between">
<Label
className="mx-0 cursor-pointer text-primary"
htmlFor="showstats"
>
Show stream stats
</Label>
<Switch
className="ml-1"
id="showstats"
checked={showStats}
onCheckedChange={(checked) => setShowStats(checked)}
/>
</div>
<p className="text-sm text-muted-foreground">
Enable this option to show stream statistics as an overlay on
the camera feed.
</p>
</div>
<div className="flex flex-col gap-1">
<div className="flex items-center justify-between text-sm font-medium leading-none">
Debug View
<LuExternalLink
onClick={() =>
navigate(`/settings?page=debug&camera=${camera.name}`)
}
className="ml-2 inline-flex size-5 cursor-pointer"
/>
</div>
</div>
</div>
</DropdownMenuContent>
</DropdownMenu>
</> </>
); );
} }
@ -908,44 +1367,276 @@ function FrigateCameraFeatures({
title={`${camera} Settings`} title={`${camera} Settings`}
/> />
</DrawerTrigger> </DrawerTrigger>
<DrawerContent className="flex flex-col gap-3 rounded-2xl px-2 py-4"> <DrawerContent className="rounded-2xl px-2 py-4">
<FilterSwitch <div className="mt-2 flex flex-col gap-2">
label="Object Detection"
isChecked={detectState == "ON"}
onCheckedChange={() => sendDetect(detectState == "ON" ? "OFF" : "ON")}
/>
{recordingEnabled && (
<FilterSwitch <FilterSwitch
label="Recording" label="Object Detection"
isChecked={recordState == "ON"} isChecked={detectState == "ON"}
onCheckedChange={() => onCheckedChange={() =>
sendRecord(recordState == "ON" ? "OFF" : "ON") sendDetect(detectState == "ON" ? "OFF" : "ON")
} }
/> />
)} {recordingEnabled && (
<FilterSwitch <FilterSwitch
label="Snapshots" label="Recording"
isChecked={snapshotState == "ON"} isChecked={recordState == "ON"}
onCheckedChange={() => onCheckedChange={() =>
sendSnapshot(snapshotState == "ON" ? "OFF" : "ON") sendRecord(recordState == "ON" ? "OFF" : "ON")
} }
/> />
{audioDetectEnabled && ( )}
<FilterSwitch <FilterSwitch
label="Audio Detection" label="Snapshots"
isChecked={audioState == "ON"} isChecked={snapshotState == "ON"}
onCheckedChange={() => sendAudio(audioState == "ON" ? "OFF" : "ON")}
/>
)}
{autotrackingEnabled && (
<FilterSwitch
label="Autotracking"
isChecked={autotrackingState == "ON"}
onCheckedChange={() => onCheckedChange={() =>
sendAutotracking(autotrackingState == "ON" ? "OFF" : "ON") sendSnapshot(snapshotState == "ON" ? "OFF" : "ON")
} }
/> />
)} {audioDetectEnabled && (
<FilterSwitch
label="Audio Detection"
isChecked={audioState == "ON"}
onCheckedChange={() =>
sendAudio(audioState == "ON" ? "OFF" : "ON")
}
/>
)}
{autotrackingEnabled && (
<FilterSwitch
label="Autotracking"
isChecked={autotrackingState == "ON"}
onCheckedChange={() =>
sendAutotracking(autotrackingState == "ON" ? "OFF" : "ON")
}
/>
)}
</div>
<div className="mt-3 flex flex-col gap-5">
{!isRestreamed && (
<div className="flex flex-col gap-2 p-2">
<Label>Stream</Label>
<div className="flex flex-row items-center gap-1 text-sm text-muted-foreground">
<LuX className="size-4 text-danger" />
<div>Restreaming is not enabled for this camera.</div>
<Popover>
<PopoverTrigger asChild>
<div className="cursor-pointer p-0">
<LuInfo className="size-4" />
<span className="sr-only">Info</span>
</div>
</PopoverTrigger>
<PopoverContent className="w-80 text-xs">
Set up go2rtc for additional live view options and audio for
this camera.
<div className="mt-2 flex items-center text-primary">
<Link
to="https://docs.frigate.video/configuration/live"
target="_blank"
rel="noopener noreferrer"
className="inline"
>
Read the documentation{" "}
<LuExternalLink className="ml-2 inline-flex size-3" />
</Link>
</div>
</PopoverContent>
</Popover>
</div>
</div>
)}
{isRestreamed && Object.values(camera.live.streams).length > 0 && (
<div className="mt-1 p-2">
<div className="mb-1 text-sm">Stream</div>
<Select
value={streamName}
onValueChange={(value) => {
setStreamName?.(value);
}}
>
<SelectTrigger className="w-full">
{Object.keys(camera.live.streams).find(
(key) => camera.live.streams[key] === streamName,
)}
</SelectTrigger>
<SelectContent>
<SelectGroup>
{Object.entries(camera.live.streams).map(
([stream, name]) => (
<SelectItem
key={stream}
className="cursor-pointer"
value={name}
>
{stream}
</SelectItem>
),
)}
</SelectGroup>
</SelectContent>
</Select>
{preferredLiveMode != "jsmpeg" && isRestreamed && (
<div className="mt-1 flex flex-row items-center gap-1 text-sm text-muted-foreground">
{supportsAudioOutput ? (
<>
<LuCheck className="size-4 text-success" />
<div>Audio is available for this stream</div>
</>
) : (
<>
<LuX className="size-4 text-danger" />
<div>Audio is unavailable for this stream</div>
<Popover>
<PopoverTrigger asChild>
<div className="cursor-pointer p-0">
<LuInfo className="size-4" />
<span className="sr-only">Info</span>
</div>
</PopoverTrigger>
<PopoverContent className="w-52 text-xs">
Audio must be output from your camera and configured
in go2rtc for this stream.
<div className="mt-2 flex items-center text-primary">
<Link
to="https://docs.frigate.video/configuration/live"
target="_blank"
rel="noopener noreferrer"
className="inline"
>
Read the documentation{" "}
<LuExternalLink className="ml-2 inline-flex size-3" />
</Link>
</div>
</PopoverContent>
</Popover>
</>
)}
</div>
)}
{preferredLiveMode != "jsmpeg" &&
isRestreamed &&
supportsAudioOutput && (
<div className="flex flex-row items-center gap-1 text-sm text-muted-foreground">
{supports2WayTalk ? (
<>
<LuCheck className="size-4 text-success" />
<div>Two-way talk is available for this stream</div>
</>
) : (
<>
<LuX className="size-4 text-danger" />
<div>Two-way talk is unavailable for this stream</div>
<Popover>
<PopoverTrigger asChild>
<div className="cursor-pointer p-0">
<LuInfo className="size-4" />
<span className="sr-only">Info</span>
</div>
</PopoverTrigger>
<PopoverContent className="w-52 text-xs">
Your device must suppport the feature and WebRTC
must be configured for two-way talk.
<div className="mt-2 flex items-center text-primary">
<Link
to="https://docs.frigate.video/configuration/live/#webrtc-extra-configuration"
target="_blank"
rel="noopener noreferrer"
className="inline"
>
Read the documentation{" "}
<LuExternalLink className="ml-2 inline-flex size-3" />
</Link>
</div>
</PopoverContent>
</Popover>
</>
)}
</div>
)}
{preferredLiveMode == "jsmpeg" && isRestreamed && (
<div className="mt-2 flex flex-col items-center gap-3">
<div className="flex flex-row items-center gap-2">
<IoIosWarning className="mr-1 size-8 text-danger" />
<p className="text-sm">
Live view is in low-bandwidth mode due to buffering or
stream errors.
</p>
</div>
<Button
className={`flex items-center gap-2.5 rounded-lg`}
aria-label="Reset the stream"
variant="outline"
size="sm"
onClick={() => setLowBandwidth(false)}
>
<MdOutlineRestartAlt className="size-5 text-primary-variant" />
<div className="text-primary-variant">Reset stream</div>
</Button>
</div>
)}
</div>
)}
<div className="flex flex-col gap-1 px-2">
<div className="mb-1 text-sm font-medium leading-none">
On-Demand Recording
</div>
<Button
onClick={handleEventButtonClick}
className={cn(
"w-full",
isRecording && "animate-pulse bg-red-500 hover:bg-red-600",
)}
>
{isRecording ? "End" : "Start"} on-demand recording
</Button>
<p className="text-sm text-muted-foreground">
Start a manual event based on this camera's recording retention
settings.
</p>
</div>
{isRestreamed && (
<>
<div className="flex flex-col gap-2">
<FilterSwitch
label="Play in Background"
isChecked={playInBackground}
onCheckedChange={(checked) => {
setPlayInBackground(checked);
}}
/>
<p className="mx-2 -mt-2 text-sm text-muted-foreground">
Enable this option to continue streaming when the player is
hidden.
</p>
</div>
<div className="flex flex-col gap-2">
<FilterSwitch
label="Show Stats"
isChecked={showStats}
onCheckedChange={(checked) => {
setShowStats(checked);
}}
/>
<p className="mx-2 -mt-2 text-sm text-muted-foreground">
Enable this option to show stream statistics as an overlay on
the camera feed.
</p>
</div>
</>
)}
<div className="mb-3 flex flex-col gap-1 px-2">
<div className="flex items-center justify-between text-sm font-medium leading-none">
Debug View
<LuExternalLink
onClick={() =>
navigate(`/settings?page=debug&camera=${camera.name}`)
}
className="ml-2 inline-flex size-5 cursor-pointer"
/>
</div>
</div>
</div>
</DrawerContent> </DrawerContent>
</Drawer> </Drawer>
); );

View File

@ -28,10 +28,16 @@ import DraggableGridLayout from "./DraggableGridLayout";
import { IoClose } from "react-icons/io5"; import { IoClose } from "react-icons/io5";
import { LuLayoutDashboard } from "react-icons/lu"; import { LuLayoutDashboard } from "react-icons/lu";
import { cn } from "@/lib/utils"; import { cn } from "@/lib/utils";
import { LivePlayerError } from "@/types/live"; import {
AudioState,
LivePlayerError,
StatsState,
VolumeState,
} from "@/types/live";
import { FaCompress, FaExpand } from "react-icons/fa"; import { FaCompress, FaExpand } from "react-icons/fa";
import useCameraLiveMode from "@/hooks/use-camera-live-mode"; import useCameraLiveMode from "@/hooks/use-camera-live-mode";
import { useResizeObserver } from "@/hooks/resize-observer"; import { useResizeObserver } from "@/hooks/resize-observer";
import LiveContextMenu from "@/components/menu/LiveContextMenu";
type LiveDashboardViewProps = { type LiveDashboardViewProps = {
cameras: CameraConfig[]; cameras: CameraConfig[];
@ -184,8 +190,13 @@ export default function LiveDashboardView({
}; };
}, []); }, []);
const { preferredLiveModes, setPreferredLiveModes, resetPreferredLiveMode } = const {
useCameraLiveMode(cameras, windowVisible); preferredLiveModes,
setPreferredLiveModes,
resetPreferredLiveMode,
isRestreamedStates,
supportsAudioOutputStates,
} = useCameraLiveMode(cameras, windowVisible);
const cameraRef = useCallback( const cameraRef = useCallback(
(node: HTMLElement | null) => { (node: HTMLElement | null) => {
@ -221,9 +232,45 @@ export default function LiveDashboardView({
[setPreferredLiveModes], [setPreferredLiveModes],
); );
// audio states
const [audioStates, setAudioStates] = useState<AudioState>({});
const [volumeStates, setVolumeStates] = useState<VolumeState>({});
const [statsStates, setStatsStates] = useState<StatsState>({});
const toggleStats = (cameraName: string): void => {
setStatsStates((prev) => ({
...prev,
[cameraName]: !prev[cameraName],
}));
};
const toggleAudio = (cameraName: string): void => {
setAudioStates((prev) => ({
...prev,
[cameraName]: !prev[cameraName],
}));
};
const muteAll = (): void => {
const updatedStates: Record<string, boolean> = {};
visibleCameras.forEach((cameraName) => {
updatedStates[cameraName] = false;
});
setAudioStates(updatedStates);
};
const unmuteAll = (): void => {
const updatedStates: Record<string, boolean> = {};
visibleCameras.forEach((cameraName) => {
updatedStates[cameraName] = true;
});
setAudioStates(updatedStates);
};
return ( return (
<div <div
className="scrollbar-container size-full overflow-y-auto px-1 pt-2 md:p-2" className="scrollbar-container size-full select-none overflow-y-auto px-1 pt-2 md:p-2"
ref={containerRef} ref={containerRef}
> >
{isMobile && ( {isMobile && (
@ -346,20 +393,56 @@ export default function LiveDashboardView({
grow = "aspect-video"; grow = "aspect-video";
} }
return ( return (
<LivePlayer <LiveContextMenu
cameraRef={cameraRef} className={grow}
key={camera.name} key={camera.name}
className={`${grow} rounded-lg bg-black md:rounded-2xl`} camera={camera.name}
windowVisible={ cameraGroup={cameraGroup}
windowVisible && visibleCameras.includes(camera.name) streamName={Object.values(camera.live.streams)?.[0]}
}
cameraConfig={camera}
preferredLiveMode={preferredLiveModes[camera.name] ?? "mse"} preferredLiveMode={preferredLiveModes[camera.name] ?? "mse"}
autoLive={autoLiveView} isRestreamed={isRestreamedStates[camera.name]}
onClick={() => onSelectCamera(camera.name)} supportsAudio={
onError={(e) => handleError(camera.name, e)} supportsAudioOutputStates[
onResetLiveMode={() => resetPreferredLiveMode(camera.name)} Object.values(camera.live.streams)?.[0]
/> ]?.supportsAudio ?? false
}
audioState={audioStates[camera.name]}
toggleAudio={() => toggleAudio(camera.name)}
statsState={statsStates[camera.name]}
toggleStats={() => toggleStats(camera.name)}
volumeState={volumeStates[camera.name] ?? 1}
setVolumeState={(value) =>
setVolumeStates({
[camera.name]: value,
})
}
muteAll={muteAll}
unmuteAll={unmuteAll}
resetPreferredLiveMode={() =>
resetPreferredLiveMode(camera.name)
}
>
<LivePlayer
cameraRef={cameraRef}
key={camera.name}
className={`${grow} rounded-lg bg-black md:rounded-2xl`}
windowVisible={
windowVisible && visibleCameras.includes(camera.name)
}
cameraConfig={camera}
preferredLiveMode={preferredLiveModes[camera.name] ?? "mse"}
autoLive={autoLiveView}
useWebGL={false}
playInBackground={false}
showStats={statsStates[camera.name]}
streamName={Object.values(camera.live.streams)[0]}
onClick={() => onSelectCamera(camera.name)}
onError={(e) => handleError(camera.name, e)}
onResetLiveMode={() => resetPreferredLiveMode(camera.name)}
playAudio={audioStates[camera.name] ?? false}
volume={volumeStates[camera.name]}
/>
</LiveContextMenu>
); );
})} })}
</div> </div>

View File

@ -46,6 +46,25 @@ export default function UiSettingsView() {
}); });
}, [config]); }, [config]);
const clearStreamingSettings = useCallback(async () => {
if (!config) {
return [];
}
await delData(`streaming-settings`)
.then(() => {
toast.success(`Cleared streaming settings for all camera groups.`, {
position: "top-center",
});
})
.catch((error) => {
toast.error(
`Failed to clear camera groups streaming settings: ${error.response.data.message}`,
{ position: "top-center" },
);
});
}, [config]);
useEffect(() => { useEffect(() => {
document.title = "General Settings - Frigate"; document.title = "General Settings - Frigate";
}, []); }, []);
@ -84,11 +103,15 @@ export default function UiSettingsView() {
Automatic Live View Automatic Live View
</Label> </Label>
</div> </div>
<div className="my-2 text-sm text-muted-foreground"> <div className="my-2 max-w-5xl text-sm text-muted-foreground">
<p> <p>
Automatically switch to a camera's live view when activity is Automatically switch to a camera's live view when activity is
detected. Disabling this option causes static camera images on detected. Disabling this option causes static camera images on
the Live dashboard to only update once per minute. the your dashboards to only update once per minute.{" "}
<em>
This is a global setting but can be overridden on each
camera <strong>in camera groups only</strong>.
</em>
</p> </p>
</div> </div>
</div> </div>
@ -103,7 +126,7 @@ export default function UiSettingsView() {
Play Alert Videos Play Alert Videos
</Label> </Label>
</div> </div>
<div className="my-2 text-sm text-muted-foreground"> <div className="my-2 max-w-5xl text-sm text-muted-foreground">
<p> <p>
By default, recent alerts on the Live dashboard play as small By default, recent alerts on the Live dashboard play as small
looping videos. Disable this option to only show a static looping videos. Disable this option to only show a static
@ -114,10 +137,10 @@ export default function UiSettingsView() {
</div> </div>
<div className="my-3 flex w-full flex-col space-y-6"> <div className="my-3 flex w-full flex-col space-y-6">
<div className="mt-2 space-y-6"> <div className="mt-2 space-y-3">
<div className="space-y-0.5"> <div className="space-y-0.5">
<div className="text-md">Stored Layouts</div> <div className="text-md">Stored Layouts</div>
<div className="my-2 text-sm text-muted-foreground"> <div className="my-2 max-w-5xl text-sm text-muted-foreground">
<p> <p>
The layout of cameras in a camera group can be The layout of cameras in a camera group can be
dragged/resized. The positions are stored in your browser's dragged/resized. The positions are stored in your browser's
@ -133,6 +156,24 @@ export default function UiSettingsView() {
</Button> </Button>
</div> </div>
<div className="mt-2 space-y-3">
<div className="space-y-0.5">
<div className="text-md">Camera Group Streaming Settings</div>
<div className="my-2 max-w-5xl text-sm text-muted-foreground">
<p>
Streaming settings for each camera group are stored in your
browser's local storage.
</p>
</div>
</div>
<Button
aria-label="Clear all group streaming settings"
onClick={clearStreamingSettings}
>
Clear All Streaming Settings
</Button>
</div>
<Separator className="my-2 flex bg-secondary" /> <Separator className="my-2 flex bg-secondary" />
<Heading as="h4" className="my-2"> <Heading as="h4" className="my-2">