GenAI: allow configuring additional send trigger after_significant_updates as well as event_end (#16919)

This commit is contained in:
leccelecce 2025-03-04 16:23:51 +00:00 committed by GitHub
parent 76c35307b2
commit c23653338f
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
4 changed files with 70 additions and 2 deletions

View File

@ -5,7 +5,7 @@ title: Generative AI
Generative AI can be used to automatically generate descriptive text based on the thumbnails of your tracked objects. This helps with [Semantic Search](/configuration/semantic_search) in Frigate to provide more context about your tracked objects. Descriptions are accessed via the _Explore_ view in the Frigate UI by clicking on a tracked object's thumbnail.
Requests for a description are sent off automatically to your AI provider at the end of the tracked object's lifecycle. Descriptions can also be regenerated manually via the Frigate UI.
Requests for a description are sent off automatically to your AI provider at the end of the tracked object's lifecycle, or can optionally be sent earlier after a number of significantly changed frames, for example in use in more real-time notifications. Descriptions can also be regenerated manually via the Frigate UI. Note that if you are manually entering a description for tracked objects prior to its end, this will be overwritten by the generated response.
## Configuration
@ -148,6 +148,15 @@ While generating simple descriptions of detected objects is useful, understandin
Frigate provides an [MQTT topic](/integrations/mqtt), `frigate/tracked_object_update`, that is updated with a JSON payload containing `event_id` and `description` when your AI provider returns a description for a tracked object. This description could be used directly in notifications, such as sending alerts to your phone or making audio announcements. If additional details from the tracked object are needed, you can query the [HTTP API](/integrations/api/event-events-event-id-get) using the `event_id`, eg: `http://frigate_ip:5000/api/events/<event_id>`.
If looking to get notifications earlier than when an object ceases to be tracked, an additional send trigger can be configured of `after_significant_updates`.
```yaml
genai:
send_triggers:
tracked_object_end: true # default
after_significant_updates: 3 # how many updates to a tracked object before we should send an image
```
## Custom Prompts
Frigate sends multiple frames from the tracked object along with a prompt to your Generative AI provider asking it to generate a description. The default prompt is as follows:

View File

@ -813,6 +813,12 @@ cameras:
- cat
# Optional: Restrict generation to objects that entered any of the listed zones (default: none, all zones qualify)
required_zones: []
# Optional: What triggers to use to send frames for a tracked object to generative AI (default: shown below)
send_triggers:
# Once the object is no longer tracked
tracked_object_end: True
# Optional: After X many significant updates are received (default: shown below)
after_significant_updates: None
# Optional: Save thumbnails sent to generative AI for review/debugging purposes (default: shown below)
debug_save_thumbnails: False

View File

@ -16,6 +16,17 @@ class GenAIProviderEnum(str, Enum):
ollama = "ollama"
class GenAISendTriggersConfig(BaseModel):
tracked_object_end: bool = Field(
default=True, title="Send once the object is no longer tracked."
)
after_significant_updates: Optional[int] = Field(
default=None,
title="Send an early request to generative AI when X frames accumulated.",
ge=1,
)
# uses BaseModel because some global attributes are not available at the camera level
class GenAICameraConfig(BaseModel):
enabled: bool = Field(default=False, title="Enable GenAI for camera.")
@ -42,6 +53,10 @@ class GenAICameraConfig(BaseModel):
default=False,
title="Save thumbnails sent to generative AI for debugging purposes.",
)
send_triggers: GenAISendTriggersConfig = Field(
default_factory=GenAISendTriggersConfig,
title="What triggers to use to send frames to generative AI for a tracked object.",
)
@field_validator("required_zones", mode="before")
@classmethod

View File

@ -132,6 +132,7 @@ class EmbeddingMaintainer(threading.Thread):
self.stop_event = stop_event
self.tracked_events: dict[str, list[any]] = {}
self.early_request_sent: dict[str, bool] = {}
self.genai_client = get_genai_client(config)
# recordings data
@ -240,6 +241,43 @@ class EmbeddingMaintainer(threading.Thread):
self.tracked_events[data["id"]].append(data)
# check if we're configured to send an early request after a minimum number of updates received
if (
self.genai_client is not None
and camera_config.genai.send_triggers.after_significant_updates
):
if (
len(self.tracked_events.get(data["id"], []))
>= camera_config.genai.send_triggers.after_significant_updates
and data["id"] not in self.early_request_sent
):
if data["has_clip"] and data["has_snapshot"]:
event: Event = Event.get(Event.id == data["id"])
if (
not camera_config.genai.objects
or event.label in camera_config.genai.objects
) and (
not camera_config.genai.required_zones
or set(data["entered_zones"])
& set(camera_config.genai.required_zones)
):
logger.debug(f"{camera} sending early request to GenAI")
self.early_request_sent[data["id"]] = True
threading.Thread(
target=self._genai_embed_description,
name=f"_genai_embed_description_{event.id}",
daemon=True,
args=(
event,
[
data["thumbnail"]
for data in self.tracked_events[data["id"]]
],
),
).start()
self.frame_manager.close(frame_name)
def _process_finalized(self) -> None:
@ -300,8 +338,8 @@ class EmbeddingMaintainer(threading.Thread):
# Run GenAI
if (
camera_config.genai.enabled
and camera_config.genai.send_triggers.tracked_object_end
and self.genai_client is not None
and event.data.get("description") is None
and (
not camera_config.genai.objects
or event.label in camera_config.genai.objects