blakeblackshear.frigate/frigate/embeddings/functions/minilm_l6_v2.py
Jason Hunter 36cbffcc5e Semantic Search for Detections (#11899)
* Initial re-implementation of semantic search

* put docker-compose back and make reindex match docs

* remove debug code and fix import

* fix docs

* manually build pysqlite3 as binaries are only available for x86-64

* update comment in build_pysqlite3.sh

* only embed objects

* better error handling when genai fails

* ask ollama to pull requested model at startup

* update ollama docs

* address some PR review comments

* fix lint

* use IPC to write description, update docs for reindex

* remove gemini-pro-vision from docs as it will be unavailable soon

* fix OpenAI doc available models

* fix api error in gemini and metadata for embeddings
2024-08-29 20:19:50 -06:00

12 lines
341 B
Python

"""Embedding function for ONNX MiniLM-L6 model used in Chroma."""
from chromadb.utils.embedding_functions import ONNXMiniLM_L6_V2
from frigate.const import MODEL_CACHE_DIR
class MiniLMEmbedding(ONNXMiniLM_L6_V2):
"""Override DOWNLOAD_PATH to download to cache directory."""
DOWNLOAD_PATH = f"{MODEL_CACHE_DIR}/all-MiniLM-L6-v2"