blakeblackshear.frigate/docker/rockchip/conv2rknn.py
Felipe Santos f8b0329b37
Move database and config from homeassistant /config to addon /config (#16337)
* Move database and config from homeassistant /config to addon /config

* Re-implement config migration for the add-on

* Align some terms

* Improve function name

* Use local variables

* Add model.path migration

* Fix homeassistant config path

* Ensure migration scripts run before go2rtc and frigate

* Migrate all files I know

* Add ffmpeg.path migration

* Update docker/main/rootfs/etc/s6-overlay/s6-rc.d/prepare/run

Co-authored-by: Josh Hawkins <32435876+hawkeye217@users.noreply.github.com>

* Improve some variable names and organization

* Update docs to reflect addon config dir

* Update live.md with /addon_configs

* Move addon config section to configuration doc

* Align several terminologies and improve text

* Fix webrtc example config title

* Capitalize Add-on in more places

* Improve specific add-on config dir docs

* Align bash and python scripts to prefer config.yml over config.yaml

* Support config.json in migration shell scripts

* Change docs to reflect config.yml is preferred over config.yaml

* If previous config was yaml, migrate to yaml

* Fix typo in edgetpu.md

* Fix formatting of Python files

* Remove HailoRT Beta add-on variant from docs

* Add migration for labelmap and certs

* Fix variable name

* Fix new_config_file var unset

* Fix addon config directories table

* Improve db migration to avoid migrating files like .db.bak

* Fix echo location

---------

Co-authored-by: Josh Hawkins <32435876+hawkeye217@users.noreply.github.com>
2025-03-24 09:05:59 -05:00

83 lines
2.4 KiB
Python

import os
import rknn
import yaml
from rknn.api import RKNN
try:
with open(rknn.__path__[0] + "/VERSION") as file:
tk_version = file.read().strip()
except FileNotFoundError:
pass
try:
with open("/config/conv2rknn.yaml", "r") as config_file:
configuration = yaml.safe_load(config_file)
except FileNotFoundError:
raise Exception("Please place a config file at /config/conv2rknn.yaml")
if configuration["config"] != None:
rknn_config = configuration["config"]
else:
rknn_config = {}
if not os.path.isdir("/config/model_cache/rknn_cache/onnx"):
raise Exception(
"Place the onnx models you want to convert to rknn format in /config/model_cache/rknn_cache/onnx"
)
if "soc" not in configuration:
try:
with open("/proc/device-tree/compatible") as file:
soc = file.read().split(",")[-1].strip("\x00")
except FileNotFoundError:
raise Exception("Make sure to run docker in privileged mode.")
configuration["soc"] = [
soc,
]
if "quantization" not in configuration:
configuration["quantization"] = False
if "output_name" not in configuration:
configuration["output_name"] = "{{input_basename}}"
for input_filename in os.listdir("/config/model_cache/rknn_cache/onnx"):
for soc in configuration["soc"]:
quant = "i8" if configuration["quantization"] else "fp16"
input_path = "/config/model_cache/rknn_cache/onnx/" + input_filename
input_basename = input_filename[: input_filename.rfind(".")]
output_filename = (
configuration["output_name"].format(
quant=quant,
input_basename=input_basename,
soc=soc,
tk_version=tk_version,
)
+ ".rknn"
)
output_path = "/config/model_cache/rknn_cache/" + output_filename
rknn_config["target_platform"] = soc
rknn = RKNN(verbose=True)
rknn.config(**rknn_config)
if rknn.load_onnx(model=input_path) != 0:
raise Exception("Error loading model.")
if (
rknn.build(
do_quantization=configuration["quantization"],
dataset="/COCO/coco_subset_20.txt",
)
!= 0
):
raise Exception("Error building model.")
if rknn.export_rknn(output_path) != 0:
raise Exception("Error exporting rknn model.")