Include libraries and .rknn models for other Rockchip SoCs (#8649)

* support for other yolov models and config checks

* apply code formatting

* Information about core mask and inference speed

* update rknn postprocess and remove params

* update model selection

* Apply suggestions from code review

Co-authored-by: Nicolas Mowen <nickmowen213@gmail.com>

* support rknn on all socs

* apply changes from review and fix post process bug

* apply code formatting

* update tip in object_detectors docs

---------

Co-authored-by: Nicolas Mowen <nickmowen213@gmail.com>
This commit is contained in:
Marc Altmann 2023-11-18 14:53:49 +01:00 committed by GitHub
parent 2da99c2308
commit c6208b266b
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
5 changed files with 72 additions and 29 deletions

View File

@ -9,11 +9,6 @@ COPY docker/rockchip/requirements-wheels-rk.txt /requirements-wheels-rk.txt
RUN sed -i "/https:\/\//d" /requirements-wheels.txt
RUN pip3 wheel --wheel-dir=/rk-wheels -c /requirements-wheels.txt -r /requirements-wheels-rk.txt
FROM wget as rk-downloads
RUN wget -qO librknnrt.so https://github.com/MarcA711/rknpu2/raw/master/runtime/RK3588/Linux/librknn_api/aarch64/librknnrt.so
RUN wget -qO ffmpeg https://github.com/MarcA711/Rockchip-FFmpeg-Builds/releases/download/latest/ffmpeg
RUN wget -qO ffprobe https://github.com/MarcA711/Rockchip-FFmpeg-Builds/releases/download/latest/ffprobe
RUN wget -qO yolov8n-320x320.rknn https://github.com/MarcA711/rknn-models/releases/download/latest/yolov8n-320x320.rknn
FROM deps AS rk-deps
ARG TARGETARCH
@ -22,12 +17,16 @@ RUN --mount=type=bind,from=rk-wheels,source=/rk-wheels,target=/deps/rk-wheels \
WORKDIR /opt/frigate/
COPY --from=rootfs / /
COPY --from=rk-downloads /rootfs/librknnrt.so /usr/lib/
COPY --from=rk-downloads /rootfs/yolov8n-320x320.rknn /models/
ADD https://github.com/MarcA711/rknpu2/releases/download/v1.5.2/librknnrt_rk356x.so /usr/lib/
ADD https://github.com/MarcA711/rknpu2/releases/download/v1.5.2/librknnrt_rk3588.so /usr/lib/
ADD https://github.com/MarcA711/rknn-models/releases/download/v1.5.2-rk3562/yolov8n-320x320-rk3562.rknn /models/rknn/
ADD https://github.com/MarcA711/rknn-models/releases/download/v1.5.2-rk3566/yolov8n-320x320-rk3566.rknn /models/rknn/
ADD https://github.com/MarcA711/rknn-models/releases/download/v1.5.2-rk3568/yolov8n-320x320-rk3568.rknn /models/rknn/
ADD https://github.com/MarcA711/rknn-models/releases/download/v1.5.2-rk3588/yolov8n-320x320-rk3588.rknn /models/rknn/
RUN rm -rf /usr/lib/btbn-ffmpeg/bin/ffmpeg
RUN rm -rf /usr/lib/btbn-ffmpeg/bin/ffprobe
COPY --from=rk-downloads /rootfs/ffmpeg /usr/lib/btbn-ffmpeg/bin/
COPY --from=rk-downloads /rootfs/ffprobe /usr/lib/btbn-ffmpeg/bin/
RUN chmod +x /usr/lib/btbn-ffmpeg/bin/ffmpeg
RUN chmod +x /usr/lib/btbn-ffmpeg/bin/ffprobe
ADD --chmod=111 https://github.com/MarcA711/Rockchip-FFmpeg-Builds/releases/download/latest/ffmpeg /usr/lib/btbn-ffmpeg/bin/
ADD --chmod=111 https://github.com/MarcA711/Rockchip-FFmpeg-Builds/releases/download/latest/ffprobe /usr/lib/btbn-ffmpeg/bin/

View File

@ -1,2 +1,2 @@
hide-warnings == 0.17
rknn-toolkit-lite2 @ https://github.com/MarcA711/rknn-toolkit2/raw/master/rknn_toolkit_lite2/packages/rknn_toolkit_lite2-1.5.2-cp39-cp39-linux_aarch64.whl
rknn-toolkit-lite2 @ https://github.com/MarcA711/rknn-toolkit2/releases/download/v1.5.2/rknn_toolkit_lite2-1.5.2-cp39-cp39-linux_aarch64.whl

View File

@ -295,16 +295,16 @@ To verify that the integration is working correctly, start Frigate and observe t
## Rockchip RKNN-Toolkit-Lite2
This detector is only available if one of the following Rockchip SoCs is used:
- RK3566/RK3568
- RK3588/RK3588S
- RV1103/RV1106
- RK3568
- RK3566
- RK3562
These SoCs come with a NPU that will highly speed up detection.
### Setup
RKNN support is provided using the `-rk` suffix for the docker image. Moreover, privileged mode must be enabled by adding the `--privileged` flag to your docker run command or `privileged: true` to your `docker-compose.yml` file.
Use a frigate docker image with `-rk` suffix and enable privileged mode by adding the `--privileged` flag to your docker run command or `privileged: true` to your `docker-compose.yml` file.
### Configuration
@ -376,3 +376,16 @@ $ cat /sys/kernel/debug/rknpu/load
model:
path: /config/model_cache/rknn/my-rknn-model.rknn
```
:::tip
When you have a multicore NPU, you can enable all cores to reduce inference times. You should consider activating all cores if you use a larger model like yolov8l. If your NPU has 3 cores (like rk3588/S SoCs), you can enable all 3 cores using:
```yaml
detectors:
rknn:
type: rknn
core_mask: 0b111
```
:::

View File

@ -103,7 +103,7 @@ Frigate supports SBCs with the following Rockchip SoCs:
- RV1103/RV1106
- RK3562
Using the yolov8n model and an Orange Pi 5 Plus with RK3588 SoC inference speeds vary between 25-40 ms.
Using the yolov8n model and an Orange Pi 5 Plus with RK3588 SoC inference speeds vary between 20 - 25 ms.
## What does Frigate use the CPU for and what does it use a detector for? (ELI5 Version)

View File

@ -22,7 +22,9 @@ logger = logging.getLogger(__name__)
DETECTOR_KEY = "rknn"
yolov8_rknn_models = {
supported_socs = ["rk3562", "rk3566", "rk3568", "rk3588"]
yolov8_suffix = {
"default-yolov8n": "n",
"default-yolov8s": "s",
"default-yolov8m": "m",
@ -40,28 +42,57 @@ class Rknn(DetectionApi):
type_key = DETECTOR_KEY
def __init__(self, config: RknnDetectorConfig):
# find out SoC
try:
with open("/proc/device-tree/compatible") as file:
soc = file.read().split(",")[-1].strip("\x00")
except FileNotFoundError:
logger.error("Make sure to run docker in privileged mode.")
raise Exception("Make sure to run docker in privileged mode.")
if soc not in supported_socs:
logger.error(
"Your SoC is not supported. Your SoC is: {}. Currently these SoCs are supported: {}.".format(
soc, supported_socs
)
)
raise Exception(
"Your SoC is not supported. Your SoC is: {}. Currently these SoCs are supported: {}.".format(
soc, supported_socs
)
)
if "rk356" in soc:
os.rename("/usr/lib/librknnrt_rk356x.so", "/usr/lib/librknnrt.so")
elif "rk3588" in soc:
os.rename("/usr/lib/librknnrt_rk3588.so", "/usr/lib/librknnrt.so")
self.model_path = config.model.path or "default-yolov8n"
self.core_mask = config.core_mask
self.height = config.model.height
self.width = config.model.width
if self.model_path in yolov8_rknn_models:
if self.model_path in yolov8_suffix:
if self.model_path == "default-yolov8n":
self.model_path = "/models/yolov8n-320x320.rknn"
self.model_path = "/models/rknn/yolov8n-320x320-{soc}.rknn".format(
soc=soc
)
else:
model_suffix = yolov8_rknn_models[self.model_path]
model_suffix = yolov8_suffix[self.model_path]
self.model_path = (
"/config/model_cache/rknn/yolov8{}-320x320.rknn".format(
model_suffix
"/config/model_cache/rknn/yolov8{suffix}-320x320-{soc}.rknn".format(
suffix=model_suffix, soc=soc
)
)
os.makedirs("/config/model_cache/rknn", exist_ok=True)
if not os.path.isfile(self.model_path):
logger.info("Downloading yolov8{} model.".format(model_suffix))
logger.info(
"Downloading yolov8{suffix} model.".format(suffix=model_suffix)
)
urllib.request.urlretrieve(
"https://github.com/MarcA711/rknn-models/releases/download/latest/yolov8{}-320x320.rknn".format(
model_suffix
"https://github.com/MarcA711/rknn-models/releases/download/v1.5.2-{soc}/yolov8{suffix}-320x320-{soc}.rknn".format(
soc=soc, suffix=model_suffix
),
self.model_path,
)
@ -140,10 +171,10 @@ class Rknn(DetectionApi):
boxes = np.transpose(
np.vstack(
(
results[:, 1] - 0.5 * results[:, 3],
results[:, 0] - 0.5 * results[:, 2],
results[:, 3] + 0.5 * results[:, 3],
results[:, 2] + 0.5 * results[:, 2],
(results[:, 1] - 0.5 * results[:, 3]) / self.height,
(results[:, 0] - 0.5 * results[:, 2]) / self.width,
(results[:, 1] + 0.5 * results[:, 3]) / self.height,
(results[:, 0] + 0.5 * results[:, 2]) / self.width,
)
)
)