Update go2rtc to 1.5.0 (#5814)

* Update go2rtc to 1.3.0

* Increment to 1.3.1

* Increment to 1.3.2

* Update webrtc player to match latest

* Update version to 1.4.0

* Update mse player

* Update birdseye mse player

* remove logs

* Update docs to link to new version

* Final web lint fixes

* Update versions
This commit is contained in:
Nicolas Mowen 2023-05-21 06:53:25 -06:00 committed by GitHub
parent deccc4fd46
commit 53d63e0f75
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
10 changed files with 735 additions and 143 deletions

View File

@ -27,7 +27,7 @@ RUN --mount=type=tmpfs,target=/tmp --mount=type=tmpfs,target=/var/cache/apt \
FROM wget AS go2rtc
ARG TARGETARCH
WORKDIR /rootfs/usr/local/go2rtc/bin
RUN wget -qO go2rtc "https://github.com/AlexxIT/go2rtc/releases/download/v1.2.0/go2rtc_linux_${TARGETARCH}" \
RUN wget -qO go2rtc "https://github.com/AlexxIT/go2rtc/releases/download/v1.5.0/go2rtc_linux_${TARGETARCH}" \
&& chmod +x go2rtc

View File

@ -377,7 +377,7 @@ rtmp:
enabled: False
# Optional: Restream configuration
# Uses https://github.com/AlexxIT/go2rtc (v1.2.0)
# Uses https://github.com/AlexxIT/go2rtc (v1.5.0)
go2rtc:
# Optional: jsmpeg stream configuration for WebUI

View File

@ -115,4 +115,4 @@ services:
:::
See [go2rtc WebRTC docs](https://github.com/AlexxIT/go2rtc/tree/v1.2.0#module-webrtc) for more information about this.
See [go2rtc WebRTC docs](https://github.com/AlexxIT/go2rtc/tree/v1.5.0#module-webrtc) for more information about this.

View File

@ -7,7 +7,7 @@ title: Restream
Frigate can restream your video feed as an RTSP feed for other applications such as Home Assistant to utilize it at `rtsp://<frigate_host>:8554/<camera_name>`. Port 8554 must be open. [This allows you to use a video feed for detection in Frigate and Home Assistant live view at the same time without having to make two separate connections to the camera](#reduce-connections-to-camera). The video feed is copied from the original video feed directly to avoid re-encoding. This feed does not include any annotation by Frigate.
Frigate uses [go2rtc](https://github.com/AlexxIT/go2rtc/tree/v1.2.0) to provide its restream and MSE/WebRTC capabilities. The go2rtc config is hosted at the `go2rtc` in the config, see [go2rtc docs](https://github.com/AlexxIT/go2rtc/tree/v1.2.0#configuration) for more advanced configurations and features.
Frigate uses [go2rtc](https://github.com/AlexxIT/go2rtc/tree/v1.5.0) to provide its restream and MSE/WebRTC capabilities. The go2rtc config is hosted at the `go2rtc` in the config, see [go2rtc docs](https://github.com/AlexxIT/go2rtc/tree/v1.5.0#configuration) for more advanced configurations and features.
:::note
@ -86,7 +86,7 @@ Two connections are made to the camera. One for the sub stream, one for the rest
```yaml
go2rtc:
streams:
rtsp_cam:
rtsp_cam:
- rtsp://192.168.1.5:554/live0 # <- stream which supports video & aac audio. This is only supported for rtsp streams, http must use ffmpeg
- "ffmpeg:rtsp_cam#audio=opus" # <- copy of the stream which transcodes audio to opus
rtsp_cam_sub:
@ -130,7 +130,7 @@ cameras:
## Advanced Restream Configurations
The [exec](https://github.com/AlexxIT/go2rtc/tree/v1.2.0#source-exec) source in go2rtc can be used for custom ffmpeg commands. An example is below:
The [exec](https://github.com/AlexxIT/go2rtc/tree/v1.5.0#source-exec) source in go2rtc can be used for custom ffmpeg commands. An example is below:
NOTE: The output will need to be passed with two curly braces `{{output}}`

View File

@ -10,7 +10,7 @@ Use of the bundled go2rtc is optional. You can still configure FFmpeg to connect
# Setup a go2rtc stream
First, you will want to configure go2rtc to connect to your camera stream by adding the stream you want to use for live view in your Frigate config file. If you set the stream name under go2rtc to match the name of your camera, it will automatically be mapped and you will get additional live view options for the camera. Avoid changing any other parts of your config at this step. Note that go2rtc supports [many different stream types](https://github.com/AlexxIT/go2rtc/tree/v1.2.0#module-streams), not just rtsp.
First, you will want to configure go2rtc to connect to your camera stream by adding the stream you want to use for live view in your Frigate config file. If you set the stream name under go2rtc to match the name of your camera, it will automatically be mapped and you will get additional live view options for the camera. Avoid changing any other parts of your config at this step. Note that go2rtc supports [many different stream types](https://github.com/AlexxIT/go2rtc/tree/v1.5.0#module-streams), not just rtsp.
```yaml
go2rtc:
@ -23,7 +23,7 @@ The easiest live view to get working is MSE. After adding this to the config, re
### What if my video doesn't play?
If you are unable to see your video feed, first check the go2rtc logs in the Frigate UI under Logs in the sidebar. If go2rtc is having difficulty connecting to your camera, you should see some error messages in the log. If you do not see any errors, then the video codec of the stream may not be supported in your browser. If your camera stream is set to H265, try switching to H264. You can see more information about [video codec compatibility](https://github.com/AlexxIT/go2rtc/tree/v1.2.0#codecs-madness) in the go2rtc documentation. If you are not able to switch your camera settings from H265 to H264 or your stream is a different format such as MJPEG, you can use go2rtc to re-encode the video using the [FFmpeg parameters](https://github.com/AlexxIT/go2rtc/tree/v1.2.0#source-ffmpeg). It supports rotating and resizing video feeds and hardware acceleration. Keep in mind that transcoding video from one format to another is a resource intensive task and you may be better off using the built-in jsmpeg view. Here is an example of a config that will re-encode the stream to H264 without hardware acceleration:
If you are unable to see your video feed, first check the go2rtc logs in the Frigate UI under Logs in the sidebar. If go2rtc is having difficulty connecting to your camera, you should see some error messages in the log. If you do not see any errors, then the video codec of the stream may not be supported in your browser. If your camera stream is set to H265, try switching to H264. You can see more information about [video codec compatibility](https://github.com/AlexxIT/go2rtc/tree/v1.5.0#codecs-madness) in the go2rtc documentation. If you are not able to switch your camera settings from H265 to H264 or your stream is a different format such as MJPEG, you can use go2rtc to re-encode the video using the [FFmpeg parameters](https://github.com/AlexxIT/go2rtc/tree/v1.5.0#source-ffmpeg). It supports rotating and resizing video feeds and hardware acceleration. Keep in mind that transcoding video from one format to another is a resource intensive task and you may be better off using the built-in jsmpeg view. Here is an example of a config that will re-encode the stream to H264 without hardware acceleration:
```yaml
go2rtc:

View File

@ -0,0 +1,640 @@
class VideoRTC extends HTMLElement {
constructor() {
super();
this.DISCONNECT_TIMEOUT = 5000;
this.RECONNECT_TIMEOUT = 30000;
this.CODECS = [
'avc1.640029', // H.264 high 4.1 (Chromecast 1st and 2nd Gen)
'avc1.64002A', // H.264 high 4.2 (Chromecast 3rd Gen)
'avc1.640033', // H.264 high 5.1 (Chromecast with Google TV)
'hvc1.1.6.L153.B0', // H.265 main 5.1 (Chromecast Ultra)
'mp4a.40.2', // AAC LC
'mp4a.40.5', // AAC HE
'flac', // FLAC (PCM compatible)
'opus', // OPUS Chrome, Firefox
];
/**
* [config] Supported modes (webrtc, mse, mp4, mjpeg).
* @type {string}
*/
this.mode = 'webrtc,mse,mp4,mjpeg';
/**
* [config] Run stream when not displayed on the screen. Default `false`.
* @type {boolean}
*/
this.background = false;
/**
* [config] Run stream only when player in the viewport. Stop when user scroll out player.
* Value is percentage of visibility from `0` (not visible) to `1` (full visible).
* Default `0` - disable;
* @type {number}
*/
this.visibilityThreshold = 0;
/**
* [config] Run stream only when browser page on the screen. Stop when user change browser
* tab or minimise browser windows.
* @type {boolean}
*/
this.visibilityCheck = true;
/**
* [config] WebRTC configuration
* @type {RTCConfiguration}
*/
this.pcConfig = {
iceServers: [{ urls: 'stun:stun.l.google.com:19302' }],
sdpSemantics: 'unified-plan', // important for Chromecast 1
};
/**
* [info] WebSocket connection state. Values: CONNECTING, OPEN, CLOSED
* @type {number}
*/
this.wsState = WebSocket.CLOSED;
/**
* [info] WebRTC connection state.
* @type {number}
*/
this.pcState = WebSocket.CLOSED;
/**
* @type {HTMLVideoElement}
*/
this.video = null;
/**
* @type {WebSocket}
*/
this.ws = null;
/**
* @type {string|URL}
*/
this.wsURL = '';
/**
* @type {RTCPeerConnection}
*/
this.pc = null;
/**
* @type {number}
*/
this.connectTS = 0;
/**
* @type {string}
*/
this.mseCodecs = '';
/**
* [internal] Disconnect TimeoutID.
* @type {number}
*/
this.disconnectTID = 0;
/**
* [internal] Reconnect TimeoutID.
* @type {number}
*/
this.reconnectTID = 0;
/**
* [internal] Handler for receiving Binary from WebSocket.
* @type {Function}
*/
this.ondata = null;
/**
* [internal] Handlers list for receiving JSON from WebSocket
* @type {Object.<string,Function>}}
*/
this.onmessage = null;
}
/**
* Set video source (WebSocket URL). Support relative path.
* @param {string|URL} value
*/
set src(value) {
if (typeof value !== 'string') value = value.toString();
if (value.startsWith('http')) {
value = `ws${value.substring(4)}`;
} else if (value.startsWith('/')) {
value = `ws${location.origin.substring(4)}${value}`;
}
this.wsURL = value;
this.onconnect();
}
/**
* Play video. Support automute when autoplay blocked.
* https://developer.chrome.com/blog/autoplay/
*/
play() {
this.video.play().catch((er) => {
if (er.name === 'NotAllowedError' && !this.video.muted) {
this.video.muted = true;
this.video.play().catch(() => { });
}
});
}
/**
* Send message to server via WebSocket
* @param {Object} value
*/
send(value) {
if (this.ws) this.ws.send(JSON.stringify(value));
}
codecs(type) {
const test =
type === 'mse'
? (codec) => MediaSource.isTypeSupported(`video/mp4; codecs="${codec}"`)
: (codec) => this.video.canPlayType(`video/mp4; codecs="${codec}"`);
return this.CODECS.filter(test).join();
}
/**
* `CustomElement`. Invoked each time the custom element is appended into a
* document-connected element.
*/
connectedCallback() {
if (this.disconnectTID) {
clearTimeout(this.disconnectTID);
this.disconnectTID = 0;
}
// because video autopause on disconnected from DOM
if (this.video) {
const seek = this.video.seekable;
if (seek.length > 0) {
this.video.currentTime = seek.end(seek.length - 1);
}
this.play();
} else {
this.oninit();
}
this.onconnect();
}
/**
* `CustomElement`. Invoked each time the custom element is disconnected from the
* document's DOM.
*/
disconnectedCallback() {
if (this.background || this.disconnectTID) return;
if (this.wsState === WebSocket.CLOSED && this.pcState === WebSocket.CLOSED) return;
this.disconnectTID = setTimeout(() => {
if (this.reconnectTID) {
clearTimeout(this.reconnectTID);
this.reconnectTID = 0;
}
this.disconnectTID = 0;
this.ondisconnect();
}, this.DISCONNECT_TIMEOUT);
}
/**
* Creates child DOM elements. Called automatically once on `connectedCallback`.
*/
oninit() {
this.video = document.createElement('video');
this.video.controls = true;
this.video.playsInline = true;
this.video.preload = 'auto';
this.video.style.display = 'block'; // fix bottom margin 4px
this.video.style.width = '100%';
this.video.style.height = '100%';
this.appendChild(this.video);
if (this.background) return;
if ('hidden' in document && this.visibilityCheck) {
document.addEventListener('visibilitychange', () => {
if (document.hidden) {
this.disconnectedCallback();
} else if (this.isConnected) {
this.connectedCallback();
}
});
}
if ('IntersectionObserver' in window && this.visibilityThreshold) {
const observer = new IntersectionObserver(
(entries) => {
entries.forEach((entry) => {
if (!entry.isIntersecting) {
this.disconnectedCallback();
} else if (this.isConnected) {
this.connectedCallback();
}
});
},
{ threshold: this.visibilityThreshold }
);
observer.observe(this);
}
}
/**
* Connect to WebSocket. Called automatically on `connectedCallback`.
* @return {boolean} true if the connection has started.
*/
onconnect() {
if (!this.isConnected || !this.wsURL || this.ws || this.pc) return false;
// CLOSED or CONNECTING => CONNECTING
this.wsState = WebSocket.CONNECTING;
this.connectTS = Date.now();
this.ws = new WebSocket(this.wsURL);
this.ws.binaryType = 'arraybuffer';
this.ws.addEventListener('open', (ev) => this.onopen(ev));
this.ws.addEventListener('close', (ev) => this.onclose(ev));
return true;
}
ondisconnect() {
this.wsState = WebSocket.CLOSED;
if (this.ws) {
this.ws.close();
this.ws = null;
}
this.pcState = WebSocket.CLOSED;
if (this.pc) {
this.pc.close();
this.pc = null;
}
}
/**
* @returns {Array.<string>} of modes (mse, webrtc, etc.)
*/
onopen() {
// CONNECTING => OPEN
this.wsState = WebSocket.OPEN;
this.ws.addEventListener('message', (ev) => {
if (typeof ev.data === 'string') {
const msg = JSON.parse(ev.data);
for (const mode in this.onmessage) {
this.onmessage[mode](msg);
}
} else {
this.ondata(ev.data);
}
});
this.ondata = null;
this.onmessage = {};
const modes = [];
if (this.mode.indexOf('mse') >= 0 && 'MediaSource' in window) {
// iPhone
modes.push('mse');
this.onmse();
} else if (this.mode.indexOf('mp4') >= 0) {
modes.push('mp4');
this.onmp4();
}
if (this.mode.indexOf('webrtc') >= 0 && 'RTCPeerConnection' in window) {
// macOS Desktop app
modes.push('webrtc');
this.onwebrtc();
}
if (this.mode.indexOf('mjpeg') >= 0) {
if (modes.length) {
this.onmessage['mjpeg'] = (msg) => {
if (msg.type !== 'error' || msg.value.indexOf(modes[0]) !== 0) return;
this.onmjpeg();
};
} else {
modes.push('mjpeg');
this.onmjpeg();
}
}
return modes;
}
/**
* @return {boolean} true if reconnection has started.
*/
onclose() {
if (this.wsState === WebSocket.CLOSED) return false;
// CONNECTING, OPEN => CONNECTING
this.wsState = WebSocket.CONNECTING;
this.ws = null;
// reconnect no more than once every X seconds
const delay = Math.max(this.RECONNECT_TIMEOUT - (Date.now() - this.connectTS), 0);
this.reconnectTID = setTimeout(() => {
this.reconnectTID = 0;
this.onconnect();
}, delay);
return true;
}
onmse() {
const ms = new MediaSource();
ms.addEventListener(
'sourceopen',
() => {
URL.revokeObjectURL(this.video.src);
this.send({ type: 'mse', value: this.codecs('mse') });
},
{ once: true }
);
this.video.src = URL.createObjectURL(ms);
this.video.srcObject = null;
this.play();
this.mseCodecs = '';
this.onmessage['mse'] = (msg) => {
if (msg.type !== 'mse') return;
this.mseCodecs = msg.value;
const sb = ms.addSourceBuffer(msg.value);
sb.mode = 'segments'; // segments or sequence
sb.addEventListener('updateend', () => {
if (sb.updating) return;
try {
if (bufLen > 0) {
const data = buf.slice(0, bufLen);
bufLen = 0;
sb.appendBuffer(data);
} else if (sb.buffered && sb.buffered.length) {
const end = sb.buffered.end(sb.buffered.length - 1) - 15;
const start = sb.buffered.start(0);
if (end > start) {
sb.remove(start, end);
ms.setLiveSeekableRange(end, end + 15);
}
// console.debug("VideoRTC.buffered", start, end);
}
} catch (e) {
// console.debug(e);
}
});
const buf = new Uint8Array(2 * 1024 * 1024);
let bufLen = 0;
this.ondata = (data) => {
if (sb.updating || bufLen > 0) {
const b = new Uint8Array(data);
buf.set(b, bufLen);
bufLen += b.byteLength;
// console.debug("VideoRTC.buffer", b.byteLength, bufLen);
} else {
try {
sb.appendBuffer(data);
} catch (e) {
// console.debug(e);
}
}
};
};
}
onwebrtc() {
const pc = new RTCPeerConnection(this.pcConfig);
/** @type {HTMLVideoElement} */
const video2 = document.createElement('video');
video2.addEventListener('loadeddata', (ev) => this.onpcvideo(ev), { once: true });
pc.addEventListener('icecandidate', (ev) => {
const candidate = ev.candidate ? ev.candidate.toJSON().candidate : '';
this.send({ type: 'webrtc/candidate', value: candidate });
});
pc.addEventListener('track', (ev) => {
// when stream already init
if (video2.srcObject !== null) return;
// when audio track not exist in Chrome
if (ev.streams.length === 0) return;
// when audio track not exist in Firefox
if (ev.streams[0].id[0] === '{') return;
video2.srcObject = ev.streams[0];
});
pc.addEventListener('connectionstatechange', () => {
if (pc.connectionState === 'failed' || pc.connectionState === 'disconnected') {
pc.close(); // stop next events
this.pcState = WebSocket.CLOSED;
this.pc = null;
this.onconnect();
}
});
this.onmessage['webrtc'] = (msg) => {
switch (msg.type) {
case 'webrtc/candidate':
pc.addIceCandidate({
candidate: msg.value,
sdpMid: '0',
}).catch(() => { });
break;
case 'webrtc/answer':
pc.setRemoteDescription({
type: 'answer',
sdp: msg.value,
}).catch(() => { });
break;
case 'error':
if (msg.value.indexOf('webrtc/offer') < 0) return;
pc.close();
}
};
// Safari doesn't support "offerToReceiveVideo"
pc.addTransceiver('video', { direction: 'recvonly' });
pc.addTransceiver('audio', { direction: 'recvonly' });
pc.createOffer().then((offer) => {
pc.setLocalDescription(offer).then(() => {
this.send({ type: 'webrtc/offer', value: offer.sdp });
});
});
this.pcState = WebSocket.CONNECTING;
this.pc = pc;
}
/**
* @param ev {Event}
*/
onpcvideo(ev) {
if (!this.pc) return;
/** @type {HTMLVideoElement} */
const video2 = ev.target;
const state = this.pc.connectionState;
// Firefox doesn't support pc.connectionState
if (state === 'connected' || state === 'connecting' || !state) {
// Video+Audio > Video, H265 > H264, Video > Audio, WebRTC > MSE
let rtcPriority = 0,
msePriority = 0;
/** @type {MediaStream} */
const ms = video2.srcObject;
if (ms.getVideoTracks().length > 0) rtcPriority += 0x220;
if (ms.getAudioTracks().length > 0) rtcPriority += 0x102;
if (this.mseCodecs.indexOf('hvc1.') >= 0) msePriority += 0x230;
if (this.mseCodecs.indexOf('avc1.') >= 0) msePriority += 0x210;
if (this.mseCodecs.indexOf('mp4a.') >= 0) msePriority += 0x101;
if (rtcPriority >= msePriority) {
this.video.srcObject = ms;
this.play();
this.pcState = WebSocket.OPEN;
this.wsState = WebSocket.CLOSED;
this.ws.close();
this.ws = null;
} else {
this.pcState = WebSocket.CLOSED;
this.pc.close();
this.pc = null;
}
}
video2.srcObject = null;
}
onmjpeg() {
this.ondata = (data) => {
this.video.controls = false;
this.video.poster = `data:image/jpeg;base64,${VideoRTC.btoa(data)}`;
};
this.send({ type: 'mjpeg' });
}
onmp4() {
/** @type {HTMLCanvasElement} **/
const canvas = document.createElement('canvas');
/** @type {CanvasRenderingContext2D} */
let context;
/** @type {HTMLVideoElement} */
const video2 = document.createElement('video');
video2.autoplay = true;
video2.playsInline = true;
video2.muted = true;
video2.addEventListener('loadeddata', (_) => {
if (!context) {
canvas.width = video2.videoWidth;
canvas.height = video2.videoHeight;
context = canvas.getContext('2d');
}
context.drawImage(video2, 0, 0, canvas.width, canvas.height);
this.video.controls = false;
this.video.poster = canvas.toDataURL('image/jpeg');
});
this.ondata = (data) => {
video2.src = `data:video/mp4;base64,${VideoRTC.btoa(data)}`;
};
this.send({ type: 'mp4', value: this.codecs('mp4') });
}
static btoa(buffer) {
const bytes = new Uint8Array(buffer);
const len = bytes.byteLength;
let binary = '';
for (let i = 0; i < len; i++) {
binary += String.fromCharCode(bytes[i]);
}
return window.btoa(binary);
}
}
class VideoStream extends VideoRTC {
/**
* Custom GUI
*/
oninit() {
super.oninit();
const info = this.querySelector('.info');
this.insertBefore(this.video, info);
}
onconnect() {
const result = super.onconnect();
if (result) this.divMode = 'loading';
return result;
}
ondisconnect() {;
super.ondisconnect();
}
onopen() {
const result = super.onopen();
this.onmessage['stream'] = (_) => {
};
return result;
}
onclose() {
return super.onclose();
}
onpcvideo(ev) {
super.onpcvideo(ev);
if (this.pcState !== WebSocket.CLOSED) {
this.divMode = 'RTC';
}
}
}
customElements.define('video-stream', VideoStream);

View File

@ -1,79 +0,0 @@
import { h } from 'preact';
import { baseUrl } from '../api/baseUrl';
import { useEffect } from 'preact/hooks';
export default function MsePlayer({ camera, width, height }) {
const url = `${baseUrl.replace(/^http/, 'ws')}live/mse/api/ws?src=${camera}`;
useEffect(() => {
const video = document.querySelector('#video');
// support api_path
const ws = new WebSocket(url);
ws.binaryType = 'arraybuffer';
let mediaSource,
sourceBuffer,
queueBuffer = [];
ws.onopen = () => {
mediaSource = new MediaSource();
video.src = URL.createObjectURL(mediaSource);
mediaSource.onsourceopen = () => {
mediaSource.onsourceopen = null;
URL.revokeObjectURL(video.src);
ws.send(JSON.stringify({ type: 'mse' }));
};
};
ws.onmessage = (ev) => {
if (typeof ev.data === 'string') {
const data = JSON.parse(ev.data);
if (data.type === 'mse') {
sourceBuffer = mediaSource.addSourceBuffer(data.value);
sourceBuffer.mode = 'segments'; // segments or sequence
sourceBuffer.onupdateend = () => {
if (!sourceBuffer.updating && queueBuffer.length > 0) {
try {
sourceBuffer.appendBuffer(queueBuffer.shift());
} catch (e) {
// console.warn(e);
}
}
};
}
} else if (sourceBuffer.updating || queueBuffer.length > 0) {
queueBuffer.push(ev.data);
} else {
try {
sourceBuffer.appendBuffer(ev.data);
} catch (e) {
// console.warn(e);
}
}
if (video.seekable.length > 0) {
const delay = video.seekable.end(video.seekable.length - 1) - video.currentTime;
if (delay < 1) {
video.playbackRate = 1;
} else if (delay > 10) {
video.playbackRate = 10;
} else if (delay > 2) {
video.playbackRate = Math.floor(delay);
}
}
};
return () => {
const video = document.getElementById('video');
video.srcObject = null;
ws.close();
};
}, [url]);
return (
<div>
<video id="video" autoplay playsinline controls muted width={width} height={height} />
</div>
);
}

View File

@ -1,68 +1,95 @@
import { h } from 'preact';
import { baseUrl } from '../api/baseUrl';
import { useEffect } from 'preact/hooks';
import { useCallback, useEffect } from 'preact/hooks';
export default function WebRtcPlayer({ camera, width, height }) {
const url = `${baseUrl.replace(/^http/, 'ws')}live/webrtc/api/ws?src=${camera}`;
useEffect(() => {
const PeerConnection = useCallback(async (media) => {
const pc = new RTCPeerConnection({
iceServers: [{ urls: 'stun:stun.l.google.com:19302' }],
});
const localTracks = [];
if (/camera|microphone/.test(media)) {
const tracks = await getMediaTracks('user', {
video: media.indexOf('camera') >= 0,
audio: media.indexOf('microphone') >= 0,
});
tracks.forEach((track) => {
pc.addTransceiver(track, { direction: 'sendonly' });
if (track.kind === 'video') localTracks.push(track);
});
}
if (media.indexOf('display') >= 0) {
const tracks = await getMediaTracks('display', {
video: true,
audio: media.indexOf('speaker') >= 0,
});
tracks.forEach((track) => {
pc.addTransceiver(track, { direction: 'sendonly' });
if (track.kind === 'video') localTracks.push(track);
});
}
if (/video|audio/.test(media)) {
const tracks = ['video', 'audio']
.filter((kind) => media.indexOf(kind) >= 0)
.map((kind) => pc.addTransceiver(kind, { direction: 'recvonly' }).receiver.track);
localTracks.push(...tracks);
}
document.getElementById('video').srcObject = new MediaStream(localTracks);
return pc;
}, []);
async function getMediaTracks(media, constraints) {
try {
const stream =
media === 'user'
? await navigator.mediaDevices.getUserMedia(constraints)
: await navigator.mediaDevices.getDisplayMedia(constraints);
return stream.getTracks();
} catch (e) {
return [];
}
}
const connect = useCallback(async () => {
const pc = await PeerConnection('video+audio');
const ws = new WebSocket(url);
ws.onopen = () => {
pc.createOffer().then((offer) => {
pc.setLocalDescription(offer).then(() => {
ws.addEventListener('open', () => {
pc.addEventListener('icecandidate', (ev) => {
if (!ev.candidate) return;
const msg = { type: 'webrtc/candidate', value: ev.candidate.candidate };
ws.send(JSON.stringify(msg));
});
pc.createOffer()
.then((offer) => pc.setLocalDescription(offer))
.then(() => {
const msg = { type: 'webrtc/offer', value: pc.localDescription.sdp };
ws.send(JSON.stringify(msg));
});
});
};
ws.onmessage = (ev) => {
const msg = JSON.parse(ev.data);
});
ws.addEventListener('message', (ev) => {
const msg = JSON.parse(ev.data);
if (msg.type === 'webrtc/candidate') {
pc.addIceCandidate({ candidate: msg.value, sdpMid: '0' });
} else if (msg.type === 'webrtc/answer') {
pc.setRemoteDescription({ type: 'answer', sdp: msg.value });
}
};
const pc = new RTCPeerConnection({
iceServers: [{ urls: 'stun:stun.l.google.com:19302' }],
});
pc.onicecandidate = (ev) => {
if (ev.candidate !== null) {
ws.send(
JSON.stringify({
type: 'webrtc/candidate',
value: ev.candidate.toJSON().candidate,
})
);
}
};
pc.ontrack = (ev) => {
const video = document.getElementById('video');
}, [PeerConnection, url]);
// when audio track not exist in Chrome
if (ev.streams.length === 0) return;
// when audio track not exist in Firefox
if (ev.streams[0].id[0] === '{') return;
// when stream already init
if (video.srcObject !== null) return;
video.srcObject = ev.streams[0];
};
// Safari don't support "offerToReceiveVideo"
// so need to create transeivers manually
pc.addTransceiver('video', { direction: 'recvonly' });
pc.addTransceiver('audio', { direction: 'recvonly' });
return () => {
const video = document.getElementById('video');
video.srcObject = null;
pc.close();
ws.close();
};
}, [url]);
useEffect(() => {
connect();
}, [connect]);
return (
<div>

View File

@ -4,18 +4,16 @@ import ActivityIndicator from '../components/ActivityIndicator';
import JSMpegPlayer from '../components/JSMpegPlayer';
import Heading from '../components/Heading';
import WebRtcPlayer from '../components/WebRtcPlayer';
import MsePlayer from '../components/MsePlayer';
import '../components/MsePlayer';
import useSWR from 'swr';
import { useMemo } from 'preact/hooks';
import CameraControlPanel from '../components/CameraControlPanel';
import { baseUrl } from '../api/baseUrl';
export default function Birdseye() {
const { data: config } = useSWR('config');
const [viewSource, setViewSource, sourceIsLoaded] = usePersistence(
'birdseye-source',
getDefaultLiveMode(config)
);
const [viewSource, setViewSource, sourceIsLoaded] = usePersistence('birdseye-source', getDefaultLiveMode(config));
const sourceValues = ['mse', 'webrtc', 'jsmpeg'];
const ptzCameras = useMemo(() => {
@ -38,7 +36,10 @@ export default function Birdseye() {
player = (
<Fragment>
<div className={ptzCameras.length ? 'max-w-5xl xl:w-1/2' : 'max-w-5xl'}>
<MsePlayer camera="birdseye" />
<video-stream
mode="mse"
src={new URL(`${baseUrl.replace(/^http/, 'ws')}live/webrtc/api/ws?src=birdseye`)}
/>
</div>
</Fragment>
);
@ -110,7 +111,6 @@ export default function Birdseye() {
);
}
function getDefaultLiveMode(config) {
if (config) {
if (config.birdseye.restream) {

View File

@ -14,8 +14,9 @@ import { useCallback, useMemo, useState } from 'preact/hooks';
import { useApiHost } from '../api';
import useSWR from 'swr';
import WebRtcPlayer from '../components/WebRtcPlayer';
import MsePlayer from '../components/MsePlayer';
import '../components/MsePlayer';
import CameraControlPanel from '../components/CameraControlPanel';
import { baseUrl } from '../api/baseUrl';
const emptyObject = Object.freeze({});
@ -118,7 +119,10 @@ export default function Camera({ camera }) {
player = (
<Fragment>
<div className="max-w-5xl">
<MsePlayer camera={cameraConfig.live.stream_name} />
<video-stream
mode="mse"
src={new URL(`${baseUrl.replace(/^http/, 'ws')}live/webrtc/api/ws?src=${camera}`)}
/>
</div>
</Fragment>
);