forked from Github/frigate
Compare commits
26 Commits
dependabot
...
v0.15.0-be
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
6b12a45a95 | ||
|
|
0b9c4c18dd | ||
|
|
d0cc8cb64b | ||
|
|
bb86e71e65 | ||
|
|
8aa6297308 | ||
|
|
d3b631a952 | ||
|
|
47d495fc01 | ||
|
|
32322b23b2 | ||
|
|
c0ba98e26f | ||
|
|
a5a7cd3107 | ||
|
|
a729408599 | ||
|
|
4dddc53735 | ||
|
|
5f42caad03 | ||
|
|
5475672a9d | ||
|
|
833cdcb6d2 | ||
|
|
c95bc9fe44 | ||
|
|
a1fa9decad | ||
|
|
4a5fe4138e | ||
|
|
002fdeae67 | ||
|
|
5802a66469 | ||
|
|
71e8f75a01 | ||
|
|
ee816b2251 | ||
|
|
f094c59cd0 | ||
|
|
d25ffdb292 | ||
|
|
2207a91f7b | ||
|
|
33957e5360 |
@@ -10,10 +10,8 @@ elif [[ "${TARGETARCH}" == "arm64" ]]; then
|
|||||||
arch="aarch64"
|
arch="aarch64"
|
||||||
fi
|
fi
|
||||||
|
|
||||||
mkdir -p /rootfs
|
|
||||||
|
|
||||||
wget -qO- "https://github.com/frigate-nvr/hailort/releases/download/v${hailo_version}/hailort-${TARGETARCH}.tar.gz" |
|
wget -qO- "https://github.com/frigate-nvr/hailort/releases/download/v${hailo_version}/hailort-${TARGETARCH}.tar.gz" |
|
||||||
tar -C /rootfs/ -xzf -
|
tar -C / -xzf -
|
||||||
|
|
||||||
mkdir -p /hailo-wheels
|
mkdir -p /hailo-wheels
|
||||||
|
|
||||||
|
|||||||
@@ -19,7 +19,7 @@ pandas == 2.2.*
|
|||||||
peewee == 3.17.*
|
peewee == 3.17.*
|
||||||
peewee_migrate == 1.13.*
|
peewee_migrate == 1.13.*
|
||||||
psutil == 6.1.*
|
psutil == 6.1.*
|
||||||
pydantic == 2.10.*
|
pydantic == 2.8.*
|
||||||
git+https://github.com/fbcotter/py3nvml#egg=py3nvml
|
git+https://github.com/fbcotter/py3nvml#egg=py3nvml
|
||||||
pytz == 2024.*
|
pytz == 2024.*
|
||||||
pyzmq == 26.2.*
|
pyzmq == 26.2.*
|
||||||
|
|||||||
@@ -231,28 +231,11 @@ docker run -d \
|
|||||||
|
|
||||||
### Setup Decoder
|
### Setup Decoder
|
||||||
|
|
||||||
The decoder you need to pass in the `hwaccel_args` will depend on the input video.
|
Using `preset-nvidia` ffmpeg will automatically select the necessary profile for the incoming video, and will log an error if the profile is not supported by your GPU.
|
||||||
|
|
||||||
A list of supported codecs (you can use `ffmpeg -decoders | grep cuvid` in the container to get the ones your card supports)
|
|
||||||
|
|
||||||
```
|
|
||||||
V..... h263_cuvid Nvidia CUVID H263 decoder (codec h263)
|
|
||||||
V..... h264_cuvid Nvidia CUVID H264 decoder (codec h264)
|
|
||||||
V..... hevc_cuvid Nvidia CUVID HEVC decoder (codec hevc)
|
|
||||||
V..... mjpeg_cuvid Nvidia CUVID MJPEG decoder (codec mjpeg)
|
|
||||||
V..... mpeg1_cuvid Nvidia CUVID MPEG1VIDEO decoder (codec mpeg1video)
|
|
||||||
V..... mpeg2_cuvid Nvidia CUVID MPEG2VIDEO decoder (codec mpeg2video)
|
|
||||||
V..... mpeg4_cuvid Nvidia CUVID MPEG4 decoder (codec mpeg4)
|
|
||||||
V..... vc1_cuvid Nvidia CUVID VC1 decoder (codec vc1)
|
|
||||||
V..... vp8_cuvid Nvidia CUVID VP8 decoder (codec vp8)
|
|
||||||
V..... vp9_cuvid Nvidia CUVID VP9 decoder (codec vp9)
|
|
||||||
```
|
|
||||||
|
|
||||||
For example, for H264 video, you'll select `preset-nvidia-h264`.
|
|
||||||
|
|
||||||
```yaml
|
```yaml
|
||||||
ffmpeg:
|
ffmpeg:
|
||||||
hwaccel_args: preset-nvidia-h264
|
hwaccel_args: preset-nvidia
|
||||||
```
|
```
|
||||||
|
|
||||||
If everything is working correctly, you should see a significant improvement in performance.
|
If everything is working correctly, you should see a significant improvement in performance.
|
||||||
|
|||||||
@@ -132,6 +132,28 @@ cameras:
|
|||||||
- detect
|
- detect
|
||||||
```
|
```
|
||||||
|
|
||||||
|
## Handling Complex Passwords
|
||||||
|
|
||||||
|
go2rtc expects URL-encoded passwords in the config, [urlencoder.org](https://urlencoder.org) can be used for this purpose.
|
||||||
|
|
||||||
|
For example:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
go2rtc:
|
||||||
|
streams:
|
||||||
|
my_camera: rtsp://username:$@foo%@192.168.1.100
|
||||||
|
```
|
||||||
|
|
||||||
|
becomes
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
go2rtc:
|
||||||
|
streams:
|
||||||
|
my_camera: rtsp://username:$%40foo%25@192.168.1.100
|
||||||
|
```
|
||||||
|
|
||||||
|
See [this comment(https://github.com/AlexxIT/go2rtc/issues/1217#issuecomment-2242296489) for more information.
|
||||||
|
|
||||||
## Advanced Restream Configurations
|
## Advanced Restream Configurations
|
||||||
|
|
||||||
The [exec](https://github.com/AlexxIT/go2rtc/tree/v1.9.2#source-exec) source in go2rtc can be used for custom ffmpeg commands. An example is below:
|
The [exec](https://github.com/AlexxIT/go2rtc/tree/v1.9.2#source-exec) source in go2rtc can be used for custom ffmpeg commands. An example is below:
|
||||||
|
|||||||
@@ -193,6 +193,7 @@ services:
|
|||||||
container_name: frigate
|
container_name: frigate
|
||||||
privileged: true # this may not be necessary for all setups
|
privileged: true # this may not be necessary for all setups
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
|
stop_grace_period: 30s # allow enough time to shut down the various services
|
||||||
image: ghcr.io/blakeblackshear/frigate:stable
|
image: ghcr.io/blakeblackshear/frigate:stable
|
||||||
shm_size: "512mb" # update for your cameras based on calculation above
|
shm_size: "512mb" # update for your cameras based on calculation above
|
||||||
devices:
|
devices:
|
||||||
@@ -224,6 +225,7 @@ If you can't use docker compose, you can run the container with something simila
|
|||||||
docker run -d \
|
docker run -d \
|
||||||
--name frigate \
|
--name frigate \
|
||||||
--restart=unless-stopped \
|
--restart=unless-stopped \
|
||||||
|
--stop-timeout 30 \
|
||||||
--mount type=tmpfs,target=/tmp/cache,tmpfs-size=1000000000 \
|
--mount type=tmpfs,target=/tmp/cache,tmpfs-size=1000000000 \
|
||||||
--device /dev/bus/usb:/dev/bus/usb \
|
--device /dev/bus/usb:/dev/bus/usb \
|
||||||
--device /dev/dri/renderD128 \
|
--device /dev/dri/renderD128 \
|
||||||
|
|||||||
@@ -115,6 +115,7 @@ services:
|
|||||||
frigate:
|
frigate:
|
||||||
container_name: frigate
|
container_name: frigate
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
|
stop_grace_period: 30s
|
||||||
image: ghcr.io/blakeblackshear/frigate:stable
|
image: ghcr.io/blakeblackshear/frigate:stable
|
||||||
volumes:
|
volumes:
|
||||||
- ./config:/config
|
- ./config:/config
|
||||||
|
|||||||
1203
docs/static/frigate-api.yaml
vendored
1203
docs/static/frigate-api.yaml
vendored
File diff suppressed because it is too large
Load Diff
@@ -17,8 +17,8 @@ from fastapi.responses import JSONResponse, PlainTextResponse
|
|||||||
from markupsafe import escape
|
from markupsafe import escape
|
||||||
from peewee import operator
|
from peewee import operator
|
||||||
|
|
||||||
from frigate.api.defs.app_body import AppConfigSetBody
|
from frigate.api.defs.query.app_query_parameters import AppTimelineHourlyQueryParameters
|
||||||
from frigate.api.defs.app_query_parameters import AppTimelineHourlyQueryParameters
|
from frigate.api.defs.request.app_body import AppConfigSetBody
|
||||||
from frigate.api.defs.tags import Tags
|
from frigate.api.defs.tags import Tags
|
||||||
from frigate.config import FrigateConfig
|
from frigate.config import FrigateConfig
|
||||||
from frigate.const import CONFIG_DIR
|
from frigate.const import CONFIG_DIR
|
||||||
|
|||||||
@@ -18,7 +18,7 @@ from joserfc import jwt
|
|||||||
from peewee import DoesNotExist
|
from peewee import DoesNotExist
|
||||||
from slowapi import Limiter
|
from slowapi import Limiter
|
||||||
|
|
||||||
from frigate.api.defs.app_body import (
|
from frigate.api.defs.request.app_body import (
|
||||||
AppPostLoginBody,
|
AppPostLoginBody,
|
||||||
AppPostUsersBody,
|
AppPostUsersBody,
|
||||||
AppPutPasswordBody,
|
AppPutPasswordBody,
|
||||||
@@ -85,7 +85,12 @@ def get_remote_addr(request: Request):
|
|||||||
return str(ip)
|
return str(ip)
|
||||||
|
|
||||||
# if there wasn't anything in the route, just return the default
|
# if there wasn't anything in the route, just return the default
|
||||||
return request.remote_addr or "127.0.0.1"
|
remote_addr = None
|
||||||
|
|
||||||
|
if hasattr(request, "remote_addr"):
|
||||||
|
remote_addr = request.remote_addr
|
||||||
|
|
||||||
|
return remote_addr or "127.0.0.1"
|
||||||
|
|
||||||
|
|
||||||
def get_jwt_secret() -> str:
|
def get_jwt_secret() -> str:
|
||||||
@@ -324,7 +329,7 @@ def login(request: Request, body: AppPostLoginBody):
|
|||||||
try:
|
try:
|
||||||
db_user: User = User.get_by_id(user)
|
db_user: User = User.get_by_id(user)
|
||||||
except DoesNotExist:
|
except DoesNotExist:
|
||||||
return JSONResponse(content={"message": "Login failed"}, status_code=400)
|
return JSONResponse(content={"message": "Login failed"}, status_code=401)
|
||||||
|
|
||||||
password_hash = db_user.password_hash
|
password_hash = db_user.password_hash
|
||||||
if verify_password(password, password_hash):
|
if verify_password(password, password_hash):
|
||||||
@@ -335,7 +340,7 @@ def login(request: Request, body: AppPostLoginBody):
|
|||||||
response, JWT_COOKIE_NAME, encoded_jwt, expiration, JWT_COOKIE_SECURE
|
response, JWT_COOKIE_NAME, encoded_jwt, expiration, JWT_COOKIE_SECURE
|
||||||
)
|
)
|
||||||
return response
|
return response
|
||||||
return JSONResponse(content={"message": "Login failed"}, status_code=400)
|
return JSONResponse(content={"message": "Login failed"}, status_code=401)
|
||||||
|
|
||||||
|
|
||||||
@router.get("/users")
|
@router.get("/users")
|
||||||
|
|||||||
@@ -3,7 +3,7 @@ from typing import Union
|
|||||||
from pydantic import BaseModel
|
from pydantic import BaseModel
|
||||||
from pydantic.json_schema import SkipJsonSchema
|
from pydantic.json_schema import SkipJsonSchema
|
||||||
|
|
||||||
from frigate.review.maintainer import SeverityEnum
|
from frigate.review.types import SeverityEnum
|
||||||
|
|
||||||
|
|
||||||
class ReviewQueryParams(BaseModel):
|
class ReviewQueryParams(BaseModel):
|
||||||
@@ -1,4 +1,4 @@
|
|||||||
from typing import Optional, Union
|
from typing import List, Optional, Union
|
||||||
|
|
||||||
from pydantic import BaseModel, Field
|
from pydantic import BaseModel, Field
|
||||||
|
|
||||||
@@ -17,14 +17,18 @@ class EventsDescriptionBody(BaseModel):
|
|||||||
class EventsCreateBody(BaseModel):
|
class EventsCreateBody(BaseModel):
|
||||||
source_type: Optional[str] = "api"
|
source_type: Optional[str] = "api"
|
||||||
sub_label: Optional[str] = None
|
sub_label: Optional[str] = None
|
||||||
score: Optional[int] = 0
|
score: Optional[float] = 0
|
||||||
duration: Optional[int] = 30
|
duration: Optional[int] = 30
|
||||||
include_recording: Optional[bool] = True
|
include_recording: Optional[bool] = True
|
||||||
draw: Optional[dict] = {}
|
draw: Optional[dict] = {}
|
||||||
|
|
||||||
|
|
||||||
class EventsEndBody(BaseModel):
|
class EventsEndBody(BaseModel):
|
||||||
end_time: Optional[int] = None
|
end_time: Optional[float] = None
|
||||||
|
|
||||||
|
|
||||||
|
class EventsDeleteBody(BaseModel):
|
||||||
|
event_ids: List[str] = Field(title="The event IDs to delete")
|
||||||
|
|
||||||
|
|
||||||
class SubmitPlusBody(BaseModel):
|
class SubmitPlusBody(BaseModel):
|
||||||
42
frigate/api/defs/response/event_response.py
Normal file
42
frigate/api/defs/response/event_response.py
Normal file
@@ -0,0 +1,42 @@
|
|||||||
|
from typing import Any, Optional
|
||||||
|
|
||||||
|
from pydantic import BaseModel, ConfigDict
|
||||||
|
|
||||||
|
|
||||||
|
class EventResponse(BaseModel):
|
||||||
|
id: str
|
||||||
|
label: str
|
||||||
|
sub_label: Optional[str]
|
||||||
|
camera: str
|
||||||
|
start_time: float
|
||||||
|
end_time: Optional[float]
|
||||||
|
false_positive: Optional[bool]
|
||||||
|
zones: list[str]
|
||||||
|
thumbnail: str
|
||||||
|
has_clip: bool
|
||||||
|
has_snapshot: bool
|
||||||
|
retain_indefinitely: bool
|
||||||
|
plus_id: Optional[str]
|
||||||
|
model_hash: Optional[str]
|
||||||
|
detector_type: Optional[str]
|
||||||
|
model_type: Optional[str]
|
||||||
|
data: dict[str, Any]
|
||||||
|
|
||||||
|
model_config = ConfigDict(protected_namespaces=())
|
||||||
|
|
||||||
|
|
||||||
|
class EventCreateResponse(BaseModel):
|
||||||
|
success: bool
|
||||||
|
message: str
|
||||||
|
event_id: str
|
||||||
|
|
||||||
|
|
||||||
|
class EventMultiDeleteResponse(BaseModel):
|
||||||
|
success: bool
|
||||||
|
deleted_events: list[str]
|
||||||
|
not_found_events: list[str]
|
||||||
|
|
||||||
|
|
||||||
|
class EventUploadPlusResponse(BaseModel):
|
||||||
|
success: bool
|
||||||
|
plus_id: str
|
||||||
@@ -3,7 +3,7 @@ from typing import Dict
|
|||||||
|
|
||||||
from pydantic import BaseModel, Json
|
from pydantic import BaseModel, Json
|
||||||
|
|
||||||
from frigate.review.maintainer import SeverityEnum
|
from frigate.review.types import SeverityEnum
|
||||||
|
|
||||||
|
|
||||||
class ReviewSegmentResponse(BaseModel):
|
class ReviewSegmentResponse(BaseModel):
|
||||||
@@ -14,29 +14,36 @@ from fastapi.responses import JSONResponse
|
|||||||
from peewee import JOIN, DoesNotExist, fn, operator
|
from peewee import JOIN, DoesNotExist, fn, operator
|
||||||
from playhouse.shortcuts import model_to_dict
|
from playhouse.shortcuts import model_to_dict
|
||||||
|
|
||||||
from frigate.api.defs.events_body import (
|
from frigate.api.defs.query.events_query_parameters import (
|
||||||
EventsCreateBody,
|
|
||||||
EventsDescriptionBody,
|
|
||||||
EventsEndBody,
|
|
||||||
EventsSubLabelBody,
|
|
||||||
SubmitPlusBody,
|
|
||||||
)
|
|
||||||
from frigate.api.defs.events_query_parameters import (
|
|
||||||
DEFAULT_TIME_RANGE,
|
DEFAULT_TIME_RANGE,
|
||||||
EventsQueryParams,
|
EventsQueryParams,
|
||||||
EventsSearchQueryParams,
|
EventsSearchQueryParams,
|
||||||
EventsSummaryQueryParams,
|
EventsSummaryQueryParams,
|
||||||
)
|
)
|
||||||
from frigate.api.defs.regenerate_query_parameters import (
|
from frigate.api.defs.query.regenerate_query_parameters import (
|
||||||
RegenerateQueryParameters,
|
RegenerateQueryParameters,
|
||||||
)
|
)
|
||||||
from frigate.api.defs.tags import Tags
|
from frigate.api.defs.request.events_body import (
|
||||||
from frigate.const import (
|
EventsCreateBody,
|
||||||
CLIPS_DIR,
|
EventsDeleteBody,
|
||||||
|
EventsDescriptionBody,
|
||||||
|
EventsEndBody,
|
||||||
|
EventsSubLabelBody,
|
||||||
|
SubmitPlusBody,
|
||||||
)
|
)
|
||||||
|
from frigate.api.defs.response.event_response import (
|
||||||
|
EventCreateResponse,
|
||||||
|
EventMultiDeleteResponse,
|
||||||
|
EventResponse,
|
||||||
|
EventUploadPlusResponse,
|
||||||
|
)
|
||||||
|
from frigate.api.defs.response.generic_response import GenericResponse
|
||||||
|
from frigate.api.defs.tags import Tags
|
||||||
|
from frigate.const import CLIPS_DIR
|
||||||
from frigate.embeddings import EmbeddingsContext
|
from frigate.embeddings import EmbeddingsContext
|
||||||
|
from frigate.events.external import ExternalEventProcessor
|
||||||
from frigate.models import Event, ReviewSegment, Timeline
|
from frigate.models import Event, ReviewSegment, Timeline
|
||||||
from frigate.object_processing import TrackedObject
|
from frigate.object_processing import TrackedObject, TrackedObjectProcessor
|
||||||
from frigate.util.builtin import get_tz_modifiers
|
from frigate.util.builtin import get_tz_modifiers
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
@@ -44,7 +51,7 @@ logger = logging.getLogger(__name__)
|
|||||||
router = APIRouter(tags=[Tags.events])
|
router = APIRouter(tags=[Tags.events])
|
||||||
|
|
||||||
|
|
||||||
@router.get("/events")
|
@router.get("/events", response_model=list[EventResponse])
|
||||||
def events(params: EventsQueryParams = Depends()):
|
def events(params: EventsQueryParams = Depends()):
|
||||||
camera = params.camera
|
camera = params.camera
|
||||||
cameras = params.cameras
|
cameras = params.cameras
|
||||||
@@ -246,6 +253,8 @@ def events(params: EventsQueryParams = Depends()):
|
|||||||
order_by = Event.start_time.asc()
|
order_by = Event.start_time.asc()
|
||||||
elif sort == "date_desc":
|
elif sort == "date_desc":
|
||||||
order_by = Event.start_time.desc()
|
order_by = Event.start_time.desc()
|
||||||
|
else:
|
||||||
|
order_by = Event.start_time.desc()
|
||||||
else:
|
else:
|
||||||
order_by = Event.start_time.desc()
|
order_by = Event.start_time.desc()
|
||||||
|
|
||||||
@@ -261,7 +270,7 @@ def events(params: EventsQueryParams = Depends()):
|
|||||||
return JSONResponse(content=list(events))
|
return JSONResponse(content=list(events))
|
||||||
|
|
||||||
|
|
||||||
@router.get("/events/explore")
|
@router.get("/events/explore", response_model=list[EventResponse])
|
||||||
def events_explore(limit: int = 10):
|
def events_explore(limit: int = 10):
|
||||||
# get distinct labels for all events
|
# get distinct labels for all events
|
||||||
distinct_labels = Event.select(Event.label).distinct().order_by(Event.label)
|
distinct_labels = Event.select(Event.label).distinct().order_by(Event.label)
|
||||||
@@ -306,7 +315,8 @@ def events_explore(limit: int = 10):
|
|||||||
"data": {
|
"data": {
|
||||||
k: v
|
k: v
|
||||||
for k, v in event.data.items()
|
for k, v in event.data.items()
|
||||||
if k in ["type", "score", "top_score", "description"]
|
if k
|
||||||
|
in ["type", "score", "top_score", "description", "sub_label_score"]
|
||||||
},
|
},
|
||||||
"event_count": label_counts[event.label],
|
"event_count": label_counts[event.label],
|
||||||
}
|
}
|
||||||
@@ -322,7 +332,7 @@ def events_explore(limit: int = 10):
|
|||||||
return JSONResponse(content=processed_events)
|
return JSONResponse(content=processed_events)
|
||||||
|
|
||||||
|
|
||||||
@router.get("/event_ids")
|
@router.get("/event_ids", response_model=list[EventResponse])
|
||||||
def event_ids(ids: str):
|
def event_ids(ids: str):
|
||||||
ids = ids.split(",")
|
ids = ids.split(",")
|
||||||
|
|
||||||
@@ -580,19 +590,17 @@ def events_search(request: Request, params: EventsSearchQueryParams = Depends())
|
|||||||
|
|
||||||
processed_events.append(processed_event)
|
processed_events.append(processed_event)
|
||||||
|
|
||||||
# Sort by search distance if search_results are available, otherwise by start_time as default
|
if (sort is None or sort == "relevance") and search_results:
|
||||||
if search_results:
|
|
||||||
processed_events.sort(key=lambda x: x.get("search_distance", float("inf")))
|
processed_events.sort(key=lambda x: x.get("search_distance", float("inf")))
|
||||||
|
elif min_score is not None and max_score is not None and sort == "score_asc":
|
||||||
|
processed_events.sort(key=lambda x: x["score"])
|
||||||
|
elif min_score is not None and max_score is not None and sort == "score_desc":
|
||||||
|
processed_events.sort(key=lambda x: x["score"], reverse=True)
|
||||||
|
elif sort == "date_asc":
|
||||||
|
processed_events.sort(key=lambda x: x["start_time"])
|
||||||
else:
|
else:
|
||||||
if sort == "score_asc":
|
# "date_desc" default
|
||||||
processed_events.sort(key=lambda x: x["score"])
|
processed_events.sort(key=lambda x: x["start_time"], reverse=True)
|
||||||
elif sort == "score_desc":
|
|
||||||
processed_events.sort(key=lambda x: x["score"], reverse=True)
|
|
||||||
elif sort == "date_asc":
|
|
||||||
processed_events.sort(key=lambda x: x["start_time"])
|
|
||||||
else:
|
|
||||||
# "date_desc" default
|
|
||||||
processed_events.sort(key=lambda x: x["start_time"], reverse=True)
|
|
||||||
|
|
||||||
# Limit the number of events returned
|
# Limit the number of events returned
|
||||||
processed_events = processed_events[:limit]
|
processed_events = processed_events[:limit]
|
||||||
@@ -645,7 +653,7 @@ def events_summary(params: EventsSummaryQueryParams = Depends()):
|
|||||||
return JSONResponse(content=[e for e in groups.dicts()])
|
return JSONResponse(content=[e for e in groups.dicts()])
|
||||||
|
|
||||||
|
|
||||||
@router.get("/events/{event_id}")
|
@router.get("/events/{event_id}", response_model=EventResponse)
|
||||||
def event(event_id: str):
|
def event(event_id: str):
|
||||||
try:
|
try:
|
||||||
return model_to_dict(Event.get(Event.id == event_id))
|
return model_to_dict(Event.get(Event.id == event_id))
|
||||||
@@ -653,7 +661,7 @@ def event(event_id: str):
|
|||||||
return JSONResponse(content="Event not found", status_code=404)
|
return JSONResponse(content="Event not found", status_code=404)
|
||||||
|
|
||||||
|
|
||||||
@router.post("/events/{event_id}/retain")
|
@router.post("/events/{event_id}/retain", response_model=GenericResponse)
|
||||||
def set_retain(event_id: str):
|
def set_retain(event_id: str):
|
||||||
try:
|
try:
|
||||||
event = Event.get(Event.id == event_id)
|
event = Event.get(Event.id == event_id)
|
||||||
@@ -672,7 +680,7 @@ def set_retain(event_id: str):
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
@router.post("/events/{event_id}/plus")
|
@router.post("/events/{event_id}/plus", response_model=EventUploadPlusResponse)
|
||||||
def send_to_plus(request: Request, event_id: str, body: SubmitPlusBody = None):
|
def send_to_plus(request: Request, event_id: str, body: SubmitPlusBody = None):
|
||||||
if not request.app.frigate_config.plus_api.is_active():
|
if not request.app.frigate_config.plus_api.is_active():
|
||||||
message = "PLUS_API_KEY environment variable is not set"
|
message = "PLUS_API_KEY environment variable is not set"
|
||||||
@@ -784,7 +792,7 @@ def send_to_plus(request: Request, event_id: str, body: SubmitPlusBody = None):
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
@router.put("/events/{event_id}/false_positive")
|
@router.put("/events/{event_id}/false_positive", response_model=EventUploadPlusResponse)
|
||||||
def false_positive(request: Request, event_id: str):
|
def false_positive(request: Request, event_id: str):
|
||||||
if not request.app.frigate_config.plus_api.is_active():
|
if not request.app.frigate_config.plus_api.is_active():
|
||||||
message = "PLUS_API_KEY environment variable is not set"
|
message = "PLUS_API_KEY environment variable is not set"
|
||||||
@@ -873,7 +881,7 @@ def false_positive(request: Request, event_id: str):
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
@router.delete("/events/{event_id}/retain")
|
@router.delete("/events/{event_id}/retain", response_model=GenericResponse)
|
||||||
def delete_retain(event_id: str):
|
def delete_retain(event_id: str):
|
||||||
try:
|
try:
|
||||||
event = Event.get(Event.id == event_id)
|
event = Event.get(Event.id == event_id)
|
||||||
@@ -892,7 +900,7 @@ def delete_retain(event_id: str):
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
@router.post("/events/{event_id}/sub_label")
|
@router.post("/events/{event_id}/sub_label", response_model=GenericResponse)
|
||||||
def set_sub_label(
|
def set_sub_label(
|
||||||
request: Request,
|
request: Request,
|
||||||
event_id: str,
|
event_id: str,
|
||||||
@@ -944,7 +952,7 @@ def set_sub_label(
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
@router.post("/events/{event_id}/description")
|
@router.post("/events/{event_id}/description", response_model=GenericResponse)
|
||||||
def set_description(
|
def set_description(
|
||||||
request: Request,
|
request: Request,
|
||||||
event_id: str,
|
event_id: str,
|
||||||
@@ -991,7 +999,7 @@ def set_description(
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
@router.put("/events/{event_id}/description/regenerate")
|
@router.put("/events/{event_id}/description/regenerate", response_model=GenericResponse)
|
||||||
def regenerate_description(
|
def regenerate_description(
|
||||||
request: Request, event_id: str, params: RegenerateQueryParameters = Depends()
|
request: Request, event_id: str, params: RegenerateQueryParameters = Depends()
|
||||||
):
|
):
|
||||||
@@ -1035,37 +1043,67 @@ def regenerate_description(
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
@router.delete("/events/{event_id}")
|
def delete_single_event(event_id: str, request: Request) -> dict:
|
||||||
def delete_event(request: Request, event_id: str):
|
|
||||||
try:
|
try:
|
||||||
event = Event.get(Event.id == event_id)
|
event = Event.get(Event.id == event_id)
|
||||||
except DoesNotExist:
|
except DoesNotExist:
|
||||||
return JSONResponse(
|
return {"success": False, "message": f"Event {event_id} not found"}
|
||||||
content=({"success": False, "message": "Event " + event_id + " not found"}),
|
|
||||||
status_code=404,
|
|
||||||
)
|
|
||||||
|
|
||||||
media_name = f"{event.camera}-{event.id}"
|
media_name = f"{event.camera}-{event.id}"
|
||||||
if event.has_snapshot:
|
if event.has_snapshot:
|
||||||
media = Path(f"{os.path.join(CLIPS_DIR, media_name)}.jpg")
|
snapshot_paths = [
|
||||||
media.unlink(missing_ok=True)
|
Path(f"{os.path.join(CLIPS_DIR, media_name)}.jpg"),
|
||||||
media = Path(f"{os.path.join(CLIPS_DIR, media_name)}-clean.png")
|
Path(f"{os.path.join(CLIPS_DIR, media_name)}-clean.png"),
|
||||||
media.unlink(missing_ok=True)
|
]
|
||||||
|
for media in snapshot_paths:
|
||||||
|
media.unlink(missing_ok=True)
|
||||||
|
|
||||||
event.delete_instance()
|
event.delete_instance()
|
||||||
Timeline.delete().where(Timeline.source_id == event_id).execute()
|
Timeline.delete().where(Timeline.source_id == event_id).execute()
|
||||||
|
|
||||||
# If semantic search is enabled, update the index
|
# If semantic search is enabled, update the index
|
||||||
if request.app.frigate_config.semantic_search.enabled:
|
if request.app.frigate_config.semantic_search.enabled:
|
||||||
context: EmbeddingsContext = request.app.embeddings
|
context: EmbeddingsContext = request.app.embeddings
|
||||||
context.db.delete_embeddings_thumbnail(event_ids=[event_id])
|
context.db.delete_embeddings_thumbnail(event_ids=[event_id])
|
||||||
context.db.delete_embeddings_description(event_ids=[event_id])
|
context.db.delete_embeddings_description(event_ids=[event_id])
|
||||||
return JSONResponse(
|
|
||||||
content=({"success": True, "message": "Event " + event_id + " deleted"}),
|
return {"success": True, "message": f"Event {event_id} deleted"}
|
||||||
status_code=200,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/events/{camera_name}/{label}/create")
|
@router.delete("/events/{event_id}", response_model=GenericResponse)
|
||||||
|
def delete_event(request: Request, event_id: str):
|
||||||
|
result = delete_single_event(event_id, request)
|
||||||
|
status_code = 200 if result["success"] else 404
|
||||||
|
return JSONResponse(content=result, status_code=status_code)
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/events/", response_model=EventMultiDeleteResponse)
|
||||||
|
def delete_events(request: Request, body: EventsDeleteBody):
|
||||||
|
if not body.event_ids:
|
||||||
|
return JSONResponse(
|
||||||
|
content=({"success": False, "message": "No event IDs provided."}),
|
||||||
|
status_code=404,
|
||||||
|
)
|
||||||
|
|
||||||
|
deleted_events = []
|
||||||
|
not_found_events = []
|
||||||
|
|
||||||
|
for event_id in body.event_ids:
|
||||||
|
result = delete_single_event(event_id, request)
|
||||||
|
if result["success"]:
|
||||||
|
deleted_events.append(event_id)
|
||||||
|
else:
|
||||||
|
not_found_events.append(event_id)
|
||||||
|
|
||||||
|
response = {
|
||||||
|
"success": True,
|
||||||
|
"deleted_events": deleted_events,
|
||||||
|
"not_found_events": not_found_events,
|
||||||
|
}
|
||||||
|
return JSONResponse(content=response, status_code=200)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/events/{camera_name}/{label}/create", response_model=EventCreateResponse)
|
||||||
def create_event(
|
def create_event(
|
||||||
request: Request,
|
request: Request,
|
||||||
camera_name: str,
|
camera_name: str,
|
||||||
@@ -1087,9 +1125,11 @@ def create_event(
|
|||||||
)
|
)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
frame = request.app.detected_frames_processor.get_current_frame(camera_name)
|
frame_processor: TrackedObjectProcessor = request.app.detected_frames_processor
|
||||||
|
external_processor: ExternalEventProcessor = request.app.external_processor
|
||||||
|
|
||||||
event_id = request.app.external_processor.create_manual_event(
|
frame = frame_processor.get_current_frame(camera_name)
|
||||||
|
event_id = external_processor.create_manual_event(
|
||||||
camera_name,
|
camera_name,
|
||||||
label,
|
label,
|
||||||
body.source_type,
|
body.source_type,
|
||||||
@@ -1119,7 +1159,7 @@ def create_event(
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
@router.put("/events/{event_id}/end")
|
@router.put("/events/{event_id}/end", response_model=GenericResponse)
|
||||||
def end_event(request: Request, event_id: str, body: EventsEndBody):
|
def end_event(request: Request, event_id: str, body: EventsEndBody):
|
||||||
try:
|
try:
|
||||||
end_time = body.end_time or datetime.datetime.now().timestamp()
|
end_time = body.end_time or datetime.datetime.now().timestamp()
|
||||||
|
|||||||
@@ -9,6 +9,7 @@ import psutil
|
|||||||
from fastapi import APIRouter, Request
|
from fastapi import APIRouter, Request
|
||||||
from fastapi.responses import JSONResponse
|
from fastapi.responses import JSONResponse
|
||||||
from peewee import DoesNotExist
|
from peewee import DoesNotExist
|
||||||
|
from playhouse.shortcuts import model_to_dict
|
||||||
|
|
||||||
from frigate.api.defs.request.export_recordings_body import ExportRecordingsBody
|
from frigate.api.defs.request.export_recordings_body import ExportRecordingsBody
|
||||||
from frigate.api.defs.tags import Tags
|
from frigate.api.defs.tags import Tags
|
||||||
@@ -207,3 +208,14 @@ def export_delete(event_id: str):
|
|||||||
),
|
),
|
||||||
status_code=200,
|
status_code=200,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/exports/{export_id}")
|
||||||
|
def get_export(export_id: str):
|
||||||
|
try:
|
||||||
|
return JSONResponse(content=model_to_dict(Export.get(Export.id == export_id)))
|
||||||
|
except DoesNotExist:
|
||||||
|
return JSONResponse(
|
||||||
|
content={"success": False, "message": "Export not found"},
|
||||||
|
status_code=404,
|
||||||
|
)
|
||||||
|
|||||||
@@ -87,7 +87,11 @@ def create_fastapi_app(
|
|||||||
logger.info("FastAPI started")
|
logger.info("FastAPI started")
|
||||||
|
|
||||||
# Rate limiter (used for login endpoint)
|
# Rate limiter (used for login endpoint)
|
||||||
auth.rateLimiter.set_limit(frigate_config.auth.failed_login_rate_limit or "")
|
if frigate_config.auth.failed_login_rate_limit is None:
|
||||||
|
limiter.enabled = False
|
||||||
|
else:
|
||||||
|
auth.rateLimiter.set_limit(frigate_config.auth.failed_login_rate_limit)
|
||||||
|
|
||||||
app.state.limiter = limiter
|
app.state.limiter = limiter
|
||||||
app.add_exception_handler(RateLimitExceeded, _rate_limit_exceeded_handler)
|
app.add_exception_handler(RateLimitExceeded, _rate_limit_exceeded_handler)
|
||||||
app.add_middleware(SlowAPIMiddleware)
|
app.add_middleware(SlowAPIMiddleware)
|
||||||
|
|||||||
@@ -20,7 +20,7 @@ from pathvalidate import sanitize_filename
|
|||||||
from peewee import DoesNotExist, fn
|
from peewee import DoesNotExist, fn
|
||||||
from tzlocal import get_localzone_name
|
from tzlocal import get_localzone_name
|
||||||
|
|
||||||
from frigate.api.defs.media_query_parameters import (
|
from frigate.api.defs.query.media_query_parameters import (
|
||||||
Extension,
|
Extension,
|
||||||
MediaEventsSnapshotQueryParams,
|
MediaEventsSnapshotQueryParams,
|
||||||
MediaLatestFrameQueryParams,
|
MediaLatestFrameQueryParams,
|
||||||
@@ -36,6 +36,7 @@ from frigate.const import (
|
|||||||
RECORD_DIR,
|
RECORD_DIR,
|
||||||
)
|
)
|
||||||
from frigate.models import Event, Previews, Recordings, Regions, ReviewSegment
|
from frigate.models import Event, Previews, Recordings, Regions, ReviewSegment
|
||||||
|
from frigate.object_processing import TrackedObjectProcessor
|
||||||
from frigate.util.builtin import get_tz_modifiers
|
from frigate.util.builtin import get_tz_modifiers
|
||||||
from frigate.util.image import get_image_from_recording
|
from frigate.util.image import get_image_from_recording
|
||||||
|
|
||||||
@@ -79,7 +80,11 @@ def mjpeg_feed(
|
|||||||
|
|
||||||
|
|
||||||
def imagestream(
|
def imagestream(
|
||||||
detected_frames_processor, camera_name: str, fps: int, height: int, draw_options
|
detected_frames_processor: TrackedObjectProcessor,
|
||||||
|
camera_name: str,
|
||||||
|
fps: int,
|
||||||
|
height: int,
|
||||||
|
draw_options: dict[str, any],
|
||||||
):
|
):
|
||||||
while True:
|
while True:
|
||||||
# max out at specified FPS
|
# max out at specified FPS
|
||||||
@@ -118,6 +123,7 @@ def latest_frame(
|
|||||||
extension: Extension,
|
extension: Extension,
|
||||||
params: MediaLatestFrameQueryParams = Depends(),
|
params: MediaLatestFrameQueryParams = Depends(),
|
||||||
):
|
):
|
||||||
|
frame_processor: TrackedObjectProcessor = request.app.detected_frames_processor
|
||||||
draw_options = {
|
draw_options = {
|
||||||
"bounding_boxes": params.bbox,
|
"bounding_boxes": params.bbox,
|
||||||
"timestamp": params.timestamp,
|
"timestamp": params.timestamp,
|
||||||
@@ -129,17 +135,14 @@ def latest_frame(
|
|||||||
quality = params.quality
|
quality = params.quality
|
||||||
|
|
||||||
if camera_name in request.app.frigate_config.cameras:
|
if camera_name in request.app.frigate_config.cameras:
|
||||||
frame = request.app.detected_frames_processor.get_current_frame(
|
frame = frame_processor.get_current_frame(camera_name, draw_options)
|
||||||
camera_name, draw_options
|
|
||||||
)
|
|
||||||
retry_interval = float(
|
retry_interval = float(
|
||||||
request.app.frigate_config.cameras.get(camera_name).ffmpeg.retry_interval
|
request.app.frigate_config.cameras.get(camera_name).ffmpeg.retry_interval
|
||||||
or 10
|
or 10
|
||||||
)
|
)
|
||||||
|
|
||||||
if frame is None or datetime.now().timestamp() > (
|
if frame is None or datetime.now().timestamp() > (
|
||||||
request.app.detected_frames_processor.get_current_frame_time(camera_name)
|
frame_processor.get_current_frame_time(camera_name) + retry_interval
|
||||||
+ retry_interval
|
|
||||||
):
|
):
|
||||||
if request.app.camera_error_image is None:
|
if request.app.camera_error_image is None:
|
||||||
error_image = glob.glob("/opt/frigate/frigate/images/camera-error.jpg")
|
error_image = glob.glob("/opt/frigate/frigate/images/camera-error.jpg")
|
||||||
@@ -180,7 +183,7 @@ def latest_frame(
|
|||||||
)
|
)
|
||||||
elif camera_name == "birdseye" and request.app.frigate_config.birdseye.restream:
|
elif camera_name == "birdseye" and request.app.frigate_config.birdseye.restream:
|
||||||
frame = cv2.cvtColor(
|
frame = cv2.cvtColor(
|
||||||
request.app.detected_frames_processor.get_current_frame(camera_name),
|
frame_processor.get_current_frame(camera_name),
|
||||||
cv2.COLOR_YUV2BGR_I420,
|
cv2.COLOR_YUV2BGR_I420,
|
||||||
)
|
)
|
||||||
|
|
||||||
@@ -813,15 +816,15 @@ def grid_snapshot(
|
|||||||
):
|
):
|
||||||
if camera_name in request.app.frigate_config.cameras:
|
if camera_name in request.app.frigate_config.cameras:
|
||||||
detect = request.app.frigate_config.cameras[camera_name].detect
|
detect = request.app.frigate_config.cameras[camera_name].detect
|
||||||
frame = request.app.detected_frames_processor.get_current_frame(camera_name, {})
|
frame_processor: TrackedObjectProcessor = request.app.detected_frames_processor
|
||||||
|
frame = frame_processor.get_current_frame(camera_name, {})
|
||||||
retry_interval = float(
|
retry_interval = float(
|
||||||
request.app.frigate_config.cameras.get(camera_name).ffmpeg.retry_interval
|
request.app.frigate_config.cameras.get(camera_name).ffmpeg.retry_interval
|
||||||
or 10
|
or 10
|
||||||
)
|
)
|
||||||
|
|
||||||
if frame is None or datetime.now().timestamp() > (
|
if frame is None or datetime.now().timestamp() > (
|
||||||
request.app.detected_frames_processor.get_current_frame_time(camera_name)
|
frame_processor.get_current_frame_time(camera_name) + retry_interval
|
||||||
+ retry_interval
|
|
||||||
):
|
):
|
||||||
return JSONResponse(
|
return JSONResponse(
|
||||||
content={"success": False, "message": "Unable to get valid frame"},
|
content={"success": False, "message": "Unable to get valid frame"},
|
||||||
|
|||||||
@@ -12,20 +12,21 @@ from fastapi.responses import JSONResponse
|
|||||||
from peewee import Case, DoesNotExist, fn, operator
|
from peewee import Case, DoesNotExist, fn, operator
|
||||||
from playhouse.shortcuts import model_to_dict
|
from playhouse.shortcuts import model_to_dict
|
||||||
|
|
||||||
from frigate.api.defs.generic_response import GenericResponse
|
from frigate.api.defs.query.review_query_parameters import (
|
||||||
from frigate.api.defs.review_body import ReviewModifyMultipleBody
|
|
||||||
from frigate.api.defs.review_query_parameters import (
|
|
||||||
ReviewActivityMotionQueryParams,
|
ReviewActivityMotionQueryParams,
|
||||||
ReviewQueryParams,
|
ReviewQueryParams,
|
||||||
ReviewSummaryQueryParams,
|
ReviewSummaryQueryParams,
|
||||||
)
|
)
|
||||||
from frigate.api.defs.review_responses import (
|
from frigate.api.defs.request.review_body import ReviewModifyMultipleBody
|
||||||
|
from frigate.api.defs.response.generic_response import GenericResponse
|
||||||
|
from frigate.api.defs.response.review_response import (
|
||||||
ReviewActivityMotionResponse,
|
ReviewActivityMotionResponse,
|
||||||
ReviewSegmentResponse,
|
ReviewSegmentResponse,
|
||||||
ReviewSummaryResponse,
|
ReviewSummaryResponse,
|
||||||
)
|
)
|
||||||
from frigate.api.defs.tags import Tags
|
from frigate.api.defs.tags import Tags
|
||||||
from frigate.models import Recordings, ReviewSegment
|
from frigate.models import Recordings, ReviewSegment
|
||||||
|
from frigate.review.types import SeverityEnum
|
||||||
from frigate.util.builtin import get_tz_modifiers
|
from frigate.util.builtin import get_tz_modifiers
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
@@ -161,7 +162,7 @@ def review_summary(params: ReviewSummaryQueryParams = Depends()):
|
|||||||
None,
|
None,
|
||||||
[
|
[
|
||||||
(
|
(
|
||||||
(ReviewSegment.severity == "alert"),
|
(ReviewSegment.severity == SeverityEnum.alert),
|
||||||
ReviewSegment.has_been_reviewed,
|
ReviewSegment.has_been_reviewed,
|
||||||
)
|
)
|
||||||
],
|
],
|
||||||
@@ -173,7 +174,7 @@ def review_summary(params: ReviewSummaryQueryParams = Depends()):
|
|||||||
None,
|
None,
|
||||||
[
|
[
|
||||||
(
|
(
|
||||||
(ReviewSegment.severity == "detection"),
|
(ReviewSegment.severity == SeverityEnum.detection),
|
||||||
ReviewSegment.has_been_reviewed,
|
ReviewSegment.has_been_reviewed,
|
||||||
)
|
)
|
||||||
],
|
],
|
||||||
@@ -185,7 +186,7 @@ def review_summary(params: ReviewSummaryQueryParams = Depends()):
|
|||||||
None,
|
None,
|
||||||
[
|
[
|
||||||
(
|
(
|
||||||
(ReviewSegment.severity == "alert"),
|
(ReviewSegment.severity == SeverityEnum.alert),
|
||||||
1,
|
1,
|
||||||
)
|
)
|
||||||
],
|
],
|
||||||
@@ -197,7 +198,7 @@ def review_summary(params: ReviewSummaryQueryParams = Depends()):
|
|||||||
None,
|
None,
|
||||||
[
|
[
|
||||||
(
|
(
|
||||||
(ReviewSegment.severity == "detection"),
|
(ReviewSegment.severity == SeverityEnum.detection),
|
||||||
1,
|
1,
|
||||||
)
|
)
|
||||||
],
|
],
|
||||||
@@ -230,6 +231,7 @@ def review_summary(params: ReviewSummaryQueryParams = Depends()):
|
|||||||
label_clause = reduce(operator.or_, label_clauses)
|
label_clause = reduce(operator.or_, label_clauses)
|
||||||
clauses.append((label_clause))
|
clauses.append((label_clause))
|
||||||
|
|
||||||
|
day_in_seconds = 60 * 60 * 24
|
||||||
last_month = (
|
last_month = (
|
||||||
ReviewSegment.select(
|
ReviewSegment.select(
|
||||||
fn.strftime(
|
fn.strftime(
|
||||||
@@ -246,7 +248,7 @@ def review_summary(params: ReviewSummaryQueryParams = Depends()):
|
|||||||
None,
|
None,
|
||||||
[
|
[
|
||||||
(
|
(
|
||||||
(ReviewSegment.severity == "alert"),
|
(ReviewSegment.severity == SeverityEnum.alert),
|
||||||
ReviewSegment.has_been_reviewed,
|
ReviewSegment.has_been_reviewed,
|
||||||
)
|
)
|
||||||
],
|
],
|
||||||
@@ -258,7 +260,7 @@ def review_summary(params: ReviewSummaryQueryParams = Depends()):
|
|||||||
None,
|
None,
|
||||||
[
|
[
|
||||||
(
|
(
|
||||||
(ReviewSegment.severity == "detection"),
|
(ReviewSegment.severity == SeverityEnum.detection),
|
||||||
ReviewSegment.has_been_reviewed,
|
ReviewSegment.has_been_reviewed,
|
||||||
)
|
)
|
||||||
],
|
],
|
||||||
@@ -270,7 +272,7 @@ def review_summary(params: ReviewSummaryQueryParams = Depends()):
|
|||||||
None,
|
None,
|
||||||
[
|
[
|
||||||
(
|
(
|
||||||
(ReviewSegment.severity == "alert"),
|
(ReviewSegment.severity == SeverityEnum.alert),
|
||||||
1,
|
1,
|
||||||
)
|
)
|
||||||
],
|
],
|
||||||
@@ -282,7 +284,7 @@ def review_summary(params: ReviewSummaryQueryParams = Depends()):
|
|||||||
None,
|
None,
|
||||||
[
|
[
|
||||||
(
|
(
|
||||||
(ReviewSegment.severity == "detection"),
|
(ReviewSegment.severity == SeverityEnum.detection),
|
||||||
1,
|
1,
|
||||||
)
|
)
|
||||||
],
|
],
|
||||||
@@ -292,7 +294,7 @@ def review_summary(params: ReviewSummaryQueryParams = Depends()):
|
|||||||
)
|
)
|
||||||
.where(reduce(operator.and_, clauses))
|
.where(reduce(operator.and_, clauses))
|
||||||
.group_by(
|
.group_by(
|
||||||
(ReviewSegment.start_time + seconds_offset).cast("int") / (3600 * 24),
|
(ReviewSegment.start_time + seconds_offset).cast("int") / day_in_seconds,
|
||||||
)
|
)
|
||||||
.order_by(ReviewSegment.start_time.desc())
|
.order_by(ReviewSegment.start_time.desc())
|
||||||
)
|
)
|
||||||
@@ -362,7 +364,7 @@ def delete_reviews(body: ReviewModifyMultipleBody):
|
|||||||
ReviewSegment.delete().where(ReviewSegment.id << list_of_ids).execute()
|
ReviewSegment.delete().where(ReviewSegment.id << list_of_ids).execute()
|
||||||
|
|
||||||
return JSONResponse(
|
return JSONResponse(
|
||||||
content=({"success": True, "message": "Delete reviews"}), status_code=200
|
content=({"success": True, "message": "Deleted review items."}), status_code=200
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@@ -36,6 +36,7 @@ from frigate.const import (
|
|||||||
EXPORT_DIR,
|
EXPORT_DIR,
|
||||||
MODEL_CACHE_DIR,
|
MODEL_CACHE_DIR,
|
||||||
RECORD_DIR,
|
RECORD_DIR,
|
||||||
|
SHM_FRAMES_VAR,
|
||||||
)
|
)
|
||||||
from frigate.db.sqlitevecq import SqliteVecQueueDatabase
|
from frigate.db.sqlitevecq import SqliteVecQueueDatabase
|
||||||
from frigate.embeddings import EmbeddingsContext, manage_embeddings
|
from frigate.embeddings import EmbeddingsContext, manage_embeddings
|
||||||
@@ -436,7 +437,7 @@ class FrigateApp:
|
|||||||
# pre-create shms
|
# pre-create shms
|
||||||
for i in range(shm_frame_count):
|
for i in range(shm_frame_count):
|
||||||
frame_size = config.frame_shape_yuv[0] * config.frame_shape_yuv[1]
|
frame_size = config.frame_shape_yuv[0] * config.frame_shape_yuv[1]
|
||||||
self.frame_manager.create(f"{config.name}{i}", frame_size)
|
self.frame_manager.create(f"{config.name}_{i}", frame_size)
|
||||||
|
|
||||||
capture_process = util.Process(
|
capture_process = util.Process(
|
||||||
target=capture_camera,
|
target=capture_camera,
|
||||||
@@ -523,7 +524,10 @@ class FrigateApp:
|
|||||||
if cam_total_frame_size == 0.0:
|
if cam_total_frame_size == 0.0:
|
||||||
return 0
|
return 0
|
||||||
|
|
||||||
shm_frame_count = min(200, int(available_shm / (cam_total_frame_size)))
|
shm_frame_count = min(
|
||||||
|
int(os.environ.get(SHM_FRAMES_VAR, "50")),
|
||||||
|
int(available_shm / (cam_total_frame_size)),
|
||||||
|
)
|
||||||
|
|
||||||
logger.debug(
|
logger.debug(
|
||||||
f"Calculated total camera size {available_shm} / {cam_total_frame_size} :: {shm_frame_count} frames for each camera in SHM"
|
f"Calculated total camera size {available_shm} / {cam_total_frame_size} :: {shm_frame_count} frames for each camera in SHM"
|
||||||
|
|||||||
@@ -230,12 +230,16 @@ def verify_recording_segments_setup_with_reasonable_time(
|
|||||||
try:
|
try:
|
||||||
seg_arg_index = record_args.index("-segment_time")
|
seg_arg_index = record_args.index("-segment_time")
|
||||||
except ValueError:
|
except ValueError:
|
||||||
raise ValueError(f"Camera {camera_config.name} has no segment_time in \
|
raise ValueError(
|
||||||
recording output args, segment args are required for record.")
|
f"Camera {camera_config.name} has no segment_time in \
|
||||||
|
recording output args, segment args are required for record."
|
||||||
|
)
|
||||||
|
|
||||||
if int(record_args[seg_arg_index + 1]) > 60:
|
if int(record_args[seg_arg_index + 1]) > 60:
|
||||||
raise ValueError(f"Camera {camera_config.name} has invalid segment_time output arg, \
|
raise ValueError(
|
||||||
segment_time must be 60 or less.")
|
f"Camera {camera_config.name} has invalid segment_time output arg, \
|
||||||
|
segment_time must be 60 or less."
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
def verify_zone_objects_are_tracked(camera_config: CameraConfig) -> None:
|
def verify_zone_objects_are_tracked(camera_config: CameraConfig) -> None:
|
||||||
|
|||||||
@@ -13,6 +13,8 @@ FRIGATE_LOCALHOST = "http://127.0.0.1:5000"
|
|||||||
PLUS_ENV_VAR = "PLUS_API_KEY"
|
PLUS_ENV_VAR = "PLUS_API_KEY"
|
||||||
PLUS_API_HOST = "https://api.frigate.video"
|
PLUS_API_HOST = "https://api.frigate.video"
|
||||||
|
|
||||||
|
SHM_FRAMES_VAR = "SHM_MAX_FRAMES"
|
||||||
|
|
||||||
# Attribute & Object constants
|
# Attribute & Object constants
|
||||||
|
|
||||||
DEFAULT_ATTRIBUTE_LABEL_MAP = {
|
DEFAULT_ATTRIBUTE_LABEL_MAP = {
|
||||||
|
|||||||
@@ -216,6 +216,10 @@ class AudioEventMaintainer(threading.Thread):
|
|||||||
"label": label,
|
"label": label,
|
||||||
"last_detection": datetime.datetime.now().timestamp(),
|
"last_detection": datetime.datetime.now().timestamp(),
|
||||||
}
|
}
|
||||||
|
else:
|
||||||
|
self.logger.warning(
|
||||||
|
f"Failed to create audio event with status code {resp.status_code}"
|
||||||
|
)
|
||||||
|
|
||||||
def expire_detections(self) -> None:
|
def expire_detections(self) -> None:
|
||||||
now = datetime.datetime.now().timestamp()
|
now = datetime.datetime.now().timestamp()
|
||||||
|
|||||||
@@ -4,7 +4,6 @@ import datetime
|
|||||||
import logging
|
import logging
|
||||||
import os
|
import os
|
||||||
import threading
|
import threading
|
||||||
from enum import Enum
|
|
||||||
from multiprocessing.synchronize import Event as MpEvent
|
from multiprocessing.synchronize import Event as MpEvent
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
|
||||||
@@ -16,11 +15,6 @@ from frigate.models import Event, Timeline
|
|||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
class EventCleanupType(str, Enum):
|
|
||||||
clips = "clips"
|
|
||||||
snapshots = "snapshots"
|
|
||||||
|
|
||||||
|
|
||||||
CHUNK_SIZE = 50
|
CHUNK_SIZE = 50
|
||||||
|
|
||||||
|
|
||||||
@@ -67,19 +61,11 @@ class EventCleanup(threading.Thread):
|
|||||||
|
|
||||||
return self.camera_labels[camera]["labels"]
|
return self.camera_labels[camera]["labels"]
|
||||||
|
|
||||||
def expire(self, media_type: EventCleanupType) -> list[str]:
|
def expire_snapshots(self) -> list[str]:
|
||||||
## Expire events from unlisted cameras based on the global config
|
## Expire events from unlisted cameras based on the global config
|
||||||
if media_type == EventCleanupType.clips:
|
retain_config = self.config.snapshots.retain
|
||||||
expire_days = max(
|
file_extension = "jpg"
|
||||||
self.config.record.alerts.retain.days,
|
update_params = {"has_snapshot": False}
|
||||||
self.config.record.detections.retain.days,
|
|
||||||
)
|
|
||||||
file_extension = None # mp4 clips are no longer stored in /clips
|
|
||||||
update_params = {"has_clip": False}
|
|
||||||
else:
|
|
||||||
retain_config = self.config.snapshots.retain
|
|
||||||
file_extension = "jpg"
|
|
||||||
update_params = {"has_snapshot": False}
|
|
||||||
|
|
||||||
distinct_labels = self.get_removed_camera_labels()
|
distinct_labels = self.get_removed_camera_labels()
|
||||||
|
|
||||||
@@ -87,10 +73,7 @@ class EventCleanup(threading.Thread):
|
|||||||
# loop over object types in db
|
# loop over object types in db
|
||||||
for event in distinct_labels:
|
for event in distinct_labels:
|
||||||
# get expiration time for this label
|
# get expiration time for this label
|
||||||
if media_type == EventCleanupType.snapshots:
|
expire_days = retain_config.objects.get(event.label, retain_config.default)
|
||||||
expire_days = retain_config.objects.get(
|
|
||||||
event.label, retain_config.default
|
|
||||||
)
|
|
||||||
|
|
||||||
expire_after = (
|
expire_after = (
|
||||||
datetime.datetime.now() - datetime.timedelta(days=expire_days)
|
datetime.datetime.now() - datetime.timedelta(days=expire_days)
|
||||||
@@ -110,7 +93,7 @@ class EventCleanup(threading.Thread):
|
|||||||
.namedtuples()
|
.namedtuples()
|
||||||
.iterator()
|
.iterator()
|
||||||
)
|
)
|
||||||
logger.debug(f"{len(expired_events)} events can be expired")
|
logger.debug(f"{len(list(expired_events))} events can be expired")
|
||||||
# delete the media from disk
|
# delete the media from disk
|
||||||
for expired in expired_events:
|
for expired in expired_events:
|
||||||
media_name = f"{expired.camera}-{expired.id}"
|
media_name = f"{expired.camera}-{expired.id}"
|
||||||
@@ -162,13 +145,7 @@ class EventCleanup(threading.Thread):
|
|||||||
|
|
||||||
## Expire events from cameras based on the camera config
|
## Expire events from cameras based on the camera config
|
||||||
for name, camera in self.config.cameras.items():
|
for name, camera in self.config.cameras.items():
|
||||||
if media_type == EventCleanupType.clips:
|
retain_config = camera.snapshots.retain
|
||||||
expire_days = max(
|
|
||||||
camera.record.alerts.retain.days,
|
|
||||||
camera.record.detections.retain.days,
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
retain_config = camera.snapshots.retain
|
|
||||||
|
|
||||||
# get distinct objects in database for this camera
|
# get distinct objects in database for this camera
|
||||||
distinct_labels = self.get_camera_labels(name)
|
distinct_labels = self.get_camera_labels(name)
|
||||||
@@ -176,10 +153,9 @@ class EventCleanup(threading.Thread):
|
|||||||
# loop over object types in db
|
# loop over object types in db
|
||||||
for event in distinct_labels:
|
for event in distinct_labels:
|
||||||
# get expiration time for this label
|
# get expiration time for this label
|
||||||
if media_type == EventCleanupType.snapshots:
|
expire_days = retain_config.objects.get(
|
||||||
expire_days = retain_config.objects.get(
|
event.label, retain_config.default
|
||||||
event.label, retain_config.default
|
)
|
||||||
)
|
|
||||||
|
|
||||||
expire_after = (
|
expire_after = (
|
||||||
datetime.datetime.now() - datetime.timedelta(days=expire_days)
|
datetime.datetime.now() - datetime.timedelta(days=expire_days)
|
||||||
@@ -206,19 +182,143 @@ class EventCleanup(threading.Thread):
|
|||||||
for event in expired_events:
|
for event in expired_events:
|
||||||
events_to_update.append(event.id)
|
events_to_update.append(event.id)
|
||||||
|
|
||||||
if media_type == EventCleanupType.snapshots:
|
try:
|
||||||
try:
|
media_name = f"{event.camera}-{event.id}"
|
||||||
media_name = f"{event.camera}-{event.id}"
|
media_path = Path(
|
||||||
media_path = Path(
|
f"{os.path.join(CLIPS_DIR, media_name)}.{file_extension}"
|
||||||
f"{os.path.join(CLIPS_DIR, media_name)}.{file_extension}"
|
)
|
||||||
)
|
media_path.unlink(missing_ok=True)
|
||||||
media_path.unlink(missing_ok=True)
|
media_path = Path(
|
||||||
media_path = Path(
|
f"{os.path.join(CLIPS_DIR, media_name)}-clean.png"
|
||||||
f"{os.path.join(CLIPS_DIR, media_name)}-clean.png"
|
)
|
||||||
)
|
media_path.unlink(missing_ok=True)
|
||||||
media_path.unlink(missing_ok=True)
|
except OSError as e:
|
||||||
except OSError as e:
|
logger.warning(f"Unable to delete event images: {e}")
|
||||||
logger.warning(f"Unable to delete event images: {e}")
|
|
||||||
|
# update the clips attribute for the db entry
|
||||||
|
for i in range(0, len(events_to_update), CHUNK_SIZE):
|
||||||
|
batch = events_to_update[i : i + CHUNK_SIZE]
|
||||||
|
logger.debug(f"Updating {update_params} for {len(batch)} events")
|
||||||
|
Event.update(update_params).where(Event.id << batch).execute()
|
||||||
|
|
||||||
|
return events_to_update
|
||||||
|
|
||||||
|
def expire_clips(self) -> list[str]:
|
||||||
|
## Expire events from unlisted cameras based on the global config
|
||||||
|
expire_days = max(
|
||||||
|
self.config.record.alerts.retain.days,
|
||||||
|
self.config.record.detections.retain.days,
|
||||||
|
)
|
||||||
|
file_extension = None # mp4 clips are no longer stored in /clips
|
||||||
|
update_params = {"has_clip": False}
|
||||||
|
|
||||||
|
# get expiration time for this label
|
||||||
|
|
||||||
|
expire_after = (
|
||||||
|
datetime.datetime.now() - datetime.timedelta(days=expire_days)
|
||||||
|
).timestamp()
|
||||||
|
# grab all events after specific time
|
||||||
|
expired_events: list[Event] = (
|
||||||
|
Event.select(
|
||||||
|
Event.id,
|
||||||
|
Event.camera,
|
||||||
|
)
|
||||||
|
.where(
|
||||||
|
Event.camera.not_in(self.camera_keys),
|
||||||
|
Event.start_time < expire_after,
|
||||||
|
Event.retain_indefinitely == False,
|
||||||
|
)
|
||||||
|
.namedtuples()
|
||||||
|
.iterator()
|
||||||
|
)
|
||||||
|
logger.debug(f"{len(list(expired_events))} events can be expired")
|
||||||
|
# delete the media from disk
|
||||||
|
for expired in expired_events:
|
||||||
|
media_name = f"{expired.camera}-{expired.id}"
|
||||||
|
media_path = Path(f"{os.path.join(CLIPS_DIR, media_name)}.{file_extension}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
media_path.unlink(missing_ok=True)
|
||||||
|
if file_extension == "jpg":
|
||||||
|
media_path = Path(
|
||||||
|
f"{os.path.join(CLIPS_DIR, media_name)}-clean.png"
|
||||||
|
)
|
||||||
|
media_path.unlink(missing_ok=True)
|
||||||
|
except OSError as e:
|
||||||
|
logger.warning(f"Unable to delete event images: {e}")
|
||||||
|
|
||||||
|
# update the clips attribute for the db entry
|
||||||
|
query = Event.select(Event.id).where(
|
||||||
|
Event.camera.not_in(self.camera_keys),
|
||||||
|
Event.start_time < expire_after,
|
||||||
|
Event.retain_indefinitely == False,
|
||||||
|
)
|
||||||
|
|
||||||
|
events_to_update = []
|
||||||
|
|
||||||
|
for batch in query.iterator():
|
||||||
|
events_to_update.extend([event.id for event in batch])
|
||||||
|
if len(events_to_update) >= CHUNK_SIZE:
|
||||||
|
logger.debug(
|
||||||
|
f"Updating {update_params} for {len(events_to_update)} events"
|
||||||
|
)
|
||||||
|
Event.update(update_params).where(
|
||||||
|
Event.id << events_to_update
|
||||||
|
).execute()
|
||||||
|
events_to_update = []
|
||||||
|
|
||||||
|
# Update any remaining events
|
||||||
|
if events_to_update:
|
||||||
|
logger.debug(
|
||||||
|
f"Updating clips/snapshots attribute for {len(events_to_update)} events"
|
||||||
|
)
|
||||||
|
Event.update(update_params).where(Event.id << events_to_update).execute()
|
||||||
|
|
||||||
|
events_to_update = []
|
||||||
|
now = datetime.datetime.now()
|
||||||
|
|
||||||
|
## Expire events from cameras based on the camera config
|
||||||
|
for name, camera in self.config.cameras.items():
|
||||||
|
expire_days = max(
|
||||||
|
camera.record.alerts.retain.days,
|
||||||
|
camera.record.detections.retain.days,
|
||||||
|
)
|
||||||
|
alert_expire_date = (
|
||||||
|
now - datetime.timedelta(days=camera.record.alerts.retain.days)
|
||||||
|
).timestamp()
|
||||||
|
detection_expire_date = (
|
||||||
|
now - datetime.timedelta(days=camera.record.detections.retain.days)
|
||||||
|
).timestamp()
|
||||||
|
# grab all events after specific time
|
||||||
|
expired_events = (
|
||||||
|
Event.select(
|
||||||
|
Event.id,
|
||||||
|
Event.camera,
|
||||||
|
)
|
||||||
|
.where(
|
||||||
|
Event.camera == name,
|
||||||
|
Event.retain_indefinitely == False,
|
||||||
|
(
|
||||||
|
(
|
||||||
|
(Event.data["max_severity"] != "detection")
|
||||||
|
| (Event.data["max_severity"].is_null())
|
||||||
|
)
|
||||||
|
& (Event.end_time < alert_expire_date)
|
||||||
|
)
|
||||||
|
| (
|
||||||
|
(Event.data["max_severity"] == "detection")
|
||||||
|
& (Event.end_time < detection_expire_date)
|
||||||
|
),
|
||||||
|
)
|
||||||
|
.namedtuples()
|
||||||
|
.iterator()
|
||||||
|
)
|
||||||
|
|
||||||
|
# delete the grabbed clips from disk
|
||||||
|
# only snapshots are stored in /clips
|
||||||
|
# so no need to delete mp4 files
|
||||||
|
for event in expired_events:
|
||||||
|
events_to_update.append(event.id)
|
||||||
|
|
||||||
# update the clips attribute for the db entry
|
# update the clips attribute for the db entry
|
||||||
for i in range(0, len(events_to_update), CHUNK_SIZE):
|
for i in range(0, len(events_to_update), CHUNK_SIZE):
|
||||||
@@ -230,8 +330,9 @@ class EventCleanup(threading.Thread):
|
|||||||
|
|
||||||
def run(self) -> None:
|
def run(self) -> None:
|
||||||
# only expire events every 5 minutes
|
# only expire events every 5 minutes
|
||||||
while not self.stop_event.wait(300):
|
while not self.stop_event.wait(1):
|
||||||
events_with_expired_clips = self.expire(EventCleanupType.clips)
|
events_with_expired_clips = self.expire_clips()
|
||||||
|
return
|
||||||
|
|
||||||
# delete timeline entries for events that have expired recordings
|
# delete timeline entries for events that have expired recordings
|
||||||
# delete up to 100,000 at a time
|
# delete up to 100,000 at a time
|
||||||
@@ -242,7 +343,7 @@ class EventCleanup(threading.Thread):
|
|||||||
Timeline.source_id << deleted_events_list[i : i + max_deletes]
|
Timeline.source_id << deleted_events_list[i : i + max_deletes]
|
||||||
).execute()
|
).execute()
|
||||||
|
|
||||||
self.expire(EventCleanupType.snapshots)
|
self.expire_snapshots()
|
||||||
|
|
||||||
# drop events from db where has_clip and has_snapshot are false
|
# drop events from db where has_clip and has_snapshot are false
|
||||||
events = (
|
events = (
|
||||||
|
|||||||
@@ -10,6 +10,7 @@ from enum import Enum
|
|||||||
from typing import Optional
|
from typing import Optional
|
||||||
|
|
||||||
import cv2
|
import cv2
|
||||||
|
from numpy import ndarray
|
||||||
|
|
||||||
from frigate.comms.detections_updater import DetectionPublisher, DetectionTypeEnum
|
from frigate.comms.detections_updater import DetectionPublisher, DetectionTypeEnum
|
||||||
from frigate.comms.events_updater import EventUpdatePublisher
|
from frigate.comms.events_updater import EventUpdatePublisher
|
||||||
@@ -45,7 +46,7 @@ class ExternalEventProcessor:
|
|||||||
duration: Optional[int],
|
duration: Optional[int],
|
||||||
include_recording: bool,
|
include_recording: bool,
|
||||||
draw: dict[str, any],
|
draw: dict[str, any],
|
||||||
snapshot_frame: any,
|
snapshot_frame: Optional[ndarray],
|
||||||
) -> str:
|
) -> str:
|
||||||
now = datetime.datetime.now().timestamp()
|
now = datetime.datetime.now().timestamp()
|
||||||
camera_config = self.config.cameras.get(camera)
|
camera_config = self.config.cameras.get(camera)
|
||||||
@@ -107,6 +108,7 @@ class ExternalEventProcessor:
|
|||||||
EventTypeEnum.api,
|
EventTypeEnum.api,
|
||||||
EventStateEnum.end,
|
EventStateEnum.end,
|
||||||
None,
|
None,
|
||||||
|
"",
|
||||||
{"id": event_id, "end_time": end_time},
|
{"id": event_id, "end_time": end_time},
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
@@ -131,8 +133,11 @@ class ExternalEventProcessor:
|
|||||||
label: str,
|
label: str,
|
||||||
event_id: str,
|
event_id: str,
|
||||||
draw: dict[str, any],
|
draw: dict[str, any],
|
||||||
img_frame: any,
|
img_frame: Optional[ndarray],
|
||||||
) -> str:
|
) -> Optional[str]:
|
||||||
|
if img_frame is None:
|
||||||
|
return None
|
||||||
|
|
||||||
# write clean snapshot if enabled
|
# write clean snapshot if enabled
|
||||||
if camera_config.snapshots.clean_copy:
|
if camera_config.snapshots.clean_copy:
|
||||||
ret, png = cv2.imencode(".png", img_frame)
|
ret, png = cv2.imencode(".png", img_frame)
|
||||||
|
|||||||
@@ -210,6 +210,7 @@ class EventProcessor(threading.Thread):
|
|||||||
"top_score": event_data["top_score"],
|
"top_score": event_data["top_score"],
|
||||||
"attributes": attributes,
|
"attributes": attributes,
|
||||||
"type": "object",
|
"type": "object",
|
||||||
|
"max_severity": event_data.get("max_severity"),
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -6,7 +6,7 @@ import queue
|
|||||||
import threading
|
import threading
|
||||||
from collections import Counter, defaultdict
|
from collections import Counter, defaultdict
|
||||||
from multiprocessing.synchronize import Event as MpEvent
|
from multiprocessing.synchronize import Event as MpEvent
|
||||||
from typing import Callable
|
from typing import Callable, Optional
|
||||||
|
|
||||||
import cv2
|
import cv2
|
||||||
import numpy as np
|
import numpy as np
|
||||||
@@ -702,30 +702,7 @@ class TrackedObjectProcessor(threading.Thread):
|
|||||||
return False
|
return False
|
||||||
|
|
||||||
# If the object is not considered an alert or detection
|
# If the object is not considered an alert or detection
|
||||||
review_config = self.config.cameras[camera].review
|
if obj.max_severity is None:
|
||||||
if not (
|
|
||||||
(
|
|
||||||
obj.obj_data["label"] in review_config.alerts.labels
|
|
||||||
and (
|
|
||||||
not review_config.alerts.required_zones
|
|
||||||
or set(obj.entered_zones) & set(review_config.alerts.required_zones)
|
|
||||||
)
|
|
||||||
)
|
|
||||||
or (
|
|
||||||
(
|
|
||||||
not review_config.detections.labels
|
|
||||||
or obj.obj_data["label"] in review_config.detections.labels
|
|
||||||
)
|
|
||||||
and (
|
|
||||||
not review_config.detections.required_zones
|
|
||||||
or set(obj.entered_zones)
|
|
||||||
& set(review_config.detections.required_zones)
|
|
||||||
)
|
|
||||||
)
|
|
||||||
):
|
|
||||||
logger.debug(
|
|
||||||
f"Not creating clip for {obj.obj_data['id']} because it did not qualify as an alert or detection"
|
|
||||||
)
|
|
||||||
return False
|
return False
|
||||||
|
|
||||||
return True
|
return True
|
||||||
@@ -784,13 +761,18 @@ class TrackedObjectProcessor(threading.Thread):
|
|||||||
else:
|
else:
|
||||||
return {}
|
return {}
|
||||||
|
|
||||||
def get_current_frame(self, camera, draw_options={}):
|
def get_current_frame(
|
||||||
|
self, camera: str, draw_options: dict[str, any] = {}
|
||||||
|
) -> Optional[np.ndarray]:
|
||||||
if camera == "birdseye":
|
if camera == "birdseye":
|
||||||
return self.frame_manager.get(
|
return self.frame_manager.get(
|
||||||
"birdseye",
|
"birdseye",
|
||||||
(self.config.birdseye.height * 3 // 2, self.config.birdseye.width),
|
(self.config.birdseye.height * 3 // 2, self.config.birdseye.width),
|
||||||
)
|
)
|
||||||
|
|
||||||
|
if camera not in self.camera_states:
|
||||||
|
return None
|
||||||
|
|
||||||
return self.camera_states[camera].get_current_frame(draw_options)
|
return self.camera_states[camera].get_current_frame(draw_options)
|
||||||
|
|
||||||
def get_current_frame_time(self, camera) -> int:
|
def get_current_frame_time(self, camera) -> int:
|
||||||
|
|||||||
@@ -7,7 +7,6 @@ import random
|
|||||||
import string
|
import string
|
||||||
import sys
|
import sys
|
||||||
import threading
|
import threading
|
||||||
from enum import Enum
|
|
||||||
from multiprocessing.synchronize import Event as MpEvent
|
from multiprocessing.synchronize import Event as MpEvent
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
@@ -27,6 +26,7 @@ from frigate.const import (
|
|||||||
from frigate.events.external import ManualEventState
|
from frigate.events.external import ManualEventState
|
||||||
from frigate.models import ReviewSegment
|
from frigate.models import ReviewSegment
|
||||||
from frigate.object_processing import TrackedObject
|
from frigate.object_processing import TrackedObject
|
||||||
|
from frigate.review.types import SeverityEnum
|
||||||
from frigate.util.image import SharedMemoryFrameManager, calculate_16_9_crop
|
from frigate.util.image import SharedMemoryFrameManager, calculate_16_9_crop
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
@@ -39,11 +39,6 @@ THRESHOLD_ALERT_ACTIVITY = 120
|
|||||||
THRESHOLD_DETECTION_ACTIVITY = 30
|
THRESHOLD_DETECTION_ACTIVITY = 30
|
||||||
|
|
||||||
|
|
||||||
class SeverityEnum(str, Enum):
|
|
||||||
alert = "alert"
|
|
||||||
detection = "detection"
|
|
||||||
|
|
||||||
|
|
||||||
class PendingReviewSegment:
|
class PendingReviewSegment:
|
||||||
def __init__(
|
def __init__(
|
||||||
self,
|
self,
|
||||||
@@ -480,7 +475,9 @@ class ReviewSegmentMaintainer(threading.Thread):
|
|||||||
|
|
||||||
if not self.config.cameras[camera].record.enabled:
|
if not self.config.cameras[camera].record.enabled:
|
||||||
if current_segment:
|
if current_segment:
|
||||||
self.update_existing_segment(current_segment, frame_time, [])
|
self.update_existing_segment(
|
||||||
|
current_segment, frame_name, frame_time, []
|
||||||
|
)
|
||||||
|
|
||||||
continue
|
continue
|
||||||
|
|
||||||
|
|||||||
6
frigate/review/types.py
Normal file
6
frigate/review/types.py
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
from enum import Enum
|
||||||
|
|
||||||
|
|
||||||
|
class SeverityEnum(str, Enum):
|
||||||
|
alert = "alert"
|
||||||
|
detection = "detection"
|
||||||
@@ -9,8 +9,8 @@ from playhouse.sqliteq import SqliteQueueDatabase
|
|||||||
|
|
||||||
from frigate.api.fastapi_app import create_fastapi_app
|
from frigate.api.fastapi_app import create_fastapi_app
|
||||||
from frigate.config import FrigateConfig
|
from frigate.config import FrigateConfig
|
||||||
from frigate.models import Event, ReviewSegment
|
from frigate.models import Event, Recordings, ReviewSegment
|
||||||
from frigate.review.maintainer import SeverityEnum
|
from frigate.review.types import SeverityEnum
|
||||||
from frigate.test.const import TEST_DB, TEST_DB_CLEANUPS
|
from frigate.test.const import TEST_DB, TEST_DB_CLEANUPS
|
||||||
|
|
||||||
|
|
||||||
@@ -146,17 +146,35 @@ class BaseTestHttp(unittest.TestCase):
|
|||||||
def insert_mock_review_segment(
|
def insert_mock_review_segment(
|
||||||
self,
|
self,
|
||||||
id: str,
|
id: str,
|
||||||
start_time: datetime.datetime = datetime.datetime.now().timestamp(),
|
start_time: float = datetime.datetime.now().timestamp(),
|
||||||
end_time: datetime.datetime = datetime.datetime.now().timestamp() + 20,
|
end_time: float = datetime.datetime.now().timestamp() + 20,
|
||||||
|
severity: SeverityEnum = SeverityEnum.alert,
|
||||||
|
has_been_reviewed: bool = False,
|
||||||
) -> Event:
|
) -> Event:
|
||||||
"""Inserts a basic event model with a given id."""
|
"""Inserts a review segment model with a given id."""
|
||||||
return ReviewSegment.insert(
|
return ReviewSegment.insert(
|
||||||
id=id,
|
id=id,
|
||||||
camera="front_door",
|
camera="front_door",
|
||||||
start_time=start_time,
|
start_time=start_time,
|
||||||
end_time=end_time,
|
end_time=end_time,
|
||||||
has_been_reviewed=False,
|
has_been_reviewed=has_been_reviewed,
|
||||||
severity=SeverityEnum.alert,
|
severity=severity,
|
||||||
thumb_path=False,
|
thumb_path=False,
|
||||||
data={},
|
data={},
|
||||||
).execute()
|
).execute()
|
||||||
|
|
||||||
|
def insert_mock_recording(
|
||||||
|
self,
|
||||||
|
id: str,
|
||||||
|
start_time: float = datetime.datetime.now().timestamp(),
|
||||||
|
end_time: float = datetime.datetime.now().timestamp() + 20,
|
||||||
|
) -> Event:
|
||||||
|
"""Inserts a recording model with a given id."""
|
||||||
|
return Recordings.insert(
|
||||||
|
id=id,
|
||||||
|
path=id,
|
||||||
|
camera="front_door",
|
||||||
|
start_time=start_time,
|
||||||
|
end_time=end_time,
|
||||||
|
duration=end_time - start_time,
|
||||||
|
).execute()
|
||||||
|
|||||||
@@ -1,76 +1,89 @@
|
|||||||
import datetime
|
from datetime import datetime, timedelta
|
||||||
|
|
||||||
from fastapi.testclient import TestClient
|
from fastapi.testclient import TestClient
|
||||||
|
|
||||||
from frigate.models import Event, ReviewSegment
|
from frigate.models import Event, Recordings, ReviewSegment
|
||||||
|
from frigate.review.types import SeverityEnum
|
||||||
from frigate.test.http_api.base_http_test import BaseTestHttp
|
from frigate.test.http_api.base_http_test import BaseTestHttp
|
||||||
|
|
||||||
|
|
||||||
class TestHttpReview(BaseTestHttp):
|
class TestHttpReview(BaseTestHttp):
|
||||||
def setUp(self):
|
def setUp(self):
|
||||||
super().setUp([Event, ReviewSegment])
|
super().setUp([Event, Recordings, ReviewSegment])
|
||||||
|
self.app = super().create_app()
|
||||||
|
|
||||||
|
def _get_reviews(self, ids: list[str]):
|
||||||
|
return list(
|
||||||
|
ReviewSegment.select(ReviewSegment.id)
|
||||||
|
.where(ReviewSegment.id.in_(ids))
|
||||||
|
.execute()
|
||||||
|
)
|
||||||
|
|
||||||
|
def _get_recordings(self, ids: list[str]):
|
||||||
|
return list(
|
||||||
|
Recordings.select(Recordings.id).where(Recordings.id.in_(ids)).execute()
|
||||||
|
)
|
||||||
|
|
||||||
|
####################################################################################################################
|
||||||
|
################################### GET /review Endpoint ########################################################
|
||||||
|
####################################################################################################################
|
||||||
|
|
||||||
# Does not return any data point since the end time (before parameter) is not passed and the review segment end_time is 2 seconds from now
|
# Does not return any data point since the end time (before parameter) is not passed and the review segment end_time is 2 seconds from now
|
||||||
def test_get_review_no_filters_no_matches(self):
|
def test_get_review_no_filters_no_matches(self):
|
||||||
app = super().create_app()
|
now = datetime.now().timestamp()
|
||||||
now = datetime.datetime.now().timestamp()
|
|
||||||
|
|
||||||
with TestClient(app) as client:
|
with TestClient(self.app) as client:
|
||||||
super().insert_mock_review_segment("123456.random", now, now + 2)
|
super().insert_mock_review_segment("123456.random", now, now + 2)
|
||||||
reviews_response = client.get("/review")
|
response = client.get("/review")
|
||||||
assert reviews_response.status_code == 200
|
assert response.status_code == 200
|
||||||
reviews_in_response = reviews_response.json()
|
response_json = response.json()
|
||||||
assert len(reviews_in_response) == 0
|
assert len(response_json) == 0
|
||||||
|
|
||||||
def test_get_review_no_filters(self):
|
def test_get_review_no_filters(self):
|
||||||
app = super().create_app()
|
now = datetime.now().timestamp()
|
||||||
now = datetime.datetime.now().timestamp()
|
|
||||||
|
|
||||||
with TestClient(app) as client:
|
with TestClient(self.app) as client:
|
||||||
super().insert_mock_review_segment("123456.random", now - 2, now - 1)
|
super().insert_mock_review_segment("123456.random", now - 2, now - 1)
|
||||||
reviews_response = client.get("/review")
|
response = client.get("/review")
|
||||||
assert reviews_response.status_code == 200
|
assert response.status_code == 200
|
||||||
reviews_in_response = reviews_response.json()
|
response_json = response.json()
|
||||||
assert len(reviews_in_response) == 1
|
assert len(response_json) == 1
|
||||||
|
|
||||||
def test_get_review_with_time_filter_no_matches(self):
|
def test_get_review_with_time_filter_no_matches(self):
|
||||||
app = super().create_app()
|
now = datetime.now().timestamp()
|
||||||
now = datetime.datetime.now().timestamp()
|
|
||||||
|
|
||||||
with TestClient(app) as client:
|
with TestClient(self.app) as client:
|
||||||
id = "123456.random"
|
id = "123456.random"
|
||||||
super().insert_mock_review_segment(id, now, now + 2)
|
super().insert_mock_review_segment(id, now, now + 2)
|
||||||
params = {
|
params = {
|
||||||
"after": now,
|
"after": now,
|
||||||
"before": now + 3,
|
"before": now + 3,
|
||||||
}
|
}
|
||||||
reviews_response = client.get("/review", params=params)
|
response = client.get("/review", params=params)
|
||||||
assert reviews_response.status_code == 200
|
assert response.status_code == 200
|
||||||
reviews_in_response = reviews_response.json()
|
response_json = response.json()
|
||||||
assert len(reviews_in_response) == 0
|
assert len(response_json) == 0
|
||||||
|
|
||||||
def test_get_review_with_time_filter(self):
|
def test_get_review_with_time_filter(self):
|
||||||
app = super().create_app()
|
now = datetime.now().timestamp()
|
||||||
now = datetime.datetime.now().timestamp()
|
|
||||||
|
|
||||||
with TestClient(app) as client:
|
with TestClient(self.app) as client:
|
||||||
id = "123456.random"
|
id = "123456.random"
|
||||||
super().insert_mock_review_segment(id, now, now + 2)
|
super().insert_mock_review_segment(id, now, now + 2)
|
||||||
params = {
|
params = {
|
||||||
"after": now - 1,
|
"after": now - 1,
|
||||||
"before": now + 3,
|
"before": now + 3,
|
||||||
}
|
}
|
||||||
reviews_response = client.get("/review", params=params)
|
response = client.get("/review", params=params)
|
||||||
assert reviews_response.status_code == 200
|
assert response.status_code == 200
|
||||||
reviews_in_response = reviews_response.json()
|
response_json = response.json()
|
||||||
assert len(reviews_in_response) == 1
|
assert len(response_json) == 1
|
||||||
assert reviews_in_response[0]["id"] == id
|
assert response_json[0]["id"] == id
|
||||||
|
|
||||||
def test_get_review_with_limit_filter(self):
|
def test_get_review_with_limit_filter(self):
|
||||||
app = super().create_app()
|
now = datetime.now().timestamp()
|
||||||
now = datetime.datetime.now().timestamp()
|
|
||||||
|
|
||||||
with TestClient(app) as client:
|
with TestClient(self.app) as client:
|
||||||
id = "123456.random"
|
id = "123456.random"
|
||||||
id2 = "654321.random"
|
id2 = "654321.random"
|
||||||
super().insert_mock_review_segment(id, now, now + 2)
|
super().insert_mock_review_segment(id, now, now + 2)
|
||||||
@@ -80,17 +93,49 @@ class TestHttpReview(BaseTestHttp):
|
|||||||
"after": now,
|
"after": now,
|
||||||
"before": now + 3,
|
"before": now + 3,
|
||||||
}
|
}
|
||||||
reviews_response = client.get("/review", params=params)
|
response = client.get("/review", params=params)
|
||||||
assert reviews_response.status_code == 200
|
assert response.status_code == 200
|
||||||
reviews_in_response = reviews_response.json()
|
response_json = response.json()
|
||||||
assert len(reviews_in_response) == 1
|
assert len(response_json) == 1
|
||||||
assert reviews_in_response[0]["id"] == id2
|
assert response_json[0]["id"] == id2
|
||||||
|
|
||||||
|
def test_get_review_with_severity_filters_no_matches(self):
|
||||||
|
now = datetime.now().timestamp()
|
||||||
|
|
||||||
|
with TestClient(self.app) as client:
|
||||||
|
id = "123456.random"
|
||||||
|
super().insert_mock_review_segment(id, now, now + 2, SeverityEnum.detection)
|
||||||
|
params = {
|
||||||
|
"severity": "detection",
|
||||||
|
"after": now - 1,
|
||||||
|
"before": now + 3,
|
||||||
|
}
|
||||||
|
response = client.get("/review", params=params)
|
||||||
|
assert response.status_code == 200
|
||||||
|
response_json = response.json()
|
||||||
|
assert len(response_json) == 1
|
||||||
|
assert response_json[0]["id"] == id
|
||||||
|
|
||||||
|
def test_get_review_with_severity_filters(self):
|
||||||
|
now = datetime.now().timestamp()
|
||||||
|
|
||||||
|
with TestClient(self.app) as client:
|
||||||
|
id = "123456.random"
|
||||||
|
super().insert_mock_review_segment(id, now, now + 2, SeverityEnum.detection)
|
||||||
|
params = {
|
||||||
|
"severity": "alert",
|
||||||
|
"after": now - 1,
|
||||||
|
"before": now + 3,
|
||||||
|
}
|
||||||
|
response = client.get("/review", params=params)
|
||||||
|
assert response.status_code == 200
|
||||||
|
response_json = response.json()
|
||||||
|
assert len(response_json) == 0
|
||||||
|
|
||||||
def test_get_review_with_all_filters(self):
|
def test_get_review_with_all_filters(self):
|
||||||
app = super().create_app()
|
now = datetime.now().timestamp()
|
||||||
now = datetime.datetime.now().timestamp()
|
|
||||||
|
|
||||||
with TestClient(app) as client:
|
with TestClient(self.app) as client:
|
||||||
id = "123456.random"
|
id = "123456.random"
|
||||||
super().insert_mock_review_segment(id, now, now + 2)
|
super().insert_mock_review_segment(id, now, now + 2)
|
||||||
params = {
|
params = {
|
||||||
@@ -103,8 +148,424 @@ class TestHttpReview(BaseTestHttp):
|
|||||||
"after": now - 1,
|
"after": now - 1,
|
||||||
"before": now + 3,
|
"before": now + 3,
|
||||||
}
|
}
|
||||||
reviews_response = client.get("/review", params=params)
|
response = client.get("/review", params=params)
|
||||||
assert reviews_response.status_code == 200
|
assert response.status_code == 200
|
||||||
reviews_in_response = reviews_response.json()
|
response_json = response.json()
|
||||||
assert len(reviews_in_response) == 1
|
assert len(response_json) == 1
|
||||||
assert reviews_in_response[0]["id"] == id
|
assert response_json[0]["id"] == id
|
||||||
|
|
||||||
|
####################################################################################################################
|
||||||
|
################################### GET /review/summary Endpoint #################################################
|
||||||
|
####################################################################################################################
|
||||||
|
def test_get_review_summary_all_filters(self):
|
||||||
|
with TestClient(self.app) as client:
|
||||||
|
super().insert_mock_review_segment("123456.random")
|
||||||
|
params = {
|
||||||
|
"cameras": "front_door",
|
||||||
|
"labels": "all",
|
||||||
|
"zones": "all",
|
||||||
|
"timezone": "utc",
|
||||||
|
}
|
||||||
|
response = client.get("/review/summary", params=params)
|
||||||
|
assert response.status_code == 200
|
||||||
|
response_json = response.json()
|
||||||
|
# e.g. '2024-11-24'
|
||||||
|
today_formatted = datetime.today().strftime("%Y-%m-%d")
|
||||||
|
expected_response = {
|
||||||
|
"last24Hours": {
|
||||||
|
"reviewed_alert": 0,
|
||||||
|
"reviewed_detection": 0,
|
||||||
|
"total_alert": 1,
|
||||||
|
"total_detection": 0,
|
||||||
|
},
|
||||||
|
today_formatted: {
|
||||||
|
"day": today_formatted,
|
||||||
|
"reviewed_alert": 0,
|
||||||
|
"reviewed_detection": 0,
|
||||||
|
"total_alert": 1,
|
||||||
|
"total_detection": 0,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
self.assertEqual(response_json, expected_response)
|
||||||
|
|
||||||
|
def test_get_review_summary_no_filters(self):
|
||||||
|
with TestClient(self.app) as client:
|
||||||
|
super().insert_mock_review_segment("123456.random")
|
||||||
|
response = client.get("/review/summary")
|
||||||
|
assert response.status_code == 200
|
||||||
|
response_json = response.json()
|
||||||
|
# e.g. '2024-11-24'
|
||||||
|
today_formatted = datetime.today().strftime("%Y-%m-%d")
|
||||||
|
expected_response = {
|
||||||
|
"last24Hours": {
|
||||||
|
"reviewed_alert": 0,
|
||||||
|
"reviewed_detection": 0,
|
||||||
|
"total_alert": 1,
|
||||||
|
"total_detection": 0,
|
||||||
|
},
|
||||||
|
today_formatted: {
|
||||||
|
"day": today_formatted,
|
||||||
|
"reviewed_alert": 0,
|
||||||
|
"reviewed_detection": 0,
|
||||||
|
"total_alert": 1,
|
||||||
|
"total_detection": 0,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
self.assertEqual(response_json, expected_response)
|
||||||
|
|
||||||
|
def test_get_review_summary_multiple_days(self):
|
||||||
|
now = datetime.now()
|
||||||
|
five_days_ago = datetime.today() - timedelta(days=5)
|
||||||
|
|
||||||
|
with TestClient(self.app) as client:
|
||||||
|
super().insert_mock_review_segment(
|
||||||
|
"123456.random", now.timestamp() - 2, now.timestamp() - 1
|
||||||
|
)
|
||||||
|
super().insert_mock_review_segment(
|
||||||
|
"654321.random",
|
||||||
|
five_days_ago.timestamp(),
|
||||||
|
five_days_ago.timestamp() + 1,
|
||||||
|
)
|
||||||
|
response = client.get("/review/summary")
|
||||||
|
assert response.status_code == 200
|
||||||
|
response_json = response.json()
|
||||||
|
# e.g. '2024-11-24'
|
||||||
|
today_formatted = now.strftime("%Y-%m-%d")
|
||||||
|
# e.g. '2024-11-19'
|
||||||
|
five_days_ago_formatted = five_days_ago.strftime("%Y-%m-%d")
|
||||||
|
expected_response = {
|
||||||
|
"last24Hours": {
|
||||||
|
"reviewed_alert": 0,
|
||||||
|
"reviewed_detection": 0,
|
||||||
|
"total_alert": 1,
|
||||||
|
"total_detection": 0,
|
||||||
|
},
|
||||||
|
today_formatted: {
|
||||||
|
"day": today_formatted,
|
||||||
|
"reviewed_alert": 0,
|
||||||
|
"reviewed_detection": 0,
|
||||||
|
"total_alert": 1,
|
||||||
|
"total_detection": 0,
|
||||||
|
},
|
||||||
|
five_days_ago_formatted: {
|
||||||
|
"day": five_days_ago_formatted,
|
||||||
|
"reviewed_alert": 0,
|
||||||
|
"reviewed_detection": 0,
|
||||||
|
"total_alert": 1,
|
||||||
|
"total_detection": 0,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
self.assertEqual(response_json, expected_response)
|
||||||
|
|
||||||
|
def test_get_review_summary_multiple_days_edge_cases(self):
|
||||||
|
now = datetime.now()
|
||||||
|
five_days_ago = datetime.today() - timedelta(days=5)
|
||||||
|
twenty_days_ago = datetime.today() - timedelta(days=20)
|
||||||
|
one_month_ago = datetime.today() - timedelta(days=30)
|
||||||
|
one_month_ago_ts = one_month_ago.timestamp()
|
||||||
|
|
||||||
|
with TestClient(self.app) as client:
|
||||||
|
super().insert_mock_review_segment("123456.random", now.timestamp())
|
||||||
|
super().insert_mock_review_segment(
|
||||||
|
"123457.random", five_days_ago.timestamp()
|
||||||
|
)
|
||||||
|
super().insert_mock_review_segment(
|
||||||
|
"123458.random",
|
||||||
|
twenty_days_ago.timestamp(),
|
||||||
|
None,
|
||||||
|
SeverityEnum.detection,
|
||||||
|
)
|
||||||
|
# One month ago plus 5 seconds fits within the condition (review.start_time > month_ago). Assuming that the endpoint does not take more than 5 seconds to be invoked
|
||||||
|
super().insert_mock_review_segment(
|
||||||
|
"123459.random",
|
||||||
|
one_month_ago_ts + 5,
|
||||||
|
None,
|
||||||
|
SeverityEnum.detection,
|
||||||
|
)
|
||||||
|
# This won't appear in the output since it's not within last month start_time clause (review.start_time > month_ago)
|
||||||
|
super().insert_mock_review_segment("123450.random", one_month_ago_ts)
|
||||||
|
response = client.get("/review/summary")
|
||||||
|
assert response.status_code == 200
|
||||||
|
response_json = response.json()
|
||||||
|
# e.g. '2024-11-24'
|
||||||
|
today_formatted = now.strftime("%Y-%m-%d")
|
||||||
|
# e.g. '2024-11-19'
|
||||||
|
five_days_ago_formatted = five_days_ago.strftime("%Y-%m-%d")
|
||||||
|
# e.g. '2024-11-04'
|
||||||
|
twenty_days_ago_formatted = twenty_days_ago.strftime("%Y-%m-%d")
|
||||||
|
# e.g. '2024-10-24'
|
||||||
|
one_month_ago_formatted = one_month_ago.strftime("%Y-%m-%d")
|
||||||
|
expected_response = {
|
||||||
|
"last24Hours": {
|
||||||
|
"reviewed_alert": 0,
|
||||||
|
"reviewed_detection": 0,
|
||||||
|
"total_alert": 1,
|
||||||
|
"total_detection": 0,
|
||||||
|
},
|
||||||
|
today_formatted: {
|
||||||
|
"day": today_formatted,
|
||||||
|
"reviewed_alert": 0,
|
||||||
|
"reviewed_detection": 0,
|
||||||
|
"total_alert": 1,
|
||||||
|
"total_detection": 0,
|
||||||
|
},
|
||||||
|
five_days_ago_formatted: {
|
||||||
|
"day": five_days_ago_formatted,
|
||||||
|
"reviewed_alert": 0,
|
||||||
|
"reviewed_detection": 0,
|
||||||
|
"total_alert": 1,
|
||||||
|
"total_detection": 0,
|
||||||
|
},
|
||||||
|
twenty_days_ago_formatted: {
|
||||||
|
"day": twenty_days_ago_formatted,
|
||||||
|
"reviewed_alert": 0,
|
||||||
|
"reviewed_detection": 0,
|
||||||
|
"total_alert": 0,
|
||||||
|
"total_detection": 1,
|
||||||
|
},
|
||||||
|
one_month_ago_formatted: {
|
||||||
|
"day": one_month_ago_formatted,
|
||||||
|
"reviewed_alert": 0,
|
||||||
|
"reviewed_detection": 0,
|
||||||
|
"total_alert": 0,
|
||||||
|
"total_detection": 1,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
self.assertEqual(response_json, expected_response)
|
||||||
|
|
||||||
|
def test_get_review_summary_multiple_in_same_day(self):
|
||||||
|
now = datetime.now()
|
||||||
|
five_days_ago = datetime.today() - timedelta(days=5)
|
||||||
|
|
||||||
|
with TestClient(self.app) as client:
|
||||||
|
super().insert_mock_review_segment("123456.random", now.timestamp())
|
||||||
|
five_days_ago_ts = five_days_ago.timestamp()
|
||||||
|
for i in range(20):
|
||||||
|
super().insert_mock_review_segment(
|
||||||
|
f"123456_{i}.random_alert",
|
||||||
|
five_days_ago_ts,
|
||||||
|
five_days_ago_ts,
|
||||||
|
SeverityEnum.alert,
|
||||||
|
)
|
||||||
|
for i in range(15):
|
||||||
|
super().insert_mock_review_segment(
|
||||||
|
f"123456_{i}.random_detection",
|
||||||
|
five_days_ago_ts,
|
||||||
|
five_days_ago_ts,
|
||||||
|
SeverityEnum.detection,
|
||||||
|
)
|
||||||
|
response = client.get("/review/summary")
|
||||||
|
assert response.status_code == 200
|
||||||
|
response_json = response.json()
|
||||||
|
# e.g. '2024-11-24'
|
||||||
|
today_formatted = now.strftime("%Y-%m-%d")
|
||||||
|
# e.g. '2024-11-19'
|
||||||
|
five_days_ago_formatted = five_days_ago.strftime("%Y-%m-%d")
|
||||||
|
expected_response = {
|
||||||
|
"last24Hours": {
|
||||||
|
"reviewed_alert": 0,
|
||||||
|
"reviewed_detection": 0,
|
||||||
|
"total_alert": 1,
|
||||||
|
"total_detection": 0,
|
||||||
|
},
|
||||||
|
today_formatted: {
|
||||||
|
"day": today_formatted,
|
||||||
|
"reviewed_alert": 0,
|
||||||
|
"reviewed_detection": 0,
|
||||||
|
"total_alert": 1,
|
||||||
|
"total_detection": 0,
|
||||||
|
},
|
||||||
|
five_days_ago_formatted: {
|
||||||
|
"day": five_days_ago_formatted,
|
||||||
|
"reviewed_alert": 0,
|
||||||
|
"reviewed_detection": 0,
|
||||||
|
"total_alert": 20,
|
||||||
|
"total_detection": 15,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
self.assertEqual(response_json, expected_response)
|
||||||
|
|
||||||
|
def test_get_review_summary_multiple_in_same_day_with_reviewed(self):
|
||||||
|
five_days_ago = datetime.today() - timedelta(days=5)
|
||||||
|
|
||||||
|
with TestClient(self.app) as client:
|
||||||
|
five_days_ago_ts = five_days_ago.timestamp()
|
||||||
|
for i in range(10):
|
||||||
|
super().insert_mock_review_segment(
|
||||||
|
f"123456_{i}.random_alert_not_reviewed",
|
||||||
|
five_days_ago_ts,
|
||||||
|
five_days_ago_ts,
|
||||||
|
SeverityEnum.alert,
|
||||||
|
False,
|
||||||
|
)
|
||||||
|
for i in range(10):
|
||||||
|
super().insert_mock_review_segment(
|
||||||
|
f"123456_{i}.random_alert_reviewed",
|
||||||
|
five_days_ago_ts,
|
||||||
|
five_days_ago_ts,
|
||||||
|
SeverityEnum.alert,
|
||||||
|
True,
|
||||||
|
)
|
||||||
|
for i in range(10):
|
||||||
|
super().insert_mock_review_segment(
|
||||||
|
f"123456_{i}.random_detection_not_reviewed",
|
||||||
|
five_days_ago_ts,
|
||||||
|
five_days_ago_ts,
|
||||||
|
SeverityEnum.detection,
|
||||||
|
False,
|
||||||
|
)
|
||||||
|
for i in range(5):
|
||||||
|
super().insert_mock_review_segment(
|
||||||
|
f"123456_{i}.random_detection_reviewed",
|
||||||
|
five_days_ago_ts,
|
||||||
|
five_days_ago_ts,
|
||||||
|
SeverityEnum.detection,
|
||||||
|
True,
|
||||||
|
)
|
||||||
|
response = client.get("/review/summary")
|
||||||
|
assert response.status_code == 200
|
||||||
|
response_json = response.json()
|
||||||
|
# e.g. '2024-11-19'
|
||||||
|
five_days_ago_formatted = five_days_ago.strftime("%Y-%m-%d")
|
||||||
|
expected_response = {
|
||||||
|
"last24Hours": {
|
||||||
|
"reviewed_alert": None,
|
||||||
|
"reviewed_detection": None,
|
||||||
|
"total_alert": None,
|
||||||
|
"total_detection": None,
|
||||||
|
},
|
||||||
|
five_days_ago_formatted: {
|
||||||
|
"day": five_days_ago_formatted,
|
||||||
|
"reviewed_alert": 10,
|
||||||
|
"reviewed_detection": 5,
|
||||||
|
"total_alert": 20,
|
||||||
|
"total_detection": 15,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
self.assertEqual(response_json, expected_response)
|
||||||
|
|
||||||
|
####################################################################################################################
|
||||||
|
################################### POST reviews/viewed Endpoint ################################################
|
||||||
|
####################################################################################################################
|
||||||
|
def test_post_reviews_viewed_no_body(self):
|
||||||
|
with TestClient(self.app) as client:
|
||||||
|
super().insert_mock_review_segment("123456.random")
|
||||||
|
response = client.post("/reviews/viewed")
|
||||||
|
# Missing ids
|
||||||
|
assert response.status_code == 422
|
||||||
|
|
||||||
|
def test_post_reviews_viewed_no_body_ids(self):
|
||||||
|
with TestClient(self.app) as client:
|
||||||
|
super().insert_mock_review_segment("123456.random")
|
||||||
|
body = {"ids": [""]}
|
||||||
|
response = client.post("/reviews/viewed", json=body)
|
||||||
|
# Missing ids
|
||||||
|
assert response.status_code == 422
|
||||||
|
|
||||||
|
def test_post_reviews_viewed_non_existent_id(self):
|
||||||
|
with TestClient(self.app) as client:
|
||||||
|
id = "123456.random"
|
||||||
|
super().insert_mock_review_segment(id)
|
||||||
|
body = {"ids": ["1"]}
|
||||||
|
response = client.post("/reviews/viewed", json=body)
|
||||||
|
assert response.status_code == 200
|
||||||
|
response = response.json()
|
||||||
|
assert response["success"] == True
|
||||||
|
assert response["message"] == "Reviewed multiple items"
|
||||||
|
# Verify that in DB the review segment was not changed
|
||||||
|
review_segment_in_db = (
|
||||||
|
ReviewSegment.select(ReviewSegment.has_been_reviewed)
|
||||||
|
.where(ReviewSegment.id == id)
|
||||||
|
.get()
|
||||||
|
)
|
||||||
|
assert review_segment_in_db.has_been_reviewed == False
|
||||||
|
|
||||||
|
def test_post_reviews_viewed(self):
|
||||||
|
with TestClient(self.app) as client:
|
||||||
|
id = "123456.random"
|
||||||
|
super().insert_mock_review_segment(id)
|
||||||
|
body = {"ids": [id]}
|
||||||
|
response = client.post("/reviews/viewed", json=body)
|
||||||
|
assert response.status_code == 200
|
||||||
|
response = response.json()
|
||||||
|
assert response["success"] == True
|
||||||
|
assert response["message"] == "Reviewed multiple items"
|
||||||
|
# Verify that in DB the review segment was changed
|
||||||
|
review_segment_in_db = (
|
||||||
|
ReviewSegment.select(ReviewSegment.has_been_reviewed)
|
||||||
|
.where(ReviewSegment.id == id)
|
||||||
|
.get()
|
||||||
|
)
|
||||||
|
assert review_segment_in_db.has_been_reviewed == True
|
||||||
|
|
||||||
|
####################################################################################################################
|
||||||
|
################################### POST reviews/delete Endpoint ################################################
|
||||||
|
####################################################################################################################
|
||||||
|
def test_post_reviews_delete_no_body(self):
|
||||||
|
with TestClient(self.app) as client:
|
||||||
|
super().insert_mock_review_segment("123456.random")
|
||||||
|
response = client.post("/reviews/delete")
|
||||||
|
# Missing ids
|
||||||
|
assert response.status_code == 422
|
||||||
|
|
||||||
|
def test_post_reviews_delete_no_body_ids(self):
|
||||||
|
with TestClient(self.app) as client:
|
||||||
|
super().insert_mock_review_segment("123456.random")
|
||||||
|
body = {"ids": [""]}
|
||||||
|
response = client.post("/reviews/delete", json=body)
|
||||||
|
# Missing ids
|
||||||
|
assert response.status_code == 422
|
||||||
|
|
||||||
|
def test_post_reviews_delete_non_existent_id(self):
|
||||||
|
with TestClient(self.app) as client:
|
||||||
|
id = "123456.random"
|
||||||
|
super().insert_mock_review_segment(id)
|
||||||
|
body = {"ids": ["1"]}
|
||||||
|
response = client.post("/reviews/delete", json=body)
|
||||||
|
assert response.status_code == 200
|
||||||
|
response_json = response.json()
|
||||||
|
assert response_json["success"] == True
|
||||||
|
assert response_json["message"] == "Deleted review items."
|
||||||
|
# Verify that in DB the review segment was not deleted
|
||||||
|
review_ids_in_db_after = self._get_reviews([id])
|
||||||
|
assert len(review_ids_in_db_after) == 1
|
||||||
|
assert review_ids_in_db_after[0].id == id
|
||||||
|
|
||||||
|
def test_post_reviews_delete(self):
|
||||||
|
with TestClient(self.app) as client:
|
||||||
|
id = "123456.random"
|
||||||
|
super().insert_mock_review_segment(id)
|
||||||
|
body = {"ids": [id]}
|
||||||
|
response = client.post("/reviews/delete", json=body)
|
||||||
|
assert response.status_code == 200
|
||||||
|
response_json = response.json()
|
||||||
|
assert response_json["success"] == True
|
||||||
|
assert response_json["message"] == "Deleted review items."
|
||||||
|
# Verify that in DB the review segment was deleted
|
||||||
|
review_ids_in_db_after = self._get_reviews([id])
|
||||||
|
assert len(review_ids_in_db_after) == 0
|
||||||
|
|
||||||
|
def test_post_reviews_delete_many(self):
|
||||||
|
with TestClient(self.app) as client:
|
||||||
|
ids = ["123456.random", "654321.random"]
|
||||||
|
for id in ids:
|
||||||
|
super().insert_mock_review_segment(id)
|
||||||
|
super().insert_mock_recording(id)
|
||||||
|
|
||||||
|
review_ids_in_db_before = self._get_reviews(ids)
|
||||||
|
recordings_ids_in_db_before = self._get_recordings(ids)
|
||||||
|
assert len(review_ids_in_db_before) == 2
|
||||||
|
assert len(recordings_ids_in_db_before) == 2
|
||||||
|
|
||||||
|
body = {"ids": ids}
|
||||||
|
response = client.post("/reviews/delete", json=body)
|
||||||
|
assert response.status_code == 200
|
||||||
|
response_json = response.json()
|
||||||
|
assert response_json["success"] == True
|
||||||
|
assert response_json["message"] == "Deleted review items."
|
||||||
|
|
||||||
|
# Verify that in DB all review segments and recordings that were passed were deleted
|
||||||
|
review_ids_in_db_after = self._get_reviews(ids)
|
||||||
|
recording_ids_in_db_after = self._get_recordings(ids)
|
||||||
|
assert len(review_ids_in_db_after) == 0
|
||||||
|
assert len(recording_ids_in_db_after) == 0
|
||||||
|
|||||||
@@ -168,7 +168,7 @@ class TestHttp(unittest.TestCase):
|
|||||||
|
|
||||||
assert event
|
assert event
|
||||||
assert event["id"] == id
|
assert event["id"] == id
|
||||||
assert event == model_to_dict(Event.get(Event.id == id))
|
assert event["id"] == model_to_dict(Event.get(Event.id == id))["id"]
|
||||||
|
|
||||||
def test_get_bad_event(self):
|
def test_get_bad_event(self):
|
||||||
app = create_fastapi_app(
|
app = create_fastapi_app(
|
||||||
|
|||||||
@@ -13,6 +13,7 @@ from frigate.config import (
|
|||||||
CameraConfig,
|
CameraConfig,
|
||||||
ModelConfig,
|
ModelConfig,
|
||||||
)
|
)
|
||||||
|
from frigate.review.types import SeverityEnum
|
||||||
from frigate.util.image import (
|
from frigate.util.image import (
|
||||||
area,
|
area,
|
||||||
calculate_region,
|
calculate_region,
|
||||||
@@ -59,6 +60,27 @@ class TrackedObject:
|
|||||||
self.pending_loitering = False
|
self.pending_loitering = False
|
||||||
self.previous = self.to_dict()
|
self.previous = self.to_dict()
|
||||||
|
|
||||||
|
@property
|
||||||
|
def max_severity(self) -> Optional[str]:
|
||||||
|
review_config = self.camera_config.review
|
||||||
|
|
||||||
|
if self.obj_data["label"] in review_config.alerts.labels and (
|
||||||
|
not review_config.alerts.required_zones
|
||||||
|
or set(self.entered_zones) & set(review_config.alerts.required_zones)
|
||||||
|
):
|
||||||
|
return SeverityEnum.alert
|
||||||
|
|
||||||
|
if (
|
||||||
|
not review_config.detections.labels
|
||||||
|
or self.obj_data["label"] in review_config.detections.labels
|
||||||
|
) and (
|
||||||
|
not review_config.detections.required_zones
|
||||||
|
or set(self.entered_zones) & set(review_config.detections.required_zones)
|
||||||
|
):
|
||||||
|
return SeverityEnum.detection
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
def _is_false_positive(self):
|
def _is_false_positive(self):
|
||||||
# once a true positive, always a true positive
|
# once a true positive, always a true positive
|
||||||
if not self.false_positive:
|
if not self.false_positive:
|
||||||
@@ -232,6 +254,7 @@ class TrackedObject:
|
|||||||
"attributes": self.attributes,
|
"attributes": self.attributes,
|
||||||
"current_attributes": self.obj_data["attributes"],
|
"current_attributes": self.obj_data["attributes"],
|
||||||
"pending_loitering": self.pending_loitering,
|
"pending_loitering": self.pending_loitering,
|
||||||
|
"max_severity": self.max_severity,
|
||||||
}
|
}
|
||||||
|
|
||||||
if include_thumbnail:
|
if include_thumbnail:
|
||||||
|
|||||||
@@ -13,12 +13,12 @@ import urllib.parse
|
|||||||
from collections.abc import Mapping
|
from collections.abc import Mapping
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from typing import Any, Optional, Tuple, Union
|
from typing import Any, Optional, Tuple, Union
|
||||||
|
from zoneinfo import ZoneInfoNotFoundError
|
||||||
|
|
||||||
import numpy as np
|
import numpy as np
|
||||||
import pytz
|
import pytz
|
||||||
from ruamel.yaml import YAML
|
from ruamel.yaml import YAML
|
||||||
from tzlocal import get_localzone
|
from tzlocal import get_localzone
|
||||||
from zoneinfo import ZoneInfoNotFoundError
|
|
||||||
|
|
||||||
from frigate.const import REGEX_HTTP_CAMERA_USER_PASS, REGEX_RTSP_CAMERA_USER_PASS
|
from frigate.const import REGEX_HTTP_CAMERA_USER_PASS, REGEX_RTSP_CAMERA_USER_PASS
|
||||||
|
|
||||||
|
|||||||
@@ -219,19 +219,35 @@ def draw_box_with_label(
|
|||||||
text_width = size[0][0]
|
text_width = size[0][0]
|
||||||
text_height = size[0][1]
|
text_height = size[0][1]
|
||||||
line_height = text_height + size[1]
|
line_height = text_height + size[1]
|
||||||
|
# get frame height
|
||||||
|
frame_height = frame.shape[0]
|
||||||
# set the text start position
|
# set the text start position
|
||||||
if position == "ul":
|
if position == "ul":
|
||||||
text_offset_x = x_min
|
text_offset_x = x_min
|
||||||
text_offset_y = 0 if y_min < line_height else y_min - (line_height + 8)
|
text_offset_y = max(0, y_min - (line_height + 8))
|
||||||
elif position == "ur":
|
elif position == "ur":
|
||||||
text_offset_x = x_max - (text_width + 8)
|
text_offset_x = max(0, x_max - (text_width + 8))
|
||||||
text_offset_y = 0 if y_min < line_height else y_min - (line_height + 8)
|
text_offset_y = max(0, y_min - (line_height + 8))
|
||||||
elif position == "bl":
|
elif position == "bl":
|
||||||
text_offset_x = x_min
|
text_offset_x = x_min
|
||||||
text_offset_y = y_max
|
text_offset_y = min(frame_height - line_height, y_max)
|
||||||
elif position == "br":
|
elif position == "br":
|
||||||
text_offset_x = x_max - (text_width + 8)
|
text_offset_x = max(0, x_max - (text_width + 8))
|
||||||
text_offset_y = y_max
|
text_offset_y = min(frame_height - line_height, y_max)
|
||||||
|
# Adjust position if it overlaps with the box or goes out of frame
|
||||||
|
if position in {"ul", "ur"}:
|
||||||
|
if text_offset_y < y_min + thickness: # Label overlaps with the box
|
||||||
|
if y_min - (line_height + 8) < 0 and y_max + line_height <= frame_height:
|
||||||
|
# Not enough space above, and there is space below
|
||||||
|
text_offset_y = y_max
|
||||||
|
elif y_min - (line_height + 8) >= 0:
|
||||||
|
# Enough space above, keep the label at the top
|
||||||
|
text_offset_y = max(0, y_min - (line_height + 8))
|
||||||
|
elif position in {"bl", "br"}:
|
||||||
|
if text_offset_y + line_height > frame_height:
|
||||||
|
# If there's not enough space below, try above the box
|
||||||
|
text_offset_y = max(0, y_min - (line_height + 8))
|
||||||
|
|
||||||
# make the coords of the box with a small padding of two pixels
|
# make the coords of the box with a small padding of two pixels
|
||||||
textbox_coords = (
|
textbox_coords = (
|
||||||
(text_offset_x, text_offset_y),
|
(text_offset_x, text_offset_y),
|
||||||
|
|||||||
@@ -113,7 +113,7 @@ def capture_frames(
|
|||||||
fps.value = frame_rate.eps()
|
fps.value = frame_rate.eps()
|
||||||
skipped_fps.value = skipped_eps.eps()
|
skipped_fps.value = skipped_eps.eps()
|
||||||
current_frame.value = datetime.datetime.now().timestamp()
|
current_frame.value = datetime.datetime.now().timestamp()
|
||||||
frame_name = f"{config.name}{frame_index}"
|
frame_name = f"{config.name}_{frame_index}"
|
||||||
frame_buffer = frame_manager.write(frame_name)
|
frame_buffer = frame_manager.write(frame_name)
|
||||||
try:
|
try:
|
||||||
frame_buffer[:] = ffmpeg_process.stdout.read(frame_size)
|
frame_buffer[:] = ffmpeg_process.stdout.read(frame_size)
|
||||||
|
|||||||
10
web/package-lock.json
generated
10
web/package-lock.json
generated
@@ -72,6 +72,7 @@
|
|||||||
"tailwind-merge": "^2.4.0",
|
"tailwind-merge": "^2.4.0",
|
||||||
"tailwind-scrollbar": "^3.1.0",
|
"tailwind-scrollbar": "^3.1.0",
|
||||||
"tailwindcss-animate": "^1.0.7",
|
"tailwindcss-animate": "^1.0.7",
|
||||||
|
"use-long-press": "^3.2.0",
|
||||||
"vaul": "^0.9.1",
|
"vaul": "^0.9.1",
|
||||||
"vite-plugin-monaco-editor": "^1.1.0",
|
"vite-plugin-monaco-editor": "^1.1.0",
|
||||||
"zod": "^3.23.8"
|
"zod": "^3.23.8"
|
||||||
@@ -8709,6 +8710,15 @@
|
|||||||
"scheduler": ">=0.19.0"
|
"scheduler": ">=0.19.0"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"node_modules/use-long-press": {
|
||||||
|
"version": "3.2.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/use-long-press/-/use-long-press-3.2.0.tgz",
|
||||||
|
"integrity": "sha512-uq5o2qFR1VRjHn8Of7Fl344/AGvgk7C5Mcb4aSb1ZRVp6PkgdXJJLdRrlSTJQVkkQcDuqFbFc3mDX4COg7mRTA==",
|
||||||
|
"license": "MIT",
|
||||||
|
"peerDependencies": {
|
||||||
|
"react": ">=16.8.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
"node_modules/use-sidecar": {
|
"node_modules/use-sidecar": {
|
||||||
"version": "1.1.2",
|
"version": "1.1.2",
|
||||||
"resolved": "https://registry.npmjs.org/use-sidecar/-/use-sidecar-1.1.2.tgz",
|
"resolved": "https://registry.npmjs.org/use-sidecar/-/use-sidecar-1.1.2.tgz",
|
||||||
|
|||||||
@@ -78,6 +78,7 @@
|
|||||||
"tailwind-merge": "^2.4.0",
|
"tailwind-merge": "^2.4.0",
|
||||||
"tailwind-scrollbar": "^3.1.0",
|
"tailwind-scrollbar": "^3.1.0",
|
||||||
"tailwindcss-animate": "^1.0.7",
|
"tailwindcss-animate": "^1.0.7",
|
||||||
|
"use-long-press": "^3.2.0",
|
||||||
"vaul": "^0.9.1",
|
"vaul": "^0.9.1",
|
||||||
"vite-plugin-monaco-editor": "^1.1.0",
|
"vite-plugin-monaco-editor": "^1.1.0",
|
||||||
"zod": "^3.23.8"
|
"zod": "^3.23.8"
|
||||||
|
|||||||
@@ -29,8 +29,11 @@ export function ApiProvider({ children, options }: ApiProviderType) {
|
|||||||
error.response &&
|
error.response &&
|
||||||
[401, 302, 307].includes(error.response.status)
|
[401, 302, 307].includes(error.response.status)
|
||||||
) {
|
) {
|
||||||
window.location.href =
|
// redirect to the login page if not already there
|
||||||
error.response.headers.get("location") ?? "login";
|
const loginPage = error.response.headers.get("location") ?? "login";
|
||||||
|
if (window.location.href !== loginPage) {
|
||||||
|
window.location.href = loginPage;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
...options,
|
...options,
|
||||||
|
|||||||
@@ -63,7 +63,7 @@ export function UserAuthForm({ className, ...props }: UserAuthFormProps) {
|
|||||||
toast.error("Exceeded rate limit. Try again later.", {
|
toast.error("Exceeded rate limit. Try again later.", {
|
||||||
position: "top-center",
|
position: "top-center",
|
||||||
});
|
});
|
||||||
} else if (err.response?.status === 400) {
|
} else if (err.response?.status === 401) {
|
||||||
toast.error("Login failed", {
|
toast.error("Login failed", {
|
||||||
position: "top-center",
|
position: "top-center",
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -1,4 +1,4 @@
|
|||||||
import { useCallback, useMemo } from "react";
|
import { useMemo } from "react";
|
||||||
import { useApiHost } from "@/api";
|
import { useApiHost } from "@/api";
|
||||||
import { getIconForLabel } from "@/utils/iconUtil";
|
import { getIconForLabel } from "@/utils/iconUtil";
|
||||||
import useSWR from "swr";
|
import useSWR from "swr";
|
||||||
@@ -12,10 +12,11 @@ import { capitalizeFirstLetter } from "@/utils/stringUtil";
|
|||||||
import { SearchResult } from "@/types/search";
|
import { SearchResult } from "@/types/search";
|
||||||
import { cn } from "@/lib/utils";
|
import { cn } from "@/lib/utils";
|
||||||
import { TooltipPortal } from "@radix-ui/react-tooltip";
|
import { TooltipPortal } from "@radix-ui/react-tooltip";
|
||||||
|
import useContextMenu from "@/hooks/use-contextmenu";
|
||||||
|
|
||||||
type SearchThumbnailProps = {
|
type SearchThumbnailProps = {
|
||||||
searchResult: SearchResult;
|
searchResult: SearchResult;
|
||||||
onClick: (searchResult: SearchResult) => void;
|
onClick: (searchResult: SearchResult, ctrl: boolean, detail: boolean) => void;
|
||||||
};
|
};
|
||||||
|
|
||||||
export default function SearchThumbnail({
|
export default function SearchThumbnail({
|
||||||
@@ -28,9 +29,9 @@ export default function SearchThumbnail({
|
|||||||
|
|
||||||
// interactions
|
// interactions
|
||||||
|
|
||||||
const handleOnClick = useCallback(() => {
|
useContextMenu(imgRef, () => {
|
||||||
onClick(searchResult);
|
onClick(searchResult, true, false);
|
||||||
}, [searchResult, onClick]);
|
});
|
||||||
|
|
||||||
const objectLabel = useMemo(() => {
|
const objectLabel = useMemo(() => {
|
||||||
if (
|
if (
|
||||||
@@ -45,7 +46,10 @@ export default function SearchThumbnail({
|
|||||||
}, [config, searchResult]);
|
}, [config, searchResult]);
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<div className="relative size-full cursor-pointer" onClick={handleOnClick}>
|
<div
|
||||||
|
className="relative size-full cursor-pointer"
|
||||||
|
onClick={() => onClick(searchResult, false, true)}
|
||||||
|
>
|
||||||
<ImageLoadingIndicator
|
<ImageLoadingIndicator
|
||||||
className="absolute inset-0"
|
className="absolute inset-0"
|
||||||
imgLoaded={imgLoaded}
|
imgLoaded={imgLoaded}
|
||||||
@@ -79,7 +83,7 @@ export default function SearchThumbnail({
|
|||||||
<div className="mx-3 pb-1 text-sm text-white">
|
<div className="mx-3 pb-1 text-sm text-white">
|
||||||
<Chip
|
<Chip
|
||||||
className={`z-0 flex items-center justify-between gap-1 space-x-1 bg-gray-500 bg-gradient-to-br from-gray-400 to-gray-500 text-xs`}
|
className={`z-0 flex items-center justify-between gap-1 space-x-1 bg-gray-500 bg-gradient-to-br from-gray-400 to-gray-500 text-xs`}
|
||||||
onClick={() => onClick(searchResult)}
|
onClick={() => onClick(searchResult, false, true)}
|
||||||
>
|
>
|
||||||
{getIconForLabel(objectLabel, "size-3 text-white")}
|
{getIconForLabel(objectLabel, "size-3 text-white")}
|
||||||
{Math.round(
|
{Math.round(
|
||||||
|
|||||||
132
web/src/components/filter/SearchActionGroup.tsx
Normal file
132
web/src/components/filter/SearchActionGroup.tsx
Normal file
@@ -0,0 +1,132 @@
|
|||||||
|
import { useCallback, useState } from "react";
|
||||||
|
import axios from "axios";
|
||||||
|
import { Button, buttonVariants } from "../ui/button";
|
||||||
|
import { isDesktop } from "react-device-detect";
|
||||||
|
import { HiTrash } from "react-icons/hi";
|
||||||
|
import {
|
||||||
|
AlertDialog,
|
||||||
|
AlertDialogAction,
|
||||||
|
AlertDialogCancel,
|
||||||
|
AlertDialogContent,
|
||||||
|
AlertDialogDescription,
|
||||||
|
AlertDialogFooter,
|
||||||
|
AlertDialogHeader,
|
||||||
|
AlertDialogTitle,
|
||||||
|
} from "../ui/alert-dialog";
|
||||||
|
import useKeyboardListener from "@/hooks/use-keyboard-listener";
|
||||||
|
import { toast } from "sonner";
|
||||||
|
|
||||||
|
type SearchActionGroupProps = {
|
||||||
|
selectedObjects: string[];
|
||||||
|
setSelectedObjects: (ids: string[]) => void;
|
||||||
|
pullLatestData: () => void;
|
||||||
|
};
|
||||||
|
export default function SearchActionGroup({
|
||||||
|
selectedObjects,
|
||||||
|
setSelectedObjects,
|
||||||
|
pullLatestData,
|
||||||
|
}: SearchActionGroupProps) {
|
||||||
|
const onClearSelected = useCallback(() => {
|
||||||
|
setSelectedObjects([]);
|
||||||
|
}, [setSelectedObjects]);
|
||||||
|
|
||||||
|
const onDelete = useCallback(async () => {
|
||||||
|
await axios
|
||||||
|
.delete(`events/`, {
|
||||||
|
data: { event_ids: selectedObjects },
|
||||||
|
})
|
||||||
|
.then((resp) => {
|
||||||
|
if (resp.status == 200) {
|
||||||
|
toast.success("Tracked objects deleted successfully.", {
|
||||||
|
position: "top-center",
|
||||||
|
});
|
||||||
|
setSelectedObjects([]);
|
||||||
|
pullLatestData();
|
||||||
|
}
|
||||||
|
})
|
||||||
|
.catch(() => {
|
||||||
|
toast.error("Failed to delete tracked objects.", {
|
||||||
|
position: "top-center",
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}, [selectedObjects, setSelectedObjects, pullLatestData]);
|
||||||
|
|
||||||
|
const [deleteDialogOpen, setDeleteDialogOpen] = useState(false);
|
||||||
|
const [bypassDialog, setBypassDialog] = useState(false);
|
||||||
|
|
||||||
|
useKeyboardListener(["Shift"], (_, modifiers) => {
|
||||||
|
setBypassDialog(modifiers.shift);
|
||||||
|
});
|
||||||
|
|
||||||
|
const handleDelete = useCallback(() => {
|
||||||
|
if (bypassDialog) {
|
||||||
|
onDelete();
|
||||||
|
} else {
|
||||||
|
setDeleteDialogOpen(true);
|
||||||
|
}
|
||||||
|
}, [bypassDialog, onDelete]);
|
||||||
|
|
||||||
|
return (
|
||||||
|
<>
|
||||||
|
<AlertDialog
|
||||||
|
open={deleteDialogOpen}
|
||||||
|
onOpenChange={() => setDeleteDialogOpen(!deleteDialogOpen)}
|
||||||
|
>
|
||||||
|
<AlertDialogContent>
|
||||||
|
<AlertDialogHeader>
|
||||||
|
<AlertDialogTitle>Confirm Delete</AlertDialogTitle>
|
||||||
|
</AlertDialogHeader>
|
||||||
|
<AlertDialogDescription>
|
||||||
|
Deleting these {selectedObjects.length} tracked objects removes the
|
||||||
|
snapshot, any saved embeddings, and any associated object lifecycle
|
||||||
|
entries. Recorded footage of these tracked objects in History view
|
||||||
|
will <em>NOT</em> be deleted.
|
||||||
|
<br />
|
||||||
|
<br />
|
||||||
|
Are you sure you want to proceed?
|
||||||
|
<br />
|
||||||
|
<br />
|
||||||
|
Hold the <em>Shift</em> key to bypass this dialog in the future.
|
||||||
|
</AlertDialogDescription>
|
||||||
|
<AlertDialogFooter>
|
||||||
|
<AlertDialogCancel>Cancel</AlertDialogCancel>
|
||||||
|
<AlertDialogAction
|
||||||
|
className={buttonVariants({ variant: "destructive" })}
|
||||||
|
onClick={onDelete}
|
||||||
|
>
|
||||||
|
Delete
|
||||||
|
</AlertDialogAction>
|
||||||
|
</AlertDialogFooter>
|
||||||
|
</AlertDialogContent>
|
||||||
|
</AlertDialog>
|
||||||
|
|
||||||
|
<div className="absolute inset-x-2 inset-y-0 flex items-center justify-between gap-2 bg-background py-2 md:left-auto">
|
||||||
|
<div className="mx-1 flex items-center justify-center text-sm text-muted-foreground">
|
||||||
|
<div className="p-1">{`${selectedObjects.length} selected`}</div>
|
||||||
|
<div className="p-1">{"|"}</div>
|
||||||
|
<div
|
||||||
|
className="cursor-pointer p-2 text-primary hover:rounded-lg hover:bg-secondary"
|
||||||
|
onClick={onClearSelected}
|
||||||
|
>
|
||||||
|
Unselect
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div className="flex items-center gap-1 md:gap-2">
|
||||||
|
<Button
|
||||||
|
className="flex items-center gap-2 p-2"
|
||||||
|
aria-label="Delete"
|
||||||
|
size="sm"
|
||||||
|
onClick={handleDelete}
|
||||||
|
>
|
||||||
|
<HiTrash className="text-secondary-foreground" />
|
||||||
|
{isDesktop && (
|
||||||
|
<div className="text-primary">
|
||||||
|
{bypassDialog ? "Delete Now" : "Delete"}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</>
|
||||||
|
);
|
||||||
|
}
|
||||||
@@ -15,13 +15,15 @@ import {
|
|||||||
SearchFilter,
|
SearchFilter,
|
||||||
SearchFilters,
|
SearchFilters,
|
||||||
SearchSource,
|
SearchSource,
|
||||||
|
SearchSortType,
|
||||||
} from "@/types/search";
|
} from "@/types/search";
|
||||||
import { DateRange } from "react-day-picker";
|
import { DateRange } from "react-day-picker";
|
||||||
import { cn } from "@/lib/utils";
|
import { cn } from "@/lib/utils";
|
||||||
import { MdLabel } from "react-icons/md";
|
import { MdLabel, MdSort } from "react-icons/md";
|
||||||
import PlatformAwareDialog from "../overlay/dialog/PlatformAwareDialog";
|
import PlatformAwareDialog from "../overlay/dialog/PlatformAwareDialog";
|
||||||
import SearchFilterDialog from "../overlay/dialog/SearchFilterDialog";
|
import SearchFilterDialog from "../overlay/dialog/SearchFilterDialog";
|
||||||
import { CalendarRangeFilterButton } from "./CalendarFilterButton";
|
import { CalendarRangeFilterButton } from "./CalendarFilterButton";
|
||||||
|
import { RadioGroup, RadioGroupItem } from "@/components/ui/radio-group";
|
||||||
|
|
||||||
type SearchFilterGroupProps = {
|
type SearchFilterGroupProps = {
|
||||||
className: string;
|
className: string;
|
||||||
@@ -107,6 +109,25 @@ export default function SearchFilterGroup({
|
|||||||
[config, allLabels, allZones],
|
[config, allLabels, allZones],
|
||||||
);
|
);
|
||||||
|
|
||||||
|
const availableSortTypes = useMemo(() => {
|
||||||
|
const sortTypes = ["date_asc", "date_desc"];
|
||||||
|
if (filter?.min_score || filter?.max_score) {
|
||||||
|
sortTypes.push("score_desc", "score_asc");
|
||||||
|
}
|
||||||
|
if (filter?.event_id || filter?.query) {
|
||||||
|
sortTypes.push("relevance");
|
||||||
|
}
|
||||||
|
return sortTypes as SearchSortType[];
|
||||||
|
}, [filter]);
|
||||||
|
|
||||||
|
const defaultSortType = useMemo<SearchSortType>(() => {
|
||||||
|
if (filter?.query || filter?.event_id) {
|
||||||
|
return "relevance";
|
||||||
|
} else {
|
||||||
|
return "date_desc";
|
||||||
|
}
|
||||||
|
}, [filter]);
|
||||||
|
|
||||||
const groups = useMemo(() => {
|
const groups = useMemo(() => {
|
||||||
if (!config) {
|
if (!config) {
|
||||||
return [];
|
return [];
|
||||||
@@ -179,6 +200,16 @@ export default function SearchFilterGroup({
|
|||||||
filterValues={filterValues}
|
filterValues={filterValues}
|
||||||
onUpdateFilter={onUpdateFilter}
|
onUpdateFilter={onUpdateFilter}
|
||||||
/>
|
/>
|
||||||
|
{filters.includes("sort") && Object.keys(filter ?? {}).length > 0 && (
|
||||||
|
<SortTypeButton
|
||||||
|
availableSortTypes={availableSortTypes ?? []}
|
||||||
|
defaultSortType={defaultSortType}
|
||||||
|
selectedSortType={filter?.sort}
|
||||||
|
updateSortType={(newSort) => {
|
||||||
|
onUpdateFilter({ ...filter, sort: newSort });
|
||||||
|
}}
|
||||||
|
/>
|
||||||
|
)}
|
||||||
</div>
|
</div>
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
@@ -362,3 +393,176 @@ export function GeneralFilterContent({
|
|||||||
</>
|
</>
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
type SortTypeButtonProps = {
|
||||||
|
availableSortTypes: SearchSortType[];
|
||||||
|
defaultSortType: SearchSortType;
|
||||||
|
selectedSortType: SearchSortType | undefined;
|
||||||
|
updateSortType: (sortType: SearchSortType | undefined) => void;
|
||||||
|
};
|
||||||
|
function SortTypeButton({
|
||||||
|
availableSortTypes,
|
||||||
|
defaultSortType,
|
||||||
|
selectedSortType,
|
||||||
|
updateSortType,
|
||||||
|
}: SortTypeButtonProps) {
|
||||||
|
const [open, setOpen] = useState(false);
|
||||||
|
const [currentSortType, setCurrentSortType] = useState<
|
||||||
|
SearchSortType | undefined
|
||||||
|
>(selectedSortType as SearchSortType);
|
||||||
|
|
||||||
|
// ui
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
setCurrentSortType(selectedSortType);
|
||||||
|
// only refresh when state changes
|
||||||
|
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||||
|
}, [selectedSortType]);
|
||||||
|
|
||||||
|
const trigger = (
|
||||||
|
<Button
|
||||||
|
size="sm"
|
||||||
|
variant={
|
||||||
|
selectedSortType != defaultSortType && selectedSortType != undefined
|
||||||
|
? "select"
|
||||||
|
: "default"
|
||||||
|
}
|
||||||
|
className="flex items-center gap-2 capitalize"
|
||||||
|
aria-label="Labels"
|
||||||
|
>
|
||||||
|
<MdSort
|
||||||
|
className={`${selectedSortType != defaultSortType && selectedSortType != undefined ? "text-selected-foreground" : "text-secondary-foreground"}`}
|
||||||
|
/>
|
||||||
|
<div
|
||||||
|
className={`${selectedSortType != defaultSortType && selectedSortType != undefined ? "text-selected-foreground" : "text-primary"}`}
|
||||||
|
>
|
||||||
|
Sort
|
||||||
|
</div>
|
||||||
|
</Button>
|
||||||
|
);
|
||||||
|
const content = (
|
||||||
|
<SortTypeContent
|
||||||
|
availableSortTypes={availableSortTypes ?? []}
|
||||||
|
defaultSortType={defaultSortType}
|
||||||
|
selectedSortType={selectedSortType}
|
||||||
|
currentSortType={currentSortType}
|
||||||
|
setCurrentSortType={setCurrentSortType}
|
||||||
|
updateSortType={updateSortType}
|
||||||
|
onClose={() => setOpen(false)}
|
||||||
|
/>
|
||||||
|
);
|
||||||
|
|
||||||
|
return (
|
||||||
|
<PlatformAwareDialog
|
||||||
|
trigger={trigger}
|
||||||
|
content={content}
|
||||||
|
contentClassName={
|
||||||
|
isDesktop
|
||||||
|
? "scrollbar-container h-auto max-h-[80dvh] overflow-y-auto"
|
||||||
|
: "max-h-[75dvh] overflow-hidden p-4"
|
||||||
|
}
|
||||||
|
open={open}
|
||||||
|
onOpenChange={(open) => {
|
||||||
|
if (!open) {
|
||||||
|
setCurrentSortType(selectedSortType);
|
||||||
|
}
|
||||||
|
|
||||||
|
setOpen(open);
|
||||||
|
}}
|
||||||
|
/>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
type SortTypeContentProps = {
|
||||||
|
availableSortTypes: SearchSortType[];
|
||||||
|
defaultSortType: SearchSortType;
|
||||||
|
selectedSortType: SearchSortType | undefined;
|
||||||
|
currentSortType: SearchSortType | undefined;
|
||||||
|
updateSortType: (sort_type: SearchSortType | undefined) => void;
|
||||||
|
setCurrentSortType: (sort_type: SearchSortType | undefined) => void;
|
||||||
|
onClose: () => void;
|
||||||
|
};
|
||||||
|
export function SortTypeContent({
|
||||||
|
availableSortTypes,
|
||||||
|
defaultSortType,
|
||||||
|
selectedSortType,
|
||||||
|
currentSortType,
|
||||||
|
updateSortType,
|
||||||
|
setCurrentSortType,
|
||||||
|
onClose,
|
||||||
|
}: SortTypeContentProps) {
|
||||||
|
const sortLabels = {
|
||||||
|
date_asc: "Date (Ascending)",
|
||||||
|
date_desc: "Date (Descending)",
|
||||||
|
score_asc: "Object Score (Ascending)",
|
||||||
|
score_desc: "Object Score (Descending)",
|
||||||
|
relevance: "Relevance",
|
||||||
|
};
|
||||||
|
|
||||||
|
return (
|
||||||
|
<>
|
||||||
|
<div className="overflow-x-hidden">
|
||||||
|
<div className="my-2.5 flex flex-col gap-2.5">
|
||||||
|
<RadioGroup
|
||||||
|
value={
|
||||||
|
Array.isArray(currentSortType)
|
||||||
|
? currentSortType?.[0]
|
||||||
|
: (currentSortType ?? defaultSortType)
|
||||||
|
}
|
||||||
|
defaultValue={defaultSortType}
|
||||||
|
onValueChange={(value) =>
|
||||||
|
setCurrentSortType(value as SearchSortType)
|
||||||
|
}
|
||||||
|
className="w-full space-y-1"
|
||||||
|
>
|
||||||
|
{availableSortTypes.map((value) => (
|
||||||
|
<div className="flex flex-row gap-2">
|
||||||
|
<RadioGroupItem
|
||||||
|
key={value}
|
||||||
|
value={value}
|
||||||
|
id={`sort-${value}`}
|
||||||
|
className={
|
||||||
|
value == (currentSortType ?? defaultSortType)
|
||||||
|
? "bg-selected from-selected/50 to-selected/90 text-selected"
|
||||||
|
: "bg-secondary from-secondary/50 to-secondary/90 text-secondary"
|
||||||
|
}
|
||||||
|
/>
|
||||||
|
<Label
|
||||||
|
htmlFor={`sort-${value}`}
|
||||||
|
className="flex cursor-pointer items-center space-x-2"
|
||||||
|
>
|
||||||
|
<span>{sortLabels[value]}</span>
|
||||||
|
</Label>
|
||||||
|
</div>
|
||||||
|
))}
|
||||||
|
</RadioGroup>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<DropdownMenuSeparator />
|
||||||
|
<div className="flex items-center justify-evenly p-2">
|
||||||
|
<Button
|
||||||
|
aria-label="Apply"
|
||||||
|
variant="select"
|
||||||
|
onClick={() => {
|
||||||
|
if (selectedSortType != currentSortType) {
|
||||||
|
updateSortType(currentSortType);
|
||||||
|
}
|
||||||
|
|
||||||
|
onClose();
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
Apply
|
||||||
|
</Button>
|
||||||
|
<Button
|
||||||
|
aria-label="Reset"
|
||||||
|
onClick={() => {
|
||||||
|
setCurrentSortType(undefined);
|
||||||
|
updateSortType(undefined);
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
Reset
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
</>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|||||||
@@ -18,6 +18,7 @@ import {
|
|||||||
FilterType,
|
FilterType,
|
||||||
SavedSearchQuery,
|
SavedSearchQuery,
|
||||||
SearchFilter,
|
SearchFilter,
|
||||||
|
SearchSortType,
|
||||||
SearchSource,
|
SearchSource,
|
||||||
} from "@/types/search";
|
} from "@/types/search";
|
||||||
import useSuggestions from "@/hooks/use-suggestions";
|
import useSuggestions from "@/hooks/use-suggestions";
|
||||||
@@ -323,6 +324,9 @@ export default function InputWithTags({
|
|||||||
case "event_id":
|
case "event_id":
|
||||||
newFilters.event_id = value;
|
newFilters.event_id = value;
|
||||||
break;
|
break;
|
||||||
|
case "sort":
|
||||||
|
newFilters.sort = value as SearchSortType;
|
||||||
|
break;
|
||||||
default:
|
default:
|
||||||
// Handle array types (cameras, labels, subLabels, zones)
|
// Handle array types (cameras, labels, subLabels, zones)
|
||||||
if (!newFilters[type]) newFilters[type] = [];
|
if (!newFilters[type]) newFilters[type] = [];
|
||||||
|
|||||||
@@ -108,13 +108,15 @@ export default function SearchResultActions({
|
|||||||
</a>
|
</a>
|
||||||
</MenuItem>
|
</MenuItem>
|
||||||
)}
|
)}
|
||||||
<MenuItem
|
{searchResult.data.type == "object" && (
|
||||||
aria-label="Show the object lifecycle"
|
<MenuItem
|
||||||
onClick={showObjectLifecycle}
|
aria-label="Show the object lifecycle"
|
||||||
>
|
onClick={showObjectLifecycle}
|
||||||
<FaArrowsRotate className="mr-2 size-4" />
|
>
|
||||||
<span>View object lifecycle</span>
|
<FaArrowsRotate className="mr-2 size-4" />
|
||||||
</MenuItem>
|
<span>View object lifecycle</span>
|
||||||
|
</MenuItem>
|
||||||
|
)}
|
||||||
{config?.semantic_search?.enabled && isContextMenu && (
|
{config?.semantic_search?.enabled && isContextMenu && (
|
||||||
<MenuItem
|
<MenuItem
|
||||||
aria-label="Find similar tracked objects"
|
aria-label="Find similar tracked objects"
|
||||||
@@ -128,6 +130,7 @@ export default function SearchResultActions({
|
|||||||
config?.plus?.enabled &&
|
config?.plus?.enabled &&
|
||||||
searchResult.has_snapshot &&
|
searchResult.has_snapshot &&
|
||||||
searchResult.end_time &&
|
searchResult.end_time &&
|
||||||
|
searchResult.data.type == "object" &&
|
||||||
!searchResult.plus_id && (
|
!searchResult.plus_id && (
|
||||||
<MenuItem aria-label="Submit to Frigate Plus" onClick={showSnapshot}>
|
<MenuItem aria-label="Submit to Frigate Plus" onClick={showSnapshot}>
|
||||||
<FrigatePlusIcon className="mr-2 size-4 cursor-pointer text-primary" />
|
<FrigatePlusIcon className="mr-2 size-4 cursor-pointer text-primary" />
|
||||||
@@ -181,22 +184,24 @@ export default function SearchResultActions({
|
|||||||
</ContextMenu>
|
</ContextMenu>
|
||||||
) : (
|
) : (
|
||||||
<>
|
<>
|
||||||
{config?.semantic_search?.enabled && (
|
{config?.semantic_search?.enabled &&
|
||||||
<Tooltip>
|
searchResult.data.type == "object" && (
|
||||||
<TooltipTrigger>
|
<Tooltip>
|
||||||
<MdImageSearch
|
<TooltipTrigger>
|
||||||
className="size-5 cursor-pointer text-primary-variant hover:text-primary"
|
<MdImageSearch
|
||||||
onClick={findSimilar}
|
className="size-5 cursor-pointer text-primary-variant hover:text-primary"
|
||||||
/>
|
onClick={findSimilar}
|
||||||
</TooltipTrigger>
|
/>
|
||||||
<TooltipContent>Find similar</TooltipContent>
|
</TooltipTrigger>
|
||||||
</Tooltip>
|
<TooltipContent>Find similar</TooltipContent>
|
||||||
)}
|
</Tooltip>
|
||||||
|
)}
|
||||||
|
|
||||||
{!isMobileOnly &&
|
{!isMobileOnly &&
|
||||||
config?.plus?.enabled &&
|
config?.plus?.enabled &&
|
||||||
searchResult.has_snapshot &&
|
searchResult.has_snapshot &&
|
||||||
searchResult.end_time &&
|
searchResult.end_time &&
|
||||||
|
searchResult.data.type == "object" &&
|
||||||
!searchResult.plus_id && (
|
!searchResult.plus_id && (
|
||||||
<Tooltip>
|
<Tooltip>
|
||||||
<TooltipTrigger>
|
<TooltipTrigger>
|
||||||
|
|||||||
@@ -379,6 +379,7 @@ function EventItem({
|
|||||||
|
|
||||||
{event.has_snapshot &&
|
{event.has_snapshot &&
|
||||||
event.plus_id == undefined &&
|
event.plus_id == undefined &&
|
||||||
|
event.data.type == "object" &&
|
||||||
config?.plus.enabled && (
|
config?.plus.enabled && (
|
||||||
<Tooltip>
|
<Tooltip>
|
||||||
<TooltipTrigger>
|
<TooltipTrigger>
|
||||||
|
|||||||
@@ -452,7 +452,7 @@ function ObjectDetailsTab({
|
|||||||
draggable={false}
|
draggable={false}
|
||||||
src={`${apiHost}api/events/${search.id}/thumbnail.jpg`}
|
src={`${apiHost}api/events/${search.id}/thumbnail.jpg`}
|
||||||
/>
|
/>
|
||||||
{config?.semantic_search.enabled && (
|
{config?.semantic_search.enabled && search.data.type == "object" && (
|
||||||
<Button
|
<Button
|
||||||
aria-label="Find similar tracked objects"
|
aria-label="Find similar tracked objects"
|
||||||
onClick={() => {
|
onClick={() => {
|
||||||
@@ -626,65 +626,67 @@ export function ObjectSnapshotTab({
|
|||||||
</div>
|
</div>
|
||||||
)}
|
)}
|
||||||
</TransformComponent>
|
</TransformComponent>
|
||||||
{search.plus_id !== "not_enabled" && search.end_time && (
|
{search.data.type == "object" &&
|
||||||
<Card className="p-1 text-sm md:p-2">
|
search.plus_id !== "not_enabled" &&
|
||||||
<CardContent className="flex flex-col items-center justify-between gap-3 p-2 md:flex-row">
|
search.end_time && (
|
||||||
<div className={cn("flex flex-col space-y-3")}>
|
<Card className="p-1 text-sm md:p-2">
|
||||||
<div
|
<CardContent className="flex flex-col items-center justify-between gap-3 p-2 md:flex-row">
|
||||||
className={
|
<div className={cn("flex flex-col space-y-3")}>
|
||||||
"text-lg font-semibold leading-none tracking-tight"
|
<div
|
||||||
}
|
className={
|
||||||
>
|
"text-lg font-semibold leading-none tracking-tight"
|
||||||
Submit To Frigate+
|
}
|
||||||
</div>
|
>
|
||||||
<div className="text-sm text-muted-foreground">
|
Submit To Frigate+
|
||||||
Objects in locations you want to avoid are not false
|
|
||||||
positives. Submitting them as false positives will confuse
|
|
||||||
the model.
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div className="flex flex-row justify-center gap-2 md:justify-end">
|
|
||||||
{state == "reviewing" && (
|
|
||||||
<>
|
|
||||||
<Button
|
|
||||||
className="bg-success"
|
|
||||||
aria-label="Confirm this label for Frigate Plus"
|
|
||||||
onClick={() => {
|
|
||||||
setState("uploading");
|
|
||||||
onSubmitToPlus(false);
|
|
||||||
}}
|
|
||||||
>
|
|
||||||
This is{" "}
|
|
||||||
{/^[aeiou]/i.test(search?.label || "") ? "an" : "a"}{" "}
|
|
||||||
{search?.label}
|
|
||||||
</Button>
|
|
||||||
<Button
|
|
||||||
className="text-white"
|
|
||||||
aria-label="Do not confirm this label for Frigate Plus"
|
|
||||||
variant="destructive"
|
|
||||||
onClick={() => {
|
|
||||||
setState("uploading");
|
|
||||||
onSubmitToPlus(true);
|
|
||||||
}}
|
|
||||||
>
|
|
||||||
This is not{" "}
|
|
||||||
{/^[aeiou]/i.test(search?.label || "") ? "an" : "a"}{" "}
|
|
||||||
{search?.label}
|
|
||||||
</Button>
|
|
||||||
</>
|
|
||||||
)}
|
|
||||||
{state == "uploading" && <ActivityIndicator />}
|
|
||||||
{state == "submitted" && (
|
|
||||||
<div className="flex flex-row items-center justify-center gap-2">
|
|
||||||
<FaCheckCircle className="text-success" />
|
|
||||||
Submitted
|
|
||||||
</div>
|
</div>
|
||||||
)}
|
<div className="text-sm text-muted-foreground">
|
||||||
</div>
|
Objects in locations you want to avoid are not false
|
||||||
</CardContent>
|
positives. Submitting them as false positives will
|
||||||
</Card>
|
confuse the model.
|
||||||
)}
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="flex flex-row justify-center gap-2 md:justify-end">
|
||||||
|
{state == "reviewing" && (
|
||||||
|
<>
|
||||||
|
<Button
|
||||||
|
className="bg-success"
|
||||||
|
aria-label="Confirm this label for Frigate Plus"
|
||||||
|
onClick={() => {
|
||||||
|
setState("uploading");
|
||||||
|
onSubmitToPlus(false);
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
This is{" "}
|
||||||
|
{/^[aeiou]/i.test(search?.label || "") ? "an" : "a"}{" "}
|
||||||
|
{search?.label}
|
||||||
|
</Button>
|
||||||
|
<Button
|
||||||
|
className="text-white"
|
||||||
|
aria-label="Do not confirm this label for Frigate Plus"
|
||||||
|
variant="destructive"
|
||||||
|
onClick={() => {
|
||||||
|
setState("uploading");
|
||||||
|
onSubmitToPlus(true);
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
This is not{" "}
|
||||||
|
{/^[aeiou]/i.test(search?.label || "") ? "an" : "a"}{" "}
|
||||||
|
{search?.label}
|
||||||
|
</Button>
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
{state == "uploading" && <ActivityIndicator />}
|
||||||
|
{state == "submitted" && (
|
||||||
|
<div className="flex flex-row items-center justify-center gap-2">
|
||||||
|
<FaCheckCircle className="text-success" />
|
||||||
|
Submitted
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
)}
|
||||||
</div>
|
</div>
|
||||||
</TransformWrapper>
|
</TransformWrapper>
|
||||||
</div>
|
</div>
|
||||||
|
|||||||
@@ -175,7 +175,7 @@ export default function SearchFilterDialog({
|
|||||||
time_range: undefined,
|
time_range: undefined,
|
||||||
zones: undefined,
|
zones: undefined,
|
||||||
sub_labels: undefined,
|
sub_labels: undefined,
|
||||||
search_type: ["thumbnail", "description"],
|
search_type: undefined,
|
||||||
min_score: undefined,
|
min_score: undefined,
|
||||||
max_score: undefined,
|
max_score: undefined,
|
||||||
has_snapshot: undefined,
|
has_snapshot: undefined,
|
||||||
|
|||||||
@@ -15,7 +15,10 @@ export function useOverlayState<S>(
|
|||||||
(value: S, replace: boolean = false) => {
|
(value: S, replace: boolean = false) => {
|
||||||
const newLocationState = { ...currentLocationState };
|
const newLocationState = { ...currentLocationState };
|
||||||
newLocationState[key] = value;
|
newLocationState[key] = value;
|
||||||
navigate(location.pathname, { state: newLocationState, replace });
|
navigate(location.pathname + location.search, {
|
||||||
|
state: newLocationState,
|
||||||
|
replace,
|
||||||
|
});
|
||||||
},
|
},
|
||||||
// we know that these deps are correct
|
// we know that these deps are correct
|
||||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||||
|
|||||||
54
web/src/hooks/use-press.ts
Normal file
54
web/src/hooks/use-press.ts
Normal file
@@ -0,0 +1,54 @@
|
|||||||
|
// https://gist.github.com/cpojer/641bf305e6185006ea453e7631b80f95
|
||||||
|
|
||||||
|
import { useCallback, useState } from "react";
|
||||||
|
import {
|
||||||
|
LongPressCallbackMeta,
|
||||||
|
LongPressReactEvents,
|
||||||
|
useLongPress,
|
||||||
|
} from "use-long-press";
|
||||||
|
|
||||||
|
export default function usePress(
|
||||||
|
options: Omit<Parameters<typeof useLongPress>[1], "onCancel" | "onStart"> & {
|
||||||
|
onLongPress: NonNullable<Parameters<typeof useLongPress>[0]>;
|
||||||
|
onPress: (event: LongPressReactEvents<Element>) => void;
|
||||||
|
},
|
||||||
|
) {
|
||||||
|
const { onLongPress, onPress, ...actualOptions } = options;
|
||||||
|
const [hasLongPress, setHasLongPress] = useState(false);
|
||||||
|
|
||||||
|
const onCancel = useCallback(() => {
|
||||||
|
if (hasLongPress) {
|
||||||
|
setHasLongPress(false);
|
||||||
|
}
|
||||||
|
}, [hasLongPress]);
|
||||||
|
|
||||||
|
const bind = useLongPress(
|
||||||
|
useCallback(
|
||||||
|
(
|
||||||
|
event: LongPressReactEvents<Element>,
|
||||||
|
meta: LongPressCallbackMeta<unknown>,
|
||||||
|
) => {
|
||||||
|
setHasLongPress(true);
|
||||||
|
onLongPress(event, meta);
|
||||||
|
},
|
||||||
|
[onLongPress],
|
||||||
|
),
|
||||||
|
{
|
||||||
|
...actualOptions,
|
||||||
|
onCancel,
|
||||||
|
onStart: onCancel,
|
||||||
|
},
|
||||||
|
);
|
||||||
|
|
||||||
|
return useCallback(
|
||||||
|
() => ({
|
||||||
|
...bind(),
|
||||||
|
onClick: (event: LongPressReactEvents<HTMLDivElement>) => {
|
||||||
|
if (!hasLongPress) {
|
||||||
|
onPress(event);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
}),
|
||||||
|
[bind, hasLongPress, onPress],
|
||||||
|
);
|
||||||
|
}
|
||||||
@@ -116,6 +116,7 @@ export default function Explore() {
|
|||||||
is_submitted: searchSearchParams["is_submitted"],
|
is_submitted: searchSearchParams["is_submitted"],
|
||||||
has_clip: searchSearchParams["has_clip"],
|
has_clip: searchSearchParams["has_clip"],
|
||||||
event_id: searchSearchParams["event_id"],
|
event_id: searchSearchParams["event_id"],
|
||||||
|
sort: searchSearchParams["sort"],
|
||||||
limit:
|
limit:
|
||||||
Object.keys(searchSearchParams).length == 0 ? API_LIMIT : undefined,
|
Object.keys(searchSearchParams).length == 0 ? API_LIMIT : undefined,
|
||||||
timezone,
|
timezone,
|
||||||
@@ -148,6 +149,7 @@ export default function Explore() {
|
|||||||
is_submitted: searchSearchParams["is_submitted"],
|
is_submitted: searchSearchParams["is_submitted"],
|
||||||
has_clip: searchSearchParams["has_clip"],
|
has_clip: searchSearchParams["has_clip"],
|
||||||
event_id: searchSearchParams["event_id"],
|
event_id: searchSearchParams["event_id"],
|
||||||
|
sort: searchSearchParams["sort"],
|
||||||
timezone,
|
timezone,
|
||||||
include_thumbnails: 0,
|
include_thumbnails: 0,
|
||||||
},
|
},
|
||||||
@@ -165,12 +167,17 @@ export default function Explore() {
|
|||||||
|
|
||||||
const [url, params] = searchQuery;
|
const [url, params] = searchQuery;
|
||||||
|
|
||||||
// If it's not the first page, use the last item's start_time as the 'before' parameter
|
const isAscending = params.sort?.includes("date_asc");
|
||||||
|
|
||||||
if (pageIndex > 0 && previousPageData) {
|
if (pageIndex > 0 && previousPageData) {
|
||||||
const lastDate = previousPageData[previousPageData.length - 1].start_time;
|
const lastDate = previousPageData[previousPageData.length - 1].start_time;
|
||||||
return [
|
return [
|
||||||
url,
|
url,
|
||||||
{ ...params, before: lastDate.toString(), limit: API_LIMIT },
|
{
|
||||||
|
...params,
|
||||||
|
[isAscending ? "after" : "before"]: lastDate.toString(),
|
||||||
|
limit: API_LIMIT,
|
||||||
|
},
|
||||||
];
|
];
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -62,6 +62,7 @@ function Live() {
|
|||||||
if (selectedCameraName) {
|
if (selectedCameraName) {
|
||||||
const capitalized = selectedCameraName
|
const capitalized = selectedCameraName
|
||||||
.split("_")
|
.split("_")
|
||||||
|
.filter((text) => text)
|
||||||
.map((text) => text[0].toUpperCase() + text.substring(1));
|
.map((text) => text[0].toUpperCase() + text.substring(1));
|
||||||
document.title = `${capitalized.join(" ")} - Live - Frigate`;
|
document.title = `${capitalized.join(" ")} - Live - Frigate`;
|
||||||
} else if (cameraGroup && cameraGroup != "default") {
|
} else if (cameraGroup && cameraGroup != "default") {
|
||||||
|
|||||||
@@ -6,6 +6,7 @@ const SEARCH_FILTERS = [
|
|||||||
"zone",
|
"zone",
|
||||||
"sub",
|
"sub",
|
||||||
"source",
|
"source",
|
||||||
|
"sort",
|
||||||
] as const;
|
] as const;
|
||||||
export type SearchFilters = (typeof SEARCH_FILTERS)[number];
|
export type SearchFilters = (typeof SEARCH_FILTERS)[number];
|
||||||
export const DEFAULT_SEARCH_FILTERS: SearchFilters[] = [
|
export const DEFAULT_SEARCH_FILTERS: SearchFilters[] = [
|
||||||
@@ -16,10 +17,18 @@ export const DEFAULT_SEARCH_FILTERS: SearchFilters[] = [
|
|||||||
"zone",
|
"zone",
|
||||||
"sub",
|
"sub",
|
||||||
"source",
|
"source",
|
||||||
|
"sort",
|
||||||
];
|
];
|
||||||
|
|
||||||
export type SearchSource = "similarity" | "thumbnail" | "description";
|
export type SearchSource = "similarity" | "thumbnail" | "description";
|
||||||
|
|
||||||
|
export type SearchSortType =
|
||||||
|
| "date_asc"
|
||||||
|
| "date_desc"
|
||||||
|
| "score_asc"
|
||||||
|
| "score_desc"
|
||||||
|
| "relevance";
|
||||||
|
|
||||||
export type SearchResult = {
|
export type SearchResult = {
|
||||||
id: string;
|
id: string;
|
||||||
camera: string;
|
camera: string;
|
||||||
@@ -65,6 +74,7 @@ export type SearchFilter = {
|
|||||||
time_range?: string;
|
time_range?: string;
|
||||||
search_type?: SearchSource[];
|
search_type?: SearchSource[];
|
||||||
event_id?: string;
|
event_id?: string;
|
||||||
|
sort?: SearchSortType;
|
||||||
};
|
};
|
||||||
|
|
||||||
export const DEFAULT_TIME_RANGE_AFTER = "00:00";
|
export const DEFAULT_TIME_RANGE_AFTER = "00:00";
|
||||||
@@ -86,6 +96,7 @@ export type SearchQueryParams = {
|
|||||||
query?: string;
|
query?: string;
|
||||||
page?: number;
|
page?: number;
|
||||||
time_range?: string;
|
time_range?: string;
|
||||||
|
sort?: SearchSortType;
|
||||||
};
|
};
|
||||||
|
|
||||||
export type SearchQuery = [string, SearchQueryParams] | null;
|
export type SearchQuery = [string, SearchQueryParams] | null;
|
||||||
|
|||||||
@@ -26,7 +26,7 @@ type ExploreViewProps = {
|
|||||||
searchDetail: SearchResult | undefined;
|
searchDetail: SearchResult | undefined;
|
||||||
setSearchDetail: (search: SearchResult | undefined) => void;
|
setSearchDetail: (search: SearchResult | undefined) => void;
|
||||||
setSimilaritySearch: (search: SearchResult) => void;
|
setSimilaritySearch: (search: SearchResult) => void;
|
||||||
onSelectSearch: (item: SearchResult, index: number, page?: SearchTab) => void;
|
onSelectSearch: (item: SearchResult, ctrl: boolean, page?: SearchTab) => void;
|
||||||
};
|
};
|
||||||
|
|
||||||
export default function ExploreView({
|
export default function ExploreView({
|
||||||
@@ -125,7 +125,7 @@ type ThumbnailRowType = {
|
|||||||
setSearchDetail: (search: SearchResult | undefined) => void;
|
setSearchDetail: (search: SearchResult | undefined) => void;
|
||||||
mutate: () => void;
|
mutate: () => void;
|
||||||
setSimilaritySearch: (search: SearchResult) => void;
|
setSimilaritySearch: (search: SearchResult) => void;
|
||||||
onSelectSearch: (item: SearchResult, index: number, page?: SearchTab) => void;
|
onSelectSearch: (item: SearchResult, ctrl: boolean, page?: SearchTab) => void;
|
||||||
};
|
};
|
||||||
|
|
||||||
function ThumbnailRow({
|
function ThumbnailRow({
|
||||||
@@ -205,7 +205,7 @@ type ExploreThumbnailImageProps = {
|
|||||||
setSearchDetail: (search: SearchResult | undefined) => void;
|
setSearchDetail: (search: SearchResult | undefined) => void;
|
||||||
mutate: () => void;
|
mutate: () => void;
|
||||||
setSimilaritySearch: (search: SearchResult) => void;
|
setSimilaritySearch: (search: SearchResult) => void;
|
||||||
onSelectSearch: (item: SearchResult, index: number, page?: SearchTab) => void;
|
onSelectSearch: (item: SearchResult, ctrl: boolean, page?: SearchTab) => void;
|
||||||
};
|
};
|
||||||
function ExploreThumbnailImage({
|
function ExploreThumbnailImage({
|
||||||
event,
|
event,
|
||||||
@@ -225,11 +225,11 @@ function ExploreThumbnailImage({
|
|||||||
};
|
};
|
||||||
|
|
||||||
const handleShowObjectLifecycle = () => {
|
const handleShowObjectLifecycle = () => {
|
||||||
onSelectSearch(event, 0, "object lifecycle");
|
onSelectSearch(event, false, "object lifecycle");
|
||||||
};
|
};
|
||||||
|
|
||||||
const handleShowSnapshot = () => {
|
const handleShowSnapshot = () => {
|
||||||
onSelectSearch(event, 0, "snapshot");
|
onSelectSearch(event, false, "snapshot");
|
||||||
};
|
};
|
||||||
|
|
||||||
return (
|
return (
|
||||||
|
|||||||
@@ -30,6 +30,7 @@ import {
|
|||||||
} from "@/components/ui/tooltip";
|
} from "@/components/ui/tooltip";
|
||||||
import Chip from "@/components/indicators/Chip";
|
import Chip from "@/components/indicators/Chip";
|
||||||
import { TooltipPortal } from "@radix-ui/react-tooltip";
|
import { TooltipPortal } from "@radix-ui/react-tooltip";
|
||||||
|
import SearchActionGroup from "@/components/filter/SearchActionGroup";
|
||||||
|
|
||||||
type SearchViewProps = {
|
type SearchViewProps = {
|
||||||
search: string;
|
search: string;
|
||||||
@@ -181,20 +182,53 @@ export default function SearchView({
|
|||||||
|
|
||||||
// search interaction
|
// search interaction
|
||||||
|
|
||||||
const [selectedIndex, setSelectedIndex] = useState<number | null>(null);
|
const [selectedObjects, setSelectedObjects] = useState<string[]>([]);
|
||||||
const itemRefs = useRef<(HTMLDivElement | null)[]>([]);
|
const itemRefs = useRef<(HTMLDivElement | null)[]>([]);
|
||||||
|
|
||||||
const onSelectSearch = useCallback(
|
const onSelectSearch = useCallback(
|
||||||
(item: SearchResult, index: number, page: SearchTab = "details") => {
|
(item: SearchResult, ctrl: boolean, page: SearchTab = "details") => {
|
||||||
setPage(page);
|
if (selectedObjects.length > 1 || ctrl) {
|
||||||
setSearchDetail(item);
|
const index = selectedObjects.indexOf(item.id);
|
||||||
setSelectedIndex(index);
|
|
||||||
|
if (index != -1) {
|
||||||
|
if (selectedObjects.length == 1) {
|
||||||
|
setSelectedObjects([]);
|
||||||
|
} else {
|
||||||
|
const copy = [
|
||||||
|
...selectedObjects.slice(0, index),
|
||||||
|
...selectedObjects.slice(index + 1),
|
||||||
|
];
|
||||||
|
setSelectedObjects(copy);
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
const copy = [...selectedObjects];
|
||||||
|
copy.push(item.id);
|
||||||
|
setSelectedObjects(copy);
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
setPage(page);
|
||||||
|
setSearchDetail(item);
|
||||||
|
}
|
||||||
},
|
},
|
||||||
[],
|
[selectedObjects],
|
||||||
);
|
);
|
||||||
|
|
||||||
|
const onSelectAllObjects = useCallback(() => {
|
||||||
|
if (!uniqueResults || uniqueResults.length == 0) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (selectedObjects.length < uniqueResults.length) {
|
||||||
|
setSelectedObjects(uniqueResults.map((value) => value.id));
|
||||||
|
} else {
|
||||||
|
setSelectedObjects([]);
|
||||||
|
}
|
||||||
|
}, [uniqueResults, selectedObjects]);
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
setSelectedIndex(0);
|
setSelectedObjects([]);
|
||||||
|
// unselect items when search term or filter changes
|
||||||
|
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||||
}, [searchTerm, searchFilter]);
|
}, [searchTerm, searchFilter]);
|
||||||
|
|
||||||
// confidence score
|
// confidence score
|
||||||
@@ -243,23 +277,44 @@ export default function SearchView({
|
|||||||
}
|
}
|
||||||
|
|
||||||
switch (key) {
|
switch (key) {
|
||||||
case "ArrowLeft":
|
case "a":
|
||||||
setSelectedIndex((prevIndex) => {
|
if (modifiers.ctrl) {
|
||||||
const newIndex =
|
onSelectAllObjects();
|
||||||
prevIndex === null
|
}
|
||||||
? uniqueResults.length - 1
|
|
||||||
: (prevIndex - 1 + uniqueResults.length) % uniqueResults.length;
|
|
||||||
setSearchDetail(uniqueResults[newIndex]);
|
|
||||||
return newIndex;
|
|
||||||
});
|
|
||||||
break;
|
break;
|
||||||
case "ArrowRight":
|
case "ArrowLeft":
|
||||||
setSelectedIndex((prevIndex) => {
|
if (uniqueResults.length > 0) {
|
||||||
|
const currentIndex = searchDetail
|
||||||
|
? uniqueResults.findIndex(
|
||||||
|
(result) => result.id === searchDetail.id,
|
||||||
|
)
|
||||||
|
: -1;
|
||||||
|
|
||||||
const newIndex =
|
const newIndex =
|
||||||
prevIndex === null ? 0 : (prevIndex + 1) % uniqueResults.length;
|
currentIndex === -1
|
||||||
|
? uniqueResults.length - 1
|
||||||
|
: (currentIndex - 1 + uniqueResults.length) %
|
||||||
|
uniqueResults.length;
|
||||||
|
|
||||||
setSearchDetail(uniqueResults[newIndex]);
|
setSearchDetail(uniqueResults[newIndex]);
|
||||||
return newIndex;
|
}
|
||||||
});
|
break;
|
||||||
|
|
||||||
|
case "ArrowRight":
|
||||||
|
if (uniqueResults.length > 0) {
|
||||||
|
const currentIndex = searchDetail
|
||||||
|
? uniqueResults.findIndex(
|
||||||
|
(result) => result.id === searchDetail.id,
|
||||||
|
)
|
||||||
|
: -1;
|
||||||
|
|
||||||
|
const newIndex =
|
||||||
|
currentIndex === -1
|
||||||
|
? 0
|
||||||
|
: (currentIndex + 1) % uniqueResults.length;
|
||||||
|
|
||||||
|
setSearchDetail(uniqueResults[newIndex]);
|
||||||
|
}
|
||||||
break;
|
break;
|
||||||
case "PageDown":
|
case "PageDown":
|
||||||
contentRef.current?.scrollBy({
|
contentRef.current?.scrollBy({
|
||||||
@@ -275,32 +330,80 @@ export default function SearchView({
|
|||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
[uniqueResults, inputFocused],
|
[uniqueResults, inputFocused, onSelectAllObjects, searchDetail],
|
||||||
);
|
);
|
||||||
|
|
||||||
useKeyboardListener(
|
useKeyboardListener(
|
||||||
["ArrowLeft", "ArrowRight", "PageDown", "PageUp"],
|
["a", "ArrowLeft", "ArrowRight", "PageDown", "PageUp"],
|
||||||
onKeyboardShortcut,
|
onKeyboardShortcut,
|
||||||
!inputFocused,
|
!inputFocused,
|
||||||
);
|
);
|
||||||
|
|
||||||
// scroll into view
|
// scroll into view
|
||||||
|
|
||||||
|
const [prevSearchDetail, setPrevSearchDetail] = useState<
|
||||||
|
SearchResult | undefined
|
||||||
|
>();
|
||||||
|
|
||||||
|
// keep track of previous ref to outline thumbnail when dialog closes
|
||||||
|
const prevSearchDetailRef = useRef<SearchResult | undefined>();
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
if (
|
if (searchDetail === undefined && prevSearchDetailRef.current) {
|
||||||
selectedIndex !== null &&
|
setPrevSearchDetail(prevSearchDetailRef.current);
|
||||||
uniqueResults &&
|
|
||||||
itemRefs.current?.[selectedIndex]
|
|
||||||
) {
|
|
||||||
scrollIntoView(itemRefs.current[selectedIndex], {
|
|
||||||
block: "center",
|
|
||||||
behavior: "smooth",
|
|
||||||
scrollMode: "if-needed",
|
|
||||||
});
|
|
||||||
}
|
}
|
||||||
// we only want to scroll when the index changes
|
prevSearchDetailRef.current = searchDetail;
|
||||||
|
}, [searchDetail]);
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
if (uniqueResults && itemRefs.current && prevSearchDetail) {
|
||||||
|
const selectedIndex = uniqueResults.findIndex(
|
||||||
|
(result) => result.id === prevSearchDetail.id,
|
||||||
|
);
|
||||||
|
|
||||||
|
const parent = itemRefs.current[selectedIndex];
|
||||||
|
|
||||||
|
if (selectedIndex !== -1 && parent) {
|
||||||
|
const target = parent.querySelector(".review-item-ring");
|
||||||
|
if (target) {
|
||||||
|
scrollIntoView(target, {
|
||||||
|
block: "center",
|
||||||
|
behavior: "smooth",
|
||||||
|
scrollMode: "if-needed",
|
||||||
|
});
|
||||||
|
target.classList.add(`outline-selected`);
|
||||||
|
target.classList.remove("outline-transparent");
|
||||||
|
|
||||||
|
setTimeout(() => {
|
||||||
|
target.classList.remove(`outline-selected`);
|
||||||
|
target.classList.add("outline-transparent");
|
||||||
|
}, 3000);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// we only want to scroll when the dialog closes
|
||||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||||
}, [selectedIndex]);
|
}, [prevSearchDetail]);
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
if (uniqueResults && itemRefs.current && searchDetail) {
|
||||||
|
const selectedIndex = uniqueResults.findIndex(
|
||||||
|
(result) => result.id === searchDetail.id,
|
||||||
|
);
|
||||||
|
|
||||||
|
const parent = itemRefs.current[selectedIndex];
|
||||||
|
|
||||||
|
if (selectedIndex !== -1 && parent) {
|
||||||
|
scrollIntoView(parent, {
|
||||||
|
block: "center",
|
||||||
|
behavior: "smooth",
|
||||||
|
scrollMode: "if-needed",
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// we only want to scroll when changing the detail pane
|
||||||
|
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||||
|
}, [searchDetail]);
|
||||||
|
|
||||||
// observer for loading more
|
// observer for loading more
|
||||||
|
|
||||||
@@ -369,22 +472,39 @@ export default function SearchView({
|
|||||||
{hasExistingSearch && (
|
{hasExistingSearch && (
|
||||||
<ScrollArea className="w-full whitespace-nowrap lg:ml-[35%]">
|
<ScrollArea className="w-full whitespace-nowrap lg:ml-[35%]">
|
||||||
<div className="flex flex-row gap-2">
|
<div className="flex flex-row gap-2">
|
||||||
<SearchFilterGroup
|
{selectedObjects.length == 0 ? (
|
||||||
className={cn(
|
<>
|
||||||
"w-full justify-between md:justify-start lg:justify-end",
|
<SearchFilterGroup
|
||||||
)}
|
className={cn(
|
||||||
filter={searchFilter}
|
"w-full justify-between md:justify-start lg:justify-end",
|
||||||
onUpdateFilter={onUpdateFilter}
|
)}
|
||||||
/>
|
filter={searchFilter}
|
||||||
<SearchSettings
|
onUpdateFilter={onUpdateFilter}
|
||||||
columns={columns}
|
/>
|
||||||
setColumns={setColumns}
|
<SearchSettings
|
||||||
defaultView={defaultView}
|
columns={columns}
|
||||||
setDefaultView={setDefaultView}
|
setColumns={setColumns}
|
||||||
filter={searchFilter}
|
defaultView={defaultView}
|
||||||
onUpdateFilter={onUpdateFilter}
|
setDefaultView={setDefaultView}
|
||||||
/>
|
filter={searchFilter}
|
||||||
<ScrollBar orientation="horizontal" className="h-0" />
|
onUpdateFilter={onUpdateFilter}
|
||||||
|
/>
|
||||||
|
<ScrollBar orientation="horizontal" className="h-0" />
|
||||||
|
</>
|
||||||
|
) : (
|
||||||
|
<div
|
||||||
|
className={cn(
|
||||||
|
"scrollbar-container flex justify-center gap-2 overflow-x-auto",
|
||||||
|
"h-10 w-full justify-between md:justify-start lg:justify-end",
|
||||||
|
)}
|
||||||
|
>
|
||||||
|
<SearchActionGroup
|
||||||
|
selectedObjects={selectedObjects}
|
||||||
|
setSelectedObjects={setSelectedObjects}
|
||||||
|
pullLatestData={refresh}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
</div>
|
</div>
|
||||||
</ScrollArea>
|
</ScrollArea>
|
||||||
)}
|
)}
|
||||||
@@ -412,14 +532,14 @@ export default function SearchView({
|
|||||||
<div className={gridClassName}>
|
<div className={gridClassName}>
|
||||||
{uniqueResults &&
|
{uniqueResults &&
|
||||||
uniqueResults.map((value, index) => {
|
uniqueResults.map((value, index) => {
|
||||||
const selected = selectedIndex === index;
|
const selected = selectedObjects.includes(value.id);
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<div
|
<div
|
||||||
key={value.id}
|
key={value.id}
|
||||||
ref={(item) => (itemRefs.current[index] = item)}
|
ref={(item) => (itemRefs.current[index] = item)}
|
||||||
data-start={value.start_time}
|
data-start={value.start_time}
|
||||||
className="review-item relative flex flex-col rounded-lg"
|
className="relative flex flex-col rounded-lg"
|
||||||
>
|
>
|
||||||
<div
|
<div
|
||||||
className={cn(
|
className={cn(
|
||||||
@@ -428,7 +548,20 @@ export default function SearchView({
|
|||||||
>
|
>
|
||||||
<SearchThumbnail
|
<SearchThumbnail
|
||||||
searchResult={value}
|
searchResult={value}
|
||||||
onClick={() => onSelectSearch(value, index)}
|
onClick={(
|
||||||
|
value: SearchResult,
|
||||||
|
ctrl: boolean,
|
||||||
|
detail: boolean,
|
||||||
|
) => {
|
||||||
|
if (detail && selectedObjects.length == 0) {
|
||||||
|
setSearchDetail(value);
|
||||||
|
} else {
|
||||||
|
onSelectSearch(
|
||||||
|
value,
|
||||||
|
ctrl || selectedObjects.length > 0,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}}
|
||||||
/>
|
/>
|
||||||
{(searchTerm ||
|
{(searchTerm ||
|
||||||
searchFilter?.search_type?.includes("similarity")) && (
|
searchFilter?.search_type?.includes("similarity")) && (
|
||||||
@@ -469,10 +602,10 @@ export default function SearchView({
|
|||||||
}}
|
}}
|
||||||
refreshResults={refresh}
|
refreshResults={refresh}
|
||||||
showObjectLifecycle={() =>
|
showObjectLifecycle={() =>
|
||||||
onSelectSearch(value, index, "object lifecycle")
|
onSelectSearch(value, false, "object lifecycle")
|
||||||
}
|
}
|
||||||
showSnapshot={() =>
|
showSnapshot={() =>
|
||||||
onSelectSearch(value, index, "snapshot")
|
onSelectSearch(value, false, "snapshot")
|
||||||
}
|
}
|
||||||
/>
|
/>
|
||||||
</div>
|
</div>
|
||||||
|
|||||||
Reference in New Issue
Block a user