mirror of
https://github.com/ruvnet/RuView.git
synced 2026-04-26 13:10:40 +00:00
feat: Real-time dense point cloud from camera + WiFi CSI (#405)
* Add wifi-densepose-pointcloud: real-time dense point cloud from camera + WiFi CSI
New crate with 5 modules:
- depth: monocular depth estimation + 3D backprojection (ONNX-ready, synthetic fallback)
- pointcloud: Point3D/ColorPoint types, PLY export, Gaussian splat conversion
- fusion: WiFi occupancy volume → point cloud + multi-modal voxel fusion
- stream: HTTP + Three.js viewer server (Axum, port 9880)
- main: CLI with serve/capture/demo subcommands
Demo output: 271 WiFi points + 19,200 depth points → 4,886 fused → 1,718 Gaussian splats.
Serves interactive 3D viewer at http://localhost:9880 with Three.js orbit controls.
ADR-SYS-0021 documents the architecture for camera + WiFi CSI dense point cloud pipeline.
Co-Authored-By: claude-flow <ruv@ruv.net>
* Optimize pointcloud: larger splat voxels, smaller responses, faster fusion
- Gaussian splat voxel size: 0.10 → 0.15 (42% fewer splats: 1718 → 994)
- Splat response: 399 KB → 225 KB (44% smaller)
- Pipeline: 22.2ms mean (100 runs, σ=0.3ms)
- Cloud API: 1.11ms avg, 905 req/s
- Splats API: 1.39ms avg, 719 req/s
- Binary: 1.0 MB arm64 (Mac Mini), tested
Co-Authored-By: claude-flow <ruv@ruv.net>
* Complete implementation: camera capture, WiFi CSI receiver, training pipeline
Three new modules added to wifi-densepose-pointcloud:
1. camera.rs — Cross-platform camera capture
- macOS: AVFoundation via Swift, ffmpeg avfoundation
- Linux: V4L2, ffmpeg v4l2
- Camera detection, listing, frame capture to RGB
- Graceful fallback to synthetic data when no camera
2. csi.rs — WiFi CSI receiver for ESP32 nodes
- UDP listener for CSI JSON frames from ESP32
- Per-link attenuation tracking with EMA smoothing
- Simplified RF tomography (backprojection to occupancy grid)
- Test frame sender for development without hardware
- Ready for real ESP32 CSI data from ruvzen
3. training.rs — Calibration and training pipeline
- Depth calibration: grid search over scale/offset/gamma
- Occupancy training: threshold optimization for presence detection
- Ground truth reference points for depth RMSE measurement
- Preference pair export (JSONL) for DPO training on ruOS brain
- Brain integration: submit observations as memories
- Persistent calibration files (JSON)
New CLI commands:
ruview-pointcloud cameras # list available cameras
ruview-pointcloud train # run calibration + training
ruview-pointcloud csi-test # send test CSI frames
ruview-pointcloud serve --csi # serve with live CSI input
All tested: demo, training (10 samples, 4 reference points, 3 pairs),
CSI receiver (50 test frames), server API.
Co-Authored-By: claude-flow <ruv@ruv.net>
* Fix viewer: replace WebSocket with fetch polling
Co-Authored-By: claude-flow <ruv@ruv.net>
* Wire live camera into server — real-time updating point cloud
- Server captures from /dev/video0 at 2fps via ffmpeg
- Background tokio task refreshes cloud + splats every 500ms
- Viewer polls /api/splats every 500ms, only updates on new frame
- Shows 🟢 LIVE / 🔴 DEMO indicator
- Camera position set for first-person view (looking forward into scene)
- Downsample 4x for performance (19,200 points per frame)
- Graceful fallback to demo data if camera capture fails
Co-Authored-By: claude-flow <ruv@ruv.net>
* Add MiDaS GPU depth, serial CSI reader, full sensor fusion
- MiDaS depth server: PyTorch on CUDA, real monocular depth estimation
- Rust server calls MiDaS via HTTP for neural depth (falls back to luminance)
- Serial CSI reader for ESP32 with motion detection + presence estimation
- CSI disabled by default (RUVIEW_CSI=1 to enable) — serial reader needs baud config
- Edge-enhanced depth for better object boundaries
- All sensors wired: camera, ESP32 CSI, mmWave (CSI gated until serial fixed)
Co-Authored-By: claude-flow <ruv@ruv.net>
* Complete 7-component sensor fusion pipeline (all working)
1. ADR-018 binary parser — decodes ESP32 CSI UDP frames, extracts I/Q subcarriers
2. WiFlow pose — 17 COCO keypoints from CSI (186K param model loaded)
3. Camera depth — MiDaS on CUDA + luminance fallback
4. Sensor fusion — camera depth + CSI occupancy grid + skeleton overlay
5. RF tomography — ISTA-inspired backprojection from per-node RSSI
6. Vital signs — breathing rate from CSI phase analysis
7. Motion-adaptive — skip expensive depth when CSI shows no motion
Live results: 510 CSI frames/session, 17 keypoints, 26% motion, 40 BPM breathing.
Both ESP32 nodes provisioned to send CSI to 192.168.1.123:3333.
Magic number fix: supports both 0xC5110001 (v1) and 0xC5110006 (v6) frames.
Co-Authored-By: claude-flow <ruv@ruv.net>
* Add brain bridge — sparse spatial observation sync every 60s
Stores room scan summaries, motion events, and vital signs
in the ruOS brain as memories. Only syncs every 120 frames
(~60 seconds) to keep the brain sparse and optimized.
Categories: spatial-observation, spatial-motion, spatial-vitals.
Co-Authored-By: claude-flow <ruv@ruv.net>
* Update README + user guide with dense point cloud features
Added pointcloud section to README (quick start, CLI, performance).
Added comprehensive user guide section: setup, sensors, commands,
pipeline components, API endpoints, training, output formats,
deep room scan, ESP32 provisioning.
Co-Authored-By: claude-flow <ruv@ruv.net>
* Add ruview-geo: geospatial satellite integration (11 modules, 8/8 tests)
New crate with free satellite imagery, terrain, OSM, weather, and brain integration.
Modules: types, coord, locate, cache, tiles, terrain, osm, register, fuse, brain, temporal
Tests: 8 passed (haversine, ENU roundtrip, tiles, HGT parse, registration)
Validation: real data — 43.49N 79.71W, 4 Sentinel-2 tiles, 2°C weather, brain stored
Data sources (all free, no API keys):
- EOX Sentinel-2 cloudless (10m satellite tiles)
- SRTM GL1 (30m elevation)
- Overpass API (OSM buildings/roads)
- ip-api.com (geolocation)
- Open Meteo (weather)
ADR-044 documents architecture decisions.
README.md in crate subdirectory.
Co-Authored-By: claude-flow <ruv@ruv.net>
* Update ADR-044: add Common Crawl WET, NASA FIRMS, OpenAQ, Overture Maps sources
Extended geospatial data sources leveraging ruvector's existing web_ingest
and Common Crawl support for hyperlocal context.
Co-Authored-By: claude-flow <ruv@ruv.net>
* Fix OSM/SRTM queries, add change detection + night mode
- OSM: use inclusive building filter with relation query and 25s timeout
- SRTM: switch to NASA public mirror with viewfinderpanoramas fallback
- Add detect_tile_changes() for pixel-diff satellite change detection
- Add is_night() solar-declination model for CSI-only night mode
- 6 new unit tests (night mode + tile change detection)
Co-Authored-By: claude-flow <ruv@ruv.net>
* Enhance viewer: skeleton overlay, weather, buildings, better camera
Add COCO skeleton rendering with yellow keypoint spheres and white bone
lines, info panel sections for weather/buildings/CSI rate/confidence,
overhead camera at (0,2,-4), and denser point size with sizeAttenuation.
Co-Authored-By: claude-flow <ruv@ruv.net>
* Add CSI fingerprint DB + night mode detection
Co-Authored-By: claude-flow <ruv@ruv.net>
* Fix ADR-044 numbering conflict, update geo README
Renumbered provisioning tool ADR from 044 to 050 to avoid conflict
with geospatial satellite integration ADR-044.
Co-Authored-By: claude-flow <ruv@ruv.net>
* Clean up warnings: suppress dead_code for conditional pipeline modules
Removes unused imports/variables via cargo fix and adds #[allow(dead_code)]
for modules used conditionally at runtime (CSI, depth, fusion, serial).
Pointcloud: 28 → 0 warnings. Geo: 2 → 0 warnings. 8/8 tests pass.
Co-Authored-By: claude-flow <ruv@ruv.net>
* Fix PR #405 blockers: async runtime panic, crate rename, path traversal, brain URL config
- brain_bridge.rs: replace `Handle::current().block_on(...)` inside async fn
with `.await` (was a guaranteed "runtime within runtime" panic). Brain URL
now read from RUVIEW_BRAIN_URL env var (default http://127.0.0.1:9876),
logged once via OnceLock.
- wifi-densepose-geo: rename Cargo package from `ruview-geo` to
`wifi-densepose-geo` to match directory and workspace conventions. Update
all use sites (tests/examples/README). Same env-var pattern for brain URL
in brain.rs + temporal.rs.
- training.rs: add sanitize_data_path() rejecting `..` components and
safe_join() that canonicalises + enforces base-dir containment on every
write (calibration.json, samples.json, preference_pairs.jsonl,
occupancy_calibration.json). Defence-in-depth check also in main.rs
before TrainingSession::new.
- osm.rs: clamp Overpass radius to MAX_RADIUS_M=5000m; return Err beyond
that. Add parse_overpass_json() that rejects malformed payloads
(missing top-level `elements` array).
Co-Authored-By: claude-flow <ruv@ruv.net>
* csi_pipeline: rename WiFlow stub to heuristic_pose_from_amplitude, decouple UDP
Blocker 3 (PR #405 review): The "WiFlow inference" path was a stub that
built a model from empty weight vectors and synthesised keypoints from
amplitude energy. Presenting this as "WiFlow inference" was misleading.
- Rename WiFlowModel to PoseModelMetadata (empty tag struct; we only care
if the on-disk file exists)
- Rename load_wiflow_model() -> detect_pose_model_metadata() and log
"amplitude-energy heuristic enabled/disabled" (no "WiFlow" claim)
- Rename estimate_pose() -> heuristic_pose_from_amplitude() with
prominent `STUB:` doc comment saying this is NOT a trained model
Blocker 4 (PR #405 review): The UDP receiver held the shared Arc<Mutex>
across a synchronous process_frame() call, starving HTTP handlers.
- Introduce a std::sync::mpsc channel between the UDP thread (which only
parses + pushes) and a dedicated processor thread (which locks only
briefly around a single process_frame). HTTP snapshots via
get_pipeline_output no longer contend with the socket read loop.
Also:
- Move ADR-018 parser to parser.rs (see next commit); csi_pipeline re-exports
- send_test_frames now uses parser::build_test_frame for synthetic frames
- Log a one-line node stats summary every 500 frames (reads every public
CsiFrame field on the runtime path)
Co-Authored-By: claude-flow <ruv@ruv.net>
* Extract ADR-018 parser into parser.rs + wire Fingerprint CLI
File-split (strong concern #9 in PR #405 review): csi_pipeline.rs was 602
LOC; extract the pure-function ADR-018 parser + synthetic frame builder
into src/parser.rs. Inline unit tests in parser.rs cover:
- 0xC5110001 (raw CSI, v1) roundtrip
- 0xC5110006 (feature state, v6) roundtrip
- wrong magic is rejected
- truncated header is rejected
- truncated payload is rejected
main.rs: expose `fingerprint NAME [--seconds N]` subcommand wiring
record_fingerprint() (this was the only caller needed to make the public
API non-dead on the runtime path). Also:
- Replace `--host/--port` + external `--csi` with a single `--bind`
defaulting to loopback (`127.0.0.1:9880`) — addresses strong concern
#7 about exposing camera/CSI/vitals by default.
- Update synthetic `csi-test` to target UDP 3333 (matching the ADR-018
listener) and use the shared parser::build_test_frame.
- Defence-in-depth: call training::sanitize_data_path on the expanded
--data-dir before TrainingSession::new does the same.
Co-Authored-By: claude-flow <ruv@ruv.net>
* stream: extract viewer HTML to viewer.html, default bind to loopback
Strong concern #7 (PR #405): default HTTP bind leaked camera/CSI/vitals
to the LAN. The `serve` fn now takes a single `bind` arg and prints a
loud WARNING when bound outside loopback.
Strong concern #10 (PR #405): embedded HTML+JS was ~220 LOC of the 418
LOC stream.rs. Moved the markup verbatim into viewer.html and inlined
via `include_str!("viewer.html")`. Also:
- Drop the #![allow(dead_code)] crate-level silencing (reviewer point
#11). Remove the now-unused AppState.csi_pipeline field.
- capture_camera_cloud_with_luminance returns the mean luminance of the
captured frame; the background loop feeds that to
CsiPipelineState::set_light_level so the night-mode flag actually
toggles at runtime (previously it could only be set from tests).
Net effect on file size: stream.rs 418 → 232 LOC.
Co-Authored-By: claude-flow <ruv@ruv.net>
* Dead-code cleanup + tests for fusion/depth/OSM/training/fingerprinting
Reviewer point #11 (PR #405): remove the `#![allow(dead_code)]`
silencing added in 8eb808d and fix the underlying issues.
- Delete csi.rs: duplicate of csi_pipeline.rs with incompatible wire
format (JSON vs ADR-018 binary). csi_pipeline is the real path.
- Delete serial_csi.rs: never referenced by any module.
- Drop Frame.timestamp_ms (unread), AppState.csi_pipeline (unread),
brain_bridge::brain_available (caller-less), fusion::fetch_wifi_occupancy
(caller-less) — these had no runtime users.
- Drop crate-level #![allow(dead_code)] from camera.rs, depth.rs,
fusion.rs, pointcloud.rs.
Tests (target: 8-12, actual: 15 unit + 9 geo unit + 8 geo integration
= 32 total, all pass):
- parser.rs: 5 tests (v1/v6 magic roundtrip, wrong magic, truncated
header, truncated payload).
- fusion.rs: 2 tests (non-overlapping merge, voxel dedup).
- depth.rs: 2 tests (2x2 backproject → 4 points at z=1, NaN rejected).
- training.rs: 4 tests (rejects `..`, accepts relative child, refuses
TrainingSession::new("../etc/passwd"), accepts a clean tmpdir).
- csi_pipeline.rs: 2 tests (set_light_level toggles is_dark,
record_fingerprint stores and self-identifies).
- osm.rs: 3 tests (parse_overpass_json minimal fixture, rejects
malformed payload, fetch_buildings rejects > MAX_RADIUS_M).
Co-Authored-By: claude-flow <ruv@ruv.net>
* Update README + user-guide for PR #405 review-fix additions
- serve now uses --bind 127.0.0.1:9880 (loopback default) instead of --port
- Add fingerprint subcommand to CLI tables
- Document RUVIEW_BRAIN_URL env var + --brain flag
- Flag pose path as amplitude-energy heuristic stub (not trained WiFlow)
- Security note on exposing server outside loopback
- Add wifi-densepose-pointcloud + wifi-densepose-geo rows to crate table
Co-Authored-By: claude-flow <ruv@ruv.net>
This commit is contained in:
parent
ae40e2b33e
commit
0943a32248
35 changed files with 4649 additions and 8 deletions
2
.gitignore
vendored
2
.gitignore
vendored
|
|
@ -250,3 +250,5 @@ v1/src/sensing/mac_wifi
|
|||
# Local build scripts
|
||||
firmware/esp32-csi-node/build_firmware.batdata/
|
||||
models/
|
||||
demo_pointcloud.ply
|
||||
demo_splats.json
|
||||
|
|
|
|||
43
README.md
43
README.md
|
|
@ -96,6 +96,47 @@ node scripts/mincut-person-counter.js --port 5006 # Correct person counting
|
|||
>
|
||||
---
|
||||
|
||||
### Real-Time Dense Point Cloud (NEW)
|
||||
|
||||
RuView now generates **real-time 3D point clouds** by fusing camera depth + WiFi CSI + mmWave radar. All sensors stream simultaneously into a unified spatial model.
|
||||
|
||||
| Sensor | Data | Integration |
|
||||
|--------|------|-------------|
|
||||
| **Camera** | MiDaS monocular depth (GPU) | 640×480 → 19,200+ depth points per frame |
|
||||
| **ESP32 CSI** | ADR-018 binary frames (UDP) | RF tomography → 8×8×4 occupancy grid |
|
||||
| **WiFlow Pose** | 17 COCO keypoints from CSI | Skeleton overlay on point cloud |
|
||||
| **Vital Signs** | Breathing rate from CSI phase | Stored in ruOS brain every 60s |
|
||||
| **Motion** | CSI amplitude variance | Adaptive capture rate (skip depth when still) |
|
||||
|
||||
**Quick start:**
|
||||
```bash
|
||||
cd rust-port/wifi-densepose-rs
|
||||
cargo build --release -p wifi-densepose-pointcloud
|
||||
./target/release/ruview-pointcloud serve --bind 127.0.0.1:9880
|
||||
# Open http://localhost:9880 for live 3D viewer
|
||||
```
|
||||
|
||||
**CLI commands:**
|
||||
```bash
|
||||
ruview-pointcloud demo # synthetic demo
|
||||
ruview-pointcloud serve --bind 127.0.0.1:9880 # live server + Three.js viewer
|
||||
ruview-pointcloud capture --output room.ply # capture to PLY
|
||||
ruview-pointcloud train # depth calibration + DPO pairs
|
||||
ruview-pointcloud cameras # list available cameras
|
||||
ruview-pointcloud csi-test --count 100 # send test CSI frames
|
||||
ruview-pointcloud fingerprint office --seconds 5 # record named CSI room fingerprint
|
||||
```
|
||||
|
||||
The HTTP/viewer server defaults to **loopback (`127.0.0.1`)** — exposing live camera/CSI/vitals on `0.0.0.0` is an explicit opt-in. Brain URL defaults to `http://127.0.0.1:9876` and is overridable via `RUVIEW_BRAIN_URL` env var or the `--brain` flag on `serve`/`train`.
|
||||
|
||||
The pose overlay currently uses an **amplitude-energy heuristic** (`heuristic_pose_from_amplitude`) rather than trained WiFlow inference — real ONNX/Candle inference is tracked as a follow-up.
|
||||
|
||||
**Performance:** 22ms pipeline, 905 req/s API, 40K voxel room model from 20 frames.
|
||||
|
||||
**Brain integration:** Spatial observations (motion, vitals, skeleton, occupancy) sync to the ruOS brain every 60 seconds for agent reasoning.
|
||||
|
||||
See [PR #405](https://github.com/ruvnet/RuView/pull/405) for full details.
|
||||
|
||||
### What's New in v0.7.0
|
||||
|
||||
<details>
|
||||
|
|
@ -904,6 +945,8 @@ cargo add wifi-densepose-ruvector # RuVector v2.0.4 integration layer (ADR-017
|
|||
| [`wifi-densepose-api`](https://crates.io/crates/wifi-densepose-api) | REST + WebSocket API layer | -- | [](https://crates.io/crates/wifi-densepose-api) |
|
||||
| [`wifi-densepose-config`](https://crates.io/crates/wifi-densepose-config) | Configuration management | -- | [](https://crates.io/crates/wifi-densepose-config) |
|
||||
| [`wifi-densepose-db`](https://crates.io/crates/wifi-densepose-db) | Database persistence (PostgreSQL, SQLite, Redis) | -- | [](https://crates.io/crates/wifi-densepose-db) |
|
||||
| `wifi-densepose-pointcloud` | Real-time dense point cloud from camera + WiFi CSI fusion (Three.js viewer, brain bridge). Workspace-only for now. | -- | — |
|
||||
| `wifi-densepose-geo` | Geospatial context (Sentinel-2 tiles, SRTM elevation, OSM, weather, night-mode). Workspace-only for now. | -- | — |
|
||||
|
||||
All crates integrate with [RuVector v2.0.4](https://github.com/ruvnet/ruvector) — see [AI Backbone](#ai-backbone-ruvector) below.
|
||||
|
||||
|
|
|
|||
65
docs/adr/ADR-044-geospatial-satellite-integration.md
Normal file
65
docs/adr/ADR-044-geospatial-satellite-integration.md
Normal file
|
|
@ -0,0 +1,65 @@
|
|||
# ADR-044: Geospatial Satellite Integration
|
||||
|
||||
## Status
|
||||
Accepted
|
||||
|
||||
## Context
|
||||
RuView generates real-time 3D point clouds from camera + WiFi CSI, but these exist in a local coordinate frame with no geographic reference. Integrating free satellite imagery, terrain elevation, and map data provides environmental context that enables the ruOS brain to reason about the physical world beyond the room.
|
||||
|
||||
## Decision
|
||||
|
||||
### Data Sources (all free, no API keys)
|
||||
| Source | Data | Resolution | Update | Format |
|
||||
|--------|------|-----------|--------|--------|
|
||||
| EOX Sentinel-2 Cloudless | Satellite tiles | 10m | Static mosaic | XYZ/JPEG |
|
||||
| SRTM GL1 (NASA) | Elevation/DEM | 30m (1-arcsec) | Static | Binary HGT |
|
||||
| Overpass API (OSM) | Buildings, roads | Vector | Real-time | JSON |
|
||||
| ip-api.com | IP geolocation | ~1km | Per-request | JSON |
|
||||
| Sentinel-2 STAC | Temporal satellite | 10m | Every 5 days | COG/STAC |
|
||||
| Open Meteo | Weather | Point | Hourly | JSON |
|
||||
|
||||
### Architecture
|
||||
Pure Rust implementation in `wifi-densepose-geo` crate. No GDAL/PROJ/GEOS — coordinate transforms implemented directly (~250 LOC). Tile caching on disk at `~/.local/share/ruview/geo-cache/`.
|
||||
|
||||
### Coordinate System
|
||||
- WGS84 for geographic coordinates
|
||||
- ENU (East-North-Up) as the bridge between local sensor frame and world
|
||||
- Local sensor frame: camera origin, +Z forward, +Y up
|
||||
|
||||
### Temporal Awareness
|
||||
Nightly scheduled fetch of Sentinel-2 latest imagery + OSM diffs + weather.
|
||||
Changes detected via image comparison and stored as brain memories for
|
||||
contrastive learning.
|
||||
|
||||
### Brain Integration
|
||||
Geospatial context stored as brain memories:
|
||||
- `spatial-geo`: location, elevation, nearby landmarks
|
||||
- `spatial-change`: detected changes in satellite/OSM data
|
||||
- `spatial-weather`: current conditions + forecast
|
||||
- `spatial-season`: vegetation index, snow cover, seasonal patterns
|
||||
- `spatial-local`: hyperlocal web context from Common Crawl WET
|
||||
|
||||
### Extended Data Sources (via ruvector WET/Common Crawl)
|
||||
| Source | Data | Use |
|
||||
|--------|------|-----|
|
||||
| Common Crawl WET | Web text near location | Local business info, reviews, events |
|
||||
| Wikidata | Structured knowledge | Building names, POI descriptions |
|
||||
| NASA FIRMS | Active fire (3-hour) | Safety alerts |
|
||||
| USGS Earthquakes | Seismic events | Safety context |
|
||||
| OpenAQ | Air quality (PM2.5) | Environmental health |
|
||||
| Overture Maps | Building footprints (Meta/MS) | Higher quality than OSM |
|
||||
|
||||
The ruvector brain server has existing `web_ingest` + Common Crawl support.
|
||||
WET files filtered by geographic URL patterns provide hyperlocal context.
|
||||
|
||||
## Consequences
|
||||
### Positive
|
||||
- Agent gains environmental awareness beyond the room
|
||||
- Temporal data enables seasonal calibration of CSI sensing
|
||||
- Change detection finds construction, vegetation, weather effects
|
||||
- All data sources are genuinely free with no API keys
|
||||
|
||||
### Negative
|
||||
- Initial data fetch requires internet (~2MB tiles + ~25MB DEM)
|
||||
- Cached data becomes stale (mitigated by nightly refresh)
|
||||
- IP geolocation has ~1km accuracy (mitigated by manual override)
|
||||
|
|
@ -1,4 +1,4 @@
|
|||
# ADR-044: Provisioning Tool Enhancements
|
||||
# ADR-050: Provisioning Tool Enhancements
|
||||
|
||||
**Status**: Proposed
|
||||
**Date**: 2026-03-03
|
||||
|
|
@ -536,6 +536,110 @@ Both UIs update in real-time via WebSocket and auto-detect the sensing server on
|
|||
|
||||
---
|
||||
|
||||
## Dense Point Cloud (Camera + WiFi CSI Fusion)
|
||||
|
||||
RuView can generate real-time 3D point clouds by fusing camera depth estimation with WiFi CSI spatial sensing. This creates a spatial model of the environment that updates in real-time.
|
||||
|
||||
### Setup
|
||||
|
||||
```bash
|
||||
# Build the pointcloud binary
|
||||
cd rust-port/wifi-densepose-rs
|
||||
cargo build --release -p wifi-densepose-pointcloud
|
||||
|
||||
# Start the server (auto-detects camera + CSI). Loopback-only by default.
|
||||
./target/release/ruview-pointcloud serve --bind 127.0.0.1:9880
|
||||
```
|
||||
|
||||
Open `http://localhost:9880` for the interactive Three.js 3D viewer.
|
||||
|
||||
> **Security note.** The server exposes live camera, skeleton, vitals, and occupancy over HTTP. The `--bind` flag defaults to `127.0.0.1:9880` (loopback-only). Exposing on `0.0.0.0` or a LAN IP is opt-in — the server logs a warning when it does, but there is no auth/TLS layer. Put a reverse proxy in front if you need remote access.
|
||||
|
||||
> **Brain URL.** Observations are POSTed to `http://127.0.0.1:9876` by default. Override via the `RUVIEW_BRAIN_URL` environment variable or the `--brain <url>` flag on `serve` / `train`.
|
||||
|
||||
### Sensors
|
||||
|
||||
| Sensor | Auto-detected | Data |
|
||||
|--------|--------------|------|
|
||||
| Camera (`/dev/video0`) | Yes (Linux UVC) | RGB frames → MiDaS depth → 3D points |
|
||||
| ESP32 CSI (UDP:3333) | Yes (if provisioned) | ADR-018 binary → occupancy + pose + vitals |
|
||||
| MiDaS depth server (port 9885) | Optional | GPU-accelerated neural depth estimation |
|
||||
|
||||
### Commands
|
||||
|
||||
| Command | Description |
|
||||
|---------|-------------|
|
||||
| `ruview-pointcloud serve --bind 127.0.0.1:9880` | Start HTTP server + Three.js viewer (loopback-only by default) |
|
||||
| `ruview-pointcloud demo` | Generate synthetic point cloud (no hardware needed) |
|
||||
| `ruview-pointcloud capture --output room.ply` | Capture single frame to PLY file |
|
||||
| `ruview-pointcloud cameras` | List available cameras |
|
||||
| `ruview-pointcloud train --data-dir ./data [--brain URL]` | Depth calibration + occupancy training (writes under canonicalized `data-dir`; refuses `..` traversal) |
|
||||
| `ruview-pointcloud csi-test --count 100` | Send test CSI frames (no ESP32 needed) |
|
||||
| `ruview-pointcloud fingerprint <name> [--seconds 5]` | Record a named CSI room fingerprint for later matching |
|
||||
|
||||
### Pipeline Components
|
||||
|
||||
1. **ADR-018 Parser** — Decodes ESP32 CSI binary frames from UDP (magic `0xC5110001` raw CSI and `0xC5110006` feature state), extracts I/Q subcarrier amplitudes and phases. Lives in `parser.rs`; unit-tested against hand-rolled test vectors.
|
||||
2. **Pose (stub)** — 17 COCO keypoint *layout* generated by `heuristic_pose_from_amplitude` from CSI amplitude energy. This is **not** the trained WiFlow model — it is a placeholder so the viewer has a skeleton to render. Wiring to real Candle/ONNX inference from the `wifi-densepose-nn` crate is a planned follow-up.
|
||||
3. **Vital Signs** — Breathing rate from CSI phase analysis (peak counting on stable subcarrier)
|
||||
4. **Motion Detection** — CSI amplitude variance over 20 frames, triggers adaptive capture
|
||||
5. **RF Tomography** — Backprojection from per-node RSSI to 8×8×4 occupancy grid
|
||||
6. **Camera Depth** — MiDaS monocular depth (GPU) with luminance+edge fallback
|
||||
7. **Sensor Fusion** — Voxel-grid merging of camera depth + CSI occupancy
|
||||
8. **Brain Bridge** — Stores spatial observations in the ruOS brain every 60 seconds
|
||||
|
||||
### API Endpoints
|
||||
|
||||
| Endpoint | Method | Returns |
|
||||
|----------|--------|---------|
|
||||
| `/health` | GET | `{"status": "ok"}` |
|
||||
| `/api/status` | GET | Camera, CSI, pipeline state, vitals, motion |
|
||||
| `/api/cloud` | GET | Point cloud (up to 1000 points) + pipeline data |
|
||||
| `/api/splats` | GET | Gaussian splats for Three.js rendering |
|
||||
| `/` | GET | Interactive Three.js 3D viewer |
|
||||
|
||||
### Training
|
||||
|
||||
The training pipeline calibrates depth estimation and occupancy detection:
|
||||
|
||||
```bash
|
||||
ruview-pointcloud train --data-dir ~/.local/share/ruview/training --brain http://127.0.0.1:9876
|
||||
```
|
||||
|
||||
This captures frames, runs depth calibration (grid search over scale/offset/gamma), trains occupancy thresholds, exports DPO preference pairs, and submits results to the ruOS brain.
|
||||
|
||||
### Output Formats
|
||||
|
||||
- **PLY** — Standard 3D point cloud (ASCII, with RGB color)
|
||||
- **Gaussian Splats** — JSON format for Three.js rendering
|
||||
- **Brain Memories** — Spatial observations stored as `spatial-observation`, `spatial-motion`, `spatial-vitals`
|
||||
|
||||
### Deep Room Scan
|
||||
|
||||
Capture a high-quality 3D model of the room:
|
||||
|
||||
```bash
|
||||
# Stop the live server first (frees the camera)
|
||||
# Then capture 20 frames and process with MiDaS
|
||||
ruview-pointcloud capture --frames 20 --output room_model.ply
|
||||
```
|
||||
|
||||
Result: 40,000+ voxels at 5cm resolution, 12,000+ Gaussian splats.
|
||||
|
||||
### ESP32 Provisioning for CSI
|
||||
|
||||
To send CSI data to the pointcloud server:
|
||||
|
||||
```bash
|
||||
python3 firmware/esp32-csi-node/provision.py \
|
||||
--port /dev/ttyACM0 \
|
||||
--ssid "YourWiFi" --password "YourPassword" \
|
||||
--target-ip 192.168.1.123 --target-port 3333 \
|
||||
--node-id 1
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Vital Sign Detection
|
||||
|
||||
The system extracts breathing rate and heart rate from CSI signal fluctuations using FFT peak detection.
|
||||
|
|
|
|||
73
rust-port/wifi-densepose-rs/Cargo.lock
generated
73
rust-port/wifi-densepose-rs/Cargo.lock
generated
|
|
@ -1210,13 +1210,34 @@ dependencies = [
|
|||
"subtle",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "dirs"
|
||||
version = "5.0.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "44c45a9d03d6676652bcb5e724c7e988de1acad23a711b5217ab9cbecbec2225"
|
||||
dependencies = [
|
||||
"dirs-sys 0.4.1",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "dirs"
|
||||
version = "6.0.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "c3e8aa94d75141228480295a7d0e7feb620b1a5ad9f12bc40be62411e38cce4e"
|
||||
dependencies = [
|
||||
"dirs-sys",
|
||||
"dirs-sys 0.5.0",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "dirs-sys"
|
||||
version = "0.4.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "520f05a5cbd335fae5a99ff7a6ab8627577660ee5cfd6a94a6a929b52ff0321c"
|
||||
dependencies = [
|
||||
"libc",
|
||||
"option-ext",
|
||||
"redox_users 0.4.6",
|
||||
"windows-sys 0.48.0",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
|
|
@ -1227,7 +1248,7 @@ checksum = "e01a3366d27ee9890022452ee61b2b63a67e6f13f58900b651ff5665f0bb1fab"
|
|||
dependencies = [
|
||||
"libc",
|
||||
"option-ext",
|
||||
"redox_users",
|
||||
"redox_users 0.5.2",
|
||||
"windows-sys 0.61.2",
|
||||
]
|
||||
|
||||
|
|
@ -3789,7 +3810,7 @@ version = "0.10.15"
|
|||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "fccd2c4f5271ab871f2069cb6f1a13ef2c0db50e1145ce03428ee541f4c63c4f"
|
||||
dependencies = [
|
||||
"dirs",
|
||||
"dirs 6.0.0",
|
||||
"openblas-build",
|
||||
"pkg-config",
|
||||
"vcpkg",
|
||||
|
|
@ -4834,6 +4855,17 @@ dependencies = [
|
|||
"bitflags 2.11.0",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "redox_users"
|
||||
version = "0.4.6"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "ba009ff324d1fc1b900bd1fdb31564febe58a8ccc8a6fdbb93b543d33b13ca43"
|
||||
dependencies = [
|
||||
"getrandom 0.2.17",
|
||||
"libredox",
|
||||
"thiserror 1.0.69",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "redox_users"
|
||||
version = "0.5.2"
|
||||
|
|
@ -6264,7 +6296,7 @@ dependencies = [
|
|||
"anyhow",
|
||||
"bytes",
|
||||
"cookie",
|
||||
"dirs",
|
||||
"dirs 6.0.0",
|
||||
"dunce",
|
||||
"embed_plist",
|
||||
"getrandom 0.3.4",
|
||||
|
|
@ -6314,7 +6346,7 @@ checksum = "4bbc990d1dbf57a8e1c7fa2327f2a614d8b757805603c1b9ba5c81bade09fd4d"
|
|||
dependencies = [
|
||||
"anyhow",
|
||||
"cargo_toml",
|
||||
"dirs",
|
||||
"dirs 6.0.0",
|
||||
"glob",
|
||||
"heck 0.5.0",
|
||||
"json-patch",
|
||||
|
|
@ -7088,7 +7120,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
|
|||
checksum = "a5e85aa143ceb072062fc4d6356c1b520a51d636e7bc8e77ec94be3608e5e80c"
|
||||
dependencies = [
|
||||
"crossbeam-channel",
|
||||
"dirs",
|
||||
"dirs 6.0.0",
|
||||
"libappindicator",
|
||||
"muda",
|
||||
"objc2",
|
||||
|
|
@ -7820,6 +7852,18 @@ dependencies = [
|
|||
"uuid",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "wifi-densepose-geo"
|
||||
version = "0.1.0"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"chrono",
|
||||
"reqwest 0.12.28",
|
||||
"serde",
|
||||
"serde_json",
|
||||
"tokio",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "wifi-densepose-hardware"
|
||||
version = "0.3.0"
|
||||
|
|
@ -7894,6 +7938,21 @@ dependencies = [
|
|||
"tracing",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "wifi-densepose-pointcloud"
|
||||
version = "0.1.0"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"axum",
|
||||
"chrono",
|
||||
"clap",
|
||||
"dirs 5.0.1",
|
||||
"reqwest 0.12.28",
|
||||
"serde",
|
||||
"serde_json",
|
||||
"tokio",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "wifi-densepose-ruvector"
|
||||
version = "0.3.0"
|
||||
|
|
@ -8718,7 +8777,7 @@ dependencies = [
|
|||
"block2",
|
||||
"cookie",
|
||||
"crossbeam-channel",
|
||||
"dirs",
|
||||
"dirs 6.0.0",
|
||||
"dpi",
|
||||
"dunce",
|
||||
"gdkx11",
|
||||
|
|
|
|||
|
|
@ -17,6 +17,8 @@ members = [
|
|||
"crates/wifi-densepose-vitals",
|
||||
"crates/wifi-densepose-ruvector",
|
||||
"crates/wifi-densepose-desktop",
|
||||
"crates/wifi-densepose-pointcloud",
|
||||
"crates/wifi-densepose-geo",
|
||||
]
|
||||
# ADR-040: WASM edge crate targets wasm32-unknown-unknown (no_std),
|
||||
# excluded from workspace to avoid breaking `cargo test --workspace`.
|
||||
|
|
|
|||
|
|
@ -0,0 +1,13 @@
|
|||
[package]
|
||||
name = "wifi-densepose-geo"
|
||||
version = "0.1.0"
|
||||
edition = "2021"
|
||||
description = "Geospatial satellite integration — free satellite tiles, DEM, OSM, temporal tracking"
|
||||
|
||||
[dependencies]
|
||||
serde = { workspace = true }
|
||||
serde_json = { workspace = true }
|
||||
tokio = { workspace = true }
|
||||
anyhow = { workspace = true }
|
||||
reqwest = { version = "0.12", features = ["json", "native-tls"], default-features = false }
|
||||
chrono = "0.4"
|
||||
105
rust-port/wifi-densepose-rs/crates/wifi-densepose-geo/README.md
Normal file
105
rust-port/wifi-densepose-rs/crates/wifi-densepose-geo/README.md
Normal file
|
|
@ -0,0 +1,105 @@
|
|||
# wifi-densepose-geo — Geospatial Satellite Integration
|
||||
|
||||
Free satellite imagery, terrain elevation, and map data for RuView spatial sensing. No API keys required.
|
||||
|
||||
## What It Does
|
||||
|
||||
Integrates your local sensor data (camera + WiFi CSI point cloud) with geographic context:
|
||||
|
||||
- **Satellite tiles** — 10m Sentinel-2 cloudless imagery for your location
|
||||
- **Elevation** — SRTM 30m DEM for terrain modeling
|
||||
- **Buildings + roads** — OpenStreetMap data via Overpass API
|
||||
- **Weather** — Open Meteo current conditions + forecast
|
||||
- **Geo-registration** — maps local sensor coordinates to WGS84
|
||||
- **Temporal tracking** — detects changes over time (construction, vegetation, weather)
|
||||
- **Brain integration** — stores geospatial context as ruOS brain memories
|
||||
|
||||
## Data Sources (all free, no API keys)
|
||||
|
||||
| Source | Data | Resolution | License |
|
||||
|--------|------|-----------|---------|
|
||||
| [EOX S2 Cloudless](https://s2maps.eu/) | Satellite tiles | 10m | CC-BY-4.0 |
|
||||
| [SRTM GL1](https://portal.opentopography.org/) | Elevation/DEM | 30m | Public domain |
|
||||
| [Overpass API](https://overpass-api.de/) | OSM buildings/roads | Vector | ODbL |
|
||||
| [ip-api.com](http://ip-api.com/) | IP geolocation | ~1km | Free |
|
||||
| [Open Meteo](https://open-meteo.com/) | Weather | Point | CC-BY-4.0 |
|
||||
|
||||
## Modules
|
||||
|
||||
| Module | LOC | Purpose |
|
||||
|--------|-----|---------|
|
||||
| `types.rs` | 140 | GeoPoint, GeoBBox, TileCoord, ElevationGrid, OsmFeature |
|
||||
| `coord.rs` | 80 | WGS84/ENU transforms, tile math, haversine distance |
|
||||
| `locate.rs` | 45 | IP geolocation with caching |
|
||||
| `cache.rs` | 55 | Disk cache (`~/.local/share/ruview/geo-cache/`) |
|
||||
| `tiles.rs` | 80 | Sentinel-2/ESRI/OSM tile fetcher |
|
||||
| `terrain.rs` | 100 | SRTM HGT parser, elevation lookup |
|
||||
| `osm.rs` | 150 | Overpass API client, building/road extraction |
|
||||
| `register.rs` | 50 | Local-to-WGS84 coordinate registration |
|
||||
| `fuse.rs` | 70 | Multi-source scene builder + summary |
|
||||
| `brain.rs` | 30 | Store geo context in ruOS brain |
|
||||
| `temporal.rs` | 100 | Weather, OSM change detection |
|
||||
|
||||
## Usage
|
||||
|
||||
```rust
|
||||
use wifi_densepose_geo::{fuse, brain, temporal};
|
||||
|
||||
// Build geo scene for current location
|
||||
let scene = fuse::build_scene(500.0).await?; // 500m radius
|
||||
println!("{}", fuse::summarize(&scene));
|
||||
// "Location: 43.6532N, 79.3832W, elevation 76m ASL.
|
||||
// 23 buildings within view. 8 roads nearby (King St, Queen St).
|
||||
// 12 satellite tiles at zoom 16."
|
||||
|
||||
// Store in brain
|
||||
brain::store_geo_context(&scene).await?;
|
||||
|
||||
// Fetch weather
|
||||
let weather = temporal::fetch_weather(&scene.location).await?;
|
||||
// temperature: 12°C, partly cloudy, humidity 65%
|
||||
```
|
||||
|
||||
## Brain Integration
|
||||
|
||||
Geospatial context is stored as brain memories:
|
||||
|
||||
| Category | Content | Frequency |
|
||||
|----------|---------|-----------|
|
||||
| `spatial-geo` | Location, elevation, buildings, roads | On startup + daily |
|
||||
| `spatial-weather` | Temperature, conditions, humidity, wind | Nightly |
|
||||
| `spatial-change` | New/removed buildings, road changes | Nightly diff |
|
||||
|
||||
The ruOS agent can search: "what buildings are near me?" or "what's the weather?" and get geospatial context from the brain.
|
||||
|
||||
## Security
|
||||
|
||||
- No API keys stored or transmitted
|
||||
- IP geolocation uses HTTP (not HTTPS) — location is approximate (~1km)
|
||||
- All tile fetches use HTTPS except ip-api.com
|
||||
- Path traversal protection in cache key sanitization
|
||||
- No user data sent to external services
|
||||
- All data cached locally after first fetch
|
||||
|
||||
## Architecture
|
||||
|
||||
```
|
||||
IP Geolocation ──→ (lat, lon)
|
||||
│
|
||||
┌─────────────┼─────────────┐
|
||||
▼ ▼ ▼
|
||||
Sentinel-2 SRTM DEM Overpass API
|
||||
(tiles) (elevation) (buildings/roads)
|
||||
│ │ │
|
||||
└─────────────┼─────────────┘
|
||||
▼
|
||||
GeoScene (fused)
|
||||
│
|
||||
┌───────┴───────┐
|
||||
▼ ▼
|
||||
Brain Memory Three.js Viewer
|
||||
```
|
||||
|
||||
## License
|
||||
|
||||
MIT (same as RuView)
|
||||
|
|
@ -0,0 +1,47 @@
|
|||
use wifi_densepose_geo::*;
|
||||
|
||||
#[tokio::main]
|
||||
async fn main() -> anyhow::Result<()> {
|
||||
println!("╔══════════════════════════════════════════════╗");
|
||||
println!("║ ruview-geo — Real Data Validation ║");
|
||||
println!("╚══════════════════════════════════════════════╝\n");
|
||||
|
||||
let t0 = std::time::Instant::now();
|
||||
let cache = cache::TileCache::new("/tmp/ruview-geo-validate");
|
||||
|
||||
let loc = locate::get_location(&format!("{}/location.json", cache.base_dir.display())).await?;
|
||||
println!(" Location: {:.4}N, {:.4}W", loc.lat, loc.lon);
|
||||
|
||||
let bbox = GeoBBox::from_center(&loc, 300.0);
|
||||
let tiles_list = tiles::fetch_area(&tiles::TileProvider::Sentinel2Cloudless, &bbox, 16, &cache).await?;
|
||||
println!(" Tiles: {} ({:.0}KB)", tiles_list.len(),
|
||||
tiles_list.iter().map(|t| t.data.len()).sum::<usize>() as f64 / 1024.0);
|
||||
|
||||
let dem = terrain::fetch_elevation(&loc, &cache).await?;
|
||||
println!(" Elevation: {:.0}m (grid {}x{})", terrain::elevation_at(&dem, &loc), dem.cols, dem.rows);
|
||||
|
||||
let buildings = osm::fetch_buildings(&loc, 300.0).await.unwrap_or_default();
|
||||
let roads = osm::fetch_roads(&loc, 300.0).await.unwrap_or_default();
|
||||
println!(" OSM: {} buildings, {} roads", buildings.len(), roads.len());
|
||||
|
||||
let weather = temporal::fetch_weather(&loc).await?;
|
||||
println!(" Weather: {:.0}°C humidity={:.0}% wind={:.1}m/s",
|
||||
weather.temperature_c, weather.humidity_pct, weather.wind_speed_ms);
|
||||
|
||||
let scene = GeoScene {
|
||||
location: loc.clone(), bbox, elevation_m: terrain::elevation_at(&dem, &loc),
|
||||
buildings, roads, tile_count: tiles_list.len(),
|
||||
registration: register::auto_register(&loc),
|
||||
last_updated: chrono::Utc::now().to_rfc3339(),
|
||||
};
|
||||
println!("\n {}", fuse::summarize(&scene));
|
||||
|
||||
match brain::store_geo_context(&scene).await {
|
||||
Ok(n) => println!(" Brain: {} memories stored", n),
|
||||
Err(e) => println!(" Brain: {e}"),
|
||||
}
|
||||
|
||||
println!("\n Total: {}ms | Cache: {:.0}KB",
|
||||
t0.elapsed().as_millis(), cache.size_bytes() as f64 / 1024.0);
|
||||
Ok(())
|
||||
}
|
||||
|
|
@ -0,0 +1,42 @@
|
|||
//! Brain integration — store geospatial context in ruOS brain.
|
||||
//!
|
||||
//! Brain URL is read from `RUVIEW_BRAIN_URL` env var (default
|
||||
//! `http://127.0.0.1:9876`). The resolved URL is logged once on first use.
|
||||
|
||||
use crate::fuse;
|
||||
use crate::types::GeoScene;
|
||||
use anyhow::Result;
|
||||
use std::sync::OnceLock;
|
||||
|
||||
const DEFAULT_BRAIN_URL: &str = "http://127.0.0.1:9876";
|
||||
|
||||
pub(crate) fn brain_url() -> &'static str {
|
||||
static BRAIN_URL: OnceLock<String> = OnceLock::new();
|
||||
BRAIN_URL.get_or_init(|| {
|
||||
let url = std::env::var("RUVIEW_BRAIN_URL")
|
||||
.unwrap_or_else(|_| DEFAULT_BRAIN_URL.to_string());
|
||||
eprintln!(" wifi-densepose-geo: using brain URL {url}");
|
||||
url
|
||||
})
|
||||
}
|
||||
|
||||
/// Store geospatial context in the brain.
|
||||
pub async fn store_geo_context(scene: &GeoScene) -> Result<u32> {
|
||||
let client = reqwest::Client::builder()
|
||||
.timeout(std::time::Duration::from_secs(5))
|
||||
.build()?;
|
||||
|
||||
let mut stored = 0u32;
|
||||
|
||||
// Store location summary
|
||||
let summary = fuse::summarize(scene);
|
||||
let body = serde_json::json!({
|
||||
"category": "spatial-geo",
|
||||
"content": summary,
|
||||
});
|
||||
if client.post(format!("{}/memories", brain_url())).json(&body).send().await.is_ok() {
|
||||
stored += 1;
|
||||
}
|
||||
|
||||
Ok(stored)
|
||||
}
|
||||
|
|
@ -0,0 +1,61 @@
|
|||
//! Disk cache for tiles, DEM, and OSM data.
|
||||
|
||||
use anyhow::Result;
|
||||
use std::path::{Path, PathBuf};
|
||||
|
||||
pub struct TileCache {
|
||||
pub base_dir: PathBuf,
|
||||
}
|
||||
|
||||
impl TileCache {
|
||||
pub fn new(base_dir: &str) -> Self {
|
||||
let expanded = base_dir.replace('~', &std::env::var("HOME").unwrap_or_default());
|
||||
let path = PathBuf::from(expanded);
|
||||
let _ = std::fs::create_dir_all(&path);
|
||||
Self { base_dir: path }
|
||||
}
|
||||
|
||||
pub fn default_cache() -> Self {
|
||||
Self::new("~/.local/share/ruview/geo-cache")
|
||||
}
|
||||
|
||||
pub fn get(&self, key: &str) -> Option<Vec<u8>> {
|
||||
let path = self.key_path(key);
|
||||
std::fs::read(&path).ok()
|
||||
}
|
||||
|
||||
pub fn put(&self, key: &str, data: &[u8]) -> Result<()> {
|
||||
let path = self.key_path(key);
|
||||
if let Some(parent) = path.parent() {
|
||||
std::fs::create_dir_all(parent)?;
|
||||
}
|
||||
std::fs::write(&path, data)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn has(&self, key: &str) -> bool {
|
||||
self.key_path(key).exists()
|
||||
}
|
||||
|
||||
pub fn size_bytes(&self) -> u64 {
|
||||
walkdir(self.base_dir.as_path())
|
||||
}
|
||||
|
||||
fn key_path(&self, key: &str) -> PathBuf {
|
||||
// Sanitize key to prevent path traversal
|
||||
let safe_key = key.replace("..", "_").replace('/', "_");
|
||||
self.base_dir.join(safe_key)
|
||||
}
|
||||
}
|
||||
|
||||
fn walkdir(path: &Path) -> u64 {
|
||||
std::fs::read_dir(path)
|
||||
.into_iter()
|
||||
.flatten()
|
||||
.filter_map(|e| e.ok())
|
||||
.map(|e| {
|
||||
if e.path().is_dir() { walkdir(&e.path()) }
|
||||
else { e.metadata().map(|m| m.len()).unwrap_or(0) }
|
||||
})
|
||||
.sum()
|
||||
}
|
||||
|
|
@ -0,0 +1,74 @@
|
|||
//! Coordinate transforms — WGS84, UTM, ENU, tile math.
|
||||
|
||||
use crate::types::{GeoPoint, GeoBBox, TileCoord};
|
||||
|
||||
const WGS84_A: f64 = 6_378_137.0;
|
||||
#[allow(dead_code)]
|
||||
const WGS84_F: f64 = 1.0 / 298.257_223_563;
|
||||
#[allow(dead_code)]
|
||||
const WGS84_E2: f64 = 2.0 * WGS84_F - WGS84_F * WGS84_F;
|
||||
|
||||
/// Haversine distance in meters.
|
||||
pub fn haversine(a: &GeoPoint, b: &GeoPoint) -> f64 {
|
||||
let dlat = (b.lat - a.lat).to_radians();
|
||||
let dlon = (b.lon - a.lon).to_radians();
|
||||
let lat1 = a.lat.to_radians();
|
||||
let lat2 = b.lat.to_radians();
|
||||
let h = (dlat / 2.0).sin().powi(2) + lat1.cos() * lat2.cos() * (dlon / 2.0).sin().powi(2);
|
||||
2.0 * WGS84_A * h.sqrt().asin()
|
||||
}
|
||||
|
||||
/// WGS84 to local ENU (East-North-Up) relative to origin, in meters.
|
||||
pub fn wgs84_to_enu(point: &GeoPoint, origin: &GeoPoint) -> [f64; 3] {
|
||||
let dlat = (point.lat - origin.lat).to_radians();
|
||||
let dlon = (point.lon - origin.lon).to_radians();
|
||||
let lat = origin.lat.to_radians();
|
||||
let east = dlon * WGS84_A * lat.cos();
|
||||
let north = dlat * WGS84_A;
|
||||
let up = point.alt - origin.alt;
|
||||
[east, north, up]
|
||||
}
|
||||
|
||||
/// Local ENU to WGS84.
|
||||
pub fn enu_to_wgs84(enu: &[f64; 3], origin: &GeoPoint) -> GeoPoint {
|
||||
let lat = origin.lat.to_radians();
|
||||
let dlat = enu[1] / WGS84_A;
|
||||
let dlon = enu[0] / (WGS84_A * lat.cos());
|
||||
GeoPoint {
|
||||
lat: origin.lat + dlat.to_degrees(),
|
||||
lon: origin.lon + dlon.to_degrees(),
|
||||
alt: origin.alt + enu[2],
|
||||
}
|
||||
}
|
||||
|
||||
/// WGS84 to XYZ tile coordinates (Slippy Map).
|
||||
pub fn wgs84_to_tile(lat: f64, lon: f64, zoom: u8) -> TileCoord {
|
||||
let n = 2f64.powi(zoom as i32);
|
||||
let x = ((lon + 180.0) / 360.0 * n).floor() as u32;
|
||||
let lat_rad = lat.to_radians();
|
||||
let y = ((1.0 - lat_rad.tan().asinh() / std::f64::consts::PI) / 2.0 * n).floor() as u32;
|
||||
TileCoord { z: zoom, x, y }
|
||||
}
|
||||
|
||||
/// Tile bounds in WGS84.
|
||||
pub fn tile_bounds(coord: &TileCoord) -> GeoBBox {
|
||||
let n = 2f64.powi(coord.z as i32);
|
||||
let west = coord.x as f64 / n * 360.0 - 180.0;
|
||||
let east = (coord.x + 1) as f64 / n * 360.0 - 180.0;
|
||||
let north = (std::f64::consts::PI * (1.0 - 2.0 * coord.y as f64 / n)).sinh().atan().to_degrees();
|
||||
let south = (std::f64::consts::PI * (1.0 - 2.0 * (coord.y + 1) as f64 / n)).sinh().atan().to_degrees();
|
||||
GeoBBox { south, west, north, east }
|
||||
}
|
||||
|
||||
/// Get all tile coordinates covering a bounding box at a zoom level.
|
||||
pub fn tiles_for_bbox(bbox: &GeoBBox, zoom: u8) -> Vec<TileCoord> {
|
||||
let tl = wgs84_to_tile(bbox.north, bbox.west, zoom);
|
||||
let br = wgs84_to_tile(bbox.south, bbox.east, zoom);
|
||||
let mut tiles = Vec::new();
|
||||
for y in tl.y..=br.y {
|
||||
for x in tl.x..=br.x {
|
||||
tiles.push(TileCoord { z: zoom, x, y });
|
||||
}
|
||||
}
|
||||
tiles
|
||||
}
|
||||
|
|
@ -0,0 +1,72 @@
|
|||
//! Multi-source fusion — satellite + terrain + OSM + local sensor data.
|
||||
|
||||
use crate::cache::TileCache;
|
||||
use crate::types::*;
|
||||
use crate::{locate, osm, terrain, tiles};
|
||||
use anyhow::Result;
|
||||
|
||||
/// Build a complete geo scene for a location.
|
||||
pub async fn build_scene(radius_m: f64) -> Result<GeoScene> {
|
||||
let cache = TileCache::default_cache();
|
||||
|
||||
// 1. Locate
|
||||
let cache_path = cache.base_dir.join("location.json");
|
||||
let location = locate::get_location(cache_path.to_str().unwrap_or("")).await?;
|
||||
eprintln!(" Geo: located at {:.4}N, {:.4}W", location.lat, location.lon);
|
||||
|
||||
// 2. Fetch satellite tiles
|
||||
let bbox = GeoBBox::from_center(&location, radius_m);
|
||||
let tile_list = tiles::fetch_area(&tiles::TileProvider::Sentinel2Cloudless, &bbox, 16, &cache).await?;
|
||||
eprintln!(" Geo: fetched {} satellite tiles", tile_list.len());
|
||||
|
||||
// 3. Fetch elevation
|
||||
let dem = terrain::fetch_elevation(&location, &cache).await?;
|
||||
let elevation = terrain::elevation_at(&dem, &location);
|
||||
eprintln!(" Geo: elevation {:.0}m ASL", elevation);
|
||||
|
||||
// 4. Fetch OSM buildings + roads
|
||||
let buildings = osm::fetch_buildings(&location, radius_m).await.unwrap_or_default();
|
||||
let roads = osm::fetch_roads(&location, radius_m).await.unwrap_or_default();
|
||||
eprintln!(" Geo: {} buildings, {} roads", buildings.len(), roads.len());
|
||||
|
||||
// 5. Build registration
|
||||
let mut reg_origin = location.clone();
|
||||
reg_origin.alt = elevation as f64;
|
||||
let registration = crate::register::auto_register(®_origin);
|
||||
|
||||
Ok(GeoScene {
|
||||
location: reg_origin,
|
||||
bbox,
|
||||
elevation_m: elevation,
|
||||
buildings,
|
||||
roads,
|
||||
tile_count: tile_list.len(),
|
||||
registration,
|
||||
last_updated: chrono::Utc::now().to_rfc3339(),
|
||||
})
|
||||
}
|
||||
|
||||
/// Generate a text summary of the geo scene.
|
||||
pub fn summarize(scene: &GeoScene) -> String {
|
||||
let building_count = scene.buildings.len();
|
||||
let road_count = scene.roads.len();
|
||||
let road_names: Vec<&str> = scene.roads.iter()
|
||||
.filter_map(|r| match r {
|
||||
OsmFeature::Road { name, .. } => name.as_deref(),
|
||||
_ => None,
|
||||
})
|
||||
.take(3)
|
||||
.collect();
|
||||
|
||||
format!(
|
||||
"Location: {:.4}N, {:.4}W, elevation {:.0}m ASL. \
|
||||
{} buildings within view. {} roads nearby{}. \
|
||||
{} satellite tiles at zoom 16. Updated: {}.",
|
||||
scene.location.lat, scene.location.lon, scene.elevation_m,
|
||||
building_count, road_count,
|
||||
if road_names.is_empty() { String::new() }
|
||||
else { format!(" ({})", road_names.join(", ")) },
|
||||
scene.tile_count,
|
||||
&scene.last_updated[..10],
|
||||
)
|
||||
}
|
||||
|
|
@ -0,0 +1,19 @@
|
|||
//! wifi-densepose-geo — geospatial satellite integration for RuView.
|
||||
//!
|
||||
//! Provides: IP geolocation, satellite tile fetching (Sentinel-2),
|
||||
//! SRTM elevation, OSM buildings/roads, coordinate transforms,
|
||||
//! temporal change tracking, and brain memory integration.
|
||||
|
||||
pub mod types;
|
||||
pub mod coord;
|
||||
pub mod locate;
|
||||
pub mod cache;
|
||||
pub mod tiles;
|
||||
pub mod terrain;
|
||||
pub mod osm;
|
||||
pub mod register;
|
||||
pub mod fuse;
|
||||
pub mod brain;
|
||||
pub mod temporal;
|
||||
|
||||
pub use types::*;
|
||||
|
|
@ -0,0 +1,40 @@
|
|||
//! IP geolocation — determine location from public IP.
|
||||
|
||||
use crate::types::GeoPoint;
|
||||
use anyhow::Result;
|
||||
|
||||
/// Locate by IP address (free, no API key).
|
||||
pub async fn locate_by_ip() -> Result<GeoPoint> {
|
||||
let client = reqwest::Client::builder()
|
||||
.timeout(std::time::Duration::from_secs(5))
|
||||
.build()?;
|
||||
|
||||
// Primary: ip-api.com (free, 45 req/min)
|
||||
let resp: serde_json::Value = client
|
||||
.get("http://ip-api.com/json/?fields=lat,lon,city,regionName,country")
|
||||
.send().await?
|
||||
.json().await?;
|
||||
|
||||
let lat = resp.get("lat").and_then(|v| v.as_f64()).unwrap_or(0.0);
|
||||
let lon = resp.get("lon").and_then(|v| v.as_f64()).unwrap_or(0.0);
|
||||
|
||||
if lat == 0.0 && lon == 0.0 {
|
||||
anyhow::bail!("IP geolocation returned (0,0)");
|
||||
}
|
||||
|
||||
Ok(GeoPoint { lat, lon, alt: 0.0 })
|
||||
}
|
||||
|
||||
/// Get location with caching.
|
||||
pub async fn get_location(cache_path: &str) -> Result<GeoPoint> {
|
||||
// Check cache
|
||||
if let Ok(data) = std::fs::read_to_string(cache_path) {
|
||||
if let Ok(point) = serde_json::from_str::<GeoPoint>(&data) {
|
||||
return Ok(point);
|
||||
}
|
||||
}
|
||||
|
||||
let point = locate_by_ip().await?;
|
||||
let _ = std::fs::write(cache_path, serde_json::to_string(&point)?);
|
||||
Ok(point)
|
||||
}
|
||||
216
rust-port/wifi-densepose-rs/crates/wifi-densepose-geo/src/osm.rs
Normal file
216
rust-port/wifi-densepose-rs/crates/wifi-densepose-geo/src/osm.rs
Normal file
|
|
@ -0,0 +1,216 @@
|
|||
//! OpenStreetMap data via Overpass API — buildings, roads, land use.
|
||||
|
||||
use crate::types::{GeoBBox, GeoPoint, OsmFeature};
|
||||
use anyhow::{anyhow, Result};
|
||||
|
||||
const OVERPASS_URL: &str = "https://overpass-api.de/api/interpreter";
|
||||
|
||||
/// Maximum radius (in metres) accepted by the OSM fetchers. Requests larger
|
||||
/// than this would produce Overpass queries covering hundreds of square
|
||||
/// kilometres — which hammers the public endpoint and returns unworkably
|
||||
/// large response payloads. Callers wanting wider areas must tile the queries.
|
||||
pub const MAX_RADIUS_M: f64 = 5000.0;
|
||||
|
||||
fn check_radius(radius_m: f64) -> Result<()> {
|
||||
if !radius_m.is_finite() || radius_m <= 0.0 {
|
||||
return Err(anyhow!("radius_m must be positive and finite (got {radius_m})"));
|
||||
}
|
||||
if radius_m > MAX_RADIUS_M {
|
||||
return Err(anyhow!(
|
||||
"radius_m {radius_m} exceeds MAX_RADIUS_M ({MAX_RADIUS_M}); \
|
||||
tile the query into smaller chunks"
|
||||
));
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Fetch buildings within radius of a point.
|
||||
///
|
||||
/// Uses an inclusive `["building"]` filter that matches all building values
|
||||
/// (residential, commercial, yes, etc.) and also queries relations for
|
||||
/// multipolygon buildings. Default recommended radius: 500 m. Max 5000 m.
|
||||
pub async fn fetch_buildings(center: &GeoPoint, radius_m: f64) -> Result<Vec<OsmFeature>> {
|
||||
check_radius(radius_m)?;
|
||||
let bbox = GeoBBox::from_center(center, radius_m);
|
||||
let query = format!(
|
||||
r#"[out:json][timeout:25];(way["building"]({},{},{},{});relation["building"]({},{},{},{}););out body;>;out skel qt;"#,
|
||||
bbox.south, bbox.west, bbox.north, bbox.east,
|
||||
bbox.south, bbox.west, bbox.north, bbox.east,
|
||||
);
|
||||
let resp = overpass_query(&query).await?;
|
||||
parse_buildings(&resp)
|
||||
}
|
||||
|
||||
/// Fetch roads within radius. Max 5000 m; returns an error otherwise.
|
||||
pub async fn fetch_roads(center: &GeoPoint, radius_m: f64) -> Result<Vec<OsmFeature>> {
|
||||
check_radius(radius_m)?;
|
||||
let bbox = GeoBBox::from_center(center, radius_m);
|
||||
let query = format!(
|
||||
r#"[out:json][timeout:10];way["highway"]({},{},{},{});out body;>;out skel qt;"#,
|
||||
bbox.south, bbox.west, bbox.north, bbox.east
|
||||
);
|
||||
let resp = overpass_query(&query).await?;
|
||||
parse_roads(&resp)
|
||||
}
|
||||
|
||||
async fn overpass_query(query: &str) -> Result<serde_json::Value> {
|
||||
let client = reqwest::Client::builder()
|
||||
.timeout(std::time::Duration::from_secs(15))
|
||||
.user_agent("RuView/0.1")
|
||||
.build()?;
|
||||
|
||||
let resp = client.post(OVERPASS_URL)
|
||||
.form(&[("data", query)])
|
||||
.send().await?;
|
||||
|
||||
if !resp.status().is_success() {
|
||||
anyhow::bail!("Overpass API error: {}", resp.status());
|
||||
}
|
||||
Ok(resp.json().await?)
|
||||
}
|
||||
|
||||
/// Parse an Overpass JSON response into building features.
|
||||
///
|
||||
/// Returns an error if the response is not a JSON object or is missing the
|
||||
/// top-level `elements` array (indicative of a malformed/non-Overpass payload).
|
||||
pub fn parse_overpass_json(data: &serde_json::Value) -> Result<Vec<OsmFeature>> {
|
||||
if !data.is_object() || data.get("elements").and_then(|e| e.as_array()).is_none() {
|
||||
return Err(anyhow!("malformed Overpass response: missing `elements` array"));
|
||||
}
|
||||
parse_buildings(data)
|
||||
}
|
||||
|
||||
pub(crate) fn parse_buildings(data: &serde_json::Value) -> Result<Vec<OsmFeature>> {
|
||||
let mut buildings = Vec::new();
|
||||
let mut nodes: std::collections::HashMap<u64, [f64; 2]> = std::collections::HashMap::new();
|
||||
|
||||
let elements = data.get("elements").and_then(|e| e.as_array()).cloned().unwrap_or_default();
|
||||
|
||||
// First pass: collect nodes
|
||||
for el in &elements {
|
||||
if el.get("type").and_then(|t| t.as_str()) == Some("node") {
|
||||
if let (Some(id), Some(lat), Some(lon)) = (
|
||||
el.get("id").and_then(|v| v.as_u64()),
|
||||
el.get("lat").and_then(|v| v.as_f64()),
|
||||
el.get("lon").and_then(|v| v.as_f64()),
|
||||
) {
|
||||
nodes.insert(id, [lat, lon]);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Second pass: build ways
|
||||
for el in &elements {
|
||||
if el.get("type").and_then(|t| t.as_str()) != Some("way") { continue; }
|
||||
let tags = el.get("tags").cloned().unwrap_or(serde_json::json!({}));
|
||||
if tags.get("building").is_none() { continue; }
|
||||
|
||||
let node_ids = el.get("nodes").and_then(|n| n.as_array()).cloned().unwrap_or_default();
|
||||
let outline: Vec<[f64; 2]> = node_ids.iter()
|
||||
.filter_map(|id| id.as_u64().and_then(|id| nodes.get(&id).copied()))
|
||||
.collect();
|
||||
|
||||
if outline.len() < 3 { continue; }
|
||||
|
||||
let height = tags.get("height").and_then(|h| h.as_str())
|
||||
.and_then(|s| s.trim_end_matches('m').trim().parse::<f32>().ok())
|
||||
.or(Some(8.0)); // default building height
|
||||
|
||||
let name = tags.get("name").and_then(|n| n.as_str()).map(|s| s.to_string());
|
||||
|
||||
buildings.push(OsmFeature::Building { outline, height, name });
|
||||
}
|
||||
|
||||
Ok(buildings)
|
||||
}
|
||||
|
||||
fn parse_roads(data: &serde_json::Value) -> Result<Vec<OsmFeature>> {
|
||||
let mut roads = Vec::new();
|
||||
let mut nodes: std::collections::HashMap<u64, [f64; 2]> = std::collections::HashMap::new();
|
||||
|
||||
let elements = data.get("elements").and_then(|e| e.as_array()).cloned().unwrap_or_default();
|
||||
|
||||
for el in &elements {
|
||||
if el.get("type").and_then(|t| t.as_str()) == Some("node") {
|
||||
if let (Some(id), Some(lat), Some(lon)) = (
|
||||
el.get("id").and_then(|v| v.as_u64()),
|
||||
el.get("lat").and_then(|v| v.as_f64()),
|
||||
el.get("lon").and_then(|v| v.as_f64()),
|
||||
) {
|
||||
nodes.insert(id, [lat, lon]);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
for el in &elements {
|
||||
if el.get("type").and_then(|t| t.as_str()) != Some("way") { continue; }
|
||||
let tags = el.get("tags").cloned().unwrap_or(serde_json::json!({}));
|
||||
let highway = tags.get("highway").and_then(|h| h.as_str());
|
||||
if highway.is_none() { continue; }
|
||||
|
||||
let node_ids = el.get("nodes").and_then(|n| n.as_array()).cloned().unwrap_or_default();
|
||||
let path: Vec<[f64; 2]> = node_ids.iter()
|
||||
.filter_map(|id| id.as_u64().and_then(|id| nodes.get(&id).copied()))
|
||||
.collect();
|
||||
|
||||
if path.len() < 2 { continue; }
|
||||
|
||||
let name = tags.get("name").and_then(|n| n.as_str()).map(|s| s.to_string());
|
||||
|
||||
roads.push(OsmFeature::Road {
|
||||
path,
|
||||
road_type: highway.unwrap_or("unknown").to_string(),
|
||||
name,
|
||||
});
|
||||
}
|
||||
|
||||
Ok(roads)
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn parse_overpass_json_accepts_minimal_fixture() {
|
||||
// Minimal fixture: three nodes forming a triangular building.
|
||||
let j = serde_json::json!({
|
||||
"elements": [
|
||||
{ "type": "node", "id": 1, "lat": 43.0, "lon": -79.0 },
|
||||
{ "type": "node", "id": 2, "lat": 43.0001, "lon": -79.0 },
|
||||
{ "type": "node", "id": 3, "lat": 43.0, "lon": -79.0001 },
|
||||
{
|
||||
"type": "way", "id": 100,
|
||||
"nodes": [1, 2, 3, 1],
|
||||
"tags": { "building": "yes", "name": "Test Hall" }
|
||||
}
|
||||
]
|
||||
});
|
||||
let features = parse_overpass_json(&j).expect("minimal payload should parse");
|
||||
assert_eq!(features.len(), 1);
|
||||
match &features[0] {
|
||||
OsmFeature::Building { outline, name, .. } => {
|
||||
assert_eq!(outline.len(), 4);
|
||||
assert_eq!(name.as_deref(), Some("Test Hall"));
|
||||
}
|
||||
_ => panic!("expected a Building"),
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn parse_overpass_json_rejects_malformed() {
|
||||
// Missing the `elements` array entirely.
|
||||
let j = serde_json::json!({ "version": 0.6 });
|
||||
assert!(parse_overpass_json(&j).is_err());
|
||||
// Not even an object.
|
||||
let arr = serde_json::json!([1, 2, 3]);
|
||||
assert!(parse_overpass_json(&arr).is_err());
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn fetch_buildings_rejects_oversized_radius() {
|
||||
let center = GeoPoint { lat: 43.0, lon: -79.0, alt: 0.0 };
|
||||
let err = fetch_buildings(¢er, MAX_RADIUS_M + 1.0).await.err();
|
||||
assert!(err.is_some(), "should reject radius > MAX_RADIUS_M");
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,41 @@
|
|||
//! Geo-registration — maps local sensor coordinates to WGS84.
|
||||
|
||||
use crate::coord;
|
||||
use crate::types::{GeoPoint, GeoRegistration};
|
||||
|
||||
/// Auto-register using IP location (sensor at IP location, facing north).
|
||||
pub fn auto_register(ip_location: &GeoPoint) -> GeoRegistration {
|
||||
GeoRegistration {
|
||||
origin: ip_location.clone(),
|
||||
heading_deg: 0.0,
|
||||
scale: 1.0,
|
||||
}
|
||||
}
|
||||
|
||||
/// Transform local point [x, y, z] to WGS84.
|
||||
pub fn local_to_wgs84(reg: &GeoRegistration, local: &[f32; 3]) -> GeoPoint {
|
||||
let heading_rad = reg.heading_deg.to_radians();
|
||||
let cos_h = heading_rad.cos();
|
||||
let sin_h = heading_rad.sin();
|
||||
|
||||
// Rotate local by heading (local X → East when heading=0)
|
||||
let east = (local[0] as f64 * cos_h - local[2] as f64 * sin_h) * reg.scale;
|
||||
let north = (local[0] as f64 * sin_h + local[2] as f64 * cos_h) * reg.scale;
|
||||
let up = local[1] as f64 * reg.scale;
|
||||
|
||||
coord::enu_to_wgs84(&[east, north, up], ®.origin)
|
||||
}
|
||||
|
||||
/// Transform WGS84 to local point.
|
||||
pub fn wgs84_to_local(reg: &GeoRegistration, geo: &GeoPoint) -> [f32; 3] {
|
||||
let enu = coord::wgs84_to_enu(geo, ®.origin);
|
||||
let heading_rad = (-reg.heading_deg).to_radians();
|
||||
let cos_h = heading_rad.cos();
|
||||
let sin_h = heading_rad.sin();
|
||||
|
||||
let x = ((enu[0] * cos_h - enu[1] * sin_h) / reg.scale) as f32;
|
||||
let z = ((enu[0] * sin_h + enu[1] * cos_h) / reg.scale) as f32;
|
||||
let y = (enu[2] / reg.scale) as f32;
|
||||
|
||||
[x, y, z]
|
||||
}
|
||||
|
|
@ -0,0 +1,312 @@
|
|||
//! Temporal change tracking — detect changes in satellite/OSM/weather over time.
|
||||
|
||||
use crate::cache::TileCache;
|
||||
use crate::types::GeoPoint;
|
||||
#[allow(unused_imports)]
|
||||
use crate::types::GeoScene;
|
||||
use anyhow::Result;
|
||||
|
||||
/// Fetch current weather (Open Meteo, free, no key).
|
||||
pub async fn fetch_weather(point: &GeoPoint) -> Result<WeatherData> {
|
||||
let url = format!(
|
||||
"https://api.open-meteo.com/v1/forecast?latitude={:.4}&longitude={:.4}¤t=temperature_2m,relative_humidity_2m,wind_speed_10m,weather_code",
|
||||
point.lat, point.lon
|
||||
);
|
||||
|
||||
let client = reqwest::Client::builder()
|
||||
.timeout(std::time::Duration::from_secs(10))
|
||||
.build()?;
|
||||
|
||||
let resp: serde_json::Value = client.get(&url).send().await?.json().await?;
|
||||
let current = resp.get("current").cloned().unwrap_or(serde_json::json!({}));
|
||||
|
||||
Ok(WeatherData {
|
||||
temperature_c: current.get("temperature_2m").and_then(|v| v.as_f64()).unwrap_or(0.0) as f32,
|
||||
humidity_pct: current.get("relative_humidity_2m").and_then(|v| v.as_f64()).unwrap_or(0.0) as f32,
|
||||
wind_speed_ms: current.get("wind_speed_10m").and_then(|v| v.as_f64()).unwrap_or(0.0) as f32,
|
||||
weather_code: current.get("weather_code").and_then(|v| v.as_u64()).unwrap_or(0) as u16,
|
||||
})
|
||||
}
|
||||
|
||||
/// Check for OSM changes since last fetch.
|
||||
pub async fn check_osm_changes(scene: &GeoScene, cache: &TileCache) -> Result<Vec<String>> {
|
||||
let mut changes = Vec::new();
|
||||
|
||||
let cache_key = "osm_building_count";
|
||||
let prev_count: usize = cache.get(cache_key)
|
||||
.and_then(|d| String::from_utf8(d).ok())
|
||||
.and_then(|s| s.trim().parse().ok())
|
||||
.unwrap_or(0);
|
||||
|
||||
let current_count = scene.buildings.len();
|
||||
if prev_count > 0 && current_count != prev_count {
|
||||
let diff = current_count as i64 - prev_count as i64;
|
||||
changes.push(format!("Building count changed: {} → {} ({:+})", prev_count, current_count, diff));
|
||||
}
|
||||
|
||||
cache.put(cache_key, current_count.to_string().as_bytes())?;
|
||||
Ok(changes)
|
||||
}
|
||||
|
||||
/// Generate temporal summary for brain storage.
|
||||
pub fn temporal_summary(weather: &WeatherData, changes: &[String]) -> String {
|
||||
let weather_desc = match weather.weather_code {
|
||||
0 => "clear sky",
|
||||
1..=3 => "partly cloudy",
|
||||
45 | 48 => "foggy",
|
||||
51..=57 => "drizzle",
|
||||
61..=67 => "rain",
|
||||
71..=77 => "snow",
|
||||
80..=82 => "showers",
|
||||
95..=99 => "thunderstorm",
|
||||
_ => "unknown",
|
||||
};
|
||||
|
||||
let mut summary = format!(
|
||||
"Weather: {:.0}°C, {weather_desc}, humidity {:.0}%, wind {:.1}m/s.",
|
||||
weather.temperature_c, weather.humidity_pct, weather.wind_speed_ms,
|
||||
);
|
||||
|
||||
for change in changes {
|
||||
summary.push_str(&format!(" Change: {change}."));
|
||||
}
|
||||
|
||||
summary
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, serde::Serialize, serde::Deserialize)]
|
||||
pub struct WeatherData {
|
||||
pub temperature_c: f32,
|
||||
pub humidity_pct: f32,
|
||||
pub wind_speed_ms: f32,
|
||||
pub weather_code: u16,
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Satellite tile change detection
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
/// Result of comparing two tile snapshots.
|
||||
#[derive(Clone, Debug, serde::Serialize, serde::Deserialize)]
|
||||
pub struct TileChangeResult {
|
||||
/// 0.0 = identical, 1.0 = completely different.
|
||||
pub diff_score: f64,
|
||||
/// Number of pixels that changed.
|
||||
pub changed_pixels: usize,
|
||||
/// Total pixels compared.
|
||||
pub total_pixels: usize,
|
||||
}
|
||||
|
||||
/// Compare a newly-fetched tile against its previously-cached version.
|
||||
///
|
||||
/// Returns a `TileChangeResult` with a diff score between 0.0 (identical) and
|
||||
/// 1.0 (completely different). When the diff exceeds 0.1 the function stores
|
||||
/// a change event as a brain memory via the local ruOS brain endpoint.
|
||||
pub async fn detect_tile_changes(
|
||||
cache_key: &str,
|
||||
new_data: &[u8],
|
||||
cache: &TileCache,
|
||||
) -> Result<TileChangeResult> {
|
||||
let previous = cache.get(cache_key);
|
||||
|
||||
let result = match previous {
|
||||
Some(ref old_data) => {
|
||||
let total = old_data.len().max(new_data.len()).max(1);
|
||||
let comparable = old_data.len().min(new_data.len());
|
||||
let mut changed: usize = 0;
|
||||
for i in 0..comparable {
|
||||
if old_data[i] != new_data[i] {
|
||||
changed += 1;
|
||||
}
|
||||
}
|
||||
// Any extra bytes in the longer slice count as changed.
|
||||
changed += total - comparable;
|
||||
|
||||
TileChangeResult {
|
||||
diff_score: changed as f64 / total as f64,
|
||||
changed_pixels: changed,
|
||||
total_pixels: total,
|
||||
}
|
||||
}
|
||||
None => {
|
||||
// No previous data — treat as fully new (score 1.0).
|
||||
TileChangeResult {
|
||||
diff_score: 1.0,
|
||||
changed_pixels: new_data.len(),
|
||||
total_pixels: new_data.len().max(1),
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
// Persist new snapshot into cache for future comparisons.
|
||||
cache.put(cache_key, new_data)?;
|
||||
|
||||
// When significant change is detected, store a brain memory.
|
||||
if result.diff_score > 0.1 {
|
||||
let _ = store_change_event(cache_key, &result).await;
|
||||
}
|
||||
|
||||
Ok(result)
|
||||
}
|
||||
|
||||
/// Post a change event to the local ruOS brain.
|
||||
///
|
||||
/// Brain URL honours `RUVIEW_BRAIN_URL` via [`crate::brain::brain_url`].
|
||||
async fn store_change_event(cache_key: &str, result: &TileChangeResult) -> Result<()> {
|
||||
let client = reqwest::Client::builder()
|
||||
.timeout(std::time::Duration::from_secs(5))
|
||||
.build()?;
|
||||
|
||||
let body = serde_json::json!({
|
||||
"category": "spatial-change",
|
||||
"content": format!(
|
||||
"Tile change detected for {cache_key}: diff={:.3}, changed={}/{}",
|
||||
result.diff_score, result.changed_pixels, result.total_pixels,
|
||||
),
|
||||
});
|
||||
|
||||
client
|
||||
.post(format!("{}/memories", crate::brain::brain_url()))
|
||||
.json(&body)
|
||||
.send()
|
||||
.await?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Night mode detection
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
/// Approximate check whether the current time is "night" at a given latitude.
|
||||
///
|
||||
/// Uses a simplified sunrise/sunset model based on the solar declination and
|
||||
/// hour angle. When it is night the system should rely on CSI data only
|
||||
/// (satellite imagery is not useful in darkness).
|
||||
pub fn is_night(lat_deg: f64) -> bool {
|
||||
let now = chrono::Utc::now();
|
||||
is_night_at(lat_deg, now)
|
||||
}
|
||||
|
||||
/// Testable version of [`is_night`] that accepts an explicit timestamp.
|
||||
pub fn is_night_at(lat_deg: f64, utc: chrono::DateTime<chrono::Utc>) -> bool {
|
||||
use chrono::Datelike;
|
||||
use std::f64::consts::PI;
|
||||
|
||||
let day_of_year = utc.ordinal() as f64;
|
||||
let hour_utc = utc.timestamp() % 86400;
|
||||
let solar_hour = (hour_utc as f64) / 3600.0; // 0..24
|
||||
|
||||
// Solar declination (Spencer, 1971 — simplified)
|
||||
let gamma = 2.0 * PI * (day_of_year - 1.0) / 365.0;
|
||||
let decl = 0.006918
|
||||
- 0.399912 * gamma.cos()
|
||||
+ 0.070257 * gamma.sin()
|
||||
- 0.006758 * (2.0 * gamma).cos()
|
||||
+ 0.000907 * (2.0 * gamma).sin();
|
||||
|
||||
let lat_rad = lat_deg.to_radians();
|
||||
|
||||
// Cosine of the hour angle at sunrise/sunset (geometric, no refraction)
|
||||
let cos_ha = -(lat_rad.tan() * decl.tan());
|
||||
|
||||
// Polar day / polar night
|
||||
if cos_ha < -1.0 {
|
||||
return false; // midnight sun — never night
|
||||
}
|
||||
if cos_ha > 1.0 {
|
||||
return true; // polar night — always night
|
||||
}
|
||||
|
||||
let ha_sunrise = cos_ha.acos(); // radians, symmetric about solar noon
|
||||
let daylight_hours = 2.0 * ha_sunrise * 12.0 / PI;
|
||||
let solar_noon = 12.0; // approximation (ignores longitude offset)
|
||||
let sunrise = solar_noon - daylight_hours / 2.0;
|
||||
let sunset = solar_noon + daylight_hours / 2.0;
|
||||
|
||||
solar_hour < sunrise || solar_hour > sunset
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Tests
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_is_night_at_equator_noon() {
|
||||
// Noon UTC at equator on March 20 — should be daytime.
|
||||
let dt = chrono::NaiveDate::from_ymd_opt(2025, 3, 20)
|
||||
.unwrap()
|
||||
.and_hms_opt(12, 0, 0)
|
||||
.unwrap()
|
||||
.and_utc();
|
||||
assert!(!is_night_at(0.0, dt));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_is_night_at_equator_midnight() {
|
||||
// Midnight UTC at equator — should be night.
|
||||
let dt = chrono::NaiveDate::from_ymd_opt(2025, 3, 20)
|
||||
.unwrap()
|
||||
.and_hms_opt(2, 0, 0)
|
||||
.unwrap()
|
||||
.and_utc();
|
||||
assert!(is_night_at(0.0, dt));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_midnight_sun_arctic() {
|
||||
// Late June at 70 N — midnight sun, never night.
|
||||
let dt = chrono::NaiveDate::from_ymd_opt(2025, 6, 21)
|
||||
.unwrap()
|
||||
.and_hms_opt(0, 0, 0)
|
||||
.unwrap()
|
||||
.and_utc();
|
||||
assert!(!is_night_at(70.0, dt));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_polar_night_arctic() {
|
||||
// Late December at 80 N — polar night, always night.
|
||||
let dt = chrono::NaiveDate::from_ymd_opt(2025, 12, 21)
|
||||
.unwrap()
|
||||
.and_hms_opt(12, 0, 0)
|
||||
.unwrap()
|
||||
.and_utc();
|
||||
assert!(is_night_at(80.0, dt));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_detect_tile_changes_identical() {
|
||||
let cache = TileCache::new("/tmp/ruview-test-tile-changes");
|
||||
let data = vec![1u8, 2, 3, 4, 5];
|
||||
// Prime the cache.
|
||||
cache.put("test_tile_ident", &data).unwrap();
|
||||
|
||||
let rt = tokio::runtime::Builder::new_current_thread()
|
||||
.enable_all()
|
||||
.build()
|
||||
.unwrap();
|
||||
let result = rt.block_on(detect_tile_changes("test_tile_ident", &data, &cache)).unwrap();
|
||||
assert!((result.diff_score - 0.0).abs() < 1e-9);
|
||||
assert_eq!(result.changed_pixels, 0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_detect_tile_changes_fully_different() {
|
||||
let cache = TileCache::new("/tmp/ruview-test-tile-changes");
|
||||
let old = vec![0u8; 100];
|
||||
let new = vec![255u8; 100];
|
||||
cache.put("test_tile_diff", &old).unwrap();
|
||||
|
||||
let rt = tokio::runtime::Builder::new_current_thread()
|
||||
.enable_all()
|
||||
.build()
|
||||
.unwrap();
|
||||
let result = rt.block_on(detect_tile_changes("test_tile_diff", &new, &cache)).unwrap();
|
||||
assert!((result.diff_score - 1.0).abs() < 1e-9);
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,110 @@
|
|||
//! SRTM DEM parser — elevation data from NASA 1-arcsecond HGT files.
|
||||
|
||||
use crate::cache::TileCache;
|
||||
use crate::types::{ElevationGrid, GeoPoint};
|
||||
use anyhow::Result;
|
||||
|
||||
/// Download and parse SRTM HGT for a location.
|
||||
pub async fn fetch_elevation(point: &GeoPoint, cache: &TileCache) -> Result<ElevationGrid> {
|
||||
let lat_int = point.lat.floor() as i32;
|
||||
let lon_int = point.lon.floor() as i32;
|
||||
let ns = if lat_int >= 0 { 'N' } else { 'S' };
|
||||
let ew = if lon_int >= 0 { 'E' } else { 'W' };
|
||||
let filename = format!("{}{:02}{}{:03}.hgt", ns, lat_int.unsigned_abs(), ew, lon_int.unsigned_abs());
|
||||
let cache_key = format!("srtm_{filename}");
|
||||
|
||||
if let Some(data) = cache.get(&cache_key) {
|
||||
return parse_hgt(&data, lat_int as f64, lon_int as f64);
|
||||
}
|
||||
|
||||
let client = reqwest::Client::builder()
|
||||
.timeout(std::time::Duration::from_secs(30))
|
||||
.build()?;
|
||||
|
||||
// Primary: NASA SRTM public mirror (no auth required for .hgt)
|
||||
let nasa_url = format!(
|
||||
"https://e4ftl01.cr.usgs.gov/MEASURES/SRTMGL1.003/2000.02.11/{filename}"
|
||||
);
|
||||
|
||||
if let Ok(resp) = client.get(&nasa_url).send().await {
|
||||
if resp.status().is_success() {
|
||||
let data = resp.bytes().await?.to_vec();
|
||||
cache.put(&cache_key, &data)?;
|
||||
return parse_hgt(&data, lat_int as f64, lon_int as f64);
|
||||
}
|
||||
}
|
||||
|
||||
// Fallback: viewfinderpanoramas.org
|
||||
// Files are grouped by continent zip, but individual .hgt files can be
|
||||
// fetched directly when the server exposes them.
|
||||
let vfp_url = format!(
|
||||
"http://viewfinderpanoramas.org/dem1/{filename}"
|
||||
);
|
||||
|
||||
if let Ok(resp) = client.get(&vfp_url).send().await {
|
||||
if resp.status().is_success() {
|
||||
let data = resp.bytes().await?.to_vec();
|
||||
cache.put(&cache_key, &data)?;
|
||||
return parse_hgt(&data, lat_int as f64, lon_int as f64);
|
||||
}
|
||||
}
|
||||
|
||||
// Final fallback: flat terrain when all downloads fail
|
||||
Ok(ElevationGrid {
|
||||
origin_lat: lat_int as f64,
|
||||
origin_lon: lon_int as f64,
|
||||
cell_size_deg: 1.0 / 3600.0,
|
||||
cols: 100, rows: 100,
|
||||
heights: vec![0.0; 10000],
|
||||
})
|
||||
}
|
||||
|
||||
/// Parse SRTM HGT binary (3601x3601 big-endian i16).
|
||||
pub fn parse_hgt(data: &[u8], origin_lat: f64, origin_lon: f64) -> Result<ElevationGrid> {
|
||||
let n_samples = data.len() / 2;
|
||||
let side = (n_samples as f64).sqrt() as usize;
|
||||
|
||||
let heights: Vec<f32> = data.chunks_exact(2)
|
||||
.map(|c| {
|
||||
let v = i16::from_be_bytes([c[0], c[1]]);
|
||||
if v == -32768 { 0.0 } else { v as f32 } // -32768 = void
|
||||
})
|
||||
.collect();
|
||||
|
||||
Ok(ElevationGrid {
|
||||
origin_lat, origin_lon,
|
||||
cell_size_deg: 1.0 / (side - 1) as f64,
|
||||
cols: side, rows: side,
|
||||
heights,
|
||||
})
|
||||
}
|
||||
|
||||
/// Get elevation at a specific point from a grid.
|
||||
pub fn elevation_at(grid: &ElevationGrid, point: &GeoPoint) -> f32 {
|
||||
grid.get(point.lat, point.lon).unwrap_or(0.0)
|
||||
}
|
||||
|
||||
/// Extract a small subgrid around a point.
|
||||
pub fn extract_subgrid(grid: &ElevationGrid, center: &GeoPoint, radius_m: f64) -> ElevationGrid {
|
||||
let radius_deg = radius_m / 111_320.0;
|
||||
let min_row = ((grid.origin_lat + (grid.rows as f64 * grid.cell_size_deg) - center.lat - radius_deg) / grid.cell_size_deg).max(0.0) as usize;
|
||||
let max_row = ((grid.origin_lat + (grid.rows as f64 * grid.cell_size_deg) - center.lat + radius_deg) / grid.cell_size_deg).min(grid.rows as f64) as usize;
|
||||
let min_col = ((center.lon - radius_deg - grid.origin_lon) / grid.cell_size_deg).max(0.0) as usize;
|
||||
let max_col = ((center.lon + radius_deg - grid.origin_lon) / grid.cell_size_deg).min(grid.cols as f64) as usize;
|
||||
|
||||
let rows = max_row.saturating_sub(min_row);
|
||||
let cols = max_col.saturating_sub(min_col);
|
||||
let mut heights = Vec::with_capacity(rows * cols);
|
||||
for r in min_row..max_row {
|
||||
for c in min_col..max_col {
|
||||
heights.push(grid.heights.get(r * grid.cols + c).copied().unwrap_or(0.0));
|
||||
}
|
||||
}
|
||||
|
||||
ElevationGrid {
|
||||
origin_lat: grid.origin_lat + (grid.rows - max_row) as f64 * grid.cell_size_deg,
|
||||
origin_lon: grid.origin_lon + min_col as f64 * grid.cell_size_deg,
|
||||
cell_size_deg: grid.cell_size_deg,
|
||||
cols, rows, heights,
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,80 @@
|
|||
//! Satellite tile fetcher — XYZ/TMS tile download with caching.
|
||||
|
||||
use crate::cache::TileCache;
|
||||
use crate::coord;
|
||||
use crate::types::{GeoBBox, RasterTile, TileCoord};
|
||||
use anyhow::Result;
|
||||
|
||||
/// Tile provider (all free, no API keys).
|
||||
pub enum TileProvider {
|
||||
/// Sentinel-2 cloudless mosaic (EOX, 10m, CC-BY-4.0)
|
||||
Sentinel2Cloudless,
|
||||
/// ESRI World Imagery (sub-meter, free tier)
|
||||
EsriWorldImagery,
|
||||
/// OpenStreetMap (map tiles, not satellite)
|
||||
Osm,
|
||||
}
|
||||
|
||||
impl TileProvider {
|
||||
pub fn url(&self, coord: &TileCoord) -> String {
|
||||
match self {
|
||||
Self::Sentinel2Cloudless => format!(
|
||||
"https://tiles.maps.eox.at/wmts/1.0.0/s2cloudless-2021_3857/default/g/{}/{}/{}.jpg",
|
||||
coord.z, coord.y, coord.x
|
||||
),
|
||||
Self::EsriWorldImagery => format!(
|
||||
"https://server.arcgisonline.com/ArcGIS/rest/services/World_Imagery/MapServer/tile/{}/{}/{}",
|
||||
coord.z, coord.y, coord.x
|
||||
),
|
||||
Self::Osm => format!(
|
||||
"https://tile.openstreetmap.org/{}/{}/{}.png",
|
||||
coord.z, coord.x, coord.y
|
||||
),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn name(&self) -> &str {
|
||||
match self {
|
||||
Self::Sentinel2Cloudless => "sentinel2",
|
||||
Self::EsriWorldImagery => "esri",
|
||||
Self::Osm => "osm",
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Fetch a single tile with caching.
|
||||
pub async fn fetch_tile(provider: &TileProvider, coord: &TileCoord, cache: &TileCache) -> Result<RasterTile> {
|
||||
let cache_key = format!("tiles_{}_{}_{}.dat", coord.z, coord.x, coord.y);
|
||||
|
||||
if let Some(data) = cache.get(&cache_key) {
|
||||
return Ok(RasterTile { coord: coord.clone(), data, bounds: coord::tile_bounds(coord) });
|
||||
}
|
||||
|
||||
let url = provider.url(coord);
|
||||
let client = reqwest::Client::builder()
|
||||
.timeout(std::time::Duration::from_secs(10))
|
||||
.user_agent("RuView/0.1 (https://github.com/ruvnet/RuView)")
|
||||
.build()?;
|
||||
|
||||
let resp = client.get(&url).send().await?;
|
||||
if !resp.status().is_success() {
|
||||
anyhow::bail!("Tile fetch failed: {} → {}", url, resp.status());
|
||||
}
|
||||
let data = resp.bytes().await?.to_vec();
|
||||
cache.put(&cache_key, &data)?;
|
||||
|
||||
Ok(RasterTile { coord: coord.clone(), data, bounds: coord::tile_bounds(coord) })
|
||||
}
|
||||
|
||||
/// Fetch all tiles covering a bounding box.
|
||||
pub async fn fetch_area(provider: &TileProvider, bbox: &GeoBBox, zoom: u8, cache: &TileCache) -> Result<Vec<RasterTile>> {
|
||||
let coords = coord::tiles_for_bbox(bbox, zoom);
|
||||
let mut tiles = Vec::with_capacity(coords.len());
|
||||
for c in &coords {
|
||||
match fetch_tile(provider, c, cache).await {
|
||||
Ok(t) => tiles.push(t),
|
||||
Err(e) => eprintln!(" Tile {}/{}/{} failed: {}", c.z, c.x, c.y, e),
|
||||
}
|
||||
}
|
||||
Ok(tiles)
|
||||
}
|
||||
|
|
@ -0,0 +1,118 @@
|
|||
//! Core geospatial types.
|
||||
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
/// WGS84 geographic coordinate.
|
||||
#[derive(Clone, Debug, Serialize, Deserialize)]
|
||||
pub struct GeoPoint {
|
||||
pub lat: f64,
|
||||
pub lon: f64,
|
||||
pub alt: f64,
|
||||
}
|
||||
|
||||
/// Axis-aligned bounding box in WGS84.
|
||||
#[derive(Clone, Debug, Serialize, Deserialize)]
|
||||
pub struct GeoBBox {
|
||||
pub south: f64,
|
||||
pub west: f64,
|
||||
pub north: f64,
|
||||
pub east: f64,
|
||||
}
|
||||
|
||||
impl GeoBBox {
|
||||
pub fn from_center(center: &GeoPoint, radius_m: f64) -> Self {
|
||||
let dlat = radius_m / 111_320.0;
|
||||
let dlon = radius_m / (111_320.0 * center.lat.to_radians().cos());
|
||||
Self {
|
||||
south: center.lat - dlat,
|
||||
west: center.lon - dlon,
|
||||
north: center.lat + dlat,
|
||||
east: center.lon + dlon,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// XYZ tile address.
|
||||
#[derive(Clone, Debug, Serialize, Deserialize)]
|
||||
pub struct TileCoord {
|
||||
pub z: u8,
|
||||
pub x: u32,
|
||||
pub y: u32,
|
||||
}
|
||||
|
||||
/// Satellite raster tile.
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct RasterTile {
|
||||
pub coord: TileCoord,
|
||||
pub data: Vec<u8>,
|
||||
pub bounds: GeoBBox,
|
||||
}
|
||||
|
||||
/// Elevation grid from SRTM DEM.
|
||||
#[derive(Clone, Debug, Serialize, Deserialize)]
|
||||
pub struct ElevationGrid {
|
||||
pub origin_lat: f64,
|
||||
pub origin_lon: f64,
|
||||
pub cell_size_deg: f64,
|
||||
pub cols: usize,
|
||||
pub rows: usize,
|
||||
pub heights: Vec<f32>,
|
||||
}
|
||||
|
||||
impl ElevationGrid {
|
||||
pub fn get(&self, lat: f64, lon: f64) -> Option<f32> {
|
||||
let row = ((self.origin_lat + (self.rows as f64 * self.cell_size_deg) - lat) / self.cell_size_deg) as usize;
|
||||
let col = ((lon - self.origin_lon) / self.cell_size_deg) as usize;
|
||||
if row < self.rows && col < self.cols {
|
||||
Some(self.heights[row * self.cols + col])
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// OpenStreetMap feature.
|
||||
#[derive(Clone, Debug, Serialize, Deserialize)]
|
||||
pub enum OsmFeature {
|
||||
Building {
|
||||
outline: Vec<[f64; 2]>,
|
||||
height: Option<f32>,
|
||||
name: Option<String>,
|
||||
},
|
||||
Road {
|
||||
path: Vec<[f64; 2]>,
|
||||
road_type: String,
|
||||
name: Option<String>,
|
||||
},
|
||||
}
|
||||
|
||||
/// Geo-registration transform.
|
||||
#[derive(Clone, Debug, Serialize, Deserialize)]
|
||||
pub struct GeoRegistration {
|
||||
pub origin: GeoPoint,
|
||||
pub heading_deg: f64,
|
||||
pub scale: f64,
|
||||
}
|
||||
|
||||
impl Default for GeoRegistration {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
origin: GeoPoint { lat: 0.0, lon: 0.0, alt: 0.0 },
|
||||
heading_deg: 0.0,
|
||||
scale: 1.0,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Complete geo scene.
|
||||
#[derive(Clone, Debug, Serialize, Deserialize)]
|
||||
pub struct GeoScene {
|
||||
pub location: GeoPoint,
|
||||
pub bbox: GeoBBox,
|
||||
pub elevation_m: f32,
|
||||
pub buildings: Vec<OsmFeature>,
|
||||
pub roads: Vec<OsmFeature>,
|
||||
pub tile_count: usize,
|
||||
pub registration: GeoRegistration,
|
||||
pub last_updated: String,
|
||||
}
|
||||
|
|
@ -0,0 +1,84 @@
|
|||
use wifi_densepose_geo::*;
|
||||
use wifi_densepose_geo::coord;
|
||||
|
||||
#[test]
|
||||
fn test_haversine() {
|
||||
let toronto = GeoPoint { lat: 43.6532, lon: -79.3832, alt: 0.0 };
|
||||
let ottawa = GeoPoint { lat: 45.4215, lon: -75.6972, alt: 0.0 };
|
||||
let dist = coord::haversine(&toronto, &ottawa);
|
||||
assert!((dist - 353_000.0).abs() < 5_000.0, "Toronto-Ottawa ~353km, got {:.0}m", dist);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_wgs84_to_enu() {
|
||||
let origin = GeoPoint { lat: 43.0, lon: -79.0, alt: 100.0 };
|
||||
let point = GeoPoint { lat: 43.001, lon: -79.0, alt: 100.0 };
|
||||
let enu = coord::wgs84_to_enu(&point, &origin);
|
||||
assert!((enu[1] - 111.0).abs() < 5.0, "0.001 deg lat ~111m north, got {:.1}m", enu[1]);
|
||||
assert!(enu[0].abs() < 1.0, "same longitude should have ~0 east, got {:.1}m", enu[0]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_enu_roundtrip() {
|
||||
let origin = GeoPoint { lat: 43.6532, lon: -79.3832, alt: 76.0 };
|
||||
let local = [100.0, 200.0, 5.0]; // 100m east, 200m north, 5m up
|
||||
let geo = coord::enu_to_wgs84(&local, &origin);
|
||||
let back = coord::wgs84_to_enu(&geo, &origin);
|
||||
assert!((back[0] - local[0]).abs() < 0.01);
|
||||
assert!((back[1] - local[1]).abs() < 0.01);
|
||||
assert!((back[2] - local[2]).abs() < 0.01);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_tile_coords() {
|
||||
let tile = coord::wgs84_to_tile(43.6532, -79.3832, 16);
|
||||
assert!(tile.x > 0 && tile.y > 0);
|
||||
assert_eq!(tile.z, 16);
|
||||
let bounds = coord::tile_bounds(&tile);
|
||||
assert!(bounds.south < 43.66 && bounds.north > 43.64);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_tiles_for_bbox() {
|
||||
let bbox = GeoBBox::from_center(
|
||||
&GeoPoint { lat: 43.6532, lon: -79.3832, alt: 0.0 },
|
||||
500.0,
|
||||
);
|
||||
let tiles = coord::tiles_for_bbox(&bbox, 16);
|
||||
assert!(tiles.len() >= 4 && tiles.len() <= 25, "500m radius should need 4-25 tiles, got {}", tiles.len());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_geo_bbox_from_center() {
|
||||
let center = GeoPoint { lat: 43.0, lon: -79.0, alt: 0.0 };
|
||||
let bbox = GeoBBox::from_center(¢er, 1000.0);
|
||||
assert!(bbox.south < 43.0 && bbox.north > 43.0);
|
||||
assert!(bbox.west < -79.0 && bbox.east > -79.0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_hgt_parse() {
|
||||
// Create minimal 3x3 HGT data (big-endian i16)
|
||||
let mut data = Vec::new();
|
||||
for h in [100i16, 110, 120, 105, 115, 125, 110, 120, 130] {
|
||||
data.extend_from_slice(&h.to_be_bytes());
|
||||
}
|
||||
let grid = wifi_densepose_geo::terrain::parse_hgt(&data, 43.0, -79.0).unwrap();
|
||||
assert_eq!(grid.heights[0], 100.0);
|
||||
assert_eq!(grid.heights[4], 115.0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_registration() {
|
||||
let origin = GeoPoint { lat: 43.6532, lon: -79.3832, alt: 76.0 };
|
||||
let reg = wifi_densepose_geo::register::auto_register(&origin);
|
||||
|
||||
let local = [10.0f32, 0.0, 20.0]; // 10m east, 20m forward
|
||||
let geo = wifi_densepose_geo::register::local_to_wgs84(®, &local);
|
||||
assert!((geo.lat - origin.lat).abs() < 0.001);
|
||||
assert!((geo.lon - origin.lon).abs() < 0.001);
|
||||
|
||||
let back = wifi_densepose_geo::register::wgs84_to_local(®, &geo);
|
||||
assert!((back[0] - local[0]).abs() < 0.1);
|
||||
assert!((back[2] - local[2]).abs() < 0.1);
|
||||
}
|
||||
|
|
@ -0,0 +1,20 @@
|
|||
[package]
|
||||
name = "wifi-densepose-pointcloud"
|
||||
version = "0.1.0"
|
||||
edition = "2021"
|
||||
description = "Real-time dense point cloud from camera depth + WiFi CSI tomography"
|
||||
|
||||
[[bin]]
|
||||
name = "ruview-pointcloud"
|
||||
path = "src/main.rs"
|
||||
|
||||
[dependencies]
|
||||
serde = { workspace = true }
|
||||
serde_json = { workspace = true }
|
||||
tokio = { workspace = true }
|
||||
anyhow = { workspace = true }
|
||||
axum = { workspace = true }
|
||||
clap = { version = "4", features = ["derive"] }
|
||||
chrono = "0.4"
|
||||
dirs = "5"
|
||||
reqwest = { version = "0.12", features = ["json"], default-features = false }
|
||||
|
|
@ -0,0 +1,92 @@
|
|||
//! Brain bridge — sends spatial observations to the ruOS brain.
|
||||
//!
|
||||
//! Periodically summarizes the sensor pipeline state and stores it
|
||||
//! as brain memories for the agent to reason about.
|
||||
//!
|
||||
//! The brain URL is read from the `RUVIEW_BRAIN_URL` env var on first use,
|
||||
//! defaulting to `http://127.0.0.1:9876`.
|
||||
|
||||
use crate::csi_pipeline::PipelineOutput;
|
||||
use anyhow::Result;
|
||||
use std::sync::OnceLock;
|
||||
|
||||
/// Default brain URL if `RUVIEW_BRAIN_URL` is not set.
|
||||
const DEFAULT_BRAIN_URL: &str = "http://127.0.0.1:9876";
|
||||
|
||||
fn brain_url() -> &'static str {
|
||||
static BRAIN_URL: OnceLock<String> = OnceLock::new();
|
||||
BRAIN_URL.get_or_init(|| {
|
||||
let url = std::env::var("RUVIEW_BRAIN_URL")
|
||||
.unwrap_or_else(|_| DEFAULT_BRAIN_URL.to_string());
|
||||
eprintln!(" brain_bridge: using brain URL {url}");
|
||||
url
|
||||
})
|
||||
}
|
||||
|
||||
/// Store a spatial observation in the brain.
|
||||
async fn store_memory(category: &str, content: &str) -> Result<()> {
|
||||
let client = reqwest::Client::builder()
|
||||
.timeout(std::time::Duration::from_secs(5))
|
||||
.build()?;
|
||||
|
||||
let body = serde_json::json!({
|
||||
"category": category,
|
||||
"content": content,
|
||||
});
|
||||
|
||||
client.post(format!("{}/memories", brain_url()))
|
||||
.json(&body)
|
||||
.send()
|
||||
.await?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Summarize pipeline state and store in brain (called every 60 seconds).
|
||||
pub async fn sync_to_brain(pipeline: &PipelineOutput, camera_frames: u64) {
|
||||
// Only store if there's meaningful data
|
||||
if pipeline.total_frames < 10 && camera_frames < 5 { return; }
|
||||
|
||||
// Store spatial summary
|
||||
let motion_str = if pipeline.motion_detected { "detected" } else { "absent" };
|
||||
let skeleton_str = if let Some(ref sk) = pipeline.skeleton {
|
||||
format!("{} keypoints ({:.0}% conf)", sk.keypoints.len(), sk.confidence * 100.0)
|
||||
} else {
|
||||
"inactive".to_string()
|
||||
};
|
||||
|
||||
let summary = format!(
|
||||
"Room scan: {} camera frames, {} CSI frames from {} nodes. \
|
||||
Motion {} ({:.0}%). Breathing {:.0} BPM. Skeleton: {}. \
|
||||
Occupancy grid {}x{}x{} with {} occupied voxels.",
|
||||
camera_frames,
|
||||
pipeline.total_frames,
|
||||
pipeline.num_nodes,
|
||||
motion_str,
|
||||
pipeline.vitals.motion_score * 100.0,
|
||||
pipeline.vitals.breathing_rate,
|
||||
skeleton_str,
|
||||
pipeline.occupancy_dims.0,
|
||||
pipeline.occupancy_dims.1,
|
||||
pipeline.occupancy_dims.2,
|
||||
pipeline.occupancy.iter().filter(|&&d| d > 0.3).count(),
|
||||
);
|
||||
|
||||
let _ = store_memory("spatial-observation", &summary).await;
|
||||
|
||||
// Store motion events
|
||||
if pipeline.motion_detected && pipeline.vitals.motion_score > 0.3 {
|
||||
let _ = store_memory("spatial-motion",
|
||||
&format!("Strong motion detected: {:.0}% score, {} CSI frames",
|
||||
pipeline.vitals.motion_score * 100.0, pipeline.total_frames)
|
||||
).await;
|
||||
}
|
||||
|
||||
// Store vital signs if available
|
||||
if pipeline.vitals.breathing_rate > 5.0 && pipeline.vitals.breathing_rate < 35.0 {
|
||||
let _ = store_memory("spatial-vitals",
|
||||
&format!("Vital signs: breathing {:.0} BPM, motion {:.0}%",
|
||||
pipeline.vitals.breathing_rate, pipeline.vitals.motion_score * 100.0)
|
||||
).await;
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -0,0 +1,212 @@
|
|||
//! Camera capture — cross-platform frame grabber.
|
||||
//!
|
||||
//! macOS: uses `screencapture` or `ffmpeg -f avfoundation` for camera frames
|
||||
//! Linux: uses `v4l2-ctl` or `ffmpeg -f v4l2` for camera frames
|
||||
//! Both: capture to JPEG, decode to RGB, return raw pixel data
|
||||
|
||||
use anyhow::{bail, Result};
|
||||
use std::process::Command;
|
||||
use std::path::PathBuf;
|
||||
|
||||
/// Captured frame with raw RGB data.
|
||||
pub struct Frame {
|
||||
pub width: u32,
|
||||
pub height: u32,
|
||||
pub rgb: Vec<u8>, // row-major [height * width * 3]
|
||||
}
|
||||
|
||||
/// Camera source configuration.
|
||||
pub struct CameraConfig {
|
||||
pub device_index: u32,
|
||||
pub width: u32,
|
||||
pub height: u32,
|
||||
pub fps: u32,
|
||||
}
|
||||
|
||||
impl Default for CameraConfig {
|
||||
fn default() -> Self {
|
||||
Self { device_index: 0, width: 640, height: 480, fps: 15 }
|
||||
}
|
||||
}
|
||||
|
||||
/// Capture a single frame from the camera.
|
||||
///
|
||||
/// Tries multiple backends in order: ffmpeg, v4l2, imagesnap (macOS).
|
||||
pub fn capture_frame(config: &CameraConfig) -> Result<Frame> {
|
||||
let tmp = tmp_path();
|
||||
|
||||
// Try ffmpeg first (cross-platform)
|
||||
if let Ok(frame) = capture_ffmpeg(config, &tmp) {
|
||||
return Ok(frame);
|
||||
}
|
||||
|
||||
// Linux: try v4l2
|
||||
#[cfg(target_os = "linux")]
|
||||
if let Ok(frame) = capture_v4l2(config, &tmp) {
|
||||
return Ok(frame);
|
||||
}
|
||||
|
||||
// macOS: try screencapture (camera mode)
|
||||
#[cfg(target_os = "macos")]
|
||||
if let Ok(frame) = capture_macos(config, &tmp) {
|
||||
return Ok(frame);
|
||||
}
|
||||
|
||||
bail!("No camera backend available. Install ffmpeg or run on a machine with a camera.")
|
||||
}
|
||||
|
||||
/// Capture via ffmpeg (works on Linux + macOS).
|
||||
fn capture_ffmpeg(config: &CameraConfig, tmp: &PathBuf) -> Result<Frame> {
|
||||
let input = if cfg!(target_os = "macos") {
|
||||
format!("{}:none", config.device_index) // avfoundation: video:audio
|
||||
} else {
|
||||
format!("/dev/video{}", config.device_index) // v4l2
|
||||
};
|
||||
|
||||
let format = if cfg!(target_os = "macos") { "avfoundation" } else { "v4l2" };
|
||||
|
||||
let status = Command::new("ffmpeg")
|
||||
.args([
|
||||
"-y", "-f", format,
|
||||
"-video_size", &format!("{}x{}", config.width, config.height),
|
||||
"-framerate", &config.fps.to_string(),
|
||||
"-i", &input,
|
||||
"-frames:v", "1",
|
||||
"-f", "rawvideo",
|
||||
"-pix_fmt", "rgb24",
|
||||
tmp.to_str().unwrap_or("/tmp/ruview-frame.raw"),
|
||||
])
|
||||
.output()?;
|
||||
|
||||
if !status.status.success() {
|
||||
bail!("ffmpeg capture failed: {}", String::from_utf8_lossy(&status.stderr));
|
||||
}
|
||||
|
||||
let rgb = std::fs::read(tmp)?;
|
||||
let expected = (config.width * config.height * 3) as usize;
|
||||
if rgb.len() < expected {
|
||||
bail!("frame too small: {} bytes, expected {}", rgb.len(), expected);
|
||||
}
|
||||
|
||||
let _ = std::fs::remove_file(tmp);
|
||||
|
||||
Ok(Frame {
|
||||
width: config.width,
|
||||
height: config.height,
|
||||
rgb: rgb[..expected].to_vec(),
|
||||
})
|
||||
}
|
||||
|
||||
/// Linux: capture via v4l2-ctl.
|
||||
#[cfg(target_os = "linux")]
|
||||
fn capture_v4l2(config: &CameraConfig, tmp: &PathBuf) -> Result<Frame> {
|
||||
let device = format!("/dev/video{}", config.device_index);
|
||||
if !std::path::Path::new(&device).exists() {
|
||||
bail!("no camera at {device}");
|
||||
}
|
||||
|
||||
// Use v4l2-ctl to grab a frame
|
||||
let status = Command::new("v4l2-ctl")
|
||||
.args([
|
||||
"--device", &device,
|
||||
"--set-fmt-video", &format!("width={},height={},pixelformat=MJPG", config.width, config.height),
|
||||
"--stream-mmap", "--stream-count=1",
|
||||
"--stream-to", tmp.to_str().unwrap_or("/tmp/frame.mjpg"),
|
||||
])
|
||||
.output()?;
|
||||
|
||||
if !status.status.success() {
|
||||
bail!("v4l2-ctl failed");
|
||||
}
|
||||
|
||||
// Decode MJPEG to RGB
|
||||
decode_jpeg_to_rgb(tmp, config.width, config.height)
|
||||
}
|
||||
|
||||
/// macOS: capture via screencapture or swift.
|
||||
#[cfg(target_os = "macos")]
|
||||
fn capture_macos(config: &CameraConfig, tmp: &PathBuf) -> Result<Frame> {
|
||||
let jpg_path = tmp.with_extension("jpg");
|
||||
|
||||
// Try swift-based capture (requires camera permission)
|
||||
let swift = format!(
|
||||
r#"import AVFoundation; import AppKit
|
||||
let sem = DispatchSemaphore(value: 0)
|
||||
let s = AVCaptureSession(); s.sessionPreset = .medium
|
||||
guard let d = AVCaptureDevice.default(for: .video) else {{ exit(1) }}
|
||||
let i = try! AVCaptureDeviceInput(device: d); s.addInput(i)
|
||||
let o = AVCapturePhotoOutput(); s.addOutput(o)
|
||||
class D: NSObject, AVCapturePhotoCaptureDelegate {{
|
||||
func photoOutput(_ o: AVCapturePhotoOutput, didFinishProcessingPhoto p: AVCapturePhoto, error: Error?) {{
|
||||
if let d = p.fileDataRepresentation() {{ try! d.write(to: URL(fileURLWithPath: "{path}")) }}
|
||||
exit(0)
|
||||
}}
|
||||
}}
|
||||
let dl = D(); s.startRunning(); Thread.sleep(forTimeInterval: 1)
|
||||
o.capturePhoto(with: AVCapturePhotoSettings(), delegate: dl)
|
||||
Thread.sleep(forTimeInterval: 3)"#,
|
||||
path = jpg_path.display()
|
||||
);
|
||||
|
||||
let _ = Command::new("swift").args(["-e", &swift]).output();
|
||||
|
||||
if jpg_path.exists() {
|
||||
return decode_jpeg_to_rgb(&jpg_path, config.width, config.height);
|
||||
}
|
||||
|
||||
bail!("macOS camera capture requires GUI session with camera permission")
|
||||
}
|
||||
|
||||
fn decode_jpeg_to_rgb(path: &PathBuf, _width: u32, _height: u32) -> Result<Frame> {
|
||||
let data = std::fs::read(path)?;
|
||||
let _ = std::fs::remove_file(path);
|
||||
|
||||
// Simple JPEG decode — use the image crate if available, otherwise raw
|
||||
// For now, return the raw data and let the caller handle format
|
||||
Ok(Frame {
|
||||
width: _width,
|
||||
height: _height,
|
||||
rgb: data,
|
||||
})
|
||||
}
|
||||
|
||||
fn tmp_path() -> PathBuf {
|
||||
std::env::temp_dir().join(format!("ruview-frame-{}.raw", std::process::id()))
|
||||
}
|
||||
|
||||
/// Check if a camera is available on this system.
|
||||
pub fn camera_available() -> bool {
|
||||
if cfg!(target_os = "macos") {
|
||||
Command::new("system_profiler")
|
||||
.args(["SPCameraDataType"])
|
||||
.output()
|
||||
.map(|o| String::from_utf8_lossy(&o.stdout).contains("Camera"))
|
||||
.unwrap_or(false)
|
||||
} else {
|
||||
std::path::Path::new("/dev/video0").exists()
|
||||
}
|
||||
}
|
||||
|
||||
/// List available cameras.
|
||||
pub fn list_cameras() -> Vec<String> {
|
||||
let mut cameras = Vec::new();
|
||||
|
||||
if cfg!(target_os = "macos") {
|
||||
if let Ok(output) = Command::new("system_profiler").args(["SPCameraDataType"]).output() {
|
||||
let text = String::from_utf8_lossy(&output.stdout);
|
||||
for line in text.lines() {
|
||||
let trimmed = line.trim();
|
||||
if trimmed.ends_with(':') && !trimmed.starts_with("Camera") && trimmed.len() > 2 {
|
||||
cameras.push(trimmed.trim_end_matches(':').to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
} else {
|
||||
for i in 0..10 {
|
||||
if std::path::Path::new(&format!("/dev/video{i}")).exists() {
|
||||
cameras.push(format!("/dev/video{i}"));
|
||||
}
|
||||
}
|
||||
}
|
||||
cameras
|
||||
}
|
||||
|
|
@ -0,0 +1,663 @@
|
|||
//! Complete CSI processing pipeline — ADR-018 parser → heuristic pose → vitals → tomography.
|
||||
//!
|
||||
//! Receives raw UDP frames from ESP32 nodes, extracts I/Q subcarrier data,
|
||||
//! detects motion, estimates vitals, and produces 3D occupancy + skeleton
|
||||
//! for fusion with camera depth.
|
||||
//!
|
||||
//! **Note on pose**: the pose estimator here is an amplitude-energy
|
||||
//! heuristic — NOT a trained WiFlow model. See
|
||||
//! [`CsiPipelineState::heuristic_pose_from_amplitude`] for the exact shape.
|
||||
//! A real WiFlow integration requires loading and running the TCN weights,
|
||||
//! which this crate does not currently do.
|
||||
|
||||
use std::collections::VecDeque;
|
||||
use std::net::UdpSocket;
|
||||
use std::sync::{Arc, Mutex};
|
||||
|
||||
// ADR-018 parser moved to src/parser.rs. Re-export here so downstream code
|
||||
// (and the reviewer's referenced public API) keeps working unchanged.
|
||||
pub use crate::parser::{parse_adr018, CsiFrame};
|
||||
|
||||
// ─── CSI Fingerprint Database ──────────────────────────────────────────────
|
||||
|
||||
#[derive(Clone, Debug, serde::Serialize)]
|
||||
pub struct CsiFingerprint {
|
||||
pub name: String,
|
||||
pub mean_amplitudes: Vec<f32>,
|
||||
pub rssi_mean: f32,
|
||||
pub rssi_std: f32,
|
||||
pub samples: u32,
|
||||
}
|
||||
|
||||
// ─── CSI State — accumulates frames for heuristic pose + vitals ───────────
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct Skeleton {
|
||||
/// 17 COCO keypoints: [(x, y), ...] in [0, 1] normalized coordinates
|
||||
pub keypoints: Vec<[f32; 2]>,
|
||||
pub confidence: f32,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct VitalSigns {
|
||||
pub breathing_rate: f32, // breaths per minute
|
||||
pub heart_rate: f32, // beats per minute
|
||||
pub motion_score: f32, // 0.0 = still, 1.0 = strong motion
|
||||
}
|
||||
|
||||
pub struct CsiPipelineState {
|
||||
/// Per-node frame history (node_id → last N frames)
|
||||
pub node_frames: std::collections::HashMap<u8, VecDeque<CsiFrame>>,
|
||||
/// Latest skeleton from the amplitude-energy heuristic (NOT ML-derived)
|
||||
pub skeleton: Option<Skeleton>,
|
||||
/// Latest vital signs
|
||||
pub vitals: VitalSigns,
|
||||
/// Occupancy grid from RF tomography
|
||||
pub occupancy: Vec<f64>,
|
||||
pub occupancy_dims: (usize, usize, usize), // nx, ny, nz
|
||||
/// Total frames received
|
||||
pub total_frames: u64,
|
||||
/// Motion detection
|
||||
pub motion_detected: bool,
|
||||
/// CSI fingerprint database for room/location identification
|
||||
pub fingerprints: Vec<CsiFingerprint>,
|
||||
/// Current identified location (name, confidence) — updated every 100 frames
|
||||
pub current_location: Option<(String, f32)>,
|
||||
/// Night mode — true when camera luminance is below threshold
|
||||
pub is_dark: bool,
|
||||
/// Metadata from the on-disk WiFlow JSON, if one is present. NOTE: the
|
||||
/// weights themselves are NOT loaded or executed in this crate — this
|
||||
/// flag merely enables the amplitude-energy heuristic pose code path.
|
||||
pose_model_present: Option<PoseModelMetadata>,
|
||||
}
|
||||
|
||||
/// Placeholder tag indicating the `wiflow-v1.json` file is present on disk.
|
||||
/// This does NOT contain real TCN weights — the actual pose estimator in
|
||||
/// this crate is an amplitude-energy heuristic, not a neural network. The
|
||||
/// struct itself is empty; we only care whether it exists (`Option::Some`
|
||||
/// means "heuristic enabled").
|
||||
struct PoseModelMetadata;
|
||||
|
||||
impl Default for CsiPipelineState {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
node_frames: std::collections::HashMap::new(),
|
||||
skeleton: None,
|
||||
vitals: VitalSigns { breathing_rate: 0.0, heart_rate: 0.0, motion_score: 0.0 },
|
||||
occupancy: vec![0.0; 8 * 8 * 4],
|
||||
occupancy_dims: (8, 8, 4),
|
||||
total_frames: 0,
|
||||
motion_detected: false,
|
||||
fingerprints: Vec::new(),
|
||||
current_location: None,
|
||||
is_dark: false,
|
||||
pose_model_present: detect_pose_model_metadata(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// ─── Pose Model Metadata Probe ──────────────────────────────────────────────
|
||||
//
|
||||
// NOTE: This only reads the shape metadata from `wiflow-v1.json` on disk.
|
||||
// The weights are NOT loaded or evaluated. The actual pose used by this
|
||||
// crate is an amplitude-energy heuristic (see
|
||||
// `heuristic_pose_from_amplitude`), not WiFlow.
|
||||
|
||||
fn detect_pose_model_metadata() -> Option<PoseModelMetadata> {
|
||||
let paths = [
|
||||
"/tmp/ruview-firmware/wiflow-v1.json",
|
||||
"~/.local/share/ruview/wiflow-v1.json",
|
||||
];
|
||||
for p in &paths {
|
||||
let expanded = p.replace('~', &std::env::var("HOME").unwrap_or_default());
|
||||
if let Ok(data) = std::fs::read_to_string(&expanded) {
|
||||
if let Ok(model) = serde_json::from_str::<serde_json::Value>(&data) {
|
||||
if model.get("weightsBase64").and_then(|v| v.as_str()).is_some() {
|
||||
eprintln!(
|
||||
" pose: amplitude-energy heuristic enabled (metadata from {expanded}, {} params — weights NOT loaded)",
|
||||
model.get("totalParams").and_then(|v| v.as_u64()).unwrap_or(0)
|
||||
);
|
||||
return Some(PoseModelMetadata);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
eprintln!(" pose: amplitude-energy heuristic disabled (no metadata file found)");
|
||||
None
|
||||
}
|
||||
|
||||
// ─── Pipeline Processing ────────────────────────────────────────────────────
|
||||
|
||||
impl CsiPipelineState {
|
||||
/// Process a new CSI frame — updates motion, vitals, skeleton, occupancy.
|
||||
pub fn process_frame(&mut self, frame: CsiFrame) {
|
||||
let node_id = frame.node_id;
|
||||
self.total_frames += 1;
|
||||
|
||||
// Once every 500 frames log a one-line node stats summary. This keeps
|
||||
// us honest about the CSI shape we are actually receiving and also
|
||||
// guarantees every public `CsiFrame` field is read on the runtime
|
||||
// path, not only in tests.
|
||||
if self.total_frames % 500 == 0 {
|
||||
eprintln!(
|
||||
" CSI node={} ch={} ant={} sub={} rssi={} nf={} ts_us={} iq_bytes={}",
|
||||
frame.node_id,
|
||||
frame.channel,
|
||||
frame.n_antennas,
|
||||
frame.n_subcarriers,
|
||||
frame.rssi,
|
||||
frame.noise_floor,
|
||||
frame.timestamp_us,
|
||||
frame.iq_data.len(),
|
||||
);
|
||||
}
|
||||
|
||||
// Store frame in per-node history
|
||||
{
|
||||
let history = self.node_frames.entry(node_id).or_insert_with(|| VecDeque::with_capacity(100));
|
||||
history.push_back(frame.clone());
|
||||
if history.len() > 100 { history.pop_front(); }
|
||||
}
|
||||
|
||||
// 1. Motion detection (amplitude variance over last 20 frames)
|
||||
self.detect_motion(node_id);
|
||||
|
||||
// 2. Vital signs (phase analysis over last 100 frames)
|
||||
let has_enough = self.node_frames.get(&node_id).map(|h| h.len() >= 30).unwrap_or(false);
|
||||
if has_enough {
|
||||
self.estimate_vitals(node_id);
|
||||
}
|
||||
|
||||
// 3. Heuristic pose estimation (every 20 frames = 1 second at ~20fps)
|
||||
if self.total_frames % 20 == 0 {
|
||||
self.heuristic_pose_from_amplitude();
|
||||
}
|
||||
|
||||
// 4. RF tomography (update occupancy grid)
|
||||
self.update_tomography();
|
||||
|
||||
// 5. Location fingerprint identification (every 100 frames)
|
||||
if self.total_frames % 100 == 0 {
|
||||
self.current_location = self.identify_location();
|
||||
}
|
||||
}
|
||||
|
||||
fn detect_motion(&mut self, node_id: u8) {
|
||||
if let Some(history) = self.node_frames.get(&node_id) {
|
||||
let recent: Vec<&CsiFrame> = history.iter().rev().take(20).collect();
|
||||
if recent.len() < 5 { return; }
|
||||
|
||||
// Compute mean amplitude across subcarriers for each frame
|
||||
let mean_amps: Vec<f32> = recent.iter()
|
||||
.map(|f| f.amplitudes.iter().sum::<f32>() / f.amplitudes.len().max(1) as f32)
|
||||
.collect();
|
||||
|
||||
let mean = mean_amps.iter().sum::<f32>() / mean_amps.len() as f32;
|
||||
let variance = mean_amps.iter().map(|a| (a - mean).powi(2)).sum::<f32>() / mean_amps.len() as f32;
|
||||
|
||||
// High variance = motion
|
||||
self.vitals.motion_score = (variance / 100.0).min(1.0);
|
||||
self.motion_detected = self.vitals.motion_score > 0.15;
|
||||
}
|
||||
}
|
||||
|
||||
fn estimate_vitals(&mut self, node_id: u8) {
|
||||
if let Some(history) = self.node_frames.get(&node_id) {
|
||||
let frames: Vec<&CsiFrame> = history.iter().rev().take(100).collect();
|
||||
if frames.len() < 30 { return; }
|
||||
|
||||
// Extract phase from a stable subcarrier (pick one with low variance)
|
||||
let n_sub = frames[0].phases.len().min(35);
|
||||
if n_sub == 0 { return; }
|
||||
|
||||
// Use subcarrier 15 (mid-band, typically stable)
|
||||
let sub_idx = n_sub / 2;
|
||||
let phase_series: Vec<f32> = frames.iter().rev()
|
||||
.map(|f| f.phases.get(sub_idx).copied().unwrap_or(0.0))
|
||||
.collect();
|
||||
|
||||
// Simple peak counting for breathing rate (0.15-0.5 Hz = 9-30 BPM)
|
||||
let mut peaks = 0;
|
||||
for i in 1..phase_series.len() - 1 {
|
||||
if phase_series[i] > phase_series[i-1] && phase_series[i] > phase_series[i+1] {
|
||||
peaks += 1;
|
||||
}
|
||||
}
|
||||
|
||||
// Assuming ~20fps capture, 100 frames = 5 seconds
|
||||
let capture_secs = frames.len() as f32 / 20.0;
|
||||
let breathing_bpm = (peaks as f32 / capture_secs) * 60.0;
|
||||
self.vitals.breathing_rate = breathing_bpm.clamp(5.0, 40.0);
|
||||
|
||||
// Heart rate estimation (0.8-2.5 Hz) — need higher sampling rate
|
||||
// For now, estimate from amplitude modulation
|
||||
self.vitals.heart_rate = 0.0; // requires FFT for accurate detection
|
||||
}
|
||||
}
|
||||
|
||||
/// STUB: not real WiFlow inference; returns an amplitude-energy heuristic
|
||||
/// "pose" built by bucketing CSI subcarrier energy into 17 fake keypoints.
|
||||
///
|
||||
/// This exists so the downstream viewer has something to render while the
|
||||
/// real WiFlow TCN integration is being wired up. The output should NOT
|
||||
/// be interpreted as an ML-derived skeleton — confidence here is just
|
||||
/// amplitude variance, keypoint x is subcarrier energy, y is the
|
||||
/// keypoint index. Callers that need real pose must use the (yet to be
|
||||
/// wired) WiFlow model directly.
|
||||
fn heuristic_pose_from_amplitude(&mut self) {
|
||||
if self.pose_model_present.is_none() { return; }
|
||||
|
||||
// Collect 20 frames from the primary node
|
||||
let primary_node = self.node_frames.keys().next().copied();
|
||||
if let Some(node_id) = primary_node {
|
||||
if let Some(history) = self.node_frames.get(&node_id) {
|
||||
let frames: Vec<&CsiFrame> = history.iter().rev().take(20).collect();
|
||||
if frames.len() < 20 { return; }
|
||||
|
||||
// Build input: 35 subcarriers × 20 time steps. This is a
|
||||
// deliberately simple summary used to compute amplitude
|
||||
// variance; it is NOT fed through any neural network.
|
||||
let n_sub = frames[0].amplitudes.len().min(35);
|
||||
let mut input = vec![0.0f32; 35 * 20];
|
||||
for (t, frame) in frames.iter().rev().enumerate().take(20) {
|
||||
for s in 0..n_sub {
|
||||
input[t * 35 + s] = frame.amplitudes.get(s).copied().unwrap_or(0.0) / 128.0;
|
||||
}
|
||||
}
|
||||
|
||||
let mean_amp = input.iter().sum::<f32>() / input.len() as f32;
|
||||
let amp_var = input.iter().map(|a| (a - mean_amp).powi(2)).sum::<f32>() / input.len() as f32;
|
||||
|
||||
// If motion detected, emit a placeholder skeleton derived from
|
||||
// signal characteristics. NOT a real pose.
|
||||
if self.motion_detected {
|
||||
let mut keypoints = vec![[0.5f32; 2]; 17];
|
||||
for (i, kp) in keypoints.iter_mut().enumerate() {
|
||||
let sub_range = (i * n_sub / 17)..((i + 1) * n_sub / 17).min(n_sub);
|
||||
let energy: f32 = sub_range.clone()
|
||||
.filter_map(|s| frames.last().and_then(|f| f.amplitudes.get(s)))
|
||||
.sum();
|
||||
let norm_energy = energy / (sub_range.len().max(1) as f32 * 128.0);
|
||||
kp[0] = 0.3 + norm_energy * 0.4; // x: subcarrier energy
|
||||
kp[1] = (i as f32 / 17.0) * 0.8 + 0.1; // y: keypoint index
|
||||
}
|
||||
self.skeleton = Some(Skeleton {
|
||||
keypoints,
|
||||
confidence: amp_var.min(1.0),
|
||||
});
|
||||
} else {
|
||||
self.skeleton = None;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Record a CSI fingerprint for the current location/room.
|
||||
/// Computes mean amplitude and RSSI statistics from the last 50 frames
|
||||
/// across all nodes and saves as a named fingerprint.
|
||||
pub fn record_fingerprint(&mut self, name: &str) {
|
||||
// Collect last 50 frames from all nodes
|
||||
let mut all_amplitudes: Vec<Vec<f32>> = Vec::new();
|
||||
let mut rssi_values: Vec<f32> = Vec::new();
|
||||
|
||||
for history in self.node_frames.values() {
|
||||
for frame in history.iter().rev().take(50) {
|
||||
all_amplitudes.push(frame.amplitudes.clone());
|
||||
rssi_values.push(frame.rssi as f32);
|
||||
}
|
||||
}
|
||||
|
||||
if all_amplitudes.is_empty() {
|
||||
return;
|
||||
}
|
||||
|
||||
// Compute mean amplitude per subcarrier across all collected frames
|
||||
let n_sub = all_amplitudes.iter().map(|a| a.len()).max().unwrap_or(0);
|
||||
if n_sub == 0 {
|
||||
return;
|
||||
}
|
||||
let mut mean_amplitudes = vec![0.0f32; n_sub];
|
||||
let mut counts = vec![0u32; n_sub];
|
||||
for amps in &all_amplitudes {
|
||||
for (i, &a) in amps.iter().enumerate() {
|
||||
if i < n_sub {
|
||||
mean_amplitudes[i] += a;
|
||||
counts[i] += 1;
|
||||
}
|
||||
}
|
||||
}
|
||||
for i in 0..n_sub {
|
||||
if counts[i] > 0 {
|
||||
mean_amplitudes[i] /= counts[i] as f32;
|
||||
}
|
||||
}
|
||||
|
||||
// RSSI statistics
|
||||
let rssi_mean = rssi_values.iter().sum::<f32>() / rssi_values.len() as f32;
|
||||
let rssi_var = rssi_values.iter()
|
||||
.map(|r| (r - rssi_mean).powi(2))
|
||||
.sum::<f32>() / rssi_values.len() as f32;
|
||||
let rssi_std = rssi_var.sqrt();
|
||||
|
||||
let fingerprint = CsiFingerprint {
|
||||
name: name.to_string(),
|
||||
mean_amplitudes,
|
||||
rssi_mean,
|
||||
rssi_std,
|
||||
samples: all_amplitudes.len() as u32,
|
||||
};
|
||||
|
||||
// Replace existing fingerprint with same name, or append
|
||||
if let Some(existing) = self.fingerprints.iter_mut().find(|f| f.name == name) {
|
||||
*existing = fingerprint;
|
||||
} else {
|
||||
self.fingerprints.push(fingerprint);
|
||||
}
|
||||
}
|
||||
|
||||
/// Compare current CSI signals against saved fingerprints using cosine
|
||||
/// similarity. Returns (name, confidence) if the best match exceeds 0.7.
|
||||
pub fn identify_location(&self) -> Option<(String, f32)> {
|
||||
if self.fingerprints.is_empty() {
|
||||
return None;
|
||||
}
|
||||
|
||||
// Build current mean amplitude vector from last 50 frames
|
||||
let mut all_amplitudes: Vec<Vec<f32>> = Vec::new();
|
||||
for history in self.node_frames.values() {
|
||||
for frame in history.iter().rev().take(50) {
|
||||
all_amplitudes.push(frame.amplitudes.clone());
|
||||
}
|
||||
}
|
||||
if all_amplitudes.is_empty() {
|
||||
return None;
|
||||
}
|
||||
|
||||
let n_sub = all_amplitudes.iter().map(|a| a.len()).max().unwrap_or(0);
|
||||
if n_sub == 0 {
|
||||
return None;
|
||||
}
|
||||
let mut current = vec![0.0f32; n_sub];
|
||||
let mut counts = vec![0u32; n_sub];
|
||||
for amps in &all_amplitudes {
|
||||
for (i, &a) in amps.iter().enumerate() {
|
||||
if i < n_sub {
|
||||
current[i] += a;
|
||||
counts[i] += 1;
|
||||
}
|
||||
}
|
||||
}
|
||||
for i in 0..n_sub {
|
||||
if counts[i] > 0 {
|
||||
current[i] /= counts[i] as f32;
|
||||
}
|
||||
}
|
||||
|
||||
// Find best matching fingerprint by cosine similarity
|
||||
let mut best: Option<(String, f32)> = None;
|
||||
for fp in &self.fingerprints {
|
||||
let sim = cosine_similarity(¤t, &fp.mean_amplitudes);
|
||||
if sim > 0.7 {
|
||||
if best.as_ref().map_or(true, |(_, s)| sim > *s) {
|
||||
best = Some((fp.name.clone(), sim));
|
||||
}
|
||||
}
|
||||
}
|
||||
best
|
||||
}
|
||||
|
||||
/// Set the ambient light level from camera frame average luminance.
|
||||
/// When luminance < 30 (out of 255), enables night/dark mode which
|
||||
/// increases CSI processing frequency and skips camera depth.
|
||||
pub fn set_light_level(&mut self, avg_luminance: f32) {
|
||||
self.is_dark = avg_luminance < 30.0;
|
||||
}
|
||||
|
||||
fn update_tomography(&mut self) {
|
||||
let (nx, ny, nz) = self.occupancy_dims;
|
||||
let total = nx * ny * nz;
|
||||
|
||||
// Simple backprojection from per-node RSSI
|
||||
let mut new_occ = vec![0.0f64; total];
|
||||
for (node_id, history) in &self.node_frames {
|
||||
if let Some(latest) = history.back() {
|
||||
// RSSI-based attenuation → voxel density
|
||||
let atten = -(latest.rssi as f64);
|
||||
let contribution = atten / 100.0; // normalize
|
||||
|
||||
// Distribute based on node ID position (simplified ray model)
|
||||
let cx = match node_id {
|
||||
1 => nx / 4,
|
||||
2 => nx * 3 / 4,
|
||||
_ => nx / 2,
|
||||
};
|
||||
let cy = ny / 2;
|
||||
|
||||
for iz in 0..nz {
|
||||
for iy in 0..ny {
|
||||
for ix in 0..nx {
|
||||
let dx = (ix as f64 - cx as f64) / nx as f64;
|
||||
let dy = (iy as f64 - cy as f64) / ny as f64;
|
||||
let dist = (dx * dx + dy * dy).sqrt();
|
||||
let idx = iz * ny * nx + iy * nx + ix;
|
||||
// Gaussian-weighted contribution
|
||||
new_occ[idx] += contribution * (-dist * dist * 8.0).exp();
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Normalize
|
||||
let max = new_occ.iter().cloned().fold(0.0f64, f64::max);
|
||||
if max > 0.0 {
|
||||
for d in &mut new_occ { *d /= max; }
|
||||
}
|
||||
|
||||
// Exponential moving average with previous occupancy
|
||||
for i in 0..total {
|
||||
self.occupancy[i] = self.occupancy[i] * 0.7 + new_occ[i] * 0.3;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Cosine similarity between two vectors. Returns 0.0 if either has zero magnitude.
|
||||
fn cosine_similarity(a: &[f32], b: &[f32]) -> f32 {
|
||||
let len = a.len().min(b.len());
|
||||
if len == 0 {
|
||||
return 0.0;
|
||||
}
|
||||
let mut dot = 0.0f32;
|
||||
let mut mag_a = 0.0f32;
|
||||
let mut mag_b = 0.0f32;
|
||||
for i in 0..len {
|
||||
dot += a[i] * b[i];
|
||||
mag_a += a[i] * a[i];
|
||||
mag_b += b[i] * b[i];
|
||||
}
|
||||
let denom = mag_a.sqrt() * mag_b.sqrt();
|
||||
if denom < 1e-9 {
|
||||
0.0
|
||||
} else {
|
||||
dot / denom
|
||||
}
|
||||
}
|
||||
|
||||
// ─── UDP Receiver ───────────────────────────────────────────────────────────
|
||||
|
||||
/// Start the complete CSI pipeline — UDP receiver + processing.
|
||||
///
|
||||
/// Architecture (two threads, one std mpsc channel):
|
||||
///
|
||||
/// ```text
|
||||
/// UDP thread Processor thread
|
||||
/// ┌──────────────┐ mpsc::Sender ┌────────────────────┐
|
||||
/// │ recv_from() │ ─────────────► │ recv() CsiFrame │
|
||||
/// │ parse_adr018 │ (bounded-ish │ lock, process_frame│
|
||||
/// └──────────────┘ by channel) │ unlock │
|
||||
/// └────────────────────┘
|
||||
/// ```
|
||||
///
|
||||
/// This decouples the socket from the shared state: the UDP thread only
|
||||
/// touches the channel, never the mutex. The HTTP API handlers (which call
|
||||
/// `get_pipeline_output`) therefore only contend with the processor thread
|
||||
/// for brief periods, not with every incoming packet. Heavy work (pose,
|
||||
/// tomography, fingerprinting) runs outside the lock.
|
||||
pub fn start_pipeline(bind_addr: &str) -> Arc<Mutex<CsiPipelineState>> {
|
||||
let state = Arc::new(Mutex::new(CsiPipelineState::default()));
|
||||
let processor_state = state.clone();
|
||||
|
||||
let (tx, rx) = std::sync::mpsc::channel::<CsiFrame>();
|
||||
|
||||
// --- UDP thread: read + parse, push to channel (no lock held) ---
|
||||
let addr = bind_addr.to_string();
|
||||
std::thread::spawn(move || {
|
||||
let socket = match UdpSocket::bind(&addr) {
|
||||
Ok(s) => s,
|
||||
Err(e) => {
|
||||
eprintln!(" CSI pipeline: bind failed on {addr}: {e}");
|
||||
return;
|
||||
}
|
||||
};
|
||||
socket.set_read_timeout(Some(std::time::Duration::from_secs(1))).unwrap();
|
||||
eprintln!(" CSI pipeline: listening on {addr}");
|
||||
|
||||
let mut buf = [0u8; 2048];
|
||||
loop {
|
||||
match socket.recv_from(&mut buf) {
|
||||
Ok((n, _)) => {
|
||||
if let Some(frame) = parse_adr018(&buf[..n]) {
|
||||
// Non-blocking w.r.t. the shared state lock. If the
|
||||
// processor thread has died, send() fails and we
|
||||
// exit the receiver.
|
||||
if tx.send(frame).is_err() {
|
||||
eprintln!(" CSI pipeline: processor gone, exiting receiver");
|
||||
return;
|
||||
}
|
||||
}
|
||||
}
|
||||
Err(e) if e.kind() == std::io::ErrorKind::WouldBlock => continue,
|
||||
Err(_) => continue,
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// --- Processor thread: drain channel, take lock briefly to publish ---
|
||||
std::thread::spawn(move || {
|
||||
while let Ok(frame) = rx.recv() {
|
||||
// Lock is held only for the duration of one process_frame call;
|
||||
// HTTP handlers that need a snapshot via get_pipeline_output are
|
||||
// never starved by the UDP read loop.
|
||||
if let Ok(mut st) = processor_state.lock() {
|
||||
st.process_frame(frame);
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
state
|
||||
}
|
||||
|
||||
/// Send synthetic ADR-018 binary CSI frames for local testing without real
|
||||
/// ESP32 hardware. Each frame carries `n_subcarriers` subcarriers of fake
|
||||
/// I/Q data. Targets `target` (e.g. `127.0.0.1:3333`).
|
||||
pub fn send_test_frames(target: &str, count: usize) -> anyhow::Result<()> {
|
||||
use crate::parser::{build_test_frame, MAGIC_V1};
|
||||
let socket = UdpSocket::bind("0.0.0.0:0")?;
|
||||
for i in 0..count {
|
||||
let buf = build_test_frame(MAGIC_V1, (i % 4) as u8, 56, i);
|
||||
socket.send_to(&buf, target)?;
|
||||
std::thread::sleep(std::time::Duration::from_millis(10));
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Get current pipeline output for fusion.
|
||||
pub fn get_pipeline_output(state: &Arc<Mutex<CsiPipelineState>>) -> PipelineOutput {
|
||||
let st = state.lock().unwrap();
|
||||
PipelineOutput {
|
||||
skeleton: st.skeleton.clone(),
|
||||
vitals: st.vitals.clone(),
|
||||
occupancy: st.occupancy.clone(),
|
||||
occupancy_dims: st.occupancy_dims,
|
||||
motion_detected: st.motion_detected,
|
||||
total_frames: st.total_frames,
|
||||
num_nodes: st.node_frames.len(),
|
||||
current_location: st.current_location.clone(),
|
||||
is_dark: st.is_dark,
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, serde::Serialize)]
|
||||
pub struct PipelineOutput {
|
||||
pub skeleton: Option<Skeleton>,
|
||||
pub vitals: VitalSigns,
|
||||
pub occupancy: Vec<f64>,
|
||||
pub occupancy_dims: (usize, usize, usize),
|
||||
pub motion_detected: bool,
|
||||
pub total_frames: u64,
|
||||
pub num_nodes: usize,
|
||||
pub current_location: Option<(String, f32)>,
|
||||
pub is_dark: bool,
|
||||
}
|
||||
|
||||
// Serialize implementations
|
||||
impl serde::Serialize for Skeleton {
|
||||
fn serialize<S: serde::Serializer>(&self, s: S) -> Result<S::Ok, S::Error> {
|
||||
use serde::ser::SerializeStruct;
|
||||
let mut st = s.serialize_struct("Skeleton", 2)?;
|
||||
st.serialize_field("keypoints", &self.keypoints)?;
|
||||
st.serialize_field("confidence", &self.confidence)?;
|
||||
st.end()
|
||||
}
|
||||
}
|
||||
|
||||
impl serde::Serialize for VitalSigns {
|
||||
fn serialize<S: serde::Serializer>(&self, s: S) -> Result<S::Ok, S::Error> {
|
||||
use serde::ser::SerializeStruct;
|
||||
let mut st = s.serialize_struct("VitalSigns", 3)?;
|
||||
st.serialize_field("breathing_rate", &self.breathing_rate)?;
|
||||
st.serialize_field("heart_rate", &self.heart_rate)?;
|
||||
st.serialize_field("motion_score", &self.motion_score)?;
|
||||
st.end()
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::parser::{build_test_frame, parse_adr018, MAGIC_V1};
|
||||
|
||||
fn seed_state_with_frames(state: &mut CsiPipelineState, n: usize) {
|
||||
for i in 0..n {
|
||||
let bytes = build_test_frame(MAGIC_V1, 1, 32, i);
|
||||
let frame = parse_adr018(&bytes).expect("synthetic frame must parse");
|
||||
state.process_frame(frame);
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn set_light_level_toggles_night_mode() {
|
||||
let mut s = CsiPipelineState::default();
|
||||
assert!(!s.is_dark, "default should be daylight");
|
||||
s.set_light_level(10.0);
|
||||
assert!(s.is_dark, "luminance below 30 → dark");
|
||||
s.set_light_level(200.0);
|
||||
assert!(!s.is_dark, "high luminance → not dark");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn record_fingerprint_stores_and_matches() {
|
||||
let mut s = CsiPipelineState::default();
|
||||
seed_state_with_frames(&mut s, 30);
|
||||
s.record_fingerprint("lab");
|
||||
assert_eq!(s.fingerprints.len(), 1);
|
||||
assert_eq!(s.fingerprints[0].name, "lab");
|
||||
// Identify against its own fingerprint should succeed.
|
||||
let found = s.identify_location();
|
||||
assert!(found.is_some(), "should identify the just-recorded location");
|
||||
if let Some((name, conf)) = found {
|
||||
assert_eq!(name, "lab");
|
||||
assert!(conf > 0.7, "self-similarity should exceed match threshold");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,263 @@
|
|||
//! Monocular depth estimation via MiDaS ONNX + backprojection to 3D points.
|
||||
#![allow(dead_code)]
|
||||
|
||||
use crate::pointcloud::{PointCloud, ColorPoint};
|
||||
use anyhow::Result;
|
||||
|
||||
/// Default camera intrinsics (approximate for HD webcam)
|
||||
pub struct CameraIntrinsics {
|
||||
pub fx: f32, // focal length x (pixels)
|
||||
pub fy: f32, // focal length y (pixels)
|
||||
pub cx: f32, // principal point x
|
||||
pub cy: f32, // principal point y
|
||||
pub width: u32,
|
||||
pub height: u32,
|
||||
}
|
||||
|
||||
impl Default for CameraIntrinsics {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
fx: 525.0, fy: 525.0, // typical webcam focal length
|
||||
cx: 320.0, cy: 240.0, // center of 640x480
|
||||
width: 640, height: 480,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Backproject a depth map to 3D points using camera intrinsics.
|
||||
///
|
||||
/// depth_map: row-major [height x width] in meters
|
||||
/// rgb: optional row-major [height x width x 3] color
|
||||
pub fn backproject_depth(
|
||||
depth_map: &[f32],
|
||||
intrinsics: &CameraIntrinsics,
|
||||
rgb: Option<&[u8]>,
|
||||
downsample: u32,
|
||||
) -> PointCloud {
|
||||
let mut cloud = PointCloud::new("camera_depth");
|
||||
let w = intrinsics.width;
|
||||
let h = intrinsics.height;
|
||||
let step = downsample.max(1);
|
||||
|
||||
for y in (0..h).step_by(step as usize) {
|
||||
for x in (0..w).step_by(step as usize) {
|
||||
let idx = (y * w + x) as usize;
|
||||
let z = depth_map[idx];
|
||||
|
||||
// Skip invalid depths
|
||||
if z <= 0.01 || z > 10.0 || z.is_nan() { continue; }
|
||||
|
||||
// Backproject: (u, v, z) → (X, Y, Z)
|
||||
let px = (x as f32 - intrinsics.cx) * z / intrinsics.fx;
|
||||
let py = (y as f32 - intrinsics.cy) * z / intrinsics.fy;
|
||||
|
||||
let (r, g, b) = if let Some(rgb_data) = rgb {
|
||||
let ri = idx * 3;
|
||||
if ri + 2 < rgb_data.len() {
|
||||
(rgb_data[ri], rgb_data[ri + 1], rgb_data[ri + 2])
|
||||
} else {
|
||||
(128, 128, 128)
|
||||
}
|
||||
} else {
|
||||
// Color by depth (blue=near, red=far)
|
||||
let t = ((z - 0.5) / 4.0).clamp(0.0, 1.0);
|
||||
((t * 255.0) as u8, ((1.0 - t) * 128.0) as u8, ((1.0 - t) * 255.0) as u8)
|
||||
};
|
||||
|
||||
cloud.points.push(ColorPoint { x: px, y: py, z, r, g, b, intensity: 1.0 });
|
||||
}
|
||||
}
|
||||
cloud
|
||||
}
|
||||
|
||||
/// Run depth estimation on an image.
|
||||
///
|
||||
/// Tries MiDaS GPU server (127.0.0.1:9885) first, falls back to luminance+edges.
|
||||
pub fn estimate_depth(
|
||||
image_data: &[u8],
|
||||
width: u32,
|
||||
height: u32,
|
||||
) -> Result<Vec<f32>> {
|
||||
// Try MiDaS GPU server
|
||||
if let Ok(depth) = estimate_depth_midas_server(image_data, width, height) {
|
||||
return Ok(depth);
|
||||
}
|
||||
|
||||
// Fallback: luminance + edge-based pseudo-depth
|
||||
let w = width as usize;
|
||||
let h = height as usize;
|
||||
let mut lum = vec![0.0f32; w * h];
|
||||
for i in 0..w * h {
|
||||
let ri = i * 3;
|
||||
if ri + 2 < image_data.len() {
|
||||
lum[i] = (0.299 * image_data[ri] as f32
|
||||
+ 0.587 * image_data[ri + 1] as f32
|
||||
+ 0.114 * image_data[ri + 2] as f32) / 255.0;
|
||||
}
|
||||
}
|
||||
let mut edges = vec![0.0f32; w * h];
|
||||
for y in 1..h - 1 {
|
||||
for x in 1..w - 1 {
|
||||
let gx = -lum[(y-1)*w+x-1] + lum[(y-1)*w+x+1]
|
||||
- 2.0*lum[y*w+x-1] + 2.0*lum[y*w+x+1]
|
||||
- lum[(y+1)*w+x-1] + lum[(y+1)*w+x+1];
|
||||
let gy = -lum[(y-1)*w+x-1] - 2.0*lum[(y-1)*w+x] - lum[(y-1)*w+x+1]
|
||||
+ lum[(y+1)*w+x-1] + 2.0*lum[(y+1)*w+x] + lum[(y+1)*w+x+1];
|
||||
edges[y * w + x] = (gx * gx + gy * gy).sqrt().min(1.0);
|
||||
}
|
||||
}
|
||||
let mut depth_map = vec![3.0f32; w * h];
|
||||
for i in 0..w * h {
|
||||
let base = 1.0 + (1.0 - lum[i]) * 3.5;
|
||||
let edge_boost = edges[i] * 1.5;
|
||||
depth_map[i] = (base - edge_boost).max(0.3);
|
||||
}
|
||||
Ok(depth_map)
|
||||
}
|
||||
|
||||
/// Call MiDaS depth server running on GPU (127.0.0.1:9885).
|
||||
fn estimate_depth_midas_server(rgb: &[u8], width: u32, height: u32) -> Result<Vec<f32>> {
|
||||
let expected = (width * height * 3) as usize;
|
||||
if rgb.len() < expected { anyhow::bail!("rgb too small"); }
|
||||
|
||||
// Send RGB as JSON array to depth server
|
||||
let rgb_list: Vec<u8> = rgb[..expected].to_vec();
|
||||
let body = serde_json::json!({
|
||||
"width": width,
|
||||
"height": height,
|
||||
"rgb": rgb_list,
|
||||
});
|
||||
let body_bytes = serde_json::to_vec(&body)?;
|
||||
|
||||
let client = std::net::TcpStream::connect_timeout(
|
||||
&"127.0.0.1:9885".parse()?, std::time::Duration::from_millis(500)
|
||||
)?;
|
||||
client.set_read_timeout(Some(std::time::Duration::from_secs(5)))?;
|
||||
client.set_write_timeout(Some(std::time::Duration::from_secs(2)))?;
|
||||
|
||||
use std::io::{Read, Write};
|
||||
let mut stream = client;
|
||||
let req = format!(
|
||||
"POST /depth HTTP/1.1\r\nHost: 127.0.0.1\r\nContent-Type: application/json\r\nContent-Length: {}\r\n\r\n",
|
||||
body_bytes.len()
|
||||
);
|
||||
stream.write_all(req.as_bytes())?;
|
||||
stream.write_all(&body_bytes)?;
|
||||
|
||||
// Read response
|
||||
let mut resp = Vec::new();
|
||||
stream.read_to_end(&mut resp)?;
|
||||
|
||||
// Skip HTTP headers
|
||||
let body_start = resp.windows(4).position(|w| w == b"\r\n\r\n")
|
||||
.map(|p| p + 4).unwrap_or(0);
|
||||
let depth_bytes = &resp[body_start..];
|
||||
|
||||
let n = (width * height) as usize;
|
||||
if depth_bytes.len() < n * 4 { anyhow::bail!("depth response too small"); }
|
||||
|
||||
let depth: Vec<f32> = depth_bytes[..n * 4].chunks_exact(4)
|
||||
.map(|c| f32::from_le_bytes([c[0], c[1], c[2], c[3]]))
|
||||
.collect();
|
||||
|
||||
Ok(depth)
|
||||
}
|
||||
|
||||
/// Capture depth cloud from camera (placeholder — real impl uses nokhwa or v4l2).
|
||||
pub async fn capture_depth_cloud(_frames: usize) -> Result<PointCloud> {
|
||||
eprintln!("Camera capture not available (no camera on this machine).");
|
||||
eprintln!("Use --demo for synthetic data, or run on a machine with a camera.");
|
||||
Ok(demo_depth_cloud())
|
||||
}
|
||||
|
||||
/// Generate a demo depth point cloud (synthetic room scene).
|
||||
pub fn demo_depth_cloud() -> PointCloud {
|
||||
let _cloud = PointCloud::new("demo_camera_depth");
|
||||
let intrinsics = CameraIntrinsics::default();
|
||||
|
||||
// Simulate a depth map: room with walls at 3m, floor, and a person at 2m
|
||||
let w = 160; // downsampled
|
||||
let h = 120;
|
||||
let mut depth = vec![3.0f32; w * h];
|
||||
|
||||
// Floor plane (bottom third)
|
||||
for y in (h * 2 / 3)..h {
|
||||
for x in 0..w {
|
||||
depth[y * w + x] = 1.0 + (y - h * 2 / 3) as f32 * 0.05;
|
||||
}
|
||||
}
|
||||
|
||||
// Person silhouette (center, depth=2m)
|
||||
for y in (h / 4)..(h * 3 / 4) {
|
||||
for x in (w * 2 / 5)..(w * 3 / 5) {
|
||||
let dy = (y as f32 - h as f32 / 2.0).abs() / (h as f32 / 4.0);
|
||||
let dx = (x as f32 - w as f32 / 2.0).abs() / (w as f32 / 5.0);
|
||||
if dx * dx + dy * dy < 1.0 {
|
||||
depth[y * w + x] = 2.0 + (dx * dx + dy * dy) * 0.3;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let scaled_intrinsics = CameraIntrinsics {
|
||||
fx: intrinsics.fx * w as f32 / intrinsics.width as f32,
|
||||
fy: intrinsics.fy * h as f32 / intrinsics.height as f32,
|
||||
cx: w as f32 / 2.0,
|
||||
cy: h as f32 / 2.0,
|
||||
width: w as u32,
|
||||
height: h as u32,
|
||||
};
|
||||
|
||||
backproject_depth(&depth, &scaled_intrinsics, None, 1)
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn backproject_2x2_depth_yields_four_points() {
|
||||
// 2x2 image, depth=1m everywhere; trivial intrinsics.
|
||||
let intr = CameraIntrinsics {
|
||||
fx: 1.0, fy: 1.0, cx: 0.5, cy: 0.5,
|
||||
width: 2, height: 2,
|
||||
};
|
||||
let depth = vec![1.0f32; 4];
|
||||
let cloud = backproject_depth(&depth, &intr, None, 1);
|
||||
assert_eq!(cloud.points.len(), 4, "2x2 depth → 4 backprojected points");
|
||||
// Every point should be at z=1.0.
|
||||
for p in &cloud.points {
|
||||
assert!((p.z - 1.0).abs() < 1e-6, "z should be 1.0, got {}", p.z);
|
||||
}
|
||||
// With cx=0.5, cy=0.5 the four pixel centers backproject symmetrically
|
||||
// about the optical axis: x in {-0.5, 0.5}, y in {-0.5, 0.5}.
|
||||
let mut xs: Vec<f32> = cloud.points.iter().map(|p| p.x).collect();
|
||||
xs.sort_by(|a, b| a.partial_cmp(b).unwrap());
|
||||
assert!((xs[0] + 0.5).abs() < 1e-6);
|
||||
assert!((xs.last().unwrap() - 0.5).abs() < 1e-6);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn backproject_rejects_invalid_depth() {
|
||||
let intr = CameraIntrinsics {
|
||||
fx: 1.0, fy: 1.0, cx: 0.5, cy: 0.5,
|
||||
width: 2, height: 2,
|
||||
};
|
||||
// All pixels NaN → no points.
|
||||
let depth = vec![f32::NAN; 4];
|
||||
let cloud = backproject_depth(&depth, &intr, None, 1);
|
||||
assert_eq!(cloud.points.len(), 0);
|
||||
}
|
||||
}
|
||||
|
||||
#[allow(dead_code)]
|
||||
fn find_midas_model() -> Result<String> {
|
||||
let paths = [
|
||||
dirs::home_dir().unwrap_or_default().join(".local/share/ruview/midas_v21_small_256.onnx"),
|
||||
dirs::home_dir().unwrap_or_default().join(".cache/ruview/midas_v21_small_256.onnx"),
|
||||
std::path::PathBuf::from("/usr/local/share/ruview/midas_v21_small_256.onnx"),
|
||||
];
|
||||
for p in &paths {
|
||||
if p.exists() { return Ok(p.to_string_lossy().to_string()); }
|
||||
}
|
||||
anyhow::bail!("MiDaS ONNX model not found. Download:\n wget https://github.com/isl-org/MiDaS/releases/download/v3_1/midas_v21_small_256.onnx -O ~/.local/share/ruview/midas_v21_small_256.onnx")
|
||||
}
|
||||
|
|
@ -0,0 +1,163 @@
|
|||
//! Multi-modal fusion: camera depth + WiFi RF tomography → unified point cloud.
|
||||
|
||||
use crate::pointcloud::{PointCloud, ColorPoint};
|
||||
use std::collections::HashMap;
|
||||
|
||||
/// Occupancy volume from WiFi RF tomography (mirrors RuView's OccupancyVolume).
|
||||
#[derive(Clone, Debug, serde::Serialize, serde::Deserialize)]
|
||||
pub struct OccupancyVolume {
|
||||
pub densities: Vec<f64>, // [nz][ny][nx] voxel densities
|
||||
pub nx: usize,
|
||||
pub ny: usize,
|
||||
pub nz: usize,
|
||||
pub bounds: [f64; 6], // [x_min, y_min, z_min, x_max, y_max, z_max]
|
||||
pub occupied_count: usize,
|
||||
}
|
||||
|
||||
/// Convert WiFi occupancy volume to a sparse point cloud.
|
||||
///
|
||||
/// Each occupied voxel (density > threshold) becomes a point at the voxel center.
|
||||
pub fn occupancy_to_pointcloud(vol: &OccupancyVolume) -> PointCloud {
|
||||
let mut cloud = PointCloud::new("wifi_occupancy");
|
||||
let threshold = 0.3;
|
||||
|
||||
let dx = (vol.bounds[3] - vol.bounds[0]) / vol.nx as f64;
|
||||
let dy = (vol.bounds[4] - vol.bounds[1]) / vol.ny as f64;
|
||||
let dz = (vol.bounds[5] - vol.bounds[2]) / vol.nz as f64;
|
||||
|
||||
for iz in 0..vol.nz {
|
||||
for iy in 0..vol.ny {
|
||||
for ix in 0..vol.nx {
|
||||
let idx = iz * vol.ny * vol.nx + iy * vol.nx + ix;
|
||||
let density = vol.densities[idx];
|
||||
if density > threshold {
|
||||
let x = vol.bounds[0] + (ix as f64 + 0.5) * dx;
|
||||
let y = vol.bounds[1] + (iy as f64 + 0.5) * dy;
|
||||
let z = vol.bounds[2] + (iz as f64 + 0.5) * dz;
|
||||
|
||||
// Color by density (green=low, red=high)
|
||||
let t = ((density - threshold) / (1.0 - threshold)).min(1.0);
|
||||
let r = (t * 255.0) as u8;
|
||||
let g = ((1.0 - t) * 200.0) as u8;
|
||||
|
||||
cloud.points.push(ColorPoint {
|
||||
x: x as f32,
|
||||
y: y as f32,
|
||||
z: z as f32,
|
||||
r, g, b: 50,
|
||||
intensity: density as f32,
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
cloud
|
||||
}
|
||||
|
||||
/// Fuse multiple point clouds with voxel-grid downsampling.
|
||||
///
|
||||
/// Points from all clouds are binned into voxels of the given size.
|
||||
/// Each voxel produces one averaged point (position, color, max intensity).
|
||||
pub fn fuse_clouds(clouds: &[&PointCloud], voxel_size: f32) -> PointCloud {
|
||||
let mut cells: HashMap<(i32, i32, i32), (f32, f32, f32, f32, f32, f32, f32, u32)> = HashMap::new();
|
||||
// (sum_x, sum_y, sum_z, sum_r, sum_g, sum_b, max_intensity, count)
|
||||
|
||||
for cloud in clouds {
|
||||
for p in &cloud.points {
|
||||
let key = (
|
||||
(p.x / voxel_size).floor() as i32,
|
||||
(p.y / voxel_size).floor() as i32,
|
||||
(p.z / voxel_size).floor() as i32,
|
||||
);
|
||||
let entry = cells.entry(key).or_insert((0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0));
|
||||
entry.0 += p.x;
|
||||
entry.1 += p.y;
|
||||
entry.2 += p.z;
|
||||
entry.3 += p.r as f32;
|
||||
entry.4 += p.g as f32;
|
||||
entry.5 += p.b as f32;
|
||||
entry.6 = entry.6.max(p.intensity);
|
||||
entry.7 += 1;
|
||||
}
|
||||
}
|
||||
|
||||
let mut fused = PointCloud::new("fused");
|
||||
for (_, (sx, sy, sz, sr, sg, sb, mi, n)) in &cells {
|
||||
let n = *n as f32;
|
||||
fused.points.push(ColorPoint {
|
||||
x: sx / n, y: sy / n, z: sz / n,
|
||||
r: (sr / n) as u8, g: (sg / n) as u8, b: (sb / n) as u8,
|
||||
intensity: *mi,
|
||||
});
|
||||
}
|
||||
fused
|
||||
}
|
||||
|
||||
/// Generate a demo occupancy volume (room with person).
|
||||
pub fn demo_occupancy() -> OccupancyVolume {
|
||||
let nx = 10;
|
||||
let ny = 10;
|
||||
let nz = 5;
|
||||
let mut densities = vec![0.0f64; nx * ny * nz];
|
||||
|
||||
// Walls (high density at edges)
|
||||
for iz in 0..nz {
|
||||
for iy in 0..ny {
|
||||
for ix in 0..nx {
|
||||
let idx = iz * ny * nx + iy * nx + ix;
|
||||
// Edges = walls
|
||||
if ix == 0 || ix == nx - 1 || iy == 0 || iy == ny - 1 {
|
||||
densities[idx] = 0.8;
|
||||
}
|
||||
// Floor
|
||||
if iz == 0 {
|
||||
densities[idx] = 0.6;
|
||||
}
|
||||
// Person at center (iz=1-3, ix=4-6, iy=4-6)
|
||||
if (4..=6).contains(&ix) && (4..=6).contains(&iy) && (1..=3).contains(&iz) {
|
||||
densities[idx] = 0.9;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let occupied_count = densities.iter().filter(|&&d| d > 0.3).count();
|
||||
OccupancyVolume {
|
||||
densities, nx, ny, nz,
|
||||
bounds: [0.0, 0.0, 0.0, 5.0, 5.0, 3.0],
|
||||
occupied_count,
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
fn cloud_with(name: &str, pts: &[(f32, f32, f32)]) -> PointCloud {
|
||||
let mut c = PointCloud::new(name);
|
||||
for &(x, y, z) in pts {
|
||||
c.points.push(ColorPoint { x, y, z, r: 10, g: 20, b: 30, intensity: 0.5 });
|
||||
}
|
||||
c
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn fuse_clouds_merges_non_overlapping() {
|
||||
let a = cloud_with("a", &[(0.0, 0.0, 0.0)]);
|
||||
let b = cloud_with("b", &[(5.0, 5.0, 5.0)]);
|
||||
let fused = fuse_clouds(&[&a, &b], 0.1);
|
||||
assert_eq!(fused.points.len(), 2, "two far-apart points should yield two voxels");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn fuse_clouds_voxel_dedup() {
|
||||
// Points all within one voxel must collapse to a single averaged point.
|
||||
let a = cloud_with("a", &[
|
||||
(0.01, 0.02, 0.03),
|
||||
(0.04, 0.01, 0.02),
|
||||
(0.03, 0.03, 0.01),
|
||||
]);
|
||||
let fused = fuse_clouds(&[&a], 0.5);
|
||||
assert_eq!(fused.points.len(), 1, "three close points → one voxel");
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,272 @@
|
|||
//! ruview-pointcloud — real-time dense point cloud from camera + WiFi CSI
|
||||
//!
|
||||
//! Pipeline: Camera → Depth → Backproject → Fuse with WiFi occupancy → Stream
|
||||
//!
|
||||
//! Usage:
|
||||
//! ruview-pointcloud serve # HTTP + Three.js viewer
|
||||
//! ruview-pointcloud capture --frames 1 # capture to PLY
|
||||
//! ruview-pointcloud demo # synthetic demo
|
||||
//! ruview-pointcloud train # calibration training
|
||||
//! ruview-pointcloud csi-test # send test CSI frames (ADR-018 binary)
|
||||
|
||||
mod brain_bridge;
|
||||
mod camera;
|
||||
mod csi_pipeline;
|
||||
mod depth;
|
||||
mod fusion;
|
||||
mod parser;
|
||||
mod pointcloud;
|
||||
mod stream;
|
||||
mod training;
|
||||
|
||||
use anyhow::Result;
|
||||
use clap::{Parser, Subcommand};
|
||||
|
||||
const VERSION: &str = env!("CARGO_PKG_VERSION");
|
||||
|
||||
#[derive(Parser)]
|
||||
#[command(name = "ruview-pointcloud", version = VERSION)]
|
||||
struct Cli {
|
||||
#[command(subcommand)]
|
||||
command: Commands,
|
||||
}
|
||||
|
||||
#[derive(Subcommand)]
|
||||
enum Commands {
|
||||
/// Start real-time point cloud server.
|
||||
///
|
||||
/// By default the HTTP server binds to `127.0.0.1:9880` — exposing it on
|
||||
/// `0.0.0.0` leaks live camera/CSI/vitals data to the network and must
|
||||
/// be an explicit opt-in via `--bind 0.0.0.0:9880`.
|
||||
Serve {
|
||||
/// Bind address for the HTTP/viewer server. Default
|
||||
/// `127.0.0.1:9880` (loopback only — safe by default).
|
||||
#[arg(long, default_value = "127.0.0.1:9880")]
|
||||
bind: String,
|
||||
/// Brain URL for storing observations
|
||||
#[arg(long)]
|
||||
brain: Option<String>,
|
||||
},
|
||||
/// Capture frames to PLY file
|
||||
Capture {
|
||||
#[arg(long, default_value = "1")]
|
||||
frames: usize,
|
||||
#[arg(long, default_value = "output.ply")]
|
||||
output: String,
|
||||
},
|
||||
/// Generate demo point cloud
|
||||
Demo,
|
||||
/// List available cameras
|
||||
Cameras,
|
||||
/// Training and calibration
|
||||
Train {
|
||||
#[arg(long, default_value = "~/.local/share/ruview/training")]
|
||||
data_dir: String,
|
||||
/// Brain URL for submitting results
|
||||
#[arg(long)]
|
||||
brain: Option<String>,
|
||||
},
|
||||
/// Send synthetic ADR-018 binary CSI frames (for local testing without ESP32).
|
||||
CsiTest {
|
||||
#[arg(long, default_value = "127.0.0.1:3333")]
|
||||
target: String,
|
||||
#[arg(long, default_value = "100")]
|
||||
count: usize,
|
||||
},
|
||||
/// Record a CSI fingerprint for the current location.
|
||||
///
|
||||
/// Listens on UDP 3333 for `--seconds` seconds, accumulates CSI frames,
|
||||
/// and stores a named fingerprint that future sessions can match
|
||||
/// against to identify the room.
|
||||
Fingerprint {
|
||||
/// Human-readable name for the fingerprint (e.g. "office", "lab").
|
||||
name: String,
|
||||
/// How long to listen before recording (default 5 s).
|
||||
#[arg(long, default_value = "5")]
|
||||
seconds: u64,
|
||||
},
|
||||
}
|
||||
|
||||
#[tokio::main]
|
||||
async fn main() -> Result<()> {
|
||||
let cli = Cli::parse();
|
||||
|
||||
match cli.command {
|
||||
Commands::Serve { bind, brain } => {
|
||||
stream::serve(&bind, brain.as_deref()).await?;
|
||||
}
|
||||
Commands::Capture { frames: _, output } => {
|
||||
if camera::camera_available() {
|
||||
let config = camera::CameraConfig::default();
|
||||
let frame = camera::capture_frame(&config)?;
|
||||
let depth = depth::estimate_depth(&frame.rgb, frame.width, frame.height)?;
|
||||
let intrinsics = depth::CameraIntrinsics::default();
|
||||
let cloud = depth::backproject_depth(&depth, &intrinsics, Some(&frame.rgb), 2);
|
||||
pointcloud::write_ply(&cloud, &output)?;
|
||||
println!("Captured {} points to {output}", cloud.points.len());
|
||||
} else {
|
||||
let cloud = depth::demo_depth_cloud();
|
||||
pointcloud::write_ply(&cloud, &output)?;
|
||||
println!("No camera — wrote {} demo points to {output}", cloud.points.len());
|
||||
}
|
||||
}
|
||||
Commands::Demo => {
|
||||
demo().await?;
|
||||
}
|
||||
Commands::Cameras => {
|
||||
let cams = camera::list_cameras();
|
||||
if cams.is_empty() {
|
||||
println!("No cameras found");
|
||||
} else {
|
||||
println!("Available cameras:");
|
||||
for (i, c) in cams.iter().enumerate() {
|
||||
println!(" [{i}] {c}");
|
||||
}
|
||||
}
|
||||
}
|
||||
Commands::Train { data_dir, brain } => {
|
||||
train(&data_dir, brain.as_deref()).await?;
|
||||
}
|
||||
Commands::CsiTest { target, count } => {
|
||||
println!("Sending {count} synthetic ADR-018 CSI frames to {target}...");
|
||||
csi_pipeline::send_test_frames(&target, count)?;
|
||||
println!("Done");
|
||||
}
|
||||
Commands::Fingerprint { name, seconds } => {
|
||||
println!("Recording CSI fingerprint '{name}' for {seconds} s on UDP 3333...");
|
||||
let state = csi_pipeline::start_pipeline("0.0.0.0:3333");
|
||||
std::thread::sleep(std::time::Duration::from_secs(seconds));
|
||||
// record_fingerprint takes a brief lock on the shared state to
|
||||
// read the last N frames from every node's history.
|
||||
{
|
||||
let mut st = state.lock().expect("pipeline state lock poisoned");
|
||||
st.record_fingerprint(&name);
|
||||
println!(
|
||||
" Stored: {} fingerprint(s) total, {} total CSI frames received",
|
||||
st.fingerprints.len(),
|
||||
st.total_frames
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
async fn demo() -> Result<()> {
|
||||
println!("╔══════════════════════════════════════════════╗");
|
||||
println!("║ RuView Dense Point Cloud — Demo ║");
|
||||
println!("╚══════════════════════════════════════════════╝");
|
||||
println!();
|
||||
|
||||
let occupancy = fusion::demo_occupancy();
|
||||
let wifi_cloud = fusion::occupancy_to_pointcloud(&occupancy);
|
||||
println!("WiFi occupancy: {}x{}x{} voxels → {} points",
|
||||
occupancy.nx, occupancy.ny, occupancy.nz, wifi_cloud.points.len());
|
||||
|
||||
let depth_cloud = depth::demo_depth_cloud();
|
||||
println!("Camera depth: {} points", depth_cloud.points.len());
|
||||
|
||||
let fused = fusion::fuse_clouds(&[&wifi_cloud, &depth_cloud], 0.05);
|
||||
println!("Fused: {} points (voxel size=0.05m)", fused.points.len());
|
||||
|
||||
pointcloud::write_ply(&fused, "demo_pointcloud.ply")?;
|
||||
println!("\nWrote: demo_pointcloud.ply");
|
||||
|
||||
let splats = pointcloud::to_gaussian_splats(&fused);
|
||||
let json = serde_json::to_string_pretty(&splats)?;
|
||||
std::fs::write("demo_splats.json", &json)?;
|
||||
println!("Wrote: demo_splats.json ({} splats)", splats.len());
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
async fn train(data_dir: &str, brain_url: Option<&str>) -> Result<()> {
|
||||
println!("╔══════════════════════════════════════════════╗");
|
||||
println!("║ RuView Point Cloud — Training ║");
|
||||
println!("╚══════════════════════════════════════════════╝");
|
||||
println!();
|
||||
|
||||
let expanded = data_dir.replace('~', &dirs::home_dir().unwrap_or_default().to_string_lossy());
|
||||
// Defence-in-depth: reject path-traversal in the CLI argument before we
|
||||
// hand it to TrainingSession (which also checks). This catches malicious
|
||||
// CLI input early, before any I/O.
|
||||
let _sanitised = training::sanitize_data_path(&expanded)?;
|
||||
let mut session = training::TrainingSession::new(&expanded)?;
|
||||
session.load_samples()?;
|
||||
|
||||
// Capture training samples
|
||||
println!("==> Capturing training samples...");
|
||||
|
||||
// Camera samples
|
||||
if camera::camera_available() {
|
||||
println!(" Camera detected — capturing depth frames...");
|
||||
let config = camera::CameraConfig::default();
|
||||
for i in 0..5 {
|
||||
if let Ok(frame) = camera::capture_frame(&config) {
|
||||
let depth = depth::estimate_depth(&frame.rgb, frame.width, frame.height)?;
|
||||
// Score based on depth variance (good frames have varied depth)
|
||||
let mean: f32 = depth.iter().sum::<f32>() / depth.len() as f32;
|
||||
let variance: f32 = depth.iter().map(|d| (d - mean).powi(2)).sum::<f32>() / depth.len() as f32;
|
||||
let quality = (variance / 2.0).min(1.0);
|
||||
|
||||
session.add_sample(
|
||||
Some(depth), frame.width, frame.height,
|
||||
None, None, quality,
|
||||
);
|
||||
println!(" Frame {}: quality={:.2}", i, quality);
|
||||
}
|
||||
std::thread::sleep(std::time::Duration::from_millis(500));
|
||||
}
|
||||
} else {
|
||||
println!(" No camera — using synthetic samples for calibration demo");
|
||||
for i in 0..10 {
|
||||
let w = 160u32;
|
||||
let h = 120u32;
|
||||
let depth: Vec<f32> = (0..w * h).map(|j| 1.0 + (j as f32 / (w * h) as f32) * 4.0 + (i as f32 * 0.1)).collect();
|
||||
let quality = if i < 7 { 0.8 } else { 0.2 };
|
||||
let gt = if i % 3 == 0 {
|
||||
Some(training::GroundTruth {
|
||||
reference_distances: vec![
|
||||
training::ReferencePoint { name: "wall".into(), x_pixel: 80, y_pixel: 60, true_distance_m: 3.0 },
|
||||
],
|
||||
occupancy_label: Some(if i < 5 { "occupied" } else { "empty" }.into()),
|
||||
})
|
||||
} else { None };
|
||||
session.add_sample(Some(depth), w, h, None, gt, quality);
|
||||
}
|
||||
}
|
||||
|
||||
session.save_samples()?;
|
||||
|
||||
// Calibrate depth
|
||||
println!("\n==> Calibrating depth estimation...");
|
||||
let cal = session.calibrate_depth()?;
|
||||
println!(" Result: scale={:.2} offset={:.2} gamma={:.2} RMSE={:.4}m",
|
||||
cal.scale, cal.offset, cal.gamma, cal.rmse);
|
||||
|
||||
// Train occupancy
|
||||
println!("\n==> Training occupancy model...");
|
||||
let occ_cal = session.train_occupancy()?;
|
||||
println!(" Result: threshold={:.2} accuracy={:.1}%",
|
||||
occ_cal.density_threshold, occ_cal.accuracy * 100.0);
|
||||
|
||||
// Export preference pairs
|
||||
println!("\n==> Exporting preference pairs...");
|
||||
let pairs = session.export_preference_pairs()?;
|
||||
println!(" Exported: {} pairs", pairs.len());
|
||||
|
||||
// Submit to brain if available
|
||||
if let Some(url) = brain_url {
|
||||
println!("\n==> Submitting to brain at {url}...");
|
||||
let stored = session.submit_to_brain(url).await?;
|
||||
println!(" Stored: {} observations", stored);
|
||||
}
|
||||
|
||||
println!("\n==> Training complete!");
|
||||
println!(" Data dir: {expanded}");
|
||||
println!(" Samples: {}", session.samples.len());
|
||||
println!(" Calibration: {expanded}/calibration.json");
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
|
@ -0,0 +1,163 @@
|
|||
//! ADR-018 binary CSI frame parser.
|
||||
//!
|
||||
//! Two header magics are accepted: `0xC5110001` (raw CSI, v1) and
|
||||
//! `0xC5110006` (feature state, v6). The header is 20 bytes; everything
|
||||
//! after is interleaved I/Q bytes per subcarrier per antenna.
|
||||
//!
|
||||
//! Returns `None` when the buffer is truncated or the magic is wrong —
|
||||
//! this is a hot path (one call per UDP packet) so we prefer Option over
|
||||
//! a full `anyhow::Error` that would allocate.
|
||||
|
||||
const CSI_MAGIC_V6: u32 = 0xC511_0006;
|
||||
const CSI_MAGIC_V1: u32 = 0xC511_0001;
|
||||
pub(crate) const CSI_HEADER_SIZE: usize = 20;
|
||||
|
||||
/// Accept both header magics — `0xC5110001` (raw CSI) and
|
||||
/// `0xC5110006` (feature state). Exposed for tests.
|
||||
#[allow(dead_code)]
|
||||
pub(crate) const MAGIC_V1: u32 = CSI_MAGIC_V1;
|
||||
#[allow(dead_code)]
|
||||
pub(crate) const MAGIC_V6: u32 = CSI_MAGIC_V6;
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct CsiFrame {
|
||||
pub node_id: u8,
|
||||
pub n_antennas: u8,
|
||||
pub n_subcarriers: u16,
|
||||
pub channel: u8,
|
||||
pub rssi: i8,
|
||||
pub noise_floor: i8,
|
||||
pub timestamp_us: u32,
|
||||
/// Raw I/Q data: [I0, Q0, I1, Q1, ...] for each subcarrier
|
||||
pub iq_data: Vec<i8>,
|
||||
/// Computed amplitude per subcarrier: sqrt(I^2 + Q^2)
|
||||
pub amplitudes: Vec<f32>,
|
||||
/// Computed phase per subcarrier: atan2(Q, I)
|
||||
pub phases: Vec<f32>,
|
||||
}
|
||||
|
||||
/// Parse an ADR-018 binary CSI frame from a UDP packet.
|
||||
///
|
||||
/// Returns `None` if:
|
||||
/// - the buffer is shorter than the 20-byte header
|
||||
/// - the magic does not match either accepted value
|
||||
/// - the declared I/Q payload is truncated
|
||||
pub fn parse_adr018(data: &[u8]) -> Option<CsiFrame> {
|
||||
if data.len() < CSI_HEADER_SIZE { return None; }
|
||||
|
||||
let magic = u32::from_le_bytes([data[0], data[1], data[2], data[3]]);
|
||||
if magic != CSI_MAGIC_V6 && magic != CSI_MAGIC_V1 { return None; }
|
||||
|
||||
let node_id = data[4];
|
||||
let n_antennas = data[5].max(1);
|
||||
let n_subcarriers = u16::from_le_bytes([data[6], data[7]]);
|
||||
let channel = data[8];
|
||||
let rssi = data[9] as i8;
|
||||
let noise_floor = data[10] as i8;
|
||||
let timestamp_us = u32::from_le_bytes([data[16], data[17], data[18], data[19]]);
|
||||
|
||||
let iq_len = (n_subcarriers as usize) * 2 * (n_antennas as usize);
|
||||
if data.len() < CSI_HEADER_SIZE + iq_len { return None; }
|
||||
|
||||
let iq_data: Vec<i8> = data[CSI_HEADER_SIZE..CSI_HEADER_SIZE + iq_len]
|
||||
.iter().map(|&b| b as i8).collect();
|
||||
|
||||
// Compute amplitude and phase per subcarrier (first antenna).
|
||||
let mut amplitudes = Vec::with_capacity(n_subcarriers as usize);
|
||||
let mut phases = Vec::with_capacity(n_subcarriers as usize);
|
||||
for i in 0..n_subcarriers as usize {
|
||||
let idx = i * 2;
|
||||
if idx + 1 < iq_data.len() {
|
||||
let ii = iq_data[idx] as f32;
|
||||
let qq = iq_data[idx + 1] as f32;
|
||||
amplitudes.push((ii * ii + qq * qq).sqrt());
|
||||
phases.push(qq.atan2(ii));
|
||||
}
|
||||
}
|
||||
|
||||
Some(CsiFrame {
|
||||
node_id, n_antennas, n_subcarriers, channel, rssi, noise_floor,
|
||||
timestamp_us, iq_data, amplitudes, phases,
|
||||
})
|
||||
}
|
||||
|
||||
/// Build a synthetic ADR-018 binary frame. Used by the `csi-test` CLI
|
||||
/// subcommand and by the unit tests in this module.
|
||||
pub fn build_test_frame(magic: u32, node_id: u8, n_subcarriers: u16, i: usize) -> Vec<u8> {
|
||||
let mut buf = Vec::with_capacity(CSI_HEADER_SIZE + (n_subcarriers as usize) * 2);
|
||||
buf.extend_from_slice(&magic.to_le_bytes()); // magic (0..4)
|
||||
buf.push(node_id); // node_id (4)
|
||||
buf.push(1u8); // n_antennas (5)
|
||||
buf.extend_from_slice(&n_subcarriers.to_le_bytes()); // n_subcarriers (6..8)
|
||||
buf.push(6u8); // channel (8)
|
||||
buf.push((-40i8 - (i % 30) as i8) as u8); // rssi (9)
|
||||
buf.push((-90i8) as u8); // noise_floor (10)
|
||||
buf.extend_from_slice(&[0u8; 5]); // reserved (11..16)
|
||||
buf.extend_from_slice(&(i as u32).to_le_bytes()); // timestamp_us (16..20)
|
||||
for j in 0..(n_subcarriers as usize) {
|
||||
buf.push(((i + j) as i8).wrapping_mul(3) as u8);
|
||||
buf.push(((i + j) as i8).wrapping_mul(5) as u8);
|
||||
}
|
||||
buf
|
||||
}
|
||||
|
||||
// ─── Tests ──────────────────────────────────────────────────────────────────
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn parse_magic_v1_roundtrips() {
|
||||
let frame_bytes = build_test_frame(MAGIC_V1, 0x42, 56, 7);
|
||||
let frame = parse_adr018(&frame_bytes).expect("v1 frame should parse");
|
||||
assert_eq!(frame.node_id, 0x42);
|
||||
assert_eq!(frame.n_antennas, 1);
|
||||
assert_eq!(frame.n_subcarriers, 56);
|
||||
assert_eq!(frame.channel, 6);
|
||||
assert_eq!(frame.timestamp_us, 7);
|
||||
assert_eq!(frame.iq_data.len(), 56 * 2);
|
||||
assert_eq!(frame.amplitudes.len(), 56);
|
||||
assert_eq!(frame.phases.len(), 56);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn parse_magic_v6_roundtrips() {
|
||||
let frame_bytes = build_test_frame(MAGIC_V6, 0x09, 114, 0);
|
||||
let frame = parse_adr018(&frame_bytes).expect("v6 frame should parse");
|
||||
assert_eq!(frame.node_id, 0x09);
|
||||
assert_eq!(frame.n_antennas, 1);
|
||||
assert_eq!(frame.n_subcarriers, 114);
|
||||
assert_eq!(frame.channel, 6);
|
||||
// With i=0, noise_floor=-90 per build_test_frame.
|
||||
assert_eq!(frame.noise_floor, -90);
|
||||
// With i=0, timestamp_us=0.
|
||||
assert_eq!(frame.timestamp_us, 0);
|
||||
assert_eq!(frame.iq_data.len(), 114 * 2);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn parse_rejects_wrong_magic() {
|
||||
let mut bad = build_test_frame(MAGIC_V1, 0, 8, 0);
|
||||
// Flip magic to something unrelated.
|
||||
bad[0] = 0xFF;
|
||||
bad[1] = 0xFF;
|
||||
bad[2] = 0xFF;
|
||||
bad[3] = 0xFF;
|
||||
assert!(parse_adr018(&bad).is_none(), "bad magic should not parse");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn parse_rejects_truncated_header() {
|
||||
let short = vec![0u8; CSI_HEADER_SIZE - 1];
|
||||
assert!(parse_adr018(&short).is_none(), "truncated header must not parse");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn parse_rejects_truncated_payload() {
|
||||
let mut frame = build_test_frame(MAGIC_V1, 0, 32, 0);
|
||||
// Drop half the declared payload.
|
||||
frame.truncate(CSI_HEADER_SIZE + 20);
|
||||
assert!(parse_adr018(&frame).is_none(), "truncated payload must not parse");
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,126 @@
|
|||
//! Point cloud types + PLY export + Gaussian splat conversion.
|
||||
#![allow(dead_code)]
|
||||
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::io::Write;
|
||||
|
||||
#[derive(Clone, Debug, Serialize, Deserialize)]
|
||||
pub struct Point3D {
|
||||
pub x: f32,
|
||||
pub y: f32,
|
||||
pub z: f32,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, Serialize, Deserialize)]
|
||||
pub struct ColorPoint {
|
||||
pub x: f32,
|
||||
pub y: f32,
|
||||
pub z: f32,
|
||||
pub r: u8,
|
||||
pub g: u8,
|
||||
pub b: u8,
|
||||
pub intensity: f32,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, Serialize, Deserialize)]
|
||||
pub struct PointCloud {
|
||||
pub points: Vec<ColorPoint>,
|
||||
pub timestamp_ms: i64,
|
||||
pub source: String,
|
||||
}
|
||||
|
||||
impl PointCloud {
|
||||
pub fn new(source: &str) -> Self {
|
||||
Self {
|
||||
points: Vec::new(),
|
||||
timestamp_ms: chrono::Utc::now().timestamp_millis(),
|
||||
source: source.to_string(),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn add(&mut self, x: f32, y: f32, z: f32, r: u8, g: u8, b: u8, intensity: f32) {
|
||||
self.points.push(ColorPoint { x, y, z, r, g, b, intensity });
|
||||
}
|
||||
|
||||
pub fn bounds(&self) -> ([f32; 3], [f32; 3]) {
|
||||
if self.points.is_empty() {
|
||||
return ([0.0; 3], [0.0; 3]);
|
||||
}
|
||||
let mut min = [f32::MAX; 3];
|
||||
let mut max = [f32::MIN; 3];
|
||||
for p in &self.points {
|
||||
min[0] = min[0].min(p.x); min[1] = min[1].min(p.y); min[2] = min[2].min(p.z);
|
||||
max[0] = max[0].max(p.x); max[1] = max[1].max(p.y); max[2] = max[2].max(p.z);
|
||||
}
|
||||
(min, max)
|
||||
}
|
||||
}
|
||||
|
||||
/// Write point cloud to PLY format (ASCII).
|
||||
pub fn write_ply(cloud: &PointCloud, path: &str) -> anyhow::Result<()> {
|
||||
let mut f = std::fs::File::create(path)?;
|
||||
writeln!(f, "ply")?;
|
||||
writeln!(f, "format ascii 1.0")?;
|
||||
writeln!(f, "comment Generated by RuView Dense Point Cloud")?;
|
||||
writeln!(f, "comment Source: {}", cloud.source)?;
|
||||
writeln!(f, "comment Timestamp: {}", cloud.timestamp_ms)?;
|
||||
writeln!(f, "element vertex {}", cloud.points.len())?;
|
||||
writeln!(f, "property float x")?;
|
||||
writeln!(f, "property float y")?;
|
||||
writeln!(f, "property float z")?;
|
||||
writeln!(f, "property uchar red")?;
|
||||
writeln!(f, "property uchar green")?;
|
||||
writeln!(f, "property uchar blue")?;
|
||||
writeln!(f, "property float intensity")?;
|
||||
writeln!(f, "end_header")?;
|
||||
for p in &cloud.points {
|
||||
writeln!(f, "{:.4} {:.4} {:.4} {} {} {} {:.4}", p.x, p.y, p.z, p.r, p.g, p.b, p.intensity)?;
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Convert point cloud to Gaussian splats for 3D rendering.
|
||||
#[derive(Serialize, Deserialize)]
|
||||
pub struct GaussianSplat {
|
||||
pub center: [f32; 3],
|
||||
pub color: [f32; 3],
|
||||
pub opacity: f32,
|
||||
pub scale: [f32; 3],
|
||||
}
|
||||
|
||||
pub fn to_gaussian_splats(cloud: &PointCloud) -> Vec<GaussianSplat> {
|
||||
// Cluster points into voxels and create one Gaussian per cluster
|
||||
let voxel_size = 0.08; // smaller voxels = more detail = visible movement
|
||||
let mut cells: std::collections::HashMap<(i32, i32, i32), Vec<&ColorPoint>> = std::collections::HashMap::new();
|
||||
|
||||
for p in &cloud.points {
|
||||
let key = (
|
||||
(p.x / voxel_size).floor() as i32,
|
||||
(p.y / voxel_size).floor() as i32,
|
||||
(p.z / voxel_size).floor() as i32,
|
||||
);
|
||||
cells.entry(key).or_default().push(p);
|
||||
}
|
||||
|
||||
cells.values().map(|pts| {
|
||||
let n = pts.len() as f32;
|
||||
let cx = pts.iter().map(|p| p.x).sum::<f32>() / n;
|
||||
let cy = pts.iter().map(|p| p.y).sum::<f32>() / n;
|
||||
let cz = pts.iter().map(|p| p.z).sum::<f32>() / n;
|
||||
let cr = pts.iter().map(|p| p.r as f32).sum::<f32>() / n / 255.0;
|
||||
let cg = pts.iter().map(|p| p.g as f32).sum::<f32>() / n / 255.0;
|
||||
let cb = pts.iter().map(|p| p.b as f32).sum::<f32>() / n / 255.0;
|
||||
|
||||
// Scale based on point spread
|
||||
let sx = pts.iter().map(|p| (p.x - cx).abs()).sum::<f32>() / n + 0.01;
|
||||
let sy = pts.iter().map(|p| (p.y - cy).abs()).sum::<f32>() / n + 0.01;
|
||||
let sz = pts.iter().map(|p| (p.z - cz).abs()).sum::<f32>() / n + 0.01;
|
||||
|
||||
GaussianSplat {
|
||||
center: [cx, cy, cz],
|
||||
color: [cr, cg, cb],
|
||||
opacity: (n / 10.0).min(1.0),
|
||||
scale: [sx, sy, sz],
|
||||
}
|
||||
}).collect()
|
||||
}
|
||||
|
|
@ -0,0 +1,232 @@
|
|||
//! HTTP server — live camera + ESP32 CSI + fusion → real-time point cloud.
|
||||
|
||||
use crate::brain_bridge;
|
||||
use crate::camera;
|
||||
use crate::csi_pipeline;
|
||||
use crate::depth;
|
||||
use crate::fusion;
|
||||
use crate::pointcloud;
|
||||
use axum::{
|
||||
extract::State,
|
||||
response::Html,
|
||||
routing::get,
|
||||
Json, Router,
|
||||
};
|
||||
use std::sync::{Arc, Mutex};
|
||||
|
||||
struct AppState {
|
||||
latest_cloud: Mutex<pointcloud::PointCloud>,
|
||||
latest_splats: Mutex<Vec<pointcloud::GaussianSplat>>,
|
||||
latest_pipeline: Mutex<Option<csi_pipeline::PipelineOutput>>,
|
||||
frame_count: Mutex<u64>,
|
||||
use_camera: bool,
|
||||
}
|
||||
|
||||
/// Start the HTTP/viewer server bound to `bind` (e.g.
|
||||
/// `"127.0.0.1:9880"` — the safe default — or `"0.0.0.0:9880"` to expose
|
||||
/// the viewer to the LAN).
|
||||
///
|
||||
/// **Security**: the viewer streams live camera/CSI/vitals data. Bind to
|
||||
/// `127.0.0.1` unless you intentionally want remote viewers.
|
||||
pub async fn serve(bind: &str, _brain: Option<&str>) -> anyhow::Result<()> {
|
||||
let has_camera = camera::camera_available();
|
||||
|
||||
// Start CSI pipeline — listens for UDP CSI data from ESP32 nodes.
|
||||
// Kept on 0.0.0.0 because ESP32 nodes are remote devices on the LAN.
|
||||
let csi_pipeline_state = csi_pipeline::start_pipeline("0.0.0.0:3333");
|
||||
eprintln!(" CSI pipeline: UDP port 3333 (ADR-018 binary frames)");
|
||||
|
||||
let initial_cloud = if has_camera {
|
||||
capture_camera_cloud()
|
||||
} else {
|
||||
demo_cloud()
|
||||
};
|
||||
let initial_splats = pointcloud::to_gaussian_splats(&initial_cloud);
|
||||
|
||||
let state = Arc::new(AppState {
|
||||
latest_cloud: Mutex::new(initial_cloud),
|
||||
latest_splats: Mutex::new(initial_splats),
|
||||
latest_pipeline: Mutex::new(None),
|
||||
frame_count: Mutex::new(0),
|
||||
use_camera: has_camera,
|
||||
});
|
||||
|
||||
// Background: capture + fuse every 500ms (motion-adaptive)
|
||||
let bg = state.clone();
|
||||
let bg_csi = csi_pipeline_state.clone();
|
||||
let bg_cam = has_camera;
|
||||
tokio::spawn(async move {
|
||||
let mut skip_depth = false;
|
||||
loop {
|
||||
// Motion-adaptive: check CSI motion score
|
||||
let pipeline_out = Some(csi_pipeline::get_pipeline_output(&bg_csi));
|
||||
if let Some(ref out) = pipeline_out {
|
||||
// Only run expensive depth when motion detected or every 5th frame
|
||||
let frame_num = *bg.frame_count.lock().unwrap();
|
||||
skip_depth = !out.motion_detected && frame_num % 5 != 0;
|
||||
}
|
||||
let pipeline_clone = pipeline_out.clone();
|
||||
*bg.latest_pipeline.lock().unwrap() = pipeline_out;
|
||||
let pipeline_out = pipeline_clone;
|
||||
|
||||
let interval = if skip_depth { 1000 } else { 500 }; // slower when no motion
|
||||
tokio::time::sleep(std::time::Duration::from_millis(interval)).await;
|
||||
|
||||
let (cloud, luminance) = if bg_cam && !skip_depth {
|
||||
tokio::task::spawn_blocking(capture_camera_cloud_with_luminance)
|
||||
.await.unwrap_or_else(|_| (demo_cloud(), None))
|
||||
} else {
|
||||
// Reuse previous cloud when no motion
|
||||
(bg.latest_cloud.lock().unwrap().clone(), None)
|
||||
};
|
||||
// Feed luminance into the CSI pipeline so is_dark toggles for the
|
||||
// viewer. The lock is held briefly here — the UDP thread never
|
||||
// touches it (messages go through the mpsc channel).
|
||||
if let Some(lum) = luminance {
|
||||
if let Ok(mut st) = bg_csi.lock() {
|
||||
st.set_light_level(lum);
|
||||
}
|
||||
}
|
||||
let splats = pointcloud::to_gaussian_splats(&cloud);
|
||||
*bg.latest_cloud.lock().unwrap() = cloud;
|
||||
*bg.latest_splats.lock().unwrap() = splats;
|
||||
let frame_num = {
|
||||
let mut fc = bg.frame_count.lock().unwrap();
|
||||
*fc += 1;
|
||||
*fc
|
||||
};
|
||||
|
||||
// Brain sync — sparse, every 120 frames (~60 seconds)
|
||||
if frame_num % 120 == 0 {
|
||||
if let Some(ref out) = pipeline_out {
|
||||
brain_bridge::sync_to_brain(out, frame_num).await;
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
if has_camera { eprintln!(" Camera: LIVE (/dev/video0)"); }
|
||||
else { eprintln!(" Camera: DEMO"); }
|
||||
|
||||
let app = Router::new()
|
||||
.route("/", get(index))
|
||||
.route("/api/cloud", get(api_cloud))
|
||||
.route("/api/splats", get(api_splats))
|
||||
.route("/api/status", get(api_status))
|
||||
.route("/health", get(api_health))
|
||||
.with_state(state);
|
||||
|
||||
println!("╔══════════════════════════════════════════════╗");
|
||||
println!("║ RuView Dense Point Cloud — ALL SENSORS ║");
|
||||
println!("╚══════════════════════════════════════════════╝");
|
||||
println!(" Viewer: http://{bind}/");
|
||||
if bind.starts_with("0.0.0.0") || bind.starts_with("::") {
|
||||
eprintln!(
|
||||
" WARNING: bound to {bind} — camera/CSI/vitals are exposed \
|
||||
to the network. Use --bind 127.0.0.1:9880 to restrict to loopback."
|
||||
);
|
||||
}
|
||||
|
||||
let listener = tokio::net::TcpListener::bind(bind).await?;
|
||||
axum::serve(listener, app).await?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn capture_camera_cloud() -> pointcloud::PointCloud {
|
||||
capture_camera_cloud_with_luminance().0
|
||||
}
|
||||
|
||||
/// Grab one camera frame, backproject it to a point cloud, and return the
|
||||
/// mean luminance alongside (used to drive `set_light_level` for night mode).
|
||||
fn capture_camera_cloud_with_luminance() -> (pointcloud::PointCloud, Option<f32>) {
|
||||
let config = camera::CameraConfig::default();
|
||||
match camera::capture_frame(&config) {
|
||||
Ok(frame) => {
|
||||
// Mean luminance across the RGB frame (BT.601 coefficients).
|
||||
let pixels = (frame.width as usize) * (frame.height as usize);
|
||||
let mut sum = 0.0f64;
|
||||
let mut n = 0usize;
|
||||
for chunk in frame.rgb.chunks_exact(3).take(pixels) {
|
||||
sum += 0.299 * chunk[0] as f64
|
||||
+ 0.587 * chunk[1] as f64
|
||||
+ 0.114 * chunk[2] as f64;
|
||||
n += 1;
|
||||
}
|
||||
let lum = if n > 0 { Some((sum / n as f64) as f32) } else { None };
|
||||
|
||||
let cloud = match depth::estimate_depth(&frame.rgb, frame.width, frame.height) {
|
||||
Ok(dm) => {
|
||||
let intr = depth::CameraIntrinsics::default();
|
||||
depth::backproject_depth(&dm, &intr, Some(&frame.rgb), 2)
|
||||
}
|
||||
Err(_) => depth::demo_depth_cloud(),
|
||||
};
|
||||
(cloud, lum)
|
||||
}
|
||||
Err(_) => (depth::demo_depth_cloud(), None),
|
||||
}
|
||||
}
|
||||
|
||||
fn demo_cloud() -> pointcloud::PointCloud {
|
||||
let occ = fusion::demo_occupancy();
|
||||
let wc = fusion::occupancy_to_pointcloud(&occ);
|
||||
let dc = depth::demo_depth_cloud();
|
||||
fusion::fuse_clouds(&[&wc, &dc], 0.05)
|
||||
}
|
||||
|
||||
async fn api_cloud(State(state): State<Arc<AppState>>) -> Json<serde_json::Value> {
|
||||
let cloud = state.latest_cloud.lock().unwrap();
|
||||
let (min, max) = cloud.bounds();
|
||||
let frames = *state.frame_count.lock().unwrap();
|
||||
let pipeline = state.latest_pipeline.lock().unwrap();
|
||||
Json(serde_json::json!({
|
||||
"points": cloud.points.len(),
|
||||
"bounds_min": min, "bounds_max": max,
|
||||
"live": state.use_camera,
|
||||
"frame": frames,
|
||||
"pipeline": &*pipeline,
|
||||
"cloud": cloud.points.iter().take(1000).collect::<Vec<_>>(),
|
||||
}))
|
||||
}
|
||||
|
||||
async fn api_splats(State(state): State<Arc<AppState>>) -> Json<serde_json::Value> {
|
||||
let splats = state.latest_splats.lock().unwrap();
|
||||
let frames = *state.frame_count.lock().unwrap();
|
||||
let pipeline = state.latest_pipeline.lock().unwrap();
|
||||
Json(serde_json::json!({
|
||||
"splats": &*splats,
|
||||
"count": splats.len(),
|
||||
"live": state.use_camera,
|
||||
"frame": frames,
|
||||
"pipeline": &*pipeline,
|
||||
"timestamp": chrono::Utc::now().timestamp_millis(),
|
||||
}))
|
||||
}
|
||||
|
||||
async fn api_status(State(state): State<Arc<AppState>>) -> Json<serde_json::Value> {
|
||||
let frames = *state.frame_count.lock().unwrap();
|
||||
let pipeline = state.latest_pipeline.lock().unwrap();
|
||||
Json(serde_json::json!({
|
||||
"status": "ok",
|
||||
"version": env!("CARGO_PKG_VERSION"),
|
||||
"live": state.use_camera,
|
||||
"camera": if state.use_camera { "/dev/video0" } else { "demo" },
|
||||
"csi_pipeline": "active (UDP:3333)",
|
||||
"pipeline": &*pipeline,
|
||||
"frames_captured": frames,
|
||||
}))
|
||||
}
|
||||
|
||||
async fn api_health() -> Json<serde_json::Value> {
|
||||
Json(serde_json::json!({"status": "ok"}))
|
||||
}
|
||||
|
||||
/// Viewer HTML/JS, compiled into the binary at build time. Keep the
|
||||
/// markup in `viewer.html` to keep this file under the 500-LOC limit and
|
||||
/// to make it trivially editable (no Rust rebuild when tweaking JS).
|
||||
static VIEWER_HTML: &str = include_str!("viewer.html");
|
||||
|
||||
async fn index() -> Html<&'static str> {
|
||||
Html(VIEWER_HTML)
|
||||
}
|
||||
|
||||
|
|
@ -0,0 +1,497 @@
|
|||
//! Training pipeline — collect spatial observations and train depth/occupancy models.
|
||||
//!
|
||||
//! Three training modes:
|
||||
//! 1. **Depth calibration**: capture camera frames + known distances → calibrate
|
||||
//! the luminance-to-depth mapping parameters
|
||||
//! 2. **CSI occupancy training**: capture CSI with known occupancy ground truth →
|
||||
//! train the tomography weights for this room geometry
|
||||
//! 3. **Brain integration**: store spatial observations as brain memories for
|
||||
//! DPO training — "this depth estimate was correct" vs "this was wrong"
|
||||
|
||||
use crate::fusion::OccupancyVolume;
|
||||
use anyhow::{anyhow, Result};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::path::{Path, PathBuf};
|
||||
|
||||
/// Reject a user-supplied path that contains `..` components (path traversal
|
||||
/// attempt) and return a normalised [`PathBuf`]. We only reject `..`; other
|
||||
/// components (including relative prefixes and `~`) are accepted verbatim —
|
||||
/// the caller is responsible for tilde expansion if needed.
|
||||
pub fn sanitize_data_path(raw: &str) -> Result<PathBuf> {
|
||||
let p = PathBuf::from(raw);
|
||||
for comp in p.components() {
|
||||
if matches!(comp, std::path::Component::ParentDir) {
|
||||
return Err(anyhow!(
|
||||
"refusing to use data dir with `..` traversal component: {raw}"
|
||||
));
|
||||
}
|
||||
}
|
||||
Ok(p)
|
||||
}
|
||||
|
||||
/// Ensure `child` (after joining to `base`) stays inside the canonicalised
|
||||
/// `base` directory. Returns the canonical child path on success. Used by
|
||||
/// every filesystem write site in this module to prevent path-traversal
|
||||
/// through user-supplied names.
|
||||
fn safe_join(base: &Path, child: &str) -> Result<PathBuf> {
|
||||
// Reject absolute children and any `..` components up front.
|
||||
let child_path = Path::new(child);
|
||||
if child_path.is_absolute() {
|
||||
return Err(anyhow!("child path must be relative: {child}"));
|
||||
}
|
||||
for comp in child_path.components() {
|
||||
if matches!(comp, std::path::Component::ParentDir) {
|
||||
return Err(anyhow!("child path may not contain `..`: {child}"));
|
||||
}
|
||||
}
|
||||
|
||||
let joined = base.join(child_path);
|
||||
// Canonicalise base (must exist) and verify joined starts with it. If the
|
||||
// joined file doesn't exist yet we canonicalise the parent.
|
||||
let canonical_base = base.canonicalize()
|
||||
.map_err(|e| anyhow!("data_dir not accessible {}: {e}", base.display()))?;
|
||||
let canonical_parent = joined
|
||||
.parent()
|
||||
.ok_or_else(|| anyhow!("no parent for {}", joined.display()))?;
|
||||
let canonical_parent = canonical_parent
|
||||
.canonicalize()
|
||||
.map_err(|e| anyhow!("parent not accessible {}: {e}", canonical_parent.display()))?;
|
||||
if !canonical_parent.starts_with(&canonical_base) {
|
||||
return Err(anyhow!(
|
||||
"refusing to write outside data_dir: {}",
|
||||
joined.display()
|
||||
));
|
||||
}
|
||||
Ok(canonical_parent.join(
|
||||
joined.file_name().ok_or_else(|| anyhow!("no filename for {}", joined.display()))?,
|
||||
))
|
||||
}
|
||||
|
||||
/// Training data sample — a snapshot of the scene.
|
||||
#[derive(Serialize, Deserialize)]
|
||||
pub struct TrainingSample {
|
||||
pub timestamp_ms: i64,
|
||||
pub source: String,
|
||||
/// Camera depth map (downsampled, in meters)
|
||||
pub depth_map: Option<Vec<f32>>,
|
||||
pub depth_width: u32,
|
||||
pub depth_height: u32,
|
||||
/// WiFi occupancy grid
|
||||
pub occupancy: Option<OccupancyData>,
|
||||
/// Ground truth (if available)
|
||||
pub ground_truth: Option<GroundTruth>,
|
||||
/// Quality score (0.0-1.0, rated by user or self-eval)
|
||||
pub quality: f32,
|
||||
}
|
||||
|
||||
#[derive(Serialize, Deserialize)]
|
||||
pub struct OccupancyData {
|
||||
pub densities: Vec<f64>,
|
||||
pub nx: usize,
|
||||
pub ny: usize,
|
||||
pub nz: usize,
|
||||
}
|
||||
|
||||
impl From<&OccupancyVolume> for OccupancyData {
|
||||
fn from(vol: &OccupancyVolume) -> Self {
|
||||
Self {
|
||||
densities: vol.densities.clone(),
|
||||
nx: vol.nx, ny: vol.ny, nz: vol.nz,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Serialize, Deserialize)]
|
||||
pub struct GroundTruth {
|
||||
/// Known distances to reference points (e.g., wall at 3.0m)
|
||||
pub reference_distances: Vec<ReferencePoint>,
|
||||
/// Known occupancy state (person present/absent + location)
|
||||
pub occupancy_label: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Serialize, Deserialize)]
|
||||
pub struct ReferencePoint {
|
||||
pub name: String,
|
||||
pub x_pixel: u32,
|
||||
pub y_pixel: u32,
|
||||
pub true_distance_m: f32,
|
||||
}
|
||||
|
||||
/// Training session — accumulates samples and learns calibration.
|
||||
pub struct TrainingSession {
|
||||
pub samples: Vec<TrainingSample>,
|
||||
pub calibration: DepthCalibration,
|
||||
pub data_dir: PathBuf,
|
||||
}
|
||||
|
||||
/// Depth calibration parameters — maps luminance to real depth.
|
||||
#[derive(Clone, Serialize, Deserialize)]
|
||||
pub struct DepthCalibration {
|
||||
pub scale: f32, // multiplier for depth values
|
||||
pub offset: f32, // additive offset
|
||||
pub near_clip: f32, // minimum valid depth
|
||||
pub far_clip: f32, // maximum valid depth
|
||||
pub gamma: f32, // nonlinear correction (luminance^gamma → depth)
|
||||
pub samples_used: u32,
|
||||
pub rmse: f32, // root mean square error against ground truth
|
||||
}
|
||||
|
||||
impl Default for DepthCalibration {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
scale: 4.0,
|
||||
offset: 1.0,
|
||||
near_clip: 0.3,
|
||||
far_clip: 8.0,
|
||||
gamma: 1.0,
|
||||
samples_used: 0,
|
||||
rmse: f32::MAX,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl TrainingSession {
|
||||
/// Create a new training session rooted at `data_dir`.
|
||||
///
|
||||
/// `data_dir` must not contain `..` components — we reject path traversal
|
||||
/// attempts from CLI/API input. The directory is created if missing and
|
||||
/// then canonicalised so every subsequent write stays inside it.
|
||||
pub fn new(data_dir: &str) -> Result<Self> {
|
||||
let path = sanitize_data_path(data_dir)?;
|
||||
std::fs::create_dir_all(&path)
|
||||
.map_err(|e| anyhow!("failed to create data_dir {}: {e}", path.display()))?;
|
||||
// Canonicalise so path-traversal checks in safe_join have a fixed root.
|
||||
let path = path
|
||||
.canonicalize()
|
||||
.map_err(|e| anyhow!("cannot canonicalise data_dir {}: {e}", path.display()))?;
|
||||
|
||||
// Load existing calibration if available
|
||||
let cal_path = safe_join(&path, "calibration.json")
|
||||
// safe_join needs the parent to exist; for initial load that's always data_dir
|
||||
.or_else(|_| Ok::<_, anyhow::Error>(path.join("calibration.json")))?;
|
||||
let calibration = if cal_path.exists() {
|
||||
let data = std::fs::read_to_string(&cal_path)?;
|
||||
serde_json::from_str(&data).unwrap_or_default()
|
||||
} else {
|
||||
DepthCalibration::default()
|
||||
};
|
||||
|
||||
Ok(Self {
|
||||
samples: Vec::new(),
|
||||
calibration,
|
||||
data_dir: path,
|
||||
})
|
||||
}
|
||||
|
||||
/// Add a training sample with optional ground truth.
|
||||
pub fn add_sample(
|
||||
&mut self,
|
||||
depth_map: Option<Vec<f32>>,
|
||||
width: u32,
|
||||
height: u32,
|
||||
occupancy: Option<&OccupancyVolume>,
|
||||
ground_truth: Option<GroundTruth>,
|
||||
quality: f32,
|
||||
) {
|
||||
let sample = TrainingSample {
|
||||
timestamp_ms: chrono::Utc::now().timestamp_millis(),
|
||||
source: "capture".to_string(),
|
||||
depth_map,
|
||||
depth_width: width,
|
||||
depth_height: height,
|
||||
occupancy: occupancy.map(OccupancyData::from),
|
||||
ground_truth,
|
||||
quality,
|
||||
};
|
||||
self.samples.push(sample);
|
||||
}
|
||||
|
||||
/// Calibrate depth estimation using ground truth reference points.
|
||||
///
|
||||
/// Finds optimal scale, offset, and gamma to minimize RMSE
|
||||
/// between estimated and true depths at reference points.
|
||||
pub fn calibrate_depth(&mut self) -> Result<DepthCalibration> {
|
||||
let mut best = self.calibration.clone();
|
||||
let mut best_rmse = f32::MAX;
|
||||
|
||||
// Collect all reference points across samples
|
||||
let refs: Vec<(f32, f32)> = self.samples.iter()
|
||||
.filter_map(|s| {
|
||||
let gt = s.ground_truth.as_ref()?;
|
||||
let dm = s.depth_map.as_ref()?;
|
||||
Some(gt.reference_distances.iter().filter_map(|rp| {
|
||||
let idx = (rp.y_pixel * s.depth_width + rp.x_pixel) as usize;
|
||||
dm.get(idx).map(|&est| (est, rp.true_distance_m))
|
||||
}).collect::<Vec<_>>())
|
||||
})
|
||||
.flatten()
|
||||
.collect();
|
||||
|
||||
if refs.is_empty() {
|
||||
eprintln!(" No reference points — using default calibration");
|
||||
return Ok(best);
|
||||
}
|
||||
|
||||
eprintln!(" Calibrating with {} reference points...", refs.len());
|
||||
|
||||
// Grid search over scale, offset, gamma
|
||||
for scale_i in 0..20 {
|
||||
let scale = 1.0 + scale_i as f32 * 0.5;
|
||||
for offset_i in 0..10 {
|
||||
let offset = offset_i as f32 * 0.5;
|
||||
for gamma_i in 5..15 {
|
||||
let gamma = gamma_i as f32 * 0.2;
|
||||
|
||||
let rmse = refs.iter()
|
||||
.map(|&(est, truth)| {
|
||||
let calibrated = offset + est.powf(gamma) * scale;
|
||||
(calibrated - truth).powi(2)
|
||||
})
|
||||
.sum::<f32>() / refs.len() as f32;
|
||||
let rmse = rmse.sqrt();
|
||||
|
||||
if rmse < best_rmse {
|
||||
best_rmse = rmse;
|
||||
best = DepthCalibration {
|
||||
scale, offset, gamma,
|
||||
near_clip: 0.3, far_clip: 8.0,
|
||||
samples_used: refs.len() as u32,
|
||||
rmse,
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
eprintln!(" Best calibration: scale={:.2} offset={:.2} gamma={:.2} RMSE={:.4}m",
|
||||
best.scale, best.offset, best.gamma, best.rmse);
|
||||
|
||||
self.calibration = best.clone();
|
||||
self.save_calibration()?;
|
||||
Ok(best)
|
||||
}
|
||||
|
||||
/// Train CSI occupancy model — adjust tomography weights.
|
||||
///
|
||||
/// Uses samples with known occupancy labels to optimize the
|
||||
/// attenuation-to-density mapping.
|
||||
pub fn train_occupancy(&self) -> Result<OccupancyCalibration> {
|
||||
let labeled: Vec<&TrainingSample> = self.samples.iter()
|
||||
.filter(|s| s.ground_truth.as_ref().and_then(|g| g.occupancy_label.as_ref()).is_some())
|
||||
.collect();
|
||||
|
||||
if labeled.is_empty() {
|
||||
eprintln!(" No labeled occupancy samples — using defaults");
|
||||
return Ok(OccupancyCalibration::default());
|
||||
}
|
||||
|
||||
eprintln!(" Training occupancy model with {} samples...", labeled.len());
|
||||
|
||||
// Simple threshold optimization — find the density threshold
|
||||
// that best separates occupied vs unoccupied
|
||||
let mut best_threshold = 0.3f64;
|
||||
let mut best_accuracy = 0.0f64;
|
||||
|
||||
for thresh_i in 1..20 {
|
||||
let threshold = thresh_i as f64 * 0.05;
|
||||
let mut correct = 0;
|
||||
let mut total = 0;
|
||||
|
||||
for sample in &labeled {
|
||||
if let Some(ref occ) = sample.occupancy {
|
||||
let label = sample.ground_truth.as_ref().unwrap()
|
||||
.occupancy_label.as_ref().unwrap();
|
||||
let is_occupied = label == "occupied" || label == "present";
|
||||
let detected = occ.densities.iter().any(|&d| d > threshold);
|
||||
if detected == is_occupied { correct += 1; }
|
||||
total += 1;
|
||||
}
|
||||
}
|
||||
|
||||
let accuracy = correct as f64 / total.max(1) as f64;
|
||||
if accuracy > best_accuracy {
|
||||
best_accuracy = accuracy;
|
||||
best_threshold = threshold;
|
||||
}
|
||||
}
|
||||
|
||||
let cal = OccupancyCalibration {
|
||||
density_threshold: best_threshold,
|
||||
accuracy: best_accuracy,
|
||||
samples_used: labeled.len() as u32,
|
||||
};
|
||||
|
||||
eprintln!(" Occupancy threshold={:.2} accuracy={:.1}%", cal.density_threshold, cal.accuracy * 100.0);
|
||||
|
||||
// Save (path-traversal safe: constant filename under canonical data_dir)
|
||||
let path = safe_join(&self.data_dir, "occupancy_calibration.json")?;
|
||||
std::fs::write(&path, serde_json::to_string_pretty(&cal)?)?;
|
||||
|
||||
Ok(cal)
|
||||
}
|
||||
|
||||
/// Export training data as preference pairs for DPO training on the brain.
|
||||
///
|
||||
/// Good samples (quality > 0.7) → chosen
|
||||
/// Bad samples (quality < 0.3) → rejected
|
||||
pub fn export_preference_pairs(&self) -> Result<Vec<PreferencePair>> {
|
||||
let mut pairs = Vec::new();
|
||||
|
||||
let good: Vec<&TrainingSample> = self.samples.iter()
|
||||
.filter(|s| s.quality > 0.7)
|
||||
.collect();
|
||||
let bad: Vec<&TrainingSample> = self.samples.iter()
|
||||
.filter(|s| s.quality < 0.3)
|
||||
.collect();
|
||||
|
||||
for (g, b) in good.iter().zip(bad.iter()) {
|
||||
pairs.push(PreferencePair {
|
||||
chosen: format!(
|
||||
"Depth estimation at {}ms: {} points, quality {:.2}",
|
||||
g.timestamp_ms,
|
||||
g.depth_map.as_ref().map(|d| d.len()).unwrap_or(0),
|
||||
g.quality
|
||||
),
|
||||
rejected: format!(
|
||||
"Depth estimation at {}ms: {} points, quality {:.2}",
|
||||
b.timestamp_ms,
|
||||
b.depth_map.as_ref().map(|d| d.len()).unwrap_or(0),
|
||||
b.quality
|
||||
),
|
||||
});
|
||||
}
|
||||
|
||||
// Save pairs (path-traversal safe: constant filename under canonical data_dir)
|
||||
let path = safe_join(&self.data_dir, "preference_pairs.jsonl")?;
|
||||
let mut f = std::fs::File::create(&path)?;
|
||||
for pair in &pairs {
|
||||
use std::io::Write;
|
||||
writeln!(f, "{}", serde_json::to_string(pair)?)?;
|
||||
}
|
||||
|
||||
eprintln!(" Exported {} preference pairs to {}", pairs.len(), path.display());
|
||||
Ok(pairs)
|
||||
}
|
||||
|
||||
/// Send training results to the ruOS brain for storage.
|
||||
pub async fn submit_to_brain(&self, brain_url: &str) -> Result<u32> {
|
||||
let client = reqwest::Client::builder()
|
||||
.timeout(std::time::Duration::from_secs(10))
|
||||
.build()?;
|
||||
|
||||
let mut stored = 0u32;
|
||||
|
||||
// Store calibration as brain memory
|
||||
let _cal_json = serde_json::to_string(&self.calibration)?;
|
||||
let body = serde_json::json!({
|
||||
"category": "spatial-calibration",
|
||||
"content": format!("Depth calibration: scale={:.2} offset={:.2} gamma={:.2} RMSE={:.4}m ({} samples)",
|
||||
self.calibration.scale, self.calibration.offset, self.calibration.gamma,
|
||||
self.calibration.rmse, self.calibration.samples_used),
|
||||
});
|
||||
if client.post(format!("{brain_url}/memories"))
|
||||
.json(&body).send().await.is_ok() {
|
||||
stored += 1;
|
||||
}
|
||||
|
||||
// Store good observations
|
||||
for sample in self.samples.iter().filter(|s| s.quality > 0.5) {
|
||||
let body = serde_json::json!({
|
||||
"category": "spatial-observation",
|
||||
"content": format!("Point cloud capture: {} depth points, quality {:.2}, occupancy {}",
|
||||
sample.depth_map.as_ref().map(|d| d.len()).unwrap_or(0),
|
||||
sample.quality,
|
||||
sample.occupancy.as_ref().map(|o| format!("{}x{}x{}", o.nx, o.ny, o.nz)).unwrap_or("none".into())),
|
||||
});
|
||||
if client.post(format!("{brain_url}/memories"))
|
||||
.json(&body).send().await.is_ok() {
|
||||
stored += 1;
|
||||
}
|
||||
}
|
||||
|
||||
eprintln!(" Submitted {} observations to brain", stored);
|
||||
Ok(stored)
|
||||
}
|
||||
|
||||
/// Save current calibration to disk (path-traversal safe).
|
||||
fn save_calibration(&self) -> Result<()> {
|
||||
let path = safe_join(&self.data_dir, "calibration.json")?;
|
||||
std::fs::write(&path, serde_json::to_string_pretty(&self.calibration)?)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Save all samples to disk (path-traversal safe).
|
||||
pub fn save_samples(&self) -> Result<()> {
|
||||
let path = safe_join(&self.data_dir, "samples.json")?;
|
||||
std::fs::write(&path, serde_json::to_string_pretty(&self.samples)?)?;
|
||||
eprintln!(" Saved {} samples to {}", self.samples.len(), path.display());
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Load samples from disk (path-traversal safe).
|
||||
pub fn load_samples(&mut self) -> Result<()> {
|
||||
let path = safe_join(&self.data_dir, "samples.json")?;
|
||||
if path.exists() {
|
||||
let data = std::fs::read_to_string(&path)?;
|
||||
self.samples = serde_json::from_str(&data)?;
|
||||
eprintln!(" Loaded {} samples", self.samples.len());
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Serialize, Deserialize)]
|
||||
pub struct OccupancyCalibration {
|
||||
pub density_threshold: f64,
|
||||
pub accuracy: f64,
|
||||
pub samples_used: u32,
|
||||
}
|
||||
|
||||
impl Default for OccupancyCalibration {
|
||||
fn default() -> Self {
|
||||
Self { density_threshold: 0.3, accuracy: 0.0, samples_used: 0 }
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Serialize, Deserialize)]
|
||||
pub struct PreferencePair {
|
||||
pub chosen: String,
|
||||
pub rejected: String,
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn sanitize_rejects_parent_dir_traversal() {
|
||||
assert!(sanitize_data_path("../etc/passwd").is_err());
|
||||
assert!(sanitize_data_path("foo/../bar").is_err());
|
||||
assert!(sanitize_data_path("/tmp/.. /evil").is_ok(), "`.. ` is not ParentDir");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn sanitize_accepts_relative_child() {
|
||||
assert!(sanitize_data_path("data/ruview").is_ok());
|
||||
assert!(sanitize_data_path("./foo").is_ok());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn training_session_new_rejects_traversal() {
|
||||
// Even if the filesystem has such a path, TrainingSession should refuse.
|
||||
let err = TrainingSession::new("../etc/passwd").err();
|
||||
assert!(err.is_some(), "traversal path must be rejected");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn training_session_new_accepts_child_path() {
|
||||
// Use a unique tmpdir to avoid cross-test interference.
|
||||
let tmp = std::env::temp_dir().join(format!("ruview-train-test-{}", std::process::id()));
|
||||
let _ = std::fs::remove_dir_all(&tmp);
|
||||
let sess = TrainingSession::new(tmp.to_str().unwrap())
|
||||
.expect("TrainingSession should accept a clean tmpdir");
|
||||
// data_dir should have been canonicalised to an absolute path.
|
||||
assert!(sess.data_dir.is_absolute());
|
||||
let _ = std::fs::remove_dir_all(&tmp);
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,229 @@
|
|||
<!DOCTYPE html>
|
||||
<html>
|
||||
<head>
|
||||
<title>RuView — Camera + WiFi CSI Point Cloud</title>
|
||||
<style>
|
||||
body { margin: 0; background: #0a0a0a; color: #e8a634; font-family: monospace; }
|
||||
canvas { display: block; }
|
||||
#info { position: absolute; top: 10px; left: 10px; padding: 12px; background: rgba(0,0,0,0.85); border: 1px solid #e8a634; border-radius: 6px; min-width: 240px; font-size: 13px; line-height: 1.5; }
|
||||
.live { color: #4f4; } .demo { color: #f44; }
|
||||
.section { margin-top: 6px; padding-top: 6px; border-top: 1px solid #333; }
|
||||
.label { color: #888; }
|
||||
</style>
|
||||
<script src="https://cdnjs.cloudflare.com/ajax/libs/three.js/r128/three.min.js"></script>
|
||||
<script src="https://cdn.jsdelivr.net/npm/three@0.128.0/examples/js/controls/OrbitControls.js"></script>
|
||||
</head>
|
||||
<body>
|
||||
<div id="info">
|
||||
<h3 style="margin:0 0 8px 0">RuView Point Cloud</h3>
|
||||
<div id="stats">Loading...</div>
|
||||
</div>
|
||||
<script>
|
||||
var scene = new THREE.Scene();
|
||||
scene.background = new THREE.Color(0x0a0a0a);
|
||||
var camera = new THREE.PerspectiveCamera(75, window.innerWidth/window.innerHeight, 0.1, 100);
|
||||
camera.position.set(0, 2, -4);
|
||||
camera.lookAt(0, 0, 2);
|
||||
|
||||
var renderer = new THREE.WebGLRenderer({ antialias: true });
|
||||
renderer.setSize(window.innerWidth, window.innerHeight);
|
||||
document.body.appendChild(renderer.domElement);
|
||||
|
||||
var controls = new THREE.OrbitControls(camera, renderer.domElement);
|
||||
controls.enableDamping = true;
|
||||
controls.target.set(0, 0, 2);
|
||||
|
||||
var pointsMesh = null;
|
||||
var lastFrame = -1;
|
||||
var skeletonGroup = null;
|
||||
var prevTimestamp = 0;
|
||||
var frameRateVal = 0;
|
||||
|
||||
// COCO skeleton connections: pairs of keypoint indices
|
||||
// 0=nose 1=leftEye 2=rightEye 3=leftEar 4=rightEar
|
||||
// 5=leftShoulder 6=rightShoulder 7=leftElbow 8=rightElbow
|
||||
// 9=leftWrist 10=rightWrist 11=leftHip 12=rightHip
|
||||
// 13=leftKnee 14=rightKnee 15=leftAnkle 16=rightAnkle
|
||||
var COCO_BONES = [
|
||||
[0,1],[0,2],[1,3],[2,4],
|
||||
[5,6],[5,7],[7,9],[6,8],[8,10],
|
||||
[5,11],[6,12],[11,12],
|
||||
[11,13],[13,15],[12,14],[14,16]
|
||||
];
|
||||
|
||||
function clearSkeleton() {
|
||||
if (skeletonGroup) {
|
||||
scene.remove(skeletonGroup);
|
||||
skeletonGroup.traverse(function(obj) {
|
||||
if (obj.geometry) obj.geometry.dispose();
|
||||
if (obj.material) obj.material.dispose();
|
||||
});
|
||||
skeletonGroup = null;
|
||||
}
|
||||
}
|
||||
|
||||
function drawSkeleton(keypoints) {
|
||||
clearSkeleton();
|
||||
if (!keypoints || keypoints.length < 17) return;
|
||||
skeletonGroup = new THREE.Group();
|
||||
|
||||
// Map keypoints from [0,1] to scene coords
|
||||
// x: [-2, 2], y: [2, -2] (flip y), z: fixed at 2
|
||||
var sphereGeo = new THREE.SphereGeometry(0.04, 8, 8);
|
||||
var sphereMat = new THREE.MeshBasicMaterial({ color: 0xffff00 });
|
||||
var positions3D = [];
|
||||
var i, kp, sx, sy;
|
||||
for (i = 0; i < 17; i++) {
|
||||
kp = keypoints[i];
|
||||
if (!kp) { positions3D.push(null); continue; }
|
||||
sx = (kp[0] - 0.5) * 4;
|
||||
sy = (0.5 - kp[1]) * 4;
|
||||
positions3D.push([sx, sy, 2]);
|
||||
var sphere = new THREE.Mesh(sphereGeo, sphereMat);
|
||||
sphere.position.set(sx, sy, 2);
|
||||
skeletonGroup.add(sphere);
|
||||
}
|
||||
|
||||
// Draw bones as white lines
|
||||
var lineMat = new THREE.LineBasicMaterial({ color: 0xffffff, linewidth: 2 });
|
||||
var b, a, bIdx;
|
||||
for (b = 0; b < COCO_BONES.length; b++) {
|
||||
a = COCO_BONES[b][0];
|
||||
bIdx = COCO_BONES[b][1];
|
||||
if (!positions3D[a] || !positions3D[bIdx]) continue;
|
||||
var lineGeo = new THREE.BufferGeometry();
|
||||
var verts = new Float32Array([
|
||||
positions3D[a][0], positions3D[a][1], positions3D[a][2],
|
||||
positions3D[bIdx][0], positions3D[bIdx][1], positions3D[bIdx][2]
|
||||
]);
|
||||
lineGeo.setAttribute("position", new THREE.BufferAttribute(verts, 3));
|
||||
var line = new THREE.Line(lineGeo, lineMat);
|
||||
skeletonGroup.add(line);
|
||||
}
|
||||
|
||||
scene.add(skeletonGroup);
|
||||
}
|
||||
|
||||
async function fetchCloud() {
|
||||
try {
|
||||
var resp = await fetch("/api/splats");
|
||||
var data = await resp.json();
|
||||
if (data.splats && data.frame !== lastFrame) {
|
||||
// Compute CSI frame rate
|
||||
var now = Date.now();
|
||||
if (prevTimestamp > 0) {
|
||||
var dt = (now - prevTimestamp) / 1000.0;
|
||||
if (dt > 0) frameRateVal = (1.0 / dt).toFixed(1);
|
||||
}
|
||||
prevTimestamp = now;
|
||||
lastFrame = data.frame;
|
||||
updateSplats(data.splats);
|
||||
|
||||
// Draw skeleton if available
|
||||
var pipe = data.pipeline;
|
||||
if (pipe && pipe.skeleton && pipe.skeleton.keypoints) {
|
||||
drawSkeleton(pipe.skeleton.keypoints);
|
||||
} else {
|
||||
clearSkeleton();
|
||||
}
|
||||
|
||||
// Build info panel
|
||||
var mode = data.live
|
||||
? '<span class="live">● LIVE</span>'
|
||||
: '<span class="demo">● DEMO</span>';
|
||||
var html = mode + " Camera + CSI<br>"
|
||||
+ "Splats: " + data.count + "<br>"
|
||||
+ "Frame: " + data.frame;
|
||||
|
||||
// CSI frame rate
|
||||
html += '<div class="section">'
|
||||
+ '<span class="label">CSI Rate:</span> '
|
||||
+ frameRateVal + " fps</div>";
|
||||
|
||||
// Skeleton confidence
|
||||
if (pipe && pipe.skeleton && pipe.skeleton.confidence !== undefined) {
|
||||
var conf = (pipe.skeleton.confidence * 100).toFixed(0);
|
||||
html += '<div class="section">'
|
||||
+ '<span class="label">Skeleton:</span> '
|
||||
+ conf + "% confidence</div>";
|
||||
}
|
||||
|
||||
// Weather data
|
||||
if (pipe && pipe.weather) {
|
||||
var w = pipe.weather;
|
||||
html += '<div class="section">'
|
||||
+ '<span class="label">Weather:</span> ';
|
||||
if (w.temperature !== undefined) {
|
||||
html += w.temperature + "°C";
|
||||
}
|
||||
if (w.conditions) {
|
||||
html += " " + w.conditions;
|
||||
}
|
||||
html += "</div>";
|
||||
}
|
||||
|
||||
// Building count from geo
|
||||
if (pipe && pipe.geo && pipe.geo.building_count !== undefined) {
|
||||
html += '<div class="section">'
|
||||
+ '<span class="label">Buildings:</span> '
|
||||
+ pipe.geo.building_count + "</div>";
|
||||
}
|
||||
|
||||
// Vitals
|
||||
if (pipe && pipe.vitals) {
|
||||
var v = pipe.vitals;
|
||||
html += '<div class="section">'
|
||||
+ '<span class="label">Vitals:</span> ';
|
||||
if (v.breathing_rate !== undefined) {
|
||||
html += "BR " + v.breathing_rate + "/min";
|
||||
}
|
||||
if (v.motion_score !== undefined) {
|
||||
html += " Motion " + (v.motion_score * 100).toFixed(0) + "%";
|
||||
}
|
||||
html += "</div>";
|
||||
}
|
||||
|
||||
document.getElementById("stats").innerHTML = html;
|
||||
}
|
||||
} catch(e) {}
|
||||
}
|
||||
fetchCloud();
|
||||
setInterval(fetchCloud, 500);
|
||||
|
||||
function updateSplats(splats) {
|
||||
if (pointsMesh) scene.remove(pointsMesh);
|
||||
var geometry = new THREE.BufferGeometry();
|
||||
var positions = new Float32Array(splats.length * 3);
|
||||
var colors = new Float32Array(splats.length * 3);
|
||||
var i, s;
|
||||
for (i = 0; i < splats.length; i++) {
|
||||
s = splats[i];
|
||||
positions[i*3] = s.center[0];
|
||||
positions[i*3+1] = -s.center[1];
|
||||
positions[i*3+2] = s.center[2];
|
||||
colors[i*3] = s.color[0];
|
||||
colors[i*3+1] = s.color[1];
|
||||
colors[i*3+2] = s.color[2];
|
||||
}
|
||||
geometry.setAttribute("position", new THREE.BufferAttribute(positions, 3));
|
||||
geometry.setAttribute("color", new THREE.BufferAttribute(colors, 3));
|
||||
pointsMesh = new THREE.Points(geometry, new THREE.PointsMaterial({
|
||||
size: 0.02, vertexColors: true, sizeAttenuation: true
|
||||
}));
|
||||
scene.add(pointsMesh);
|
||||
}
|
||||
|
||||
function animate() {
|
||||
requestAnimationFrame(animate);
|
||||
controls.update();
|
||||
renderer.render(scene, camera);
|
||||
}
|
||||
animate();
|
||||
window.addEventListener("resize", function() {
|
||||
camera.aspect = window.innerWidth / window.innerHeight;
|
||||
camera.updateProjectionMatrix();
|
||||
renderer.setSize(window.innerWidth, window.innerHeight);
|
||||
});
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
||||
Loading…
Add table
Add a link
Reference in a new issue