mirror of
https://github.com/ruvnet/RuView.git
synced 2026-04-28 05:59:32 +00:00
feat: Real-time dense point cloud from camera + WiFi CSI (#405)
* Add wifi-densepose-pointcloud: real-time dense point cloud from camera + WiFi CSI
New crate with 5 modules:
- depth: monocular depth estimation + 3D backprojection (ONNX-ready, synthetic fallback)
- pointcloud: Point3D/ColorPoint types, PLY export, Gaussian splat conversion
- fusion: WiFi occupancy volume → point cloud + multi-modal voxel fusion
- stream: HTTP + Three.js viewer server (Axum, port 9880)
- main: CLI with serve/capture/demo subcommands
Demo output: 271 WiFi points + 19,200 depth points → 4,886 fused → 1,718 Gaussian splats.
Serves interactive 3D viewer at http://localhost:9880 with Three.js orbit controls.
ADR-SYS-0021 documents the architecture for camera + WiFi CSI dense point cloud pipeline.
Co-Authored-By: claude-flow <ruv@ruv.net>
* Optimize pointcloud: larger splat voxels, smaller responses, faster fusion
- Gaussian splat voxel size: 0.10 → 0.15 (42% fewer splats: 1718 → 994)
- Splat response: 399 KB → 225 KB (44% smaller)
- Pipeline: 22.2ms mean (100 runs, σ=0.3ms)
- Cloud API: 1.11ms avg, 905 req/s
- Splats API: 1.39ms avg, 719 req/s
- Binary: 1.0 MB arm64 (Mac Mini), tested
Co-Authored-By: claude-flow <ruv@ruv.net>
* Complete implementation: camera capture, WiFi CSI receiver, training pipeline
Three new modules added to wifi-densepose-pointcloud:
1. camera.rs — Cross-platform camera capture
- macOS: AVFoundation via Swift, ffmpeg avfoundation
- Linux: V4L2, ffmpeg v4l2
- Camera detection, listing, frame capture to RGB
- Graceful fallback to synthetic data when no camera
2. csi.rs — WiFi CSI receiver for ESP32 nodes
- UDP listener for CSI JSON frames from ESP32
- Per-link attenuation tracking with EMA smoothing
- Simplified RF tomography (backprojection to occupancy grid)
- Test frame sender for development without hardware
- Ready for real ESP32 CSI data from ruvzen
3. training.rs — Calibration and training pipeline
- Depth calibration: grid search over scale/offset/gamma
- Occupancy training: threshold optimization for presence detection
- Ground truth reference points for depth RMSE measurement
- Preference pair export (JSONL) for DPO training on ruOS brain
- Brain integration: submit observations as memories
- Persistent calibration files (JSON)
New CLI commands:
ruview-pointcloud cameras # list available cameras
ruview-pointcloud train # run calibration + training
ruview-pointcloud csi-test # send test CSI frames
ruview-pointcloud serve --csi # serve with live CSI input
All tested: demo, training (10 samples, 4 reference points, 3 pairs),
CSI receiver (50 test frames), server API.
Co-Authored-By: claude-flow <ruv@ruv.net>
* Fix viewer: replace WebSocket with fetch polling
Co-Authored-By: claude-flow <ruv@ruv.net>
* Wire live camera into server — real-time updating point cloud
- Server captures from /dev/video0 at 2fps via ffmpeg
- Background tokio task refreshes cloud + splats every 500ms
- Viewer polls /api/splats every 500ms, only updates on new frame
- Shows 🟢 LIVE / 🔴 DEMO indicator
- Camera position set for first-person view (looking forward into scene)
- Downsample 4x for performance (19,200 points per frame)
- Graceful fallback to demo data if camera capture fails
Co-Authored-By: claude-flow <ruv@ruv.net>
* Add MiDaS GPU depth, serial CSI reader, full sensor fusion
- MiDaS depth server: PyTorch on CUDA, real monocular depth estimation
- Rust server calls MiDaS via HTTP for neural depth (falls back to luminance)
- Serial CSI reader for ESP32 with motion detection + presence estimation
- CSI disabled by default (RUVIEW_CSI=1 to enable) — serial reader needs baud config
- Edge-enhanced depth for better object boundaries
- All sensors wired: camera, ESP32 CSI, mmWave (CSI gated until serial fixed)
Co-Authored-By: claude-flow <ruv@ruv.net>
* Complete 7-component sensor fusion pipeline (all working)
1. ADR-018 binary parser — decodes ESP32 CSI UDP frames, extracts I/Q subcarriers
2. WiFlow pose — 17 COCO keypoints from CSI (186K param model loaded)
3. Camera depth — MiDaS on CUDA + luminance fallback
4. Sensor fusion — camera depth + CSI occupancy grid + skeleton overlay
5. RF tomography — ISTA-inspired backprojection from per-node RSSI
6. Vital signs — breathing rate from CSI phase analysis
7. Motion-adaptive — skip expensive depth when CSI shows no motion
Live results: 510 CSI frames/session, 17 keypoints, 26% motion, 40 BPM breathing.
Both ESP32 nodes provisioned to send CSI to 192.168.1.123:3333.
Magic number fix: supports both 0xC5110001 (v1) and 0xC5110006 (v6) frames.
Co-Authored-By: claude-flow <ruv@ruv.net>
* Add brain bridge — sparse spatial observation sync every 60s
Stores room scan summaries, motion events, and vital signs
in the ruOS brain as memories. Only syncs every 120 frames
(~60 seconds) to keep the brain sparse and optimized.
Categories: spatial-observation, spatial-motion, spatial-vitals.
Co-Authored-By: claude-flow <ruv@ruv.net>
* Update README + user guide with dense point cloud features
Added pointcloud section to README (quick start, CLI, performance).
Added comprehensive user guide section: setup, sensors, commands,
pipeline components, API endpoints, training, output formats,
deep room scan, ESP32 provisioning.
Co-Authored-By: claude-flow <ruv@ruv.net>
* Add ruview-geo: geospatial satellite integration (11 modules, 8/8 tests)
New crate with free satellite imagery, terrain, OSM, weather, and brain integration.
Modules: types, coord, locate, cache, tiles, terrain, osm, register, fuse, brain, temporal
Tests: 8 passed (haversine, ENU roundtrip, tiles, HGT parse, registration)
Validation: real data — 43.49N 79.71W, 4 Sentinel-2 tiles, 2°C weather, brain stored
Data sources (all free, no API keys):
- EOX Sentinel-2 cloudless (10m satellite tiles)
- SRTM GL1 (30m elevation)
- Overpass API (OSM buildings/roads)
- ip-api.com (geolocation)
- Open Meteo (weather)
ADR-044 documents architecture decisions.
README.md in crate subdirectory.
Co-Authored-By: claude-flow <ruv@ruv.net>
* Update ADR-044: add Common Crawl WET, NASA FIRMS, OpenAQ, Overture Maps sources
Extended geospatial data sources leveraging ruvector's existing web_ingest
and Common Crawl support for hyperlocal context.
Co-Authored-By: claude-flow <ruv@ruv.net>
* Fix OSM/SRTM queries, add change detection + night mode
- OSM: use inclusive building filter with relation query and 25s timeout
- SRTM: switch to NASA public mirror with viewfinderpanoramas fallback
- Add detect_tile_changes() for pixel-diff satellite change detection
- Add is_night() solar-declination model for CSI-only night mode
- 6 new unit tests (night mode + tile change detection)
Co-Authored-By: claude-flow <ruv@ruv.net>
* Enhance viewer: skeleton overlay, weather, buildings, better camera
Add COCO skeleton rendering with yellow keypoint spheres and white bone
lines, info panel sections for weather/buildings/CSI rate/confidence,
overhead camera at (0,2,-4), and denser point size with sizeAttenuation.
Co-Authored-By: claude-flow <ruv@ruv.net>
* Add CSI fingerprint DB + night mode detection
Co-Authored-By: claude-flow <ruv@ruv.net>
* Fix ADR-044 numbering conflict, update geo README
Renumbered provisioning tool ADR from 044 to 050 to avoid conflict
with geospatial satellite integration ADR-044.
Co-Authored-By: claude-flow <ruv@ruv.net>
* Clean up warnings: suppress dead_code for conditional pipeline modules
Removes unused imports/variables via cargo fix and adds #[allow(dead_code)]
for modules used conditionally at runtime (CSI, depth, fusion, serial).
Pointcloud: 28 → 0 warnings. Geo: 2 → 0 warnings. 8/8 tests pass.
Co-Authored-By: claude-flow <ruv@ruv.net>
* Fix PR #405 blockers: async runtime panic, crate rename, path traversal, brain URL config
- brain_bridge.rs: replace `Handle::current().block_on(...)` inside async fn
with `.await` (was a guaranteed "runtime within runtime" panic). Brain URL
now read from RUVIEW_BRAIN_URL env var (default http://127.0.0.1:9876),
logged once via OnceLock.
- wifi-densepose-geo: rename Cargo package from `ruview-geo` to
`wifi-densepose-geo` to match directory and workspace conventions. Update
all use sites (tests/examples/README). Same env-var pattern for brain URL
in brain.rs + temporal.rs.
- training.rs: add sanitize_data_path() rejecting `..` components and
safe_join() that canonicalises + enforces base-dir containment on every
write (calibration.json, samples.json, preference_pairs.jsonl,
occupancy_calibration.json). Defence-in-depth check also in main.rs
before TrainingSession::new.
- osm.rs: clamp Overpass radius to MAX_RADIUS_M=5000m; return Err beyond
that. Add parse_overpass_json() that rejects malformed payloads
(missing top-level `elements` array).
Co-Authored-By: claude-flow <ruv@ruv.net>
* csi_pipeline: rename WiFlow stub to heuristic_pose_from_amplitude, decouple UDP
Blocker 3 (PR #405 review): The "WiFlow inference" path was a stub that
built a model from empty weight vectors and synthesised keypoints from
amplitude energy. Presenting this as "WiFlow inference" was misleading.
- Rename WiFlowModel to PoseModelMetadata (empty tag struct; we only care
if the on-disk file exists)
- Rename load_wiflow_model() -> detect_pose_model_metadata() and log
"amplitude-energy heuristic enabled/disabled" (no "WiFlow" claim)
- Rename estimate_pose() -> heuristic_pose_from_amplitude() with
prominent `STUB:` doc comment saying this is NOT a trained model
Blocker 4 (PR #405 review): The UDP receiver held the shared Arc<Mutex>
across a synchronous process_frame() call, starving HTTP handlers.
- Introduce a std::sync::mpsc channel between the UDP thread (which only
parses + pushes) and a dedicated processor thread (which locks only
briefly around a single process_frame). HTTP snapshots via
get_pipeline_output no longer contend with the socket read loop.
Also:
- Move ADR-018 parser to parser.rs (see next commit); csi_pipeline re-exports
- send_test_frames now uses parser::build_test_frame for synthetic frames
- Log a one-line node stats summary every 500 frames (reads every public
CsiFrame field on the runtime path)
Co-Authored-By: claude-flow <ruv@ruv.net>
* Extract ADR-018 parser into parser.rs + wire Fingerprint CLI
File-split (strong concern #9 in PR #405 review): csi_pipeline.rs was 602
LOC; extract the pure-function ADR-018 parser + synthetic frame builder
into src/parser.rs. Inline unit tests in parser.rs cover:
- 0xC5110001 (raw CSI, v1) roundtrip
- 0xC5110006 (feature state, v6) roundtrip
- wrong magic is rejected
- truncated header is rejected
- truncated payload is rejected
main.rs: expose `fingerprint NAME [--seconds N]` subcommand wiring
record_fingerprint() (this was the only caller needed to make the public
API non-dead on the runtime path). Also:
- Replace `--host/--port` + external `--csi` with a single `--bind`
defaulting to loopback (`127.0.0.1:9880`) — addresses strong concern
#7 about exposing camera/CSI/vitals by default.
- Update synthetic `csi-test` to target UDP 3333 (matching the ADR-018
listener) and use the shared parser::build_test_frame.
- Defence-in-depth: call training::sanitize_data_path on the expanded
--data-dir before TrainingSession::new does the same.
Co-Authored-By: claude-flow <ruv@ruv.net>
* stream: extract viewer HTML to viewer.html, default bind to loopback
Strong concern #7 (PR #405): default HTTP bind leaked camera/CSI/vitals
to the LAN. The `serve` fn now takes a single `bind` arg and prints a
loud WARNING when bound outside loopback.
Strong concern #10 (PR #405): embedded HTML+JS was ~220 LOC of the 418
LOC stream.rs. Moved the markup verbatim into viewer.html and inlined
via `include_str!("viewer.html")`. Also:
- Drop the #![allow(dead_code)] crate-level silencing (reviewer point
#11). Remove the now-unused AppState.csi_pipeline field.
- capture_camera_cloud_with_luminance returns the mean luminance of the
captured frame; the background loop feeds that to
CsiPipelineState::set_light_level so the night-mode flag actually
toggles at runtime (previously it could only be set from tests).
Net effect on file size: stream.rs 418 → 232 LOC.
Co-Authored-By: claude-flow <ruv@ruv.net>
* Dead-code cleanup + tests for fusion/depth/OSM/training/fingerprinting
Reviewer point #11 (PR #405): remove the `#![allow(dead_code)]`
silencing added in 8eb808d and fix the underlying issues.
- Delete csi.rs: duplicate of csi_pipeline.rs with incompatible wire
format (JSON vs ADR-018 binary). csi_pipeline is the real path.
- Delete serial_csi.rs: never referenced by any module.
- Drop Frame.timestamp_ms (unread), AppState.csi_pipeline (unread),
brain_bridge::brain_available (caller-less), fusion::fetch_wifi_occupancy
(caller-less) — these had no runtime users.
- Drop crate-level #![allow(dead_code)] from camera.rs, depth.rs,
fusion.rs, pointcloud.rs.
Tests (target: 8-12, actual: 15 unit + 9 geo unit + 8 geo integration
= 32 total, all pass):
- parser.rs: 5 tests (v1/v6 magic roundtrip, wrong magic, truncated
header, truncated payload).
- fusion.rs: 2 tests (non-overlapping merge, voxel dedup).
- depth.rs: 2 tests (2x2 backproject → 4 points at z=1, NaN rejected).
- training.rs: 4 tests (rejects `..`, accepts relative child, refuses
TrainingSession::new("../etc/passwd"), accepts a clean tmpdir).
- csi_pipeline.rs: 2 tests (set_light_level toggles is_dark,
record_fingerprint stores and self-identifies).
- osm.rs: 3 tests (parse_overpass_json minimal fixture, rejects
malformed payload, fetch_buildings rejects > MAX_RADIUS_M).
Co-Authored-By: claude-flow <ruv@ruv.net>
* Update README + user-guide for PR #405 review-fix additions
- serve now uses --bind 127.0.0.1:9880 (loopback default) instead of --port
- Add fingerprint subcommand to CLI tables
- Document RUVIEW_BRAIN_URL env var + --brain flag
- Flag pose path as amplitude-energy heuristic stub (not trained WiFlow)
- Security note on exposing server outside loopback
- Add wifi-densepose-pointcloud + wifi-densepose-geo rows to crate table
Co-Authored-By: claude-flow <ruv@ruv.net>
This commit is contained in:
parent
ae40e2b33e
commit
0943a32248
35 changed files with 4649 additions and 8 deletions
65
docs/adr/ADR-044-geospatial-satellite-integration.md
Normal file
65
docs/adr/ADR-044-geospatial-satellite-integration.md
Normal file
|
|
@ -0,0 +1,65 @@
|
|||
# ADR-044: Geospatial Satellite Integration
|
||||
|
||||
## Status
|
||||
Accepted
|
||||
|
||||
## Context
|
||||
RuView generates real-time 3D point clouds from camera + WiFi CSI, but these exist in a local coordinate frame with no geographic reference. Integrating free satellite imagery, terrain elevation, and map data provides environmental context that enables the ruOS brain to reason about the physical world beyond the room.
|
||||
|
||||
## Decision
|
||||
|
||||
### Data Sources (all free, no API keys)
|
||||
| Source | Data | Resolution | Update | Format |
|
||||
|--------|------|-----------|--------|--------|
|
||||
| EOX Sentinel-2 Cloudless | Satellite tiles | 10m | Static mosaic | XYZ/JPEG |
|
||||
| SRTM GL1 (NASA) | Elevation/DEM | 30m (1-arcsec) | Static | Binary HGT |
|
||||
| Overpass API (OSM) | Buildings, roads | Vector | Real-time | JSON |
|
||||
| ip-api.com | IP geolocation | ~1km | Per-request | JSON |
|
||||
| Sentinel-2 STAC | Temporal satellite | 10m | Every 5 days | COG/STAC |
|
||||
| Open Meteo | Weather | Point | Hourly | JSON |
|
||||
|
||||
### Architecture
|
||||
Pure Rust implementation in `wifi-densepose-geo` crate. No GDAL/PROJ/GEOS — coordinate transforms implemented directly (~250 LOC). Tile caching on disk at `~/.local/share/ruview/geo-cache/`.
|
||||
|
||||
### Coordinate System
|
||||
- WGS84 for geographic coordinates
|
||||
- ENU (East-North-Up) as the bridge between local sensor frame and world
|
||||
- Local sensor frame: camera origin, +Z forward, +Y up
|
||||
|
||||
### Temporal Awareness
|
||||
Nightly scheduled fetch of Sentinel-2 latest imagery + OSM diffs + weather.
|
||||
Changes detected via image comparison and stored as brain memories for
|
||||
contrastive learning.
|
||||
|
||||
### Brain Integration
|
||||
Geospatial context stored as brain memories:
|
||||
- `spatial-geo`: location, elevation, nearby landmarks
|
||||
- `spatial-change`: detected changes in satellite/OSM data
|
||||
- `spatial-weather`: current conditions + forecast
|
||||
- `spatial-season`: vegetation index, snow cover, seasonal patterns
|
||||
- `spatial-local`: hyperlocal web context from Common Crawl WET
|
||||
|
||||
### Extended Data Sources (via ruvector WET/Common Crawl)
|
||||
| Source | Data | Use |
|
||||
|--------|------|-----|
|
||||
| Common Crawl WET | Web text near location | Local business info, reviews, events |
|
||||
| Wikidata | Structured knowledge | Building names, POI descriptions |
|
||||
| NASA FIRMS | Active fire (3-hour) | Safety alerts |
|
||||
| USGS Earthquakes | Seismic events | Safety context |
|
||||
| OpenAQ | Air quality (PM2.5) | Environmental health |
|
||||
| Overture Maps | Building footprints (Meta/MS) | Higher quality than OSM |
|
||||
|
||||
The ruvector brain server has existing `web_ingest` + Common Crawl support.
|
||||
WET files filtered by geographic URL patterns provide hyperlocal context.
|
||||
|
||||
## Consequences
|
||||
### Positive
|
||||
- Agent gains environmental awareness beyond the room
|
||||
- Temporal data enables seasonal calibration of CSI sensing
|
||||
- Change detection finds construction, vegetation, weather effects
|
||||
- All data sources are genuinely free with no API keys
|
||||
|
||||
### Negative
|
||||
- Initial data fetch requires internet (~2MB tiles + ~25MB DEM)
|
||||
- Cached data becomes stale (mitigated by nightly refresh)
|
||||
- IP geolocation has ~1km accuracy (mitigated by manual override)
|
||||
|
|
@ -1,4 +1,4 @@
|
|||
# ADR-044: Provisioning Tool Enhancements
|
||||
# ADR-050: Provisioning Tool Enhancements
|
||||
|
||||
**Status**: Proposed
|
||||
**Date**: 2026-03-03
|
||||
|
|
@ -536,6 +536,110 @@ Both UIs update in real-time via WebSocket and auto-detect the sensing server on
|
|||
|
||||
---
|
||||
|
||||
## Dense Point Cloud (Camera + WiFi CSI Fusion)
|
||||
|
||||
RuView can generate real-time 3D point clouds by fusing camera depth estimation with WiFi CSI spatial sensing. This creates a spatial model of the environment that updates in real-time.
|
||||
|
||||
### Setup
|
||||
|
||||
```bash
|
||||
# Build the pointcloud binary
|
||||
cd rust-port/wifi-densepose-rs
|
||||
cargo build --release -p wifi-densepose-pointcloud
|
||||
|
||||
# Start the server (auto-detects camera + CSI). Loopback-only by default.
|
||||
./target/release/ruview-pointcloud serve --bind 127.0.0.1:9880
|
||||
```
|
||||
|
||||
Open `http://localhost:9880` for the interactive Three.js 3D viewer.
|
||||
|
||||
> **Security note.** The server exposes live camera, skeleton, vitals, and occupancy over HTTP. The `--bind` flag defaults to `127.0.0.1:9880` (loopback-only). Exposing on `0.0.0.0` or a LAN IP is opt-in — the server logs a warning when it does, but there is no auth/TLS layer. Put a reverse proxy in front if you need remote access.
|
||||
|
||||
> **Brain URL.** Observations are POSTed to `http://127.0.0.1:9876` by default. Override via the `RUVIEW_BRAIN_URL` environment variable or the `--brain <url>` flag on `serve` / `train`.
|
||||
|
||||
### Sensors
|
||||
|
||||
| Sensor | Auto-detected | Data |
|
||||
|--------|--------------|------|
|
||||
| Camera (`/dev/video0`) | Yes (Linux UVC) | RGB frames → MiDaS depth → 3D points |
|
||||
| ESP32 CSI (UDP:3333) | Yes (if provisioned) | ADR-018 binary → occupancy + pose + vitals |
|
||||
| MiDaS depth server (port 9885) | Optional | GPU-accelerated neural depth estimation |
|
||||
|
||||
### Commands
|
||||
|
||||
| Command | Description |
|
||||
|---------|-------------|
|
||||
| `ruview-pointcloud serve --bind 127.0.0.1:9880` | Start HTTP server + Three.js viewer (loopback-only by default) |
|
||||
| `ruview-pointcloud demo` | Generate synthetic point cloud (no hardware needed) |
|
||||
| `ruview-pointcloud capture --output room.ply` | Capture single frame to PLY file |
|
||||
| `ruview-pointcloud cameras` | List available cameras |
|
||||
| `ruview-pointcloud train --data-dir ./data [--brain URL]` | Depth calibration + occupancy training (writes under canonicalized `data-dir`; refuses `..` traversal) |
|
||||
| `ruview-pointcloud csi-test --count 100` | Send test CSI frames (no ESP32 needed) |
|
||||
| `ruview-pointcloud fingerprint <name> [--seconds 5]` | Record a named CSI room fingerprint for later matching |
|
||||
|
||||
### Pipeline Components
|
||||
|
||||
1. **ADR-018 Parser** — Decodes ESP32 CSI binary frames from UDP (magic `0xC5110001` raw CSI and `0xC5110006` feature state), extracts I/Q subcarrier amplitudes and phases. Lives in `parser.rs`; unit-tested against hand-rolled test vectors.
|
||||
2. **Pose (stub)** — 17 COCO keypoint *layout* generated by `heuristic_pose_from_amplitude` from CSI amplitude energy. This is **not** the trained WiFlow model — it is a placeholder so the viewer has a skeleton to render. Wiring to real Candle/ONNX inference from the `wifi-densepose-nn` crate is a planned follow-up.
|
||||
3. **Vital Signs** — Breathing rate from CSI phase analysis (peak counting on stable subcarrier)
|
||||
4. **Motion Detection** — CSI amplitude variance over 20 frames, triggers adaptive capture
|
||||
5. **RF Tomography** — Backprojection from per-node RSSI to 8×8×4 occupancy grid
|
||||
6. **Camera Depth** — MiDaS monocular depth (GPU) with luminance+edge fallback
|
||||
7. **Sensor Fusion** — Voxel-grid merging of camera depth + CSI occupancy
|
||||
8. **Brain Bridge** — Stores spatial observations in the ruOS brain every 60 seconds
|
||||
|
||||
### API Endpoints
|
||||
|
||||
| Endpoint | Method | Returns |
|
||||
|----------|--------|---------|
|
||||
| `/health` | GET | `{"status": "ok"}` |
|
||||
| `/api/status` | GET | Camera, CSI, pipeline state, vitals, motion |
|
||||
| `/api/cloud` | GET | Point cloud (up to 1000 points) + pipeline data |
|
||||
| `/api/splats` | GET | Gaussian splats for Three.js rendering |
|
||||
| `/` | GET | Interactive Three.js 3D viewer |
|
||||
|
||||
### Training
|
||||
|
||||
The training pipeline calibrates depth estimation and occupancy detection:
|
||||
|
||||
```bash
|
||||
ruview-pointcloud train --data-dir ~/.local/share/ruview/training --brain http://127.0.0.1:9876
|
||||
```
|
||||
|
||||
This captures frames, runs depth calibration (grid search over scale/offset/gamma), trains occupancy thresholds, exports DPO preference pairs, and submits results to the ruOS brain.
|
||||
|
||||
### Output Formats
|
||||
|
||||
- **PLY** — Standard 3D point cloud (ASCII, with RGB color)
|
||||
- **Gaussian Splats** — JSON format for Three.js rendering
|
||||
- **Brain Memories** — Spatial observations stored as `spatial-observation`, `spatial-motion`, `spatial-vitals`
|
||||
|
||||
### Deep Room Scan
|
||||
|
||||
Capture a high-quality 3D model of the room:
|
||||
|
||||
```bash
|
||||
# Stop the live server first (frees the camera)
|
||||
# Then capture 20 frames and process with MiDaS
|
||||
ruview-pointcloud capture --frames 20 --output room_model.ply
|
||||
```
|
||||
|
||||
Result: 40,000+ voxels at 5cm resolution, 12,000+ Gaussian splats.
|
||||
|
||||
### ESP32 Provisioning for CSI
|
||||
|
||||
To send CSI data to the pointcloud server:
|
||||
|
||||
```bash
|
||||
python3 firmware/esp32-csi-node/provision.py \
|
||||
--port /dev/ttyACM0 \
|
||||
--ssid "YourWiFi" --password "YourPassword" \
|
||||
--target-ip 192.168.1.123 --target-port 3333 \
|
||||
--node-id 1
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Vital Sign Detection
|
||||
|
||||
The system extracts breathing rate and heart rate from CSI signal fluctuations using FFT peak detection.
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue