initial commit
This commit is contained in:
+60
@@ -0,0 +1,60 @@
|
|||||||
|
# Python
|
||||||
|
__pycache__/
|
||||||
|
*.py[cod]
|
||||||
|
*$py.class
|
||||||
|
*.so
|
||||||
|
.Python
|
||||||
|
/build/
|
||||||
|
develop-eggs/
|
||||||
|
/dist/
|
||||||
|
downloads/
|
||||||
|
eggs/
|
||||||
|
.eggs/
|
||||||
|
/lib/
|
||||||
|
/lib64/
|
||||||
|
parts/
|
||||||
|
sdist/
|
||||||
|
var/
|
||||||
|
wheels/
|
||||||
|
share/python-wheels/
|
||||||
|
*.egg-info/
|
||||||
|
.installed.cfg
|
||||||
|
*.egg
|
||||||
|
MANIFEST
|
||||||
|
|
||||||
|
# Virtual Environments
|
||||||
|
.env
|
||||||
|
.venv
|
||||||
|
env/
|
||||||
|
venv/
|
||||||
|
ENV/
|
||||||
|
env.bak/
|
||||||
|
venv.bak/
|
||||||
|
|
||||||
|
# Node / JS
|
||||||
|
node_modules/
|
||||||
|
/dist/
|
||||||
|
/build/
|
||||||
|
.svelte-kit/
|
||||||
|
npm-debug.log*
|
||||||
|
yarn-debug.log*
|
||||||
|
yarn-error.log*
|
||||||
|
|
||||||
|
# Database / SQLite
|
||||||
|
*.sqlite
|
||||||
|
*.sqlite3
|
||||||
|
*.db
|
||||||
|
|
||||||
|
# macOS
|
||||||
|
.DS_Store
|
||||||
|
|
||||||
|
# Editor / IDE
|
||||||
|
.vscode/
|
||||||
|
.idea/
|
||||||
|
*.swp
|
||||||
|
*.swo
|
||||||
|
|
||||||
|
# Project Specific
|
||||||
|
/staging/
|
||||||
|
/source_data/
|
||||||
|
/config/
|
||||||
@@ -0,0 +1,15 @@
|
|||||||
|
repos:
|
||||||
|
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||||
|
rev: v4.6.0
|
||||||
|
hooks:
|
||||||
|
- id: trailing-whitespace
|
||||||
|
- id: end-of-file-fixer
|
||||||
|
- id: check-yaml
|
||||||
|
- id: check-added-large-files
|
||||||
|
|
||||||
|
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||||
|
rev: v0.4.4
|
||||||
|
hooks:
|
||||||
|
- id: ruff
|
||||||
|
args: [--fix]
|
||||||
|
- id: ruff-format
|
||||||
@@ -0,0 +1,40 @@
|
|||||||
|
# TapeHoard - Developer & AI Assistant Guide
|
||||||
|
|
||||||
|
This document (`GEMINI.md`) contains critical, contextual information about the TapeHoard project. **It takes absolute precedence over generic workflows.** Always refer to the architecture constraints in `PLAN.md` before implementing new features.
|
||||||
|
|
||||||
|
## 1. Tooling & Ecosystem
|
||||||
|
|
||||||
|
### Backend (Python)
|
||||||
|
* **Package Manager:** `uv`. Never use `pip` directly. Use `uv add <pkg>` and `uv sync` to manage dependencies.
|
||||||
|
* **Framework:** FastAPI.
|
||||||
|
* **Database:** SQLite via SQLAlchemy ORM. Migrations are strictly managed by `alembic`.
|
||||||
|
* *To generate migrations:* `cd backend && uv run alembic revision --autogenerate -m "message"`
|
||||||
|
* *To apply migrations:* `cd backend && uv run alembic upgrade head`
|
||||||
|
* **Logging:** `loguru`. Do not use standard `logging` or print statements.
|
||||||
|
* **Type Safety:** `ty`. All Python code must be fully type-hinted and pass `uv run ty` without errors.
|
||||||
|
* **Configuration:** `pydantic-settings`. Define environment variables and constants in a settings schema.
|
||||||
|
|
||||||
|
### Frontend (Svelte 5 / SvelteKit)
|
||||||
|
* **Framework:** Svelte 5 Runes (using `$props()`, `$state()`, etc.).
|
||||||
|
* **Styling:** Tailwind CSS. All new components must use Tailwind utility classes.
|
||||||
|
* **Component Library:** Custom library based on **shadcn-svelte** and **bits-ui**. Use existing components in `src/lib/components/ui/` or add new ones following the shadcn pattern.
|
||||||
|
* **Package Manager:** `npm`.
|
||||||
|
* **API Client Generation:** `@hey-api/openapi-ts`. Never manually fetch or type API responses. Ensure the backend is running, then run `just generate-client` to auto-generate the strictly typed TypeScript client from the FastAPI OpenAPI spec.
|
||||||
|
* **Icons:** `lucide-svelte`.
|
||||||
|
* **Notifications:** `svelte-sonner`.
|
||||||
|
|
||||||
|
### Global Task Runner
|
||||||
|
* **`just`:** Use the `justfile` in the root directory for executing common tasks.
|
||||||
|
* `just dev`: Starts both backend and frontend servers.
|
||||||
|
* `just lint`: Runs Ruff, ty, and Svelte Check.
|
||||||
|
* `just format`: Auto-formats code with Ruff.
|
||||||
|
|
||||||
|
## 2. Code Quality & Pre-commit
|
||||||
|
* All code must pass `pre-commit` hooks. If you generate a new file or modify an existing one, ensure it complies with `ruff` and `ruff-format`.
|
||||||
|
* You can manually format your changes by running `just format`.
|
||||||
|
|
||||||
|
## 3. Core Architectural Rules
|
||||||
|
* **Abstract Storage Providers:** The core archiver must never directly call `mt` or write `tar` streams. It must use the Abstract Storage Provider interfaces defined in `backend/app/providers/base.py` to seamlessly support Tapes, HDDs, and Cloud buckets.
|
||||||
|
* **File Hashing & Deduplication:** Never re-hash a file if its `mtime` and `size` remain unchanged in the `filesystem_state` table.
|
||||||
|
* **Local Staging (`/staging`):** All massive file manipulation, chunking, and restore compilations must occur in the temporary `/staging` directory before hitting the final media or user.
|
||||||
|
* **Docker PUID/PGID:** Always account for the fact that the container will be running under a dynamic User ID. Avoid writing files to directories where the `appuser` might lack permissions.
|
||||||
@@ -0,0 +1,305 @@
|
|||||||
|
# Tape Backup Manager Implementation Plan
|
||||||
|
|
||||||
|
## 1. Overview & Objectives
|
||||||
|
This document outlines the architecture and implementation strategy for a Tape Backup Manager. The system is designed to provide robust, index-driven backups to LTO tape media, catering specifically to single-tape drive users while maintaining the scalability to support tape libraries in the future.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 2. LTO Generations & Capacity Planning
|
||||||
|
|
||||||
|
The system targets the **native (raw) storage capacity** of the tapes to ensure reliable capacity planning. A safety margin (~50GB) will be reserved.
|
||||||
|
|
||||||
|
| Generation | Native Capacity | Target Fill Capacity (approx.) |
|
||||||
|
| :--- | :--- | :--- |
|
||||||
|
| **LTO-5** | 1.5 TB | ~ 1.42 TB |
|
||||||
|
| **LTO-6** | 2.5 TB | ~ 2.37 TB |
|
||||||
|
| **LTO-7** | 6.0 TB | ~ 5.70 TB |
|
||||||
|
| **LTO-8** | 12.0 TB | ~ 11.4 TB |
|
||||||
|
| **LTO-9** | 18.0 TB | ~ 17.1 TB |
|
||||||
|
| **LTO-10** | 30.0 TB | ~ 28.5 TB |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 3. Itemized Features & Implementation Descriptions
|
||||||
|
|
||||||
|
### 3.1 SQLite-based Indexing Database
|
||||||
|
* **Description:** A centralized database to act as the single source of truth for the filesystem state, backup jobs, and physical tape inventory.
|
||||||
|
* **Implementation:** Use `sqlite3`.
|
||||||
|
* `filesystem_state` table: Tracks `id`, `file_path`, `size`, `mtime`, `sha256_hash`, `last_seen_timestamp`.
|
||||||
|
* `storage_media` table (formerly `tapes`): Tracks `id`, `media_type` (e.g., tape, hdd, cloud), `identifier` (barcode, UUID, or bucket name), `generation/tier`, `capacity`, `bytes_used`, `location`, `status`.
|
||||||
|
* `backups` table: Tracks `id`, `job_name`, `type` (initial/incremental), timestamps, and status.
|
||||||
|
* `file_versions` table: Maps a `filesystem_state_id` to a `media_id`, storing the `file_number` (e.g., tape position, or object path) and an optional `offset_in_tar`.
|
||||||
|
* `job_logs` table: Tracks `id`, `backup_id`, `timestamp`, `log_level`, `message` (e.g., file written, permission denied error, manual eject).
|
||||||
|
|
||||||
|
### 3.2 File Hashing & Deduplication
|
||||||
|
* **Description:** Utilizing cryptographic hashes to identify unique files, ensuring that duplicate files across the filesystem (or over time) are recognized. This prevents writing the exact same file content to tape multiple times if it has already been stored.
|
||||||
|
* **Implementation:**
|
||||||
|
* During the filesystem scan, the system checks a file's `mtime` (modification time) and `size` against the existing `filesystem_state` index. A cryptographic hash (e.g., SHA-256 or BLAKE3) is **only computed** if the file is completely new or its `mtime`/`size` indicates it has been modified since the last scan. This heavily optimizes sequential scans of large filesystems.
|
||||||
|
* Store this hash in the `filesystem_state` table.
|
||||||
|
* Before writing a file to tape during a backup, query the database by the computed hash. If the hash exists and is mapped to a valid tape location in `file_versions`, link the new file path to the existing tape location instead of writing the payload again.
|
||||||
|
|
||||||
|
### 3.3 Web-Based Explorer Interface & Search
|
||||||
|
* **Description:** A user-friendly, web-based graphical interface that allows users to navigate the backed-up contents similarly to Windows Explorer or macOS Finder. It includes robust search functionality to locate files across all tapes.
|
||||||
|
* **Implementation:**
|
||||||
|
* **Backend:** Serve a local web application using a lightweight framework (e.g., FastAPI or Flask). The backend will expose REST API endpoints that query the SQLite database.
|
||||||
|
* **Frontend:** Build a modern, responsive web UI utilizing Svelte 5 as the primary framework, styled with Tailwind CSS and a custom component library based on shadcn-svelte and bits-ui.
|
||||||
|
* **Features:**
|
||||||
|
* **Virtual Filesystem (Frontend Explorer):** Dynamically parse the stored `file_path`s in SQLite to generate a unified, browsable directory structure. Users can navigate this virtual filesystem in the browser exactly as if it were a single, massive mounted drive spanning all their backups. Clicking on a specific file opens a detailed metadata panel displaying its exact storage locations (e.g., "Tape BUP-001", "HDD-002"), backup dates, sizes, and file hashes—all without needing the physical media inserted.
|
||||||
|
* **Global Search (Full-Text Search):** Implement an advanced search bar utilizing SQLite's FTS5 (Full-Text Search) extension. File paths and names will be indexed in an FTS5 virtual table to provide instantaneous, highly optimized keyword searching across millions of files, replacing slower `LIKE` queries.
|
||||||
|
* **Restore Cart:** Allow users to select multiple files or folders from the web interface and "Add to Restore Queue", which then triggers the CLI/Backend to prompt for the necessary tapes.
|
||||||
|
* **Multi-Drive & Library Management:** The web interface will allow users to view connected tape drives and robotic libraries. The backend will spawn concurrent background processes, allowing multiple read/write operations to happen in parallel across different tape drives. The interface will also provide manual controls for sending tape library commands (e.g., load, unload, inventory).
|
||||||
|
|
||||||
|
### 3.4 Backup Strategies (Versioning & Compaction)
|
||||||
|
* **Description:** Managing file modifications, deletions, and optimizing tape usage over time on an append-only medium.
|
||||||
|
* **Implementation:**
|
||||||
|
* *Initial (Full):* Walk directories, compute hashes, populate `filesystem_state`. Group files up to the target tape capacity. Stream to tape using `tar` and update `file_versions`.
|
||||||
|
* *Incremental (Append-Only Versioning):* Walk directories and detect modified files via `size`/`mtime`. Because tape cannot be overwritten in place, modified files are packaged into a new `tar` stream and appended to the newest active tape. The old version remains on its original tape, providing automatic point-in-time file history. Deleted files are simply marked as `unseen_timestamp` in the database and hidden from the current Web UI view.
|
||||||
|
* *Tape Compaction (Grooming):* Over time, old tapes accumulate "stale" data (deleted files or old versions). The system will identify fragmented tapes. The user (or autoloader) loads the fragmented tape, the system reads *only* the current active files into the Local Staging Area, and then appends them to the newest active tape. Once migrated, the old fragmented tape is marked as "Recyclable" in the DB, allowing the physical cartridge to be formatted and reused.
|
||||||
|
|
||||||
|
### 3.5 File Splitting & Bin-Packing
|
||||||
|
* **Description:** Intelligent packing of files to maximize tape usage without overflowing, and splitting of files that exceed a single tape's capacity.
|
||||||
|
* **Implementation:**
|
||||||
|
* Implement a greedy bin-packing algorithm to group files into "Tape Sets" before writing.
|
||||||
|
* If a single file exceeds the tape capacity, utilize the Unix `split` command (or equivalent chunking logic in the primary language) to break the file into parts before packaging it into the `tar` stream. The database must record these as multi-part files for reassembly upon restore.
|
||||||
|
|
||||||
|
### 3.6 Optimized Storage Format (`tar` + EOF Marks)
|
||||||
|
* **Description:** Avoiding LTFS in favor of a simpler, more performant approach for many files by using raw `tar` streams and tape file marks.
|
||||||
|
* **Implementation:**
|
||||||
|
* Stream files into 100GB - 500GB `tar` archives.
|
||||||
|
* Write the `tar` archive to the non-rewinding tape device (e.g., `/dev/nst0`).
|
||||||
|
* The tape drive automatically writes an EOF mark after each archive.
|
||||||
|
* Record the tape file number (the index of the EOF mark) in the SQLite database for each file contained in that `tar` stream.
|
||||||
|
|
||||||
|
### 3.7 Near-Random Access Retrieval
|
||||||
|
* **Description:** The ability to quickly retrieve a specific file without reading the entire tape.
|
||||||
|
* **Implementation:**
|
||||||
|
* Query the SQLite database for the target file to find its `tape_id` and `tape_file_number`.
|
||||||
|
* Prompt the user to insert the correct tape (by barcode).
|
||||||
|
* Execute `mt -f /dev/nst0 fsf <tape_file_number>` to fast-forward the tape head directly to the start of the correct `tar` stream.
|
||||||
|
* Extract the specific file from the `tar` stream.
|
||||||
|
|
||||||
|
### 3.8 Tape Handling, Barcode Tracking, & Autoloader Library Support
|
||||||
|
* **Description:** Workflows tailored for both users manually swapping single tapes and advanced users operating robotic tape libraries (autoloaders).
|
||||||
|
* **Implementation:**
|
||||||
|
* **Manual Users:** CLI and Web prompts to insert specific tapes. On new tape insertion, prompt the user to input a physical barcode label, or auto-generate one (e.g., `BUP-00001`).
|
||||||
|
* **Automated Libraries:** For setups with tape changers, use `mtx` (Media Changer Tools) to automatically scan the inventory of storage slots, read physical hardware barcodes, and orchestrate the loading/unloading of cartridges between slots and drives without user intervention.
|
||||||
|
* **Verification:** Regardless of method, write a tiny header file (tape label) at tape file number 0 (the very beginning of the tape) containing the barcode. Validate this label upon subsequent insertions to ensure physical tape contents match the database expectations.
|
||||||
|
|
||||||
|
### 3.9 Alerting for Tape Switching
|
||||||
|
* **Description:** Notifying the user when a tape is full and needs manual replacement.
|
||||||
|
* **Implementation:**
|
||||||
|
* Monitor the `bytes_used` metric for the active tape during writing.
|
||||||
|
* When the target fill capacity is reached, finalize the current `tar` stream and eject the tape.
|
||||||
|
* Trigger alerts:
|
||||||
|
* Standard console audio alert (Terminal bell).
|
||||||
|
* OS-native desktop notifications (e.g., `notify-send` or `osascript`).
|
||||||
|
* Configurable webhooks (e.g., Slack, Discord, Email).
|
||||||
|
* Pause the backup job and wait for user input confirming a new tape insertion.
|
||||||
|
|
||||||
|
### 3.10 Data Integrity & Tape Health Monitoring
|
||||||
|
* **Description:** Ensuring long-term survival of data by verifying tape integrity and monitoring hardware metrics.
|
||||||
|
* **Implementation:**
|
||||||
|
* **Tape Scrubbing:** Schedule background tasks that read a tape from start to finish, verifying the computed hashes of the payload against the SQLite database to detect bit rot or magnetic degradation.
|
||||||
|
* **Hardware Health (SMART/Logs):** Utilize SCSI logs (via `sg_logs` or `smartctl`) to track hardware-level metrics (e.g., soft/hard read/write error rates, load/unload counts, remaining tape life) and display warnings in the Web UI to predict failures.
|
||||||
|
|
||||||
|
### 3.11 Physical Vault & Location Tracking
|
||||||
|
* **Description:** Managing the physical whereabouts of tapes and enforcing retention policies for offsite backups.
|
||||||
|
* **Implementation:**
|
||||||
|
* **Location Management:** Add a `location` field in the `tapes` database table to track where a tape physically resides (e.g., "Bank Safe Deposit Box", "Shelf 2", "In Drive").
|
||||||
|
* **Rotation Policies:** Support standard backup rotations (like GFS - Grandfather-Father-Son). Allow marking a tape's data as "expired," which flags the physical tape cartridge as "Recyclable/Ready to Format" in the UI.
|
||||||
|
|
||||||
|
### 3.12 Security (Encryption) & Smart Compression
|
||||||
|
* **Description:** Protecting sensitive data at rest and optimizing tape capacity based on file type.
|
||||||
|
* **Implementation:**
|
||||||
|
* **Encryption:** Provide options for either hardware-based Application-Managed Encryption (AME) using the LTO drive's native capabilities, or software-based encryption by piping the `tar` stream through symmetric encryption (e.g., `gpg` or `age`) before writing to tape.
|
||||||
|
* **Smart Compression:** Allow users to toggle hardware compression. Implement logic to selectively use fast software compression (like `zstd`) for highly compressible datasets while disabling it for pre-compressed media formats (e.g., `.mkv`, `.mp4`).
|
||||||
|
|
||||||
|
### 3.14 Local Staging Area (Cache Directory)
|
||||||
|
* **Description:** A dedicated, high-speed local storage directory (preferably NVMe/SSD) used for buffering data between the host filesystem and the tape drive.
|
||||||
|
* **Implementation:**
|
||||||
|
* **Performance Buffering:** To prevent "shoe-shining" (tape drive stopping/starting due to slow data feeds), `tar` streams can be pre-built in the staging area before being flushed to tape at maximum sequential speed.
|
||||||
|
* **Restore Carts:** When a user queues files from multiple tapes for restoration, the files are temporarily extracted to the staging area until all requested tapes have been read, allowing the user to download a single package.
|
||||||
|
* **File Splitting:** Provides temporary space for chunking massive files that exceed a single tape's capacity before writing.
|
||||||
|
* **Docker Volume:** This must be exposed as a dedicated bind mount (e.g., `/staging`) in the Docker deployment strategy.
|
||||||
|
|
||||||
|
### 3.15 Quality of Life & Ecosystem Integration
|
||||||
|
* **Description:** Features tailored for homelabs and data hoarders to seamlessly integrate the backup manager into their existing infrastructure.
|
||||||
|
* **Implementation:**
|
||||||
|
* **Metrics Exporter:** Provide a `/metrics` Prometheus endpoint exposing data about the backup system (e.g., total bytes backed up, tape drive status, last backup duration, active tapes) for Grafana dashboards.
|
||||||
|
* **Automated Scheduling:** Implement a built-in scheduler (e.g., using `APScheduler`) to trigger backups automatically at specific intervals, pausing and alerting if a tape swap is needed.
|
||||||
|
* **Metadata & Proxy Caching:** During the initial scan of large media libraries, extract and store tiny metadata footprints (e.g., video resolution, EXIF data, or highly compressed thumbnails) in the SQLite DB. This allows users to browse and preview file contents in the Web UI without needing to load the physical tape.
|
||||||
|
|
||||||
|
### 3.16 Source Management & Tracking UX
|
||||||
|
* **Description:** A clear, interactive way for users to define exactly which folders and files should be backed up within the Web UI, handling complex inclusion and exclusion rules.
|
||||||
|
* **Implementation:**
|
||||||
|
* **Visual File Tree:** Under a "Backup Jobs" or "Sources" tab, the frontend will render a live, interactive tree-view of the host's `/source_data` directory.
|
||||||
|
* **Checkbox Tracking:** Users can click checkboxes next to folders or files to explicitly "Track" (include) or "Untrack" (exclude) them from a backup job.
|
||||||
|
* **Exclusion Rules (Gitignore Style):** Provide an advanced text input where users can define global or job-specific exclusion patterns (e.g., `*.tmp`, `node_modules/`, `cache/`) mimicking `.gitignore` behavior. These rules are stored in the SQLite database and evaluated during the filesystem scan.
|
||||||
|
|
||||||
|
### 3.17 Abstract Storage Providers (Tape, HDD, Cloud)
|
||||||
|
* **Description:** Making the backup manager generic enough to handle not just sequential LTO tapes, but also random-access offline HDDs (e.g., in a USB dock) and remote Cloud Storage (e.g., AWS S3, Backblaze B2).
|
||||||
|
* **Implementation:** Introduce a "Storage Provider" backend interface.
|
||||||
|
* **Tape Provider:** The default implementation utilizing `mt`, block devices (`/dev/nst0`), physical barcodes, and sequential EOF file marks.
|
||||||
|
* **Offline HDD Provider:** Treats an external hard drive like a tape. It writes `tar` streams or raw files to a mounted path (e.g., `/mnt/usb`). It tracks the specific HDD by writing a hidden identifier file (e.g., `.tapehoard_id=HDD-001`) to the root of the drive. The UI still prompts the user: *"Please insert HDD-001 into the dock to continue."*
|
||||||
|
* **Cloud Storage Provider:** Uses cloud APIs (e.g., `boto3` for S3). It chunks data into manageable, multipart-upload friendly blocks. Instead of "inserting media," the provider authenticates and streams encrypted chunks directly to a designated bucket, logging the bucket path as the "Media ID" in the database.
|
||||||
|
|
||||||
|
### 3.18 Global Inventory & Credential Management
|
||||||
|
* **Description:** A dedicated interface and backend subsystem to manage the entire fleet of available storage media, their capacities, hardware types, and authentication credentials for cloud providers.
|
||||||
|
* **Implementation:**
|
||||||
|
* **Inventory Dashboard:** The Web UI will feature an "Inventory" tab displaying a comprehensive, filterable list of all registered media (e.g., "LTO-6 Tape BUP-001", "8TB WD Red HDD-002", "AWS S3 Bucket `my-backups`").
|
||||||
|
* **Capacity Forecasting:** The dashboard aggregates the total `capacity` and `bytes_used` across all registered media, providing users with a global view of their available "Storage Pool" before they initiate massive backup jobs.
|
||||||
|
* **Media Provisioning:** Users can manually register new media through the UI (e.g., pre-registering a box of 10 new LTO-8 tapes, or adding a new 16TB external drive), defining their expected capacities and physical locations.
|
||||||
|
* **Credential Manager (Vault):** A secure subsystem within the backend (storing encrypted keys in SQLite or utilizing environment variables) to manage Cloud Provider credentials (e.g., AWS Access Keys, Backblaze B2 Application Keys). This allows the system to seamlessly connect to different cloud buckets without prompting the user for passwords during automated backup jobs.
|
||||||
|
|
||||||
|
### 3.19 Disaster Recovery (Index Bootstrap)
|
||||||
|
* **Description:** The ability to recover the SQLite database index in the event of complete host system failure.
|
||||||
|
* **Implementation:**
|
||||||
|
* **Self-Hosting the DB:** Automatically write a compressed copy of the current SQLite database to the beginning (Tape File 1) or end of the tape every time a tape is finalized or ejected.
|
||||||
|
* **Cloud Sync:** An option to automatically push the SQLite DB to an S3 bucket or email it to the user after every successful backup job.
|
||||||
|
* **Recovery UI:** A "Recover Index from Media" button on first boot that reads the DB from an inserted tape or HDD, instantly restoring the system to a working state without manually rebuilding the index.
|
||||||
|
|
||||||
|
### 3.20 Parallel Hashing & Performance
|
||||||
|
* **Description:** Dramatically speeding up the filesystem scan by utilizing all available CPU cores.
|
||||||
|
* **Implementation:** The filesystem scanner must use Python's `multiprocessing` module (or thread pools). While one process walks directories reading `mtime`/`size` from the disk, a pool of worker processes aggressively computes cryptographic hashes (e.g., BLAKE3) in parallel, maximizing local read throughput.
|
||||||
|
|
||||||
|
### 3.21 Checkpointing & Job Resumption
|
||||||
|
* **Description:** Ensuring massive, multi-hour backup jobs can be resumed after a power outage or hardware failure without starting over.
|
||||||
|
* **Implementation:** Implement aggressive checkpointing. As each `tar` block (e.g., 100GB chunks) finishes writing and the EOF mark is successfully placed on the media, the SQLite database immediately commits those `file_versions` as "Written." If the system crashes, the next run will detect the partially completed job, fast-forward the media to the last known good EOF mark, and resume exactly where it left off.
|
||||||
|
|
||||||
|
### 3.22 Symlinks & Hardlinks Configuration
|
||||||
|
* **Description:** Preventing infinite loops and storage bloat when backing up Unix/Linux filesystems containing symbolic links.
|
||||||
|
* **Implementation:** Add a configuration toggle in the UI for **"Follow Symlinks"**.
|
||||||
|
* *Disabled (Default):* The backup simply records the link path and its target, taking up almost zero space.
|
||||||
|
* *Enabled:* The system treats the symlink as a real directory/file and copies the underlying data.
|
||||||
|
|
||||||
|
### 3.23 Audit Logging & Job History
|
||||||
|
* **Description:** Providing a clear, filterable history of what succeeded, what failed, and why.
|
||||||
|
* **Implementation:** The newly added `job_logs` table tracks every file written, "Permission Denied" errors during scans, hardware read/write retries, and media ejects. The Web UI will feature a "Logs" tab where users can filter by date, log level (INFO/WARN/ERROR), or specific backup jobs.
|
||||||
|
|
||||||
|
### 3.24 Premium UX & Interactive Workflows
|
||||||
|
* **Description:** Elevating the software to an enterprise-grade experience by making long-running operations visible, safe, and intuitive.
|
||||||
|
* **Implementation:**
|
||||||
|
* **Real-Time Progress (WebSockets/SSE):** The Svelte frontend and backend will communicate via WebSockets or Server-Sent Events. The UI provides a persistent dashboard showing the active job, live progress bars ("Writing to Tape BUP-001 (45% Full)"), transfer speeds, and ETAs. This persists across browser refreshes.
|
||||||
|
* **"Dry Run" / Simulation Mode:** Before committing to a multi-day backup job, users can click "Simulate Job". The system walks the filesystem, runs the bin-packing algorithm, and generates a report detailing exactly how many new files will be backed up, how many were excluded, and how much media (e.g., number of tapes) will be required.
|
||||||
|
* **Restore Wizard:** When executing a multi-tape restore, the UI launches a step-by-step wizard. It prompts the user for the first tape, extracts the necessary files to the Staging Area, and then prompts for the next tape, ultimately presenting the user with a single cohesive download or directory.
|
||||||
|
* **Visualizing Media Fragmentation:** The Inventory dashboard displays a stacked bar chart for each media showing Active Data (Green), Stale/Deleted Data (Yellow), and Free Space (Gray). If a tape is highly fragmented, it displays a recommended "Groom Tape" (Compaction) button.
|
||||||
|
* **Interactive Error Handling:** If the backend catches a hardware I/O error, it halts the job and triggers an alert. The UI presents actionable prompts instead of failing silently: *"Hardware Error. Please clean the tape drive head. [Retry Last Block] | [Mark Tape as Bad & Prompt for New] | [Abort Job]"*.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 4. Software Dependency List
|
||||||
|
|
||||||
|
Assuming implementation in **Python** (a strong fit for CLI tools, SQLite integration, and web backends).
|
||||||
|
|
||||||
|
**System-Level Dependencies:**
|
||||||
|
* `tar`: For archiving and streaming files.
|
||||||
|
* `mt-st` (or standard `mt`): Magnetic Tape control utility for issuing commands like `fsf`, `rewind`, `eject`.
|
||||||
|
* `mtx`: Media Changer Tools for controlling robotic tape libraries.
|
||||||
|
* `smartmontools` & `sg3-utils`: For querying tape drive health and SCSI metrics (`smartctl` and `sg_logs`).
|
||||||
|
* `zstd` / `gpg` / `age`: (Optional) System-level binaries for fast compression and encryption.
|
||||||
|
* Tape Drive Drivers: Appropriate SAS/FC HBA drivers, the `st` (SCSI tape), and `sg` (SCSI generic) kernel modules (Linux/Unix).
|
||||||
|
|
||||||
|
**Language/Application Dependencies (Python):**
|
||||||
|
* `sqlite3`: Built-in Python library for database management.
|
||||||
|
* `click` or `argparse`: For building the robust Command Line Interface.
|
||||||
|
* `hashlib` / `blake3`: For generating fast cryptographic file hashes to handle deduplication.
|
||||||
|
* `subprocess`: Built-in for interacting with `tar`, `mt`, and `split`.
|
||||||
|
* `FastAPI` & `uvicorn` (or `Flask`): For serving the local web application and API endpoints for the web interface.
|
||||||
|
* `prometheus_client`: For exposing the `/metrics` endpoint to Grafana.
|
||||||
|
* `APScheduler`: For managing automated background backup scheduling and tape scrubbing tasks.
|
||||||
|
* `apprise` (Optional): A robust Python library for sending push notifications (Desktop, Slack, Discord, Email, etc.) for the alerting feature.
|
||||||
|
* `tqdm` (Optional): For displaying progress bars during file scanning and writing.
|
||||||
|
* `Pillow` / `ffmpeg-python`: (Optional) For generating image thumbnails and extracting video metadata during scans.
|
||||||
|
|
||||||
|
**Frontend Dependencies:**
|
||||||
|
* Standard HTML/CSS/JS.
|
||||||
|
* **Svelte 5:** For rendering the Windows Explorer style tree-view, reactive components, and handling search interactions.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 5. Docker Deployment Strategy
|
||||||
|
|
||||||
|
To ensure seamless deployment on NAS operating systems (such as TrueNAS SCALE, Unraid) and standard Linux servers, the application will be packaged as a robust Docker container.
|
||||||
|
|
||||||
|
### 5.1 Configurable User and Group IDs (PUID / PGID)
|
||||||
|
* **Description:** The container will be designed to run its internal processes as a specific user mapped to the host system. This prevents permission errors when accessing mounted source files, configuration directories, and the tape block device.
|
||||||
|
* **Implementation:**
|
||||||
|
* Use an entrypoint script (e.g., via `s6-overlay` or a custom shell script) that reads `PUID` and `PGID` environment variables.
|
||||||
|
* On container startup, the script will create or modify a local non-root user (e.g., `appuser`) to match the provided host IDs.
|
||||||
|
* The backend processes and web server will be launched using `su-exec` or `gosu` under this dynamic user.
|
||||||
|
* **Hardware Access:** The dynamically created user will be automatically added to the container's `tape` or `disk` group to ensure it has read/write permissions for the passed-through tape drive device (`/dev/nst0`).
|
||||||
|
|
||||||
|
### 5.2 Container Architecture & Bind Mounts
|
||||||
|
* **Base Image:** A lightweight Linux base image (e.g., Debian slim or Alpine) that includes system-level dependencies (`mt-st`, `tar`, SCSI drivers) and Python.
|
||||||
|
* **Volumes:**
|
||||||
|
* `/source_data`: Bind mount for the directories the user wants to back up (Mounted as Read-Only).
|
||||||
|
* `/config`: Bind mount for persistent storage of the SQLite database and application settings.
|
||||||
|
* `/staging`: Bind mount to a high-speed local disk (NVMe/SSD) for buffering `tar` streams, chunking large files, and caching restore operations.
|
||||||
|
* **Devices:**
|
||||||
|
* `--device=/dev/nst0:/dev/nst0`: Passthrough of the non-rewinding SCSI tape device (extendable to `/dev/nst1`, etc., for multiple drives).
|
||||||
|
* `--device=/dev/sgX:/dev/sgX`: (Optional) Passthrough of the SCSI Generic device for communicating with the robotic tape changer/library.
|
||||||
|
* **Ports:** Expose the web interface port (e.g., `8080`) to the host network for user access.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 6. Project Directory Structure
|
||||||
|
|
||||||
|
To maintain a clean separation of concerns, the repository will be structured as a monorepo containing both the Python backend and the Svelte frontend. This ensures synchronized development and simplifies the Docker build process.
|
||||||
|
|
||||||
|
```text
|
||||||
|
tapehoard/
|
||||||
|
├── backend/ # Python/FastAPI Backend
|
||||||
|
│ ├── app/ # Main application package
|
||||||
|
│ │ ├── api/ # FastAPI route definitions (routers)
|
||||||
|
│ │ │ ├── backups.py
|
||||||
|
│ │ │ ├── inventory.py
|
||||||
|
│ │ │ └── system.py
|
||||||
|
│ │ ├── core/ # Configuration, logging, dependencies
|
||||||
|
│ │ ├── db/ # SQLite connection, migrations (Alembic), schema
|
||||||
|
│ │ │ ├── schema.sql
|
||||||
|
│ │ │ └── models.py # SQLAlchemy or raw dataclasses
|
||||||
|
│ │ ├── providers/ # Abstract Storage Providers
|
||||||
|
│ │ │ ├── base.py # Provider interface
|
||||||
|
│ │ │ ├── tape.py # LTO `mt` / `mtx` implementation
|
||||||
|
│ │ │ ├── hdd.py # Offline HDD implementation
|
||||||
|
│ │ │ └── cloud.py # Cloud API implementation
|
||||||
|
│ │ ├── services/ # Core business logic
|
||||||
|
│ │ │ ├── scanner.py # Multiprocessing filesystem walker
|
||||||
|
│ │ │ ├── archiver.py # Tar streaming, chunking, and EOF marks
|
||||||
|
│ │ │ └── scheduler.py # APScheduler background tasks
|
||||||
|
│ │ └── main.py # FastAPI application entry point
|
||||||
|
│ ├── tests/ # Backend unit and integration tests
|
||||||
|
│ ├── pyproject.toml # Python dependencies (Poetry or similar)
|
||||||
|
│ └── requirements.txt
|
||||||
|
├── frontend/ # Svelte 5 Frontend
|
||||||
|
│ ├── src/
|
||||||
|
│ │ ├── lib/ # Shared components, stores, utilities
|
||||||
|
│ │ │ ├── components/ # UI components (Tree View, Modals, Progress)
|
||||||
|
│ │ │ ├── stores/ # Svelte stores for global state (WebSockets)
|
||||||
|
│ │ │ └── utils/ # API clients, formatting functions
|
||||||
|
│ │ ├── routes/ # SvelteKit routing (pages)
|
||||||
|
│ │ │ ├── +page.svelte # Dashboard / Active Jobs
|
||||||
|
│ │ │ ├── inventory/ # Media management
|
||||||
|
│ │ │ ├── sources/ # Directory tree and tracking config
|
||||||
|
│ │ │ └── restores/ # Restore Wizard
|
||||||
|
│ │ ├── app.html # HTML template
|
||||||
|
│ │ └── app.css # Global styles
|
||||||
|
│ ├── static/ # Static assets (images, favicons)
|
||||||
|
│ ├── package.json # Node dependencies
|
||||||
|
│ ├── svelte.config.js
|
||||||
|
│ └── vite.config.js # Vite build configuration
|
||||||
|
├── docker/ # Docker build and deployment files
|
||||||
|
│ ├── Dockerfile # Multi-stage build (builds frontend, then backend)
|
||||||
|
│ ├── entrypoint.sh # s6-overlay/su-exec script for PUID/PGID handling
|
||||||
|
│ └── docker-compose.yml # Example compose file for users
|
||||||
|
├── docs/ # Project documentation
|
||||||
|
├── .gitignore
|
||||||
|
├── .dockerignore
|
||||||
|
└── README.md # Project overview and quickstart guide
|
||||||
|
```
|
||||||
|
|
||||||
|
### 6.1 Directory Highlights
|
||||||
|
* **`backend/app/providers/`:** This is where the core abstraction lives. The `scanner.py` and `archiver.py` talk to `base.py`, allowing the system to swap between `tape.py`, `hdd.py`, or `cloud.py` seamlessly based on the job.
|
||||||
|
* **`backend/app/services/scanner.py`:** Contains the heavily optimized, multiprocessing directory traversal and hashing logic.
|
||||||
|
* **`frontend/src/lib/stores/`:** Will handle the persistent WebSocket/SSE connections for live progress bars across the application.
|
||||||
|
* **`docker/`:** Contains the multi-stage `Dockerfile`. It will first use a Node image to build the Svelte frontend into static HTML/JS/CSS files, and then copy those static files into the final Python backend image so FastAPI can serve both the API and the Web UI from a single container.
|
||||||
@@ -0,0 +1,16 @@
|
|||||||
|
# TapeHoard
|
||||||
|
|
||||||
|
A robust, index-driven Tape Backup Manager designed for single-tape drive users and scalable to tape libraries.
|
||||||
|
|
||||||
|
For full architectural details, see [PLAN.md](PLAN.md).
|
||||||
|
|
||||||
|
## Project Structure
|
||||||
|
|
||||||
|
* `backend/`: Python/FastAPI application handling the heavy lifting (hashing, streaming, db indexing).
|
||||||
|
* `frontend/`: Svelte 5 application providing the Web UI.
|
||||||
|
* `docker/`: Files required for building the multi-stage Docker container.
|
||||||
|
* `docs/`: Additional documentation.
|
||||||
|
|
||||||
|
## Quickstart
|
||||||
|
|
||||||
|
(Coming soon)
|
||||||
@@ -0,0 +1,149 @@
|
|||||||
|
# A generic, single database configuration.
|
||||||
|
|
||||||
|
[alembic]
|
||||||
|
# path to migration scripts.
|
||||||
|
# this is typically a path given in POSIX (e.g. forward slashes)
|
||||||
|
# format, relative to the token %(here)s which refers to the location of this
|
||||||
|
# ini file
|
||||||
|
script_location = %(here)s/alembic
|
||||||
|
|
||||||
|
# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
|
||||||
|
# Uncomment the line below if you want the files to be prepended with date and time
|
||||||
|
# see https://alembic.sqlalchemy.org/en/latest/tutorial.html#editing-the-ini-file
|
||||||
|
# for all available tokens
|
||||||
|
# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s
|
||||||
|
# Or organize into date-based subdirectories (requires recursive_version_locations = true)
|
||||||
|
# file_template = %%(year)d/%%(month).2d/%%(day).2d_%%(hour).2d%%(minute).2d_%%(second).2d_%%(rev)s_%%(slug)s
|
||||||
|
|
||||||
|
# sys.path path, will be prepended to sys.path if present.
|
||||||
|
# defaults to the current working directory. for multiple paths, the path separator
|
||||||
|
# is defined by "path_separator" below.
|
||||||
|
prepend_sys_path = .
|
||||||
|
|
||||||
|
|
||||||
|
# timezone to use when rendering the date within the migration file
|
||||||
|
# as well as the filename.
|
||||||
|
# If specified, requires the tzdata library which can be installed by adding
|
||||||
|
# `alembic[tz]` to the pip requirements.
|
||||||
|
# string value is passed to ZoneInfo()
|
||||||
|
# leave blank for localtime
|
||||||
|
# timezone =
|
||||||
|
|
||||||
|
# max length of characters to apply to the "slug" field
|
||||||
|
# truncate_slug_length = 40
|
||||||
|
|
||||||
|
# set to 'true' to run the environment during
|
||||||
|
# the 'revision' command, regardless of autogenerate
|
||||||
|
# revision_environment = false
|
||||||
|
|
||||||
|
# set to 'true' to allow .pyc and .pyo files without
|
||||||
|
# a source .py file to be detected as revisions in the
|
||||||
|
# versions/ directory
|
||||||
|
# sourceless = false
|
||||||
|
|
||||||
|
# version location specification; This defaults
|
||||||
|
# to <script_location>/versions. When using multiple version
|
||||||
|
# directories, initial revisions must be specified with --version-path.
|
||||||
|
# The path separator used here should be the separator specified by "path_separator"
|
||||||
|
# below.
|
||||||
|
# version_locations = %(here)s/bar:%(here)s/bat:%(here)s/alembic/versions
|
||||||
|
|
||||||
|
# path_separator; This indicates what character is used to split lists of file
|
||||||
|
# paths, including version_locations and prepend_sys_path within configparser
|
||||||
|
# files such as alembic.ini.
|
||||||
|
# The default rendered in new alembic.ini files is "os", which uses os.pathsep
|
||||||
|
# to provide os-dependent path splitting.
|
||||||
|
#
|
||||||
|
# Note that in order to support legacy alembic.ini files, this default does NOT
|
||||||
|
# take place if path_separator is not present in alembic.ini. If this
|
||||||
|
# option is omitted entirely, fallback logic is as follows:
|
||||||
|
#
|
||||||
|
# 1. Parsing of the version_locations option falls back to using the legacy
|
||||||
|
# "version_path_separator" key, which if absent then falls back to the legacy
|
||||||
|
# behavior of splitting on spaces and/or commas.
|
||||||
|
# 2. Parsing of the prepend_sys_path option falls back to the legacy
|
||||||
|
# behavior of splitting on spaces, commas, or colons.
|
||||||
|
#
|
||||||
|
# Valid values for path_separator are:
|
||||||
|
#
|
||||||
|
# path_separator = :
|
||||||
|
# path_separator = ;
|
||||||
|
# path_separator = space
|
||||||
|
# path_separator = newline
|
||||||
|
#
|
||||||
|
# Use os.pathsep. Default configuration used for new projects.
|
||||||
|
path_separator = os
|
||||||
|
|
||||||
|
# set to 'true' to search source files recursively
|
||||||
|
# in each "version_locations" directory
|
||||||
|
# new in Alembic version 1.10
|
||||||
|
# recursive_version_locations = false
|
||||||
|
|
||||||
|
# the output encoding used when revision files
|
||||||
|
# are written from script.py.mako
|
||||||
|
# output_encoding = utf-8
|
||||||
|
|
||||||
|
# database URL. This is consumed by the user-maintained env.py script only.
|
||||||
|
# other means of configuring database URLs may be customized within the env.py
|
||||||
|
# file.
|
||||||
|
sqlalchemy.url = sqlite:///tapehoard.db
|
||||||
|
|
||||||
|
|
||||||
|
[post_write_hooks]
|
||||||
|
# post_write_hooks defines scripts or Python functions that are run
|
||||||
|
# on newly generated revision scripts. See the documentation for further
|
||||||
|
# detail and examples
|
||||||
|
|
||||||
|
# format using "black" - use the console_scripts runner, against the "black" entrypoint
|
||||||
|
# hooks = black
|
||||||
|
# black.type = console_scripts
|
||||||
|
# black.entrypoint = black
|
||||||
|
# black.options = -l 79 REVISION_SCRIPT_FILENAME
|
||||||
|
|
||||||
|
# lint with attempts to fix using "ruff" - use the module runner, against the "ruff" module
|
||||||
|
# hooks = ruff
|
||||||
|
# ruff.type = module
|
||||||
|
# ruff.module = ruff
|
||||||
|
# ruff.options = check --fix REVISION_SCRIPT_FILENAME
|
||||||
|
|
||||||
|
# Alternatively, use the exec runner to execute a binary found on your PATH
|
||||||
|
# hooks = ruff
|
||||||
|
# ruff.type = exec
|
||||||
|
# ruff.executable = ruff
|
||||||
|
# ruff.options = check --fix REVISION_SCRIPT_FILENAME
|
||||||
|
|
||||||
|
# Logging configuration. This is also consumed by the user-maintained
|
||||||
|
# env.py script only.
|
||||||
|
[loggers]
|
||||||
|
keys = root,sqlalchemy,alembic
|
||||||
|
|
||||||
|
[handlers]
|
||||||
|
keys = console
|
||||||
|
|
||||||
|
[formatters]
|
||||||
|
keys = generic
|
||||||
|
|
||||||
|
[logger_root]
|
||||||
|
level = WARNING
|
||||||
|
handlers = console
|
||||||
|
qualname =
|
||||||
|
|
||||||
|
[logger_sqlalchemy]
|
||||||
|
level = WARNING
|
||||||
|
handlers =
|
||||||
|
qualname = sqlalchemy.engine
|
||||||
|
|
||||||
|
[logger_alembic]
|
||||||
|
level = INFO
|
||||||
|
handlers =
|
||||||
|
qualname = alembic
|
||||||
|
|
||||||
|
[handler_console]
|
||||||
|
class = StreamHandler
|
||||||
|
args = (sys.stderr,)
|
||||||
|
level = NOTSET
|
||||||
|
formatter = generic
|
||||||
|
|
||||||
|
[formatter_generic]
|
||||||
|
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||||
|
datefmt = %H:%M:%S
|
||||||
@@ -0,0 +1 @@
|
|||||||
|
Generic single-database configuration.
|
||||||
@@ -0,0 +1,81 @@
|
|||||||
|
import sys
|
||||||
|
from os.path import dirname, abspath
|
||||||
|
from logging.config import fileConfig
|
||||||
|
|
||||||
|
from sqlalchemy import engine_from_config
|
||||||
|
from sqlalchemy import pool
|
||||||
|
|
||||||
|
from alembic import context
|
||||||
|
|
||||||
|
# Add the backend directory to sys.path so we can import the app module
|
||||||
|
sys.path.insert(0, dirname(dirname(abspath(__file__))))
|
||||||
|
|
||||||
|
from app.db.models import Base # noqa: E402
|
||||||
|
|
||||||
|
# this is the Alembic Config object, which provides
|
||||||
|
# access to the values within the .ini file in use.
|
||||||
|
config = context.config
|
||||||
|
|
||||||
|
# Interpret the config file for Python logging.
|
||||||
|
# This line sets up loggers basically.
|
||||||
|
if config.config_file_name is not None:
|
||||||
|
fileConfig(config.config_file_name)
|
||||||
|
|
||||||
|
# add your model's MetaData object here
|
||||||
|
# for 'autogenerate' support
|
||||||
|
target_metadata = Base.metadata
|
||||||
|
|
||||||
|
# other values from the config, defined by the needs of env.py,
|
||||||
|
# can be acquired:
|
||||||
|
# my_important_option = config.get_main_option("my_important_option")
|
||||||
|
# ... etc.
|
||||||
|
|
||||||
|
|
||||||
|
def run_migrations_offline() -> None:
|
||||||
|
"""Run migrations in 'offline' mode.
|
||||||
|
|
||||||
|
This configures the context with just a URL
|
||||||
|
and not an Engine, though an Engine is acceptable
|
||||||
|
here as well. By skipping the Engine creation
|
||||||
|
we don't even need a DBAPI to be available.
|
||||||
|
|
||||||
|
Calls to context.execute() here emit the given string to the
|
||||||
|
script output.
|
||||||
|
|
||||||
|
"""
|
||||||
|
url = config.get_main_option("sqlalchemy.url")
|
||||||
|
context.configure(
|
||||||
|
url=url,
|
||||||
|
target_metadata=target_metadata,
|
||||||
|
literal_binds=True,
|
||||||
|
dialect_opts={"paramstyle": "named"},
|
||||||
|
)
|
||||||
|
|
||||||
|
with context.begin_transaction():
|
||||||
|
context.run_migrations()
|
||||||
|
|
||||||
|
|
||||||
|
def run_migrations_online() -> None:
|
||||||
|
"""Run migrations in 'online' mode.
|
||||||
|
|
||||||
|
In this scenario we need to create an Engine
|
||||||
|
and associate a connection with the context.
|
||||||
|
|
||||||
|
"""
|
||||||
|
connectable = engine_from_config(
|
||||||
|
config.get_section(config.config_ini_section, {}),
|
||||||
|
prefix="sqlalchemy.",
|
||||||
|
poolclass=pool.NullPool,
|
||||||
|
)
|
||||||
|
|
||||||
|
with connectable.connect() as connection:
|
||||||
|
context.configure(connection=connection, target_metadata=target_metadata)
|
||||||
|
|
||||||
|
with context.begin_transaction():
|
||||||
|
context.run_migrations()
|
||||||
|
|
||||||
|
|
||||||
|
if context.is_offline_mode():
|
||||||
|
run_migrations_offline()
|
||||||
|
else:
|
||||||
|
run_migrations_online()
|
||||||
@@ -0,0 +1,28 @@
|
|||||||
|
"""${message}
|
||||||
|
|
||||||
|
Revision ID: ${up_revision}
|
||||||
|
Revises: ${down_revision | comma,n}
|
||||||
|
Create Date: ${create_date}
|
||||||
|
|
||||||
|
"""
|
||||||
|
from typing import Sequence, Union
|
||||||
|
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
${imports if imports else ""}
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision: str = ${repr(up_revision)}
|
||||||
|
down_revision: Union[str, Sequence[str], None] = ${repr(down_revision)}
|
||||||
|
branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)}
|
||||||
|
depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
"""Upgrade schema."""
|
||||||
|
${upgrades if upgrades else "pass"}
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
"""Downgrade schema."""
|
||||||
|
${downgrades if downgrades else "pass"}
|
||||||
@@ -0,0 +1,121 @@
|
|||||||
|
"""Initial schema
|
||||||
|
|
||||||
|
Revision ID: 9a6e70fabf7b
|
||||||
|
Revises:
|
||||||
|
Create Date: 2026-04-22 15:12:10.997643
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
from typing import Sequence, Union
|
||||||
|
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision: str = "9a6e70fabf7b"
|
||||||
|
down_revision: Union[str, Sequence[str], None] = None
|
||||||
|
branch_labels: Union[str, Sequence[str], None] = None
|
||||||
|
depends_on: Union[str, Sequence[str], None] = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
"""Upgrade schema."""
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
op.create_table(
|
||||||
|
"backups",
|
||||||
|
sa.Column("id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("job_name", sa.String(), nullable=False),
|
||||||
|
sa.Column("job_type", sa.String(), nullable=False),
|
||||||
|
sa.Column("start_time", sa.DateTime(), nullable=False),
|
||||||
|
sa.Column("end_time", sa.DateTime(), nullable=True),
|
||||||
|
sa.Column("status", sa.String(), nullable=False),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
)
|
||||||
|
op.create_table(
|
||||||
|
"filesystem_state",
|
||||||
|
sa.Column("id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("file_path", sa.String(), nullable=False),
|
||||||
|
sa.Column("size", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("mtime", sa.Float(), nullable=False),
|
||||||
|
sa.Column("sha256_hash", sa.String(), nullable=True),
|
||||||
|
sa.Column("last_seen_timestamp", sa.DateTime(), nullable=False),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
)
|
||||||
|
op.create_index(
|
||||||
|
op.f("ix_filesystem_state_file_path"),
|
||||||
|
"filesystem_state",
|
||||||
|
["file_path"],
|
||||||
|
unique=True,
|
||||||
|
)
|
||||||
|
op.create_index(
|
||||||
|
op.f("ix_filesystem_state_sha256_hash"),
|
||||||
|
"filesystem_state",
|
||||||
|
["sha256_hash"],
|
||||||
|
unique=False,
|
||||||
|
)
|
||||||
|
op.create_table(
|
||||||
|
"storage_media",
|
||||||
|
sa.Column("id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("media_type", sa.String(), nullable=False),
|
||||||
|
sa.Column("identifier", sa.String(), nullable=False),
|
||||||
|
sa.Column("generation_tier", sa.String(), nullable=True),
|
||||||
|
sa.Column("capacity", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("bytes_used", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("location", sa.String(), nullable=True),
|
||||||
|
sa.Column("status", sa.String(), nullable=False),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
)
|
||||||
|
op.create_index(
|
||||||
|
op.f("ix_storage_media_identifier"),
|
||||||
|
"storage_media",
|
||||||
|
["identifier"],
|
||||||
|
unique=True,
|
||||||
|
)
|
||||||
|
op.create_table(
|
||||||
|
"file_versions",
|
||||||
|
sa.Column("id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("filesystem_state_id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("media_id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("file_number", sa.String(), nullable=False),
|
||||||
|
sa.Column("offset_in_tar", sa.Integer(), nullable=True),
|
||||||
|
sa.ForeignKeyConstraint(
|
||||||
|
["filesystem_state_id"],
|
||||||
|
["filesystem_state.id"],
|
||||||
|
),
|
||||||
|
sa.ForeignKeyConstraint(
|
||||||
|
["media_id"],
|
||||||
|
["storage_media.id"],
|
||||||
|
),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
)
|
||||||
|
op.create_table(
|
||||||
|
"job_logs",
|
||||||
|
sa.Column("id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("backup_id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("timestamp", sa.DateTime(), nullable=False),
|
||||||
|
sa.Column("log_level", sa.String(), nullable=False),
|
||||||
|
sa.Column("message", sa.String(), nullable=False),
|
||||||
|
sa.ForeignKeyConstraint(
|
||||||
|
["backup_id"],
|
||||||
|
["backups.id"],
|
||||||
|
),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
)
|
||||||
|
# ### end Alembic commands ###
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
"""Downgrade schema."""
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
op.drop_table("job_logs")
|
||||||
|
op.drop_table("file_versions")
|
||||||
|
op.drop_index(op.f("ix_storage_media_identifier"), table_name="storage_media")
|
||||||
|
op.drop_table("storage_media")
|
||||||
|
op.drop_index(
|
||||||
|
op.f("ix_filesystem_state_sha256_hash"), table_name="filesystem_state"
|
||||||
|
)
|
||||||
|
op.drop_index(op.f("ix_filesystem_state_file_path"), table_name="filesystem_state")
|
||||||
|
op.drop_table("filesystem_state")
|
||||||
|
op.drop_table("backups")
|
||||||
|
# ### end Alembic commands ###
|
||||||
@@ -0,0 +1,8 @@
|
|||||||
|
from fastapi import APIRouter
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/backups", tags=["Backups"])
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/")
|
||||||
|
def list_backups():
|
||||||
|
return []
|
||||||
@@ -0,0 +1,79 @@
|
|||||||
|
from fastapi import APIRouter, Depends
|
||||||
|
from typing import List, Optional
|
||||||
|
from pydantic import BaseModel
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from app.db.database import get_db
|
||||||
|
from app.db import models
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/inventory", tags=["Inventory"])
|
||||||
|
|
||||||
|
|
||||||
|
class FileItemSchema(BaseModel):
|
||||||
|
name: str
|
||||||
|
path: str
|
||||||
|
type: str
|
||||||
|
size: Optional[int] = None
|
||||||
|
mtime: Optional[float] = None
|
||||||
|
media: List[str] = [] # List of media identifiers this file is on
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/browse", response_model=List[FileItemSchema])
|
||||||
|
def browse_index(path: str = "/", db: Session = Depends(get_db)):
|
||||||
|
# This is trickier because we store full paths.
|
||||||
|
# We need to find all unique "first level" children of the given path.
|
||||||
|
|
||||||
|
if not path.endswith("/"):
|
||||||
|
path += "/"
|
||||||
|
|
||||||
|
# Files directly in this path
|
||||||
|
# We can use a regex or just string manipulation in SQLite
|
||||||
|
# For simplicity, let's get all files starting with path and then parse
|
||||||
|
|
||||||
|
# Query for files that are in this directory
|
||||||
|
# A file is in /dir/ if its path starts with /dir/ and doesn't contain another / after that
|
||||||
|
|
||||||
|
# Actually, a better way for a Virtual FS is to query for all paths starting with 'path'
|
||||||
|
# and then find the next component.
|
||||||
|
|
||||||
|
all_files = (
|
||||||
|
db.query(models.FilesystemState)
|
||||||
|
.filter(models.FilesystemState.file_path.like(f"{path}%"))
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
|
||||||
|
results_map = {}
|
||||||
|
|
||||||
|
for f in all_files:
|
||||||
|
relative = f.file_path[len(path) :]
|
||||||
|
if not relative:
|
||||||
|
continue
|
||||||
|
|
||||||
|
parts = relative.split("/")
|
||||||
|
name = parts[0]
|
||||||
|
full_item_path = path + name
|
||||||
|
|
||||||
|
if len(parts) > 1:
|
||||||
|
# It's a directory
|
||||||
|
if name not in results_map:
|
||||||
|
results_map[name] = FileItemSchema(
|
||||||
|
name=name, path=full_item_path, type="directory", size=0, mtime=0
|
||||||
|
)
|
||||||
|
results_map[name].size += f.size
|
||||||
|
if f.mtime > results_map[name].mtime:
|
||||||
|
results_map[name].mtime = f.mtime
|
||||||
|
else:
|
||||||
|
# It's a file
|
||||||
|
media_list = [v.media.identifier for v in f.versions]
|
||||||
|
results_map[name] = FileItemSchema(
|
||||||
|
name=name,
|
||||||
|
path=f.file_path,
|
||||||
|
type="file",
|
||||||
|
size=f.size,
|
||||||
|
mtime=f.mtime,
|
||||||
|
media=media_list,
|
||||||
|
)
|
||||||
|
|
||||||
|
results = list(results_map.values())
|
||||||
|
results.sort(key=lambda x: (x.type != "directory", x.name.lower()))
|
||||||
|
|
||||||
|
return results
|
||||||
@@ -0,0 +1,169 @@
|
|||||||
|
from fastapi import APIRouter, HTTPException, Depends
|
||||||
|
import os
|
||||||
|
from typing import List, Optional
|
||||||
|
from pydantic import BaseModel
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from app.db.database import get_db
|
||||||
|
from app.db import models
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/system", tags=["System"])
|
||||||
|
|
||||||
|
|
||||||
|
class FileItemSchema(BaseModel):
|
||||||
|
name: str
|
||||||
|
path: str
|
||||||
|
type: str # file, directory, link
|
||||||
|
size: Optional[int] = None
|
||||||
|
mtime: Optional[float] = None
|
||||||
|
tracked: bool = False
|
||||||
|
|
||||||
|
|
||||||
|
class TrackToggleRequest(BaseModel):
|
||||||
|
path: str
|
||||||
|
is_directory: bool = True
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/browse", response_model=List[FileItemSchema])
|
||||||
|
def browse_path(path: str = "/source_data", db: Session = Depends(get_db)):
|
||||||
|
# If absolute path doesn't exist, try relative to project root
|
||||||
|
if not os.path.exists(path):
|
||||||
|
local_source = os.path.abspath(os.path.join(os.getcwd(), "..", "source_data"))
|
||||||
|
if path == "/source_data" and os.path.exists(local_source):
|
||||||
|
path = local_source
|
||||||
|
else:
|
||||||
|
raise HTTPException(status_code=404, detail=f"Path not found: {path}")
|
||||||
|
|
||||||
|
if not os.path.isdir(path):
|
||||||
|
raise HTTPException(status_code=400, detail="Path is not a directory")
|
||||||
|
|
||||||
|
# Get all tracked paths to mark items as tracked
|
||||||
|
tracked_paths = {t.path for t in db.query(models.TrackedSource).all()}
|
||||||
|
|
||||||
|
results = []
|
||||||
|
try:
|
||||||
|
with os.scandir(path) as it:
|
||||||
|
for entry in it:
|
||||||
|
try:
|
||||||
|
stats = entry.stat(follow_symlinks=False)
|
||||||
|
item_type = "file"
|
||||||
|
if entry.is_dir():
|
||||||
|
item_type = "directory"
|
||||||
|
elif entry.is_symlink():
|
||||||
|
item_type = "link"
|
||||||
|
|
||||||
|
results.append(
|
||||||
|
FileItemSchema(
|
||||||
|
name=entry.name,
|
||||||
|
path=entry.path,
|
||||||
|
type=item_type,
|
||||||
|
size=stats.st_size,
|
||||||
|
mtime=stats.st_mtime,
|
||||||
|
tracked=entry.path in tracked_paths,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
except Exception:
|
||||||
|
continue
|
||||||
|
except Exception as e:
|
||||||
|
raise HTTPException(status_code=500, detail=str(e))
|
||||||
|
|
||||||
|
# Sort: directories first, then name
|
||||||
|
results.sort(key=lambda x: (x.type != "directory", x.name.lower()))
|
||||||
|
|
||||||
|
return results
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/track")
|
||||||
|
def track_path(req: TrackToggleRequest, db: Session = Depends(get_db)):
|
||||||
|
existing = (
|
||||||
|
db.query(models.TrackedSource)
|
||||||
|
.filter(models.TrackedSource.path == req.path)
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
if existing:
|
||||||
|
return {"message": "Already tracked"}
|
||||||
|
|
||||||
|
new_track = models.TrackedSource(path=req.path, is_directory=req.is_directory)
|
||||||
|
db.add(new_track)
|
||||||
|
db.commit()
|
||||||
|
return {"message": "Path tracked"}
|
||||||
|
|
||||||
|
|
||||||
|
class BatchTrackRequest(BaseModel):
|
||||||
|
tracks: List[str] = [] # Paths to track
|
||||||
|
untracks: List[str] = [] # Paths to untrack
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/track/batch")
|
||||||
|
def track_batch(req: BatchTrackRequest, db: Session = Depends(get_db)):
|
||||||
|
# Handle untracks
|
||||||
|
if req.untracks:
|
||||||
|
db.query(models.TrackedSource).filter(
|
||||||
|
models.TrackedSource.path.in_(req.untracks)
|
||||||
|
).delete(synchronize_session=False)
|
||||||
|
|
||||||
|
# Handle tracks
|
||||||
|
if req.tracks:
|
||||||
|
# Get existing to avoid duplicates
|
||||||
|
existing = {
|
||||||
|
t.path
|
||||||
|
for t in db.query(models.TrackedSource)
|
||||||
|
.filter(models.TrackedSource.path.in_(req.tracks))
|
||||||
|
.all()
|
||||||
|
}
|
||||||
|
new_paths = [path for path in req.tracks if path not in existing]
|
||||||
|
|
||||||
|
for path in new_paths:
|
||||||
|
# Note: In a real app we'd verify if it's a directory
|
||||||
|
new_track = models.TrackedSource(path=path, is_directory=True)
|
||||||
|
db.add(new_track)
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
return {
|
||||||
|
"message": f"Processed {len(req.tracks)} tracks and {len(req.untracks)} untracks"
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
class TreeNodeSchema(BaseModel):
|
||||||
|
name: str
|
||||||
|
path: str
|
||||||
|
has_children: bool = False
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/tree", response_model=List[TreeNodeSchema])
|
||||||
|
def get_tree(path: str = "/source_data"):
|
||||||
|
if not os.path.exists(path):
|
||||||
|
local_source = os.path.abspath(os.path.join(os.getcwd(), "..", "source_data"))
|
||||||
|
if path == "/source_data" and os.path.exists(local_source):
|
||||||
|
path = local_source
|
||||||
|
else:
|
||||||
|
raise HTTPException(status_code=404, detail="Path not found")
|
||||||
|
|
||||||
|
if not os.path.isdir(path):
|
||||||
|
return []
|
||||||
|
|
||||||
|
results = []
|
||||||
|
try:
|
||||||
|
with os.scandir(path) as it:
|
||||||
|
for entry in it:
|
||||||
|
if entry.is_dir():
|
||||||
|
# Check if it has subdirectories for the expander icon
|
||||||
|
has_subdirs = False
|
||||||
|
try:
|
||||||
|
with os.scandir(entry.path) as sub_it:
|
||||||
|
for sub_entry in sub_it:
|
||||||
|
if sub_entry.is_dir():
|
||||||
|
has_subdirs = True
|
||||||
|
break
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
results.append(
|
||||||
|
TreeNodeSchema(
|
||||||
|
name=entry.name, path=entry.path, has_children=has_subdirs
|
||||||
|
)
|
||||||
|
)
|
||||||
|
except Exception as e:
|
||||||
|
raise HTTPException(status_code=500, detail=str(e))
|
||||||
|
|
||||||
|
results.sort(key=lambda x: x.name.lower())
|
||||||
|
return results
|
||||||
@@ -0,0 +1,21 @@
|
|||||||
|
from sqlalchemy import create_engine
|
||||||
|
from sqlalchemy.orm import sessionmaker
|
||||||
|
|
||||||
|
# Dependency mapping for FastAPI
|
||||||
|
# Using standard relative path, but easily overridden with env vars later
|
||||||
|
SQLALCHEMY_DATABASE_URL = "sqlite:///tapehoard.db"
|
||||||
|
|
||||||
|
# connect_args={"check_same_thread": False} is required for SQLite in FastAPI
|
||||||
|
engine = create_engine(
|
||||||
|
SQLALCHEMY_DATABASE_URL, connect_args={"check_same_thread": False}
|
||||||
|
)
|
||||||
|
|
||||||
|
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
||||||
|
|
||||||
|
|
||||||
|
def get_db():
|
||||||
|
db = SessionLocal()
|
||||||
|
try:
|
||||||
|
yield db
|
||||||
|
finally:
|
||||||
|
db.close()
|
||||||
@@ -0,0 +1,90 @@
|
|||||||
|
from datetime import datetime
|
||||||
|
from typing import Optional, List
|
||||||
|
|
||||||
|
from sqlalchemy import Integer, String, Float, ForeignKey, DateTime, Boolean
|
||||||
|
from sqlalchemy.orm import DeclarativeBase, Mapped, mapped_column, relationship
|
||||||
|
|
||||||
|
|
||||||
|
class Base(DeclarativeBase):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
class FilesystemState(Base):
|
||||||
|
__tablename__ = "filesystem_state"
|
||||||
|
|
||||||
|
id: Mapped[int] = mapped_column(primary_key=True)
|
||||||
|
file_path: Mapped[str] = mapped_column(String, index=True, unique=True)
|
||||||
|
size: Mapped[int] = mapped_column(Integer)
|
||||||
|
mtime: Mapped[float] = mapped_column(Float)
|
||||||
|
sha256_hash: Mapped[Optional[str]] = mapped_column(String, index=True)
|
||||||
|
last_seen_timestamp: Mapped[datetime] = mapped_column(DateTime)
|
||||||
|
|
||||||
|
versions: Mapped[List["FileVersion"]] = relationship(back_populates="file_state")
|
||||||
|
|
||||||
|
|
||||||
|
class StorageMedia(Base):
|
||||||
|
__tablename__ = "storage_media"
|
||||||
|
|
||||||
|
id: Mapped[int] = mapped_column(primary_key=True)
|
||||||
|
media_type: Mapped[str] = mapped_column(String) # tape, hdd, cloud
|
||||||
|
identifier: Mapped[str] = mapped_column(
|
||||||
|
String, unique=True, index=True
|
||||||
|
) # barcode, UUID, bucket
|
||||||
|
generation_tier: Mapped[Optional[str]] = mapped_column(
|
||||||
|
String
|
||||||
|
) # e.g., LTO-6, S3 Standard
|
||||||
|
capacity: Mapped[int] = mapped_column(Integer) # Native capacity in bytes
|
||||||
|
bytes_used: Mapped[int] = mapped_column(Integer, default=0)
|
||||||
|
location: Mapped[Optional[str]] = mapped_column(String)
|
||||||
|
status: Mapped[str] = mapped_column(
|
||||||
|
String, default="active"
|
||||||
|
) # active, full, retired, offline
|
||||||
|
|
||||||
|
versions: Mapped[List["FileVersion"]] = relationship(back_populates="media")
|
||||||
|
|
||||||
|
|
||||||
|
class BackupJob(Base):
|
||||||
|
__tablename__ = "backups"
|
||||||
|
|
||||||
|
id: Mapped[int] = mapped_column(primary_key=True)
|
||||||
|
job_name: Mapped[str] = mapped_column(String)
|
||||||
|
job_type: Mapped[str] = mapped_column(String) # initial, incremental
|
||||||
|
start_time: Mapped[datetime] = mapped_column(DateTime)
|
||||||
|
end_time: Mapped[Optional[datetime]] = mapped_column(DateTime)
|
||||||
|
status: Mapped[str] = mapped_column(String) # running, success, failed, aborted
|
||||||
|
|
||||||
|
logs: Mapped[List["JobLog"]] = relationship(back_populates="backup")
|
||||||
|
|
||||||
|
|
||||||
|
class FileVersion(Base):
|
||||||
|
__tablename__ = "file_versions"
|
||||||
|
|
||||||
|
id: Mapped[int] = mapped_column(primary_key=True)
|
||||||
|
filesystem_state_id: Mapped[int] = mapped_column(ForeignKey("filesystem_state.id"))
|
||||||
|
media_id: Mapped[int] = mapped_column(ForeignKey("storage_media.id"))
|
||||||
|
file_number: Mapped[str] = mapped_column(String) # Tape position or object path
|
||||||
|
offset_in_tar: Mapped[Optional[int]] = mapped_column(Integer)
|
||||||
|
|
||||||
|
file_state: Mapped["FilesystemState"] = relationship(back_populates="versions")
|
||||||
|
media: Mapped["StorageMedia"] = relationship(back_populates="versions")
|
||||||
|
|
||||||
|
|
||||||
|
class TrackedSource(Base):
|
||||||
|
__tablename__ = "tracked_sources"
|
||||||
|
|
||||||
|
id: Mapped[int] = mapped_column(primary_key=True)
|
||||||
|
path: Mapped[str] = mapped_column(String, unique=True, index=True)
|
||||||
|
is_directory: Mapped[bool] = mapped_column(Boolean, default=True)
|
||||||
|
created_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow)
|
||||||
|
|
||||||
|
|
||||||
|
class JobLog(Base):
|
||||||
|
__tablename__ = "job_logs"
|
||||||
|
|
||||||
|
id: Mapped[int] = mapped_column(primary_key=True)
|
||||||
|
backup_id: Mapped[int] = mapped_column(ForeignKey("backups.id"))
|
||||||
|
timestamp: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow)
|
||||||
|
log_level: Mapped[str] = mapped_column(String) # INFO, WARN, ERROR
|
||||||
|
message: Mapped[str] = mapped_column(String)
|
||||||
|
|
||||||
|
backup: Mapped["BackupJob"] = relationship(back_populates="logs")
|
||||||
@@ -0,0 +1,33 @@
|
|||||||
|
from fastapi import FastAPI
|
||||||
|
from fastapi.middleware.cors import CORSMiddleware
|
||||||
|
from app.api import system, inventory, backups
|
||||||
|
from app.db.database import engine
|
||||||
|
from app.db import models
|
||||||
|
|
||||||
|
# Create tables
|
||||||
|
models.Base.metadata.create_all(bind=engine)
|
||||||
|
|
||||||
|
app = FastAPI(
|
||||||
|
title="TapeHoard API",
|
||||||
|
description="A robust, index-driven Tape Backup Manager",
|
||||||
|
version="0.1.0",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Configure CORS
|
||||||
|
app.add_middleware(
|
||||||
|
CORSMiddleware,
|
||||||
|
allow_origins=["*"], # In production, this should be restricted
|
||||||
|
allow_credentials=True,
|
||||||
|
allow_methods=["*"],
|
||||||
|
allow_headers=["*"],
|
||||||
|
)
|
||||||
|
|
||||||
|
# Include routers
|
||||||
|
app.include_router(system.router)
|
||||||
|
app.include_router(inventory.router)
|
||||||
|
app.include_router(backups.router)
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/")
|
||||||
|
def read_root():
|
||||||
|
return {"message": "Welcome to TapeHoard API"}
|
||||||
@@ -0,0 +1,35 @@
|
|||||||
|
[project]
|
||||||
|
name = "tapehoard"
|
||||||
|
version = "0.1.0"
|
||||||
|
description = "A robust, index-driven Tape Backup Manager"
|
||||||
|
readme = "../README.md"
|
||||||
|
requires-python = ">=3.10"
|
||||||
|
dependencies = [
|
||||||
|
"alembic",
|
||||||
|
"fastapi",
|
||||||
|
"sqlalchemy",
|
||||||
|
"uvicorn[standard]",
|
||||||
|
"prometheus-client",
|
||||||
|
"apscheduler",
|
||||||
|
"apprise",
|
||||||
|
"loguru>=0.7.3",
|
||||||
|
"pydantic-settings>=2.14.0",
|
||||||
|
]
|
||||||
|
|
||||||
|
[build-system]
|
||||||
|
requires = ["hatchling"]
|
||||||
|
build-backend = "hatchling.build"
|
||||||
|
|
||||||
|
[tool.hatch.build.targets.wheel]
|
||||||
|
packages = ["app"]
|
||||||
|
|
||||||
|
[tool.uv]
|
||||||
|
managed = true
|
||||||
|
|
||||||
|
[dependency-groups]
|
||||||
|
dev = [
|
||||||
|
"pre-commit>=4.6.0",
|
||||||
|
"pytest>=9.0.3",
|
||||||
|
"ruff>=0.15.11",
|
||||||
|
"ty>=0.0.32",
|
||||||
|
]
|
||||||
Generated
+1427
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,85 @@
|
|||||||
|
# TapeHoard Design System (THDS)
|
||||||
|
|
||||||
|
## 1. Vision & Philosophy
|
||||||
|
TapeHoard is a high-precision tool for managing massive datasets on physical and cloud media. The design language must reflect **reliability, industrial precision, and information density.** It should feel like a modern piece of mission-control software for a data center, scaled down for the sophisticated homelabber.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 2. Visual Identity
|
||||||
|
|
||||||
|
### 2.1 Color Palette (Dark Mode First)
|
||||||
|
* **Primary Background:** `#0B0E14` (Deep Space Black)
|
||||||
|
* **Secondary Background (Cards/Panels):** `#161B22` (Industrial Slate)
|
||||||
|
* **Border / Divider:** `#30363D` (Steel Grey)
|
||||||
|
* **Primary Action:** `#3498DB` (Azure Blue)
|
||||||
|
* **Success:** `#2ECC71` (Emerald Green)
|
||||||
|
* **Warning:** `#F1C40F` (Amber Gold)
|
||||||
|
* **Error / Critical:** `#E74C3C` (Alizarin Crimson)
|
||||||
|
* **Text (Primary):** `#F0F6FC` (Off-White)
|
||||||
|
* **Text (Secondary/Muted):** `#8B949E` (Grey-Blue)
|
||||||
|
|
||||||
|
### 2.2 Typography
|
||||||
|
* **UI Text:** `Inter` or `Geist Sans` (Clean, highly readable at small sizes).
|
||||||
|
* **Data / Code:** `JetBrains Mono` or `Fira Code` (Used for file paths, barcodes, hashes, and logs).
|
||||||
|
* **Hierarchy:**
|
||||||
|
* **H1:** 2rem, Bold, High-contrast.
|
||||||
|
* **H2/H3:** 1.5rem / 1.25rem, Semi-bold.
|
||||||
|
* **Body:** 0.95rem, Regular.
|
||||||
|
* **Small/Caption:** 0.8rem, Muted.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 3. UI Components & Patterns
|
||||||
|
|
||||||
|
### 3.1 "Glassmorphism 2.0" (Layering)
|
||||||
|
Use subtle background blurs and semi-transparent overlays for modals and sidebars to create a sense of depth and spatial awareness.
|
||||||
|
* *Example:* The Restore Wizard modal should feel "layered" over the dashboard.
|
||||||
|
|
||||||
|
### 3.2 Industrial Components
|
||||||
|
* **The Tape Progress Bar:** Unlike a standard loading bar, the tape progress bar should have "notched" intervals and a subtle "reel" animation to evoke the feeling of physical media movement.
|
||||||
|
* **Media Badges:** Tapes, HDDs, and Cloud buckets should have distinct, high-contrast badges with recognizable icons from Lucide.
|
||||||
|
* **FTS Search Bar:** A "Command Palette" style search bar (Cmd+K) that floats in the center of the screen for instant access.
|
||||||
|
|
||||||
|
### 3.3 Status Indicators
|
||||||
|
* **Pulsing Dots:** Use small pulsing green/amber/red dots for active jobs and hardware status.
|
||||||
|
* **Toast Notifications (Sonner):** Sleek, non-intrusive alerts in the bottom-right corner for background completions and hardware events.
|
||||||
|
|
||||||
|
### 3.4 Two-Pane Explorer Layout
|
||||||
|
The File Browser must utilize a classic two-pane architecture to provide a native-feeling management experience:
|
||||||
|
* **Navigation Sidebar (Directory Tree):** The left pane contains a persistent, hierarchical tree view of the directory structure. Users can expand/collapse folders and select a directory to populate the right pane.
|
||||||
|
* **Detail Pane (File List):** The right pane displays the members (files and sub-directories) of the folder selected in the sidebar. This pane supports sorting, searching, and batch selection.
|
||||||
|
* **Interactivity:** Navigation is synchronized; double-clicking a folder in the Detail Pane automatically expands and scrolls to that folder in the Sidebar Tree.
|
||||||
|
|
||||||
|
### 3.5 Core File Browser Utility Features
|
||||||
|
To maintain an enterprise-grade experience, the utility must support:
|
||||||
|
* **Multi-Select & Range Selection:** Standard desktop logic (Shift+Click, Ctrl/Cmd+Click).
|
||||||
|
* **Column Sorting:** Interactive headers for Name, Size, and Date Modified.
|
||||||
|
* **Metadata Detail Pane:** A collapsible right-hand panel for file previews and technical metadata (BLAKE3, storage location).
|
||||||
|
* **Keyboard Shortcuts:** Arrow keys for movement, Enter for navigation, Space for tracking toggle.
|
||||||
|
* **Visual Status Indicators:** Clear distinct styling for "Tracked", "Excluded", and "Modified" files.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 4. Interaction Language
|
||||||
|
|
||||||
|
### 4.1 Tactile Feedback
|
||||||
|
* **Buttons:** Subtle scale-down effect (98%) on click to provide a "physical" press sensation.
|
||||||
|
* **Transitions:** Use Svelte's `fly` and `fade` transitions for all page navigations and modal entries to make the app feel "alive."
|
||||||
|
|
||||||
|
### 4.2 Information Density
|
||||||
|
* **Data Grids:** Use compact row heights with clear hover highlighting.
|
||||||
|
* **The "Metadata Drawer":** Instead of a separate page, clicking a file in the Virtual Filesystem should slide out a detailed drawer from the right side of the screen, keeping the user in their browsing context.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 5. Iconography (Lucide-Svelte)
|
||||||
|
* **Navigation:** `LayoutDashboard`, `Library`, `FolderTree`, `History`, `Settings`.
|
||||||
|
* **Actions:** `Database`, `HardDrive`, `CloudDownload`, `Scissors` (for splitting), `Trash2`, `Edit3`.
|
||||||
|
* **Status:** `CheckCircle2`, `AlertTriangle`, `XCircle`, `Info`.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 6. Implementation Principles (Vanilla CSS)
|
||||||
|
* **Variables:** Define all colors and spacing as CSS variables in `app.css`.
|
||||||
|
* **Flexbox/Grid:** Strictly use Modern CSS Layouts (No legacy floats).
|
||||||
|
* **Consistency:** Every card, button, and input must strictly adhere to the defined spacing and border-radius (Default: `8px`).
|
||||||
Generated
+2802
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,35 @@
|
|||||||
|
{
|
||||||
|
"name": "tapehoard-frontend",
|
||||||
|
"version": "0.1.0",
|
||||||
|
"private": true,
|
||||||
|
"type": "module",
|
||||||
|
"scripts": {
|
||||||
|
"dev": "vite dev",
|
||||||
|
"build": "vite build",
|
||||||
|
"preview": "vite preview",
|
||||||
|
"check": "svelte-check --tsconfig ./tsconfig.json",
|
||||||
|
"check:watch": "svelte-check --tsconfig ./tsconfig.json --watch"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"@hey-api/openapi-ts": "^0.96.1",
|
||||||
|
"@sveltejs/adapter-static": "^3.0.10",
|
||||||
|
"@sveltejs/kit": "^2.57.1",
|
||||||
|
"@sveltejs/vite-plugin-svelte": "^7.0.0",
|
||||||
|
"@tailwindcss/postcss": "^4.2.4",
|
||||||
|
"@tailwindcss/vite": "^4.2.4",
|
||||||
|
"autoprefixer": "^10.5.0",
|
||||||
|
"bits-ui": "^1.0.0-next.98",
|
||||||
|
"clsx": "^2.1.1",
|
||||||
|
"lucide-svelte": "^1.0.1",
|
||||||
|
"postcss": "^8.5.10",
|
||||||
|
"svelte": "^5.55.4",
|
||||||
|
"svelte-check": "^4.4.6",
|
||||||
|
"svelte-sonner": "^1.1.0",
|
||||||
|
"tailwind-merge": "^3.5.0",
|
||||||
|
"tailwind-variants": "^3.2.2",
|
||||||
|
"tailwindcss": "^4.2.4",
|
||||||
|
"tslib": "^2.8.1",
|
||||||
|
"typescript": "^6.0.3",
|
||||||
|
"vite": "^8.0.9"
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,293 @@
|
|||||||
|
@import "tailwindcss";
|
||||||
|
|
||||||
|
@theme {
|
||||||
|
/* THDS Colors */
|
||||||
|
--color-bg-primary: #0b0e14;
|
||||||
|
--color-bg-secondary: #161b22;
|
||||||
|
--color-bg-tertiary: #21262d;
|
||||||
|
--color-border-color: #30363d;
|
||||||
|
--color-action-color: #3498db;
|
||||||
|
--color-success-color: #2ecc71;
|
||||||
|
--color-warning-color: #f1c40f;
|
||||||
|
--color-error-color: #e74c3c;
|
||||||
|
--color-text-primary: #f0f6fc;
|
||||||
|
--color-text-secondary: #8b949e;
|
||||||
|
|
||||||
|
/* Shadcn/UI Mappings */
|
||||||
|
--color-background: var(--color-bg-primary);
|
||||||
|
--color-foreground: var(--color-text-primary);
|
||||||
|
|
||||||
|
--color-muted: var(--color-bg-tertiary);
|
||||||
|
--color-muted-foreground: var(--color-text-secondary);
|
||||||
|
|
||||||
|
--color-popover: var(--color-bg-primary);
|
||||||
|
--color-popover-foreground: var(--color-text-primary);
|
||||||
|
|
||||||
|
--color-card: var(--color-bg-secondary);
|
||||||
|
--color-card-foreground: var(--color-text-primary);
|
||||||
|
|
||||||
|
--color-border: var(--color-border-color);
|
||||||
|
--color-input: var(--color-border-color);
|
||||||
|
|
||||||
|
--color-primary: var(--color-action-color);
|
||||||
|
--color-primary-foreground: #ffffff;
|
||||||
|
|
||||||
|
--color-secondary: var(--color-bg-tertiary);
|
||||||
|
--color-secondary-foreground: var(--color-text-primary);
|
||||||
|
|
||||||
|
--color-accent: var(--color-bg-tertiary);
|
||||||
|
--color-accent-foreground: var(--color-text-primary);
|
||||||
|
|
||||||
|
--color-destructive: var(--color-error-color);
|
||||||
|
--color-destructive-foreground: #ffffff;
|
||||||
|
|
||||||
|
/* Spacing & Radii */
|
||||||
|
--spacing-xs: 0.25rem;
|
||||||
|
--spacing-sm: 0.5rem;
|
||||||
|
--spacing-md: 1rem;
|
||||||
|
--spacing-lg: 1.5rem;
|
||||||
|
--spacing-xl: 2rem;
|
||||||
|
|
||||||
|
--radius-lg: 0.5rem;
|
||||||
|
--radius-md: calc(0.5rem - 2px);
|
||||||
|
--radius-sm: calc(0.5rem - 4px);
|
||||||
|
|
||||||
|
--font-sans: "Inter", ui-sans-serif, system-ui, sans-serif;
|
||||||
|
--font-mono: "JetBrains Mono", "Fira Code", ui-monospace, SFMono-Regular, monospace;
|
||||||
|
}
|
||||||
|
|
||||||
|
@layer base {
|
||||||
|
*, ::after, ::before {
|
||||||
|
@apply border-border;
|
||||||
|
}
|
||||||
|
body {
|
||||||
|
@apply bg-background text-foreground font-sans antialiased;
|
||||||
|
margin: 0;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/* --- Layout --- */
|
||||||
|
|
||||||
|
.app-container {
|
||||||
|
display: flex;
|
||||||
|
height: 100vh;
|
||||||
|
width: 100vw;
|
||||||
|
overflow: hidden;
|
||||||
|
background-color: var(--color-bg-primary);
|
||||||
|
}
|
||||||
|
|
||||||
|
nav {
|
||||||
|
width: 240px;
|
||||||
|
background-color: var(--color-bg-secondary);
|
||||||
|
border-right: 1px solid var(--color-border-color);
|
||||||
|
display: flex;
|
||||||
|
flex-direction: column;
|
||||||
|
padding: var(--spacing-lg);
|
||||||
|
flex-shrink: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
nav h2 {
|
||||||
|
font-size: 1.5rem;
|
||||||
|
margin-bottom: var(--spacing-xl);
|
||||||
|
font-weight: 700;
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: var(--spacing-sm);
|
||||||
|
color: var(--color-text-primary);
|
||||||
|
}
|
||||||
|
|
||||||
|
nav ul {
|
||||||
|
list-style: none;
|
||||||
|
padding: 0;
|
||||||
|
margin: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
nav ul li {
|
||||||
|
margin-bottom: var(--spacing-sm);
|
||||||
|
}
|
||||||
|
|
||||||
|
nav ul li a {
|
||||||
|
color: var(--color-text-secondary);
|
||||||
|
text-decoration: none;
|
||||||
|
font-weight: 500;
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: var(--spacing-md);
|
||||||
|
padding: var(--spacing-sm) var(--spacing-md);
|
||||||
|
border-radius: var(--radius-md);
|
||||||
|
transition: all 0.2s ease;
|
||||||
|
}
|
||||||
|
|
||||||
|
nav ul li a:hover, nav ul li a.active {
|
||||||
|
background-color: var(--color-bg-tertiary);
|
||||||
|
color: var(--color-text-primary);
|
||||||
|
}
|
||||||
|
|
||||||
|
main {
|
||||||
|
flex: 1;
|
||||||
|
padding: var(--spacing-xl);
|
||||||
|
overflow-y: auto;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* --- Global Utilities --- */
|
||||||
|
|
||||||
|
.mono {
|
||||||
|
font-family: var(--font-mono);
|
||||||
|
}
|
||||||
|
|
||||||
|
.subtitle {
|
||||||
|
color: var(--color-text-secondary);
|
||||||
|
font-size: 0.95rem;
|
||||||
|
margin-top: -0.25rem;
|
||||||
|
margin-bottom: var(--spacing-lg);
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Industrial Data Tables */
|
||||||
|
.data-table {
|
||||||
|
width: 100%;
|
||||||
|
border-collapse: collapse;
|
||||||
|
text-align: left;
|
||||||
|
}
|
||||||
|
|
||||||
|
.data-table th {
|
||||||
|
padding: var(--spacing-md);
|
||||||
|
color: var(--color-text-secondary);
|
||||||
|
font-size: 0.75rem;
|
||||||
|
text-transform: uppercase;
|
||||||
|
border-bottom: 1px solid var(--color-border-color);
|
||||||
|
font-weight: 600;
|
||||||
|
letter-spacing: 0.05em;
|
||||||
|
}
|
||||||
|
|
||||||
|
.data-table td {
|
||||||
|
padding: var(--spacing-md);
|
||||||
|
border-bottom: 1px solid var(--color-border-color);
|
||||||
|
font-size: 0.9rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.data-table tr:hover td {
|
||||||
|
background-color: rgba(255, 255, 255, 0.02);
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Component Enhancements */
|
||||||
|
.card-thds {
|
||||||
|
background-color: var(--color-bg-secondary);
|
||||||
|
border: 1px solid var(--color-border-color);
|
||||||
|
border-radius: var(--radius-lg);
|
||||||
|
padding: 1.5rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* --- Buttons (Machined Industrial Style) --- */
|
||||||
|
|
||||||
|
.btn {
|
||||||
|
display: inline-flex;
|
||||||
|
align-items: center;
|
||||||
|
justify-content: center;
|
||||||
|
gap: var(--spacing-sm);
|
||||||
|
padding: 0.5rem 1.25rem;
|
||||||
|
border-radius: var(--radius-md);
|
||||||
|
font-size: 0.8125rem;
|
||||||
|
font-weight: 600;
|
||||||
|
text-transform: uppercase;
|
||||||
|
letter-spacing: 0.025em;
|
||||||
|
cursor: pointer;
|
||||||
|
transition: all 0.1s cubic-bezier(0.4, 0, 0.2, 1);
|
||||||
|
border: 1px solid var(--color-border-color);
|
||||||
|
user-select: none;
|
||||||
|
white-space: nowrap;
|
||||||
|
position: relative;
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn::after {
|
||||||
|
content: '';
|
||||||
|
position: absolute;
|
||||||
|
inset: 0;
|
||||||
|
border-radius: inherit;
|
||||||
|
background: linear-gradient(to bottom, rgba(255,255,255,0.05), transparent);
|
||||||
|
pointer-events: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn:active {
|
||||||
|
transform: translateY(1px);
|
||||||
|
box-shadow: inset 0 2px 4px rgba(0,0,0,0.3);
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn:disabled {
|
||||||
|
opacity: 0.4;
|
||||||
|
cursor: not-allowed;
|
||||||
|
filter: grayscale(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-primary {
|
||||||
|
background: linear-gradient(to bottom, #4aa3df, var(--color-action-color));
|
||||||
|
color: #ffffff;
|
||||||
|
border-color: #2980b9;
|
||||||
|
text-shadow: 0 1px 2px rgba(0,0,0,0.2);
|
||||||
|
box-shadow:
|
||||||
|
0 1px 0 rgba(255,255,255,0.2) inset,
|
||||||
|
0 2px 4px rgba(0,0,0,0.2);
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-primary:hover {
|
||||||
|
filter: brightness(1.1);
|
||||||
|
box-shadow:
|
||||||
|
0 0 12px rgba(52, 152, 219, 0.4),
|
||||||
|
0 1px 0 rgba(255,255,255,0.2) inset;
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-secondary {
|
||||||
|
background: linear-gradient(to bottom, #21262d, #161b22);
|
||||||
|
color: var(--color-text-primary);
|
||||||
|
border-color: var(--color-border-color);
|
||||||
|
box-shadow:
|
||||||
|
0 1px 0 rgba(255,255,255,0.05) inset,
|
||||||
|
0 2px 4px rgba(0,0,0,0.1);
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-secondary:hover {
|
||||||
|
border-color: var(--color-text-secondary);
|
||||||
|
background: linear-gradient(to bottom, #2d333b, #21262d);
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-warning {
|
||||||
|
background: linear-gradient(to bottom, #f39c12, #e67e22);
|
||||||
|
color: #ffffff;
|
||||||
|
border-color: #d35400;
|
||||||
|
text-shadow: 0 1px 1px rgba(0,0,0,0.2);
|
||||||
|
box-shadow: 0 1px 0 rgba(255,255,255,0.2) inset;
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-warning:hover {
|
||||||
|
filter: brightness(1.1);
|
||||||
|
box-shadow: 0 0 12px rgba(230, 126, 34, 0.3);
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-ghost {
|
||||||
|
background: transparent;
|
||||||
|
color: var(--color-text-secondary);
|
||||||
|
border-color: transparent;
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-ghost:hover {
|
||||||
|
background: rgba(255,255,255,0.03);
|
||||||
|
color: var(--color-text-primary);
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-icon-only {
|
||||||
|
padding: 0.5rem;
|
||||||
|
aspect-ratio: 1;
|
||||||
|
width: 2.25rem;
|
||||||
|
height: 2.25rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
/* Badges */
|
||||||
|
.badge {
|
||||||
|
display: inline-flex;
|
||||||
|
align-items: center;
|
||||||
|
padding: 0.2rem 0.5rem;
|
||||||
|
border-radius: var(--radius-sm);
|
||||||
|
font-size: 0.7rem;
|
||||||
|
font-weight: 700;
|
||||||
|
text-transform: uppercase;
|
||||||
|
}
|
||||||
@@ -0,0 +1,12 @@
|
|||||||
|
<!doctype html>
|
||||||
|
<html lang="en">
|
||||||
|
<head>
|
||||||
|
<meta charset="utf-8" />
|
||||||
|
<link rel="icon" href="%sveltekit.assets%/favicon.png" />
|
||||||
|
<meta name="viewport" content="width=device-width, initial-scale=1" />
|
||||||
|
%sveltekit.head%
|
||||||
|
</head>
|
||||||
|
<body data-sveltekit-preload-data="hover">
|
||||||
|
<div style="display: contents">%sveltekit.body%</div>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
@@ -0,0 +1,16 @@
|
|||||||
|
// This file is auto-generated by @hey-api/openapi-ts
|
||||||
|
|
||||||
|
import { type ClientOptions, type Config, createClient, createConfig } from './client';
|
||||||
|
import type { ClientOptions as ClientOptions2 } from './types.gen';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* The `createClientConfig()` function will be called on client initialization
|
||||||
|
* and the returned object will become the client's initial configuration.
|
||||||
|
*
|
||||||
|
* You may want to initialize your client this way instead of calling
|
||||||
|
* `setConfig()`. This is useful for example if you're using Next.js
|
||||||
|
* to ensure your client always has the correct values.
|
||||||
|
*/
|
||||||
|
export type CreateClientConfig<T extends ClientOptions = ClientOptions2> = (override?: Config<ClientOptions & T>) => Config<Required<ClientOptions> & T>;
|
||||||
|
|
||||||
|
export const client = createClient(createConfig<ClientOptions2>({ baseUrl: 'http://localhost:8000' }));
|
||||||
@@ -0,0 +1,298 @@
|
|||||||
|
// This file is auto-generated by @hey-api/openapi-ts
|
||||||
|
|
||||||
|
import { createSseClient } from '../core/serverSentEvents.gen';
|
||||||
|
import type { HttpMethod } from '../core/types.gen';
|
||||||
|
import { getValidRequestBody } from '../core/utils.gen';
|
||||||
|
import type { Client, Config, RequestOptions, ResolvedRequestOptions } from './types.gen';
|
||||||
|
import {
|
||||||
|
buildUrl,
|
||||||
|
createConfig,
|
||||||
|
createInterceptors,
|
||||||
|
getParseAs,
|
||||||
|
mergeConfigs,
|
||||||
|
mergeHeaders,
|
||||||
|
setAuthParams,
|
||||||
|
} from './utils.gen';
|
||||||
|
|
||||||
|
type ReqInit = Omit<RequestInit, 'body' | 'headers'> & {
|
||||||
|
body?: any;
|
||||||
|
headers: ReturnType<typeof mergeHeaders>;
|
||||||
|
};
|
||||||
|
|
||||||
|
export const createClient = (config: Config = {}): Client => {
|
||||||
|
let _config = mergeConfigs(createConfig(), config);
|
||||||
|
|
||||||
|
const getConfig = (): Config => ({ ..._config });
|
||||||
|
|
||||||
|
const setConfig = (config: Config): Config => {
|
||||||
|
_config = mergeConfigs(_config, config);
|
||||||
|
return getConfig();
|
||||||
|
};
|
||||||
|
|
||||||
|
const interceptors = createInterceptors<Request, Response, unknown, ResolvedRequestOptions>();
|
||||||
|
|
||||||
|
const beforeRequest = async <
|
||||||
|
TData = unknown,
|
||||||
|
TResponseStyle extends 'data' | 'fields' = 'fields',
|
||||||
|
ThrowOnError extends boolean = boolean,
|
||||||
|
Url extends string = string,
|
||||||
|
>(
|
||||||
|
options: RequestOptions<TData, TResponseStyle, ThrowOnError, Url>,
|
||||||
|
) => {
|
||||||
|
const opts = {
|
||||||
|
..._config,
|
||||||
|
...options,
|
||||||
|
fetch: options.fetch ?? _config.fetch ?? globalThis.fetch,
|
||||||
|
headers: mergeHeaders(_config.headers, options.headers),
|
||||||
|
serializedBody: undefined as string | undefined,
|
||||||
|
};
|
||||||
|
|
||||||
|
if (opts.security) {
|
||||||
|
await setAuthParams({
|
||||||
|
...opts,
|
||||||
|
security: opts.security,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
if (opts.requestValidator) {
|
||||||
|
await opts.requestValidator(opts);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (opts.body !== undefined && opts.bodySerializer) {
|
||||||
|
opts.serializedBody = opts.bodySerializer(opts.body) as string | undefined;
|
||||||
|
}
|
||||||
|
|
||||||
|
// remove Content-Type header if body is empty to avoid sending invalid requests
|
||||||
|
if (opts.body === undefined || opts.serializedBody === '') {
|
||||||
|
opts.headers.delete('Content-Type');
|
||||||
|
}
|
||||||
|
|
||||||
|
const resolvedOpts = opts as typeof opts &
|
||||||
|
ResolvedRequestOptions<TResponseStyle, ThrowOnError, Url>;
|
||||||
|
const url = buildUrl(resolvedOpts);
|
||||||
|
|
||||||
|
return { opts: resolvedOpts, url };
|
||||||
|
};
|
||||||
|
|
||||||
|
const request: Client['request'] = async (options) => {
|
||||||
|
const { opts, url } = await beforeRequest(options);
|
||||||
|
const requestInit: ReqInit = {
|
||||||
|
redirect: 'follow',
|
||||||
|
...opts,
|
||||||
|
body: getValidRequestBody(opts),
|
||||||
|
};
|
||||||
|
|
||||||
|
let request = new Request(url, requestInit);
|
||||||
|
|
||||||
|
for (const fn of interceptors.request.fns) {
|
||||||
|
if (fn) {
|
||||||
|
request = await fn(request, opts);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// fetch must be assigned here, otherwise it would throw the error:
|
||||||
|
// TypeError: Failed to execute 'fetch' on 'Window': Illegal invocation
|
||||||
|
const _fetch = opts.fetch!;
|
||||||
|
let response: Response;
|
||||||
|
|
||||||
|
try {
|
||||||
|
response = await _fetch(request);
|
||||||
|
} catch (error) {
|
||||||
|
// Handle fetch exceptions (AbortError, network errors, etc.)
|
||||||
|
let finalError = error;
|
||||||
|
|
||||||
|
for (const fn of interceptors.error.fns) {
|
||||||
|
if (fn) {
|
||||||
|
finalError = (await fn(error, undefined as any, request, opts)) as unknown;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
finalError = finalError || ({} as unknown);
|
||||||
|
|
||||||
|
if (opts.throwOnError) {
|
||||||
|
throw finalError;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Return error response
|
||||||
|
return opts.responseStyle === 'data'
|
||||||
|
? undefined
|
||||||
|
: {
|
||||||
|
error: finalError,
|
||||||
|
request,
|
||||||
|
response: undefined as any,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
for (const fn of interceptors.response.fns) {
|
||||||
|
if (fn) {
|
||||||
|
response = await fn(response, request, opts);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = {
|
||||||
|
request,
|
||||||
|
response,
|
||||||
|
};
|
||||||
|
|
||||||
|
if (response.ok) {
|
||||||
|
const parseAs =
|
||||||
|
(opts.parseAs === 'auto'
|
||||||
|
? getParseAs(response.headers.get('Content-Type'))
|
||||||
|
: opts.parseAs) ?? 'json';
|
||||||
|
|
||||||
|
if (response.status === 204 || response.headers.get('Content-Length') === '0') {
|
||||||
|
let emptyData: any;
|
||||||
|
switch (parseAs) {
|
||||||
|
case 'arrayBuffer':
|
||||||
|
case 'blob':
|
||||||
|
case 'text':
|
||||||
|
emptyData = await response[parseAs]();
|
||||||
|
break;
|
||||||
|
case 'formData':
|
||||||
|
emptyData = new FormData();
|
||||||
|
break;
|
||||||
|
case 'stream':
|
||||||
|
emptyData = response.body;
|
||||||
|
break;
|
||||||
|
case 'json':
|
||||||
|
default:
|
||||||
|
emptyData = {};
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
return opts.responseStyle === 'data'
|
||||||
|
? emptyData
|
||||||
|
: {
|
||||||
|
data: emptyData,
|
||||||
|
...result,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
let data: any;
|
||||||
|
switch (parseAs) {
|
||||||
|
case 'arrayBuffer':
|
||||||
|
case 'blob':
|
||||||
|
case 'formData':
|
||||||
|
case 'text':
|
||||||
|
data = await response[parseAs]();
|
||||||
|
break;
|
||||||
|
case 'json': {
|
||||||
|
// Some servers return 200 with no Content-Length and empty body.
|
||||||
|
// response.json() would throw; read as text and parse if non-empty.
|
||||||
|
const text = await response.text();
|
||||||
|
data = text ? JSON.parse(text) : {};
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
case 'stream':
|
||||||
|
return opts.responseStyle === 'data'
|
||||||
|
? response.body
|
||||||
|
: {
|
||||||
|
data: response.body,
|
||||||
|
...result,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
if (parseAs === 'json') {
|
||||||
|
if (opts.responseValidator) {
|
||||||
|
await opts.responseValidator(data);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (opts.responseTransformer) {
|
||||||
|
data = await opts.responseTransformer(data);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return opts.responseStyle === 'data'
|
||||||
|
? data
|
||||||
|
: {
|
||||||
|
data,
|
||||||
|
...result,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const textError = await response.text();
|
||||||
|
let jsonError: unknown;
|
||||||
|
|
||||||
|
try {
|
||||||
|
jsonError = JSON.parse(textError);
|
||||||
|
} catch {
|
||||||
|
// noop
|
||||||
|
}
|
||||||
|
|
||||||
|
const error = jsonError ?? textError;
|
||||||
|
let finalError = error;
|
||||||
|
|
||||||
|
for (const fn of interceptors.error.fns) {
|
||||||
|
if (fn) {
|
||||||
|
finalError = (await fn(error, response, request, opts)) as string;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
finalError = finalError || ({} as string);
|
||||||
|
|
||||||
|
if (opts.throwOnError) {
|
||||||
|
throw finalError;
|
||||||
|
}
|
||||||
|
|
||||||
|
// TODO: we probably want to return error and improve types
|
||||||
|
return opts.responseStyle === 'data'
|
||||||
|
? undefined
|
||||||
|
: {
|
||||||
|
error: finalError,
|
||||||
|
...result,
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
const makeMethodFn = (method: Uppercase<HttpMethod>) => (options: RequestOptions) =>
|
||||||
|
request({ ...options, method });
|
||||||
|
|
||||||
|
const makeSseFn = (method: Uppercase<HttpMethod>) => async (options: RequestOptions) => {
|
||||||
|
const { opts, url } = await beforeRequest(options);
|
||||||
|
return createSseClient({
|
||||||
|
...opts,
|
||||||
|
body: opts.body as BodyInit | null | undefined,
|
||||||
|
headers: opts.headers as unknown as Record<string, string>,
|
||||||
|
method,
|
||||||
|
onRequest: async (url, init) => {
|
||||||
|
let request = new Request(url, init);
|
||||||
|
for (const fn of interceptors.request.fns) {
|
||||||
|
if (fn) {
|
||||||
|
request = await fn(request, opts);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return request;
|
||||||
|
},
|
||||||
|
serializedBody: getValidRequestBody(opts) as BodyInit | null | undefined,
|
||||||
|
url,
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
const _buildUrl: Client['buildUrl'] = (options) => buildUrl({ ..._config, ...options });
|
||||||
|
|
||||||
|
return {
|
||||||
|
buildUrl: _buildUrl,
|
||||||
|
connect: makeMethodFn('CONNECT'),
|
||||||
|
delete: makeMethodFn('DELETE'),
|
||||||
|
get: makeMethodFn('GET'),
|
||||||
|
getConfig,
|
||||||
|
head: makeMethodFn('HEAD'),
|
||||||
|
interceptors,
|
||||||
|
options: makeMethodFn('OPTIONS'),
|
||||||
|
patch: makeMethodFn('PATCH'),
|
||||||
|
post: makeMethodFn('POST'),
|
||||||
|
put: makeMethodFn('PUT'),
|
||||||
|
request,
|
||||||
|
setConfig,
|
||||||
|
sse: {
|
||||||
|
connect: makeSseFn('CONNECT'),
|
||||||
|
delete: makeSseFn('DELETE'),
|
||||||
|
get: makeSseFn('GET'),
|
||||||
|
head: makeSseFn('HEAD'),
|
||||||
|
options: makeSseFn('OPTIONS'),
|
||||||
|
patch: makeSseFn('PATCH'),
|
||||||
|
post: makeSseFn('POST'),
|
||||||
|
put: makeSseFn('PUT'),
|
||||||
|
trace: makeSseFn('TRACE'),
|
||||||
|
},
|
||||||
|
trace: makeMethodFn('TRACE'),
|
||||||
|
} as Client;
|
||||||
|
};
|
||||||
@@ -0,0 +1,25 @@
|
|||||||
|
// This file is auto-generated by @hey-api/openapi-ts
|
||||||
|
|
||||||
|
export type { Auth } from '../core/auth.gen';
|
||||||
|
export type { QuerySerializerOptions } from '../core/bodySerializer.gen';
|
||||||
|
export {
|
||||||
|
formDataBodySerializer,
|
||||||
|
jsonBodySerializer,
|
||||||
|
urlSearchParamsBodySerializer,
|
||||||
|
} from '../core/bodySerializer.gen';
|
||||||
|
export { buildClientParams } from '../core/params.gen';
|
||||||
|
export { serializeQueryKeyValue } from '../core/queryKeySerializer.gen';
|
||||||
|
export { createClient } from './client.gen';
|
||||||
|
export type {
|
||||||
|
Client,
|
||||||
|
ClientOptions,
|
||||||
|
Config,
|
||||||
|
CreateClientConfig,
|
||||||
|
Options,
|
||||||
|
RequestOptions,
|
||||||
|
RequestResult,
|
||||||
|
ResolvedRequestOptions,
|
||||||
|
ResponseStyle,
|
||||||
|
TDataShape,
|
||||||
|
} from './types.gen';
|
||||||
|
export { createConfig, mergeHeaders } from './utils.gen';
|
||||||
@@ -0,0 +1,215 @@
|
|||||||
|
// This file is auto-generated by @hey-api/openapi-ts
|
||||||
|
|
||||||
|
import type { Auth } from '../core/auth.gen';
|
||||||
|
import type {
|
||||||
|
ServerSentEventsOptions,
|
||||||
|
ServerSentEventsResult,
|
||||||
|
} from '../core/serverSentEvents.gen';
|
||||||
|
import type { Client as CoreClient, Config as CoreConfig } from '../core/types.gen';
|
||||||
|
import type { Middleware } from './utils.gen';
|
||||||
|
|
||||||
|
export type ResponseStyle = 'data' | 'fields';
|
||||||
|
|
||||||
|
export interface Config<T extends ClientOptions = ClientOptions>
|
||||||
|
extends Omit<RequestInit, 'body' | 'headers' | 'method'>, CoreConfig {
|
||||||
|
/**
|
||||||
|
* Base URL for all requests made by this client.
|
||||||
|
*/
|
||||||
|
baseUrl?: T['baseUrl'];
|
||||||
|
/**
|
||||||
|
* Fetch API implementation. You can use this option to provide a custom
|
||||||
|
* fetch instance.
|
||||||
|
*
|
||||||
|
* @default globalThis.fetch
|
||||||
|
*/
|
||||||
|
fetch?: typeof fetch;
|
||||||
|
/**
|
||||||
|
* Please don't use the Fetch client for Next.js applications. The `next`
|
||||||
|
* options won't have any effect.
|
||||||
|
*
|
||||||
|
* Install {@link https://www.npmjs.com/package/@hey-api/client-next `@hey-api/client-next`} instead.
|
||||||
|
*/
|
||||||
|
next?: never;
|
||||||
|
/**
|
||||||
|
* Return the response data parsed in a specified format. By default, `auto`
|
||||||
|
* will infer the appropriate method from the `Content-Type` response header.
|
||||||
|
* You can override this behavior with any of the {@link Body} methods.
|
||||||
|
* Select `stream` if you don't want to parse response data at all.
|
||||||
|
*
|
||||||
|
* @default 'auto'
|
||||||
|
*/
|
||||||
|
parseAs?: 'arrayBuffer' | 'auto' | 'blob' | 'formData' | 'json' | 'stream' | 'text';
|
||||||
|
/**
|
||||||
|
* Should we return only data or multiple fields (data, error, response, etc.)?
|
||||||
|
*
|
||||||
|
* @default 'fields'
|
||||||
|
*/
|
||||||
|
responseStyle?: ResponseStyle;
|
||||||
|
/**
|
||||||
|
* Throw an error instead of returning it in the response?
|
||||||
|
*
|
||||||
|
* @default false
|
||||||
|
*/
|
||||||
|
throwOnError?: T['throwOnError'];
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface RequestOptions<
|
||||||
|
TData = unknown,
|
||||||
|
TResponseStyle extends ResponseStyle = 'fields',
|
||||||
|
ThrowOnError extends boolean = boolean,
|
||||||
|
Url extends string = string,
|
||||||
|
>
|
||||||
|
extends
|
||||||
|
Config<{
|
||||||
|
responseStyle: TResponseStyle;
|
||||||
|
throwOnError: ThrowOnError;
|
||||||
|
}>,
|
||||||
|
Pick<
|
||||||
|
ServerSentEventsOptions<TData>,
|
||||||
|
| 'onRequest'
|
||||||
|
| 'onSseError'
|
||||||
|
| 'onSseEvent'
|
||||||
|
| 'sseDefaultRetryDelay'
|
||||||
|
| 'sseMaxRetryAttempts'
|
||||||
|
| 'sseMaxRetryDelay'
|
||||||
|
> {
|
||||||
|
/**
|
||||||
|
* Any body that you want to add to your request.
|
||||||
|
*
|
||||||
|
* {@link https://developer.mozilla.org/docs/Web/API/fetch#body}
|
||||||
|
*/
|
||||||
|
body?: unknown;
|
||||||
|
path?: Record<string, unknown>;
|
||||||
|
query?: Record<string, unknown>;
|
||||||
|
/**
|
||||||
|
* Security mechanism(s) to use for the request.
|
||||||
|
*/
|
||||||
|
security?: ReadonlyArray<Auth>;
|
||||||
|
url: Url;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ResolvedRequestOptions<
|
||||||
|
TResponseStyle extends ResponseStyle = 'fields',
|
||||||
|
ThrowOnError extends boolean = boolean,
|
||||||
|
Url extends string = string,
|
||||||
|
> extends RequestOptions<unknown, TResponseStyle, ThrowOnError, Url> {
|
||||||
|
headers: Headers;
|
||||||
|
serializedBody?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export type RequestResult<
|
||||||
|
TData = unknown,
|
||||||
|
TError = unknown,
|
||||||
|
ThrowOnError extends boolean = boolean,
|
||||||
|
TResponseStyle extends ResponseStyle = 'fields',
|
||||||
|
> = ThrowOnError extends true
|
||||||
|
? Promise<
|
||||||
|
TResponseStyle extends 'data'
|
||||||
|
? TData extends Record<string, unknown>
|
||||||
|
? TData[keyof TData]
|
||||||
|
: TData
|
||||||
|
: {
|
||||||
|
data: TData extends Record<string, unknown> ? TData[keyof TData] : TData;
|
||||||
|
request: Request;
|
||||||
|
response: Response;
|
||||||
|
}
|
||||||
|
>
|
||||||
|
: Promise<
|
||||||
|
TResponseStyle extends 'data'
|
||||||
|
? (TData extends Record<string, unknown> ? TData[keyof TData] : TData) | undefined
|
||||||
|
: (
|
||||||
|
| {
|
||||||
|
data: TData extends Record<string, unknown> ? TData[keyof TData] : TData;
|
||||||
|
error: undefined;
|
||||||
|
}
|
||||||
|
| {
|
||||||
|
data: undefined;
|
||||||
|
error: TError extends Record<string, unknown> ? TError[keyof TError] : TError;
|
||||||
|
}
|
||||||
|
) & {
|
||||||
|
request: Request;
|
||||||
|
response: Response;
|
||||||
|
}
|
||||||
|
>;
|
||||||
|
|
||||||
|
export interface ClientOptions {
|
||||||
|
baseUrl?: string;
|
||||||
|
responseStyle?: ResponseStyle;
|
||||||
|
throwOnError?: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
type MethodFn = <
|
||||||
|
TData = unknown,
|
||||||
|
TError = unknown,
|
||||||
|
ThrowOnError extends boolean = false,
|
||||||
|
TResponseStyle extends ResponseStyle = 'fields',
|
||||||
|
>(
|
||||||
|
options: Omit<RequestOptions<TData, TResponseStyle, ThrowOnError>, 'method'>,
|
||||||
|
) => RequestResult<TData, TError, ThrowOnError, TResponseStyle>;
|
||||||
|
|
||||||
|
type SseFn = <
|
||||||
|
TData = unknown,
|
||||||
|
TError = unknown,
|
||||||
|
ThrowOnError extends boolean = false,
|
||||||
|
TResponseStyle extends ResponseStyle = 'fields',
|
||||||
|
>(
|
||||||
|
options: Omit<RequestOptions<never, TResponseStyle, ThrowOnError>, 'method'>,
|
||||||
|
) => Promise<ServerSentEventsResult<TData, TError>>;
|
||||||
|
|
||||||
|
type RequestFn = <
|
||||||
|
TData = unknown,
|
||||||
|
TError = unknown,
|
||||||
|
ThrowOnError extends boolean = false,
|
||||||
|
TResponseStyle extends ResponseStyle = 'fields',
|
||||||
|
>(
|
||||||
|
options: Omit<RequestOptions<TData, TResponseStyle, ThrowOnError>, 'method'> &
|
||||||
|
Pick<Required<RequestOptions<TData, TResponseStyle, ThrowOnError>>, 'method'>,
|
||||||
|
) => RequestResult<TData, TError, ThrowOnError, TResponseStyle>;
|
||||||
|
|
||||||
|
type BuildUrlFn = <
|
||||||
|
TData extends {
|
||||||
|
body?: unknown;
|
||||||
|
path?: Record<string, unknown>;
|
||||||
|
query?: Record<string, unknown>;
|
||||||
|
url: string;
|
||||||
|
},
|
||||||
|
>(
|
||||||
|
options: TData & Options<TData>,
|
||||||
|
) => string;
|
||||||
|
|
||||||
|
export type Client = CoreClient<RequestFn, Config, MethodFn, BuildUrlFn, SseFn> & {
|
||||||
|
interceptors: Middleware<Request, Response, unknown, ResolvedRequestOptions>;
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* The `createClientConfig()` function will be called on client initialization
|
||||||
|
* and the returned object will become the client's initial configuration.
|
||||||
|
*
|
||||||
|
* You may want to initialize your client this way instead of calling
|
||||||
|
* `setConfig()`. This is useful for example if you're using Next.js
|
||||||
|
* to ensure your client always has the correct values.
|
||||||
|
*/
|
||||||
|
export type CreateClientConfig<T extends ClientOptions = ClientOptions> = (
|
||||||
|
override?: Config<ClientOptions & T>,
|
||||||
|
) => Config<Required<ClientOptions> & T>;
|
||||||
|
|
||||||
|
export interface TDataShape {
|
||||||
|
body?: unknown;
|
||||||
|
headers?: unknown;
|
||||||
|
path?: unknown;
|
||||||
|
query?: unknown;
|
||||||
|
url: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
type OmitKeys<T, K> = Pick<T, Exclude<keyof T, K>>;
|
||||||
|
|
||||||
|
export type Options<
|
||||||
|
TData extends TDataShape = TDataShape,
|
||||||
|
ThrowOnError extends boolean = boolean,
|
||||||
|
TResponse = unknown,
|
||||||
|
TResponseStyle extends ResponseStyle = 'fields',
|
||||||
|
> = OmitKeys<
|
||||||
|
RequestOptions<TResponse, TResponseStyle, ThrowOnError>,
|
||||||
|
'body' | 'path' | 'query' | 'url'
|
||||||
|
> &
|
||||||
|
([TData] extends [never] ? unknown : Omit<TData, 'url'>);
|
||||||
@@ -0,0 +1,316 @@
|
|||||||
|
// This file is auto-generated by @hey-api/openapi-ts
|
||||||
|
|
||||||
|
import { getAuthToken } from '../core/auth.gen';
|
||||||
|
import type { QuerySerializerOptions } from '../core/bodySerializer.gen';
|
||||||
|
import { jsonBodySerializer } from '../core/bodySerializer.gen';
|
||||||
|
import {
|
||||||
|
serializeArrayParam,
|
||||||
|
serializeObjectParam,
|
||||||
|
serializePrimitiveParam,
|
||||||
|
} from '../core/pathSerializer.gen';
|
||||||
|
import { getUrl } from '../core/utils.gen';
|
||||||
|
import type { Client, ClientOptions, Config, RequestOptions } from './types.gen';
|
||||||
|
|
||||||
|
export const createQuerySerializer = <T = unknown>({
|
||||||
|
parameters = {},
|
||||||
|
...args
|
||||||
|
}: QuerySerializerOptions = {}) => {
|
||||||
|
const querySerializer = (queryParams: T) => {
|
||||||
|
const search: string[] = [];
|
||||||
|
if (queryParams && typeof queryParams === 'object') {
|
||||||
|
for (const name in queryParams) {
|
||||||
|
const value = queryParams[name];
|
||||||
|
|
||||||
|
if (value === undefined || value === null) {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
const options = parameters[name] || args;
|
||||||
|
|
||||||
|
if (Array.isArray(value)) {
|
||||||
|
const serializedArray = serializeArrayParam({
|
||||||
|
allowReserved: options.allowReserved,
|
||||||
|
explode: true,
|
||||||
|
name,
|
||||||
|
style: 'form',
|
||||||
|
value,
|
||||||
|
...options.array,
|
||||||
|
});
|
||||||
|
if (serializedArray) search.push(serializedArray);
|
||||||
|
} else if (typeof value === 'object') {
|
||||||
|
const serializedObject = serializeObjectParam({
|
||||||
|
allowReserved: options.allowReserved,
|
||||||
|
explode: true,
|
||||||
|
name,
|
||||||
|
style: 'deepObject',
|
||||||
|
value: value as Record<string, unknown>,
|
||||||
|
...options.object,
|
||||||
|
});
|
||||||
|
if (serializedObject) search.push(serializedObject);
|
||||||
|
} else {
|
||||||
|
const serializedPrimitive = serializePrimitiveParam({
|
||||||
|
allowReserved: options.allowReserved,
|
||||||
|
name,
|
||||||
|
value: value as string,
|
||||||
|
});
|
||||||
|
if (serializedPrimitive) search.push(serializedPrimitive);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return search.join('&');
|
||||||
|
};
|
||||||
|
return querySerializer;
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Infers parseAs value from provided Content-Type header.
|
||||||
|
*/
|
||||||
|
export const getParseAs = (contentType: string | null): Exclude<Config['parseAs'], 'auto'> => {
|
||||||
|
if (!contentType) {
|
||||||
|
// If no Content-Type header is provided, the best we can do is return the raw response body,
|
||||||
|
// which is effectively the same as the 'stream' option.
|
||||||
|
return 'stream';
|
||||||
|
}
|
||||||
|
|
||||||
|
const cleanContent = contentType.split(';')[0]?.trim();
|
||||||
|
|
||||||
|
if (!cleanContent) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (cleanContent.startsWith('application/json') || cleanContent.endsWith('+json')) {
|
||||||
|
return 'json';
|
||||||
|
}
|
||||||
|
|
||||||
|
if (cleanContent === 'multipart/form-data') {
|
||||||
|
return 'formData';
|
||||||
|
}
|
||||||
|
|
||||||
|
if (
|
||||||
|
['application/', 'audio/', 'image/', 'video/'].some((type) => cleanContent.startsWith(type))
|
||||||
|
) {
|
||||||
|
return 'blob';
|
||||||
|
}
|
||||||
|
|
||||||
|
if (cleanContent.startsWith('text/')) {
|
||||||
|
return 'text';
|
||||||
|
}
|
||||||
|
|
||||||
|
return;
|
||||||
|
};
|
||||||
|
|
||||||
|
const checkForExistence = (
|
||||||
|
options: Pick<RequestOptions, 'auth' | 'query'> & {
|
||||||
|
headers: Headers;
|
||||||
|
},
|
||||||
|
name?: string,
|
||||||
|
): boolean => {
|
||||||
|
if (!name) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
if (
|
||||||
|
options.headers.has(name) ||
|
||||||
|
options.query?.[name] ||
|
||||||
|
options.headers.get('Cookie')?.includes(`${name}=`)
|
||||||
|
) {
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
return false;
|
||||||
|
};
|
||||||
|
|
||||||
|
export const setAuthParams = async ({
|
||||||
|
security,
|
||||||
|
...options
|
||||||
|
}: Pick<Required<RequestOptions>, 'security'> &
|
||||||
|
Pick<RequestOptions, 'auth' | 'query'> & {
|
||||||
|
headers: Headers;
|
||||||
|
}) => {
|
||||||
|
for (const auth of security) {
|
||||||
|
if (checkForExistence(options, auth.name)) {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
const token = await getAuthToken(auth, options.auth);
|
||||||
|
|
||||||
|
if (!token) {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
const name = auth.name ?? 'Authorization';
|
||||||
|
|
||||||
|
switch (auth.in) {
|
||||||
|
case 'query':
|
||||||
|
if (!options.query) {
|
||||||
|
options.query = {};
|
||||||
|
}
|
||||||
|
options.query[name] = token;
|
||||||
|
break;
|
||||||
|
case 'cookie':
|
||||||
|
options.headers.append('Cookie', `${name}=${token}`);
|
||||||
|
break;
|
||||||
|
case 'header':
|
||||||
|
default:
|
||||||
|
options.headers.set(name, token);
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
export const buildUrl: Client['buildUrl'] = (options) =>
|
||||||
|
getUrl({
|
||||||
|
baseUrl: options.baseUrl as string,
|
||||||
|
path: options.path,
|
||||||
|
query: options.query,
|
||||||
|
querySerializer:
|
||||||
|
typeof options.querySerializer === 'function'
|
||||||
|
? options.querySerializer
|
||||||
|
: createQuerySerializer(options.querySerializer),
|
||||||
|
url: options.url,
|
||||||
|
});
|
||||||
|
|
||||||
|
export const mergeConfigs = (a: Config, b: Config): Config => {
|
||||||
|
const config = { ...a, ...b };
|
||||||
|
if (config.baseUrl?.endsWith('/')) {
|
||||||
|
config.baseUrl = config.baseUrl.substring(0, config.baseUrl.length - 1);
|
||||||
|
}
|
||||||
|
config.headers = mergeHeaders(a.headers, b.headers);
|
||||||
|
return config;
|
||||||
|
};
|
||||||
|
|
||||||
|
const headersEntries = (headers: Headers): Array<[string, string]> => {
|
||||||
|
const entries: Array<[string, string]> = [];
|
||||||
|
headers.forEach((value, key) => {
|
||||||
|
entries.push([key, value]);
|
||||||
|
});
|
||||||
|
return entries;
|
||||||
|
};
|
||||||
|
|
||||||
|
export const mergeHeaders = (
|
||||||
|
...headers: Array<Required<Config>['headers'] | undefined>
|
||||||
|
): Headers => {
|
||||||
|
const mergedHeaders = new Headers();
|
||||||
|
for (const header of headers) {
|
||||||
|
if (!header) {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
const iterator = header instanceof Headers ? headersEntries(header) : Object.entries(header);
|
||||||
|
|
||||||
|
for (const [key, value] of iterator) {
|
||||||
|
if (value === null) {
|
||||||
|
mergedHeaders.delete(key);
|
||||||
|
} else if (Array.isArray(value)) {
|
||||||
|
for (const v of value) {
|
||||||
|
mergedHeaders.append(key, v as string);
|
||||||
|
}
|
||||||
|
} else if (value !== undefined) {
|
||||||
|
// assume object headers are meant to be JSON stringified, i.e., their
|
||||||
|
// content value in OpenAPI specification is 'application/json'
|
||||||
|
mergedHeaders.set(
|
||||||
|
key,
|
||||||
|
typeof value === 'object' ? JSON.stringify(value) : (value as string),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return mergedHeaders;
|
||||||
|
};
|
||||||
|
|
||||||
|
type ErrInterceptor<Err, Res, Req, Options> = (
|
||||||
|
error: Err,
|
||||||
|
response: Res,
|
||||||
|
request: Req,
|
||||||
|
options: Options,
|
||||||
|
) => Err | Promise<Err>;
|
||||||
|
|
||||||
|
type ReqInterceptor<Req, Options> = (request: Req, options: Options) => Req | Promise<Req>;
|
||||||
|
|
||||||
|
type ResInterceptor<Res, Req, Options> = (
|
||||||
|
response: Res,
|
||||||
|
request: Req,
|
||||||
|
options: Options,
|
||||||
|
) => Res | Promise<Res>;
|
||||||
|
|
||||||
|
class Interceptors<Interceptor> {
|
||||||
|
fns: Array<Interceptor | null> = [];
|
||||||
|
|
||||||
|
clear(): void {
|
||||||
|
this.fns = [];
|
||||||
|
}
|
||||||
|
|
||||||
|
eject(id: number | Interceptor): void {
|
||||||
|
const index = this.getInterceptorIndex(id);
|
||||||
|
if (this.fns[index]) {
|
||||||
|
this.fns[index] = null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
exists(id: number | Interceptor): boolean {
|
||||||
|
const index = this.getInterceptorIndex(id);
|
||||||
|
return Boolean(this.fns[index]);
|
||||||
|
}
|
||||||
|
|
||||||
|
getInterceptorIndex(id: number | Interceptor): number {
|
||||||
|
if (typeof id === 'number') {
|
||||||
|
return this.fns[id] ? id : -1;
|
||||||
|
}
|
||||||
|
return this.fns.indexOf(id);
|
||||||
|
}
|
||||||
|
|
||||||
|
update(id: number | Interceptor, fn: Interceptor): number | Interceptor | false {
|
||||||
|
const index = this.getInterceptorIndex(id);
|
||||||
|
if (this.fns[index]) {
|
||||||
|
this.fns[index] = fn;
|
||||||
|
return id;
|
||||||
|
}
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
use(fn: Interceptor): number {
|
||||||
|
this.fns.push(fn);
|
||||||
|
return this.fns.length - 1;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface Middleware<Req, Res, Err, Options> {
|
||||||
|
error: Interceptors<ErrInterceptor<Err, Res, Req, Options>>;
|
||||||
|
request: Interceptors<ReqInterceptor<Req, Options>>;
|
||||||
|
response: Interceptors<ResInterceptor<Res, Req, Options>>;
|
||||||
|
}
|
||||||
|
|
||||||
|
export const createInterceptors = <Req, Res, Err, Options>(): Middleware<
|
||||||
|
Req,
|
||||||
|
Res,
|
||||||
|
Err,
|
||||||
|
Options
|
||||||
|
> => ({
|
||||||
|
error: new Interceptors<ErrInterceptor<Err, Res, Req, Options>>(),
|
||||||
|
request: new Interceptors<ReqInterceptor<Req, Options>>(),
|
||||||
|
response: new Interceptors<ResInterceptor<Res, Req, Options>>(),
|
||||||
|
});
|
||||||
|
|
||||||
|
const defaultQuerySerializer = createQuerySerializer({
|
||||||
|
allowReserved: false,
|
||||||
|
array: {
|
||||||
|
explode: true,
|
||||||
|
style: 'form',
|
||||||
|
},
|
||||||
|
object: {
|
||||||
|
explode: true,
|
||||||
|
style: 'deepObject',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const defaultHeaders = {
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
};
|
||||||
|
|
||||||
|
export const createConfig = <T extends ClientOptions = ClientOptions>(
|
||||||
|
override: Config<Omit<ClientOptions, keyof T> & T> = {},
|
||||||
|
): Config<Omit<ClientOptions, keyof T> & T> => ({
|
||||||
|
...jsonBodySerializer,
|
||||||
|
headers: defaultHeaders,
|
||||||
|
parseAs: 'auto',
|
||||||
|
querySerializer: defaultQuerySerializer,
|
||||||
|
...override,
|
||||||
|
});
|
||||||
@@ -0,0 +1,41 @@
|
|||||||
|
// This file is auto-generated by @hey-api/openapi-ts
|
||||||
|
|
||||||
|
export type AuthToken = string | undefined;
|
||||||
|
|
||||||
|
export interface Auth {
|
||||||
|
/**
|
||||||
|
* Which part of the request do we use to send the auth?
|
||||||
|
*
|
||||||
|
* @default 'header'
|
||||||
|
*/
|
||||||
|
in?: 'header' | 'query' | 'cookie';
|
||||||
|
/**
|
||||||
|
* Header or query parameter name.
|
||||||
|
*
|
||||||
|
* @default 'Authorization'
|
||||||
|
*/
|
||||||
|
name?: string;
|
||||||
|
scheme?: 'basic' | 'bearer';
|
||||||
|
type: 'apiKey' | 'http';
|
||||||
|
}
|
||||||
|
|
||||||
|
export const getAuthToken = async (
|
||||||
|
auth: Auth,
|
||||||
|
callback: ((auth: Auth) => Promise<AuthToken> | AuthToken) | AuthToken,
|
||||||
|
): Promise<string | undefined> => {
|
||||||
|
const token = typeof callback === 'function' ? await callback(auth) : callback;
|
||||||
|
|
||||||
|
if (!token) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (auth.scheme === 'bearer') {
|
||||||
|
return `Bearer ${token}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (auth.scheme === 'basic') {
|
||||||
|
return `Basic ${btoa(token)}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
return token;
|
||||||
|
};
|
||||||
@@ -0,0 +1,82 @@
|
|||||||
|
// This file is auto-generated by @hey-api/openapi-ts
|
||||||
|
|
||||||
|
import type { ArrayStyle, ObjectStyle, SerializerOptions } from './pathSerializer.gen';
|
||||||
|
|
||||||
|
export type QuerySerializer = (query: Record<string, unknown>) => string;
|
||||||
|
|
||||||
|
export type BodySerializer = (body: unknown) => unknown;
|
||||||
|
|
||||||
|
type QuerySerializerOptionsObject = {
|
||||||
|
allowReserved?: boolean;
|
||||||
|
array?: Partial<SerializerOptions<ArrayStyle>>;
|
||||||
|
object?: Partial<SerializerOptions<ObjectStyle>>;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type QuerySerializerOptions = QuerySerializerOptionsObject & {
|
||||||
|
/**
|
||||||
|
* Per-parameter serialization overrides. When provided, these settings
|
||||||
|
* override the global array/object settings for specific parameter names.
|
||||||
|
*/
|
||||||
|
parameters?: Record<string, QuerySerializerOptionsObject>;
|
||||||
|
};
|
||||||
|
|
||||||
|
const serializeFormDataPair = (data: FormData, key: string, value: unknown): void => {
|
||||||
|
if (typeof value === 'string' || value instanceof Blob) {
|
||||||
|
data.append(key, value);
|
||||||
|
} else if (value instanceof Date) {
|
||||||
|
data.append(key, value.toISOString());
|
||||||
|
} else {
|
||||||
|
data.append(key, JSON.stringify(value));
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const serializeUrlSearchParamsPair = (data: URLSearchParams, key: string, value: unknown): void => {
|
||||||
|
if (typeof value === 'string') {
|
||||||
|
data.append(key, value);
|
||||||
|
} else {
|
||||||
|
data.append(key, JSON.stringify(value));
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
export const formDataBodySerializer = {
|
||||||
|
bodySerializer: (body: unknown): FormData => {
|
||||||
|
const data = new FormData();
|
||||||
|
|
||||||
|
Object.entries(body as Record<string, unknown>).forEach(([key, value]) => {
|
||||||
|
if (value === undefined || value === null) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
if (Array.isArray(value)) {
|
||||||
|
value.forEach((v) => serializeFormDataPair(data, key, v));
|
||||||
|
} else {
|
||||||
|
serializeFormDataPair(data, key, value);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
return data;
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
export const jsonBodySerializer = {
|
||||||
|
bodySerializer: (body: unknown): string =>
|
||||||
|
JSON.stringify(body, (_key, value) => (typeof value === 'bigint' ? value.toString() : value)),
|
||||||
|
};
|
||||||
|
|
||||||
|
export const urlSearchParamsBodySerializer = {
|
||||||
|
bodySerializer: (body: unknown): string => {
|
||||||
|
const data = new URLSearchParams();
|
||||||
|
|
||||||
|
Object.entries(body as Record<string, unknown>).forEach(([key, value]) => {
|
||||||
|
if (value === undefined || value === null) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
if (Array.isArray(value)) {
|
||||||
|
value.forEach((v) => serializeUrlSearchParamsPair(data, key, v));
|
||||||
|
} else {
|
||||||
|
serializeUrlSearchParamsPair(data, key, value);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
return data.toString();
|
||||||
|
},
|
||||||
|
};
|
||||||
@@ -0,0 +1,169 @@
|
|||||||
|
// This file is auto-generated by @hey-api/openapi-ts
|
||||||
|
|
||||||
|
type Slot = 'body' | 'headers' | 'path' | 'query';
|
||||||
|
|
||||||
|
export type Field =
|
||||||
|
| {
|
||||||
|
in: Exclude<Slot, 'body'>;
|
||||||
|
/**
|
||||||
|
* Field name. This is the name we want the user to see and use.
|
||||||
|
*/
|
||||||
|
key: string;
|
||||||
|
/**
|
||||||
|
* Field mapped name. This is the name we want to use in the request.
|
||||||
|
* If omitted, we use the same value as `key`.
|
||||||
|
*/
|
||||||
|
map?: string;
|
||||||
|
}
|
||||||
|
| {
|
||||||
|
in: Extract<Slot, 'body'>;
|
||||||
|
/**
|
||||||
|
* Key isn't required for bodies.
|
||||||
|
*/
|
||||||
|
key?: string;
|
||||||
|
map?: string;
|
||||||
|
}
|
||||||
|
| {
|
||||||
|
/**
|
||||||
|
* Field name. This is the name we want the user to see and use.
|
||||||
|
*/
|
||||||
|
key: string;
|
||||||
|
/**
|
||||||
|
* Field mapped name. This is the name we want to use in the request.
|
||||||
|
* If `in` is omitted, `map` aliases `key` to the transport layer.
|
||||||
|
*/
|
||||||
|
map: Slot;
|
||||||
|
};
|
||||||
|
|
||||||
|
export interface Fields {
|
||||||
|
allowExtra?: Partial<Record<Slot, boolean>>;
|
||||||
|
args?: ReadonlyArray<Field>;
|
||||||
|
}
|
||||||
|
|
||||||
|
export type FieldsConfig = ReadonlyArray<Field | Fields>;
|
||||||
|
|
||||||
|
const extraPrefixesMap: Record<string, Slot> = {
|
||||||
|
$body_: 'body',
|
||||||
|
$headers_: 'headers',
|
||||||
|
$path_: 'path',
|
||||||
|
$query_: 'query',
|
||||||
|
};
|
||||||
|
const extraPrefixes = Object.entries(extraPrefixesMap);
|
||||||
|
|
||||||
|
type KeyMap = Map<
|
||||||
|
string,
|
||||||
|
| {
|
||||||
|
in: Slot;
|
||||||
|
map?: string;
|
||||||
|
}
|
||||||
|
| {
|
||||||
|
in?: never;
|
||||||
|
map: Slot;
|
||||||
|
}
|
||||||
|
>;
|
||||||
|
|
||||||
|
const buildKeyMap = (fields: FieldsConfig, map?: KeyMap): KeyMap => {
|
||||||
|
if (!map) {
|
||||||
|
map = new Map();
|
||||||
|
}
|
||||||
|
|
||||||
|
for (const config of fields) {
|
||||||
|
if ('in' in config) {
|
||||||
|
if (config.key) {
|
||||||
|
map.set(config.key, {
|
||||||
|
in: config.in,
|
||||||
|
map: config.map,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
} else if ('key' in config) {
|
||||||
|
map.set(config.key, {
|
||||||
|
map: config.map,
|
||||||
|
});
|
||||||
|
} else if (config.args) {
|
||||||
|
buildKeyMap(config.args, map);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return map;
|
||||||
|
};
|
||||||
|
|
||||||
|
interface Params {
|
||||||
|
body: unknown;
|
||||||
|
headers: Record<string, unknown>;
|
||||||
|
path: Record<string, unknown>;
|
||||||
|
query: Record<string, unknown>;
|
||||||
|
}
|
||||||
|
|
||||||
|
const stripEmptySlots = (params: Params) => {
|
||||||
|
for (const [slot, value] of Object.entries(params)) {
|
||||||
|
if (value && typeof value === 'object' && !Array.isArray(value) && !Object.keys(value).length) {
|
||||||
|
delete params[slot as Slot];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
export const buildClientParams = (args: ReadonlyArray<unknown>, fields: FieldsConfig) => {
|
||||||
|
const params: Params = {
|
||||||
|
body: {},
|
||||||
|
headers: {},
|
||||||
|
path: {},
|
||||||
|
query: {},
|
||||||
|
};
|
||||||
|
|
||||||
|
const map = buildKeyMap(fields);
|
||||||
|
|
||||||
|
let config: FieldsConfig[number] | undefined;
|
||||||
|
|
||||||
|
for (const [index, arg] of args.entries()) {
|
||||||
|
if (fields[index]) {
|
||||||
|
config = fields[index];
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!config) {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
if ('in' in config) {
|
||||||
|
if (config.key) {
|
||||||
|
const field = map.get(config.key)!;
|
||||||
|
const name = field.map || config.key;
|
||||||
|
if (field.in) {
|
||||||
|
(params[field.in] as Record<string, unknown>)[name] = arg;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
params.body = arg;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
for (const [key, value] of Object.entries(arg ?? {})) {
|
||||||
|
const field = map.get(key);
|
||||||
|
|
||||||
|
if (field) {
|
||||||
|
if (field.in) {
|
||||||
|
const name = field.map || key;
|
||||||
|
(params[field.in] as Record<string, unknown>)[name] = value;
|
||||||
|
} else {
|
||||||
|
params[field.map] = value;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
const extra = extraPrefixes.find(([prefix]) => key.startsWith(prefix));
|
||||||
|
|
||||||
|
if (extra) {
|
||||||
|
const [prefix, slot] = extra;
|
||||||
|
(params[slot] as Record<string, unknown>)[key.slice(prefix.length)] = value;
|
||||||
|
} else if ('allowExtra' in config && config.allowExtra) {
|
||||||
|
for (const [slot, allowed] of Object.entries(config.allowExtra)) {
|
||||||
|
if (allowed) {
|
||||||
|
(params[slot as Slot] as Record<string, unknown>)[key] = value;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
stripEmptySlots(params);
|
||||||
|
|
||||||
|
return params;
|
||||||
|
};
|
||||||
@@ -0,0 +1,171 @@
|
|||||||
|
// This file is auto-generated by @hey-api/openapi-ts
|
||||||
|
|
||||||
|
interface SerializeOptions<T> extends SerializePrimitiveOptions, SerializerOptions<T> {}
|
||||||
|
|
||||||
|
interface SerializePrimitiveOptions {
|
||||||
|
allowReserved?: boolean;
|
||||||
|
name: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface SerializerOptions<T> {
|
||||||
|
/**
|
||||||
|
* @default true
|
||||||
|
*/
|
||||||
|
explode: boolean;
|
||||||
|
style: T;
|
||||||
|
}
|
||||||
|
|
||||||
|
export type ArrayStyle = 'form' | 'spaceDelimited' | 'pipeDelimited';
|
||||||
|
export type ArraySeparatorStyle = ArrayStyle | MatrixStyle;
|
||||||
|
type MatrixStyle = 'label' | 'matrix' | 'simple';
|
||||||
|
export type ObjectStyle = 'form' | 'deepObject';
|
||||||
|
type ObjectSeparatorStyle = ObjectStyle | MatrixStyle;
|
||||||
|
|
||||||
|
interface SerializePrimitiveParam extends SerializePrimitiveOptions {
|
||||||
|
value: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export const separatorArrayExplode = (style: ArraySeparatorStyle) => {
|
||||||
|
switch (style) {
|
||||||
|
case 'label':
|
||||||
|
return '.';
|
||||||
|
case 'matrix':
|
||||||
|
return ';';
|
||||||
|
case 'simple':
|
||||||
|
return ',';
|
||||||
|
default:
|
||||||
|
return '&';
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
export const separatorArrayNoExplode = (style: ArraySeparatorStyle) => {
|
||||||
|
switch (style) {
|
||||||
|
case 'form':
|
||||||
|
return ',';
|
||||||
|
case 'pipeDelimited':
|
||||||
|
return '|';
|
||||||
|
case 'spaceDelimited':
|
||||||
|
return '%20';
|
||||||
|
default:
|
||||||
|
return ',';
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
export const separatorObjectExplode = (style: ObjectSeparatorStyle) => {
|
||||||
|
switch (style) {
|
||||||
|
case 'label':
|
||||||
|
return '.';
|
||||||
|
case 'matrix':
|
||||||
|
return ';';
|
||||||
|
case 'simple':
|
||||||
|
return ',';
|
||||||
|
default:
|
||||||
|
return '&';
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
export const serializeArrayParam = ({
|
||||||
|
allowReserved,
|
||||||
|
explode,
|
||||||
|
name,
|
||||||
|
style,
|
||||||
|
value,
|
||||||
|
}: SerializeOptions<ArraySeparatorStyle> & {
|
||||||
|
value: unknown[];
|
||||||
|
}) => {
|
||||||
|
if (!explode) {
|
||||||
|
const joinedValues = (
|
||||||
|
allowReserved ? value : value.map((v) => encodeURIComponent(v as string))
|
||||||
|
).join(separatorArrayNoExplode(style));
|
||||||
|
switch (style) {
|
||||||
|
case 'label':
|
||||||
|
return `.${joinedValues}`;
|
||||||
|
case 'matrix':
|
||||||
|
return `;${name}=${joinedValues}`;
|
||||||
|
case 'simple':
|
||||||
|
return joinedValues;
|
||||||
|
default:
|
||||||
|
return `${name}=${joinedValues}`;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const separator = separatorArrayExplode(style);
|
||||||
|
const joinedValues = value
|
||||||
|
.map((v) => {
|
||||||
|
if (style === 'label' || style === 'simple') {
|
||||||
|
return allowReserved ? v : encodeURIComponent(v as string);
|
||||||
|
}
|
||||||
|
|
||||||
|
return serializePrimitiveParam({
|
||||||
|
allowReserved,
|
||||||
|
name,
|
||||||
|
value: v as string,
|
||||||
|
});
|
||||||
|
})
|
||||||
|
.join(separator);
|
||||||
|
return style === 'label' || style === 'matrix' ? separator + joinedValues : joinedValues;
|
||||||
|
};
|
||||||
|
|
||||||
|
export const serializePrimitiveParam = ({
|
||||||
|
allowReserved,
|
||||||
|
name,
|
||||||
|
value,
|
||||||
|
}: SerializePrimitiveParam) => {
|
||||||
|
if (value === undefined || value === null) {
|
||||||
|
return '';
|
||||||
|
}
|
||||||
|
|
||||||
|
if (typeof value === 'object') {
|
||||||
|
throw new Error(
|
||||||
|
'Deeply-nested arrays/objects aren’t supported. Provide your own `querySerializer()` to handle these.',
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return `${name}=${allowReserved ? value : encodeURIComponent(value)}`;
|
||||||
|
};
|
||||||
|
|
||||||
|
export const serializeObjectParam = ({
|
||||||
|
allowReserved,
|
||||||
|
explode,
|
||||||
|
name,
|
||||||
|
style,
|
||||||
|
value,
|
||||||
|
valueOnly,
|
||||||
|
}: SerializeOptions<ObjectSeparatorStyle> & {
|
||||||
|
value: Record<string, unknown> | Date;
|
||||||
|
valueOnly?: boolean;
|
||||||
|
}) => {
|
||||||
|
if (value instanceof Date) {
|
||||||
|
return valueOnly ? value.toISOString() : `${name}=${value.toISOString()}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (style !== 'deepObject' && !explode) {
|
||||||
|
let values: string[] = [];
|
||||||
|
Object.entries(value).forEach(([key, v]) => {
|
||||||
|
values = [...values, key, allowReserved ? (v as string) : encodeURIComponent(v as string)];
|
||||||
|
});
|
||||||
|
const joinedValues = values.join(',');
|
||||||
|
switch (style) {
|
||||||
|
case 'form':
|
||||||
|
return `${name}=${joinedValues}`;
|
||||||
|
case 'label':
|
||||||
|
return `.${joinedValues}`;
|
||||||
|
case 'matrix':
|
||||||
|
return `;${name}=${joinedValues}`;
|
||||||
|
default:
|
||||||
|
return joinedValues;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const separator = separatorObjectExplode(style);
|
||||||
|
const joinedValues = Object.entries(value)
|
||||||
|
.map(([key, v]) =>
|
||||||
|
serializePrimitiveParam({
|
||||||
|
allowReserved,
|
||||||
|
name: style === 'deepObject' ? `${name}[${key}]` : key,
|
||||||
|
value: v as string,
|
||||||
|
}),
|
||||||
|
)
|
||||||
|
.join(separator);
|
||||||
|
return style === 'label' || style === 'matrix' ? separator + joinedValues : joinedValues;
|
||||||
|
};
|
||||||
@@ -0,0 +1,117 @@
|
|||||||
|
// This file is auto-generated by @hey-api/openapi-ts
|
||||||
|
|
||||||
|
/**
|
||||||
|
* JSON-friendly union that mirrors what Pinia Colada can hash.
|
||||||
|
*/
|
||||||
|
export type JsonValue =
|
||||||
|
| null
|
||||||
|
| string
|
||||||
|
| number
|
||||||
|
| boolean
|
||||||
|
| JsonValue[]
|
||||||
|
| { [key: string]: JsonValue };
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Replacer that converts non-JSON values (bigint, Date, etc.) to safe substitutes.
|
||||||
|
*/
|
||||||
|
export const queryKeyJsonReplacer = (_key: string, value: unknown) => {
|
||||||
|
if (value === undefined || typeof value === 'function' || typeof value === 'symbol') {
|
||||||
|
return undefined;
|
||||||
|
}
|
||||||
|
if (typeof value === 'bigint') {
|
||||||
|
return value.toString();
|
||||||
|
}
|
||||||
|
if (value instanceof Date) {
|
||||||
|
return value.toISOString();
|
||||||
|
}
|
||||||
|
return value;
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Safely stringifies a value and parses it back into a JsonValue.
|
||||||
|
*/
|
||||||
|
export const stringifyToJsonValue = (input: unknown): JsonValue | undefined => {
|
||||||
|
try {
|
||||||
|
const json = JSON.stringify(input, queryKeyJsonReplacer);
|
||||||
|
if (json === undefined) {
|
||||||
|
return undefined;
|
||||||
|
}
|
||||||
|
return JSON.parse(json) as JsonValue;
|
||||||
|
} catch {
|
||||||
|
return undefined;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects plain objects (including objects with a null prototype).
|
||||||
|
*/
|
||||||
|
const isPlainObject = (value: unknown): value is Record<string, unknown> => {
|
||||||
|
if (value === null || typeof value !== 'object') {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
const prototype = Object.getPrototypeOf(value as object);
|
||||||
|
return prototype === Object.prototype || prototype === null;
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Turns URLSearchParams into a sorted JSON object for deterministic keys.
|
||||||
|
*/
|
||||||
|
const serializeSearchParams = (params: URLSearchParams): JsonValue => {
|
||||||
|
const entries = Array.from(params.entries()).sort(([a], [b]) => a.localeCompare(b));
|
||||||
|
const result: Record<string, JsonValue> = {};
|
||||||
|
|
||||||
|
for (const [key, value] of entries) {
|
||||||
|
const existing = result[key];
|
||||||
|
if (existing === undefined) {
|
||||||
|
result[key] = value;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (Array.isArray(existing)) {
|
||||||
|
(existing as string[]).push(value);
|
||||||
|
} else {
|
||||||
|
result[key] = [existing, value];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return result;
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Normalizes any accepted value into a JSON-friendly shape for query keys.
|
||||||
|
*/
|
||||||
|
export const serializeQueryKeyValue = (value: unknown): JsonValue | undefined => {
|
||||||
|
if (value === null) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (typeof value === 'string' || typeof value === 'number' || typeof value === 'boolean') {
|
||||||
|
return value;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (value === undefined || typeof value === 'function' || typeof value === 'symbol') {
|
||||||
|
return undefined;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (typeof value === 'bigint') {
|
||||||
|
return value.toString();
|
||||||
|
}
|
||||||
|
|
||||||
|
if (value instanceof Date) {
|
||||||
|
return value.toISOString();
|
||||||
|
}
|
||||||
|
|
||||||
|
if (Array.isArray(value)) {
|
||||||
|
return stringifyToJsonValue(value);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (typeof URLSearchParams !== 'undefined' && value instanceof URLSearchParams) {
|
||||||
|
return serializeSearchParams(value);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (isPlainObject(value)) {
|
||||||
|
return stringifyToJsonValue(value);
|
||||||
|
}
|
||||||
|
|
||||||
|
return undefined;
|
||||||
|
};
|
||||||
@@ -0,0 +1,242 @@
|
|||||||
|
// This file is auto-generated by @hey-api/openapi-ts
|
||||||
|
|
||||||
|
import type { Config } from './types.gen';
|
||||||
|
|
||||||
|
export type ServerSentEventsOptions<TData = unknown> = Omit<RequestInit, 'method'> &
|
||||||
|
Pick<Config, 'method' | 'responseTransformer' | 'responseValidator'> & {
|
||||||
|
/**
|
||||||
|
* Fetch API implementation. You can use this option to provide a custom
|
||||||
|
* fetch instance.
|
||||||
|
*
|
||||||
|
* @default globalThis.fetch
|
||||||
|
*/
|
||||||
|
fetch?: typeof fetch;
|
||||||
|
/**
|
||||||
|
* Implementing clients can call request interceptors inside this hook.
|
||||||
|
*/
|
||||||
|
onRequest?: (url: string, init: RequestInit) => Promise<Request>;
|
||||||
|
/**
|
||||||
|
* Callback invoked when a network or parsing error occurs during streaming.
|
||||||
|
*
|
||||||
|
* This option applies only if the endpoint returns a stream of events.
|
||||||
|
*
|
||||||
|
* @param error The error that occurred.
|
||||||
|
*/
|
||||||
|
onSseError?: (error: unknown) => void;
|
||||||
|
/**
|
||||||
|
* Callback invoked when an event is streamed from the server.
|
||||||
|
*
|
||||||
|
* This option applies only if the endpoint returns a stream of events.
|
||||||
|
*
|
||||||
|
* @param event Event streamed from the server.
|
||||||
|
* @returns Nothing (void).
|
||||||
|
*/
|
||||||
|
onSseEvent?: (event: StreamEvent<TData>) => void;
|
||||||
|
serializedBody?: RequestInit['body'];
|
||||||
|
/**
|
||||||
|
* Default retry delay in milliseconds.
|
||||||
|
*
|
||||||
|
* This option applies only if the endpoint returns a stream of events.
|
||||||
|
*
|
||||||
|
* @default 3000
|
||||||
|
*/
|
||||||
|
sseDefaultRetryDelay?: number;
|
||||||
|
/**
|
||||||
|
* Maximum number of retry attempts before giving up.
|
||||||
|
*/
|
||||||
|
sseMaxRetryAttempts?: number;
|
||||||
|
/**
|
||||||
|
* Maximum retry delay in milliseconds.
|
||||||
|
*
|
||||||
|
* Applies only when exponential backoff is used.
|
||||||
|
*
|
||||||
|
* This option applies only if the endpoint returns a stream of events.
|
||||||
|
*
|
||||||
|
* @default 30000
|
||||||
|
*/
|
||||||
|
sseMaxRetryDelay?: number;
|
||||||
|
/**
|
||||||
|
* Optional sleep function for retry backoff.
|
||||||
|
*
|
||||||
|
* Defaults to using `setTimeout`.
|
||||||
|
*/
|
||||||
|
sseSleepFn?: (ms: number) => Promise<void>;
|
||||||
|
url: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
export interface StreamEvent<TData = unknown> {
|
||||||
|
data: TData;
|
||||||
|
event?: string;
|
||||||
|
id?: string;
|
||||||
|
retry?: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export type ServerSentEventsResult<TData = unknown, TReturn = void, TNext = unknown> = {
|
||||||
|
stream: AsyncGenerator<
|
||||||
|
TData extends Record<string, unknown> ? TData[keyof TData] : TData,
|
||||||
|
TReturn,
|
||||||
|
TNext
|
||||||
|
>;
|
||||||
|
};
|
||||||
|
|
||||||
|
export function createSseClient<TData = unknown>({
|
||||||
|
onRequest,
|
||||||
|
onSseError,
|
||||||
|
onSseEvent,
|
||||||
|
responseTransformer,
|
||||||
|
responseValidator,
|
||||||
|
sseDefaultRetryDelay,
|
||||||
|
sseMaxRetryAttempts,
|
||||||
|
sseMaxRetryDelay,
|
||||||
|
sseSleepFn,
|
||||||
|
url,
|
||||||
|
...options
|
||||||
|
}: ServerSentEventsOptions): ServerSentEventsResult<TData> {
|
||||||
|
let lastEventId: string | undefined;
|
||||||
|
|
||||||
|
const sleep = sseSleepFn ?? ((ms: number) => new Promise((resolve) => setTimeout(resolve, ms)));
|
||||||
|
|
||||||
|
const createStream = async function* () {
|
||||||
|
let retryDelay: number = sseDefaultRetryDelay ?? 3000;
|
||||||
|
let attempt = 0;
|
||||||
|
const signal = options.signal ?? new AbortController().signal;
|
||||||
|
|
||||||
|
while (true) {
|
||||||
|
if (signal.aborted) break;
|
||||||
|
|
||||||
|
attempt++;
|
||||||
|
|
||||||
|
const headers =
|
||||||
|
options.headers instanceof Headers
|
||||||
|
? options.headers
|
||||||
|
: new Headers(options.headers as Record<string, string> | undefined);
|
||||||
|
|
||||||
|
if (lastEventId !== undefined) {
|
||||||
|
headers.set('Last-Event-ID', lastEventId);
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
const requestInit: RequestInit = {
|
||||||
|
redirect: 'follow',
|
||||||
|
...options,
|
||||||
|
body: options.serializedBody,
|
||||||
|
headers,
|
||||||
|
signal,
|
||||||
|
};
|
||||||
|
let request = new Request(url, requestInit);
|
||||||
|
if (onRequest) {
|
||||||
|
request = await onRequest(url, requestInit);
|
||||||
|
}
|
||||||
|
// fetch must be assigned here, otherwise it would throw the error:
|
||||||
|
// TypeError: Failed to execute 'fetch' on 'Window': Illegal invocation
|
||||||
|
const _fetch = options.fetch ?? globalThis.fetch;
|
||||||
|
const response = await _fetch(request);
|
||||||
|
|
||||||
|
if (!response.ok) throw new Error(`SSE failed: ${response.status} ${response.statusText}`);
|
||||||
|
|
||||||
|
if (!response.body) throw new Error('No body in SSE response');
|
||||||
|
|
||||||
|
const reader = response.body.pipeThrough(new TextDecoderStream()).getReader();
|
||||||
|
|
||||||
|
let buffer = '';
|
||||||
|
|
||||||
|
const abortHandler = () => {
|
||||||
|
try {
|
||||||
|
reader.cancel();
|
||||||
|
} catch {
|
||||||
|
// noop
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
signal.addEventListener('abort', abortHandler);
|
||||||
|
|
||||||
|
try {
|
||||||
|
while (true) {
|
||||||
|
const { done, value } = await reader.read();
|
||||||
|
if (done) break;
|
||||||
|
buffer += value;
|
||||||
|
buffer = buffer.replace(/\r\n?/g, '\n'); // normalize line endings
|
||||||
|
|
||||||
|
const chunks = buffer.split('\n\n');
|
||||||
|
buffer = chunks.pop() ?? '';
|
||||||
|
|
||||||
|
for (const chunk of chunks) {
|
||||||
|
const lines = chunk.split('\n');
|
||||||
|
const dataLines: Array<string> = [];
|
||||||
|
let eventName: string | undefined;
|
||||||
|
|
||||||
|
for (const line of lines) {
|
||||||
|
if (line.startsWith('data:')) {
|
||||||
|
dataLines.push(line.replace(/^data:\s*/, ''));
|
||||||
|
} else if (line.startsWith('event:')) {
|
||||||
|
eventName = line.replace(/^event:\s*/, '');
|
||||||
|
} else if (line.startsWith('id:')) {
|
||||||
|
lastEventId = line.replace(/^id:\s*/, '');
|
||||||
|
} else if (line.startsWith('retry:')) {
|
||||||
|
const parsed = Number.parseInt(line.replace(/^retry:\s*/, ''), 10);
|
||||||
|
if (!Number.isNaN(parsed)) {
|
||||||
|
retryDelay = parsed;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
let data: unknown;
|
||||||
|
let parsedJson = false;
|
||||||
|
|
||||||
|
if (dataLines.length) {
|
||||||
|
const rawData = dataLines.join('\n');
|
||||||
|
try {
|
||||||
|
data = JSON.parse(rawData);
|
||||||
|
parsedJson = true;
|
||||||
|
} catch {
|
||||||
|
data = rawData;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (parsedJson) {
|
||||||
|
if (responseValidator) {
|
||||||
|
await responseValidator(data);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (responseTransformer) {
|
||||||
|
data = await responseTransformer(data);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
onSseEvent?.({
|
||||||
|
data,
|
||||||
|
event: eventName,
|
||||||
|
id: lastEventId,
|
||||||
|
retry: retryDelay,
|
||||||
|
});
|
||||||
|
|
||||||
|
if (dataLines.length) {
|
||||||
|
yield data as any;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} finally {
|
||||||
|
signal.removeEventListener('abort', abortHandler);
|
||||||
|
reader.releaseLock();
|
||||||
|
}
|
||||||
|
|
||||||
|
break; // exit loop on normal completion
|
||||||
|
} catch (error) {
|
||||||
|
// connection failed or aborted; retry after delay
|
||||||
|
onSseError?.(error);
|
||||||
|
|
||||||
|
if (sseMaxRetryAttempts !== undefined && attempt >= sseMaxRetryAttempts) {
|
||||||
|
break; // stop after firing error
|
||||||
|
}
|
||||||
|
|
||||||
|
// exponential backoff: double retry each attempt, cap at 30s
|
||||||
|
const backoff = Math.min(retryDelay * 2 ** (attempt - 1), sseMaxRetryDelay ?? 30000);
|
||||||
|
await sleep(backoff);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const stream = createStream();
|
||||||
|
|
||||||
|
return { stream };
|
||||||
|
}
|
||||||
@@ -0,0 +1,104 @@
|
|||||||
|
// This file is auto-generated by @hey-api/openapi-ts
|
||||||
|
|
||||||
|
import type { Auth, AuthToken } from './auth.gen';
|
||||||
|
import type { BodySerializer, QuerySerializer, QuerySerializerOptions } from './bodySerializer.gen';
|
||||||
|
|
||||||
|
export type HttpMethod =
|
||||||
|
| 'connect'
|
||||||
|
| 'delete'
|
||||||
|
| 'get'
|
||||||
|
| 'head'
|
||||||
|
| 'options'
|
||||||
|
| 'patch'
|
||||||
|
| 'post'
|
||||||
|
| 'put'
|
||||||
|
| 'trace';
|
||||||
|
|
||||||
|
export type Client<
|
||||||
|
RequestFn = never,
|
||||||
|
Config = unknown,
|
||||||
|
MethodFn = never,
|
||||||
|
BuildUrlFn = never,
|
||||||
|
SseFn = never,
|
||||||
|
> = {
|
||||||
|
/**
|
||||||
|
* Returns the final request URL.
|
||||||
|
*/
|
||||||
|
buildUrl: BuildUrlFn;
|
||||||
|
getConfig: () => Config;
|
||||||
|
request: RequestFn;
|
||||||
|
setConfig: (config: Config) => Config;
|
||||||
|
} & {
|
||||||
|
[K in HttpMethod]: MethodFn;
|
||||||
|
} & ([SseFn] extends [never] ? { sse?: never } : { sse: { [K in HttpMethod]: SseFn } });
|
||||||
|
|
||||||
|
export interface Config {
|
||||||
|
/**
|
||||||
|
* Auth token or a function returning auth token. The resolved value will be
|
||||||
|
* added to the request payload as defined by its `security` array.
|
||||||
|
*/
|
||||||
|
auth?: ((auth: Auth) => Promise<AuthToken> | AuthToken) | AuthToken;
|
||||||
|
/**
|
||||||
|
* A function for serializing request body parameter. By default,
|
||||||
|
* {@link JSON.stringify()} will be used.
|
||||||
|
*/
|
||||||
|
bodySerializer?: BodySerializer | null;
|
||||||
|
/**
|
||||||
|
* An object containing any HTTP headers that you want to pre-populate your
|
||||||
|
* `Headers` object with.
|
||||||
|
*
|
||||||
|
* {@link https://developer.mozilla.org/docs/Web/API/Headers/Headers#init See more}
|
||||||
|
*/
|
||||||
|
headers?:
|
||||||
|
| RequestInit['headers']
|
||||||
|
| Record<
|
||||||
|
string,
|
||||||
|
string | number | boolean | (string | number | boolean)[] | null | undefined | unknown
|
||||||
|
>;
|
||||||
|
/**
|
||||||
|
* The request method.
|
||||||
|
*
|
||||||
|
* {@link https://developer.mozilla.org/docs/Web/API/fetch#method See more}
|
||||||
|
*/
|
||||||
|
method?: Uppercase<HttpMethod>;
|
||||||
|
/**
|
||||||
|
* A function for serializing request query parameters. By default, arrays
|
||||||
|
* will be exploded in form style, objects will be exploded in deepObject
|
||||||
|
* style, and reserved characters are percent-encoded.
|
||||||
|
*
|
||||||
|
* This method will have no effect if the native `paramsSerializer()` Axios
|
||||||
|
* API function is used.
|
||||||
|
*
|
||||||
|
* {@link https://swagger.io/docs/specification/serialization/#query View examples}
|
||||||
|
*/
|
||||||
|
querySerializer?: QuerySerializer | QuerySerializerOptions;
|
||||||
|
/**
|
||||||
|
* A function validating request data. This is useful if you want to ensure
|
||||||
|
* the request conforms to the desired shape, so it can be safely sent to
|
||||||
|
* the server.
|
||||||
|
*/
|
||||||
|
requestValidator?: (data: unknown) => Promise<unknown>;
|
||||||
|
/**
|
||||||
|
* A function transforming response data before it's returned. This is useful
|
||||||
|
* for post-processing data, e.g., converting ISO strings into Date objects.
|
||||||
|
*/
|
||||||
|
responseTransformer?: (data: unknown) => Promise<unknown>;
|
||||||
|
/**
|
||||||
|
* A function validating response data. This is useful if you want to ensure
|
||||||
|
* the response conforms to the desired shape, so it can be safely passed to
|
||||||
|
* the transformers and returned to the user.
|
||||||
|
*/
|
||||||
|
responseValidator?: (data: unknown) => Promise<unknown>;
|
||||||
|
}
|
||||||
|
|
||||||
|
type IsExactlyNeverOrNeverUndefined<T> = [T] extends [never]
|
||||||
|
? true
|
||||||
|
: [T] extends [never | undefined]
|
||||||
|
? [undefined] extends [T]
|
||||||
|
? false
|
||||||
|
: true
|
||||||
|
: false;
|
||||||
|
|
||||||
|
export type OmitNever<T extends Record<string, unknown>> = {
|
||||||
|
[K in keyof T as IsExactlyNeverOrNeverUndefined<T[K]> extends true ? never : K]: T[K];
|
||||||
|
};
|
||||||
@@ -0,0 +1,140 @@
|
|||||||
|
// This file is auto-generated by @hey-api/openapi-ts
|
||||||
|
|
||||||
|
import type { BodySerializer, QuerySerializer } from './bodySerializer.gen';
|
||||||
|
import {
|
||||||
|
type ArraySeparatorStyle,
|
||||||
|
serializeArrayParam,
|
||||||
|
serializeObjectParam,
|
||||||
|
serializePrimitiveParam,
|
||||||
|
} from './pathSerializer.gen';
|
||||||
|
|
||||||
|
export interface PathSerializer {
|
||||||
|
path: Record<string, unknown>;
|
||||||
|
url: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export const PATH_PARAM_RE = /\{[^{}]+\}/g;
|
||||||
|
|
||||||
|
export const defaultPathSerializer = ({ path, url: _url }: PathSerializer) => {
|
||||||
|
let url = _url;
|
||||||
|
const matches = _url.match(PATH_PARAM_RE);
|
||||||
|
if (matches) {
|
||||||
|
for (const match of matches) {
|
||||||
|
let explode = false;
|
||||||
|
let name = match.substring(1, match.length - 1);
|
||||||
|
let style: ArraySeparatorStyle = 'simple';
|
||||||
|
|
||||||
|
if (name.endsWith('*')) {
|
||||||
|
explode = true;
|
||||||
|
name = name.substring(0, name.length - 1);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (name.startsWith('.')) {
|
||||||
|
name = name.substring(1);
|
||||||
|
style = 'label';
|
||||||
|
} else if (name.startsWith(';')) {
|
||||||
|
name = name.substring(1);
|
||||||
|
style = 'matrix';
|
||||||
|
}
|
||||||
|
|
||||||
|
const value = path[name];
|
||||||
|
|
||||||
|
if (value === undefined || value === null) {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (Array.isArray(value)) {
|
||||||
|
url = url.replace(match, serializeArrayParam({ explode, name, style, value }));
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (typeof value === 'object') {
|
||||||
|
url = url.replace(
|
||||||
|
match,
|
||||||
|
serializeObjectParam({
|
||||||
|
explode,
|
||||||
|
name,
|
||||||
|
style,
|
||||||
|
value: value as Record<string, unknown>,
|
||||||
|
valueOnly: true,
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (style === 'matrix') {
|
||||||
|
url = url.replace(
|
||||||
|
match,
|
||||||
|
`;${serializePrimitiveParam({
|
||||||
|
name,
|
||||||
|
value: value as string,
|
||||||
|
})}`,
|
||||||
|
);
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
const replaceValue = encodeURIComponent(
|
||||||
|
style === 'label' ? `.${value as string}` : (value as string),
|
||||||
|
);
|
||||||
|
url = url.replace(match, replaceValue);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return url;
|
||||||
|
};
|
||||||
|
|
||||||
|
export const getUrl = ({
|
||||||
|
baseUrl,
|
||||||
|
path,
|
||||||
|
query,
|
||||||
|
querySerializer,
|
||||||
|
url: _url,
|
||||||
|
}: {
|
||||||
|
baseUrl?: string;
|
||||||
|
path?: Record<string, unknown>;
|
||||||
|
query?: Record<string, unknown>;
|
||||||
|
querySerializer: QuerySerializer;
|
||||||
|
url: string;
|
||||||
|
}) => {
|
||||||
|
const pathUrl = _url.startsWith('/') ? _url : `/${_url}`;
|
||||||
|
let url = (baseUrl ?? '') + pathUrl;
|
||||||
|
if (path) {
|
||||||
|
url = defaultPathSerializer({ path, url });
|
||||||
|
}
|
||||||
|
let search = query ? querySerializer(query) : '';
|
||||||
|
if (search.startsWith('?')) {
|
||||||
|
search = search.substring(1);
|
||||||
|
}
|
||||||
|
if (search) {
|
||||||
|
url += `?${search}`;
|
||||||
|
}
|
||||||
|
return url;
|
||||||
|
};
|
||||||
|
|
||||||
|
export function getValidRequestBody(options: {
|
||||||
|
body?: unknown;
|
||||||
|
bodySerializer?: BodySerializer | null;
|
||||||
|
serializedBody?: unknown;
|
||||||
|
}) {
|
||||||
|
const hasBody = options.body !== undefined;
|
||||||
|
const isSerializedBody = hasBody && options.bodySerializer;
|
||||||
|
|
||||||
|
if (isSerializedBody) {
|
||||||
|
if ('serializedBody' in options) {
|
||||||
|
const hasSerializedBody =
|
||||||
|
options.serializedBody !== undefined && options.serializedBody !== '';
|
||||||
|
|
||||||
|
return hasSerializedBody ? options.serializedBody : null;
|
||||||
|
}
|
||||||
|
|
||||||
|
// not all clients implement a serializedBody property (i.e., client-axios)
|
||||||
|
return options.body !== '' ? options.body : null;
|
||||||
|
}
|
||||||
|
|
||||||
|
// plain/text body
|
||||||
|
if (hasBody) {
|
||||||
|
return options.body;
|
||||||
|
}
|
||||||
|
|
||||||
|
// no body was provided
|
||||||
|
return undefined;
|
||||||
|
}
|
||||||
@@ -0,0 +1,4 @@
|
|||||||
|
// This file is auto-generated by @hey-api/openapi-ts
|
||||||
|
|
||||||
|
export { browseIndexInventoryBrowseGet, browsePathSystemBrowseGet, getTreeSystemTreeGet, listBackupsBackupsGet, type Options, readRootGet, trackBatchSystemTrackBatchPost, trackPathSystemTrackPost } from './sdk.gen';
|
||||||
|
export type { AppApiInventoryFileItemSchema, AppApiSystemFileItemSchema, BatchTrackRequest, BrowseIndexInventoryBrowseGetData, BrowseIndexInventoryBrowseGetError, BrowseIndexInventoryBrowseGetErrors, BrowseIndexInventoryBrowseGetResponse, BrowseIndexInventoryBrowseGetResponses, BrowsePathSystemBrowseGetData, BrowsePathSystemBrowseGetError, BrowsePathSystemBrowseGetErrors, BrowsePathSystemBrowseGetResponse, BrowsePathSystemBrowseGetResponses, ClientOptions, GetTreeSystemTreeGetData, GetTreeSystemTreeGetError, GetTreeSystemTreeGetErrors, GetTreeSystemTreeGetResponse, GetTreeSystemTreeGetResponses, HttpValidationError, ListBackupsBackupsGetData, ListBackupsBackupsGetResponses, ReadRootGetData, ReadRootGetResponses, TrackBatchSystemTrackBatchPostData, TrackBatchSystemTrackBatchPostError, TrackBatchSystemTrackBatchPostErrors, TrackBatchSystemTrackBatchPostResponses, TrackPathSystemTrackPostData, TrackPathSystemTrackPostError, TrackPathSystemTrackPostErrors, TrackPathSystemTrackPostResponses, TrackToggleRequest, TreeNodeSchema, ValidationError } from './types.gen';
|
||||||
@@ -0,0 +1,68 @@
|
|||||||
|
// This file is auto-generated by @hey-api/openapi-ts
|
||||||
|
|
||||||
|
import type { Client, Options as Options2, TDataShape } from './client';
|
||||||
|
import { client } from './client.gen';
|
||||||
|
import type { BrowseIndexInventoryBrowseGetData, BrowseIndexInventoryBrowseGetErrors, BrowseIndexInventoryBrowseGetResponses, BrowsePathSystemBrowseGetData, BrowsePathSystemBrowseGetErrors, BrowsePathSystemBrowseGetResponses, GetTreeSystemTreeGetData, GetTreeSystemTreeGetErrors, GetTreeSystemTreeGetResponses, ListBackupsBackupsGetData, ListBackupsBackupsGetResponses, ReadRootGetData, ReadRootGetResponses, TrackBatchSystemTrackBatchPostData, TrackBatchSystemTrackBatchPostErrors, TrackBatchSystemTrackBatchPostResponses, TrackPathSystemTrackPostData, TrackPathSystemTrackPostErrors, TrackPathSystemTrackPostResponses } from './types.gen';
|
||||||
|
|
||||||
|
export type Options<TData extends TDataShape = TDataShape, ThrowOnError extends boolean = boolean, TResponse = unknown> = Options2<TData, ThrowOnError, TResponse> & {
|
||||||
|
/**
|
||||||
|
* You can provide a client instance returned by `createClient()` instead of
|
||||||
|
* individual options. This might be also useful if you want to implement a
|
||||||
|
* custom client.
|
||||||
|
*/
|
||||||
|
client?: Client;
|
||||||
|
/**
|
||||||
|
* You can pass arbitrary values through the `meta` object. This can be
|
||||||
|
* used to access values that aren't defined as part of the SDK function.
|
||||||
|
*/
|
||||||
|
meta?: Record<string, unknown>;
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Browse Path
|
||||||
|
*/
|
||||||
|
export const browsePathSystemBrowseGet = <ThrowOnError extends boolean = false>(options?: Options<BrowsePathSystemBrowseGetData, ThrowOnError>) => (options?.client ?? client).get<BrowsePathSystemBrowseGetResponses, BrowsePathSystemBrowseGetErrors, ThrowOnError>({ url: '/system/browse', ...options });
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Track Path
|
||||||
|
*/
|
||||||
|
export const trackPathSystemTrackPost = <ThrowOnError extends boolean = false>(options: Options<TrackPathSystemTrackPostData, ThrowOnError>) => (options.client ?? client).post<TrackPathSystemTrackPostResponses, TrackPathSystemTrackPostErrors, ThrowOnError>({
|
||||||
|
url: '/system/track',
|
||||||
|
...options,
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
...options.headers
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Track Batch
|
||||||
|
*/
|
||||||
|
export const trackBatchSystemTrackBatchPost = <ThrowOnError extends boolean = false>(options: Options<TrackBatchSystemTrackBatchPostData, ThrowOnError>) => (options.client ?? client).post<TrackBatchSystemTrackBatchPostResponses, TrackBatchSystemTrackBatchPostErrors, ThrowOnError>({
|
||||||
|
url: '/system/track/batch',
|
||||||
|
...options,
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
...options.headers
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get Tree
|
||||||
|
*/
|
||||||
|
export const getTreeSystemTreeGet = <ThrowOnError extends boolean = false>(options?: Options<GetTreeSystemTreeGetData, ThrowOnError>) => (options?.client ?? client).get<GetTreeSystemTreeGetResponses, GetTreeSystemTreeGetErrors, ThrowOnError>({ url: '/system/tree', ...options });
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Browse Index
|
||||||
|
*/
|
||||||
|
export const browseIndexInventoryBrowseGet = <ThrowOnError extends boolean = false>(options?: Options<BrowseIndexInventoryBrowseGetData, ThrowOnError>) => (options?.client ?? client).get<BrowseIndexInventoryBrowseGetResponses, BrowseIndexInventoryBrowseGetErrors, ThrowOnError>({ url: '/inventory/browse', ...options });
|
||||||
|
|
||||||
|
/**
|
||||||
|
* List Backups
|
||||||
|
*/
|
||||||
|
export const listBackupsBackupsGet = <ThrowOnError extends boolean = false>(options?: Options<ListBackupsBackupsGetData, ThrowOnError>) => (options?.client ?? client).get<ListBackupsBackupsGetResponses, unknown, ThrowOnError>({ url: '/backups/', ...options });
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Read Root
|
||||||
|
*/
|
||||||
|
export const readRootGet = <ThrowOnError extends boolean = false>(options?: Options<ReadRootGetData, ThrowOnError>) => (options?.client ?? client).get<ReadRootGetResponses, unknown, ThrowOnError>({ url: '/', ...options });
|
||||||
@@ -0,0 +1,319 @@
|
|||||||
|
// This file is auto-generated by @hey-api/openapi-ts
|
||||||
|
|
||||||
|
export type ClientOptions = {
|
||||||
|
baseUrl: 'http://localhost:8000' | (string & {});
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* BatchTrackRequest
|
||||||
|
*/
|
||||||
|
export type BatchTrackRequest = {
|
||||||
|
/**
|
||||||
|
* Tracks
|
||||||
|
*/
|
||||||
|
tracks?: Array<string>;
|
||||||
|
/**
|
||||||
|
* Untracks
|
||||||
|
*/
|
||||||
|
untracks?: Array<string>;
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* HTTPValidationError
|
||||||
|
*/
|
||||||
|
export type HttpValidationError = {
|
||||||
|
/**
|
||||||
|
* Detail
|
||||||
|
*/
|
||||||
|
detail?: Array<ValidationError>;
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* TrackToggleRequest
|
||||||
|
*/
|
||||||
|
export type TrackToggleRequest = {
|
||||||
|
/**
|
||||||
|
* Path
|
||||||
|
*/
|
||||||
|
path: string;
|
||||||
|
/**
|
||||||
|
* Is Directory
|
||||||
|
*/
|
||||||
|
is_directory?: boolean;
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* TreeNodeSchema
|
||||||
|
*/
|
||||||
|
export type TreeNodeSchema = {
|
||||||
|
/**
|
||||||
|
* Name
|
||||||
|
*/
|
||||||
|
name: string;
|
||||||
|
/**
|
||||||
|
* Path
|
||||||
|
*/
|
||||||
|
path: string;
|
||||||
|
/**
|
||||||
|
* Has Children
|
||||||
|
*/
|
||||||
|
has_children?: boolean;
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* ValidationError
|
||||||
|
*/
|
||||||
|
export type ValidationError = {
|
||||||
|
/**
|
||||||
|
* Location
|
||||||
|
*/
|
||||||
|
loc: Array<string | number>;
|
||||||
|
/**
|
||||||
|
* Message
|
||||||
|
*/
|
||||||
|
msg: string;
|
||||||
|
/**
|
||||||
|
* Error Type
|
||||||
|
*/
|
||||||
|
type: string;
|
||||||
|
/**
|
||||||
|
* Input
|
||||||
|
*/
|
||||||
|
input?: unknown;
|
||||||
|
/**
|
||||||
|
* Context
|
||||||
|
*/
|
||||||
|
ctx?: {
|
||||||
|
[key: string]: unknown;
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* FileItemSchema
|
||||||
|
*/
|
||||||
|
export type AppApiInventoryFileItemSchema = {
|
||||||
|
/**
|
||||||
|
* Name
|
||||||
|
*/
|
||||||
|
name: string;
|
||||||
|
/**
|
||||||
|
* Path
|
||||||
|
*/
|
||||||
|
path: string;
|
||||||
|
/**
|
||||||
|
* Type
|
||||||
|
*/
|
||||||
|
type: string;
|
||||||
|
/**
|
||||||
|
* Size
|
||||||
|
*/
|
||||||
|
size?: number | null;
|
||||||
|
/**
|
||||||
|
* Mtime
|
||||||
|
*/
|
||||||
|
mtime?: number | null;
|
||||||
|
/**
|
||||||
|
* Media
|
||||||
|
*/
|
||||||
|
media?: Array<string>;
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* FileItemSchema
|
||||||
|
*/
|
||||||
|
export type AppApiSystemFileItemSchema = {
|
||||||
|
/**
|
||||||
|
* Name
|
||||||
|
*/
|
||||||
|
name: string;
|
||||||
|
/**
|
||||||
|
* Path
|
||||||
|
*/
|
||||||
|
path: string;
|
||||||
|
/**
|
||||||
|
* Type
|
||||||
|
*/
|
||||||
|
type: string;
|
||||||
|
/**
|
||||||
|
* Size
|
||||||
|
*/
|
||||||
|
size?: number | null;
|
||||||
|
/**
|
||||||
|
* Mtime
|
||||||
|
*/
|
||||||
|
mtime?: number | null;
|
||||||
|
/**
|
||||||
|
* Tracked
|
||||||
|
*/
|
||||||
|
tracked?: boolean;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type BrowsePathSystemBrowseGetData = {
|
||||||
|
body?: never;
|
||||||
|
path?: never;
|
||||||
|
query?: {
|
||||||
|
/**
|
||||||
|
* Path
|
||||||
|
*/
|
||||||
|
path?: string;
|
||||||
|
};
|
||||||
|
url: '/system/browse';
|
||||||
|
};
|
||||||
|
|
||||||
|
export type BrowsePathSystemBrowseGetErrors = {
|
||||||
|
/**
|
||||||
|
* Validation Error
|
||||||
|
*/
|
||||||
|
422: HttpValidationError;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type BrowsePathSystemBrowseGetError = BrowsePathSystemBrowseGetErrors[keyof BrowsePathSystemBrowseGetErrors];
|
||||||
|
|
||||||
|
export type BrowsePathSystemBrowseGetResponses = {
|
||||||
|
/**
|
||||||
|
* Response Browse Path System Browse Get
|
||||||
|
*
|
||||||
|
* Successful Response
|
||||||
|
*/
|
||||||
|
200: Array<AppApiSystemFileItemSchema>;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type BrowsePathSystemBrowseGetResponse = BrowsePathSystemBrowseGetResponses[keyof BrowsePathSystemBrowseGetResponses];
|
||||||
|
|
||||||
|
export type TrackPathSystemTrackPostData = {
|
||||||
|
body: TrackToggleRequest;
|
||||||
|
path?: never;
|
||||||
|
query?: never;
|
||||||
|
url: '/system/track';
|
||||||
|
};
|
||||||
|
|
||||||
|
export type TrackPathSystemTrackPostErrors = {
|
||||||
|
/**
|
||||||
|
* Validation Error
|
||||||
|
*/
|
||||||
|
422: HttpValidationError;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type TrackPathSystemTrackPostError = TrackPathSystemTrackPostErrors[keyof TrackPathSystemTrackPostErrors];
|
||||||
|
|
||||||
|
export type TrackPathSystemTrackPostResponses = {
|
||||||
|
/**
|
||||||
|
* Successful Response
|
||||||
|
*/
|
||||||
|
200: unknown;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type TrackBatchSystemTrackBatchPostData = {
|
||||||
|
body: BatchTrackRequest;
|
||||||
|
path?: never;
|
||||||
|
query?: never;
|
||||||
|
url: '/system/track/batch';
|
||||||
|
};
|
||||||
|
|
||||||
|
export type TrackBatchSystemTrackBatchPostErrors = {
|
||||||
|
/**
|
||||||
|
* Validation Error
|
||||||
|
*/
|
||||||
|
422: HttpValidationError;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type TrackBatchSystemTrackBatchPostError = TrackBatchSystemTrackBatchPostErrors[keyof TrackBatchSystemTrackBatchPostErrors];
|
||||||
|
|
||||||
|
export type TrackBatchSystemTrackBatchPostResponses = {
|
||||||
|
/**
|
||||||
|
* Successful Response
|
||||||
|
*/
|
||||||
|
200: unknown;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type GetTreeSystemTreeGetData = {
|
||||||
|
body?: never;
|
||||||
|
path?: never;
|
||||||
|
query?: {
|
||||||
|
/**
|
||||||
|
* Path
|
||||||
|
*/
|
||||||
|
path?: string;
|
||||||
|
};
|
||||||
|
url: '/system/tree';
|
||||||
|
};
|
||||||
|
|
||||||
|
export type GetTreeSystemTreeGetErrors = {
|
||||||
|
/**
|
||||||
|
* Validation Error
|
||||||
|
*/
|
||||||
|
422: HttpValidationError;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type GetTreeSystemTreeGetError = GetTreeSystemTreeGetErrors[keyof GetTreeSystemTreeGetErrors];
|
||||||
|
|
||||||
|
export type GetTreeSystemTreeGetResponses = {
|
||||||
|
/**
|
||||||
|
* Response Get Tree System Tree Get
|
||||||
|
*
|
||||||
|
* Successful Response
|
||||||
|
*/
|
||||||
|
200: Array<TreeNodeSchema>;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type GetTreeSystemTreeGetResponse = GetTreeSystemTreeGetResponses[keyof GetTreeSystemTreeGetResponses];
|
||||||
|
|
||||||
|
export type BrowseIndexInventoryBrowseGetData = {
|
||||||
|
body?: never;
|
||||||
|
path?: never;
|
||||||
|
query?: {
|
||||||
|
/**
|
||||||
|
* Path
|
||||||
|
*/
|
||||||
|
path?: string;
|
||||||
|
};
|
||||||
|
url: '/inventory/browse';
|
||||||
|
};
|
||||||
|
|
||||||
|
export type BrowseIndexInventoryBrowseGetErrors = {
|
||||||
|
/**
|
||||||
|
* Validation Error
|
||||||
|
*/
|
||||||
|
422: HttpValidationError;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type BrowseIndexInventoryBrowseGetError = BrowseIndexInventoryBrowseGetErrors[keyof BrowseIndexInventoryBrowseGetErrors];
|
||||||
|
|
||||||
|
export type BrowseIndexInventoryBrowseGetResponses = {
|
||||||
|
/**
|
||||||
|
* Response Browse Index Inventory Browse Get
|
||||||
|
*
|
||||||
|
* Successful Response
|
||||||
|
*/
|
||||||
|
200: Array<AppApiInventoryFileItemSchema>;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type BrowseIndexInventoryBrowseGetResponse = BrowseIndexInventoryBrowseGetResponses[keyof BrowseIndexInventoryBrowseGetResponses];
|
||||||
|
|
||||||
|
export type ListBackupsBackupsGetData = {
|
||||||
|
body?: never;
|
||||||
|
path?: never;
|
||||||
|
query?: never;
|
||||||
|
url: '/backups/';
|
||||||
|
};
|
||||||
|
|
||||||
|
export type ListBackupsBackupsGetResponses = {
|
||||||
|
/**
|
||||||
|
* Successful Response
|
||||||
|
*/
|
||||||
|
200: unknown;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type ReadRootGetData = {
|
||||||
|
body?: never;
|
||||||
|
path?: never;
|
||||||
|
query?: never;
|
||||||
|
url: '/';
|
||||||
|
};
|
||||||
|
|
||||||
|
export type ReadRootGetResponses = {
|
||||||
|
/**
|
||||||
|
* Successful Response
|
||||||
|
*/
|
||||||
|
200: unknown;
|
||||||
|
};
|
||||||
@@ -0,0 +1,446 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import {
|
||||||
|
ChevronLeft,
|
||||||
|
ChevronRight,
|
||||||
|
ChevronUp,
|
||||||
|
RotateCw,
|
||||||
|
Search,
|
||||||
|
Folder,
|
||||||
|
Home,
|
||||||
|
ArrowUpDown,
|
||||||
|
MoreHorizontal,
|
||||||
|
Plus,
|
||||||
|
Scissors,
|
||||||
|
Copy,
|
||||||
|
Clipboard,
|
||||||
|
Trash2,
|
||||||
|
Type,
|
||||||
|
LayoutGrid,
|
||||||
|
CheckSquare,
|
||||||
|
HardDrive,
|
||||||
|
ShieldCheck,
|
||||||
|
ShieldAlert,
|
||||||
|
Square
|
||||||
|
} from "lucide-svelte";
|
||||||
|
import { Button } from "$lib/components/ui/button";
|
||||||
|
import { Checkbox } from "$lib/components/ui/checkbox";
|
||||||
|
import { ScrollArea } from "$lib/components/ui/scroll-area";
|
||||||
|
import { Input } from "$lib/components/ui/input";
|
||||||
|
import FileBrowserTreeItem from "./FileBrowserTreeItem.svelte";
|
||||||
|
import FileBrowserRowItem from "./FileBrowserRowItem.svelte";
|
||||||
|
import type { FileItem, Breadcrumb, TreeNode } from "$lib/types";
|
||||||
|
import { cn } from "$lib/utils";
|
||||||
|
|
||||||
|
let {
|
||||||
|
currentPath = $bindable("/source_data"),
|
||||||
|
files = [],
|
||||||
|
onNavigate = (path: string) => {},
|
||||||
|
onToggleTrack = (item: FileItem) => {},
|
||||||
|
mode = "host"
|
||||||
|
} = $props<{
|
||||||
|
currentPath: string;
|
||||||
|
files: FileItem[];
|
||||||
|
onNavigate?: (path: string) => void;
|
||||||
|
onToggleTrack?: (item: FileItem) => void;
|
||||||
|
mode?: "host" | "index";
|
||||||
|
}>();
|
||||||
|
|
||||||
|
let searchQuery = $state("");
|
||||||
|
let selectedPaths = $state<Set<string>>(new Set());
|
||||||
|
let lastSelectedPath = $state<string | null>(null);
|
||||||
|
let sortColumn = $state<"name" | "size" | "mtime" | "type">("name");
|
||||||
|
let sortDirection = $state<"asc" | "desc">("asc");
|
||||||
|
|
||||||
|
// --- Column Resizing Logic ---
|
||||||
|
let mtimeWidth = $state(200);
|
||||||
|
let typeWidth = $state(150);
|
||||||
|
let sizeWidth = $state(120);
|
||||||
|
|
||||||
|
let resizingCol = $state<string | null>(null);
|
||||||
|
let startX = 0;
|
||||||
|
let startWidth = 0;
|
||||||
|
|
||||||
|
function startResize(e: MouseEvent, col: string) {
|
||||||
|
e.preventDefault();
|
||||||
|
resizingCol = col;
|
||||||
|
startX = e.clientX;
|
||||||
|
if (col === 'mtime') startWidth = mtimeWidth;
|
||||||
|
else if (col === 'type') startWidth = typeWidth;
|
||||||
|
else if (col === 'size') startWidth = sizeWidth;
|
||||||
|
|
||||||
|
window.addEventListener('mousemove', handleMouseMove);
|
||||||
|
window.addEventListener('mouseup', stopResize);
|
||||||
|
document.body.style.cursor = 'col-resize';
|
||||||
|
}
|
||||||
|
|
||||||
|
function handleMouseMove(e: MouseEvent) {
|
||||||
|
if (!resizingCol) return;
|
||||||
|
const delta = e.clientX - startX;
|
||||||
|
if (resizingCol === 'mtime') mtimeWidth = Math.max(100, startWidth + delta);
|
||||||
|
else if (resizingCol === 'type') typeWidth = Math.max(80, startWidth + delta);
|
||||||
|
else if (resizingCol === 'size') sizeWidth = Math.max(60, startWidth + delta);
|
||||||
|
}
|
||||||
|
|
||||||
|
function stopResize() {
|
||||||
|
resizingCol = null;
|
||||||
|
window.removeEventListener('mousemove', handleMouseMove);
|
||||||
|
window.removeEventListener('mouseup', stopResize);
|
||||||
|
document.body.style.cursor = '';
|
||||||
|
}
|
||||||
|
|
||||||
|
const colWidths = $derived({
|
||||||
|
mtime: mtimeWidth,
|
||||||
|
type: typeWidth,
|
||||||
|
size: sizeWidth
|
||||||
|
});
|
||||||
|
|
||||||
|
// --- Navigation Tree Definition ---
|
||||||
|
|
||||||
|
const sourceDataRoot = $derived({
|
||||||
|
name: "Source Data",
|
||||||
|
path: "/source_data",
|
||||||
|
expanded: true,
|
||||||
|
children: [],
|
||||||
|
hasChildren: true
|
||||||
|
});
|
||||||
|
|
||||||
|
const virtualIndexRoot = $derived({
|
||||||
|
name: "Virtual Index",
|
||||||
|
path: "/",
|
||||||
|
expanded: true,
|
||||||
|
children: [],
|
||||||
|
hasChildren: true
|
||||||
|
});
|
||||||
|
|
||||||
|
const activeRoot = $derived(mode === "host" ? sourceDataRoot : virtualIndexRoot);
|
||||||
|
|
||||||
|
// --- Logic ---
|
||||||
|
|
||||||
|
const breadcrumbs = $derived.by(() => {
|
||||||
|
const parts = currentPath.split("/").filter(Boolean);
|
||||||
|
const crumbs: Breadcrumb[] = [];
|
||||||
|
|
||||||
|
if (mode === "host") {
|
||||||
|
crumbs.push({ name: "Source Data", path: "/source_data" });
|
||||||
|
let current = "/source_data";
|
||||||
|
const subParts = parts[0] === "source_data" ? parts.slice(1) : parts;
|
||||||
|
for (const part of subParts) {
|
||||||
|
current += `/${part}`;
|
||||||
|
crumbs.push({ name: part, path: current });
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
crumbs.push({ name: "Virtual Index", path: "/" });
|
||||||
|
let current = "";
|
||||||
|
for (const part of parts) {
|
||||||
|
current += `/${part}`;
|
||||||
|
crumbs.push({ name: part, path: current });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return crumbs;
|
||||||
|
});
|
||||||
|
|
||||||
|
const filteredFiles = $derived.by(() => {
|
||||||
|
let result = files.filter((f) => f.name.toLowerCase().includes(searchQuery.toLowerCase()));
|
||||||
|
|
||||||
|
result.sort((a, b) => {
|
||||||
|
const valA = sortColumn === "type" ? a.type : a[sortColumn as keyof FileItem] || 0;
|
||||||
|
const valB = sortColumn === "type" ? b.type : b[sortColumn as keyof FileItem] || 0;
|
||||||
|
|
||||||
|
if (valA < valB) return sortDirection === "asc" ? -1 : 1;
|
||||||
|
if (valA > valB) return sortDirection === "asc" ? 1 : -1;
|
||||||
|
return 0;
|
||||||
|
});
|
||||||
|
|
||||||
|
return result;
|
||||||
|
});
|
||||||
|
|
||||||
|
function toggleSort(col: typeof sortColumn) {
|
||||||
|
if (sortColumn === col) {
|
||||||
|
sortDirection = sortDirection === "asc" ? "desc" : "asc";
|
||||||
|
} else {
|
||||||
|
sortColumn = col;
|
||||||
|
sortDirection = "asc";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function handleRowClick(e: MouseEvent, item: FileItem) {
|
||||||
|
if (e.shiftKey && lastSelectedPath) {
|
||||||
|
const lastIndex = filteredFiles.findIndex(f => f.path === lastSelectedPath);
|
||||||
|
const currentIndex = filteredFiles.findIndex(f => f.path === item.path);
|
||||||
|
const start = Math.min(lastIndex, currentIndex);
|
||||||
|
const end = Math.max(lastIndex, currentIndex);
|
||||||
|
|
||||||
|
const newSelection = new Set(selectedPaths);
|
||||||
|
for (let i = start; i <= end; i++) {
|
||||||
|
newSelection.add(filteredFiles[i].path);
|
||||||
|
}
|
||||||
|
selectedPaths = newSelection;
|
||||||
|
} else if (e.metaKey || e.ctrlKey) {
|
||||||
|
const newSelection = new Set(selectedPaths);
|
||||||
|
if (newSelection.has(item.path)) {
|
||||||
|
newSelection.delete(item.path);
|
||||||
|
} else {
|
||||||
|
newSelection.add(item.path);
|
||||||
|
}
|
||||||
|
selectedPaths = newSelection;
|
||||||
|
lastSelectedPath = item.path;
|
||||||
|
} else {
|
||||||
|
selectedPaths = new Set([item.path]);
|
||||||
|
lastSelectedPath = item.path;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function handleRowDoubleClick(item: FileItem) {
|
||||||
|
if (item.type === "directory") {
|
||||||
|
onNavigate(item.path);
|
||||||
|
selectedPaths = new Set();
|
||||||
|
lastSelectedPath = null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function handleSelectAll(checked: boolean) {
|
||||||
|
if (checked) {
|
||||||
|
selectedPaths = new Set(filteredFiles.map(f => f.path));
|
||||||
|
} else {
|
||||||
|
selectedPaths = new Set();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function bulkToggle(track: boolean) {
|
||||||
|
const selectedItems = files.filter(f => selectedPaths.has(f.path) && f.tracked !== track);
|
||||||
|
selectedItems.forEach(item => onToggleTrack(item));
|
||||||
|
}
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<div
|
||||||
|
class="file-browser flex h-full flex-col overflow-hidden rounded-lg border border-border-color bg-bg-secondary shadow-2xl"
|
||||||
|
>
|
||||||
|
<!-- ZONE A: TOP BAR -->
|
||||||
|
<div class="flex h-14 shrink-0 items-center justify-between border-b border-border-color bg-bg-tertiary/50 px-6 shadow-sm">
|
||||||
|
<div class="flex items-center gap-4 flex-1">
|
||||||
|
<!-- Navigation Buttons -->
|
||||||
|
<div class="flex items-center gap-1">
|
||||||
|
<Button variant="ghost" size="icon" class="h-8 w-8 text-text-secondary hover:text-text-primary hover:bg-white/5">
|
||||||
|
<ChevronLeft size={18}></ChevronLeft>
|
||||||
|
</Button>
|
||||||
|
<Button variant="ghost" size="icon" class="h-8 w-8 text-text-secondary hover:text-text-primary hover:bg-white/5">
|
||||||
|
<ChevronRight size={18}></ChevronRight>
|
||||||
|
</Button>
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
size="icon"
|
||||||
|
class="h-8 w-8 text-text-secondary hover:text-text-primary hover:bg-white/5"
|
||||||
|
onclick={() => {
|
||||||
|
if (mode === "host" && currentPath === "/source_data") return;
|
||||||
|
if (mode === "index" && currentPath === "/") return;
|
||||||
|
const parent = currentPath.split("/").slice(0, -1).join("/") || "/";
|
||||||
|
onNavigate(parent);
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
<ChevronUp size={18}></ChevronUp>
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Address Bar -->
|
||||||
|
<div class="flex-1 flex items-center bg-bg-primary border border-border-color/40 rounded-md px-3 h-9 shadow-inner overflow-hidden max-w-3xl group transition-all focus-within:border-action-color/50">
|
||||||
|
<Folder size={16} class="text-yellow-500/80 mr-2 shrink-0"></Folder>
|
||||||
|
<div class="flex-1 flex items-center overflow-x-auto scrollbar-hide">
|
||||||
|
{#each breadcrumbs as crumb, i}
|
||||||
|
{#if i > 0}
|
||||||
|
<ChevronRight size={14} class="mx-1 text-text-secondary/30 shrink-0"></ChevronRight>
|
||||||
|
{/if}
|
||||||
|
<button
|
||||||
|
class={cn(
|
||||||
|
"px-2 py-0.5 rounded-md text-[13px] transition-colors hover:bg-white/5 whitespace-nowrap cursor-pointer",
|
||||||
|
i === breadcrumbs.length - 1 ? "text-text-primary font-bold" : "text-text-secondary hover:text-text-primary"
|
||||||
|
)}
|
||||||
|
onclick={() => onNavigate(crumb.path)}
|
||||||
|
>
|
||||||
|
{crumb.name}
|
||||||
|
</button>
|
||||||
|
{/each}
|
||||||
|
</div>
|
||||||
|
<button class="ml-2 text-text-secondary hover:text-text-primary p-1 transition-colors cursor-pointer" onclick={() => onNavigate(currentPath)}>
|
||||||
|
<RotateCw size={14}></RotateCw>
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Search Input -->
|
||||||
|
<div class="flex items-center shrink-0 ml-12">
|
||||||
|
<div class="relative w-64 sm:w-80 group">
|
||||||
|
<Search
|
||||||
|
size={14}
|
||||||
|
class="absolute left-3 top-3 text-text-secondary group-focus-within:text-action-color transition-colors"
|
||||||
|
></Search>
|
||||||
|
<Input
|
||||||
|
type="text"
|
||||||
|
placeholder="Search folder"
|
||||||
|
bind:value={searchQuery}
|
||||||
|
class="h-9 bg-bg-primary/50 pl-10 text-[13px] placeholder:text-text-secondary/50 border-border-color/40 focus-visible:ring-action-color/40 transition-all rounded-md"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="flex flex-1 overflow-hidden">
|
||||||
|
<!-- ZONE B: NAVIGATION PANE -->
|
||||||
|
<aside class="flex w-72 shrink-0 flex-col border-r border-border-color bg-bg-secondary/50">
|
||||||
|
<ScrollArea class="flex-1 p-2">
|
||||||
|
<div class="px-3 py-1 text-[11px] font-bold uppercase tracking-widest text-text-secondary/60 mb-2">
|
||||||
|
Navigation
|
||||||
|
</div>
|
||||||
|
<FileBrowserTreeItem node={activeRoot} selectedPath={currentPath} onSelect={onNavigate} isSpecial={true} {mode} />
|
||||||
|
</ScrollArea>
|
||||||
|
</aside>
|
||||||
|
|
||||||
|
<!-- ZONE C: DETAILS PANE -->
|
||||||
|
<div class="flex min-w-0 flex-1 flex-col bg-bg-primary shadow-inner">
|
||||||
|
<!-- Column Headers -->
|
||||||
|
<div class="flex h-10 items-center border-b border-border-color bg-bg-tertiary/30 shrink-0 select-none">
|
||||||
|
<div class="flex w-12 shrink-0 justify-center">
|
||||||
|
<Checkbox
|
||||||
|
checked={selectedPaths.size === filteredFiles.length && filteredFiles.length > 0}
|
||||||
|
onCheckedChange={handleSelectAll}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="flex flex-1 items-center min-w-0 h-full relative group/col">
|
||||||
|
<button
|
||||||
|
class="flex w-full items-center justify-between text-[11px] font-semibold text-text-secondary hover:bg-white/5 px-4 h-full transition-colors"
|
||||||
|
onclick={() => toggleSort("name")}
|
||||||
|
>
|
||||||
|
Name
|
||||||
|
{#if sortColumn === "name"}
|
||||||
|
<ArrowUpDown size={10} class={cn(sortDirection === "desc" && "rotate-180")}></ArrowUpDown>
|
||||||
|
{/if}
|
||||||
|
</button>
|
||||||
|
<!-- Vertical Separator & Resizer -->
|
||||||
|
<div class="absolute right-0 top-0 w-px h-full bg-border-color/30"></div>
|
||||||
|
<div
|
||||||
|
class="absolute -right-1 top-0 w-2 h-full cursor-col-resize z-10"
|
||||||
|
role="none"
|
||||||
|
></div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="flex items-center h-full relative group/col shrink-0" style="width: {mtimeWidth}px">
|
||||||
|
<button
|
||||||
|
class="flex w-full items-center justify-between text-[11px] font-semibold text-text-secondary hover:bg-white/5 px-4 h-full transition-colors"
|
||||||
|
onclick={() => toggleSort("mtime")}
|
||||||
|
>
|
||||||
|
Date modified
|
||||||
|
{#if sortColumn === "mtime"}
|
||||||
|
<ArrowUpDown size={10} class={cn(sortDirection === "desc" && "rotate-180")}></ArrowUpDown>
|
||||||
|
{/if}
|
||||||
|
</button>
|
||||||
|
<!-- Vertical Separator & Resizer -->
|
||||||
|
<div class="absolute right-0 top-0 w-px h-full bg-border-color/30"></div>
|
||||||
|
<div
|
||||||
|
class="absolute -right-1 top-0 w-2 h-full cursor-col-resize z-10"
|
||||||
|
onmousedown={(e) => startResize(e, 'mtime')}
|
||||||
|
role="none"
|
||||||
|
></div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="flex items-center h-full relative group/col shrink-0" style="width: {typeWidth}px">
|
||||||
|
<button
|
||||||
|
class="flex w-full items-center justify-between text-[11px] font-semibold text-text-secondary hover:bg-white/5 px-4 h-full transition-colors"
|
||||||
|
onclick={() => toggleSort("type")}
|
||||||
|
>
|
||||||
|
Type
|
||||||
|
{#if sortColumn === "type"}
|
||||||
|
<ArrowUpDown size={10} class={cn(sortDirection === "desc" && "rotate-180")}></ArrowUpDown>
|
||||||
|
{/if}
|
||||||
|
</button>
|
||||||
|
<!-- Vertical Separator & Resizer -->
|
||||||
|
<div class="absolute right-0 top-0 w-px h-full bg-border-color/30"></div>
|
||||||
|
<div
|
||||||
|
class="absolute -right-1 top-0 w-2 h-full cursor-col-resize z-10"
|
||||||
|
onmousedown={(e) => startResize(e, 'type')}
|
||||||
|
role="none"
|
||||||
|
></div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="flex items-center h-full relative group/col shrink-0" style="width: {sizeWidth}px">
|
||||||
|
<button
|
||||||
|
class="flex w-full items-center justify-between text-[11px] font-semibold text-text-secondary hover:bg-white/5 px-4 h-full transition-colors text-right"
|
||||||
|
onclick={() => toggleSort("size")}
|
||||||
|
>
|
||||||
|
Size
|
||||||
|
{#if sortColumn === "size"}
|
||||||
|
<ArrowUpDown size={10} class={cn(sortDirection === "desc" && "rotate-180")}></ArrowUpDown>
|
||||||
|
{/if}
|
||||||
|
</button>
|
||||||
|
<!-- Vertical Separator & Resizer -->
|
||||||
|
<div class="absolute right-0 top-0 w-px h-full bg-border-color/30"></div>
|
||||||
|
<div
|
||||||
|
class="absolute -right-1 top-0 w-2 h-full cursor-col-resize z-10"
|
||||||
|
onmousedown={(e) => startResize(e, 'size')}
|
||||||
|
role="none"
|
||||||
|
></div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="w-10 shrink-0"></div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Scrollable File List -->
|
||||||
|
<ScrollArea class="flex-1">
|
||||||
|
{#if filteredFiles.length === 0}
|
||||||
|
<div class="flex h-full flex-col items-center justify-center p-12 text-center opacity-30">
|
||||||
|
<Search size={48} class="mb-4" strokeWidth={1}></Search>
|
||||||
|
<p class="text-sm font-medium uppercase tracking-widest">Folder is empty</p>
|
||||||
|
</div>
|
||||||
|
{:else}
|
||||||
|
{#each filteredFiles as item}
|
||||||
|
<FileBrowserRowItem
|
||||||
|
{item}
|
||||||
|
{mode}
|
||||||
|
{colWidths}
|
||||||
|
isSelected={selectedPaths.has(item.path)}
|
||||||
|
onClick={(e) => handleRowClick(e, item)}
|
||||||
|
onDoubleClick={() => handleRowDoubleClick(item)}
|
||||||
|
onToggleTrack={() => onToggleTrack(item)}
|
||||||
|
/>
|
||||||
|
{/each}
|
||||||
|
{/if}
|
||||||
|
</ScrollArea>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- ZONE D: STATUS BAR -->
|
||||||
|
<div
|
||||||
|
class="flex h-8 shrink-0 items-center justify-between border-t border-border-color bg-bg-tertiary px-6 text-[10px] font-medium text-text-secondary"
|
||||||
|
>
|
||||||
|
<div class="flex items-center gap-4">
|
||||||
|
<span>{filteredFiles.length} items</span>
|
||||||
|
<div class="h-3 w-px bg-border-color/40"></div>
|
||||||
|
{#if selectedPaths.size > 0}
|
||||||
|
<span class="text-text-primary">
|
||||||
|
{selectedPaths.size} items selected
|
||||||
|
</span>
|
||||||
|
{/if}
|
||||||
|
</div>
|
||||||
|
<div class="flex items-center gap-3">
|
||||||
|
<div class="flex items-center gap-1.5 rounded-full bg-action-color/10 px-2 py-0.5 text-action-color border border-action-color/20 shadow-sm">
|
||||||
|
<CheckSquare size={10}></CheckSquare>
|
||||||
|
<span class="font-bold uppercase tracking-wider">
|
||||||
|
{#if mode === 'host'}
|
||||||
|
{files.filter((f) => f.tracked).length} Tracked
|
||||||
|
{:else}
|
||||||
|
{files.filter((f) => f.selected).length} Selected
|
||||||
|
{/if}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<style>
|
||||||
|
:global(.scrollbar-hide::-webkit-scrollbar) {
|
||||||
|
display: none;
|
||||||
|
}
|
||||||
|
:global(.scrollbar-hide) {
|
||||||
|
-ms-overflow-style: none;
|
||||||
|
scrollbar-width: none;
|
||||||
|
}
|
||||||
|
</style>
|
||||||
@@ -0,0 +1,229 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import {
|
||||||
|
Folder,
|
||||||
|
File,
|
||||||
|
FileText,
|
||||||
|
Film,
|
||||||
|
Image,
|
||||||
|
Archive,
|
||||||
|
Link as LinkIcon,
|
||||||
|
MoreVertical,
|
||||||
|
ExternalLink,
|
||||||
|
CassetteTape,
|
||||||
|
ShieldCheck,
|
||||||
|
ShieldAlert,
|
||||||
|
Square
|
||||||
|
} from "lucide-svelte";
|
||||||
|
import { Checkbox } from "$lib/components/ui/checkbox";
|
||||||
|
import { Button } from "$lib/components/ui/button";
|
||||||
|
import type { FileItem } from "$lib/types";
|
||||||
|
import { cn } from "$lib/utils";
|
||||||
|
|
||||||
|
let {
|
||||||
|
item,
|
||||||
|
isSelected = false,
|
||||||
|
onClick = (e: MouseEvent) => {},
|
||||||
|
onDoubleClick = () => {},
|
||||||
|
onToggleTrack = () => {},
|
||||||
|
mode = "host",
|
||||||
|
colWidths = { mtime: 200, type: 150, size: 120 }
|
||||||
|
} = $props<{
|
||||||
|
item: FileItem;
|
||||||
|
isSelected?: boolean;
|
||||||
|
onClick?: (e: MouseEvent) => void;
|
||||||
|
onDoubleClick?: () => void;
|
||||||
|
onToggleTrack?: () => void;
|
||||||
|
mode?: "host" | "index";
|
||||||
|
colWidths?: { mtime: number; type: number; size: number };
|
||||||
|
}>();
|
||||||
|
|
||||||
|
const FileIcon = $derived.by(() => {
|
||||||
|
if (item.type === "directory") return Folder;
|
||||||
|
if (item.type === "link") return LinkIcon;
|
||||||
|
|
||||||
|
const ext = item.name.split(".").pop()?.toLowerCase();
|
||||||
|
switch (ext) {
|
||||||
|
case "txt":
|
||||||
|
case "pdf":
|
||||||
|
case "doc":
|
||||||
|
case "docx":
|
||||||
|
return FileText;
|
||||||
|
case "mp4":
|
||||||
|
case "mkv":
|
||||||
|
case "mov":
|
||||||
|
case "avi":
|
||||||
|
return Film;
|
||||||
|
case "jpg":
|
||||||
|
case "jpeg":
|
||||||
|
case "png":
|
||||||
|
case "gif":
|
||||||
|
case "webp":
|
||||||
|
return Image;
|
||||||
|
case "zip":
|
||||||
|
case "rar":
|
||||||
|
case "7z":
|
||||||
|
case "tar":
|
||||||
|
case "gz":
|
||||||
|
return Archive;
|
||||||
|
default:
|
||||||
|
return File;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
function formatSize(bytes?: number) {
|
||||||
|
if (bytes === undefined) return "--";
|
||||||
|
if (bytes === 0) return "0 B";
|
||||||
|
const units = ["B", "KB", "MB", "GB", "TB"];
|
||||||
|
let unitIndex = 0;
|
||||||
|
let size = bytes;
|
||||||
|
while (size >= 1024 && unitIndex < units.length - 1) {
|
||||||
|
size /= 1024;
|
||||||
|
unitIndex++;
|
||||||
|
}
|
||||||
|
return `${size.toFixed(1)} ${units[unitIndex]}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
function formatDate(mtime?: number) {
|
||||||
|
if (!mtime) return "--";
|
||||||
|
const date = new Date(mtime * 1000);
|
||||||
|
return date.toLocaleDateString(undefined, {
|
||||||
|
year: "numeric",
|
||||||
|
month: "short",
|
||||||
|
day: "numeric",
|
||||||
|
hour: "2-digit",
|
||||||
|
minute: "2-digit"
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function getItemTypeLabel(item: FileItem) {
|
||||||
|
if (item.type === "directory") return "File folder";
|
||||||
|
if (item.type === "link") return "System link";
|
||||||
|
const ext = item.name.split(".").pop()?.toUpperCase();
|
||||||
|
return ext ? `${ext} File` : "File";
|
||||||
|
}
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<div
|
||||||
|
class={cn(
|
||||||
|
"group flex h-10 items-center border-b border-border-color/10 transition-all cursor-pointer select-none",
|
||||||
|
isSelected
|
||||||
|
? "bg-blue-500/15 border-l-2 border-l-blue-500"
|
||||||
|
: "hover:bg-white/5 border-l-2 border-l-transparent"
|
||||||
|
)}
|
||||||
|
role="button"
|
||||||
|
tabindex="0"
|
||||||
|
onclick={onClick}
|
||||||
|
ondblclick={onDoubleClick}
|
||||||
|
onkeydown={(e) => e.key === "Enter" && onDoubleClick()}
|
||||||
|
>
|
||||||
|
<!-- TRACKING STATUS / SELECTION -->
|
||||||
|
<div
|
||||||
|
class="flex h-10 w-12 shrink-0 items-center justify-center border-r border-border-color/10"
|
||||||
|
onclick={(e) => {
|
||||||
|
e.stopPropagation();
|
||||||
|
onToggleTrack();
|
||||||
|
}}
|
||||||
|
onkeydown={(e) => e.key === " " && e.stopPropagation()}
|
||||||
|
role="none"
|
||||||
|
>
|
||||||
|
{#if mode === 'host'}
|
||||||
|
{#if item.tracked}
|
||||||
|
<div class="text-success-color bg-success-color/10 p-1 rounded-md">
|
||||||
|
<ShieldCheck size={16} />
|
||||||
|
</div>
|
||||||
|
{:else}
|
||||||
|
<div class="text-text-secondary/20 group-hover:text-text-secondary/40 p-1">
|
||||||
|
<Square size={16} />
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
{:else}
|
||||||
|
<Checkbox checked={item.selected} onCheckedChange={onToggleTrack} />
|
||||||
|
{/if}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- NAME & ICON -->
|
||||||
|
<div class="flex flex-1 items-center gap-3 min-w-0 px-4 h-full border-r border-border-color/10">
|
||||||
|
<div class="shrink-0 relative">
|
||||||
|
<FileIcon
|
||||||
|
size={18}
|
||||||
|
class={cn(
|
||||||
|
item.type === "directory" ? "text-yellow-500/80 fill-yellow-500/10" : "text-text-secondary"
|
||||||
|
)}
|
||||||
|
></FileIcon>
|
||||||
|
{#if item.type === "link"}
|
||||||
|
<div
|
||||||
|
class="absolute -bottom-1 -right-1 bg-bg-secondary rounded-full p-0.5 border border-border-color"
|
||||||
|
>
|
||||||
|
<ExternalLink size={8} class="text-action-color" />
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="flex flex-col min-w-0">
|
||||||
|
<div class="flex items-center gap-2">
|
||||||
|
<span
|
||||||
|
class={cn(
|
||||||
|
"truncate text-[13px] transition-colors",
|
||||||
|
item.tracked
|
||||||
|
? "text-success-color font-bold"
|
||||||
|
: isSelected
|
||||||
|
? "text-text-primary font-medium"
|
||||||
|
: "text-text-secondary group-hover:text-text-primary"
|
||||||
|
)}
|
||||||
|
>
|
||||||
|
{item.name}
|
||||||
|
</span>
|
||||||
|
{#if mode === "index" && item.media && item.media.length > 0}
|
||||||
|
<div class="flex gap-1 overflow-hidden">
|
||||||
|
{#each item.media as m}
|
||||||
|
<span class="inline-flex items-center gap-1 bg-blue-500/10 text-blue-400 text-[9px] px-1.5 py-0.5 rounded border border-blue-500/20 font-bold uppercase tracking-wider">
|
||||||
|
<CassetteTape size={10} />
|
||||||
|
{m}
|
||||||
|
</span>
|
||||||
|
{/each}
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
</div>
|
||||||
|
{#if item.type === "link" && item.target}
|
||||||
|
<span class="text-[10px] text-text-secondary/50 truncate italic flex items-center gap-1">
|
||||||
|
<LinkIcon size={10} /> {item.target}
|
||||||
|
</span>
|
||||||
|
{/if}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- DATE MODIFIED -->
|
||||||
|
<div
|
||||||
|
class="shrink-0 px-4 h-full flex items-center text-[12px] text-text-secondary tabular-nums font-medium border-r border-border-color/10"
|
||||||
|
style="width: {colWidths.mtime}px"
|
||||||
|
>
|
||||||
|
{formatDate(item.mtime)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- TYPE -->
|
||||||
|
<div
|
||||||
|
class="shrink-0 px-4 h-full flex items-center text-[12px] text-text-secondary truncate font-medium border-r border-border-color/10"
|
||||||
|
style="width: {colWidths.type}px"
|
||||||
|
>
|
||||||
|
{getItemTypeLabel(item)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- SIZE -->
|
||||||
|
<div
|
||||||
|
class="shrink-0 px-4 h-full flex items-center justify-end text-[12px] text-text-secondary mono text-right tabular-nums font-medium border-r border-border-color/10"
|
||||||
|
style="width: {colWidths.size}px"
|
||||||
|
>
|
||||||
|
{item.type === "directory" ? "" : formatSize(item.size)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- QUICK ACTIONS -->
|
||||||
|
<div class="w-10 shrink-0 flex justify-center opacity-0 group-hover:opacity-100 transition-opacity">
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
size="icon"
|
||||||
|
class="h-7 w-7 text-text-secondary hover:text-text-primary hover:bg-white/10"
|
||||||
|
>
|
||||||
|
<MoreVertical size={14} />
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
@@ -0,0 +1,176 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import { ChevronRight, Folder, Home, HardDrive, Monitor, Download, FileText, Image } from "lucide-svelte";
|
||||||
|
import type { TreeNode } from "$lib/types";
|
||||||
|
import { cn } from "$lib/utils";
|
||||||
|
import FileBrowserTreeItem from "./FileBrowserTreeItem.svelte";
|
||||||
|
import { getTreeSystemTreeGet } from "$lib/api/sdk.gen";
|
||||||
|
|
||||||
|
let {
|
||||||
|
node,
|
||||||
|
selectedPath,
|
||||||
|
onSelect = (path: string) => {},
|
||||||
|
level = 0,
|
||||||
|
isSpecial = false,
|
||||||
|
mode = "host"
|
||||||
|
} = $props<{
|
||||||
|
node: TreeNode;
|
||||||
|
selectedPath: string | null;
|
||||||
|
onSelect?: (path: string) => void;
|
||||||
|
level?: number;
|
||||||
|
isSpecial?: boolean;
|
||||||
|
mode?: "host" | "index";
|
||||||
|
}>();
|
||||||
|
|
||||||
|
let expanded = $state(node.expanded || false);
|
||||||
|
let children = $state<TreeNode[]>(node.children || []);
|
||||||
|
let loading = $state(false);
|
||||||
|
let loaded = $state(false);
|
||||||
|
|
||||||
|
// Auto-load if started expanded
|
||||||
|
$effect(() => {
|
||||||
|
if (expanded && !loaded) {
|
||||||
|
loadSubdirs();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
async function loadSubdirs() {
|
||||||
|
if (loaded || mode === "index") return; // Index mode lazy loading not yet implemented
|
||||||
|
|
||||||
|
loading = true;
|
||||||
|
try {
|
||||||
|
const response = await getTreeSystemTreeGet({
|
||||||
|
query: { path: node.path }
|
||||||
|
});
|
||||||
|
if (response.data) {
|
||||||
|
children = response.data.map(d => ({
|
||||||
|
name: d.name,
|
||||||
|
path: d.path,
|
||||||
|
children: [],
|
||||||
|
expanded: false,
|
||||||
|
hasChildren: d.has_children
|
||||||
|
}));
|
||||||
|
loaded = true;
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error("Failed to load subdirectories:", error);
|
||||||
|
} finally {
|
||||||
|
loading = false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function toggle() {
|
||||||
|
expanded = !expanded;
|
||||||
|
if (expanded && !loaded) {
|
||||||
|
await loadSubdirs();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function select() {
|
||||||
|
onSelect(node.path);
|
||||||
|
}
|
||||||
|
|
||||||
|
const specialIcon = $derived.by(() => {
|
||||||
|
if (!isSpecial) return null;
|
||||||
|
switch (node.name.toLowerCase()) {
|
||||||
|
case "source data":
|
||||||
|
case "this pc":
|
||||||
|
case "root":
|
||||||
|
case "virtual index":
|
||||||
|
return HardDrive;
|
||||||
|
default:
|
||||||
|
return HardDrive;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
const hasSubdirs = $derived((children && children.length > 0) || (node as any).hasChildren);
|
||||||
|
|
||||||
|
function handleKeyDown(e: KeyboardEvent) {
|
||||||
|
if (e.key === "Enter" || e.key === " ") {
|
||||||
|
e.preventDefault();
|
||||||
|
select();
|
||||||
|
} else if (e.key === "ArrowRight") {
|
||||||
|
if (!expanded) toggle();
|
||||||
|
} else if (e.key === "ArrowLeft") {
|
||||||
|
if (expanded) toggle();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<div class="tree-item-group">
|
||||||
|
<div
|
||||||
|
class={cn(
|
||||||
|
"group flex items-center gap-2 py-1.5 px-3 cursor-pointer select-none transition-all rounded-sm",
|
||||||
|
selectedPath === node.path
|
||||||
|
? "bg-blue-500/15 text-text-primary shadow-sm border-l-2 border-blue-500"
|
||||||
|
: "text-text-secondary hover:bg-white/5 hover:text-text-primary border-l-2 border-transparent"
|
||||||
|
)}
|
||||||
|
style="padding-left: {level * 12 + (isSpecial ? 12 : 8)}px"
|
||||||
|
onclick={select}
|
||||||
|
onkeydown={handleKeyDown}
|
||||||
|
role="treeitem"
|
||||||
|
aria-selected={selectedPath === node.path}
|
||||||
|
aria-expanded={hasSubdirs ? expanded : undefined}
|
||||||
|
tabindex="0"
|
||||||
|
>
|
||||||
|
<!-- EXPANDER ARROW -->
|
||||||
|
<button
|
||||||
|
class={cn(
|
||||||
|
"w-4 h-4 flex items-center justify-center transition-all",
|
||||||
|
hasSubdirs ? "opacity-60 hover:opacity-100" : "opacity-0 pointer-events-none"
|
||||||
|
)}
|
||||||
|
onclick={(e) => {
|
||||||
|
e.stopPropagation();
|
||||||
|
toggle();
|
||||||
|
}}
|
||||||
|
tabindex="-1"
|
||||||
|
>
|
||||||
|
<ChevronRight
|
||||||
|
size={12}
|
||||||
|
strokeWidth={3}
|
||||||
|
class={cn("transition-transform duration-200", expanded && "rotate-90", loading && "animate-pulse")}
|
||||||
|
></ChevronRight>
|
||||||
|
</button>
|
||||||
|
|
||||||
|
<!-- ICON -->
|
||||||
|
{#if isSpecial && specialIcon}
|
||||||
|
{@const Icon = specialIcon}
|
||||||
|
<Icon
|
||||||
|
size={16}
|
||||||
|
class={cn(
|
||||||
|
"shrink-0 transition-colors",
|
||||||
|
selectedPath === node.path ? "text-blue-400" : "text-text-secondary/60 group-hover:text-text-secondary/90"
|
||||||
|
)}
|
||||||
|
></Icon>
|
||||||
|
{:else}
|
||||||
|
<Folder
|
||||||
|
size={16}
|
||||||
|
class={cn(
|
||||||
|
"shrink-0 transition-colors",
|
||||||
|
selectedPath === node.path
|
||||||
|
? "text-yellow-500/80 fill-yellow-500/10"
|
||||||
|
: "text-text-secondary/40 group-hover:text-text-secondary/80"
|
||||||
|
)}
|
||||||
|
></Folder>
|
||||||
|
{/if}
|
||||||
|
|
||||||
|
<!-- LABEL -->
|
||||||
|
<span class={cn("text-[13px] truncate", selectedPath === node.path ? "font-semibold" : "font-medium")}>
|
||||||
|
{node.name}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{#if expanded && children.length > 0}
|
||||||
|
<div role="group">
|
||||||
|
{#each children as child}
|
||||||
|
<FileBrowserTreeItem node={child} {selectedPath} {onSelect} level={level + 1} {mode} />
|
||||||
|
{/each}
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<style>
|
||||||
|
.tree-item-group {
|
||||||
|
display: flex;
|
||||||
|
flex-direction: column;
|
||||||
|
}
|
||||||
|
</style>
|
||||||
@@ -0,0 +1,17 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import type { HTMLAttributes } from "svelte/elements";
|
||||||
|
import { MoreHorizontal } from "lucide-svelte";
|
||||||
|
import { cn } from "$lib/utils";
|
||||||
|
|
||||||
|
let { class: className, ...rest }: HTMLAttributes<HTMLSpanElement> = $props();
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<span
|
||||||
|
role="presentation"
|
||||||
|
aria-hidden="true"
|
||||||
|
class={cn("flex h-9 w-9 items-center justify-center", className)}
|
||||||
|
{...rest}
|
||||||
|
>
|
||||||
|
<MoreHorizontal class="h-4 w-4" />
|
||||||
|
<span class="sr-only">More</span>
|
||||||
|
</span>
|
||||||
@@ -0,0 +1,10 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import type { HTMLAttributes } from "svelte/elements";
|
||||||
|
import { cn } from "$lib/utils";
|
||||||
|
|
||||||
|
let { class: className, children, ...rest }: HTMLAttributes<HTMLLIElement> = $props();
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<li class={cn("inline-flex items-center gap-1.5", className)} {...rest}>
|
||||||
|
{@render children?.()}
|
||||||
|
</li>
|
||||||
@@ -0,0 +1,13 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import type { HTMLAttributes } from "svelte/elements";
|
||||||
|
import { cn } from "$lib/utils";
|
||||||
|
|
||||||
|
let { class: className, children, ...rest }: HTMLAttributes<HTMLAnchorElement> = $props();
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<a
|
||||||
|
class={cn("transition-colors hover:text-foreground", className)}
|
||||||
|
{...rest}
|
||||||
|
>
|
||||||
|
{@render children?.()}
|
||||||
|
</a>
|
||||||
@@ -0,0 +1,16 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import type { HTMLAttributes } from "svelte/elements";
|
||||||
|
import { cn } from "$lib/utils";
|
||||||
|
|
||||||
|
let { class: className, children, ...rest }: HTMLAttributes<HTMLOListElement> = $props();
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<ol
|
||||||
|
class={cn(
|
||||||
|
"flex flex-wrap items-center gap-1.5 break-words text-sm text-muted-foreground sm:gap-2.5",
|
||||||
|
className
|
||||||
|
)}
|
||||||
|
{...rest}
|
||||||
|
>
|
||||||
|
{@render children?.()}
|
||||||
|
</ol>
|
||||||
@@ -0,0 +1,16 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import type { HTMLAttributes } from "svelte/elements";
|
||||||
|
import { cn } from "$lib/utils";
|
||||||
|
|
||||||
|
let { class: className, children, ...rest }: HTMLAttributes<HTMLSpanElement> = $props();
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<span
|
||||||
|
role="link"
|
||||||
|
aria-disabled="true"
|
||||||
|
aria-current="page"
|
||||||
|
class={cn("font-normal text-foreground", className)}
|
||||||
|
{...rest}
|
||||||
|
>
|
||||||
|
{@render children?.()}
|
||||||
|
</span>
|
||||||
@@ -0,0 +1,20 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import type { HTMLAttributes } from "svelte/elements";
|
||||||
|
import { ChevronRight } from "lucide-svelte";
|
||||||
|
import { cn } from "$lib/utils";
|
||||||
|
|
||||||
|
let { class: className, children, ...rest }: HTMLAttributes<HTMLLIElement> = $props();
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<li
|
||||||
|
role="presentation"
|
||||||
|
aria-hidden="true"
|
||||||
|
class={cn("[&>svg]:size-3.5", className)}
|
||||||
|
{...rest}
|
||||||
|
>
|
||||||
|
{#if children}
|
||||||
|
{@render children()}
|
||||||
|
{:else}
|
||||||
|
<ChevronRight />
|
||||||
|
{/if}
|
||||||
|
</li>
|
||||||
@@ -0,0 +1,10 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import type { HTMLAttributes } from "svelte/elements";
|
||||||
|
import { cn } from "$lib/utils";
|
||||||
|
|
||||||
|
let { class: className, children, ...rest }: HTMLAttributes<HTMLElement> = $props();
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<nav aria-label="breadcrumb" class={cn("", className)} {...rest}>
|
||||||
|
{@render children?.()}
|
||||||
|
</nav>
|
||||||
@@ -0,0 +1,25 @@
|
|||||||
|
import Root from "./breadcrumb.svelte";
|
||||||
|
import List from "./breadcrumb-list.svelte";
|
||||||
|
import Item from "./breadcrumb-item.svelte";
|
||||||
|
import Link from "./breadcrumb-link.svelte";
|
||||||
|
import Page from "./breadcrumb-page.svelte";
|
||||||
|
import Separator from "./breadcrumb-separator.svelte";
|
||||||
|
import Ellipsis from "./breadcrumb-ellipsis.svelte";
|
||||||
|
|
||||||
|
export {
|
||||||
|
Root,
|
||||||
|
List,
|
||||||
|
Item,
|
||||||
|
Link,
|
||||||
|
Page,
|
||||||
|
Separator,
|
||||||
|
Ellipsis,
|
||||||
|
//
|
||||||
|
Root as Breadcrumb,
|
||||||
|
List as BreadcrumbList,
|
||||||
|
Item as BreadcrumbItem,
|
||||||
|
Link as BreadcrumbLink,
|
||||||
|
Page as BreadcrumbPage,
|
||||||
|
Separator as BreadcrumbSeparator,
|
||||||
|
Ellipsis as BreadcrumbEllipsis
|
||||||
|
};
|
||||||
@@ -0,0 +1,22 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import { Button as ButtonPrimitive } from "bits-ui";
|
||||||
|
import { cn } from "$lib/utils";
|
||||||
|
import { type Props, buttonVariants } from "./index.js";
|
||||||
|
|
||||||
|
let {
|
||||||
|
class: className,
|
||||||
|
variant,
|
||||||
|
size,
|
||||||
|
children,
|
||||||
|
builders,
|
||||||
|
...rest
|
||||||
|
}: Props = $props();
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<ButtonPrimitive.Root
|
||||||
|
class={cn(buttonVariants({ variant, size, className }))}
|
||||||
|
{builders}
|
||||||
|
{...rest}
|
||||||
|
>
|
||||||
|
{@render children?.()}
|
||||||
|
</ButtonPrimitive.Root>
|
||||||
@@ -0,0 +1,44 @@
|
|||||||
|
import { type VariantProps, tv } from "tailwind-variants";
|
||||||
|
import type { Button as ButtonPrimitive } from "bits-ui";
|
||||||
|
import Root from "./button.svelte";
|
||||||
|
|
||||||
|
const buttonVariants = tv({
|
||||||
|
base: "inline-flex items-center justify-center whitespace-nowrap rounded-md text-[13px] font-bold uppercase tracking-wider ring-offset-background transition-all focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2 disabled:pointer-events-none disabled:opacity-40 active:translate-y-[1px] border select-none relative overflow-hidden",
|
||||||
|
variants: {
|
||||||
|
variant: {
|
||||||
|
default: "bg-gradient-to-b from-[#4aa3df] to-[var(--color-action-color)] text-white border-[#2980b9] shadow-[inset_0_1px_0_rgba(255,255,255,0.2),0_2px_4px_rgba(0,0,0,0.2)] hover:brightness-110 hover:shadow-[0_0_15px_rgba(52,152,219,0.3)]",
|
||||||
|
destructive: "bg-gradient-to-b from-[#ef5350] to-[var(--color-error-color)] text-white border-[#c62828] shadow-[inset_0_1px_0_rgba(255,255,255,0.2),0_2px_4px_rgba(0,0,0,0.2)] hover:brightness-110",
|
||||||
|
outline: "border-border bg-transparent hover:bg-white/5 hover:text-white",
|
||||||
|
secondary: "bg-gradient-to-b from-[#21262d] to-[#161b22] text-[var(--color-text-primary)] border-[var(--color-border-color)] shadow-[inset_0_1px_0_rgba(255,255,255,0.05),0_2px_4px_rgba(0,0,0,0.1)] hover:from-[#2d333b] hover:to-[#21262d] hover:border-[var(--color-text-secondary)]",
|
||||||
|
ghost: "border-transparent text-[var(--color-text-secondary)] hover:bg-white/5 hover:text-white",
|
||||||
|
link: "border-transparent text-primary underline-offset-4 hover:underline",
|
||||||
|
warning: "bg-gradient-to-b from-[#f39c12] to-[#e67e22] text-white border-[#d35400] shadow-[inset_0_1px_0_rgba(255,255,255,0.2),0_2px_4px_rgba(0,0,0,0.2)] hover:brightness-110 hover:shadow-[0_0_15px_rgba(230,126,34,0.3)]"
|
||||||
|
},
|
||||||
|
size: {
|
||||||
|
default: "h-9 px-5 py-2",
|
||||||
|
sm: "h-8 rounded-md px-3 text-[11px]",
|
||||||
|
lg: "h-11 rounded-md px-10 text-[14px]",
|
||||||
|
icon: "h-9 w-9 p-0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
defaultVariants: {
|
||||||
|
variant: "default",
|
||||||
|
size: "default"
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
type Variant = VariantProps<typeof buttonVariants>["variant"];
|
||||||
|
type Size = VariantProps<typeof buttonVariants>["size"];
|
||||||
|
|
||||||
|
type Props = ButtonPrimitive.Props & {
|
||||||
|
variant?: Variant;
|
||||||
|
size?: Size;
|
||||||
|
};
|
||||||
|
|
||||||
|
export {
|
||||||
|
Root,
|
||||||
|
type Props,
|
||||||
|
//
|
||||||
|
Root as Button,
|
||||||
|
buttonVariants
|
||||||
|
};
|
||||||
@@ -0,0 +1,10 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import { cn } from "$lib/utils";
|
||||||
|
import type { HTMLAttributes } from "svelte/elements";
|
||||||
|
|
||||||
|
let { class: className, children, ...rest }: HTMLAttributes<HTMLDivElement> = $props();
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<div class={cn("p-6 pt-0", className)} {...rest}>
|
||||||
|
{@render children?.()}
|
||||||
|
</div>
|
||||||
@@ -0,0 +1,13 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import { cn } from "$lib/utils";
|
||||||
|
import type { HTMLAttributes } from "svelte/elements";
|
||||||
|
|
||||||
|
let { class: className, children, ...rest }: HTMLAttributes<HTMLParagraphElement> = $props();
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<p
|
||||||
|
class={cn("text-sm text-muted-foreground", className)}
|
||||||
|
{...rest}
|
||||||
|
>
|
||||||
|
{@render children?.()}
|
||||||
|
</p>
|
||||||
@@ -0,0 +1,10 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import { cn } from "$lib/utils";
|
||||||
|
import type { HTMLAttributes } from "svelte/elements";
|
||||||
|
|
||||||
|
let { class: className, children, ...rest }: HTMLAttributes<HTMLDivElement> = $props();
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<div class={cn("flex items-center p-6 pt-0", className)} {...rest}>
|
||||||
|
{@render children?.()}
|
||||||
|
</div>
|
||||||
@@ -0,0 +1,10 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import { cn } from "$lib/utils";
|
||||||
|
import type { HTMLAttributes } from "svelte/elements";
|
||||||
|
|
||||||
|
let { class: className, children, ...rest }: HTMLAttributes<HTMLDivElement> = $props();
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<div class={cn("flex flex-col space-y-1.5 p-6", className)} {...rest}>
|
||||||
|
{@render children?.()}
|
||||||
|
</div>
|
||||||
@@ -0,0 +1,13 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import { cn } from "$lib/utils";
|
||||||
|
import type { HTMLAttributes } from "svelte/elements";
|
||||||
|
|
||||||
|
let { class: className, children, ...rest }: HTMLAttributes<HTMLHeadingElement> = $props();
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<h3
|
||||||
|
class={cn("text-2xl font-semibold leading-none tracking-tight", className)}
|
||||||
|
{...rest}
|
||||||
|
>
|
||||||
|
{@render children?.()}
|
||||||
|
</h3>
|
||||||
@@ -0,0 +1,13 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import { cn } from "$lib/utils";
|
||||||
|
import type { HTMLAttributes } from "svelte/elements";
|
||||||
|
|
||||||
|
let { class: className, children, ...rest }: HTMLAttributes<HTMLDivElement> = $props();
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<div
|
||||||
|
class={cn("rounded-lg border bg-card text-card-foreground shadow-sm", className)}
|
||||||
|
{...rest}
|
||||||
|
>
|
||||||
|
{@render children?.()}
|
||||||
|
</div>
|
||||||
@@ -0,0 +1,22 @@
|
|||||||
|
import Root from "./card.svelte";
|
||||||
|
import Header from "./card-header.svelte";
|
||||||
|
import Footer from "./card-footer.svelte";
|
||||||
|
import Title from "./card-title.svelte";
|
||||||
|
import Description from "./card-description.svelte";
|
||||||
|
import Content from "./card-content.svelte";
|
||||||
|
|
||||||
|
export {
|
||||||
|
Root,
|
||||||
|
Header,
|
||||||
|
Footer,
|
||||||
|
Title,
|
||||||
|
Description,
|
||||||
|
Content,
|
||||||
|
//
|
||||||
|
Root as Card,
|
||||||
|
Header as CardHeader,
|
||||||
|
Footer as CardFooter,
|
||||||
|
Title as CardTitle,
|
||||||
|
Description as CardDescription,
|
||||||
|
Content as CardContent
|
||||||
|
};
|
||||||
@@ -0,0 +1,26 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import { Checkbox as CheckboxPrimitive } from "bits-ui";
|
||||||
|
import { Check } from "lucide-svelte";
|
||||||
|
import { cn } from "$lib/utils";
|
||||||
|
|
||||||
|
let {
|
||||||
|
class: className,
|
||||||
|
checked = $bindable(false),
|
||||||
|
...rest
|
||||||
|
}: CheckboxPrimitive.Props = $props();
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<CheckboxPrimitive.Root
|
||||||
|
bind:checked
|
||||||
|
class={cn(
|
||||||
|
"peer h-4 w-4 shrink-0 rounded-sm border border-border-color bg-bg-primary ring-offset-background focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-action-color focus-visible:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50 data-[state=checked]:bg-action-color data-[state=checked]:border-action-color data-[state=checked]:text-white transition-all",
|
||||||
|
className
|
||||||
|
)}
|
||||||
|
{...rest}
|
||||||
|
>
|
||||||
|
{#if checked}
|
||||||
|
<div class="flex items-center justify-center text-current h-full w-full">
|
||||||
|
<Check class="h-3.5 w-3.5 stroke-[3]" />
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
</CheckboxPrimitive.Root>
|
||||||
@@ -0,0 +1,7 @@
|
|||||||
|
import Root from "./checkbox.svelte";
|
||||||
|
|
||||||
|
export {
|
||||||
|
Root,
|
||||||
|
//
|
||||||
|
Root as Checkbox
|
||||||
|
};
|
||||||
@@ -0,0 +1,7 @@
|
|||||||
|
import Root from "./input.svelte";
|
||||||
|
|
||||||
|
export {
|
||||||
|
Root,
|
||||||
|
//
|
||||||
|
Root as Input
|
||||||
|
};
|
||||||
@@ -0,0 +1,19 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import type { HTMLInputAttributes } from "svelte/elements";
|
||||||
|
import { cn } from "$lib/utils";
|
||||||
|
|
||||||
|
let {
|
||||||
|
class: className,
|
||||||
|
value = $bindable(),
|
||||||
|
...rest
|
||||||
|
}: HTMLInputAttributes = $props();
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<input
|
||||||
|
class={cn(
|
||||||
|
"flex h-10 w-full rounded-md border border-border-color bg-bg-primary px-3 py-2 text-sm text-text-primary ring-offset-background file:border-0 file:bg-transparent file:text-sm file:font-medium placeholder:text-text-secondary/50 focus-visible:outline-none focus-visible:ring-1 focus-visible:ring-action-color disabled:cursor-not-allowed disabled:opacity-50 transition-all",
|
||||||
|
className
|
||||||
|
)}
|
||||||
|
bind:value
|
||||||
|
{...rest}
|
||||||
|
/>
|
||||||
@@ -0,0 +1,7 @@
|
|||||||
|
import Root from "./scroll-area.svelte";
|
||||||
|
|
||||||
|
export {
|
||||||
|
Root,
|
||||||
|
//
|
||||||
|
Root as ScrollArea
|
||||||
|
};
|
||||||
@@ -0,0 +1,40 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import { cn } from "$lib/utils";
|
||||||
|
import type { HTMLAttributes } from "svelte/elements";
|
||||||
|
|
||||||
|
let { class: className, children, ...rest }: HTMLAttributes<HTMLDivElement> = $props();
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<div
|
||||||
|
class={cn("relative overflow-auto scrollbar-custom", className)}
|
||||||
|
{...rest}
|
||||||
|
>
|
||||||
|
{@render children?.()}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<style>
|
||||||
|
:global(.scrollbar-custom::-webkit-scrollbar) {
|
||||||
|
width: 8px;
|
||||||
|
height: 8px;
|
||||||
|
}
|
||||||
|
|
||||||
|
:global(.scrollbar-custom::-webkit-scrollbar-track) {
|
||||||
|
background: transparent;
|
||||||
|
}
|
||||||
|
|
||||||
|
:global(.scrollbar-custom::-webkit-scrollbar-thumb) {
|
||||||
|
background: var(--color-border-color);
|
||||||
|
border-radius: 10px;
|
||||||
|
border: 2px solid var(--color-bg-secondary);
|
||||||
|
}
|
||||||
|
|
||||||
|
:global(.scrollbar-custom::-webkit-scrollbar-thumb:hover) {
|
||||||
|
background: var(--color-text-secondary);
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Firefox support */
|
||||||
|
.scrollbar-custom {
|
||||||
|
scrollbar-width: thin;
|
||||||
|
scrollbar-color: var(--color-border-color) transparent;
|
||||||
|
}
|
||||||
|
</style>
|
||||||
@@ -0,0 +1,23 @@
|
|||||||
|
export interface FileItem {
|
||||||
|
name: string;
|
||||||
|
path: string;
|
||||||
|
type: 'file' | 'directory' | 'link';
|
||||||
|
target?: string; // For links
|
||||||
|
size?: number;
|
||||||
|
mtime?: number;
|
||||||
|
tracked?: boolean;
|
||||||
|
media?: string[]; // Media it's on (for index browsing)
|
||||||
|
selected?: boolean; // For restore cart
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface TreeNode {
|
||||||
|
name: string;
|
||||||
|
path: string;
|
||||||
|
children?: TreeNode[];
|
||||||
|
expanded?: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface Breadcrumb {
|
||||||
|
name: string;
|
||||||
|
path: string;
|
||||||
|
}
|
||||||
@@ -0,0 +1,6 @@
|
|||||||
|
import { type ClassValue, clsx } from "clsx";
|
||||||
|
import { twMerge } from "tailwind-merge";
|
||||||
|
|
||||||
|
export function cn(...inputs: ClassValue[]) {
|
||||||
|
return twMerge(clsx(inputs));
|
||||||
|
}
|
||||||
@@ -0,0 +1,51 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import '../app.css';
|
||||||
|
import { page } from '$app/stores';
|
||||||
|
import {
|
||||||
|
LayoutDashboard,
|
||||||
|
Library,
|
||||||
|
FolderTree,
|
||||||
|
History,
|
||||||
|
Settings,
|
||||||
|
Database
|
||||||
|
} from 'lucide-svelte';
|
||||||
|
|
||||||
|
let { children } = $props();
|
||||||
|
|
||||||
|
const navItems = [
|
||||||
|
{ name: 'Dashboard', href: '/', icon: LayoutDashboard },
|
||||||
|
{ name: 'Inventory', href: '/inventory', icon: Library },
|
||||||
|
{ name: 'File Tracking', href: '/tracking', icon: FolderTree },
|
||||||
|
|
||||||
|
{ name: 'Restores', href: '/restores', icon: History }
|
||||||
|
];
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<div class="app-container">
|
||||||
|
<nav>
|
||||||
|
<h2><Database size={24} color="#3498db" /> TapeHoard</h2>
|
||||||
|
<ul>
|
||||||
|
{#each navItems as item}
|
||||||
|
<li>
|
||||||
|
<a href={item.href} class:active={$page.url.pathname === item.href}>
|
||||||
|
<item.icon size={20} />
|
||||||
|
{item.name}
|
||||||
|
</a>
|
||||||
|
</li>
|
||||||
|
{/each}
|
||||||
|
</ul>
|
||||||
|
<div style="margin-top: auto;">
|
||||||
|
<ul>
|
||||||
|
<li>
|
||||||
|
<a href="/settings" class:active={$page.url.pathname === '/settings'}>
|
||||||
|
<Settings size={20} />
|
||||||
|
Settings
|
||||||
|
</a>
|
||||||
|
</li>
|
||||||
|
</ul>
|
||||||
|
</div>
|
||||||
|
</nav>
|
||||||
|
<main>
|
||||||
|
{@render children()}
|
||||||
|
</main>
|
||||||
|
</div>
|
||||||
@@ -0,0 +1,9 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<svelte:head>
|
||||||
|
<title>TapeHoard - Dashboard</title>
|
||||||
|
</svelte:head>
|
||||||
|
|
||||||
|
<h1>Dashboard</h1>
|
||||||
|
<p>Welcome to TapeHoard.</p>
|
||||||
@@ -0,0 +1,191 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import { Plus, CassetteTape, HardDrive, Cloud, MapPin, Edit3, Scissors } from 'lucide-svelte';
|
||||||
|
|
||||||
|
// Mock Data
|
||||||
|
const mediaList = [
|
||||||
|
{ id: 1, type: 'LTO-6', identifier: 'BUP-00001', capacity: 2500, used: 2450, status: 'Full', location: 'Bank Vault' },
|
||||||
|
{ id: 2, type: 'LTO-6', identifier: 'BUP-00002', capacity: 2500, used: 1200, status: 'Active', location: 'In Drive' },
|
||||||
|
{ id: 3, type: 'HDD', identifier: 'HDD-001', capacity: 8000, used: 4000, status: 'Active', location: 'Shelf 2' },
|
||||||
|
{ id: 4, type: 'Cloud', identifier: 's3://my-backups', capacity: Infinity, used: 1500, status: 'Active', location: 'AWS-East' }
|
||||||
|
];
|
||||||
|
|
||||||
|
function getPercentage(used: number, capacity: number) {
|
||||||
|
if (capacity === Infinity) return 0;
|
||||||
|
return Math.min(100, Math.round((used / capacity) * 100));
|
||||||
|
}
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<svelte:head>
|
||||||
|
<title>Inventory - TapeHoard</title>
|
||||||
|
</svelte:head>
|
||||||
|
|
||||||
|
<div class="flex justify-between items-center mb-8 bg-bg-secondary p-6 rounded-lg border border-border-color shadow-lg">
|
||||||
|
<div>
|
||||||
|
<h1 class="text-3xl font-bold tracking-tight text-text-primary">Global Inventory</h1>
|
||||||
|
<p class="text-text-secondary mt-1">Manage physical and cloud storage media across all locations.</p>
|
||||||
|
</div>
|
||||||
|
<button class="btn btn-primary h-11 px-8"><Plus size={18} class="mr-2" /> Register New Media</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="card stats-summary">
|
||||||
|
<h3>Storage Pool Summary</h3>
|
||||||
|
<div class="stats-grid">
|
||||||
|
<div class="stat-box">
|
||||||
|
<span class="label">Total Media</span>
|
||||||
|
<span class="value">{mediaList.length}</span>
|
||||||
|
</div>
|
||||||
|
<div class="stat-box">
|
||||||
|
<span class="label">Total Capacity</span>
|
||||||
|
<span class="value">13.0 TB</span>
|
||||||
|
</div>
|
||||||
|
<div class="stat-box">
|
||||||
|
<span class="label">Total Used</span>
|
||||||
|
<span class="value">9.1 TB</span>
|
||||||
|
</div>
|
||||||
|
<div class="stat-box">
|
||||||
|
<span class="label">Utilization</span>
|
||||||
|
<span class="value">70%</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="card no-padding">
|
||||||
|
<table class="data-table">
|
||||||
|
<thead>
|
||||||
|
<tr>
|
||||||
|
<th>Identifier</th>
|
||||||
|
<th>Type</th>
|
||||||
|
<th>Capacity Used</th>
|
||||||
|
<th>Status</th>
|
||||||
|
<th>Location</th>
|
||||||
|
<th>Actions</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody>
|
||||||
|
{#each mediaList as media}
|
||||||
|
<tr>
|
||||||
|
<td><span class="mono"><strong>{media.identifier}</strong></span></td>
|
||||||
|
<td>
|
||||||
|
{#if media.type.startsWith('LTO')}
|
||||||
|
<span class="badge badge-tape"><CassetteTape size={12} style="margin-right: 4px;" /> {media.type}</span>
|
||||||
|
{:else if media.type === 'HDD'}
|
||||||
|
<span class="badge badge-hdd"><HardDrive size={12} style="margin-right: 4px;" /> {media.type}</span>
|
||||||
|
{:else}
|
||||||
|
<span class="badge badge-cloud"><Cloud size={12} style="margin-right: 4px;" /> {media.type}</span>
|
||||||
|
{/if}
|
||||||
|
</td>
|
||||||
|
<td>
|
||||||
|
<div class="progress-info">
|
||||||
|
<div class="progress-container">
|
||||||
|
<div class="progress-bar" style="width: {getPercentage(media.used, media.capacity)}%; background-color: {media.status === 'Full' ? 'var(--color-error-color)' : 'var(--color-action-color)'}"></div>
|
||||||
|
</div>
|
||||||
|
<span class="mono">{media.used} GB / {media.capacity === Infinity ? '∞' : media.capacity + ' GB'}</span>
|
||||||
|
</div>
|
||||||
|
</td>
|
||||||
|
<td>
|
||||||
|
<div class="status-cell">
|
||||||
|
<span class="status-dot" class:active={media.status === 'Active'} class:full={media.status === 'Full'}></span>
|
||||||
|
{media.status}
|
||||||
|
</div>
|
||||||
|
</td>
|
||||||
|
<td>
|
||||||
|
<div class="location-cell">
|
||||||
|
<MapPin size={14} color="var(--color-text-secondary)" />
|
||||||
|
{media.location}
|
||||||
|
</div>
|
||||||
|
</td>
|
||||||
|
<td>
|
||||||
|
<div class="actions">
|
||||||
|
<button class="btn btn-secondary btn-icon-only" title="Edit"><Edit3 size={14} /></button>
|
||||||
|
{#if media.status === 'Full' && media.type.startsWith('LTO')}
|
||||||
|
<button class="btn btn-warning btn-icon-only" title="Groom Tape"><Scissors size={14} /></button>
|
||||||
|
{/if}
|
||||||
|
</div>
|
||||||
|
</td>
|
||||||
|
</tr>
|
||||||
|
{/each}
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<style>
|
||||||
|
.no-padding {
|
||||||
|
padding: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
.stats-grid {
|
||||||
|
display: grid;
|
||||||
|
grid-template-columns: repeat(4, 1fr);
|
||||||
|
gap: var(--spacing-lg);
|
||||||
|
margin-top: var(--spacing-md);
|
||||||
|
}
|
||||||
|
|
||||||
|
.stat-box .label {
|
||||||
|
color: var(--color-text-secondary);
|
||||||
|
font-size: 0.8rem;
|
||||||
|
text-transform: uppercase;
|
||||||
|
font-weight: 600;
|
||||||
|
letter-spacing: 0.05em;
|
||||||
|
}
|
||||||
|
|
||||||
|
.stat-box .value {
|
||||||
|
display: block;
|
||||||
|
font-size: 1.75rem;
|
||||||
|
font-weight: 700;
|
||||||
|
color: var(--color-text-primary);
|
||||||
|
margin-top: 0.25rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.progress-info {
|
||||||
|
display: flex;
|
||||||
|
flex-direction: column;
|
||||||
|
gap: 0.25rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.progress-info span {
|
||||||
|
font-size: 0.75rem;
|
||||||
|
color: var(--color-text-secondary);
|
||||||
|
}
|
||||||
|
|
||||||
|
.badge-tape { background-color: rgba(52, 152, 219, 0.15); color: #3498db; }
|
||||||
|
.badge-hdd { background-color: rgba(241, 196, 15, 0.15); color: #f1c40f; }
|
||||||
|
.badge-cloud { background-color: rgba(46, 204, 113, 0.15); color: #2ecc71; }
|
||||||
|
|
||||||
|
.status-cell {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: var(--spacing-sm);
|
||||||
|
}
|
||||||
|
|
||||||
|
.status-dot {
|
||||||
|
width: 8px;
|
||||||
|
height: 8px;
|
||||||
|
border-radius: 50%;
|
||||||
|
background-color: var(--color-text-secondary);
|
||||||
|
}
|
||||||
|
|
||||||
|
.status-dot.active { background-color: var(--color-success-color); box-shadow: 0 0 8px var(--color-success-color); }
|
||||||
|
.status-dot.full { background-color: var(--color-error-color); }
|
||||||
|
|
||||||
|
.location-cell {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: var(--spacing-xs);
|
||||||
|
color: var(--color-text-secondary);
|
||||||
|
}
|
||||||
|
|
||||||
|
.actions {
|
||||||
|
display: flex;
|
||||||
|
gap: var(--spacing-xs);
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-icon-only {
|
||||||
|
padding: 0.4rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-warning {
|
||||||
|
background-color: rgba(243, 156, 18, 0.2);
|
||||||
|
color: #f39c12;
|
||||||
|
border: 1px solid rgba(243, 156, 18, 0.3);
|
||||||
|
}
|
||||||
|
</style>
|
||||||
@@ -0,0 +1,252 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import { onMount } from 'svelte';
|
||||||
|
import {
|
||||||
|
Play,
|
||||||
|
Trash2,
|
||||||
|
CassetteTape,
|
||||||
|
ChevronRight,
|
||||||
|
CheckCircle2,
|
||||||
|
AlertCircle,
|
||||||
|
Download,
|
||||||
|
RotateCw,
|
||||||
|
X,
|
||||||
|
Search
|
||||||
|
} from 'lucide-svelte';
|
||||||
|
import { fade, fly } from 'svelte/transition';
|
||||||
|
import FileBrowser from '$lib/components/file-browser/FileBrowser.svelte';
|
||||||
|
import type { FileItem } from '$lib/types';
|
||||||
|
import { browseIndexInventoryBrowseGet } from '$lib/api/sdk.gen';
|
||||||
|
|
||||||
|
// Wizard State
|
||||||
|
let currentStep = $state(1);
|
||||||
|
|
||||||
|
// File Browser State
|
||||||
|
let currentPath = $state('/');
|
||||||
|
let indexedFiles = $state<FileItem[]>([]);
|
||||||
|
let loading = $state(false);
|
||||||
|
|
||||||
|
// Restore Cart (Selected Files)
|
||||||
|
let restoreCart = $state<FileItem[]>([]);
|
||||||
|
|
||||||
|
async function loadIndexedFiles(path: string) {
|
||||||
|
loading = true;
|
||||||
|
try {
|
||||||
|
const response = await browseIndexInventoryBrowseGet({
|
||||||
|
query: { path }
|
||||||
|
});
|
||||||
|
if (response.data) {
|
||||||
|
// Map API response and preserve selection state if already in cart
|
||||||
|
indexedFiles = response.data.map(f => ({
|
||||||
|
...f,
|
||||||
|
type: f.type as 'file' | 'directory' | 'link',
|
||||||
|
selected: restoreCart.some(cartItem => cartItem.path === f.path)
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error("Failed to load indexed files:", error);
|
||||||
|
} finally {
|
||||||
|
loading = false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
onMount(() => {
|
||||||
|
loadIndexedFiles(currentPath);
|
||||||
|
});
|
||||||
|
|
||||||
|
$effect(() => {
|
||||||
|
if (currentPath) {
|
||||||
|
loadIndexedFiles(currentPath);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
function handleToggleSelect(item: FileItem) {
|
||||||
|
const index = restoreCart.findIndex(i => i.path === item.path);
|
||||||
|
if (index > -1) {
|
||||||
|
restoreCart = restoreCart.filter((_, i) => i !== index);
|
||||||
|
item.selected = false;
|
||||||
|
} else {
|
||||||
|
restoreCart = [...restoreCart, { ...item, selected: true }];
|
||||||
|
item.selected = true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function removeFromCart(path: string) {
|
||||||
|
restoreCart = restoreCart.filter(i => i.path !== path);
|
||||||
|
// Update selection in current view if visible
|
||||||
|
const visibleItem = indexedFiles.find(f => f.path === path);
|
||||||
|
if (visibleItem) visibleItem.selected = false;
|
||||||
|
}
|
||||||
|
|
||||||
|
const totalSize = $derived(restoreCart.reduce((acc, item) => acc + (item.size || 0), 0));
|
||||||
|
const requiredMedia = $derived([...new Set(restoreCart.flatMap(item => item.media || []))]);
|
||||||
|
|
||||||
|
function formatSize(bytes: number) {
|
||||||
|
if (bytes === 0) return "0 B";
|
||||||
|
const units = ["B", "KB", "MB", "GB", "TB"];
|
||||||
|
let unitIndex = 0;
|
||||||
|
let size = bytes;
|
||||||
|
while (size >= 1024 && unitIndex < units.length - 1) {
|
||||||
|
size /= 1024;
|
||||||
|
unitIndex++;
|
||||||
|
}
|
||||||
|
return `${size.toFixed(1)} ${units[unitIndex]}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
function nextStep() {
|
||||||
|
if (currentStep < 4) currentStep++;
|
||||||
|
}
|
||||||
|
|
||||||
|
function cancel() {
|
||||||
|
currentStep = 1;
|
||||||
|
}
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<svelte:head>
|
||||||
|
<title>Restore Wizard - TapeHoard</title>
|
||||||
|
</svelte:head>
|
||||||
|
|
||||||
|
<div class="flex flex-col gap-6 h-full">
|
||||||
|
<header class="flex justify-between items-center bg-bg-secondary px-8 py-5 rounded-xl border border-border-color shadow-2xl relative overflow-hidden">
|
||||||
|
<div class="absolute inset-0 bg-gradient-to-r from-blue-500/5 to-transparent pointer-events-none"></div>
|
||||||
|
<div class="relative z-10">
|
||||||
|
<h1 class="text-2xl font-black uppercase tracking-tighter text-text-primary flex items-center gap-3">
|
||||||
|
<RotateCw class="text-blue-500" size={28} />
|
||||||
|
Restore Wizard
|
||||||
|
</h1>
|
||||||
|
<p class="text-[12px] font-bold uppercase tracking-widest text-text-secondary mt-1 opacity-80">
|
||||||
|
Step {currentStep} of 4: {['Browse & Select', 'Insert Media', 'Swap Media', 'Finalize'][currentStep-1]}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{#if currentStep === 1}
|
||||||
|
<div class="flex gap-4 relative z-10">
|
||||||
|
<Button variant="default" size="lg" class="px-8 h-12" disabled={restoreCart.length === 0} onclick={nextStep}>
|
||||||
|
<Play size={20} class="mr-2" />
|
||||||
|
Execute Restore
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
</header>
|
||||||
|
|
||||||
|
{#if currentStep === 1}
|
||||||
|
<div class="grid grid-cols-1 lg:grid-cols-3 gap-6 flex-1 min-h-0">
|
||||||
|
<!-- LEFT: VIRTUAL FILESYSTEM BROWSER -->
|
||||||
|
<div class="lg:col-span-2 flex flex-col min-h-0 relative">
|
||||||
|
<div class="mb-2 flex items-center justify-between">
|
||||||
|
<h3 class="text-sm font-bold uppercase tracking-widest text-text-secondary">Virtual Filesystem</h3>
|
||||||
|
<span class="text-[10px] bg-white/5 px-2 py-1 rounded text-text-secondary font-mono">Indexing: {currentPath}</span>
|
||||||
|
</div>
|
||||||
|
{#if loading}
|
||||||
|
<div class="absolute inset-0 bg-bg-primary/50 z-50 flex items-center justify-center top-8">
|
||||||
|
<span class="text-text-secondary animate-pulse">Querying Index...</span>
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
<FileBrowser
|
||||||
|
bind:currentPath
|
||||||
|
files={indexedFiles}
|
||||||
|
mode="index"
|
||||||
|
onNavigate={(path) => currentPath = path}
|
||||||
|
onToggleTrack={handleToggleSelect}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- RIGHT: RESTORE CART -->
|
||||||
|
<div class="flex flex-col gap-4 min-h-0">
|
||||||
|
<div class="bg-bg-secondary border border-border-color rounded-xl p-6 flex flex-col h-full shadow-lg">
|
||||||
|
<div class="flex items-center justify-between mb-4">
|
||||||
|
<h3 class="text-lg font-black uppercase tracking-tight text-text-primary flex items-center gap-2">
|
||||||
|
<Download size={20} class="text-blue-400" />
|
||||||
|
Restore Cart
|
||||||
|
</h3>
|
||||||
|
<span class="bg-blue-500 text-white text-[10px] font-bold px-2 py-0.5 rounded-full">
|
||||||
|
{restoreCart.length}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="flex-1 overflow-y-auto mb-4 pr-2">
|
||||||
|
{#if restoreCart.length === 0}
|
||||||
|
<div class="h-full flex flex-col items-center justify-center text-center opacity-20 py-12">
|
||||||
|
<Search size={48} class="mb-4" />
|
||||||
|
<p class="text-xs font-bold uppercase tracking-widest leading-relaxed">
|
||||||
|
Your cart is empty.<br>Select files from the index.
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
{:else}
|
||||||
|
<div class="flex flex-col gap-2">
|
||||||
|
{#each restoreCart as item}
|
||||||
|
<div class="bg-bg-primary/50 border border-border-color/50 rounded-lg p-3 group transition-all hover:border-blue-500/30">
|
||||||
|
<div class="flex justify-between items-start gap-2">
|
||||||
|
<div class="min-w-0">
|
||||||
|
<p class="text-[12px] font-bold text-text-primary truncate">{item.name}</p>
|
||||||
|
<p class="text-[10px] text-text-secondary truncate mono opacity-50">{item.path}</p>
|
||||||
|
</div>
|
||||||
|
<button
|
||||||
|
class="text-text-secondary hover:text-error-color transition-colors p-1"
|
||||||
|
onclick={() => removeFromCart(item.path)}
|
||||||
|
>
|
||||||
|
<X size={14} />
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
<div class="flex items-center gap-3 mt-2">
|
||||||
|
<span class="text-[10px] mono text-text-secondary font-bold">{formatSize(item.size || 0)}</span>
|
||||||
|
<div class="flex gap-1">
|
||||||
|
{#each (item.media || []) as m}
|
||||||
|
<span class="flex items-center gap-1 text-[9px] bg-blue-500/10 text-blue-400 px-1 rounded font-bold">
|
||||||
|
<CassetteTape size={10} /> {m}
|
||||||
|
</span>
|
||||||
|
{/each}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{/each}
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="pt-4 border-t border-border-color mt-auto">
|
||||||
|
<div class="flex justify-between mb-2">
|
||||||
|
<span class="text-[10px] font-bold uppercase tracking-widest text-text-secondary">Total Payload</span>
|
||||||
|
<span class="text-sm font-black text-text-primary mono">{formatSize(totalSize)}</span>
|
||||||
|
</div>
|
||||||
|
<div class="flex justify-between">
|
||||||
|
<span class="text-[10px] font-bold uppercase tracking-widest text-text-secondary">Media Required</span>
|
||||||
|
<span class="text-sm font-black text-blue-400 mono">{requiredMedia.length} Tapes</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{:else if currentStep === 2}
|
||||||
|
<!-- (Step 2-4 keep the same design as before but with Tailwind classes) -->
|
||||||
|
<div in:fly={{ y: 20, duration: 300 }} class="flex flex-col items-center justify-center flex-1">
|
||||||
|
<div class="bg-bg-secondary border-2 border-border-color rounded-2xl p-12 text-center shadow-2xl max-w-lg w-full">
|
||||||
|
<div class="w-20 h-20 bg-blue-500/10 text-blue-500 rounded-full flex items-center justify-center mx-auto mb-6 animate-pulse">
|
||||||
|
<CassetteTape size={48} />
|
||||||
|
</div>
|
||||||
|
<h2 class="text-2xl font-black text-text-primary uppercase tracking-tight mb-2">Insert Tape</h2>
|
||||||
|
<p class="text-text-secondary mb-8">
|
||||||
|
Please insert the following tape into the drive to begin extraction.
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<div class="bg-bg-primary border-2 border-dashed border-border-color rounded-xl p-8 mb-8">
|
||||||
|
<span class="text-4xl font-black text-text-primary mono">{requiredMedia[0] || 'BUP-00001'}</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="flex items-center justify-center gap-3 text-text-secondary text-sm font-bold uppercase tracking-widest mb-10">
|
||||||
|
<RotateCw size={18} class="animate-spin text-blue-500" />
|
||||||
|
<span>Waiting for drive status...</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="flex gap-4 justify-center">
|
||||||
|
<Button variant="secondary" onclick={cancel}>
|
||||||
|
<X size={18} class="mr-2" /> Cancel
|
||||||
|
</Button>
|
||||||
|
<Button variant="default" onclick={nextStep}>
|
||||||
|
Simulate Load <ChevronRight size={18} class="ml-2" />
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
</div>
|
||||||
@@ -0,0 +1,54 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import { Search, Save, ShieldAlert } from 'lucide-svelte';
|
||||||
|
import { Button } from '$lib/components/ui/button';
|
||||||
|
import { Card } from '$lib/components/ui/card';
|
||||||
|
|
||||||
|
let globalExclusions = $state("*.tmp\nnode_modules/\n.DS_Store\nThumbs.db\nCache/\n");
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<svelte:head>
|
||||||
|
<title>Settings - TapeHoard</title>
|
||||||
|
</svelte:head>
|
||||||
|
|
||||||
|
<div class="flex justify-between items-center mb-8 bg-bg-secondary p-6 rounded-lg border border-border-color shadow-lg">
|
||||||
|
<div>
|
||||||
|
<h1 class="text-3xl font-bold tracking-tight text-text-primary">System Settings</h1>
|
||||||
|
<p class="text-text-secondary mt-1">Configure global backup behavior and exclusion engines.</p>
|
||||||
|
</div>
|
||||||
|
<Button variant="default" size="lg" class="px-8 h-12">
|
||||||
|
<Save size={20} class="mr-2" />
|
||||||
|
Apply Settings
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="max-w-4xl mx-auto space-y-6">
|
||||||
|
<Card class="p-8 shadow-xl border-border-color/60">
|
||||||
|
<div class="flex items-center justify-between mb-6">
|
||||||
|
<div class="flex items-center gap-3">
|
||||||
|
<div class="p-2 bg-action-color/10 rounded-lg">
|
||||||
|
<Search size={24} class="text-action-color" />
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<h3 class="text-lg font-bold text-text-primary uppercase tracking-tight">Global Exclusion Engine</h3>
|
||||||
|
<p class="text-[12px] text-text-secondary font-medium">Define patterns that will be ignored across all backup sources.</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<span class="text-[10px] font-mono text-text-secondary bg-bg-primary px-3 py-1 rounded-full border border-border-color">.gitignore syntax</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="relative group">
|
||||||
|
<textarea
|
||||||
|
bind:value={globalExclusions}
|
||||||
|
class="w-full h-[400px] bg-bg-primary/50 border border-border-color rounded-lg p-6 text-[14px] mono text-text-primary focus:ring-1 focus:ring-action-color focus:border-action-color focus:outline-none resize-none leading-relaxed transition-all"
|
||||||
|
placeholder="e.g. *.tmp"
|
||||||
|
></textarea>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="mt-6 p-4 bg-orange-500/5 border border-dashed border-orange-500/30 rounded-lg flex gap-4 items-start">
|
||||||
|
<ShieldAlert size={20} class="text-orange-500 shrink-0 mt-0.5" />
|
||||||
|
<p class="text-[12px] text-text-secondary leading-normal font-medium">
|
||||||
|
<strong>WARNING:</strong> Broad exclusion patterns (like <code>*</code> or <code>/</code>) can lead to empty backups. Patterns are evaluated recursively. Use <code>!</code> to explicitly include a sub-pattern within an excluded directory.
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</Card>
|
||||||
|
</div>
|
||||||
@@ -0,0 +1,227 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import { onMount } from 'svelte';
|
||||||
|
import { Save, PlayCircle, FolderTree, FileCheck, Database, HardDrive, LayoutGrid, RotateCw, Search } from 'lucide-svelte';
|
||||||
|
import { Button } from '$lib/components/ui/button';
|
||||||
|
import { Card } from '$lib/components/ui/card';
|
||||||
|
import FileBrowser from '$lib/components/file-browser/FileBrowser.svelte';
|
||||||
|
import type { FileItem } from '$lib/types';
|
||||||
|
import { browsePathSystemBrowseGet, trackBatchSystemTrackBatchPost } from '$lib/api/sdk.gen';
|
||||||
|
import { toast } from "svelte-sonner";
|
||||||
|
import { cn } from "$lib/utils";
|
||||||
|
|
||||||
|
// Current directory state
|
||||||
|
let currentPath = $state('/source_data');
|
||||||
|
let files = $state<FileItem[]>([]);
|
||||||
|
let loading = $state(false);
|
||||||
|
let committing = $state(false);
|
||||||
|
|
||||||
|
// Staging area for tracking changes: path -> desired tracked state
|
||||||
|
let pendingChanges = $state<Map<string, boolean>>(new Map());
|
||||||
|
|
||||||
|
async function loadFiles(path: string) {
|
||||||
|
loading = true;
|
||||||
|
try {
|
||||||
|
const response = await browsePathSystemBrowseGet({
|
||||||
|
query: { path }
|
||||||
|
});
|
||||||
|
if (response.data) {
|
||||||
|
files = response.data.map(f => ({
|
||||||
|
...f,
|
||||||
|
type: f.type as 'file' | 'directory' | 'link'
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error("Failed to load files:", error);
|
||||||
|
toast.error("Failed to load file system");
|
||||||
|
} finally {
|
||||||
|
loading = false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
onMount(() => {
|
||||||
|
loadFiles(currentPath);
|
||||||
|
});
|
||||||
|
|
||||||
|
$effect(() => {
|
||||||
|
if (currentPath) {
|
||||||
|
loadFiles(currentPath);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
function handleNavigate(path: string) {
|
||||||
|
currentPath = path;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Toggle track locally (staging)
|
||||||
|
function handleToggleTrack(item: FileItem) {
|
||||||
|
const path = item.path;
|
||||||
|
const currentlyTracked = item.tracked;
|
||||||
|
const stagedState = pendingChanges.get(path);
|
||||||
|
|
||||||
|
if (stagedState !== undefined) {
|
||||||
|
// If already staged, revert to original state if toggled back
|
||||||
|
if (stagedState === !currentlyTracked) {
|
||||||
|
pendingChanges.delete(path);
|
||||||
|
pendingChanges = new Map(pendingChanges); // Trigger reactivity
|
||||||
|
} else {
|
||||||
|
pendingChanges.set(path, !stagedState);
|
||||||
|
pendingChanges = new Map(pendingChanges);
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
// Stage the flip
|
||||||
|
pendingChanges.set(path, !currentlyTracked);
|
||||||
|
pendingChanges = new Map(pendingChanges);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Computed files that merge original state with pending changes
|
||||||
|
const displayFiles = $derived(files.map(f => {
|
||||||
|
const pending = pendingChanges.get(f.path);
|
||||||
|
return {
|
||||||
|
...f,
|
||||||
|
tracked: pending !== undefined ? pending : f.tracked
|
||||||
|
};
|
||||||
|
}));
|
||||||
|
|
||||||
|
const hasChanges = $derived(pendingChanges.size > 0);
|
||||||
|
|
||||||
|
async function commitChanges() {
|
||||||
|
if (!hasChanges) return;
|
||||||
|
|
||||||
|
committing = true;
|
||||||
|
const tracks: string[] = [];
|
||||||
|
const untracks: string[] = [];
|
||||||
|
|
||||||
|
for (const [path, track] of pendingChanges.entries()) {
|
||||||
|
if (track) tracks.push(path);
|
||||||
|
else untracks.push(path);
|
||||||
|
}
|
||||||
|
|
||||||
|
const promise = (async () => {
|
||||||
|
await trackBatchSystemTrackBatchPost({
|
||||||
|
body: { tracks, untracks }
|
||||||
|
});
|
||||||
|
pendingChanges = new Map();
|
||||||
|
await loadFiles(currentPath);
|
||||||
|
})();
|
||||||
|
|
||||||
|
toast.promise(promise, {
|
||||||
|
loading: 'Committing changes...',
|
||||||
|
success: 'Tracking updated successfully',
|
||||||
|
error: 'Failed to update tracking'
|
||||||
|
});
|
||||||
|
|
||||||
|
try {
|
||||||
|
await promise;
|
||||||
|
} catch (e) {
|
||||||
|
console.error(e);
|
||||||
|
} finally {
|
||||||
|
committing = false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<svelte:head>
|
||||||
|
<title>File Tracking - TapeHoard</title>
|
||||||
|
</svelte:head>
|
||||||
|
|
||||||
|
<div class="flex flex-col gap-6 h-full">
|
||||||
|
<!-- INTEGRATED HEADER -->
|
||||||
|
<header class="flex justify-between items-center bg-bg-secondary px-8 py-5 rounded-xl border border-border-color shadow-2xl relative overflow-hidden">
|
||||||
|
<div class="absolute inset-0 bg-gradient-to-r from-action-color/5 to-transparent pointer-events-none"></div>
|
||||||
|
<div class="relative z-10">
|
||||||
|
<h1 class="text-2xl font-black uppercase tracking-tighter text-text-primary flex items-center gap-3">
|
||||||
|
<FolderTree class="text-action-color" size={28} />
|
||||||
|
File Tracking
|
||||||
|
</h1>
|
||||||
|
<p class="text-[12px] font-bold uppercase tracking-widest text-text-secondary mt-1 opacity-80">
|
||||||
|
Data Provisioning & Indexing Configuration
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="flex gap-4 relative z-10">
|
||||||
|
<Button variant="secondary" size="lg" class="px-6 h-12 border-border-color font-bold uppercase tracking-widest text-[11px]">
|
||||||
|
<PlayCircle size={20} class="mr-2 text-action-color" />
|
||||||
|
Simulate Scan
|
||||||
|
</Button>
|
||||||
|
<Button
|
||||||
|
variant="default"
|
||||||
|
size="lg"
|
||||||
|
class={cn(
|
||||||
|
"px-8 h-12 font-bold uppercase tracking-widest text-[11px] transition-all",
|
||||||
|
hasChanges ? "bg-action-color text-white shadow-lg shadow-action-color/20" : "opacity-50"
|
||||||
|
)}
|
||||||
|
onclick={commitChanges}
|
||||||
|
disabled={!hasChanges || committing}
|
||||||
|
>
|
||||||
|
{#if committing}
|
||||||
|
<RotateCw size={20} class="mr-2 animate-spin" />
|
||||||
|
{:else}
|
||||||
|
<Save size={20} class="mr-2" />
|
||||||
|
{/if}
|
||||||
|
Commit Changes ({pendingChanges.size})
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
</header>
|
||||||
|
|
||||||
|
<!-- BACKUP DELTA STATS (FULL WIDTH) -->
|
||||||
|
<div class="grid grid-cols-1 md:grid-cols-4 gap-6">
|
||||||
|
<Card class="bg-gradient-to-br from-bg-secondary to-bg-tertiary border-border-color shadow-lg p-4 flex items-center gap-4">
|
||||||
|
<div class="p-3 bg-action-color/10 rounded-lg text-action-color border border-action-color/20">
|
||||||
|
<LayoutGrid size={24} />
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<span class="text-[10px] font-black uppercase tracking-widest text-text-secondary block">Tracked Items</span>
|
||||||
|
<span class="text-xl font-black text-text-primary mono">14,203</span>
|
||||||
|
</div>
|
||||||
|
</Card>
|
||||||
|
|
||||||
|
<Card class="bg-gradient-to-br from-bg-secondary to-bg-tertiary border-border-color shadow-lg p-4 flex items-center gap-4">
|
||||||
|
<div class="p-3 bg-action-color/10 rounded-lg text-action-color border border-action-color/20">
|
||||||
|
<Database size={24} />
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<span class="text-[10px] font-black uppercase tracking-widest text-text-secondary block">Est. Payload</span>
|
||||||
|
<span class="text-xl font-black text-action-color mono">4.2 TB</span>
|
||||||
|
</div>
|
||||||
|
</Card>
|
||||||
|
|
||||||
|
<Card class="bg-gradient-to-br from-bg-secondary to-bg-tertiary border-border-color shadow-lg p-4 flex items-center gap-4">
|
||||||
|
<div class="p-3 bg-success-color/10 rounded-lg text-success-color border border-success-color/20">
|
||||||
|
<HardDrive size={24} />
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<span class="text-[10px] font-black uppercase tracking-widest text-text-secondary block">Media Load</span>
|
||||||
|
<span class="text-xl font-black text-success-color mono">2 x LTO-6</span>
|
||||||
|
</div>
|
||||||
|
</Card>
|
||||||
|
|
||||||
|
<Card class="bg-gradient-to-br from-bg-secondary to-bg-tertiary border-border-color shadow-lg p-4 flex items-center gap-4">
|
||||||
|
<div class="p-3 bg-orange-500/10 rounded-lg text-orange-500 border border-orange-500/20">
|
||||||
|
<FileCheck size={24} />
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<span class="text-[10px] font-black uppercase tracking-widest text-text-secondary block">Last Simulation</span>
|
||||||
|
<span class="text-xl font-black text-text-primary mono">2h ago</span>
|
||||||
|
</div>
|
||||||
|
</Card>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- FULL WIDTH FILE BROWSER -->
|
||||||
|
<div class="flex-1 min-h-0 relative">
|
||||||
|
{#if loading}
|
||||||
|
<div class="absolute inset-0 bg-bg-primary/50 z-50 flex items-center justify-center rounded-lg">
|
||||||
|
<div class="flex flex-col items-center gap-3">
|
||||||
|
<RotateCw size={32} class="animate-spin text-action-color" />
|
||||||
|
<span class="text-[11px] font-black uppercase tracking-widest text-text-secondary">Accessing File System...</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
<FileBrowser
|
||||||
|
bind:currentPath
|
||||||
|
files={displayFiles}
|
||||||
|
onNavigate={handleNavigate}
|
||||||
|
onToggleTrack={handleToggleTrack}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
@@ -0,0 +1,18 @@
|
|||||||
|
import adapter from '@sveltejs/adapter-static';
|
||||||
|
import { vitePreprocess } from '@sveltejs/vite-plugin-svelte';
|
||||||
|
|
||||||
|
/** @type {import('@sveltejs/kit').Config} */
|
||||||
|
const config = {
|
||||||
|
preprocess: vitePreprocess(),
|
||||||
|
kit: {
|
||||||
|
adapter: adapter({
|
||||||
|
fallback: 'index.html', // Enable SPA mode so FastAPI can serve the fallback
|
||||||
|
pages: 'build',
|
||||||
|
assets: 'build',
|
||||||
|
precompress: false,
|
||||||
|
strict: true
|
||||||
|
})
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
export default config;
|
||||||
@@ -0,0 +1,19 @@
|
|||||||
|
{
|
||||||
|
"extends": "./.svelte-kit/tsconfig.json",
|
||||||
|
"compilerOptions": {
|
||||||
|
"allowJs": true,
|
||||||
|
"checkJs": true,
|
||||||
|
"esModuleInterop": true,
|
||||||
|
"forceConsistentCasingInFileNames": true,
|
||||||
|
"resolveJsonModule": true,
|
||||||
|
"skipLibCheck": true,
|
||||||
|
"sourceMap": true,
|
||||||
|
"strict": true
|
||||||
|
},
|
||||||
|
"include": [
|
||||||
|
"src/**/*.d.ts",
|
||||||
|
"src/**/*.ts",
|
||||||
|
"src/**/*.js",
|
||||||
|
"src/**/*.svelte"
|
||||||
|
]
|
||||||
|
}
|
||||||
@@ -0,0 +1,10 @@
|
|||||||
|
import { sveltekit } from '@sveltejs/kit/vite';
|
||||||
|
import tailwindcss from '@tailwindcss/vite';
|
||||||
|
import { defineConfig } from 'vite';
|
||||||
|
|
||||||
|
export default defineConfig({
|
||||||
|
plugins: [
|
||||||
|
tailwindcss(),
|
||||||
|
sveltekit()
|
||||||
|
]
|
||||||
|
});
|
||||||
@@ -0,0 +1,60 @@
|
|||||||
|
# TapeHoard Justfile
|
||||||
|
# Install `just` to run these commands easily (e.g. `brew install just` or `cargo install just`)
|
||||||
|
|
||||||
|
set shell := ["bash", "-c"]
|
||||||
|
|
||||||
|
default:
|
||||||
|
@just --list
|
||||||
|
|
||||||
|
# --- Development ---
|
||||||
|
|
||||||
|
# Run both the FastAPI backend and Svelte frontend in development mode
|
||||||
|
dev:
|
||||||
|
@echo "Starting Backend (FastAPI) and Frontend (SvelteKit)..."
|
||||||
|
@trap 'kill %1' SIGINT; \
|
||||||
|
(cd backend && uv run uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload) & \
|
||||||
|
(cd frontend && npm run dev)
|
||||||
|
|
||||||
|
# Run just the backend
|
||||||
|
backend:
|
||||||
|
cd backend && uv run uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload
|
||||||
|
|
||||||
|
# Run just the frontend
|
||||||
|
frontend:
|
||||||
|
cd frontend && npm run dev
|
||||||
|
|
||||||
|
# --- Quality Control ---
|
||||||
|
|
||||||
|
# Run all linters and type checkers (Ruff, ty, Svelte Check)
|
||||||
|
lint:
|
||||||
|
@echo "Linting Python (Ruff)..."
|
||||||
|
cd backend && uv run ruff check .
|
||||||
|
@echo "Type checking Python (ty)..."
|
||||||
|
cd backend && uv run ty
|
||||||
|
@echo "Type checking Svelte..."
|
||||||
|
cd frontend && npm run check
|
||||||
|
|
||||||
|
# Auto-format all code (Ruff Format)
|
||||||
|
format:
|
||||||
|
@echo "Formatting Python (Ruff)..."
|
||||||
|
cd backend && uv run ruff format .
|
||||||
|
|
||||||
|
# --- Database ---
|
||||||
|
|
||||||
|
# Apply all pending Alembic database migrations
|
||||||
|
db-upgrade:
|
||||||
|
@echo "Upgrading Database..."
|
||||||
|
cd backend && uv run alembic upgrade head
|
||||||
|
|
||||||
|
# Autogenerate a new migration (Usage: just db-migrate "message")
|
||||||
|
db-migrate message:
|
||||||
|
@echo "Generating Migration..."
|
||||||
|
cd backend && uv run alembic revision --autogenerate -m "{{message}}"
|
||||||
|
|
||||||
|
# --- Code Generation ---
|
||||||
|
|
||||||
|
# Generate the TypeScript API client from the FastAPI OpenAPI spec
|
||||||
|
generate-client:
|
||||||
|
@echo "Generating TypeScript API client..."
|
||||||
|
# Ensure backend is running first: `just backend`
|
||||||
|
cd frontend && npx @hey-api/openapi-ts -i http://localhost:8000/openapi.json -o src/lib/api -c @hey-api/client-fetch
|
||||||
Reference in New Issue
Block a user