Compare commits
141 Commits
v2025.10.0
...
cli-improv
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
0702864a11 | ||
|
|
487e66faa4 | ||
|
|
f71a3ea8fe | ||
|
|
39fc9e81cd | ||
|
|
a4f96fca11 | ||
|
|
1f588d0498 | ||
|
|
4573edc1e1 | ||
|
|
5a184c1b83 | ||
|
|
7b73401dcf | ||
|
|
8571440d84 | ||
|
|
bc37a5d666 | ||
|
|
a80f2ccf9a | ||
|
|
1eaf276b75 | ||
|
|
e9559dfdfa | ||
|
|
4c2e7b8609 | ||
|
|
e638cecf07 | ||
|
|
076058da4f | ||
|
|
f1bc4aa146 | ||
|
|
773c4a24a5 | ||
|
|
6cc659e5c4 | ||
|
|
e1580210dc | ||
|
|
0a4ffde319 | ||
|
|
cc4d598af3 | ||
|
|
f5d11cb6d3 | ||
|
|
65e91aec6b | ||
|
|
ae943a5fd2 | ||
|
|
9e1a11de0b | ||
|
|
52732e12ec | ||
|
|
1127d7e3fa | ||
|
|
7d4d228236 | ||
|
|
565e053ee8 | ||
|
|
26aba6034f | ||
|
|
9a1d613034 | ||
|
|
3e4de7d3c4 | ||
|
|
b64b5ec0f8 | ||
|
|
510d1c7d17 | ||
|
|
ed13a62269 | ||
|
|
935d613959 | ||
|
|
adeaaccc45 | ||
|
|
d253093333 | ||
|
|
f265b7a572 | ||
|
|
68b2ff016f | ||
|
|
a1c6295810 | ||
|
|
76ee3fa61b | ||
|
|
7fef35ce0a | ||
|
|
654af09951 | ||
|
|
484dcfade0 | ||
|
|
fda18c5434 | ||
|
|
a8176d6e9e | ||
|
|
957d8d9d46 | ||
|
|
5f18bf25e2 | ||
|
|
66942eaf2c | ||
|
|
38796b1833 | ||
|
|
49ffa6fc45 | ||
|
|
1f56ba2eb6 | ||
|
|
f98a70ecb4 | ||
|
|
2984eb40c9 | ||
|
|
cc5d4742f0 | ||
|
|
5b8e4b98a0 | ||
|
|
8637c90a21 | ||
|
|
b88c5e71a0 | ||
|
|
1899d512ab | ||
|
|
7c31718f5e | ||
|
|
8f1463e5d0 | ||
|
|
0dc8807808 | ||
|
|
f24a159b8a | ||
|
|
0b91d3aaff | ||
|
|
431dc1c896 | ||
|
|
bc8277b56b | ||
|
|
0afed185d9 | ||
|
|
55cee00601 | ||
|
|
b41a8e04cb | ||
|
|
eff4519d91 | ||
|
|
c4ce458f79 | ||
|
|
f02ae35634 | ||
|
|
c2f068970b | ||
|
|
eec2d6bc38 | ||
|
|
efa22e470e | ||
|
|
c00d2e981f | ||
|
|
9c45254952 | ||
|
|
d031ff231a | ||
|
|
f056894ddb | ||
|
|
1b0315165f | ||
|
|
bd7e840a57 | ||
|
|
8969748c3c | ||
|
|
4e15ac10a6 | ||
|
|
47a3d44888 | ||
|
|
eb10910d20 | ||
|
|
6ba83d424d | ||
|
|
beb47a6b6a | ||
|
|
1893b8f8dd | ||
|
|
7a5bca7aae | ||
|
|
9a75bc2ae7 | ||
|
|
65514e3882 | ||
|
|
9ddaafb79f | ||
|
|
de47ee19ec | ||
|
|
ea730d0184 | ||
|
|
fe706998d4 | ||
|
|
99209e088f | ||
|
|
3eb29ff2fe | ||
|
|
b759003c83 | ||
|
|
6cba38ac89 | ||
|
|
ba8f85baaf | ||
|
|
9970d5fa6f | ||
|
|
d550b42ca3 | ||
|
|
2e1f0cb53f | ||
|
|
eead422ada | ||
|
|
b5753da3b7 | ||
|
|
ae2f2459e9 | ||
|
|
306e6f358a | ||
|
|
822d52a57e | ||
|
|
e665ce04df | ||
|
|
e4828e1b17 | ||
|
|
42143249a2 | ||
|
|
72a7e6963d | ||
|
|
494e9efb64 | ||
|
|
9fe077f598 | ||
|
|
a6eca1cf2e | ||
|
|
31edd1013f | ||
|
|
28e9657ea5 | ||
|
|
ff084a224a | ||
|
|
bbcae34575 | ||
|
|
2a5587c128 | ||
|
|
c41e173a63 | ||
|
|
2b43407ddf | ||
|
|
4d75b8ef06 | ||
|
|
aa79fb05f9 | ||
|
|
fe01796536 | ||
|
|
6654d6c346 | ||
|
|
4c8f768624 | ||
|
|
47c5ef1464 | ||
|
|
2bf7cf5eeb | ||
|
|
f2be52bfec | ||
|
|
ef80216ca1 | ||
|
|
3bcc0b8356 | ||
|
|
ebcdee9be0 | ||
|
|
873abe69a1 | ||
|
|
5fe64f8a22 | ||
|
|
33afafd890 | ||
|
|
ac7de993ba | ||
|
|
1f8fa0f8c3 |
72
.claude-context.md
Normal file
@@ -0,0 +1,72 @@
|
|||||||
|
# Claude Context: Detaching Tauri from Yaak
|
||||||
|
|
||||||
|
## Goal
|
||||||
|
Make Yaak runnable as a standalone CLI without Tauri as a dependency. The core Rust crates in `crates/` should be usable independently, while Tauri-specific code lives in `crates-tauri/`.
|
||||||
|
|
||||||
|
## Project Structure
|
||||||
|
```
|
||||||
|
crates/ # Core crates - should NOT depend on Tauri
|
||||||
|
crates-tauri/ # Tauri-specific crates (yaak-app, yaak-tauri-utils, etc.)
|
||||||
|
crates-cli/ # CLI crate (yaak-cli)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Completed Work
|
||||||
|
|
||||||
|
### 1. Folder Restructure
|
||||||
|
- Moved Tauri-dependent app code to `crates-tauri/yaak-app/`
|
||||||
|
- Created `crates-tauri/yaak-tauri-utils/` for shared Tauri utilities (window traits, api_client, error handling)
|
||||||
|
- Created `crates-cli/yaak-cli/` for the standalone CLI
|
||||||
|
|
||||||
|
### 2. Decoupled Crates (no longer depend on Tauri)
|
||||||
|
- **yaak-models**: Uses `init_standalone()` pattern for CLI database access
|
||||||
|
- **yaak-http**: Removed Tauri plugin, HttpConnectionManager initialized in yaak-app setup
|
||||||
|
- **yaak-common**: Only contains Tauri-free utilities (serde, platform)
|
||||||
|
- **yaak-crypto**: Removed Tauri plugin, EncryptionManager initialized in yaak-app setup, commands moved to yaak-app
|
||||||
|
- **yaak-grpc**: Replaced AppHandle with GrpcConfig struct, uses tokio::process::Command instead of Tauri sidecar
|
||||||
|
|
||||||
|
### 3. CLI Implementation
|
||||||
|
- Basic CLI at `crates-cli/yaak-cli/src/main.rs`
|
||||||
|
- Commands: workspaces, requests, send (by ID), get (ad-hoc URL), create
|
||||||
|
- Uses same database as Tauri app via `yaak_models::init_standalone()`
|
||||||
|
|
||||||
|
## Remaining Work
|
||||||
|
|
||||||
|
### Crates Still Depending on Tauri (in `crates/`)
|
||||||
|
1. **yaak-git** (3 files) - Moderate complexity
|
||||||
|
2. **yaak-plugins** (13 files) - **Hardest** - deeply integrated with Tauri for plugin-to-window communication
|
||||||
|
3. **yaak-sync** (4 files) - Moderate complexity
|
||||||
|
4. **yaak-ws** (5 files) - Moderate complexity
|
||||||
|
|
||||||
|
### Pattern for Decoupling
|
||||||
|
1. Remove Tauri plugin `init()` function from the crate
|
||||||
|
2. Move commands to `yaak-app/src/commands.rs` or keep inline in `lib.rs`
|
||||||
|
3. Move extension traits (e.g., `SomethingManagerExt`) to yaak-app or yaak-tauri-utils
|
||||||
|
4. Initialize managers in yaak-app's `.setup()` block
|
||||||
|
5. Remove `tauri` from Cargo.toml dependencies
|
||||||
|
6. Update `crates-tauri/yaak-app/capabilities/default.json` to remove the plugin permission
|
||||||
|
7. Replace `tauri::async_runtime::block_on` with `tokio::runtime::Handle::current().block_on()`
|
||||||
|
|
||||||
|
## Key Files
|
||||||
|
- `crates-tauri/yaak-app/src/lib.rs` - Main Tauri app, setup block initializes managers
|
||||||
|
- `crates-tauri/yaak-app/src/commands.rs` - Migrated Tauri commands
|
||||||
|
- `crates-tauri/yaak-app/src/models_ext.rs` - Database plugin and extension traits
|
||||||
|
- `crates-tauri/yaak-tauri-utils/src/window.rs` - WorkspaceWindowTrait for window state
|
||||||
|
- `crates/yaak-models/src/lib.rs` - Contains `init_standalone()` for CLI usage
|
||||||
|
|
||||||
|
## Git Branch
|
||||||
|
Working on `detach-tauri` branch.
|
||||||
|
|
||||||
|
## Recent Commits
|
||||||
|
```
|
||||||
|
c40cff40 Remove Tauri dependencies from yaak-crypto and yaak-grpc
|
||||||
|
df495f1d Move Tauri utilities from yaak-common to yaak-tauri-utils
|
||||||
|
481e0273 Remove Tauri dependencies from yaak-http and yaak-common
|
||||||
|
10568ac3 Add HTTP request sending to yaak-cli
|
||||||
|
bcb7d600 Add yaak-cli stub with basic database access
|
||||||
|
e718a5f1 Refactor models_ext to use init_standalone from yaak-models
|
||||||
|
```
|
||||||
|
|
||||||
|
## Testing
|
||||||
|
- Run `cargo check -p <crate>` to verify a crate builds without Tauri
|
||||||
|
- Run `npm run app-dev` to test the Tauri app still works
|
||||||
|
- Run `cargo run -p yaak-cli -- --help` to test the CLI
|
||||||
@@ -1,35 +1,46 @@
|
|||||||
---
|
---
|
||||||
description: Review a PR in a new worktree
|
description: Review a PR in a new worktree
|
||||||
allowed-tools: Bash(git worktree:*), Bash(gh pr:*)
|
allowed-tools: Bash(git worktree:*), Bash(gh pr:*), Bash(git branch:*)
|
||||||
---
|
---
|
||||||
|
|
||||||
Review a GitHub pull request in a new git worktree.
|
Check out a GitHub pull request for review.
|
||||||
|
|
||||||
## Usage
|
## Usage
|
||||||
|
|
||||||
```
|
```
|
||||||
/review-pr <PR_NUMBER>
|
/check-out-pr <PR_NUMBER>
|
||||||
```
|
```
|
||||||
|
|
||||||
## What to do
|
## What to do
|
||||||
|
|
||||||
1. List all open pull requests and ask the user to select one
|
1. If no PR number is provided, list all open pull requests and ask the user to select one
|
||||||
2. Get PR information using `gh pr view <PR_NUMBER> --json number,headRefName`
|
2. Get PR information using `gh pr view <PR_NUMBER> --json number,headRefName`
|
||||||
3. Extract the branch name from the PR
|
3. **Ask the user** whether they want to:
|
||||||
4. Create a new worktree at `../yaak-worktrees/pr-<PR_NUMBER>` using `git worktree add` with a timeout of at least 300000ms (5 minutes) since the post-checkout hook runs a bootstrap script
|
- **A) Check out in current directory** — simple `gh pr checkout <PR_NUMBER>`
|
||||||
5. Checkout the PR branch in the new worktree using `gh pr checkout <PR_NUMBER>`
|
- **B) Create a new worktree** — isolated copy at `../yaak-worktrees/pr-<PR_NUMBER>`
|
||||||
6. The post-checkout hook will automatically:
|
4. Follow the appropriate path below
|
||||||
|
|
||||||
|
## Option A: Check out in current directory
|
||||||
|
|
||||||
|
1. Run `gh pr checkout <PR_NUMBER>`
|
||||||
|
2. Inform the user which branch they're now on
|
||||||
|
|
||||||
|
## Option B: Create a new worktree
|
||||||
|
|
||||||
|
1. Create a new worktree at `../yaak-worktrees/pr-<PR_NUMBER>` using `git worktree add` with a timeout of at least 300000ms (5 minutes) since the post-checkout hook runs a bootstrap script
|
||||||
|
2. Checkout the PR branch in the new worktree using `gh pr checkout <PR_NUMBER>`
|
||||||
|
3. The post-checkout hook will automatically:
|
||||||
- Create `.env.local` with unique ports
|
- Create `.env.local` with unique ports
|
||||||
- Copy editor config folders
|
- Copy editor config folders
|
||||||
- Run `npm install && npm run bootstrap`
|
- Run `npm install && npm run bootstrap`
|
||||||
7. Inform the user:
|
4. Inform the user:
|
||||||
- Where the worktree was created
|
- Where the worktree was created
|
||||||
- What ports were assigned
|
- What ports were assigned
|
||||||
- How to access it (cd command)
|
- How to access it (cd command)
|
||||||
- How to run the dev server
|
- How to run the dev server
|
||||||
- How to remove the worktree when done
|
- How to remove the worktree when done
|
||||||
|
|
||||||
## Example Output
|
### Example worktree output
|
||||||
|
|
||||||
```
|
```
|
||||||
Created worktree for PR #123 at ../yaak-worktrees/pr-123
|
Created worktree for PR #123 at ../yaak-worktrees/pr-123
|
||||||
|
|||||||
@@ -37,3 +37,13 @@ The skill generates markdown-formatted release notes following this structure:
|
|||||||
|
|
||||||
**IMPORTANT**: Always add a blank lines around the markdown code fence and output the markdown code block last
|
**IMPORTANT**: Always add a blank lines around the markdown code fence and output the markdown code block last
|
||||||
**IMPORTANT**: PRs by `@gschier` should not mention the @username
|
**IMPORTANT**: PRs by `@gschier` should not mention the @username
|
||||||
|
|
||||||
|
## After Generating Release Notes
|
||||||
|
|
||||||
|
After outputting the release notes, ask the user if they would like to create a draft GitHub release with these notes. If they confirm, create the release using:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
gh release create <tag> --draft --prerelease --title "Release <version>" --notes '<release notes>'
|
||||||
|
```
|
||||||
|
|
||||||
|
**IMPORTANT**: The release title format is "Release XXXX" where XXXX is the version WITHOUT the `v` prefix. For example, tag `v2026.2.1-beta.1` gets title "Release 2026.2.1-beta.1".
|
||||||
|
|||||||
@@ -1,22 +1,27 @@
|
|||||||
# Project Rules
|
# Project Rules
|
||||||
|
|
||||||
## General Development
|
## General Development
|
||||||
|
|
||||||
- **NEVER** commit or push without explicit confirmation
|
- **NEVER** commit or push without explicit confirmation
|
||||||
|
|
||||||
## Build and Lint
|
## Build and Lint
|
||||||
|
|
||||||
- **ALWAYS** run `npm run lint` after modifying TypeScript or JavaScript files
|
- **ALWAYS** run `npm run lint` after modifying TypeScript or JavaScript files
|
||||||
- Run `npm run bootstrap` after changing plugin runtime or MCP server code
|
- Run `npm run bootstrap` after changing plugin runtime or MCP server code
|
||||||
|
|
||||||
## Plugin System
|
## Plugin System
|
||||||
|
|
||||||
### Backend Constraints
|
### Backend Constraints
|
||||||
|
|
||||||
- Always use `UpdateSource::Plugin` when calling database methods from plugin events
|
- Always use `UpdateSource::Plugin` when calling database methods from plugin events
|
||||||
- Never send timestamps (`createdAt`, `updatedAt`) from TypeScript - Rust backend controls these
|
- Never send timestamps (`createdAt`, `updatedAt`) from TypeScript - Rust backend controls these
|
||||||
- Backend uses `NaiveDateTime` (no timezone) so avoid sending ISO timestamp strings
|
- Backend uses `NaiveDateTime` (no timezone) so avoid sending ISO timestamp strings
|
||||||
|
|
||||||
### MCP Server
|
### MCP Server
|
||||||
|
|
||||||
- MCP server has **no active window context** - cannot call `window.workspaceId()`
|
- MCP server has **no active window context** - cannot call `window.workspaceId()`
|
||||||
- Get workspace ID from `workspaceCtx.yaak.workspace.list()` instead
|
- Get workspace ID from `workspaceCtx.yaak.workspace.list()` instead
|
||||||
|
|
||||||
## Rust Type Generation
|
## Rust Type Generation
|
||||||
- Run `cd src-tauri && cargo test --package yaak-plugins` to regenerate TypeScript bindings after modifying Rust event types
|
|
||||||
|
- Run `cargo test --package yaak-plugins` (and for other crates) to regenerate TypeScript bindings after modifying Rust event types
|
||||||
|
|||||||
46
.codex/skills/release-check-out-pr/SKILL.md
Normal file
@@ -0,0 +1,46 @@
|
|||||||
|
---
|
||||||
|
name: release-check-out-pr
|
||||||
|
description: Check out a GitHub pull request for review in this repo, either in the current directory or in a new isolated worktree at ../yaak-worktrees/pr-<PR_NUMBER>. Use when asked to run or replace the old Claude check-out-pr command.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Check Out PR
|
||||||
|
|
||||||
|
Check out a PR by number and let the user choose between current-directory checkout and isolated worktree checkout.
|
||||||
|
|
||||||
|
## Workflow
|
||||||
|
|
||||||
|
1. Confirm `gh` CLI is available.
|
||||||
|
2. If no PR number is provided, list open PRs (`gh pr list`) and ask the user to choose one.
|
||||||
|
3. Read PR metadata:
|
||||||
|
- `gh pr view <PR_NUMBER> --json number,headRefName`
|
||||||
|
4. Ask the user to choose:
|
||||||
|
- Option A: check out in the current directory
|
||||||
|
- Option B: create a new worktree at `../yaak-worktrees/pr-<PR_NUMBER>`
|
||||||
|
|
||||||
|
## Option A: Current Directory
|
||||||
|
|
||||||
|
1. Run:
|
||||||
|
- `gh pr checkout <PR_NUMBER>`
|
||||||
|
2. Report the checked-out branch.
|
||||||
|
|
||||||
|
## Option B: New Worktree
|
||||||
|
|
||||||
|
1. Use path:
|
||||||
|
- `../yaak-worktrees/pr-<PR_NUMBER>`
|
||||||
|
2. Create the worktree with a timeout of at least 5 minutes because checkout hooks run bootstrap.
|
||||||
|
3. In the new worktree, run:
|
||||||
|
- `gh pr checkout <PR_NUMBER>`
|
||||||
|
4. Report:
|
||||||
|
- Worktree path
|
||||||
|
- Assigned ports from `.env.local` if present
|
||||||
|
- How to start work:
|
||||||
|
- `cd ../yaak-worktrees/pr-<PR_NUMBER>`
|
||||||
|
- `npm run app-dev`
|
||||||
|
- How to remove when done:
|
||||||
|
- `git worktree remove ../yaak-worktrees/pr-<PR_NUMBER>`
|
||||||
|
|
||||||
|
## Error Handling
|
||||||
|
|
||||||
|
- If PR does not exist, show a clear error.
|
||||||
|
- If worktree already exists, ask whether to reuse it or remove/recreate it.
|
||||||
|
- If `gh` is missing, instruct the user to install/authenticate it.
|
||||||
48
.codex/skills/release-generate-release-notes/SKILL.md
Normal file
@@ -0,0 +1,48 @@
|
|||||||
|
---
|
||||||
|
name: release-generate-release-notes
|
||||||
|
description: Generate Yaak release notes from git history and PR metadata, including feedback links and full changelog compare links. Use when asked to run or replace the old Claude generate-release-notes command.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Generate Release Notes
|
||||||
|
|
||||||
|
Generate formatted markdown release notes for a Yaak tag.
|
||||||
|
|
||||||
|
## Workflow
|
||||||
|
|
||||||
|
1. Determine target tag.
|
||||||
|
2. Determine previous comparable tag:
|
||||||
|
- Beta tag: compare against previous beta (if the root version is the same) or stable tag.
|
||||||
|
- Stable tag: compare against previous stable tag.
|
||||||
|
3. Collect commits in range:
|
||||||
|
- `git log --oneline <prev_tag>..<target_tag>`
|
||||||
|
4. For linked PRs, fetch metadata:
|
||||||
|
- `gh pr view <PR_NUMBER> --json number,title,body,author,url`
|
||||||
|
5. Extract useful details:
|
||||||
|
- Feedback URLs (`feedback.yaak.app`)
|
||||||
|
- Plugin install links or other notable context
|
||||||
|
6. Format notes using Yaak style:
|
||||||
|
- Changelog badge at top
|
||||||
|
- Bulleted items with PR links where available
|
||||||
|
- Feedback links where available
|
||||||
|
- Full changelog compare link at bottom
|
||||||
|
|
||||||
|
## Formatting Rules
|
||||||
|
|
||||||
|
- Wrap final notes in a markdown code fence.
|
||||||
|
- Keep a blank line before and after the code fence.
|
||||||
|
- Output the markdown code block last.
|
||||||
|
- Do not append `by @gschier` for PRs authored by `@gschier`.
|
||||||
|
|
||||||
|
## Release Creation Prompt
|
||||||
|
|
||||||
|
After producing notes, ask whether to create a draft GitHub release.
|
||||||
|
|
||||||
|
If confirmed and release does not yet exist, run:
|
||||||
|
|
||||||
|
`gh release create <tag> --draft --prerelease --title "Release <version_without_v>" --notes '<release notes>'`
|
||||||
|
|
||||||
|
If a draft release for the tag already exists, update it instead:
|
||||||
|
|
||||||
|
`gh release edit <tag> --title "Release <version_without_v>" --notes-file <path_to_notes>`
|
||||||
|
|
||||||
|
Use title format `Release <version_without_v>`, e.g. `v2026.2.1-beta.1` -> `Release 2026.2.1-beta.1`.
|
||||||
37
.codex/skills/worktree-management/SKILL.md
Normal file
@@ -0,0 +1,37 @@
|
|||||||
|
---
|
||||||
|
name: worktree-management
|
||||||
|
description: Manage Yaak git worktrees using the standard ../yaak-worktrees/<NAME> layout, including creation, removal, and expected automatic setup behavior and port assignments.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Worktree Management
|
||||||
|
|
||||||
|
Use the Yaak-standard worktree path layout and lifecycle commands.
|
||||||
|
|
||||||
|
## Path Convention
|
||||||
|
|
||||||
|
Always create worktrees under:
|
||||||
|
|
||||||
|
`../yaak-worktrees/<NAME>`
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
- `git worktree add ../yaak-worktrees/feature-auth`
|
||||||
|
- `git worktree add ../yaak-worktrees/bugfix-login`
|
||||||
|
- `git worktree add ../yaak-worktrees/refactor-api`
|
||||||
|
|
||||||
|
## Automatic Setup After Checkout
|
||||||
|
|
||||||
|
Project git hooks automatically:
|
||||||
|
1. Create `.env.local` with unique `YAAK_DEV_PORT` and `YAAK_PLUGIN_MCP_SERVER_PORT`
|
||||||
|
2. Copy gitignored editor config folders
|
||||||
|
3. Run `npm install && npm run bootstrap`
|
||||||
|
|
||||||
|
## Remove Worktree
|
||||||
|
|
||||||
|
`git worktree remove ../yaak-worktrees/<NAME>`
|
||||||
|
|
||||||
|
## Port Pattern
|
||||||
|
|
||||||
|
- Main worktree: Vite `1420`, MCP `64343`
|
||||||
|
- First extra worktree: `1421`, `64344`
|
||||||
|
- Second extra worktree: `1422`, `64345`
|
||||||
|
- Continue incrementally for additional worktrees
|
||||||
8
.gitattributes
vendored
@@ -1,7 +1,7 @@
|
|||||||
src-tauri/vendored/**/* linguist-generated=true
|
crates-tauri/yaak-app/vendored/**/* linguist-generated=true
|
||||||
src-tauri/gen/schemas/**/* linguist-generated=true
|
crates-tauri/yaak-app/gen/schemas/**/* linguist-generated=true
|
||||||
**/bindings/* linguist-generated=true
|
**/bindings/* linguist-generated=true
|
||||||
src-tauri/yaak-templates/pkg/* linguist-generated=true
|
crates/yaak-templates/pkg/* linguist-generated=true
|
||||||
|
|
||||||
# Ensure consistent line endings for test files that check exact content
|
# Ensure consistent line endings for test files that check exact content
|
||||||
src-tauri/yaak-http/tests/test.txt text eol=lf
|
crates/yaak-http/tests/test.txt text eol=lf
|
||||||
|
|||||||
18
.github/pull_request_template.md
vendored
Normal file
@@ -0,0 +1,18 @@
|
|||||||
|
## Summary
|
||||||
|
|
||||||
|
<!-- Describe the bug and the fix in 1-3 sentences. -->
|
||||||
|
|
||||||
|
## Submission
|
||||||
|
|
||||||
|
- [ ] This PR is a bug fix or small-scope improvement.
|
||||||
|
- [ ] If this PR is not a bug fix or small-scope improvement, I linked an approved feedback item below.
|
||||||
|
- [ ] I have read and followed [`CONTRIBUTING.md`](CONTRIBUTING.md).
|
||||||
|
- [ ] I tested this change locally.
|
||||||
|
- [ ] I added or updated tests when reasonable.
|
||||||
|
|
||||||
|
Approved feedback item (required if not a bug fix or small-scope improvement):
|
||||||
|
<!-- https://yaak.app/feedback/... -->
|
||||||
|
|
||||||
|
## Related
|
||||||
|
|
||||||
|
<!-- Link related issues, discussions, or feedback items. -->
|
||||||
3
.github/workflows/ci.yml
vendored
@@ -18,14 +18,13 @@ jobs:
|
|||||||
- uses: dtolnay/rust-toolchain@stable
|
- uses: dtolnay/rust-toolchain@stable
|
||||||
- uses: Swatinem/rust-cache@v2
|
- uses: Swatinem/rust-cache@v2
|
||||||
with:
|
with:
|
||||||
workspaces: 'src-tauri'
|
|
||||||
shared-key: ci
|
shared-key: ci
|
||||||
cache-on-failure: true
|
cache-on-failure: true
|
||||||
|
|
||||||
- run: npm ci
|
- run: npm ci
|
||||||
|
- run: npm run bootstrap
|
||||||
- run: npm run lint
|
- run: npm run lint
|
||||||
- name: Run JS Tests
|
- name: Run JS Tests
|
||||||
run: npm test
|
run: npm test
|
||||||
- name: Run Rust Tests
|
- name: Run Rust Tests
|
||||||
run: cargo test --all
|
run: cargo test --all
|
||||||
working-directory: src-tauri
|
|
||||||
|
|||||||
52
.github/workflows/flathub.yml
vendored
Normal file
@@ -0,0 +1,52 @@
|
|||||||
|
name: Update Flathub
|
||||||
|
on:
|
||||||
|
release:
|
||||||
|
types: [published]
|
||||||
|
|
||||||
|
permissions:
|
||||||
|
contents: read
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
update-flathub:
|
||||||
|
name: Update Flathub manifest
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
# Only run for stable releases (skip betas/pre-releases)
|
||||||
|
if: ${{ !github.event.release.prerelease }}
|
||||||
|
steps:
|
||||||
|
- name: Checkout app repo
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Checkout Flathub repo
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
with:
|
||||||
|
repository: flathub/app.yaak.Yaak
|
||||||
|
token: ${{ secrets.FLATHUB_TOKEN }}
|
||||||
|
path: flathub-repo
|
||||||
|
|
||||||
|
- name: Set up Python
|
||||||
|
uses: actions/setup-python@v5
|
||||||
|
with:
|
||||||
|
python-version: "3.12"
|
||||||
|
|
||||||
|
- name: Set up Node.js
|
||||||
|
uses: actions/setup-node@v4
|
||||||
|
with:
|
||||||
|
node-version: "22"
|
||||||
|
|
||||||
|
- name: Install source generators
|
||||||
|
run: |
|
||||||
|
pip install flatpak-node-generator tomlkit aiohttp
|
||||||
|
git clone --depth 1 https://github.com/flatpak/flatpak-builder-tools flatpak/flatpak-builder-tools
|
||||||
|
|
||||||
|
- name: Run update-manifest.sh
|
||||||
|
run: bash flatpak/update-manifest.sh "${{ github.event.release.tag_name }}" flathub-repo
|
||||||
|
|
||||||
|
- name: Commit and push to Flathub
|
||||||
|
working-directory: flathub-repo
|
||||||
|
run: |
|
||||||
|
git config user.name "github-actions[bot]"
|
||||||
|
git config user.email "github-actions[bot]@users.noreply.github.com"
|
||||||
|
git add -A
|
||||||
|
git diff --cached --quiet && echo "No changes to commit" && exit 0
|
||||||
|
git commit -m "Update to ${{ github.event.release.tag_name }}"
|
||||||
|
git push
|
||||||
122
.github/workflows/release.yml
vendored
@@ -1,7 +1,7 @@
|
|||||||
name: Generate Artifacts
|
name: Generate Artifacts
|
||||||
on:
|
on:
|
||||||
push:
|
push:
|
||||||
tags: [ v* ]
|
tags: [v*]
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
build-artifacts:
|
build-artifacts:
|
||||||
@@ -13,37 +13,37 @@ jobs:
|
|||||||
fail-fast: false
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
include:
|
include:
|
||||||
- platform: 'macos-latest' # for Arm-based Macs (M1 and above).
|
- platform: "macos-latest" # for Arm-based Macs (M1 and above).
|
||||||
args: '--target aarch64-apple-darwin'
|
args: "--target aarch64-apple-darwin"
|
||||||
yaak_arch: 'arm64'
|
yaak_arch: "arm64"
|
||||||
os: 'macos'
|
os: "macos"
|
||||||
targets: 'aarch64-apple-darwin'
|
targets: "aarch64-apple-darwin"
|
||||||
- platform: 'macos-latest' # for Intel-based Macs.
|
- platform: "macos-latest" # for Intel-based Macs.
|
||||||
args: '--target x86_64-apple-darwin'
|
args: "--target x86_64-apple-darwin"
|
||||||
yaak_arch: 'x64'
|
yaak_arch: "x64"
|
||||||
os: 'macos'
|
os: "macos"
|
||||||
targets: 'x86_64-apple-darwin'
|
targets: "x86_64-apple-darwin"
|
||||||
- platform: 'ubuntu-22.04'
|
- platform: "ubuntu-22.04"
|
||||||
args: ''
|
args: ""
|
||||||
yaak_arch: 'x64'
|
yaak_arch: "x64"
|
||||||
os: 'ubuntu'
|
os: "ubuntu"
|
||||||
targets: ''
|
targets: ""
|
||||||
- platform: 'ubuntu-22.04-arm'
|
- platform: "ubuntu-22.04-arm"
|
||||||
args: ''
|
args: ""
|
||||||
yaak_arch: 'arm64'
|
yaak_arch: "arm64"
|
||||||
os: 'ubuntu'
|
os: "ubuntu"
|
||||||
targets: ''
|
targets: ""
|
||||||
- platform: 'windows-latest'
|
- platform: "windows-latest"
|
||||||
args: ''
|
args: ""
|
||||||
yaak_arch: 'x64'
|
yaak_arch: "x64"
|
||||||
os: 'windows'
|
os: "windows"
|
||||||
targets: ''
|
targets: ""
|
||||||
# Windows ARM64
|
# Windows ARM64
|
||||||
- platform: 'windows-latest'
|
- platform: "windows-latest"
|
||||||
args: '--target aarch64-pc-windows-msvc'
|
args: "--target aarch64-pc-windows-msvc"
|
||||||
yaak_arch: 'arm64'
|
yaak_arch: "arm64"
|
||||||
os: 'windows'
|
os: "windows"
|
||||||
targets: 'aarch64-pc-windows-msvc'
|
targets: "aarch64-pc-windows-msvc"
|
||||||
runs-on: ${{ matrix.platform }}
|
runs-on: ${{ matrix.platform }}
|
||||||
timeout-minutes: 40
|
timeout-minutes: 40
|
||||||
steps:
|
steps:
|
||||||
@@ -60,7 +60,6 @@ jobs:
|
|||||||
|
|
||||||
- uses: Swatinem/rust-cache@v2
|
- uses: Swatinem/rust-cache@v2
|
||||||
with:
|
with:
|
||||||
workspaces: 'src-tauri'
|
|
||||||
shared-key: ci
|
shared-key: ci
|
||||||
cache-on-failure: true
|
cache-on-failure: true
|
||||||
|
|
||||||
@@ -89,18 +88,43 @@ jobs:
|
|||||||
& $exe --version
|
& $exe --version
|
||||||
|
|
||||||
- run: npm ci
|
- run: npm ci
|
||||||
|
- run: npm run bootstrap
|
||||||
|
env:
|
||||||
|
YAAK_TARGET_ARCH: ${{ matrix.yaak_arch }}
|
||||||
- run: npm run lint
|
- run: npm run lint
|
||||||
- name: Run JS Tests
|
- name: Run JS Tests
|
||||||
run: npm test
|
run: npm test
|
||||||
- name: Run Rust Tests
|
- name: Run Rust Tests
|
||||||
run: cargo test --all
|
run: cargo test --all
|
||||||
working-directory: src-tauri
|
|
||||||
|
|
||||||
- name: Set version
|
- name: Set version
|
||||||
run: npm run replace-version
|
run: npm run replace-version
|
||||||
env:
|
env:
|
||||||
YAAK_VERSION: ${{ github.ref_name }}
|
YAAK_VERSION: ${{ github.ref_name }}
|
||||||
|
|
||||||
|
- name: Sign vendored binaries (macOS only)
|
||||||
|
if: matrix.os == 'macos'
|
||||||
|
env:
|
||||||
|
APPLE_CERTIFICATE: ${{ secrets.APPLE_CERTIFICATE }}
|
||||||
|
APPLE_CERTIFICATE_PASSWORD: ${{ secrets.APPLE_CERTIFICATE_PASSWORD }}
|
||||||
|
APPLE_SIGNING_IDENTITY: ${{ secrets.APPLE_SIGNING_IDENTITY }}
|
||||||
|
KEYCHAIN_PASSWORD: ${{ secrets.KEYCHAIN_PASSWORD }}
|
||||||
|
run: |
|
||||||
|
# Create keychain
|
||||||
|
KEYCHAIN_PATH=$RUNNER_TEMP/app-signing.keychain-db
|
||||||
|
security create-keychain -p "$KEYCHAIN_PASSWORD" $KEYCHAIN_PATH
|
||||||
|
security set-keychain-settings -lut 21600 $KEYCHAIN_PATH
|
||||||
|
security unlock-keychain -p "$KEYCHAIN_PASSWORD" $KEYCHAIN_PATH
|
||||||
|
|
||||||
|
# Import certificate
|
||||||
|
echo "$APPLE_CERTIFICATE" | base64 --decode > certificate.p12
|
||||||
|
security import certificate.p12 -P "$APPLE_CERTIFICATE_PASSWORD" -A -t cert -f pkcs12 -k $KEYCHAIN_PATH
|
||||||
|
security list-keychain -d user -s $KEYCHAIN_PATH
|
||||||
|
|
||||||
|
# Sign vendored binaries with hardened runtime and their specific entitlements
|
||||||
|
codesign --force --options runtime --entitlements crates-tauri/yaak-app/macos/entitlements.yaakprotoc.plist --sign "$APPLE_SIGNING_IDENTITY" crates-tauri/yaak-app/vendored/protoc/yaakprotoc || true
|
||||||
|
codesign --force --options runtime --entitlements crates-tauri/yaak-app/macos/entitlements.yaaknode.plist --sign "$APPLE_SIGNING_IDENTITY" crates-tauri/yaak-app/vendored/node/yaaknode || true
|
||||||
|
|
||||||
- uses: tauri-apps/tauri-action@v0
|
- uses: tauri-apps/tauri-action@v0
|
||||||
env:
|
env:
|
||||||
YAAK_TARGET_ARCH: ${{ matrix.yaak_arch }}
|
YAAK_TARGET_ARCH: ${{ matrix.yaak_arch }}
|
||||||
@@ -123,9 +147,33 @@ jobs:
|
|||||||
AZURE_CLIENT_SECRET: ${{ matrix.os == 'windows' && secrets.AZURE_CLIENT_SECRET }}
|
AZURE_CLIENT_SECRET: ${{ matrix.os == 'windows' && secrets.AZURE_CLIENT_SECRET }}
|
||||||
AZURE_TENANT_ID: ${{ matrix.os == 'windows' && secrets.AZURE_TENANT_ID }}
|
AZURE_TENANT_ID: ${{ matrix.os == 'windows' && secrets.AZURE_TENANT_ID }}
|
||||||
with:
|
with:
|
||||||
tagName: 'v__VERSION__'
|
tagName: "v__VERSION__"
|
||||||
releaseName: 'Release __VERSION__'
|
releaseName: "Release __VERSION__"
|
||||||
releaseBody: '[Changelog __VERSION__](https://yaak.app/blog/__VERSION__)'
|
releaseBody: "[Changelog __VERSION__](https://yaak.app/blog/__VERSION__)"
|
||||||
releaseDraft: true
|
releaseDraft: true
|
||||||
prerelease: true
|
prerelease: true
|
||||||
args: '${{ matrix.args }} --config ./src-tauri/tauri.release.conf.json'
|
args: "${{ matrix.args }} --config ./crates-tauri/yaak-app/tauri.release.conf.json"
|
||||||
|
|
||||||
|
# Build a per-machine NSIS installer for enterprise deployment (PDQ, SCCM, Intune)
|
||||||
|
- name: Build and upload machine-wide installer (Windows only)
|
||||||
|
if: matrix.os == 'windows'
|
||||||
|
shell: pwsh
|
||||||
|
env:
|
||||||
|
YAAK_TARGET_ARCH: ${{ matrix.yaak_arch }}
|
||||||
|
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
AZURE_CLIENT_ID: ${{ secrets.AZURE_CLIENT_ID }}
|
||||||
|
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_CLIENT_SECRET }}
|
||||||
|
AZURE_TENANT_ID: ${{ secrets.AZURE_TENANT_ID }}
|
||||||
|
TAURI_SIGNING_PRIVATE_KEY: ${{ secrets.TAURI_PRIVATE_KEY }}
|
||||||
|
TAURI_SIGNING_PRIVATE_KEY_PASSWORD: ${{ secrets.TAURI_KEY_PASSWORD }}
|
||||||
|
run: |
|
||||||
|
Get-ChildItem -Recurse -Path target -File -Filter "*.exe.sig" | Remove-Item -Force
|
||||||
|
npx tauri bundle ${{ matrix.args }} --bundles nsis --config ./crates-tauri/yaak-app/tauri.release.conf.json --config '{"bundle":{"createUpdaterArtifacts":true,"windows":{"nsis":{"installMode":"perMachine"}}}}'
|
||||||
|
$setup = Get-ChildItem -Recurse -Path target -Filter "*setup*.exe" | Select-Object -First 1
|
||||||
|
$setupSig = "$($setup.FullName).sig"
|
||||||
|
$dest = $setup.FullName -replace '-setup\.exe$', '-setup-machine.exe'
|
||||||
|
$destSig = "$dest.sig"
|
||||||
|
Copy-Item $setup.FullName $dest
|
||||||
|
Copy-Item $setupSig $destSig
|
||||||
|
gh release upload "${{ github.ref_name }}" "$dest" --clobber
|
||||||
|
gh release upload "${{ github.ref_name }}" "$destSig" --clobber
|
||||||
|
|||||||
17
.gitignore
vendored
@@ -37,3 +37,20 @@ tmp
|
|||||||
.zed
|
.zed
|
||||||
codebook.toml
|
codebook.toml
|
||||||
target
|
target
|
||||||
|
|
||||||
|
# Per-worktree Tauri config (generated by post-checkout hook)
|
||||||
|
crates-tauri/yaak-app/tauri.worktree.conf.json
|
||||||
|
|
||||||
|
# Tauri auto-generated permission files
|
||||||
|
**/permissions/autogenerated
|
||||||
|
**/permissions/schemas
|
||||||
|
|
||||||
|
# Flatpak build artifacts
|
||||||
|
flatpak-repo/
|
||||||
|
.flatpak-builder/
|
||||||
|
flatpak/flatpak-builder-tools/
|
||||||
|
flatpak/cargo-sources.json
|
||||||
|
flatpak/node-sources.json
|
||||||
|
|
||||||
|
# Local Codex desktop env state
|
||||||
|
.codex/environments/environment.toml
|
||||||
|
|||||||
16
CONTRIBUTING.md
Normal file
@@ -0,0 +1,16 @@
|
|||||||
|
# Contributing to Yaak
|
||||||
|
|
||||||
|
Yaak accepts community pull requests for:
|
||||||
|
|
||||||
|
- Bug fixes
|
||||||
|
- Small-scope improvements directly tied to existing behavior
|
||||||
|
|
||||||
|
Pull requests that introduce broad new features, major redesigns, or large refactors are out of scope unless explicitly approved first.
|
||||||
|
|
||||||
|
## Approval for Non-Bugfix Changes
|
||||||
|
|
||||||
|
If your PR is not a bug fix or small-scope improvement, include a link to the approved [feedback item](https://yaak.app/feedback) where contribution approval was explicitly stated.
|
||||||
|
|
||||||
|
## Development Setup
|
||||||
|
|
||||||
|
For local setup and development workflows, see [`DEVELOPMENT.md`](DEVELOPMENT.md).
|
||||||
716
src-tauri/Cargo.lock → Cargo.lock
generated
74
Cargo.toml
Normal file
@@ -0,0 +1,74 @@
|
|||||||
|
[workspace]
|
||||||
|
resolver = "2"
|
||||||
|
members = [
|
||||||
|
"crates/yaak",
|
||||||
|
# Shared crates (no Tauri dependency)
|
||||||
|
"crates/yaak-core",
|
||||||
|
"crates/yaak-common",
|
||||||
|
"crates/yaak-crypto",
|
||||||
|
"crates/yaak-git",
|
||||||
|
"crates/yaak-grpc",
|
||||||
|
"crates/yaak-http",
|
||||||
|
"crates/yaak-models",
|
||||||
|
"crates/yaak-plugins",
|
||||||
|
"crates/yaak-sse",
|
||||||
|
"crates/yaak-sync",
|
||||||
|
"crates/yaak-templates",
|
||||||
|
"crates/yaak-tls",
|
||||||
|
"crates/yaak-ws",
|
||||||
|
"crates/yaak-api",
|
||||||
|
# CLI crates
|
||||||
|
"crates-cli/yaak-cli",
|
||||||
|
# Tauri-specific crates
|
||||||
|
"crates-tauri/yaak-app",
|
||||||
|
"crates-tauri/yaak-fonts",
|
||||||
|
"crates-tauri/yaak-license",
|
||||||
|
"crates-tauri/yaak-mac-window",
|
||||||
|
"crates-tauri/yaak-tauri-utils",
|
||||||
|
]
|
||||||
|
|
||||||
|
[workspace.dependencies]
|
||||||
|
chrono = "0.4.42"
|
||||||
|
hex = "0.4.3"
|
||||||
|
keyring = "3.6.3"
|
||||||
|
log = "0.4.29"
|
||||||
|
reqwest = "0.12.20"
|
||||||
|
rustls = { version = "0.23.34", default-features = false }
|
||||||
|
rustls-platform-verifier = "0.6.2"
|
||||||
|
schemars = { version = "0.8.22", features = ["chrono"] }
|
||||||
|
serde = "1.0.228"
|
||||||
|
serde_json = "1.0.145"
|
||||||
|
sha2 = "0.10.9"
|
||||||
|
tauri = "2.9.5"
|
||||||
|
tauri-plugin = "2.5.2"
|
||||||
|
tauri-plugin-dialog = "2.4.2"
|
||||||
|
tauri-plugin-shell = "2.3.3"
|
||||||
|
thiserror = "2.0.17"
|
||||||
|
tokio = "1.48.0"
|
||||||
|
ts-rs = "11.1.0"
|
||||||
|
|
||||||
|
# Internal crates - shared
|
||||||
|
yaak-core = { path = "crates/yaak-core" }
|
||||||
|
yaak = { path = "crates/yaak" }
|
||||||
|
yaak-common = { path = "crates/yaak-common" }
|
||||||
|
yaak-crypto = { path = "crates/yaak-crypto" }
|
||||||
|
yaak-git = { path = "crates/yaak-git" }
|
||||||
|
yaak-grpc = { path = "crates/yaak-grpc" }
|
||||||
|
yaak-http = { path = "crates/yaak-http" }
|
||||||
|
yaak-models = { path = "crates/yaak-models" }
|
||||||
|
yaak-plugins = { path = "crates/yaak-plugins" }
|
||||||
|
yaak-sse = { path = "crates/yaak-sse" }
|
||||||
|
yaak-sync = { path = "crates/yaak-sync" }
|
||||||
|
yaak-templates = { path = "crates/yaak-templates" }
|
||||||
|
yaak-tls = { path = "crates/yaak-tls" }
|
||||||
|
yaak-ws = { path = "crates/yaak-ws" }
|
||||||
|
yaak-api = { path = "crates/yaak-api" }
|
||||||
|
|
||||||
|
# Internal crates - Tauri-specific
|
||||||
|
yaak-fonts = { path = "crates-tauri/yaak-fonts" }
|
||||||
|
yaak-license = { path = "crates-tauri/yaak-license" }
|
||||||
|
yaak-mac-window = { path = "crates-tauri/yaak-mac-window" }
|
||||||
|
yaak-tauri-utils = { path = "crates-tauri/yaak-tauri-utils" }
|
||||||
|
|
||||||
|
[profile.release]
|
||||||
|
strip = false
|
||||||
12
README.md
@@ -1,6 +1,6 @@
|
|||||||
<p align="center">
|
<p align="center">
|
||||||
<a href="https://github.com/JamesIves/github-sponsors-readme-action">
|
<a href="https://github.com/JamesIves/github-sponsors-readme-action">
|
||||||
<img width="200px" src="https://github.com/mountain-loop/yaak/raw/main/src-tauri/icons/icon.png">
|
<img width="200px" src="https://github.com/mountain-loop/yaak/raw/main/crates-tauri/yaak-app/icons/icon.png">
|
||||||
</a>
|
</a>
|
||||||
</p>
|
</p>
|
||||||
|
|
||||||
@@ -22,7 +22,7 @@
|
|||||||
<!-- sponsors-premium --><a href="https://github.com/MVST-Solutions"><img src="https://github.com/MVST-Solutions.png" width="80px" alt="User avatar: MVST-Solutions" /></a> <a href="https://github.com/dharsanb"><img src="https://github.com/dharsanb.png" width="80px" alt="User avatar: dharsanb" /></a> <a href="https://github.com/railwayapp"><img src="https://github.com/railwayapp.png" width="80px" alt="User avatar: railwayapp" /></a> <a href="https://github.com/caseyamcl"><img src="https://github.com/caseyamcl.png" width="80px" alt="User avatar: caseyamcl" /></a> <a href="https://github.com/bytebase"><img src="https://github.com/bytebase.png" width="80px" alt="User avatar: bytebase" /></a> <a href="https://github.com/"><img src="https://raw.githubusercontent.com/JamesIves/github-sponsors-readme-action/dev/.github/assets/placeholder.png" width="80px" alt="User avatar: " /></a> <!-- sponsors-premium -->
|
<!-- sponsors-premium --><a href="https://github.com/MVST-Solutions"><img src="https://github.com/MVST-Solutions.png" width="80px" alt="User avatar: MVST-Solutions" /></a> <a href="https://github.com/dharsanb"><img src="https://github.com/dharsanb.png" width="80px" alt="User avatar: dharsanb" /></a> <a href="https://github.com/railwayapp"><img src="https://github.com/railwayapp.png" width="80px" alt="User avatar: railwayapp" /></a> <a href="https://github.com/caseyamcl"><img src="https://github.com/caseyamcl.png" width="80px" alt="User avatar: caseyamcl" /></a> <a href="https://github.com/bytebase"><img src="https://github.com/bytebase.png" width="80px" alt="User avatar: bytebase" /></a> <a href="https://github.com/"><img src="https://raw.githubusercontent.com/JamesIves/github-sponsors-readme-action/dev/.github/assets/placeholder.png" width="80px" alt="User avatar: " /></a> <!-- sponsors-premium -->
|
||||||
</p>
|
</p>
|
||||||
<p align="center">
|
<p align="center">
|
||||||
<!-- sponsors-base --><a href="https://github.com/seanwash"><img src="https://github.com/seanwash.png" width="50px" alt="User avatar: seanwash" /></a> <a href="https://github.com/jerath"><img src="https://github.com/jerath.png" width="50px" alt="User avatar: jerath" /></a> <a href="https://github.com/itsa-sh"><img src="https://github.com/itsa-sh.png" width="50px" alt="User avatar: itsa-sh" /></a> <a href="https://github.com/dmmulroy"><img src="https://github.com/dmmulroy.png" width="50px" alt="User avatar: dmmulroy" /></a> <a href="https://github.com/timcole"><img src="https://github.com/timcole.png" width="50px" alt="User avatar: timcole" /></a> <a href="https://github.com/VLZH"><img src="https://github.com/VLZH.png" width="50px" alt="User avatar: VLZH" /></a> <a href="https://github.com/terasaka2k"><img src="https://github.com/terasaka2k.png" width="50px" alt="User avatar: terasaka2k" /></a> <a href="https://github.com/andriyor"><img src="https://github.com/andriyor.png" width="50px" alt="User avatar: andriyor" /></a> <a href="https://github.com/majudhu"><img src="https://github.com/majudhu.png" width="50px" alt="User avatar: majudhu" /></a> <a href="https://github.com/axelrindle"><img src="https://github.com/axelrindle.png" width="50px" alt="User avatar: axelrindle" /></a> <a href="https://github.com/jirizverina"><img src="https://github.com/jirizverina.png" width="50px" alt="User avatar: jirizverina" /></a> <a href="https://github.com/chip-well"><img src="https://github.com/chip-well.png" width="50px" alt="User avatar: chip-well" /></a> <a href="https://github.com/GRAYAH"><img src="https://github.com/GRAYAH.png" width="50px" alt="User avatar: GRAYAH" /></a> <!-- sponsors-base -->
|
<!-- sponsors-base --><a href="https://github.com/seanwash"><img src="https://github.com/seanwash.png" width="50px" alt="User avatar: seanwash" /></a> <a href="https://github.com/jerath"><img src="https://github.com/jerath.png" width="50px" alt="User avatar: jerath" /></a> <a href="https://github.com/itsa-sh"><img src="https://github.com/itsa-sh.png" width="50px" alt="User avatar: itsa-sh" /></a> <a href="https://github.com/dmmulroy"><img src="https://github.com/dmmulroy.png" width="50px" alt="User avatar: dmmulroy" /></a> <a href="https://github.com/timcole"><img src="https://github.com/timcole.png" width="50px" alt="User avatar: timcole" /></a> <a href="https://github.com/VLZH"><img src="https://github.com/VLZH.png" width="50px" alt="User avatar: VLZH" /></a> <a href="https://github.com/terasaka2k"><img src="https://github.com/terasaka2k.png" width="50px" alt="User avatar: terasaka2k" /></a> <a href="https://github.com/andriyor"><img src="https://github.com/andriyor.png" width="50px" alt="User avatar: andriyor" /></a> <a href="https://github.com/majudhu"><img src="https://github.com/majudhu.png" width="50px" alt="User avatar: majudhu" /></a> <a href="https://github.com/axelrindle"><img src="https://github.com/axelrindle.png" width="50px" alt="User avatar: axelrindle" /></a> <a href="https://github.com/jirizverina"><img src="https://github.com/jirizverina.png" width="50px" alt="User avatar: jirizverina" /></a> <a href="https://github.com/chip-well"><img src="https://github.com/chip-well.png" width="50px" alt="User avatar: chip-well" /></a> <a href="https://github.com/GRAYAH"><img src="https://github.com/GRAYAH.png" width="50px" alt="User avatar: GRAYAH" /></a> <a href="https://github.com/flashblaze"><img src="https://github.com/flashblaze.png" width="50px" alt="User avatar: flashblaze" /></a> <!-- sponsors-base -->
|
||||||
</p>
|
</p>
|
||||||
|
|
||||||

|

|
||||||
@@ -58,13 +58,15 @@ Built with [Tauri](https://tauri.app), Rust, and React, it’s fast, lightweight
|
|||||||
|
|
||||||
## Contribution Policy
|
## Contribution Policy
|
||||||
|
|
||||||
Yaak is open source but only accepting contributions for bug fixes. To get started,
|
> [!IMPORTANT]
|
||||||
visit [`DEVELOPMENT.md`](DEVELOPMENT.md) for tips on setting up your environment.
|
> Community PRs are currently limited to bug fixes and small-scope improvements.
|
||||||
|
> If your PR is out of scope, link an approved feedback item from [yaak.app/feedback](https://yaak.app/feedback).
|
||||||
|
> See [`CONTRIBUTING.md`](CONTRIBUTING.md) for policy details and [`DEVELOPMENT.md`](DEVELOPMENT.md) for local setup.
|
||||||
|
|
||||||
## Useful Resources
|
## Useful Resources
|
||||||
|
|
||||||
- [Feedback and Bug Reports](https://feedback.yaak.app)
|
- [Feedback and Bug Reports](https://feedback.yaak.app)
|
||||||
- [Documentation](https://feedback.yaak.app/help)
|
- [Documentation](https://yaak.app/docs)
|
||||||
- [Yaak vs Postman](https://yaak.app/alternatives/postman)
|
- [Yaak vs Postman](https://yaak.app/alternatives/postman)
|
||||||
- [Yaak vs Bruno](https://yaak.app/alternatives/bruno)
|
- [Yaak vs Bruno](https://yaak.app/alternatives/bruno)
|
||||||
- [Yaak vs Insomnia](https://yaak.app/alternatives/insomnia)
|
- [Yaak vs Insomnia](https://yaak.app/alternatives/insomnia)
|
||||||
|
|||||||
@@ -38,14 +38,17 @@
|
|||||||
"!**/node_modules",
|
"!**/node_modules",
|
||||||
"!**/dist",
|
"!**/dist",
|
||||||
"!**/build",
|
"!**/build",
|
||||||
|
"!target",
|
||||||
"!scripts",
|
"!scripts",
|
||||||
"!src-tauri",
|
"!crates",
|
||||||
|
"!crates-tauri",
|
||||||
"!src-web/tailwind.config.cjs",
|
"!src-web/tailwind.config.cjs",
|
||||||
"!src-web/postcss.config.cjs",
|
"!src-web/postcss.config.cjs",
|
||||||
"!src-web/vite.config.ts",
|
"!src-web/vite.config.ts",
|
||||||
"!src-web/routeTree.gen.ts",
|
"!src-web/routeTree.gen.ts",
|
||||||
"!packages/plugin-runtime-types/lib",
|
"!packages/plugin-runtime-types/lib",
|
||||||
"!**/bindings"
|
"!**/bindings",
|
||||||
|
"!flatpak"
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
31
crates-cli/yaak-cli/Cargo.toml
Normal file
@@ -0,0 +1,31 @@
|
|||||||
|
[package]
|
||||||
|
name = "yaak-cli"
|
||||||
|
version = "0.1.0"
|
||||||
|
edition = "2024"
|
||||||
|
publish = false
|
||||||
|
|
||||||
|
[[bin]]
|
||||||
|
name = "yaakcli"
|
||||||
|
path = "src/main.rs"
|
||||||
|
|
||||||
|
[dependencies]
|
||||||
|
clap = { version = "4", features = ["derive"] }
|
||||||
|
dirs = "6"
|
||||||
|
env_logger = "0.11"
|
||||||
|
futures = "0.3"
|
||||||
|
log = { workspace = true }
|
||||||
|
schemars = { workspace = true }
|
||||||
|
serde = { workspace = true }
|
||||||
|
serde_json = { workspace = true }
|
||||||
|
tokio = { workspace = true, features = ["rt-multi-thread", "macros"] }
|
||||||
|
yaak = { workspace = true }
|
||||||
|
yaak-crypto = { workspace = true }
|
||||||
|
yaak-http = { workspace = true }
|
||||||
|
yaak-models = { workspace = true }
|
||||||
|
yaak-plugins = { workspace = true }
|
||||||
|
yaak-templates = { workspace = true }
|
||||||
|
|
||||||
|
[dev-dependencies]
|
||||||
|
assert_cmd = "2"
|
||||||
|
predicates = "3"
|
||||||
|
tempfile = "3"
|
||||||
87
crates-cli/yaak-cli/README.md
Normal file
@@ -0,0 +1,87 @@
|
|||||||
|
# yaak-cli
|
||||||
|
|
||||||
|
Command-line interface for Yaak.
|
||||||
|
|
||||||
|
## Command Overview
|
||||||
|
|
||||||
|
Current top-level commands:
|
||||||
|
|
||||||
|
```text
|
||||||
|
yaakcli send <request_id>
|
||||||
|
yaakcli workspace list
|
||||||
|
yaakcli workspace show <workspace_id>
|
||||||
|
yaakcli workspace create --name <name>
|
||||||
|
yaakcli workspace create --json '{"name":"My Workspace"}'
|
||||||
|
yaakcli workspace create '{"name":"My Workspace"}'
|
||||||
|
yaakcli workspace update --json '{"id":"wk_abc","description":"Updated"}'
|
||||||
|
yaakcli workspace delete <workspace_id> [--yes]
|
||||||
|
yaakcli request list <workspace_id>
|
||||||
|
yaakcli request show <request_id>
|
||||||
|
yaakcli request send <request_id>
|
||||||
|
yaakcli request create <workspace_id> --name <name> --url <url> [--method GET]
|
||||||
|
yaakcli request create --json '{"workspaceId":"wk_abc","name":"Users","url":"https://api.example.com/users"}'
|
||||||
|
yaakcli request create '{"workspaceId":"wk_abc","name":"Users","url":"https://api.example.com/users"}'
|
||||||
|
yaakcli request update --json '{"id":"rq_abc","name":"Users v2"}'
|
||||||
|
yaakcli request delete <request_id> [--yes]
|
||||||
|
yaakcli folder list <workspace_id>
|
||||||
|
yaakcli folder show <folder_id>
|
||||||
|
yaakcli folder create <workspace_id> --name <name>
|
||||||
|
yaakcli folder create --json '{"workspaceId":"wk_abc","name":"Auth"}'
|
||||||
|
yaakcli folder create '{"workspaceId":"wk_abc","name":"Auth"}'
|
||||||
|
yaakcli folder update --json '{"id":"fl_abc","name":"Auth v2"}'
|
||||||
|
yaakcli folder delete <folder_id> [--yes]
|
||||||
|
yaakcli environment list <workspace_id>
|
||||||
|
yaakcli environment show <environment_id>
|
||||||
|
yaakcli environment create <workspace_id> --name <name>
|
||||||
|
yaakcli environment create --json '{"workspaceId":"wk_abc","name":"Production"}'
|
||||||
|
yaakcli environment create '{"workspaceId":"wk_abc","name":"Production"}'
|
||||||
|
yaakcli environment update --json '{"id":"ev_abc","color":"#00ff00"}'
|
||||||
|
yaakcli environment delete <environment_id> [--yes]
|
||||||
|
```
|
||||||
|
|
||||||
|
Global options:
|
||||||
|
|
||||||
|
- `--data-dir <path>`: use a custom data directory
|
||||||
|
- `-e, --environment <id>`: environment to use during request rendering/sending
|
||||||
|
- `-v, --verbose`: verbose logging and send output
|
||||||
|
|
||||||
|
Notes:
|
||||||
|
|
||||||
|
- `send` is currently a shortcut for sending an HTTP request ID.
|
||||||
|
- `delete` commands prompt for confirmation unless `--yes` is provided.
|
||||||
|
- In non-interactive mode, `delete` commands require `--yes`.
|
||||||
|
- `create` and `update` commands support `--json` and positional JSON shorthand.
|
||||||
|
- `update` uses JSON Merge Patch semantics (RFC 7386) for partial updates.
|
||||||
|
|
||||||
|
## Examples
|
||||||
|
|
||||||
|
```bash
|
||||||
|
yaakcli workspace list
|
||||||
|
yaakcli workspace create --name "My Workspace"
|
||||||
|
yaakcli workspace show wk_abc
|
||||||
|
yaakcli workspace update --json '{"id":"wk_abc","description":"Team workspace"}'
|
||||||
|
yaakcli request list wk_abc
|
||||||
|
yaakcli request show rq_abc
|
||||||
|
yaakcli request create wk_abc --name "Users" --url "https://api.example.com/users"
|
||||||
|
yaakcli request update --json '{"id":"rq_abc","name":"Users v2"}'
|
||||||
|
yaakcli request send rq_abc -e ev_abc
|
||||||
|
yaakcli request delete rq_abc --yes
|
||||||
|
yaakcli folder create wk_abc --name "Auth"
|
||||||
|
yaakcli folder update --json '{"id":"fl_abc","name":"Auth v2"}'
|
||||||
|
yaakcli environment create wk_abc --name "Production"
|
||||||
|
yaakcli environment update --json '{"id":"ev_abc","color":"#00ff00"}'
|
||||||
|
```
|
||||||
|
|
||||||
|
## Roadmap
|
||||||
|
|
||||||
|
Planned command expansion (request schema and polymorphic send) is tracked in `PLAN.md`.
|
||||||
|
|
||||||
|
When command behavior changes, update this README and verify with:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cargo run -q -p yaak-cli -- --help
|
||||||
|
cargo run -q -p yaak-cli -- request --help
|
||||||
|
cargo run -q -p yaak-cli -- workspace --help
|
||||||
|
cargo run -q -p yaak-cli -- folder --help
|
||||||
|
cargo run -q -p yaak-cli -- environment --help
|
||||||
|
```
|
||||||
307
crates-cli/yaak-cli/src/cli.rs
Normal file
@@ -0,0 +1,307 @@
|
|||||||
|
use clap::{Args, Parser, Subcommand, ValueEnum};
|
||||||
|
use std::path::PathBuf;
|
||||||
|
|
||||||
|
#[derive(Parser)]
|
||||||
|
#[command(name = "yaakcli")]
|
||||||
|
#[command(about = "Yaak CLI - API client from the command line")]
|
||||||
|
pub struct Cli {
|
||||||
|
/// Use a custom data directory
|
||||||
|
#[arg(long, global = true)]
|
||||||
|
pub data_dir: Option<PathBuf>,
|
||||||
|
|
||||||
|
/// Environment ID to use for variable substitution
|
||||||
|
#[arg(long, short, global = true)]
|
||||||
|
pub environment: Option<String>,
|
||||||
|
|
||||||
|
/// Enable verbose logging
|
||||||
|
#[arg(long, short, global = true)]
|
||||||
|
pub verbose: bool,
|
||||||
|
|
||||||
|
#[command(subcommand)]
|
||||||
|
pub command: Commands,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Subcommand)]
|
||||||
|
pub enum Commands {
|
||||||
|
/// Send a request, folder, or workspace by ID
|
||||||
|
Send(SendArgs),
|
||||||
|
|
||||||
|
/// Workspace commands
|
||||||
|
Workspace(WorkspaceArgs),
|
||||||
|
|
||||||
|
/// Request commands
|
||||||
|
Request(RequestArgs),
|
||||||
|
|
||||||
|
/// Folder commands
|
||||||
|
Folder(FolderArgs),
|
||||||
|
|
||||||
|
/// Environment commands
|
||||||
|
Environment(EnvironmentArgs),
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Args)]
|
||||||
|
pub struct SendArgs {
|
||||||
|
/// Request, folder, or workspace ID
|
||||||
|
pub id: String,
|
||||||
|
|
||||||
|
/// Execute requests sequentially (default)
|
||||||
|
#[arg(long, conflicts_with = "parallel")]
|
||||||
|
pub sequential: bool,
|
||||||
|
|
||||||
|
/// Execute requests in parallel
|
||||||
|
#[arg(long, conflicts_with = "sequential")]
|
||||||
|
pub parallel: bool,
|
||||||
|
|
||||||
|
/// Stop on first request failure when sending folders/workspaces
|
||||||
|
#[arg(long, conflicts_with = "parallel")]
|
||||||
|
pub fail_fast: bool,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Args)]
|
||||||
|
pub struct WorkspaceArgs {
|
||||||
|
#[command(subcommand)]
|
||||||
|
pub command: WorkspaceCommands,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Subcommand)]
|
||||||
|
pub enum WorkspaceCommands {
|
||||||
|
/// List all workspaces
|
||||||
|
List,
|
||||||
|
|
||||||
|
/// Show a workspace as JSON
|
||||||
|
Show {
|
||||||
|
/// Workspace ID
|
||||||
|
workspace_id: String,
|
||||||
|
},
|
||||||
|
|
||||||
|
/// Create a workspace
|
||||||
|
Create {
|
||||||
|
/// Workspace name
|
||||||
|
#[arg(short, long)]
|
||||||
|
name: Option<String>,
|
||||||
|
|
||||||
|
/// JSON payload
|
||||||
|
#[arg(long, conflicts_with = "json_input")]
|
||||||
|
json: Option<String>,
|
||||||
|
|
||||||
|
/// JSON payload shorthand
|
||||||
|
#[arg(value_name = "JSON", conflicts_with = "json")]
|
||||||
|
json_input: Option<String>,
|
||||||
|
},
|
||||||
|
|
||||||
|
/// Update a workspace
|
||||||
|
Update {
|
||||||
|
/// JSON payload
|
||||||
|
#[arg(long, conflicts_with = "json_input")]
|
||||||
|
json: Option<String>,
|
||||||
|
|
||||||
|
/// JSON payload shorthand
|
||||||
|
#[arg(value_name = "JSON", conflicts_with = "json")]
|
||||||
|
json_input: Option<String>,
|
||||||
|
},
|
||||||
|
|
||||||
|
/// Delete a workspace
|
||||||
|
Delete {
|
||||||
|
/// Workspace ID
|
||||||
|
workspace_id: String,
|
||||||
|
|
||||||
|
/// Skip confirmation prompt
|
||||||
|
#[arg(short, long)]
|
||||||
|
yes: bool,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Args)]
|
||||||
|
pub struct RequestArgs {
|
||||||
|
#[command(subcommand)]
|
||||||
|
pub command: RequestCommands,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Subcommand)]
|
||||||
|
pub enum RequestCommands {
|
||||||
|
/// List requests in a workspace
|
||||||
|
List {
|
||||||
|
/// Workspace ID
|
||||||
|
workspace_id: String,
|
||||||
|
},
|
||||||
|
|
||||||
|
/// Show a request as JSON
|
||||||
|
Show {
|
||||||
|
/// Request ID
|
||||||
|
request_id: String,
|
||||||
|
},
|
||||||
|
|
||||||
|
/// Send a request by ID
|
||||||
|
Send {
|
||||||
|
/// Request ID
|
||||||
|
request_id: String,
|
||||||
|
},
|
||||||
|
|
||||||
|
/// Output JSON schema for request create/update payloads
|
||||||
|
Schema {
|
||||||
|
#[arg(value_enum)]
|
||||||
|
request_type: RequestSchemaType,
|
||||||
|
},
|
||||||
|
|
||||||
|
/// Create a new HTTP request
|
||||||
|
Create {
|
||||||
|
/// Workspace ID (or positional JSON payload shorthand)
|
||||||
|
workspace_id: Option<String>,
|
||||||
|
|
||||||
|
/// Request name
|
||||||
|
#[arg(short, long)]
|
||||||
|
name: Option<String>,
|
||||||
|
|
||||||
|
/// HTTP method
|
||||||
|
#[arg(short, long)]
|
||||||
|
method: Option<String>,
|
||||||
|
|
||||||
|
/// URL
|
||||||
|
#[arg(short, long)]
|
||||||
|
url: Option<String>,
|
||||||
|
|
||||||
|
/// JSON payload
|
||||||
|
#[arg(long)]
|
||||||
|
json: Option<String>,
|
||||||
|
},
|
||||||
|
|
||||||
|
/// Update an HTTP request
|
||||||
|
Update {
|
||||||
|
/// JSON payload
|
||||||
|
#[arg(long, conflicts_with = "json_input")]
|
||||||
|
json: Option<String>,
|
||||||
|
|
||||||
|
/// JSON payload shorthand
|
||||||
|
#[arg(value_name = "JSON", conflicts_with = "json")]
|
||||||
|
json_input: Option<String>,
|
||||||
|
},
|
||||||
|
|
||||||
|
/// Delete a request
|
||||||
|
Delete {
|
||||||
|
/// Request ID
|
||||||
|
request_id: String,
|
||||||
|
|
||||||
|
/// Skip confirmation prompt
|
||||||
|
#[arg(short, long)]
|
||||||
|
yes: bool,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Clone, Copy, Debug, ValueEnum)]
|
||||||
|
pub enum RequestSchemaType {
|
||||||
|
Http,
|
||||||
|
Grpc,
|
||||||
|
Websocket,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Args)]
|
||||||
|
pub struct FolderArgs {
|
||||||
|
#[command(subcommand)]
|
||||||
|
pub command: FolderCommands,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Subcommand)]
|
||||||
|
pub enum FolderCommands {
|
||||||
|
/// List folders in a workspace
|
||||||
|
List {
|
||||||
|
/// Workspace ID
|
||||||
|
workspace_id: String,
|
||||||
|
},
|
||||||
|
|
||||||
|
/// Show a folder as JSON
|
||||||
|
Show {
|
||||||
|
/// Folder ID
|
||||||
|
folder_id: String,
|
||||||
|
},
|
||||||
|
|
||||||
|
/// Create a folder
|
||||||
|
Create {
|
||||||
|
/// Workspace ID (or positional JSON payload shorthand)
|
||||||
|
workspace_id: Option<String>,
|
||||||
|
|
||||||
|
/// Folder name
|
||||||
|
#[arg(short, long)]
|
||||||
|
name: Option<String>,
|
||||||
|
|
||||||
|
/// JSON payload
|
||||||
|
#[arg(long)]
|
||||||
|
json: Option<String>,
|
||||||
|
},
|
||||||
|
|
||||||
|
/// Update a folder
|
||||||
|
Update {
|
||||||
|
/// JSON payload
|
||||||
|
#[arg(long, conflicts_with = "json_input")]
|
||||||
|
json: Option<String>,
|
||||||
|
|
||||||
|
/// JSON payload shorthand
|
||||||
|
#[arg(value_name = "JSON", conflicts_with = "json")]
|
||||||
|
json_input: Option<String>,
|
||||||
|
},
|
||||||
|
|
||||||
|
/// Delete a folder
|
||||||
|
Delete {
|
||||||
|
/// Folder ID
|
||||||
|
folder_id: String,
|
||||||
|
|
||||||
|
/// Skip confirmation prompt
|
||||||
|
#[arg(short, long)]
|
||||||
|
yes: bool,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Args)]
|
||||||
|
pub struct EnvironmentArgs {
|
||||||
|
#[command(subcommand)]
|
||||||
|
pub command: EnvironmentCommands,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Subcommand)]
|
||||||
|
pub enum EnvironmentCommands {
|
||||||
|
/// List environments in a workspace
|
||||||
|
List {
|
||||||
|
/// Workspace ID
|
||||||
|
workspace_id: String,
|
||||||
|
},
|
||||||
|
|
||||||
|
/// Show an environment as JSON
|
||||||
|
Show {
|
||||||
|
/// Environment ID
|
||||||
|
environment_id: String,
|
||||||
|
},
|
||||||
|
|
||||||
|
/// Create an environment
|
||||||
|
Create {
|
||||||
|
/// Workspace ID (or positional JSON payload shorthand)
|
||||||
|
workspace_id: Option<String>,
|
||||||
|
|
||||||
|
/// Environment name
|
||||||
|
#[arg(short, long)]
|
||||||
|
name: Option<String>,
|
||||||
|
|
||||||
|
/// JSON payload
|
||||||
|
#[arg(long)]
|
||||||
|
json: Option<String>,
|
||||||
|
},
|
||||||
|
|
||||||
|
/// Update an environment
|
||||||
|
Update {
|
||||||
|
/// JSON payload
|
||||||
|
#[arg(long, conflicts_with = "json_input")]
|
||||||
|
json: Option<String>,
|
||||||
|
|
||||||
|
/// JSON payload shorthand
|
||||||
|
#[arg(value_name = "JSON", conflicts_with = "json")]
|
||||||
|
json_input: Option<String>,
|
||||||
|
},
|
||||||
|
|
||||||
|
/// Delete an environment
|
||||||
|
Delete {
|
||||||
|
/// Environment ID
|
||||||
|
environment_id: String,
|
||||||
|
|
||||||
|
/// Skip confirmation prompt
|
||||||
|
#[arg(short, long)]
|
||||||
|
yes: bool,
|
||||||
|
},
|
||||||
|
}
|
||||||
159
crates-cli/yaak-cli/src/commands/environment.rs
Normal file
@@ -0,0 +1,159 @@
|
|||||||
|
use crate::cli::{EnvironmentArgs, EnvironmentCommands};
|
||||||
|
use crate::context::CliContext;
|
||||||
|
use crate::utils::confirm::confirm_delete;
|
||||||
|
use crate::utils::json::{
|
||||||
|
apply_merge_patch, is_json_shorthand, parse_optional_json, parse_required_json, require_id,
|
||||||
|
validate_create_id,
|
||||||
|
};
|
||||||
|
use yaak_models::models::Environment;
|
||||||
|
use yaak_models::util::UpdateSource;
|
||||||
|
|
||||||
|
type CommandResult<T = ()> = std::result::Result<T, String>;
|
||||||
|
|
||||||
|
pub fn run(ctx: &CliContext, args: EnvironmentArgs) -> i32 {
|
||||||
|
let result = match args.command {
|
||||||
|
EnvironmentCommands::List { workspace_id } => list(ctx, &workspace_id),
|
||||||
|
EnvironmentCommands::Show { environment_id } => show(ctx, &environment_id),
|
||||||
|
EnvironmentCommands::Create { workspace_id, name, json } => {
|
||||||
|
create(ctx, workspace_id, name, json)
|
||||||
|
}
|
||||||
|
EnvironmentCommands::Update { json, json_input } => update(ctx, json, json_input),
|
||||||
|
EnvironmentCommands::Delete { environment_id, yes } => delete(ctx, &environment_id, yes),
|
||||||
|
};
|
||||||
|
|
||||||
|
match result {
|
||||||
|
Ok(()) => 0,
|
||||||
|
Err(error) => {
|
||||||
|
eprintln!("Error: {error}");
|
||||||
|
1
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fn list(ctx: &CliContext, workspace_id: &str) -> CommandResult {
|
||||||
|
let environments = ctx
|
||||||
|
.db()
|
||||||
|
.list_environments_ensure_base(workspace_id)
|
||||||
|
.map_err(|e| format!("Failed to list environments: {e}"))?;
|
||||||
|
|
||||||
|
if environments.is_empty() {
|
||||||
|
println!("No environments found in workspace {}", workspace_id);
|
||||||
|
} else {
|
||||||
|
for environment in environments {
|
||||||
|
println!("{} - {} ({})", environment.id, environment.name, environment.parent_model);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
fn show(ctx: &CliContext, environment_id: &str) -> CommandResult {
|
||||||
|
let environment = ctx
|
||||||
|
.db()
|
||||||
|
.get_environment(environment_id)
|
||||||
|
.map_err(|e| format!("Failed to get environment: {e}"))?;
|
||||||
|
let output = serde_json::to_string_pretty(&environment)
|
||||||
|
.map_err(|e| format!("Failed to serialize environment: {e}"))?;
|
||||||
|
println!("{output}");
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
fn create(
|
||||||
|
ctx: &CliContext,
|
||||||
|
workspace_id: Option<String>,
|
||||||
|
name: Option<String>,
|
||||||
|
json: Option<String>,
|
||||||
|
) -> CommandResult {
|
||||||
|
if json.is_some() && workspace_id.as_deref().is_some_and(|v| !is_json_shorthand(v)) {
|
||||||
|
return Err(
|
||||||
|
"environment create cannot combine workspace_id with --json payload".to_string()
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
let payload = parse_optional_json(
|
||||||
|
json,
|
||||||
|
workspace_id.clone().filter(|v| is_json_shorthand(v)),
|
||||||
|
"environment create",
|
||||||
|
)?;
|
||||||
|
|
||||||
|
if let Some(payload) = payload {
|
||||||
|
if name.is_some() {
|
||||||
|
return Err("environment create cannot combine --name with JSON payload".to_string());
|
||||||
|
}
|
||||||
|
|
||||||
|
validate_create_id(&payload, "environment")?;
|
||||||
|
let mut environment: Environment = serde_json::from_value(payload)
|
||||||
|
.map_err(|e| format!("Failed to parse environment create JSON: {e}"))?;
|
||||||
|
|
||||||
|
if environment.workspace_id.is_empty() {
|
||||||
|
return Err("environment create JSON requires non-empty \"workspaceId\"".to_string());
|
||||||
|
}
|
||||||
|
|
||||||
|
if environment.parent_model.is_empty() {
|
||||||
|
environment.parent_model = "environment".to_string();
|
||||||
|
}
|
||||||
|
|
||||||
|
let created = ctx
|
||||||
|
.db()
|
||||||
|
.upsert_environment(&environment, &UpdateSource::Sync)
|
||||||
|
.map_err(|e| format!("Failed to create environment: {e}"))?;
|
||||||
|
|
||||||
|
println!("Created environment: {}", created.id);
|
||||||
|
return Ok(());
|
||||||
|
}
|
||||||
|
|
||||||
|
let workspace_id = workspace_id.ok_or_else(|| {
|
||||||
|
"environment create requires workspace_id unless JSON payload is provided".to_string()
|
||||||
|
})?;
|
||||||
|
let name = name.ok_or_else(|| {
|
||||||
|
"environment create requires --name unless JSON payload is provided".to_string()
|
||||||
|
})?;
|
||||||
|
|
||||||
|
let environment = Environment {
|
||||||
|
workspace_id,
|
||||||
|
name,
|
||||||
|
parent_model: "environment".to_string(),
|
||||||
|
..Default::default()
|
||||||
|
};
|
||||||
|
|
||||||
|
let created = ctx
|
||||||
|
.db()
|
||||||
|
.upsert_environment(&environment, &UpdateSource::Sync)
|
||||||
|
.map_err(|e| format!("Failed to create environment: {e}"))?;
|
||||||
|
|
||||||
|
println!("Created environment: {}", created.id);
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
fn update(ctx: &CliContext, json: Option<String>, json_input: Option<String>) -> CommandResult {
|
||||||
|
let patch = parse_required_json(json, json_input, "environment update")?;
|
||||||
|
let id = require_id(&patch, "environment update")?;
|
||||||
|
|
||||||
|
let existing = ctx
|
||||||
|
.db()
|
||||||
|
.get_environment(&id)
|
||||||
|
.map_err(|e| format!("Failed to get environment for update: {e}"))?;
|
||||||
|
let updated = apply_merge_patch(&existing, &patch, &id, "environment update")?;
|
||||||
|
|
||||||
|
let saved = ctx
|
||||||
|
.db()
|
||||||
|
.upsert_environment(&updated, &UpdateSource::Sync)
|
||||||
|
.map_err(|e| format!("Failed to update environment: {e}"))?;
|
||||||
|
|
||||||
|
println!("Updated environment: {}", saved.id);
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
fn delete(ctx: &CliContext, environment_id: &str, yes: bool) -> CommandResult {
|
||||||
|
if !yes && !confirm_delete("environment", environment_id) {
|
||||||
|
println!("Aborted");
|
||||||
|
return Ok(());
|
||||||
|
}
|
||||||
|
|
||||||
|
let deleted = ctx
|
||||||
|
.db()
|
||||||
|
.delete_environment_by_id(environment_id, &UpdateSource::Sync)
|
||||||
|
.map_err(|e| format!("Failed to delete environment: {e}"))?;
|
||||||
|
|
||||||
|
println!("Deleted environment: {}", deleted.id);
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
141
crates-cli/yaak-cli/src/commands/folder.rs
Normal file
@@ -0,0 +1,141 @@
|
|||||||
|
use crate::cli::{FolderArgs, FolderCommands};
|
||||||
|
use crate::context::CliContext;
|
||||||
|
use crate::utils::confirm::confirm_delete;
|
||||||
|
use crate::utils::json::{
|
||||||
|
apply_merge_patch, is_json_shorthand, parse_optional_json, parse_required_json, require_id,
|
||||||
|
validate_create_id,
|
||||||
|
};
|
||||||
|
use yaak_models::models::Folder;
|
||||||
|
use yaak_models::util::UpdateSource;
|
||||||
|
|
||||||
|
type CommandResult<T = ()> = std::result::Result<T, String>;
|
||||||
|
|
||||||
|
pub fn run(ctx: &CliContext, args: FolderArgs) -> i32 {
|
||||||
|
let result = match args.command {
|
||||||
|
FolderCommands::List { workspace_id } => list(ctx, &workspace_id),
|
||||||
|
FolderCommands::Show { folder_id } => show(ctx, &folder_id),
|
||||||
|
FolderCommands::Create { workspace_id, name, json } => {
|
||||||
|
create(ctx, workspace_id, name, json)
|
||||||
|
}
|
||||||
|
FolderCommands::Update { json, json_input } => update(ctx, json, json_input),
|
||||||
|
FolderCommands::Delete { folder_id, yes } => delete(ctx, &folder_id, yes),
|
||||||
|
};
|
||||||
|
|
||||||
|
match result {
|
||||||
|
Ok(()) => 0,
|
||||||
|
Err(error) => {
|
||||||
|
eprintln!("Error: {error}");
|
||||||
|
1
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fn list(ctx: &CliContext, workspace_id: &str) -> CommandResult {
|
||||||
|
let folders =
|
||||||
|
ctx.db().list_folders(workspace_id).map_err(|e| format!("Failed to list folders: {e}"))?;
|
||||||
|
if folders.is_empty() {
|
||||||
|
println!("No folders found in workspace {}", workspace_id);
|
||||||
|
} else {
|
||||||
|
for folder in folders {
|
||||||
|
println!("{} - {}", folder.id, folder.name);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
fn show(ctx: &CliContext, folder_id: &str) -> CommandResult {
|
||||||
|
let folder =
|
||||||
|
ctx.db().get_folder(folder_id).map_err(|e| format!("Failed to get folder: {e}"))?;
|
||||||
|
let output = serde_json::to_string_pretty(&folder)
|
||||||
|
.map_err(|e| format!("Failed to serialize folder: {e}"))?;
|
||||||
|
println!("{output}");
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
fn create(
|
||||||
|
ctx: &CliContext,
|
||||||
|
workspace_id: Option<String>,
|
||||||
|
name: Option<String>,
|
||||||
|
json: Option<String>,
|
||||||
|
) -> CommandResult {
|
||||||
|
if json.is_some() && workspace_id.as_deref().is_some_and(|v| !is_json_shorthand(v)) {
|
||||||
|
return Err("folder create cannot combine workspace_id with --json payload".to_string());
|
||||||
|
}
|
||||||
|
|
||||||
|
let payload = parse_optional_json(
|
||||||
|
json,
|
||||||
|
workspace_id.clone().filter(|v| is_json_shorthand(v)),
|
||||||
|
"folder create",
|
||||||
|
)?;
|
||||||
|
|
||||||
|
if let Some(payload) = payload {
|
||||||
|
if name.is_some() {
|
||||||
|
return Err("folder create cannot combine --name with JSON payload".to_string());
|
||||||
|
}
|
||||||
|
|
||||||
|
validate_create_id(&payload, "folder")?;
|
||||||
|
let folder: Folder = serde_json::from_value(payload)
|
||||||
|
.map_err(|e| format!("Failed to parse folder create JSON: {e}"))?;
|
||||||
|
|
||||||
|
if folder.workspace_id.is_empty() {
|
||||||
|
return Err("folder create JSON requires non-empty \"workspaceId\"".to_string());
|
||||||
|
}
|
||||||
|
|
||||||
|
let created = ctx
|
||||||
|
.db()
|
||||||
|
.upsert_folder(&folder, &UpdateSource::Sync)
|
||||||
|
.map_err(|e| format!("Failed to create folder: {e}"))?;
|
||||||
|
|
||||||
|
println!("Created folder: {}", created.id);
|
||||||
|
return Ok(());
|
||||||
|
}
|
||||||
|
|
||||||
|
let workspace_id = workspace_id.ok_or_else(|| {
|
||||||
|
"folder create requires workspace_id unless JSON payload is provided".to_string()
|
||||||
|
})?;
|
||||||
|
let name = name.ok_or_else(|| {
|
||||||
|
"folder create requires --name unless JSON payload is provided".to_string()
|
||||||
|
})?;
|
||||||
|
|
||||||
|
let folder = Folder { workspace_id, name, ..Default::default() };
|
||||||
|
|
||||||
|
let created = ctx
|
||||||
|
.db()
|
||||||
|
.upsert_folder(&folder, &UpdateSource::Sync)
|
||||||
|
.map_err(|e| format!("Failed to create folder: {e}"))?;
|
||||||
|
|
||||||
|
println!("Created folder: {}", created.id);
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
fn update(ctx: &CliContext, json: Option<String>, json_input: Option<String>) -> CommandResult {
|
||||||
|
let patch = parse_required_json(json, json_input, "folder update")?;
|
||||||
|
let id = require_id(&patch, "folder update")?;
|
||||||
|
|
||||||
|
let existing =
|
||||||
|
ctx.db().get_folder(&id).map_err(|e| format!("Failed to get folder for update: {e}"))?;
|
||||||
|
let updated = apply_merge_patch(&existing, &patch, &id, "folder update")?;
|
||||||
|
|
||||||
|
let saved = ctx
|
||||||
|
.db()
|
||||||
|
.upsert_folder(&updated, &UpdateSource::Sync)
|
||||||
|
.map_err(|e| format!("Failed to update folder: {e}"))?;
|
||||||
|
|
||||||
|
println!("Updated folder: {}", saved.id);
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
fn delete(ctx: &CliContext, folder_id: &str, yes: bool) -> CommandResult {
|
||||||
|
if !yes && !confirm_delete("folder", folder_id) {
|
||||||
|
println!("Aborted");
|
||||||
|
return Ok(());
|
||||||
|
}
|
||||||
|
|
||||||
|
let deleted = ctx
|
||||||
|
.db()
|
||||||
|
.delete_folder_by_id(folder_id, &UpdateSource::Sync)
|
||||||
|
.map_err(|e| format!("Failed to delete folder: {e}"))?;
|
||||||
|
|
||||||
|
println!("Deleted folder: {}", deleted.id);
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
5
crates-cli/yaak-cli/src/commands/mod.rs
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
pub mod environment;
|
||||||
|
pub mod folder;
|
||||||
|
pub mod request;
|
||||||
|
pub mod send;
|
||||||
|
pub mod workspace;
|
||||||
485
crates-cli/yaak-cli/src/commands/request.rs
Normal file
@@ -0,0 +1,485 @@
|
|||||||
|
use crate::cli::{RequestArgs, RequestCommands, RequestSchemaType};
|
||||||
|
use crate::context::CliContext;
|
||||||
|
use crate::utils::confirm::confirm_delete;
|
||||||
|
use crate::utils::json::{
|
||||||
|
apply_merge_patch, is_json_shorthand, parse_optional_json, parse_required_json, require_id,
|
||||||
|
validate_create_id,
|
||||||
|
};
|
||||||
|
use schemars::schema_for;
|
||||||
|
use serde_json::{Map, Value, json};
|
||||||
|
use std::collections::HashMap;
|
||||||
|
use tokio::sync::mpsc;
|
||||||
|
use yaak::send::{SendHttpRequestByIdWithPluginsParams, send_http_request_by_id_with_plugins};
|
||||||
|
use yaak_models::models::{GrpcRequest, HttpRequest, WebsocketRequest};
|
||||||
|
use yaak_models::queries::any_request::AnyRequest;
|
||||||
|
use yaak_models::util::UpdateSource;
|
||||||
|
use yaak_plugins::events::{FormInput, FormInputBase, JsonPrimitive, PluginContext};
|
||||||
|
|
||||||
|
type CommandResult<T = ()> = std::result::Result<T, String>;
|
||||||
|
|
||||||
|
pub async fn run(
|
||||||
|
ctx: &CliContext,
|
||||||
|
args: RequestArgs,
|
||||||
|
environment: Option<&str>,
|
||||||
|
verbose: bool,
|
||||||
|
) -> i32 {
|
||||||
|
let result = match args.command {
|
||||||
|
RequestCommands::List { workspace_id } => list(ctx, &workspace_id),
|
||||||
|
RequestCommands::Show { request_id } => show(ctx, &request_id),
|
||||||
|
RequestCommands::Send { request_id } => {
|
||||||
|
return match send_request_by_id(ctx, &request_id, environment, verbose).await {
|
||||||
|
Ok(()) => 0,
|
||||||
|
Err(error) => {
|
||||||
|
eprintln!("Error: {error}");
|
||||||
|
1
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
|
RequestCommands::Schema { request_type } => {
|
||||||
|
return match schema(ctx, request_type).await {
|
||||||
|
Ok(()) => 0,
|
||||||
|
Err(error) => {
|
||||||
|
eprintln!("Error: {error}");
|
||||||
|
1
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
|
RequestCommands::Create { workspace_id, name, method, url, json } => {
|
||||||
|
create(ctx, workspace_id, name, method, url, json)
|
||||||
|
}
|
||||||
|
RequestCommands::Update { json, json_input } => update(ctx, json, json_input),
|
||||||
|
RequestCommands::Delete { request_id, yes } => delete(ctx, &request_id, yes),
|
||||||
|
};
|
||||||
|
|
||||||
|
match result {
|
||||||
|
Ok(()) => 0,
|
||||||
|
Err(error) => {
|
||||||
|
eprintln!("Error: {error}");
|
||||||
|
1
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fn list(ctx: &CliContext, workspace_id: &str) -> CommandResult {
|
||||||
|
let requests = ctx
|
||||||
|
.db()
|
||||||
|
.list_http_requests(workspace_id)
|
||||||
|
.map_err(|e| format!("Failed to list requests: {e}"))?;
|
||||||
|
if requests.is_empty() {
|
||||||
|
println!("No requests found in workspace {}", workspace_id);
|
||||||
|
} else {
|
||||||
|
for request in requests {
|
||||||
|
println!("{} - {} {}", request.id, request.method, request.name);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
async fn schema(ctx: &CliContext, request_type: RequestSchemaType) -> CommandResult {
|
||||||
|
let mut schema = match request_type {
|
||||||
|
RequestSchemaType::Http => serde_json::to_value(schema_for!(HttpRequest))
|
||||||
|
.map_err(|e| format!("Failed to serialize HTTP request schema: {e}"))?,
|
||||||
|
RequestSchemaType::Grpc => serde_json::to_value(schema_for!(GrpcRequest))
|
||||||
|
.map_err(|e| format!("Failed to serialize gRPC request schema: {e}"))?,
|
||||||
|
RequestSchemaType::Websocket => serde_json::to_value(schema_for!(WebsocketRequest))
|
||||||
|
.map_err(|e| format!("Failed to serialize WebSocket request schema: {e}"))?,
|
||||||
|
};
|
||||||
|
|
||||||
|
if let Err(error) = merge_auth_schema_from_plugins(ctx, &mut schema).await {
|
||||||
|
eprintln!("Warning: Failed to enrich authentication schema from plugins: {error}");
|
||||||
|
}
|
||||||
|
|
||||||
|
let output = serde_json::to_string_pretty(&schema)
|
||||||
|
.map_err(|e| format!("Failed to format schema JSON: {e}"))?;
|
||||||
|
println!("{output}");
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
async fn merge_auth_schema_from_plugins(
|
||||||
|
ctx: &CliContext,
|
||||||
|
schema: &mut Value,
|
||||||
|
) -> Result<(), String> {
|
||||||
|
let plugin_context = PluginContext::new_empty();
|
||||||
|
let plugin_manager = ctx.plugin_manager();
|
||||||
|
let summaries = plugin_manager
|
||||||
|
.get_http_authentication_summaries(&plugin_context)
|
||||||
|
.await
|
||||||
|
.map_err(|e| e.to_string())?;
|
||||||
|
|
||||||
|
let mut auth_variants = Vec::new();
|
||||||
|
for (_, summary) in summaries {
|
||||||
|
let config = match plugin_manager
|
||||||
|
.get_http_authentication_config(
|
||||||
|
&plugin_context,
|
||||||
|
&summary.name,
|
||||||
|
HashMap::<String, JsonPrimitive>::new(),
|
||||||
|
"yaakcli_request_schema",
|
||||||
|
)
|
||||||
|
.await
|
||||||
|
{
|
||||||
|
Ok(config) => config,
|
||||||
|
Err(error) => {
|
||||||
|
eprintln!(
|
||||||
|
"Warning: Failed to load auth config for strategy '{}': {}",
|
||||||
|
summary.name, error
|
||||||
|
);
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
auth_variants.push(auth_variant_schema(&summary.name, &summary.label, &config.args));
|
||||||
|
}
|
||||||
|
|
||||||
|
let Some(properties) = schema.get_mut("properties").and_then(Value::as_object_mut) else {
|
||||||
|
return Ok(());
|
||||||
|
};
|
||||||
|
|
||||||
|
let Some(auth_schema) = properties.get_mut("authentication") else {
|
||||||
|
return Ok(());
|
||||||
|
};
|
||||||
|
|
||||||
|
if !auth_variants.is_empty() {
|
||||||
|
let mut one_of = vec![auth_schema.clone()];
|
||||||
|
one_of.extend(auth_variants);
|
||||||
|
*auth_schema = json!({ "oneOf": one_of });
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
fn auth_variant_schema(auth_name: &str, auth_label: &str, args: &[FormInput]) -> Value {
|
||||||
|
let mut properties = Map::new();
|
||||||
|
let mut required = Vec::new();
|
||||||
|
for input in args {
|
||||||
|
add_input_schema(input, &mut properties, &mut required);
|
||||||
|
}
|
||||||
|
|
||||||
|
let mut schema = json!({
|
||||||
|
"title": auth_label,
|
||||||
|
"description": format!("Authentication values for strategy '{}'", auth_name),
|
||||||
|
"type": "object",
|
||||||
|
"properties": properties,
|
||||||
|
"additionalProperties": true
|
||||||
|
});
|
||||||
|
|
||||||
|
if !required.is_empty() {
|
||||||
|
schema["required"] = json!(required);
|
||||||
|
}
|
||||||
|
|
||||||
|
schema
|
||||||
|
}
|
||||||
|
|
||||||
|
fn add_input_schema(
|
||||||
|
input: &FormInput,
|
||||||
|
properties: &mut Map<String, Value>,
|
||||||
|
required: &mut Vec<String>,
|
||||||
|
) {
|
||||||
|
match input {
|
||||||
|
FormInput::Text(v) => add_base_schema(
|
||||||
|
&v.base,
|
||||||
|
json!({
|
||||||
|
"type": "string",
|
||||||
|
"writeOnly": v.password.unwrap_or(false),
|
||||||
|
}),
|
||||||
|
properties,
|
||||||
|
required,
|
||||||
|
),
|
||||||
|
FormInput::Editor(v) => add_base_schema(
|
||||||
|
&v.base,
|
||||||
|
json!({
|
||||||
|
"type": "string",
|
||||||
|
"x-editorLanguage": v.language.clone(),
|
||||||
|
}),
|
||||||
|
properties,
|
||||||
|
required,
|
||||||
|
),
|
||||||
|
FormInput::Select(v) => {
|
||||||
|
let options: Vec<Value> =
|
||||||
|
v.options.iter().map(|o| Value::String(o.value.clone())).collect();
|
||||||
|
add_base_schema(
|
||||||
|
&v.base,
|
||||||
|
json!({
|
||||||
|
"type": "string",
|
||||||
|
"enum": options,
|
||||||
|
}),
|
||||||
|
properties,
|
||||||
|
required,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
FormInput::Checkbox(v) => {
|
||||||
|
add_base_schema(&v.base, json!({ "type": "boolean" }), properties, required);
|
||||||
|
}
|
||||||
|
FormInput::File(v) => {
|
||||||
|
if v.multiple.unwrap_or(false) {
|
||||||
|
add_base_schema(
|
||||||
|
&v.base,
|
||||||
|
json!({
|
||||||
|
"type": "array",
|
||||||
|
"items": { "type": "string" },
|
||||||
|
}),
|
||||||
|
properties,
|
||||||
|
required,
|
||||||
|
);
|
||||||
|
} else {
|
||||||
|
add_base_schema(&v.base, json!({ "type": "string" }), properties, required);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
FormInput::HttpRequest(v) => {
|
||||||
|
add_base_schema(&v.base, json!({ "type": "string" }), properties, required);
|
||||||
|
}
|
||||||
|
FormInput::KeyValue(v) => {
|
||||||
|
add_base_schema(
|
||||||
|
&v.base,
|
||||||
|
json!({
|
||||||
|
"type": "object",
|
||||||
|
"additionalProperties": true,
|
||||||
|
}),
|
||||||
|
properties,
|
||||||
|
required,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
FormInput::Accordion(v) => {
|
||||||
|
if let Some(children) = &v.inputs {
|
||||||
|
for child in children {
|
||||||
|
add_input_schema(child, properties, required);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
FormInput::HStack(v) => {
|
||||||
|
if let Some(children) = &v.inputs {
|
||||||
|
for child in children {
|
||||||
|
add_input_schema(child, properties, required);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
FormInput::Banner(v) => {
|
||||||
|
if let Some(children) = &v.inputs {
|
||||||
|
for child in children {
|
||||||
|
add_input_schema(child, properties, required);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
FormInput::Markdown(_) => {}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fn add_base_schema(
|
||||||
|
base: &FormInputBase,
|
||||||
|
mut schema: Value,
|
||||||
|
properties: &mut Map<String, Value>,
|
||||||
|
required: &mut Vec<String>,
|
||||||
|
) {
|
||||||
|
if base.hidden.unwrap_or(false) || base.name.trim().is_empty() {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if let Some(description) = &base.description {
|
||||||
|
schema["description"] = Value::String(description.clone());
|
||||||
|
}
|
||||||
|
if let Some(label) = &base.label {
|
||||||
|
schema["title"] = Value::String(label.clone());
|
||||||
|
}
|
||||||
|
if let Some(default_value) = &base.default_value {
|
||||||
|
schema["default"] = Value::String(default_value.clone());
|
||||||
|
}
|
||||||
|
|
||||||
|
let name = base.name.clone();
|
||||||
|
properties.insert(name.clone(), schema);
|
||||||
|
if !base.optional.unwrap_or(false) {
|
||||||
|
required.push(name);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fn create(
|
||||||
|
ctx: &CliContext,
|
||||||
|
workspace_id: Option<String>,
|
||||||
|
name: Option<String>,
|
||||||
|
method: Option<String>,
|
||||||
|
url: Option<String>,
|
||||||
|
json: Option<String>,
|
||||||
|
) -> CommandResult {
|
||||||
|
if json.is_some() && workspace_id.as_deref().is_some_and(|v| !is_json_shorthand(v)) {
|
||||||
|
return Err("request create cannot combine workspace_id with --json payload".to_string());
|
||||||
|
}
|
||||||
|
|
||||||
|
let payload = parse_optional_json(
|
||||||
|
json,
|
||||||
|
workspace_id.clone().filter(|v| is_json_shorthand(v)),
|
||||||
|
"request create",
|
||||||
|
)?;
|
||||||
|
|
||||||
|
if let Some(payload) = payload {
|
||||||
|
if name.is_some() || method.is_some() || url.is_some() {
|
||||||
|
return Err("request create cannot combine simple flags with JSON payload".to_string());
|
||||||
|
}
|
||||||
|
|
||||||
|
validate_create_id(&payload, "request")?;
|
||||||
|
let request: HttpRequest = serde_json::from_value(payload)
|
||||||
|
.map_err(|e| format!("Failed to parse request create JSON: {e}"))?;
|
||||||
|
|
||||||
|
if request.workspace_id.is_empty() {
|
||||||
|
return Err("request create JSON requires non-empty \"workspaceId\"".to_string());
|
||||||
|
}
|
||||||
|
|
||||||
|
let created = ctx
|
||||||
|
.db()
|
||||||
|
.upsert_http_request(&request, &UpdateSource::Sync)
|
||||||
|
.map_err(|e| format!("Failed to create request: {e}"))?;
|
||||||
|
|
||||||
|
println!("Created request: {}", created.id);
|
||||||
|
return Ok(());
|
||||||
|
}
|
||||||
|
|
||||||
|
let workspace_id = workspace_id.ok_or_else(|| {
|
||||||
|
"request create requires workspace_id unless JSON payload is provided".to_string()
|
||||||
|
})?;
|
||||||
|
let name = name.unwrap_or_default();
|
||||||
|
let url = url.unwrap_or_default();
|
||||||
|
let method = method.unwrap_or_else(|| "GET".to_string());
|
||||||
|
|
||||||
|
let request = HttpRequest {
|
||||||
|
workspace_id,
|
||||||
|
name,
|
||||||
|
method: method.to_uppercase(),
|
||||||
|
url,
|
||||||
|
..Default::default()
|
||||||
|
};
|
||||||
|
|
||||||
|
let created = ctx
|
||||||
|
.db()
|
||||||
|
.upsert_http_request(&request, &UpdateSource::Sync)
|
||||||
|
.map_err(|e| format!("Failed to create request: {e}"))?;
|
||||||
|
|
||||||
|
println!("Created request: {}", created.id);
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
fn update(ctx: &CliContext, json: Option<String>, json_input: Option<String>) -> CommandResult {
|
||||||
|
let patch = parse_required_json(json, json_input, "request update")?;
|
||||||
|
let id = require_id(&patch, "request update")?;
|
||||||
|
|
||||||
|
let existing = ctx
|
||||||
|
.db()
|
||||||
|
.get_http_request(&id)
|
||||||
|
.map_err(|e| format!("Failed to get request for update: {e}"))?;
|
||||||
|
let updated = apply_merge_patch(&existing, &patch, &id, "request update")?;
|
||||||
|
|
||||||
|
let saved = ctx
|
||||||
|
.db()
|
||||||
|
.upsert_http_request(&updated, &UpdateSource::Sync)
|
||||||
|
.map_err(|e| format!("Failed to update request: {e}"))?;
|
||||||
|
|
||||||
|
println!("Updated request: {}", saved.id);
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
fn show(ctx: &CliContext, request_id: &str) -> CommandResult {
|
||||||
|
let request =
|
||||||
|
ctx.db().get_http_request(request_id).map_err(|e| format!("Failed to get request: {e}"))?;
|
||||||
|
let output = serde_json::to_string_pretty(&request)
|
||||||
|
.map_err(|e| format!("Failed to serialize request: {e}"))?;
|
||||||
|
println!("{output}");
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
fn delete(ctx: &CliContext, request_id: &str, yes: bool) -> CommandResult {
|
||||||
|
if !yes && !confirm_delete("request", request_id) {
|
||||||
|
println!("Aborted");
|
||||||
|
return Ok(());
|
||||||
|
}
|
||||||
|
|
||||||
|
let deleted = ctx
|
||||||
|
.db()
|
||||||
|
.delete_http_request_by_id(request_id, &UpdateSource::Sync)
|
||||||
|
.map_err(|e| format!("Failed to delete request: {e}"))?;
|
||||||
|
println!("Deleted request: {}", deleted.id);
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Send a request by ID and print response in the same format as legacy `send`.
|
||||||
|
pub async fn send_request_by_id(
|
||||||
|
ctx: &CliContext,
|
||||||
|
request_id: &str,
|
||||||
|
environment: Option<&str>,
|
||||||
|
verbose: bool,
|
||||||
|
) -> Result<(), String> {
|
||||||
|
let request =
|
||||||
|
ctx.db().get_any_request(request_id).map_err(|e| format!("Failed to get request: {e}"))?;
|
||||||
|
match request {
|
||||||
|
AnyRequest::HttpRequest(http_request) => {
|
||||||
|
send_http_request_by_id(
|
||||||
|
ctx,
|
||||||
|
&http_request.id,
|
||||||
|
&http_request.workspace_id,
|
||||||
|
environment,
|
||||||
|
verbose,
|
||||||
|
)
|
||||||
|
.await
|
||||||
|
}
|
||||||
|
AnyRequest::GrpcRequest(_) => {
|
||||||
|
Err("gRPC request send is not implemented yet in yaak-cli".to_string())
|
||||||
|
}
|
||||||
|
AnyRequest::WebsocketRequest(_) => {
|
||||||
|
Err("WebSocket request send is not implemented yet in yaak-cli".to_string())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async fn send_http_request_by_id(
|
||||||
|
ctx: &CliContext,
|
||||||
|
request_id: &str,
|
||||||
|
workspace_id: &str,
|
||||||
|
environment: Option<&str>,
|
||||||
|
verbose: bool,
|
||||||
|
) -> Result<(), String> {
|
||||||
|
let plugin_context = PluginContext::new(None, Some(workspace_id.to_string()));
|
||||||
|
|
||||||
|
let (event_tx, mut event_rx) = mpsc::channel(100);
|
||||||
|
let event_handle = tokio::spawn(async move {
|
||||||
|
while let Some(event) = event_rx.recv().await {
|
||||||
|
if verbose {
|
||||||
|
println!("{}", event);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
let response_dir = ctx.data_dir().join("responses");
|
||||||
|
|
||||||
|
let result = send_http_request_by_id_with_plugins(SendHttpRequestByIdWithPluginsParams {
|
||||||
|
query_manager: ctx.query_manager(),
|
||||||
|
blob_manager: ctx.blob_manager(),
|
||||||
|
request_id,
|
||||||
|
environment_id: environment,
|
||||||
|
update_source: UpdateSource::Sync,
|
||||||
|
cookie_jar_id: None,
|
||||||
|
response_dir: &response_dir,
|
||||||
|
emit_events_to: Some(event_tx),
|
||||||
|
plugin_manager: ctx.plugin_manager(),
|
||||||
|
encryption_manager: ctx.encryption_manager.clone(),
|
||||||
|
plugin_context: &plugin_context,
|
||||||
|
cancelled_rx: None,
|
||||||
|
connection_manager: None,
|
||||||
|
})
|
||||||
|
.await;
|
||||||
|
|
||||||
|
let _ = event_handle.await;
|
||||||
|
let result = result.map_err(|e| e.to_string())?;
|
||||||
|
|
||||||
|
if verbose {
|
||||||
|
println!();
|
||||||
|
}
|
||||||
|
println!(
|
||||||
|
"HTTP {} {}",
|
||||||
|
result.response.status,
|
||||||
|
result.response.status_reason.as_deref().unwrap_or("")
|
||||||
|
);
|
||||||
|
if verbose {
|
||||||
|
for header in &result.response.headers {
|
||||||
|
println!("{}: {}", header.name, header.value);
|
||||||
|
}
|
||||||
|
println!();
|
||||||
|
}
|
||||||
|
let body = String::from_utf8(result.response_body)
|
||||||
|
.map_err(|e| format!("Failed to read response body: {e}"))?;
|
||||||
|
println!("{}", body);
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
184
crates-cli/yaak-cli/src/commands/send.rs
Normal file
@@ -0,0 +1,184 @@
|
|||||||
|
use crate::cli::SendArgs;
|
||||||
|
use crate::commands::request;
|
||||||
|
use crate::context::CliContext;
|
||||||
|
use futures::future::join_all;
|
||||||
|
|
||||||
|
enum ExecutionMode {
|
||||||
|
Sequential,
|
||||||
|
Parallel,
|
||||||
|
}
|
||||||
|
|
||||||
|
pub async fn run(
|
||||||
|
ctx: &CliContext,
|
||||||
|
args: SendArgs,
|
||||||
|
environment: Option<&str>,
|
||||||
|
verbose: bool,
|
||||||
|
) -> i32 {
|
||||||
|
match send_target(ctx, args, environment, verbose).await {
|
||||||
|
Ok(()) => 0,
|
||||||
|
Err(error) => {
|
||||||
|
eprintln!("Error: {error}");
|
||||||
|
1
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async fn send_target(
|
||||||
|
ctx: &CliContext,
|
||||||
|
args: SendArgs,
|
||||||
|
environment: Option<&str>,
|
||||||
|
verbose: bool,
|
||||||
|
) -> Result<(), String> {
|
||||||
|
let mode = if args.parallel { ExecutionMode::Parallel } else { ExecutionMode::Sequential };
|
||||||
|
|
||||||
|
if ctx.db().get_any_request(&args.id).is_ok() {
|
||||||
|
return request::send_request_by_id(ctx, &args.id, environment, verbose).await;
|
||||||
|
}
|
||||||
|
|
||||||
|
if ctx.db().get_folder(&args.id).is_ok() {
|
||||||
|
let request_ids = collect_folder_request_ids(ctx, &args.id)?;
|
||||||
|
if request_ids.is_empty() {
|
||||||
|
println!("No requests found in folder {}", args.id);
|
||||||
|
return Ok(());
|
||||||
|
}
|
||||||
|
return send_many(ctx, request_ids, mode, args.fail_fast, environment, verbose).await;
|
||||||
|
}
|
||||||
|
|
||||||
|
if ctx.db().get_workspace(&args.id).is_ok() {
|
||||||
|
let request_ids = collect_workspace_request_ids(ctx, &args.id)?;
|
||||||
|
if request_ids.is_empty() {
|
||||||
|
println!("No requests found in workspace {}", args.id);
|
||||||
|
return Ok(());
|
||||||
|
}
|
||||||
|
return send_many(ctx, request_ids, mode, args.fail_fast, environment, verbose).await;
|
||||||
|
}
|
||||||
|
|
||||||
|
Err(format!("Could not resolve ID '{}' as request, folder, or workspace", args.id))
|
||||||
|
}
|
||||||
|
|
||||||
|
fn collect_folder_request_ids(ctx: &CliContext, folder_id: &str) -> Result<Vec<String>, String> {
|
||||||
|
let mut ids = Vec::new();
|
||||||
|
|
||||||
|
let mut http_ids = ctx
|
||||||
|
.db()
|
||||||
|
.list_http_requests_for_folder_recursive(folder_id)
|
||||||
|
.map_err(|e| format!("Failed to list HTTP requests in folder: {e}"))?
|
||||||
|
.into_iter()
|
||||||
|
.map(|r| r.id)
|
||||||
|
.collect::<Vec<_>>();
|
||||||
|
ids.append(&mut http_ids);
|
||||||
|
|
||||||
|
let mut grpc_ids = ctx
|
||||||
|
.db()
|
||||||
|
.list_grpc_requests_for_folder_recursive(folder_id)
|
||||||
|
.map_err(|e| format!("Failed to list gRPC requests in folder: {e}"))?
|
||||||
|
.into_iter()
|
||||||
|
.map(|r| r.id)
|
||||||
|
.collect::<Vec<_>>();
|
||||||
|
ids.append(&mut grpc_ids);
|
||||||
|
|
||||||
|
let mut websocket_ids = ctx
|
||||||
|
.db()
|
||||||
|
.list_websocket_requests_for_folder_recursive(folder_id)
|
||||||
|
.map_err(|e| format!("Failed to list WebSocket requests in folder: {e}"))?
|
||||||
|
.into_iter()
|
||||||
|
.map(|r| r.id)
|
||||||
|
.collect::<Vec<_>>();
|
||||||
|
ids.append(&mut websocket_ids);
|
||||||
|
|
||||||
|
Ok(ids)
|
||||||
|
}
|
||||||
|
|
||||||
|
fn collect_workspace_request_ids(
|
||||||
|
ctx: &CliContext,
|
||||||
|
workspace_id: &str,
|
||||||
|
) -> Result<Vec<String>, String> {
|
||||||
|
let mut ids = Vec::new();
|
||||||
|
|
||||||
|
let mut http_ids = ctx
|
||||||
|
.db()
|
||||||
|
.list_http_requests(workspace_id)
|
||||||
|
.map_err(|e| format!("Failed to list HTTP requests in workspace: {e}"))?
|
||||||
|
.into_iter()
|
||||||
|
.map(|r| r.id)
|
||||||
|
.collect::<Vec<_>>();
|
||||||
|
ids.append(&mut http_ids);
|
||||||
|
|
||||||
|
let mut grpc_ids = ctx
|
||||||
|
.db()
|
||||||
|
.list_grpc_requests(workspace_id)
|
||||||
|
.map_err(|e| format!("Failed to list gRPC requests in workspace: {e}"))?
|
||||||
|
.into_iter()
|
||||||
|
.map(|r| r.id)
|
||||||
|
.collect::<Vec<_>>();
|
||||||
|
ids.append(&mut grpc_ids);
|
||||||
|
|
||||||
|
let mut websocket_ids = ctx
|
||||||
|
.db()
|
||||||
|
.list_websocket_requests(workspace_id)
|
||||||
|
.map_err(|e| format!("Failed to list WebSocket requests in workspace: {e}"))?
|
||||||
|
.into_iter()
|
||||||
|
.map(|r| r.id)
|
||||||
|
.collect::<Vec<_>>();
|
||||||
|
ids.append(&mut websocket_ids);
|
||||||
|
|
||||||
|
Ok(ids)
|
||||||
|
}
|
||||||
|
|
||||||
|
async fn send_many(
|
||||||
|
ctx: &CliContext,
|
||||||
|
request_ids: Vec<String>,
|
||||||
|
mode: ExecutionMode,
|
||||||
|
fail_fast: bool,
|
||||||
|
environment: Option<&str>,
|
||||||
|
verbose: bool,
|
||||||
|
) -> Result<(), String> {
|
||||||
|
let mut success_count = 0usize;
|
||||||
|
let mut failures: Vec<(String, String)> = Vec::new();
|
||||||
|
|
||||||
|
match mode {
|
||||||
|
ExecutionMode::Sequential => {
|
||||||
|
for request_id in request_ids {
|
||||||
|
match request::send_request_by_id(ctx, &request_id, environment, verbose).await {
|
||||||
|
Ok(()) => success_count += 1,
|
||||||
|
Err(error) => {
|
||||||
|
failures.push((request_id, error));
|
||||||
|
if fail_fast {
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
ExecutionMode::Parallel => {
|
||||||
|
let tasks = request_ids
|
||||||
|
.iter()
|
||||||
|
.map(|request_id| async move {
|
||||||
|
(
|
||||||
|
request_id.clone(),
|
||||||
|
request::send_request_by_id(ctx, request_id, environment, verbose).await,
|
||||||
|
)
|
||||||
|
})
|
||||||
|
.collect::<Vec<_>>();
|
||||||
|
|
||||||
|
for (request_id, result) in join_all(tasks).await {
|
||||||
|
match result {
|
||||||
|
Ok(()) => success_count += 1,
|
||||||
|
Err(error) => failures.push((request_id, error)),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
let failure_count = failures.len();
|
||||||
|
println!("Send summary: {success_count} succeeded, {failure_count} failed");
|
||||||
|
|
||||||
|
if failure_count == 0 {
|
||||||
|
return Ok(());
|
||||||
|
}
|
||||||
|
|
||||||
|
for (request_id, error) in failures {
|
||||||
|
eprintln!(" {}: {}", request_id, error);
|
||||||
|
}
|
||||||
|
Err("One or more requests failed".to_string())
|
||||||
|
}
|
||||||
123
crates-cli/yaak-cli/src/commands/workspace.rs
Normal file
@@ -0,0 +1,123 @@
|
|||||||
|
use crate::cli::{WorkspaceArgs, WorkspaceCommands};
|
||||||
|
use crate::context::CliContext;
|
||||||
|
use crate::utils::confirm::confirm_delete;
|
||||||
|
use crate::utils::json::{
|
||||||
|
apply_merge_patch, parse_optional_json, parse_required_json, require_id, validate_create_id,
|
||||||
|
};
|
||||||
|
use yaak_models::models::Workspace;
|
||||||
|
use yaak_models::util::UpdateSource;
|
||||||
|
|
||||||
|
type CommandResult<T = ()> = std::result::Result<T, String>;
|
||||||
|
|
||||||
|
pub fn run(ctx: &CliContext, args: WorkspaceArgs) -> i32 {
|
||||||
|
let result = match args.command {
|
||||||
|
WorkspaceCommands::List => list(ctx),
|
||||||
|
WorkspaceCommands::Show { workspace_id } => show(ctx, &workspace_id),
|
||||||
|
WorkspaceCommands::Create { name, json, json_input } => create(ctx, name, json, json_input),
|
||||||
|
WorkspaceCommands::Update { json, json_input } => update(ctx, json, json_input),
|
||||||
|
WorkspaceCommands::Delete { workspace_id, yes } => delete(ctx, &workspace_id, yes),
|
||||||
|
};
|
||||||
|
|
||||||
|
match result {
|
||||||
|
Ok(()) => 0,
|
||||||
|
Err(error) => {
|
||||||
|
eprintln!("Error: {error}");
|
||||||
|
1
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fn list(ctx: &CliContext) -> CommandResult {
|
||||||
|
let workspaces =
|
||||||
|
ctx.db().list_workspaces().map_err(|e| format!("Failed to list workspaces: {e}"))?;
|
||||||
|
if workspaces.is_empty() {
|
||||||
|
println!("No workspaces found");
|
||||||
|
} else {
|
||||||
|
for workspace in workspaces {
|
||||||
|
println!("{} - {}", workspace.id, workspace.name);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
fn show(ctx: &CliContext, workspace_id: &str) -> CommandResult {
|
||||||
|
let workspace = ctx
|
||||||
|
.db()
|
||||||
|
.get_workspace(workspace_id)
|
||||||
|
.map_err(|e| format!("Failed to get workspace: {e}"))?;
|
||||||
|
let output = serde_json::to_string_pretty(&workspace)
|
||||||
|
.map_err(|e| format!("Failed to serialize workspace: {e}"))?;
|
||||||
|
println!("{output}");
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
fn create(
|
||||||
|
ctx: &CliContext,
|
||||||
|
name: Option<String>,
|
||||||
|
json: Option<String>,
|
||||||
|
json_input: Option<String>,
|
||||||
|
) -> CommandResult {
|
||||||
|
let payload = parse_optional_json(json, json_input, "workspace create")?;
|
||||||
|
|
||||||
|
if let Some(payload) = payload {
|
||||||
|
if name.is_some() {
|
||||||
|
return Err("workspace create cannot combine --name with JSON payload".to_string());
|
||||||
|
}
|
||||||
|
|
||||||
|
validate_create_id(&payload, "workspace")?;
|
||||||
|
let workspace: Workspace = serde_json::from_value(payload)
|
||||||
|
.map_err(|e| format!("Failed to parse workspace create JSON: {e}"))?;
|
||||||
|
|
||||||
|
let created = ctx
|
||||||
|
.db()
|
||||||
|
.upsert_workspace(&workspace, &UpdateSource::Sync)
|
||||||
|
.map_err(|e| format!("Failed to create workspace: {e}"))?;
|
||||||
|
println!("Created workspace: {}", created.id);
|
||||||
|
return Ok(());
|
||||||
|
}
|
||||||
|
|
||||||
|
let name = name.ok_or_else(|| {
|
||||||
|
"workspace create requires --name unless JSON payload is provided".to_string()
|
||||||
|
})?;
|
||||||
|
|
||||||
|
let workspace = Workspace { name, ..Default::default() };
|
||||||
|
let created = ctx
|
||||||
|
.db()
|
||||||
|
.upsert_workspace(&workspace, &UpdateSource::Sync)
|
||||||
|
.map_err(|e| format!("Failed to create workspace: {e}"))?;
|
||||||
|
println!("Created workspace: {}", created.id);
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
fn update(ctx: &CliContext, json: Option<String>, json_input: Option<String>) -> CommandResult {
|
||||||
|
let patch = parse_required_json(json, json_input, "workspace update")?;
|
||||||
|
let id = require_id(&patch, "workspace update")?;
|
||||||
|
|
||||||
|
let existing = ctx
|
||||||
|
.db()
|
||||||
|
.get_workspace(&id)
|
||||||
|
.map_err(|e| format!("Failed to get workspace for update: {e}"))?;
|
||||||
|
let updated = apply_merge_patch(&existing, &patch, &id, "workspace update")?;
|
||||||
|
|
||||||
|
let saved = ctx
|
||||||
|
.db()
|
||||||
|
.upsert_workspace(&updated, &UpdateSource::Sync)
|
||||||
|
.map_err(|e| format!("Failed to update workspace: {e}"))?;
|
||||||
|
|
||||||
|
println!("Updated workspace: {}", saved.id);
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
fn delete(ctx: &CliContext, workspace_id: &str, yes: bool) -> CommandResult {
|
||||||
|
if !yes && !confirm_delete("workspace", workspace_id) {
|
||||||
|
println!("Aborted");
|
||||||
|
return Ok(());
|
||||||
|
}
|
||||||
|
|
||||||
|
let deleted = ctx
|
||||||
|
.db()
|
||||||
|
.delete_workspace_by_id(workspace_id, &UpdateSource::Sync)
|
||||||
|
.map_err(|e| format!("Failed to delete workspace: {e}"))?;
|
||||||
|
println!("Deleted workspace: {}", deleted.id);
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
115
crates-cli/yaak-cli/src/context.rs
Normal file
@@ -0,0 +1,115 @@
|
|||||||
|
use crate::plugin_events::CliPluginEventBridge;
|
||||||
|
use std::path::{Path, PathBuf};
|
||||||
|
use std::sync::Arc;
|
||||||
|
use tokio::sync::Mutex;
|
||||||
|
use yaak_crypto::manager::EncryptionManager;
|
||||||
|
use yaak_models::blob_manager::BlobManager;
|
||||||
|
use yaak_models::db_context::DbContext;
|
||||||
|
use yaak_models::query_manager::QueryManager;
|
||||||
|
use yaak_plugins::events::PluginContext;
|
||||||
|
use yaak_plugins::manager::PluginManager;
|
||||||
|
|
||||||
|
pub struct CliContext {
|
||||||
|
data_dir: PathBuf,
|
||||||
|
query_manager: QueryManager,
|
||||||
|
blob_manager: BlobManager,
|
||||||
|
pub encryption_manager: Arc<EncryptionManager>,
|
||||||
|
plugin_manager: Option<Arc<PluginManager>>,
|
||||||
|
plugin_event_bridge: Mutex<Option<CliPluginEventBridge>>,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl CliContext {
|
||||||
|
pub async fn initialize(data_dir: PathBuf, app_id: &str, with_plugins: bool) -> Self {
|
||||||
|
let db_path = data_dir.join("db.sqlite");
|
||||||
|
let blob_path = data_dir.join("blobs.sqlite");
|
||||||
|
|
||||||
|
let (query_manager, blob_manager, _rx) = yaak_models::init_standalone(&db_path, &blob_path)
|
||||||
|
.expect("Failed to initialize database");
|
||||||
|
|
||||||
|
let encryption_manager = Arc::new(EncryptionManager::new(query_manager.clone(), app_id));
|
||||||
|
|
||||||
|
let plugin_manager = if with_plugins {
|
||||||
|
let vendored_plugin_dir = data_dir.join("vendored-plugins");
|
||||||
|
let installed_plugin_dir = data_dir.join("installed-plugins");
|
||||||
|
let node_bin_path = PathBuf::from("node");
|
||||||
|
|
||||||
|
let plugin_runtime_main =
|
||||||
|
std::env::var("YAAK_PLUGIN_RUNTIME").map(PathBuf::from).unwrap_or_else(|_| {
|
||||||
|
PathBuf::from(env!("CARGO_MANIFEST_DIR"))
|
||||||
|
.join("../../crates-tauri/yaak-app/vendored/plugin-runtime/index.cjs")
|
||||||
|
});
|
||||||
|
|
||||||
|
let plugin_manager = Arc::new(
|
||||||
|
PluginManager::new(
|
||||||
|
vendored_plugin_dir,
|
||||||
|
installed_plugin_dir,
|
||||||
|
node_bin_path,
|
||||||
|
plugin_runtime_main,
|
||||||
|
false,
|
||||||
|
)
|
||||||
|
.await,
|
||||||
|
);
|
||||||
|
|
||||||
|
let plugins = query_manager.connect().list_plugins().unwrap_or_default();
|
||||||
|
if !plugins.is_empty() {
|
||||||
|
let errors = plugin_manager
|
||||||
|
.initialize_all_plugins(plugins, &PluginContext::new_empty())
|
||||||
|
.await;
|
||||||
|
for (plugin_dir, error_msg) in errors {
|
||||||
|
eprintln!(
|
||||||
|
"Warning: Failed to initialize plugin '{}': {}",
|
||||||
|
plugin_dir, error_msg
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Some(plugin_manager)
|
||||||
|
} else {
|
||||||
|
None
|
||||||
|
};
|
||||||
|
|
||||||
|
let plugin_event_bridge = if let Some(plugin_manager) = &plugin_manager {
|
||||||
|
Some(CliPluginEventBridge::start(plugin_manager.clone(), query_manager.clone()).await)
|
||||||
|
} else {
|
||||||
|
None
|
||||||
|
};
|
||||||
|
|
||||||
|
Self {
|
||||||
|
data_dir,
|
||||||
|
query_manager,
|
||||||
|
blob_manager,
|
||||||
|
encryption_manager,
|
||||||
|
plugin_manager,
|
||||||
|
plugin_event_bridge: Mutex::new(plugin_event_bridge),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn data_dir(&self) -> &Path {
|
||||||
|
&self.data_dir
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn db(&self) -> DbContext<'_> {
|
||||||
|
self.query_manager.connect()
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn query_manager(&self) -> &QueryManager {
|
||||||
|
&self.query_manager
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn blob_manager(&self) -> &BlobManager {
|
||||||
|
&self.blob_manager
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn plugin_manager(&self) -> Arc<PluginManager> {
|
||||||
|
self.plugin_manager.clone().expect("Plugin manager was not initialized for this command")
|
||||||
|
}
|
||||||
|
|
||||||
|
pub async fn shutdown(&self) {
|
||||||
|
if let Some(plugin_manager) = &self.plugin_manager {
|
||||||
|
if let Some(plugin_event_bridge) = self.plugin_event_bridge.lock().await.take() {
|
||||||
|
plugin_event_bridge.shutdown(plugin_manager).await;
|
||||||
|
}
|
||||||
|
plugin_manager.terminate().await;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
52
crates-cli/yaak-cli/src/main.rs
Normal file
@@ -0,0 +1,52 @@
|
|||||||
|
mod cli;
|
||||||
|
mod commands;
|
||||||
|
mod context;
|
||||||
|
mod plugin_events;
|
||||||
|
mod utils;
|
||||||
|
|
||||||
|
use clap::Parser;
|
||||||
|
use cli::{Cli, Commands, RequestCommands};
|
||||||
|
use context::CliContext;
|
||||||
|
|
||||||
|
#[tokio::main]
|
||||||
|
async fn main() {
|
||||||
|
let Cli { data_dir, environment, verbose, command } = Cli::parse();
|
||||||
|
|
||||||
|
if verbose {
|
||||||
|
env_logger::Builder::from_env(env_logger::Env::default().default_filter_or("info")).init();
|
||||||
|
}
|
||||||
|
|
||||||
|
let app_id = if cfg!(debug_assertions) { "app.yaak.desktop.dev" } else { "app.yaak.desktop" };
|
||||||
|
|
||||||
|
let data_dir = data_dir.unwrap_or_else(|| {
|
||||||
|
dirs::data_dir().expect("Could not determine data directory").join(app_id)
|
||||||
|
});
|
||||||
|
|
||||||
|
let needs_plugins = matches!(
|
||||||
|
&command,
|
||||||
|
Commands::Send(_)
|
||||||
|
| Commands::Request(cli::RequestArgs {
|
||||||
|
command: RequestCommands::Send { .. } | RequestCommands::Schema { .. },
|
||||||
|
})
|
||||||
|
);
|
||||||
|
|
||||||
|
let context = CliContext::initialize(data_dir, app_id, needs_plugins).await;
|
||||||
|
|
||||||
|
let exit_code = match command {
|
||||||
|
Commands::Send(args) => {
|
||||||
|
commands::send::run(&context, args, environment.as_deref(), verbose).await
|
||||||
|
}
|
||||||
|
Commands::Workspace(args) => commands::workspace::run(&context, args),
|
||||||
|
Commands::Request(args) => {
|
||||||
|
commands::request::run(&context, args, environment.as_deref(), verbose).await
|
||||||
|
}
|
||||||
|
Commands::Folder(args) => commands::folder::run(&context, args),
|
||||||
|
Commands::Environment(args) => commands::environment::run(&context, args),
|
||||||
|
};
|
||||||
|
|
||||||
|
context.shutdown().await;
|
||||||
|
|
||||||
|
if exit_code != 0 {
|
||||||
|
std::process::exit(exit_code);
|
||||||
|
}
|
||||||
|
}
|
||||||
212
crates-cli/yaak-cli/src/plugin_events.rs
Normal file
@@ -0,0 +1,212 @@
|
|||||||
|
use std::sync::Arc;
|
||||||
|
use tokio::task::JoinHandle;
|
||||||
|
use yaak::plugin_events::{
|
||||||
|
GroupedPluginEvent, HostRequest, SharedPluginEventContext, handle_shared_plugin_event,
|
||||||
|
};
|
||||||
|
use yaak_models::query_manager::QueryManager;
|
||||||
|
use yaak_plugins::events::{
|
||||||
|
EmptyPayload, ErrorResponse, InternalEvent, InternalEventPayload, ListOpenWorkspacesResponse,
|
||||||
|
WorkspaceInfo,
|
||||||
|
};
|
||||||
|
use yaak_plugins::manager::PluginManager;
|
||||||
|
|
||||||
|
pub struct CliPluginEventBridge {
|
||||||
|
rx_id: String,
|
||||||
|
task: JoinHandle<()>,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl CliPluginEventBridge {
|
||||||
|
pub async fn start(plugin_manager: Arc<PluginManager>, query_manager: QueryManager) -> Self {
|
||||||
|
let (rx_id, mut rx) = plugin_manager.subscribe("cli").await;
|
||||||
|
let rx_id_for_task = rx_id.clone();
|
||||||
|
let pm = plugin_manager.clone();
|
||||||
|
|
||||||
|
let task = tokio::spawn(async move {
|
||||||
|
while let Some(event) = rx.recv().await {
|
||||||
|
// Events with reply IDs are replies to app-originated requests.
|
||||||
|
if event.reply_id.is_some() {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
let Some(plugin_handle) = pm.get_plugin_by_ref_id(&event.plugin_ref_id).await
|
||||||
|
else {
|
||||||
|
eprintln!(
|
||||||
|
"Warning: Ignoring plugin event with unknown plugin ref '{}'",
|
||||||
|
event.plugin_ref_id
|
||||||
|
);
|
||||||
|
continue;
|
||||||
|
};
|
||||||
|
|
||||||
|
let plugin_name = plugin_handle.info().name;
|
||||||
|
let Some(reply_payload) = build_plugin_reply(&query_manager, &event, &plugin_name)
|
||||||
|
else {
|
||||||
|
continue;
|
||||||
|
};
|
||||||
|
|
||||||
|
if let Err(err) = pm.reply(&event, &reply_payload).await {
|
||||||
|
eprintln!("Warning: Failed replying to plugin event: {err}");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
pm.unsubscribe(&rx_id_for_task).await;
|
||||||
|
});
|
||||||
|
|
||||||
|
Self { rx_id, task }
|
||||||
|
}
|
||||||
|
|
||||||
|
pub async fn shutdown(self, plugin_manager: &PluginManager) {
|
||||||
|
plugin_manager.unsubscribe(&self.rx_id).await;
|
||||||
|
self.task.abort();
|
||||||
|
let _ = self.task.await;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fn build_plugin_reply(
|
||||||
|
query_manager: &QueryManager,
|
||||||
|
event: &InternalEvent,
|
||||||
|
plugin_name: &str,
|
||||||
|
) -> Option<InternalEventPayload> {
|
||||||
|
match handle_shared_plugin_event(
|
||||||
|
query_manager,
|
||||||
|
&event.payload,
|
||||||
|
SharedPluginEventContext {
|
||||||
|
plugin_name,
|
||||||
|
workspace_id: event.context.workspace_id.as_deref(),
|
||||||
|
},
|
||||||
|
) {
|
||||||
|
GroupedPluginEvent::Handled(payload) => payload,
|
||||||
|
GroupedPluginEvent::ToHandle(host_request) => match host_request {
|
||||||
|
HostRequest::ErrorResponse(resp) => {
|
||||||
|
eprintln!("[plugin:{}] error: {}", plugin_name, resp.error);
|
||||||
|
None
|
||||||
|
}
|
||||||
|
HostRequest::ReloadResponse(_) => None,
|
||||||
|
HostRequest::ShowToast(req) => {
|
||||||
|
eprintln!("[plugin:{}] {}", plugin_name, req.message);
|
||||||
|
Some(InternalEventPayload::ShowToastResponse(EmptyPayload {}))
|
||||||
|
}
|
||||||
|
HostRequest::ListOpenWorkspaces(_) => {
|
||||||
|
let workspaces = match query_manager.connect().list_workspaces() {
|
||||||
|
Ok(workspaces) => workspaces
|
||||||
|
.into_iter()
|
||||||
|
.map(|w| WorkspaceInfo { id: w.id.clone(), name: w.name, label: w.id })
|
||||||
|
.collect(),
|
||||||
|
Err(err) => {
|
||||||
|
return Some(InternalEventPayload::ErrorResponse(ErrorResponse {
|
||||||
|
error: format!("Failed to list workspaces in CLI: {err}"),
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
};
|
||||||
|
Some(InternalEventPayload::ListOpenWorkspacesResponse(ListOpenWorkspacesResponse {
|
||||||
|
workspaces,
|
||||||
|
}))
|
||||||
|
}
|
||||||
|
req => Some(InternalEventPayload::ErrorResponse(ErrorResponse {
|
||||||
|
error: format!("Unsupported plugin request in CLI: {}", req.type_name()),
|
||||||
|
})),
|
||||||
|
},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[cfg(test)]
|
||||||
|
mod tests {
|
||||||
|
use super::*;
|
||||||
|
use tempfile::TempDir;
|
||||||
|
use yaak_plugins::events::{GetKeyValueRequest, PluginContext, WindowInfoRequest};
|
||||||
|
|
||||||
|
fn query_manager_for_test() -> (QueryManager, TempDir) {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let db_path = temp_dir.path().join("db.sqlite");
|
||||||
|
let blob_path = temp_dir.path().join("blobs.sqlite");
|
||||||
|
let (query_manager, _blob_manager, _rx) =
|
||||||
|
yaak_models::init_standalone(&db_path, &blob_path).expect("Failed to initialize DB");
|
||||||
|
(query_manager, temp_dir)
|
||||||
|
}
|
||||||
|
|
||||||
|
fn event(payload: InternalEventPayload) -> InternalEvent {
|
||||||
|
InternalEvent {
|
||||||
|
id: "evt_1".to_string(),
|
||||||
|
plugin_ref_id: "plugin_ref_1".to_string(),
|
||||||
|
plugin_name: "@yaak/test-plugin".to_string(),
|
||||||
|
reply_id: None,
|
||||||
|
context: PluginContext::new_empty(),
|
||||||
|
payload,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn key_value_requests_round_trip() {
|
||||||
|
let (query_manager, _temp_dir) = query_manager_for_test();
|
||||||
|
let plugin_name = "@yaak/test-plugin";
|
||||||
|
|
||||||
|
let get_missing = build_plugin_reply(
|
||||||
|
&query_manager,
|
||||||
|
&event(InternalEventPayload::GetKeyValueRequest(GetKeyValueRequest {
|
||||||
|
key: "missing".to_string(),
|
||||||
|
})),
|
||||||
|
plugin_name,
|
||||||
|
);
|
||||||
|
match get_missing {
|
||||||
|
Some(InternalEventPayload::GetKeyValueResponse(r)) => assert_eq!(r.value, None),
|
||||||
|
other => panic!("unexpected payload for missing get: {other:?}"),
|
||||||
|
}
|
||||||
|
|
||||||
|
let set = build_plugin_reply(
|
||||||
|
&query_manager,
|
||||||
|
&event(InternalEventPayload::SetKeyValueRequest(
|
||||||
|
yaak_plugins::events::SetKeyValueRequest {
|
||||||
|
key: "token".to_string(),
|
||||||
|
value: "{\"access_token\":\"abc\"}".to_string(),
|
||||||
|
},
|
||||||
|
)),
|
||||||
|
plugin_name,
|
||||||
|
);
|
||||||
|
assert!(matches!(set, Some(InternalEventPayload::SetKeyValueResponse(_))));
|
||||||
|
|
||||||
|
let get_present = build_plugin_reply(
|
||||||
|
&query_manager,
|
||||||
|
&event(InternalEventPayload::GetKeyValueRequest(GetKeyValueRequest {
|
||||||
|
key: "token".to_string(),
|
||||||
|
})),
|
||||||
|
plugin_name,
|
||||||
|
);
|
||||||
|
match get_present {
|
||||||
|
Some(InternalEventPayload::GetKeyValueResponse(r)) => {
|
||||||
|
assert_eq!(r.value, Some("{\"access_token\":\"abc\"}".to_string()))
|
||||||
|
}
|
||||||
|
other => panic!("unexpected payload for present get: {other:?}"),
|
||||||
|
}
|
||||||
|
|
||||||
|
let delete = build_plugin_reply(
|
||||||
|
&query_manager,
|
||||||
|
&event(InternalEventPayload::DeleteKeyValueRequest(
|
||||||
|
yaak_plugins::events::DeleteKeyValueRequest { key: "token".to_string() },
|
||||||
|
)),
|
||||||
|
plugin_name,
|
||||||
|
);
|
||||||
|
match delete {
|
||||||
|
Some(InternalEventPayload::DeleteKeyValueResponse(r)) => assert!(r.deleted),
|
||||||
|
other => panic!("unexpected payload for delete: {other:?}"),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn unsupported_request_gets_error_reply() {
|
||||||
|
let (query_manager, _temp_dir) = query_manager_for_test();
|
||||||
|
let payload = build_plugin_reply(
|
||||||
|
&query_manager,
|
||||||
|
&event(InternalEventPayload::WindowInfoRequest(WindowInfoRequest {
|
||||||
|
label: "main".to_string(),
|
||||||
|
})),
|
||||||
|
"@yaak/test-plugin",
|
||||||
|
);
|
||||||
|
|
||||||
|
match payload {
|
||||||
|
Some(InternalEventPayload::ErrorResponse(err)) => {
|
||||||
|
assert!(err.error.contains("Unsupported plugin request in CLI"));
|
||||||
|
assert!(err.error.contains("window_info_request"));
|
||||||
|
}
|
||||||
|
other => panic!("unexpected payload for unsupported request: {other:?}"),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
16
crates-cli/yaak-cli/src/utils/confirm.rs
Normal file
@@ -0,0 +1,16 @@
|
|||||||
|
use std::io::{self, IsTerminal, Write};
|
||||||
|
|
||||||
|
pub fn confirm_delete(resource_name: &str, resource_id: &str) -> bool {
|
||||||
|
if !io::stdin().is_terminal() {
|
||||||
|
eprintln!("Refusing to delete in non-interactive mode without --yes");
|
||||||
|
std::process::exit(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
print!("Delete {resource_name} {resource_id}? [y/N]: ");
|
||||||
|
io::stdout().flush().expect("Failed to flush stdout");
|
||||||
|
|
||||||
|
let mut input = String::new();
|
||||||
|
io::stdin().read_line(&mut input).expect("Failed to read confirmation");
|
||||||
|
|
||||||
|
matches!(input.trim().to_lowercase().as_str(), "y" | "yes")
|
||||||
|
}
|
||||||
107
crates-cli/yaak-cli/src/utils/json.rs
Normal file
@@ -0,0 +1,107 @@
|
|||||||
|
use serde::Serialize;
|
||||||
|
use serde::de::DeserializeOwned;
|
||||||
|
use serde_json::{Map, Value};
|
||||||
|
|
||||||
|
type JsonResult<T> = std::result::Result<T, String>;
|
||||||
|
|
||||||
|
pub fn is_json_shorthand(input: &str) -> bool {
|
||||||
|
input.trim_start().starts_with('{')
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn parse_json_object(raw: &str, context: &str) -> JsonResult<Value> {
|
||||||
|
let value: Value = serde_json::from_str(raw)
|
||||||
|
.map_err(|error| format!("Invalid JSON for {context}: {error}"))?;
|
||||||
|
|
||||||
|
if !value.is_object() {
|
||||||
|
return Err(format!("JSON payload for {context} must be an object"));
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(value)
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn parse_optional_json(
|
||||||
|
json_flag: Option<String>,
|
||||||
|
json_shorthand: Option<String>,
|
||||||
|
context: &str,
|
||||||
|
) -> JsonResult<Option<Value>> {
|
||||||
|
match (json_flag, json_shorthand) {
|
||||||
|
(Some(_), Some(_)) => {
|
||||||
|
Err(format!("Cannot provide both --json and positional JSON for {context}"))
|
||||||
|
}
|
||||||
|
(Some(raw), None) => parse_json_object(&raw, context).map(Some),
|
||||||
|
(None, Some(raw)) => parse_json_object(&raw, context).map(Some),
|
||||||
|
(None, None) => Ok(None),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn parse_required_json(
|
||||||
|
json_flag: Option<String>,
|
||||||
|
json_shorthand: Option<String>,
|
||||||
|
context: &str,
|
||||||
|
) -> JsonResult<Value> {
|
||||||
|
parse_optional_json(json_flag, json_shorthand, context)?
|
||||||
|
.ok_or_else(|| format!("Missing JSON payload for {context}. Use --json or positional JSON"))
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn require_id(payload: &Value, context: &str) -> JsonResult<String> {
|
||||||
|
payload
|
||||||
|
.get("id")
|
||||||
|
.and_then(|value| value.as_str())
|
||||||
|
.filter(|value| !value.is_empty())
|
||||||
|
.map(|value| value.to_string())
|
||||||
|
.ok_or_else(|| format!("{context} requires a non-empty \"id\" field"))
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn validate_create_id(payload: &Value, context: &str) -> JsonResult<()> {
|
||||||
|
let Some(id_value) = payload.get("id") else {
|
||||||
|
return Ok(());
|
||||||
|
};
|
||||||
|
|
||||||
|
match id_value {
|
||||||
|
Value::String(id) if id.is_empty() => Ok(()),
|
||||||
|
_ => Err(format!("{context} create JSON must omit \"id\" or set it to an empty string")),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn apply_merge_patch<T>(existing: &T, patch: &Value, id: &str, context: &str) -> JsonResult<T>
|
||||||
|
where
|
||||||
|
T: Serialize + DeserializeOwned,
|
||||||
|
{
|
||||||
|
let mut base = serde_json::to_value(existing)
|
||||||
|
.map_err(|error| format!("Failed to serialize existing model for {context}: {error}"))?;
|
||||||
|
merge_patch(&mut base, patch);
|
||||||
|
|
||||||
|
let Some(base_object) = base.as_object_mut() else {
|
||||||
|
return Err(format!("Merged payload for {context} must be an object"));
|
||||||
|
};
|
||||||
|
base_object.insert("id".to_string(), Value::String(id.to_string()));
|
||||||
|
|
||||||
|
serde_json::from_value(base)
|
||||||
|
.map_err(|error| format!("Failed to deserialize merged payload for {context}: {error}"))
|
||||||
|
}
|
||||||
|
|
||||||
|
fn merge_patch(target: &mut Value, patch: &Value) {
|
||||||
|
match patch {
|
||||||
|
Value::Object(patch_map) => {
|
||||||
|
if !target.is_object() {
|
||||||
|
*target = Value::Object(Map::new());
|
||||||
|
}
|
||||||
|
|
||||||
|
let target_map =
|
||||||
|
target.as_object_mut().expect("merge_patch target expected to be object");
|
||||||
|
|
||||||
|
for (key, patch_value) in patch_map {
|
||||||
|
if patch_value.is_null() {
|
||||||
|
target_map.remove(key);
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
let target_entry = target_map.entry(key.clone()).or_insert(Value::Null);
|
||||||
|
merge_patch(target_entry, patch_value);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
_ => {
|
||||||
|
*target = patch.clone();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
2
crates-cli/yaak-cli/src/utils/mod.rs
Normal file
@@ -0,0 +1,2 @@
|
|||||||
|
pub mod confirm;
|
||||||
|
pub mod json;
|
||||||
42
crates-cli/yaak-cli/tests/common/http_server.rs
Normal file
@@ -0,0 +1,42 @@
|
|||||||
|
use std::io::{Read, Write};
|
||||||
|
use std::net::TcpListener;
|
||||||
|
use std::thread;
|
||||||
|
|
||||||
|
pub struct TestHttpServer {
|
||||||
|
pub url: String,
|
||||||
|
handle: Option<thread::JoinHandle<()>>,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl TestHttpServer {
|
||||||
|
pub fn spawn_ok(body: &'static str) -> Self {
|
||||||
|
let listener = TcpListener::bind("127.0.0.1:0").expect("Failed to bind test HTTP server");
|
||||||
|
let addr = listener.local_addr().expect("Failed to get local addr");
|
||||||
|
let url = format!("http://{addr}/test");
|
||||||
|
let body_bytes = body.as_bytes().to_vec();
|
||||||
|
|
||||||
|
let handle = thread::spawn(move || {
|
||||||
|
if let Ok((mut stream, _)) = listener.accept() {
|
||||||
|
let mut request_buf = [0u8; 4096];
|
||||||
|
let _ = stream.read(&mut request_buf);
|
||||||
|
|
||||||
|
let response = format!(
|
||||||
|
"HTTP/1.1 200 OK\r\nContent-Type: text/plain\r\nContent-Length: {}\r\nConnection: close\r\n\r\n",
|
||||||
|
body_bytes.len()
|
||||||
|
);
|
||||||
|
let _ = stream.write_all(response.as_bytes());
|
||||||
|
let _ = stream.write_all(&body_bytes);
|
||||||
|
let _ = stream.flush();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
Self { url, handle: Some(handle) }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl Drop for TestHttpServer {
|
||||||
|
fn drop(&mut self) {
|
||||||
|
if let Some(handle) = self.handle.take() {
|
||||||
|
let _ = handle.join();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
106
crates-cli/yaak-cli/tests/common/mod.rs
Normal file
@@ -0,0 +1,106 @@
|
|||||||
|
#![allow(dead_code)]
|
||||||
|
|
||||||
|
pub mod http_server;
|
||||||
|
|
||||||
|
use assert_cmd::Command;
|
||||||
|
use assert_cmd::cargo::cargo_bin_cmd;
|
||||||
|
use std::path::Path;
|
||||||
|
use yaak_models::models::{Folder, GrpcRequest, HttpRequest, WebsocketRequest, Workspace};
|
||||||
|
use yaak_models::query_manager::QueryManager;
|
||||||
|
use yaak_models::util::UpdateSource;
|
||||||
|
|
||||||
|
pub fn cli_cmd(data_dir: &Path) -> Command {
|
||||||
|
let mut cmd = cargo_bin_cmd!("yaakcli");
|
||||||
|
cmd.arg("--data-dir").arg(data_dir);
|
||||||
|
cmd
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn parse_created_id(stdout: &[u8], label: &str) -> String {
|
||||||
|
String::from_utf8_lossy(stdout)
|
||||||
|
.trim()
|
||||||
|
.split_once(": ")
|
||||||
|
.map(|(_, id)| id.to_string())
|
||||||
|
.unwrap_or_else(|| panic!("Expected id in '{label}' output"))
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn query_manager(data_dir: &Path) -> QueryManager {
|
||||||
|
let db_path = data_dir.join("db.sqlite");
|
||||||
|
let blob_path = data_dir.join("blobs.sqlite");
|
||||||
|
let (query_manager, _blob_manager, _rx) =
|
||||||
|
yaak_models::init_standalone(&db_path, &blob_path).expect("Failed to initialize DB");
|
||||||
|
query_manager
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn seed_workspace(data_dir: &Path, workspace_id: &str) {
|
||||||
|
let workspace = Workspace {
|
||||||
|
id: workspace_id.to_string(),
|
||||||
|
name: "Seed Workspace".to_string(),
|
||||||
|
description: "Seeded for integration tests".to_string(),
|
||||||
|
..Default::default()
|
||||||
|
};
|
||||||
|
|
||||||
|
query_manager(data_dir)
|
||||||
|
.connect()
|
||||||
|
.upsert_workspace(&workspace, &UpdateSource::Sync)
|
||||||
|
.expect("Failed to seed workspace");
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn seed_request(data_dir: &Path, workspace_id: &str, request_id: &str) {
|
||||||
|
let request = HttpRequest {
|
||||||
|
id: request_id.to_string(),
|
||||||
|
workspace_id: workspace_id.to_string(),
|
||||||
|
name: "Seeded Request".to_string(),
|
||||||
|
method: "GET".to_string(),
|
||||||
|
url: "https://example.com".to_string(),
|
||||||
|
..Default::default()
|
||||||
|
};
|
||||||
|
|
||||||
|
query_manager(data_dir)
|
||||||
|
.connect()
|
||||||
|
.upsert_http_request(&request, &UpdateSource::Sync)
|
||||||
|
.expect("Failed to seed request");
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn seed_folder(data_dir: &Path, workspace_id: &str, folder_id: &str) {
|
||||||
|
let folder = Folder {
|
||||||
|
id: folder_id.to_string(),
|
||||||
|
workspace_id: workspace_id.to_string(),
|
||||||
|
name: "Seed Folder".to_string(),
|
||||||
|
..Default::default()
|
||||||
|
};
|
||||||
|
|
||||||
|
query_manager(data_dir)
|
||||||
|
.connect()
|
||||||
|
.upsert_folder(&folder, &UpdateSource::Sync)
|
||||||
|
.expect("Failed to seed folder");
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn seed_grpc_request(data_dir: &Path, workspace_id: &str, request_id: &str) {
|
||||||
|
let request = GrpcRequest {
|
||||||
|
id: request_id.to_string(),
|
||||||
|
workspace_id: workspace_id.to_string(),
|
||||||
|
name: "Seeded gRPC Request".to_string(),
|
||||||
|
url: "https://example.com".to_string(),
|
||||||
|
..Default::default()
|
||||||
|
};
|
||||||
|
|
||||||
|
query_manager(data_dir)
|
||||||
|
.connect()
|
||||||
|
.upsert_grpc_request(&request, &UpdateSource::Sync)
|
||||||
|
.expect("Failed to seed gRPC request");
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn seed_websocket_request(data_dir: &Path, workspace_id: &str, request_id: &str) {
|
||||||
|
let request = WebsocketRequest {
|
||||||
|
id: request_id.to_string(),
|
||||||
|
workspace_id: workspace_id.to_string(),
|
||||||
|
name: "Seeded WebSocket Request".to_string(),
|
||||||
|
url: "wss://example.com/socket".to_string(),
|
||||||
|
..Default::default()
|
||||||
|
};
|
||||||
|
|
||||||
|
query_manager(data_dir)
|
||||||
|
.connect()
|
||||||
|
.upsert_websocket_request(&request, &UpdateSource::Sync)
|
||||||
|
.expect("Failed to seed WebSocket request");
|
||||||
|
}
|
||||||
80
crates-cli/yaak-cli/tests/environment_commands.rs
Normal file
@@ -0,0 +1,80 @@
|
|||||||
|
mod common;
|
||||||
|
|
||||||
|
use common::{cli_cmd, parse_created_id, query_manager, seed_workspace};
|
||||||
|
use predicates::str::contains;
|
||||||
|
use tempfile::TempDir;
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn create_list_show_delete_round_trip() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let data_dir = temp_dir.path();
|
||||||
|
seed_workspace(data_dir, "wk_test");
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args(["environment", "list", "wk_test"])
|
||||||
|
.assert()
|
||||||
|
.success()
|
||||||
|
.stdout(contains("Global Variables"));
|
||||||
|
|
||||||
|
let create_assert = cli_cmd(data_dir)
|
||||||
|
.args(["environment", "create", "wk_test", "--name", "Production"])
|
||||||
|
.assert()
|
||||||
|
.success();
|
||||||
|
let environment_id = parse_created_id(&create_assert.get_output().stdout, "environment create");
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args(["environment", "list", "wk_test"])
|
||||||
|
.assert()
|
||||||
|
.success()
|
||||||
|
.stdout(contains(&environment_id))
|
||||||
|
.stdout(contains("Production"));
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args(["environment", "show", &environment_id])
|
||||||
|
.assert()
|
||||||
|
.success()
|
||||||
|
.stdout(contains(format!("\"id\": \"{environment_id}\"")))
|
||||||
|
.stdout(contains("\"parentModel\": \"environment\""));
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args(["environment", "delete", &environment_id, "--yes"])
|
||||||
|
.assert()
|
||||||
|
.success()
|
||||||
|
.stdout(contains(format!("Deleted environment: {environment_id}")));
|
||||||
|
|
||||||
|
assert!(query_manager(data_dir).connect().get_environment(&environment_id).is_err());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn json_create_and_update_merge_patch_round_trip() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let data_dir = temp_dir.path();
|
||||||
|
seed_workspace(data_dir, "wk_test");
|
||||||
|
|
||||||
|
let create_assert = cli_cmd(data_dir)
|
||||||
|
.args([
|
||||||
|
"environment",
|
||||||
|
"create",
|
||||||
|
r#"{"workspaceId":"wk_test","name":"Json Environment"}"#,
|
||||||
|
])
|
||||||
|
.assert()
|
||||||
|
.success();
|
||||||
|
let environment_id = parse_created_id(&create_assert.get_output().stdout, "environment create");
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args([
|
||||||
|
"environment",
|
||||||
|
"update",
|
||||||
|
&format!(r##"{{"id":"{}","color":"#00ff00"}}"##, environment_id),
|
||||||
|
])
|
||||||
|
.assert()
|
||||||
|
.success()
|
||||||
|
.stdout(contains(format!("Updated environment: {environment_id}")));
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args(["environment", "show", &environment_id])
|
||||||
|
.assert()
|
||||||
|
.success()
|
||||||
|
.stdout(contains("\"name\": \"Json Environment\""))
|
||||||
|
.stdout(contains("\"color\": \"#00ff00\""));
|
||||||
|
}
|
||||||
74
crates-cli/yaak-cli/tests/folder_commands.rs
Normal file
@@ -0,0 +1,74 @@
|
|||||||
|
mod common;
|
||||||
|
|
||||||
|
use common::{cli_cmd, parse_created_id, query_manager, seed_workspace};
|
||||||
|
use predicates::str::contains;
|
||||||
|
use tempfile::TempDir;
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn create_list_show_delete_round_trip() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let data_dir = temp_dir.path();
|
||||||
|
seed_workspace(data_dir, "wk_test");
|
||||||
|
|
||||||
|
let create_assert = cli_cmd(data_dir)
|
||||||
|
.args(["folder", "create", "wk_test", "--name", "Auth"])
|
||||||
|
.assert()
|
||||||
|
.success();
|
||||||
|
let folder_id = parse_created_id(&create_assert.get_output().stdout, "folder create");
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args(["folder", "list", "wk_test"])
|
||||||
|
.assert()
|
||||||
|
.success()
|
||||||
|
.stdout(contains(&folder_id))
|
||||||
|
.stdout(contains("Auth"));
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args(["folder", "show", &folder_id])
|
||||||
|
.assert()
|
||||||
|
.success()
|
||||||
|
.stdout(contains(format!("\"id\": \"{folder_id}\"")))
|
||||||
|
.stdout(contains("\"workspaceId\": \"wk_test\""));
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args(["folder", "delete", &folder_id, "--yes"])
|
||||||
|
.assert()
|
||||||
|
.success()
|
||||||
|
.stdout(contains(format!("Deleted folder: {folder_id}")));
|
||||||
|
|
||||||
|
assert!(query_manager(data_dir).connect().get_folder(&folder_id).is_err());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn json_create_and_update_merge_patch_round_trip() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let data_dir = temp_dir.path();
|
||||||
|
seed_workspace(data_dir, "wk_test");
|
||||||
|
|
||||||
|
let create_assert = cli_cmd(data_dir)
|
||||||
|
.args([
|
||||||
|
"folder",
|
||||||
|
"create",
|
||||||
|
r#"{"workspaceId":"wk_test","name":"Json Folder"}"#,
|
||||||
|
])
|
||||||
|
.assert()
|
||||||
|
.success();
|
||||||
|
let folder_id = parse_created_id(&create_assert.get_output().stdout, "folder create");
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args([
|
||||||
|
"folder",
|
||||||
|
"update",
|
||||||
|
&format!(r#"{{"id":"{}","description":"Folder Description"}}"#, folder_id),
|
||||||
|
])
|
||||||
|
.assert()
|
||||||
|
.success()
|
||||||
|
.stdout(contains(format!("Updated folder: {folder_id}")));
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args(["folder", "show", &folder_id])
|
||||||
|
.assert()
|
||||||
|
.success()
|
||||||
|
.stdout(contains("\"name\": \"Json Folder\""))
|
||||||
|
.stdout(contains("\"description\": \"Folder Description\""));
|
||||||
|
}
|
||||||
224
crates-cli/yaak-cli/tests/request_commands.rs
Normal file
@@ -0,0 +1,224 @@
|
|||||||
|
mod common;
|
||||||
|
|
||||||
|
use common::http_server::TestHttpServer;
|
||||||
|
use common::{
|
||||||
|
cli_cmd, parse_created_id, query_manager, seed_grpc_request, seed_request,
|
||||||
|
seed_websocket_request, seed_workspace,
|
||||||
|
};
|
||||||
|
use predicates::str::contains;
|
||||||
|
use tempfile::TempDir;
|
||||||
|
use yaak_models::models::HttpResponseState;
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn show_and_delete_yes_round_trip() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let data_dir = temp_dir.path();
|
||||||
|
seed_workspace(data_dir, "wk_test");
|
||||||
|
|
||||||
|
let create_assert = cli_cmd(data_dir)
|
||||||
|
.args([
|
||||||
|
"request",
|
||||||
|
"create",
|
||||||
|
"wk_test",
|
||||||
|
"--name",
|
||||||
|
"Smoke Test",
|
||||||
|
"--url",
|
||||||
|
"https://example.com",
|
||||||
|
])
|
||||||
|
.assert()
|
||||||
|
.success();
|
||||||
|
|
||||||
|
let request_id = parse_created_id(&create_assert.get_output().stdout, "request create");
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args(["request", "show", &request_id])
|
||||||
|
.assert()
|
||||||
|
.success()
|
||||||
|
.stdout(contains(format!("\"id\": \"{request_id}\"")))
|
||||||
|
.stdout(contains("\"workspaceId\": \"wk_test\""));
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args(["request", "delete", &request_id, "--yes"])
|
||||||
|
.assert()
|
||||||
|
.success()
|
||||||
|
.stdout(contains(format!("Deleted request: {request_id}")));
|
||||||
|
|
||||||
|
assert!(query_manager(data_dir).connect().get_http_request(&request_id).is_err());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn delete_without_yes_fails_in_non_interactive_mode() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let data_dir = temp_dir.path();
|
||||||
|
seed_workspace(data_dir, "wk_test");
|
||||||
|
seed_request(data_dir, "wk_test", "rq_seed_delete_noninteractive");
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args(["request", "delete", "rq_seed_delete_noninteractive"])
|
||||||
|
.assert()
|
||||||
|
.failure()
|
||||||
|
.code(1)
|
||||||
|
.stderr(contains("Refusing to delete in non-interactive mode without --yes"));
|
||||||
|
|
||||||
|
assert!(
|
||||||
|
query_manager(data_dir).connect().get_http_request("rq_seed_delete_noninteractive").is_ok()
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn json_create_and_update_merge_patch_round_trip() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let data_dir = temp_dir.path();
|
||||||
|
seed_workspace(data_dir, "wk_test");
|
||||||
|
|
||||||
|
let create_assert = cli_cmd(data_dir)
|
||||||
|
.args([
|
||||||
|
"request",
|
||||||
|
"create",
|
||||||
|
r#"{"workspaceId":"wk_test","name":"Json Request","url":"https://example.com"}"#,
|
||||||
|
])
|
||||||
|
.assert()
|
||||||
|
.success();
|
||||||
|
let request_id = parse_created_id(&create_assert.get_output().stdout, "request create");
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args([
|
||||||
|
"request",
|
||||||
|
"update",
|
||||||
|
&format!(r#"{{"id":"{}","name":"Renamed Request"}}"#, request_id),
|
||||||
|
])
|
||||||
|
.assert()
|
||||||
|
.success()
|
||||||
|
.stdout(contains(format!("Updated request: {request_id}")));
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args(["request", "show", &request_id])
|
||||||
|
.assert()
|
||||||
|
.success()
|
||||||
|
.stdout(contains("\"name\": \"Renamed Request\""))
|
||||||
|
.stdout(contains("\"url\": \"https://example.com\""));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn update_requires_id_in_json_payload() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let data_dir = temp_dir.path();
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args(["request", "update", r#"{"name":"No ID"}"#])
|
||||||
|
.assert()
|
||||||
|
.failure()
|
||||||
|
.stderr(contains("request update requires a non-empty \"id\" field"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn create_allows_workspace_only_with_empty_defaults() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let data_dir = temp_dir.path();
|
||||||
|
seed_workspace(data_dir, "wk_test");
|
||||||
|
|
||||||
|
let create_assert = cli_cmd(data_dir).args(["request", "create", "wk_test"]).assert().success();
|
||||||
|
let request_id = parse_created_id(&create_assert.get_output().stdout, "request create");
|
||||||
|
|
||||||
|
let request = query_manager(data_dir)
|
||||||
|
.connect()
|
||||||
|
.get_http_request(&request_id)
|
||||||
|
.expect("Failed to load created request");
|
||||||
|
assert_eq!(request.workspace_id, "wk_test");
|
||||||
|
assert_eq!(request.method, "GET");
|
||||||
|
assert_eq!(request.name, "");
|
||||||
|
assert_eq!(request.url, "");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn request_send_persists_response_body_and_events() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let data_dir = temp_dir.path();
|
||||||
|
seed_workspace(data_dir, "wk_test");
|
||||||
|
|
||||||
|
let server = TestHttpServer::spawn_ok("hello from integration test");
|
||||||
|
|
||||||
|
let create_assert = cli_cmd(data_dir)
|
||||||
|
.args([
|
||||||
|
"request",
|
||||||
|
"create",
|
||||||
|
"wk_test",
|
||||||
|
"--name",
|
||||||
|
"Send Test",
|
||||||
|
"--url",
|
||||||
|
&server.url,
|
||||||
|
])
|
||||||
|
.assert()
|
||||||
|
.success();
|
||||||
|
let request_id = parse_created_id(&create_assert.get_output().stdout, "request create");
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args(["request", "send", &request_id])
|
||||||
|
.assert()
|
||||||
|
.success()
|
||||||
|
.stdout(contains("HTTP 200 OK"))
|
||||||
|
.stdout(contains("hello from integration test"));
|
||||||
|
|
||||||
|
let qm = query_manager(data_dir);
|
||||||
|
let db = qm.connect();
|
||||||
|
let responses =
|
||||||
|
db.list_http_responses_for_request(&request_id, None).expect("Failed to load responses");
|
||||||
|
assert_eq!(responses.len(), 1, "expected exactly one persisted response");
|
||||||
|
|
||||||
|
let response = &responses[0];
|
||||||
|
assert_eq!(response.status, 200);
|
||||||
|
assert!(matches!(response.state, HttpResponseState::Closed));
|
||||||
|
assert!(response.error.is_none());
|
||||||
|
|
||||||
|
let body_path =
|
||||||
|
response.body_path.as_ref().expect("expected persisted response body path").to_string();
|
||||||
|
let body = std::fs::read_to_string(&body_path).expect("Failed to read response body file");
|
||||||
|
assert_eq!(body, "hello from integration test");
|
||||||
|
|
||||||
|
let events =
|
||||||
|
db.list_http_response_events(&response.id).expect("Failed to load response events");
|
||||||
|
assert!(!events.is_empty(), "expected at least one persisted response event");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn request_schema_http_outputs_json_schema() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let data_dir = temp_dir.path();
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args(["request", "schema", "http"])
|
||||||
|
.assert()
|
||||||
|
.success()
|
||||||
|
.stdout(contains("\"type\": \"object\""))
|
||||||
|
.stdout(contains("\"authentication\""));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn request_send_grpc_returns_explicit_nyi_error() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let data_dir = temp_dir.path();
|
||||||
|
seed_workspace(data_dir, "wk_test");
|
||||||
|
seed_grpc_request(data_dir, "wk_test", "gr_seed_nyi");
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args(["request", "send", "gr_seed_nyi"])
|
||||||
|
.assert()
|
||||||
|
.failure()
|
||||||
|
.code(1)
|
||||||
|
.stderr(contains("gRPC request send is not implemented yet in yaak-cli"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn request_send_websocket_returns_explicit_nyi_error() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let data_dir = temp_dir.path();
|
||||||
|
seed_workspace(data_dir, "wk_test");
|
||||||
|
seed_websocket_request(data_dir, "wk_test", "wr_seed_nyi");
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args(["request", "send", "wr_seed_nyi"])
|
||||||
|
.assert()
|
||||||
|
.failure()
|
||||||
|
.code(1)
|
||||||
|
.stderr(contains("WebSocket request send is not implemented yet in yaak-cli"));
|
||||||
|
}
|
||||||
81
crates-cli/yaak-cli/tests/send_commands.rs
Normal file
@@ -0,0 +1,81 @@
|
|||||||
|
mod common;
|
||||||
|
|
||||||
|
use common::http_server::TestHttpServer;
|
||||||
|
use common::{cli_cmd, query_manager, seed_folder, seed_workspace};
|
||||||
|
use predicates::str::contains;
|
||||||
|
use tempfile::TempDir;
|
||||||
|
use yaak_models::models::HttpRequest;
|
||||||
|
use yaak_models::util::UpdateSource;
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn top_level_send_workspace_sends_http_requests_and_prints_summary() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let data_dir = temp_dir.path();
|
||||||
|
seed_workspace(data_dir, "wk_test");
|
||||||
|
|
||||||
|
let server = TestHttpServer::spawn_ok("workspace bulk send");
|
||||||
|
let request = HttpRequest {
|
||||||
|
id: "rq_workspace_send".to_string(),
|
||||||
|
workspace_id: "wk_test".to_string(),
|
||||||
|
name: "Workspace Send".to_string(),
|
||||||
|
method: "GET".to_string(),
|
||||||
|
url: server.url.clone(),
|
||||||
|
..Default::default()
|
||||||
|
};
|
||||||
|
query_manager(data_dir)
|
||||||
|
.connect()
|
||||||
|
.upsert_http_request(&request, &UpdateSource::Sync)
|
||||||
|
.expect("Failed to seed workspace request");
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args(["send", "wk_test"])
|
||||||
|
.assert()
|
||||||
|
.success()
|
||||||
|
.stdout(contains("HTTP 200 OK"))
|
||||||
|
.stdout(contains("workspace bulk send"))
|
||||||
|
.stdout(contains("Send summary: 1 succeeded, 0 failed"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn top_level_send_folder_sends_http_requests_and_prints_summary() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let data_dir = temp_dir.path();
|
||||||
|
seed_workspace(data_dir, "wk_test");
|
||||||
|
seed_folder(data_dir, "wk_test", "fl_test");
|
||||||
|
|
||||||
|
let server = TestHttpServer::spawn_ok("folder bulk send");
|
||||||
|
let request = HttpRequest {
|
||||||
|
id: "rq_folder_send".to_string(),
|
||||||
|
workspace_id: "wk_test".to_string(),
|
||||||
|
folder_id: Some("fl_test".to_string()),
|
||||||
|
name: "Folder Send".to_string(),
|
||||||
|
method: "GET".to_string(),
|
||||||
|
url: server.url.clone(),
|
||||||
|
..Default::default()
|
||||||
|
};
|
||||||
|
query_manager(data_dir)
|
||||||
|
.connect()
|
||||||
|
.upsert_http_request(&request, &UpdateSource::Sync)
|
||||||
|
.expect("Failed to seed folder request");
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args(["send", "fl_test"])
|
||||||
|
.assert()
|
||||||
|
.success()
|
||||||
|
.stdout(contains("HTTP 200 OK"))
|
||||||
|
.stdout(contains("folder bulk send"))
|
||||||
|
.stdout(contains("Send summary: 1 succeeded, 0 failed"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn top_level_send_unknown_id_fails_with_clear_error() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let data_dir = temp_dir.path();
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args(["send", "does_not_exist"])
|
||||||
|
.assert()
|
||||||
|
.failure()
|
||||||
|
.code(1)
|
||||||
|
.stderr(contains("Could not resolve ID 'does_not_exist' as request, folder, or workspace"));
|
||||||
|
}
|
||||||
59
crates-cli/yaak-cli/tests/workspace_commands.rs
Normal file
@@ -0,0 +1,59 @@
|
|||||||
|
mod common;
|
||||||
|
|
||||||
|
use common::{cli_cmd, parse_created_id, query_manager};
|
||||||
|
use predicates::str::contains;
|
||||||
|
use tempfile::TempDir;
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn create_show_delete_round_trip() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let data_dir = temp_dir.path();
|
||||||
|
|
||||||
|
let create_assert =
|
||||||
|
cli_cmd(data_dir).args(["workspace", "create", "--name", "WS One"]).assert().success();
|
||||||
|
let workspace_id = parse_created_id(&create_assert.get_output().stdout, "workspace create");
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args(["workspace", "show", &workspace_id])
|
||||||
|
.assert()
|
||||||
|
.success()
|
||||||
|
.stdout(contains(format!("\"id\": \"{workspace_id}\"")))
|
||||||
|
.stdout(contains("\"name\": \"WS One\""));
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args(["workspace", "delete", &workspace_id, "--yes"])
|
||||||
|
.assert()
|
||||||
|
.success()
|
||||||
|
.stdout(contains(format!("Deleted workspace: {workspace_id}")));
|
||||||
|
|
||||||
|
assert!(query_manager(data_dir).connect().get_workspace(&workspace_id).is_err());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn json_create_and_update_merge_patch_round_trip() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let data_dir = temp_dir.path();
|
||||||
|
|
||||||
|
let create_assert = cli_cmd(data_dir)
|
||||||
|
.args(["workspace", "create", r#"{"name":"Json Workspace"}"#])
|
||||||
|
.assert()
|
||||||
|
.success();
|
||||||
|
let workspace_id = parse_created_id(&create_assert.get_output().stdout, "workspace create");
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args([
|
||||||
|
"workspace",
|
||||||
|
"update",
|
||||||
|
&format!(r#"{{"id":"{}","description":"Updated via JSON"}}"#, workspace_id),
|
||||||
|
])
|
||||||
|
.assert()
|
||||||
|
.success()
|
||||||
|
.stdout(contains(format!("Updated workspace: {workspace_id}")));
|
||||||
|
|
||||||
|
cli_cmd(data_dir)
|
||||||
|
.args(["workspace", "show", &workspace_id])
|
||||||
|
.assert()
|
||||||
|
.success()
|
||||||
|
.stdout(contains("\"name\": \"Json Workspace\""))
|
||||||
|
.stdout(contains("\"description\": \"Updated via JSON\""));
|
||||||
|
}
|
||||||
@@ -1,21 +1,3 @@
|
|||||||
[workspace]
|
|
||||||
members = [
|
|
||||||
"yaak-crypto",
|
|
||||||
"yaak-fonts",
|
|
||||||
"yaak-git",
|
|
||||||
"yaak-grpc",
|
|
||||||
"yaak-http",
|
|
||||||
"yaak-license",
|
|
||||||
"yaak-mac-window",
|
|
||||||
"yaak-models",
|
|
||||||
"yaak-plugins",
|
|
||||||
"yaak-sse",
|
|
||||||
"yaak-sync",
|
|
||||||
"yaak-templates",
|
|
||||||
"yaak-tls",
|
|
||||||
"yaak-ws",
|
|
||||||
]
|
|
||||||
|
|
||||||
[package]
|
[package]
|
||||||
name = "yaak-app"
|
name = "yaak-app"
|
||||||
version = "0.0.0"
|
version = "0.0.0"
|
||||||
@@ -28,11 +10,6 @@ publish = false
|
|||||||
name = "tauri_app_lib"
|
name = "tauri_app_lib"
|
||||||
crate-type = ["staticlib", "cdylib", "lib"]
|
crate-type = ["staticlib", "cdylib", "lib"]
|
||||||
|
|
||||||
[profile.release]
|
|
||||||
# Currently disabled due to:
|
|
||||||
# Warn Failed to add bundler type to the binary: __TAURI_BUNDLE_TYPE variable not found in binary. Make sure tauri crate and tauri-cli are up to date and that symbol stripping is disabled (https://doc.rust-lang.org/cargo/reference/profiles.html#strip). Updater plugin may not be able to update this package. This shouldn't normally happen, please report it to https://github.com/tauri-apps/tauri/issues
|
|
||||||
strip = false
|
|
||||||
|
|
||||||
[features]
|
[features]
|
||||||
cargo-clippy = []
|
cargo-clippy = []
|
||||||
default = []
|
default = []
|
||||||
@@ -53,6 +30,8 @@ eventsource-client = { git = "https://github.com/yaakapp/rust-eventsource-client
|
|||||||
http = { version = "1.2.0", default-features = false }
|
http = { version = "1.2.0", default-features = false }
|
||||||
log = { workspace = true }
|
log = { workspace = true }
|
||||||
md5 = "0.8.0"
|
md5 = "0.8.0"
|
||||||
|
r2d2 = "0.8.10"
|
||||||
|
r2d2_sqlite = "0.25.0"
|
||||||
mime_guess = "2.0.5"
|
mime_guess = "2.0.5"
|
||||||
rand = "0.9.0"
|
rand = "0.9.0"
|
||||||
reqwest = { workspace = true, features = ["multipart", "gzip", "brotli", "deflate", "json", "rustls-tls-manual-roots-no-provider", "socks", "http2"] }
|
reqwest = { workspace = true, features = ["multipart", "gzip", "brotli", "deflate", "json", "rustls-tls-manual-roots-no-provider", "socks", "http2"] }
|
||||||
@@ -73,50 +52,27 @@ tauri-plugin-window-state = "2.4.1"
|
|||||||
thiserror = { workspace = true }
|
thiserror = { workspace = true }
|
||||||
tokio = { workspace = true, features = ["sync"] }
|
tokio = { workspace = true, features = ["sync"] }
|
||||||
tokio-stream = "0.1.17"
|
tokio-stream = "0.1.17"
|
||||||
|
tokio-tungstenite = { version = "0.26.2", default-features = false }
|
||||||
|
url = "2"
|
||||||
tokio-util = { version = "0.7", features = ["codec"] }
|
tokio-util = { version = "0.7", features = ["codec"] }
|
||||||
ts-rs = { workspace = true }
|
ts-rs = { workspace = true }
|
||||||
uuid = "1.12.1"
|
uuid = "1.12.1"
|
||||||
|
yaak-api = { workspace = true }
|
||||||
yaak-common = { workspace = true }
|
yaak-common = { workspace = true }
|
||||||
|
yaak-tauri-utils = { workspace = true }
|
||||||
|
yaak-core = { workspace = true }
|
||||||
|
yaak = { workspace = true }
|
||||||
yaak-crypto = { workspace = true }
|
yaak-crypto = { workspace = true }
|
||||||
yaak-fonts = { workspace = true }
|
yaak-fonts = { workspace = true }
|
||||||
yaak-git = { path = "yaak-git" }
|
yaak-git = { workspace = true }
|
||||||
yaak-grpc = { path = "yaak-grpc" }
|
yaak-grpc = { workspace = true }
|
||||||
yaak-http = { workspace = true }
|
yaak-http = { workspace = true }
|
||||||
yaak-license = { path = "yaak-license", optional = true }
|
yaak-license = { workspace = true, optional = true }
|
||||||
yaak-mac-window = { path = "yaak-mac-window" }
|
yaak-mac-window = { workspace = true }
|
||||||
yaak-models = { workspace = true }
|
yaak-models = { workspace = true }
|
||||||
yaak-plugins = { workspace = true }
|
yaak-plugins = { workspace = true }
|
||||||
yaak-sse = { workspace = true }
|
yaak-sse = { workspace = true }
|
||||||
yaak-sync = { workspace = true }
|
yaak-sync = { workspace = true }
|
||||||
yaak-templates = { workspace = true }
|
yaak-templates = { workspace = true }
|
||||||
yaak-tls = { workspace = true }
|
yaak-tls = { workspace = true }
|
||||||
yaak-ws = { path = "yaak-ws" }
|
yaak-ws = { workspace = true }
|
||||||
|
|
||||||
[workspace.dependencies]
|
|
||||||
chrono = "0.4.42"
|
|
||||||
hex = "0.4.3"
|
|
||||||
keyring = "3.6.3"
|
|
||||||
reqwest = "0.12.20"
|
|
||||||
rustls = { version = "0.23.34", default-features = false }
|
|
||||||
rustls-platform-verifier = "0.6.2"
|
|
||||||
serde = "1.0.228"
|
|
||||||
serde_json = "1.0.145"
|
|
||||||
sha2 = "0.10.9"
|
|
||||||
log = "0.4.29"
|
|
||||||
tauri = "2.9.5"
|
|
||||||
tauri-plugin = "2.5.2"
|
|
||||||
tauri-plugin-dialog = "2.4.2"
|
|
||||||
tauri-plugin-shell = "2.3.3"
|
|
||||||
thiserror = "2.0.17"
|
|
||||||
tokio = "1.48.0"
|
|
||||||
ts-rs = "11.1.0"
|
|
||||||
yaak-common = { path = "yaak-common" }
|
|
||||||
yaak-crypto = { path = "yaak-crypto" }
|
|
||||||
yaak-fonts = { path = "yaak-fonts" }
|
|
||||||
yaak-http = { path = "yaak-http" }
|
|
||||||
yaak-models = { path = "yaak-models" }
|
|
||||||
yaak-plugins = { path = "yaak-plugins" }
|
|
||||||
yaak-sse = { path = "yaak-sse" }
|
|
||||||
yaak-sync = { path = "yaak-sync" }
|
|
||||||
yaak-templates = { path = "yaak-templates" }
|
|
||||||
yaak-tls = { path = "yaak-tls" }
|
|
||||||
3
crates-tauri/yaak-app/bindings/gen_watch.ts
generated
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||||
|
|
||||||
|
export type WatchResult = { unlistenEvent: string, };
|
||||||
@@ -10,6 +10,8 @@ export type UpdateResponse = { "type": "ack" } | { "type": "action", action: Upd
|
|||||||
|
|
||||||
export type UpdateResponseAction = "install" | "skip";
|
export type UpdateResponseAction = "install" | "skip";
|
||||||
|
|
||||||
|
export type WatchResult = { unlistenEvent: string, };
|
||||||
|
|
||||||
export type YaakNotification = { timestamp: string, timeout: number | null, id: string, title: string | null, message: string, color: string | null, action: YaakNotificationAction | null, };
|
export type YaakNotification = { timestamp: string, timeout: number | null, id: string, title: string | null, message: string, color: string | null, action: YaakNotificationAction | null, };
|
||||||
|
|
||||||
export type YaakNotificationAction = { label: string, url: string, };
|
export type YaakNotificationAction = { label: string, url: string, };
|
||||||
5
crates-tauri/yaak-app/bindings/plugins_ext.ts
generated
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||||
|
|
||||||
|
export type PluginUpdateInfo = { name: string, currentVersion: string, latestVersion: string, };
|
||||||
|
|
||||||
|
export type PluginUpdateNotification = { updateCount: number, plugins: Array<PluginUpdateInfo>, };
|
||||||
@@ -51,13 +51,7 @@
|
|||||||
"opener:allow-open-url",
|
"opener:allow-open-url",
|
||||||
"opener:allow-reveal-item-in-dir",
|
"opener:allow-reveal-item-in-dir",
|
||||||
"shell:allow-open",
|
"shell:allow-open",
|
||||||
"yaak-crypto:default",
|
|
||||||
"yaak-fonts:default",
|
"yaak-fonts:default",
|
||||||
"yaak-git:default",
|
"yaak-mac-window:default"
|
||||||
"yaak-mac-window:default",
|
|
||||||
"yaak-models:default",
|
|
||||||
"yaak-plugins:default",
|
|
||||||
"yaak-sync:default",
|
|
||||||
"yaak-ws:default"
|
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
Before Width: | Height: | Size: 6.1 KiB After Width: | Height: | Size: 6.1 KiB |
|
Before Width: | Height: | Size: 12 KiB After Width: | Height: | Size: 12 KiB |
|
Before Width: | Height: | Size: 1.5 KiB After Width: | Height: | Size: 1.5 KiB |
|
Before Width: | Height: | Size: 2.9 KiB After Width: | Height: | Size: 2.9 KiB |
|
Before Width: | Height: | Size: 4.9 KiB After Width: | Height: | Size: 4.9 KiB |
|
Before Width: | Height: | Size: 6.7 KiB After Width: | Height: | Size: 6.7 KiB |
|
Before Width: | Height: | Size: 7.0 KiB After Width: | Height: | Size: 7.0 KiB |
|
Before Width: | Height: | Size: 14 KiB After Width: | Height: | Size: 14 KiB |
|
Before Width: | Height: | Size: 1.4 KiB After Width: | Height: | Size: 1.4 KiB |
|
Before Width: | Height: | Size: 15 KiB After Width: | Height: | Size: 15 KiB |
|
Before Width: | Height: | Size: 2.0 KiB After Width: | Height: | Size: 2.0 KiB |
|
Before Width: | Height: | Size: 3.4 KiB After Width: | Height: | Size: 3.4 KiB |
|
Before Width: | Height: | Size: 4.1 KiB After Width: | Height: | Size: 4.1 KiB |
|
Before Width: | Height: | Size: 2.3 KiB After Width: | Height: | Size: 2.3 KiB |
|
Before Width: | Height: | Size: 2.3 KiB After Width: | Height: | Size: 2.3 KiB |
|
Before Width: | Height: | Size: 7.6 KiB After Width: | Height: | Size: 7.6 KiB |
|
Before Width: | Height: | Size: 2.3 KiB After Width: | Height: | Size: 2.3 KiB |
|
Before Width: | Height: | Size: 2.3 KiB After Width: | Height: | Size: 2.3 KiB |
|
Before Width: | Height: | Size: 4.9 KiB After Width: | Height: | Size: 4.9 KiB |
|
Before Width: | Height: | Size: 2.3 KiB After Width: | Height: | Size: 2.3 KiB |
|
Before Width: | Height: | Size: 4.4 KiB After Width: | Height: | Size: 4.4 KiB |
|
Before Width: | Height: | Size: 10 KiB After Width: | Height: | Size: 10 KiB |
|
Before Width: | Height: | Size: 4.4 KiB After Width: | Height: | Size: 4.4 KiB |
|
Before Width: | Height: | Size: 6.7 KiB After Width: | Height: | Size: 6.7 KiB |
|
Before Width: | Height: | Size: 16 KiB After Width: | Height: | Size: 16 KiB |
|
Before Width: | Height: | Size: 6.7 KiB After Width: | Height: | Size: 6.7 KiB |
|
Before Width: | Height: | Size: 8.9 KiB After Width: | Height: | Size: 8.9 KiB |
|
Before Width: | Height: | Size: 22 KiB After Width: | Height: | Size: 22 KiB |
|
Before Width: | Height: | Size: 8.9 KiB After Width: | Height: | Size: 8.9 KiB |
|
Before Width: | Height: | Size: 21 KiB After Width: | Height: | Size: 21 KiB |
|
Before Width: | Height: | Size: 26 KiB After Width: | Height: | Size: 26 KiB |
|
Before Width: | Height: | Size: 848 B After Width: | Height: | Size: 848 B |
|
Before Width: | Height: | Size: 1.8 KiB After Width: | Height: | Size: 1.8 KiB |
|
Before Width: | Height: | Size: 1.8 KiB After Width: | Height: | Size: 1.8 KiB |
|
Before Width: | Height: | Size: 2.8 KiB After Width: | Height: | Size: 2.8 KiB |
|
Before Width: | Height: | Size: 1.3 KiB After Width: | Height: | Size: 1.3 KiB |
|
Before Width: | Height: | Size: 2.7 KiB After Width: | Height: | Size: 2.7 KiB |
|
Before Width: | Height: | Size: 2.7 KiB After Width: | Height: | Size: 2.7 KiB |
|
Before Width: | Height: | Size: 3.9 KiB After Width: | Height: | Size: 3.9 KiB |
|
Before Width: | Height: | Size: 1.8 KiB After Width: | Height: | Size: 1.8 KiB |
|
Before Width: | Height: | Size: 3.7 KiB After Width: | Height: | Size: 3.7 KiB |
|
Before Width: | Height: | Size: 3.7 KiB After Width: | Height: | Size: 3.7 KiB |
|
Before Width: | Height: | Size: 5.6 KiB After Width: | Height: | Size: 5.6 KiB |
|
Before Width: | Height: | Size: 31 KiB After Width: | Height: | Size: 31 KiB |
|
Before Width: | Height: | Size: 5.6 KiB After Width: | Height: | Size: 5.6 KiB |
|
Before Width: | Height: | Size: 8.3 KiB After Width: | Height: | Size: 8.3 KiB |
|
Before Width: | Height: | Size: 3.3 KiB After Width: | Height: | Size: 3.3 KiB |
|
Before Width: | Height: | Size: 7.1 KiB After Width: | Height: | Size: 7.1 KiB |
|
Before Width: | Height: | Size: 7.8 KiB After Width: | Height: | Size: 7.8 KiB |
|
Before Width: | Height: | Size: 31 KiB After Width: | Height: | Size: 31 KiB |
|
Before Width: | Height: | Size: 356 KiB After Width: | Height: | Size: 356 KiB |
|
Before Width: | Height: | Size: 14 KiB After Width: | Height: | Size: 14 KiB |