mirror of
https://github.com/mountain-loop/yaak.git
synced 2026-04-12 20:19:37 +02:00
Compare commits
2 Commits
main
...
actions-sy
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
986143c4ae | ||
|
|
50b0e23d53 |
@@ -1,11 +1,9 @@
|
|||||||
# Claude Context: Detaching Tauri from Yaak
|
# Claude Context: Detaching Tauri from Yaak
|
||||||
|
|
||||||
## Goal
|
## Goal
|
||||||
|
|
||||||
Make Yaak runnable as a standalone CLI without Tauri as a dependency. The core Rust crates in `crates/` should be usable independently, while Tauri-specific code lives in `crates-tauri/`.
|
Make Yaak runnable as a standalone CLI without Tauri as a dependency. The core Rust crates in `crates/` should be usable independently, while Tauri-specific code lives in `crates-tauri/`.
|
||||||
|
|
||||||
## Project Structure
|
## Project Structure
|
||||||
|
|
||||||
```
|
```
|
||||||
crates/ # Core crates - should NOT depend on Tauri
|
crates/ # Core crates - should NOT depend on Tauri
|
||||||
crates-tauri/ # Tauri-specific crates (yaak-app, yaak-tauri-utils, etc.)
|
crates-tauri/ # Tauri-specific crates (yaak-app, yaak-tauri-utils, etc.)
|
||||||
@@ -15,13 +13,11 @@ crates-cli/ # CLI crate (yaak-cli)
|
|||||||
## Completed Work
|
## Completed Work
|
||||||
|
|
||||||
### 1. Folder Restructure
|
### 1. Folder Restructure
|
||||||
|
|
||||||
- Moved Tauri-dependent app code to `crates-tauri/yaak-app/`
|
- Moved Tauri-dependent app code to `crates-tauri/yaak-app/`
|
||||||
- Created `crates-tauri/yaak-tauri-utils/` for shared Tauri utilities (window traits, api_client, error handling)
|
- Created `crates-tauri/yaak-tauri-utils/` for shared Tauri utilities (window traits, api_client, error handling)
|
||||||
- Created `crates-cli/yaak-cli/` for the standalone CLI
|
- Created `crates-cli/yaak-cli/` for the standalone CLI
|
||||||
|
|
||||||
### 2. Decoupled Crates (no longer depend on Tauri)
|
### 2. Decoupled Crates (no longer depend on Tauri)
|
||||||
|
|
||||||
- **yaak-models**: Uses `init_standalone()` pattern for CLI database access
|
- **yaak-models**: Uses `init_standalone()` pattern for CLI database access
|
||||||
- **yaak-http**: Removed Tauri plugin, HttpConnectionManager initialized in yaak-app setup
|
- **yaak-http**: Removed Tauri plugin, HttpConnectionManager initialized in yaak-app setup
|
||||||
- **yaak-common**: Only contains Tauri-free utilities (serde, platform)
|
- **yaak-common**: Only contains Tauri-free utilities (serde, platform)
|
||||||
@@ -29,7 +25,6 @@ crates-cli/ # CLI crate (yaak-cli)
|
|||||||
- **yaak-grpc**: Replaced AppHandle with GrpcConfig struct, uses tokio::process::Command instead of Tauri sidecar
|
- **yaak-grpc**: Replaced AppHandle with GrpcConfig struct, uses tokio::process::Command instead of Tauri sidecar
|
||||||
|
|
||||||
### 3. CLI Implementation
|
### 3. CLI Implementation
|
||||||
|
|
||||||
- Basic CLI at `crates-cli/yaak-cli/src/main.rs`
|
- Basic CLI at `crates-cli/yaak-cli/src/main.rs`
|
||||||
- Commands: workspaces, requests, send (by ID), get (ad-hoc URL), create
|
- Commands: workspaces, requests, send (by ID), get (ad-hoc URL), create
|
||||||
- Uses same database as Tauri app via `yaak_models::init_standalone()`
|
- Uses same database as Tauri app via `yaak_models::init_standalone()`
|
||||||
@@ -37,14 +32,12 @@ crates-cli/ # CLI crate (yaak-cli)
|
|||||||
## Remaining Work
|
## Remaining Work
|
||||||
|
|
||||||
### Crates Still Depending on Tauri (in `crates/`)
|
### Crates Still Depending on Tauri (in `crates/`)
|
||||||
|
|
||||||
1. **yaak-git** (3 files) - Moderate complexity
|
1. **yaak-git** (3 files) - Moderate complexity
|
||||||
2. **yaak-plugins** (13 files) - **Hardest** - deeply integrated with Tauri for plugin-to-window communication
|
2. **yaak-plugins** (13 files) - **Hardest** - deeply integrated with Tauri for plugin-to-window communication
|
||||||
3. **yaak-sync** (4 files) - Moderate complexity
|
3. **yaak-sync** (4 files) - Moderate complexity
|
||||||
4. **yaak-ws** (5 files) - Moderate complexity
|
4. **yaak-ws** (5 files) - Moderate complexity
|
||||||
|
|
||||||
### Pattern for Decoupling
|
### Pattern for Decoupling
|
||||||
|
|
||||||
1. Remove Tauri plugin `init()` function from the crate
|
1. Remove Tauri plugin `init()` function from the crate
|
||||||
2. Move commands to `yaak-app/src/commands.rs` or keep inline in `lib.rs`
|
2. Move commands to `yaak-app/src/commands.rs` or keep inline in `lib.rs`
|
||||||
3. Move extension traits (e.g., `SomethingManagerExt`) to yaak-app or yaak-tauri-utils
|
3. Move extension traits (e.g., `SomethingManagerExt`) to yaak-app or yaak-tauri-utils
|
||||||
@@ -54,7 +47,6 @@ crates-cli/ # CLI crate (yaak-cli)
|
|||||||
7. Replace `tauri::async_runtime::block_on` with `tokio::runtime::Handle::current().block_on()`
|
7. Replace `tauri::async_runtime::block_on` with `tokio::runtime::Handle::current().block_on()`
|
||||||
|
|
||||||
## Key Files
|
## Key Files
|
||||||
|
|
||||||
- `crates-tauri/yaak-app/src/lib.rs` - Main Tauri app, setup block initializes managers
|
- `crates-tauri/yaak-app/src/lib.rs` - Main Tauri app, setup block initializes managers
|
||||||
- `crates-tauri/yaak-app/src/commands.rs` - Migrated Tauri commands
|
- `crates-tauri/yaak-app/src/commands.rs` - Migrated Tauri commands
|
||||||
- `crates-tauri/yaak-app/src/models_ext.rs` - Database plugin and extension traits
|
- `crates-tauri/yaak-app/src/models_ext.rs` - Database plugin and extension traits
|
||||||
@@ -62,11 +54,9 @@ crates-cli/ # CLI crate (yaak-cli)
|
|||||||
- `crates/yaak-models/src/lib.rs` - Contains `init_standalone()` for CLI usage
|
- `crates/yaak-models/src/lib.rs` - Contains `init_standalone()` for CLI usage
|
||||||
|
|
||||||
## Git Branch
|
## Git Branch
|
||||||
|
|
||||||
Working on `detach-tauri` branch.
|
Working on `detach-tauri` branch.
|
||||||
|
|
||||||
## Recent Commits
|
## Recent Commits
|
||||||
|
|
||||||
```
|
```
|
||||||
c40cff40 Remove Tauri dependencies from yaak-crypto and yaak-grpc
|
c40cff40 Remove Tauri dependencies from yaak-crypto and yaak-grpc
|
||||||
df495f1d Move Tauri utilities from yaak-common to yaak-tauri-utils
|
df495f1d Move Tauri utilities from yaak-common to yaak-tauri-utils
|
||||||
@@ -77,7 +67,6 @@ e718a5f1 Refactor models_ext to use init_standalone from yaak-models
|
|||||||
```
|
```
|
||||||
|
|
||||||
## Testing
|
## Testing
|
||||||
|
|
||||||
- Run `cargo check -p <crate>` to verify a crate builds without Tauri
|
- Run `cargo check -p <crate>` to verify a crate builds without Tauri
|
||||||
- Run `npm run app-dev` to test the Tauri app still works
|
- Run `npm run app-dev` to test the Tauri app still works
|
||||||
- Run `cargo run -p yaak-cli -- --help` to test the CLI
|
- Run `cargo run -p yaak-cli -- --help` to test the CLI
|
||||||
|
|||||||
51
.claude/commands/release/check-out-pr.md
Normal file
51
.claude/commands/release/check-out-pr.md
Normal file
@@ -0,0 +1,51 @@
|
|||||||
|
---
|
||||||
|
description: Review a PR in a new worktree
|
||||||
|
allowed-tools: Bash(git worktree:*), Bash(gh pr:*)
|
||||||
|
---
|
||||||
|
|
||||||
|
Review a GitHub pull request in a new git worktree.
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
```
|
||||||
|
/review-pr <PR_NUMBER>
|
||||||
|
```
|
||||||
|
|
||||||
|
## What to do
|
||||||
|
|
||||||
|
1. List all open pull requests and ask the user to select one
|
||||||
|
2. Get PR information using `gh pr view <PR_NUMBER> --json number,headRefName`
|
||||||
|
3. Extract the branch name from the PR
|
||||||
|
4. Create a new worktree at `../yaak-worktrees/pr-<PR_NUMBER>` using `git worktree add` with a timeout of at least 300000ms (5 minutes) since the post-checkout hook runs a bootstrap script
|
||||||
|
5. Checkout the PR branch in the new worktree using `gh pr checkout <PR_NUMBER>`
|
||||||
|
6. The post-checkout hook will automatically:
|
||||||
|
- Create `.env.local` with unique ports
|
||||||
|
- Copy editor config folders
|
||||||
|
- Run `npm install && npm run bootstrap`
|
||||||
|
7. Inform the user:
|
||||||
|
- Where the worktree was created
|
||||||
|
- What ports were assigned
|
||||||
|
- How to access it (cd command)
|
||||||
|
- How to run the dev server
|
||||||
|
- How to remove the worktree when done
|
||||||
|
|
||||||
|
## Example Output
|
||||||
|
|
||||||
|
```
|
||||||
|
Created worktree for PR #123 at ../yaak-worktrees/pr-123
|
||||||
|
Branch: feature-auth
|
||||||
|
Ports: Vite (1421), MCP (64344)
|
||||||
|
|
||||||
|
To start working:
|
||||||
|
cd ../yaak-worktrees/pr-123
|
||||||
|
npm run app-dev
|
||||||
|
|
||||||
|
To remove when done:
|
||||||
|
git worktree remove ../yaak-worktrees/pr-123
|
||||||
|
```
|
||||||
|
|
||||||
|
## Error Handling
|
||||||
|
|
||||||
|
- If the PR doesn't exist, show a helpful error
|
||||||
|
- If the worktree already exists, inform the user and ask if they want to remove and recreate it
|
||||||
|
- If `gh` CLI is not available, inform the user to install it
|
||||||
@@ -8,7 +8,7 @@ Generate formatted release notes for Yaak releases by analyzing git history and
|
|||||||
## What to do
|
## What to do
|
||||||
|
|
||||||
1. Identifies the version tag and previous version
|
1. Identifies the version tag and previous version
|
||||||
2. Retrieves all commits between versions
|
2. Retrieves all commits between versions
|
||||||
- If the version is a beta version, it retrieves commits between the beta version and previous beta version
|
- If the version is a beta version, it retrieves commits between the beta version and previous beta version
|
||||||
- If the version is a stable version, it retrieves commits between the stable version and the previous stable version
|
- If the version is a stable version, it retrieves commits between the stable version and the previous stable version
|
||||||
3. Fetches PR descriptions for linked issues to find:
|
3. Fetches PR descriptions for linked issues to find:
|
||||||
@@ -37,14 +37,11 @@ The skill generates markdown-formatted release notes following this structure:
|
|||||||
|
|
||||||
**IMPORTANT**: Always add a blank lines around the markdown code fence and output the markdown code block last
|
**IMPORTANT**: Always add a blank lines around the markdown code fence and output the markdown code block last
|
||||||
**IMPORTANT**: PRs by `@gschier` should not mention the @username
|
**IMPORTANT**: PRs by `@gschier` should not mention the @username
|
||||||
**IMPORTANT**: These are app release notes. Exclude CLI-only changes (commits prefixed with `cli:` or only touching `crates-cli/`) since the CLI has its own release process.
|
|
||||||
|
|
||||||
## After Generating Release Notes
|
## After Generating Release Notes
|
||||||
|
|
||||||
After outputting the release notes, ask the user if they would like to create a draft GitHub release with these notes. If they confirm, create the release using:
|
After outputting the release notes, ask the user if they would like to create a draft GitHub release with these notes. If they confirm, create the release using:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
gh release create <tag> --draft --prerelease --title "Release <version>" --notes '<release notes>'
|
gh release create <tag> --draft --prerelease --title "<tag>" --notes '<release notes>'
|
||||||
```
|
```
|
||||||
|
|
||||||
**IMPORTANT**: The release title format is "Release XXXX" where XXXX is the version WITHOUT the `v` prefix. For example, tag `v2026.2.1-beta.1` gets title "Release 2026.2.1-beta.1".
|
|
||||||
|
|||||||
35
.claude/skills/worktree.md
Normal file
35
.claude/skills/worktree.md
Normal file
@@ -0,0 +1,35 @@
|
|||||||
|
# Worktree Management Skill
|
||||||
|
|
||||||
|
## Creating Worktrees
|
||||||
|
|
||||||
|
When creating git worktrees for this project, ALWAYS use the path format:
|
||||||
|
```
|
||||||
|
../yaak-worktrees/<NAME>
|
||||||
|
```
|
||||||
|
|
||||||
|
For example:
|
||||||
|
- `git worktree add ../yaak-worktrees/feature-auth`
|
||||||
|
- `git worktree add ../yaak-worktrees/bugfix-login`
|
||||||
|
- `git worktree add ../yaak-worktrees/refactor-api`
|
||||||
|
|
||||||
|
## What Happens Automatically
|
||||||
|
|
||||||
|
The post-checkout hook will automatically:
|
||||||
|
1. Create `.env.local` with unique ports (YAAK_DEV_PORT and YAAK_PLUGIN_MCP_SERVER_PORT)
|
||||||
|
2. Copy gitignored editor config folders (.zed, .idea, etc.)
|
||||||
|
3. Run `npm install && npm run bootstrap`
|
||||||
|
|
||||||
|
## Deleting Worktrees
|
||||||
|
|
||||||
|
```bash
|
||||||
|
git worktree remove ../yaak-worktrees/<NAME>
|
||||||
|
```
|
||||||
|
|
||||||
|
## Port Assignments
|
||||||
|
|
||||||
|
- Main worktree: 1420 (Vite), 64343 (MCP)
|
||||||
|
- First worktree: 1421, 64344
|
||||||
|
- Second worktree: 1422, 64345
|
||||||
|
- etc.
|
||||||
|
|
||||||
|
Each worktree can run `npm run app-dev` simultaneously without conflicts.
|
||||||
@@ -1,49 +0,0 @@
|
|||||||
---
|
|
||||||
name: release-generate-release-notes
|
|
||||||
description: Generate Yaak release notes from git history and PR metadata, including feedback links and full changelog compare links. Use when asked to run or replace the old Claude generate-release-notes command.
|
|
||||||
---
|
|
||||||
|
|
||||||
# Generate Release Notes
|
|
||||||
|
|
||||||
Generate formatted markdown release notes for a Yaak tag.
|
|
||||||
|
|
||||||
## Workflow
|
|
||||||
|
|
||||||
1. Determine target tag.
|
|
||||||
2. Determine previous comparable tag:
|
|
||||||
- Beta tag: compare against previous beta (if the root version is the same) or stable tag.
|
|
||||||
- Stable tag: compare against previous stable tag.
|
|
||||||
3. Collect commits in range:
|
|
||||||
- `git log --oneline <prev_tag>..<target_tag>`
|
|
||||||
4. For linked PRs, fetch metadata:
|
|
||||||
- `gh pr view <PR_NUMBER> --json number,title,body,author,url`
|
|
||||||
5. Extract useful details:
|
|
||||||
- Feedback URLs (`feedback.yaak.app`)
|
|
||||||
- Plugin install links or other notable context
|
|
||||||
6. Format notes using Yaak style:
|
|
||||||
- Changelog badge at top
|
|
||||||
- Bulleted items with PR links where available
|
|
||||||
- Feedback links where available
|
|
||||||
- Full changelog compare link at bottom
|
|
||||||
|
|
||||||
## Formatting Rules
|
|
||||||
|
|
||||||
- Wrap final notes in a markdown code fence.
|
|
||||||
- Keep a blank line before and after the code fence.
|
|
||||||
- Output the markdown code block last.
|
|
||||||
- Do not append `by @gschier` for PRs authored by `@gschier`.
|
|
||||||
- These are app release notes. Exclude CLI-only changes (commits prefixed with `cli:` or only touching `crates-cli/`) since the CLI has its own release process.
|
|
||||||
|
|
||||||
## Release Creation Prompt
|
|
||||||
|
|
||||||
After producing notes, ask whether to create a draft GitHub release.
|
|
||||||
|
|
||||||
If confirmed and release does not yet exist, run:
|
|
||||||
|
|
||||||
`gh release create <tag> --draft --prerelease --title "Release <version_without_v>" --notes '<release notes>'`
|
|
||||||
|
|
||||||
If a draft release for the tag already exists, update it instead:
|
|
||||||
|
|
||||||
`gh release edit <tag> --title "Release <version_without_v>" --notes-file <path_to_notes>`
|
|
||||||
|
|
||||||
Use title format `Release <version_without_v>`, e.g. `v2026.2.1-beta.1` -> `Release 2026.2.1-beta.1`.
|
|
||||||
24
.github/ISSUE_TEMPLATE/bug_report.md
vendored
24
.github/ISSUE_TEMPLATE/bug_report.md
vendored
@@ -1,9 +1,10 @@
|
|||||||
---
|
---
|
||||||
name: Bug report
|
name: Bug report
|
||||||
about: Create a report to help us improve
|
about: Create a report to help us improve
|
||||||
title: ""
|
title: ''
|
||||||
labels: ""
|
labels: ''
|
||||||
assignees: ""
|
assignees: ''
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
**Describe the bug**
|
**Describe the bug**
|
||||||
@@ -11,7 +12,6 @@ A clear and concise description of what the bug is.
|
|||||||
|
|
||||||
**To Reproduce**
|
**To Reproduce**
|
||||||
Steps to reproduce the behavior:
|
Steps to reproduce the behavior:
|
||||||
|
|
||||||
1. Go to '...'
|
1. Go to '...'
|
||||||
2. Click on '....'
|
2. Click on '....'
|
||||||
3. Scroll down to '....'
|
3. Scroll down to '....'
|
||||||
@@ -24,17 +24,15 @@ A clear and concise description of what you expected to happen.
|
|||||||
If applicable, add screenshots to help explain your problem.
|
If applicable, add screenshots to help explain your problem.
|
||||||
|
|
||||||
**Desktop (please complete the following information):**
|
**Desktop (please complete the following information):**
|
||||||
|
- OS: [e.g. iOS]
|
||||||
- OS: [e.g. iOS]
|
- Browser [e.g. chrome, safari]
|
||||||
- Browser [e.g. chrome, safari]
|
- Version [e.g. 22]
|
||||||
- Version [e.g. 22]
|
|
||||||
|
|
||||||
**Smartphone (please complete the following information):**
|
**Smartphone (please complete the following information):**
|
||||||
|
- Device: [e.g. iPhone6]
|
||||||
- Device: [e.g. iPhone6]
|
- OS: [e.g. iOS8.1]
|
||||||
- OS: [e.g. iOS8.1]
|
- Browser [e.g. stock browser, safari]
|
||||||
- Browser [e.g. stock browser, safari]
|
- Version [e.g. 22]
|
||||||
- Version [e.g. 22]
|
|
||||||
|
|
||||||
**Additional context**
|
**Additional context**
|
||||||
Add any other context about the problem here.
|
Add any other context about the problem here.
|
||||||
|
|||||||
19
.github/pull_request_template.md
vendored
19
.github/pull_request_template.md
vendored
@@ -1,19 +0,0 @@
|
|||||||
## Summary
|
|
||||||
|
|
||||||
<!-- Describe the bug and the fix in 1-3 sentences. -->
|
|
||||||
|
|
||||||
## Submission
|
|
||||||
|
|
||||||
- [ ] This PR is a bug fix or small-scope improvement.
|
|
||||||
- [ ] If this PR is not a bug fix or small-scope improvement, I linked an approved feedback item below.
|
|
||||||
- [ ] I have read and followed [`CONTRIBUTING.md`](CONTRIBUTING.md).
|
|
||||||
- [ ] I tested this change locally.
|
|
||||||
- [ ] I added or updated tests when reasonable.
|
|
||||||
|
|
||||||
Approved feedback item (required if not a bug fix or small-scope improvement):
|
|
||||||
|
|
||||||
<!-- https://yaak.app/feedback/... -->
|
|
||||||
|
|
||||||
## Related
|
|
||||||
|
|
||||||
<!-- Link related issues, discussions, or feedback items. -->
|
|
||||||
9
.github/workflows/ci.yml
vendored
9
.github/workflows/ci.yml
vendored
@@ -14,20 +14,17 @@ jobs:
|
|||||||
runs-on: macos-latest
|
runs-on: macos-latest
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4
|
- uses: actions/checkout@v4
|
||||||
- uses: voidzero-dev/setup-vp@v1
|
- uses: actions/setup-node@v4
|
||||||
with:
|
|
||||||
node-version: "24"
|
|
||||||
cache: true
|
|
||||||
- uses: dtolnay/rust-toolchain@stable
|
- uses: dtolnay/rust-toolchain@stable
|
||||||
- uses: Swatinem/rust-cache@v2
|
- uses: Swatinem/rust-cache@v2
|
||||||
with:
|
with:
|
||||||
shared-key: ci
|
shared-key: ci
|
||||||
cache-on-failure: true
|
cache-on-failure: true
|
||||||
|
|
||||||
- run: vp install
|
- run: npm ci
|
||||||
- run: npm run bootstrap
|
- run: npm run bootstrap
|
||||||
- run: npm run lint
|
- run: npm run lint
|
||||||
- name: Run JS Tests
|
- name: Run JS Tests
|
||||||
run: vp test
|
run: npm test
|
||||||
- name: Run Rust Tests
|
- name: Run Rust Tests
|
||||||
run: cargo test --all
|
run: cargo test --all
|
||||||
|
|||||||
1
.github/workflows/claude.yml
vendored
1
.github/workflows/claude.yml
vendored
@@ -47,3 +47,4 @@ jobs:
|
|||||||
# See https://github.com/anthropics/claude-code-action/blob/main/docs/usage.md
|
# See https://github.com/anthropics/claude-code-action/blob/main/docs/usage.md
|
||||||
# or https://code.claude.com/docs/en/cli-reference for available options
|
# or https://code.claude.com/docs/en/cli-reference for available options
|
||||||
# claude_args: '--allowed-tools Bash(gh pr:*)'
|
# claude_args: '--allowed-tools Bash(gh pr:*)'
|
||||||
|
|
||||||
|
|||||||
52
.github/workflows/flathub.yml
vendored
52
.github/workflows/flathub.yml
vendored
@@ -1,52 +0,0 @@
|
|||||||
name: Update Flathub
|
|
||||||
on:
|
|
||||||
release:
|
|
||||||
types: [published]
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
contents: read
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
update-flathub:
|
|
||||||
name: Update Flathub manifest
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
# Only run for stable releases (skip betas/pre-releases)
|
|
||||||
if: ${{ !github.event.release.prerelease }}
|
|
||||||
steps:
|
|
||||||
- name: Checkout app repo
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Checkout Flathub repo
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
with:
|
|
||||||
repository: flathub/app.yaak.Yaak
|
|
||||||
token: ${{ secrets.FLATHUB_TOKEN }}
|
|
||||||
path: flathub-repo
|
|
||||||
|
|
||||||
- name: Set up Python
|
|
||||||
uses: actions/setup-python@v5
|
|
||||||
with:
|
|
||||||
python-version: "3.12"
|
|
||||||
|
|
||||||
- name: Set up Node.js
|
|
||||||
uses: actions/setup-node@v4
|
|
||||||
with:
|
|
||||||
node-version: "22"
|
|
||||||
|
|
||||||
- name: Install source generators
|
|
||||||
run: |
|
|
||||||
pip install flatpak-node-generator tomlkit aiohttp
|
|
||||||
git clone --depth 1 https://github.com/flatpak/flatpak-builder-tools flatpak/flatpak-builder-tools
|
|
||||||
|
|
||||||
- name: Run update-manifest.sh
|
|
||||||
run: bash flatpak/update-manifest.sh "${{ github.event.release.tag_name }}" flathub-repo
|
|
||||||
|
|
||||||
- name: Commit and push to Flathub
|
|
||||||
working-directory: flathub-repo
|
|
||||||
run: |
|
|
||||||
git config user.name "github-actions[bot]"
|
|
||||||
git config user.email "github-actions[bot]@users.noreply.github.com"
|
|
||||||
git add -A
|
|
||||||
git diff --cached --quiet && echo "No changes to commit" && exit 0
|
|
||||||
git commit -m "Update to ${{ github.event.release.tag_name }}"
|
|
||||||
git push
|
|
||||||
59
.github/workflows/release-api-npm.yml
vendored
59
.github/workflows/release-api-npm.yml
vendored
@@ -1,59 +0,0 @@
|
|||||||
name: Release API to NPM
|
|
||||||
|
|
||||||
on:
|
|
||||||
push:
|
|
||||||
tags: [yaak-api-*]
|
|
||||||
workflow_dispatch:
|
|
||||||
inputs:
|
|
||||||
version:
|
|
||||||
description: API version to publish (for example 0.9.0 or v0.9.0)
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
contents: read
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
publish-npm:
|
|
||||||
name: Publish @yaakapp/api
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
permissions:
|
|
||||||
contents: read
|
|
||||||
id-token: write
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- name: Checkout
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Setup Node
|
|
||||||
uses: actions/setup-node@v4
|
|
||||||
with:
|
|
||||||
node-version: lts/*
|
|
||||||
registry-url: https://registry.npmjs.org
|
|
||||||
|
|
||||||
- name: Install dependencies
|
|
||||||
run: npm ci
|
|
||||||
|
|
||||||
- name: Set @yaakapp/api version
|
|
||||||
shell: bash
|
|
||||||
env:
|
|
||||||
WORKFLOW_VERSION: ${{ inputs.version }}
|
|
||||||
run: |
|
|
||||||
set -euo pipefail
|
|
||||||
if [ "${{ github.event_name }}" = "workflow_dispatch" ]; then
|
|
||||||
VERSION="$WORKFLOW_VERSION"
|
|
||||||
else
|
|
||||||
VERSION="${GITHUB_REF_NAME#yaak-api-}"
|
|
||||||
fi
|
|
||||||
VERSION="${VERSION#v}"
|
|
||||||
echo "Preparing @yaakapp/api version: $VERSION"
|
|
||||||
cd packages/plugin-runtime-types
|
|
||||||
npm version "$VERSION" --no-git-tag-version --allow-same-version
|
|
||||||
|
|
||||||
- name: Build @yaakapp/api
|
|
||||||
working-directory: packages/plugin-runtime-types
|
|
||||||
run: npm run build
|
|
||||||
|
|
||||||
- name: Publish @yaakapp/api
|
|
||||||
working-directory: packages/plugin-runtime-types
|
|
||||||
run: npm publish --provenance --access public
|
|
||||||
218
.github/workflows/release-cli-npm.yml
vendored
218
.github/workflows/release-cli-npm.yml
vendored
@@ -1,218 +0,0 @@
|
|||||||
name: Release CLI to NPM
|
|
||||||
|
|
||||||
on:
|
|
||||||
push:
|
|
||||||
tags: [yaak-cli-*]
|
|
||||||
workflow_dispatch:
|
|
||||||
inputs:
|
|
||||||
version:
|
|
||||||
description: CLI version to publish (for example 0.4.0 or v0.4.0)
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
contents: read
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
prepare-vendored-assets:
|
|
||||||
name: Prepare vendored plugin assets
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- name: Checkout
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Setup Node
|
|
||||||
uses: actions/setup-node@v4
|
|
||||||
with:
|
|
||||||
node-version: lts/*
|
|
||||||
|
|
||||||
- name: Install Rust stable
|
|
||||||
uses: dtolnay/rust-toolchain@stable
|
|
||||||
|
|
||||||
- name: Install dependencies
|
|
||||||
run: npm ci
|
|
||||||
|
|
||||||
- name: Build plugin assets
|
|
||||||
env:
|
|
||||||
SKIP_WASM_BUILD: "1"
|
|
||||||
run: |
|
|
||||||
npm run build
|
|
||||||
npm run vendor:vendor-plugins
|
|
||||||
|
|
||||||
- name: Upload vendored assets
|
|
||||||
uses: actions/upload-artifact@v4
|
|
||||||
with:
|
|
||||||
name: vendored-assets
|
|
||||||
path: |
|
|
||||||
crates-tauri/yaak-app/vendored/plugin-runtime/index.cjs
|
|
||||||
crates-tauri/yaak-app/vendored/plugins
|
|
||||||
if-no-files-found: error
|
|
||||||
|
|
||||||
build-binaries:
|
|
||||||
name: Build ${{ matrix.pkg }}
|
|
||||||
needs: prepare-vendored-assets
|
|
||||||
runs-on: ${{ matrix.runner }}
|
|
||||||
strategy:
|
|
||||||
fail-fast: false
|
|
||||||
matrix:
|
|
||||||
include:
|
|
||||||
- pkg: cli-darwin-arm64
|
|
||||||
runner: macos-latest
|
|
||||||
target: aarch64-apple-darwin
|
|
||||||
binary: yaak
|
|
||||||
- pkg: cli-darwin-x64
|
|
||||||
runner: macos-latest
|
|
||||||
target: x86_64-apple-darwin
|
|
||||||
binary: yaak
|
|
||||||
- pkg: cli-linux-arm64
|
|
||||||
runner: ubuntu-22.04-arm
|
|
||||||
target: aarch64-unknown-linux-gnu
|
|
||||||
binary: yaak
|
|
||||||
- pkg: cli-linux-x64
|
|
||||||
runner: ubuntu-22.04
|
|
||||||
target: x86_64-unknown-linux-gnu
|
|
||||||
binary: yaak
|
|
||||||
- pkg: cli-win32-arm64
|
|
||||||
runner: windows-latest
|
|
||||||
target: aarch64-pc-windows-msvc
|
|
||||||
binary: yaak.exe
|
|
||||||
- pkg: cli-win32-x64
|
|
||||||
runner: windows-latest
|
|
||||||
target: x86_64-pc-windows-msvc
|
|
||||||
binary: yaak.exe
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- name: Checkout
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Install Rust stable
|
|
||||||
uses: dtolnay/rust-toolchain@stable
|
|
||||||
with:
|
|
||||||
targets: ${{ matrix.target }}
|
|
||||||
|
|
||||||
- name: Restore Rust cache
|
|
||||||
uses: Swatinem/rust-cache@v2
|
|
||||||
with:
|
|
||||||
shared-key: release-cli-npm
|
|
||||||
cache-on-failure: true
|
|
||||||
|
|
||||||
- name: Install Linux build dependencies
|
|
||||||
if: startsWith(matrix.runner, 'ubuntu')
|
|
||||||
run: |
|
|
||||||
sudo apt-get update
|
|
||||||
sudo apt-get install -y pkg-config libdbus-1-dev
|
|
||||||
|
|
||||||
- name: Download vendored assets
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: vendored-assets
|
|
||||||
path: crates-tauri/yaak-app/vendored
|
|
||||||
|
|
||||||
- name: Set CLI build version
|
|
||||||
shell: bash
|
|
||||||
env:
|
|
||||||
WORKFLOW_VERSION: ${{ inputs.version }}
|
|
||||||
run: |
|
|
||||||
set -euo pipefail
|
|
||||||
if [ "${{ github.event_name }}" = "workflow_dispatch" ]; then
|
|
||||||
VERSION="$WORKFLOW_VERSION"
|
|
||||||
else
|
|
||||||
VERSION="${GITHUB_REF_NAME#yaak-cli-}"
|
|
||||||
fi
|
|
||||||
VERSION="${VERSION#v}"
|
|
||||||
echo "Building yaak version: $VERSION"
|
|
||||||
echo "YAAK_CLI_VERSION=$VERSION" >> "$GITHUB_ENV"
|
|
||||||
|
|
||||||
- name: Build yaak
|
|
||||||
run: cargo build --locked --release -p yaak-cli --bin yaak --target ${{ matrix.target }}
|
|
||||||
|
|
||||||
- name: Stage binary artifact
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
set -euo pipefail
|
|
||||||
mkdir -p "npm/dist/${{ matrix.pkg }}"
|
|
||||||
cp "target/${{ matrix.target }}/release/${{ matrix.binary }}" "npm/dist/${{ matrix.pkg }}/${{ matrix.binary }}"
|
|
||||||
|
|
||||||
- name: Upload binary artifact
|
|
||||||
uses: actions/upload-artifact@v4
|
|
||||||
with:
|
|
||||||
name: ${{ matrix.pkg }}
|
|
||||||
path: npm/dist/${{ matrix.pkg }}/${{ matrix.binary }}
|
|
||||||
if-no-files-found: error
|
|
||||||
|
|
||||||
publish-npm:
|
|
||||||
name: Publish @yaakapp/cli packages
|
|
||||||
needs: build-binaries
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
permissions:
|
|
||||||
contents: read
|
|
||||||
id-token: write
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- name: Checkout
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Setup Node
|
|
||||||
uses: actions/setup-node@v4
|
|
||||||
with:
|
|
||||||
node-version: lts/*
|
|
||||||
registry-url: https://registry.npmjs.org
|
|
||||||
|
|
||||||
- name: Download binary artifacts
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
pattern: cli-*
|
|
||||||
path: npm/dist
|
|
||||||
merge-multiple: false
|
|
||||||
|
|
||||||
- name: Prepare npm packages
|
|
||||||
shell: bash
|
|
||||||
env:
|
|
||||||
WORKFLOW_VERSION: ${{ inputs.version }}
|
|
||||||
run: |
|
|
||||||
set -euo pipefail
|
|
||||||
if [ "${{ github.event_name }}" = "workflow_dispatch" ]; then
|
|
||||||
VERSION="$WORKFLOW_VERSION"
|
|
||||||
else
|
|
||||||
VERSION="${GITHUB_REF_NAME#yaak-cli-}"
|
|
||||||
fi
|
|
||||||
VERSION="${VERSION#v}"
|
|
||||||
if [[ "$VERSION" == *-* ]]; then
|
|
||||||
PRERELEASE="${VERSION#*-}"
|
|
||||||
NPM_TAG="${PRERELEASE%%.*}"
|
|
||||||
else
|
|
||||||
NPM_TAG="latest"
|
|
||||||
fi
|
|
||||||
echo "Preparing CLI npm packages for version: $VERSION"
|
|
||||||
echo "Publishing with npm dist-tag: $NPM_TAG"
|
|
||||||
echo "NPM_TAG=$NPM_TAG" >> "$GITHUB_ENV"
|
|
||||||
YAAK_CLI_VERSION="$VERSION" node npm/prepare-publish.js
|
|
||||||
|
|
||||||
- name: Publish @yaakapp/cli-darwin-arm64
|
|
||||||
run: npm publish --provenance --access public --tag "$NPM_TAG"
|
|
||||||
working-directory: npm/cli-darwin-arm64
|
|
||||||
|
|
||||||
- name: Publish @yaakapp/cli-darwin-x64
|
|
||||||
run: npm publish --provenance --access public --tag "$NPM_TAG"
|
|
||||||
working-directory: npm/cli-darwin-x64
|
|
||||||
|
|
||||||
- name: Publish @yaakapp/cli-linux-arm64
|
|
||||||
run: npm publish --provenance --access public --tag "$NPM_TAG"
|
|
||||||
working-directory: npm/cli-linux-arm64
|
|
||||||
|
|
||||||
- name: Publish @yaakapp/cli-linux-x64
|
|
||||||
run: npm publish --provenance --access public --tag "$NPM_TAG"
|
|
||||||
working-directory: npm/cli-linux-x64
|
|
||||||
|
|
||||||
- name: Publish @yaakapp/cli-win32-arm64
|
|
||||||
run: npm publish --provenance --access public --tag "$NPM_TAG"
|
|
||||||
working-directory: npm/cli-win32-arm64
|
|
||||||
|
|
||||||
- name: Publish @yaakapp/cli-win32-x64
|
|
||||||
run: npm publish --provenance --access public --tag "$NPM_TAG"
|
|
||||||
working-directory: npm/cli-win32-x64
|
|
||||||
|
|
||||||
- name: Publish @yaakapp/cli
|
|
||||||
run: npm publish --provenance --access public --tag "$NPM_TAG"
|
|
||||||
working-directory: npm/cli
|
|
||||||
@@ -1,4 +1,4 @@
|
|||||||
name: Release App Artifacts
|
name: Generate Artifacts
|
||||||
on:
|
on:
|
||||||
push:
|
push:
|
||||||
tags: [v*]
|
tags: [v*]
|
||||||
@@ -50,11 +50,8 @@ jobs:
|
|||||||
- name: Checkout yaakapp/app
|
- name: Checkout yaakapp/app
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
- name: Setup Vite+
|
- name: Setup Node
|
||||||
uses: voidzero-dev/setup-vp@v1
|
uses: actions/setup-node@v4
|
||||||
with:
|
|
||||||
node-version: "24"
|
|
||||||
cache: true
|
|
||||||
|
|
||||||
- name: install Rust stable
|
- name: install Rust stable
|
||||||
uses: dtolnay/rust-toolchain@stable
|
uses: dtolnay/rust-toolchain@stable
|
||||||
@@ -90,15 +87,15 @@ jobs:
|
|||||||
echo $dir >> $env:GITHUB_PATH
|
echo $dir >> $env:GITHUB_PATH
|
||||||
& $exe --version
|
& $exe --version
|
||||||
|
|
||||||
- run: vp install
|
- run: npm ci
|
||||||
- run: npm run bootstrap
|
- run: npm run bootstrap
|
||||||
env:
|
env:
|
||||||
YAAK_TARGET_ARCH: ${{ matrix.yaak_arch }}
|
YAAK_TARGET_ARCH: ${{ matrix.yaak_arch }}
|
||||||
- run: npm run lint
|
- run: npm run lint
|
||||||
- name: Run JS Tests
|
- name: Run JS Tests
|
||||||
run: vp test
|
run: npm test
|
||||||
- name: Run Rust Tests
|
- name: Run Rust Tests
|
||||||
run: cargo test --all --exclude yaak-cli
|
run: cargo test --all
|
||||||
|
|
||||||
- name: Set version
|
- name: Set version
|
||||||
run: npm run replace-version
|
run: npm run replace-version
|
||||||
@@ -156,27 +153,3 @@ jobs:
|
|||||||
releaseDraft: true
|
releaseDraft: true
|
||||||
prerelease: true
|
prerelease: true
|
||||||
args: "${{ matrix.args }} --config ./crates-tauri/yaak-app/tauri.release.conf.json"
|
args: "${{ matrix.args }} --config ./crates-tauri/yaak-app/tauri.release.conf.json"
|
||||||
|
|
||||||
# Build a per-machine NSIS installer for enterprise deployment (PDQ, SCCM, Intune)
|
|
||||||
- name: Build and upload machine-wide installer (Windows only)
|
|
||||||
if: matrix.os == 'windows'
|
|
||||||
shell: pwsh
|
|
||||||
env:
|
|
||||||
YAAK_TARGET_ARCH: ${{ matrix.yaak_arch }}
|
|
||||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
|
||||||
AZURE_CLIENT_ID: ${{ secrets.AZURE_CLIENT_ID }}
|
|
||||||
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_CLIENT_SECRET }}
|
|
||||||
AZURE_TENANT_ID: ${{ secrets.AZURE_TENANT_ID }}
|
|
||||||
TAURI_SIGNING_PRIVATE_KEY: ${{ secrets.TAURI_PRIVATE_KEY }}
|
|
||||||
TAURI_SIGNING_PRIVATE_KEY_PASSWORD: ${{ secrets.TAURI_KEY_PASSWORD }}
|
|
||||||
run: |
|
|
||||||
Get-ChildItem -Recurse -Path target -File -Filter "*.exe.sig" | Remove-Item -Force
|
|
||||||
npx tauri bundle ${{ matrix.args }} --bundles nsis --config ./crates-tauri/yaak-app/tauri.release.conf.json --config '{"bundle":{"createUpdaterArtifacts":true,"windows":{"nsis":{"installMode":"perMachine"}}}}'
|
|
||||||
$setup = Get-ChildItem -Recurse -Path target -Filter "*setup*.exe" | Select-Object -First 1
|
|
||||||
$setupSig = "$($setup.FullName).sig"
|
|
||||||
$dest = $setup.FullName -replace '-setup\.exe$', '-setup-machine.exe'
|
|
||||||
$destSig = "$dest.sig"
|
|
||||||
Copy-Item $setup.FullName $dest
|
|
||||||
Copy-Item $setupSig $destSig
|
|
||||||
gh release upload "${{ github.ref_name }}" "$dest" --clobber
|
|
||||||
gh release upload "${{ github.ref_name }}" "$destSig" --clobber
|
|
||||||
10
.github/workflows/sponsors.yml
vendored
10
.github/workflows/sponsors.yml
vendored
@@ -16,23 +16,23 @@ jobs:
|
|||||||
uses: JamesIves/github-sponsors-readme-action@v1
|
uses: JamesIves/github-sponsors-readme-action@v1
|
||||||
with:
|
with:
|
||||||
token: ${{ secrets.SPONSORS_PAT }}
|
token: ${{ secrets.SPONSORS_PAT }}
|
||||||
file: "README.md"
|
file: 'README.md'
|
||||||
maximum: 1999
|
maximum: 1999
|
||||||
template: '<a href="https://github.com/{{{ login }}}"><img src="{{{ avatarUrl }}}" width="50px" alt="User avatar: {{{ login }}}" /></a> '
|
template: '<a href="https://github.com/{{{ login }}}"><img src="{{{ avatarUrl }}}" width="50px" alt="User avatar: {{{ login }}}" /></a> '
|
||||||
active-only: false
|
active-only: false
|
||||||
include-private: true
|
include-private: true
|
||||||
marker: "sponsors-base"
|
marker: 'sponsors-base'
|
||||||
|
|
||||||
- name: Generate Sponsors
|
- name: Generate Sponsors
|
||||||
uses: JamesIves/github-sponsors-readme-action@v1
|
uses: JamesIves/github-sponsors-readme-action@v1
|
||||||
with:
|
with:
|
||||||
token: ${{ secrets.SPONSORS_PAT }}
|
token: ${{ secrets.SPONSORS_PAT }}
|
||||||
file: "README.md"
|
file: 'README.md'
|
||||||
minimum: 2000
|
minimum: 2000
|
||||||
template: '<a href="https://github.com/{{{ login }}}"><img src="{{{ avatarUrl }}}" width="80px" alt="User avatar: {{{ login }}}" /></a> '
|
template: '<a href="https://github.com/{{{ login }}}"><img src="{{{ avatarUrl }}}" width="80px" alt="User avatar: {{{ login }}}" /></a> '
|
||||||
active-only: false
|
active-only: false
|
||||||
include-private: true
|
include-private: true
|
||||||
marker: "sponsors-premium"
|
marker: 'sponsors-premium'
|
||||||
|
|
||||||
# ⚠️ Note: You can use any deployment step here to automatically push the README
|
# ⚠️ Note: You can use any deployment step here to automatically push the README
|
||||||
# changes back to your branch.
|
# changes back to your branch.
|
||||||
@@ -41,4 +41,4 @@ jobs:
|
|||||||
with:
|
with:
|
||||||
branch: main
|
branch: main
|
||||||
force: false
|
force: false
|
||||||
folder: "."
|
folder: '.'
|
||||||
|
|||||||
13
.gitignore
vendored
13
.gitignore
vendored
@@ -44,16 +44,3 @@ crates-tauri/yaak-app/tauri.worktree.conf.json
|
|||||||
# Tauri auto-generated permission files
|
# Tauri auto-generated permission files
|
||||||
**/permissions/autogenerated
|
**/permissions/autogenerated
|
||||||
**/permissions/schemas
|
**/permissions/schemas
|
||||||
|
|
||||||
# Flatpak build artifacts
|
|
||||||
flatpak-repo/
|
|
||||||
.flatpak-builder/
|
|
||||||
flatpak/flatpak-builder-tools/
|
|
||||||
flatpak/cargo-sources.json
|
|
||||||
flatpak/node-sources.json
|
|
||||||
|
|
||||||
# Local Codex desktop env state
|
|
||||||
.codex/environments/environment.toml
|
|
||||||
|
|
||||||
# Claude Code local settings
|
|
||||||
.claude/settings.local.json
|
|
||||||
|
|||||||
@@ -1 +0,0 @@
|
|||||||
24.14.0
|
|
||||||
2
.npmrc
2
.npmrc
@@ -1,2 +0,0 @@
|
|||||||
# vite-plugin-wasm has not yet declared Vite 8 in its peerDependencies
|
|
||||||
legacy-peer-deps=true
|
|
||||||
@@ -1,2 +0,0 @@
|
|||||||
**/bindings/**
|
|
||||||
crates/yaak-templates/pkg/**
|
|
||||||
@@ -1 +0,0 @@
|
|||||||
vp lint
|
|
||||||
6
.vscode/extensions.json
vendored
6
.vscode/extensions.json
vendored
@@ -1,7 +1,3 @@
|
|||||||
{
|
{
|
||||||
"recommendations": [
|
"recommendations": ["biomejs.biome", "rust-lang.rust-analyzer", "bradlc.vscode-tailwindcss"]
|
||||||
"rust-lang.rust-analyzer",
|
|
||||||
"bradlc.vscode-tailwindcss",
|
|
||||||
"VoidZero.vite-plus-extension-pack"
|
|
||||||
]
|
|
||||||
}
|
}
|
||||||
|
|||||||
8
.vscode/settings.json
vendored
8
.vscode/settings.json
vendored
@@ -1,8 +1,6 @@
|
|||||||
{
|
{
|
||||||
"editor.defaultFormatter": "oxc.oxc-vscode",
|
"editor.defaultFormatter": "biomejs.biome",
|
||||||
"editor.formatOnSave": true,
|
"editor.formatOnSave": true,
|
||||||
"editor.formatOnSaveMode": "file",
|
"biome.enabled": true,
|
||||||
"editor.codeActionsOnSave": {
|
"biome.lint.format.enable": true
|
||||||
"source.fixAll.oxc": "explicit"
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,2 +0,0 @@
|
|||||||
- Tag safety: app releases use `v*` tags and CLI releases use `yaak-cli-*` tags; always confirm which one is requested before retagging.
|
|
||||||
- Do not commit, push, or tag without explicit approval
|
|
||||||
@@ -1,16 +0,0 @@
|
|||||||
# Contributing to Yaak
|
|
||||||
|
|
||||||
Yaak accepts community pull requests for:
|
|
||||||
|
|
||||||
- Bug fixes
|
|
||||||
- Small-scope improvements directly tied to existing behavior
|
|
||||||
|
|
||||||
Pull requests that introduce broad new features, major redesigns, or large refactors are out of scope unless explicitly approved first.
|
|
||||||
|
|
||||||
## Approval for Non-Bugfix Changes
|
|
||||||
|
|
||||||
If your PR is not a bug fix or small-scope improvement, include a link to the approved [feedback item](https://yaak.app/feedback) where contribution approval was explicitly stated.
|
|
||||||
|
|
||||||
## Development Setup
|
|
||||||
|
|
||||||
For local setup and development workflows, see [`DEVELOPMENT.md`](DEVELOPMENT.md).
|
|
||||||
2663
Cargo.lock
generated
2663
Cargo.lock
generated
File diff suppressed because it is too large
Load Diff
53
Cargo.toml
53
Cargo.toml
@@ -1,30 +1,30 @@
|
|||||||
[workspace]
|
[workspace]
|
||||||
resolver = "2"
|
resolver = "2"
|
||||||
members = [
|
members = [
|
||||||
"crates/yaak",
|
# Shared crates (no Tauri dependency)
|
||||||
# Shared crates (no Tauri dependency)
|
"crates/yaak-actions",
|
||||||
"crates/yaak-core",
|
"crates/yaak-actions-builtin",
|
||||||
"crates/yaak-common",
|
"crates/yaak-core",
|
||||||
"crates/yaak-crypto",
|
"crates/yaak-common",
|
||||||
"crates/yaak-git",
|
"crates/yaak-crypto",
|
||||||
"crates/yaak-grpc",
|
"crates/yaak-git",
|
||||||
"crates/yaak-http",
|
"crates/yaak-grpc",
|
||||||
"crates/yaak-models",
|
"crates/yaak-http",
|
||||||
"crates/yaak-plugins",
|
"crates/yaak-models",
|
||||||
"crates/yaak-sse",
|
"crates/yaak-plugins",
|
||||||
"crates/yaak-sync",
|
"crates/yaak-sse",
|
||||||
"crates/yaak-templates",
|
"crates/yaak-sync",
|
||||||
"crates/yaak-tls",
|
"crates/yaak-templates",
|
||||||
"crates/yaak-ws",
|
"crates/yaak-tls",
|
||||||
"crates/yaak-api",
|
"crates/yaak-ws",
|
||||||
# CLI crates
|
# CLI crates
|
||||||
"crates-cli/yaak-cli",
|
"crates-cli/yaak-cli",
|
||||||
# Tauri-specific crates
|
# Tauri-specific crates
|
||||||
"crates-tauri/yaak-app",
|
"crates-tauri/yaak-app",
|
||||||
"crates-tauri/yaak-fonts",
|
"crates-tauri/yaak-fonts",
|
||||||
"crates-tauri/yaak-license",
|
"crates-tauri/yaak-license",
|
||||||
"crates-tauri/yaak-mac-window",
|
"crates-tauri/yaak-mac-window",
|
||||||
"crates-tauri/yaak-tauri-utils",
|
"crates-tauri/yaak-tauri-utils",
|
||||||
]
|
]
|
||||||
|
|
||||||
[workspace.dependencies]
|
[workspace.dependencies]
|
||||||
@@ -35,7 +35,6 @@ log = "0.4.29"
|
|||||||
reqwest = "0.12.20"
|
reqwest = "0.12.20"
|
||||||
rustls = { version = "0.23.34", default-features = false }
|
rustls = { version = "0.23.34", default-features = false }
|
||||||
rustls-platform-verifier = "0.6.2"
|
rustls-platform-verifier = "0.6.2"
|
||||||
schemars = { version = "0.8.22", features = ["chrono"] }
|
|
||||||
serde = "1.0.228"
|
serde = "1.0.228"
|
||||||
serde_json = "1.0.145"
|
serde_json = "1.0.145"
|
||||||
sha2 = "0.10.9"
|
sha2 = "0.10.9"
|
||||||
@@ -48,8 +47,9 @@ tokio = "1.48.0"
|
|||||||
ts-rs = "11.1.0"
|
ts-rs = "11.1.0"
|
||||||
|
|
||||||
# Internal crates - shared
|
# Internal crates - shared
|
||||||
|
yaak-actions = { path = "crates/yaak-actions" }
|
||||||
|
yaak-actions-builtin = { path = "crates/yaak-actions-builtin" }
|
||||||
yaak-core = { path = "crates/yaak-core" }
|
yaak-core = { path = "crates/yaak-core" }
|
||||||
yaak = { path = "crates/yaak" }
|
|
||||||
yaak-common = { path = "crates/yaak-common" }
|
yaak-common = { path = "crates/yaak-common" }
|
||||||
yaak-crypto = { path = "crates/yaak-crypto" }
|
yaak-crypto = { path = "crates/yaak-crypto" }
|
||||||
yaak-git = { path = "crates/yaak-git" }
|
yaak-git = { path = "crates/yaak-git" }
|
||||||
@@ -62,7 +62,6 @@ yaak-sync = { path = "crates/yaak-sync" }
|
|||||||
yaak-templates = { path = "crates/yaak-templates" }
|
yaak-templates = { path = "crates/yaak-templates" }
|
||||||
yaak-tls = { path = "crates/yaak-tls" }
|
yaak-tls = { path = "crates/yaak-tls" }
|
||||||
yaak-ws = { path = "crates/yaak-ws" }
|
yaak-ws = { path = "crates/yaak-ws" }
|
||||||
yaak-api = { path = "crates/yaak-api" }
|
|
||||||
|
|
||||||
# Internal crates - Tauri-specific
|
# Internal crates - Tauri-specific
|
||||||
yaak-fonts = { path = "crates-tauri/yaak-fonts" }
|
yaak-fonts = { path = "crates-tauri/yaak-fonts" }
|
||||||
|
|||||||
@@ -1,26 +1,24 @@
|
|||||||
# Developer Setup
|
# Developer Setup
|
||||||
|
|
||||||
Yaak is a combined Node.js and Rust monorepo. It is a [Tauri](https://tauri.app) project, so
|
Yaak is a combined Node.js and Rust monorepo. It is a [Tauri](https://tauri.app) project, so
|
||||||
uses Rust and HTML/CSS/JS for the main application but there is also a plugin system powered
|
uses Rust and HTML/CSS/JS for the main application but there is also a plugin system powered
|
||||||
by a Node.js sidecar that communicates to the app over gRPC.
|
by a Node.js sidecar that communicates to the app over gRPC.
|
||||||
|
|
||||||
Because of the moving parts, there are a few setup steps required before development can
|
Because of the moving parts, there are a few setup steps required before development can
|
||||||
begin.
|
begin.
|
||||||
|
|
||||||
## Prerequisites
|
## Prerequisites
|
||||||
|
|
||||||
Make sure you have the following tools installed:
|
Make sure you have the following tools installed:
|
||||||
|
|
||||||
- [Node.js](https://nodejs.org/en/download/package-manager) (v24+)
|
- [Node.js](https://nodejs.org/en/download/package-manager)
|
||||||
- [Rust](https://www.rust-lang.org/tools/install)
|
- [Rust](https://www.rust-lang.org/tools/install)
|
||||||
- [Vite+](https://vite.dev/guide/vite-plus) (`vp` CLI)
|
|
||||||
|
|
||||||
Check the installations with the following commands:
|
Check the installations with the following commands:
|
||||||
|
|
||||||
```shell
|
```shell
|
||||||
node -v
|
node -v
|
||||||
npm -v
|
npm -v
|
||||||
vp --version
|
|
||||||
rustc --version
|
rustc --version
|
||||||
```
|
```
|
||||||
|
|
||||||
@@ -47,12 +45,12 @@ npm start
|
|||||||
## SQLite Migrations
|
## SQLite Migrations
|
||||||
|
|
||||||
New migrations can be created from the `src-tauri/` directory:
|
New migrations can be created from the `src-tauri/` directory:
|
||||||
|
|
||||||
```shell
|
```shell
|
||||||
npm run migration
|
npm run migration
|
||||||
```
|
```
|
||||||
|
|
||||||
Rerun the app to apply the migrations.
|
Rerun the app to apply the migrations.
|
||||||
|
|
||||||
_Note: For safety, development builds use a separate database location from production builds._
|
_Note: For safety, development builds use a separate database location from production builds._
|
||||||
|
|
||||||
@@ -63,9 +61,9 @@ _Note: For safety, development builds use a separate database location from prod
|
|||||||
lezer-generator components/core/Editor/<LANG>/<LANG>.grammar > components/core/Editor/<LANG>/<LANG>.ts
|
lezer-generator components/core/Editor/<LANG>/<LANG>.grammar > components/core/Editor/<LANG>/<LANG>.ts
|
||||||
```
|
```
|
||||||
|
|
||||||
## Linting and Formatting
|
## Linting & Formatting
|
||||||
|
|
||||||
This repo uses [Vite+](https://vite.dev/guide/vite-plus) for linting (oxlint) and formatting (oxfmt).
|
This repo uses Biome for linting and formatting (replacing ESLint + Prettier).
|
||||||
|
|
||||||
- Lint the entire repo:
|
- Lint the entire repo:
|
||||||
|
|
||||||
@@ -73,6 +71,12 @@ This repo uses [Vite+](https://vite.dev/guide/vite-plus) for linting (oxlint) an
|
|||||||
npm run lint
|
npm run lint
|
||||||
```
|
```
|
||||||
|
|
||||||
|
- Auto-fix lint issues where possible:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
npm run lint:fix
|
||||||
|
```
|
||||||
|
|
||||||
- Format code:
|
- Format code:
|
||||||
|
|
||||||
```sh
|
```sh
|
||||||
@@ -80,7 +84,5 @@ npm run format
|
|||||||
```
|
```
|
||||||
|
|
||||||
Notes:
|
Notes:
|
||||||
|
- Many workspace packages also expose the same scripts (`lint`, `lint:fix`, and `format`).
|
||||||
- A pre-commit hook runs `vp lint` automatically on commit.
|
- TypeScript type-checking still runs separately via `tsc --noEmit` in relevant packages.
|
||||||
- Some workspace packages also run `tsc --noEmit` for type-checking.
|
|
||||||
- VS Code users should install the recommended extensions for format-on-save support.
|
|
||||||
|
|||||||
22
README.md
22
README.md
@@ -16,19 +16,23 @@
|
|||||||
</p>
|
</p>
|
||||||
<br>
|
<br>
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
<p align="center">
|
<p align="center">
|
||||||
<!-- sponsors-premium --><a href="https://github.com/MVST-Solutions"><img src="https://github.com/MVST-Solutions.png" width="80px" alt="User avatar: MVST-Solutions" /></a> <a href="https://github.com/dharsanb"><img src="https://github.com/dharsanb.png" width="80px" alt="User avatar: dharsanb" /></a> <a href="https://github.com/railwayapp"><img src="https://github.com/railwayapp.png" width="80px" alt="User avatar: railwayapp" /></a> <a href="https://github.com/caseyamcl"><img src="https://github.com/caseyamcl.png" width="80px" alt="User avatar: caseyamcl" /></a> <a href="https://github.com/bytebase"><img src="https://github.com/bytebase.png" width="80px" alt="User avatar: bytebase" /></a> <a href="https://github.com/"><img src="https://raw.githubusercontent.com/JamesIves/github-sponsors-readme-action/dev/.github/assets/placeholder.png" width="80px" alt="User avatar: " /></a> <!-- sponsors-premium -->
|
<!-- sponsors-premium --><a href="https://github.com/MVST-Solutions"><img src="https://github.com/MVST-Solutions.png" width="80px" alt="User avatar: MVST-Solutions" /></a> <a href="https://github.com/dharsanb"><img src="https://github.com/dharsanb.png" width="80px" alt="User avatar: dharsanb" /></a> <a href="https://github.com/railwayapp"><img src="https://github.com/railwayapp.png" width="80px" alt="User avatar: railwayapp" /></a> <a href="https://github.com/caseyamcl"><img src="https://github.com/caseyamcl.png" width="80px" alt="User avatar: caseyamcl" /></a> <a href="https://github.com/bytebase"><img src="https://github.com/bytebase.png" width="80px" alt="User avatar: bytebase" /></a> <a href="https://github.com/"><img src="https://raw.githubusercontent.com/JamesIves/github-sponsors-readme-action/dev/.github/assets/placeholder.png" width="80px" alt="User avatar: " /></a> <!-- sponsors-premium -->
|
||||||
</p>
|
</p>
|
||||||
<p align="center">
|
<p align="center">
|
||||||
<!-- sponsors-base --><a href="https://github.com/seanwash"><img src="https://github.com/seanwash.png" width="50px" alt="User avatar: seanwash" /></a> <a href="https://github.com/jerath"><img src="https://github.com/jerath.png" width="50px" alt="User avatar: jerath" /></a> <a href="https://github.com/itsa-sh"><img src="https://github.com/itsa-sh.png" width="50px" alt="User avatar: itsa-sh" /></a> <a href="https://github.com/dmmulroy"><img src="https://github.com/dmmulroy.png" width="50px" alt="User avatar: dmmulroy" /></a> <a href="https://github.com/timcole"><img src="https://github.com/timcole.png" width="50px" alt="User avatar: timcole" /></a> <a href="https://github.com/VLZH"><img src="https://github.com/VLZH.png" width="50px" alt="User avatar: VLZH" /></a> <a href="https://github.com/terasaka2k"><img src="https://github.com/terasaka2k.png" width="50px" alt="User avatar: terasaka2k" /></a> <a href="https://github.com/andriyor"><img src="https://github.com/andriyor.png" width="50px" alt="User avatar: andriyor" /></a> <a href="https://github.com/majudhu"><img src="https://github.com/majudhu.png" width="50px" alt="User avatar: majudhu" /></a> <a href="https://github.com/axelrindle"><img src="https://github.com/axelrindle.png" width="50px" alt="User avatar: axelrindle" /></a> <a href="https://github.com/jirizverina"><img src="https://github.com/jirizverina.png" width="50px" alt="User avatar: jirizverina" /></a> <a href="https://github.com/chip-well"><img src="https://github.com/chip-well.png" width="50px" alt="User avatar: chip-well" /></a> <a href="https://github.com/GRAYAH"><img src="https://github.com/GRAYAH.png" width="50px" alt="User avatar: GRAYAH" /></a> <a href="https://github.com/flashblaze"><img src="https://github.com/flashblaze.png" width="50px" alt="User avatar: flashblaze" /></a> <a href="https://github.com/Frostist"><img src="https://github.com/Frostist.png" width="50px" alt="User avatar: Frostist" /></a> <!-- sponsors-base -->
|
<!-- sponsors-base --><a href="https://github.com/seanwash"><img src="https://github.com/seanwash.png" width="50px" alt="User avatar: seanwash" /></a> <a href="https://github.com/jerath"><img src="https://github.com/jerath.png" width="50px" alt="User avatar: jerath" /></a> <a href="https://github.com/itsa-sh"><img src="https://github.com/itsa-sh.png" width="50px" alt="User avatar: itsa-sh" /></a> <a href="https://github.com/dmmulroy"><img src="https://github.com/dmmulroy.png" width="50px" alt="User avatar: dmmulroy" /></a> <a href="https://github.com/timcole"><img src="https://github.com/timcole.png" width="50px" alt="User avatar: timcole" /></a> <a href="https://github.com/VLZH"><img src="https://github.com/VLZH.png" width="50px" alt="User avatar: VLZH" /></a> <a href="https://github.com/terasaka2k"><img src="https://github.com/terasaka2k.png" width="50px" alt="User avatar: terasaka2k" /></a> <a href="https://github.com/andriyor"><img src="https://github.com/andriyor.png" width="50px" alt="User avatar: andriyor" /></a> <a href="https://github.com/majudhu"><img src="https://github.com/majudhu.png" width="50px" alt="User avatar: majudhu" /></a> <a href="https://github.com/axelrindle"><img src="https://github.com/axelrindle.png" width="50px" alt="User avatar: axelrindle" /></a> <a href="https://github.com/jirizverina"><img src="https://github.com/jirizverina.png" width="50px" alt="User avatar: jirizverina" /></a> <a href="https://github.com/chip-well"><img src="https://github.com/chip-well.png" width="50px" alt="User avatar: chip-well" /></a> <a href="https://github.com/GRAYAH"><img src="https://github.com/GRAYAH.png" width="50px" alt="User avatar: GRAYAH" /></a> <!-- sponsors-base -->
|
||||||
</p>
|
</p>
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
|
|
||||||
## Features
|
## Features
|
||||||
|
|
||||||
Yaak is an offline-first API client designed to stay out of your way while giving you everything you need when you need it.
|
Yaak is an offline-first API client designed to stay out of your way while giving you everything you need when you need it.
|
||||||
Built with [Tauri](https://tauri.app), Rust, and React, it’s fast, lightweight, and private. No telemetry, no VC funding, and no cloud lock-in.
|
Built with [Tauri](https://tauri.app), Rust, and React, it’s fast, lightweight, and private. No telemetry, no VC funding, and no cloud lock-in.
|
||||||
|
|
||||||
|
|
||||||
### 🌐 Work with any API
|
### 🌐 Work with any API
|
||||||
|
|
||||||
@@ -37,29 +41,25 @@ Built with [Tauri](https://tauri.app), Rust, and React, it’s fast, lightweight
|
|||||||
- Filter and inspect responses with JSONPath or XPath.
|
- Filter and inspect responses with JSONPath or XPath.
|
||||||
|
|
||||||
### 🔐 Stay secure
|
### 🔐 Stay secure
|
||||||
|
|
||||||
- Use OAuth 2.0, JWT, Basic Auth, or custom plugins for authentication.
|
- Use OAuth 2.0, JWT, Basic Auth, or custom plugins for authentication.
|
||||||
- Secure sensitive values with encrypted secrets.
|
- Secure sensitive values with encrypted secrets.
|
||||||
- Store secrets in your OS keychain.
|
- Store secrets in your OS keychain.
|
||||||
|
|
||||||
### ☁️ Organize & collaborate
|
### ☁️ Organize & collaborate
|
||||||
|
|
||||||
- Group requests into workspaces and nested folders.
|
- Group requests into workspaces and nested folders.
|
||||||
- Use environment variables to switch between dev, staging, and prod.
|
- Use environment variables to switch between dev, staging, and prod.
|
||||||
- Mirror workspaces to your filesystem for versioning in Git or syncing with Dropbox.
|
- Mirror workspaces to your filesystem for versioning in Git or syncing with Dropbox.
|
||||||
|
|
||||||
### 🧩 Extend & customize
|
### 🧩 Extend & customize
|
||||||
|
|
||||||
- Insert dynamic values like UUIDs or timestamps with template tags.
|
- Insert dynamic values like UUIDs or timestamps with template tags.
|
||||||
- Pick from built-in themes or build your own.
|
- Pick from built-in themes or build your own.
|
||||||
- Create plugins to extend authentication, template tags, or the UI.
|
- Create plugins to extend authentication, template tags, or the UI.
|
||||||
|
|
||||||
|
|
||||||
## Contribution Policy
|
## Contribution Policy
|
||||||
|
|
||||||
> [!IMPORTANT]
|
Yaak is open source but only accepting contributions for bug fixes. To get started,
|
||||||
> Community PRs are currently limited to bug fixes and small-scope improvements.
|
visit [`DEVELOPMENT.md`](DEVELOPMENT.md) for tips on setting up your environment.
|
||||||
> If your PR is out of scope, link an approved feedback item from [yaak.app/feedback](https://yaak.app/feedback).
|
|
||||||
> See [`CONTRIBUTING.md`](CONTRIBUTING.md) for policy details and [`DEVELOPMENT.md`](DEVELOPMENT.md) for local setup.
|
|
||||||
|
|
||||||
## Useful Resources
|
## Useful Resources
|
||||||
|
|
||||||
|
|||||||
53
biome.json
Normal file
53
biome.json
Normal file
@@ -0,0 +1,53 @@
|
|||||||
|
{
|
||||||
|
"$schema": "https://biomejs.dev/schemas/2.3.11/schema.json",
|
||||||
|
"linter": {
|
||||||
|
"enabled": true,
|
||||||
|
"rules": {
|
||||||
|
"recommended": true,
|
||||||
|
"a11y": {
|
||||||
|
"useKeyWithClickEvents": "off"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"formatter": {
|
||||||
|
"enabled": true,
|
||||||
|
"indentStyle": "space",
|
||||||
|
"indentWidth": 2,
|
||||||
|
"lineWidth": 100,
|
||||||
|
"bracketSpacing": true
|
||||||
|
},
|
||||||
|
"css": {
|
||||||
|
"parser": {
|
||||||
|
"tailwindDirectives": true
|
||||||
|
},
|
||||||
|
"linter": {
|
||||||
|
"enabled": false
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"javascript": {
|
||||||
|
"formatter": {
|
||||||
|
"quoteStyle": "single",
|
||||||
|
"jsxQuoteStyle": "double",
|
||||||
|
"trailingCommas": "all",
|
||||||
|
"semicolons": "always"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"files": {
|
||||||
|
"includes": [
|
||||||
|
"**",
|
||||||
|
"!**/node_modules",
|
||||||
|
"!**/dist",
|
||||||
|
"!**/build",
|
||||||
|
"!target",
|
||||||
|
"!scripts",
|
||||||
|
"!crates",
|
||||||
|
"!crates-tauri",
|
||||||
|
"!src-web/tailwind.config.cjs",
|
||||||
|
"!src-web/postcss.config.cjs",
|
||||||
|
"!src-web/vite.config.ts",
|
||||||
|
"!src-web/routeTree.gen.ts",
|
||||||
|
"!packages/plugin-runtime-types/lib",
|
||||||
|
"!**/bindings"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -5,50 +5,20 @@ edition = "2024"
|
|||||||
publish = false
|
publish = false
|
||||||
|
|
||||||
[[bin]]
|
[[bin]]
|
||||||
name = "yaak"
|
name = "yaakcli"
|
||||||
path = "src/main.rs"
|
path = "src/main.rs"
|
||||||
|
|
||||||
[dependencies]
|
[dependencies]
|
||||||
arboard = "3"
|
|
||||||
base64 = "0.22"
|
|
||||||
clap = { version = "4", features = ["derive"] }
|
clap = { version = "4", features = ["derive"] }
|
||||||
console = "0.15"
|
|
||||||
dirs = "6"
|
dirs = "6"
|
||||||
env_logger = "0.11"
|
env_logger = "0.11"
|
||||||
futures = "0.3"
|
|
||||||
inquire = { version = "0.7", features = ["editor"] }
|
|
||||||
hex = { workspace = true }
|
|
||||||
include_dir = "0.7"
|
|
||||||
keyring = { workspace = true, features = ["apple-native", "windows-native", "sync-secret-service"] }
|
|
||||||
log = { workspace = true }
|
log = { workspace = true }
|
||||||
rand = "0.8"
|
|
||||||
reqwest = { workspace = true }
|
|
||||||
rolldown = "0.1.0"
|
|
||||||
oxc_resolver = "=11.10.0"
|
|
||||||
schemars = { workspace = true }
|
|
||||||
serde = { workspace = true }
|
|
||||||
serde_json = { workspace = true }
|
serde_json = { workspace = true }
|
||||||
sha2 = { workspace = true }
|
tokio = { workspace = true, features = ["rt-multi-thread", "macros"] }
|
||||||
tokio = { workspace = true, features = [
|
yaak-actions = { workspace = true }
|
||||||
"rt-multi-thread",
|
yaak-actions-builtin = { workspace = true }
|
||||||
"macros",
|
|
||||||
"io-util",
|
|
||||||
"net",
|
|
||||||
"signal",
|
|
||||||
"time",
|
|
||||||
] }
|
|
||||||
walkdir = "2"
|
|
||||||
webbrowser = "1"
|
|
||||||
zip = "4"
|
|
||||||
yaak = { workspace = true }
|
|
||||||
yaak-api = { workspace = true }
|
|
||||||
yaak-crypto = { workspace = true }
|
yaak-crypto = { workspace = true }
|
||||||
yaak-http = { workspace = true }
|
yaak-http = { workspace = true }
|
||||||
yaak-models = { workspace = true }
|
yaak-models = { workspace = true }
|
||||||
yaak-plugins = { workspace = true }
|
yaak-plugins = { workspace = true }
|
||||||
yaak-templates = { workspace = true }
|
yaak-templates = { workspace = true }
|
||||||
|
|
||||||
[dev-dependencies]
|
|
||||||
assert_cmd = "2"
|
|
||||||
predicates = "3"
|
|
||||||
tempfile = "3"
|
|
||||||
|
|||||||
@@ -1,66 +0,0 @@
|
|||||||
# Yaak CLI
|
|
||||||
|
|
||||||
The `yaak` CLI for publishing plugins and creating/updating/sending requests.
|
|
||||||
|
|
||||||
## Installation
|
|
||||||
|
|
||||||
```sh
|
|
||||||
npm install @yaakapp/cli
|
|
||||||
```
|
|
||||||
|
|
||||||
## Agentic Workflows
|
|
||||||
|
|
||||||
The `yaak` CLI is primarily meant to be used by AI agents, and has the following features:
|
|
||||||
|
|
||||||
- `schema` subcommands to get the JSON Schema for any model (eg. `yaak request schema http`)
|
|
||||||
- `--json '{...}'` input format to create and update data
|
|
||||||
- `--verbose` mode for extracting debug info while sending requests
|
|
||||||
- The ability to send entire workspaces and folders (Supports `--parallel` and `--fail-fast`)
|
|
||||||
|
|
||||||
### Example Prompts
|
|
||||||
|
|
||||||
Use the `yaak` CLI with agents like Claude or Codex to do useful things for you.
|
|
||||||
|
|
||||||
Here are some example prompts:
|
|
||||||
|
|
||||||
```text
|
|
||||||
Scan my API routes and create a workspace (using yaak cli) with
|
|
||||||
all the requests needed for me to do manual testing?
|
|
||||||
```
|
|
||||||
|
|
||||||
```text
|
|
||||||
Send all the GraphQL requests in my workspace
|
|
||||||
```
|
|
||||||
|
|
||||||
## Description
|
|
||||||
|
|
||||||
Here's the current print of `yaak --help`
|
|
||||||
|
|
||||||
```text
|
|
||||||
Yaak CLI - API client from the command line
|
|
||||||
|
|
||||||
Usage: yaak [OPTIONS] <COMMAND>
|
|
||||||
|
|
||||||
Commands:
|
|
||||||
auth Authentication commands
|
|
||||||
plugin Plugin development and publishing commands
|
|
||||||
send Send a request, folder, or workspace by ID
|
|
||||||
workspace Workspace commands
|
|
||||||
request Request commands
|
|
||||||
folder Folder commands
|
|
||||||
environment Environment commands
|
|
||||||
|
|
||||||
Options:
|
|
||||||
--data-dir <DATA_DIR> Use a custom data directory
|
|
||||||
-e, --environment <ENVIRONMENT> Environment ID to use for variable substitution
|
|
||||||
-v, --verbose Enable verbose send output (events and streamed response body)
|
|
||||||
--log [<LEVEL>] Enable CLI logging; optionally set level (error|warn|info|debug|trace) [possible values: error, warn, info, debug, trace]
|
|
||||||
-h, --help Print help
|
|
||||||
-V, --version Print version
|
|
||||||
|
|
||||||
Agent Hints:
|
|
||||||
- Template variable syntax is ${[ my_var ]}, not {{ ... }}
|
|
||||||
- Template function syntax is ${[ namespace.my_func(a='aaa',b='bbb') ]}
|
|
||||||
- View JSONSchema for models before creating or updating (eg. `yaak request schema http`)
|
|
||||||
- Deletion requires confirmation (--yes for non-interactive environments)
|
|
||||||
```
|
|
||||||
@@ -1,475 +0,0 @@
|
|||||||
use clap::{Args, Parser, Subcommand, ValueEnum};
|
|
||||||
use std::path::PathBuf;
|
|
||||||
|
|
||||||
#[derive(Parser)]
|
|
||||||
#[command(name = "yaak")]
|
|
||||||
#[command(about = "Yaak CLI - API client from the command line")]
|
|
||||||
#[command(version = crate::version::cli_version())]
|
|
||||||
#[command(disable_help_subcommand = true)]
|
|
||||||
#[command(after_help = r#"Agent Hints:
|
|
||||||
- Template variable syntax is ${[ my_var ]}, not {{ ... }}
|
|
||||||
- Template function syntax is ${[ namespace.my_func(a='aaa',b='bbb') ]}
|
|
||||||
- View JSONSchema for models before creating or updating (eg. `yaak request schema http`)
|
|
||||||
- Deletion requires confirmation (--yes for non-interactive environments)
|
|
||||||
"#)]
|
|
||||||
pub struct Cli {
|
|
||||||
/// Use a custom data directory
|
|
||||||
#[arg(long, global = true)]
|
|
||||||
pub data_dir: Option<PathBuf>,
|
|
||||||
|
|
||||||
/// Environment ID to use for variable substitution
|
|
||||||
#[arg(long, short, global = true)]
|
|
||||||
pub environment: Option<String>,
|
|
||||||
|
|
||||||
/// Cookie jar ID to use when sending requests
|
|
||||||
#[arg(long = "cookie-jar", global = true, value_name = "COOKIE_JAR_ID")]
|
|
||||||
pub cookie_jar: Option<String>,
|
|
||||||
|
|
||||||
/// Enable verbose send output (events and streamed response body)
|
|
||||||
#[arg(long, short, global = true)]
|
|
||||||
pub verbose: bool,
|
|
||||||
|
|
||||||
/// Enable CLI logging; optionally set level (error|warn|info|debug|trace)
|
|
||||||
#[arg(long, global = true, value_name = "LEVEL", num_args = 0..=1, ignore_case = true)]
|
|
||||||
pub log: Option<Option<LogLevel>>,
|
|
||||||
|
|
||||||
#[command(subcommand)]
|
|
||||||
pub command: Commands,
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Subcommand)]
|
|
||||||
pub enum Commands {
|
|
||||||
/// Authentication commands
|
|
||||||
Auth(AuthArgs),
|
|
||||||
|
|
||||||
/// Plugin development and publishing commands
|
|
||||||
Plugin(PluginArgs),
|
|
||||||
|
|
||||||
#[command(hide = true)]
|
|
||||||
Build(PluginPathArg),
|
|
||||||
|
|
||||||
#[command(hide = true)]
|
|
||||||
Dev(PluginPathArg),
|
|
||||||
|
|
||||||
/// Backward-compatible alias for `plugin generate`
|
|
||||||
#[command(hide = true)]
|
|
||||||
Generate(GenerateArgs),
|
|
||||||
|
|
||||||
/// Backward-compatible alias for `plugin publish`
|
|
||||||
#[command(hide = true)]
|
|
||||||
Publish(PluginPathArg),
|
|
||||||
|
|
||||||
/// Send a request, folder, or workspace by ID
|
|
||||||
Send(SendArgs),
|
|
||||||
|
|
||||||
/// Cookie jar commands
|
|
||||||
CookieJar(CookieJarArgs),
|
|
||||||
|
|
||||||
/// Workspace commands
|
|
||||||
Workspace(WorkspaceArgs),
|
|
||||||
|
|
||||||
/// Request commands
|
|
||||||
Request(RequestArgs),
|
|
||||||
|
|
||||||
/// Folder commands
|
|
||||||
Folder(FolderArgs),
|
|
||||||
|
|
||||||
/// Environment commands
|
|
||||||
Environment(EnvironmentArgs),
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Args)]
|
|
||||||
pub struct SendArgs {
|
|
||||||
/// Request, folder, or workspace ID
|
|
||||||
pub id: String,
|
|
||||||
|
|
||||||
/// Execute requests in parallel
|
|
||||||
#[arg(long)]
|
|
||||||
pub parallel: bool,
|
|
||||||
|
|
||||||
/// Stop on first request failure when sending folders/workspaces
|
|
||||||
#[arg(long, conflicts_with = "parallel")]
|
|
||||||
pub fail_fast: bool,
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Args)]
|
|
||||||
#[command(disable_help_subcommand = true)]
|
|
||||||
pub struct CookieJarArgs {
|
|
||||||
#[command(subcommand)]
|
|
||||||
pub command: CookieJarCommands,
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Subcommand)]
|
|
||||||
pub enum CookieJarCommands {
|
|
||||||
/// List cookie jars in a workspace
|
|
||||||
List {
|
|
||||||
/// Workspace ID (optional when exactly one workspace exists)
|
|
||||||
workspace_id: Option<String>,
|
|
||||||
},
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Args)]
|
|
||||||
#[command(disable_help_subcommand = true)]
|
|
||||||
pub struct WorkspaceArgs {
|
|
||||||
#[command(subcommand)]
|
|
||||||
pub command: WorkspaceCommands,
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Subcommand)]
|
|
||||||
pub enum WorkspaceCommands {
|
|
||||||
/// List all workspaces
|
|
||||||
List,
|
|
||||||
|
|
||||||
/// Output JSON schema for workspace create/update payloads
|
|
||||||
Schema {
|
|
||||||
/// Pretty-print schema JSON output
|
|
||||||
#[arg(long)]
|
|
||||||
pretty: bool,
|
|
||||||
},
|
|
||||||
|
|
||||||
/// Show a workspace as JSON
|
|
||||||
Show {
|
|
||||||
/// Workspace ID
|
|
||||||
workspace_id: String,
|
|
||||||
},
|
|
||||||
|
|
||||||
/// Create a workspace
|
|
||||||
Create {
|
|
||||||
/// Workspace name
|
|
||||||
#[arg(short, long)]
|
|
||||||
name: Option<String>,
|
|
||||||
|
|
||||||
/// JSON payload
|
|
||||||
#[arg(long, conflicts_with = "json_input")]
|
|
||||||
json: Option<String>,
|
|
||||||
|
|
||||||
/// JSON payload shorthand
|
|
||||||
#[arg(value_name = "JSON", conflicts_with = "json")]
|
|
||||||
json_input: Option<String>,
|
|
||||||
},
|
|
||||||
|
|
||||||
/// Update a workspace
|
|
||||||
Update {
|
|
||||||
/// JSON payload
|
|
||||||
#[arg(long, conflicts_with = "json_input")]
|
|
||||||
json: Option<String>,
|
|
||||||
|
|
||||||
/// JSON payload shorthand
|
|
||||||
#[arg(value_name = "JSON", conflicts_with = "json")]
|
|
||||||
json_input: Option<String>,
|
|
||||||
},
|
|
||||||
|
|
||||||
/// Delete a workspace
|
|
||||||
Delete {
|
|
||||||
/// Workspace ID
|
|
||||||
workspace_id: String,
|
|
||||||
|
|
||||||
/// Skip confirmation prompt
|
|
||||||
#[arg(short, long)]
|
|
||||||
yes: bool,
|
|
||||||
},
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Args)]
|
|
||||||
#[command(disable_help_subcommand = true)]
|
|
||||||
pub struct RequestArgs {
|
|
||||||
#[command(subcommand)]
|
|
||||||
pub command: RequestCommands,
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Subcommand)]
|
|
||||||
pub enum RequestCommands {
|
|
||||||
/// List requests in a workspace
|
|
||||||
List {
|
|
||||||
/// Workspace ID (optional when exactly one workspace exists)
|
|
||||||
workspace_id: Option<String>,
|
|
||||||
},
|
|
||||||
|
|
||||||
/// Show a request as JSON
|
|
||||||
Show {
|
|
||||||
/// Request ID
|
|
||||||
request_id: String,
|
|
||||||
},
|
|
||||||
|
|
||||||
/// Send a request by ID
|
|
||||||
Send {
|
|
||||||
/// Request ID
|
|
||||||
request_id: String,
|
|
||||||
},
|
|
||||||
|
|
||||||
/// Output JSON schema for request create/update payloads
|
|
||||||
Schema {
|
|
||||||
#[arg(value_enum)]
|
|
||||||
request_type: RequestSchemaType,
|
|
||||||
|
|
||||||
/// Pretty-print schema JSON output
|
|
||||||
#[arg(long)]
|
|
||||||
pretty: bool,
|
|
||||||
},
|
|
||||||
|
|
||||||
/// Create a new HTTP request
|
|
||||||
Create {
|
|
||||||
/// Workspace ID (or positional JSON payload shorthand)
|
|
||||||
workspace_id: Option<String>,
|
|
||||||
|
|
||||||
/// Request name
|
|
||||||
#[arg(short, long)]
|
|
||||||
name: Option<String>,
|
|
||||||
|
|
||||||
/// HTTP method
|
|
||||||
#[arg(short, long)]
|
|
||||||
method: Option<String>,
|
|
||||||
|
|
||||||
/// URL
|
|
||||||
#[arg(short, long)]
|
|
||||||
url: Option<String>,
|
|
||||||
|
|
||||||
/// JSON payload
|
|
||||||
#[arg(long)]
|
|
||||||
json: Option<String>,
|
|
||||||
},
|
|
||||||
|
|
||||||
/// Update an HTTP request
|
|
||||||
Update {
|
|
||||||
/// JSON payload
|
|
||||||
#[arg(long, conflicts_with = "json_input")]
|
|
||||||
json: Option<String>,
|
|
||||||
|
|
||||||
/// JSON payload shorthand
|
|
||||||
#[arg(value_name = "JSON", conflicts_with = "json")]
|
|
||||||
json_input: Option<String>,
|
|
||||||
},
|
|
||||||
|
|
||||||
/// Delete a request
|
|
||||||
Delete {
|
|
||||||
/// Request ID
|
|
||||||
request_id: String,
|
|
||||||
|
|
||||||
/// Skip confirmation prompt
|
|
||||||
#[arg(short, long)]
|
|
||||||
yes: bool,
|
|
||||||
},
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Clone, Copy, Debug, ValueEnum)]
|
|
||||||
pub enum RequestSchemaType {
|
|
||||||
Http,
|
|
||||||
Grpc,
|
|
||||||
Websocket,
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Clone, Copy, Debug, ValueEnum)]
|
|
||||||
pub enum LogLevel {
|
|
||||||
Error,
|
|
||||||
Warn,
|
|
||||||
Info,
|
|
||||||
Debug,
|
|
||||||
Trace,
|
|
||||||
}
|
|
||||||
|
|
||||||
impl LogLevel {
|
|
||||||
pub fn as_filter(self) -> log::LevelFilter {
|
|
||||||
match self {
|
|
||||||
LogLevel::Error => log::LevelFilter::Error,
|
|
||||||
LogLevel::Warn => log::LevelFilter::Warn,
|
|
||||||
LogLevel::Info => log::LevelFilter::Info,
|
|
||||||
LogLevel::Debug => log::LevelFilter::Debug,
|
|
||||||
LogLevel::Trace => log::LevelFilter::Trace,
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Args)]
|
|
||||||
#[command(disable_help_subcommand = true)]
|
|
||||||
pub struct FolderArgs {
|
|
||||||
#[command(subcommand)]
|
|
||||||
pub command: FolderCommands,
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Subcommand)]
|
|
||||||
pub enum FolderCommands {
|
|
||||||
/// List folders in a workspace
|
|
||||||
List {
|
|
||||||
/// Workspace ID (optional when exactly one workspace exists)
|
|
||||||
workspace_id: Option<String>,
|
|
||||||
},
|
|
||||||
|
|
||||||
/// Show a folder as JSON
|
|
||||||
Show {
|
|
||||||
/// Folder ID
|
|
||||||
folder_id: String,
|
|
||||||
},
|
|
||||||
|
|
||||||
/// Create a folder
|
|
||||||
Create {
|
|
||||||
/// Workspace ID (or positional JSON payload shorthand)
|
|
||||||
workspace_id: Option<String>,
|
|
||||||
|
|
||||||
/// Folder name
|
|
||||||
#[arg(short, long)]
|
|
||||||
name: Option<String>,
|
|
||||||
|
|
||||||
/// JSON payload
|
|
||||||
#[arg(long)]
|
|
||||||
json: Option<String>,
|
|
||||||
},
|
|
||||||
|
|
||||||
/// Update a folder
|
|
||||||
Update {
|
|
||||||
/// JSON payload
|
|
||||||
#[arg(long, conflicts_with = "json_input")]
|
|
||||||
json: Option<String>,
|
|
||||||
|
|
||||||
/// JSON payload shorthand
|
|
||||||
#[arg(value_name = "JSON", conflicts_with = "json")]
|
|
||||||
json_input: Option<String>,
|
|
||||||
},
|
|
||||||
|
|
||||||
/// Delete a folder
|
|
||||||
Delete {
|
|
||||||
/// Folder ID
|
|
||||||
folder_id: String,
|
|
||||||
|
|
||||||
/// Skip confirmation prompt
|
|
||||||
#[arg(short, long)]
|
|
||||||
yes: bool,
|
|
||||||
},
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Args)]
|
|
||||||
#[command(disable_help_subcommand = true)]
|
|
||||||
pub struct EnvironmentArgs {
|
|
||||||
#[command(subcommand)]
|
|
||||||
pub command: EnvironmentCommands,
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Subcommand)]
|
|
||||||
pub enum EnvironmentCommands {
|
|
||||||
/// List environments in a workspace
|
|
||||||
List {
|
|
||||||
/// Workspace ID (optional when exactly one workspace exists)
|
|
||||||
workspace_id: Option<String>,
|
|
||||||
},
|
|
||||||
|
|
||||||
/// Output JSON schema for environment create/update payloads
|
|
||||||
Schema {
|
|
||||||
/// Pretty-print schema JSON output
|
|
||||||
#[arg(long)]
|
|
||||||
pretty: bool,
|
|
||||||
},
|
|
||||||
|
|
||||||
/// Show an environment as JSON
|
|
||||||
Show {
|
|
||||||
/// Environment ID
|
|
||||||
environment_id: String,
|
|
||||||
},
|
|
||||||
|
|
||||||
/// Create an environment
|
|
||||||
#[command(after_help = r#"Modes (choose one):
|
|
||||||
1) yaak environment create <workspace_id> --name <name>
|
|
||||||
2) yaak environment create --json '{"workspaceId":"wk_abc","name":"Production"}'
|
|
||||||
3) yaak environment create '{"workspaceId":"wk_abc","name":"Production"}'
|
|
||||||
4) yaak environment create <workspace_id> --json '{"name":"Production"}'
|
|
||||||
"#)]
|
|
||||||
Create {
|
|
||||||
/// Workspace ID for flag-based mode, or positional JSON payload shorthand
|
|
||||||
#[arg(value_name = "WORKSPACE_ID_OR_JSON")]
|
|
||||||
workspace_id: Option<String>,
|
|
||||||
|
|
||||||
/// Environment name
|
|
||||||
#[arg(short, long)]
|
|
||||||
name: Option<String>,
|
|
||||||
|
|
||||||
/// JSON payload (use instead of WORKSPACE_ID/--name)
|
|
||||||
#[arg(long)]
|
|
||||||
json: Option<String>,
|
|
||||||
},
|
|
||||||
|
|
||||||
/// Update an environment
|
|
||||||
Update {
|
|
||||||
/// JSON payload
|
|
||||||
#[arg(long, conflicts_with = "json_input")]
|
|
||||||
json: Option<String>,
|
|
||||||
|
|
||||||
/// JSON payload shorthand
|
|
||||||
#[arg(value_name = "JSON", conflicts_with = "json")]
|
|
||||||
json_input: Option<String>,
|
|
||||||
},
|
|
||||||
|
|
||||||
/// Delete an environment
|
|
||||||
Delete {
|
|
||||||
/// Environment ID
|
|
||||||
environment_id: String,
|
|
||||||
|
|
||||||
/// Skip confirmation prompt
|
|
||||||
#[arg(short, long)]
|
|
||||||
yes: bool,
|
|
||||||
},
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Args)]
|
|
||||||
#[command(disable_help_subcommand = true)]
|
|
||||||
pub struct AuthArgs {
|
|
||||||
#[command(subcommand)]
|
|
||||||
pub command: AuthCommands,
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Subcommand)]
|
|
||||||
pub enum AuthCommands {
|
|
||||||
/// Login to Yaak via web browser
|
|
||||||
Login,
|
|
||||||
|
|
||||||
/// Sign out of the Yaak CLI
|
|
||||||
Logout,
|
|
||||||
|
|
||||||
/// Print the current logged-in user's info
|
|
||||||
Whoami,
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Args)]
|
|
||||||
#[command(disable_help_subcommand = true)]
|
|
||||||
pub struct PluginArgs {
|
|
||||||
#[command(subcommand)]
|
|
||||||
pub command: PluginCommands,
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Subcommand)]
|
|
||||||
pub enum PluginCommands {
|
|
||||||
/// Transpile code into a runnable plugin bundle
|
|
||||||
Build(PluginPathArg),
|
|
||||||
|
|
||||||
/// Build plugin bundle continuously when the filesystem changes
|
|
||||||
Dev(PluginPathArg),
|
|
||||||
|
|
||||||
/// Generate a "Hello World" Yaak plugin
|
|
||||||
Generate(GenerateArgs),
|
|
||||||
|
|
||||||
/// Install a plugin from a local directory or from the registry
|
|
||||||
Install(InstallPluginArgs),
|
|
||||||
|
|
||||||
/// Publish a Yaak plugin version to the plugin registry
|
|
||||||
Publish(PluginPathArg),
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Args, Clone)]
|
|
||||||
pub struct PluginPathArg {
|
|
||||||
/// Path to plugin directory (defaults to current working directory)
|
|
||||||
pub path: Option<PathBuf>,
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Args, Clone)]
|
|
||||||
pub struct GenerateArgs {
|
|
||||||
/// Plugin name (defaults to a generated name in interactive mode)
|
|
||||||
#[arg(long)]
|
|
||||||
pub name: Option<String>,
|
|
||||||
|
|
||||||
/// Output directory for the generated plugin (defaults to ./<name> in interactive mode)
|
|
||||||
#[arg(long)]
|
|
||||||
pub dir: Option<PathBuf>,
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Args, Clone)]
|
|
||||||
pub struct InstallPluginArgs {
|
|
||||||
/// Local plugin directory path, or registry plugin spec (@org/plugin[@version])
|
|
||||||
pub source: String,
|
|
||||||
}
|
|
||||||
@@ -1,528 +0,0 @@
|
|||||||
use crate::cli::{AuthArgs, AuthCommands};
|
|
||||||
use crate::ui;
|
|
||||||
use crate::utils::http;
|
|
||||||
use base64::Engine as _;
|
|
||||||
use keyring::Entry;
|
|
||||||
use rand::RngCore;
|
|
||||||
use rand::rngs::OsRng;
|
|
||||||
use reqwest::Url;
|
|
||||||
use serde_json::Value;
|
|
||||||
use sha2::{Digest, Sha256};
|
|
||||||
use std::io::{self, IsTerminal, Write};
|
|
||||||
use std::time::Duration;
|
|
||||||
use tokio::io::{AsyncReadExt, AsyncWriteExt};
|
|
||||||
use tokio::net::{TcpListener, TcpStream};
|
|
||||||
|
|
||||||
const OAUTH_CLIENT_ID: &str = "a1fe44800c2d7e803cad1b4bf07a291c";
|
|
||||||
const KEYRING_USER: &str = "yaak";
|
|
||||||
const AUTH_TIMEOUT: Duration = Duration::from_secs(300);
|
|
||||||
const MAX_REQUEST_BYTES: usize = 16 * 1024;
|
|
||||||
|
|
||||||
type CommandResult<T = ()> = std::result::Result<T, String>;
|
|
||||||
|
|
||||||
#[derive(Clone, Copy, Debug, Eq, PartialEq)]
|
|
||||||
enum Environment {
|
|
||||||
Production,
|
|
||||||
Staging,
|
|
||||||
Development,
|
|
||||||
}
|
|
||||||
|
|
||||||
impl Environment {
|
|
||||||
fn app_base_url(self) -> &'static str {
|
|
||||||
match self {
|
|
||||||
Environment::Production => "https://yaak.app",
|
|
||||||
Environment::Staging => "https://todo.yaak.app",
|
|
||||||
Environment::Development => "http://localhost:9444",
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn api_base_url(self) -> &'static str {
|
|
||||||
match self {
|
|
||||||
Environment::Production => "https://api.yaak.app",
|
|
||||||
Environment::Staging => "https://todo.yaak.app",
|
|
||||||
Environment::Development => "http://localhost:9444",
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn keyring_service(self) -> &'static str {
|
|
||||||
match self {
|
|
||||||
Environment::Production => "app.yaak.cli.Token",
|
|
||||||
Environment::Staging => "app.yaak.cli.staging.Token",
|
|
||||||
Environment::Development => "app.yaak.cli.dev.Token",
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
struct OAuthFlow {
|
|
||||||
app_base_url: String,
|
|
||||||
auth_url: Url,
|
|
||||||
token_url: String,
|
|
||||||
redirect_url: String,
|
|
||||||
state: String,
|
|
||||||
code_verifier: String,
|
|
||||||
}
|
|
||||||
|
|
||||||
pub async fn run(args: AuthArgs) -> i32 {
|
|
||||||
let result = match args.command {
|
|
||||||
AuthCommands::Login => login().await,
|
|
||||||
AuthCommands::Logout => logout(),
|
|
||||||
AuthCommands::Whoami => whoami().await,
|
|
||||||
};
|
|
||||||
|
|
||||||
match result {
|
|
||||||
Ok(()) => 0,
|
|
||||||
Err(error) => {
|
|
||||||
ui::error(&error);
|
|
||||||
1
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async fn login() -> CommandResult {
|
|
||||||
let environment = current_environment();
|
|
||||||
|
|
||||||
let listener = TcpListener::bind("127.0.0.1:0")
|
|
||||||
.await
|
|
||||||
.map_err(|e| format!("Failed to start OAuth callback server: {e}"))?;
|
|
||||||
let port = listener
|
|
||||||
.local_addr()
|
|
||||||
.map_err(|e| format!("Failed to determine callback server port: {e}"))?
|
|
||||||
.port();
|
|
||||||
|
|
||||||
let oauth = build_oauth_flow(environment, port)?;
|
|
||||||
|
|
||||||
ui::info(&format!("Initiating login to {}", oauth.auth_url));
|
|
||||||
if !confirm_open_browser()? {
|
|
||||||
ui::info("Login canceled");
|
|
||||||
return Ok(());
|
|
||||||
}
|
|
||||||
|
|
||||||
if let Err(err) = webbrowser::open(oauth.auth_url.as_ref()) {
|
|
||||||
ui::warning(&format!("Failed to open browser: {err}"));
|
|
||||||
ui::info(&format!("Open this URL manually:\n{}", oauth.auth_url));
|
|
||||||
}
|
|
||||||
ui::info("Waiting for authentication...");
|
|
||||||
|
|
||||||
let code = tokio::select! {
|
|
||||||
result = receive_oauth_code(listener, &oauth.state, &oauth.app_base_url) => result?,
|
|
||||||
_ = tokio::signal::ctrl_c() => {
|
|
||||||
return Err("Interrupted by user".to_string());
|
|
||||||
}
|
|
||||||
_ = tokio::time::sleep(AUTH_TIMEOUT) => {
|
|
||||||
return Err("Timeout waiting for authentication".to_string());
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
let token = exchange_access_token(&oauth, &code).await?;
|
|
||||||
store_auth_token(environment, &token)?;
|
|
||||||
ui::success("Authentication successful!");
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn logout() -> CommandResult {
|
|
||||||
delete_auth_token(current_environment())?;
|
|
||||||
ui::success("Signed out of Yaak");
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
async fn whoami() -> CommandResult {
|
|
||||||
let environment = current_environment();
|
|
||||||
let token = match get_auth_token(environment)? {
|
|
||||||
Some(token) => token,
|
|
||||||
None => {
|
|
||||||
ui::warning("Not logged in");
|
|
||||||
ui::info("Please run `yaak auth login`");
|
|
||||||
return Ok(());
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
let url = format!("{}/api/v1/whoami", environment.api_base_url());
|
|
||||||
let response = http::build_client(Some(&token))?
|
|
||||||
.get(url)
|
|
||||||
.send()
|
|
||||||
.await
|
|
||||||
.map_err(|e| format!("Failed to call whoami endpoint: {e}"))?;
|
|
||||||
|
|
||||||
let status = response.status();
|
|
||||||
let body =
|
|
||||||
response.text().await.map_err(|e| format!("Failed to read whoami response body: {e}"))?;
|
|
||||||
|
|
||||||
if !status.is_success() {
|
|
||||||
if status.as_u16() == 401 {
|
|
||||||
let _ = delete_auth_token(environment);
|
|
||||||
return Err(
|
|
||||||
"Unauthorized to access CLI. Run `yaak auth login` to refresh credentials."
|
|
||||||
.to_string(),
|
|
||||||
);
|
|
||||||
}
|
|
||||||
return Err(http::parse_api_error(status.as_u16(), &body));
|
|
||||||
}
|
|
||||||
|
|
||||||
println!("{body}");
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn current_environment() -> Environment {
|
|
||||||
let value = std::env::var("ENVIRONMENT").ok();
|
|
||||||
parse_environment(value.as_deref())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn parse_environment(value: Option<&str>) -> Environment {
|
|
||||||
match value {
|
|
||||||
Some("staging") => Environment::Staging,
|
|
||||||
Some("development") => Environment::Development,
|
|
||||||
_ => Environment::Production,
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn build_oauth_flow(environment: Environment, callback_port: u16) -> CommandResult<OAuthFlow> {
|
|
||||||
let code_verifier = random_hex(32);
|
|
||||||
let state = random_hex(24);
|
|
||||||
let redirect_url = format!("http://127.0.0.1:{callback_port}/oauth/callback");
|
|
||||||
|
|
||||||
let code_challenge = base64::engine::general_purpose::URL_SAFE_NO_PAD
|
|
||||||
.encode(Sha256::digest(code_verifier.as_bytes()));
|
|
||||||
|
|
||||||
let mut auth_url = Url::parse(&format!("{}/login/oauth/authorize", environment.app_base_url()))
|
|
||||||
.map_err(|e| format!("Failed to build OAuth authorize URL: {e}"))?;
|
|
||||||
auth_url
|
|
||||||
.query_pairs_mut()
|
|
||||||
.append_pair("response_type", "code")
|
|
||||||
.append_pair("client_id", OAUTH_CLIENT_ID)
|
|
||||||
.append_pair("redirect_uri", &redirect_url)
|
|
||||||
.append_pair("state", &state)
|
|
||||||
.append_pair("code_challenge_method", "S256")
|
|
||||||
.append_pair("code_challenge", &code_challenge);
|
|
||||||
|
|
||||||
Ok(OAuthFlow {
|
|
||||||
app_base_url: environment.app_base_url().to_string(),
|
|
||||||
auth_url,
|
|
||||||
token_url: format!("{}/login/oauth/access_token", environment.app_base_url()),
|
|
||||||
redirect_url,
|
|
||||||
state,
|
|
||||||
code_verifier,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
|
|
||||||
async fn receive_oauth_code(
|
|
||||||
listener: TcpListener,
|
|
||||||
expected_state: &str,
|
|
||||||
app_base_url: &str,
|
|
||||||
) -> CommandResult<String> {
|
|
||||||
loop {
|
|
||||||
let (mut stream, _) = listener
|
|
||||||
.accept()
|
|
||||||
.await
|
|
||||||
.map_err(|e| format!("OAuth callback server accept error: {e}"))?;
|
|
||||||
|
|
||||||
match parse_callback_request(&mut stream).await {
|
|
||||||
Ok((state, code)) => {
|
|
||||||
if state != expected_state {
|
|
||||||
let _ = write_bad_request(&mut stream, "Invalid OAuth state").await;
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
let success_redirect = format!("{app_base_url}/login/oauth/success");
|
|
||||||
write_redirect(&mut stream, &success_redirect)
|
|
||||||
.await
|
|
||||||
.map_err(|e| format!("Failed responding to OAuth callback: {e}"))?;
|
|
||||||
return Ok(code);
|
|
||||||
}
|
|
||||||
Err(error) => {
|
|
||||||
let _ = write_bad_request(&mut stream, &error).await;
|
|
||||||
if error.starts_with("OAuth provider returned error:") {
|
|
||||||
return Err(error);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async fn parse_callback_request(stream: &mut TcpStream) -> CommandResult<(String, String)> {
|
|
||||||
let target = read_http_target(stream).await?;
|
|
||||||
if !target.starts_with("/oauth/callback") {
|
|
||||||
return Err("Expected /oauth/callback path".to_string());
|
|
||||||
}
|
|
||||||
|
|
||||||
let url = Url::parse(&format!("http://127.0.0.1{target}"))
|
|
||||||
.map_err(|e| format!("Failed to parse callback URL: {e}"))?;
|
|
||||||
let mut state: Option<String> = None;
|
|
||||||
let mut code: Option<String> = None;
|
|
||||||
let mut oauth_error: Option<String> = None;
|
|
||||||
let mut oauth_error_description: Option<String> = None;
|
|
||||||
|
|
||||||
for (k, v) in url.query_pairs() {
|
|
||||||
if k == "state" {
|
|
||||||
state = Some(v.into_owned());
|
|
||||||
} else if k == "code" {
|
|
||||||
code = Some(v.into_owned());
|
|
||||||
} else if k == "error" {
|
|
||||||
oauth_error = Some(v.into_owned());
|
|
||||||
} else if k == "error_description" {
|
|
||||||
oauth_error_description = Some(v.into_owned());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if let Some(error) = oauth_error {
|
|
||||||
let mut message = format!("OAuth provider returned error: {error}");
|
|
||||||
if let Some(description) = oauth_error_description.filter(|d| !d.is_empty()) {
|
|
||||||
message.push_str(&format!(" ({description})"));
|
|
||||||
}
|
|
||||||
return Err(message);
|
|
||||||
}
|
|
||||||
|
|
||||||
let state = state.ok_or_else(|| "Missing 'state' query parameter".to_string())?;
|
|
||||||
let code = code.ok_or_else(|| "Missing 'code' query parameter".to_string())?;
|
|
||||||
|
|
||||||
if code.is_empty() {
|
|
||||||
return Err("Missing 'code' query parameter".to_string());
|
|
||||||
}
|
|
||||||
|
|
||||||
Ok((state, code))
|
|
||||||
}
|
|
||||||
|
|
||||||
async fn read_http_target(stream: &mut TcpStream) -> CommandResult<String> {
|
|
||||||
let mut buf = vec![0_u8; MAX_REQUEST_BYTES];
|
|
||||||
let mut total_read = 0_usize;
|
|
||||||
|
|
||||||
loop {
|
|
||||||
let n = stream
|
|
||||||
.read(&mut buf[total_read..])
|
|
||||||
.await
|
|
||||||
.map_err(|e| format!("Failed reading callback request: {e}"))?;
|
|
||||||
if n == 0 {
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
total_read += n;
|
|
||||||
|
|
||||||
if buf[..total_read].windows(4).any(|w| w == b"\r\n\r\n") {
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
|
|
||||||
if total_read == MAX_REQUEST_BYTES {
|
|
||||||
return Err("OAuth callback request too large".to_string());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
let req = String::from_utf8_lossy(&buf[..total_read]);
|
|
||||||
let request_line =
|
|
||||||
req.lines().next().ok_or_else(|| "Invalid callback request line".to_string())?;
|
|
||||||
let mut parts = request_line.split_whitespace();
|
|
||||||
let method = parts.next().unwrap_or_default();
|
|
||||||
let target = parts.next().unwrap_or_default();
|
|
||||||
|
|
||||||
if method != "GET" {
|
|
||||||
return Err(format!("Expected GET callback request, got '{method}'"));
|
|
||||||
}
|
|
||||||
if target.is_empty() {
|
|
||||||
return Err("Missing callback request target".to_string());
|
|
||||||
}
|
|
||||||
|
|
||||||
Ok(target.to_string())
|
|
||||||
}
|
|
||||||
|
|
||||||
async fn write_bad_request(stream: &mut TcpStream, message: &str) -> std::io::Result<()> {
|
|
||||||
let body = format!("Failed to authenticate: {message}");
|
|
||||||
let response = format!(
|
|
||||||
"HTTP/1.1 400 Bad Request\r\nContent-Type: text/plain; charset=utf-8\r\nContent-Length: {}\r\nConnection: close\r\n\r\n{}",
|
|
||||||
body.len(),
|
|
||||||
body
|
|
||||||
);
|
|
||||||
stream.write_all(response.as_bytes()).await?;
|
|
||||||
stream.shutdown().await
|
|
||||||
}
|
|
||||||
|
|
||||||
async fn write_redirect(stream: &mut TcpStream, location: &str) -> std::io::Result<()> {
|
|
||||||
let response = format!(
|
|
||||||
"HTTP/1.1 302 Found\r\nLocation: {location}\r\nContent-Length: 0\r\nConnection: close\r\n\r\n"
|
|
||||||
);
|
|
||||||
stream.write_all(response.as_bytes()).await?;
|
|
||||||
stream.shutdown().await
|
|
||||||
}
|
|
||||||
|
|
||||||
async fn exchange_access_token(oauth: &OAuthFlow, code: &str) -> CommandResult<String> {
|
|
||||||
let response = http::build_client(None)?
|
|
||||||
.post(&oauth.token_url)
|
|
||||||
.form(&[
|
|
||||||
("grant_type", "authorization_code"),
|
|
||||||
("client_id", OAUTH_CLIENT_ID),
|
|
||||||
("code", code),
|
|
||||||
("redirect_uri", oauth.redirect_url.as_str()),
|
|
||||||
("code_verifier", oauth.code_verifier.as_str()),
|
|
||||||
])
|
|
||||||
.send()
|
|
||||||
.await
|
|
||||||
.map_err(|e| format!("Failed to exchange OAuth code for access token: {e}"))?;
|
|
||||||
|
|
||||||
let status = response.status();
|
|
||||||
let body =
|
|
||||||
response.text().await.map_err(|e| format!("Failed to read token response body: {e}"))?;
|
|
||||||
|
|
||||||
if !status.is_success() {
|
|
||||||
return Err(format!(
|
|
||||||
"Failed to fetch access token: status={} body={}",
|
|
||||||
status.as_u16(),
|
|
||||||
body
|
|
||||||
));
|
|
||||||
}
|
|
||||||
|
|
||||||
let parsed: Value =
|
|
||||||
serde_json::from_str(&body).map_err(|e| format!("Invalid token response JSON: {e}"))?;
|
|
||||||
let token = parsed
|
|
||||||
.get("access_token")
|
|
||||||
.and_then(Value::as_str)
|
|
||||||
.filter(|s| !s.is_empty())
|
|
||||||
.ok_or_else(|| format!("Token response missing access_token: {body}"))?;
|
|
||||||
|
|
||||||
Ok(token.to_string())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn keyring_entry(environment: Environment) -> CommandResult<Entry> {
|
|
||||||
Entry::new(environment.keyring_service(), KEYRING_USER)
|
|
||||||
.map_err(|e| format!("Failed to initialize auth keyring entry: {e}"))
|
|
||||||
}
|
|
||||||
|
|
||||||
fn get_auth_token(environment: Environment) -> CommandResult<Option<String>> {
|
|
||||||
let entry = keyring_entry(environment)?;
|
|
||||||
match entry.get_password() {
|
|
||||||
Ok(token) => Ok(Some(token)),
|
|
||||||
Err(keyring::Error::NoEntry) => Ok(None),
|
|
||||||
Err(err) => Err(format!("Failed to read auth token: {err}")),
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn store_auth_token(environment: Environment, token: &str) -> CommandResult {
|
|
||||||
let entry = keyring_entry(environment)?;
|
|
||||||
entry.set_password(token).map_err(|e| format!("Failed to store auth token: {e}"))
|
|
||||||
}
|
|
||||||
|
|
||||||
fn delete_auth_token(environment: Environment) -> CommandResult {
|
|
||||||
let entry = keyring_entry(environment)?;
|
|
||||||
match entry.delete_credential() {
|
|
||||||
Ok(()) | Err(keyring::Error::NoEntry) => Ok(()),
|
|
||||||
Err(err) => Err(format!("Failed to delete auth token: {err}")),
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn random_hex(bytes: usize) -> String {
|
|
||||||
let mut data = vec![0_u8; bytes];
|
|
||||||
OsRng.fill_bytes(&mut data);
|
|
||||||
hex::encode(data)
|
|
||||||
}
|
|
||||||
|
|
||||||
fn confirm_open_browser() -> CommandResult<bool> {
|
|
||||||
if !io::stdin().is_terminal() {
|
|
||||||
return Ok(true);
|
|
||||||
}
|
|
||||||
|
|
||||||
loop {
|
|
||||||
print!("Open default browser? [Y/n]: ");
|
|
||||||
io::stdout().flush().map_err(|e| format!("Failed to flush stdout: {e}"))?;
|
|
||||||
|
|
||||||
let mut input = String::new();
|
|
||||||
io::stdin().read_line(&mut input).map_err(|e| format!("Failed to read input: {e}"))?;
|
|
||||||
|
|
||||||
match input.trim().to_ascii_lowercase().as_str() {
|
|
||||||
"" | "y" | "yes" => return Ok(true),
|
|
||||||
"n" | "no" => return Ok(false),
|
|
||||||
_ => ui::warning("Please answer y or n"),
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
#[cfg(test)]
|
|
||||||
mod tests {
|
|
||||||
use super::*;
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn environment_mapping() {
|
|
||||||
assert_eq!(parse_environment(Some("staging")), Environment::Staging);
|
|
||||||
assert_eq!(parse_environment(Some("development")), Environment::Development);
|
|
||||||
assert_eq!(parse_environment(Some("production")), Environment::Production);
|
|
||||||
assert_eq!(parse_environment(None), Environment::Production);
|
|
||||||
}
|
|
||||||
|
|
||||||
#[tokio::test]
|
|
||||||
async fn parses_callback_request() {
|
|
||||||
let listener = TcpListener::bind("127.0.0.1:0").await.expect("bind");
|
|
||||||
let addr = listener.local_addr().expect("local addr");
|
|
||||||
|
|
||||||
let server = tokio::spawn(async move {
|
|
||||||
let (mut stream, _) = listener.accept().await.expect("accept");
|
|
||||||
parse_callback_request(&mut stream).await
|
|
||||||
});
|
|
||||||
|
|
||||||
let mut client = TcpStream::connect(addr).await.expect("connect");
|
|
||||||
client
|
|
||||||
.write_all(
|
|
||||||
b"GET /oauth/callback?code=abc123&state=xyz HTTP/1.1\r\nHost: localhost\r\n\r\n",
|
|
||||||
)
|
|
||||||
.await
|
|
||||||
.expect("write");
|
|
||||||
|
|
||||||
let parsed = server.await.expect("join").expect("parse");
|
|
||||||
assert_eq!(parsed.0, "xyz");
|
|
||||||
assert_eq!(parsed.1, "abc123");
|
|
||||||
}
|
|
||||||
|
|
||||||
#[tokio::test]
|
|
||||||
async fn parse_callback_request_oauth_error() {
|
|
||||||
let listener = TcpListener::bind("127.0.0.1:0").await.expect("bind");
|
|
||||||
let addr = listener.local_addr().expect("local addr");
|
|
||||||
|
|
||||||
let server = tokio::spawn(async move {
|
|
||||||
let (mut stream, _) = listener.accept().await.expect("accept");
|
|
||||||
parse_callback_request(&mut stream).await
|
|
||||||
});
|
|
||||||
|
|
||||||
let mut client = TcpStream::connect(addr).await.expect("connect");
|
|
||||||
client
|
|
||||||
.write_all(
|
|
||||||
b"GET /oauth/callback?error=access_denied&error_description=User%20denied&state=xyz HTTP/1.1\r\nHost: localhost\r\n\r\n",
|
|
||||||
)
|
|
||||||
.await
|
|
||||||
.expect("write");
|
|
||||||
|
|
||||||
let err = server.await.expect("join").expect_err("should fail");
|
|
||||||
assert!(err.contains("OAuth provider returned error: access_denied"));
|
|
||||||
assert!(err.contains("User denied"));
|
|
||||||
}
|
|
||||||
|
|
||||||
#[tokio::test]
|
|
||||||
async fn receive_oauth_code_fails_fast_on_provider_error() {
|
|
||||||
let listener = TcpListener::bind("127.0.0.1:0").await.expect("bind");
|
|
||||||
let addr = listener.local_addr().expect("local addr");
|
|
||||||
|
|
||||||
let server = tokio::spawn(async move {
|
|
||||||
receive_oauth_code(listener, "expected-state", "http://localhost:9444").await
|
|
||||||
});
|
|
||||||
|
|
||||||
let mut client = TcpStream::connect(addr).await.expect("connect");
|
|
||||||
client
|
|
||||||
.write_all(
|
|
||||||
b"GET /oauth/callback?error=access_denied&state=expected-state HTTP/1.1\r\nHost: localhost\r\n\r\n",
|
|
||||||
)
|
|
||||||
.await
|
|
||||||
.expect("write");
|
|
||||||
|
|
||||||
let result = tokio::time::timeout(std::time::Duration::from_secs(2), server)
|
|
||||||
.await
|
|
||||||
.expect("should not timeout")
|
|
||||||
.expect("join");
|
|
||||||
let err = result.expect_err("should return oauth error");
|
|
||||||
assert!(err.contains("OAuth provider returned error: access_denied"));
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn builds_oauth_flow_with_pkce() {
|
|
||||||
let flow = build_oauth_flow(Environment::Development, 8080).expect("flow");
|
|
||||||
assert!(flow.auth_url.as_str().contains("code_challenge_method=S256"));
|
|
||||||
assert!(
|
|
||||||
flow.auth_url
|
|
||||||
.as_str()
|
|
||||||
.contains("redirect_uri=http%3A%2F%2F127.0.0.1%3A8080%2Foauth%2Fcallback")
|
|
||||||
);
|
|
||||||
assert_eq!(flow.redirect_url, "http://127.0.0.1:8080/oauth/callback");
|
|
||||||
assert_eq!(flow.token_url, "http://localhost:9444/login/oauth/access_token");
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,42 +0,0 @@
|
|||||||
use crate::cli::{CookieJarArgs, CookieJarCommands};
|
|
||||||
use crate::context::CliContext;
|
|
||||||
use crate::utils::workspace::resolve_workspace_id;
|
|
||||||
|
|
||||||
type CommandResult<T = ()> = std::result::Result<T, String>;
|
|
||||||
|
|
||||||
pub fn run(ctx: &CliContext, args: CookieJarArgs) -> i32 {
|
|
||||||
let result = match args.command {
|
|
||||||
CookieJarCommands::List { workspace_id } => list(ctx, workspace_id.as_deref()),
|
|
||||||
};
|
|
||||||
|
|
||||||
match result {
|
|
||||||
Ok(()) => 0,
|
|
||||||
Err(error) => {
|
|
||||||
eprintln!("Error: {error}");
|
|
||||||
1
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn list(ctx: &CliContext, workspace_id: Option<&str>) -> CommandResult {
|
|
||||||
let workspace_id = resolve_workspace_id(ctx, workspace_id, "cookie-jar list")?;
|
|
||||||
let cookie_jars = ctx
|
|
||||||
.db()
|
|
||||||
.list_cookie_jars(&workspace_id)
|
|
||||||
.map_err(|e| format!("Failed to list cookie jars: {e}"))?;
|
|
||||||
|
|
||||||
if cookie_jars.is_empty() {
|
|
||||||
println!("No cookie jars found in workspace {}", workspace_id);
|
|
||||||
} else {
|
|
||||||
for cookie_jar in cookie_jars {
|
|
||||||
println!(
|
|
||||||
"{} - {} ({} cookies)",
|
|
||||||
cookie_jar.id,
|
|
||||||
cookie_jar.name,
|
|
||||||
cookie_jar.cookies.len()
|
|
||||||
);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
@@ -1,176 +0,0 @@
|
|||||||
use crate::cli::{EnvironmentArgs, EnvironmentCommands};
|
|
||||||
use crate::context::CliContext;
|
|
||||||
use crate::utils::confirm::confirm_delete;
|
|
||||||
use crate::utils::json::{
|
|
||||||
apply_merge_patch, is_json_shorthand, merge_workspace_id_arg, parse_optional_json,
|
|
||||||
parse_required_json, require_id, validate_create_id,
|
|
||||||
};
|
|
||||||
use crate::utils::schema::append_agent_hints;
|
|
||||||
use crate::utils::workspace::resolve_workspace_id;
|
|
||||||
use schemars::schema_for;
|
|
||||||
use yaak_models::models::Environment;
|
|
||||||
use yaak_models::util::UpdateSource;
|
|
||||||
|
|
||||||
type CommandResult<T = ()> = std::result::Result<T, String>;
|
|
||||||
|
|
||||||
pub fn run(ctx: &CliContext, args: EnvironmentArgs) -> i32 {
|
|
||||||
let result = match args.command {
|
|
||||||
EnvironmentCommands::List { workspace_id } => list(ctx, workspace_id.as_deref()),
|
|
||||||
EnvironmentCommands::Schema { pretty } => schema(pretty),
|
|
||||||
EnvironmentCommands::Show { environment_id } => show(ctx, &environment_id),
|
|
||||||
EnvironmentCommands::Create { workspace_id, name, json } => {
|
|
||||||
create(ctx, workspace_id, name, json)
|
|
||||||
}
|
|
||||||
EnvironmentCommands::Update { json, json_input } => update(ctx, json, json_input),
|
|
||||||
EnvironmentCommands::Delete { environment_id, yes } => delete(ctx, &environment_id, yes),
|
|
||||||
};
|
|
||||||
|
|
||||||
match result {
|
|
||||||
Ok(()) => 0,
|
|
||||||
Err(error) => {
|
|
||||||
eprintln!("Error: {error}");
|
|
||||||
1
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn schema(pretty: bool) -> CommandResult {
|
|
||||||
let mut schema = serde_json::to_value(schema_for!(Environment))
|
|
||||||
.map_err(|e| format!("Failed to serialize environment schema: {e}"))?;
|
|
||||||
append_agent_hints(&mut schema);
|
|
||||||
|
|
||||||
let output =
|
|
||||||
if pretty { serde_json::to_string_pretty(&schema) } else { serde_json::to_string(&schema) }
|
|
||||||
.map_err(|e| format!("Failed to format environment schema JSON: {e}"))?;
|
|
||||||
println!("{output}");
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn list(ctx: &CliContext, workspace_id: Option<&str>) -> CommandResult {
|
|
||||||
let workspace_id = resolve_workspace_id(ctx, workspace_id, "environment list")?;
|
|
||||||
let environments = ctx
|
|
||||||
.db()
|
|
||||||
.list_environments_ensure_base(&workspace_id)
|
|
||||||
.map_err(|e| format!("Failed to list environments: {e}"))?;
|
|
||||||
|
|
||||||
if environments.is_empty() {
|
|
||||||
println!("No environments found in workspace {}", workspace_id);
|
|
||||||
} else {
|
|
||||||
for environment in environments {
|
|
||||||
println!("{} - {} ({})", environment.id, environment.name, environment.parent_model);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn show(ctx: &CliContext, environment_id: &str) -> CommandResult {
|
|
||||||
let environment = ctx
|
|
||||||
.db()
|
|
||||||
.get_environment(environment_id)
|
|
||||||
.map_err(|e| format!("Failed to get environment: {e}"))?;
|
|
||||||
let output = serde_json::to_string_pretty(&environment)
|
|
||||||
.map_err(|e| format!("Failed to serialize environment: {e}"))?;
|
|
||||||
println!("{output}");
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn create(
|
|
||||||
ctx: &CliContext,
|
|
||||||
workspace_id: Option<String>,
|
|
||||||
name: Option<String>,
|
|
||||||
json: Option<String>,
|
|
||||||
) -> CommandResult {
|
|
||||||
let json_shorthand =
|
|
||||||
workspace_id.as_deref().filter(|v| is_json_shorthand(v)).map(str::to_owned);
|
|
||||||
let workspace_id_arg = workspace_id.filter(|v| !is_json_shorthand(v));
|
|
||||||
|
|
||||||
let payload = parse_optional_json(json, json_shorthand, "environment create")?;
|
|
||||||
|
|
||||||
if let Some(payload) = payload {
|
|
||||||
if name.is_some() {
|
|
||||||
return Err("environment create cannot combine --name with JSON payload".to_string());
|
|
||||||
}
|
|
||||||
|
|
||||||
validate_create_id(&payload, "environment")?;
|
|
||||||
let mut environment: Environment = serde_json::from_value(payload)
|
|
||||||
.map_err(|e| format!("Failed to parse environment create JSON: {e}"))?;
|
|
||||||
let fallback_workspace_id =
|
|
||||||
if workspace_id_arg.is_none() && environment.workspace_id.is_empty() {
|
|
||||||
Some(resolve_workspace_id(ctx, None, "environment create")?)
|
|
||||||
} else {
|
|
||||||
None
|
|
||||||
};
|
|
||||||
merge_workspace_id_arg(
|
|
||||||
workspace_id_arg.as_deref().or(fallback_workspace_id.as_deref()),
|
|
||||||
&mut environment.workspace_id,
|
|
||||||
"environment create",
|
|
||||||
)?;
|
|
||||||
|
|
||||||
if environment.parent_model.is_empty() {
|
|
||||||
environment.parent_model = "environment".to_string();
|
|
||||||
}
|
|
||||||
|
|
||||||
let created = ctx
|
|
||||||
.db()
|
|
||||||
.upsert_environment(&environment, &UpdateSource::Sync)
|
|
||||||
.map_err(|e| format!("Failed to create environment: {e}"))?;
|
|
||||||
|
|
||||||
println!("Created environment: {}", created.id);
|
|
||||||
return Ok(());
|
|
||||||
}
|
|
||||||
|
|
||||||
let workspace_id =
|
|
||||||
resolve_workspace_id(ctx, workspace_id_arg.as_deref(), "environment create")?;
|
|
||||||
let name = name.ok_or_else(|| {
|
|
||||||
"environment create requires --name unless JSON payload is provided".to_string()
|
|
||||||
})?;
|
|
||||||
|
|
||||||
let environment = Environment {
|
|
||||||
workspace_id,
|
|
||||||
name,
|
|
||||||
parent_model: "environment".to_string(),
|
|
||||||
..Default::default()
|
|
||||||
};
|
|
||||||
|
|
||||||
let created = ctx
|
|
||||||
.db()
|
|
||||||
.upsert_environment(&environment, &UpdateSource::Sync)
|
|
||||||
.map_err(|e| format!("Failed to create environment: {e}"))?;
|
|
||||||
|
|
||||||
println!("Created environment: {}", created.id);
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn update(ctx: &CliContext, json: Option<String>, json_input: Option<String>) -> CommandResult {
|
|
||||||
let patch = parse_required_json(json, json_input, "environment update")?;
|
|
||||||
let id = require_id(&patch, "environment update")?;
|
|
||||||
|
|
||||||
let existing = ctx
|
|
||||||
.db()
|
|
||||||
.get_environment(&id)
|
|
||||||
.map_err(|e| format!("Failed to get environment for update: {e}"))?;
|
|
||||||
let updated = apply_merge_patch(&existing, &patch, &id, "environment update")?;
|
|
||||||
|
|
||||||
let saved = ctx
|
|
||||||
.db()
|
|
||||||
.upsert_environment(&updated, &UpdateSource::Sync)
|
|
||||||
.map_err(|e| format!("Failed to update environment: {e}"))?;
|
|
||||||
|
|
||||||
println!("Updated environment: {}", saved.id);
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn delete(ctx: &CliContext, environment_id: &str, yes: bool) -> CommandResult {
|
|
||||||
if !yes && !confirm_delete("environment", environment_id) {
|
|
||||||
println!("Aborted");
|
|
||||||
return Ok(());
|
|
||||||
}
|
|
||||||
|
|
||||||
let deleted = ctx
|
|
||||||
.db()
|
|
||||||
.delete_environment_by_id(environment_id, &UpdateSource::Sync)
|
|
||||||
.map_err(|e| format!("Failed to delete environment: {e}"))?;
|
|
||||||
|
|
||||||
println!("Deleted environment: {}", deleted.id);
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
@@ -1,144 +0,0 @@
|
|||||||
use crate::cli::{FolderArgs, FolderCommands};
|
|
||||||
use crate::context::CliContext;
|
|
||||||
use crate::utils::confirm::confirm_delete;
|
|
||||||
use crate::utils::json::{
|
|
||||||
apply_merge_patch, is_json_shorthand, merge_workspace_id_arg, parse_optional_json,
|
|
||||||
parse_required_json, require_id, validate_create_id,
|
|
||||||
};
|
|
||||||
use crate::utils::workspace::resolve_workspace_id;
|
|
||||||
use yaak_models::models::Folder;
|
|
||||||
use yaak_models::util::UpdateSource;
|
|
||||||
|
|
||||||
type CommandResult<T = ()> = std::result::Result<T, String>;
|
|
||||||
|
|
||||||
pub fn run(ctx: &CliContext, args: FolderArgs) -> i32 {
|
|
||||||
let result = match args.command {
|
|
||||||
FolderCommands::List { workspace_id } => list(ctx, workspace_id.as_deref()),
|
|
||||||
FolderCommands::Show { folder_id } => show(ctx, &folder_id),
|
|
||||||
FolderCommands::Create { workspace_id, name, json } => {
|
|
||||||
create(ctx, workspace_id, name, json)
|
|
||||||
}
|
|
||||||
FolderCommands::Update { json, json_input } => update(ctx, json, json_input),
|
|
||||||
FolderCommands::Delete { folder_id, yes } => delete(ctx, &folder_id, yes),
|
|
||||||
};
|
|
||||||
|
|
||||||
match result {
|
|
||||||
Ok(()) => 0,
|
|
||||||
Err(error) => {
|
|
||||||
eprintln!("Error: {error}");
|
|
||||||
1
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn list(ctx: &CliContext, workspace_id: Option<&str>) -> CommandResult {
|
|
||||||
let workspace_id = resolve_workspace_id(ctx, workspace_id, "folder list")?;
|
|
||||||
let folders =
|
|
||||||
ctx.db().list_folders(&workspace_id).map_err(|e| format!("Failed to list folders: {e}"))?;
|
|
||||||
if folders.is_empty() {
|
|
||||||
println!("No folders found in workspace {}", workspace_id);
|
|
||||||
} else {
|
|
||||||
for folder in folders {
|
|
||||||
println!("{} - {}", folder.id, folder.name);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn show(ctx: &CliContext, folder_id: &str) -> CommandResult {
|
|
||||||
let folder =
|
|
||||||
ctx.db().get_folder(folder_id).map_err(|e| format!("Failed to get folder: {e}"))?;
|
|
||||||
let output = serde_json::to_string_pretty(&folder)
|
|
||||||
.map_err(|e| format!("Failed to serialize folder: {e}"))?;
|
|
||||||
println!("{output}");
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn create(
|
|
||||||
ctx: &CliContext,
|
|
||||||
workspace_id: Option<String>,
|
|
||||||
name: Option<String>,
|
|
||||||
json: Option<String>,
|
|
||||||
) -> CommandResult {
|
|
||||||
let json_shorthand =
|
|
||||||
workspace_id.as_deref().filter(|v| is_json_shorthand(v)).map(str::to_owned);
|
|
||||||
let workspace_id_arg = workspace_id.filter(|v| !is_json_shorthand(v));
|
|
||||||
|
|
||||||
let payload = parse_optional_json(json, json_shorthand, "folder create")?;
|
|
||||||
|
|
||||||
if let Some(payload) = payload {
|
|
||||||
if name.is_some() {
|
|
||||||
return Err("folder create cannot combine --name with JSON payload".to_string());
|
|
||||||
}
|
|
||||||
|
|
||||||
validate_create_id(&payload, "folder")?;
|
|
||||||
let mut folder: Folder = serde_json::from_value(payload)
|
|
||||||
.map_err(|e| format!("Failed to parse folder create JSON: {e}"))?;
|
|
||||||
let fallback_workspace_id = if workspace_id_arg.is_none() && folder.workspace_id.is_empty()
|
|
||||||
{
|
|
||||||
Some(resolve_workspace_id(ctx, None, "folder create")?)
|
|
||||||
} else {
|
|
||||||
None
|
|
||||||
};
|
|
||||||
merge_workspace_id_arg(
|
|
||||||
workspace_id_arg.as_deref().or(fallback_workspace_id.as_deref()),
|
|
||||||
&mut folder.workspace_id,
|
|
||||||
"folder create",
|
|
||||||
)?;
|
|
||||||
|
|
||||||
let created = ctx
|
|
||||||
.db()
|
|
||||||
.upsert_folder(&folder, &UpdateSource::Sync)
|
|
||||||
.map_err(|e| format!("Failed to create folder: {e}"))?;
|
|
||||||
|
|
||||||
println!("Created folder: {}", created.id);
|
|
||||||
return Ok(());
|
|
||||||
}
|
|
||||||
|
|
||||||
let workspace_id = resolve_workspace_id(ctx, workspace_id_arg.as_deref(), "folder create")?;
|
|
||||||
let name = name.ok_or_else(|| {
|
|
||||||
"folder create requires --name unless JSON payload is provided".to_string()
|
|
||||||
})?;
|
|
||||||
|
|
||||||
let folder = Folder { workspace_id, name, ..Default::default() };
|
|
||||||
|
|
||||||
let created = ctx
|
|
||||||
.db()
|
|
||||||
.upsert_folder(&folder, &UpdateSource::Sync)
|
|
||||||
.map_err(|e| format!("Failed to create folder: {e}"))?;
|
|
||||||
|
|
||||||
println!("Created folder: {}", created.id);
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn update(ctx: &CliContext, json: Option<String>, json_input: Option<String>) -> CommandResult {
|
|
||||||
let patch = parse_required_json(json, json_input, "folder update")?;
|
|
||||||
let id = require_id(&patch, "folder update")?;
|
|
||||||
|
|
||||||
let existing =
|
|
||||||
ctx.db().get_folder(&id).map_err(|e| format!("Failed to get folder for update: {e}"))?;
|
|
||||||
let updated = apply_merge_patch(&existing, &patch, &id, "folder update")?;
|
|
||||||
|
|
||||||
let saved = ctx
|
|
||||||
.db()
|
|
||||||
.upsert_folder(&updated, &UpdateSource::Sync)
|
|
||||||
.map_err(|e| format!("Failed to update folder: {e}"))?;
|
|
||||||
|
|
||||||
println!("Updated folder: {}", saved.id);
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn delete(ctx: &CliContext, folder_id: &str, yes: bool) -> CommandResult {
|
|
||||||
if !yes && !confirm_delete("folder", folder_id) {
|
|
||||||
println!("Aborted");
|
|
||||||
return Ok(());
|
|
||||||
}
|
|
||||||
|
|
||||||
let deleted = ctx
|
|
||||||
.db()
|
|
||||||
.delete_folder_by_id(folder_id, &UpdateSource::Sync)
|
|
||||||
.map_err(|e| format!("Failed to delete folder: {e}"))?;
|
|
||||||
|
|
||||||
println!("Deleted folder: {}", deleted.id);
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
@@ -1,8 +0,0 @@
|
|||||||
pub mod auth;
|
|
||||||
pub mod cookie_jar;
|
|
||||||
pub mod environment;
|
|
||||||
pub mod folder;
|
|
||||||
pub mod plugin;
|
|
||||||
pub mod request;
|
|
||||||
pub mod send;
|
|
||||||
pub mod workspace;
|
|
||||||
@@ -1,680 +0,0 @@
|
|||||||
use crate::cli::{GenerateArgs, InstallPluginArgs, PluginPathArg};
|
|
||||||
use crate::context::CliContext;
|
|
||||||
use crate::ui;
|
|
||||||
use crate::utils::http;
|
|
||||||
use keyring::Entry;
|
|
||||||
use rand::Rng;
|
|
||||||
use rolldown::{
|
|
||||||
BundleEvent, Bundler, BundlerOptions, ExperimentalOptions, InputItem, LogLevel, OutputFormat,
|
|
||||||
Platform, WatchOption, Watcher, WatcherEvent,
|
|
||||||
};
|
|
||||||
use serde::Deserialize;
|
|
||||||
use std::collections::HashSet;
|
|
||||||
use std::fs;
|
|
||||||
use std::io::{self, IsTerminal, Read, Write};
|
|
||||||
use std::path::{Path, PathBuf};
|
|
||||||
use std::sync::Arc;
|
|
||||||
use tokio::sync::Mutex;
|
|
||||||
use walkdir::WalkDir;
|
|
||||||
use yaak_api::{ApiClientKind, yaak_api_client};
|
|
||||||
use yaak_models::models::{Plugin, PluginSource};
|
|
||||||
use yaak_models::util::UpdateSource;
|
|
||||||
use yaak_plugins::events::PluginContext;
|
|
||||||
use yaak_plugins::install::download_and_install;
|
|
||||||
use zip::CompressionMethod;
|
|
||||||
use zip::write::SimpleFileOptions;
|
|
||||||
|
|
||||||
type CommandResult<T = ()> = std::result::Result<T, String>;
|
|
||||||
|
|
||||||
const KEYRING_USER: &str = "yaak";
|
|
||||||
|
|
||||||
#[derive(Clone, Copy, Debug, Eq, PartialEq)]
|
|
||||||
enum Environment {
|
|
||||||
Production,
|
|
||||||
Staging,
|
|
||||||
Development,
|
|
||||||
}
|
|
||||||
|
|
||||||
impl Environment {
|
|
||||||
fn api_base_url(self) -> &'static str {
|
|
||||||
match self {
|
|
||||||
Environment::Production => "https://api.yaak.app",
|
|
||||||
Environment::Staging => "https://todo.yaak.app",
|
|
||||||
Environment::Development => "http://localhost:9444",
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn keyring_service(self) -> &'static str {
|
|
||||||
match self {
|
|
||||||
Environment::Production => "app.yaak.cli.Token",
|
|
||||||
Environment::Staging => "app.yaak.cli.staging.Token",
|
|
||||||
Environment::Development => "app.yaak.cli.dev.Token",
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
pub async fn run_build(args: PluginPathArg) -> i32 {
|
|
||||||
match build(args).await {
|
|
||||||
Ok(()) => 0,
|
|
||||||
Err(error) => {
|
|
||||||
ui::error(&error);
|
|
||||||
1
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
pub async fn run_install(context: &CliContext, args: InstallPluginArgs) -> i32 {
|
|
||||||
match install(context, args).await {
|
|
||||||
Ok(()) => 0,
|
|
||||||
Err(error) => {
|
|
||||||
ui::error(&error);
|
|
||||||
1
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
pub async fn run_dev(args: PluginPathArg) -> i32 {
|
|
||||||
match dev(args).await {
|
|
||||||
Ok(()) => 0,
|
|
||||||
Err(error) => {
|
|
||||||
ui::error(&error);
|
|
||||||
1
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
pub async fn run_generate(args: GenerateArgs) -> i32 {
|
|
||||||
match generate(args) {
|
|
||||||
Ok(()) => 0,
|
|
||||||
Err(error) => {
|
|
||||||
ui::error(&error);
|
|
||||||
1
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
pub async fn run_publish(args: PluginPathArg) -> i32 {
|
|
||||||
match publish(args).await {
|
|
||||||
Ok(()) => 0,
|
|
||||||
Err(error) => {
|
|
||||||
ui::error(&error);
|
|
||||||
1
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async fn build(args: PluginPathArg) -> CommandResult {
|
|
||||||
let plugin_dir = resolve_plugin_dir(args.path)?;
|
|
||||||
ensure_plugin_build_inputs(&plugin_dir)?;
|
|
||||||
|
|
||||||
ui::info(&format!("Building plugin {}...", plugin_dir.display()));
|
|
||||||
let warnings = build_plugin_bundle(&plugin_dir).await?;
|
|
||||||
for warning in warnings {
|
|
||||||
ui::warning(&warning);
|
|
||||||
}
|
|
||||||
ui::success(&format!("Built plugin bundle at {}", plugin_dir.join("build/index.js").display()));
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
async fn dev(args: PluginPathArg) -> CommandResult {
|
|
||||||
let plugin_dir = resolve_plugin_dir(args.path)?;
|
|
||||||
ensure_plugin_build_inputs(&plugin_dir)?;
|
|
||||||
|
|
||||||
ui::info(&format!("Watching plugin {}...", plugin_dir.display()));
|
|
||||||
|
|
||||||
let bundler = Bundler::new(bundler_options(&plugin_dir, true))
|
|
||||||
.map_err(|err| format!("Failed to initialize Rolldown watcher: {err}"))?;
|
|
||||||
let watcher = Watcher::new(vec![Arc::new(Mutex::new(bundler))], None)
|
|
||||||
.map_err(|err| format!("Failed to start Rolldown watcher: {err}"))?;
|
|
||||||
let emitter = watcher.emitter();
|
|
||||||
let watch_root = plugin_dir.clone();
|
|
||||||
let _event_logger = tokio::spawn(async move {
|
|
||||||
loop {
|
|
||||||
let event = {
|
|
||||||
let rx = emitter.rx.lock().await;
|
|
||||||
rx.recv()
|
|
||||||
};
|
|
||||||
|
|
||||||
let Ok(event) = event else {
|
|
||||||
break;
|
|
||||||
};
|
|
||||||
|
|
||||||
match event {
|
|
||||||
WatcherEvent::Change(change) => {
|
|
||||||
let changed_path = Path::new(change.path.as_str());
|
|
||||||
let display_path = changed_path
|
|
||||||
.strip_prefix(&watch_root)
|
|
||||||
.map(|p| p.display().to_string())
|
|
||||||
.unwrap_or_else(|_| {
|
|
||||||
changed_path
|
|
||||||
.file_name()
|
|
||||||
.map(|name| name.to_string_lossy().into_owned())
|
|
||||||
.unwrap_or_else(|| "unknown".to_string())
|
|
||||||
});
|
|
||||||
ui::info(&format!("Rebuilding plugin {display_path}"));
|
|
||||||
}
|
|
||||||
WatcherEvent::Event(BundleEvent::BundleEnd(_)) => {}
|
|
||||||
WatcherEvent::Event(BundleEvent::Error(event)) => {
|
|
||||||
if event.error.diagnostics.is_empty() {
|
|
||||||
ui::error("Plugin build failed");
|
|
||||||
} else {
|
|
||||||
for diagnostic in event.error.diagnostics {
|
|
||||||
ui::error(&diagnostic.to_string());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
WatcherEvent::Close => break,
|
|
||||||
_ => {}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
watcher.start().await;
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn generate(args: GenerateArgs) -> CommandResult {
|
|
||||||
let default_name = random_name();
|
|
||||||
let name = match args.name {
|
|
||||||
Some(name) => name,
|
|
||||||
None => prompt_with_default("Plugin name", &default_name)?,
|
|
||||||
};
|
|
||||||
|
|
||||||
let default_dir = format!("./{name}");
|
|
||||||
let output_dir = match args.dir {
|
|
||||||
Some(dir) => dir,
|
|
||||||
None => PathBuf::from(prompt_with_default("Plugin dir", &default_dir)?),
|
|
||||||
};
|
|
||||||
|
|
||||||
if output_dir.exists() {
|
|
||||||
return Err(format!("Plugin directory already exists: {}", output_dir.display()));
|
|
||||||
}
|
|
||||||
|
|
||||||
ui::info(&format!("Generating plugin in {}", output_dir.display()));
|
|
||||||
fs::create_dir_all(output_dir.join("src"))
|
|
||||||
.map_err(|e| format!("Failed creating plugin directory {}: {e}", output_dir.display()))?;
|
|
||||||
|
|
||||||
write_file(&output_dir.join(".gitignore"), TEMPLATE_GITIGNORE)?;
|
|
||||||
write_file(
|
|
||||||
&output_dir.join("package.json"),
|
|
||||||
&TEMPLATE_PACKAGE_JSON.replace("yaak-plugin-name", &name),
|
|
||||||
)?;
|
|
||||||
write_file(&output_dir.join("tsconfig.json"), TEMPLATE_TSCONFIG)?;
|
|
||||||
write_file(&output_dir.join("README.md"), &TEMPLATE_README.replace("yaak-plugin-name", &name))?;
|
|
||||||
write_file(
|
|
||||||
&output_dir.join("src/index.ts"),
|
|
||||||
&TEMPLATE_INDEX_TS.replace("yaak-plugin-name", &name),
|
|
||||||
)?;
|
|
||||||
write_file(&output_dir.join("src/index.test.ts"), TEMPLATE_INDEX_TEST_TS)?;
|
|
||||||
|
|
||||||
ui::success("Plugin scaffold generated");
|
|
||||||
ui::info("Next steps:");
|
|
||||||
println!(" 1. cd {}", output_dir.display());
|
|
||||||
println!(" 2. npm install");
|
|
||||||
println!(" 3. yaak plugin build");
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
async fn publish(args: PluginPathArg) -> CommandResult {
|
|
||||||
let plugin_dir = resolve_plugin_dir(args.path)?;
|
|
||||||
ensure_plugin_build_inputs(&plugin_dir)?;
|
|
||||||
|
|
||||||
let environment = current_environment();
|
|
||||||
let token = get_auth_token(environment)?
|
|
||||||
.ok_or_else(|| "Not logged in. Run `yaak auth login`.".to_string())?;
|
|
||||||
|
|
||||||
ui::info(&format!("Building plugin {}...", plugin_dir.display()));
|
|
||||||
let warnings = build_plugin_bundle(&plugin_dir).await?;
|
|
||||||
for warning in warnings {
|
|
||||||
ui::warning(&warning);
|
|
||||||
}
|
|
||||||
|
|
||||||
ui::info("Archiving plugin");
|
|
||||||
let archive = create_publish_archive(&plugin_dir)?;
|
|
||||||
|
|
||||||
ui::info("Uploading plugin");
|
|
||||||
let url = format!("{}/api/v1/plugins/publish", environment.api_base_url());
|
|
||||||
let response = http::build_client(Some(&token))?
|
|
||||||
.post(url)
|
|
||||||
.header(reqwest::header::CONTENT_TYPE, "application/zip")
|
|
||||||
.body(archive)
|
|
||||||
.send()
|
|
||||||
.await
|
|
||||||
.map_err(|e| format!("Failed to upload plugin: {e}"))?;
|
|
||||||
|
|
||||||
let status = response.status();
|
|
||||||
let body =
|
|
||||||
response.text().await.map_err(|e| format!("Failed reading publish response body: {e}"))?;
|
|
||||||
|
|
||||||
if !status.is_success() {
|
|
||||||
return Err(http::parse_api_error(status.as_u16(), &body));
|
|
||||||
}
|
|
||||||
|
|
||||||
let published: PublishResponse = serde_json::from_str(&body)
|
|
||||||
.map_err(|e| format!("Failed parsing publish response JSON: {e}\nResponse: {body}"))?;
|
|
||||||
ui::success(&format!("Plugin published {}", published.version));
|
|
||||||
println!(" -> {}", published.url);
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
async fn install(context: &CliContext, args: InstallPluginArgs) -> CommandResult {
|
|
||||||
if args.source.starts_with('@') {
|
|
||||||
let (name, version) =
|
|
||||||
parse_registry_install_spec(args.source.as_str()).ok_or_else(|| {
|
|
||||||
"Invalid registry plugin spec. Expected format: @org/plugin or @org/plugin@version"
|
|
||||||
.to_string()
|
|
||||||
})?;
|
|
||||||
return install_from_registry(context, name, version).await;
|
|
||||||
}
|
|
||||||
|
|
||||||
install_from_directory(context, args.source.as_str()).await
|
|
||||||
}
|
|
||||||
|
|
||||||
async fn install_from_registry(
|
|
||||||
context: &CliContext,
|
|
||||||
name: String,
|
|
||||||
version: Option<String>,
|
|
||||||
) -> CommandResult {
|
|
||||||
let current_version = crate::version::cli_version();
|
|
||||||
let http_client = yaak_api_client(ApiClientKind::Cli, current_version)
|
|
||||||
.map_err(|err| format!("Failed to initialize API client: {err}"))?;
|
|
||||||
let installing_version = version.clone().unwrap_or_else(|| "latest".to_string());
|
|
||||||
ui::info(&format!("Installing registry plugin {name}@{installing_version}"));
|
|
||||||
|
|
||||||
let plugin_context = PluginContext::new(Some("cli".to_string()), None);
|
|
||||||
let installed = download_and_install(
|
|
||||||
context.plugin_manager(),
|
|
||||||
context.query_manager(),
|
|
||||||
&http_client,
|
|
||||||
&plugin_context,
|
|
||||||
name.as_str(),
|
|
||||||
version,
|
|
||||||
)
|
|
||||||
.await
|
|
||||||
.map_err(|err| format!("Failed to install plugin: {err}"))?;
|
|
||||||
|
|
||||||
ui::success(&format!("Installed plugin {}@{}", installed.name, installed.version));
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
async fn install_from_directory(context: &CliContext, source: &str) -> CommandResult {
|
|
||||||
let plugin_dir = resolve_plugin_dir(Some(PathBuf::from(source)))?;
|
|
||||||
let plugin_dir_str = plugin_dir
|
|
||||||
.to_str()
|
|
||||||
.ok_or_else(|| {
|
|
||||||
format!("Plugin directory path is not valid UTF-8: {}", plugin_dir.display())
|
|
||||||
})?
|
|
||||||
.to_string();
|
|
||||||
ui::info(&format!("Installing plugin from directory {}", plugin_dir.display()));
|
|
||||||
|
|
||||||
let plugin = context
|
|
||||||
.db()
|
|
||||||
.upsert_plugin(
|
|
||||||
&Plugin {
|
|
||||||
directory: plugin_dir_str,
|
|
||||||
url: None,
|
|
||||||
enabled: true,
|
|
||||||
source: PluginSource::Filesystem,
|
|
||||||
..Default::default()
|
|
||||||
},
|
|
||||||
&UpdateSource::Background,
|
|
||||||
)
|
|
||||||
.map_err(|err| format!("Failed to save plugin in database: {err}"))?;
|
|
||||||
|
|
||||||
let plugin_context = PluginContext::new(Some("cli".to_string()), None);
|
|
||||||
context
|
|
||||||
.plugin_manager()
|
|
||||||
.add_plugin(&plugin_context, &plugin)
|
|
||||||
.await
|
|
||||||
.map_err(|err| format!("Failed to load plugin runtime: {err}"))?;
|
|
||||||
|
|
||||||
ui::success(&format!("Installed plugin from {}", plugin.directory));
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn parse_registry_install_spec(source: &str) -> Option<(String, Option<String>)> {
|
|
||||||
if !source.starts_with('@') || !source.contains('/') {
|
|
||||||
return None;
|
|
||||||
}
|
|
||||||
|
|
||||||
let rest = source.get(1..)?;
|
|
||||||
let version_split = rest.rfind('@').map(|idx| idx + 1);
|
|
||||||
let (name, version) = match version_split {
|
|
||||||
Some(at_idx) => {
|
|
||||||
let (name, version) = source.split_at(at_idx);
|
|
||||||
let version = version.strip_prefix('@').unwrap_or_default();
|
|
||||||
if version.is_empty() {
|
|
||||||
return None;
|
|
||||||
}
|
|
||||||
(name.to_string(), Some(version.to_string()))
|
|
||||||
}
|
|
||||||
None => (source.to_string(), None),
|
|
||||||
};
|
|
||||||
|
|
||||||
if !name.starts_with('@') {
|
|
||||||
return None;
|
|
||||||
}
|
|
||||||
|
|
||||||
let without_scope = name.get(1..)?;
|
|
||||||
let (scope, plugin_name) = without_scope.split_once('/')?;
|
|
||||||
if scope.is_empty() || plugin_name.is_empty() {
|
|
||||||
return None;
|
|
||||||
}
|
|
||||||
|
|
||||||
Some((name, version))
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Deserialize)]
|
|
||||||
struct PublishResponse {
|
|
||||||
version: String,
|
|
||||||
url: String,
|
|
||||||
}
|
|
||||||
|
|
||||||
async fn build_plugin_bundle(plugin_dir: &Path) -> CommandResult<Vec<String>> {
|
|
||||||
prepare_build_output_dir(plugin_dir)?;
|
|
||||||
let mut bundler = Bundler::new(bundler_options(plugin_dir, false))
|
|
||||||
.map_err(|err| format!("Failed to initialize Rolldown: {err}"))?;
|
|
||||||
let output = bundler.write().await.map_err(|err| format!("Plugin build failed:\n{err}"))?;
|
|
||||||
|
|
||||||
Ok(output.warnings.into_iter().map(|w| w.to_string()).collect())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn prepare_build_output_dir(plugin_dir: &Path) -> CommandResult {
|
|
||||||
let build_dir = plugin_dir.join("build");
|
|
||||||
if build_dir.exists() {
|
|
||||||
fs::remove_dir_all(&build_dir)
|
|
||||||
.map_err(|e| format!("Failed to clean build directory {}: {e}", build_dir.display()))?;
|
|
||||||
}
|
|
||||||
fs::create_dir_all(&build_dir)
|
|
||||||
.map_err(|e| format!("Failed to create build directory {}: {e}", build_dir.display()))
|
|
||||||
}
|
|
||||||
|
|
||||||
fn bundler_options(plugin_dir: &Path, watch: bool) -> BundlerOptions {
|
|
||||||
BundlerOptions {
|
|
||||||
input: Some(vec![InputItem { import: "./src/index.ts".to_string(), ..Default::default() }]),
|
|
||||||
cwd: Some(plugin_dir.to_path_buf()),
|
|
||||||
file: Some("build/index.js".to_string()),
|
|
||||||
format: Some(OutputFormat::Cjs),
|
|
||||||
platform: Some(Platform::Node),
|
|
||||||
log_level: Some(LogLevel::Info),
|
|
||||||
experimental: watch
|
|
||||||
.then_some(ExperimentalOptions { incremental_build: Some(true), ..Default::default() }),
|
|
||||||
watch: watch.then_some(WatchOption::default()),
|
|
||||||
..Default::default()
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn resolve_plugin_dir(path: Option<PathBuf>) -> CommandResult<PathBuf> {
|
|
||||||
let cwd =
|
|
||||||
std::env::current_dir().map_err(|e| format!("Failed to read current directory: {e}"))?;
|
|
||||||
let candidate = match path {
|
|
||||||
Some(path) if path.is_absolute() => path,
|
|
||||||
Some(path) => cwd.join(path),
|
|
||||||
None => cwd,
|
|
||||||
};
|
|
||||||
|
|
||||||
if !candidate.exists() {
|
|
||||||
return Err(format!("Plugin directory does not exist: {}", candidate.display()));
|
|
||||||
}
|
|
||||||
if !candidate.is_dir() {
|
|
||||||
return Err(format!("Plugin path is not a directory: {}", candidate.display()));
|
|
||||||
}
|
|
||||||
|
|
||||||
candidate
|
|
||||||
.canonicalize()
|
|
||||||
.map_err(|e| format!("Failed to resolve plugin directory {}: {e}", candidate.display()))
|
|
||||||
}
|
|
||||||
|
|
||||||
fn ensure_plugin_build_inputs(plugin_dir: &Path) -> CommandResult {
|
|
||||||
let package_json = plugin_dir.join("package.json");
|
|
||||||
if !package_json.is_file() {
|
|
||||||
return Err(format!(
|
|
||||||
"{} does not exist. Ensure that you are in a plugin directory.",
|
|
||||||
package_json.display()
|
|
||||||
));
|
|
||||||
}
|
|
||||||
|
|
||||||
let entry = plugin_dir.join("src/index.ts");
|
|
||||||
if !entry.is_file() {
|
|
||||||
return Err(format!("Required entrypoint missing: {}", entry.display()));
|
|
||||||
}
|
|
||||||
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn create_publish_archive(plugin_dir: &Path) -> CommandResult<Vec<u8>> {
|
|
||||||
let required_files = [
|
|
||||||
"README.md",
|
|
||||||
"package.json",
|
|
||||||
"build/index.js",
|
|
||||||
"src/index.ts",
|
|
||||||
];
|
|
||||||
let optional_files = ["package-lock.json"];
|
|
||||||
|
|
||||||
let mut selected = HashSet::new();
|
|
||||||
for required in required_files {
|
|
||||||
let required_path = plugin_dir.join(required);
|
|
||||||
if !required_path.is_file() {
|
|
||||||
return Err(format!("Missing required file: {required}"));
|
|
||||||
}
|
|
||||||
selected.insert(required.to_string());
|
|
||||||
}
|
|
||||||
for optional in optional_files {
|
|
||||||
selected.insert(optional.to_string());
|
|
||||||
}
|
|
||||||
|
|
||||||
let cursor = std::io::Cursor::new(Vec::new());
|
|
||||||
let mut zip = zip::ZipWriter::new(cursor);
|
|
||||||
let options = SimpleFileOptions::default().compression_method(CompressionMethod::Deflated);
|
|
||||||
|
|
||||||
for entry in WalkDir::new(plugin_dir) {
|
|
||||||
let entry = entry.map_err(|e| format!("Failed walking plugin directory: {e}"))?;
|
|
||||||
if !entry.file_type().is_file() {
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
let path = entry.path();
|
|
||||||
let rel = path
|
|
||||||
.strip_prefix(plugin_dir)
|
|
||||||
.map_err(|e| format!("Failed deriving relative path for {}: {e}", path.display()))?;
|
|
||||||
let rel = rel.to_string_lossy().replace('\\', "/");
|
|
||||||
|
|
||||||
let keep = rel.starts_with("src/") || rel.starts_with("build/") || selected.contains(&rel);
|
|
||||||
if !keep {
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
zip.start_file(rel, options).map_err(|e| format!("Failed adding file to archive: {e}"))?;
|
|
||||||
let mut file = fs::File::open(path)
|
|
||||||
.map_err(|e| format!("Failed opening file {}: {e}", path.display()))?;
|
|
||||||
let mut contents = Vec::new();
|
|
||||||
file.read_to_end(&mut contents)
|
|
||||||
.map_err(|e| format!("Failed reading file {}: {e}", path.display()))?;
|
|
||||||
zip.write_all(&contents).map_err(|e| format!("Failed writing archive contents: {e}"))?;
|
|
||||||
}
|
|
||||||
|
|
||||||
let cursor = zip.finish().map_err(|e| format!("Failed finalizing plugin archive: {e}"))?;
|
|
||||||
Ok(cursor.into_inner())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn write_file(path: &Path, contents: &str) -> CommandResult {
|
|
||||||
if let Some(parent) = path.parent() {
|
|
||||||
fs::create_dir_all(parent)
|
|
||||||
.map_err(|e| format!("Failed creating directory {}: {e}", parent.display()))?;
|
|
||||||
}
|
|
||||||
fs::write(path, contents).map_err(|e| format!("Failed writing file {}: {e}", path.display()))
|
|
||||||
}
|
|
||||||
|
|
||||||
fn prompt_with_default(label: &str, default: &str) -> CommandResult<String> {
|
|
||||||
if !io::stdin().is_terminal() {
|
|
||||||
return Ok(default.to_string());
|
|
||||||
}
|
|
||||||
|
|
||||||
print!("{label} [{default}]: ");
|
|
||||||
io::stdout().flush().map_err(|e| format!("Failed to flush stdout: {e}"))?;
|
|
||||||
|
|
||||||
let mut input = String::new();
|
|
||||||
io::stdin().read_line(&mut input).map_err(|e| format!("Failed to read input: {e}"))?;
|
|
||||||
let trimmed = input.trim();
|
|
||||||
|
|
||||||
if trimmed.is_empty() { Ok(default.to_string()) } else { Ok(trimmed.to_string()) }
|
|
||||||
}
|
|
||||||
|
|
||||||
fn current_environment() -> Environment {
|
|
||||||
match std::env::var("ENVIRONMENT").as_deref() {
|
|
||||||
Ok("staging") => Environment::Staging,
|
|
||||||
Ok("development") => Environment::Development,
|
|
||||||
_ => Environment::Production,
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn keyring_entry(environment: Environment) -> CommandResult<Entry> {
|
|
||||||
Entry::new(environment.keyring_service(), KEYRING_USER)
|
|
||||||
.map_err(|e| format!("Failed to initialize auth keyring entry: {e}"))
|
|
||||||
}
|
|
||||||
|
|
||||||
fn get_auth_token(environment: Environment) -> CommandResult<Option<String>> {
|
|
||||||
let entry = keyring_entry(environment)?;
|
|
||||||
match entry.get_password() {
|
|
||||||
Ok(token) => Ok(Some(token)),
|
|
||||||
Err(keyring::Error::NoEntry) => Ok(None),
|
|
||||||
Err(err) => Err(format!("Failed to read auth token: {err}")),
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn random_name() -> String {
|
|
||||||
const ADJECTIVES: &[&str] = &[
|
|
||||||
"young", "youthful", "yellow", "yielding", "yappy", "yawning", "yummy", "yucky", "yearly",
|
|
||||||
"yester", "yeasty", "yelling",
|
|
||||||
];
|
|
||||||
const NOUNS: &[&str] = &[
|
|
||||||
"yak", "yarn", "year", "yell", "yoke", "yoga", "yam", "yacht", "yodel",
|
|
||||||
];
|
|
||||||
|
|
||||||
let mut rng = rand::thread_rng();
|
|
||||||
let adjective = ADJECTIVES[rng.gen_range(0..ADJECTIVES.len())];
|
|
||||||
let noun = NOUNS[rng.gen_range(0..NOUNS.len())];
|
|
||||||
format!("{adjective}-{noun}")
|
|
||||||
}
|
|
||||||
|
|
||||||
const TEMPLATE_GITIGNORE: &str = "node_modules\n";
|
|
||||||
|
|
||||||
const TEMPLATE_PACKAGE_JSON: &str = r#"{
|
|
||||||
"name": "yaak-plugin-name",
|
|
||||||
"private": true,
|
|
||||||
"version": "0.0.1",
|
|
||||||
"scripts": {
|
|
||||||
"build": "yaak plugin build",
|
|
||||||
"dev": "yaak plugin dev"
|
|
||||||
},
|
|
||||||
"devDependencies": {
|
|
||||||
"@types/node": "^24.10.1",
|
|
||||||
"typescript": "^5.9.3",
|
|
||||||
"vitest": "^4.0.14"
|
|
||||||
},
|
|
||||||
"dependencies": {
|
|
||||||
"@yaakapp/api": "^0.7.0"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
"#;
|
|
||||||
|
|
||||||
const TEMPLATE_TSCONFIG: &str = r#"{
|
|
||||||
"compilerOptions": {
|
|
||||||
"target": "es2021",
|
|
||||||
"lib": ["DOM", "DOM.Iterable", "ESNext"],
|
|
||||||
"useDefineForClassFields": true,
|
|
||||||
"allowJs": false,
|
|
||||||
"skipLibCheck": true,
|
|
||||||
"esModuleInterop": false,
|
|
||||||
"allowSyntheticDefaultImports": true,
|
|
||||||
"strict": true,
|
|
||||||
"noUncheckedIndexedAccess": true,
|
|
||||||
"forceConsistentCasingInFileNames": true,
|
|
||||||
"module": "ESNext",
|
|
||||||
"moduleResolution": "Node",
|
|
||||||
"resolveJsonModule": true,
|
|
||||||
"isolatedModules": true,
|
|
||||||
"noEmit": true,
|
|
||||||
"jsx": "react-jsx"
|
|
||||||
},
|
|
||||||
"include": ["src"]
|
|
||||||
}
|
|
||||||
"#;
|
|
||||||
|
|
||||||
const TEMPLATE_README: &str = r#"# yaak-plugin-name
|
|
||||||
|
|
||||||
Describe what your plugin does.
|
|
||||||
"#;
|
|
||||||
|
|
||||||
const TEMPLATE_INDEX_TS: &str = r#"import type { PluginDefinition } from "@yaakapp/api";
|
|
||||||
|
|
||||||
export const plugin: PluginDefinition = {
|
|
||||||
httpRequestActions: [
|
|
||||||
{
|
|
||||||
label: "Hello, From Plugin",
|
|
||||||
icon: "info",
|
|
||||||
async onSelect(ctx, args) {
|
|
||||||
await ctx.toast.show({
|
|
||||||
color: "success",
|
|
||||||
message: `You clicked the request ${args.httpRequest.id}`,
|
|
||||||
});
|
|
||||||
},
|
|
||||||
},
|
|
||||||
],
|
|
||||||
};
|
|
||||||
"#;
|
|
||||||
|
|
||||||
const TEMPLATE_INDEX_TEST_TS: &str = r#"import { describe, expect, test } from "vitest";
|
|
||||||
import { plugin } from "./index";
|
|
||||||
|
|
||||||
describe("Example Plugin", () => {
|
|
||||||
test("Exports plugin object", () => {
|
|
||||||
expect(plugin).toBeTypeOf("object");
|
|
||||||
});
|
|
||||||
});
|
|
||||||
"#;
|
|
||||||
|
|
||||||
#[cfg(test)]
|
|
||||||
mod tests {
|
|
||||||
use super::create_publish_archive;
|
|
||||||
use std::collections::HashSet;
|
|
||||||
use std::fs;
|
|
||||||
use std::io::Cursor;
|
|
||||||
use tempfile::TempDir;
|
|
||||||
use zip::ZipArchive;
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn publish_archive_includes_required_and_optional_files() {
|
|
||||||
let dir = TempDir::new().expect("temp dir");
|
|
||||||
let root = dir.path();
|
|
||||||
|
|
||||||
fs::create_dir_all(root.join("src")).expect("create src");
|
|
||||||
fs::create_dir_all(root.join("build")).expect("create build");
|
|
||||||
fs::create_dir_all(root.join("ignored")).expect("create ignored");
|
|
||||||
|
|
||||||
fs::write(root.join("README.md"), "# Demo\n").expect("write README");
|
|
||||||
fs::write(root.join("package.json"), "{}").expect("write package.json");
|
|
||||||
fs::write(root.join("package-lock.json"), "{}").expect("write package-lock.json");
|
|
||||||
fs::write(root.join("src/index.ts"), "export const plugin = {};\n")
|
|
||||||
.expect("write src/index.ts");
|
|
||||||
fs::write(root.join("build/index.js"), "exports.plugin = {};\n")
|
|
||||||
.expect("write build/index.js");
|
|
||||||
fs::write(root.join("ignored/secret.txt"), "do-not-ship").expect("write ignored file");
|
|
||||||
|
|
||||||
let archive = create_publish_archive(root).expect("create archive");
|
|
||||||
let mut zip = ZipArchive::new(Cursor::new(archive)).expect("open zip");
|
|
||||||
|
|
||||||
let mut names = HashSet::new();
|
|
||||||
for i in 0..zip.len() {
|
|
||||||
let file = zip.by_index(i).expect("zip entry");
|
|
||||||
names.insert(file.name().to_string());
|
|
||||||
}
|
|
||||||
|
|
||||||
assert!(names.contains("README.md"));
|
|
||||||
assert!(names.contains("package.json"));
|
|
||||||
assert!(names.contains("package-lock.json"));
|
|
||||||
assert!(names.contains("src/index.ts"));
|
|
||||||
assert!(names.contains("build/index.js"));
|
|
||||||
assert!(!names.contains("ignored/secret.txt"));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,548 +0,0 @@
|
|||||||
use crate::cli::{RequestArgs, RequestCommands, RequestSchemaType};
|
|
||||||
use crate::context::CliContext;
|
|
||||||
use crate::utils::confirm::confirm_delete;
|
|
||||||
use crate::utils::json::{
|
|
||||||
apply_merge_patch, is_json_shorthand, merge_workspace_id_arg, parse_optional_json,
|
|
||||||
parse_required_json, require_id, validate_create_id,
|
|
||||||
};
|
|
||||||
use crate::utils::schema::append_agent_hints;
|
|
||||||
use crate::utils::workspace::resolve_workspace_id;
|
|
||||||
use schemars::schema_for;
|
|
||||||
use serde_json::{Map, Value, json};
|
|
||||||
use std::collections::HashMap;
|
|
||||||
use std::io::Write;
|
|
||||||
use tokio::sync::mpsc;
|
|
||||||
use yaak::send::{SendHttpRequestByIdWithPluginsParams, send_http_request_by_id_with_plugins};
|
|
||||||
use yaak_http::sender::HttpResponseEvent as SenderHttpResponseEvent;
|
|
||||||
use yaak_models::models::{GrpcRequest, HttpRequest, WebsocketRequest};
|
|
||||||
use yaak_models::queries::any_request::AnyRequest;
|
|
||||||
use yaak_models::util::UpdateSource;
|
|
||||||
use yaak_plugins::events::{FormInput, FormInputBase, JsonPrimitive, PluginContext};
|
|
||||||
|
|
||||||
type CommandResult<T = ()> = std::result::Result<T, String>;
|
|
||||||
|
|
||||||
pub async fn run(
|
|
||||||
ctx: &CliContext,
|
|
||||||
args: RequestArgs,
|
|
||||||
environment: Option<&str>,
|
|
||||||
cookie_jar_id: Option<&str>,
|
|
||||||
verbose: bool,
|
|
||||||
) -> i32 {
|
|
||||||
let result = match args.command {
|
|
||||||
RequestCommands::List { workspace_id } => list(ctx, workspace_id.as_deref()),
|
|
||||||
RequestCommands::Show { request_id } => show(ctx, &request_id),
|
|
||||||
RequestCommands::Send { request_id } => {
|
|
||||||
return match send_request_by_id(ctx, &request_id, environment, cookie_jar_id, verbose)
|
|
||||||
.await
|
|
||||||
{
|
|
||||||
Ok(()) => 0,
|
|
||||||
Err(error) => {
|
|
||||||
eprintln!("Error: {error}");
|
|
||||||
1
|
|
||||||
}
|
|
||||||
};
|
|
||||||
}
|
|
||||||
RequestCommands::Schema { request_type, pretty } => {
|
|
||||||
return match schema(ctx, request_type, pretty).await {
|
|
||||||
Ok(()) => 0,
|
|
||||||
Err(error) => {
|
|
||||||
eprintln!("Error: {error}");
|
|
||||||
1
|
|
||||||
}
|
|
||||||
};
|
|
||||||
}
|
|
||||||
RequestCommands::Create { workspace_id, name, method, url, json } => {
|
|
||||||
create(ctx, workspace_id, name, method, url, json)
|
|
||||||
}
|
|
||||||
RequestCommands::Update { json, json_input } => update(ctx, json, json_input),
|
|
||||||
RequestCommands::Delete { request_id, yes } => delete(ctx, &request_id, yes),
|
|
||||||
};
|
|
||||||
|
|
||||||
match result {
|
|
||||||
Ok(()) => 0,
|
|
||||||
Err(error) => {
|
|
||||||
eprintln!("Error: {error}");
|
|
||||||
1
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn list(ctx: &CliContext, workspace_id: Option<&str>) -> CommandResult {
|
|
||||||
let workspace_id = resolve_workspace_id(ctx, workspace_id, "request list")?;
|
|
||||||
let requests = ctx
|
|
||||||
.db()
|
|
||||||
.list_http_requests(&workspace_id)
|
|
||||||
.map_err(|e| format!("Failed to list requests: {e}"))?;
|
|
||||||
if requests.is_empty() {
|
|
||||||
println!("No requests found in workspace {}", workspace_id);
|
|
||||||
} else {
|
|
||||||
for request in requests {
|
|
||||||
println!("{} - {} {}", request.id, request.method, request.name);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
async fn schema(ctx: &CliContext, request_type: RequestSchemaType, pretty: bool) -> CommandResult {
|
|
||||||
let mut schema = match request_type {
|
|
||||||
RequestSchemaType::Http => serde_json::to_value(schema_for!(HttpRequest))
|
|
||||||
.map_err(|e| format!("Failed to serialize HTTP request schema: {e}"))?,
|
|
||||||
RequestSchemaType::Grpc => serde_json::to_value(schema_for!(GrpcRequest))
|
|
||||||
.map_err(|e| format!("Failed to serialize gRPC request schema: {e}"))?,
|
|
||||||
RequestSchemaType::Websocket => serde_json::to_value(schema_for!(WebsocketRequest))
|
|
||||||
.map_err(|e| format!("Failed to serialize WebSocket request schema: {e}"))?,
|
|
||||||
};
|
|
||||||
|
|
||||||
enrich_schema_guidance(&mut schema, request_type);
|
|
||||||
append_agent_hints(&mut schema);
|
|
||||||
|
|
||||||
if let Err(error) = merge_auth_schema_from_plugins(ctx, &mut schema).await {
|
|
||||||
eprintln!("Warning: Failed to enrich authentication schema from plugins: {error}");
|
|
||||||
}
|
|
||||||
|
|
||||||
let output =
|
|
||||||
if pretty { serde_json::to_string_pretty(&schema) } else { serde_json::to_string(&schema) }
|
|
||||||
.map_err(|e| format!("Failed to format schema JSON: {e}"))?;
|
|
||||||
println!("{output}");
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn enrich_schema_guidance(schema: &mut Value, request_type: RequestSchemaType) {
|
|
||||||
if !matches!(request_type, RequestSchemaType::Http) {
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
let Some(properties) = schema.get_mut("properties").and_then(Value::as_object_mut) else {
|
|
||||||
return;
|
|
||||||
};
|
|
||||||
|
|
||||||
if let Some(url_schema) = properties.get_mut("url").and_then(Value::as_object_mut) {
|
|
||||||
append_description(
|
|
||||||
url_schema,
|
|
||||||
"For path segments like `/foo/:id/comments/:commentId`, put concrete values in `urlParameters` using names without `:` (for example `id`, `commentId`).",
|
|
||||||
);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn append_description(schema: &mut Map<String, Value>, extra: &str) {
|
|
||||||
match schema.get_mut("description") {
|
|
||||||
Some(Value::String(existing)) if !existing.trim().is_empty() => {
|
|
||||||
if !existing.ends_with(' ') {
|
|
||||||
existing.push(' ');
|
|
||||||
}
|
|
||||||
existing.push_str(extra);
|
|
||||||
}
|
|
||||||
_ => {
|
|
||||||
schema.insert("description".to_string(), Value::String(extra.to_string()));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async fn merge_auth_schema_from_plugins(
|
|
||||||
ctx: &CliContext,
|
|
||||||
schema: &mut Value,
|
|
||||||
) -> Result<(), String> {
|
|
||||||
let plugin_context = PluginContext::new_empty();
|
|
||||||
let plugin_manager = ctx.plugin_manager();
|
|
||||||
let summaries = plugin_manager
|
|
||||||
.get_http_authentication_summaries(&plugin_context)
|
|
||||||
.await
|
|
||||||
.map_err(|e| e.to_string())?;
|
|
||||||
|
|
||||||
let mut auth_variants = Vec::new();
|
|
||||||
for (_, summary) in summaries {
|
|
||||||
let config = match plugin_manager
|
|
||||||
.get_http_authentication_config(
|
|
||||||
&plugin_context,
|
|
||||||
&summary.name,
|
|
||||||
HashMap::<String, JsonPrimitive>::new(),
|
|
||||||
"yaakcli_request_schema",
|
|
||||||
)
|
|
||||||
.await
|
|
||||||
{
|
|
||||||
Ok(config) => config,
|
|
||||||
Err(error) => {
|
|
||||||
eprintln!(
|
|
||||||
"Warning: Failed to load auth config for strategy '{}': {}",
|
|
||||||
summary.name, error
|
|
||||||
);
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
auth_variants.push(auth_variant_schema(&summary.name, &summary.label, &config.args));
|
|
||||||
}
|
|
||||||
|
|
||||||
let Some(properties) = schema.get_mut("properties").and_then(Value::as_object_mut) else {
|
|
||||||
return Ok(());
|
|
||||||
};
|
|
||||||
|
|
||||||
let Some(auth_schema) = properties.get_mut("authentication") else {
|
|
||||||
return Ok(());
|
|
||||||
};
|
|
||||||
|
|
||||||
if !auth_variants.is_empty() {
|
|
||||||
let mut one_of = vec![auth_schema.clone()];
|
|
||||||
one_of.extend(auth_variants);
|
|
||||||
*auth_schema = json!({ "oneOf": one_of });
|
|
||||||
}
|
|
||||||
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn auth_variant_schema(auth_name: &str, auth_label: &str, args: &[FormInput]) -> Value {
|
|
||||||
let mut properties = Map::new();
|
|
||||||
let mut required = Vec::new();
|
|
||||||
for input in args {
|
|
||||||
add_input_schema(input, &mut properties, &mut required);
|
|
||||||
}
|
|
||||||
|
|
||||||
let mut schema = json!({
|
|
||||||
"title": auth_label,
|
|
||||||
"description": format!("Authentication values for strategy '{}'", auth_name),
|
|
||||||
"type": "object",
|
|
||||||
"properties": properties,
|
|
||||||
"additionalProperties": true
|
|
||||||
});
|
|
||||||
|
|
||||||
if !required.is_empty() {
|
|
||||||
schema["required"] = json!(required);
|
|
||||||
}
|
|
||||||
|
|
||||||
schema
|
|
||||||
}
|
|
||||||
|
|
||||||
fn add_input_schema(
|
|
||||||
input: &FormInput,
|
|
||||||
properties: &mut Map<String, Value>,
|
|
||||||
required: &mut Vec<String>,
|
|
||||||
) {
|
|
||||||
match input {
|
|
||||||
FormInput::Text(v) => add_base_schema(
|
|
||||||
&v.base,
|
|
||||||
json!({
|
|
||||||
"type": "string",
|
|
||||||
"writeOnly": v.password.unwrap_or(false),
|
|
||||||
}),
|
|
||||||
properties,
|
|
||||||
required,
|
|
||||||
),
|
|
||||||
FormInput::Editor(v) => add_base_schema(
|
|
||||||
&v.base,
|
|
||||||
json!({
|
|
||||||
"type": "string",
|
|
||||||
"x-editorLanguage": v.language.clone(),
|
|
||||||
}),
|
|
||||||
properties,
|
|
||||||
required,
|
|
||||||
),
|
|
||||||
FormInput::Select(v) => {
|
|
||||||
let options: Vec<Value> =
|
|
||||||
v.options.iter().map(|o| Value::String(o.value.clone())).collect();
|
|
||||||
add_base_schema(
|
|
||||||
&v.base,
|
|
||||||
json!({
|
|
||||||
"type": "string",
|
|
||||||
"enum": options,
|
|
||||||
}),
|
|
||||||
properties,
|
|
||||||
required,
|
|
||||||
);
|
|
||||||
}
|
|
||||||
FormInput::Checkbox(v) => {
|
|
||||||
add_base_schema(&v.base, json!({ "type": "boolean" }), properties, required);
|
|
||||||
}
|
|
||||||
FormInput::File(v) => {
|
|
||||||
if v.multiple.unwrap_or(false) {
|
|
||||||
add_base_schema(
|
|
||||||
&v.base,
|
|
||||||
json!({
|
|
||||||
"type": "array",
|
|
||||||
"items": { "type": "string" },
|
|
||||||
}),
|
|
||||||
properties,
|
|
||||||
required,
|
|
||||||
);
|
|
||||||
} else {
|
|
||||||
add_base_schema(&v.base, json!({ "type": "string" }), properties, required);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
FormInput::HttpRequest(v) => {
|
|
||||||
add_base_schema(&v.base, json!({ "type": "string" }), properties, required);
|
|
||||||
}
|
|
||||||
FormInput::KeyValue(v) => {
|
|
||||||
add_base_schema(
|
|
||||||
&v.base,
|
|
||||||
json!({
|
|
||||||
"type": "object",
|
|
||||||
"additionalProperties": true,
|
|
||||||
}),
|
|
||||||
properties,
|
|
||||||
required,
|
|
||||||
);
|
|
||||||
}
|
|
||||||
FormInput::Accordion(v) => {
|
|
||||||
if let Some(children) = &v.inputs {
|
|
||||||
for child in children {
|
|
||||||
add_input_schema(child, properties, required);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
FormInput::HStack(v) => {
|
|
||||||
if let Some(children) = &v.inputs {
|
|
||||||
for child in children {
|
|
||||||
add_input_schema(child, properties, required);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
FormInput::Banner(v) => {
|
|
||||||
if let Some(children) = &v.inputs {
|
|
||||||
for child in children {
|
|
||||||
add_input_schema(child, properties, required);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
FormInput::Markdown(_) => {}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn add_base_schema(
|
|
||||||
base: &FormInputBase,
|
|
||||||
mut schema: Value,
|
|
||||||
properties: &mut Map<String, Value>,
|
|
||||||
required: &mut Vec<String>,
|
|
||||||
) {
|
|
||||||
if base.hidden.unwrap_or(false) || base.name.trim().is_empty() {
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
if let Some(description) = &base.description {
|
|
||||||
schema["description"] = Value::String(description.clone());
|
|
||||||
}
|
|
||||||
if let Some(label) = &base.label {
|
|
||||||
schema["title"] = Value::String(label.clone());
|
|
||||||
}
|
|
||||||
if let Some(default_value) = &base.default_value {
|
|
||||||
schema["default"] = Value::String(default_value.clone());
|
|
||||||
}
|
|
||||||
|
|
||||||
let name = base.name.clone();
|
|
||||||
properties.insert(name.clone(), schema);
|
|
||||||
if !base.optional.unwrap_or(false) {
|
|
||||||
required.push(name);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn create(
|
|
||||||
ctx: &CliContext,
|
|
||||||
workspace_id: Option<String>,
|
|
||||||
name: Option<String>,
|
|
||||||
method: Option<String>,
|
|
||||||
url: Option<String>,
|
|
||||||
json: Option<String>,
|
|
||||||
) -> CommandResult {
|
|
||||||
let json_shorthand =
|
|
||||||
workspace_id.as_deref().filter(|v| is_json_shorthand(v)).map(str::to_owned);
|
|
||||||
let workspace_id_arg = workspace_id.filter(|v| !is_json_shorthand(v));
|
|
||||||
|
|
||||||
let payload = parse_optional_json(json, json_shorthand, "request create")?;
|
|
||||||
|
|
||||||
if let Some(payload) = payload {
|
|
||||||
if name.is_some() || method.is_some() || url.is_some() {
|
|
||||||
return Err("request create cannot combine simple flags with JSON payload".to_string());
|
|
||||||
}
|
|
||||||
|
|
||||||
validate_create_id(&payload, "request")?;
|
|
||||||
let mut request: HttpRequest = serde_json::from_value(payload)
|
|
||||||
.map_err(|e| format!("Failed to parse request create JSON: {e}"))?;
|
|
||||||
let fallback_workspace_id = if workspace_id_arg.is_none() && request.workspace_id.is_empty()
|
|
||||||
{
|
|
||||||
Some(resolve_workspace_id(ctx, None, "request create")?)
|
|
||||||
} else {
|
|
||||||
None
|
|
||||||
};
|
|
||||||
merge_workspace_id_arg(
|
|
||||||
workspace_id_arg.as_deref().or(fallback_workspace_id.as_deref()),
|
|
||||||
&mut request.workspace_id,
|
|
||||||
"request create",
|
|
||||||
)?;
|
|
||||||
|
|
||||||
let created = ctx
|
|
||||||
.db()
|
|
||||||
.upsert_http_request(&request, &UpdateSource::Sync)
|
|
||||||
.map_err(|e| format!("Failed to create request: {e}"))?;
|
|
||||||
|
|
||||||
println!("Created request: {}", created.id);
|
|
||||||
return Ok(());
|
|
||||||
}
|
|
||||||
|
|
||||||
let workspace_id = resolve_workspace_id(ctx, workspace_id_arg.as_deref(), "request create")?;
|
|
||||||
let name = name.unwrap_or_default();
|
|
||||||
let url = url.unwrap_or_default();
|
|
||||||
let method = method.unwrap_or_else(|| "GET".to_string());
|
|
||||||
|
|
||||||
let request = HttpRequest {
|
|
||||||
workspace_id,
|
|
||||||
name,
|
|
||||||
method: method.to_uppercase(),
|
|
||||||
url,
|
|
||||||
..Default::default()
|
|
||||||
};
|
|
||||||
|
|
||||||
let created = ctx
|
|
||||||
.db()
|
|
||||||
.upsert_http_request(&request, &UpdateSource::Sync)
|
|
||||||
.map_err(|e| format!("Failed to create request: {e}"))?;
|
|
||||||
|
|
||||||
println!("Created request: {}", created.id);
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn update(ctx: &CliContext, json: Option<String>, json_input: Option<String>) -> CommandResult {
|
|
||||||
let patch = parse_required_json(json, json_input, "request update")?;
|
|
||||||
let id = require_id(&patch, "request update")?;
|
|
||||||
|
|
||||||
let existing = ctx
|
|
||||||
.db()
|
|
||||||
.get_http_request(&id)
|
|
||||||
.map_err(|e| format!("Failed to get request for update: {e}"))?;
|
|
||||||
let updated = apply_merge_patch(&existing, &patch, &id, "request update")?;
|
|
||||||
|
|
||||||
let saved = ctx
|
|
||||||
.db()
|
|
||||||
.upsert_http_request(&updated, &UpdateSource::Sync)
|
|
||||||
.map_err(|e| format!("Failed to update request: {e}"))?;
|
|
||||||
|
|
||||||
println!("Updated request: {}", saved.id);
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn show(ctx: &CliContext, request_id: &str) -> CommandResult {
|
|
||||||
let request =
|
|
||||||
ctx.db().get_http_request(request_id).map_err(|e| format!("Failed to get request: {e}"))?;
|
|
||||||
let output = serde_json::to_string_pretty(&request)
|
|
||||||
.map_err(|e| format!("Failed to serialize request: {e}"))?;
|
|
||||||
println!("{output}");
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn delete(ctx: &CliContext, request_id: &str, yes: bool) -> CommandResult {
|
|
||||||
if !yes && !confirm_delete("request", request_id) {
|
|
||||||
println!("Aborted");
|
|
||||||
return Ok(());
|
|
||||||
}
|
|
||||||
|
|
||||||
let deleted = ctx
|
|
||||||
.db()
|
|
||||||
.delete_http_request_by_id(request_id, &UpdateSource::Sync)
|
|
||||||
.map_err(|e| format!("Failed to delete request: {e}"))?;
|
|
||||||
println!("Deleted request: {}", deleted.id);
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Send a request by ID and print response in the same format as legacy `send`.
|
|
||||||
pub async fn send_request_by_id(
|
|
||||||
ctx: &CliContext,
|
|
||||||
request_id: &str,
|
|
||||||
environment: Option<&str>,
|
|
||||||
cookie_jar_id: Option<&str>,
|
|
||||||
verbose: bool,
|
|
||||||
) -> Result<(), String> {
|
|
||||||
let request =
|
|
||||||
ctx.db().get_any_request(request_id).map_err(|e| format!("Failed to get request: {e}"))?;
|
|
||||||
match request {
|
|
||||||
AnyRequest::HttpRequest(http_request) => {
|
|
||||||
send_http_request_by_id(
|
|
||||||
ctx,
|
|
||||||
&http_request.id,
|
|
||||||
&http_request.workspace_id,
|
|
||||||
environment,
|
|
||||||
cookie_jar_id,
|
|
||||||
verbose,
|
|
||||||
)
|
|
||||||
.await
|
|
||||||
}
|
|
||||||
AnyRequest::GrpcRequest(_) => {
|
|
||||||
Err("gRPC request send is not implemented yet in yaak-cli".to_string())
|
|
||||||
}
|
|
||||||
AnyRequest::WebsocketRequest(_) => {
|
|
||||||
Err("WebSocket request send is not implemented yet in yaak-cli".to_string())
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async fn send_http_request_by_id(
|
|
||||||
ctx: &CliContext,
|
|
||||||
request_id: &str,
|
|
||||||
workspace_id: &str,
|
|
||||||
environment: Option<&str>,
|
|
||||||
cookie_jar_id: Option<&str>,
|
|
||||||
verbose: bool,
|
|
||||||
) -> Result<(), String> {
|
|
||||||
let cookie_jar_id = resolve_cookie_jar_id(ctx, workspace_id, cookie_jar_id)?;
|
|
||||||
|
|
||||||
let plugin_context =
|
|
||||||
PluginContext::new(Some("cli".to_string()), Some(workspace_id.to_string()));
|
|
||||||
|
|
||||||
let (event_tx, mut event_rx) = mpsc::channel::<SenderHttpResponseEvent>(100);
|
|
||||||
let (body_chunk_tx, mut body_chunk_rx) = mpsc::unbounded_channel::<Vec<u8>>();
|
|
||||||
let event_handle = tokio::spawn(async move {
|
|
||||||
while let Some(event) = event_rx.recv().await {
|
|
||||||
if verbose && !matches!(event, SenderHttpResponseEvent::ChunkReceived { .. }) {
|
|
||||||
println!("{}", event);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
});
|
|
||||||
let body_handle = tokio::task::spawn_blocking(move || {
|
|
||||||
let mut stdout = std::io::stdout();
|
|
||||||
while let Some(chunk) = body_chunk_rx.blocking_recv() {
|
|
||||||
if stdout.write_all(&chunk).is_err() {
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
let _ = stdout.flush();
|
|
||||||
}
|
|
||||||
});
|
|
||||||
let response_dir = ctx.data_dir().join("responses");
|
|
||||||
|
|
||||||
let result = send_http_request_by_id_with_plugins(SendHttpRequestByIdWithPluginsParams {
|
|
||||||
query_manager: ctx.query_manager(),
|
|
||||||
blob_manager: ctx.blob_manager(),
|
|
||||||
request_id,
|
|
||||||
environment_id: environment,
|
|
||||||
update_source: UpdateSource::Sync,
|
|
||||||
cookie_jar_id,
|
|
||||||
response_dir: &response_dir,
|
|
||||||
emit_events_to: Some(event_tx),
|
|
||||||
emit_response_body_chunks_to: Some(body_chunk_tx),
|
|
||||||
plugin_manager: ctx.plugin_manager(),
|
|
||||||
encryption_manager: ctx.encryption_manager.clone(),
|
|
||||||
plugin_context: &plugin_context,
|
|
||||||
cancelled_rx: None,
|
|
||||||
connection_manager: None,
|
|
||||||
})
|
|
||||||
.await;
|
|
||||||
|
|
||||||
let _ = event_handle.await;
|
|
||||||
let _ = body_handle.await;
|
|
||||||
result.map_err(|e| e.to_string())?;
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
pub(crate) fn resolve_cookie_jar_id(
|
|
||||||
ctx: &CliContext,
|
|
||||||
workspace_id: &str,
|
|
||||||
explicit_cookie_jar_id: Option<&str>,
|
|
||||||
) -> Result<Option<String>, String> {
|
|
||||||
if let Some(cookie_jar_id) = explicit_cookie_jar_id {
|
|
||||||
return Ok(Some(cookie_jar_id.to_string()));
|
|
||||||
}
|
|
||||||
|
|
||||||
let default_cookie_jar = ctx
|
|
||||||
.db()
|
|
||||||
.list_cookie_jars(workspace_id)
|
|
||||||
.map_err(|e| format!("Failed to list cookie jars: {e}"))?
|
|
||||||
.into_iter()
|
|
||||||
.min_by_key(|jar| jar.created_at)
|
|
||||||
.map(|jar| jar.id);
|
|
||||||
Ok(default_cookie_jar)
|
|
||||||
}
|
|
||||||
@@ -1,242 +0,0 @@
|
|||||||
use crate::cli::SendArgs;
|
|
||||||
use crate::commands::request;
|
|
||||||
use crate::context::CliContext;
|
|
||||||
use futures::future::join_all;
|
|
||||||
use yaak_models::queries::any_request::AnyRequest;
|
|
||||||
|
|
||||||
enum ExecutionMode {
|
|
||||||
Sequential,
|
|
||||||
Parallel,
|
|
||||||
}
|
|
||||||
|
|
||||||
pub async fn run(
|
|
||||||
ctx: &CliContext,
|
|
||||||
args: SendArgs,
|
|
||||||
environment: Option<&str>,
|
|
||||||
cookie_jar_id: Option<&str>,
|
|
||||||
verbose: bool,
|
|
||||||
) -> i32 {
|
|
||||||
match send_target(ctx, args, environment, cookie_jar_id, verbose).await {
|
|
||||||
Ok(()) => 0,
|
|
||||||
Err(error) => {
|
|
||||||
eprintln!("Error: {error}");
|
|
||||||
1
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async fn send_target(
|
|
||||||
ctx: &CliContext,
|
|
||||||
args: SendArgs,
|
|
||||||
environment: Option<&str>,
|
|
||||||
cookie_jar_id: Option<&str>,
|
|
||||||
verbose: bool,
|
|
||||||
) -> Result<(), String> {
|
|
||||||
let mode = if args.parallel { ExecutionMode::Parallel } else { ExecutionMode::Sequential };
|
|
||||||
|
|
||||||
if let Ok(request) = ctx.db().get_any_request(&args.id) {
|
|
||||||
let workspace_id = match &request {
|
|
||||||
AnyRequest::HttpRequest(r) => r.workspace_id.clone(),
|
|
||||||
AnyRequest::GrpcRequest(r) => r.workspace_id.clone(),
|
|
||||||
AnyRequest::WebsocketRequest(r) => r.workspace_id.clone(),
|
|
||||||
};
|
|
||||||
let resolved_cookie_jar_id =
|
|
||||||
request::resolve_cookie_jar_id(ctx, &workspace_id, cookie_jar_id)?;
|
|
||||||
|
|
||||||
return request::send_request_by_id(
|
|
||||||
ctx,
|
|
||||||
&args.id,
|
|
||||||
environment,
|
|
||||||
resolved_cookie_jar_id.as_deref(),
|
|
||||||
verbose,
|
|
||||||
)
|
|
||||||
.await;
|
|
||||||
}
|
|
||||||
|
|
||||||
if let Ok(folder) = ctx.db().get_folder(&args.id) {
|
|
||||||
let resolved_cookie_jar_id =
|
|
||||||
request::resolve_cookie_jar_id(ctx, &folder.workspace_id, cookie_jar_id)?;
|
|
||||||
|
|
||||||
let request_ids = collect_folder_request_ids(ctx, &args.id)?;
|
|
||||||
if request_ids.is_empty() {
|
|
||||||
println!("No requests found in folder {}", args.id);
|
|
||||||
return Ok(());
|
|
||||||
}
|
|
||||||
return send_many(
|
|
||||||
ctx,
|
|
||||||
request_ids,
|
|
||||||
mode,
|
|
||||||
args.fail_fast,
|
|
||||||
environment,
|
|
||||||
resolved_cookie_jar_id.as_deref(),
|
|
||||||
verbose,
|
|
||||||
)
|
|
||||||
.await;
|
|
||||||
}
|
|
||||||
|
|
||||||
if let Ok(workspace) = ctx.db().get_workspace(&args.id) {
|
|
||||||
let resolved_cookie_jar_id =
|
|
||||||
request::resolve_cookie_jar_id(ctx, &workspace.id, cookie_jar_id)?;
|
|
||||||
|
|
||||||
let request_ids = collect_workspace_request_ids(ctx, &args.id)?;
|
|
||||||
if request_ids.is_empty() {
|
|
||||||
println!("No requests found in workspace {}", args.id);
|
|
||||||
return Ok(());
|
|
||||||
}
|
|
||||||
return send_many(
|
|
||||||
ctx,
|
|
||||||
request_ids,
|
|
||||||
mode,
|
|
||||||
args.fail_fast,
|
|
||||||
environment,
|
|
||||||
resolved_cookie_jar_id.as_deref(),
|
|
||||||
verbose,
|
|
||||||
)
|
|
||||||
.await;
|
|
||||||
}
|
|
||||||
|
|
||||||
Err(format!("Could not resolve ID '{}' as request, folder, or workspace", args.id))
|
|
||||||
}
|
|
||||||
|
|
||||||
fn collect_folder_request_ids(ctx: &CliContext, folder_id: &str) -> Result<Vec<String>, String> {
|
|
||||||
let mut ids = Vec::new();
|
|
||||||
|
|
||||||
let mut http_ids = ctx
|
|
||||||
.db()
|
|
||||||
.list_http_requests_for_folder_recursive(folder_id)
|
|
||||||
.map_err(|e| format!("Failed to list HTTP requests in folder: {e}"))?
|
|
||||||
.into_iter()
|
|
||||||
.map(|r| r.id)
|
|
||||||
.collect::<Vec<_>>();
|
|
||||||
ids.append(&mut http_ids);
|
|
||||||
|
|
||||||
let mut grpc_ids = ctx
|
|
||||||
.db()
|
|
||||||
.list_grpc_requests_for_folder_recursive(folder_id)
|
|
||||||
.map_err(|e| format!("Failed to list gRPC requests in folder: {e}"))?
|
|
||||||
.into_iter()
|
|
||||||
.map(|r| r.id)
|
|
||||||
.collect::<Vec<_>>();
|
|
||||||
ids.append(&mut grpc_ids);
|
|
||||||
|
|
||||||
let mut websocket_ids = ctx
|
|
||||||
.db()
|
|
||||||
.list_websocket_requests_for_folder_recursive(folder_id)
|
|
||||||
.map_err(|e| format!("Failed to list WebSocket requests in folder: {e}"))?
|
|
||||||
.into_iter()
|
|
||||||
.map(|r| r.id)
|
|
||||||
.collect::<Vec<_>>();
|
|
||||||
ids.append(&mut websocket_ids);
|
|
||||||
|
|
||||||
Ok(ids)
|
|
||||||
}
|
|
||||||
|
|
||||||
fn collect_workspace_request_ids(
|
|
||||||
ctx: &CliContext,
|
|
||||||
workspace_id: &str,
|
|
||||||
) -> Result<Vec<String>, String> {
|
|
||||||
let mut ids = Vec::new();
|
|
||||||
|
|
||||||
let mut http_ids = ctx
|
|
||||||
.db()
|
|
||||||
.list_http_requests(workspace_id)
|
|
||||||
.map_err(|e| format!("Failed to list HTTP requests in workspace: {e}"))?
|
|
||||||
.into_iter()
|
|
||||||
.map(|r| r.id)
|
|
||||||
.collect::<Vec<_>>();
|
|
||||||
ids.append(&mut http_ids);
|
|
||||||
|
|
||||||
let mut grpc_ids = ctx
|
|
||||||
.db()
|
|
||||||
.list_grpc_requests(workspace_id)
|
|
||||||
.map_err(|e| format!("Failed to list gRPC requests in workspace: {e}"))?
|
|
||||||
.into_iter()
|
|
||||||
.map(|r| r.id)
|
|
||||||
.collect::<Vec<_>>();
|
|
||||||
ids.append(&mut grpc_ids);
|
|
||||||
|
|
||||||
let mut websocket_ids = ctx
|
|
||||||
.db()
|
|
||||||
.list_websocket_requests(workspace_id)
|
|
||||||
.map_err(|e| format!("Failed to list WebSocket requests in workspace: {e}"))?
|
|
||||||
.into_iter()
|
|
||||||
.map(|r| r.id)
|
|
||||||
.collect::<Vec<_>>();
|
|
||||||
ids.append(&mut websocket_ids);
|
|
||||||
|
|
||||||
Ok(ids)
|
|
||||||
}
|
|
||||||
|
|
||||||
async fn send_many(
|
|
||||||
ctx: &CliContext,
|
|
||||||
request_ids: Vec<String>,
|
|
||||||
mode: ExecutionMode,
|
|
||||||
fail_fast: bool,
|
|
||||||
environment: Option<&str>,
|
|
||||||
cookie_jar_id: Option<&str>,
|
|
||||||
verbose: bool,
|
|
||||||
) -> Result<(), String> {
|
|
||||||
let mut success_count = 0usize;
|
|
||||||
let mut failures: Vec<(String, String)> = Vec::new();
|
|
||||||
|
|
||||||
match mode {
|
|
||||||
ExecutionMode::Sequential => {
|
|
||||||
for request_id in request_ids {
|
|
||||||
match request::send_request_by_id(
|
|
||||||
ctx,
|
|
||||||
&request_id,
|
|
||||||
environment,
|
|
||||||
cookie_jar_id,
|
|
||||||
verbose,
|
|
||||||
)
|
|
||||||
.await
|
|
||||||
{
|
|
||||||
Ok(()) => success_count += 1,
|
|
||||||
Err(error) => {
|
|
||||||
failures.push((request_id, error));
|
|
||||||
if fail_fast {
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
ExecutionMode::Parallel => {
|
|
||||||
let tasks = request_ids
|
|
||||||
.iter()
|
|
||||||
.map(|request_id| async move {
|
|
||||||
(
|
|
||||||
request_id.clone(),
|
|
||||||
request::send_request_by_id(
|
|
||||||
ctx,
|
|
||||||
request_id,
|
|
||||||
environment,
|
|
||||||
cookie_jar_id,
|
|
||||||
verbose,
|
|
||||||
)
|
|
||||||
.await,
|
|
||||||
)
|
|
||||||
})
|
|
||||||
.collect::<Vec<_>>();
|
|
||||||
|
|
||||||
for (request_id, result) in join_all(tasks).await {
|
|
||||||
match result {
|
|
||||||
Ok(()) => success_count += 1,
|
|
||||||
Err(error) => failures.push((request_id, error)),
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
let failure_count = failures.len();
|
|
||||||
println!("Send summary: {success_count} succeeded, {failure_count} failed");
|
|
||||||
|
|
||||||
if failure_count == 0 {
|
|
||||||
return Ok(());
|
|
||||||
}
|
|
||||||
|
|
||||||
for (request_id, error) in failures {
|
|
||||||
eprintln!(" {}: {}", request_id, error);
|
|
||||||
}
|
|
||||||
Err("One or more requests failed".to_string())
|
|
||||||
}
|
|
||||||
@@ -1,138 +0,0 @@
|
|||||||
use crate::cli::{WorkspaceArgs, WorkspaceCommands};
|
|
||||||
use crate::context::CliContext;
|
|
||||||
use crate::utils::confirm::confirm_delete;
|
|
||||||
use crate::utils::json::{
|
|
||||||
apply_merge_patch, parse_optional_json, parse_required_json, require_id, validate_create_id,
|
|
||||||
};
|
|
||||||
use crate::utils::schema::append_agent_hints;
|
|
||||||
use schemars::schema_for;
|
|
||||||
use yaak_models::models::Workspace;
|
|
||||||
use yaak_models::util::UpdateSource;
|
|
||||||
|
|
||||||
type CommandResult<T = ()> = std::result::Result<T, String>;
|
|
||||||
|
|
||||||
pub fn run(ctx: &CliContext, args: WorkspaceArgs) -> i32 {
|
|
||||||
let result = match args.command {
|
|
||||||
WorkspaceCommands::List => list(ctx),
|
|
||||||
WorkspaceCommands::Schema { pretty } => schema(pretty),
|
|
||||||
WorkspaceCommands::Show { workspace_id } => show(ctx, &workspace_id),
|
|
||||||
WorkspaceCommands::Create { name, json, json_input } => create(ctx, name, json, json_input),
|
|
||||||
WorkspaceCommands::Update { json, json_input } => update(ctx, json, json_input),
|
|
||||||
WorkspaceCommands::Delete { workspace_id, yes } => delete(ctx, &workspace_id, yes),
|
|
||||||
};
|
|
||||||
|
|
||||||
match result {
|
|
||||||
Ok(()) => 0,
|
|
||||||
Err(error) => {
|
|
||||||
eprintln!("Error: {error}");
|
|
||||||
1
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn schema(pretty: bool) -> CommandResult {
|
|
||||||
let mut schema = serde_json::to_value(schema_for!(Workspace))
|
|
||||||
.map_err(|e| format!("Failed to serialize workspace schema: {e}"))?;
|
|
||||||
append_agent_hints(&mut schema);
|
|
||||||
|
|
||||||
let output =
|
|
||||||
if pretty { serde_json::to_string_pretty(&schema) } else { serde_json::to_string(&schema) }
|
|
||||||
.map_err(|e| format!("Failed to format workspace schema JSON: {e}"))?;
|
|
||||||
println!("{output}");
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn list(ctx: &CliContext) -> CommandResult {
|
|
||||||
let workspaces =
|
|
||||||
ctx.db().list_workspaces().map_err(|e| format!("Failed to list workspaces: {e}"))?;
|
|
||||||
if workspaces.is_empty() {
|
|
||||||
println!("No workspaces found");
|
|
||||||
} else {
|
|
||||||
for workspace in workspaces {
|
|
||||||
println!("{} - {}", workspace.id, workspace.name);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn show(ctx: &CliContext, workspace_id: &str) -> CommandResult {
|
|
||||||
let workspace = ctx
|
|
||||||
.db()
|
|
||||||
.get_workspace(workspace_id)
|
|
||||||
.map_err(|e| format!("Failed to get workspace: {e}"))?;
|
|
||||||
let output = serde_json::to_string_pretty(&workspace)
|
|
||||||
.map_err(|e| format!("Failed to serialize workspace: {e}"))?;
|
|
||||||
println!("{output}");
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn create(
|
|
||||||
ctx: &CliContext,
|
|
||||||
name: Option<String>,
|
|
||||||
json: Option<String>,
|
|
||||||
json_input: Option<String>,
|
|
||||||
) -> CommandResult {
|
|
||||||
let payload = parse_optional_json(json, json_input, "workspace create")?;
|
|
||||||
|
|
||||||
if let Some(payload) = payload {
|
|
||||||
if name.is_some() {
|
|
||||||
return Err("workspace create cannot combine --name with JSON payload".to_string());
|
|
||||||
}
|
|
||||||
|
|
||||||
validate_create_id(&payload, "workspace")?;
|
|
||||||
let workspace: Workspace = serde_json::from_value(payload)
|
|
||||||
.map_err(|e| format!("Failed to parse workspace create JSON: {e}"))?;
|
|
||||||
|
|
||||||
let created = ctx
|
|
||||||
.db()
|
|
||||||
.upsert_workspace(&workspace, &UpdateSource::Sync)
|
|
||||||
.map_err(|e| format!("Failed to create workspace: {e}"))?;
|
|
||||||
println!("Created workspace: {}", created.id);
|
|
||||||
return Ok(());
|
|
||||||
}
|
|
||||||
|
|
||||||
let name = name.ok_or_else(|| {
|
|
||||||
"workspace create requires --name unless JSON payload is provided".to_string()
|
|
||||||
})?;
|
|
||||||
|
|
||||||
let workspace = Workspace { name, ..Default::default() };
|
|
||||||
let created = ctx
|
|
||||||
.db()
|
|
||||||
.upsert_workspace(&workspace, &UpdateSource::Sync)
|
|
||||||
.map_err(|e| format!("Failed to create workspace: {e}"))?;
|
|
||||||
println!("Created workspace: {}", created.id);
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn update(ctx: &CliContext, json: Option<String>, json_input: Option<String>) -> CommandResult {
|
|
||||||
let patch = parse_required_json(json, json_input, "workspace update")?;
|
|
||||||
let id = require_id(&patch, "workspace update")?;
|
|
||||||
|
|
||||||
let existing = ctx
|
|
||||||
.db()
|
|
||||||
.get_workspace(&id)
|
|
||||||
.map_err(|e| format!("Failed to get workspace for update: {e}"))?;
|
|
||||||
let updated = apply_merge_patch(&existing, &patch, &id, "workspace update")?;
|
|
||||||
|
|
||||||
let saved = ctx
|
|
||||||
.db()
|
|
||||||
.upsert_workspace(&updated, &UpdateSource::Sync)
|
|
||||||
.map_err(|e| format!("Failed to update workspace: {e}"))?;
|
|
||||||
|
|
||||||
println!("Updated workspace: {}", saved.id);
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn delete(ctx: &CliContext, workspace_id: &str, yes: bool) -> CommandResult {
|
|
||||||
if !yes && !confirm_delete("workspace", workspace_id) {
|
|
||||||
println!("Aborted");
|
|
||||||
return Ok(());
|
|
||||||
}
|
|
||||||
|
|
||||||
let deleted = ctx
|
|
||||||
.db()
|
|
||||||
.delete_workspace_by_id(workspace_id, &UpdateSource::Sync)
|
|
||||||
.map_err(|e| format!("Failed to delete workspace: {e}"))?;
|
|
||||||
println!("Deleted workspace: {}", deleted.id);
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
@@ -1,149 +0,0 @@
|
|||||||
use crate::plugin_events::CliPluginEventBridge;
|
|
||||||
use include_dir::{Dir, include_dir};
|
|
||||||
use std::fs;
|
|
||||||
use std::path::{Path, PathBuf};
|
|
||||||
use std::sync::Arc;
|
|
||||||
use tokio::sync::Mutex;
|
|
||||||
use yaak_crypto::manager::EncryptionManager;
|
|
||||||
use yaak_models::blob_manager::BlobManager;
|
|
||||||
use yaak_models::db_context::DbContext;
|
|
||||||
use yaak_models::query_manager::QueryManager;
|
|
||||||
use yaak_plugins::events::PluginContext;
|
|
||||||
use yaak_plugins::manager::PluginManager;
|
|
||||||
|
|
||||||
const EMBEDDED_PLUGIN_RUNTIME: &str = include_str!(concat!(
|
|
||||||
env!("CARGO_MANIFEST_DIR"),
|
|
||||||
"/../../crates-tauri/yaak-app/vendored/plugin-runtime/index.cjs"
|
|
||||||
));
|
|
||||||
static EMBEDDED_VENDORED_PLUGINS: Dir<'_> =
|
|
||||||
include_dir!("$CARGO_MANIFEST_DIR/../../crates-tauri/yaak-app/vendored/plugins");
|
|
||||||
|
|
||||||
#[derive(Clone, Debug, Default)]
|
|
||||||
pub struct CliExecutionContext {
|
|
||||||
pub request_id: Option<String>,
|
|
||||||
pub workspace_id: Option<String>,
|
|
||||||
pub environment_id: Option<String>,
|
|
||||||
pub cookie_jar_id: Option<String>,
|
|
||||||
}
|
|
||||||
|
|
||||||
pub struct CliContext {
|
|
||||||
data_dir: PathBuf,
|
|
||||||
query_manager: QueryManager,
|
|
||||||
blob_manager: BlobManager,
|
|
||||||
pub encryption_manager: Arc<EncryptionManager>,
|
|
||||||
plugin_manager: Option<Arc<PluginManager>>,
|
|
||||||
plugin_event_bridge: Mutex<Option<CliPluginEventBridge>>,
|
|
||||||
}
|
|
||||||
|
|
||||||
impl CliContext {
|
|
||||||
pub fn new(data_dir: PathBuf, app_id: &str) -> Self {
|
|
||||||
let db_path = data_dir.join("db.sqlite");
|
|
||||||
let blob_path = data_dir.join("blobs.sqlite");
|
|
||||||
let (query_manager, blob_manager, _rx) =
|
|
||||||
match yaak_models::init_standalone(&db_path, &blob_path) {
|
|
||||||
Ok(v) => v,
|
|
||||||
Err(err) => {
|
|
||||||
eprintln!("Error: Failed to initialize database: {err}");
|
|
||||||
std::process::exit(1);
|
|
||||||
}
|
|
||||||
};
|
|
||||||
let encryption_manager = Arc::new(EncryptionManager::new(query_manager.clone(), app_id));
|
|
||||||
|
|
||||||
Self {
|
|
||||||
data_dir,
|
|
||||||
query_manager,
|
|
||||||
blob_manager,
|
|
||||||
encryption_manager,
|
|
||||||
plugin_manager: None,
|
|
||||||
plugin_event_bridge: Mutex::new(None),
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
pub async fn init_plugins(&mut self, execution_context: CliExecutionContext) {
|
|
||||||
let vendored_plugin_dir = self.data_dir.join("vendored-plugins");
|
|
||||||
let installed_plugin_dir = self.data_dir.join("installed-plugins");
|
|
||||||
let node_bin_path = PathBuf::from("node");
|
|
||||||
|
|
||||||
prepare_embedded_vendored_plugins(&vendored_plugin_dir)
|
|
||||||
.expect("Failed to prepare bundled plugins");
|
|
||||||
|
|
||||||
let plugin_runtime_main =
|
|
||||||
std::env::var("YAAK_PLUGIN_RUNTIME").map(PathBuf::from).unwrap_or_else(|_| {
|
|
||||||
prepare_embedded_plugin_runtime(&self.data_dir)
|
|
||||||
.expect("Failed to prepare embedded plugin runtime")
|
|
||||||
});
|
|
||||||
|
|
||||||
match PluginManager::new(
|
|
||||||
vendored_plugin_dir,
|
|
||||||
installed_plugin_dir,
|
|
||||||
node_bin_path,
|
|
||||||
plugin_runtime_main,
|
|
||||||
&self.query_manager,
|
|
||||||
&PluginContext::new_empty(),
|
|
||||||
false,
|
|
||||||
)
|
|
||||||
.await
|
|
||||||
{
|
|
||||||
Ok(plugin_manager) => {
|
|
||||||
let plugin_manager = Arc::new(plugin_manager);
|
|
||||||
let plugin_event_bridge = CliPluginEventBridge::start(
|
|
||||||
plugin_manager.clone(),
|
|
||||||
self.query_manager.clone(),
|
|
||||||
self.blob_manager.clone(),
|
|
||||||
self.encryption_manager.clone(),
|
|
||||||
self.data_dir.clone(),
|
|
||||||
execution_context,
|
|
||||||
)
|
|
||||||
.await;
|
|
||||||
self.plugin_manager = Some(plugin_manager);
|
|
||||||
*self.plugin_event_bridge.lock().await = Some(plugin_event_bridge);
|
|
||||||
}
|
|
||||||
Err(err) => {
|
|
||||||
eprintln!("Warning: Failed to initialize plugins: {err}");
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn data_dir(&self) -> &Path {
|
|
||||||
&self.data_dir
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn db(&self) -> DbContext<'_> {
|
|
||||||
self.query_manager.connect()
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn query_manager(&self) -> &QueryManager {
|
|
||||||
&self.query_manager
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn blob_manager(&self) -> &BlobManager {
|
|
||||||
&self.blob_manager
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn plugin_manager(&self) -> Arc<PluginManager> {
|
|
||||||
self.plugin_manager.clone().expect("Plugin manager was not initialized for this command")
|
|
||||||
}
|
|
||||||
|
|
||||||
pub async fn shutdown(&self) {
|
|
||||||
if let Some(plugin_manager) = &self.plugin_manager {
|
|
||||||
if let Some(plugin_event_bridge) = self.plugin_event_bridge.lock().await.take() {
|
|
||||||
plugin_event_bridge.shutdown(plugin_manager).await;
|
|
||||||
}
|
|
||||||
plugin_manager.terminate().await;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn prepare_embedded_plugin_runtime(data_dir: &Path) -> std::io::Result<PathBuf> {
|
|
||||||
let runtime_dir = data_dir.join("vendored").join("plugin-runtime");
|
|
||||||
fs::create_dir_all(&runtime_dir)?;
|
|
||||||
let runtime_main = runtime_dir.join("index.cjs");
|
|
||||||
fs::write(&runtime_main, EMBEDDED_PLUGIN_RUNTIME)?;
|
|
||||||
Ok(runtime_main)
|
|
||||||
}
|
|
||||||
|
|
||||||
fn prepare_embedded_vendored_plugins(vendored_plugin_dir: &Path) -> std::io::Result<()> {
|
|
||||||
fs::create_dir_all(vendored_plugin_dir)?;
|
|
||||||
EMBEDDED_VENDORED_PLUGINS.extract(vendored_plugin_dir)?;
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
@@ -1,283 +1,290 @@
|
|||||||
mod cli;
|
use clap::{Parser, Subcommand};
|
||||||
mod commands;
|
|
||||||
mod context;
|
|
||||||
mod plugin_events;
|
|
||||||
mod ui;
|
|
||||||
mod utils;
|
|
||||||
mod version;
|
|
||||||
mod version_check;
|
|
||||||
|
|
||||||
use clap::Parser;
|
|
||||||
use cli::{Cli, Commands, PluginCommands, RequestCommands};
|
|
||||||
use context::{CliContext, CliExecutionContext};
|
|
||||||
use std::path::PathBuf;
|
use std::path::PathBuf;
|
||||||
use yaak_models::queries::any_request::AnyRequest;
|
use std::sync::Arc;
|
||||||
|
use tokio::sync::mpsc;
|
||||||
|
use yaak_http::sender::{HttpSender, ReqwestSender};
|
||||||
|
use yaak_http::types::{SendableHttpRequest, SendableHttpRequestOptions};
|
||||||
|
use yaak_models::models::HttpRequest;
|
||||||
|
use yaak_models::util::UpdateSource;
|
||||||
|
use yaak_plugins::events::PluginContext;
|
||||||
|
use yaak_plugins::manager::PluginManager;
|
||||||
|
|
||||||
|
#[derive(Parser)]
|
||||||
|
#[command(name = "yaakcli")]
|
||||||
|
#[command(about = "Yaak CLI - API client from the command line")]
|
||||||
|
struct Cli {
|
||||||
|
/// Use a custom data directory
|
||||||
|
#[arg(long, global = true)]
|
||||||
|
data_dir: Option<PathBuf>,
|
||||||
|
|
||||||
|
/// Environment ID to use for variable substitution
|
||||||
|
#[arg(long, short, global = true)]
|
||||||
|
environment: Option<String>,
|
||||||
|
|
||||||
|
/// Enable verbose logging
|
||||||
|
#[arg(long, short, global = true)]
|
||||||
|
verbose: bool,
|
||||||
|
|
||||||
|
#[command(subcommand)]
|
||||||
|
command: Commands,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Subcommand)]
|
||||||
|
enum Commands {
|
||||||
|
/// List all workspaces
|
||||||
|
Workspaces,
|
||||||
|
/// List requests in a workspace
|
||||||
|
Requests {
|
||||||
|
/// Workspace ID
|
||||||
|
workspace_id: String,
|
||||||
|
},
|
||||||
|
/// Send an HTTP request by ID
|
||||||
|
Send {
|
||||||
|
/// Request ID
|
||||||
|
request_id: String,
|
||||||
|
},
|
||||||
|
/// Send a GET request to a URL
|
||||||
|
Get {
|
||||||
|
/// URL to request
|
||||||
|
url: String,
|
||||||
|
},
|
||||||
|
/// Create a new HTTP request
|
||||||
|
Create {
|
||||||
|
/// Workspace ID
|
||||||
|
workspace_id: String,
|
||||||
|
/// Request name
|
||||||
|
#[arg(short, long)]
|
||||||
|
name: String,
|
||||||
|
/// HTTP method
|
||||||
|
#[arg(short, long, default_value = "GET")]
|
||||||
|
method: String,
|
||||||
|
/// URL
|
||||||
|
#[arg(short, long)]
|
||||||
|
url: String,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
#[tokio::main]
|
#[tokio::main]
|
||||||
async fn main() {
|
async fn main() {
|
||||||
let Cli { data_dir, environment, cookie_jar, verbose, log, command } = Cli::parse();
|
let cli = Cli::parse();
|
||||||
|
|
||||||
if let Some(log_level) = log {
|
// Initialize logging
|
||||||
match log_level {
|
if cli.verbose {
|
||||||
Some(level) => {
|
env_logger::Builder::from_env(env_logger::Env::default().default_filter_or("info")).init();
|
||||||
env_logger::Builder::new().filter_level(level.as_filter()).init();
|
|
||||||
}
|
|
||||||
None => {
|
|
||||||
env_logger::Builder::from_env(env_logger::Env::default().default_filter_or("info"))
|
|
||||||
.init();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Use the same app_id for both data directory and keyring
|
||||||
let app_id = if cfg!(debug_assertions) { "app.yaak.desktop.dev" } else { "app.yaak.desktop" };
|
let app_id = if cfg!(debug_assertions) { "app.yaak.desktop.dev" } else { "app.yaak.desktop" };
|
||||||
|
|
||||||
let data_dir = data_dir.unwrap_or_else(|| resolve_data_dir(app_id));
|
let data_dir = cli.data_dir.unwrap_or_else(|| {
|
||||||
|
dirs::data_dir().expect("Could not determine data directory").join(app_id)
|
||||||
|
});
|
||||||
|
|
||||||
version_check::maybe_check_for_updates().await;
|
let db_path = data_dir.join("db.sqlite");
|
||||||
|
let blob_path = data_dir.join("blobs.sqlite");
|
||||||
|
|
||||||
let exit_code = match command {
|
let (query_manager, _blob_manager, _rx) =
|
||||||
Commands::Auth(args) => commands::auth::run(args).await,
|
yaak_models::init_standalone(&db_path, &blob_path).expect("Failed to initialize database");
|
||||||
Commands::Plugin(args) => match args.command {
|
|
||||||
PluginCommands::Build(args) => commands::plugin::run_build(args).await,
|
let db = query_manager.connect();
|
||||||
PluginCommands::Dev(args) => commands::plugin::run_dev(args).await,
|
|
||||||
PluginCommands::Generate(args) => commands::plugin::run_generate(args).await,
|
// Initialize plugin manager for template functions
|
||||||
PluginCommands::Publish(args) => commands::plugin::run_publish(args).await,
|
let vendored_plugin_dir = data_dir.join("vendored-plugins");
|
||||||
PluginCommands::Install(install_args) => {
|
let installed_plugin_dir = data_dir.join("installed-plugins");
|
||||||
let mut context = CliContext::new(data_dir.clone(), app_id);
|
|
||||||
context.init_plugins(CliExecutionContext::default()).await;
|
// Use system node for CLI (must be in PATH)
|
||||||
let exit_code = commands::plugin::run_install(&context, install_args).await;
|
let node_bin_path = PathBuf::from("node");
|
||||||
context.shutdown().await;
|
|
||||||
exit_code
|
// Find the plugin runtime - check YAAK_PLUGIN_RUNTIME env var, then fallback to development path
|
||||||
}
|
let plugin_runtime_main =
|
||||||
},
|
std::env::var("YAAK_PLUGIN_RUNTIME").map(PathBuf::from).unwrap_or_else(|_| {
|
||||||
Commands::Build(args) => commands::plugin::run_build(args).await,
|
// Development fallback: look relative to crate root
|
||||||
Commands::Dev(args) => commands::plugin::run_dev(args).await,
|
PathBuf::from(env!("CARGO_MANIFEST_DIR"))
|
||||||
Commands::Generate(args) => commands::plugin::run_generate(args).await,
|
.join("../../crates-tauri/yaak-app/vendored/plugin-runtime/index.cjs")
|
||||||
Commands::Publish(args) => commands::plugin::run_publish(args).await,
|
});
|
||||||
Commands::Send(args) => {
|
|
||||||
let mut context = CliContext::new(data_dir.clone(), app_id);
|
// Create plugin manager (plugins may not be available in CLI context)
|
||||||
match resolve_send_execution_context(
|
let plugin_manager = Arc::new(
|
||||||
&context,
|
PluginManager::new(
|
||||||
&args.id,
|
vendored_plugin_dir.clone(),
|
||||||
environment.as_deref(),
|
installed_plugin_dir.clone(),
|
||||||
cookie_jar.as_deref(),
|
node_bin_path.clone(),
|
||||||
) {
|
plugin_runtime_main,
|
||||||
Ok(execution_context) => {
|
false,
|
||||||
context.init_plugins(execution_context).await;
|
)
|
||||||
let exit_code = commands::send::run(
|
.await,
|
||||||
&context,
|
);
|
||||||
args,
|
|
||||||
environment.as_deref(),
|
// Initialize plugins from database
|
||||||
cookie_jar.as_deref(),
|
let plugins = db.list_plugins().unwrap_or_default();
|
||||||
verbose,
|
if !plugins.is_empty() {
|
||||||
)
|
let errors =
|
||||||
.await;
|
plugin_manager.initialize_all_plugins(plugins, &PluginContext::new_empty()).await;
|
||||||
context.shutdown().await;
|
for (plugin_dir, error_msg) in errors {
|
||||||
exit_code
|
eprintln!("Warning: Failed to initialize plugin '{}': {}", plugin_dir, error_msg);
|
||||||
}
|
}
|
||||||
Err(error) => {
|
}
|
||||||
eprintln!("Error: {error}");
|
|
||||||
1
|
match cli.command {
|
||||||
|
Commands::Workspaces => {
|
||||||
|
let workspaces = db.list_workspaces().expect("Failed to list workspaces");
|
||||||
|
if workspaces.is_empty() {
|
||||||
|
println!("No workspaces found");
|
||||||
|
} else {
|
||||||
|
for ws in workspaces {
|
||||||
|
println!("{} - {}", ws.id, ws.name);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
Commands::CookieJar(args) => {
|
Commands::Requests { workspace_id } => {
|
||||||
let context = CliContext::new(data_dir.clone(), app_id);
|
let requests = db.list_http_requests(&workspace_id).expect("Failed to list requests");
|
||||||
let exit_code = commands::cookie_jar::run(&context, args);
|
if requests.is_empty() {
|
||||||
context.shutdown().await;
|
println!("No requests found in workspace {}", workspace_id);
|
||||||
exit_code
|
} else {
|
||||||
|
for req in requests {
|
||||||
|
println!("{} - {} {}", req.id, req.method, req.name);
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
Commands::Workspace(args) => {
|
Commands::Send { request_id } => {
|
||||||
let context = CliContext::new(data_dir.clone(), app_id);
|
use yaak_actions::{
|
||||||
let exit_code = commands::workspace::run(&context, args);
|
ActionExecutor, ActionId, ActionParams, ActionResult, ActionTarget, CurrentContext,
|
||||||
context.shutdown().await;
|
|
||||||
exit_code
|
|
||||||
}
|
|
||||||
Commands::Request(args) => {
|
|
||||||
let mut context = CliContext::new(data_dir.clone(), app_id);
|
|
||||||
let execution_context_result = match &args.command {
|
|
||||||
RequestCommands::Send { request_id } => resolve_request_execution_context(
|
|
||||||
&context,
|
|
||||||
request_id,
|
|
||||||
environment.as_deref(),
|
|
||||||
cookie_jar.as_deref(),
|
|
||||||
),
|
|
||||||
_ => Ok(CliExecutionContext::default()),
|
|
||||||
};
|
};
|
||||||
match execution_context_result {
|
use yaak_actions_builtin::{BuiltinActionDependencies, register_http_actions};
|
||||||
Ok(execution_context) => {
|
|
||||||
let with_plugins = matches!(
|
// Create dependencies
|
||||||
&args.command,
|
let deps = BuiltinActionDependencies::new_standalone(
|
||||||
RequestCommands::Send { .. } | RequestCommands::Schema { .. }
|
&db_path,
|
||||||
);
|
&blob_path,
|
||||||
if with_plugins {
|
&app_id,
|
||||||
context.init_plugins(execution_context).await;
|
vendored_plugin_dir.clone(),
|
||||||
|
installed_plugin_dir.clone(),
|
||||||
|
node_bin_path.clone(),
|
||||||
|
)
|
||||||
|
.await
|
||||||
|
.expect("Failed to initialize dependencies");
|
||||||
|
|
||||||
|
// Create executor and register actions
|
||||||
|
let executor = ActionExecutor::new();
|
||||||
|
executor.register_builtin_groups().await.expect("Failed to register groups");
|
||||||
|
register_http_actions(&executor, &deps).await.expect("Failed to register HTTP actions");
|
||||||
|
|
||||||
|
// Prepare context
|
||||||
|
let context = CurrentContext {
|
||||||
|
target: Some(ActionTarget::HttpRequest { id: request_id.clone() }),
|
||||||
|
environment_id: cli.environment.clone(),
|
||||||
|
workspace_id: None,
|
||||||
|
has_window: false,
|
||||||
|
can_prompt: false,
|
||||||
|
};
|
||||||
|
|
||||||
|
// Prepare params
|
||||||
|
let params = ActionParams {
|
||||||
|
data: serde_json::json!({
|
||||||
|
"render": true,
|
||||||
|
"follow_redirects": false,
|
||||||
|
"timeout_ms": 30000,
|
||||||
|
}),
|
||||||
|
};
|
||||||
|
|
||||||
|
// Invoke action
|
||||||
|
let action_id = ActionId::builtin("http", "send-request");
|
||||||
|
let result = executor.invoke(&action_id, context, params).await.expect("Action failed");
|
||||||
|
|
||||||
|
// Handle result
|
||||||
|
match result {
|
||||||
|
ActionResult::Success { data, message } => {
|
||||||
|
if let Some(msg) = message {
|
||||||
|
println!("{}", msg);
|
||||||
|
}
|
||||||
|
if let Some(data) = data {
|
||||||
|
println!("{}", serde_json::to_string_pretty(&data).unwrap());
|
||||||
}
|
}
|
||||||
let exit_code = commands::request::run(
|
|
||||||
&context,
|
|
||||||
args,
|
|
||||||
environment.as_deref(),
|
|
||||||
cookie_jar.as_deref(),
|
|
||||||
verbose,
|
|
||||||
)
|
|
||||||
.await;
|
|
||||||
context.shutdown().await;
|
|
||||||
exit_code
|
|
||||||
}
|
}
|
||||||
Err(error) => {
|
ActionResult::RequiresInput { .. } => {
|
||||||
eprintln!("Error: {error}");
|
eprintln!("Action requires input (not supported in CLI)");
|
||||||
1
|
}
|
||||||
|
ActionResult::Cancelled => {
|
||||||
|
eprintln!("Action cancelled");
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
Commands::Folder(args) => {
|
Commands::Get { url } => {
|
||||||
let context = CliContext::new(data_dir.clone(), app_id);
|
if cli.verbose {
|
||||||
let exit_code = commands::folder::run(&context, args);
|
println!("> GET {}", url);
|
||||||
context.shutdown().await;
|
}
|
||||||
exit_code
|
|
||||||
|
// Build a simple GET request
|
||||||
|
let sendable = SendableHttpRequest {
|
||||||
|
url: url.clone(),
|
||||||
|
method: "GET".to_string(),
|
||||||
|
headers: vec![],
|
||||||
|
body: None,
|
||||||
|
options: SendableHttpRequestOptions::default(),
|
||||||
|
};
|
||||||
|
|
||||||
|
// Create event channel for progress
|
||||||
|
let (event_tx, mut event_rx) = mpsc::channel(100);
|
||||||
|
|
||||||
|
// Spawn task to print events if verbose
|
||||||
|
let verbose = cli.verbose;
|
||||||
|
let verbose_handle = if verbose {
|
||||||
|
Some(tokio::spawn(async move {
|
||||||
|
while let Some(event) = event_rx.recv().await {
|
||||||
|
println!("{}", event);
|
||||||
|
}
|
||||||
|
}))
|
||||||
|
} else {
|
||||||
|
tokio::spawn(async move { while event_rx.recv().await.is_some() {} });
|
||||||
|
None
|
||||||
|
};
|
||||||
|
|
||||||
|
// Send the request
|
||||||
|
let sender = ReqwestSender::new().expect("Failed to create HTTP client");
|
||||||
|
let response = sender.send(sendable, event_tx).await.expect("Failed to send request");
|
||||||
|
|
||||||
|
if let Some(handle) = verbose_handle {
|
||||||
|
let _ = handle.await;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Print response
|
||||||
|
if verbose {
|
||||||
|
println!();
|
||||||
|
}
|
||||||
|
println!(
|
||||||
|
"HTTP {} {}",
|
||||||
|
response.status,
|
||||||
|
response.status_reason.as_deref().unwrap_or("")
|
||||||
|
);
|
||||||
|
|
||||||
|
if verbose {
|
||||||
|
for (name, value) in &response.headers {
|
||||||
|
println!("{}: {}", name, value);
|
||||||
|
}
|
||||||
|
println!();
|
||||||
|
}
|
||||||
|
|
||||||
|
// Print body
|
||||||
|
let (body, _stats) = response.text().await.expect("Failed to read response body");
|
||||||
|
println!("{}", body);
|
||||||
}
|
}
|
||||||
Commands::Environment(args) => {
|
Commands::Create { workspace_id, name, method, url } => {
|
||||||
let context = CliContext::new(data_dir.clone(), app_id);
|
let request = HttpRequest {
|
||||||
let exit_code = commands::environment::run(&context, args);
|
workspace_id,
|
||||||
context.shutdown().await;
|
name,
|
||||||
exit_code
|
method: method.to_uppercase(),
|
||||||
|
url,
|
||||||
|
..Default::default()
|
||||||
|
};
|
||||||
|
|
||||||
|
let created = db
|
||||||
|
.upsert_http_request(&request, &UpdateSource::Sync)
|
||||||
|
.expect("Failed to create request");
|
||||||
|
|
||||||
|
println!("Created request: {}", created.id);
|
||||||
}
|
}
|
||||||
};
|
|
||||||
|
|
||||||
if exit_code != 0 {
|
|
||||||
std::process::exit(exit_code);
|
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
// Terminate plugin manager gracefully
|
||||||
fn resolve_send_execution_context(
|
plugin_manager.terminate().await;
|
||||||
context: &CliContext,
|
|
||||||
id: &str,
|
|
||||||
environment: Option<&str>,
|
|
||||||
explicit_cookie_jar_id: Option<&str>,
|
|
||||||
) -> Result<CliExecutionContext, String> {
|
|
||||||
if let Ok(request) = context.db().get_any_request(id) {
|
|
||||||
let (request_id, workspace_id) = match request {
|
|
||||||
AnyRequest::HttpRequest(r) => (Some(r.id), r.workspace_id),
|
|
||||||
AnyRequest::GrpcRequest(r) => (Some(r.id), r.workspace_id),
|
|
||||||
AnyRequest::WebsocketRequest(r) => (Some(r.id), r.workspace_id),
|
|
||||||
};
|
|
||||||
let cookie_jar_id = resolve_cookie_jar_id(context, &workspace_id, explicit_cookie_jar_id)?;
|
|
||||||
return Ok(CliExecutionContext {
|
|
||||||
request_id,
|
|
||||||
workspace_id: Some(workspace_id),
|
|
||||||
environment_id: environment.map(str::to_string),
|
|
||||||
cookie_jar_id,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
if let Ok(folder) = context.db().get_folder(id) {
|
|
||||||
let cookie_jar_id =
|
|
||||||
resolve_cookie_jar_id(context, &folder.workspace_id, explicit_cookie_jar_id)?;
|
|
||||||
return Ok(CliExecutionContext {
|
|
||||||
request_id: None,
|
|
||||||
workspace_id: Some(folder.workspace_id),
|
|
||||||
environment_id: environment.map(str::to_string),
|
|
||||||
cookie_jar_id,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
if let Ok(workspace) = context.db().get_workspace(id) {
|
|
||||||
let cookie_jar_id = resolve_cookie_jar_id(context, &workspace.id, explicit_cookie_jar_id)?;
|
|
||||||
return Ok(CliExecutionContext {
|
|
||||||
request_id: None,
|
|
||||||
workspace_id: Some(workspace.id),
|
|
||||||
environment_id: environment.map(str::to_string),
|
|
||||||
cookie_jar_id,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
Err(format!("Could not resolve ID '{}' as request, folder, or workspace", id))
|
|
||||||
}
|
|
||||||
|
|
||||||
fn resolve_request_execution_context(
|
|
||||||
context: &CliContext,
|
|
||||||
request_id: &str,
|
|
||||||
environment: Option<&str>,
|
|
||||||
explicit_cookie_jar_id: Option<&str>,
|
|
||||||
) -> Result<CliExecutionContext, String> {
|
|
||||||
let request = context
|
|
||||||
.db()
|
|
||||||
.get_any_request(request_id)
|
|
||||||
.map_err(|e| format!("Failed to get request: {e}"))?;
|
|
||||||
|
|
||||||
let workspace_id = match request {
|
|
||||||
AnyRequest::HttpRequest(r) => r.workspace_id,
|
|
||||||
AnyRequest::GrpcRequest(r) => r.workspace_id,
|
|
||||||
AnyRequest::WebsocketRequest(r) => r.workspace_id,
|
|
||||||
};
|
|
||||||
let cookie_jar_id = resolve_cookie_jar_id(context, &workspace_id, explicit_cookie_jar_id)?;
|
|
||||||
|
|
||||||
Ok(CliExecutionContext {
|
|
||||||
request_id: Some(request_id.to_string()),
|
|
||||||
workspace_id: Some(workspace_id),
|
|
||||||
environment_id: environment.map(str::to_string),
|
|
||||||
cookie_jar_id,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
|
|
||||||
fn resolve_cookie_jar_id(
|
|
||||||
context: &CliContext,
|
|
||||||
workspace_id: &str,
|
|
||||||
explicit_cookie_jar_id: Option<&str>,
|
|
||||||
) -> Result<Option<String>, String> {
|
|
||||||
if let Some(cookie_jar_id) = explicit_cookie_jar_id {
|
|
||||||
return Ok(Some(cookie_jar_id.to_string()));
|
|
||||||
}
|
|
||||||
|
|
||||||
let default_cookie_jar = context
|
|
||||||
.db()
|
|
||||||
.list_cookie_jars(workspace_id)
|
|
||||||
.map_err(|e| format!("Failed to list cookie jars: {e}"))?
|
|
||||||
.into_iter()
|
|
||||||
.min_by_key(|jar| jar.created_at)
|
|
||||||
.map(|jar| jar.id);
|
|
||||||
Ok(default_cookie_jar)
|
|
||||||
}
|
|
||||||
|
|
||||||
fn resolve_data_dir(app_id: &str) -> PathBuf {
|
|
||||||
if let Some(dir) = wsl_data_dir(app_id) {
|
|
||||||
return dir;
|
|
||||||
}
|
|
||||||
dirs::data_dir().expect("Could not determine data directory").join(app_id)
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Detect WSL and resolve the Windows AppData\Roaming path for the Yaak data directory.
|
|
||||||
fn wsl_data_dir(app_id: &str) -> Option<PathBuf> {
|
|
||||||
if !cfg!(target_os = "linux") {
|
|
||||||
return None;
|
|
||||||
}
|
|
||||||
|
|
||||||
let proc_version = std::fs::read_to_string("/proc/version").ok()?;
|
|
||||||
let is_wsl = proc_version.to_lowercase().contains("microsoft");
|
|
||||||
if !is_wsl {
|
|
||||||
return None;
|
|
||||||
}
|
|
||||||
|
|
||||||
// We're in WSL, so try to resolve the Yaak app's data directory in Windows
|
|
||||||
|
|
||||||
// Get the Windows %APPDATA% path via cmd.exe
|
|
||||||
let appdata_output =
|
|
||||||
std::process::Command::new("cmd.exe").args(["/C", "echo", "%APPDATA%"]).output().ok()?;
|
|
||||||
|
|
||||||
let win_path = String::from_utf8(appdata_output.stdout).ok()?.trim().to_string();
|
|
||||||
if win_path.is_empty() || win_path == "%APPDATA%" {
|
|
||||||
return None;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Convert Windows path to WSL path using wslpath (handles custom mount points)
|
|
||||||
let wslpath_output = std::process::Command::new("wslpath").arg(&win_path).output().ok()?;
|
|
||||||
|
|
||||||
let wsl_appdata = String::from_utf8(wslpath_output.stdout).ok()?.trim().to_string();
|
|
||||||
if wsl_appdata.is_empty() {
|
|
||||||
return None;
|
|
||||||
}
|
|
||||||
|
|
||||||
let wsl_path = PathBuf::from(wsl_appdata).join(app_id);
|
|
||||||
|
|
||||||
if wsl_path.exists() { Some(wsl_path) } else { None }
|
|
||||||
}
|
}
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
@@ -1,42 +0,0 @@
|
|||||||
use console::style;
|
|
||||||
use std::io::{self, IsTerminal};
|
|
||||||
|
|
||||||
pub fn info(message: &str) {
|
|
||||||
if io::stdout().is_terminal() {
|
|
||||||
println!("{:<8} {}", style("INFO").cyan().bold(), style(message).cyan());
|
|
||||||
} else {
|
|
||||||
println!("INFO {message}");
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn warning(message: &str) {
|
|
||||||
if io::stdout().is_terminal() {
|
|
||||||
println!("{:<8} {}", style("WARNING").yellow().bold(), style(message).yellow());
|
|
||||||
} else {
|
|
||||||
println!("WARNING {message}");
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn warning_stderr(message: &str) {
|
|
||||||
if io::stderr().is_terminal() {
|
|
||||||
eprintln!("{:<8} {}", style("WARNING").yellow().bold(), style(message).yellow());
|
|
||||||
} else {
|
|
||||||
eprintln!("WARNING {message}");
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn success(message: &str) {
|
|
||||||
if io::stdout().is_terminal() {
|
|
||||||
println!("{:<8} {}", style("SUCCESS").green().bold(), style(message).green());
|
|
||||||
} else {
|
|
||||||
println!("SUCCESS {message}");
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn error(message: &str) {
|
|
||||||
if io::stderr().is_terminal() {
|
|
||||||
eprintln!("{:<8} {}", style("ERROR").red().bold(), style(message).red());
|
|
||||||
} else {
|
|
||||||
eprintln!("Error: {message}");
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,16 +0,0 @@
|
|||||||
use std::io::{self, IsTerminal, Write};
|
|
||||||
|
|
||||||
pub fn confirm_delete(resource_name: &str, resource_id: &str) -> bool {
|
|
||||||
if !io::stdin().is_terminal() {
|
|
||||||
eprintln!("Refusing to delete in non-interactive mode without --yes");
|
|
||||||
std::process::exit(1);
|
|
||||||
}
|
|
||||||
|
|
||||||
print!("Delete {resource_name} {resource_id}? [y/N]: ");
|
|
||||||
io::stdout().flush().expect("Failed to flush stdout");
|
|
||||||
|
|
||||||
let mut input = String::new();
|
|
||||||
io::stdin().read_line(&mut input).expect("Failed to read confirmation");
|
|
||||||
|
|
||||||
matches!(input.trim().to_lowercase().as_str(), "y" | "yes")
|
|
||||||
}
|
|
||||||
@@ -1,47 +0,0 @@
|
|||||||
use reqwest::Client;
|
|
||||||
use reqwest::header::{HeaderMap, HeaderName, HeaderValue, USER_AGENT};
|
|
||||||
use serde_json::Value;
|
|
||||||
|
|
||||||
pub fn build_client(session_token: Option<&str>) -> Result<Client, String> {
|
|
||||||
let mut headers = HeaderMap::new();
|
|
||||||
let user_agent = HeaderValue::from_str(&user_agent())
|
|
||||||
.map_err(|e| format!("Failed to build user-agent header: {e}"))?;
|
|
||||||
headers.insert(USER_AGENT, user_agent);
|
|
||||||
|
|
||||||
if let Some(token) = session_token {
|
|
||||||
let token_value = HeaderValue::from_str(token)
|
|
||||||
.map_err(|e| format!("Failed to build session header: {e}"))?;
|
|
||||||
headers.insert(HeaderName::from_static("x-yaak-session"), token_value);
|
|
||||||
}
|
|
||||||
|
|
||||||
Client::builder()
|
|
||||||
.default_headers(headers)
|
|
||||||
.build()
|
|
||||||
.map_err(|e| format!("Failed to initialize HTTP client: {e}"))
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn parse_api_error(status: u16, body: &str) -> String {
|
|
||||||
if let Ok(value) = serde_json::from_str::<Value>(body) {
|
|
||||||
if let Some(message) = value.get("message").and_then(Value::as_str) {
|
|
||||||
return message.to_string();
|
|
||||||
}
|
|
||||||
if let Some(error) = value.get("error").and_then(Value::as_str) {
|
|
||||||
return error.to_string();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
format!("API error {status}: {body}")
|
|
||||||
}
|
|
||||||
|
|
||||||
fn user_agent() -> String {
|
|
||||||
format!("YaakCli/{} ({})", crate::version::cli_version(), ua_platform())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn ua_platform() -> &'static str {
|
|
||||||
match std::env::consts::OS {
|
|
||||||
"windows" => "Win",
|
|
||||||
"darwin" => "Mac",
|
|
||||||
"linux" => "Linux",
|
|
||||||
_ => "Unknown",
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,131 +0,0 @@
|
|||||||
use serde::Serialize;
|
|
||||||
use serde::de::DeserializeOwned;
|
|
||||||
use serde_json::{Map, Value};
|
|
||||||
|
|
||||||
type JsonResult<T> = std::result::Result<T, String>;
|
|
||||||
|
|
||||||
pub fn is_json_shorthand(input: &str) -> bool {
|
|
||||||
input.trim_start().starts_with('{')
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn parse_json_object(raw: &str, context: &str) -> JsonResult<Value> {
|
|
||||||
let value: Value = serde_json::from_str(raw)
|
|
||||||
.map_err(|error| format!("Invalid JSON for {context}: {error}"))?;
|
|
||||||
|
|
||||||
if !value.is_object() {
|
|
||||||
return Err(format!("JSON payload for {context} must be an object"));
|
|
||||||
}
|
|
||||||
|
|
||||||
Ok(value)
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn parse_optional_json(
|
|
||||||
json_flag: Option<String>,
|
|
||||||
json_shorthand: Option<String>,
|
|
||||||
context: &str,
|
|
||||||
) -> JsonResult<Option<Value>> {
|
|
||||||
match (json_flag, json_shorthand) {
|
|
||||||
(Some(_), Some(_)) => {
|
|
||||||
Err(format!("Cannot provide both --json and positional JSON for {context}"))
|
|
||||||
}
|
|
||||||
(Some(raw), None) => parse_json_object(&raw, context).map(Some),
|
|
||||||
(None, Some(raw)) => parse_json_object(&raw, context).map(Some),
|
|
||||||
(None, None) => Ok(None),
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn parse_required_json(
|
|
||||||
json_flag: Option<String>,
|
|
||||||
json_shorthand: Option<String>,
|
|
||||||
context: &str,
|
|
||||||
) -> JsonResult<Value> {
|
|
||||||
parse_optional_json(json_flag, json_shorthand, context)?
|
|
||||||
.ok_or_else(|| format!("Missing JSON payload for {context}. Use --json or positional JSON"))
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn require_id(payload: &Value, context: &str) -> JsonResult<String> {
|
|
||||||
payload
|
|
||||||
.get("id")
|
|
||||||
.and_then(|value| value.as_str())
|
|
||||||
.filter(|value| !value.is_empty())
|
|
||||||
.map(|value| value.to_string())
|
|
||||||
.ok_or_else(|| format!("{context} requires a non-empty \"id\" field"))
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn validate_create_id(payload: &Value, context: &str) -> JsonResult<()> {
|
|
||||||
let Some(id_value) = payload.get("id") else {
|
|
||||||
return Ok(());
|
|
||||||
};
|
|
||||||
|
|
||||||
match id_value {
|
|
||||||
Value::String(id) if id.is_empty() => Ok(()),
|
|
||||||
_ => Err(format!("{context} create JSON must omit \"id\" or set it to an empty string")),
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn merge_workspace_id_arg(
|
|
||||||
workspace_id_from_arg: Option<&str>,
|
|
||||||
payload_workspace_id: &mut String,
|
|
||||||
context: &str,
|
|
||||||
) -> JsonResult<()> {
|
|
||||||
if let Some(workspace_id_arg) = workspace_id_from_arg {
|
|
||||||
if payload_workspace_id.is_empty() {
|
|
||||||
*payload_workspace_id = workspace_id_arg.to_string();
|
|
||||||
} else if payload_workspace_id != workspace_id_arg {
|
|
||||||
return Err(format!(
|
|
||||||
"{context} got conflicting workspace_id values between positional arg and JSON payload"
|
|
||||||
));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if payload_workspace_id.is_empty() {
|
|
||||||
return Err(format!(
|
|
||||||
"{context} requires non-empty \"workspaceId\" in JSON payload or positional workspace_id"
|
|
||||||
));
|
|
||||||
}
|
|
||||||
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn apply_merge_patch<T>(existing: &T, patch: &Value, id: &str, context: &str) -> JsonResult<T>
|
|
||||||
where
|
|
||||||
T: Serialize + DeserializeOwned,
|
|
||||||
{
|
|
||||||
let mut base = serde_json::to_value(existing)
|
|
||||||
.map_err(|error| format!("Failed to serialize existing model for {context}: {error}"))?;
|
|
||||||
merge_patch(&mut base, patch);
|
|
||||||
|
|
||||||
let Some(base_object) = base.as_object_mut() else {
|
|
||||||
return Err(format!("Merged payload for {context} must be an object"));
|
|
||||||
};
|
|
||||||
base_object.insert("id".to_string(), Value::String(id.to_string()));
|
|
||||||
|
|
||||||
serde_json::from_value(base)
|
|
||||||
.map_err(|error| format!("Failed to deserialize merged payload for {context}: {error}"))
|
|
||||||
}
|
|
||||||
|
|
||||||
fn merge_patch(target: &mut Value, patch: &Value) {
|
|
||||||
match patch {
|
|
||||||
Value::Object(patch_map) => {
|
|
||||||
if !target.is_object() {
|
|
||||||
*target = Value::Object(Map::new());
|
|
||||||
}
|
|
||||||
|
|
||||||
let target_map =
|
|
||||||
target.as_object_mut().expect("merge_patch target expected to be object");
|
|
||||||
|
|
||||||
for (key, patch_value) in patch_map {
|
|
||||||
if patch_value.is_null() {
|
|
||||||
target_map.remove(key);
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
let target_entry = target_map.entry(key.clone()).or_insert(Value::Null);
|
|
||||||
merge_patch(target_entry, patch_value);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
_ => {
|
|
||||||
*target = patch.clone();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,5 +0,0 @@
|
|||||||
pub mod confirm;
|
|
||||||
pub mod http;
|
|
||||||
pub mod json;
|
|
||||||
pub mod schema;
|
|
||||||
pub mod workspace;
|
|
||||||
@@ -1,15 +0,0 @@
|
|||||||
use serde_json::{Value, json};
|
|
||||||
|
|
||||||
pub fn append_agent_hints(schema: &mut Value) {
|
|
||||||
let Some(schema_obj) = schema.as_object_mut() else {
|
|
||||||
return;
|
|
||||||
};
|
|
||||||
|
|
||||||
schema_obj.insert(
|
|
||||||
"x-yaak-agent-hints".to_string(),
|
|
||||||
json!({
|
|
||||||
"templateVariableSyntax": "${[ my_var ]}",
|
|
||||||
"templateFunctionSyntax": "${[ namespace.my_func(a='aaa',b='bbb') ]}",
|
|
||||||
}),
|
|
||||||
);
|
|
||||||
}
|
|
||||||
@@ -1,19 +0,0 @@
|
|||||||
use crate::context::CliContext;
|
|
||||||
|
|
||||||
pub fn resolve_workspace_id(
|
|
||||||
ctx: &CliContext,
|
|
||||||
workspace_id: Option<&str>,
|
|
||||||
command_name: &str,
|
|
||||||
) -> Result<String, String> {
|
|
||||||
if let Some(workspace_id) = workspace_id {
|
|
||||||
return Ok(workspace_id.to_string());
|
|
||||||
}
|
|
||||||
|
|
||||||
let workspaces =
|
|
||||||
ctx.db().list_workspaces().map_err(|e| format!("Failed to list workspaces: {e}"))?;
|
|
||||||
match workspaces.as_slice() {
|
|
||||||
[] => Err(format!("No workspaces found. {command_name} requires a workspace ID.")),
|
|
||||||
[workspace] => Ok(workspace.id.clone()),
|
|
||||||
_ => Err(format!("Multiple workspaces found. {command_name} requires a workspace ID.")),
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,3 +0,0 @@
|
|||||||
pub fn cli_version() -> &'static str {
|
|
||||||
option_env!("YAAK_CLI_VERSION").unwrap_or(env!("CARGO_PKG_VERSION"))
|
|
||||||
}
|
|
||||||
@@ -1,226 +0,0 @@
|
|||||||
use crate::ui;
|
|
||||||
use crate::version;
|
|
||||||
use serde::{Deserialize, Serialize};
|
|
||||||
use std::fs;
|
|
||||||
use std::io::IsTerminal;
|
|
||||||
use std::path::{Path, PathBuf};
|
|
||||||
use std::time::{Duration, SystemTime, UNIX_EPOCH};
|
|
||||||
use yaak_api::{ApiClientKind, yaak_api_client};
|
|
||||||
|
|
||||||
const CACHE_FILE_NAME: &str = "cli-version-check.json";
|
|
||||||
const CHECK_INTERVAL_SECS: u64 = 24 * 60 * 60;
|
|
||||||
const REQUEST_TIMEOUT: Duration = Duration::from_millis(800);
|
|
||||||
|
|
||||||
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
|
||||||
#[serde(default)]
|
|
||||||
struct VersionCheckResponse {
|
|
||||||
outdated: bool,
|
|
||||||
latest_version: Option<String>,
|
|
||||||
upgrade_hint: Option<String>,
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
|
||||||
#[serde(default)]
|
|
||||||
struct CacheRecord {
|
|
||||||
checked_at_epoch_secs: u64,
|
|
||||||
response: VersionCheckResponse,
|
|
||||||
last_warned_at_epoch_secs: Option<u64>,
|
|
||||||
last_warned_version: Option<String>,
|
|
||||||
}
|
|
||||||
|
|
||||||
impl Default for CacheRecord {
|
|
||||||
fn default() -> Self {
|
|
||||||
Self {
|
|
||||||
checked_at_epoch_secs: 0,
|
|
||||||
response: VersionCheckResponse::default(),
|
|
||||||
last_warned_at_epoch_secs: None,
|
|
||||||
last_warned_version: None,
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Debug, Serialize)]
|
|
||||||
#[serde(rename_all = "camelCase")]
|
|
||||||
struct VersionCheckRequest<'a> {
|
|
||||||
current_version: &'a str,
|
|
||||||
channel: String,
|
|
||||||
install_source: String,
|
|
||||||
platform: &'a str,
|
|
||||||
arch: &'a str,
|
|
||||||
}
|
|
||||||
|
|
||||||
pub async fn maybe_check_for_updates() {
|
|
||||||
if should_skip_check() {
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
let now = unix_epoch_secs();
|
|
||||||
let cache_path = cache_path();
|
|
||||||
let cached = read_cache(&cache_path);
|
|
||||||
|
|
||||||
if let Some(cache) = cached.as_ref().filter(|c| !is_expired(c.checked_at_epoch_secs, now)) {
|
|
||||||
let mut record = cache.clone();
|
|
||||||
maybe_warn_outdated(&mut record, now);
|
|
||||||
write_cache(&cache_path, &record);
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
let fresh = fetch_version_check().await;
|
|
||||||
match fresh {
|
|
||||||
Some(response) => {
|
|
||||||
let mut record = CacheRecord {
|
|
||||||
checked_at_epoch_secs: now,
|
|
||||||
response: response.clone(),
|
|
||||||
last_warned_at_epoch_secs: cached
|
|
||||||
.as_ref()
|
|
||||||
.and_then(|c| c.last_warned_at_epoch_secs),
|
|
||||||
last_warned_version: cached.as_ref().and_then(|c| c.last_warned_version.clone()),
|
|
||||||
};
|
|
||||||
maybe_warn_outdated(&mut record, now);
|
|
||||||
write_cache(&cache_path, &record);
|
|
||||||
}
|
|
||||||
None => {
|
|
||||||
let fallback = cached.as_ref().map(|cache| cache.response.clone()).unwrap_or_default();
|
|
||||||
let mut record = CacheRecord {
|
|
||||||
checked_at_epoch_secs: now,
|
|
||||||
response: fallback,
|
|
||||||
last_warned_at_epoch_secs: cached
|
|
||||||
.as_ref()
|
|
||||||
.and_then(|c| c.last_warned_at_epoch_secs),
|
|
||||||
last_warned_version: cached.as_ref().and_then(|c| c.last_warned_version.clone()),
|
|
||||||
};
|
|
||||||
maybe_warn_outdated(&mut record, now);
|
|
||||||
write_cache(&cache_path, &record);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn should_skip_check() -> bool {
|
|
||||||
if std::env::var("YAAK_CLI_NO_UPDATE_CHECK")
|
|
||||||
.is_ok_and(|v| v == "1" || v.eq_ignore_ascii_case("true"))
|
|
||||||
{
|
|
||||||
return true;
|
|
||||||
}
|
|
||||||
|
|
||||||
if std::env::var("CI").is_ok() {
|
|
||||||
return true;
|
|
||||||
}
|
|
||||||
|
|
||||||
!std::io::stdout().is_terminal()
|
|
||||||
}
|
|
||||||
|
|
||||||
async fn fetch_version_check() -> Option<VersionCheckResponse> {
|
|
||||||
let api_url = format!("{}/cli/check", update_base_url());
|
|
||||||
let current_version = version::cli_version();
|
|
||||||
let payload = VersionCheckRequest {
|
|
||||||
current_version,
|
|
||||||
channel: release_channel(current_version),
|
|
||||||
install_source: install_source(),
|
|
||||||
platform: std::env::consts::OS,
|
|
||||||
arch: std::env::consts::ARCH,
|
|
||||||
};
|
|
||||||
|
|
||||||
let client = yaak_api_client(ApiClientKind::Cli, current_version).ok()?;
|
|
||||||
let request = client.post(api_url).json(&payload);
|
|
||||||
|
|
||||||
let response = tokio::time::timeout(REQUEST_TIMEOUT, request.send()).await.ok()?.ok()?;
|
|
||||||
if !response.status().is_success() {
|
|
||||||
return None;
|
|
||||||
}
|
|
||||||
|
|
||||||
tokio::time::timeout(REQUEST_TIMEOUT, response.json::<VersionCheckResponse>()).await.ok()?.ok()
|
|
||||||
}
|
|
||||||
|
|
||||||
fn release_channel(version: &str) -> String {
|
|
||||||
version
|
|
||||||
.split_once('-')
|
|
||||||
.and_then(|(_, suffix)| suffix.split('.').next())
|
|
||||||
.unwrap_or("stable")
|
|
||||||
.to_string()
|
|
||||||
}
|
|
||||||
|
|
||||||
fn install_source() -> String {
|
|
||||||
std::env::var("YAAK_CLI_INSTALL_SOURCE")
|
|
||||||
.ok()
|
|
||||||
.filter(|s| !s.trim().is_empty())
|
|
||||||
.unwrap_or_else(|| "source".to_string())
|
|
||||||
}
|
|
||||||
|
|
||||||
fn update_base_url() -> &'static str {
|
|
||||||
match std::env::var("ENVIRONMENT").ok().as_deref() {
|
|
||||||
Some("development") => "http://localhost:9444",
|
|
||||||
_ => "https://update.yaak.app",
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn maybe_warn_outdated(record: &mut CacheRecord, now: u64) {
|
|
||||||
if !record.response.outdated {
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
let latest =
|
|
||||||
record.response.latest_version.clone().unwrap_or_else(|| "a newer release".to_string());
|
|
||||||
let warn_suppressed = record.last_warned_version.as_deref() == Some(latest.as_str())
|
|
||||||
&& record.last_warned_at_epoch_secs.is_some_and(|t| !is_expired(t, now));
|
|
||||||
if warn_suppressed {
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
let hint = record.response.upgrade_hint.clone().unwrap_or_else(default_upgrade_hint);
|
|
||||||
ui::warning_stderr(&format!("A newer Yaak CLI version is available ({latest}). {hint}"));
|
|
||||||
record.last_warned_version = Some(latest);
|
|
||||||
record.last_warned_at_epoch_secs = Some(now);
|
|
||||||
}
|
|
||||||
|
|
||||||
fn default_upgrade_hint() -> String {
|
|
||||||
if install_source() == "npm" {
|
|
||||||
let channel = release_channel(version::cli_version());
|
|
||||||
if channel == "stable" {
|
|
||||||
return "Run `npm install -g @yaakapp/cli@latest` to update.".to_string();
|
|
||||||
}
|
|
||||||
return format!("Run `npm install -g @yaakapp/cli@{channel}` to update.");
|
|
||||||
}
|
|
||||||
|
|
||||||
"Update your Yaak CLI installation to the latest release.".to_string()
|
|
||||||
}
|
|
||||||
|
|
||||||
fn cache_path() -> PathBuf {
|
|
||||||
std::env::temp_dir().join("yaak-cli").join(format!("{}-{CACHE_FILE_NAME}", environment_name()))
|
|
||||||
}
|
|
||||||
|
|
||||||
fn environment_name() -> &'static str {
|
|
||||||
match std::env::var("ENVIRONMENT").ok().as_deref() {
|
|
||||||
Some("staging") => "staging",
|
|
||||||
Some("development") => "development",
|
|
||||||
_ => "production",
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn read_cache(path: &Path) -> Option<CacheRecord> {
|
|
||||||
let contents = fs::read_to_string(path).ok()?;
|
|
||||||
serde_json::from_str::<CacheRecord>(&contents).ok()
|
|
||||||
}
|
|
||||||
|
|
||||||
fn write_cache(path: &Path, record: &CacheRecord) {
|
|
||||||
let Some(parent) = path.parent() else {
|
|
||||||
return;
|
|
||||||
};
|
|
||||||
if fs::create_dir_all(parent).is_err() {
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
let Ok(json) = serde_json::to_string(record) else {
|
|
||||||
return;
|
|
||||||
};
|
|
||||||
let _ = fs::write(path, json);
|
|
||||||
}
|
|
||||||
|
|
||||||
fn is_expired(checked_at_epoch_secs: u64, now: u64) -> bool {
|
|
||||||
now.saturating_sub(checked_at_epoch_secs) >= CHECK_INTERVAL_SECS
|
|
||||||
}
|
|
||||||
|
|
||||||
fn unix_epoch_secs() -> u64 {
|
|
||||||
SystemTime::now()
|
|
||||||
.duration_since(UNIX_EPOCH)
|
|
||||||
.unwrap_or_else(|_| Duration::from_secs(0))
|
|
||||||
.as_secs()
|
|
||||||
}
|
|
||||||
@@ -1,64 +0,0 @@
|
|||||||
use std::io::{Read, Write};
|
|
||||||
use std::net::{SocketAddr, TcpListener, TcpStream};
|
|
||||||
use std::sync::Arc;
|
|
||||||
use std::sync::atomic::{AtomicBool, Ordering};
|
|
||||||
use std::thread;
|
|
||||||
use std::time::Duration;
|
|
||||||
|
|
||||||
pub struct TestHttpServer {
|
|
||||||
pub url: String,
|
|
||||||
addr: SocketAddr,
|
|
||||||
shutdown: Arc<AtomicBool>,
|
|
||||||
handle: Option<thread::JoinHandle<()>>,
|
|
||||||
}
|
|
||||||
|
|
||||||
impl TestHttpServer {
|
|
||||||
pub fn spawn_ok(body: &'static str) -> Self {
|
|
||||||
let listener = TcpListener::bind("127.0.0.1:0").expect("Failed to bind test HTTP server");
|
|
||||||
let addr = listener.local_addr().expect("Failed to get local addr");
|
|
||||||
let url = format!("http://{addr}/test");
|
|
||||||
listener.set_nonblocking(true).expect("Failed to set test server listener nonblocking");
|
|
||||||
|
|
||||||
let shutdown = Arc::new(AtomicBool::new(false));
|
|
||||||
let shutdown_signal = Arc::clone(&shutdown);
|
|
||||||
let body_bytes = body.as_bytes().to_vec();
|
|
||||||
|
|
||||||
let handle = thread::spawn(move || {
|
|
||||||
while !shutdown_signal.load(Ordering::Relaxed) {
|
|
||||||
match listener.accept() {
|
|
||||||
Ok((mut stream, _)) => {
|
|
||||||
let _ = stream.set_read_timeout(Some(Duration::from_secs(1)));
|
|
||||||
let mut request_buf = [0u8; 4096];
|
|
||||||
let _ = stream.read(&mut request_buf);
|
|
||||||
|
|
||||||
let response = format!(
|
|
||||||
"HTTP/1.1 200 OK\r\nContent-Type: text/plain\r\nContent-Length: {}\r\nConnection: close\r\n\r\n",
|
|
||||||
body_bytes.len()
|
|
||||||
);
|
|
||||||
let _ = stream.write_all(response.as_bytes());
|
|
||||||
let _ = stream.write_all(&body_bytes);
|
|
||||||
let _ = stream.flush();
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
Err(err) if err.kind() == std::io::ErrorKind::WouldBlock => {
|
|
||||||
thread::sleep(Duration::from_millis(10));
|
|
||||||
}
|
|
||||||
Err(_) => break,
|
|
||||||
}
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
Self { url, addr, shutdown, handle: Some(handle) }
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
impl Drop for TestHttpServer {
|
|
||||||
fn drop(&mut self) {
|
|
||||||
self.shutdown.store(true, Ordering::Relaxed);
|
|
||||||
let _ = TcpStream::connect(self.addr);
|
|
||||||
|
|
||||||
if let Some(handle) = self.handle.take() {
|
|
||||||
let _ = handle.join();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,106 +0,0 @@
|
|||||||
#![allow(dead_code)]
|
|
||||||
|
|
||||||
pub mod http_server;
|
|
||||||
|
|
||||||
use assert_cmd::Command;
|
|
||||||
use assert_cmd::cargo::cargo_bin_cmd;
|
|
||||||
use std::path::Path;
|
|
||||||
use yaak_models::models::{Folder, GrpcRequest, HttpRequest, WebsocketRequest, Workspace};
|
|
||||||
use yaak_models::query_manager::QueryManager;
|
|
||||||
use yaak_models::util::UpdateSource;
|
|
||||||
|
|
||||||
pub fn cli_cmd(data_dir: &Path) -> Command {
|
|
||||||
let mut cmd = cargo_bin_cmd!("yaak");
|
|
||||||
cmd.arg("--data-dir").arg(data_dir);
|
|
||||||
cmd
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn parse_created_id(stdout: &[u8], label: &str) -> String {
|
|
||||||
String::from_utf8_lossy(stdout)
|
|
||||||
.trim()
|
|
||||||
.split_once(": ")
|
|
||||||
.map(|(_, id)| id.to_string())
|
|
||||||
.unwrap_or_else(|| panic!("Expected id in '{label}' output"))
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn query_manager(data_dir: &Path) -> QueryManager {
|
|
||||||
let db_path = data_dir.join("db.sqlite");
|
|
||||||
let blob_path = data_dir.join("blobs.sqlite");
|
|
||||||
let (query_manager, _blob_manager, _rx) =
|
|
||||||
yaak_models::init_standalone(&db_path, &blob_path).expect("Failed to initialize DB");
|
|
||||||
query_manager
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn seed_workspace(data_dir: &Path, workspace_id: &str) {
|
|
||||||
let workspace = Workspace {
|
|
||||||
id: workspace_id.to_string(),
|
|
||||||
name: "Seed Workspace".to_string(),
|
|
||||||
description: "Seeded for integration tests".to_string(),
|
|
||||||
..Default::default()
|
|
||||||
};
|
|
||||||
|
|
||||||
query_manager(data_dir)
|
|
||||||
.connect()
|
|
||||||
.upsert_workspace(&workspace, &UpdateSource::Sync)
|
|
||||||
.expect("Failed to seed workspace");
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn seed_request(data_dir: &Path, workspace_id: &str, request_id: &str) {
|
|
||||||
let request = HttpRequest {
|
|
||||||
id: request_id.to_string(),
|
|
||||||
workspace_id: workspace_id.to_string(),
|
|
||||||
name: "Seeded Request".to_string(),
|
|
||||||
method: "GET".to_string(),
|
|
||||||
url: "https://example.com".to_string(),
|
|
||||||
..Default::default()
|
|
||||||
};
|
|
||||||
|
|
||||||
query_manager(data_dir)
|
|
||||||
.connect()
|
|
||||||
.upsert_http_request(&request, &UpdateSource::Sync)
|
|
||||||
.expect("Failed to seed request");
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn seed_folder(data_dir: &Path, workspace_id: &str, folder_id: &str) {
|
|
||||||
let folder = Folder {
|
|
||||||
id: folder_id.to_string(),
|
|
||||||
workspace_id: workspace_id.to_string(),
|
|
||||||
name: "Seed Folder".to_string(),
|
|
||||||
..Default::default()
|
|
||||||
};
|
|
||||||
|
|
||||||
query_manager(data_dir)
|
|
||||||
.connect()
|
|
||||||
.upsert_folder(&folder, &UpdateSource::Sync)
|
|
||||||
.expect("Failed to seed folder");
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn seed_grpc_request(data_dir: &Path, workspace_id: &str, request_id: &str) {
|
|
||||||
let request = GrpcRequest {
|
|
||||||
id: request_id.to_string(),
|
|
||||||
workspace_id: workspace_id.to_string(),
|
|
||||||
name: "Seeded gRPC Request".to_string(),
|
|
||||||
url: "https://example.com".to_string(),
|
|
||||||
..Default::default()
|
|
||||||
};
|
|
||||||
|
|
||||||
query_manager(data_dir)
|
|
||||||
.connect()
|
|
||||||
.upsert_grpc_request(&request, &UpdateSource::Sync)
|
|
||||||
.expect("Failed to seed gRPC request");
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn seed_websocket_request(data_dir: &Path, workspace_id: &str, request_id: &str) {
|
|
||||||
let request = WebsocketRequest {
|
|
||||||
id: request_id.to_string(),
|
|
||||||
workspace_id: workspace_id.to_string(),
|
|
||||||
name: "Seeded WebSocket Request".to_string(),
|
|
||||||
url: "wss://example.com/socket".to_string(),
|
|
||||||
..Default::default()
|
|
||||||
};
|
|
||||||
|
|
||||||
query_manager(data_dir)
|
|
||||||
.connect()
|
|
||||||
.upsert_websocket_request(&request, &UpdateSource::Sync)
|
|
||||||
.expect("Failed to seed WebSocket request");
|
|
||||||
}
|
|
||||||
@@ -1,146 +0,0 @@
|
|||||||
mod common;
|
|
||||||
|
|
||||||
use common::{cli_cmd, parse_created_id, query_manager, seed_workspace};
|
|
||||||
use predicates::str::contains;
|
|
||||||
use tempfile::TempDir;
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn create_list_show_delete_round_trip() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
seed_workspace(data_dir, "wk_test");
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["environment", "list", "wk_test"])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains("Global Variables"));
|
|
||||||
|
|
||||||
let create_assert = cli_cmd(data_dir)
|
|
||||||
.args(["environment", "create", "wk_test", "--name", "Production"])
|
|
||||||
.assert()
|
|
||||||
.success();
|
|
||||||
let environment_id = parse_created_id(&create_assert.get_output().stdout, "environment create");
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["environment", "list", "wk_test"])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains(&environment_id))
|
|
||||||
.stdout(contains("Production"));
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["environment", "show", &environment_id])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains(format!("\"id\": \"{environment_id}\"")))
|
|
||||||
.stdout(contains("\"parentModel\": \"environment\""));
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["environment", "delete", &environment_id, "--yes"])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains(format!("Deleted environment: {environment_id}")));
|
|
||||||
|
|
||||||
assert!(query_manager(data_dir).connect().get_environment(&environment_id).is_err());
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn json_create_and_update_merge_patch_round_trip() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
seed_workspace(data_dir, "wk_test");
|
|
||||||
|
|
||||||
let create_assert = cli_cmd(data_dir)
|
|
||||||
.args([
|
|
||||||
"environment",
|
|
||||||
"create",
|
|
||||||
r#"{"workspaceId":"wk_test","name":"Json Environment"}"#,
|
|
||||||
])
|
|
||||||
.assert()
|
|
||||||
.success();
|
|
||||||
let environment_id = parse_created_id(&create_assert.get_output().stdout, "environment create");
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args([
|
|
||||||
"environment",
|
|
||||||
"update",
|
|
||||||
&format!(r##"{{"id":"{}","color":"#00ff00"}}"##, environment_id),
|
|
||||||
])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains(format!("Updated environment: {environment_id}")));
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["environment", "show", &environment_id])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains("\"name\": \"Json Environment\""))
|
|
||||||
.stdout(contains("\"color\": \"#00ff00\""));
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn create_merges_positional_workspace_id_into_json_payload() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
seed_workspace(data_dir, "wk_test");
|
|
||||||
|
|
||||||
let create_assert = cli_cmd(data_dir)
|
|
||||||
.args([
|
|
||||||
"environment",
|
|
||||||
"create",
|
|
||||||
"wk_test",
|
|
||||||
"--json",
|
|
||||||
r#"{"name":"Merged Environment"}"#,
|
|
||||||
])
|
|
||||||
.assert()
|
|
||||||
.success();
|
|
||||||
let environment_id = parse_created_id(&create_assert.get_output().stdout, "environment create");
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["environment", "show", &environment_id])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains("\"workspaceId\": \"wk_test\""))
|
|
||||||
.stdout(contains("\"name\": \"Merged Environment\""));
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn create_rejects_conflicting_workspace_ids_between_arg_and_json() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
seed_workspace(data_dir, "wk_test");
|
|
||||||
seed_workspace(data_dir, "wk_other");
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args([
|
|
||||||
"environment",
|
|
||||||
"create",
|
|
||||||
"wk_test",
|
|
||||||
"--json",
|
|
||||||
r#"{"workspaceId":"wk_other","name":"Mismatch"}"#,
|
|
||||||
])
|
|
||||||
.assert()
|
|
||||||
.failure()
|
|
||||||
.stderr(contains(
|
|
||||||
"environment create got conflicting workspace_id values between positional arg and JSON payload",
|
|
||||||
));
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn environment_schema_outputs_json_schema() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["environment", "schema"])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains("\"type\":\"object\""))
|
|
||||||
.stdout(contains("\"x-yaak-agent-hints\""))
|
|
||||||
.stdout(contains("\"templateVariableSyntax\":\"${[ my_var ]}\""))
|
|
||||||
.stdout(contains(
|
|
||||||
"\"templateFunctionSyntax\":\"${[ namespace.my_func(a='aaa',b='bbb') ]}\"",
|
|
||||||
))
|
|
||||||
.stdout(contains("\"workspaceId\""));
|
|
||||||
}
|
|
||||||
@@ -1,122 +0,0 @@
|
|||||||
mod common;
|
|
||||||
|
|
||||||
use common::{cli_cmd, parse_created_id, query_manager, seed_workspace};
|
|
||||||
use predicates::str::contains;
|
|
||||||
use tempfile::TempDir;
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn create_list_show_delete_round_trip() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
seed_workspace(data_dir, "wk_test");
|
|
||||||
|
|
||||||
let create_assert = cli_cmd(data_dir)
|
|
||||||
.args(["folder", "create", "wk_test", "--name", "Auth"])
|
|
||||||
.assert()
|
|
||||||
.success();
|
|
||||||
let folder_id = parse_created_id(&create_assert.get_output().stdout, "folder create");
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["folder", "list", "wk_test"])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains(&folder_id))
|
|
||||||
.stdout(contains("Auth"));
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["folder", "show", &folder_id])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains(format!("\"id\": \"{folder_id}\"")))
|
|
||||||
.stdout(contains("\"workspaceId\": \"wk_test\""));
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["folder", "delete", &folder_id, "--yes"])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains(format!("Deleted folder: {folder_id}")));
|
|
||||||
|
|
||||||
assert!(query_manager(data_dir).connect().get_folder(&folder_id).is_err());
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn json_create_and_update_merge_patch_round_trip() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
seed_workspace(data_dir, "wk_test");
|
|
||||||
|
|
||||||
let create_assert = cli_cmd(data_dir)
|
|
||||||
.args([
|
|
||||||
"folder",
|
|
||||||
"create",
|
|
||||||
r#"{"workspaceId":"wk_test","name":"Json Folder"}"#,
|
|
||||||
])
|
|
||||||
.assert()
|
|
||||||
.success();
|
|
||||||
let folder_id = parse_created_id(&create_assert.get_output().stdout, "folder create");
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args([
|
|
||||||
"folder",
|
|
||||||
"update",
|
|
||||||
&format!(r#"{{"id":"{}","description":"Folder Description"}}"#, folder_id),
|
|
||||||
])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains(format!("Updated folder: {folder_id}")));
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["folder", "show", &folder_id])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains("\"name\": \"Json Folder\""))
|
|
||||||
.stdout(contains("\"description\": \"Folder Description\""));
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn create_merges_positional_workspace_id_into_json_payload() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
seed_workspace(data_dir, "wk_test");
|
|
||||||
|
|
||||||
let create_assert = cli_cmd(data_dir)
|
|
||||||
.args([
|
|
||||||
"folder",
|
|
||||||
"create",
|
|
||||||
"wk_test",
|
|
||||||
"--json",
|
|
||||||
r#"{"name":"Merged Folder"}"#,
|
|
||||||
])
|
|
||||||
.assert()
|
|
||||||
.success();
|
|
||||||
let folder_id = parse_created_id(&create_assert.get_output().stdout, "folder create");
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["folder", "show", &folder_id])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains("\"workspaceId\": \"wk_test\""))
|
|
||||||
.stdout(contains("\"name\": \"Merged Folder\""));
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn create_rejects_conflicting_workspace_ids_between_arg_and_json() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
seed_workspace(data_dir, "wk_test");
|
|
||||||
seed_workspace(data_dir, "wk_other");
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args([
|
|
||||||
"folder",
|
|
||||||
"create",
|
|
||||||
"wk_test",
|
|
||||||
"--json",
|
|
||||||
r#"{"workspaceId":"wk_other","name":"Mismatch"}"#,
|
|
||||||
])
|
|
||||||
.assert()
|
|
||||||
.failure()
|
|
||||||
.stderr(contains(
|
|
||||||
"folder create got conflicting workspace_id values between positional arg and JSON payload",
|
|
||||||
));
|
|
||||||
}
|
|
||||||
@@ -1,291 +0,0 @@
|
|||||||
mod common;
|
|
||||||
|
|
||||||
use common::http_server::TestHttpServer;
|
|
||||||
use common::{
|
|
||||||
cli_cmd, parse_created_id, query_manager, seed_grpc_request, seed_request,
|
|
||||||
seed_websocket_request, seed_workspace,
|
|
||||||
};
|
|
||||||
use predicates::str::contains;
|
|
||||||
use tempfile::TempDir;
|
|
||||||
use yaak_models::models::HttpResponseState;
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn show_and_delete_yes_round_trip() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
seed_workspace(data_dir, "wk_test");
|
|
||||||
|
|
||||||
let create_assert = cli_cmd(data_dir)
|
|
||||||
.args([
|
|
||||||
"request",
|
|
||||||
"create",
|
|
||||||
"wk_test",
|
|
||||||
"--name",
|
|
||||||
"Smoke Test",
|
|
||||||
"--url",
|
|
||||||
"https://example.com",
|
|
||||||
])
|
|
||||||
.assert()
|
|
||||||
.success();
|
|
||||||
|
|
||||||
let request_id = parse_created_id(&create_assert.get_output().stdout, "request create");
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["request", "show", &request_id])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains(format!("\"id\": \"{request_id}\"")))
|
|
||||||
.stdout(contains("\"workspaceId\": \"wk_test\""));
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["request", "delete", &request_id, "--yes"])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains(format!("Deleted request: {request_id}")));
|
|
||||||
|
|
||||||
assert!(query_manager(data_dir).connect().get_http_request(&request_id).is_err());
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn delete_without_yes_fails_in_non_interactive_mode() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
seed_workspace(data_dir, "wk_test");
|
|
||||||
seed_request(data_dir, "wk_test", "rq_seed_delete_noninteractive");
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["request", "delete", "rq_seed_delete_noninteractive"])
|
|
||||||
.assert()
|
|
||||||
.failure()
|
|
||||||
.code(1)
|
|
||||||
.stderr(contains("Refusing to delete in non-interactive mode without --yes"));
|
|
||||||
|
|
||||||
assert!(
|
|
||||||
query_manager(data_dir).connect().get_http_request("rq_seed_delete_noninteractive").is_ok()
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn json_create_and_update_merge_patch_round_trip() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
seed_workspace(data_dir, "wk_test");
|
|
||||||
|
|
||||||
let create_assert = cli_cmd(data_dir)
|
|
||||||
.args([
|
|
||||||
"request",
|
|
||||||
"create",
|
|
||||||
r#"{"workspaceId":"wk_test","name":"Json Request","url":"https://example.com"}"#,
|
|
||||||
])
|
|
||||||
.assert()
|
|
||||||
.success();
|
|
||||||
let request_id = parse_created_id(&create_assert.get_output().stdout, "request create");
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args([
|
|
||||||
"request",
|
|
||||||
"update",
|
|
||||||
&format!(r#"{{"id":"{}","name":"Renamed Request"}}"#, request_id),
|
|
||||||
])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains(format!("Updated request: {request_id}")));
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["request", "show", &request_id])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains("\"name\": \"Renamed Request\""))
|
|
||||||
.stdout(contains("\"url\": \"https://example.com\""));
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn update_requires_id_in_json_payload() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["request", "update", r#"{"name":"No ID"}"#])
|
|
||||||
.assert()
|
|
||||||
.failure()
|
|
||||||
.stderr(contains("request update requires a non-empty \"id\" field"));
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn create_allows_workspace_only_with_empty_defaults() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
seed_workspace(data_dir, "wk_test");
|
|
||||||
|
|
||||||
let create_assert = cli_cmd(data_dir).args(["request", "create", "wk_test"]).assert().success();
|
|
||||||
let request_id = parse_created_id(&create_assert.get_output().stdout, "request create");
|
|
||||||
|
|
||||||
let request = query_manager(data_dir)
|
|
||||||
.connect()
|
|
||||||
.get_http_request(&request_id)
|
|
||||||
.expect("Failed to load created request");
|
|
||||||
assert_eq!(request.workspace_id, "wk_test");
|
|
||||||
assert_eq!(request.method, "GET");
|
|
||||||
assert_eq!(request.name, "");
|
|
||||||
assert_eq!(request.url, "");
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn create_merges_positional_workspace_id_into_json_payload() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
seed_workspace(data_dir, "wk_test");
|
|
||||||
|
|
||||||
let create_assert = cli_cmd(data_dir)
|
|
||||||
.args([
|
|
||||||
"request",
|
|
||||||
"create",
|
|
||||||
"wk_test",
|
|
||||||
"--json",
|
|
||||||
r#"{"name":"Merged Request","url":"https://example.com"}"#,
|
|
||||||
])
|
|
||||||
.assert()
|
|
||||||
.success();
|
|
||||||
let request_id = parse_created_id(&create_assert.get_output().stdout, "request create");
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["request", "show", &request_id])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains("\"workspaceId\": \"wk_test\""))
|
|
||||||
.stdout(contains("\"name\": \"Merged Request\""));
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn create_rejects_conflicting_workspace_ids_between_arg_and_json() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
seed_workspace(data_dir, "wk_test");
|
|
||||||
seed_workspace(data_dir, "wk_other");
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args([
|
|
||||||
"request",
|
|
||||||
"create",
|
|
||||||
"wk_test",
|
|
||||||
"--json",
|
|
||||||
r#"{"workspaceId":"wk_other","name":"Mismatch"}"#,
|
|
||||||
])
|
|
||||||
.assert()
|
|
||||||
.failure()
|
|
||||||
.stderr(contains(
|
|
||||||
"request create got conflicting workspace_id values between positional arg and JSON payload",
|
|
||||||
));
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn request_send_persists_response_body_and_events() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
seed_workspace(data_dir, "wk_test");
|
|
||||||
|
|
||||||
let server = TestHttpServer::spawn_ok("hello from integration test");
|
|
||||||
|
|
||||||
let create_assert = cli_cmd(data_dir)
|
|
||||||
.args([
|
|
||||||
"request",
|
|
||||||
"create",
|
|
||||||
"wk_test",
|
|
||||||
"--name",
|
|
||||||
"Send Test",
|
|
||||||
"--url",
|
|
||||||
&server.url,
|
|
||||||
])
|
|
||||||
.assert()
|
|
||||||
.success();
|
|
||||||
let request_id = parse_created_id(&create_assert.get_output().stdout, "request create");
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["request", "send", &request_id])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains("hello from integration test"));
|
|
||||||
|
|
||||||
let qm = query_manager(data_dir);
|
|
||||||
let db = qm.connect();
|
|
||||||
let responses =
|
|
||||||
db.list_http_responses_for_request(&request_id, None).expect("Failed to load responses");
|
|
||||||
assert_eq!(responses.len(), 1, "expected exactly one persisted response");
|
|
||||||
|
|
||||||
let response = &responses[0];
|
|
||||||
assert_eq!(response.status, 200);
|
|
||||||
assert!(matches!(response.state, HttpResponseState::Closed));
|
|
||||||
assert!(response.error.is_none());
|
|
||||||
|
|
||||||
let body_path =
|
|
||||||
response.body_path.as_ref().expect("expected persisted response body path").to_string();
|
|
||||||
let body = std::fs::read_to_string(&body_path).expect("Failed to read response body file");
|
|
||||||
assert_eq!(body, "hello from integration test");
|
|
||||||
|
|
||||||
let events =
|
|
||||||
db.list_http_response_events(&response.id).expect("Failed to load response events");
|
|
||||||
assert!(!events.is_empty(), "expected at least one persisted response event");
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn request_schema_http_outputs_json_schema() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["request", "schema", "http"])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains("\"type\":\"object\""))
|
|
||||||
.stdout(contains("\"x-yaak-agent-hints\""))
|
|
||||||
.stdout(contains("\"templateVariableSyntax\":\"${[ my_var ]}\""))
|
|
||||||
.stdout(contains(
|
|
||||||
"\"templateFunctionSyntax\":\"${[ namespace.my_func(a='aaa',b='bbb') ]}\"",
|
|
||||||
))
|
|
||||||
.stdout(contains("\"authentication\":"))
|
|
||||||
.stdout(contains("/foo/:id/comments/:commentId"))
|
|
||||||
.stdout(contains("put concrete values in `urlParameters`"));
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn request_schema_http_pretty_prints_with_flag() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["request", "schema", "http", "--pretty"])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains("\"type\": \"object\""))
|
|
||||||
.stdout(contains("\"authentication\""));
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn request_send_grpc_returns_explicit_nyi_error() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
seed_workspace(data_dir, "wk_test");
|
|
||||||
seed_grpc_request(data_dir, "wk_test", "gr_seed_nyi");
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["request", "send", "gr_seed_nyi"])
|
|
||||||
.assert()
|
|
||||||
.failure()
|
|
||||||
.code(1)
|
|
||||||
.stderr(contains("gRPC request send is not implemented yet in yaak-cli"));
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn request_send_websocket_returns_explicit_nyi_error() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
seed_workspace(data_dir, "wk_test");
|
|
||||||
seed_websocket_request(data_dir, "wk_test", "wr_seed_nyi");
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["request", "send", "wr_seed_nyi"])
|
|
||||||
.assert()
|
|
||||||
.failure()
|
|
||||||
.code(1)
|
|
||||||
.stderr(contains("WebSocket request send is not implemented yet in yaak-cli"));
|
|
||||||
}
|
|
||||||
@@ -1,79 +0,0 @@
|
|||||||
mod common;
|
|
||||||
|
|
||||||
use common::http_server::TestHttpServer;
|
|
||||||
use common::{cli_cmd, query_manager, seed_folder, seed_workspace};
|
|
||||||
use predicates::str::contains;
|
|
||||||
use tempfile::TempDir;
|
|
||||||
use yaak_models::models::HttpRequest;
|
|
||||||
use yaak_models::util::UpdateSource;
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn top_level_send_workspace_sends_http_requests_and_prints_summary() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
seed_workspace(data_dir, "wk_test");
|
|
||||||
|
|
||||||
let server = TestHttpServer::spawn_ok("workspace bulk send");
|
|
||||||
let request = HttpRequest {
|
|
||||||
id: "rq_workspace_send".to_string(),
|
|
||||||
workspace_id: "wk_test".to_string(),
|
|
||||||
name: "Workspace Send".to_string(),
|
|
||||||
method: "GET".to_string(),
|
|
||||||
url: server.url.clone(),
|
|
||||||
..Default::default()
|
|
||||||
};
|
|
||||||
query_manager(data_dir)
|
|
||||||
.connect()
|
|
||||||
.upsert_http_request(&request, &UpdateSource::Sync)
|
|
||||||
.expect("Failed to seed workspace request");
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["send", "wk_test"])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains("workspace bulk send"))
|
|
||||||
.stdout(contains("Send summary: 1 succeeded, 0 failed"));
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn top_level_send_folder_sends_http_requests_and_prints_summary() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
seed_workspace(data_dir, "wk_test");
|
|
||||||
seed_folder(data_dir, "wk_test", "fl_test");
|
|
||||||
|
|
||||||
let server = TestHttpServer::spawn_ok("folder bulk send");
|
|
||||||
let request = HttpRequest {
|
|
||||||
id: "rq_folder_send".to_string(),
|
|
||||||
workspace_id: "wk_test".to_string(),
|
|
||||||
folder_id: Some("fl_test".to_string()),
|
|
||||||
name: "Folder Send".to_string(),
|
|
||||||
method: "GET".to_string(),
|
|
||||||
url: server.url.clone(),
|
|
||||||
..Default::default()
|
|
||||||
};
|
|
||||||
query_manager(data_dir)
|
|
||||||
.connect()
|
|
||||||
.upsert_http_request(&request, &UpdateSource::Sync)
|
|
||||||
.expect("Failed to seed folder request");
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["send", "fl_test"])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains("folder bulk send"))
|
|
||||||
.stdout(contains("Send summary: 1 succeeded, 0 failed"));
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn top_level_send_unknown_id_fails_with_clear_error() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["send", "does_not_exist"])
|
|
||||||
.assert()
|
|
||||||
.failure()
|
|
||||||
.code(1)
|
|
||||||
.stderr(contains("Could not resolve ID 'does_not_exist' as request, folder, or workspace"));
|
|
||||||
}
|
|
||||||
@@ -1,77 +0,0 @@
|
|||||||
mod common;
|
|
||||||
|
|
||||||
use common::{cli_cmd, parse_created_id, query_manager};
|
|
||||||
use predicates::str::contains;
|
|
||||||
use tempfile::TempDir;
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn create_show_delete_round_trip() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
|
|
||||||
let create_assert =
|
|
||||||
cli_cmd(data_dir).args(["workspace", "create", "--name", "WS One"]).assert().success();
|
|
||||||
let workspace_id = parse_created_id(&create_assert.get_output().stdout, "workspace create");
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["workspace", "show", &workspace_id])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains(format!("\"id\": \"{workspace_id}\"")))
|
|
||||||
.stdout(contains("\"name\": \"WS One\""));
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["workspace", "delete", &workspace_id, "--yes"])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains(format!("Deleted workspace: {workspace_id}")));
|
|
||||||
|
|
||||||
assert!(query_manager(data_dir).connect().get_workspace(&workspace_id).is_err());
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn json_create_and_update_merge_patch_round_trip() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
|
|
||||||
let create_assert = cli_cmd(data_dir)
|
|
||||||
.args(["workspace", "create", r#"{"name":"Json Workspace"}"#])
|
|
||||||
.assert()
|
|
||||||
.success();
|
|
||||||
let workspace_id = parse_created_id(&create_assert.get_output().stdout, "workspace create");
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args([
|
|
||||||
"workspace",
|
|
||||||
"update",
|
|
||||||
&format!(r#"{{"id":"{}","description":"Updated via JSON"}}"#, workspace_id),
|
|
||||||
])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains(format!("Updated workspace: {workspace_id}")));
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["workspace", "show", &workspace_id])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains("\"name\": \"Json Workspace\""))
|
|
||||||
.stdout(contains("\"description\": \"Updated via JSON\""));
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn workspace_schema_outputs_json_schema() {
|
|
||||||
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
|
||||||
let data_dir = temp_dir.path();
|
|
||||||
|
|
||||||
cli_cmd(data_dir)
|
|
||||||
.args(["workspace", "schema"])
|
|
||||||
.assert()
|
|
||||||
.success()
|
|
||||||
.stdout(contains("\"type\":\"object\""))
|
|
||||||
.stdout(contains("\"x-yaak-agent-hints\""))
|
|
||||||
.stdout(contains("\"templateVariableSyntax\":\"${[ my_var ]}\""))
|
|
||||||
.stdout(contains(
|
|
||||||
"\"templateFunctionSyntax\":\"${[ namespace.my_func(a='aaa',b='bbb') ]}\"",
|
|
||||||
))
|
|
||||||
.stdout(contains("\"name\""));
|
|
||||||
}
|
|
||||||
@@ -30,21 +30,11 @@ eventsource-client = { git = "https://github.com/yaakapp/rust-eventsource-client
|
|||||||
http = { version = "1.2.0", default-features = false }
|
http = { version = "1.2.0", default-features = false }
|
||||||
log = { workspace = true }
|
log = { workspace = true }
|
||||||
md5 = "0.8.0"
|
md5 = "0.8.0"
|
||||||
pretty_graphql = "0.2"
|
|
||||||
r2d2 = "0.8.10"
|
r2d2 = "0.8.10"
|
||||||
r2d2_sqlite = "0.25.0"
|
r2d2_sqlite = "0.25.0"
|
||||||
mime_guess = "2.0.5"
|
mime_guess = "2.0.5"
|
||||||
rand = "0.9.0"
|
rand = "0.9.0"
|
||||||
reqwest = { workspace = true, features = [
|
reqwest = { workspace = true, features = ["multipart", "gzip", "brotli", "deflate", "json", "rustls-tls-manual-roots-no-provider", "socks", "http2"] }
|
||||||
"multipart",
|
|
||||||
"gzip",
|
|
||||||
"brotli",
|
|
||||||
"deflate",
|
|
||||||
"json",
|
|
||||||
"rustls-tls-manual-roots-no-provider",
|
|
||||||
"socks",
|
|
||||||
"http2",
|
|
||||||
] }
|
|
||||||
serde = { workspace = true, features = ["derive"] }
|
serde = { workspace = true, features = ["derive"] }
|
||||||
serde_json = { workspace = true, features = ["raw_value"] }
|
serde_json = { workspace = true, features = ["raw_value"] }
|
||||||
tauri = { workspace = true, features = ["devtools", "protocol-asset"] }
|
tauri = { workspace = true, features = ["devtools", "protocol-asset"] }
|
||||||
@@ -67,11 +57,9 @@ url = "2"
|
|||||||
tokio-util = { version = "0.7", features = ["codec"] }
|
tokio-util = { version = "0.7", features = ["codec"] }
|
||||||
ts-rs = { workspace = true }
|
ts-rs = { workspace = true }
|
||||||
uuid = "1.12.1"
|
uuid = "1.12.1"
|
||||||
yaak-api = { workspace = true }
|
|
||||||
yaak-common = { workspace = true }
|
yaak-common = { workspace = true }
|
||||||
yaak-tauri-utils = { workspace = true }
|
yaak-tauri-utils = { workspace = true }
|
||||||
yaak-core = { workspace = true }
|
yaak-core = { workspace = true }
|
||||||
yaak = { workspace = true }
|
|
||||||
yaak-crypto = { workspace = true }
|
yaak-crypto = { workspace = true }
|
||||||
yaak-fonts = { workspace = true }
|
yaak-fonts = { workspace = true }
|
||||||
yaak-git = { workspace = true }
|
yaak-git = { workspace = true }
|
||||||
|
|||||||
@@ -1,7 +1,9 @@
|
|||||||
{
|
{
|
||||||
"identifier": "default",
|
"identifier": "default",
|
||||||
"description": "Default capabilities for all build variants",
|
"description": "Default capabilities for all build variants",
|
||||||
"windows": ["*"],
|
"windows": [
|
||||||
|
"*"
|
||||||
|
],
|
||||||
"permissions": [
|
"permissions": [
|
||||||
"core:app:allow-identifier",
|
"core:app:allow-identifier",
|
||||||
"core:event:allow-emit",
|
"core:event:allow-emit",
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "@yaakapp-internal/tauri",
|
"name": "@yaakapp-internal/tauri",
|
||||||
"version": "1.0.0",
|
|
||||||
"private": true,
|
"private": true,
|
||||||
|
"version": "1.0.0",
|
||||||
"main": "bindings/index.ts"
|
"main": "bindings/index.ts"
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -2,6 +2,7 @@ use crate::PluginContextExt;
|
|||||||
use crate::error::Result;
|
use crate::error::Result;
|
||||||
use std::sync::Arc;
|
use std::sync::Arc;
|
||||||
use tauri::{AppHandle, Manager, Runtime, State, WebviewWindow, command};
|
use tauri::{AppHandle, Manager, Runtime, State, WebviewWindow, command};
|
||||||
|
use tauri_plugin_dialog::{DialogExt, MessageDialogKind};
|
||||||
use yaak_crypto::manager::EncryptionManager;
|
use yaak_crypto::manager::EncryptionManager;
|
||||||
use yaak_models::models::HttpRequestHeader;
|
use yaak_models::models::HttpRequestHeader;
|
||||||
use yaak_models::queries::workspaces::default_headers;
|
use yaak_models::queries::workspaces::default_headers;
|
||||||
@@ -22,6 +23,20 @@ impl<'a, R: Runtime, M: Manager<R>> EncryptionManagerExt<'a, R> for M {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[command]
|
||||||
|
pub(crate) async fn cmd_show_workspace_key<R: Runtime>(
|
||||||
|
window: WebviewWindow<R>,
|
||||||
|
workspace_id: &str,
|
||||||
|
) -> Result<()> {
|
||||||
|
let key = window.crypto().reveal_workspace_key(workspace_id)?;
|
||||||
|
window
|
||||||
|
.dialog()
|
||||||
|
.message(format!("Your workspace key is \n\n{}", key))
|
||||||
|
.kind(MessageDialogKind::Info)
|
||||||
|
.show(|_v| {});
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
#[command]
|
#[command]
|
||||||
pub(crate) async fn cmd_decrypt_template<R: Runtime>(
|
pub(crate) async fn cmd_decrypt_template<R: Runtime>(
|
||||||
window: WebviewWindow<R>,
|
window: WebviewWindow<R>,
|
||||||
|
|||||||
@@ -36,7 +36,7 @@ pub enum Error {
|
|||||||
PluginError(#[from] yaak_plugins::error::Error),
|
PluginError(#[from] yaak_plugins::error::Error),
|
||||||
|
|
||||||
#[error(transparent)]
|
#[error(transparent)]
|
||||||
ApiError(#[from] yaak_api::Error),
|
TauriUtilsError(#[from] yaak_tauri_utils::error::Error),
|
||||||
|
|
||||||
#[error(transparent)]
|
#[error(transparent)]
|
||||||
ClipboardError(#[from] tauri_plugin_clipboard_manager::Error),
|
ClipboardError(#[from] tauri_plugin_clipboard_manager::Error),
|
||||||
|
|||||||
@@ -9,8 +9,8 @@ use yaak_git::{
|
|||||||
BranchDeleteResult, CloneResult, GitCommit, GitRemote, GitStatusSummary, PullResult,
|
BranchDeleteResult, CloneResult, GitCommit, GitRemote, GitStatusSummary, PullResult,
|
||||||
PushResult, git_add, git_add_credential, git_add_remote, git_checkout_branch, git_clone,
|
PushResult, git_add, git_add_credential, git_add_remote, git_checkout_branch, git_clone,
|
||||||
git_commit, git_create_branch, git_delete_branch, git_delete_remote_branch, git_fetch_all,
|
git_commit, git_create_branch, git_delete_branch, git_delete_remote_branch, git_fetch_all,
|
||||||
git_init, git_log, git_merge_branch, git_pull, git_pull_force_reset, git_pull_merge, git_push,
|
git_init, git_log, git_merge_branch, git_pull, git_push, git_remotes, git_rename_branch,
|
||||||
git_remotes, git_rename_branch, git_reset_changes, git_rm_remote, git_status, git_unstage,
|
git_rm_remote, git_status, git_unstage,
|
||||||
};
|
};
|
||||||
|
|
||||||
// NOTE: All of these commands are async to prevent blocking work from locking up the UI
|
// NOTE: All of these commands are async to prevent blocking work from locking up the UI
|
||||||
@@ -89,20 +89,6 @@ pub async fn cmd_git_pull(dir: &Path) -> Result<PullResult> {
|
|||||||
Ok(git_pull(dir).await?)
|
Ok(git_pull(dir).await?)
|
||||||
}
|
}
|
||||||
|
|
||||||
#[command]
|
|
||||||
pub async fn cmd_git_pull_force_reset(
|
|
||||||
dir: &Path,
|
|
||||||
remote: &str,
|
|
||||||
branch: &str,
|
|
||||||
) -> Result<PullResult> {
|
|
||||||
Ok(git_pull_force_reset(dir, remote, branch).await?)
|
|
||||||
}
|
|
||||||
|
|
||||||
#[command]
|
|
||||||
pub async fn cmd_git_pull_merge(dir: &Path, remote: &str, branch: &str) -> Result<PullResult> {
|
|
||||||
Ok(git_pull_merge(dir, remote, branch).await?)
|
|
||||||
}
|
|
||||||
|
|
||||||
#[command]
|
#[command]
|
||||||
pub async fn cmd_git_add(dir: &Path, rela_paths: Vec<PathBuf>) -> Result<()> {
|
pub async fn cmd_git_add(dir: &Path, rela_paths: Vec<PathBuf>) -> Result<()> {
|
||||||
for path in rela_paths {
|
for path in rela_paths {
|
||||||
@@ -119,11 +105,6 @@ pub async fn cmd_git_unstage(dir: &Path, rela_paths: Vec<PathBuf>) -> Result<()>
|
|||||||
Ok(())
|
Ok(())
|
||||||
}
|
}
|
||||||
|
|
||||||
#[command]
|
|
||||||
pub async fn cmd_git_reset_changes(dir: &Path) -> Result<()> {
|
|
||||||
Ok(git_reset_changes(dir).await?)
|
|
||||||
}
|
|
||||||
|
|
||||||
#[command]
|
#[command]
|
||||||
pub async fn cmd_git_add_credential(
|
pub async fn cmd_git_add_credential(
|
||||||
remote_url: &str,
|
remote_url: &str,
|
||||||
|
|||||||
@@ -3,18 +3,45 @@ use crate::error::Error::GenericError;
|
|||||||
use crate::error::Result;
|
use crate::error::Result;
|
||||||
use crate::models_ext::BlobManagerExt;
|
use crate::models_ext::BlobManagerExt;
|
||||||
use crate::models_ext::QueryManagerExt;
|
use crate::models_ext::QueryManagerExt;
|
||||||
use log::warn;
|
use crate::render::render_http_request;
|
||||||
|
use log::{debug, warn};
|
||||||
|
use std::pin::Pin;
|
||||||
use std::sync::Arc;
|
use std::sync::Arc;
|
||||||
use std::time::Instant;
|
use std::sync::atomic::{AtomicI32, Ordering};
|
||||||
|
use std::time::{Duration, Instant};
|
||||||
use tauri::{AppHandle, Manager, Runtime, WebviewWindow};
|
use tauri::{AppHandle, Manager, Runtime, WebviewWindow};
|
||||||
|
use tokio::fs::{File, create_dir_all};
|
||||||
|
use tokio::io::{AsyncRead, AsyncReadExt, AsyncWriteExt};
|
||||||
use tokio::sync::watch::Receiver;
|
use tokio::sync::watch::Receiver;
|
||||||
use yaak::send::{SendHttpRequestWithPluginsParams, send_http_request_with_plugins};
|
use tokio_util::bytes::Bytes;
|
||||||
use yaak_crypto::manager::EncryptionManager;
|
use yaak_crypto::manager::EncryptionManager;
|
||||||
use yaak_http::manager::HttpConnectionManager;
|
use yaak_http::client::{
|
||||||
use yaak_models::models::{CookieJar, Environment, HttpRequest, HttpResponse, HttpResponseState};
|
HttpConnectionOptions, HttpConnectionProxySetting, HttpConnectionProxySettingAuth,
|
||||||
|
};
|
||||||
|
use yaak_http::cookies::CookieStore;
|
||||||
|
use yaak_http::manager::{CachedClient, HttpConnectionManager};
|
||||||
|
use yaak_http::sender::ReqwestSender;
|
||||||
|
use yaak_http::tee_reader::TeeReader;
|
||||||
|
use yaak_http::transaction::HttpTransaction;
|
||||||
|
use yaak_http::types::{
|
||||||
|
SendableBody, SendableHttpRequest, SendableHttpRequestOptions, append_query_params,
|
||||||
|
};
|
||||||
|
use yaak_models::blob_manager::BodyChunk;
|
||||||
|
use yaak_models::models::{
|
||||||
|
CookieJar, Environment, HttpRequest, HttpResponse, HttpResponseEvent, HttpResponseHeader,
|
||||||
|
HttpResponseState, ProxySetting, ProxySettingAuth,
|
||||||
|
};
|
||||||
use yaak_models::util::UpdateSource;
|
use yaak_models::util::UpdateSource;
|
||||||
use yaak_plugins::events::PluginContext;
|
use yaak_plugins::events::{
|
||||||
|
CallHttpAuthenticationRequest, HttpHeader, PluginContext, RenderPurpose,
|
||||||
|
};
|
||||||
use yaak_plugins::manager::PluginManager;
|
use yaak_plugins::manager::PluginManager;
|
||||||
|
use yaak_plugins::template_callback::PluginTemplateCallback;
|
||||||
|
use yaak_templates::RenderOptions;
|
||||||
|
use yaak_tls::find_client_certificate;
|
||||||
|
|
||||||
|
/// Chunk size for storing request bodies (1MB)
|
||||||
|
const REQUEST_BODY_CHUNK_SIZE: usize = 1024 * 1024;
|
||||||
|
|
||||||
/// Context for managing response state during HTTP transactions.
|
/// Context for managing response state during HTTP transactions.
|
||||||
/// Handles both persisted responses (stored in DB) and ephemeral responses (in-memory only).
|
/// Handles both persisted responses (stored in DB) and ephemeral responses (in-memory only).
|
||||||
@@ -141,31 +168,135 @@ async fn send_http_request_inner<R: Runtime>(
|
|||||||
let plugin_manager = Arc::new((*app_handle.state::<PluginManager>()).clone());
|
let plugin_manager = Arc::new((*app_handle.state::<PluginManager>()).clone());
|
||||||
let encryption_manager = Arc::new((*app_handle.state::<EncryptionManager>()).clone());
|
let encryption_manager = Arc::new((*app_handle.state::<EncryptionManager>()).clone());
|
||||||
let connection_manager = app_handle.state::<HttpConnectionManager>();
|
let connection_manager = app_handle.state::<HttpConnectionManager>();
|
||||||
|
let settings = window.db().get_settings();
|
||||||
|
let workspace_id = &unrendered_request.workspace_id;
|
||||||
|
let folder_id = unrendered_request.folder_id.as_deref();
|
||||||
let environment_id = environment.map(|e| e.id);
|
let environment_id = environment.map(|e| e.id);
|
||||||
let cookie_jar_id = cookie_jar.as_ref().map(|jar| jar.id.clone());
|
let workspace = window.db().get_workspace(workspace_id)?;
|
||||||
|
let (resolved, auth_context_id) = resolve_http_request(window, unrendered_request)?;
|
||||||
|
let cb = PluginTemplateCallback::new(
|
||||||
|
plugin_manager.clone(),
|
||||||
|
encryption_manager.clone(),
|
||||||
|
&plugin_context,
|
||||||
|
RenderPurpose::Send,
|
||||||
|
);
|
||||||
|
let env_chain =
|
||||||
|
window.db().resolve_environments(&workspace.id, folder_id, environment_id.as_deref())?;
|
||||||
|
let request = render_http_request(&resolved, env_chain, &cb, &RenderOptions::throw()).await?;
|
||||||
|
|
||||||
let response_dir = app_handle.path().app_data_dir()?.join("responses");
|
// Build the sendable request using the new SendableHttpRequest type
|
||||||
let result = send_http_request_with_plugins(SendHttpRequestWithPluginsParams {
|
let options = SendableHttpRequestOptions {
|
||||||
query_manager: app_handle.db_manager().inner(),
|
follow_redirects: workspace.setting_follow_redirects,
|
||||||
blob_manager: app_handle.blob_manager().inner(),
|
timeout: if workspace.setting_request_timeout > 0 {
|
||||||
request: unrendered_request.clone(),
|
Some(Duration::from_millis(workspace.setting_request_timeout.unsigned_abs() as u64))
|
||||||
environment_id: environment_id.as_deref(),
|
} else {
|
||||||
update_source: response_ctx.update_source.clone(),
|
None
|
||||||
cookie_jar_id,
|
},
|
||||||
response_dir: &response_dir,
|
};
|
||||||
emit_events_to: None,
|
let mut sendable_request = SendableHttpRequest::from_http_request(&request, options).await?;
|
||||||
emit_response_body_chunks_to: None,
|
|
||||||
existing_response: Some(response_ctx.response().clone()),
|
debug!("Sending request to {} {}", sendable_request.method, sendable_request.url);
|
||||||
plugin_manager,
|
|
||||||
encryption_manager,
|
let proxy_setting = match settings.proxy {
|
||||||
|
None => HttpConnectionProxySetting::System,
|
||||||
|
Some(ProxySetting::Disabled) => HttpConnectionProxySetting::Disabled,
|
||||||
|
Some(ProxySetting::Enabled { http, https, auth, bypass, disabled }) => {
|
||||||
|
if disabled {
|
||||||
|
HttpConnectionProxySetting::System
|
||||||
|
} else {
|
||||||
|
HttpConnectionProxySetting::Enabled {
|
||||||
|
http,
|
||||||
|
https,
|
||||||
|
bypass,
|
||||||
|
auth: match auth {
|
||||||
|
None => None,
|
||||||
|
Some(ProxySettingAuth { user, password }) => {
|
||||||
|
Some(HttpConnectionProxySettingAuth { user, password })
|
||||||
|
}
|
||||||
|
},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
let client_certificate =
|
||||||
|
find_client_certificate(&sendable_request.url, &settings.client_certificates);
|
||||||
|
|
||||||
|
// Create cookie store if a cookie jar is specified
|
||||||
|
let maybe_cookie_store = match cookie_jar.clone() {
|
||||||
|
Some(CookieJar { id, .. }) => {
|
||||||
|
// NOTE: We need to refetch the cookie jar because a chained request might have
|
||||||
|
// updated cookies when we rendered the request.
|
||||||
|
let cj = window.db().get_cookie_jar(&id)?;
|
||||||
|
let cookie_store = CookieStore::from_cookies(cj.cookies.clone());
|
||||||
|
Some((cookie_store, cj))
|
||||||
|
}
|
||||||
|
None => None,
|
||||||
|
};
|
||||||
|
|
||||||
|
let cached_client = connection_manager
|
||||||
|
.get_client(&HttpConnectionOptions {
|
||||||
|
id: plugin_context.id.clone(),
|
||||||
|
validate_certificates: workspace.setting_validate_certificates,
|
||||||
|
proxy: proxy_setting,
|
||||||
|
client_certificate,
|
||||||
|
dns_overrides: workspace.setting_dns_overrides.clone(),
|
||||||
|
})
|
||||||
|
.await?;
|
||||||
|
|
||||||
|
// Apply authentication to the request
|
||||||
|
apply_authentication(
|
||||||
|
&window,
|
||||||
|
&mut sendable_request,
|
||||||
|
&request,
|
||||||
|
auth_context_id,
|
||||||
|
&plugin_manager,
|
||||||
plugin_context,
|
plugin_context,
|
||||||
cancelled_rx: Some(cancelled_rx.clone()),
|
)
|
||||||
connection_manager: Some(connection_manager.inner()),
|
.await?;
|
||||||
})
|
|
||||||
.await
|
|
||||||
.map_err(|e| GenericError(e.to_string()))?;
|
|
||||||
|
|
||||||
Ok(result.response)
|
let cookie_store = maybe_cookie_store.as_ref().map(|(cs, _)| cs.clone());
|
||||||
|
let result = execute_transaction(
|
||||||
|
cached_client,
|
||||||
|
sendable_request,
|
||||||
|
response_ctx,
|
||||||
|
cancelled_rx.clone(),
|
||||||
|
cookie_store,
|
||||||
|
)
|
||||||
|
.await;
|
||||||
|
|
||||||
|
// Wait for blob writing to complete and check for errors
|
||||||
|
let final_result = match result {
|
||||||
|
Ok((response, maybe_blob_write_handle)) => {
|
||||||
|
// Check if blob writing failed
|
||||||
|
if let Some(handle) = maybe_blob_write_handle {
|
||||||
|
if let Ok(Err(e)) = handle.await {
|
||||||
|
// Update response with the storage error
|
||||||
|
let _ = response_ctx.update(|r| {
|
||||||
|
let error_msg =
|
||||||
|
format!("Request succeeded but failed to store request body: {}", e);
|
||||||
|
r.error = Some(match &r.error {
|
||||||
|
Some(existing) => format!("{}; {}", existing, error_msg),
|
||||||
|
None => error_msg,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Ok(response)
|
||||||
|
}
|
||||||
|
Err(e) => Err(e),
|
||||||
|
};
|
||||||
|
|
||||||
|
// Persist cookies back to the database after the request completes
|
||||||
|
if let Some((cookie_store, mut cj)) = maybe_cookie_store {
|
||||||
|
let cookies = cookie_store.get_all_cookies();
|
||||||
|
cj.cookies = cookies;
|
||||||
|
if let Err(e) = window.db().upsert_cookie_jar(&cj, &UpdateSource::Background) {
|
||||||
|
warn!("Failed to persist cookies to database: {}", e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
final_result
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn resolve_http_request<R: Runtime>(
|
pub fn resolve_http_request<R: Runtime>(
|
||||||
@@ -184,3 +315,395 @@ pub fn resolve_http_request<R: Runtime>(
|
|||||||
|
|
||||||
Ok((new_request, authentication_context_id))
|
Ok((new_request, authentication_context_id))
|
||||||
}
|
}
|
||||||
|
|
||||||
|
async fn execute_transaction<R: Runtime>(
|
||||||
|
cached_client: CachedClient,
|
||||||
|
mut sendable_request: SendableHttpRequest,
|
||||||
|
response_ctx: &mut ResponseContext<R>,
|
||||||
|
mut cancelled_rx: Receiver<bool>,
|
||||||
|
cookie_store: Option<CookieStore>,
|
||||||
|
) -> Result<(HttpResponse, Option<tauri::async_runtime::JoinHandle<Result<()>>>)> {
|
||||||
|
let app_handle = &response_ctx.app_handle.clone();
|
||||||
|
let response_id = response_ctx.response().id.clone();
|
||||||
|
let workspace_id = response_ctx.response().workspace_id.clone();
|
||||||
|
let is_persisted = response_ctx.is_persisted();
|
||||||
|
|
||||||
|
// Keep a reference to the resolver for DNS timing events
|
||||||
|
let resolver = cached_client.resolver.clone();
|
||||||
|
|
||||||
|
let sender = ReqwestSender::with_client(cached_client.client);
|
||||||
|
let transaction = match cookie_store {
|
||||||
|
Some(cs) => HttpTransaction::with_cookie_store(sender, cs),
|
||||||
|
None => HttpTransaction::new(sender),
|
||||||
|
};
|
||||||
|
let start = Instant::now();
|
||||||
|
|
||||||
|
// Capture request headers before sending
|
||||||
|
let request_headers: Vec<HttpResponseHeader> = sendable_request
|
||||||
|
.headers
|
||||||
|
.iter()
|
||||||
|
.map(|(name, value)| HttpResponseHeader { name: name.clone(), value: value.clone() })
|
||||||
|
.collect();
|
||||||
|
|
||||||
|
// Update response with headers info
|
||||||
|
response_ctx.update(|r| {
|
||||||
|
r.url = sendable_request.url.clone();
|
||||||
|
r.request_headers = request_headers;
|
||||||
|
})?;
|
||||||
|
|
||||||
|
// Create bounded channel for receiving events and spawn a task to store them in DB
|
||||||
|
// Buffer size of 100 events provides back pressure if DB writes are slow
|
||||||
|
let (event_tx, mut event_rx) =
|
||||||
|
tokio::sync::mpsc::channel::<yaak_http::sender::HttpResponseEvent>(100);
|
||||||
|
|
||||||
|
// Set the event sender on the DNS resolver so it can emit DNS timing events
|
||||||
|
resolver.set_event_sender(Some(event_tx.clone())).await;
|
||||||
|
|
||||||
|
// Shared state to capture DNS timing from the event processing task
|
||||||
|
let dns_elapsed = Arc::new(AtomicI32::new(0));
|
||||||
|
|
||||||
|
// Write events to DB in a task (only for persisted responses)
|
||||||
|
if is_persisted {
|
||||||
|
let response_id = response_id.clone();
|
||||||
|
let app_handle = app_handle.clone();
|
||||||
|
let update_source = response_ctx.update_source.clone();
|
||||||
|
let workspace_id = workspace_id.clone();
|
||||||
|
let dns_elapsed = dns_elapsed.clone();
|
||||||
|
tokio::spawn(async move {
|
||||||
|
while let Some(event) = event_rx.recv().await {
|
||||||
|
// Capture DNS timing when we see a DNS event
|
||||||
|
if let yaak_http::sender::HttpResponseEvent::DnsResolved { duration, .. } = &event {
|
||||||
|
dns_elapsed.store(*duration as i32, Ordering::SeqCst);
|
||||||
|
}
|
||||||
|
let db_event = HttpResponseEvent::new(&response_id, &workspace_id, event.into());
|
||||||
|
let _ = app_handle.db().upsert_http_response_event(&db_event, &update_source);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
// For ephemeral responses, just drain the events but still capture DNS timing
|
||||||
|
let dns_elapsed = dns_elapsed.clone();
|
||||||
|
tokio::spawn(async move {
|
||||||
|
while let Some(event) = event_rx.recv().await {
|
||||||
|
if let yaak_http::sender::HttpResponseEvent::DnsResolved { duration, .. } = &event {
|
||||||
|
dns_elapsed.store(*duration as i32, Ordering::SeqCst);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
// Capture request body as it's sent (only for persisted responses)
|
||||||
|
let body_id = format!("{}.request", response_id);
|
||||||
|
let maybe_blob_write_handle = match sendable_request.body {
|
||||||
|
Some(SendableBody::Bytes(bytes)) => {
|
||||||
|
if is_persisted {
|
||||||
|
write_bytes_to_db_sync(response_ctx, &body_id, bytes.clone())?;
|
||||||
|
}
|
||||||
|
sendable_request.body = Some(SendableBody::Bytes(bytes));
|
||||||
|
None
|
||||||
|
}
|
||||||
|
Some(SendableBody::Stream(stream)) => {
|
||||||
|
// Wrap stream with TeeReader to capture data as it's read
|
||||||
|
// Use unbounded channel to ensure all data is captured without blocking the HTTP request
|
||||||
|
let (body_chunk_tx, body_chunk_rx) = tokio::sync::mpsc::unbounded_channel::<Vec<u8>>();
|
||||||
|
let tee_reader = TeeReader::new(stream, body_chunk_tx);
|
||||||
|
let pinned: Pin<Box<dyn AsyncRead + Send + 'static>> = Box::pin(tee_reader);
|
||||||
|
|
||||||
|
let handle = if is_persisted {
|
||||||
|
// Spawn task to write request body chunks to blob DB
|
||||||
|
let app_handle = app_handle.clone();
|
||||||
|
let response_id = response_id.clone();
|
||||||
|
let workspace_id = workspace_id.clone();
|
||||||
|
let body_id = body_id.clone();
|
||||||
|
let update_source = response_ctx.update_source.clone();
|
||||||
|
Some(tauri::async_runtime::spawn(async move {
|
||||||
|
write_stream_chunks_to_db(
|
||||||
|
app_handle,
|
||||||
|
&body_id,
|
||||||
|
&workspace_id,
|
||||||
|
&response_id,
|
||||||
|
&update_source,
|
||||||
|
body_chunk_rx,
|
||||||
|
)
|
||||||
|
.await
|
||||||
|
}))
|
||||||
|
} else {
|
||||||
|
// For ephemeral responses, just drain the body chunks
|
||||||
|
tauri::async_runtime::spawn(async move {
|
||||||
|
let mut rx = body_chunk_rx;
|
||||||
|
while rx.recv().await.is_some() {}
|
||||||
|
});
|
||||||
|
None
|
||||||
|
};
|
||||||
|
|
||||||
|
sendable_request.body = Some(SendableBody::Stream(pinned));
|
||||||
|
handle
|
||||||
|
}
|
||||||
|
None => {
|
||||||
|
sendable_request.body = None;
|
||||||
|
None
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// Execute the transaction with cancellation support
|
||||||
|
// This returns the response with headers, but body is not yet consumed
|
||||||
|
// Events (headers, settings, chunks) are sent through the channel
|
||||||
|
let mut http_response = transaction
|
||||||
|
.execute_with_cancellation(sendable_request, cancelled_rx.clone(), event_tx)
|
||||||
|
.await?;
|
||||||
|
|
||||||
|
// Prepare the response path before consuming the body
|
||||||
|
let body_path = if response_id.is_empty() {
|
||||||
|
// Ephemeral responses: use OS temp directory for automatic cleanup
|
||||||
|
let temp_dir = std::env::temp_dir().join("yaak-ephemeral-responses");
|
||||||
|
create_dir_all(&temp_dir).await?;
|
||||||
|
temp_dir.join(uuid::Uuid::new_v4().to_string())
|
||||||
|
} else {
|
||||||
|
// Persisted responses: use app data directory
|
||||||
|
let dir = app_handle.path().app_data_dir()?;
|
||||||
|
let base_dir = dir.join("responses");
|
||||||
|
create_dir_all(&base_dir).await?;
|
||||||
|
base_dir.join(&response_id)
|
||||||
|
};
|
||||||
|
|
||||||
|
// Extract metadata before consuming the body (headers are available immediately)
|
||||||
|
// Url might change, so update again
|
||||||
|
response_ctx.update(|r| {
|
||||||
|
r.body_path = Some(body_path.to_string_lossy().to_string());
|
||||||
|
r.elapsed_headers = start.elapsed().as_millis() as i32;
|
||||||
|
r.status = http_response.status as i32;
|
||||||
|
r.status_reason = http_response.status_reason.clone();
|
||||||
|
r.url = http_response.url.clone();
|
||||||
|
r.remote_addr = http_response.remote_addr.clone();
|
||||||
|
r.version = http_response.version.clone();
|
||||||
|
r.headers = http_response
|
||||||
|
.headers
|
||||||
|
.iter()
|
||||||
|
.map(|(name, value)| HttpResponseHeader { name: name.clone(), value: value.clone() })
|
||||||
|
.collect();
|
||||||
|
r.content_length = http_response.content_length.map(|l| l as i32);
|
||||||
|
r.state = HttpResponseState::Connected;
|
||||||
|
r.request_headers = http_response
|
||||||
|
.request_headers
|
||||||
|
.iter()
|
||||||
|
.map(|(n, v)| HttpResponseHeader { name: n.clone(), value: v.clone() })
|
||||||
|
.collect();
|
||||||
|
})?;
|
||||||
|
|
||||||
|
// Get the body stream for manual consumption
|
||||||
|
let mut body_stream = http_response.into_body_stream()?;
|
||||||
|
|
||||||
|
// Open file for writing
|
||||||
|
let mut file = File::options()
|
||||||
|
.create(true)
|
||||||
|
.truncate(true)
|
||||||
|
.write(true)
|
||||||
|
.open(&body_path)
|
||||||
|
.await
|
||||||
|
.map_err(|e| GenericError(format!("Failed to open file: {}", e)))?;
|
||||||
|
|
||||||
|
// Stream body to file, with throttled DB updates to avoid excessive writes
|
||||||
|
let mut written_bytes: usize = 0;
|
||||||
|
let mut last_update_time = start;
|
||||||
|
let mut buf = [0u8; 8192];
|
||||||
|
|
||||||
|
// Throttle settings: update DB at most every 100ms
|
||||||
|
const UPDATE_INTERVAL_MS: u128 = 100;
|
||||||
|
|
||||||
|
loop {
|
||||||
|
// Check for cancellation. If we already have headers/body, just close cleanly without error
|
||||||
|
if *cancelled_rx.borrow() {
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Use select! to race between reading and cancellation, so cancellation is immediate
|
||||||
|
let read_result = tokio::select! {
|
||||||
|
biased;
|
||||||
|
_ = cancelled_rx.changed() => {
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
result = body_stream.read(&mut buf) => result,
|
||||||
|
};
|
||||||
|
|
||||||
|
match read_result {
|
||||||
|
Ok(0) => break, // EOF
|
||||||
|
Ok(n) => {
|
||||||
|
file.write_all(&buf[..n])
|
||||||
|
.await
|
||||||
|
.map_err(|e| GenericError(format!("Failed to write to file: {}", e)))?;
|
||||||
|
file.flush()
|
||||||
|
.await
|
||||||
|
.map_err(|e| GenericError(format!("Failed to flush file: {}", e)))?;
|
||||||
|
written_bytes += n;
|
||||||
|
|
||||||
|
// Throttle DB updates: only update if enough time has passed
|
||||||
|
let now = Instant::now();
|
||||||
|
let elapsed_since_update = now.duration_since(last_update_time).as_millis();
|
||||||
|
|
||||||
|
if elapsed_since_update >= UPDATE_INTERVAL_MS {
|
||||||
|
response_ctx.update(|r| {
|
||||||
|
r.elapsed = start.elapsed().as_millis() as i32;
|
||||||
|
r.content_length = Some(written_bytes as i32);
|
||||||
|
})?;
|
||||||
|
last_update_time = now;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Err(e) => {
|
||||||
|
return Err(GenericError(format!("Failed to read response body: {}", e)));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Final update with closed state and accurate byte count
|
||||||
|
response_ctx.update(|r| {
|
||||||
|
r.elapsed = start.elapsed().as_millis() as i32;
|
||||||
|
r.elapsed_dns = dns_elapsed.load(Ordering::SeqCst);
|
||||||
|
r.content_length = Some(written_bytes as i32);
|
||||||
|
r.state = HttpResponseState::Closed;
|
||||||
|
})?;
|
||||||
|
|
||||||
|
// Clear the event sender from the resolver since this request is done
|
||||||
|
resolver.set_event_sender(None).await;
|
||||||
|
|
||||||
|
Ok((response_ctx.response().clone(), maybe_blob_write_handle))
|
||||||
|
}
|
||||||
|
|
||||||
|
fn write_bytes_to_db_sync<R: Runtime>(
|
||||||
|
response_ctx: &mut ResponseContext<R>,
|
||||||
|
body_id: &str,
|
||||||
|
data: Bytes,
|
||||||
|
) -> Result<()> {
|
||||||
|
if data.is_empty() {
|
||||||
|
return Ok(());
|
||||||
|
}
|
||||||
|
|
||||||
|
// Write in chunks if data is large
|
||||||
|
let mut offset = 0;
|
||||||
|
let mut chunk_index = 0;
|
||||||
|
while offset < data.len() {
|
||||||
|
let end = std::cmp::min(offset + REQUEST_BODY_CHUNK_SIZE, data.len());
|
||||||
|
let chunk_data = data.slice(offset..end).to_vec();
|
||||||
|
let chunk = BodyChunk::new(body_id, chunk_index, chunk_data);
|
||||||
|
response_ctx.app_handle.blobs().insert_chunk(&chunk)?;
|
||||||
|
offset = end;
|
||||||
|
chunk_index += 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update the response with the total request body size
|
||||||
|
response_ctx.update(|r| {
|
||||||
|
r.request_content_length = Some(data.len() as i32);
|
||||||
|
})?;
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
async fn write_stream_chunks_to_db<R: Runtime>(
|
||||||
|
app_handle: AppHandle<R>,
|
||||||
|
body_id: &str,
|
||||||
|
workspace_id: &str,
|
||||||
|
response_id: &str,
|
||||||
|
update_source: &UpdateSource,
|
||||||
|
mut rx: tokio::sync::mpsc::UnboundedReceiver<Vec<u8>>,
|
||||||
|
) -> Result<()> {
|
||||||
|
let mut buffer = Vec::with_capacity(REQUEST_BODY_CHUNK_SIZE);
|
||||||
|
let mut chunk_index = 0;
|
||||||
|
let mut total_bytes: usize = 0;
|
||||||
|
|
||||||
|
while let Some(data) = rx.recv().await {
|
||||||
|
total_bytes += data.len();
|
||||||
|
buffer.extend_from_slice(&data);
|
||||||
|
|
||||||
|
// Flush when buffer reaches chunk size
|
||||||
|
while buffer.len() >= REQUEST_BODY_CHUNK_SIZE {
|
||||||
|
debug!("Writing chunk {chunk_index} to DB");
|
||||||
|
let chunk_data: Vec<u8> = buffer.drain(..REQUEST_BODY_CHUNK_SIZE).collect();
|
||||||
|
let chunk = BodyChunk::new(body_id, chunk_index, chunk_data);
|
||||||
|
app_handle.blobs().insert_chunk(&chunk)?;
|
||||||
|
app_handle.db().upsert_http_response_event(
|
||||||
|
&HttpResponseEvent::new(
|
||||||
|
response_id,
|
||||||
|
workspace_id,
|
||||||
|
yaak_http::sender::HttpResponseEvent::ChunkSent {
|
||||||
|
bytes: REQUEST_BODY_CHUNK_SIZE,
|
||||||
|
}
|
||||||
|
.into(),
|
||||||
|
),
|
||||||
|
update_source,
|
||||||
|
)?;
|
||||||
|
chunk_index += 1;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Flush remaining data
|
||||||
|
if !buffer.is_empty() {
|
||||||
|
let chunk = BodyChunk::new(body_id, chunk_index, buffer);
|
||||||
|
debug!("Flushing remaining data {chunk_index} {}", chunk.data.len());
|
||||||
|
app_handle.blobs().insert_chunk(&chunk)?;
|
||||||
|
app_handle.db().upsert_http_response_event(
|
||||||
|
&HttpResponseEvent::new(
|
||||||
|
response_id,
|
||||||
|
workspace_id,
|
||||||
|
yaak_http::sender::HttpResponseEvent::ChunkSent { bytes: chunk.data.len() }.into(),
|
||||||
|
),
|
||||||
|
update_source,
|
||||||
|
)?;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update the response with the total request body size
|
||||||
|
app_handle.with_tx(|tx| {
|
||||||
|
debug!("Updating final body length {total_bytes}");
|
||||||
|
if let Ok(mut response) = tx.get_http_response(&response_id) {
|
||||||
|
response.request_content_length = Some(total_bytes as i32);
|
||||||
|
tx.update_http_response_if_id(&response, update_source)?;
|
||||||
|
}
|
||||||
|
Ok(())
|
||||||
|
})?;
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
async fn apply_authentication<R: Runtime>(
|
||||||
|
_window: &WebviewWindow<R>,
|
||||||
|
sendable_request: &mut SendableHttpRequest,
|
||||||
|
request: &HttpRequest,
|
||||||
|
auth_context_id: String,
|
||||||
|
plugin_manager: &PluginManager,
|
||||||
|
plugin_context: &PluginContext,
|
||||||
|
) -> Result<()> {
|
||||||
|
match &request.authentication_type {
|
||||||
|
None => {
|
||||||
|
// No authentication found. Not even inherited
|
||||||
|
}
|
||||||
|
Some(authentication_type) if authentication_type == "none" => {
|
||||||
|
// Explicitly no authentication
|
||||||
|
}
|
||||||
|
Some(authentication_type) => {
|
||||||
|
let req = CallHttpAuthenticationRequest {
|
||||||
|
context_id: format!("{:x}", md5::compute(auth_context_id)),
|
||||||
|
values: serde_json::from_value(serde_json::to_value(&request.authentication)?)?,
|
||||||
|
url: sendable_request.url.clone(),
|
||||||
|
method: sendable_request.method.clone(),
|
||||||
|
headers: sendable_request
|
||||||
|
.headers
|
||||||
|
.iter()
|
||||||
|
.map(|(name, value)| HttpHeader {
|
||||||
|
name: name.to_string(),
|
||||||
|
value: value.to_string(),
|
||||||
|
})
|
||||||
|
.collect(),
|
||||||
|
};
|
||||||
|
let plugin_result = plugin_manager
|
||||||
|
.call_http_authentication(plugin_context, &authentication_type, req)
|
||||||
|
.await?;
|
||||||
|
|
||||||
|
for header in plugin_result.set_headers.unwrap_or_default() {
|
||||||
|
sendable_request.insert_header((header.name, header.value));
|
||||||
|
}
|
||||||
|
|
||||||
|
if let Some(params) = plugin_result.set_query_parameters {
|
||||||
|
let params = params.into_iter().map(|p| (p.name, p.value)).collect::<Vec<_>>();
|
||||||
|
sendable_request.url = append_query_params(&sendable_request.url, params);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|||||||
@@ -31,16 +31,14 @@ use tauri_plugin_window_state::{AppHandleExt, StateFlags};
|
|||||||
use tokio::sync::Mutex;
|
use tokio::sync::Mutex;
|
||||||
use tokio::task::block_in_place;
|
use tokio::task::block_in_place;
|
||||||
use tokio::time;
|
use tokio::time;
|
||||||
use yaak_common::command::new_checked_command;
|
|
||||||
use yaak_crypto::manager::EncryptionManager;
|
use yaak_crypto::manager::EncryptionManager;
|
||||||
use yaak_grpc::manager::{GrpcConfig, GrpcHandle};
|
use yaak_grpc::manager::{GrpcConfig, GrpcHandle};
|
||||||
use yaak_templates::strip_json_comments::strip_json_comments;
|
|
||||||
use yaak_grpc::{Code, ServiceDefinition, serialize_message};
|
use yaak_grpc::{Code, ServiceDefinition, serialize_message};
|
||||||
use yaak_mac_window::AppHandleMacWindowExt;
|
use yaak_mac_window::AppHandleMacWindowExt;
|
||||||
use yaak_models::models::{
|
use yaak_models::models::{
|
||||||
AnyModel, CookieJar, Environment, GrpcConnection, GrpcConnectionState, GrpcEvent,
|
AnyModel, CookieJar, Environment, GrpcConnection, GrpcConnectionState, GrpcEvent,
|
||||||
GrpcEventType, HttpRequest, HttpResponse, HttpResponseEvent, HttpResponseState, Workspace,
|
GrpcEventType, GrpcRequest, HttpRequest, HttpResponse, HttpResponseEvent, HttpResponseState,
|
||||||
WorkspaceMeta,
|
Plugin, Workspace, WorkspaceMeta,
|
||||||
};
|
};
|
||||||
use yaak_models::util::{BatchUpsertResult, UpdateSource, get_workspace_export_resources};
|
use yaak_models::util::{BatchUpsertResult, UpdateSource, get_workspace_export_resources};
|
||||||
use yaak_plugins::events::{
|
use yaak_plugins::events::{
|
||||||
@@ -99,7 +97,6 @@ impl<R: Runtime> PluginContextExt<R> for WebviewWindow<R> {
|
|||||||
struct AppMetaData {
|
struct AppMetaData {
|
||||||
is_dev: bool,
|
is_dev: bool,
|
||||||
version: String,
|
version: String,
|
||||||
cli_version: Option<String>,
|
|
||||||
name: String,
|
name: String,
|
||||||
app_data_dir: String,
|
app_data_dir: String,
|
||||||
app_log_dir: String,
|
app_log_dir: String,
|
||||||
@@ -116,11 +113,9 @@ async fn cmd_metadata(app_handle: AppHandle) -> YaakResult<AppMetaData> {
|
|||||||
let vendored_plugin_dir =
|
let vendored_plugin_dir =
|
||||||
app_handle.path().resolve("vendored/plugins", BaseDirectory::Resource)?;
|
app_handle.path().resolve("vendored/plugins", BaseDirectory::Resource)?;
|
||||||
let default_project_dir = app_handle.path().home_dir()?.join("YaakProjects");
|
let default_project_dir = app_handle.path().home_dir()?.join("YaakProjects");
|
||||||
let cli_version = detect_cli_version().await;
|
|
||||||
Ok(AppMetaData {
|
Ok(AppMetaData {
|
||||||
is_dev: is_dev(),
|
is_dev: is_dev(),
|
||||||
version: app_handle.package_info().version.to_string(),
|
version: app_handle.package_info().version.to_string(),
|
||||||
cli_version,
|
|
||||||
name: app_handle.package_info().name.to_string(),
|
name: app_handle.package_info().name.to_string(),
|
||||||
app_data_dir: app_data_dir.to_string_lossy().to_string(),
|
app_data_dir: app_data_dir.to_string_lossy().to_string(),
|
||||||
app_log_dir: app_log_dir.to_string_lossy().to_string(),
|
app_log_dir: app_log_dir.to_string_lossy().to_string(),
|
||||||
@@ -131,24 +126,6 @@ async fn cmd_metadata(app_handle: AppHandle) -> YaakResult<AppMetaData> {
|
|||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
async fn detect_cli_version() -> Option<String> {
|
|
||||||
detect_cli_version_for_binary("yaak").await
|
|
||||||
}
|
|
||||||
|
|
||||||
async fn detect_cli_version_for_binary(program: &str) -> Option<String> {
|
|
||||||
let mut cmd = new_checked_command(program, "--version").await.ok()?;
|
|
||||||
let out = cmd.arg("--version").output().await.ok()?;
|
|
||||||
if !out.status.success() {
|
|
||||||
return None;
|
|
||||||
}
|
|
||||||
|
|
||||||
let line = String::from_utf8(out.stdout).ok()?;
|
|
||||||
let line = line.lines().find(|l| !l.trim().is_empty())?.trim();
|
|
||||||
let mut parts = line.split_whitespace();
|
|
||||||
let _name = parts.next();
|
|
||||||
Some(parts.next().unwrap_or(line).to_string())
|
|
||||||
}
|
|
||||||
|
|
||||||
#[tauri::command]
|
#[tauri::command]
|
||||||
async fn cmd_template_tokens_to_string<R: Runtime>(
|
async fn cmd_template_tokens_to_string<R: Runtime>(
|
||||||
window: WebviewWindow<R>,
|
window: WebviewWindow<R>,
|
||||||
@@ -434,7 +411,6 @@ async fn cmd_grpc_go<R: Runtime>(
|
|||||||
result.expect("Failed to render template")
|
result.expect("Failed to render template")
|
||||||
})
|
})
|
||||||
});
|
});
|
||||||
let msg = strip_json_comments(&msg);
|
|
||||||
in_msg_tx.try_send(msg.clone()).unwrap();
|
in_msg_tx.try_send(msg.clone()).unwrap();
|
||||||
}
|
}
|
||||||
Ok(IncomingMsg::Commit) => {
|
Ok(IncomingMsg::Commit) => {
|
||||||
@@ -470,7 +446,6 @@ async fn cmd_grpc_go<R: Runtime>(
|
|||||||
&RenderOptions { error_behavior: RenderErrorBehavior::Throw },
|
&RenderOptions { error_behavior: RenderErrorBehavior::Throw },
|
||||||
)
|
)
|
||||||
.await?;
|
.await?;
|
||||||
let msg = strip_json_comments(&msg);
|
|
||||||
|
|
||||||
app_handle.db().upsert_grpc_event(
|
app_handle.db().upsert_grpc_event(
|
||||||
&GrpcEvent {
|
&GrpcEvent {
|
||||||
@@ -872,14 +847,6 @@ async fn cmd_format_json(text: &str) -> YaakResult<String> {
|
|||||||
Ok(format_json(text, " "))
|
Ok(format_json(text, " "))
|
||||||
}
|
}
|
||||||
|
|
||||||
#[tauri::command]
|
|
||||||
async fn cmd_format_graphql(text: &str) -> YaakResult<String> {
|
|
||||||
match pretty_graphql::format_text(text, &Default::default()) {
|
|
||||||
Ok(formatted) => Ok(formatted),
|
|
||||||
Err(_) => Ok(text.to_string()),
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
#[tauri::command]
|
#[tauri::command]
|
||||||
async fn cmd_http_response_body<R: Runtime>(
|
async fn cmd_http_response_body<R: Runtime>(
|
||||||
window: WebviewWindow<R>,
|
window: WebviewWindow<R>,
|
||||||
@@ -1129,8 +1096,7 @@ async fn cmd_get_http_authentication_config<R: Runtime>(
|
|||||||
// Convert HashMap<String, JsonPrimitive> to serde_json::Value for rendering
|
// Convert HashMap<String, JsonPrimitive> to serde_json::Value for rendering
|
||||||
let values_json: serde_json::Value = serde_json::to_value(&values)?;
|
let values_json: serde_json::Value = serde_json::to_value(&values)?;
|
||||||
let rendered_json =
|
let rendered_json =
|
||||||
render_json_value(values_json, environment_chain, &cb, &RenderOptions::return_empty())
|
render_json_value(values_json, environment_chain, &cb, &RenderOptions::throw()).await?;
|
||||||
.await?;
|
|
||||||
|
|
||||||
// Convert back to HashMap<String, JsonPrimitive>
|
// Convert back to HashMap<String, JsonPrimitive>
|
||||||
let rendered_values: HashMap<String, JsonPrimitive> = serde_json::from_value(rendered_json)?;
|
let rendered_values: HashMap<String, JsonPrimitive> = serde_json::from_value(rendered_json)?;
|
||||||
@@ -1305,6 +1271,35 @@ async fn cmd_save_response<R: Runtime>(
|
|||||||
Ok(())
|
Ok(())
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
async fn cmd_send_folder<R: Runtime>(
|
||||||
|
app_handle: AppHandle<R>,
|
||||||
|
window: WebviewWindow<R>,
|
||||||
|
environment_id: Option<String>,
|
||||||
|
cookie_jar_id: Option<String>,
|
||||||
|
folder_id: &str,
|
||||||
|
) -> YaakResult<()> {
|
||||||
|
let requests = app_handle.db().list_http_requests_for_folder_recursive(folder_id)?;
|
||||||
|
for request in requests {
|
||||||
|
let app_handle = app_handle.clone();
|
||||||
|
let window = window.clone();
|
||||||
|
let environment_id = environment_id.clone();
|
||||||
|
let cookie_jar_id = cookie_jar_id.clone();
|
||||||
|
tokio::spawn(async move {
|
||||||
|
let _ = cmd_send_http_request(
|
||||||
|
app_handle,
|
||||||
|
window,
|
||||||
|
environment_id.as_deref(),
|
||||||
|
cookie_jar_id.as_deref(),
|
||||||
|
request,
|
||||||
|
)
|
||||||
|
.await;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
#[tauri::command]
|
#[tauri::command]
|
||||||
async fn cmd_send_http_request<R: Runtime>(
|
async fn cmd_send_http_request<R: Runtime>(
|
||||||
app_handle: AppHandle<R>,
|
app_handle: AppHandle<R>,
|
||||||
@@ -1378,17 +1373,62 @@ async fn cmd_send_http_request<R: Runtime>(
|
|||||||
Ok(r)
|
Ok(r)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
async fn cmd_install_plugin<R: Runtime>(
|
||||||
|
directory: &str,
|
||||||
|
url: Option<String>,
|
||||||
|
plugin_manager: State<'_, PluginManager>,
|
||||||
|
app_handle: AppHandle<R>,
|
||||||
|
window: WebviewWindow<R>,
|
||||||
|
) -> YaakResult<Plugin> {
|
||||||
|
let plugin = app_handle.db().upsert_plugin(
|
||||||
|
&Plugin { directory: directory.into(), url, enabled: true, ..Default::default() },
|
||||||
|
&UpdateSource::from_window_label(window.label()),
|
||||||
|
)?;
|
||||||
|
|
||||||
|
plugin_manager
|
||||||
|
.add_plugin(
|
||||||
|
&PluginContext::new(Some(window.label().to_string()), window.workspace_id()),
|
||||||
|
&plugin,
|
||||||
|
)
|
||||||
|
.await?;
|
||||||
|
|
||||||
|
Ok(plugin)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
async fn cmd_create_grpc_request<R: Runtime>(
|
||||||
|
workspace_id: &str,
|
||||||
|
name: &str,
|
||||||
|
sort_priority: f64,
|
||||||
|
folder_id: Option<&str>,
|
||||||
|
app_handle: AppHandle<R>,
|
||||||
|
window: WebviewWindow<R>,
|
||||||
|
) -> YaakResult<GrpcRequest> {
|
||||||
|
Ok(app_handle.db().upsert_grpc_request(
|
||||||
|
&GrpcRequest {
|
||||||
|
workspace_id: workspace_id.to_string(),
|
||||||
|
name: name.to_string(),
|
||||||
|
folder_id: folder_id.map(|s| s.to_string()),
|
||||||
|
sort_priority,
|
||||||
|
..Default::default()
|
||||||
|
},
|
||||||
|
&UpdateSource::from_window_label(window.label()),
|
||||||
|
)?)
|
||||||
|
}
|
||||||
|
|
||||||
#[tauri::command]
|
#[tauri::command]
|
||||||
async fn cmd_reload_plugins<R: Runtime>(
|
async fn cmd_reload_plugins<R: Runtime>(
|
||||||
app_handle: AppHandle<R>,
|
app_handle: AppHandle<R>,
|
||||||
window: WebviewWindow<R>,
|
window: WebviewWindow<R>,
|
||||||
plugin_manager: State<'_, PluginManager>,
|
plugin_manager: State<'_, PluginManager>,
|
||||||
) -> YaakResult<Vec<(String, String)>> {
|
) -> YaakResult<()> {
|
||||||
let plugins = app_handle.db().list_plugins()?;
|
let plugins = app_handle.db().list_plugins()?;
|
||||||
let plugin_context =
|
let plugin_context =
|
||||||
PluginContext::new(Some(window.label().to_string()), window.workspace_id());
|
PluginContext::new(Some(window.label().to_string()), window.workspace_id());
|
||||||
let errors = plugin_manager.initialize_all_plugins(plugins, &plugin_context).await;
|
let _errors = plugin_manager.initialize_all_plugins(plugins, &plugin_context).await;
|
||||||
Ok(errors)
|
// Note: errors are returned but we don't show toasts here since this is a manual reload
|
||||||
|
Ok(())
|
||||||
}
|
}
|
||||||
|
|
||||||
#[tauri::command]
|
#[tauri::command]
|
||||||
@@ -1639,6 +1679,7 @@ pub fn run() {
|
|||||||
cmd_call_folder_action,
|
cmd_call_folder_action,
|
||||||
cmd_call_grpc_request_action,
|
cmd_call_grpc_request_action,
|
||||||
cmd_check_for_updates,
|
cmd_check_for_updates,
|
||||||
|
cmd_create_grpc_request,
|
||||||
cmd_curl_to_request,
|
cmd_curl_to_request,
|
||||||
cmd_delete_all_grpc_connections,
|
cmd_delete_all_grpc_connections,
|
||||||
cmd_delete_all_http_responses,
|
cmd_delete_all_http_responses,
|
||||||
@@ -1648,7 +1689,6 @@ pub fn run() {
|
|||||||
cmd_http_request_body,
|
cmd_http_request_body,
|
||||||
cmd_http_response_body,
|
cmd_http_response_body,
|
||||||
cmd_format_json,
|
cmd_format_json,
|
||||||
cmd_format_graphql,
|
|
||||||
cmd_get_http_authentication_summaries,
|
cmd_get_http_authentication_summaries,
|
||||||
cmd_get_http_authentication_config,
|
cmd_get_http_authentication_config,
|
||||||
cmd_get_sse_events,
|
cmd_get_sse_events,
|
||||||
@@ -1662,6 +1702,7 @@ pub fn run() {
|
|||||||
cmd_workspace_actions,
|
cmd_workspace_actions,
|
||||||
cmd_folder_actions,
|
cmd_folder_actions,
|
||||||
cmd_import_data,
|
cmd_import_data,
|
||||||
|
cmd_install_plugin,
|
||||||
cmd_metadata,
|
cmd_metadata,
|
||||||
cmd_new_child_window,
|
cmd_new_child_window,
|
||||||
cmd_new_main_window,
|
cmd_new_main_window,
|
||||||
@@ -1672,6 +1713,7 @@ pub fn run() {
|
|||||||
cmd_save_response,
|
cmd_save_response,
|
||||||
cmd_send_ephemeral_request,
|
cmd_send_ephemeral_request,
|
||||||
cmd_send_http_request,
|
cmd_send_http_request,
|
||||||
|
cmd_send_folder,
|
||||||
cmd_template_function_config,
|
cmd_template_function_config,
|
||||||
cmd_template_function_summaries,
|
cmd_template_function_summaries,
|
||||||
cmd_template_tokens_to_string,
|
cmd_template_tokens_to_string,
|
||||||
@@ -1686,6 +1728,7 @@ pub fn run() {
|
|||||||
crate::commands::cmd_reveal_workspace_key,
|
crate::commands::cmd_reveal_workspace_key,
|
||||||
crate::commands::cmd_secure_template,
|
crate::commands::cmd_secure_template,
|
||||||
crate::commands::cmd_set_workspace_key,
|
crate::commands::cmd_set_workspace_key,
|
||||||
|
crate::commands::cmd_show_workspace_key,
|
||||||
//
|
//
|
||||||
// Models commands
|
// Models commands
|
||||||
models_ext::models_delete,
|
models_ext::models_delete,
|
||||||
@@ -1719,19 +1762,14 @@ pub fn run() {
|
|||||||
git_ext::cmd_git_fetch_all,
|
git_ext::cmd_git_fetch_all,
|
||||||
git_ext::cmd_git_push,
|
git_ext::cmd_git_push,
|
||||||
git_ext::cmd_git_pull,
|
git_ext::cmd_git_pull,
|
||||||
git_ext::cmd_git_pull_force_reset,
|
|
||||||
git_ext::cmd_git_pull_merge,
|
|
||||||
git_ext::cmd_git_add,
|
git_ext::cmd_git_add,
|
||||||
git_ext::cmd_git_unstage,
|
git_ext::cmd_git_unstage,
|
||||||
git_ext::cmd_git_reset_changes,
|
|
||||||
git_ext::cmd_git_add_credential,
|
git_ext::cmd_git_add_credential,
|
||||||
git_ext::cmd_git_remotes,
|
git_ext::cmd_git_remotes,
|
||||||
git_ext::cmd_git_add_remote,
|
git_ext::cmd_git_add_remote,
|
||||||
git_ext::cmd_git_rm_remote,
|
git_ext::cmd_git_rm_remote,
|
||||||
//
|
//
|
||||||
// Plugin commands
|
// Plugin commands
|
||||||
plugins_ext::cmd_plugin_init_errors,
|
|
||||||
plugins_ext::cmd_plugins_install_from_directory,
|
|
||||||
plugins_ext::cmd_plugins_search,
|
plugins_ext::cmd_plugins_search,
|
||||||
plugins_ext::cmd_plugins_install,
|
plugins_ext::cmd_plugins_install,
|
||||||
plugins_ext::cmd_plugins_uninstall,
|
plugins_ext::cmd_plugins_uninstall,
|
||||||
@@ -1739,7 +1777,14 @@ pub fn run() {
|
|||||||
plugins_ext::cmd_plugins_update_all,
|
plugins_ext::cmd_plugins_update_all,
|
||||||
//
|
//
|
||||||
// WebSocket commands
|
// WebSocket commands
|
||||||
|
ws_ext::cmd_ws_upsert_request,
|
||||||
|
ws_ext::cmd_ws_duplicate_request,
|
||||||
|
ws_ext::cmd_ws_delete_request,
|
||||||
|
ws_ext::cmd_ws_delete_connection,
|
||||||
ws_ext::cmd_ws_delete_connections,
|
ws_ext::cmd_ws_delete_connections,
|
||||||
|
ws_ext::cmd_ws_list_events,
|
||||||
|
ws_ext::cmd_ws_list_requests,
|
||||||
|
ws_ext::cmd_ws_list_connections,
|
||||||
ws_ext::cmd_ws_send,
|
ws_ext::cmd_ws_send,
|
||||||
ws_ext::cmd_ws_close,
|
ws_ext::cmd_ws_close,
|
||||||
ws_ext::cmd_ws_connect,
|
ws_ext::cmd_ws_connect,
|
||||||
|
|||||||
@@ -3,9 +3,6 @@
|
|||||||
//! This module provides the Tauri plugin initialization and extension traits
|
//! This module provides the Tauri plugin initialization and extension traits
|
||||||
//! that allow accessing QueryManager and BlobManager from Tauri's Manager types.
|
//! that allow accessing QueryManager and BlobManager from Tauri's Manager types.
|
||||||
|
|
||||||
use chrono::Utc;
|
|
||||||
use log::error;
|
|
||||||
use std::time::Duration;
|
|
||||||
use tauri::plugin::TauriPlugin;
|
use tauri::plugin::TauriPlugin;
|
||||||
use tauri::{Emitter, Manager, Runtime, State};
|
use tauri::{Emitter, Manager, Runtime, State};
|
||||||
use tauri_plugin_dialog::{DialogExt, MessageDialogKind};
|
use tauri_plugin_dialog::{DialogExt, MessageDialogKind};
|
||||||
@@ -15,75 +12,6 @@ use yaak_models::error::Result;
|
|||||||
use yaak_models::models::{AnyModel, GraphQlIntrospection, GrpcEvent, Settings, WebsocketEvent};
|
use yaak_models::models::{AnyModel, GraphQlIntrospection, GrpcEvent, Settings, WebsocketEvent};
|
||||||
use yaak_models::query_manager::QueryManager;
|
use yaak_models::query_manager::QueryManager;
|
||||||
use yaak_models::util::UpdateSource;
|
use yaak_models::util::UpdateSource;
|
||||||
use yaak_plugins::manager::PluginManager;
|
|
||||||
|
|
||||||
const MODEL_CHANGES_RETENTION_HOURS: i64 = 1;
|
|
||||||
const MODEL_CHANGES_POLL_INTERVAL_MS: u64 = 1000;
|
|
||||||
const MODEL_CHANGES_POLL_BATCH_SIZE: usize = 200;
|
|
||||||
|
|
||||||
struct ModelChangeCursor {
|
|
||||||
created_at: String,
|
|
||||||
id: i64,
|
|
||||||
}
|
|
||||||
|
|
||||||
impl ModelChangeCursor {
|
|
||||||
fn from_launch_time() -> Self {
|
|
||||||
Self {
|
|
||||||
created_at: Utc::now().naive_utc().format("%Y-%m-%d %H:%M:%S%.3f").to_string(),
|
|
||||||
id: 0,
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn drain_model_changes_batch<R: Runtime>(
|
|
||||||
query_manager: &QueryManager,
|
|
||||||
app_handle: &tauri::AppHandle<R>,
|
|
||||||
cursor: &mut ModelChangeCursor,
|
|
||||||
) -> bool {
|
|
||||||
let changes = match query_manager.connect().list_model_changes_since(
|
|
||||||
&cursor.created_at,
|
|
||||||
cursor.id,
|
|
||||||
MODEL_CHANGES_POLL_BATCH_SIZE,
|
|
||||||
) {
|
|
||||||
Ok(changes) => changes,
|
|
||||||
Err(err) => {
|
|
||||||
error!("Failed to poll model_changes rows: {err:?}");
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
if changes.is_empty() {
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
|
|
||||||
let fetched_count = changes.len();
|
|
||||||
for change in changes {
|
|
||||||
cursor.created_at = change.created_at;
|
|
||||||
cursor.id = change.id;
|
|
||||||
|
|
||||||
// Local window-originated writes are forwarded immediately from the
|
|
||||||
// in-memory model event channel.
|
|
||||||
if matches!(change.payload.update_source, UpdateSource::Window { .. }) {
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
if let Err(err) = app_handle.emit("model_write", change.payload) {
|
|
||||||
error!("Failed to emit model_write event: {err:?}");
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fetched_count == MODEL_CHANGES_POLL_BATCH_SIZE
|
|
||||||
}
|
|
||||||
|
|
||||||
async fn run_model_change_poller<R: Runtime>(
|
|
||||||
query_manager: QueryManager,
|
|
||||||
app_handle: tauri::AppHandle<R>,
|
|
||||||
mut cursor: ModelChangeCursor,
|
|
||||||
) {
|
|
||||||
loop {
|
|
||||||
while drain_model_changes_batch(&query_manager, &app_handle, &mut cursor) {}
|
|
||||||
tokio::time::sleep(Duration::from_millis(MODEL_CHANGES_POLL_INTERVAL_MS)).await;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Extension trait for accessing the QueryManager from Tauri Manager types.
|
/// Extension trait for accessing the QueryManager from Tauri Manager types.
|
||||||
pub trait QueryManagerExt<'a, R> {
|
pub trait QueryManagerExt<'a, R> {
|
||||||
@@ -256,32 +184,23 @@ pub(crate) fn models_upsert_graphql_introspection<R: Runtime>(
|
|||||||
}
|
}
|
||||||
|
|
||||||
#[tauri::command]
|
#[tauri::command]
|
||||||
pub(crate) async fn models_workspace_models<R: Runtime>(
|
pub(crate) fn models_workspace_models<R: Runtime>(
|
||||||
window: WebviewWindow<R>,
|
window: WebviewWindow<R>,
|
||||||
workspace_id: Option<&str>,
|
workspace_id: Option<&str>,
|
||||||
plugin_manager: State<'_, PluginManager>,
|
|
||||||
) -> Result<String> {
|
) -> Result<String> {
|
||||||
|
let db = window.db();
|
||||||
let mut l: Vec<AnyModel> = Vec::new();
|
let mut l: Vec<AnyModel> = Vec::new();
|
||||||
|
|
||||||
// Add the global models
|
// Add the settings
|
||||||
{
|
l.push(db.get_settings().into());
|
||||||
let db = window.db();
|
|
||||||
l.push(db.get_settings().into());
|
|
||||||
l.append(&mut db.list_workspaces()?.into_iter().map(Into::into).collect());
|
|
||||||
l.append(&mut db.list_key_values()?.into_iter().map(Into::into).collect());
|
|
||||||
}
|
|
||||||
|
|
||||||
let plugins = {
|
// Add global models
|
||||||
let db = window.db();
|
l.append(&mut db.list_workspaces()?.into_iter().map(Into::into).collect());
|
||||||
db.list_plugins()?
|
l.append(&mut db.list_key_values()?.into_iter().map(Into::into).collect());
|
||||||
};
|
l.append(&mut db.list_plugins()?.into_iter().map(Into::into).collect());
|
||||||
|
|
||||||
let plugins = plugin_manager.resolve_plugins_for_runtime_from_db(plugins).await;
|
|
||||||
l.append(&mut plugins.into_iter().map(Into::into).collect());
|
|
||||||
|
|
||||||
// Add the workspace children
|
// Add the workspace children
|
||||||
if let Some(wid) = workspace_id {
|
if let Some(wid) = workspace_id {
|
||||||
let db = window.db();
|
|
||||||
l.append(&mut db.list_cookie_jars(wid)?.into_iter().map(Into::into).collect());
|
l.append(&mut db.list_cookie_jars(wid)?.into_iter().map(Into::into).collect());
|
||||||
l.append(&mut db.list_environments_ensure_base(wid)?.into_iter().map(Into::into).collect());
|
l.append(&mut db.list_environments_ensure_base(wid)?.into_iter().map(Into::into).collect());
|
||||||
l.append(&mut db.list_folders(wid)?.into_iter().map(Into::into).collect());
|
l.append(&mut db.list_folders(wid)?.into_iter().map(Into::into).collect());
|
||||||
@@ -343,37 +262,14 @@ pub fn init<R: Runtime>() -> TauriPlugin<R> {
|
|||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
let db = query_manager.connect();
|
|
||||||
if let Err(err) = db.prune_model_changes_older_than_hours(MODEL_CHANGES_RETENTION_HOURS)
|
|
||||||
{
|
|
||||||
error!("Failed to prune model_changes rows on startup: {err:?}");
|
|
||||||
}
|
|
||||||
// Only stream writes that happen after this app launch.
|
|
||||||
let cursor = ModelChangeCursor::from_launch_time();
|
|
||||||
|
|
||||||
let poll_query_manager = query_manager.clone();
|
|
||||||
|
|
||||||
app_handle.manage(query_manager);
|
app_handle.manage(query_manager);
|
||||||
app_handle.manage(blob_manager);
|
app_handle.manage(blob_manager);
|
||||||
|
|
||||||
// Poll model_changes so all writers (including external CLI processes) update the UI.
|
// Forward model change events to the frontend
|
||||||
let app_handle_poll = app_handle.clone();
|
let app_handle = app_handle.clone();
|
||||||
let query_manager = poll_query_manager;
|
|
||||||
tauri::async_runtime::spawn(async move {
|
|
||||||
run_model_change_poller(query_manager, app_handle_poll, cursor).await;
|
|
||||||
});
|
|
||||||
|
|
||||||
// Fast path for local app writes initiated by frontend windows. This keeps the
|
|
||||||
// current sync-model UX snappy, while DB polling handles external writers (CLI).
|
|
||||||
let app_handle_local = app_handle.clone();
|
|
||||||
tauri::async_runtime::spawn(async move {
|
tauri::async_runtime::spawn(async move {
|
||||||
for payload in rx {
|
for payload in rx {
|
||||||
if !matches!(payload.update_source, UpdateSource::Window { .. }) {
|
app_handle.emit("model_write", payload).unwrap();
|
||||||
continue;
|
|
||||||
}
|
|
||||||
if let Err(err) = app_handle_local.emit("model_write", payload) {
|
|
||||||
error!("Failed to emit local model_write event: {err:?}");
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
|
|||||||
@@ -8,9 +8,9 @@ use serde::{Deserialize, Serialize};
|
|||||||
use std::time::Instant;
|
use std::time::Instant;
|
||||||
use tauri::{AppHandle, Emitter, Manager, Runtime, WebviewWindow};
|
use tauri::{AppHandle, Emitter, Manager, Runtime, WebviewWindow};
|
||||||
use ts_rs::TS;
|
use ts_rs::TS;
|
||||||
use yaak_api::{ApiClientKind, yaak_api_client};
|
|
||||||
use yaak_common::platform::get_os_str;
|
use yaak_common::platform::get_os_str;
|
||||||
use yaak_models::util::UpdateSource;
|
use yaak_models::util::UpdateSource;
|
||||||
|
use yaak_tauri_utils::api_client::yaak_api_client;
|
||||||
|
|
||||||
// Check for updates every hour
|
// Check for updates every hour
|
||||||
const MAX_UPDATE_CHECK_SECONDS: u64 = 60 * 60;
|
const MAX_UPDATE_CHECK_SECONDS: u64 = 60 * 60;
|
||||||
@@ -101,8 +101,7 @@ impl YaakNotifier {
|
|||||||
let license_check = "disabled".to_string();
|
let license_check = "disabled".to_string();
|
||||||
|
|
||||||
let launch_info = get_or_upsert_launch_info(app_handle);
|
let launch_info = get_or_upsert_launch_info(app_handle);
|
||||||
let app_version = app_handle.package_info().version.to_string();
|
let req = yaak_api_client(app_handle)?
|
||||||
let req = yaak_api_client(ApiClientKind::App, &app_version)?
|
|
||||||
.request(Method::GET, "https://notify.yaak.app/notifications")
|
.request(Method::GET, "https://notify.yaak.app/notifications")
|
||||||
.query(&[
|
.query(&[
|
||||||
("version", &launch_info.current_version),
|
("version", &launch_info.current_version),
|
||||||
|
|||||||
@@ -12,23 +12,21 @@ use chrono::Utc;
|
|||||||
use cookie::Cookie;
|
use cookie::Cookie;
|
||||||
use log::error;
|
use log::error;
|
||||||
use std::sync::Arc;
|
use std::sync::Arc;
|
||||||
use tauri::{AppHandle, Emitter, Listener, Manager, Runtime};
|
use tauri::{AppHandle, Emitter, Manager, Runtime};
|
||||||
use tauri_plugin_clipboard_manager::ClipboardExt;
|
use tauri_plugin_clipboard_manager::ClipboardExt;
|
||||||
use tauri_plugin_opener::OpenerExt;
|
use tauri_plugin_opener::OpenerExt;
|
||||||
use yaak::plugin_events::{
|
|
||||||
GroupedPluginEvent, HostRequest, SharedPluginEventContext, handle_shared_plugin_event,
|
|
||||||
};
|
|
||||||
use yaak_crypto::manager::EncryptionManager;
|
use yaak_crypto::manager::EncryptionManager;
|
||||||
use yaak_models::models::{HttpResponse, Plugin};
|
use yaak_models::models::{AnyModel, HttpResponse, Plugin};
|
||||||
use yaak_models::queries::any_request::AnyRequest;
|
use yaak_models::queries::any_request::AnyRequest;
|
||||||
use yaak_models::util::UpdateSource;
|
use yaak_models::util::UpdateSource;
|
||||||
use yaak_plugins::error::Error::PluginErr;
|
use yaak_plugins::error::Error::PluginErr;
|
||||||
use yaak_plugins::events::{
|
use yaak_plugins::events::{
|
||||||
Color, EmptyPayload, ErrorResponse, GetCookieValueResponse, Icon, InternalEvent,
|
Color, DeleteKeyValueResponse, EmptyPayload, ErrorResponse, FindHttpResponsesResponse,
|
||||||
InternalEventPayload, ListCookieNamesResponse, ListOpenWorkspacesResponse,
|
GetCookieValueResponse, GetHttpRequestByIdResponse, GetKeyValueResponse, Icon, InternalEvent,
|
||||||
RenderGrpcRequestResponse, RenderHttpRequestResponse, SendHttpRequestResponse,
|
InternalEventPayload, ListCookieNamesResponse, ListHttpRequestsResponse,
|
||||||
ShowToastRequest, TemplateRenderResponse, WindowInfoResponse, WindowNavigateEvent,
|
ListWorkspacesResponse, RenderGrpcRequestResponse, RenderHttpRequestResponse,
|
||||||
WorkspaceInfo,
|
SendHttpRequestResponse, SetKeyValueResponse, ShowToastRequest, TemplateRenderResponse,
|
||||||
|
WindowInfoResponse, WindowNavigateEvent, WorkspaceInfo,
|
||||||
};
|
};
|
||||||
use yaak_plugins::manager::PluginManager;
|
use yaak_plugins::manager::PluginManager;
|
||||||
use yaak_plugins::plugin_handle::PluginHandle;
|
use yaak_plugins::plugin_handle::PluginHandle;
|
||||||
@@ -43,155 +41,124 @@ pub(crate) async fn handle_plugin_event<R: Runtime>(
|
|||||||
) -> Result<Option<InternalEventPayload>> {
|
) -> Result<Option<InternalEventPayload>> {
|
||||||
// log::debug!("Got event to app {event:?}");
|
// log::debug!("Got event to app {event:?}");
|
||||||
let plugin_context = event.context.to_owned();
|
let plugin_context = event.context.to_owned();
|
||||||
let plugin_name = plugin_handle.info().name;
|
match event.clone().payload {
|
||||||
let fallback_workspace_id = plugin_context.workspace_id.clone().or_else(|| {
|
InternalEventPayload::CopyTextRequest(req) => {
|
||||||
plugin_context
|
|
||||||
.label
|
|
||||||
.as_ref()
|
|
||||||
.and_then(|label| app_handle.get_webview_window(label))
|
|
||||||
.and_then(|window| workspace_from_window(&window).map(|workspace| workspace.id))
|
|
||||||
});
|
|
||||||
|
|
||||||
match handle_shared_plugin_event(
|
|
||||||
app_handle.db_manager().inner(),
|
|
||||||
&event.payload,
|
|
||||||
SharedPluginEventContext {
|
|
||||||
plugin_name: &plugin_name,
|
|
||||||
workspace_id: fallback_workspace_id.as_deref(),
|
|
||||||
},
|
|
||||||
) {
|
|
||||||
GroupedPluginEvent::Handled(payload) => Ok(payload),
|
|
||||||
GroupedPluginEvent::ToHandle(host_request) => {
|
|
||||||
handle_host_plugin_request(
|
|
||||||
app_handle,
|
|
||||||
event,
|
|
||||||
plugin_handle,
|
|
||||||
&plugin_context,
|
|
||||||
host_request,
|
|
||||||
)
|
|
||||||
.await
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async fn handle_host_plugin_request<R: Runtime>(
|
|
||||||
app_handle: &AppHandle<R>,
|
|
||||||
event: &InternalEvent,
|
|
||||||
plugin_handle: &PluginHandle,
|
|
||||||
plugin_context: &yaak_plugins::events::PluginContext,
|
|
||||||
host_request: HostRequest<'_>,
|
|
||||||
) -> Result<Option<InternalEventPayload>> {
|
|
||||||
match host_request {
|
|
||||||
HostRequest::ErrorResponse(resp) => {
|
|
||||||
error!("Plugin error: {}: {:?}", resp.error, resp);
|
|
||||||
let toast_event = plugin_handle.build_event_to_send(
|
|
||||||
plugin_context,
|
|
||||||
&InternalEventPayload::ShowToastRequest(ShowToastRequest {
|
|
||||||
message: format!(
|
|
||||||
"Plugin error from {}: {}",
|
|
||||||
plugin_handle.info().name,
|
|
||||||
resp.error
|
|
||||||
),
|
|
||||||
color: Some(Color::Danger),
|
|
||||||
timeout: Some(30000),
|
|
||||||
..Default::default()
|
|
||||||
}),
|
|
||||||
None,
|
|
||||||
);
|
|
||||||
Box::pin(handle_plugin_event(app_handle, &toast_event, plugin_handle)).await
|
|
||||||
}
|
|
||||||
HostRequest::ReloadResponse(req) => {
|
|
||||||
let plugins = app_handle.db().list_plugins()?;
|
|
||||||
for plugin in plugins {
|
|
||||||
if plugin.directory != plugin_handle.dir {
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
let new_plugin = Plugin { updated_at: Utc::now().naive_utc(), ..plugin };
|
|
||||||
app_handle.db().upsert_plugin(&new_plugin, &UpdateSource::Plugin)?;
|
|
||||||
}
|
|
||||||
|
|
||||||
if !req.silent {
|
|
||||||
let info = plugin_handle.info();
|
|
||||||
let toast_event = plugin_handle.build_event_to_send(
|
|
||||||
plugin_context,
|
|
||||||
&InternalEventPayload::ShowToastRequest(ShowToastRequest {
|
|
||||||
message: format!("Reloaded plugin {}@{}", info.name, info.version),
|
|
||||||
icon: Some(Icon::Info),
|
|
||||||
timeout: Some(5000),
|
|
||||||
..Default::default()
|
|
||||||
}),
|
|
||||||
None,
|
|
||||||
);
|
|
||||||
Box::pin(handle_plugin_event(app_handle, &toast_event, plugin_handle)).await
|
|
||||||
} else {
|
|
||||||
Ok(None)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
HostRequest::CopyText(req) => {
|
|
||||||
app_handle.clipboard().write_text(req.text.as_str())?;
|
app_handle.clipboard().write_text(req.text.as_str())?;
|
||||||
Ok(Some(InternalEventPayload::CopyTextResponse(EmptyPayload {})))
|
Ok(Some(InternalEventPayload::CopyTextResponse(EmptyPayload {})))
|
||||||
}
|
}
|
||||||
HostRequest::ShowToast(req) => {
|
InternalEventPayload::ShowToastRequest(req) => {
|
||||||
match &plugin_context.label {
|
match plugin_context.label {
|
||||||
Some(label) => app_handle.emit_to(label, "show_toast", req)?,
|
Some(label) => app_handle.emit_to(label, "show_toast", req)?,
|
||||||
None => app_handle.emit("show_toast", req)?,
|
None => app_handle.emit("show_toast", req)?,
|
||||||
};
|
};
|
||||||
Ok(Some(InternalEventPayload::ShowToastResponse(EmptyPayload {})))
|
Ok(Some(InternalEventPayload::ShowToastResponse(EmptyPayload {})))
|
||||||
}
|
}
|
||||||
HostRequest::PromptText(_) => {
|
InternalEventPayload::PromptTextRequest(_) => {
|
||||||
let window = get_window_from_plugin_context(app_handle, plugin_context)?;
|
let window = get_window_from_plugin_context(app_handle, &plugin_context)?;
|
||||||
Ok(call_frontend(&window, event).await)
|
Ok(call_frontend(&window, event).await)
|
||||||
}
|
}
|
||||||
HostRequest::PromptForm(_) => {
|
InternalEventPayload::PromptFormRequest(_) => {
|
||||||
let window = get_window_from_plugin_context(app_handle, plugin_context)?;
|
let window = get_window_from_plugin_context(app_handle, &plugin_context)?;
|
||||||
if event.reply_id.is_some() {
|
Ok(call_frontend(&window, event).await)
|
||||||
window.emit_to(window.label(), "plugin_event", event.clone())?;
|
|
||||||
Ok(None)
|
|
||||||
} else {
|
|
||||||
window.emit_to(window.label(), "plugin_event", event.clone()).unwrap();
|
|
||||||
|
|
||||||
let event_id = event.id.clone();
|
|
||||||
let plugin_handle = plugin_handle.clone();
|
|
||||||
let plugin_context = plugin_context.clone();
|
|
||||||
let window = window.clone();
|
|
||||||
|
|
||||||
tauri::async_runtime::spawn(async move {
|
|
||||||
let (tx, mut rx) = tokio::sync::mpsc::channel::<InternalEvent>(128);
|
|
||||||
|
|
||||||
let listener_id = window.listen(event_id, move |ev: tauri::Event| {
|
|
||||||
let resp: InternalEvent = serde_json::from_str(ev.payload()).unwrap();
|
|
||||||
let _ = tx.try_send(resp);
|
|
||||||
});
|
|
||||||
|
|
||||||
while let Some(resp) = rx.recv().await {
|
|
||||||
let is_done = matches!(
|
|
||||||
&resp.payload,
|
|
||||||
InternalEventPayload::PromptFormResponse(r) if r.done.unwrap_or(false)
|
|
||||||
);
|
|
||||||
|
|
||||||
let event_to_send = plugin_handle.build_event_to_send(
|
|
||||||
&plugin_context,
|
|
||||||
&resp.payload,
|
|
||||||
Some(resp.reply_id.unwrap_or_default()),
|
|
||||||
);
|
|
||||||
if let Err(e) = plugin_handle.send(&event_to_send).await {
|
|
||||||
log::warn!("Failed to forward form response to plugin: {:?}", e);
|
|
||||||
}
|
|
||||||
|
|
||||||
if is_done {
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
window.unlisten(listener_id);
|
|
||||||
});
|
|
||||||
|
|
||||||
Ok(None)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
HostRequest::RenderGrpcRequest(req) => {
|
InternalEventPayload::FindHttpResponsesRequest(req) => {
|
||||||
let window = get_window_from_plugin_context(app_handle, plugin_context)?;
|
let http_responses = app_handle
|
||||||
|
.db()
|
||||||
|
.list_http_responses_for_request(&req.request_id, req.limit.map(|l| l as u64))
|
||||||
|
.unwrap_or_default();
|
||||||
|
Ok(Some(InternalEventPayload::FindHttpResponsesResponse(FindHttpResponsesResponse {
|
||||||
|
http_responses,
|
||||||
|
})))
|
||||||
|
}
|
||||||
|
InternalEventPayload::ListHttpRequestsRequest(req) => {
|
||||||
|
let w = get_window_from_plugin_context(app_handle, &plugin_context)?;
|
||||||
|
let workspace = workspace_from_window(&w)
|
||||||
|
.ok_or(PluginErr("Failed to get workspace from window".into()))?;
|
||||||
|
|
||||||
|
let http_requests = if let Some(folder_id) = req.folder_id {
|
||||||
|
app_handle.db().list_http_requests_for_folder_recursive(&folder_id)?
|
||||||
|
} else {
|
||||||
|
app_handle.db().list_http_requests(&workspace.id)?
|
||||||
|
};
|
||||||
|
|
||||||
|
Ok(Some(InternalEventPayload::ListHttpRequestsResponse(ListHttpRequestsResponse {
|
||||||
|
http_requests,
|
||||||
|
})))
|
||||||
|
}
|
||||||
|
InternalEventPayload::ListFoldersRequest(_req) => {
|
||||||
|
let w = get_window_from_plugin_context(app_handle, &plugin_context)?;
|
||||||
|
let workspace = workspace_from_window(&w)
|
||||||
|
.ok_or(PluginErr("Failed to get workspace from window".into()))?;
|
||||||
|
let folders = app_handle.db().list_folders(&workspace.id)?;
|
||||||
|
|
||||||
|
Ok(Some(InternalEventPayload::ListFoldersResponse(
|
||||||
|
yaak_plugins::events::ListFoldersResponse { folders },
|
||||||
|
)))
|
||||||
|
}
|
||||||
|
InternalEventPayload::UpsertModelRequest(req) => {
|
||||||
|
use AnyModel::*;
|
||||||
|
let model = match &req.model {
|
||||||
|
HttpRequest(m) => {
|
||||||
|
HttpRequest(app_handle.db().upsert_http_request(m, &UpdateSource::Plugin)?)
|
||||||
|
}
|
||||||
|
GrpcRequest(m) => {
|
||||||
|
GrpcRequest(app_handle.db().upsert_grpc_request(m, &UpdateSource::Plugin)?)
|
||||||
|
}
|
||||||
|
WebsocketRequest(m) => WebsocketRequest(
|
||||||
|
app_handle.db().upsert_websocket_request(m, &UpdateSource::Plugin)?,
|
||||||
|
),
|
||||||
|
Folder(m) => Folder(app_handle.db().upsert_folder(m, &UpdateSource::Plugin)?),
|
||||||
|
Environment(m) => {
|
||||||
|
Environment(app_handle.db().upsert_environment(m, &UpdateSource::Plugin)?)
|
||||||
|
}
|
||||||
|
Workspace(m) => {
|
||||||
|
Workspace(app_handle.db().upsert_workspace(m, &UpdateSource::Plugin)?)
|
||||||
|
}
|
||||||
|
_ => {
|
||||||
|
return Err(PluginErr("Upsert not supported for this model type".into()).into());
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
Ok(Some(InternalEventPayload::UpsertModelResponse(
|
||||||
|
yaak_plugins::events::UpsertModelResponse { model },
|
||||||
|
)))
|
||||||
|
}
|
||||||
|
InternalEventPayload::DeleteModelRequest(req) => {
|
||||||
|
let model = match req.model.as_str() {
|
||||||
|
"http_request" => AnyModel::HttpRequest(
|
||||||
|
app_handle.db().delete_http_request_by_id(&req.id, &UpdateSource::Plugin)?,
|
||||||
|
),
|
||||||
|
"grpc_request" => AnyModel::GrpcRequest(
|
||||||
|
app_handle.db().delete_grpc_request_by_id(&req.id, &UpdateSource::Plugin)?,
|
||||||
|
),
|
||||||
|
"websocket_request" => AnyModel::WebsocketRequest(
|
||||||
|
app_handle
|
||||||
|
.db()
|
||||||
|
.delete_websocket_request_by_id(&req.id, &UpdateSource::Plugin)?,
|
||||||
|
),
|
||||||
|
"folder" => AnyModel::Folder(
|
||||||
|
app_handle.db().delete_folder_by_id(&req.id, &UpdateSource::Plugin)?,
|
||||||
|
),
|
||||||
|
"environment" => AnyModel::Environment(
|
||||||
|
app_handle.db().delete_environment_by_id(&req.id, &UpdateSource::Plugin)?,
|
||||||
|
),
|
||||||
|
_ => {
|
||||||
|
return Err(PluginErr("Delete not supported for this model type".into()).into());
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
Ok(Some(InternalEventPayload::DeleteModelResponse(
|
||||||
|
yaak_plugins::events::DeleteModelResponse { model },
|
||||||
|
)))
|
||||||
|
}
|
||||||
|
InternalEventPayload::GetHttpRequestByIdRequest(req) => {
|
||||||
|
let http_request = app_handle.db().get_http_request(&req.id).ok();
|
||||||
|
Ok(Some(InternalEventPayload::GetHttpRequestByIdResponse(GetHttpRequestByIdResponse {
|
||||||
|
http_request,
|
||||||
|
})))
|
||||||
|
}
|
||||||
|
InternalEventPayload::RenderGrpcRequestRequest(req) => {
|
||||||
|
let window = get_window_from_plugin_context(app_handle, &plugin_context)?;
|
||||||
|
|
||||||
let workspace =
|
let workspace =
|
||||||
workspace_from_window(&window).expect("Failed to get workspace_id from window URL");
|
workspace_from_window(&window).expect("Failed to get workspace_id from window URL");
|
||||||
@@ -206,8 +173,8 @@ async fn handle_host_plugin_request<R: Runtime>(
|
|||||||
let cb = PluginTemplateCallback::new(
|
let cb = PluginTemplateCallback::new(
|
||||||
plugin_manager,
|
plugin_manager,
|
||||||
encryption_manager,
|
encryption_manager,
|
||||||
plugin_context,
|
&plugin_context,
|
||||||
req.purpose.clone(),
|
req.purpose,
|
||||||
);
|
);
|
||||||
let opt = RenderOptions { error_behavior: RenderErrorBehavior::Throw };
|
let opt = RenderOptions { error_behavior: RenderErrorBehavior::Throw };
|
||||||
let grpc_request =
|
let grpc_request =
|
||||||
@@ -216,8 +183,8 @@ async fn handle_host_plugin_request<R: Runtime>(
|
|||||||
grpc_request,
|
grpc_request,
|
||||||
})))
|
})))
|
||||||
}
|
}
|
||||||
HostRequest::RenderHttpRequest(req) => {
|
InternalEventPayload::RenderHttpRequestRequest(req) => {
|
||||||
let window = get_window_from_plugin_context(app_handle, plugin_context)?;
|
let window = get_window_from_plugin_context(app_handle, &plugin_context)?;
|
||||||
|
|
||||||
let workspace =
|
let workspace =
|
||||||
workspace_from_window(&window).expect("Failed to get workspace_id from window URL");
|
workspace_from_window(&window).expect("Failed to get workspace_id from window URL");
|
||||||
@@ -232,18 +199,18 @@ async fn handle_host_plugin_request<R: Runtime>(
|
|||||||
let cb = PluginTemplateCallback::new(
|
let cb = PluginTemplateCallback::new(
|
||||||
plugin_manager,
|
plugin_manager,
|
||||||
encryption_manager,
|
encryption_manager,
|
||||||
plugin_context,
|
&plugin_context,
|
||||||
req.purpose.clone(),
|
req.purpose,
|
||||||
);
|
);
|
||||||
let opt = &RenderOptions { error_behavior: RenderErrorBehavior::Throw };
|
let opt = &RenderOptions { error_behavior: RenderErrorBehavior::Throw };
|
||||||
let http_request =
|
let http_request =
|
||||||
render_http_request(&req.http_request, environment_chain, &cb, opt).await?;
|
render_http_request(&req.http_request, environment_chain, &cb, &opt).await?;
|
||||||
Ok(Some(InternalEventPayload::RenderHttpRequestResponse(RenderHttpRequestResponse {
|
Ok(Some(InternalEventPayload::RenderHttpRequestResponse(RenderHttpRequestResponse {
|
||||||
http_request,
|
http_request,
|
||||||
})))
|
})))
|
||||||
}
|
}
|
||||||
HostRequest::TemplateRender(req) => {
|
InternalEventPayload::TemplateRenderRequest(req) => {
|
||||||
let window = get_window_from_plugin_context(app_handle, plugin_context)?;
|
let window = get_window_from_plugin_context(app_handle, &plugin_context)?;
|
||||||
|
|
||||||
let workspace =
|
let workspace =
|
||||||
workspace_from_window(&window).expect("Failed to get workspace_id from window URL");
|
workspace_from_window(&window).expect("Failed to get workspace_id from window URL");
|
||||||
@@ -268,16 +235,65 @@ async fn handle_host_plugin_request<R: Runtime>(
|
|||||||
let cb = PluginTemplateCallback::new(
|
let cb = PluginTemplateCallback::new(
|
||||||
plugin_manager,
|
plugin_manager,
|
||||||
encryption_manager,
|
encryption_manager,
|
||||||
plugin_context,
|
&plugin_context,
|
||||||
req.purpose.clone(),
|
req.purpose,
|
||||||
);
|
);
|
||||||
let opt = RenderOptions { error_behavior: RenderErrorBehavior::Throw };
|
let opt = RenderOptions { error_behavior: RenderErrorBehavior::Throw };
|
||||||
let data = render_json_value(req.data.clone(), environment_chain, &cb, &opt).await?;
|
let data = render_json_value(req.data, environment_chain, &cb, &opt).await?;
|
||||||
Ok(Some(InternalEventPayload::TemplateRenderResponse(TemplateRenderResponse { data })))
|
Ok(Some(InternalEventPayload::TemplateRenderResponse(TemplateRenderResponse { data })))
|
||||||
}
|
}
|
||||||
HostRequest::SendHttpRequest(req) => {
|
InternalEventPayload::ErrorResponse(resp) => {
|
||||||
let window = get_window_from_plugin_context(app_handle, plugin_context)?;
|
error!("Plugin error: {}: {:?}", resp.error, resp);
|
||||||
let mut http_request = req.http_request.clone();
|
let toast_event = plugin_handle.build_event_to_send(
|
||||||
|
&plugin_context,
|
||||||
|
&InternalEventPayload::ShowToastRequest(ShowToastRequest {
|
||||||
|
message: format!(
|
||||||
|
"Plugin error from {}: {}",
|
||||||
|
plugin_handle.info().name,
|
||||||
|
resp.error
|
||||||
|
),
|
||||||
|
color: Some(Color::Danger),
|
||||||
|
timeout: Some(30000),
|
||||||
|
..Default::default()
|
||||||
|
}),
|
||||||
|
None,
|
||||||
|
);
|
||||||
|
Box::pin(handle_plugin_event(app_handle, &toast_event, plugin_handle)).await
|
||||||
|
}
|
||||||
|
InternalEventPayload::ReloadResponse(req) => {
|
||||||
|
let plugins = app_handle.db().list_plugins()?;
|
||||||
|
for plugin in plugins {
|
||||||
|
if plugin.directory != plugin_handle.dir {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
let new_plugin = Plugin {
|
||||||
|
updated_at: Utc::now().naive_utc(), // TODO: Add reloaded_at field to use instead
|
||||||
|
..plugin
|
||||||
|
};
|
||||||
|
app_handle.db().upsert_plugin(&new_plugin, &UpdateSource::Plugin)?;
|
||||||
|
}
|
||||||
|
|
||||||
|
if !req.silent {
|
||||||
|
let info = plugin_handle.info();
|
||||||
|
let toast_event = plugin_handle.build_event_to_send(
|
||||||
|
&plugin_context,
|
||||||
|
&InternalEventPayload::ShowToastRequest(ShowToastRequest {
|
||||||
|
message: format!("Reloaded plugin {}@{}", info.name, info.version),
|
||||||
|
icon: Some(Icon::Info),
|
||||||
|
timeout: Some(3000),
|
||||||
|
..Default::default()
|
||||||
|
}),
|
||||||
|
None,
|
||||||
|
);
|
||||||
|
Box::pin(handle_plugin_event(app_handle, &toast_event, plugin_handle)).await
|
||||||
|
} else {
|
||||||
|
Ok(None)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
InternalEventPayload::SendHttpRequestRequest(req) => {
|
||||||
|
let window = get_window_from_plugin_context(app_handle, &plugin_context)?;
|
||||||
|
let mut http_request = req.http_request;
|
||||||
let workspace =
|
let workspace =
|
||||||
workspace_from_window(&window).expect("Failed to get workspace_id from window URL");
|
workspace_from_window(&window).expect("Failed to get workspace_id from window URL");
|
||||||
let cookie_jar = cookie_jar_from_window(&window);
|
let cookie_jar = cookie_jar_from_window(&window);
|
||||||
@@ -297,7 +313,7 @@ async fn handle_host_plugin_request<R: Runtime>(
|
|||||||
workspace_id: http_request.workspace_id.clone(),
|
workspace_id: http_request.workspace_id.clone(),
|
||||||
..Default::default()
|
..Default::default()
|
||||||
},
|
},
|
||||||
&UpdateSource::from_window_label(window.label()),
|
&UpdateSource::Plugin,
|
||||||
&blobs,
|
&blobs,
|
||||||
)?
|
)?
|
||||||
};
|
};
|
||||||
@@ -308,8 +324,8 @@ async fn handle_host_plugin_request<R: Runtime>(
|
|||||||
&http_response,
|
&http_response,
|
||||||
environment,
|
environment,
|
||||||
cookie_jar,
|
cookie_jar,
|
||||||
&mut tokio::sync::watch::channel(false).1,
|
&mut tokio::sync::watch::channel(false).1, // No-op cancel channel
|
||||||
plugin_context,
|
&plugin_context,
|
||||||
)
|
)
|
||||||
.await?;
|
.await?;
|
||||||
|
|
||||||
@@ -317,7 +333,7 @@ async fn handle_host_plugin_request<R: Runtime>(
|
|||||||
http_response,
|
http_response,
|
||||||
})))
|
})))
|
||||||
}
|
}
|
||||||
HostRequest::OpenWindow(req) => {
|
InternalEventPayload::OpenWindowRequest(req) => {
|
||||||
let (navigation_tx, mut navigation_rx) = tokio::sync::mpsc::channel(128);
|
let (navigation_tx, mut navigation_rx) = tokio::sync::mpsc::channel(128);
|
||||||
let (close_tx, mut close_rx) = tokio::sync::mpsc::channel(128);
|
let (close_tx, mut close_rx) = tokio::sync::mpsc::channel(128);
|
||||||
let win_config = CreateWindowConfig {
|
let win_config = CreateWindowConfig {
|
||||||
@@ -332,7 +348,7 @@ async fn handle_host_plugin_request<R: Runtime>(
|
|||||||
};
|
};
|
||||||
if let Err(e) = create_window(app_handle, win_config) {
|
if let Err(e) = create_window(app_handle, win_config) {
|
||||||
let error_event = plugin_handle.build_event_to_send(
|
let error_event = plugin_handle.build_event_to_send(
|
||||||
plugin_context,
|
&plugin_context,
|
||||||
&InternalEventPayload::ErrorResponse(ErrorResponse {
|
&InternalEventPayload::ErrorResponse(ErrorResponse {
|
||||||
error: format!("Failed to create window: {:?}", e),
|
error: format!("Failed to create window: {:?}", e),
|
||||||
}),
|
}),
|
||||||
@@ -350,7 +366,7 @@ async fn handle_host_plugin_request<R: Runtime>(
|
|||||||
while let Some(url) = navigation_rx.recv().await {
|
while let Some(url) = navigation_rx.recv().await {
|
||||||
let url = url.to_string();
|
let url = url.to_string();
|
||||||
let event_to_send = plugin_handle.build_event_to_send(
|
let event_to_send = plugin_handle.build_event_to_send(
|
||||||
&plugin_context,
|
&plugin_context, // NOTE: Sending existing context on purpose here
|
||||||
&InternalEventPayload::WindowNavigateEvent(WindowNavigateEvent { url }),
|
&InternalEventPayload::WindowNavigateEvent(WindowNavigateEvent { url }),
|
||||||
Some(event_id.clone()),
|
Some(event_id.clone()),
|
||||||
);
|
);
|
||||||
@@ -364,7 +380,7 @@ async fn handle_host_plugin_request<R: Runtime>(
|
|||||||
let plugin_handle = plugin_handle.clone();
|
let plugin_handle = plugin_handle.clone();
|
||||||
let plugin_context = plugin_context.clone();
|
let plugin_context = plugin_context.clone();
|
||||||
tauri::async_runtime::spawn(async move {
|
tauri::async_runtime::spawn(async move {
|
||||||
while close_rx.recv().await.is_some() {
|
while let Some(_) = close_rx.recv().await {
|
||||||
let event_to_send = plugin_handle.build_event_to_send(
|
let event_to_send = plugin_handle.build_event_to_send(
|
||||||
&plugin_context,
|
&plugin_context,
|
||||||
&InternalEventPayload::WindowCloseEvent,
|
&InternalEventPayload::WindowCloseEvent,
|
||||||
@@ -377,33 +393,35 @@ async fn handle_host_plugin_request<R: Runtime>(
|
|||||||
|
|
||||||
Ok(None)
|
Ok(None)
|
||||||
}
|
}
|
||||||
HostRequest::CloseWindow(req) => {
|
InternalEventPayload::CloseWindowRequest(req) => {
|
||||||
if let Some(window) = app_handle.webview_windows().get(&req.label) {
|
if let Some(window) = app_handle.webview_windows().get(&req.label) {
|
||||||
window.close()?;
|
window.close()?;
|
||||||
}
|
}
|
||||||
Ok(None)
|
Ok(None)
|
||||||
}
|
}
|
||||||
HostRequest::OpenExternalUrl(req) => {
|
InternalEventPayload::OpenExternalUrlRequest(req) => {
|
||||||
app_handle.opener().open_url(&req.url, None::<&str>)?;
|
app_handle.opener().open_url(&req.url, None::<&str>)?;
|
||||||
Ok(Some(InternalEventPayload::OpenExternalUrlResponse(EmptyPayload {})))
|
Ok(Some(InternalEventPayload::OpenExternalUrlResponse(EmptyPayload {})))
|
||||||
}
|
}
|
||||||
HostRequest::ListOpenWorkspaces(_) => {
|
InternalEventPayload::SetKeyValueRequest(req) => {
|
||||||
let mut workspaces = Vec::new();
|
let name = plugin_handle.info().name;
|
||||||
for (_, window) in app_handle.webview_windows() {
|
app_handle.db().set_plugin_key_value(&name, &req.key, &req.value);
|
||||||
if let Some(workspace) = workspace_from_window(&window) {
|
Ok(Some(InternalEventPayload::SetKeyValueResponse(SetKeyValueResponse {})))
|
||||||
workspaces.push(WorkspaceInfo {
|
}
|
||||||
id: workspace.id.clone(),
|
InternalEventPayload::GetKeyValueRequest(req) => {
|
||||||
name: workspace.name.clone(),
|
let name = plugin_handle.info().name;
|
||||||
label: window.label().to_string(),
|
let value = app_handle.db().get_plugin_key_value(&name, &req.key).map(|v| v.value);
|
||||||
});
|
Ok(Some(InternalEventPayload::GetKeyValueResponse(GetKeyValueResponse { value })))
|
||||||
}
|
}
|
||||||
}
|
InternalEventPayload::DeleteKeyValueRequest(req) => {
|
||||||
Ok(Some(InternalEventPayload::ListOpenWorkspacesResponse(ListOpenWorkspacesResponse {
|
let name = plugin_handle.info().name;
|
||||||
workspaces,
|
let deleted = app_handle.db().delete_plugin_key_value(&name, &req.key)?;
|
||||||
|
Ok(Some(InternalEventPayload::DeleteKeyValueResponse(DeleteKeyValueResponse {
|
||||||
|
deleted,
|
||||||
})))
|
})))
|
||||||
}
|
}
|
||||||
HostRequest::ListCookieNames(_) => {
|
InternalEventPayload::ListCookieNamesRequest(_req) => {
|
||||||
let window = get_window_from_plugin_context(app_handle, plugin_context)?;
|
let window = get_window_from_plugin_context(app_handle, &plugin_context)?;
|
||||||
let names = match cookie_jar_from_window(&window) {
|
let names = match cookie_jar_from_window(&window) {
|
||||||
None => Vec::new(),
|
None => Vec::new(),
|
||||||
Some(j) => j
|
Some(j) => j
|
||||||
@@ -416,8 +434,8 @@ async fn handle_host_plugin_request<R: Runtime>(
|
|||||||
names,
|
names,
|
||||||
})))
|
})))
|
||||||
}
|
}
|
||||||
HostRequest::GetCookieValue(req) => {
|
InternalEventPayload::GetCookieValueRequest(req) => {
|
||||||
let window = get_window_from_plugin_context(app_handle, plugin_context)?;
|
let window = get_window_from_plugin_context(app_handle, &plugin_context)?;
|
||||||
let value = match cookie_jar_from_window(&window) {
|
let value = match cookie_jar_from_window(&window) {
|
||||||
None => None,
|
None => None,
|
||||||
Some(j) => j.cookies.into_iter().find_map(|c| match Cookie::parse(c.raw_cookie) {
|
Some(j) => j.cookies.into_iter().find_map(|c| match Cookie::parse(c.raw_cookie) {
|
||||||
@@ -429,11 +447,12 @@ async fn handle_host_plugin_request<R: Runtime>(
|
|||||||
};
|
};
|
||||||
Ok(Some(InternalEventPayload::GetCookieValueResponse(GetCookieValueResponse { value })))
|
Ok(Some(InternalEventPayload::GetCookieValueResponse(GetCookieValueResponse { value })))
|
||||||
}
|
}
|
||||||
HostRequest::WindowInfo(req) => {
|
InternalEventPayload::WindowInfoRequest(req) => {
|
||||||
let w = app_handle
|
let w = app_handle
|
||||||
.get_webview_window(&req.label)
|
.get_webview_window(&req.label)
|
||||||
.ok_or(PluginErr(format!("Failed to find window for {}", req.label)))?;
|
.ok_or(PluginErr(format!("Failed to find window for {}", req.label)))?;
|
||||||
|
|
||||||
|
// Actually look up the data so we never return an invalid ID
|
||||||
let environment_id = environment_from_window(&w).map(|m| m.id);
|
let environment_id = environment_from_window(&w).map(|m| m.id);
|
||||||
let workspace_id = workspace_from_window(&w).map(|m| m.id);
|
let workspace_id = workspace_from_window(&w).map(|m| m.id);
|
||||||
let request_id =
|
let request_id =
|
||||||
@@ -451,13 +470,25 @@ async fn handle_host_plugin_request<R: Runtime>(
|
|||||||
environment_id,
|
environment_id,
|
||||||
})))
|
})))
|
||||||
}
|
}
|
||||||
HostRequest::OtherRequest(req) => {
|
|
||||||
Ok(Some(InternalEventPayload::ErrorResponse(ErrorResponse {
|
InternalEventPayload::ListWorkspacesRequest(_) => {
|
||||||
error: format!(
|
let mut workspaces = Vec::new();
|
||||||
"Unsupported plugin request in app host handler: {}",
|
|
||||||
req.type_name()
|
for (_, window) in app_handle.webview_windows() {
|
||||||
),
|
if let Some(workspace) = workspace_from_window(&window) {
|
||||||
|
workspaces.push(WorkspaceInfo {
|
||||||
|
id: workspace.id.clone(),
|
||||||
|
name: workspace.name.clone(),
|
||||||
|
label: window.label().to_string(),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(Some(InternalEventPayload::ListWorkspacesResponse(ListWorkspacesResponse {
|
||||||
|
workspaces,
|
||||||
})))
|
})))
|
||||||
}
|
}
|
||||||
|
|
||||||
|
_ => Ok(None),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -21,17 +21,17 @@ use tauri::{
|
|||||||
};
|
};
|
||||||
use tokio::sync::Mutex;
|
use tokio::sync::Mutex;
|
||||||
use ts_rs::TS;
|
use ts_rs::TS;
|
||||||
use yaak_api::{ApiClientKind, yaak_api_client};
|
use yaak_models::models::Plugin;
|
||||||
use yaak_models::models::{Plugin, PluginSource};
|
|
||||||
use yaak_models::util::UpdateSource;
|
use yaak_models::util::UpdateSource;
|
||||||
use yaak_plugins::api::{
|
use yaak_plugins::api::{
|
||||||
PluginNameVersion, PluginSearchResponse, PluginUpdatesResponse, check_plugin_updates,
|
PluginNameVersion, PluginSearchResponse, PluginUpdatesResponse, check_plugin_updates,
|
||||||
search_plugins,
|
search_plugins,
|
||||||
};
|
};
|
||||||
use yaak_plugins::events::PluginContext;
|
use yaak_plugins::events::{Color, Icon, PluginContext, ShowToastRequest};
|
||||||
use yaak_plugins::install::{delete_and_uninstall, download_and_install};
|
use yaak_plugins::install::{delete_and_uninstall, download_and_install};
|
||||||
use yaak_plugins::manager::PluginManager;
|
use yaak_plugins::manager::PluginManager;
|
||||||
use yaak_plugins::plugin_meta::get_plugin_meta;
|
use yaak_plugins::plugin_meta::get_plugin_meta;
|
||||||
|
use yaak_tauri_utils::api_client::yaak_api_client;
|
||||||
|
|
||||||
static EXITING: AtomicBool = AtomicBool::new(false);
|
static EXITING: AtomicBool = AtomicBool::new(false);
|
||||||
|
|
||||||
@@ -72,8 +72,7 @@ impl PluginUpdater {
|
|||||||
|
|
||||||
info!("Checking for plugin updates");
|
info!("Checking for plugin updates");
|
||||||
|
|
||||||
let app_version = window.app_handle().package_info().version.to_string();
|
let http_client = yaak_api_client(window.app_handle())?;
|
||||||
let http_client = yaak_api_client(ApiClientKind::App, &app_version)?;
|
|
||||||
let plugins = window.app_handle().db().list_plugins()?;
|
let plugins = window.app_handle().db().list_plugins()?;
|
||||||
let updates = check_plugin_updates(&http_client, plugins.clone()).await?;
|
let updates = check_plugin_updates(&http_client, plugins.clone()).await?;
|
||||||
|
|
||||||
@@ -137,8 +136,7 @@ pub async fn cmd_plugins_search<R: Runtime>(
|
|||||||
app_handle: AppHandle<R>,
|
app_handle: AppHandle<R>,
|
||||||
query: &str,
|
query: &str,
|
||||||
) -> Result<PluginSearchResponse> {
|
) -> Result<PluginSearchResponse> {
|
||||||
let app_version = app_handle.package_info().version.to_string();
|
let http_client = yaak_api_client(&app_handle)?;
|
||||||
let http_client = yaak_api_client(ApiClientKind::App, &app_version)?;
|
|
||||||
Ok(search_plugins(&http_client, query).await?)
|
Ok(search_plugins(&http_client, query).await?)
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -149,8 +147,7 @@ pub async fn cmd_plugins_install<R: Runtime>(
|
|||||||
version: Option<String>,
|
version: Option<String>,
|
||||||
) -> Result<()> {
|
) -> Result<()> {
|
||||||
let plugin_manager = Arc::new((*window.state::<PluginManager>()).clone());
|
let plugin_manager = Arc::new((*window.state::<PluginManager>()).clone());
|
||||||
let app_version = window.app_handle().package_info().version.to_string();
|
let http_client = yaak_api_client(window.app_handle())?;
|
||||||
let http_client = yaak_api_client(ApiClientKind::App, &app_version)?;
|
|
||||||
let query_manager = window.state::<yaak_models::query_manager::QueryManager>();
|
let query_manager = window.state::<yaak_models::query_manager::QueryManager>();
|
||||||
let plugin_context = window.plugin_context();
|
let plugin_context = window.plugin_context();
|
||||||
download_and_install(
|
download_and_install(
|
||||||
@@ -165,28 +162,6 @@ pub async fn cmd_plugins_install<R: Runtime>(
|
|||||||
Ok(())
|
Ok(())
|
||||||
}
|
}
|
||||||
|
|
||||||
#[command]
|
|
||||||
pub async fn cmd_plugins_install_from_directory<R: Runtime>(
|
|
||||||
window: WebviewWindow<R>,
|
|
||||||
directory: &str,
|
|
||||||
) -> Result<Plugin> {
|
|
||||||
let plugin = window.db().upsert_plugin(
|
|
||||||
&Plugin {
|
|
||||||
directory: directory.into(),
|
|
||||||
url: None,
|
|
||||||
enabled: true,
|
|
||||||
source: PluginSource::Filesystem,
|
|
||||||
..Default::default()
|
|
||||||
},
|
|
||||||
&UpdateSource::from_window_label(window.label()),
|
|
||||||
)?;
|
|
||||||
|
|
||||||
let plugin_manager = Arc::new((*window.state::<PluginManager>()).clone());
|
|
||||||
plugin_manager.add_plugin(&window.plugin_context(), &plugin).await?;
|
|
||||||
|
|
||||||
Ok(plugin)
|
|
||||||
}
|
|
||||||
|
|
||||||
#[command]
|
#[command]
|
||||||
pub async fn cmd_plugins_uninstall<R: Runtime>(
|
pub async fn cmd_plugins_uninstall<R: Runtime>(
|
||||||
plugin_id: &str,
|
plugin_id: &str,
|
||||||
@@ -198,19 +173,11 @@ pub async fn cmd_plugins_uninstall<R: Runtime>(
|
|||||||
Ok(delete_and_uninstall(plugin_manager, &query_manager, &plugin_context, plugin_id).await?)
|
Ok(delete_and_uninstall(plugin_manager, &query_manager, &plugin_context, plugin_id).await?)
|
||||||
}
|
}
|
||||||
|
|
||||||
#[command]
|
|
||||||
pub async fn cmd_plugin_init_errors(
|
|
||||||
plugin_manager: State<'_, PluginManager>,
|
|
||||||
) -> Result<Vec<(String, String)>> {
|
|
||||||
Ok(plugin_manager.take_init_errors().await)
|
|
||||||
}
|
|
||||||
|
|
||||||
#[command]
|
#[command]
|
||||||
pub async fn cmd_plugins_updates<R: Runtime>(
|
pub async fn cmd_plugins_updates<R: Runtime>(
|
||||||
app_handle: AppHandle<R>,
|
app_handle: AppHandle<R>,
|
||||||
) -> Result<PluginUpdatesResponse> {
|
) -> Result<PluginUpdatesResponse> {
|
||||||
let app_version = app_handle.package_info().version.to_string();
|
let http_client = yaak_api_client(&app_handle)?;
|
||||||
let http_client = yaak_api_client(ApiClientKind::App, &app_version)?;
|
|
||||||
let plugins = app_handle.db().list_plugins()?;
|
let plugins = app_handle.db().list_plugins()?;
|
||||||
Ok(check_plugin_updates(&http_client, plugins).await?)
|
Ok(check_plugin_updates(&http_client, plugins).await?)
|
||||||
}
|
}
|
||||||
@@ -219,8 +186,7 @@ pub async fn cmd_plugins_updates<R: Runtime>(
|
|||||||
pub async fn cmd_plugins_update_all<R: Runtime>(
|
pub async fn cmd_plugins_update_all<R: Runtime>(
|
||||||
window: WebviewWindow<R>,
|
window: WebviewWindow<R>,
|
||||||
) -> Result<Vec<PluginNameVersion>> {
|
) -> Result<Vec<PluginNameVersion>> {
|
||||||
let app_version = window.app_handle().package_info().version.to_string();
|
let http_client = yaak_api_client(window.app_handle())?;
|
||||||
let http_client = yaak_api_client(ApiClientKind::App, &app_version)?;
|
|
||||||
let plugins = window.db().list_plugins()?;
|
let plugins = window.db().list_plugins()?;
|
||||||
|
|
||||||
// Get list of available updates (already filtered to only registry plugins)
|
// Get list of available updates (already filtered to only registry plugins)
|
||||||
@@ -297,8 +263,6 @@ pub fn init<R: Runtime>() -> TauriPlugin<R> {
|
|||||||
.join("index.cjs");
|
.join("index.cjs");
|
||||||
|
|
||||||
let dev_mode = is_dev();
|
let dev_mode = is_dev();
|
||||||
let query_manager =
|
|
||||||
app_handle.state::<yaak_models::query_manager::QueryManager>().inner().clone();
|
|
||||||
|
|
||||||
// Create plugin manager asynchronously
|
// Create plugin manager asynchronously
|
||||||
let app_handle_clone = app_handle.clone();
|
let app_handle_clone = app_handle.clone();
|
||||||
@@ -308,12 +272,53 @@ pub fn init<R: Runtime>() -> TauriPlugin<R> {
|
|||||||
installed_plugin_dir,
|
installed_plugin_dir,
|
||||||
node_bin_path,
|
node_bin_path,
|
||||||
plugin_runtime_main,
|
plugin_runtime_main,
|
||||||
&query_manager,
|
|
||||||
&PluginContext::new_empty(),
|
|
||||||
dev_mode,
|
dev_mode,
|
||||||
)
|
)
|
||||||
.await
|
.await;
|
||||||
.expect("Failed to start plugin runtime");
|
|
||||||
|
// Initialize all plugins after manager is created
|
||||||
|
let bundled_dirs = manager
|
||||||
|
.list_bundled_plugin_dirs()
|
||||||
|
.await
|
||||||
|
.expect("Failed to list bundled plugins");
|
||||||
|
|
||||||
|
// Ensure all bundled plugins make it into the database
|
||||||
|
let db = app_handle_clone.db();
|
||||||
|
for dir in &bundled_dirs {
|
||||||
|
if db.get_plugin_by_directory(dir).is_none() {
|
||||||
|
db.upsert_plugin(
|
||||||
|
&Plugin {
|
||||||
|
directory: dir.clone(),
|
||||||
|
enabled: true,
|
||||||
|
url: None,
|
||||||
|
..Default::default()
|
||||||
|
},
|
||||||
|
&UpdateSource::Background,
|
||||||
|
)
|
||||||
|
.expect("Failed to upsert bundled plugin");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get all plugins from database and initialize
|
||||||
|
let plugins = db.list_plugins().expect("Failed to list plugins from database");
|
||||||
|
drop(db); // Explicitly drop the connection before await
|
||||||
|
|
||||||
|
let errors =
|
||||||
|
manager.initialize_all_plugins(plugins, &PluginContext::new_empty()).await;
|
||||||
|
|
||||||
|
// Show toast for any failed plugins
|
||||||
|
for (plugin_dir, error_msg) in errors {
|
||||||
|
let plugin_name = plugin_dir.split('/').last().unwrap_or(&plugin_dir);
|
||||||
|
let toast = ShowToastRequest {
|
||||||
|
message: format!("Failed to start plugin '{}': {}", plugin_name, error_msg),
|
||||||
|
color: Some(Color::Danger),
|
||||||
|
icon: Some(Icon::AlertTriangle),
|
||||||
|
timeout: Some(10000),
|
||||||
|
};
|
||||||
|
if let Err(emit_err) = app_handle_clone.emit("show_toast", toast) {
|
||||||
|
error!("Failed to emit toast for plugin error: {emit_err:?}");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
app_handle_clone.manage(manager);
|
app_handle_clone.manage(manager);
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -1,6 +1,10 @@
|
|||||||
|
use log::info;
|
||||||
use serde_json::Value;
|
use serde_json::Value;
|
||||||
pub use yaak::render::{render_grpc_request, render_http_request};
|
use std::collections::BTreeMap;
|
||||||
use yaak_models::models::Environment;
|
use yaak_http::path_placeholders::apply_path_placeholders;
|
||||||
|
use yaak_models::models::{
|
||||||
|
Environment, GrpcRequest, HttpRequest, HttpRequestHeader, HttpUrlParameter,
|
||||||
|
};
|
||||||
use yaak_models::render::make_vars_hashmap;
|
use yaak_models::render::make_vars_hashmap;
|
||||||
use yaak_templates::{RenderOptions, TemplateCallback, parse_and_render, render_json_value_raw};
|
use yaak_templates::{RenderOptions, TemplateCallback, parse_and_render, render_json_value_raw};
|
||||||
|
|
||||||
@@ -23,3 +27,137 @@ pub async fn render_json_value<T: TemplateCallback>(
|
|||||||
let vars = &make_vars_hashmap(environment_chain);
|
let vars = &make_vars_hashmap(environment_chain);
|
||||||
render_json_value_raw(value, vars, cb, opt).await
|
render_json_value_raw(value, vars, cb, opt).await
|
||||||
}
|
}
|
||||||
|
|
||||||
|
pub async fn render_grpc_request<T: TemplateCallback>(
|
||||||
|
r: &GrpcRequest,
|
||||||
|
environment_chain: Vec<Environment>,
|
||||||
|
cb: &T,
|
||||||
|
opt: &RenderOptions,
|
||||||
|
) -> yaak_templates::error::Result<GrpcRequest> {
|
||||||
|
let vars = &make_vars_hashmap(environment_chain);
|
||||||
|
|
||||||
|
let mut metadata = Vec::new();
|
||||||
|
for p in r.metadata.clone() {
|
||||||
|
metadata.push(HttpRequestHeader {
|
||||||
|
enabled: p.enabled,
|
||||||
|
name: parse_and_render(p.name.as_str(), vars, cb, &opt).await?,
|
||||||
|
value: parse_and_render(p.value.as_str(), vars, cb, &opt).await?,
|
||||||
|
id: p.id,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
let authentication = {
|
||||||
|
let mut disabled = false;
|
||||||
|
let mut auth = BTreeMap::new();
|
||||||
|
match r.authentication.get("disabled") {
|
||||||
|
Some(Value::Bool(true)) => {
|
||||||
|
disabled = true;
|
||||||
|
}
|
||||||
|
Some(Value::String(tmpl)) => {
|
||||||
|
disabled = parse_and_render(tmpl.as_str(), vars, cb, &opt)
|
||||||
|
.await
|
||||||
|
.unwrap_or_default()
|
||||||
|
.is_empty();
|
||||||
|
info!(
|
||||||
|
"Rendering authentication.disabled as a template: {disabled} from \"{tmpl}\""
|
||||||
|
);
|
||||||
|
}
|
||||||
|
_ => {}
|
||||||
|
}
|
||||||
|
if disabled {
|
||||||
|
auth.insert("disabled".to_string(), Value::Bool(true));
|
||||||
|
} else {
|
||||||
|
for (k, v) in r.authentication.clone() {
|
||||||
|
if k == "disabled" {
|
||||||
|
auth.insert(k, Value::Bool(false));
|
||||||
|
} else {
|
||||||
|
auth.insert(k, render_json_value_raw(v, vars, cb, &opt).await?);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
auth
|
||||||
|
};
|
||||||
|
|
||||||
|
let url = parse_and_render(r.url.as_str(), vars, cb, &opt).await?;
|
||||||
|
|
||||||
|
Ok(GrpcRequest { url, metadata, authentication, ..r.to_owned() })
|
||||||
|
}
|
||||||
|
|
||||||
|
pub async fn render_http_request<T: TemplateCallback>(
|
||||||
|
r: &HttpRequest,
|
||||||
|
environment_chain: Vec<Environment>,
|
||||||
|
cb: &T,
|
||||||
|
opt: &RenderOptions,
|
||||||
|
) -> yaak_templates::error::Result<HttpRequest> {
|
||||||
|
let vars = &make_vars_hashmap(environment_chain);
|
||||||
|
|
||||||
|
let mut url_parameters = Vec::new();
|
||||||
|
for p in r.url_parameters.clone() {
|
||||||
|
if !p.enabled {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
url_parameters.push(HttpUrlParameter {
|
||||||
|
enabled: p.enabled,
|
||||||
|
name: parse_and_render(p.name.as_str(), vars, cb, &opt).await?,
|
||||||
|
value: parse_and_render(p.value.as_str(), vars, cb, &opt).await?,
|
||||||
|
id: p.id,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
let mut headers = Vec::new();
|
||||||
|
for p in r.headers.clone() {
|
||||||
|
if !p.enabled {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
headers.push(HttpRequestHeader {
|
||||||
|
enabled: p.enabled,
|
||||||
|
name: parse_and_render(p.name.as_str(), vars, cb, &opt).await?,
|
||||||
|
value: parse_and_render(p.value.as_str(), vars, cb, &opt).await?,
|
||||||
|
id: p.id,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
let mut body = BTreeMap::new();
|
||||||
|
for (k, v) in r.body.clone() {
|
||||||
|
body.insert(k, render_json_value_raw(v, vars, cb, &opt).await?);
|
||||||
|
}
|
||||||
|
|
||||||
|
let authentication = {
|
||||||
|
let mut disabled = false;
|
||||||
|
let mut auth = BTreeMap::new();
|
||||||
|
match r.authentication.get("disabled") {
|
||||||
|
Some(Value::Bool(true)) => {
|
||||||
|
disabled = true;
|
||||||
|
}
|
||||||
|
Some(Value::String(tmpl)) => {
|
||||||
|
disabled = parse_and_render(tmpl.as_str(), vars, cb, &opt)
|
||||||
|
.await
|
||||||
|
.unwrap_or_default()
|
||||||
|
.is_empty();
|
||||||
|
info!(
|
||||||
|
"Rendering authentication.disabled as a template: {disabled} from \"{tmpl}\""
|
||||||
|
);
|
||||||
|
}
|
||||||
|
_ => {}
|
||||||
|
}
|
||||||
|
if disabled {
|
||||||
|
auth.insert("disabled".to_string(), Value::Bool(true));
|
||||||
|
} else {
|
||||||
|
for (k, v) in r.authentication.clone() {
|
||||||
|
if k == "disabled" {
|
||||||
|
auth.insert(k, Value::Bool(false));
|
||||||
|
} else {
|
||||||
|
auth.insert(k, render_json_value_raw(v, vars, cb, &opt).await?);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
auth
|
||||||
|
};
|
||||||
|
|
||||||
|
let url = parse_and_render(r.url.clone().as_str(), vars, cb, &opt).await?;
|
||||||
|
|
||||||
|
// This doesn't fit perfectly with the concept of "rendering" but it kind of does
|
||||||
|
let (url, url_parameters) = apply_path_placeholders(&url, &url_parameters);
|
||||||
|
|
||||||
|
Ok(HttpRequest { url, url_parameters, headers, body, authentication, ..r.to_owned() })
|
||||||
|
}
|
||||||
|
|||||||
@@ -15,9 +15,6 @@ use ts_rs::TS;
|
|||||||
use yaak_models::util::generate_id;
|
use yaak_models::util::generate_id;
|
||||||
use yaak_plugins::manager::PluginManager;
|
use yaak_plugins::manager::PluginManager;
|
||||||
|
|
||||||
use url::Url;
|
|
||||||
use yaak_api::get_system_proxy_url;
|
|
||||||
|
|
||||||
use crate::error::Error::GenericError;
|
use crate::error::Error::GenericError;
|
||||||
use crate::is_dev;
|
use crate::is_dev;
|
||||||
|
|
||||||
@@ -90,13 +87,8 @@ impl YaakUpdater {
|
|||||||
info!("Checking for updates mode={} autodl={}", mode, auto_download);
|
info!("Checking for updates mode={} autodl={}", mode, auto_download);
|
||||||
|
|
||||||
let w = window.clone();
|
let w = window.clone();
|
||||||
let mut updater_builder = w.updater_builder();
|
let update_check_result = w
|
||||||
if let Some(proxy_url) = get_system_proxy_url() {
|
.updater_builder()
|
||||||
if let Ok(url) = Url::parse(&proxy_url) {
|
|
||||||
updater_builder = updater_builder.proxy(url);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
let update_check_result = updater_builder
|
|
||||||
.on_before_exit(move || {
|
.on_before_exit(move || {
|
||||||
// Kill plugin manager before exit or NSIS installer will fail to replace sidecar
|
// Kill plugin manager before exit or NSIS installer will fail to replace sidecar
|
||||||
// while it's running.
|
// while it's running.
|
||||||
@@ -119,7 +111,6 @@ impl YaakUpdater {
|
|||||||
UpdateTrigger::User => "user",
|
UpdateTrigger::User => "user",
|
||||||
},
|
},
|
||||||
)?
|
)?
|
||||||
.header("X-Install-Mode", detect_install_mode().unwrap_or("unknown"))?
|
|
||||||
.build()?
|
.build()?
|
||||||
.check()
|
.check()
|
||||||
.await;
|
.await;
|
||||||
@@ -362,22 +353,6 @@ pub async fn download_update_idempotent<R: Runtime>(
|
|||||||
Ok(dl_path)
|
Ok(dl_path)
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Detect the installer type so the update server can serve the correct artifact.
|
|
||||||
fn detect_install_mode() -> Option<&'static str> {
|
|
||||||
#[cfg(target_os = "windows")]
|
|
||||||
{
|
|
||||||
if let Ok(exe) = std::env::current_exe() {
|
|
||||||
let path = exe.to_string_lossy().to_lowercase();
|
|
||||||
if path.starts_with(r"c:\program files") {
|
|
||||||
return Some("nsis-machine");
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return Some("nsis");
|
|
||||||
}
|
|
||||||
#[allow(unreachable_code)]
|
|
||||||
None
|
|
||||||
}
|
|
||||||
|
|
||||||
pub async fn install_update_maybe_download<R: Runtime>(
|
pub async fn install_update_maybe_download<R: Runtime>(
|
||||||
window: &WebviewWindow<R>,
|
window: &WebviewWindow<R>,
|
||||||
update: &Update,
|
update: &Update,
|
||||||
|
|||||||
@@ -8,11 +8,11 @@ use std::fs;
|
|||||||
use std::sync::Arc;
|
use std::sync::Arc;
|
||||||
use tauri::{AppHandle, Emitter, Manager, Runtime, Url};
|
use tauri::{AppHandle, Emitter, Manager, Runtime, Url};
|
||||||
use tauri_plugin_dialog::{DialogExt, MessageDialogButtons, MessageDialogKind};
|
use tauri_plugin_dialog::{DialogExt, MessageDialogButtons, MessageDialogKind};
|
||||||
use yaak_api::{ApiClientKind, yaak_api_client};
|
|
||||||
use yaak_models::util::generate_id;
|
use yaak_models::util::generate_id;
|
||||||
use yaak_plugins::events::{Color, ShowToastRequest};
|
use yaak_plugins::events::{Color, ShowToastRequest};
|
||||||
use yaak_plugins::install::download_and_install;
|
use yaak_plugins::install::download_and_install;
|
||||||
use yaak_plugins::manager::PluginManager;
|
use yaak_plugins::manager::PluginManager;
|
||||||
|
use yaak_tauri_utils::api_client::yaak_api_client;
|
||||||
|
|
||||||
pub(crate) async fn handle_deep_link<R: Runtime>(
|
pub(crate) async fn handle_deep_link<R: Runtime>(
|
||||||
app_handle: &AppHandle<R>,
|
app_handle: &AppHandle<R>,
|
||||||
@@ -46,8 +46,7 @@ pub(crate) async fn handle_deep_link<R: Runtime>(
|
|||||||
|
|
||||||
let plugin_manager = Arc::new((*window.state::<PluginManager>()).clone());
|
let plugin_manager = Arc::new((*window.state::<PluginManager>()).clone());
|
||||||
let query_manager = app_handle.db_manager();
|
let query_manager = app_handle.db_manager();
|
||||||
let app_version = app_handle.package_info().version.to_string();
|
let http_client = yaak_api_client(app_handle)?;
|
||||||
let http_client = yaak_api_client(ApiClientKind::App, &app_version)?;
|
|
||||||
let plugin_context = window.plugin_context();
|
let plugin_context = window.plugin_context();
|
||||||
let pv = download_and_install(
|
let pv = download_and_install(
|
||||||
plugin_manager,
|
plugin_manager,
|
||||||
@@ -87,9 +86,7 @@ pub(crate) async fn handle_deep_link<R: Runtime>(
|
|||||||
return Ok(());
|
return Ok(());
|
||||||
}
|
}
|
||||||
|
|
||||||
let app_version = app_handle.package_info().version.to_string();
|
let resp = yaak_api_client(app_handle)?.get(file_url).send().await?;
|
||||||
let resp =
|
|
||||||
yaak_api_client(ApiClientKind::App, &app_version)?.get(file_url).send().await?;
|
|
||||||
let json = resp.bytes().await?;
|
let json = resp.bytes().await?;
|
||||||
let p = app_handle
|
let p = app_handle
|
||||||
.path()
|
.path()
|
||||||
|
|||||||
@@ -162,16 +162,11 @@ pub(crate) fn create_window<R: Runtime>(
|
|||||||
"dev.reset_size" => webview_window
|
"dev.reset_size" => webview_window
|
||||||
.set_size(LogicalSize::new(DEFAULT_WINDOW_WIDTH, DEFAULT_WINDOW_HEIGHT))
|
.set_size(LogicalSize::new(DEFAULT_WINDOW_WIDTH, DEFAULT_WINDOW_HEIGHT))
|
||||||
.unwrap(),
|
.unwrap(),
|
||||||
"dev.reset_size_16x9" => {
|
"dev.reset_size_record" => {
|
||||||
let width = webview_window.outer_size().unwrap().width;
|
let width = webview_window.outer_size().unwrap().width;
|
||||||
let height = width * 9 / 16;
|
let height = width * 9 / 16;
|
||||||
webview_window.set_size(PhysicalSize::new(width, height)).unwrap()
|
webview_window.set_size(PhysicalSize::new(width, height)).unwrap()
|
||||||
}
|
}
|
||||||
"dev.reset_size_16x10" => {
|
|
||||||
let width = webview_window.outer_size().unwrap().width;
|
|
||||||
let height = width * 10 / 16;
|
|
||||||
webview_window.set_size(PhysicalSize::new(width, height)).unwrap()
|
|
||||||
}
|
|
||||||
"dev.refresh" => webview_window.eval("location.reload()").unwrap(),
|
"dev.refresh" => webview_window.eval("location.reload()").unwrap(),
|
||||||
"dev.generate_theme_css" => {
|
"dev.generate_theme_css" => {
|
||||||
w.emit("generate_theme_css", true).unwrap();
|
w.emit("generate_theme_css", true).unwrap();
|
||||||
|
|||||||
@@ -153,11 +153,9 @@ pub fn app_menu<R: Runtime>(app_handle: &AppHandle<R>) -> tauri::Result<Menu<R>>
|
|||||||
.build(app_handle)?,
|
.build(app_handle)?,
|
||||||
&MenuItemBuilder::with_id("dev.reset_size".to_string(), "Reset Size")
|
&MenuItemBuilder::with_id("dev.reset_size".to_string(), "Reset Size")
|
||||||
.build(app_handle)?,
|
.build(app_handle)?,
|
||||||
&MenuItemBuilder::with_id("dev.reset_size_16x9".to_string(), "Resize to 16x9")
|
|
||||||
.build(app_handle)?,
|
|
||||||
&MenuItemBuilder::with_id(
|
&MenuItemBuilder::with_id(
|
||||||
"dev.reset_size_16x10".to_string(),
|
"dev.reset_size_record".to_string(),
|
||||||
"Resize to 16x10",
|
"Reset Size 16x9",
|
||||||
)
|
)
|
||||||
.build(app_handle)?,
|
.build(app_handle)?,
|
||||||
&MenuItemBuilder::with_id(
|
&MenuItemBuilder::with_id(
|
||||||
|
|||||||
@@ -24,11 +24,56 @@ use yaak_models::util::UpdateSource;
|
|||||||
use yaak_plugins::events::{CallHttpAuthenticationRequest, HttpHeader, RenderPurpose};
|
use yaak_plugins::events::{CallHttpAuthenticationRequest, HttpHeader, RenderPurpose};
|
||||||
use yaak_plugins::manager::PluginManager;
|
use yaak_plugins::manager::PluginManager;
|
||||||
use yaak_plugins::template_callback::PluginTemplateCallback;
|
use yaak_plugins::template_callback::PluginTemplateCallback;
|
||||||
use yaak_templates::strip_json_comments::maybe_strip_json_comments;
|
|
||||||
use yaak_templates::{RenderErrorBehavior, RenderOptions};
|
use yaak_templates::{RenderErrorBehavior, RenderOptions};
|
||||||
use yaak_tls::find_client_certificate;
|
use yaak_tls::find_client_certificate;
|
||||||
use yaak_ws::{WebsocketManager, render_websocket_request};
|
use yaak_ws::{WebsocketManager, render_websocket_request};
|
||||||
|
|
||||||
|
#[command]
|
||||||
|
pub async fn cmd_ws_upsert_request<R: Runtime>(
|
||||||
|
request: WebsocketRequest,
|
||||||
|
app_handle: AppHandle<R>,
|
||||||
|
window: WebviewWindow<R>,
|
||||||
|
) -> Result<WebsocketRequest> {
|
||||||
|
Ok(app_handle
|
||||||
|
.db()
|
||||||
|
.upsert_websocket_request(&request, &UpdateSource::from_window_label(window.label()))?)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[command]
|
||||||
|
pub async fn cmd_ws_duplicate_request<R: Runtime>(
|
||||||
|
request_id: &str,
|
||||||
|
app_handle: AppHandle<R>,
|
||||||
|
window: WebviewWindow<R>,
|
||||||
|
) -> Result<WebsocketRequest> {
|
||||||
|
let db = app_handle.db();
|
||||||
|
let request = db.get_websocket_request(request_id)?;
|
||||||
|
Ok(db.duplicate_websocket_request(&request, &UpdateSource::from_window_label(window.label()))?)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[command]
|
||||||
|
pub async fn cmd_ws_delete_request<R: Runtime>(
|
||||||
|
request_id: &str,
|
||||||
|
app_handle: AppHandle<R>,
|
||||||
|
window: WebviewWindow<R>,
|
||||||
|
) -> Result<WebsocketRequest> {
|
||||||
|
Ok(app_handle.db().delete_websocket_request_by_id(
|
||||||
|
request_id,
|
||||||
|
&UpdateSource::from_window_label(window.label()),
|
||||||
|
)?)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[command]
|
||||||
|
pub async fn cmd_ws_delete_connection<R: Runtime>(
|
||||||
|
connection_id: &str,
|
||||||
|
app_handle: AppHandle<R>,
|
||||||
|
window: WebviewWindow<R>,
|
||||||
|
) -> Result<WebsocketConnection> {
|
||||||
|
Ok(app_handle.db().delete_websocket_connection_by_id(
|
||||||
|
connection_id,
|
||||||
|
&UpdateSource::from_window_label(window.label()),
|
||||||
|
)?)
|
||||||
|
}
|
||||||
|
|
||||||
#[command]
|
#[command]
|
||||||
pub async fn cmd_ws_delete_connections<R: Runtime>(
|
pub async fn cmd_ws_delete_connections<R: Runtime>(
|
||||||
request_id: &str,
|
request_id: &str,
|
||||||
@@ -41,6 +86,30 @@ pub async fn cmd_ws_delete_connections<R: Runtime>(
|
|||||||
)?)
|
)?)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[command]
|
||||||
|
pub async fn cmd_ws_list_events<R: Runtime>(
|
||||||
|
connection_id: &str,
|
||||||
|
app_handle: AppHandle<R>,
|
||||||
|
) -> Result<Vec<WebsocketEvent>> {
|
||||||
|
Ok(app_handle.db().list_websocket_events(connection_id)?)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[command]
|
||||||
|
pub async fn cmd_ws_list_requests<R: Runtime>(
|
||||||
|
workspace_id: &str,
|
||||||
|
app_handle: AppHandle<R>,
|
||||||
|
) -> Result<Vec<WebsocketRequest>> {
|
||||||
|
Ok(app_handle.db().list_websocket_requests(workspace_id)?)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[command]
|
||||||
|
pub async fn cmd_ws_list_connections<R: Runtime>(
|
||||||
|
workspace_id: &str,
|
||||||
|
app_handle: AppHandle<R>,
|
||||||
|
) -> Result<Vec<WebsocketConnection>> {
|
||||||
|
Ok(app_handle.db().list_websocket_connections(workspace_id)?)
|
||||||
|
}
|
||||||
|
|
||||||
#[command]
|
#[command]
|
||||||
pub async fn cmd_ws_send<R: Runtime>(
|
pub async fn cmd_ws_send<R: Runtime>(
|
||||||
connection_id: &str,
|
connection_id: &str,
|
||||||
@@ -73,10 +142,8 @@ pub async fn cmd_ws_send<R: Runtime>(
|
|||||||
)
|
)
|
||||||
.await?;
|
.await?;
|
||||||
|
|
||||||
let message = maybe_strip_json_comments(&request.message);
|
|
||||||
|
|
||||||
let mut ws_manager = ws_manager.lock().await;
|
let mut ws_manager = ws_manager.lock().await;
|
||||||
ws_manager.send(&connection.id, Message::Text(message.clone().into())).await?;
|
ws_manager.send(&connection.id, Message::Text(request.message.clone().into())).await?;
|
||||||
|
|
||||||
app_handle.db().upsert_websocket_event(
|
app_handle.db().upsert_websocket_event(
|
||||||
&WebsocketEvent {
|
&WebsocketEvent {
|
||||||
@@ -85,7 +152,7 @@ pub async fn cmd_ws_send<R: Runtime>(
|
|||||||
workspace_id: connection.workspace_id.clone(),
|
workspace_id: connection.workspace_id.clone(),
|
||||||
is_server: false,
|
is_server: false,
|
||||||
message_type: WebsocketEventType::Text,
|
message_type: WebsocketEventType::Text,
|
||||||
message: message.into(),
|
message: request.message.into(),
|
||||||
..Default::default()
|
..Default::default()
|
||||||
},
|
},
|
||||||
&UpdateSource::from_window_label(window.label()),
|
&UpdateSource::from_window_label(window.label()),
|
||||||
|
|||||||
@@ -14,7 +14,10 @@
|
|||||||
"assetProtocol": {
|
"assetProtocol": {
|
||||||
"enable": true,
|
"enable": true,
|
||||||
"scope": {
|
"scope": {
|
||||||
"allow": ["$APPDATA/responses/*", "$RESOURCE/static/*"]
|
"allow": [
|
||||||
|
"$APPDATA/responses/*",
|
||||||
|
"$RESOURCE/static/*"
|
||||||
|
]
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -22,7 +25,9 @@
|
|||||||
"plugins": {
|
"plugins": {
|
||||||
"deep-link": {
|
"deep-link": {
|
||||||
"desktop": {
|
"desktop": {
|
||||||
"schemes": ["yaak"]
|
"schemes": [
|
||||||
|
"yaak"
|
||||||
|
]
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
|||||||
@@ -1,6 +1,9 @@
|
|||||||
{
|
{
|
||||||
"build": {
|
"build": {
|
||||||
"features": ["updater", "license"]
|
"features": [
|
||||||
|
"updater",
|
||||||
|
"license"
|
||||||
|
]
|
||||||
},
|
},
|
||||||
"app": {
|
"app": {
|
||||||
"security": {
|
"security": {
|
||||||
@@ -8,15 +11,21 @@
|
|||||||
"default",
|
"default",
|
||||||
{
|
{
|
||||||
"identifier": "release",
|
"identifier": "release",
|
||||||
"windows": ["*"],
|
"windows": [
|
||||||
"permissions": ["yaak-license:default"]
|
"*"
|
||||||
|
],
|
||||||
|
"permissions": [
|
||||||
|
"yaak-license:default"
|
||||||
|
]
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"plugins": {
|
"plugins": {
|
||||||
"updater": {
|
"updater": {
|
||||||
"endpoints": ["https://update.yaak.app/check/{{target}}/{{arch}}/{{current_version}}"],
|
"endpoints": [
|
||||||
|
"https://update.yaak.app/check/{{target}}/{{arch}}/{{current_version}}"
|
||||||
|
],
|
||||||
"pubkey": "dW50cnVzdGVkIGNvbW1lbnQ6IG1pbmlzaWduIHB1YmxpYyBrZXk6IEVGRkFGMjQxRUNEOTQ3MzAKUldRd1I5bnNRZkw2NzRtMnRlWTN3R24xYUR3aGRsUjJzWGwvdHdEcGljb3ZJMUNlMjFsaHlqVU4K"
|
"pubkey": "dW50cnVzdGVkIGNvbW1lbnQ6IG1pbmlzaWduIHB1YmxpYyBrZXk6IEVGRkFGMjQxRUNEOTQ3MzAKUldRd1I5bnNRZkw2NzRtMnRlWTN3R24xYUR3aGRsUjJzWGwvdHdEcGljb3ZJMUNlMjFsaHlqVU4K"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
@@ -30,7 +39,14 @@
|
|||||||
"createUpdaterArtifacts": true,
|
"createUpdaterArtifacts": true,
|
||||||
"longDescription": "A cross-platform desktop app for interacting with REST, GraphQL, and gRPC",
|
"longDescription": "A cross-platform desktop app for interacting with REST, GraphQL, and gRPC",
|
||||||
"shortDescription": "Play with APIs, intuitively",
|
"shortDescription": "Play with APIs, intuitively",
|
||||||
"targets": ["app", "appimage", "deb", "dmg", "nsis", "rpm"],
|
"targets": [
|
||||||
|
"app",
|
||||||
|
"appimage",
|
||||||
|
"deb",
|
||||||
|
"dmg",
|
||||||
|
"nsis",
|
||||||
|
"rpm"
|
||||||
|
],
|
||||||
"macOS": {
|
"macOS": {
|
||||||
"minimumSystemVersion": "13.0",
|
"minimumSystemVersion": "13.0",
|
||||||
"exceptionDomain": "",
|
"exceptionDomain": "",
|
||||||
@@ -42,16 +58,10 @@
|
|||||||
},
|
},
|
||||||
"linux": {
|
"linux": {
|
||||||
"deb": {
|
"deb": {
|
||||||
"desktopTemplate": "./template.desktop",
|
"desktopTemplate": "./template.desktop"
|
||||||
"files": {
|
|
||||||
"/usr/share/metainfo/app.yaak.Yaak.metainfo.xml": "../../flatpak/app.yaak.Yaak.metainfo.xml"
|
|
||||||
}
|
|
||||||
},
|
},
|
||||||
"rpm": {
|
"rpm": {
|
||||||
"desktopTemplate": "./template.desktop",
|
"desktopTemplate": "./template.desktop"
|
||||||
"files": {
|
|
||||||
"/usr/share/metainfo/app.yaak.Yaak.metainfo.xml": "../../flatpak/app.yaak.Yaak.metainfo.xml"
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,14 +1,14 @@
|
|||||||
import { useQuery } from "@tanstack/react-query";
|
import { useQuery } from '@tanstack/react-query';
|
||||||
import { invoke } from "@tauri-apps/api/core";
|
import { invoke } from '@tauri-apps/api/core';
|
||||||
import { Fonts } from "./bindings/gen_fonts";
|
import { Fonts } from './bindings/gen_fonts';
|
||||||
|
|
||||||
export async function listFonts() {
|
export async function listFonts() {
|
||||||
return invoke<Fonts>("plugin:yaak-fonts|list", {});
|
return invoke<Fonts>('plugin:yaak-fonts|list', {});
|
||||||
}
|
}
|
||||||
|
|
||||||
export function useFonts() {
|
export function useFonts() {
|
||||||
return useQuery({
|
return useQuery({
|
||||||
queryKey: ["list_fonts"],
|
queryKey: ['list_fonts'],
|
||||||
queryFn: () => listFonts(),
|
queryFn: () => listFonts(),
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "@yaakapp-internal/fonts",
|
"name": "@yaakapp-internal/fonts",
|
||||||
"version": "1.0.0",
|
|
||||||
"private": true,
|
"private": true,
|
||||||
|
"version": "1.0.0",
|
||||||
"main": "index.ts"
|
"main": "index.ts"
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -16,7 +16,7 @@ thiserror = { workspace = true }
|
|||||||
ts-rs = { workspace = true }
|
ts-rs = { workspace = true }
|
||||||
yaak-common = { workspace = true }
|
yaak-common = { workspace = true }
|
||||||
yaak-models = { workspace = true }
|
yaak-models = { workspace = true }
|
||||||
yaak-api = { workspace = true }
|
yaak-tauri-utils = { workspace = true }
|
||||||
|
|
||||||
[build-dependencies]
|
[build-dependencies]
|
||||||
tauri-plugin = { workspace = true, features = ["build"] }
|
tauri-plugin = { workspace = true, features = ["build"] }
|
||||||
|
|||||||
@@ -1,35 +1,35 @@
|
|||||||
import { useMutation, useQuery, useQueryClient } from "@tanstack/react-query";
|
import { useMutation, useQuery, useQueryClient } from '@tanstack/react-query';
|
||||||
import { invoke } from "@tauri-apps/api/core";
|
import { invoke } from '@tauri-apps/api/core';
|
||||||
import { listen } from "@tauri-apps/api/event";
|
import { listen } from '@tauri-apps/api/event';
|
||||||
import { appInfo } from "@yaakapp/app/lib/appInfo";
|
import { appInfo } from '@yaakapp/app/lib/appInfo';
|
||||||
import { useEffect } from "react";
|
import { useEffect } from 'react';
|
||||||
import { LicenseCheckStatus } from "./bindings/license";
|
import { LicenseCheckStatus } from './bindings/license';
|
||||||
|
|
||||||
export * from "./bindings/license";
|
export * from './bindings/license';
|
||||||
|
|
||||||
const CHECK_QUERY_KEY = ["license.check"];
|
const CHECK_QUERY_KEY = ['license.check'];
|
||||||
|
|
||||||
export function useLicense() {
|
export function useLicense() {
|
||||||
const queryClient = useQueryClient();
|
const queryClient = useQueryClient();
|
||||||
const activate = useMutation<void, string, { licenseKey: string }>({
|
const activate = useMutation<void, string, { licenseKey: string }>({
|
||||||
mutationKey: ["license.activate"],
|
mutationKey: ['license.activate'],
|
||||||
mutationFn: (payload) => invoke("plugin:yaak-license|activate", payload),
|
mutationFn: (payload) => invoke('plugin:yaak-license|activate', payload),
|
||||||
onSuccess: () => queryClient.invalidateQueries({ queryKey: CHECK_QUERY_KEY }),
|
onSuccess: () => queryClient.invalidateQueries({ queryKey: CHECK_QUERY_KEY }),
|
||||||
});
|
});
|
||||||
|
|
||||||
const deactivate = useMutation<void, string, void>({
|
const deactivate = useMutation<void, string, void>({
|
||||||
mutationKey: ["license.deactivate"],
|
mutationKey: ['license.deactivate'],
|
||||||
mutationFn: () => invoke("plugin:yaak-license|deactivate"),
|
mutationFn: () => invoke('plugin:yaak-license|deactivate'),
|
||||||
onSuccess: () => queryClient.invalidateQueries({ queryKey: CHECK_QUERY_KEY }),
|
onSuccess: () => queryClient.invalidateQueries({ queryKey: CHECK_QUERY_KEY }),
|
||||||
});
|
});
|
||||||
|
|
||||||
// Check the license again after a license is activated
|
// Check the license again after a license is activated
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
const unlisten = listen("license-activated", async () => {
|
const unlisten = listen('license-activated', async () => {
|
||||||
await queryClient.invalidateQueries({ queryKey: CHECK_QUERY_KEY });
|
await queryClient.invalidateQueries({ queryKey: CHECK_QUERY_KEY });
|
||||||
});
|
});
|
||||||
return () => {
|
return () => {
|
||||||
void unlisten.then((fn) => fn());
|
unlisten.then((fn) => fn());
|
||||||
};
|
};
|
||||||
}, []);
|
}, []);
|
||||||
|
|
||||||
@@ -41,7 +41,7 @@ export function useLicense() {
|
|||||||
if (!appInfo.featureLicense) {
|
if (!appInfo.featureLicense) {
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
return invoke<LicenseCheckStatus>("plugin:yaak-license|check");
|
return invoke<LicenseCheckStatus>('plugin:yaak-license|check');
|
||||||
},
|
},
|
||||||
});
|
});
|
||||||
|
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "@yaakapp-internal/license",
|
"name": "@yaakapp-internal/license",
|
||||||
"version": "1.0.0",
|
|
||||||
"private": true,
|
"private": true,
|
||||||
|
"version": "1.0.0",
|
||||||
"main": "index.ts"
|
"main": "index.ts"
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -16,7 +16,7 @@ pub enum Error {
|
|||||||
ModelError(#[from] yaak_models::error::Error),
|
ModelError(#[from] yaak_models::error::Error),
|
||||||
|
|
||||||
#[error(transparent)]
|
#[error(transparent)]
|
||||||
ApiError(#[from] yaak_api::Error),
|
TauriUtilsError(#[from] yaak_tauri_utils::error::Error),
|
||||||
|
|
||||||
#[error("Internal server error")]
|
#[error("Internal server error")]
|
||||||
ServerError,
|
ServerError,
|
||||||
|
|||||||
@@ -7,11 +7,11 @@ use std::ops::Add;
|
|||||||
use std::time::Duration;
|
use std::time::Duration;
|
||||||
use tauri::{AppHandle, Emitter, Manager, Runtime, WebviewWindow, is_dev};
|
use tauri::{AppHandle, Emitter, Manager, Runtime, WebviewWindow, is_dev};
|
||||||
use ts_rs::TS;
|
use ts_rs::TS;
|
||||||
use yaak_api::{ApiClientKind, yaak_api_client};
|
|
||||||
use yaak_common::platform::get_os_str;
|
use yaak_common::platform::get_os_str;
|
||||||
use yaak_models::db_context::DbContext;
|
use yaak_models::db_context::DbContext;
|
||||||
use yaak_models::query_manager::QueryManager;
|
use yaak_models::query_manager::QueryManager;
|
||||||
use yaak_models::util::UpdateSource;
|
use yaak_models::util::UpdateSource;
|
||||||
|
use yaak_tauri_utils::api_client::yaak_api_client;
|
||||||
|
|
||||||
/// Extension trait for accessing the QueryManager from Tauri Manager types.
|
/// Extension trait for accessing the QueryManager from Tauri Manager types.
|
||||||
/// This is needed temporarily until all crates are refactored to not use Tauri.
|
/// This is needed temporarily until all crates are refactored to not use Tauri.
|
||||||
@@ -118,12 +118,11 @@ pub async fn activate_license<R: Runtime>(
|
|||||||
license_key: &str,
|
license_key: &str,
|
||||||
) -> Result<()> {
|
) -> Result<()> {
|
||||||
info!("Activating license {}", license_key);
|
info!("Activating license {}", license_key);
|
||||||
let app_version = window.app_handle().package_info().version.to_string();
|
let client = reqwest::Client::new();
|
||||||
let client = yaak_api_client(ApiClientKind::App, &app_version)?;
|
|
||||||
let payload = ActivateLicenseRequestPayload {
|
let payload = ActivateLicenseRequestPayload {
|
||||||
license_key: license_key.to_string(),
|
license_key: license_key.to_string(),
|
||||||
app_platform: get_os_str().to_string(),
|
app_platform: get_os_str().to_string(),
|
||||||
app_version,
|
app_version: window.app_handle().package_info().version.to_string(),
|
||||||
};
|
};
|
||||||
let response = client.post(build_url("/licenses/activate")).json(&payload).send().await?;
|
let response = client.post(build_url("/licenses/activate")).json(&payload).send().await?;
|
||||||
|
|
||||||
@@ -156,11 +155,12 @@ pub async fn deactivate_license<R: Runtime>(window: &WebviewWindow<R>) -> Result
|
|||||||
let app_handle = window.app_handle();
|
let app_handle = window.app_handle();
|
||||||
let activation_id = get_activation_id(app_handle).await;
|
let activation_id = get_activation_id(app_handle).await;
|
||||||
|
|
||||||
let app_version = window.app_handle().package_info().version.to_string();
|
let client = reqwest::Client::new();
|
||||||
let client = yaak_api_client(ApiClientKind::App, &app_version)?;
|
|
||||||
let path = format!("/licenses/activations/{}/deactivate", activation_id);
|
let path = format!("/licenses/activations/{}/deactivate", activation_id);
|
||||||
let payload =
|
let payload = DeactivateLicenseRequestPayload {
|
||||||
DeactivateLicenseRequestPayload { app_platform: get_os_str().to_string(), app_version };
|
app_platform: get_os_str().to_string(),
|
||||||
|
app_version: window.app_handle().package_info().version.to_string(),
|
||||||
|
};
|
||||||
let response = client.post(build_url(&path)).json(&payload).send().await?;
|
let response = client.post(build_url(&path)).json(&payload).send().await?;
|
||||||
|
|
||||||
if response.status().is_client_error() {
|
if response.status().is_client_error() {
|
||||||
@@ -186,9 +186,10 @@ pub async fn deactivate_license<R: Runtime>(window: &WebviewWindow<R>) -> Result
|
|||||||
}
|
}
|
||||||
|
|
||||||
pub async fn check_license<R: Runtime>(window: &WebviewWindow<R>) -> Result<LicenseCheckStatus> {
|
pub async fn check_license<R: Runtime>(window: &WebviewWindow<R>) -> Result<LicenseCheckStatus> {
|
||||||
let app_version = window.app_handle().package_info().version.to_string();
|
let payload = CheckActivationRequestPayload {
|
||||||
let payload =
|
app_platform: get_os_str().to_string(),
|
||||||
CheckActivationRequestPayload { app_platform: get_os_str().to_string(), app_version };
|
app_version: window.package_info().version.to_string(),
|
||||||
|
};
|
||||||
let activation_id = get_activation_id(window.app_handle()).await;
|
let activation_id = get_activation_id(window.app_handle()).await;
|
||||||
|
|
||||||
let settings = window.db().get_settings();
|
let settings = window.db().get_settings();
|
||||||
@@ -203,7 +204,7 @@ pub async fn check_license<R: Runtime>(window: &WebviewWindow<R>) -> Result<Lice
|
|||||||
(true, _) => {
|
(true, _) => {
|
||||||
info!("Checking license activation");
|
info!("Checking license activation");
|
||||||
// A license has been activated, so let's check the license server
|
// A license has been activated, so let's check the license server
|
||||||
let client = yaak_api_client(ApiClientKind::App, &payload.app_version)?;
|
let client = yaak_api_client(window.app_handle())?;
|
||||||
let path = format!("/licenses/activations/{activation_id}/check-v2");
|
let path = format!("/licenses/activations/{activation_id}/check-v2");
|
||||||
let response = client.post(build_url(&path)).json(&payload).send().await?;
|
let response = client.post(build_url(&path)).json(&payload).send().await?;
|
||||||
|
|
||||||
|
|||||||
@@ -1,9 +1,9 @@
|
|||||||
import { invoke } from "@tauri-apps/api/core";
|
import { invoke } from '@tauri-apps/api/core';
|
||||||
|
|
||||||
export function setWindowTitle(title: string) {
|
export function setWindowTitle(title: string) {
|
||||||
invoke("plugin:yaak-mac-window|set_title", { title }).catch(console.error);
|
invoke('plugin:yaak-mac-window|set_title', { title }).catch(console.error);
|
||||||
}
|
}
|
||||||
|
|
||||||
export function setWindowTheme(bgColor: string) {
|
export function setWindowTheme(bgColor: string) {
|
||||||
invoke("plugin:yaak-mac-window|set_theme", { bgColor }).catch(console.error);
|
invoke('plugin:yaak-mac-window|set_theme', { bgColor }).catch(console.error);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "@yaakapp-internal/mac-window",
|
"name": "@yaakapp-internal/mac-window",
|
||||||
"version": "1.0.0",
|
|
||||||
"private": true,
|
"private": true,
|
||||||
|
"version": "1.0.0",
|
||||||
"main": "index.ts"
|
"main": "index.ts"
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,3 +1,6 @@
|
|||||||
[default]
|
[default]
|
||||||
description = "Default permissions for the plugin"
|
description = "Default permissions for the plugin"
|
||||||
permissions = ["allow-set-title", "allow-set-theme"]
|
permissions = [
|
||||||
|
"allow-set-title",
|
||||||
|
"allow-set-theme",
|
||||||
|
]
|
||||||
|
|||||||
@@ -12,11 +12,6 @@ unsafe impl Sync for UnsafeWindowHandle {}
|
|||||||
|
|
||||||
const WINDOW_CONTROL_PAD_X: f64 = 13.0;
|
const WINDOW_CONTROL_PAD_X: f64 = 13.0;
|
||||||
const WINDOW_CONTROL_PAD_Y: f64 = 18.0;
|
const WINDOW_CONTROL_PAD_Y: f64 = 18.0;
|
||||||
/// Extra pixels to add to the title bar height when the default title bar is
|
|
||||||
/// already as tall as button_height + PAD_Y (i.e. macOS Tahoe 26+, where the
|
|
||||||
/// default is 32px and 14 + 18 = 32). On pre-Tahoe this is unused because the
|
|
||||||
/// default title bar is shorter than button_height + PAD_Y.
|
|
||||||
const TITLEBAR_EXTRA_HEIGHT: f64 = 4.0;
|
|
||||||
const MAIN_WINDOW_PREFIX: &str = "main_";
|
const MAIN_WINDOW_PREFIX: &str = "main_";
|
||||||
|
|
||||||
pub(crate) fn update_window_title<R: Runtime>(window: Window<R>, title: String) {
|
pub(crate) fn update_window_title<R: Runtime>(window: Window<R>, title: String) {
|
||||||
@@ -100,29 +95,12 @@ fn position_traffic_lights(ns_window_handle: UnsafeWindowHandle, x: f64, y: f64,
|
|||||||
ns_window.standardWindowButton_(NSWindowButton::NSWindowMiniaturizeButton);
|
ns_window.standardWindowButton_(NSWindowButton::NSWindowMiniaturizeButton);
|
||||||
let zoom = ns_window.standardWindowButton_(NSWindowButton::NSWindowZoomButton);
|
let zoom = ns_window.standardWindowButton_(NSWindowButton::NSWindowZoomButton);
|
||||||
|
|
||||||
|
let title_bar_container_view = close.superview().superview();
|
||||||
|
|
||||||
let close_rect: NSRect = msg_send![close, frame];
|
let close_rect: NSRect = msg_send![close, frame];
|
||||||
let button_height = close_rect.size.height;
|
let button_height = close_rect.size.height;
|
||||||
|
|
||||||
let title_bar_container_view = close.superview().superview();
|
let title_bar_frame_height = button_height + y;
|
||||||
|
|
||||||
// Capture the OS default title bar height on the first call, before
|
|
||||||
// we've modified it. This avoids the height growing on repeated calls.
|
|
||||||
use std::sync::OnceLock;
|
|
||||||
static DEFAULT_TITLEBAR_HEIGHT: OnceLock<f64> = OnceLock::new();
|
|
||||||
let default_height =
|
|
||||||
*DEFAULT_TITLEBAR_HEIGHT.get_or_init(|| NSView::frame(title_bar_container_view).size.height);
|
|
||||||
|
|
||||||
// On pre-Tahoe, button_height + y is larger than the default title bar
|
|
||||||
// height, so the resize works as before. On Tahoe (26+), the default is
|
|
||||||
// already 32px and button_height + y = 32, so nothing changes. In that
|
|
||||||
// case, add TITLEBAR_EXTRA_HEIGHT extra pixels to push the buttons down.
|
|
||||||
let desired = button_height + y;
|
|
||||||
let title_bar_frame_height = if desired > default_height {
|
|
||||||
desired
|
|
||||||
} else {
|
|
||||||
default_height + TITLEBAR_EXTRA_HEIGHT
|
|
||||||
};
|
|
||||||
|
|
||||||
let mut title_bar_rect = NSView::frame(title_bar_container_view);
|
let mut title_bar_rect = NSView::frame(title_bar_container_view);
|
||||||
title_bar_rect.size.height = title_bar_frame_height;
|
title_bar_rect.size.height = title_bar_frame_height;
|
||||||
title_bar_rect.origin.y = NSView::frame(ns_window).size.height - title_bar_frame_height;
|
title_bar_rect.origin.y = NSView::frame(ns_window).size.height - title_bar_frame_height;
|
||||||
|
|||||||
@@ -6,4 +6,8 @@ publish = false
|
|||||||
|
|
||||||
[dependencies]
|
[dependencies]
|
||||||
tauri = { workspace = true }
|
tauri = { workspace = true }
|
||||||
|
reqwest = { workspace = true, features = ["gzip"] }
|
||||||
|
thiserror = { workspace = true }
|
||||||
|
serde = { workspace = true, features = ["derive"] }
|
||||||
regex = "1.11.0"
|
regex = "1.11.0"
|
||||||
|
yaak-common = { workspace = true }
|
||||||
|
|||||||
24
crates-tauri/yaak-tauri-utils/src/api_client.rs
Normal file
24
crates-tauri/yaak-tauri-utils/src/api_client.rs
Normal file
@@ -0,0 +1,24 @@
|
|||||||
|
use crate::error::Result;
|
||||||
|
use reqwest::Client;
|
||||||
|
use std::time::Duration;
|
||||||
|
use tauri::http::{HeaderMap, HeaderValue};
|
||||||
|
use tauri::{AppHandle, Runtime};
|
||||||
|
use yaak_common::platform::{get_ua_arch, get_ua_platform};
|
||||||
|
|
||||||
|
pub fn yaak_api_client<R: Runtime>(app_handle: &AppHandle<R>) -> Result<Client> {
|
||||||
|
let platform = get_ua_platform();
|
||||||
|
let version = app_handle.package_info().version.clone();
|
||||||
|
let arch = get_ua_arch();
|
||||||
|
let ua = format!("Yaak/{version} ({platform}; {arch})");
|
||||||
|
let mut default_headers = HeaderMap::new();
|
||||||
|
default_headers.insert("Accept", HeaderValue::from_str("application/json").unwrap());
|
||||||
|
|
||||||
|
let client = reqwest::ClientBuilder::new()
|
||||||
|
.timeout(Duration::from_secs(20))
|
||||||
|
.default_headers(default_headers)
|
||||||
|
.gzip(true)
|
||||||
|
.user_agent(ua)
|
||||||
|
.build()?;
|
||||||
|
|
||||||
|
Ok(client)
|
||||||
|
}
|
||||||
19
crates-tauri/yaak-tauri-utils/src/error.rs
Normal file
19
crates-tauri/yaak-tauri-utils/src/error.rs
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
use serde::{Serialize, Serializer};
|
||||||
|
use thiserror::Error;
|
||||||
|
|
||||||
|
#[derive(Error, Debug)]
|
||||||
|
pub enum Error {
|
||||||
|
#[error(transparent)]
|
||||||
|
ReqwestError(#[from] reqwest::Error),
|
||||||
|
}
|
||||||
|
|
||||||
|
impl Serialize for Error {
|
||||||
|
fn serialize<S>(&self, serializer: S) -> std::result::Result<S::Ok, S::Error>
|
||||||
|
where
|
||||||
|
S: Serializer,
|
||||||
|
{
|
||||||
|
serializer.serialize_str(self.to_string().as_ref())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
pub type Result<T> = std::result::Result<T, Error>;
|
||||||
@@ -1 +1,3 @@
|
|||||||
|
pub mod api_client;
|
||||||
|
pub mod error;
|
||||||
pub mod window;
|
pub mod window;
|
||||||
|
|||||||
@@ -1,22 +1,18 @@
|
|||||||
[package]
|
[package]
|
||||||
name = "yaak"
|
name = "yaak-actions-builtin"
|
||||||
version = "0.1.0"
|
version = "0.1.0"
|
||||||
edition = "2024"
|
edition = "2024"
|
||||||
|
authors = ["Gregory Schier"]
|
||||||
publish = false
|
publish = false
|
||||||
|
|
||||||
[dependencies]
|
[dependencies]
|
||||||
async-trait = "0.1"
|
yaak-actions = { workspace = true }
|
||||||
log = { workspace = true }
|
|
||||||
md5 = "0.8.0"
|
|
||||||
serde_json = { workspace = true }
|
|
||||||
thiserror = { workspace = true }
|
|
||||||
tokio = { workspace = true, features = ["sync", "rt"] }
|
|
||||||
yaak-http = { workspace = true }
|
yaak-http = { workspace = true }
|
||||||
yaak-crypto = { workspace = true }
|
|
||||||
yaak-models = { workspace = true }
|
yaak-models = { workspace = true }
|
||||||
yaak-plugins = { workspace = true }
|
|
||||||
yaak-templates = { workspace = true }
|
yaak-templates = { workspace = true }
|
||||||
yaak-tls = { workspace = true }
|
yaak-plugins = { workspace = true }
|
||||||
|
yaak-crypto = { workspace = true }
|
||||||
[dev-dependencies]
|
serde = { workspace = true }
|
||||||
tempfile = "3"
|
serde_json = { workspace = true }
|
||||||
|
tokio = { workspace = true, features = ["sync", "rt-multi-thread"] }
|
||||||
|
log = { workspace = true }
|
||||||
88
crates/yaak-actions-builtin/src/dependencies.rs
Normal file
88
crates/yaak-actions-builtin/src/dependencies.rs
Normal file
@@ -0,0 +1,88 @@
|
|||||||
|
//! Dependency injection for built-in actions.
|
||||||
|
|
||||||
|
use std::path::{Path, PathBuf};
|
||||||
|
use std::sync::Arc;
|
||||||
|
use yaak_crypto::manager::EncryptionManager;
|
||||||
|
use yaak_models::query_manager::QueryManager;
|
||||||
|
use yaak_plugins::events::PluginContext;
|
||||||
|
use yaak_plugins::manager::PluginManager;
|
||||||
|
|
||||||
|
/// Dependencies needed by built-in action implementations.
|
||||||
|
///
|
||||||
|
/// This struct bundles all the dependencies that action handlers need,
|
||||||
|
/// providing a clean way to initialize them in different contexts
|
||||||
|
/// (CLI, Tauri app, MCP server, etc.).
|
||||||
|
pub struct BuiltinActionDependencies {
|
||||||
|
pub query_manager: Arc<QueryManager>,
|
||||||
|
pub plugin_manager: Arc<PluginManager>,
|
||||||
|
pub encryption_manager: Arc<EncryptionManager>,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl BuiltinActionDependencies {
|
||||||
|
/// Create dependencies for standalone usage (CLI, MCP server, etc.)
|
||||||
|
///
|
||||||
|
/// This initializes all the necessary managers following the same pattern
|
||||||
|
/// as the yaak-cli implementation.
|
||||||
|
pub async fn new_standalone(
|
||||||
|
db_path: &Path,
|
||||||
|
blob_path: &Path,
|
||||||
|
app_id: &str,
|
||||||
|
plugin_vendored_dir: PathBuf,
|
||||||
|
plugin_installed_dir: PathBuf,
|
||||||
|
node_path: PathBuf,
|
||||||
|
) -> Result<Self, Box<dyn std::error::Error>> {
|
||||||
|
// Initialize database
|
||||||
|
let (query_manager, _, _) = yaak_models::init_standalone(db_path, blob_path)?;
|
||||||
|
|
||||||
|
// Initialize encryption manager (takes QueryManager by value)
|
||||||
|
let encryption_manager = Arc::new(EncryptionManager::new(
|
||||||
|
query_manager.clone(),
|
||||||
|
app_id.to_string(),
|
||||||
|
));
|
||||||
|
|
||||||
|
let query_manager = Arc::new(query_manager);
|
||||||
|
|
||||||
|
// Find plugin runtime
|
||||||
|
let plugin_runtime_main = std::env::var("YAAK_PLUGIN_RUNTIME")
|
||||||
|
.map(PathBuf::from)
|
||||||
|
.unwrap_or_else(|_| {
|
||||||
|
// Development fallback
|
||||||
|
PathBuf::from(env!("CARGO_MANIFEST_DIR"))
|
||||||
|
.join("../../crates-tauri/yaak-app/vendored/plugin-runtime/index.cjs")
|
||||||
|
});
|
||||||
|
|
||||||
|
// Initialize plugin manager
|
||||||
|
let plugin_manager = Arc::new(
|
||||||
|
PluginManager::new(
|
||||||
|
plugin_vendored_dir,
|
||||||
|
plugin_installed_dir,
|
||||||
|
node_path,
|
||||||
|
plugin_runtime_main,
|
||||||
|
false, // not sandboxed in CLI
|
||||||
|
)
|
||||||
|
.await,
|
||||||
|
);
|
||||||
|
|
||||||
|
// Initialize plugins from database
|
||||||
|
let db = query_manager.connect();
|
||||||
|
let plugins = db.list_plugins().unwrap_or_default();
|
||||||
|
if !plugins.is_empty() {
|
||||||
|
let errors = plugin_manager
|
||||||
|
.initialize_all_plugins(plugins, &PluginContext::new_empty())
|
||||||
|
.await;
|
||||||
|
for (plugin_dir, error_msg) in errors {
|
||||||
|
log::warn!(
|
||||||
|
"Failed to initialize plugin '{}': {}",
|
||||||
|
plugin_dir,
|
||||||
|
error_msg
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(Self {
|
||||||
|
query_manager,
|
||||||
|
plugin_manager,
|
||||||
|
encryption_manager,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
24
crates/yaak-actions-builtin/src/http/mod.rs
Normal file
24
crates/yaak-actions-builtin/src/http/mod.rs
Normal file
@@ -0,0 +1,24 @@
|
|||||||
|
//! HTTP action implementations.
|
||||||
|
|
||||||
|
pub mod send;
|
||||||
|
|
||||||
|
use crate::BuiltinActionDependencies;
|
||||||
|
use yaak_actions::{ActionError, ActionExecutor, ActionSource};
|
||||||
|
|
||||||
|
/// Register all HTTP-related actions with the executor.
|
||||||
|
pub async fn register_http_actions(
|
||||||
|
executor: &ActionExecutor,
|
||||||
|
deps: &BuiltinActionDependencies,
|
||||||
|
) -> Result<(), ActionError> {
|
||||||
|
let handler = send::HttpSendActionHandler {
|
||||||
|
query_manager: deps.query_manager.clone(),
|
||||||
|
plugin_manager: deps.plugin_manager.clone(),
|
||||||
|
encryption_manager: deps.encryption_manager.clone(),
|
||||||
|
};
|
||||||
|
|
||||||
|
executor
|
||||||
|
.register(send::metadata(), ActionSource::Builtin, handler)
|
||||||
|
.await?;
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
293
crates/yaak-actions-builtin/src/http/send.rs
Normal file
293
crates/yaak-actions-builtin/src/http/send.rs
Normal file
@@ -0,0 +1,293 @@
|
|||||||
|
//! HTTP send action implementation.
|
||||||
|
|
||||||
|
use std::collections::BTreeMap;
|
||||||
|
use std::sync::Arc;
|
||||||
|
use serde_json::{json, Value};
|
||||||
|
use tokio::sync::mpsc;
|
||||||
|
use yaak_actions::{
|
||||||
|
ActionError, ActionGroupId, ActionHandler, ActionId, ActionMetadata,
|
||||||
|
ActionParams, ActionResult, ActionScope, CurrentContext,
|
||||||
|
RequiredContext,
|
||||||
|
};
|
||||||
|
use yaak_crypto::manager::EncryptionManager;
|
||||||
|
use yaak_http::path_placeholders::apply_path_placeholders;
|
||||||
|
use yaak_http::sender::{HttpSender, ReqwestSender};
|
||||||
|
use yaak_http::types::{SendableHttpRequest, SendableHttpRequestOptions};
|
||||||
|
use yaak_models::models::{HttpRequest, HttpRequestHeader, HttpUrlParameter};
|
||||||
|
use yaak_models::query_manager::QueryManager;
|
||||||
|
use yaak_models::render::make_vars_hashmap;
|
||||||
|
use yaak_plugins::events::{PluginContext, RenderPurpose};
|
||||||
|
use yaak_plugins::manager::PluginManager;
|
||||||
|
use yaak_plugins::template_callback::PluginTemplateCallback;
|
||||||
|
use yaak_templates::{parse_and_render, render_json_value_raw, RenderOptions};
|
||||||
|
|
||||||
|
/// Handler for HTTP send action.
|
||||||
|
pub struct HttpSendActionHandler {
|
||||||
|
pub query_manager: Arc<QueryManager>,
|
||||||
|
pub plugin_manager: Arc<PluginManager>,
|
||||||
|
pub encryption_manager: Arc<EncryptionManager>,
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Metadata for the HTTP send action.
|
||||||
|
pub fn metadata() -> ActionMetadata {
|
||||||
|
ActionMetadata {
|
||||||
|
id: ActionId::builtin("http", "send-request"),
|
||||||
|
label: "Send HTTP Request".to_string(),
|
||||||
|
description: Some("Execute an HTTP request and return the response".to_string()),
|
||||||
|
icon: Some("play".to_string()),
|
||||||
|
scope: ActionScope::HttpRequest,
|
||||||
|
keyboard_shortcut: None,
|
||||||
|
requires_selection: true,
|
||||||
|
enabled_condition: None,
|
||||||
|
group_id: Some(ActionGroupId::builtin("send")),
|
||||||
|
order: 10,
|
||||||
|
required_context: RequiredContext::requires_target(),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl ActionHandler for HttpSendActionHandler {
|
||||||
|
fn handle(
|
||||||
|
&self,
|
||||||
|
context: CurrentContext,
|
||||||
|
params: ActionParams,
|
||||||
|
) -> std::pin::Pin<
|
||||||
|
Box<dyn std::future::Future<Output = Result<ActionResult, ActionError>> + Send + 'static>,
|
||||||
|
> {
|
||||||
|
let query_manager = self.query_manager.clone();
|
||||||
|
let plugin_manager = self.plugin_manager.clone();
|
||||||
|
let encryption_manager = self.encryption_manager.clone();
|
||||||
|
|
||||||
|
Box::pin(async move {
|
||||||
|
// Extract request_id from context
|
||||||
|
let request_id = context
|
||||||
|
.target
|
||||||
|
.as_ref()
|
||||||
|
.ok_or_else(|| {
|
||||||
|
ActionError::ContextMissing {
|
||||||
|
missing_fields: vec!["target".to_string()],
|
||||||
|
}
|
||||||
|
})?
|
||||||
|
.id()
|
||||||
|
.ok_or_else(|| {
|
||||||
|
ActionError::ContextMissing {
|
||||||
|
missing_fields: vec!["target.id".to_string()],
|
||||||
|
}
|
||||||
|
})?
|
||||||
|
.to_string();
|
||||||
|
|
||||||
|
// Fetch request and environment from database (synchronous)
|
||||||
|
let (request, environment_chain) = {
|
||||||
|
let db = query_manager.connect();
|
||||||
|
|
||||||
|
// Fetch HTTP request from database
|
||||||
|
let request = db.get_http_request(&request_id).map_err(|e| {
|
||||||
|
ActionError::Internal(format!("Failed to fetch request {}: {}", request_id, e))
|
||||||
|
})?;
|
||||||
|
|
||||||
|
// Resolve environment chain for variable substitution
|
||||||
|
let environment_chain = if let Some(env_id) = &context.environment_id {
|
||||||
|
db.resolve_environments(
|
||||||
|
&request.workspace_id,
|
||||||
|
request.folder_id.as_deref(),
|
||||||
|
Some(env_id),
|
||||||
|
)
|
||||||
|
.unwrap_or_default()
|
||||||
|
} else {
|
||||||
|
db.resolve_environments(
|
||||||
|
&request.workspace_id,
|
||||||
|
request.folder_id.as_deref(),
|
||||||
|
None,
|
||||||
|
)
|
||||||
|
.unwrap_or_default()
|
||||||
|
};
|
||||||
|
|
||||||
|
(request, environment_chain)
|
||||||
|
}; // db is dropped here
|
||||||
|
|
||||||
|
// Create template callback with plugin support
|
||||||
|
let plugin_context = PluginContext::new(None, Some(request.workspace_id.clone()));
|
||||||
|
let template_callback = PluginTemplateCallback::new(
|
||||||
|
plugin_manager,
|
||||||
|
encryption_manager,
|
||||||
|
&plugin_context,
|
||||||
|
RenderPurpose::Send,
|
||||||
|
);
|
||||||
|
|
||||||
|
// Render templates in the request
|
||||||
|
let rendered_request = render_http_request(
|
||||||
|
&request,
|
||||||
|
environment_chain,
|
||||||
|
&template_callback,
|
||||||
|
&RenderOptions::throw(),
|
||||||
|
)
|
||||||
|
.await
|
||||||
|
.map_err(|e| ActionError::Internal(format!("Failed to render request: {}", e)))?;
|
||||||
|
|
||||||
|
// Build sendable request
|
||||||
|
let options = SendableHttpRequestOptions {
|
||||||
|
timeout: params
|
||||||
|
.data
|
||||||
|
.get("timeout_ms")
|
||||||
|
.and_then(|v| v.as_u64())
|
||||||
|
.map(|ms| std::time::Duration::from_millis(ms)),
|
||||||
|
follow_redirects: params
|
||||||
|
.data
|
||||||
|
.get("follow_redirects")
|
||||||
|
.and_then(|v| v.as_bool())
|
||||||
|
.unwrap_or(false),
|
||||||
|
};
|
||||||
|
|
||||||
|
let sendable = SendableHttpRequest::from_http_request(&rendered_request, options)
|
||||||
|
.await
|
||||||
|
.map_err(|e| ActionError::Internal(format!("Failed to build request: {}", e)))?;
|
||||||
|
|
||||||
|
// Create event channel
|
||||||
|
let (event_tx, mut event_rx) = mpsc::channel(100);
|
||||||
|
|
||||||
|
// Spawn task to drain events
|
||||||
|
let _event_handle = tokio::spawn(async move {
|
||||||
|
while event_rx.recv().await.is_some() {
|
||||||
|
// For now, just drain events
|
||||||
|
// In the future, we could log them or emit them to UI
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Send the request
|
||||||
|
let sender = ReqwestSender::new()
|
||||||
|
.map_err(|e| ActionError::Internal(format!("Failed to create HTTP client: {}", e)))?;
|
||||||
|
let response = sender
|
||||||
|
.send(sendable, event_tx)
|
||||||
|
.await
|
||||||
|
.map_err(|e| ActionError::Internal(format!("Failed to send request: {}", e)))?;
|
||||||
|
|
||||||
|
// Consume response body
|
||||||
|
let status = response.status;
|
||||||
|
let status_reason = response.status_reason.clone();
|
||||||
|
let headers = response.headers.clone();
|
||||||
|
let url = response.url.clone();
|
||||||
|
|
||||||
|
let (body_text, stats) = response
|
||||||
|
.text()
|
||||||
|
.await
|
||||||
|
.map_err(|e| ActionError::Internal(format!("Failed to read response body: {}", e)))?;
|
||||||
|
|
||||||
|
// Return success result with response data
|
||||||
|
Ok(ActionResult::Success {
|
||||||
|
data: Some(json!({
|
||||||
|
"status": status,
|
||||||
|
"statusReason": status_reason,
|
||||||
|
"headers": headers,
|
||||||
|
"body": body_text,
|
||||||
|
"contentLength": stats.size_decompressed,
|
||||||
|
"url": url,
|
||||||
|
})),
|
||||||
|
message: Some(format!("HTTP {}", status)),
|
||||||
|
})
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Helper function to render templates in an HTTP request.
|
||||||
|
/// Copied from yaak-cli implementation.
|
||||||
|
async fn render_http_request(
|
||||||
|
r: &HttpRequest,
|
||||||
|
environment_chain: Vec<yaak_models::models::Environment>,
|
||||||
|
cb: &PluginTemplateCallback,
|
||||||
|
opt: &RenderOptions,
|
||||||
|
) -> Result<HttpRequest, String> {
|
||||||
|
let vars = &make_vars_hashmap(environment_chain);
|
||||||
|
|
||||||
|
let mut url_parameters = Vec::new();
|
||||||
|
for p in r.url_parameters.clone() {
|
||||||
|
if !p.enabled {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
url_parameters.push(HttpUrlParameter {
|
||||||
|
enabled: p.enabled,
|
||||||
|
name: parse_and_render(p.name.as_str(), vars, cb, opt)
|
||||||
|
.await
|
||||||
|
.map_err(|e| e.to_string())?,
|
||||||
|
value: parse_and_render(p.value.as_str(), vars, cb, opt)
|
||||||
|
.await
|
||||||
|
.map_err(|e| e.to_string())?,
|
||||||
|
id: p.id,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
let mut headers = Vec::new();
|
||||||
|
for p in r.headers.clone() {
|
||||||
|
if !p.enabled {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
headers.push(HttpRequestHeader {
|
||||||
|
enabled: p.enabled,
|
||||||
|
name: parse_and_render(p.name.as_str(), vars, cb, opt)
|
||||||
|
.await
|
||||||
|
.map_err(|e| e.to_string())?,
|
||||||
|
value: parse_and_render(p.value.as_str(), vars, cb, opt)
|
||||||
|
.await
|
||||||
|
.map_err(|e| e.to_string())?,
|
||||||
|
id: p.id,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
let mut body = BTreeMap::new();
|
||||||
|
for (k, v) in r.body.clone() {
|
||||||
|
body.insert(
|
||||||
|
k,
|
||||||
|
render_json_value_raw(v, vars, cb, opt)
|
||||||
|
.await
|
||||||
|
.map_err(|e| e.to_string())?,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
let authentication = {
|
||||||
|
let mut disabled = false;
|
||||||
|
let mut auth = BTreeMap::new();
|
||||||
|
match r.authentication.get("disabled") {
|
||||||
|
Some(Value::Bool(true)) => {
|
||||||
|
disabled = true;
|
||||||
|
}
|
||||||
|
Some(Value::String(tmpl)) => {
|
||||||
|
disabled = parse_and_render(tmpl.as_str(), vars, cb, opt)
|
||||||
|
.await
|
||||||
|
.unwrap_or_default()
|
||||||
|
.is_empty();
|
||||||
|
}
|
||||||
|
_ => {}
|
||||||
|
}
|
||||||
|
if disabled {
|
||||||
|
auth.insert("disabled".to_string(), Value::Bool(true));
|
||||||
|
} else {
|
||||||
|
for (k, v) in r.authentication.clone() {
|
||||||
|
if k == "disabled" {
|
||||||
|
auth.insert(k, Value::Bool(false));
|
||||||
|
} else {
|
||||||
|
auth.insert(
|
||||||
|
k,
|
||||||
|
render_json_value_raw(v, vars, cb, opt)
|
||||||
|
.await
|
||||||
|
.map_err(|e| e.to_string())?,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
auth
|
||||||
|
};
|
||||||
|
|
||||||
|
let url = parse_and_render(r.url.clone().as_str(), vars, cb, opt)
|
||||||
|
.await
|
||||||
|
.map_err(|e| e.to_string())?;
|
||||||
|
|
||||||
|
// Apply path placeholders (e.g., /users/:id -> /users/123)
|
||||||
|
let (url, url_parameters) = apply_path_placeholders(&url, &url_parameters);
|
||||||
|
|
||||||
|
Ok(HttpRequest {
|
||||||
|
url,
|
||||||
|
url_parameters,
|
||||||
|
headers,
|
||||||
|
body,
|
||||||
|
authentication,
|
||||||
|
..r.to_owned()
|
||||||
|
})
|
||||||
|
}
|
||||||
11
crates/yaak-actions-builtin/src/lib.rs
Normal file
11
crates/yaak-actions-builtin/src/lib.rs
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
//! Built-in action implementations for Yaak.
|
||||||
|
//!
|
||||||
|
//! This crate provides concrete implementations of built-in actions using
|
||||||
|
//! the yaak-actions framework. It depends on domain-specific crates like
|
||||||
|
//! yaak-http, yaak-models, yaak-plugins, etc.
|
||||||
|
|
||||||
|
pub mod dependencies;
|
||||||
|
pub mod http;
|
||||||
|
|
||||||
|
pub use dependencies::BuiltinActionDependencies;
|
||||||
|
pub use http::register_http_actions;
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user