Compare commits

..

5 Commits

Author SHA1 Message Date
Gregory Schier
0702864a11 Add contribution policy docs and PR checklist template 2026-02-20 14:08:34 -08:00
Gregory Schier
487e66faa4 Use workspace dependency for schemars 2026-02-20 09:01:36 -08:00
Gregory Schier
f71a3ea8fe Remove completed CLI plan doc 2026-02-20 08:57:11 -08:00
Gregory Schier
39fc9e81cd Refine shared plugin event routing API 2026-02-20 08:30:41 -08:00
Gregory Schier
a4f96fca11 Implement CLI send flows and refactor plugin event handling 2026-02-20 08:06:48 -08:00
768 changed files with 16503 additions and 24812 deletions

View File

@@ -1,11 +1,9 @@
# Claude Context: Detaching Tauri from Yaak # Claude Context: Detaching Tauri from Yaak
## Goal ## Goal
Make Yaak runnable as a standalone CLI without Tauri as a dependency. The core Rust crates in `crates/` should be usable independently, while Tauri-specific code lives in `crates-tauri/`. Make Yaak runnable as a standalone CLI without Tauri as a dependency. The core Rust crates in `crates/` should be usable independently, while Tauri-specific code lives in `crates-tauri/`.
## Project Structure ## Project Structure
``` ```
crates/ # Core crates - should NOT depend on Tauri crates/ # Core crates - should NOT depend on Tauri
crates-tauri/ # Tauri-specific crates (yaak-app, yaak-tauri-utils, etc.) crates-tauri/ # Tauri-specific crates (yaak-app, yaak-tauri-utils, etc.)
@@ -15,13 +13,11 @@ crates-cli/ # CLI crate (yaak-cli)
## Completed Work ## Completed Work
### 1. Folder Restructure ### 1. Folder Restructure
- Moved Tauri-dependent app code to `crates-tauri/yaak-app/` - Moved Tauri-dependent app code to `crates-tauri/yaak-app/`
- Created `crates-tauri/yaak-tauri-utils/` for shared Tauri utilities (window traits, api_client, error handling) - Created `crates-tauri/yaak-tauri-utils/` for shared Tauri utilities (window traits, api_client, error handling)
- Created `crates-cli/yaak-cli/` for the standalone CLI - Created `crates-cli/yaak-cli/` for the standalone CLI
### 2. Decoupled Crates (no longer depend on Tauri) ### 2. Decoupled Crates (no longer depend on Tauri)
- **yaak-models**: Uses `init_standalone()` pattern for CLI database access - **yaak-models**: Uses `init_standalone()` pattern for CLI database access
- **yaak-http**: Removed Tauri plugin, HttpConnectionManager initialized in yaak-app setup - **yaak-http**: Removed Tauri plugin, HttpConnectionManager initialized in yaak-app setup
- **yaak-common**: Only contains Tauri-free utilities (serde, platform) - **yaak-common**: Only contains Tauri-free utilities (serde, platform)
@@ -29,7 +25,6 @@ crates-cli/ # CLI crate (yaak-cli)
- **yaak-grpc**: Replaced AppHandle with GrpcConfig struct, uses tokio::process::Command instead of Tauri sidecar - **yaak-grpc**: Replaced AppHandle with GrpcConfig struct, uses tokio::process::Command instead of Tauri sidecar
### 3. CLI Implementation ### 3. CLI Implementation
- Basic CLI at `crates-cli/yaak-cli/src/main.rs` - Basic CLI at `crates-cli/yaak-cli/src/main.rs`
- Commands: workspaces, requests, send (by ID), get (ad-hoc URL), create - Commands: workspaces, requests, send (by ID), get (ad-hoc URL), create
- Uses same database as Tauri app via `yaak_models::init_standalone()` - Uses same database as Tauri app via `yaak_models::init_standalone()`
@@ -37,14 +32,12 @@ crates-cli/ # CLI crate (yaak-cli)
## Remaining Work ## Remaining Work
### Crates Still Depending on Tauri (in `crates/`) ### Crates Still Depending on Tauri (in `crates/`)
1. **yaak-git** (3 files) - Moderate complexity 1. **yaak-git** (3 files) - Moderate complexity
2. **yaak-plugins** (13 files) - **Hardest** - deeply integrated with Tauri for plugin-to-window communication 2. **yaak-plugins** (13 files) - **Hardest** - deeply integrated with Tauri for plugin-to-window communication
3. **yaak-sync** (4 files) - Moderate complexity 3. **yaak-sync** (4 files) - Moderate complexity
4. **yaak-ws** (5 files) - Moderate complexity 4. **yaak-ws** (5 files) - Moderate complexity
### Pattern for Decoupling ### Pattern for Decoupling
1. Remove Tauri plugin `init()` function from the crate 1. Remove Tauri plugin `init()` function from the crate
2. Move commands to `yaak-app/src/commands.rs` or keep inline in `lib.rs` 2. Move commands to `yaak-app/src/commands.rs` or keep inline in `lib.rs`
3. Move extension traits (e.g., `SomethingManagerExt`) to yaak-app or yaak-tauri-utils 3. Move extension traits (e.g., `SomethingManagerExt`) to yaak-app or yaak-tauri-utils
@@ -54,7 +47,6 @@ crates-cli/ # CLI crate (yaak-cli)
7. Replace `tauri::async_runtime::block_on` with `tokio::runtime::Handle::current().block_on()` 7. Replace `tauri::async_runtime::block_on` with `tokio::runtime::Handle::current().block_on()`
## Key Files ## Key Files
- `crates-tauri/yaak-app/src/lib.rs` - Main Tauri app, setup block initializes managers - `crates-tauri/yaak-app/src/lib.rs` - Main Tauri app, setup block initializes managers
- `crates-tauri/yaak-app/src/commands.rs` - Migrated Tauri commands - `crates-tauri/yaak-app/src/commands.rs` - Migrated Tauri commands
- `crates-tauri/yaak-app/src/models_ext.rs` - Database plugin and extension traits - `crates-tauri/yaak-app/src/models_ext.rs` - Database plugin and extension traits
@@ -62,11 +54,9 @@ crates-cli/ # CLI crate (yaak-cli)
- `crates/yaak-models/src/lib.rs` - Contains `init_standalone()` for CLI usage - `crates/yaak-models/src/lib.rs` - Contains `init_standalone()` for CLI usage
## Git Branch ## Git Branch
Working on `detach-tauri` branch. Working on `detach-tauri` branch.
## Recent Commits ## Recent Commits
``` ```
c40cff40 Remove Tauri dependencies from yaak-crypto and yaak-grpc c40cff40 Remove Tauri dependencies from yaak-crypto and yaak-grpc
df495f1d Move Tauri utilities from yaak-common to yaak-tauri-utils df495f1d Move Tauri utilities from yaak-common to yaak-tauri-utils
@@ -77,7 +67,6 @@ e718a5f1 Refactor models_ext to use init_standalone from yaak-models
``` ```
## Testing ## Testing
- Run `cargo check -p <crate>` to verify a crate builds without Tauri - Run `cargo check -p <crate>` to verify a crate builds without Tauri
- Run `npm run app-dev` to test the Tauri app still works - Run `npm run app-dev` to test the Tauri app still works
- Run `cargo run -p yaak-cli -- --help` to test the CLI - Run `cargo run -p yaak-cli -- --help` to test the CLI

View File

@@ -0,0 +1,62 @@
---
description: Review a PR in a new worktree
allowed-tools: Bash(git worktree:*), Bash(gh pr:*), Bash(git branch:*)
---
Check out a GitHub pull request for review.
## Usage
```
/check-out-pr <PR_NUMBER>
```
## What to do
1. If no PR number is provided, list all open pull requests and ask the user to select one
2. Get PR information using `gh pr view <PR_NUMBER> --json number,headRefName`
3. **Ask the user** whether they want to:
- **A) Check out in current directory** — simple `gh pr checkout <PR_NUMBER>`
- **B) Create a new worktree** — isolated copy at `../yaak-worktrees/pr-<PR_NUMBER>`
4. Follow the appropriate path below
## Option A: Check out in current directory
1. Run `gh pr checkout <PR_NUMBER>`
2. Inform the user which branch they're now on
## Option B: Create a new worktree
1. Create a new worktree at `../yaak-worktrees/pr-<PR_NUMBER>` using `git worktree add` with a timeout of at least 300000ms (5 minutes) since the post-checkout hook runs a bootstrap script
2. Checkout the PR branch in the new worktree using `gh pr checkout <PR_NUMBER>`
3. The post-checkout hook will automatically:
- Create `.env.local` with unique ports
- Copy editor config folders
- Run `npm install && npm run bootstrap`
4. Inform the user:
- Where the worktree was created
- What ports were assigned
- How to access it (cd command)
- How to run the dev server
- How to remove the worktree when done
### Example worktree output
```
Created worktree for PR #123 at ../yaak-worktrees/pr-123
Branch: feature-auth
Ports: Vite (1421), MCP (64344)
To start working:
cd ../yaak-worktrees/pr-123
npm run app-dev
To remove when done:
git worktree remove ../yaak-worktrees/pr-123
```
## Error Handling
- If the PR doesn't exist, show a helpful error
- If the worktree already exists, inform the user and ask if they want to remove and recreate it
- If `gh` CLI is not available, inform the user to install it

View File

@@ -8,7 +8,7 @@ Generate formatted release notes for Yaak releases by analyzing git history and
## What to do ## What to do
1. Identifies the version tag and previous version 1. Identifies the version tag and previous version
2. Retrieves all commits between versions 2. Retrieves all commits between versions
- If the version is a beta version, it retrieves commits between the beta version and previous beta version - If the version is a beta version, it retrieves commits between the beta version and previous beta version
- If the version is a stable version, it retrieves commits between the stable version and the previous stable version - If the version is a stable version, it retrieves commits between the stable version and the previous stable version
3. Fetches PR descriptions for linked issues to find: 3. Fetches PR descriptions for linked issues to find:
@@ -37,7 +37,6 @@ The skill generates markdown-formatted release notes following this structure:
**IMPORTANT**: Always add a blank lines around the markdown code fence and output the markdown code block last **IMPORTANT**: Always add a blank lines around the markdown code fence and output the markdown code block last
**IMPORTANT**: PRs by `@gschier` should not mention the @username **IMPORTANT**: PRs by `@gschier` should not mention the @username
**IMPORTANT**: These are app release notes. Exclude CLI-only changes (commits prefixed with `cli:` or only touching `crates-cli/`) since the CLI has its own release process.
## After Generating Release Notes ## After Generating Release Notes

View File

@@ -0,0 +1,35 @@
# Worktree Management Skill
## Creating Worktrees
When creating git worktrees for this project, ALWAYS use the path format:
```
../yaak-worktrees/<NAME>
```
For example:
- `git worktree add ../yaak-worktrees/feature-auth`
- `git worktree add ../yaak-worktrees/bugfix-login`
- `git worktree add ../yaak-worktrees/refactor-api`
## What Happens Automatically
The post-checkout hook will automatically:
1. Create `.env.local` with unique ports (YAAK_DEV_PORT and YAAK_PLUGIN_MCP_SERVER_PORT)
2. Copy gitignored editor config folders (.zed, .idea, etc.)
3. Run `npm install && npm run bootstrap`
## Deleting Worktrees
```bash
git worktree remove ../yaak-worktrees/<NAME>
```
## Port Assignments
- Main worktree: 1420 (Vite), 64343 (MCP)
- First worktree: 1421, 64344
- Second worktree: 1422, 64345
- etc.
Each worktree can run `npm run app-dev` simultaneously without conflicts.

View File

@@ -0,0 +1,46 @@
---
name: release-check-out-pr
description: Check out a GitHub pull request for review in this repo, either in the current directory or in a new isolated worktree at ../yaak-worktrees/pr-<PR_NUMBER>. Use when asked to run or replace the old Claude check-out-pr command.
---
# Check Out PR
Check out a PR by number and let the user choose between current-directory checkout and isolated worktree checkout.
## Workflow
1. Confirm `gh` CLI is available.
2. If no PR number is provided, list open PRs (`gh pr list`) and ask the user to choose one.
3. Read PR metadata:
- `gh pr view <PR_NUMBER> --json number,headRefName`
4. Ask the user to choose:
- Option A: check out in the current directory
- Option B: create a new worktree at `../yaak-worktrees/pr-<PR_NUMBER>`
## Option A: Current Directory
1. Run:
- `gh pr checkout <PR_NUMBER>`
2. Report the checked-out branch.
## Option B: New Worktree
1. Use path:
- `../yaak-worktrees/pr-<PR_NUMBER>`
2. Create the worktree with a timeout of at least 5 minutes because checkout hooks run bootstrap.
3. In the new worktree, run:
- `gh pr checkout <PR_NUMBER>`
4. Report:
- Worktree path
- Assigned ports from `.env.local` if present
- How to start work:
- `cd ../yaak-worktrees/pr-<PR_NUMBER>`
- `npm run app-dev`
- How to remove when done:
- `git worktree remove ../yaak-worktrees/pr-<PR_NUMBER>`
## Error Handling
- If PR does not exist, show a clear error.
- If worktree already exists, ask whether to reuse it or remove/recreate it.
- If `gh` is missing, instruct the user to install/authenticate it.

View File

@@ -32,7 +32,6 @@ Generate formatted markdown release notes for a Yaak tag.
- Keep a blank line before and after the code fence. - Keep a blank line before and after the code fence.
- Output the markdown code block last. - Output the markdown code block last.
- Do not append `by @gschier` for PRs authored by `@gschier`. - Do not append `by @gschier` for PRs authored by `@gschier`.
- These are app release notes. Exclude CLI-only changes (commits prefixed with `cli:` or only touching `crates-cli/`) since the CLI has its own release process.
## Release Creation Prompt ## Release Creation Prompt

View File

@@ -0,0 +1,37 @@
---
name: worktree-management
description: Manage Yaak git worktrees using the standard ../yaak-worktrees/<NAME> layout, including creation, removal, and expected automatic setup behavior and port assignments.
---
# Worktree Management
Use the Yaak-standard worktree path layout and lifecycle commands.
## Path Convention
Always create worktrees under:
`../yaak-worktrees/<NAME>`
Examples:
- `git worktree add ../yaak-worktrees/feature-auth`
- `git worktree add ../yaak-worktrees/bugfix-login`
- `git worktree add ../yaak-worktrees/refactor-api`
## Automatic Setup After Checkout
Project git hooks automatically:
1. Create `.env.local` with unique `YAAK_DEV_PORT` and `YAAK_PLUGIN_MCP_SERVER_PORT`
2. Copy gitignored editor config folders
3. Run `npm install && npm run bootstrap`
## Remove Worktree
`git worktree remove ../yaak-worktrees/<NAME>`
## Port Pattern
- Main worktree: Vite `1420`, MCP `64343`
- First extra worktree: `1421`, `64344`
- Second extra worktree: `1422`, `64345`
- Continue incrementally for additional worktrees

View File

@@ -1,9 +1,10 @@
--- ---
name: Bug report name: Bug report
about: Create a report to help us improve about: Create a report to help us improve
title: "" title: ''
labels: "" labels: ''
assignees: "" assignees: ''
--- ---
**Describe the bug** **Describe the bug**
@@ -11,7 +12,6 @@ A clear and concise description of what the bug is.
**To Reproduce** **To Reproduce**
Steps to reproduce the behavior: Steps to reproduce the behavior:
1. Go to '...' 1. Go to '...'
2. Click on '....' 2. Click on '....'
3. Scroll down to '....' 3. Scroll down to '....'
@@ -24,17 +24,15 @@ A clear and concise description of what you expected to happen.
If applicable, add screenshots to help explain your problem. If applicable, add screenshots to help explain your problem.
**Desktop (please complete the following information):** **Desktop (please complete the following information):**
- OS: [e.g. iOS]
- OS: [e.g. iOS] - Browser [e.g. chrome, safari]
- Browser [e.g. chrome, safari] - Version [e.g. 22]
- Version [e.g. 22]
**Smartphone (please complete the following information):** **Smartphone (please complete the following information):**
- Device: [e.g. iPhone6]
- Device: [e.g. iPhone6] - OS: [e.g. iOS8.1]
- OS: [e.g. iOS8.1] - Browser [e.g. stock browser, safari]
- Browser [e.g. stock browser, safari] - Version [e.g. 22]
- Version [e.g. 22]
**Additional context** **Additional context**
Add any other context about the problem here. Add any other context about the problem here.

View File

@@ -11,7 +11,6 @@
- [ ] I added or updated tests when reasonable. - [ ] I added or updated tests when reasonable.
Approved feedback item (required if not a bug fix or small-scope improvement): Approved feedback item (required if not a bug fix or small-scope improvement):
<!-- https://yaak.app/feedback/... --> <!-- https://yaak.app/feedback/... -->
## Related ## Related

View File

@@ -14,20 +14,17 @@ jobs:
runs-on: macos-latest runs-on: macos-latest
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- uses: voidzero-dev/setup-vp@v1 - uses: actions/setup-node@v4
with:
node-version: "24"
cache: true
- uses: dtolnay/rust-toolchain@stable - uses: dtolnay/rust-toolchain@stable
- uses: Swatinem/rust-cache@v2 - uses: Swatinem/rust-cache@v2
with: with:
shared-key: ci shared-key: ci
cache-on-failure: true cache-on-failure: true
- run: vp install - run: npm ci
- run: npm run bootstrap - run: npm run bootstrap
- run: npm run lint - run: npm run lint
- name: Run JS Tests - name: Run JS Tests
run: vp test run: npm test
- name: Run Rust Tests - name: Run Rust Tests
run: cargo test --all run: cargo test --all

View File

@@ -47,3 +47,4 @@ jobs:
# See https://github.com/anthropics/claude-code-action/blob/main/docs/usage.md # See https://github.com/anthropics/claude-code-action/blob/main/docs/usage.md
# or https://code.claude.com/docs/en/cli-reference for available options # or https://code.claude.com/docs/en/cli-reference for available options
# claude_args: '--allowed-tools Bash(gh pr:*)' # claude_args: '--allowed-tools Bash(gh pr:*)'

View File

@@ -1,59 +0,0 @@
name: Release API to NPM
on:
push:
tags: [yaak-api-*]
workflow_dispatch:
inputs:
version:
description: API version to publish (for example 0.9.0 or v0.9.0)
required: true
type: string
permissions:
contents: read
jobs:
publish-npm:
name: Publish @yaakapp/api
runs-on: ubuntu-latest
permissions:
contents: read
id-token: write
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Node
uses: actions/setup-node@v4
with:
node-version: lts/*
registry-url: https://registry.npmjs.org
- name: Install dependencies
run: npm ci
- name: Set @yaakapp/api version
shell: bash
env:
WORKFLOW_VERSION: ${{ inputs.version }}
run: |
set -euo pipefail
if [ "${{ github.event_name }}" = "workflow_dispatch" ]; then
VERSION="$WORKFLOW_VERSION"
else
VERSION="${GITHUB_REF_NAME#yaak-api-}"
fi
VERSION="${VERSION#v}"
echo "Preparing @yaakapp/api version: $VERSION"
cd packages/plugin-runtime-types
npm version "$VERSION" --no-git-tag-version --allow-same-version
- name: Build @yaakapp/api
working-directory: packages/plugin-runtime-types
run: npm run build
- name: Publish @yaakapp/api
working-directory: packages/plugin-runtime-types
run: npm publish --provenance --access public

View File

@@ -1,218 +0,0 @@
name: Release CLI to NPM
on:
push:
tags: [yaak-cli-*]
workflow_dispatch:
inputs:
version:
description: CLI version to publish (for example 0.4.0 or v0.4.0)
required: true
type: string
permissions:
contents: read
jobs:
prepare-vendored-assets:
name: Prepare vendored plugin assets
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Node
uses: actions/setup-node@v4
with:
node-version: lts/*
- name: Install Rust stable
uses: dtolnay/rust-toolchain@stable
- name: Install dependencies
run: npm ci
- name: Build plugin assets
env:
SKIP_WASM_BUILD: "1"
run: |
npm run build
npm run vendor:vendor-plugins
- name: Upload vendored assets
uses: actions/upload-artifact@v4
with:
name: vendored-assets
path: |
crates-tauri/yaak-app/vendored/plugin-runtime/index.cjs
crates-tauri/yaak-app/vendored/plugins
if-no-files-found: error
build-binaries:
name: Build ${{ matrix.pkg }}
needs: prepare-vendored-assets
runs-on: ${{ matrix.runner }}
strategy:
fail-fast: false
matrix:
include:
- pkg: cli-darwin-arm64
runner: macos-latest
target: aarch64-apple-darwin
binary: yaak
- pkg: cli-darwin-x64
runner: macos-latest
target: x86_64-apple-darwin
binary: yaak
- pkg: cli-linux-arm64
runner: ubuntu-22.04-arm
target: aarch64-unknown-linux-gnu
binary: yaak
- pkg: cli-linux-x64
runner: ubuntu-22.04
target: x86_64-unknown-linux-gnu
binary: yaak
- pkg: cli-win32-arm64
runner: windows-latest
target: aarch64-pc-windows-msvc
binary: yaak.exe
- pkg: cli-win32-x64
runner: windows-latest
target: x86_64-pc-windows-msvc
binary: yaak.exe
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Install Rust stable
uses: dtolnay/rust-toolchain@stable
with:
targets: ${{ matrix.target }}
- name: Restore Rust cache
uses: Swatinem/rust-cache@v2
with:
shared-key: release-cli-npm
cache-on-failure: true
- name: Install Linux build dependencies
if: startsWith(matrix.runner, 'ubuntu')
run: |
sudo apt-get update
sudo apt-get install -y pkg-config libdbus-1-dev
- name: Download vendored assets
uses: actions/download-artifact@v4
with:
name: vendored-assets
path: crates-tauri/yaak-app/vendored
- name: Set CLI build version
shell: bash
env:
WORKFLOW_VERSION: ${{ inputs.version }}
run: |
set -euo pipefail
if [ "${{ github.event_name }}" = "workflow_dispatch" ]; then
VERSION="$WORKFLOW_VERSION"
else
VERSION="${GITHUB_REF_NAME#yaak-cli-}"
fi
VERSION="${VERSION#v}"
echo "Building yaak version: $VERSION"
echo "YAAK_CLI_VERSION=$VERSION" >> "$GITHUB_ENV"
- name: Build yaak
run: cargo build --locked --release -p yaak-cli --bin yaak --target ${{ matrix.target }}
- name: Stage binary artifact
shell: bash
run: |
set -euo pipefail
mkdir -p "npm/dist/${{ matrix.pkg }}"
cp "target/${{ matrix.target }}/release/${{ matrix.binary }}" "npm/dist/${{ matrix.pkg }}/${{ matrix.binary }}"
- name: Upload binary artifact
uses: actions/upload-artifact@v4
with:
name: ${{ matrix.pkg }}
path: npm/dist/${{ matrix.pkg }}/${{ matrix.binary }}
if-no-files-found: error
publish-npm:
name: Publish @yaakapp/cli packages
needs: build-binaries
runs-on: ubuntu-latest
permissions:
contents: read
id-token: write
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Node
uses: actions/setup-node@v4
with:
node-version: lts/*
registry-url: https://registry.npmjs.org
- name: Download binary artifacts
uses: actions/download-artifact@v4
with:
pattern: cli-*
path: npm/dist
merge-multiple: false
- name: Prepare npm packages
shell: bash
env:
WORKFLOW_VERSION: ${{ inputs.version }}
run: |
set -euo pipefail
if [ "${{ github.event_name }}" = "workflow_dispatch" ]; then
VERSION="$WORKFLOW_VERSION"
else
VERSION="${GITHUB_REF_NAME#yaak-cli-}"
fi
VERSION="${VERSION#v}"
if [[ "$VERSION" == *-* ]]; then
PRERELEASE="${VERSION#*-}"
NPM_TAG="${PRERELEASE%%.*}"
else
NPM_TAG="latest"
fi
echo "Preparing CLI npm packages for version: $VERSION"
echo "Publishing with npm dist-tag: $NPM_TAG"
echo "NPM_TAG=$NPM_TAG" >> "$GITHUB_ENV"
YAAK_CLI_VERSION="$VERSION" node npm/prepare-publish.js
- name: Publish @yaakapp/cli-darwin-arm64
run: npm publish --provenance --access public --tag "$NPM_TAG"
working-directory: npm/cli-darwin-arm64
- name: Publish @yaakapp/cli-darwin-x64
run: npm publish --provenance --access public --tag "$NPM_TAG"
working-directory: npm/cli-darwin-x64
- name: Publish @yaakapp/cli-linux-arm64
run: npm publish --provenance --access public --tag "$NPM_TAG"
working-directory: npm/cli-linux-arm64
- name: Publish @yaakapp/cli-linux-x64
run: npm publish --provenance --access public --tag "$NPM_TAG"
working-directory: npm/cli-linux-x64
- name: Publish @yaakapp/cli-win32-arm64
run: npm publish --provenance --access public --tag "$NPM_TAG"
working-directory: npm/cli-win32-arm64
- name: Publish @yaakapp/cli-win32-x64
run: npm publish --provenance --access public --tag "$NPM_TAG"
working-directory: npm/cli-win32-x64
- name: Publish @yaakapp/cli
run: npm publish --provenance --access public --tag "$NPM_TAG"
working-directory: npm/cli

View File

@@ -1,4 +1,4 @@
name: Release App Artifacts name: Generate Artifacts
on: on:
push: push:
tags: [v*] tags: [v*]
@@ -50,11 +50,8 @@ jobs:
- name: Checkout yaakapp/app - name: Checkout yaakapp/app
uses: actions/checkout@v4 uses: actions/checkout@v4
- name: Setup Vite+ - name: Setup Node
uses: voidzero-dev/setup-vp@v1 uses: actions/setup-node@v4
with:
node-version: "24"
cache: true
- name: install Rust stable - name: install Rust stable
uses: dtolnay/rust-toolchain@stable uses: dtolnay/rust-toolchain@stable
@@ -90,15 +87,15 @@ jobs:
echo $dir >> $env:GITHUB_PATH echo $dir >> $env:GITHUB_PATH
& $exe --version & $exe --version
- run: vp install - run: npm ci
- run: npm run bootstrap - run: npm run bootstrap
env: env:
YAAK_TARGET_ARCH: ${{ matrix.yaak_arch }} YAAK_TARGET_ARCH: ${{ matrix.yaak_arch }}
- run: npm run lint - run: npm run lint
- name: Run JS Tests - name: Run JS Tests
run: vp test run: npm test
- name: Run Rust Tests - name: Run Rust Tests
run: cargo test --all --exclude yaak-cli run: cargo test --all
- name: Set version - name: Set version
run: npm run replace-version run: npm run replace-version

View File

@@ -16,23 +16,23 @@ jobs:
uses: JamesIves/github-sponsors-readme-action@v1 uses: JamesIves/github-sponsors-readme-action@v1
with: with:
token: ${{ secrets.SPONSORS_PAT }} token: ${{ secrets.SPONSORS_PAT }}
file: "README.md" file: 'README.md'
maximum: 1999 maximum: 1999
template: '<a href="https://github.com/{{{ login }}}"><img src="{{{ avatarUrl }}}" width="50px" alt="User avatar: {{{ login }}}" /></a>&nbsp;&nbsp;' template: '<a href="https://github.com/{{{ login }}}"><img src="{{{ avatarUrl }}}" width="50px" alt="User avatar: {{{ login }}}" /></a>&nbsp;&nbsp;'
active-only: false active-only: false
include-private: true include-private: true
marker: "sponsors-base" marker: 'sponsors-base'
- name: Generate Sponsors - name: Generate Sponsors
uses: JamesIves/github-sponsors-readme-action@v1 uses: JamesIves/github-sponsors-readme-action@v1
with: with:
token: ${{ secrets.SPONSORS_PAT }} token: ${{ secrets.SPONSORS_PAT }}
file: "README.md" file: 'README.md'
minimum: 2000 minimum: 2000
template: '<a href="https://github.com/{{{ login }}}"><img src="{{{ avatarUrl }}}" width="80px" alt="User avatar: {{{ login }}}" /></a>&nbsp;&nbsp;' template: '<a href="https://github.com/{{{ login }}}"><img src="{{{ avatarUrl }}}" width="80px" alt="User avatar: {{{ login }}}" /></a>&nbsp;&nbsp;'
active-only: false active-only: false
include-private: true include-private: true
marker: "sponsors-premium" marker: 'sponsors-premium'
# ⚠️ Note: You can use any deployment step here to automatically push the README # ⚠️ Note: You can use any deployment step here to automatically push the README
# changes back to your branch. # changes back to your branch.
@@ -41,4 +41,4 @@ jobs:
with: with:
branch: main branch: main
force: false force: false
folder: "." folder: '.'

3
.gitignore vendored
View File

@@ -54,6 +54,3 @@ flatpak/node-sources.json
# Local Codex desktop env state # Local Codex desktop env state
.codex/environments/environment.toml .codex/environments/environment.toml
# Claude Code local settings
.claude/settings.local.json

View File

@@ -1 +0,0 @@
24.14.0

2
.npmrc
View File

@@ -1,2 +0,0 @@
# vite-plugin-wasm has not yet declared Vite 8 in its peerDependencies
legacy-peer-deps=true

View File

@@ -1,2 +0,0 @@
**/bindings/**
crates/yaak-templates/pkg/**

View File

@@ -1 +0,0 @@
vp lint

View File

@@ -1,7 +1,3 @@
{ {
"recommendations": [ "recommendations": ["biomejs.biome", "rust-lang.rust-analyzer", "bradlc.vscode-tailwindcss"]
"rust-lang.rust-analyzer",
"bradlc.vscode-tailwindcss",
"VoidZero.vite-plus-extension-pack"
]
} }

View File

@@ -1,8 +1,6 @@
{ {
"editor.defaultFormatter": "oxc.oxc-vscode", "editor.defaultFormatter": "biomejs.biome",
"editor.formatOnSave": true, "editor.formatOnSave": true,
"editor.formatOnSaveMode": "file", "biome.enabled": true,
"editor.codeActionsOnSave": { "biome.lint.format.enable": true
"source.fixAll.oxc": "explicit"
}
} }

View File

@@ -1,2 +0,0 @@
- Tag safety: app releases use `v*` tags and CLI releases use `yaak-cli-*` tags; always confirm which one is requested before retagging.
- Do not commit, push, or tag without explicit approval

2168
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,30 +1,30 @@
[workspace] [workspace]
resolver = "2" resolver = "2"
members = [ members = [
"crates/yaak", "crates/yaak",
# Shared crates (no Tauri dependency) # Shared crates (no Tauri dependency)
"crates/yaak-core", "crates/yaak-core",
"crates/yaak-common", "crates/yaak-common",
"crates/yaak-crypto", "crates/yaak-crypto",
"crates/yaak-git", "crates/yaak-git",
"crates/yaak-grpc", "crates/yaak-grpc",
"crates/yaak-http", "crates/yaak-http",
"crates/yaak-models", "crates/yaak-models",
"crates/yaak-plugins", "crates/yaak-plugins",
"crates/yaak-sse", "crates/yaak-sse",
"crates/yaak-sync", "crates/yaak-sync",
"crates/yaak-templates", "crates/yaak-templates",
"crates/yaak-tls", "crates/yaak-tls",
"crates/yaak-ws", "crates/yaak-ws",
"crates/yaak-api", "crates/yaak-api",
# CLI crates # CLI crates
"crates-cli/yaak-cli", "crates-cli/yaak-cli",
# Tauri-specific crates # Tauri-specific crates
"crates-tauri/yaak-app", "crates-tauri/yaak-app",
"crates-tauri/yaak-fonts", "crates-tauri/yaak-fonts",
"crates-tauri/yaak-license", "crates-tauri/yaak-license",
"crates-tauri/yaak-mac-window", "crates-tauri/yaak-mac-window",
"crates-tauri/yaak-tauri-utils", "crates-tauri/yaak-tauri-utils",
] ]
[workspace.dependencies] [workspace.dependencies]

View File

@@ -1,26 +1,24 @@
# Developer Setup # Developer Setup
Yaak is a combined Node.js and Rust monorepo. It is a [Tauri](https://tauri.app) project, so Yaak is a combined Node.js and Rust monorepo. It is a [Tauri](https://tauri.app) project, so
uses Rust and HTML/CSS/JS for the main application but there is also a plugin system powered uses Rust and HTML/CSS/JS for the main application but there is also a plugin system powered
by a Node.js sidecar that communicates to the app over gRPC. by a Node.js sidecar that communicates to the app over gRPC.
Because of the moving parts, there are a few setup steps required before development can Because of the moving parts, there are a few setup steps required before development can
begin. begin.
## Prerequisites ## Prerequisites
Make sure you have the following tools installed: Make sure you have the following tools installed:
- [Node.js](https://nodejs.org/en/download/package-manager) (v24+) - [Node.js](https://nodejs.org/en/download/package-manager)
- [Rust](https://www.rust-lang.org/tools/install) - [Rust](https://www.rust-lang.org/tools/install)
- [Vite+](https://vite.dev/guide/vite-plus) (`vp` CLI)
Check the installations with the following commands: Check the installations with the following commands:
```shell ```shell
node -v node -v
npm -v npm -v
vp --version
rustc --version rustc --version
``` ```
@@ -47,12 +45,12 @@ npm start
## SQLite Migrations ## SQLite Migrations
New migrations can be created from the `src-tauri/` directory: New migrations can be created from the `src-tauri/` directory:
```shell ```shell
npm run migration npm run migration
``` ```
Rerun the app to apply the migrations. Rerun the app to apply the migrations.
_Note: For safety, development builds use a separate database location from production builds._ _Note: For safety, development builds use a separate database location from production builds._
@@ -63,9 +61,9 @@ _Note: For safety, development builds use a separate database location from prod
lezer-generator components/core/Editor/<LANG>/<LANG>.grammar > components/core/Editor/<LANG>/<LANG>.ts lezer-generator components/core/Editor/<LANG>/<LANG>.grammar > components/core/Editor/<LANG>/<LANG>.ts
``` ```
## Linting and Formatting ## Linting & Formatting
This repo uses [Vite+](https://vite.dev/guide/vite-plus) for linting (oxlint) and formatting (oxfmt). This repo uses Biome for linting and formatting (replacing ESLint + Prettier).
- Lint the entire repo: - Lint the entire repo:
@@ -73,6 +71,12 @@ This repo uses [Vite+](https://vite.dev/guide/vite-plus) for linting (oxlint) an
npm run lint npm run lint
``` ```
- Auto-fix lint issues where possible:
```sh
npm run lint:fix
```
- Format code: - Format code:
```sh ```sh
@@ -80,7 +84,5 @@ npm run format
``` ```
Notes: Notes:
- Many workspace packages also expose the same scripts (`lint`, `lint:fix`, and `format`).
- A pre-commit hook runs `vp lint` automatically on commit. - TypeScript type-checking still runs separately via `tsc --noEmit` in relevant packages.
- Some workspace packages also run `tsc --noEmit` for type-checking.
- VS Code users should install the recommended extensions for format-on-save support.

View File

@@ -16,19 +16,23 @@
</p> </p>
<br> <br>
<p align="center"> <p align="center">
<!-- sponsors-premium --><a href="https://github.com/MVST-Solutions"><img src="https:&#x2F;&#x2F;github.com&#x2F;MVST-Solutions.png" width="80px" alt="User avatar: MVST-Solutions" /></a>&nbsp;&nbsp;<a href="https://github.com/dharsanb"><img src="https:&#x2F;&#x2F;github.com&#x2F;dharsanb.png" width="80px" alt="User avatar: dharsanb" /></a>&nbsp;&nbsp;<a href="https://github.com/railwayapp"><img src="https:&#x2F;&#x2F;github.com&#x2F;railwayapp.png" width="80px" alt="User avatar: railwayapp" /></a>&nbsp;&nbsp;<a href="https://github.com/caseyamcl"><img src="https:&#x2F;&#x2F;github.com&#x2F;caseyamcl.png" width="80px" alt="User avatar: caseyamcl" /></a>&nbsp;&nbsp;<a href="https://github.com/bytebase"><img src="https:&#x2F;&#x2F;github.com&#x2F;bytebase.png" width="80px" alt="User avatar: bytebase" /></a>&nbsp;&nbsp;<a href="https://github.com/"><img src="https:&#x2F;&#x2F;raw.githubusercontent.com&#x2F;JamesIves&#x2F;github-sponsors-readme-action&#x2F;dev&#x2F;.github&#x2F;assets&#x2F;placeholder.png" width="80px" alt="User avatar: " /></a>&nbsp;&nbsp;<!-- sponsors-premium --> <!-- sponsors-premium --><a href="https://github.com/MVST-Solutions"><img src="https:&#x2F;&#x2F;github.com&#x2F;MVST-Solutions.png" width="80px" alt="User avatar: MVST-Solutions" /></a>&nbsp;&nbsp;<a href="https://github.com/dharsanb"><img src="https:&#x2F;&#x2F;github.com&#x2F;dharsanb.png" width="80px" alt="User avatar: dharsanb" /></a>&nbsp;&nbsp;<a href="https://github.com/railwayapp"><img src="https:&#x2F;&#x2F;github.com&#x2F;railwayapp.png" width="80px" alt="User avatar: railwayapp" /></a>&nbsp;&nbsp;<a href="https://github.com/caseyamcl"><img src="https:&#x2F;&#x2F;github.com&#x2F;caseyamcl.png" width="80px" alt="User avatar: caseyamcl" /></a>&nbsp;&nbsp;<a href="https://github.com/bytebase"><img src="https:&#x2F;&#x2F;github.com&#x2F;bytebase.png" width="80px" alt="User avatar: bytebase" /></a>&nbsp;&nbsp;<a href="https://github.com/"><img src="https:&#x2F;&#x2F;raw.githubusercontent.com&#x2F;JamesIves&#x2F;github-sponsors-readme-action&#x2F;dev&#x2F;.github&#x2F;assets&#x2F;placeholder.png" width="80px" alt="User avatar: " /></a>&nbsp;&nbsp;<!-- sponsors-premium -->
</p> </p>
<p align="center"> <p align="center">
<!-- sponsors-base --><a href="https://github.com/seanwash"><img src="https:&#x2F;&#x2F;github.com&#x2F;seanwash.png" width="50px" alt="User avatar: seanwash" /></a>&nbsp;&nbsp;<a href="https://github.com/jerath"><img src="https:&#x2F;&#x2F;github.com&#x2F;jerath.png" width="50px" alt="User avatar: jerath" /></a>&nbsp;&nbsp;<a href="https://github.com/itsa-sh"><img src="https:&#x2F;&#x2F;github.com&#x2F;itsa-sh.png" width="50px" alt="User avatar: itsa-sh" /></a>&nbsp;&nbsp;<a href="https://github.com/dmmulroy"><img src="https:&#x2F;&#x2F;github.com&#x2F;dmmulroy.png" width="50px" alt="User avatar: dmmulroy" /></a>&nbsp;&nbsp;<a href="https://github.com/timcole"><img src="https:&#x2F;&#x2F;github.com&#x2F;timcole.png" width="50px" alt="User avatar: timcole" /></a>&nbsp;&nbsp;<a href="https://github.com/VLZH"><img src="https:&#x2F;&#x2F;github.com&#x2F;VLZH.png" width="50px" alt="User avatar: VLZH" /></a>&nbsp;&nbsp;<a href="https://github.com/terasaka2k"><img src="https:&#x2F;&#x2F;github.com&#x2F;terasaka2k.png" width="50px" alt="User avatar: terasaka2k" /></a>&nbsp;&nbsp;<a href="https://github.com/andriyor"><img src="https:&#x2F;&#x2F;github.com&#x2F;andriyor.png" width="50px" alt="User avatar: andriyor" /></a>&nbsp;&nbsp;<a href="https://github.com/majudhu"><img src="https:&#x2F;&#x2F;github.com&#x2F;majudhu.png" width="50px" alt="User avatar: majudhu" /></a>&nbsp;&nbsp;<a href="https://github.com/axelrindle"><img src="https:&#x2F;&#x2F;github.com&#x2F;axelrindle.png" width="50px" alt="User avatar: axelrindle" /></a>&nbsp;&nbsp;<a href="https://github.com/jirizverina"><img src="https:&#x2F;&#x2F;github.com&#x2F;jirizverina.png" width="50px" alt="User avatar: jirizverina" /></a>&nbsp;&nbsp;<a href="https://github.com/chip-well"><img src="https:&#x2F;&#x2F;github.com&#x2F;chip-well.png" width="50px" alt="User avatar: chip-well" /></a>&nbsp;&nbsp;<a href="https://github.com/GRAYAH"><img src="https:&#x2F;&#x2F;github.com&#x2F;GRAYAH.png" width="50px" alt="User avatar: GRAYAH" /></a>&nbsp;&nbsp;<a href="https://github.com/flashblaze"><img src="https:&#x2F;&#x2F;github.com&#x2F;flashblaze.png" width="50px" alt="User avatar: flashblaze" /></a>&nbsp;&nbsp;<a href="https://github.com/Frostist"><img src="https:&#x2F;&#x2F;github.com&#x2F;Frostist.png" width="50px" alt="User avatar: Frostist" /></a>&nbsp;&nbsp;<!-- sponsors-base --> <!-- sponsors-base --><a href="https://github.com/seanwash"><img src="https:&#x2F;&#x2F;github.com&#x2F;seanwash.png" width="50px" alt="User avatar: seanwash" /></a>&nbsp;&nbsp;<a href="https://github.com/jerath"><img src="https:&#x2F;&#x2F;github.com&#x2F;jerath.png" width="50px" alt="User avatar: jerath" /></a>&nbsp;&nbsp;<a href="https://github.com/itsa-sh"><img src="https:&#x2F;&#x2F;github.com&#x2F;itsa-sh.png" width="50px" alt="User avatar: itsa-sh" /></a>&nbsp;&nbsp;<a href="https://github.com/dmmulroy"><img src="https:&#x2F;&#x2F;github.com&#x2F;dmmulroy.png" width="50px" alt="User avatar: dmmulroy" /></a>&nbsp;&nbsp;<a href="https://github.com/timcole"><img src="https:&#x2F;&#x2F;github.com&#x2F;timcole.png" width="50px" alt="User avatar: timcole" /></a>&nbsp;&nbsp;<a href="https://github.com/VLZH"><img src="https:&#x2F;&#x2F;github.com&#x2F;VLZH.png" width="50px" alt="User avatar: VLZH" /></a>&nbsp;&nbsp;<a href="https://github.com/terasaka2k"><img src="https:&#x2F;&#x2F;github.com&#x2F;terasaka2k.png" width="50px" alt="User avatar: terasaka2k" /></a>&nbsp;&nbsp;<a href="https://github.com/andriyor"><img src="https:&#x2F;&#x2F;github.com&#x2F;andriyor.png" width="50px" alt="User avatar: andriyor" /></a>&nbsp;&nbsp;<a href="https://github.com/majudhu"><img src="https:&#x2F;&#x2F;github.com&#x2F;majudhu.png" width="50px" alt="User avatar: majudhu" /></a>&nbsp;&nbsp;<a href="https://github.com/axelrindle"><img src="https:&#x2F;&#x2F;github.com&#x2F;axelrindle.png" width="50px" alt="User avatar: axelrindle" /></a>&nbsp;&nbsp;<a href="https://github.com/jirizverina"><img src="https:&#x2F;&#x2F;github.com&#x2F;jirizverina.png" width="50px" alt="User avatar: jirizverina" /></a>&nbsp;&nbsp;<a href="https://github.com/chip-well"><img src="https:&#x2F;&#x2F;github.com&#x2F;chip-well.png" width="50px" alt="User avatar: chip-well" /></a>&nbsp;&nbsp;<a href="https://github.com/GRAYAH"><img src="https:&#x2F;&#x2F;github.com&#x2F;GRAYAH.png" width="50px" alt="User avatar: GRAYAH" /></a>&nbsp;&nbsp;<a href="https://github.com/flashblaze"><img src="https:&#x2F;&#x2F;github.com&#x2F;flashblaze.png" width="50px" alt="User avatar: flashblaze" /></a>&nbsp;&nbsp;<!-- sponsors-base -->
</p> </p>
![Yaak API Client](https://yaak.app/static/screenshot.png) ![Yaak API Client](https://yaak.app/static/screenshot.png)
## Features ## Features
Yaak is an offline-first API client designed to stay out of your way while giving you everything you need when you need it. Yaak is an offline-first API client designed to stay out of your way while giving you everything you need when you need it.
Built with [Tauri](https://tauri.app), Rust, and React, its fast, lightweight, and private. No telemetry, no VC funding, and no cloud lock-in. Built with [Tauri](https://tauri.app), Rust, and React, its fast, lightweight, and private. No telemetry, no VC funding, and no cloud lock-in.
### 🌐 Work with any API ### 🌐 Work with any API
@@ -37,23 +41,21 @@ Built with [Tauri](https://tauri.app), Rust, and React, its fast, lightweight
- Filter and inspect responses with JSONPath or XPath. - Filter and inspect responses with JSONPath or XPath.
### 🔐 Stay secure ### 🔐 Stay secure
- Use OAuth 2.0, JWT, Basic Auth, or custom plugins for authentication. - Use OAuth 2.0, JWT, Basic Auth, or custom plugins for authentication.
- Secure sensitive values with encrypted secrets. - Secure sensitive values with encrypted secrets.
- Store secrets in your OS keychain. - Store secrets in your OS keychain.
### ☁️ Organize & collaborate ### ☁️ Organize & collaborate
- Group requests into workspaces and nested folders. - Group requests into workspaces and nested folders.
- Use environment variables to switch between dev, staging, and prod. - Use environment variables to switch between dev, staging, and prod.
- Mirror workspaces to your filesystem for versioning in Git or syncing with Dropbox. - Mirror workspaces to your filesystem for versioning in Git or syncing with Dropbox.
### 🧩 Extend & customize ### 🧩 Extend & customize
- Insert dynamic values like UUIDs or timestamps with template tags. - Insert dynamic values like UUIDs or timestamps with template tags.
- Pick from built-in themes or build your own. - Pick from built-in themes or build your own.
- Create plugins to extend authentication, template tags, or the UI. - Create plugins to extend authentication, template tags, or the UI.
## Contribution Policy ## Contribution Policy
> [!IMPORTANT] > [!IMPORTANT]

54
biome.json Normal file
View File

@@ -0,0 +1,54 @@
{
"$schema": "https://biomejs.dev/schemas/2.3.11/schema.json",
"linter": {
"enabled": true,
"rules": {
"recommended": true,
"a11y": {
"useKeyWithClickEvents": "off"
}
}
},
"formatter": {
"enabled": true,
"indentStyle": "space",
"indentWidth": 2,
"lineWidth": 100,
"bracketSpacing": true
},
"css": {
"parser": {
"tailwindDirectives": true
},
"linter": {
"enabled": false
}
},
"javascript": {
"formatter": {
"quoteStyle": "single",
"jsxQuoteStyle": "double",
"trailingCommas": "all",
"semicolons": "always"
}
},
"files": {
"includes": [
"**",
"!**/node_modules",
"!**/dist",
"!**/build",
"!target",
"!scripts",
"!crates",
"!crates-tauri",
"!src-web/tailwind.config.cjs",
"!src-web/postcss.config.cjs",
"!src-web/vite.config.ts",
"!src-web/routeTree.gen.ts",
"!packages/plugin-runtime-types/lib",
"!**/bindings",
"!flatpak"
]
}
}

View File

@@ -5,43 +5,20 @@ edition = "2024"
publish = false publish = false
[[bin]] [[bin]]
name = "yaak" name = "yaakcli"
path = "src/main.rs" path = "src/main.rs"
[dependencies] [dependencies]
arboard = "3"
base64 = "0.22"
clap = { version = "4", features = ["derive"] } clap = { version = "4", features = ["derive"] }
console = "0.15"
dirs = "6" dirs = "6"
env_logger = "0.11" env_logger = "0.11"
futures = "0.3" futures = "0.3"
inquire = { version = "0.7", features = ["editor"] }
hex = { workspace = true }
include_dir = "0.7"
keyring = { workspace = true, features = ["apple-native", "windows-native", "sync-secret-service"] }
log = { workspace = true } log = { workspace = true }
rand = "0.8"
reqwest = { workspace = true }
rolldown = "0.1.0"
oxc_resolver = "=11.10.0"
schemars = { workspace = true } schemars = { workspace = true }
serde = { workspace = true } serde = { workspace = true }
serde_json = { workspace = true } serde_json = { workspace = true }
sha2 = { workspace = true } tokio = { workspace = true, features = ["rt-multi-thread", "macros"] }
tokio = { workspace = true, features = [
"rt-multi-thread",
"macros",
"io-util",
"net",
"signal",
"time",
] }
walkdir = "2"
webbrowser = "1"
zip = "4"
yaak = { workspace = true } yaak = { workspace = true }
yaak-api = { workspace = true }
yaak-crypto = { workspace = true } yaak-crypto = { workspace = true }
yaak-http = { workspace = true } yaak-http = { workspace = true }
yaak-models = { workspace = true } yaak-models = { workspace = true }

View File

@@ -1,66 +1,87 @@
# Yaak CLI # yaak-cli
The `yaak` CLI for publishing plugins and creating/updating/sending requests. Command-line interface for Yaak.
## Installation ## Command Overview
```sh Current top-level commands:
npm install @yaakapp/cli
```
## Agentic Workflows
The `yaak` CLI is primarily meant to be used by AI agents, and has the following features:
- `schema` subcommands to get the JSON Schema for any model (eg. `yaak request schema http`)
- `--json '{...}'` input format to create and update data
- `--verbose` mode for extracting debug info while sending requests
- The ability to send entire workspaces and folders (Supports `--parallel` and `--fail-fast`)
### Example Prompts
Use the `yaak` CLI with agents like Claude or Codex to do useful things for you.
Here are some example prompts:
```text ```text
Scan my API routes and create a workspace (using yaak cli) with yaakcli send <request_id>
all the requests needed for me to do manual testing? yaakcli workspace list
yaakcli workspace show <workspace_id>
yaakcli workspace create --name <name>
yaakcli workspace create --json '{"name":"My Workspace"}'
yaakcli workspace create '{"name":"My Workspace"}'
yaakcli workspace update --json '{"id":"wk_abc","description":"Updated"}'
yaakcli workspace delete <workspace_id> [--yes]
yaakcli request list <workspace_id>
yaakcli request show <request_id>
yaakcli request send <request_id>
yaakcli request create <workspace_id> --name <name> --url <url> [--method GET]
yaakcli request create --json '{"workspaceId":"wk_abc","name":"Users","url":"https://api.example.com/users"}'
yaakcli request create '{"workspaceId":"wk_abc","name":"Users","url":"https://api.example.com/users"}'
yaakcli request update --json '{"id":"rq_abc","name":"Users v2"}'
yaakcli request delete <request_id> [--yes]
yaakcli folder list <workspace_id>
yaakcli folder show <folder_id>
yaakcli folder create <workspace_id> --name <name>
yaakcli folder create --json '{"workspaceId":"wk_abc","name":"Auth"}'
yaakcli folder create '{"workspaceId":"wk_abc","name":"Auth"}'
yaakcli folder update --json '{"id":"fl_abc","name":"Auth v2"}'
yaakcli folder delete <folder_id> [--yes]
yaakcli environment list <workspace_id>
yaakcli environment show <environment_id>
yaakcli environment create <workspace_id> --name <name>
yaakcli environment create --json '{"workspaceId":"wk_abc","name":"Production"}'
yaakcli environment create '{"workspaceId":"wk_abc","name":"Production"}'
yaakcli environment update --json '{"id":"ev_abc","color":"#00ff00"}'
yaakcli environment delete <environment_id> [--yes]
``` ```
```text Global options:
Send all the GraphQL requests in my workspace
- `--data-dir <path>`: use a custom data directory
- `-e, --environment <id>`: environment to use during request rendering/sending
- `-v, --verbose`: verbose logging and send output
Notes:
- `send` is currently a shortcut for sending an HTTP request ID.
- `delete` commands prompt for confirmation unless `--yes` is provided.
- In non-interactive mode, `delete` commands require `--yes`.
- `create` and `update` commands support `--json` and positional JSON shorthand.
- `update` uses JSON Merge Patch semantics (RFC 7386) for partial updates.
## Examples
```bash
yaakcli workspace list
yaakcli workspace create --name "My Workspace"
yaakcli workspace show wk_abc
yaakcli workspace update --json '{"id":"wk_abc","description":"Team workspace"}'
yaakcli request list wk_abc
yaakcli request show rq_abc
yaakcli request create wk_abc --name "Users" --url "https://api.example.com/users"
yaakcli request update --json '{"id":"rq_abc","name":"Users v2"}'
yaakcli request send rq_abc -e ev_abc
yaakcli request delete rq_abc --yes
yaakcli folder create wk_abc --name "Auth"
yaakcli folder update --json '{"id":"fl_abc","name":"Auth v2"}'
yaakcli environment create wk_abc --name "Production"
yaakcli environment update --json '{"id":"ev_abc","color":"#00ff00"}'
``` ```
## Description ## Roadmap
Here's the current print of `yaak --help` Planned command expansion (request schema and polymorphic send) is tracked in `PLAN.md`.
```text When command behavior changes, update this README and verify with:
Yaak CLI - API client from the command line
Usage: yaak [OPTIONS] <COMMAND> ```bash
cargo run -q -p yaak-cli -- --help
Commands: cargo run -q -p yaak-cli -- request --help
auth Authentication commands cargo run -q -p yaak-cli -- workspace --help
plugin Plugin development and publishing commands cargo run -q -p yaak-cli -- folder --help
send Send a request, folder, or workspace by ID cargo run -q -p yaak-cli -- environment --help
workspace Workspace commands
request Request commands
folder Folder commands
environment Environment commands
Options:
--data-dir <DATA_DIR> Use a custom data directory
-e, --environment <ENVIRONMENT> Environment ID to use for variable substitution
-v, --verbose Enable verbose send output (events and streamed response body)
--log [<LEVEL>] Enable CLI logging; optionally set level (error|warn|info|debug|trace) [possible values: error, warn, info, debug, trace]
-h, --help Print help
-V, --version Print version
Agent Hints:
- Template variable syntax is ${[ my_var ]}, not {{ ... }}
- Template function syntax is ${[ namespace.my_func(a='aaa',b='bbb') ]}
- View JSONSchema for models before creating or updating (eg. `yaak request schema http`)
- Deletion requires confirmation (--yes for non-interactive environments)
``` ```

View File

@@ -2,16 +2,8 @@ use clap::{Args, Parser, Subcommand, ValueEnum};
use std::path::PathBuf; use std::path::PathBuf;
#[derive(Parser)] #[derive(Parser)]
#[command(name = "yaak")] #[command(name = "yaakcli")]
#[command(about = "Yaak CLI - API client from the command line")] #[command(about = "Yaak CLI - API client from the command line")]
#[command(version = crate::version::cli_version())]
#[command(disable_help_subcommand = true)]
#[command(after_help = r#"Agent Hints:
- Template variable syntax is ${[ my_var ]}, not {{ ... }}
- Template function syntax is ${[ namespace.my_func(a='aaa',b='bbb') ]}
- View JSONSchema for models before creating or updating (eg. `yaak request schema http`)
- Deletion requires confirmation (--yes for non-interactive environments)
"#)]
pub struct Cli { pub struct Cli {
/// Use a custom data directory /// Use a custom data directory
#[arg(long, global = true)] #[arg(long, global = true)]
@@ -21,50 +13,19 @@ pub struct Cli {
#[arg(long, short, global = true)] #[arg(long, short, global = true)]
pub environment: Option<String>, pub environment: Option<String>,
/// Cookie jar ID to use when sending requests /// Enable verbose logging
#[arg(long = "cookie-jar", global = true, value_name = "COOKIE_JAR_ID")]
pub cookie_jar: Option<String>,
/// Enable verbose send output (events and streamed response body)
#[arg(long, short, global = true)] #[arg(long, short, global = true)]
pub verbose: bool, pub verbose: bool,
/// Enable CLI logging; optionally set level (error|warn|info|debug|trace)
#[arg(long, global = true, value_name = "LEVEL", num_args = 0..=1, ignore_case = true)]
pub log: Option<Option<LogLevel>>,
#[command(subcommand)] #[command(subcommand)]
pub command: Commands, pub command: Commands,
} }
#[derive(Subcommand)] #[derive(Subcommand)]
pub enum Commands { pub enum Commands {
/// Authentication commands
Auth(AuthArgs),
/// Plugin development and publishing commands
Plugin(PluginArgs),
#[command(hide = true)]
Build(PluginPathArg),
#[command(hide = true)]
Dev(PluginPathArg),
/// Backward-compatible alias for `plugin generate`
#[command(hide = true)]
Generate(GenerateArgs),
/// Backward-compatible alias for `plugin publish`
#[command(hide = true)]
Publish(PluginPathArg),
/// Send a request, folder, or workspace by ID /// Send a request, folder, or workspace by ID
Send(SendArgs), Send(SendArgs),
/// Cookie jar commands
CookieJar(CookieJarArgs),
/// Workspace commands /// Workspace commands
Workspace(WorkspaceArgs), Workspace(WorkspaceArgs),
@@ -83,8 +44,12 @@ pub struct SendArgs {
/// Request, folder, or workspace ID /// Request, folder, or workspace ID
pub id: String, pub id: String,
/// Execute requests sequentially (default)
#[arg(long, conflicts_with = "parallel")]
pub sequential: bool,
/// Execute requests in parallel /// Execute requests in parallel
#[arg(long)] #[arg(long, conflicts_with = "sequential")]
pub parallel: bool, pub parallel: bool,
/// Stop on first request failure when sending folders/workspaces /// Stop on first request failure when sending folders/workspaces
@@ -93,23 +58,6 @@ pub struct SendArgs {
} }
#[derive(Args)] #[derive(Args)]
#[command(disable_help_subcommand = true)]
pub struct CookieJarArgs {
#[command(subcommand)]
pub command: CookieJarCommands,
}
#[derive(Subcommand)]
pub enum CookieJarCommands {
/// List cookie jars in a workspace
List {
/// Workspace ID (optional when exactly one workspace exists)
workspace_id: Option<String>,
},
}
#[derive(Args)]
#[command(disable_help_subcommand = true)]
pub struct WorkspaceArgs { pub struct WorkspaceArgs {
#[command(subcommand)] #[command(subcommand)]
pub command: WorkspaceCommands, pub command: WorkspaceCommands,
@@ -120,13 +68,6 @@ pub enum WorkspaceCommands {
/// List all workspaces /// List all workspaces
List, List,
/// Output JSON schema for workspace create/update payloads
Schema {
/// Pretty-print schema JSON output
#[arg(long)]
pretty: bool,
},
/// Show a workspace as JSON /// Show a workspace as JSON
Show { Show {
/// Workspace ID /// Workspace ID
@@ -171,7 +112,6 @@ pub enum WorkspaceCommands {
} }
#[derive(Args)] #[derive(Args)]
#[command(disable_help_subcommand = true)]
pub struct RequestArgs { pub struct RequestArgs {
#[command(subcommand)] #[command(subcommand)]
pub command: RequestCommands, pub command: RequestCommands,
@@ -181,8 +121,8 @@ pub struct RequestArgs {
pub enum RequestCommands { pub enum RequestCommands {
/// List requests in a workspace /// List requests in a workspace
List { List {
/// Workspace ID (optional when exactly one workspace exists) /// Workspace ID
workspace_id: Option<String>, workspace_id: String,
}, },
/// Show a request as JSON /// Show a request as JSON
@@ -201,10 +141,6 @@ pub enum RequestCommands {
Schema { Schema {
#[arg(value_enum)] #[arg(value_enum)]
request_type: RequestSchemaType, request_type: RequestSchemaType,
/// Pretty-print schema JSON output
#[arg(long)]
pretty: bool,
}, },
/// Create a new HTTP request /// Create a new HTTP request
@@ -258,29 +194,7 @@ pub enum RequestSchemaType {
Websocket, Websocket,
} }
#[derive(Clone, Copy, Debug, ValueEnum)]
pub enum LogLevel {
Error,
Warn,
Info,
Debug,
Trace,
}
impl LogLevel {
pub fn as_filter(self) -> log::LevelFilter {
match self {
LogLevel::Error => log::LevelFilter::Error,
LogLevel::Warn => log::LevelFilter::Warn,
LogLevel::Info => log::LevelFilter::Info,
LogLevel::Debug => log::LevelFilter::Debug,
LogLevel::Trace => log::LevelFilter::Trace,
}
}
}
#[derive(Args)] #[derive(Args)]
#[command(disable_help_subcommand = true)]
pub struct FolderArgs { pub struct FolderArgs {
#[command(subcommand)] #[command(subcommand)]
pub command: FolderCommands, pub command: FolderCommands,
@@ -290,8 +204,8 @@ pub struct FolderArgs {
pub enum FolderCommands { pub enum FolderCommands {
/// List folders in a workspace /// List folders in a workspace
List { List {
/// Workspace ID (optional when exactly one workspace exists) /// Workspace ID
workspace_id: Option<String>, workspace_id: String,
}, },
/// Show a folder as JSON /// Show a folder as JSON
@@ -337,7 +251,6 @@ pub enum FolderCommands {
} }
#[derive(Args)] #[derive(Args)]
#[command(disable_help_subcommand = true)]
pub struct EnvironmentArgs { pub struct EnvironmentArgs {
#[command(subcommand)] #[command(subcommand)]
pub command: EnvironmentCommands, pub command: EnvironmentCommands,
@@ -347,15 +260,8 @@ pub struct EnvironmentArgs {
pub enum EnvironmentCommands { pub enum EnvironmentCommands {
/// List environments in a workspace /// List environments in a workspace
List { List {
/// Workspace ID (optional when exactly one workspace exists) /// Workspace ID
workspace_id: Option<String>, workspace_id: String,
},
/// Output JSON schema for environment create/update payloads
Schema {
/// Pretty-print schema JSON output
#[arg(long)]
pretty: bool,
}, },
/// Show an environment as JSON /// Show an environment as JSON
@@ -365,22 +271,15 @@ pub enum EnvironmentCommands {
}, },
/// Create an environment /// Create an environment
#[command(after_help = r#"Modes (choose one):
1) yaak environment create <workspace_id> --name <name>
2) yaak environment create --json '{"workspaceId":"wk_abc","name":"Production"}'
3) yaak environment create '{"workspaceId":"wk_abc","name":"Production"}'
4) yaak environment create <workspace_id> --json '{"name":"Production"}'
"#)]
Create { Create {
/// Workspace ID for flag-based mode, or positional JSON payload shorthand /// Workspace ID (or positional JSON payload shorthand)
#[arg(value_name = "WORKSPACE_ID_OR_JSON")]
workspace_id: Option<String>, workspace_id: Option<String>,
/// Environment name /// Environment name
#[arg(short, long)] #[arg(short, long)]
name: Option<String>, name: Option<String>,
/// JSON payload (use instead of WORKSPACE_ID/--name) /// JSON payload
#[arg(long)] #[arg(long)]
json: Option<String>, json: Option<String>,
}, },
@@ -406,70 +305,3 @@ pub enum EnvironmentCommands {
yes: bool, yes: bool,
}, },
} }
#[derive(Args)]
#[command(disable_help_subcommand = true)]
pub struct AuthArgs {
#[command(subcommand)]
pub command: AuthCommands,
}
#[derive(Subcommand)]
pub enum AuthCommands {
/// Login to Yaak via web browser
Login,
/// Sign out of the Yaak CLI
Logout,
/// Print the current logged-in user's info
Whoami,
}
#[derive(Args)]
#[command(disable_help_subcommand = true)]
pub struct PluginArgs {
#[command(subcommand)]
pub command: PluginCommands,
}
#[derive(Subcommand)]
pub enum PluginCommands {
/// Transpile code into a runnable plugin bundle
Build(PluginPathArg),
/// Build plugin bundle continuously when the filesystem changes
Dev(PluginPathArg),
/// Generate a "Hello World" Yaak plugin
Generate(GenerateArgs),
/// Install a plugin from a local directory or from the registry
Install(InstallPluginArgs),
/// Publish a Yaak plugin version to the plugin registry
Publish(PluginPathArg),
}
#[derive(Args, Clone)]
pub struct PluginPathArg {
/// Path to plugin directory (defaults to current working directory)
pub path: Option<PathBuf>,
}
#[derive(Args, Clone)]
pub struct GenerateArgs {
/// Plugin name (defaults to a generated name in interactive mode)
#[arg(long)]
pub name: Option<String>,
/// Output directory for the generated plugin (defaults to ./<name> in interactive mode)
#[arg(long)]
pub dir: Option<PathBuf>,
}
#[derive(Args, Clone)]
pub struct InstallPluginArgs {
/// Local plugin directory path, or registry plugin spec (@org/plugin[@version])
pub source: String,
}

View File

@@ -1,528 +0,0 @@
use crate::cli::{AuthArgs, AuthCommands};
use crate::ui;
use crate::utils::http;
use base64::Engine as _;
use keyring::Entry;
use rand::RngCore;
use rand::rngs::OsRng;
use reqwest::Url;
use serde_json::Value;
use sha2::{Digest, Sha256};
use std::io::{self, IsTerminal, Write};
use std::time::Duration;
use tokio::io::{AsyncReadExt, AsyncWriteExt};
use tokio::net::{TcpListener, TcpStream};
const OAUTH_CLIENT_ID: &str = "a1fe44800c2d7e803cad1b4bf07a291c";
const KEYRING_USER: &str = "yaak";
const AUTH_TIMEOUT: Duration = Duration::from_secs(300);
const MAX_REQUEST_BYTES: usize = 16 * 1024;
type CommandResult<T = ()> = std::result::Result<T, String>;
#[derive(Clone, Copy, Debug, Eq, PartialEq)]
enum Environment {
Production,
Staging,
Development,
}
impl Environment {
fn app_base_url(self) -> &'static str {
match self {
Environment::Production => "https://yaak.app",
Environment::Staging => "https://todo.yaak.app",
Environment::Development => "http://localhost:9444",
}
}
fn api_base_url(self) -> &'static str {
match self {
Environment::Production => "https://api.yaak.app",
Environment::Staging => "https://todo.yaak.app",
Environment::Development => "http://localhost:9444",
}
}
fn keyring_service(self) -> &'static str {
match self {
Environment::Production => "app.yaak.cli.Token",
Environment::Staging => "app.yaak.cli.staging.Token",
Environment::Development => "app.yaak.cli.dev.Token",
}
}
}
struct OAuthFlow {
app_base_url: String,
auth_url: Url,
token_url: String,
redirect_url: String,
state: String,
code_verifier: String,
}
pub async fn run(args: AuthArgs) -> i32 {
let result = match args.command {
AuthCommands::Login => login().await,
AuthCommands::Logout => logout(),
AuthCommands::Whoami => whoami().await,
};
match result {
Ok(()) => 0,
Err(error) => {
ui::error(&error);
1
}
}
}
async fn login() -> CommandResult {
let environment = current_environment();
let listener = TcpListener::bind("127.0.0.1:0")
.await
.map_err(|e| format!("Failed to start OAuth callback server: {e}"))?;
let port = listener
.local_addr()
.map_err(|e| format!("Failed to determine callback server port: {e}"))?
.port();
let oauth = build_oauth_flow(environment, port)?;
ui::info(&format!("Initiating login to {}", oauth.auth_url));
if !confirm_open_browser()? {
ui::info("Login canceled");
return Ok(());
}
if let Err(err) = webbrowser::open(oauth.auth_url.as_ref()) {
ui::warning(&format!("Failed to open browser: {err}"));
ui::info(&format!("Open this URL manually:\n{}", oauth.auth_url));
}
ui::info("Waiting for authentication...");
let code = tokio::select! {
result = receive_oauth_code(listener, &oauth.state, &oauth.app_base_url) => result?,
_ = tokio::signal::ctrl_c() => {
return Err("Interrupted by user".to_string());
}
_ = tokio::time::sleep(AUTH_TIMEOUT) => {
return Err("Timeout waiting for authentication".to_string());
}
};
let token = exchange_access_token(&oauth, &code).await?;
store_auth_token(environment, &token)?;
ui::success("Authentication successful!");
Ok(())
}
fn logout() -> CommandResult {
delete_auth_token(current_environment())?;
ui::success("Signed out of Yaak");
Ok(())
}
async fn whoami() -> CommandResult {
let environment = current_environment();
let token = match get_auth_token(environment)? {
Some(token) => token,
None => {
ui::warning("Not logged in");
ui::info("Please run `yaak auth login`");
return Ok(());
}
};
let url = format!("{}/api/v1/whoami", environment.api_base_url());
let response = http::build_client(Some(&token))?
.get(url)
.send()
.await
.map_err(|e| format!("Failed to call whoami endpoint: {e}"))?;
let status = response.status();
let body =
response.text().await.map_err(|e| format!("Failed to read whoami response body: {e}"))?;
if !status.is_success() {
if status.as_u16() == 401 {
let _ = delete_auth_token(environment);
return Err(
"Unauthorized to access CLI. Run `yaak auth login` to refresh credentials."
.to_string(),
);
}
return Err(http::parse_api_error(status.as_u16(), &body));
}
println!("{body}");
Ok(())
}
fn current_environment() -> Environment {
let value = std::env::var("ENVIRONMENT").ok();
parse_environment(value.as_deref())
}
fn parse_environment(value: Option<&str>) -> Environment {
match value {
Some("staging") => Environment::Staging,
Some("development") => Environment::Development,
_ => Environment::Production,
}
}
fn build_oauth_flow(environment: Environment, callback_port: u16) -> CommandResult<OAuthFlow> {
let code_verifier = random_hex(32);
let state = random_hex(24);
let redirect_url = format!("http://127.0.0.1:{callback_port}/oauth/callback");
let code_challenge = base64::engine::general_purpose::URL_SAFE_NO_PAD
.encode(Sha256::digest(code_verifier.as_bytes()));
let mut auth_url = Url::parse(&format!("{}/login/oauth/authorize", environment.app_base_url()))
.map_err(|e| format!("Failed to build OAuth authorize URL: {e}"))?;
auth_url
.query_pairs_mut()
.append_pair("response_type", "code")
.append_pair("client_id", OAUTH_CLIENT_ID)
.append_pair("redirect_uri", &redirect_url)
.append_pair("state", &state)
.append_pair("code_challenge_method", "S256")
.append_pair("code_challenge", &code_challenge);
Ok(OAuthFlow {
app_base_url: environment.app_base_url().to_string(),
auth_url,
token_url: format!("{}/login/oauth/access_token", environment.app_base_url()),
redirect_url,
state,
code_verifier,
})
}
async fn receive_oauth_code(
listener: TcpListener,
expected_state: &str,
app_base_url: &str,
) -> CommandResult<String> {
loop {
let (mut stream, _) = listener
.accept()
.await
.map_err(|e| format!("OAuth callback server accept error: {e}"))?;
match parse_callback_request(&mut stream).await {
Ok((state, code)) => {
if state != expected_state {
let _ = write_bad_request(&mut stream, "Invalid OAuth state").await;
continue;
}
let success_redirect = format!("{app_base_url}/login/oauth/success");
write_redirect(&mut stream, &success_redirect)
.await
.map_err(|e| format!("Failed responding to OAuth callback: {e}"))?;
return Ok(code);
}
Err(error) => {
let _ = write_bad_request(&mut stream, &error).await;
if error.starts_with("OAuth provider returned error:") {
return Err(error);
}
}
}
}
}
async fn parse_callback_request(stream: &mut TcpStream) -> CommandResult<(String, String)> {
let target = read_http_target(stream).await?;
if !target.starts_with("/oauth/callback") {
return Err("Expected /oauth/callback path".to_string());
}
let url = Url::parse(&format!("http://127.0.0.1{target}"))
.map_err(|e| format!("Failed to parse callback URL: {e}"))?;
let mut state: Option<String> = None;
let mut code: Option<String> = None;
let mut oauth_error: Option<String> = None;
let mut oauth_error_description: Option<String> = None;
for (k, v) in url.query_pairs() {
if k == "state" {
state = Some(v.into_owned());
} else if k == "code" {
code = Some(v.into_owned());
} else if k == "error" {
oauth_error = Some(v.into_owned());
} else if k == "error_description" {
oauth_error_description = Some(v.into_owned());
}
}
if let Some(error) = oauth_error {
let mut message = format!("OAuth provider returned error: {error}");
if let Some(description) = oauth_error_description.filter(|d| !d.is_empty()) {
message.push_str(&format!(" ({description})"));
}
return Err(message);
}
let state = state.ok_or_else(|| "Missing 'state' query parameter".to_string())?;
let code = code.ok_or_else(|| "Missing 'code' query parameter".to_string())?;
if code.is_empty() {
return Err("Missing 'code' query parameter".to_string());
}
Ok((state, code))
}
async fn read_http_target(stream: &mut TcpStream) -> CommandResult<String> {
let mut buf = vec![0_u8; MAX_REQUEST_BYTES];
let mut total_read = 0_usize;
loop {
let n = stream
.read(&mut buf[total_read..])
.await
.map_err(|e| format!("Failed reading callback request: {e}"))?;
if n == 0 {
break;
}
total_read += n;
if buf[..total_read].windows(4).any(|w| w == b"\r\n\r\n") {
break;
}
if total_read == MAX_REQUEST_BYTES {
return Err("OAuth callback request too large".to_string());
}
}
let req = String::from_utf8_lossy(&buf[..total_read]);
let request_line =
req.lines().next().ok_or_else(|| "Invalid callback request line".to_string())?;
let mut parts = request_line.split_whitespace();
let method = parts.next().unwrap_or_default();
let target = parts.next().unwrap_or_default();
if method != "GET" {
return Err(format!("Expected GET callback request, got '{method}'"));
}
if target.is_empty() {
return Err("Missing callback request target".to_string());
}
Ok(target.to_string())
}
async fn write_bad_request(stream: &mut TcpStream, message: &str) -> std::io::Result<()> {
let body = format!("Failed to authenticate: {message}");
let response = format!(
"HTTP/1.1 400 Bad Request\r\nContent-Type: text/plain; charset=utf-8\r\nContent-Length: {}\r\nConnection: close\r\n\r\n{}",
body.len(),
body
);
stream.write_all(response.as_bytes()).await?;
stream.shutdown().await
}
async fn write_redirect(stream: &mut TcpStream, location: &str) -> std::io::Result<()> {
let response = format!(
"HTTP/1.1 302 Found\r\nLocation: {location}\r\nContent-Length: 0\r\nConnection: close\r\n\r\n"
);
stream.write_all(response.as_bytes()).await?;
stream.shutdown().await
}
async fn exchange_access_token(oauth: &OAuthFlow, code: &str) -> CommandResult<String> {
let response = http::build_client(None)?
.post(&oauth.token_url)
.form(&[
("grant_type", "authorization_code"),
("client_id", OAUTH_CLIENT_ID),
("code", code),
("redirect_uri", oauth.redirect_url.as_str()),
("code_verifier", oauth.code_verifier.as_str()),
])
.send()
.await
.map_err(|e| format!("Failed to exchange OAuth code for access token: {e}"))?;
let status = response.status();
let body =
response.text().await.map_err(|e| format!("Failed to read token response body: {e}"))?;
if !status.is_success() {
return Err(format!(
"Failed to fetch access token: status={} body={}",
status.as_u16(),
body
));
}
let parsed: Value =
serde_json::from_str(&body).map_err(|e| format!("Invalid token response JSON: {e}"))?;
let token = parsed
.get("access_token")
.and_then(Value::as_str)
.filter(|s| !s.is_empty())
.ok_or_else(|| format!("Token response missing access_token: {body}"))?;
Ok(token.to_string())
}
fn keyring_entry(environment: Environment) -> CommandResult<Entry> {
Entry::new(environment.keyring_service(), KEYRING_USER)
.map_err(|e| format!("Failed to initialize auth keyring entry: {e}"))
}
fn get_auth_token(environment: Environment) -> CommandResult<Option<String>> {
let entry = keyring_entry(environment)?;
match entry.get_password() {
Ok(token) => Ok(Some(token)),
Err(keyring::Error::NoEntry) => Ok(None),
Err(err) => Err(format!("Failed to read auth token: {err}")),
}
}
fn store_auth_token(environment: Environment, token: &str) -> CommandResult {
let entry = keyring_entry(environment)?;
entry.set_password(token).map_err(|e| format!("Failed to store auth token: {e}"))
}
fn delete_auth_token(environment: Environment) -> CommandResult {
let entry = keyring_entry(environment)?;
match entry.delete_credential() {
Ok(()) | Err(keyring::Error::NoEntry) => Ok(()),
Err(err) => Err(format!("Failed to delete auth token: {err}")),
}
}
fn random_hex(bytes: usize) -> String {
let mut data = vec![0_u8; bytes];
OsRng.fill_bytes(&mut data);
hex::encode(data)
}
fn confirm_open_browser() -> CommandResult<bool> {
if !io::stdin().is_terminal() {
return Ok(true);
}
loop {
print!("Open default browser? [Y/n]: ");
io::stdout().flush().map_err(|e| format!("Failed to flush stdout: {e}"))?;
let mut input = String::new();
io::stdin().read_line(&mut input).map_err(|e| format!("Failed to read input: {e}"))?;
match input.trim().to_ascii_lowercase().as_str() {
"" | "y" | "yes" => return Ok(true),
"n" | "no" => return Ok(false),
_ => ui::warning("Please answer y or n"),
}
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn environment_mapping() {
assert_eq!(parse_environment(Some("staging")), Environment::Staging);
assert_eq!(parse_environment(Some("development")), Environment::Development);
assert_eq!(parse_environment(Some("production")), Environment::Production);
assert_eq!(parse_environment(None), Environment::Production);
}
#[tokio::test]
async fn parses_callback_request() {
let listener = TcpListener::bind("127.0.0.1:0").await.expect("bind");
let addr = listener.local_addr().expect("local addr");
let server = tokio::spawn(async move {
let (mut stream, _) = listener.accept().await.expect("accept");
parse_callback_request(&mut stream).await
});
let mut client = TcpStream::connect(addr).await.expect("connect");
client
.write_all(
b"GET /oauth/callback?code=abc123&state=xyz HTTP/1.1\r\nHost: localhost\r\n\r\n",
)
.await
.expect("write");
let parsed = server.await.expect("join").expect("parse");
assert_eq!(parsed.0, "xyz");
assert_eq!(parsed.1, "abc123");
}
#[tokio::test]
async fn parse_callback_request_oauth_error() {
let listener = TcpListener::bind("127.0.0.1:0").await.expect("bind");
let addr = listener.local_addr().expect("local addr");
let server = tokio::spawn(async move {
let (mut stream, _) = listener.accept().await.expect("accept");
parse_callback_request(&mut stream).await
});
let mut client = TcpStream::connect(addr).await.expect("connect");
client
.write_all(
b"GET /oauth/callback?error=access_denied&error_description=User%20denied&state=xyz HTTP/1.1\r\nHost: localhost\r\n\r\n",
)
.await
.expect("write");
let err = server.await.expect("join").expect_err("should fail");
assert!(err.contains("OAuth provider returned error: access_denied"));
assert!(err.contains("User denied"));
}
#[tokio::test]
async fn receive_oauth_code_fails_fast_on_provider_error() {
let listener = TcpListener::bind("127.0.0.1:0").await.expect("bind");
let addr = listener.local_addr().expect("local addr");
let server = tokio::spawn(async move {
receive_oauth_code(listener, "expected-state", "http://localhost:9444").await
});
let mut client = TcpStream::connect(addr).await.expect("connect");
client
.write_all(
b"GET /oauth/callback?error=access_denied&state=expected-state HTTP/1.1\r\nHost: localhost\r\n\r\n",
)
.await
.expect("write");
let result = tokio::time::timeout(std::time::Duration::from_secs(2), server)
.await
.expect("should not timeout")
.expect("join");
let err = result.expect_err("should return oauth error");
assert!(err.contains("OAuth provider returned error: access_denied"));
}
#[test]
fn builds_oauth_flow_with_pkce() {
let flow = build_oauth_flow(Environment::Development, 8080).expect("flow");
assert!(flow.auth_url.as_str().contains("code_challenge_method=S256"));
assert!(
flow.auth_url
.as_str()
.contains("redirect_uri=http%3A%2F%2F127.0.0.1%3A8080%2Foauth%2Fcallback")
);
assert_eq!(flow.redirect_url, "http://127.0.0.1:8080/oauth/callback");
assert_eq!(flow.token_url, "http://localhost:9444/login/oauth/access_token");
}
}

View File

@@ -1,42 +0,0 @@
use crate::cli::{CookieJarArgs, CookieJarCommands};
use crate::context::CliContext;
use crate::utils::workspace::resolve_workspace_id;
type CommandResult<T = ()> = std::result::Result<T, String>;
pub fn run(ctx: &CliContext, args: CookieJarArgs) -> i32 {
let result = match args.command {
CookieJarCommands::List { workspace_id } => list(ctx, workspace_id.as_deref()),
};
match result {
Ok(()) => 0,
Err(error) => {
eprintln!("Error: {error}");
1
}
}
}
fn list(ctx: &CliContext, workspace_id: Option<&str>) -> CommandResult {
let workspace_id = resolve_workspace_id(ctx, workspace_id, "cookie-jar list")?;
let cookie_jars = ctx
.db()
.list_cookie_jars(&workspace_id)
.map_err(|e| format!("Failed to list cookie jars: {e}"))?;
if cookie_jars.is_empty() {
println!("No cookie jars found in workspace {}", workspace_id);
} else {
for cookie_jar in cookie_jars {
println!(
"{} - {} ({} cookies)",
cookie_jar.id,
cookie_jar.name,
cookie_jar.cookies.len()
);
}
}
Ok(())
}

View File

@@ -2,12 +2,9 @@ use crate::cli::{EnvironmentArgs, EnvironmentCommands};
use crate::context::CliContext; use crate::context::CliContext;
use crate::utils::confirm::confirm_delete; use crate::utils::confirm::confirm_delete;
use crate::utils::json::{ use crate::utils::json::{
apply_merge_patch, is_json_shorthand, merge_workspace_id_arg, parse_optional_json, apply_merge_patch, is_json_shorthand, parse_optional_json, parse_required_json, require_id,
parse_required_json, require_id, validate_create_id, validate_create_id,
}; };
use crate::utils::schema::append_agent_hints;
use crate::utils::workspace::resolve_workspace_id;
use schemars::schema_for;
use yaak_models::models::Environment; use yaak_models::models::Environment;
use yaak_models::util::UpdateSource; use yaak_models::util::UpdateSource;
@@ -15,8 +12,7 @@ type CommandResult<T = ()> = std::result::Result<T, String>;
pub fn run(ctx: &CliContext, args: EnvironmentArgs) -> i32 { pub fn run(ctx: &CliContext, args: EnvironmentArgs) -> i32 {
let result = match args.command { let result = match args.command {
EnvironmentCommands::List { workspace_id } => list(ctx, workspace_id.as_deref()), EnvironmentCommands::List { workspace_id } => list(ctx, &workspace_id),
EnvironmentCommands::Schema { pretty } => schema(pretty),
EnvironmentCommands::Show { environment_id } => show(ctx, &environment_id), EnvironmentCommands::Show { environment_id } => show(ctx, &environment_id),
EnvironmentCommands::Create { workspace_id, name, json } => { EnvironmentCommands::Create { workspace_id, name, json } => {
create(ctx, workspace_id, name, json) create(ctx, workspace_id, name, json)
@@ -34,23 +30,10 @@ pub fn run(ctx: &CliContext, args: EnvironmentArgs) -> i32 {
} }
} }
fn schema(pretty: bool) -> CommandResult { fn list(ctx: &CliContext, workspace_id: &str) -> CommandResult {
let mut schema = serde_json::to_value(schema_for!(Environment))
.map_err(|e| format!("Failed to serialize environment schema: {e}"))?;
append_agent_hints(&mut schema);
let output =
if pretty { serde_json::to_string_pretty(&schema) } else { serde_json::to_string(&schema) }
.map_err(|e| format!("Failed to format environment schema JSON: {e}"))?;
println!("{output}");
Ok(())
}
fn list(ctx: &CliContext, workspace_id: Option<&str>) -> CommandResult {
let workspace_id = resolve_workspace_id(ctx, workspace_id, "environment list")?;
let environments = ctx let environments = ctx
.db() .db()
.list_environments_ensure_base(&workspace_id) .list_environments_ensure_base(workspace_id)
.map_err(|e| format!("Failed to list environments: {e}"))?; .map_err(|e| format!("Failed to list environments: {e}"))?;
if environments.is_empty() { if environments.is_empty() {
@@ -80,11 +63,17 @@ fn create(
name: Option<String>, name: Option<String>,
json: Option<String>, json: Option<String>,
) -> CommandResult { ) -> CommandResult {
let json_shorthand = if json.is_some() && workspace_id.as_deref().is_some_and(|v| !is_json_shorthand(v)) {
workspace_id.as_deref().filter(|v| is_json_shorthand(v)).map(str::to_owned); return Err(
let workspace_id_arg = workspace_id.filter(|v| !is_json_shorthand(v)); "environment create cannot combine workspace_id with --json payload".to_string()
);
}
let payload = parse_optional_json(json, json_shorthand, "environment create")?; let payload = parse_optional_json(
json,
workspace_id.clone().filter(|v| is_json_shorthand(v)),
"environment create",
)?;
if let Some(payload) = payload { if let Some(payload) = payload {
if name.is_some() { if name.is_some() {
@@ -94,17 +83,10 @@ fn create(
validate_create_id(&payload, "environment")?; validate_create_id(&payload, "environment")?;
let mut environment: Environment = serde_json::from_value(payload) let mut environment: Environment = serde_json::from_value(payload)
.map_err(|e| format!("Failed to parse environment create JSON: {e}"))?; .map_err(|e| format!("Failed to parse environment create JSON: {e}"))?;
let fallback_workspace_id =
if workspace_id_arg.is_none() && environment.workspace_id.is_empty() { if environment.workspace_id.is_empty() {
Some(resolve_workspace_id(ctx, None, "environment create")?) return Err("environment create JSON requires non-empty \"workspaceId\"".to_string());
} else { }
None
};
merge_workspace_id_arg(
workspace_id_arg.as_deref().or(fallback_workspace_id.as_deref()),
&mut environment.workspace_id,
"environment create",
)?;
if environment.parent_model.is_empty() { if environment.parent_model.is_empty() {
environment.parent_model = "environment".to_string(); environment.parent_model = "environment".to_string();
@@ -119,8 +101,9 @@ fn create(
return Ok(()); return Ok(());
} }
let workspace_id = let workspace_id = workspace_id.ok_or_else(|| {
resolve_workspace_id(ctx, workspace_id_arg.as_deref(), "environment create")?; "environment create requires workspace_id unless JSON payload is provided".to_string()
})?;
let name = name.ok_or_else(|| { let name = name.ok_or_else(|| {
"environment create requires --name unless JSON payload is provided".to_string() "environment create requires --name unless JSON payload is provided".to_string()
})?; })?;

View File

@@ -2,10 +2,9 @@ use crate::cli::{FolderArgs, FolderCommands};
use crate::context::CliContext; use crate::context::CliContext;
use crate::utils::confirm::confirm_delete; use crate::utils::confirm::confirm_delete;
use crate::utils::json::{ use crate::utils::json::{
apply_merge_patch, is_json_shorthand, merge_workspace_id_arg, parse_optional_json, apply_merge_patch, is_json_shorthand, parse_optional_json, parse_required_json, require_id,
parse_required_json, require_id, validate_create_id, validate_create_id,
}; };
use crate::utils::workspace::resolve_workspace_id;
use yaak_models::models::Folder; use yaak_models::models::Folder;
use yaak_models::util::UpdateSource; use yaak_models::util::UpdateSource;
@@ -13,7 +12,7 @@ type CommandResult<T = ()> = std::result::Result<T, String>;
pub fn run(ctx: &CliContext, args: FolderArgs) -> i32 { pub fn run(ctx: &CliContext, args: FolderArgs) -> i32 {
let result = match args.command { let result = match args.command {
FolderCommands::List { workspace_id } => list(ctx, workspace_id.as_deref()), FolderCommands::List { workspace_id } => list(ctx, &workspace_id),
FolderCommands::Show { folder_id } => show(ctx, &folder_id), FolderCommands::Show { folder_id } => show(ctx, &folder_id),
FolderCommands::Create { workspace_id, name, json } => { FolderCommands::Create { workspace_id, name, json } => {
create(ctx, workspace_id, name, json) create(ctx, workspace_id, name, json)
@@ -31,10 +30,9 @@ pub fn run(ctx: &CliContext, args: FolderArgs) -> i32 {
} }
} }
fn list(ctx: &CliContext, workspace_id: Option<&str>) -> CommandResult { fn list(ctx: &CliContext, workspace_id: &str) -> CommandResult {
let workspace_id = resolve_workspace_id(ctx, workspace_id, "folder list")?;
let folders = let folders =
ctx.db().list_folders(&workspace_id).map_err(|e| format!("Failed to list folders: {e}"))?; ctx.db().list_folders(workspace_id).map_err(|e| format!("Failed to list folders: {e}"))?;
if folders.is_empty() { if folders.is_empty() {
println!("No folders found in workspace {}", workspace_id); println!("No folders found in workspace {}", workspace_id);
} else { } else {
@@ -60,11 +58,15 @@ fn create(
name: Option<String>, name: Option<String>,
json: Option<String>, json: Option<String>,
) -> CommandResult { ) -> CommandResult {
let json_shorthand = if json.is_some() && workspace_id.as_deref().is_some_and(|v| !is_json_shorthand(v)) {
workspace_id.as_deref().filter(|v| is_json_shorthand(v)).map(str::to_owned); return Err("folder create cannot combine workspace_id with --json payload".to_string());
let workspace_id_arg = workspace_id.filter(|v| !is_json_shorthand(v)); }
let payload = parse_optional_json(json, json_shorthand, "folder create")?; let payload = parse_optional_json(
json,
workspace_id.clone().filter(|v| is_json_shorthand(v)),
"folder create",
)?;
if let Some(payload) = payload { if let Some(payload) = payload {
if name.is_some() { if name.is_some() {
@@ -72,19 +74,12 @@ fn create(
} }
validate_create_id(&payload, "folder")?; validate_create_id(&payload, "folder")?;
let mut folder: Folder = serde_json::from_value(payload) let folder: Folder = serde_json::from_value(payload)
.map_err(|e| format!("Failed to parse folder create JSON: {e}"))?; .map_err(|e| format!("Failed to parse folder create JSON: {e}"))?;
let fallback_workspace_id = if workspace_id_arg.is_none() && folder.workspace_id.is_empty()
{ if folder.workspace_id.is_empty() {
Some(resolve_workspace_id(ctx, None, "folder create")?) return Err("folder create JSON requires non-empty \"workspaceId\"".to_string());
} else { }
None
};
merge_workspace_id_arg(
workspace_id_arg.as_deref().or(fallback_workspace_id.as_deref()),
&mut folder.workspace_id,
"folder create",
)?;
let created = ctx let created = ctx
.db() .db()
@@ -95,7 +90,9 @@ fn create(
return Ok(()); return Ok(());
} }
let workspace_id = resolve_workspace_id(ctx, workspace_id_arg.as_deref(), "folder create")?; let workspace_id = workspace_id.ok_or_else(|| {
"folder create requires workspace_id unless JSON payload is provided".to_string()
})?;
let name = name.ok_or_else(|| { let name = name.ok_or_else(|| {
"folder create requires --name unless JSON payload is provided".to_string() "folder create requires --name unless JSON payload is provided".to_string()
})?; })?;

View File

@@ -1,8 +1,5 @@
pub mod auth;
pub mod cookie_jar;
pub mod environment; pub mod environment;
pub mod folder; pub mod folder;
pub mod plugin;
pub mod request; pub mod request;
pub mod send; pub mod send;
pub mod workspace; pub mod workspace;

View File

@@ -1,680 +0,0 @@
use crate::cli::{GenerateArgs, InstallPluginArgs, PluginPathArg};
use crate::context::CliContext;
use crate::ui;
use crate::utils::http;
use keyring::Entry;
use rand::Rng;
use rolldown::{
BundleEvent, Bundler, BundlerOptions, ExperimentalOptions, InputItem, LogLevel, OutputFormat,
Platform, WatchOption, Watcher, WatcherEvent,
};
use serde::Deserialize;
use std::collections::HashSet;
use std::fs;
use std::io::{self, IsTerminal, Read, Write};
use std::path::{Path, PathBuf};
use std::sync::Arc;
use tokio::sync::Mutex;
use walkdir::WalkDir;
use yaak_api::{ApiClientKind, yaak_api_client};
use yaak_models::models::{Plugin, PluginSource};
use yaak_models::util::UpdateSource;
use yaak_plugins::events::PluginContext;
use yaak_plugins::install::download_and_install;
use zip::CompressionMethod;
use zip::write::SimpleFileOptions;
type CommandResult<T = ()> = std::result::Result<T, String>;
const KEYRING_USER: &str = "yaak";
#[derive(Clone, Copy, Debug, Eq, PartialEq)]
enum Environment {
Production,
Staging,
Development,
}
impl Environment {
fn api_base_url(self) -> &'static str {
match self {
Environment::Production => "https://api.yaak.app",
Environment::Staging => "https://todo.yaak.app",
Environment::Development => "http://localhost:9444",
}
}
fn keyring_service(self) -> &'static str {
match self {
Environment::Production => "app.yaak.cli.Token",
Environment::Staging => "app.yaak.cli.staging.Token",
Environment::Development => "app.yaak.cli.dev.Token",
}
}
}
pub async fn run_build(args: PluginPathArg) -> i32 {
match build(args).await {
Ok(()) => 0,
Err(error) => {
ui::error(&error);
1
}
}
}
pub async fn run_install(context: &CliContext, args: InstallPluginArgs) -> i32 {
match install(context, args).await {
Ok(()) => 0,
Err(error) => {
ui::error(&error);
1
}
}
}
pub async fn run_dev(args: PluginPathArg) -> i32 {
match dev(args).await {
Ok(()) => 0,
Err(error) => {
ui::error(&error);
1
}
}
}
pub async fn run_generate(args: GenerateArgs) -> i32 {
match generate(args) {
Ok(()) => 0,
Err(error) => {
ui::error(&error);
1
}
}
}
pub async fn run_publish(args: PluginPathArg) -> i32 {
match publish(args).await {
Ok(()) => 0,
Err(error) => {
ui::error(&error);
1
}
}
}
async fn build(args: PluginPathArg) -> CommandResult {
let plugin_dir = resolve_plugin_dir(args.path)?;
ensure_plugin_build_inputs(&plugin_dir)?;
ui::info(&format!("Building plugin {}...", plugin_dir.display()));
let warnings = build_plugin_bundle(&plugin_dir).await?;
for warning in warnings {
ui::warning(&warning);
}
ui::success(&format!("Built plugin bundle at {}", plugin_dir.join("build/index.js").display()));
Ok(())
}
async fn dev(args: PluginPathArg) -> CommandResult {
let plugin_dir = resolve_plugin_dir(args.path)?;
ensure_plugin_build_inputs(&plugin_dir)?;
ui::info(&format!("Watching plugin {}...", plugin_dir.display()));
let bundler = Bundler::new(bundler_options(&plugin_dir, true))
.map_err(|err| format!("Failed to initialize Rolldown watcher: {err}"))?;
let watcher = Watcher::new(vec![Arc::new(Mutex::new(bundler))], None)
.map_err(|err| format!("Failed to start Rolldown watcher: {err}"))?;
let emitter = watcher.emitter();
let watch_root = plugin_dir.clone();
let _event_logger = tokio::spawn(async move {
loop {
let event = {
let rx = emitter.rx.lock().await;
rx.recv()
};
let Ok(event) = event else {
break;
};
match event {
WatcherEvent::Change(change) => {
let changed_path = Path::new(change.path.as_str());
let display_path = changed_path
.strip_prefix(&watch_root)
.map(|p| p.display().to_string())
.unwrap_or_else(|_| {
changed_path
.file_name()
.map(|name| name.to_string_lossy().into_owned())
.unwrap_or_else(|| "unknown".to_string())
});
ui::info(&format!("Rebuilding plugin {display_path}"));
}
WatcherEvent::Event(BundleEvent::BundleEnd(_)) => {}
WatcherEvent::Event(BundleEvent::Error(event)) => {
if event.error.diagnostics.is_empty() {
ui::error("Plugin build failed");
} else {
for diagnostic in event.error.diagnostics {
ui::error(&diagnostic.to_string());
}
}
}
WatcherEvent::Close => break,
_ => {}
}
}
});
watcher.start().await;
Ok(())
}
fn generate(args: GenerateArgs) -> CommandResult {
let default_name = random_name();
let name = match args.name {
Some(name) => name,
None => prompt_with_default("Plugin name", &default_name)?,
};
let default_dir = format!("./{name}");
let output_dir = match args.dir {
Some(dir) => dir,
None => PathBuf::from(prompt_with_default("Plugin dir", &default_dir)?),
};
if output_dir.exists() {
return Err(format!("Plugin directory already exists: {}", output_dir.display()));
}
ui::info(&format!("Generating plugin in {}", output_dir.display()));
fs::create_dir_all(output_dir.join("src"))
.map_err(|e| format!("Failed creating plugin directory {}: {e}", output_dir.display()))?;
write_file(&output_dir.join(".gitignore"), TEMPLATE_GITIGNORE)?;
write_file(
&output_dir.join("package.json"),
&TEMPLATE_PACKAGE_JSON.replace("yaak-plugin-name", &name),
)?;
write_file(&output_dir.join("tsconfig.json"), TEMPLATE_TSCONFIG)?;
write_file(&output_dir.join("README.md"), &TEMPLATE_README.replace("yaak-plugin-name", &name))?;
write_file(
&output_dir.join("src/index.ts"),
&TEMPLATE_INDEX_TS.replace("yaak-plugin-name", &name),
)?;
write_file(&output_dir.join("src/index.test.ts"), TEMPLATE_INDEX_TEST_TS)?;
ui::success("Plugin scaffold generated");
ui::info("Next steps:");
println!(" 1. cd {}", output_dir.display());
println!(" 2. npm install");
println!(" 3. yaak plugin build");
Ok(())
}
async fn publish(args: PluginPathArg) -> CommandResult {
let plugin_dir = resolve_plugin_dir(args.path)?;
ensure_plugin_build_inputs(&plugin_dir)?;
let environment = current_environment();
let token = get_auth_token(environment)?
.ok_or_else(|| "Not logged in. Run `yaak auth login`.".to_string())?;
ui::info(&format!("Building plugin {}...", plugin_dir.display()));
let warnings = build_plugin_bundle(&plugin_dir).await?;
for warning in warnings {
ui::warning(&warning);
}
ui::info("Archiving plugin");
let archive = create_publish_archive(&plugin_dir)?;
ui::info("Uploading plugin");
let url = format!("{}/api/v1/plugins/publish", environment.api_base_url());
let response = http::build_client(Some(&token))?
.post(url)
.header(reqwest::header::CONTENT_TYPE, "application/zip")
.body(archive)
.send()
.await
.map_err(|e| format!("Failed to upload plugin: {e}"))?;
let status = response.status();
let body =
response.text().await.map_err(|e| format!("Failed reading publish response body: {e}"))?;
if !status.is_success() {
return Err(http::parse_api_error(status.as_u16(), &body));
}
let published: PublishResponse = serde_json::from_str(&body)
.map_err(|e| format!("Failed parsing publish response JSON: {e}\nResponse: {body}"))?;
ui::success(&format!("Plugin published {}", published.version));
println!(" -> {}", published.url);
Ok(())
}
async fn install(context: &CliContext, args: InstallPluginArgs) -> CommandResult {
if args.source.starts_with('@') {
let (name, version) =
parse_registry_install_spec(args.source.as_str()).ok_or_else(|| {
"Invalid registry plugin spec. Expected format: @org/plugin or @org/plugin@version"
.to_string()
})?;
return install_from_registry(context, name, version).await;
}
install_from_directory(context, args.source.as_str()).await
}
async fn install_from_registry(
context: &CliContext,
name: String,
version: Option<String>,
) -> CommandResult {
let current_version = crate::version::cli_version();
let http_client = yaak_api_client(ApiClientKind::Cli, current_version)
.map_err(|err| format!("Failed to initialize API client: {err}"))?;
let installing_version = version.clone().unwrap_or_else(|| "latest".to_string());
ui::info(&format!("Installing registry plugin {name}@{installing_version}"));
let plugin_context = PluginContext::new(Some("cli".to_string()), None);
let installed = download_and_install(
context.plugin_manager(),
context.query_manager(),
&http_client,
&plugin_context,
name.as_str(),
version,
)
.await
.map_err(|err| format!("Failed to install plugin: {err}"))?;
ui::success(&format!("Installed plugin {}@{}", installed.name, installed.version));
Ok(())
}
async fn install_from_directory(context: &CliContext, source: &str) -> CommandResult {
let plugin_dir = resolve_plugin_dir(Some(PathBuf::from(source)))?;
let plugin_dir_str = plugin_dir
.to_str()
.ok_or_else(|| {
format!("Plugin directory path is not valid UTF-8: {}", plugin_dir.display())
})?
.to_string();
ui::info(&format!("Installing plugin from directory {}", plugin_dir.display()));
let plugin = context
.db()
.upsert_plugin(
&Plugin {
directory: plugin_dir_str,
url: None,
enabled: true,
source: PluginSource::Filesystem,
..Default::default()
},
&UpdateSource::Background,
)
.map_err(|err| format!("Failed to save plugin in database: {err}"))?;
let plugin_context = PluginContext::new(Some("cli".to_string()), None);
context
.plugin_manager()
.add_plugin(&plugin_context, &plugin)
.await
.map_err(|err| format!("Failed to load plugin runtime: {err}"))?;
ui::success(&format!("Installed plugin from {}", plugin.directory));
Ok(())
}
fn parse_registry_install_spec(source: &str) -> Option<(String, Option<String>)> {
if !source.starts_with('@') || !source.contains('/') {
return None;
}
let rest = source.get(1..)?;
let version_split = rest.rfind('@').map(|idx| idx + 1);
let (name, version) = match version_split {
Some(at_idx) => {
let (name, version) = source.split_at(at_idx);
let version = version.strip_prefix('@').unwrap_or_default();
if version.is_empty() {
return None;
}
(name.to_string(), Some(version.to_string()))
}
None => (source.to_string(), None),
};
if !name.starts_with('@') {
return None;
}
let without_scope = name.get(1..)?;
let (scope, plugin_name) = without_scope.split_once('/')?;
if scope.is_empty() || plugin_name.is_empty() {
return None;
}
Some((name, version))
}
#[derive(Deserialize)]
struct PublishResponse {
version: String,
url: String,
}
async fn build_plugin_bundle(plugin_dir: &Path) -> CommandResult<Vec<String>> {
prepare_build_output_dir(plugin_dir)?;
let mut bundler = Bundler::new(bundler_options(plugin_dir, false))
.map_err(|err| format!("Failed to initialize Rolldown: {err}"))?;
let output = bundler.write().await.map_err(|err| format!("Plugin build failed:\n{err}"))?;
Ok(output.warnings.into_iter().map(|w| w.to_string()).collect())
}
fn prepare_build_output_dir(plugin_dir: &Path) -> CommandResult {
let build_dir = plugin_dir.join("build");
if build_dir.exists() {
fs::remove_dir_all(&build_dir)
.map_err(|e| format!("Failed to clean build directory {}: {e}", build_dir.display()))?;
}
fs::create_dir_all(&build_dir)
.map_err(|e| format!("Failed to create build directory {}: {e}", build_dir.display()))
}
fn bundler_options(plugin_dir: &Path, watch: bool) -> BundlerOptions {
BundlerOptions {
input: Some(vec![InputItem { import: "./src/index.ts".to_string(), ..Default::default() }]),
cwd: Some(plugin_dir.to_path_buf()),
file: Some("build/index.js".to_string()),
format: Some(OutputFormat::Cjs),
platform: Some(Platform::Node),
log_level: Some(LogLevel::Info),
experimental: watch
.then_some(ExperimentalOptions { incremental_build: Some(true), ..Default::default() }),
watch: watch.then_some(WatchOption::default()),
..Default::default()
}
}
fn resolve_plugin_dir(path: Option<PathBuf>) -> CommandResult<PathBuf> {
let cwd =
std::env::current_dir().map_err(|e| format!("Failed to read current directory: {e}"))?;
let candidate = match path {
Some(path) if path.is_absolute() => path,
Some(path) => cwd.join(path),
None => cwd,
};
if !candidate.exists() {
return Err(format!("Plugin directory does not exist: {}", candidate.display()));
}
if !candidate.is_dir() {
return Err(format!("Plugin path is not a directory: {}", candidate.display()));
}
candidate
.canonicalize()
.map_err(|e| format!("Failed to resolve plugin directory {}: {e}", candidate.display()))
}
fn ensure_plugin_build_inputs(plugin_dir: &Path) -> CommandResult {
let package_json = plugin_dir.join("package.json");
if !package_json.is_file() {
return Err(format!(
"{} does not exist. Ensure that you are in a plugin directory.",
package_json.display()
));
}
let entry = plugin_dir.join("src/index.ts");
if !entry.is_file() {
return Err(format!("Required entrypoint missing: {}", entry.display()));
}
Ok(())
}
fn create_publish_archive(plugin_dir: &Path) -> CommandResult<Vec<u8>> {
let required_files = [
"README.md",
"package.json",
"build/index.js",
"src/index.ts",
];
let optional_files = ["package-lock.json"];
let mut selected = HashSet::new();
for required in required_files {
let required_path = plugin_dir.join(required);
if !required_path.is_file() {
return Err(format!("Missing required file: {required}"));
}
selected.insert(required.to_string());
}
for optional in optional_files {
selected.insert(optional.to_string());
}
let cursor = std::io::Cursor::new(Vec::new());
let mut zip = zip::ZipWriter::new(cursor);
let options = SimpleFileOptions::default().compression_method(CompressionMethod::Deflated);
for entry in WalkDir::new(plugin_dir) {
let entry = entry.map_err(|e| format!("Failed walking plugin directory: {e}"))?;
if !entry.file_type().is_file() {
continue;
}
let path = entry.path();
let rel = path
.strip_prefix(plugin_dir)
.map_err(|e| format!("Failed deriving relative path for {}: {e}", path.display()))?;
let rel = rel.to_string_lossy().replace('\\', "/");
let keep = rel.starts_with("src/") || rel.starts_with("build/") || selected.contains(&rel);
if !keep {
continue;
}
zip.start_file(rel, options).map_err(|e| format!("Failed adding file to archive: {e}"))?;
let mut file = fs::File::open(path)
.map_err(|e| format!("Failed opening file {}: {e}", path.display()))?;
let mut contents = Vec::new();
file.read_to_end(&mut contents)
.map_err(|e| format!("Failed reading file {}: {e}", path.display()))?;
zip.write_all(&contents).map_err(|e| format!("Failed writing archive contents: {e}"))?;
}
let cursor = zip.finish().map_err(|e| format!("Failed finalizing plugin archive: {e}"))?;
Ok(cursor.into_inner())
}
fn write_file(path: &Path, contents: &str) -> CommandResult {
if let Some(parent) = path.parent() {
fs::create_dir_all(parent)
.map_err(|e| format!("Failed creating directory {}: {e}", parent.display()))?;
}
fs::write(path, contents).map_err(|e| format!("Failed writing file {}: {e}", path.display()))
}
fn prompt_with_default(label: &str, default: &str) -> CommandResult<String> {
if !io::stdin().is_terminal() {
return Ok(default.to_string());
}
print!("{label} [{default}]: ");
io::stdout().flush().map_err(|e| format!("Failed to flush stdout: {e}"))?;
let mut input = String::new();
io::stdin().read_line(&mut input).map_err(|e| format!("Failed to read input: {e}"))?;
let trimmed = input.trim();
if trimmed.is_empty() { Ok(default.to_string()) } else { Ok(trimmed.to_string()) }
}
fn current_environment() -> Environment {
match std::env::var("ENVIRONMENT").as_deref() {
Ok("staging") => Environment::Staging,
Ok("development") => Environment::Development,
_ => Environment::Production,
}
}
fn keyring_entry(environment: Environment) -> CommandResult<Entry> {
Entry::new(environment.keyring_service(), KEYRING_USER)
.map_err(|e| format!("Failed to initialize auth keyring entry: {e}"))
}
fn get_auth_token(environment: Environment) -> CommandResult<Option<String>> {
let entry = keyring_entry(environment)?;
match entry.get_password() {
Ok(token) => Ok(Some(token)),
Err(keyring::Error::NoEntry) => Ok(None),
Err(err) => Err(format!("Failed to read auth token: {err}")),
}
}
fn random_name() -> String {
const ADJECTIVES: &[&str] = &[
"young", "youthful", "yellow", "yielding", "yappy", "yawning", "yummy", "yucky", "yearly",
"yester", "yeasty", "yelling",
];
const NOUNS: &[&str] = &[
"yak", "yarn", "year", "yell", "yoke", "yoga", "yam", "yacht", "yodel",
];
let mut rng = rand::thread_rng();
let adjective = ADJECTIVES[rng.gen_range(0..ADJECTIVES.len())];
let noun = NOUNS[rng.gen_range(0..NOUNS.len())];
format!("{adjective}-{noun}")
}
const TEMPLATE_GITIGNORE: &str = "node_modules\n";
const TEMPLATE_PACKAGE_JSON: &str = r#"{
"name": "yaak-plugin-name",
"private": true,
"version": "0.0.1",
"scripts": {
"build": "yaak plugin build",
"dev": "yaak plugin dev"
},
"devDependencies": {
"@types/node": "^24.10.1",
"typescript": "^5.9.3",
"vitest": "^4.0.14"
},
"dependencies": {
"@yaakapp/api": "^0.7.0"
}
}
"#;
const TEMPLATE_TSCONFIG: &str = r#"{
"compilerOptions": {
"target": "es2021",
"lib": ["DOM", "DOM.Iterable", "ESNext"],
"useDefineForClassFields": true,
"allowJs": false,
"skipLibCheck": true,
"esModuleInterop": false,
"allowSyntheticDefaultImports": true,
"strict": true,
"noUncheckedIndexedAccess": true,
"forceConsistentCasingInFileNames": true,
"module": "ESNext",
"moduleResolution": "Node",
"resolveJsonModule": true,
"isolatedModules": true,
"noEmit": true,
"jsx": "react-jsx"
},
"include": ["src"]
}
"#;
const TEMPLATE_README: &str = r#"# yaak-plugin-name
Describe what your plugin does.
"#;
const TEMPLATE_INDEX_TS: &str = r#"import type { PluginDefinition } from "@yaakapp/api";
export const plugin: PluginDefinition = {
httpRequestActions: [
{
label: "Hello, From Plugin",
icon: "info",
async onSelect(ctx, args) {
await ctx.toast.show({
color: "success",
message: `You clicked the request ${args.httpRequest.id}`,
});
},
},
],
};
"#;
const TEMPLATE_INDEX_TEST_TS: &str = r#"import { describe, expect, test } from "vitest";
import { plugin } from "./index";
describe("Example Plugin", () => {
test("Exports plugin object", () => {
expect(plugin).toBeTypeOf("object");
});
});
"#;
#[cfg(test)]
mod tests {
use super::create_publish_archive;
use std::collections::HashSet;
use std::fs;
use std::io::Cursor;
use tempfile::TempDir;
use zip::ZipArchive;
#[test]
fn publish_archive_includes_required_and_optional_files() {
let dir = TempDir::new().expect("temp dir");
let root = dir.path();
fs::create_dir_all(root.join("src")).expect("create src");
fs::create_dir_all(root.join("build")).expect("create build");
fs::create_dir_all(root.join("ignored")).expect("create ignored");
fs::write(root.join("README.md"), "# Demo\n").expect("write README");
fs::write(root.join("package.json"), "{}").expect("write package.json");
fs::write(root.join("package-lock.json"), "{}").expect("write package-lock.json");
fs::write(root.join("src/index.ts"), "export const plugin = {};\n")
.expect("write src/index.ts");
fs::write(root.join("build/index.js"), "exports.plugin = {};\n")
.expect("write build/index.js");
fs::write(root.join("ignored/secret.txt"), "do-not-ship").expect("write ignored file");
let archive = create_publish_archive(root).expect("create archive");
let mut zip = ZipArchive::new(Cursor::new(archive)).expect("open zip");
let mut names = HashSet::new();
for i in 0..zip.len() {
let file = zip.by_index(i).expect("zip entry");
names.insert(file.name().to_string());
}
assert!(names.contains("README.md"));
assert!(names.contains("package.json"));
assert!(names.contains("package-lock.json"));
assert!(names.contains("src/index.ts"));
assert!(names.contains("build/index.js"));
assert!(!names.contains("ignored/secret.txt"));
}
}

View File

@@ -2,18 +2,14 @@ use crate::cli::{RequestArgs, RequestCommands, RequestSchemaType};
use crate::context::CliContext; use crate::context::CliContext;
use crate::utils::confirm::confirm_delete; use crate::utils::confirm::confirm_delete;
use crate::utils::json::{ use crate::utils::json::{
apply_merge_patch, is_json_shorthand, merge_workspace_id_arg, parse_optional_json, apply_merge_patch, is_json_shorthand, parse_optional_json, parse_required_json, require_id,
parse_required_json, require_id, validate_create_id, validate_create_id,
}; };
use crate::utils::schema::append_agent_hints;
use crate::utils::workspace::resolve_workspace_id;
use schemars::schema_for; use schemars::schema_for;
use serde_json::{Map, Value, json}; use serde_json::{Map, Value, json};
use std::collections::HashMap; use std::collections::HashMap;
use std::io::Write;
use tokio::sync::mpsc; use tokio::sync::mpsc;
use yaak::send::{SendHttpRequestByIdWithPluginsParams, send_http_request_by_id_with_plugins}; use yaak::send::{SendHttpRequestByIdWithPluginsParams, send_http_request_by_id_with_plugins};
use yaak_http::sender::HttpResponseEvent as SenderHttpResponseEvent;
use yaak_models::models::{GrpcRequest, HttpRequest, WebsocketRequest}; use yaak_models::models::{GrpcRequest, HttpRequest, WebsocketRequest};
use yaak_models::queries::any_request::AnyRequest; use yaak_models::queries::any_request::AnyRequest;
use yaak_models::util::UpdateSource; use yaak_models::util::UpdateSource;
@@ -25,16 +21,13 @@ pub async fn run(
ctx: &CliContext, ctx: &CliContext,
args: RequestArgs, args: RequestArgs,
environment: Option<&str>, environment: Option<&str>,
cookie_jar_id: Option<&str>,
verbose: bool, verbose: bool,
) -> i32 { ) -> i32 {
let result = match args.command { let result = match args.command {
RequestCommands::List { workspace_id } => list(ctx, workspace_id.as_deref()), RequestCommands::List { workspace_id } => list(ctx, &workspace_id),
RequestCommands::Show { request_id } => show(ctx, &request_id), RequestCommands::Show { request_id } => show(ctx, &request_id),
RequestCommands::Send { request_id } => { RequestCommands::Send { request_id } => {
return match send_request_by_id(ctx, &request_id, environment, cookie_jar_id, verbose) return match send_request_by_id(ctx, &request_id, environment, verbose).await {
.await
{
Ok(()) => 0, Ok(()) => 0,
Err(error) => { Err(error) => {
eprintln!("Error: {error}"); eprintln!("Error: {error}");
@@ -42,8 +35,8 @@ pub async fn run(
} }
}; };
} }
RequestCommands::Schema { request_type, pretty } => { RequestCommands::Schema { request_type } => {
return match schema(ctx, request_type, pretty).await { return match schema(ctx, request_type).await {
Ok(()) => 0, Ok(()) => 0,
Err(error) => { Err(error) => {
eprintln!("Error: {error}"); eprintln!("Error: {error}");
@@ -67,11 +60,10 @@ pub async fn run(
} }
} }
fn list(ctx: &CliContext, workspace_id: Option<&str>) -> CommandResult { fn list(ctx: &CliContext, workspace_id: &str) -> CommandResult {
let workspace_id = resolve_workspace_id(ctx, workspace_id, "request list")?;
let requests = ctx let requests = ctx
.db() .db()
.list_http_requests(&workspace_id) .list_http_requests(workspace_id)
.map_err(|e| format!("Failed to list requests: {e}"))?; .map_err(|e| format!("Failed to list requests: {e}"))?;
if requests.is_empty() { if requests.is_empty() {
println!("No requests found in workspace {}", workspace_id); println!("No requests found in workspace {}", workspace_id);
@@ -83,7 +75,7 @@ fn list(ctx: &CliContext, workspace_id: Option<&str>) -> CommandResult {
Ok(()) Ok(())
} }
async fn schema(ctx: &CliContext, request_type: RequestSchemaType, pretty: bool) -> CommandResult { async fn schema(ctx: &CliContext, request_type: RequestSchemaType) -> CommandResult {
let mut schema = match request_type { let mut schema = match request_type {
RequestSchemaType::Http => serde_json::to_value(schema_for!(HttpRequest)) RequestSchemaType::Http => serde_json::to_value(schema_for!(HttpRequest))
.map_err(|e| format!("Failed to serialize HTTP request schema: {e}"))?, .map_err(|e| format!("Failed to serialize HTTP request schema: {e}"))?,
@@ -93,51 +85,16 @@ async fn schema(ctx: &CliContext, request_type: RequestSchemaType, pretty: bool)
.map_err(|e| format!("Failed to serialize WebSocket request schema: {e}"))?, .map_err(|e| format!("Failed to serialize WebSocket request schema: {e}"))?,
}; };
enrich_schema_guidance(&mut schema, request_type);
append_agent_hints(&mut schema);
if let Err(error) = merge_auth_schema_from_plugins(ctx, &mut schema).await { if let Err(error) = merge_auth_schema_from_plugins(ctx, &mut schema).await {
eprintln!("Warning: Failed to enrich authentication schema from plugins: {error}"); eprintln!("Warning: Failed to enrich authentication schema from plugins: {error}");
} }
let output = let output = serde_json::to_string_pretty(&schema)
if pretty { serde_json::to_string_pretty(&schema) } else { serde_json::to_string(&schema) } .map_err(|e| format!("Failed to format schema JSON: {e}"))?;
.map_err(|e| format!("Failed to format schema JSON: {e}"))?;
println!("{output}"); println!("{output}");
Ok(()) Ok(())
} }
fn enrich_schema_guidance(schema: &mut Value, request_type: RequestSchemaType) {
if !matches!(request_type, RequestSchemaType::Http) {
return;
}
let Some(properties) = schema.get_mut("properties").and_then(Value::as_object_mut) else {
return;
};
if let Some(url_schema) = properties.get_mut("url").and_then(Value::as_object_mut) {
append_description(
url_schema,
"For path segments like `/foo/:id/comments/:commentId`, put concrete values in `urlParameters` using names without `:` (for example `id`, `commentId`).",
);
}
}
fn append_description(schema: &mut Map<String, Value>, extra: &str) {
match schema.get_mut("description") {
Some(Value::String(existing)) if !existing.trim().is_empty() => {
if !existing.ends_with(' ') {
existing.push(' ');
}
existing.push_str(extra);
}
_ => {
schema.insert("description".to_string(), Value::String(extra.to_string()));
}
}
}
async fn merge_auth_schema_from_plugins( async fn merge_auth_schema_from_plugins(
ctx: &CliContext, ctx: &CliContext,
schema: &mut Value, schema: &mut Value,
@@ -341,11 +298,15 @@ fn create(
url: Option<String>, url: Option<String>,
json: Option<String>, json: Option<String>,
) -> CommandResult { ) -> CommandResult {
let json_shorthand = if json.is_some() && workspace_id.as_deref().is_some_and(|v| !is_json_shorthand(v)) {
workspace_id.as_deref().filter(|v| is_json_shorthand(v)).map(str::to_owned); return Err("request create cannot combine workspace_id with --json payload".to_string());
let workspace_id_arg = workspace_id.filter(|v| !is_json_shorthand(v)); }
let payload = parse_optional_json(json, json_shorthand, "request create")?; let payload = parse_optional_json(
json,
workspace_id.clone().filter(|v| is_json_shorthand(v)),
"request create",
)?;
if let Some(payload) = payload { if let Some(payload) = payload {
if name.is_some() || method.is_some() || url.is_some() { if name.is_some() || method.is_some() || url.is_some() {
@@ -353,19 +314,12 @@ fn create(
} }
validate_create_id(&payload, "request")?; validate_create_id(&payload, "request")?;
let mut request: HttpRequest = serde_json::from_value(payload) let request: HttpRequest = serde_json::from_value(payload)
.map_err(|e| format!("Failed to parse request create JSON: {e}"))?; .map_err(|e| format!("Failed to parse request create JSON: {e}"))?;
let fallback_workspace_id = if workspace_id_arg.is_none() && request.workspace_id.is_empty()
{ if request.workspace_id.is_empty() {
Some(resolve_workspace_id(ctx, None, "request create")?) return Err("request create JSON requires non-empty \"workspaceId\"".to_string());
} else { }
None
};
merge_workspace_id_arg(
workspace_id_arg.as_deref().or(fallback_workspace_id.as_deref()),
&mut request.workspace_id,
"request create",
)?;
let created = ctx let created = ctx
.db() .db()
@@ -376,7 +330,9 @@ fn create(
return Ok(()); return Ok(());
} }
let workspace_id = resolve_workspace_id(ctx, workspace_id_arg.as_deref(), "request create")?; let workspace_id = workspace_id.ok_or_else(|| {
"request create requires workspace_id unless JSON payload is provided".to_string()
})?;
let name = name.unwrap_or_default(); let name = name.unwrap_or_default();
let url = url.unwrap_or_default(); let url = url.unwrap_or_default();
let method = method.unwrap_or_else(|| "GET".to_string()); let method = method.unwrap_or_else(|| "GET".to_string());
@@ -445,7 +401,6 @@ pub async fn send_request_by_id(
ctx: &CliContext, ctx: &CliContext,
request_id: &str, request_id: &str,
environment: Option<&str>, environment: Option<&str>,
cookie_jar_id: Option<&str>,
verbose: bool, verbose: bool,
) -> Result<(), String> { ) -> Result<(), String> {
let request = let request =
@@ -457,7 +412,6 @@ pub async fn send_request_by_id(
&http_request.id, &http_request.id,
&http_request.workspace_id, &http_request.workspace_id,
environment, environment,
cookie_jar_id,
verbose, verbose,
) )
.await .await
@@ -476,32 +430,18 @@ async fn send_http_request_by_id(
request_id: &str, request_id: &str,
workspace_id: &str, workspace_id: &str,
environment: Option<&str>, environment: Option<&str>,
cookie_jar_id: Option<&str>,
verbose: bool, verbose: bool,
) -> Result<(), String> { ) -> Result<(), String> {
let cookie_jar_id = resolve_cookie_jar_id(ctx, workspace_id, cookie_jar_id)?; let plugin_context = PluginContext::new(None, Some(workspace_id.to_string()));
let plugin_context = let (event_tx, mut event_rx) = mpsc::channel(100);
PluginContext::new(Some("cli".to_string()), Some(workspace_id.to_string()));
let (event_tx, mut event_rx) = mpsc::channel::<SenderHttpResponseEvent>(100);
let (body_chunk_tx, mut body_chunk_rx) = mpsc::unbounded_channel::<Vec<u8>>();
let event_handle = tokio::spawn(async move { let event_handle = tokio::spawn(async move {
while let Some(event) = event_rx.recv().await { while let Some(event) = event_rx.recv().await {
if verbose && !matches!(event, SenderHttpResponseEvent::ChunkReceived { .. }) { if verbose {
println!("{}", event); println!("{}", event);
} }
} }
}); });
let body_handle = tokio::task::spawn_blocking(move || {
let mut stdout = std::io::stdout();
while let Some(chunk) = body_chunk_rx.blocking_recv() {
if stdout.write_all(&chunk).is_err() {
break;
}
let _ = stdout.flush();
}
});
let response_dir = ctx.data_dir().join("responses"); let response_dir = ctx.data_dir().join("responses");
let result = send_http_request_by_id_with_plugins(SendHttpRequestByIdWithPluginsParams { let result = send_http_request_by_id_with_plugins(SendHttpRequestByIdWithPluginsParams {
@@ -510,10 +450,9 @@ async fn send_http_request_by_id(
request_id, request_id,
environment_id: environment, environment_id: environment,
update_source: UpdateSource::Sync, update_source: UpdateSource::Sync,
cookie_jar_id, cookie_jar_id: None,
response_dir: &response_dir, response_dir: &response_dir,
emit_events_to: Some(event_tx), emit_events_to: Some(event_tx),
emit_response_body_chunks_to: Some(body_chunk_tx),
plugin_manager: ctx.plugin_manager(), plugin_manager: ctx.plugin_manager(),
encryption_manager: ctx.encryption_manager.clone(), encryption_manager: ctx.encryption_manager.clone(),
plugin_context: &plugin_context, plugin_context: &plugin_context,
@@ -523,26 +462,24 @@ async fn send_http_request_by_id(
.await; .await;
let _ = event_handle.await; let _ = event_handle.await;
let _ = body_handle.await; let result = result.map_err(|e| e.to_string())?;
result.map_err(|e| e.to_string())?;
if verbose {
println!();
}
println!(
"HTTP {} {}",
result.response.status,
result.response.status_reason.as_deref().unwrap_or("")
);
if verbose {
for header in &result.response.headers {
println!("{}: {}", header.name, header.value);
}
println!();
}
let body = String::from_utf8(result.response_body)
.map_err(|e| format!("Failed to read response body: {e}"))?;
println!("{}", body);
Ok(()) Ok(())
} }
pub(crate) fn resolve_cookie_jar_id(
ctx: &CliContext,
workspace_id: &str,
explicit_cookie_jar_id: Option<&str>,
) -> Result<Option<String>, String> {
if let Some(cookie_jar_id) = explicit_cookie_jar_id {
return Ok(Some(cookie_jar_id.to_string()));
}
let default_cookie_jar = ctx
.db()
.list_cookie_jars(workspace_id)
.map_err(|e| format!("Failed to list cookie jars: {e}"))?
.into_iter()
.min_by_key(|jar| jar.created_at)
.map(|jar| jar.id);
Ok(default_cookie_jar)
}

View File

@@ -2,7 +2,6 @@ use crate::cli::SendArgs;
use crate::commands::request; use crate::commands::request;
use crate::context::CliContext; use crate::context::CliContext;
use futures::future::join_all; use futures::future::join_all;
use yaak_models::queries::any_request::AnyRequest;
enum ExecutionMode { enum ExecutionMode {
Sequential, Sequential,
@@ -13,10 +12,9 @@ pub async fn run(
ctx: &CliContext, ctx: &CliContext,
args: SendArgs, args: SendArgs,
environment: Option<&str>, environment: Option<&str>,
cookie_jar_id: Option<&str>,
verbose: bool, verbose: bool,
) -> i32 { ) -> i32 {
match send_target(ctx, args, environment, cookie_jar_id, verbose).await { match send_target(ctx, args, environment, verbose).await {
Ok(()) => 0, Ok(()) => 0,
Err(error) => { Err(error) => {
eprintln!("Error: {error}"); eprintln!("Error: {error}");
@@ -29,70 +27,30 @@ async fn send_target(
ctx: &CliContext, ctx: &CliContext,
args: SendArgs, args: SendArgs,
environment: Option<&str>, environment: Option<&str>,
cookie_jar_id: Option<&str>,
verbose: bool, verbose: bool,
) -> Result<(), String> { ) -> Result<(), String> {
let mode = if args.parallel { ExecutionMode::Parallel } else { ExecutionMode::Sequential }; let mode = if args.parallel { ExecutionMode::Parallel } else { ExecutionMode::Sequential };
if let Ok(request) = ctx.db().get_any_request(&args.id) { if ctx.db().get_any_request(&args.id).is_ok() {
let workspace_id = match &request { return request::send_request_by_id(ctx, &args.id, environment, verbose).await;
AnyRequest::HttpRequest(r) => r.workspace_id.clone(),
AnyRequest::GrpcRequest(r) => r.workspace_id.clone(),
AnyRequest::WebsocketRequest(r) => r.workspace_id.clone(),
};
let resolved_cookie_jar_id =
request::resolve_cookie_jar_id(ctx, &workspace_id, cookie_jar_id)?;
return request::send_request_by_id(
ctx,
&args.id,
environment,
resolved_cookie_jar_id.as_deref(),
verbose,
)
.await;
} }
if let Ok(folder) = ctx.db().get_folder(&args.id) { if ctx.db().get_folder(&args.id).is_ok() {
let resolved_cookie_jar_id =
request::resolve_cookie_jar_id(ctx, &folder.workspace_id, cookie_jar_id)?;
let request_ids = collect_folder_request_ids(ctx, &args.id)?; let request_ids = collect_folder_request_ids(ctx, &args.id)?;
if request_ids.is_empty() { if request_ids.is_empty() {
println!("No requests found in folder {}", args.id); println!("No requests found in folder {}", args.id);
return Ok(()); return Ok(());
} }
return send_many( return send_many(ctx, request_ids, mode, args.fail_fast, environment, verbose).await;
ctx,
request_ids,
mode,
args.fail_fast,
environment,
resolved_cookie_jar_id.as_deref(),
verbose,
)
.await;
} }
if let Ok(workspace) = ctx.db().get_workspace(&args.id) { if ctx.db().get_workspace(&args.id).is_ok() {
let resolved_cookie_jar_id =
request::resolve_cookie_jar_id(ctx, &workspace.id, cookie_jar_id)?;
let request_ids = collect_workspace_request_ids(ctx, &args.id)?; let request_ids = collect_workspace_request_ids(ctx, &args.id)?;
if request_ids.is_empty() { if request_ids.is_empty() {
println!("No requests found in workspace {}", args.id); println!("No requests found in workspace {}", args.id);
return Ok(()); return Ok(());
} }
return send_many( return send_many(ctx, request_ids, mode, args.fail_fast, environment, verbose).await;
ctx,
request_ids,
mode,
args.fail_fast,
environment,
resolved_cookie_jar_id.as_deref(),
verbose,
)
.await;
} }
Err(format!("Could not resolve ID '{}' as request, folder, or workspace", args.id)) Err(format!("Could not resolve ID '{}' as request, folder, or workspace", args.id))
@@ -173,7 +131,6 @@ async fn send_many(
mode: ExecutionMode, mode: ExecutionMode,
fail_fast: bool, fail_fast: bool,
environment: Option<&str>, environment: Option<&str>,
cookie_jar_id: Option<&str>,
verbose: bool, verbose: bool,
) -> Result<(), String> { ) -> Result<(), String> {
let mut success_count = 0usize; let mut success_count = 0usize;
@@ -182,15 +139,7 @@ async fn send_many(
match mode { match mode {
ExecutionMode::Sequential => { ExecutionMode::Sequential => {
for request_id in request_ids { for request_id in request_ids {
match request::send_request_by_id( match request::send_request_by_id(ctx, &request_id, environment, verbose).await {
ctx,
&request_id,
environment,
cookie_jar_id,
verbose,
)
.await
{
Ok(()) => success_count += 1, Ok(()) => success_count += 1,
Err(error) => { Err(error) => {
failures.push((request_id, error)); failures.push((request_id, error));
@@ -207,14 +156,7 @@ async fn send_many(
.map(|request_id| async move { .map(|request_id| async move {
( (
request_id.clone(), request_id.clone(),
request::send_request_by_id( request::send_request_by_id(ctx, request_id, environment, verbose).await,
ctx,
request_id,
environment,
cookie_jar_id,
verbose,
)
.await,
) )
}) })
.collect::<Vec<_>>(); .collect::<Vec<_>>();

View File

@@ -4,8 +4,6 @@ use crate::utils::confirm::confirm_delete;
use crate::utils::json::{ use crate::utils::json::{
apply_merge_patch, parse_optional_json, parse_required_json, require_id, validate_create_id, apply_merge_patch, parse_optional_json, parse_required_json, require_id, validate_create_id,
}; };
use crate::utils::schema::append_agent_hints;
use schemars::schema_for;
use yaak_models::models::Workspace; use yaak_models::models::Workspace;
use yaak_models::util::UpdateSource; use yaak_models::util::UpdateSource;
@@ -14,7 +12,6 @@ type CommandResult<T = ()> = std::result::Result<T, String>;
pub fn run(ctx: &CliContext, args: WorkspaceArgs) -> i32 { pub fn run(ctx: &CliContext, args: WorkspaceArgs) -> i32 {
let result = match args.command { let result = match args.command {
WorkspaceCommands::List => list(ctx), WorkspaceCommands::List => list(ctx),
WorkspaceCommands::Schema { pretty } => schema(pretty),
WorkspaceCommands::Show { workspace_id } => show(ctx, &workspace_id), WorkspaceCommands::Show { workspace_id } => show(ctx, &workspace_id),
WorkspaceCommands::Create { name, json, json_input } => create(ctx, name, json, json_input), WorkspaceCommands::Create { name, json, json_input } => create(ctx, name, json, json_input),
WorkspaceCommands::Update { json, json_input } => update(ctx, json, json_input), WorkspaceCommands::Update { json, json_input } => update(ctx, json, json_input),
@@ -30,18 +27,6 @@ pub fn run(ctx: &CliContext, args: WorkspaceArgs) -> i32 {
} }
} }
fn schema(pretty: bool) -> CommandResult {
let mut schema = serde_json::to_value(schema_for!(Workspace))
.map_err(|e| format!("Failed to serialize workspace schema: {e}"))?;
append_agent_hints(&mut schema);
let output =
if pretty { serde_json::to_string_pretty(&schema) } else { serde_json::to_string(&schema) }
.map_err(|e| format!("Failed to format workspace schema JSON: {e}"))?;
println!("{output}");
Ok(())
}
fn list(ctx: &CliContext) -> CommandResult { fn list(ctx: &CliContext) -> CommandResult {
let workspaces = let workspaces =
ctx.db().list_workspaces().map_err(|e| format!("Failed to list workspaces: {e}"))?; ctx.db().list_workspaces().map_err(|e| format!("Failed to list workspaces: {e}"))?;

View File

@@ -1,6 +1,4 @@
use crate::plugin_events::CliPluginEventBridge; use crate::plugin_events::CliPluginEventBridge;
use include_dir::{Dir, include_dir};
use std::fs;
use std::path::{Path, PathBuf}; use std::path::{Path, PathBuf};
use std::sync::Arc; use std::sync::Arc;
use tokio::sync::Mutex; use tokio::sync::Mutex;
@@ -11,21 +9,6 @@ use yaak_models::query_manager::QueryManager;
use yaak_plugins::events::PluginContext; use yaak_plugins::events::PluginContext;
use yaak_plugins::manager::PluginManager; use yaak_plugins::manager::PluginManager;
const EMBEDDED_PLUGIN_RUNTIME: &str = include_str!(concat!(
env!("CARGO_MANIFEST_DIR"),
"/../../crates-tauri/yaak-app/vendored/plugin-runtime/index.cjs"
));
static EMBEDDED_VENDORED_PLUGINS: Dir<'_> =
include_dir!("$CARGO_MANIFEST_DIR/../../crates-tauri/yaak-app/vendored/plugins");
#[derive(Clone, Debug, Default)]
pub struct CliExecutionContext {
pub request_id: Option<String>,
pub workspace_id: Option<String>,
pub environment_id: Option<String>,
pub cookie_jar_id: Option<String>,
}
pub struct CliContext { pub struct CliContext {
data_dir: PathBuf, data_dir: PathBuf,
query_manager: QueryManager, query_manager: QueryManager,
@@ -36,71 +19,68 @@ pub struct CliContext {
} }
impl CliContext { impl CliContext {
pub fn new(data_dir: PathBuf, app_id: &str) -> Self { pub async fn initialize(data_dir: PathBuf, app_id: &str, with_plugins: bool) -> Self {
let db_path = data_dir.join("db.sqlite"); let db_path = data_dir.join("db.sqlite");
let blob_path = data_dir.join("blobs.sqlite"); let blob_path = data_dir.join("blobs.sqlite");
let (query_manager, blob_manager, _rx) =
match yaak_models::init_standalone(&db_path, &blob_path) { let (query_manager, blob_manager, _rx) = yaak_models::init_standalone(&db_path, &blob_path)
Ok(v) => v, .expect("Failed to initialize database");
Err(err) => {
eprintln!("Error: Failed to initialize database: {err}");
std::process::exit(1);
}
};
let encryption_manager = Arc::new(EncryptionManager::new(query_manager.clone(), app_id)); let encryption_manager = Arc::new(EncryptionManager::new(query_manager.clone(), app_id));
let plugin_manager = if with_plugins {
let vendored_plugin_dir = data_dir.join("vendored-plugins");
let installed_plugin_dir = data_dir.join("installed-plugins");
let node_bin_path = PathBuf::from("node");
let plugin_runtime_main =
std::env::var("YAAK_PLUGIN_RUNTIME").map(PathBuf::from).unwrap_or_else(|_| {
PathBuf::from(env!("CARGO_MANIFEST_DIR"))
.join("../../crates-tauri/yaak-app/vendored/plugin-runtime/index.cjs")
});
let plugin_manager = Arc::new(
PluginManager::new(
vendored_plugin_dir,
installed_plugin_dir,
node_bin_path,
plugin_runtime_main,
false,
)
.await,
);
let plugins = query_manager.connect().list_plugins().unwrap_or_default();
if !plugins.is_empty() {
let errors = plugin_manager
.initialize_all_plugins(plugins, &PluginContext::new_empty())
.await;
for (plugin_dir, error_msg) in errors {
eprintln!(
"Warning: Failed to initialize plugin '{}': {}",
plugin_dir, error_msg
);
}
}
Some(plugin_manager)
} else {
None
};
let plugin_event_bridge = if let Some(plugin_manager) = &plugin_manager {
Some(CliPluginEventBridge::start(plugin_manager.clone(), query_manager.clone()).await)
} else {
None
};
Self { Self {
data_dir, data_dir,
query_manager, query_manager,
blob_manager, blob_manager,
encryption_manager, encryption_manager,
plugin_manager: None, plugin_manager,
plugin_event_bridge: Mutex::new(None), plugin_event_bridge: Mutex::new(plugin_event_bridge),
}
}
pub async fn init_plugins(&mut self, execution_context: CliExecutionContext) {
let vendored_plugin_dir = self.data_dir.join("vendored-plugins");
let installed_plugin_dir = self.data_dir.join("installed-plugins");
let node_bin_path = PathBuf::from("node");
prepare_embedded_vendored_plugins(&vendored_plugin_dir)
.expect("Failed to prepare bundled plugins");
let plugin_runtime_main =
std::env::var("YAAK_PLUGIN_RUNTIME").map(PathBuf::from).unwrap_or_else(|_| {
prepare_embedded_plugin_runtime(&self.data_dir)
.expect("Failed to prepare embedded plugin runtime")
});
match PluginManager::new(
vendored_plugin_dir,
installed_plugin_dir,
node_bin_path,
plugin_runtime_main,
&self.query_manager,
&PluginContext::new_empty(),
false,
)
.await
{
Ok(plugin_manager) => {
let plugin_manager = Arc::new(plugin_manager);
let plugin_event_bridge = CliPluginEventBridge::start(
plugin_manager.clone(),
self.query_manager.clone(),
self.blob_manager.clone(),
self.encryption_manager.clone(),
self.data_dir.clone(),
execution_context,
)
.await;
self.plugin_manager = Some(plugin_manager);
*self.plugin_event_bridge.lock().await = Some(plugin_event_bridge);
}
Err(err) => {
eprintln!("Warning: Failed to initialize plugins: {err}");
}
} }
} }
@@ -133,17 +113,3 @@ impl CliContext {
} }
} }
} }
fn prepare_embedded_plugin_runtime(data_dir: &Path) -> std::io::Result<PathBuf> {
let runtime_dir = data_dir.join("vendored").join("plugin-runtime");
fs::create_dir_all(&runtime_dir)?;
let runtime_main = runtime_dir.join("index.cjs");
fs::write(&runtime_main, EMBEDDED_PLUGIN_RUNTIME)?;
Ok(runtime_main)
}
fn prepare_embedded_vendored_plugins(vendored_plugin_dir: &Path) -> std::io::Result<()> {
fs::create_dir_all(vendored_plugin_dir)?;
EMBEDDED_VENDORED_PLUGINS.extract(vendored_plugin_dir)?;
Ok(())
}

View File

@@ -2,282 +2,51 @@ mod cli;
mod commands; mod commands;
mod context; mod context;
mod plugin_events; mod plugin_events;
mod ui;
mod utils; mod utils;
mod version;
mod version_check;
use clap::Parser; use clap::Parser;
use cli::{Cli, Commands, PluginCommands, RequestCommands}; use cli::{Cli, Commands, RequestCommands};
use context::{CliContext, CliExecutionContext}; use context::CliContext;
use std::path::PathBuf;
use yaak_models::queries::any_request::AnyRequest;
#[tokio::main] #[tokio::main]
async fn main() { async fn main() {
let Cli { data_dir, environment, cookie_jar, verbose, log, command } = Cli::parse(); let Cli { data_dir, environment, verbose, command } = Cli::parse();
if let Some(log_level) = log { if verbose {
match log_level { env_logger::Builder::from_env(env_logger::Env::default().default_filter_or("info")).init();
Some(level) => {
env_logger::Builder::new().filter_level(level.as_filter()).init();
}
None => {
env_logger::Builder::from_env(env_logger::Env::default().default_filter_or("info"))
.init();
}
}
} }
let app_id = if cfg!(debug_assertions) { "app.yaak.desktop.dev" } else { "app.yaak.desktop" }; let app_id = if cfg!(debug_assertions) { "app.yaak.desktop.dev" } else { "app.yaak.desktop" };
let data_dir = data_dir.unwrap_or_else(|| resolve_data_dir(app_id)); let data_dir = data_dir.unwrap_or_else(|| {
dirs::data_dir().expect("Could not determine data directory").join(app_id)
});
version_check::maybe_check_for_updates().await; let needs_plugins = matches!(
&command,
Commands::Send(_)
| Commands::Request(cli::RequestArgs {
command: RequestCommands::Send { .. } | RequestCommands::Schema { .. },
})
);
let context = CliContext::initialize(data_dir, app_id, needs_plugins).await;
let exit_code = match command { let exit_code = match command {
Commands::Auth(args) => commands::auth::run(args).await,
Commands::Plugin(args) => match args.command {
PluginCommands::Build(args) => commands::plugin::run_build(args).await,
PluginCommands::Dev(args) => commands::plugin::run_dev(args).await,
PluginCommands::Generate(args) => commands::plugin::run_generate(args).await,
PluginCommands::Publish(args) => commands::plugin::run_publish(args).await,
PluginCommands::Install(install_args) => {
let mut context = CliContext::new(data_dir.clone(), app_id);
context.init_plugins(CliExecutionContext::default()).await;
let exit_code = commands::plugin::run_install(&context, install_args).await;
context.shutdown().await;
exit_code
}
},
Commands::Build(args) => commands::plugin::run_build(args).await,
Commands::Dev(args) => commands::plugin::run_dev(args).await,
Commands::Generate(args) => commands::plugin::run_generate(args).await,
Commands::Publish(args) => commands::plugin::run_publish(args).await,
Commands::Send(args) => { Commands::Send(args) => {
let mut context = CliContext::new(data_dir.clone(), app_id); commands::send::run(&context, args, environment.as_deref(), verbose).await
match resolve_send_execution_context(
&context,
&args.id,
environment.as_deref(),
cookie_jar.as_deref(),
) {
Ok(execution_context) => {
context.init_plugins(execution_context).await;
let exit_code = commands::send::run(
&context,
args,
environment.as_deref(),
cookie_jar.as_deref(),
verbose,
)
.await;
context.shutdown().await;
exit_code
}
Err(error) => {
eprintln!("Error: {error}");
1
}
}
}
Commands::CookieJar(args) => {
let context = CliContext::new(data_dir.clone(), app_id);
let exit_code = commands::cookie_jar::run(&context, args);
context.shutdown().await;
exit_code
}
Commands::Workspace(args) => {
let context = CliContext::new(data_dir.clone(), app_id);
let exit_code = commands::workspace::run(&context, args);
context.shutdown().await;
exit_code
} }
Commands::Workspace(args) => commands::workspace::run(&context, args),
Commands::Request(args) => { Commands::Request(args) => {
let mut context = CliContext::new(data_dir.clone(), app_id); commands::request::run(&context, args, environment.as_deref(), verbose).await
let execution_context_result = match &args.command {
RequestCommands::Send { request_id } => resolve_request_execution_context(
&context,
request_id,
environment.as_deref(),
cookie_jar.as_deref(),
),
_ => Ok(CliExecutionContext::default()),
};
match execution_context_result {
Ok(execution_context) => {
let with_plugins = matches!(
&args.command,
RequestCommands::Send { .. } | RequestCommands::Schema { .. }
);
if with_plugins {
context.init_plugins(execution_context).await;
}
let exit_code = commands::request::run(
&context,
args,
environment.as_deref(),
cookie_jar.as_deref(),
verbose,
)
.await;
context.shutdown().await;
exit_code
}
Err(error) => {
eprintln!("Error: {error}");
1
}
}
}
Commands::Folder(args) => {
let context = CliContext::new(data_dir.clone(), app_id);
let exit_code = commands::folder::run(&context, args);
context.shutdown().await;
exit_code
}
Commands::Environment(args) => {
let context = CliContext::new(data_dir.clone(), app_id);
let exit_code = commands::environment::run(&context, args);
context.shutdown().await;
exit_code
} }
Commands::Folder(args) => commands::folder::run(&context, args),
Commands::Environment(args) => commands::environment::run(&context, args),
}; };
context.shutdown().await;
if exit_code != 0 { if exit_code != 0 {
std::process::exit(exit_code); std::process::exit(exit_code);
} }
} }
fn resolve_send_execution_context(
context: &CliContext,
id: &str,
environment: Option<&str>,
explicit_cookie_jar_id: Option<&str>,
) -> Result<CliExecutionContext, String> {
if let Ok(request) = context.db().get_any_request(id) {
let (request_id, workspace_id) = match request {
AnyRequest::HttpRequest(r) => (Some(r.id), r.workspace_id),
AnyRequest::GrpcRequest(r) => (Some(r.id), r.workspace_id),
AnyRequest::WebsocketRequest(r) => (Some(r.id), r.workspace_id),
};
let cookie_jar_id = resolve_cookie_jar_id(context, &workspace_id, explicit_cookie_jar_id)?;
return Ok(CliExecutionContext {
request_id,
workspace_id: Some(workspace_id),
environment_id: environment.map(str::to_string),
cookie_jar_id,
});
}
if let Ok(folder) = context.db().get_folder(id) {
let cookie_jar_id =
resolve_cookie_jar_id(context, &folder.workspace_id, explicit_cookie_jar_id)?;
return Ok(CliExecutionContext {
request_id: None,
workspace_id: Some(folder.workspace_id),
environment_id: environment.map(str::to_string),
cookie_jar_id,
});
}
if let Ok(workspace) = context.db().get_workspace(id) {
let cookie_jar_id = resolve_cookie_jar_id(context, &workspace.id, explicit_cookie_jar_id)?;
return Ok(CliExecutionContext {
request_id: None,
workspace_id: Some(workspace.id),
environment_id: environment.map(str::to_string),
cookie_jar_id,
});
}
Err(format!("Could not resolve ID '{}' as request, folder, or workspace", id))
}
fn resolve_request_execution_context(
context: &CliContext,
request_id: &str,
environment: Option<&str>,
explicit_cookie_jar_id: Option<&str>,
) -> Result<CliExecutionContext, String> {
let request = context
.db()
.get_any_request(request_id)
.map_err(|e| format!("Failed to get request: {e}"))?;
let workspace_id = match request {
AnyRequest::HttpRequest(r) => r.workspace_id,
AnyRequest::GrpcRequest(r) => r.workspace_id,
AnyRequest::WebsocketRequest(r) => r.workspace_id,
};
let cookie_jar_id = resolve_cookie_jar_id(context, &workspace_id, explicit_cookie_jar_id)?;
Ok(CliExecutionContext {
request_id: Some(request_id.to_string()),
workspace_id: Some(workspace_id),
environment_id: environment.map(str::to_string),
cookie_jar_id,
})
}
fn resolve_cookie_jar_id(
context: &CliContext,
workspace_id: &str,
explicit_cookie_jar_id: Option<&str>,
) -> Result<Option<String>, String> {
if let Some(cookie_jar_id) = explicit_cookie_jar_id {
return Ok(Some(cookie_jar_id.to_string()));
}
let default_cookie_jar = context
.db()
.list_cookie_jars(workspace_id)
.map_err(|e| format!("Failed to list cookie jars: {e}"))?
.into_iter()
.min_by_key(|jar| jar.created_at)
.map(|jar| jar.id);
Ok(default_cookie_jar)
}
fn resolve_data_dir(app_id: &str) -> PathBuf {
if let Some(dir) = wsl_data_dir(app_id) {
return dir;
}
dirs::data_dir().expect("Could not determine data directory").join(app_id)
}
/// Detect WSL and resolve the Windows AppData\Roaming path for the Yaak data directory.
fn wsl_data_dir(app_id: &str) -> Option<PathBuf> {
if !cfg!(target_os = "linux") {
return None;
}
let proc_version = std::fs::read_to_string("/proc/version").ok()?;
let is_wsl = proc_version.to_lowercase().contains("microsoft");
if !is_wsl {
return None;
}
// We're in WSL, so try to resolve the Yaak app's data directory in Windows
// Get the Windows %APPDATA% path via cmd.exe
let appdata_output =
std::process::Command::new("cmd.exe").args(["/C", "echo", "%APPDATA%"]).output().ok()?;
let win_path = String::from_utf8(appdata_output.stdout).ok()?.trim().to_string();
if win_path.is_empty() || win_path == "%APPDATA%" {
return None;
}
// Convert Windows path to WSL path using wslpath (handles custom mount points)
let wslpath_output = std::process::Command::new("wslpath").arg(&win_path).output().ok()?;
let wsl_appdata = String::from_utf8(wslpath_output.stdout).ok()?.trim().to_string();
if wsl_appdata.is_empty() {
return None;
}
let wsl_path = PathBuf::from(wsl_appdata).join(app_id);
if wsl_path.exists() { Some(wsl_path) } else { None }
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,42 +0,0 @@
use console::style;
use std::io::{self, IsTerminal};
pub fn info(message: &str) {
if io::stdout().is_terminal() {
println!("{:<8} {}", style("INFO").cyan().bold(), style(message).cyan());
} else {
println!("INFO {message}");
}
}
pub fn warning(message: &str) {
if io::stdout().is_terminal() {
println!("{:<8} {}", style("WARNING").yellow().bold(), style(message).yellow());
} else {
println!("WARNING {message}");
}
}
pub fn warning_stderr(message: &str) {
if io::stderr().is_terminal() {
eprintln!("{:<8} {}", style("WARNING").yellow().bold(), style(message).yellow());
} else {
eprintln!("WARNING {message}");
}
}
pub fn success(message: &str) {
if io::stdout().is_terminal() {
println!("{:<8} {}", style("SUCCESS").green().bold(), style(message).green());
} else {
println!("SUCCESS {message}");
}
}
pub fn error(message: &str) {
if io::stderr().is_terminal() {
eprintln!("{:<8} {}", style("ERROR").red().bold(), style(message).red());
} else {
eprintln!("Error: {message}");
}
}

View File

@@ -1,47 +0,0 @@
use reqwest::Client;
use reqwest::header::{HeaderMap, HeaderName, HeaderValue, USER_AGENT};
use serde_json::Value;
pub fn build_client(session_token: Option<&str>) -> Result<Client, String> {
let mut headers = HeaderMap::new();
let user_agent = HeaderValue::from_str(&user_agent())
.map_err(|e| format!("Failed to build user-agent header: {e}"))?;
headers.insert(USER_AGENT, user_agent);
if let Some(token) = session_token {
let token_value = HeaderValue::from_str(token)
.map_err(|e| format!("Failed to build session header: {e}"))?;
headers.insert(HeaderName::from_static("x-yaak-session"), token_value);
}
Client::builder()
.default_headers(headers)
.build()
.map_err(|e| format!("Failed to initialize HTTP client: {e}"))
}
pub fn parse_api_error(status: u16, body: &str) -> String {
if let Ok(value) = serde_json::from_str::<Value>(body) {
if let Some(message) = value.get("message").and_then(Value::as_str) {
return message.to_string();
}
if let Some(error) = value.get("error").and_then(Value::as_str) {
return error.to_string();
}
}
format!("API error {status}: {body}")
}
fn user_agent() -> String {
format!("YaakCli/{} ({})", crate::version::cli_version(), ua_platform())
}
fn ua_platform() -> &'static str {
match std::env::consts::OS {
"windows" => "Win",
"darwin" => "Mac",
"linux" => "Linux",
_ => "Unknown",
}
}

View File

@@ -63,30 +63,6 @@ pub fn validate_create_id(payload: &Value, context: &str) -> JsonResult<()> {
} }
} }
pub fn merge_workspace_id_arg(
workspace_id_from_arg: Option<&str>,
payload_workspace_id: &mut String,
context: &str,
) -> JsonResult<()> {
if let Some(workspace_id_arg) = workspace_id_from_arg {
if payload_workspace_id.is_empty() {
*payload_workspace_id = workspace_id_arg.to_string();
} else if payload_workspace_id != workspace_id_arg {
return Err(format!(
"{context} got conflicting workspace_id values between positional arg and JSON payload"
));
}
}
if payload_workspace_id.is_empty() {
return Err(format!(
"{context} requires non-empty \"workspaceId\" in JSON payload or positional workspace_id"
));
}
Ok(())
}
pub fn apply_merge_patch<T>(existing: &T, patch: &Value, id: &str, context: &str) -> JsonResult<T> pub fn apply_merge_patch<T>(existing: &T, patch: &Value, id: &str, context: &str) -> JsonResult<T>
where where
T: Serialize + DeserializeOwned, T: Serialize + DeserializeOwned,

View File

@@ -1,5 +1,2 @@
pub mod confirm; pub mod confirm;
pub mod http;
pub mod json; pub mod json;
pub mod schema;
pub mod workspace;

View File

@@ -1,15 +0,0 @@
use serde_json::{Value, json};
pub fn append_agent_hints(schema: &mut Value) {
let Some(schema_obj) = schema.as_object_mut() else {
return;
};
schema_obj.insert(
"x-yaak-agent-hints".to_string(),
json!({
"templateVariableSyntax": "${[ my_var ]}",
"templateFunctionSyntax": "${[ namespace.my_func(a='aaa',b='bbb') ]}",
}),
);
}

View File

@@ -1,19 +0,0 @@
use crate::context::CliContext;
pub fn resolve_workspace_id(
ctx: &CliContext,
workspace_id: Option<&str>,
command_name: &str,
) -> Result<String, String> {
if let Some(workspace_id) = workspace_id {
return Ok(workspace_id.to_string());
}
let workspaces =
ctx.db().list_workspaces().map_err(|e| format!("Failed to list workspaces: {e}"))?;
match workspaces.as_slice() {
[] => Err(format!("No workspaces found. {command_name} requires a workspace ID.")),
[workspace] => Ok(workspace.id.clone()),
_ => Err(format!("Multiple workspaces found. {command_name} requires a workspace ID.")),
}
}

View File

@@ -1,3 +0,0 @@
pub fn cli_version() -> &'static str {
option_env!("YAAK_CLI_VERSION").unwrap_or(env!("CARGO_PKG_VERSION"))
}

View File

@@ -1,226 +0,0 @@
use crate::ui;
use crate::version;
use serde::{Deserialize, Serialize};
use std::fs;
use std::io::IsTerminal;
use std::path::{Path, PathBuf};
use std::time::{Duration, SystemTime, UNIX_EPOCH};
use yaak_api::{ApiClientKind, yaak_api_client};
const CACHE_FILE_NAME: &str = "cli-version-check.json";
const CHECK_INTERVAL_SECS: u64 = 24 * 60 * 60;
const REQUEST_TIMEOUT: Duration = Duration::from_millis(800);
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
#[serde(default)]
struct VersionCheckResponse {
outdated: bool,
latest_version: Option<String>,
upgrade_hint: Option<String>,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
#[serde(default)]
struct CacheRecord {
checked_at_epoch_secs: u64,
response: VersionCheckResponse,
last_warned_at_epoch_secs: Option<u64>,
last_warned_version: Option<String>,
}
impl Default for CacheRecord {
fn default() -> Self {
Self {
checked_at_epoch_secs: 0,
response: VersionCheckResponse::default(),
last_warned_at_epoch_secs: None,
last_warned_version: None,
}
}
}
#[derive(Debug, Serialize)]
#[serde(rename_all = "camelCase")]
struct VersionCheckRequest<'a> {
current_version: &'a str,
channel: String,
install_source: String,
platform: &'a str,
arch: &'a str,
}
pub async fn maybe_check_for_updates() {
if should_skip_check() {
return;
}
let now = unix_epoch_secs();
let cache_path = cache_path();
let cached = read_cache(&cache_path);
if let Some(cache) = cached.as_ref().filter(|c| !is_expired(c.checked_at_epoch_secs, now)) {
let mut record = cache.clone();
maybe_warn_outdated(&mut record, now);
write_cache(&cache_path, &record);
return;
}
let fresh = fetch_version_check().await;
match fresh {
Some(response) => {
let mut record = CacheRecord {
checked_at_epoch_secs: now,
response: response.clone(),
last_warned_at_epoch_secs: cached
.as_ref()
.and_then(|c| c.last_warned_at_epoch_secs),
last_warned_version: cached.as_ref().and_then(|c| c.last_warned_version.clone()),
};
maybe_warn_outdated(&mut record, now);
write_cache(&cache_path, &record);
}
None => {
let fallback = cached.as_ref().map(|cache| cache.response.clone()).unwrap_or_default();
let mut record = CacheRecord {
checked_at_epoch_secs: now,
response: fallback,
last_warned_at_epoch_secs: cached
.as_ref()
.and_then(|c| c.last_warned_at_epoch_secs),
last_warned_version: cached.as_ref().and_then(|c| c.last_warned_version.clone()),
};
maybe_warn_outdated(&mut record, now);
write_cache(&cache_path, &record);
}
}
}
fn should_skip_check() -> bool {
if std::env::var("YAAK_CLI_NO_UPDATE_CHECK")
.is_ok_and(|v| v == "1" || v.eq_ignore_ascii_case("true"))
{
return true;
}
if std::env::var("CI").is_ok() {
return true;
}
!std::io::stdout().is_terminal()
}
async fn fetch_version_check() -> Option<VersionCheckResponse> {
let api_url = format!("{}/cli/check", update_base_url());
let current_version = version::cli_version();
let payload = VersionCheckRequest {
current_version,
channel: release_channel(current_version),
install_source: install_source(),
platform: std::env::consts::OS,
arch: std::env::consts::ARCH,
};
let client = yaak_api_client(ApiClientKind::Cli, current_version).ok()?;
let request = client.post(api_url).json(&payload);
let response = tokio::time::timeout(REQUEST_TIMEOUT, request.send()).await.ok()?.ok()?;
if !response.status().is_success() {
return None;
}
tokio::time::timeout(REQUEST_TIMEOUT, response.json::<VersionCheckResponse>()).await.ok()?.ok()
}
fn release_channel(version: &str) -> String {
version
.split_once('-')
.and_then(|(_, suffix)| suffix.split('.').next())
.unwrap_or("stable")
.to_string()
}
fn install_source() -> String {
std::env::var("YAAK_CLI_INSTALL_SOURCE")
.ok()
.filter(|s| !s.trim().is_empty())
.unwrap_or_else(|| "source".to_string())
}
fn update_base_url() -> &'static str {
match std::env::var("ENVIRONMENT").ok().as_deref() {
Some("development") => "http://localhost:9444",
_ => "https://update.yaak.app",
}
}
fn maybe_warn_outdated(record: &mut CacheRecord, now: u64) {
if !record.response.outdated {
return;
}
let latest =
record.response.latest_version.clone().unwrap_or_else(|| "a newer release".to_string());
let warn_suppressed = record.last_warned_version.as_deref() == Some(latest.as_str())
&& record.last_warned_at_epoch_secs.is_some_and(|t| !is_expired(t, now));
if warn_suppressed {
return;
}
let hint = record.response.upgrade_hint.clone().unwrap_or_else(default_upgrade_hint);
ui::warning_stderr(&format!("A newer Yaak CLI version is available ({latest}). {hint}"));
record.last_warned_version = Some(latest);
record.last_warned_at_epoch_secs = Some(now);
}
fn default_upgrade_hint() -> String {
if install_source() == "npm" {
let channel = release_channel(version::cli_version());
if channel == "stable" {
return "Run `npm install -g @yaakapp/cli@latest` to update.".to_string();
}
return format!("Run `npm install -g @yaakapp/cli@{channel}` to update.");
}
"Update your Yaak CLI installation to the latest release.".to_string()
}
fn cache_path() -> PathBuf {
std::env::temp_dir().join("yaak-cli").join(format!("{}-{CACHE_FILE_NAME}", environment_name()))
}
fn environment_name() -> &'static str {
match std::env::var("ENVIRONMENT").ok().as_deref() {
Some("staging") => "staging",
Some("development") => "development",
_ => "production",
}
}
fn read_cache(path: &Path) -> Option<CacheRecord> {
let contents = fs::read_to_string(path).ok()?;
serde_json::from_str::<CacheRecord>(&contents).ok()
}
fn write_cache(path: &Path, record: &CacheRecord) {
let Some(parent) = path.parent() else {
return;
};
if fs::create_dir_all(parent).is_err() {
return;
}
let Ok(json) = serde_json::to_string(record) else {
return;
};
let _ = fs::write(path, json);
}
fn is_expired(checked_at_epoch_secs: u64, now: u64) -> bool {
now.saturating_sub(checked_at_epoch_secs) >= CHECK_INTERVAL_SECS
}
fn unix_epoch_secs() -> u64 {
SystemTime::now()
.duration_since(UNIX_EPOCH)
.unwrap_or_else(|_| Duration::from_secs(0))
.as_secs()
}

View File

@@ -1,14 +1,9 @@
use std::io::{Read, Write}; use std::io::{Read, Write};
use std::net::{SocketAddr, TcpListener, TcpStream}; use std::net::TcpListener;
use std::sync::Arc;
use std::sync::atomic::{AtomicBool, Ordering};
use std::thread; use std::thread;
use std::time::Duration;
pub struct TestHttpServer { pub struct TestHttpServer {
pub url: String, pub url: String,
addr: SocketAddr,
shutdown: Arc<AtomicBool>,
handle: Option<thread::JoinHandle<()>>, handle: Option<thread::JoinHandle<()>>,
} }
@@ -17,46 +12,29 @@ impl TestHttpServer {
let listener = TcpListener::bind("127.0.0.1:0").expect("Failed to bind test HTTP server"); let listener = TcpListener::bind("127.0.0.1:0").expect("Failed to bind test HTTP server");
let addr = listener.local_addr().expect("Failed to get local addr"); let addr = listener.local_addr().expect("Failed to get local addr");
let url = format!("http://{addr}/test"); let url = format!("http://{addr}/test");
listener.set_nonblocking(true).expect("Failed to set test server listener nonblocking");
let shutdown = Arc::new(AtomicBool::new(false));
let shutdown_signal = Arc::clone(&shutdown);
let body_bytes = body.as_bytes().to_vec(); let body_bytes = body.as_bytes().to_vec();
let handle = thread::spawn(move || { let handle = thread::spawn(move || {
while !shutdown_signal.load(Ordering::Relaxed) { if let Ok((mut stream, _)) = listener.accept() {
match listener.accept() { let mut request_buf = [0u8; 4096];
Ok((mut stream, _)) => { let _ = stream.read(&mut request_buf);
let _ = stream.set_read_timeout(Some(Duration::from_secs(1)));
let mut request_buf = [0u8; 4096];
let _ = stream.read(&mut request_buf);
let response = format!( let response = format!(
"HTTP/1.1 200 OK\r\nContent-Type: text/plain\r\nContent-Length: {}\r\nConnection: close\r\n\r\n", "HTTP/1.1 200 OK\r\nContent-Type: text/plain\r\nContent-Length: {}\r\nConnection: close\r\n\r\n",
body_bytes.len() body_bytes.len()
); );
let _ = stream.write_all(response.as_bytes()); let _ = stream.write_all(response.as_bytes());
let _ = stream.write_all(&body_bytes); let _ = stream.write_all(&body_bytes);
let _ = stream.flush(); let _ = stream.flush();
break;
}
Err(err) if err.kind() == std::io::ErrorKind::WouldBlock => {
thread::sleep(Duration::from_millis(10));
}
Err(_) => break,
}
} }
}); });
Self { url, addr, shutdown, handle: Some(handle) } Self { url, handle: Some(handle) }
} }
} }
impl Drop for TestHttpServer { impl Drop for TestHttpServer {
fn drop(&mut self) { fn drop(&mut self) {
self.shutdown.store(true, Ordering::Relaxed);
let _ = TcpStream::connect(self.addr);
if let Some(handle) = self.handle.take() { if let Some(handle) = self.handle.take() {
let _ = handle.join(); let _ = handle.join();
} }

View File

@@ -10,7 +10,7 @@ use yaak_models::query_manager::QueryManager;
use yaak_models::util::UpdateSource; use yaak_models::util::UpdateSource;
pub fn cli_cmd(data_dir: &Path) -> Command { pub fn cli_cmd(data_dir: &Path) -> Command {
let mut cmd = cargo_bin_cmd!("yaak"); let mut cmd = cargo_bin_cmd!("yaakcli");
cmd.arg("--data-dir").arg(data_dir); cmd.arg("--data-dir").arg(data_dir);
cmd cmd
} }

View File

@@ -78,69 +78,3 @@ fn json_create_and_update_merge_patch_round_trip() {
.stdout(contains("\"name\": \"Json Environment\"")) .stdout(contains("\"name\": \"Json Environment\""))
.stdout(contains("\"color\": \"#00ff00\"")); .stdout(contains("\"color\": \"#00ff00\""));
} }
#[test]
fn create_merges_positional_workspace_id_into_json_payload() {
let temp_dir = TempDir::new().expect("Failed to create temp dir");
let data_dir = temp_dir.path();
seed_workspace(data_dir, "wk_test");
let create_assert = cli_cmd(data_dir)
.args([
"environment",
"create",
"wk_test",
"--json",
r#"{"name":"Merged Environment"}"#,
])
.assert()
.success();
let environment_id = parse_created_id(&create_assert.get_output().stdout, "environment create");
cli_cmd(data_dir)
.args(["environment", "show", &environment_id])
.assert()
.success()
.stdout(contains("\"workspaceId\": \"wk_test\""))
.stdout(contains("\"name\": \"Merged Environment\""));
}
#[test]
fn create_rejects_conflicting_workspace_ids_between_arg_and_json() {
let temp_dir = TempDir::new().expect("Failed to create temp dir");
let data_dir = temp_dir.path();
seed_workspace(data_dir, "wk_test");
seed_workspace(data_dir, "wk_other");
cli_cmd(data_dir)
.args([
"environment",
"create",
"wk_test",
"--json",
r#"{"workspaceId":"wk_other","name":"Mismatch"}"#,
])
.assert()
.failure()
.stderr(contains(
"environment create got conflicting workspace_id values between positional arg and JSON payload",
));
}
#[test]
fn environment_schema_outputs_json_schema() {
let temp_dir = TempDir::new().expect("Failed to create temp dir");
let data_dir = temp_dir.path();
cli_cmd(data_dir)
.args(["environment", "schema"])
.assert()
.success()
.stdout(contains("\"type\":\"object\""))
.stdout(contains("\"x-yaak-agent-hints\""))
.stdout(contains("\"templateVariableSyntax\":\"${[ my_var ]}\""))
.stdout(contains(
"\"templateFunctionSyntax\":\"${[ namespace.my_func(a='aaa',b='bbb') ]}\"",
))
.stdout(contains("\"workspaceId\""));
}

View File

@@ -72,51 +72,3 @@ fn json_create_and_update_merge_patch_round_trip() {
.stdout(contains("\"name\": \"Json Folder\"")) .stdout(contains("\"name\": \"Json Folder\""))
.stdout(contains("\"description\": \"Folder Description\"")); .stdout(contains("\"description\": \"Folder Description\""));
} }
#[test]
fn create_merges_positional_workspace_id_into_json_payload() {
let temp_dir = TempDir::new().expect("Failed to create temp dir");
let data_dir = temp_dir.path();
seed_workspace(data_dir, "wk_test");
let create_assert = cli_cmd(data_dir)
.args([
"folder",
"create",
"wk_test",
"--json",
r#"{"name":"Merged Folder"}"#,
])
.assert()
.success();
let folder_id = parse_created_id(&create_assert.get_output().stdout, "folder create");
cli_cmd(data_dir)
.args(["folder", "show", &folder_id])
.assert()
.success()
.stdout(contains("\"workspaceId\": \"wk_test\""))
.stdout(contains("\"name\": \"Merged Folder\""));
}
#[test]
fn create_rejects_conflicting_workspace_ids_between_arg_and_json() {
let temp_dir = TempDir::new().expect("Failed to create temp dir");
let data_dir = temp_dir.path();
seed_workspace(data_dir, "wk_test");
seed_workspace(data_dir, "wk_other");
cli_cmd(data_dir)
.args([
"folder",
"create",
"wk_test",
"--json",
r#"{"workspaceId":"wk_other","name":"Mismatch"}"#,
])
.assert()
.failure()
.stderr(contains(
"folder create got conflicting workspace_id values between positional arg and JSON payload",
));
}

View File

@@ -130,54 +130,6 @@ fn create_allows_workspace_only_with_empty_defaults() {
assert_eq!(request.url, ""); assert_eq!(request.url, "");
} }
#[test]
fn create_merges_positional_workspace_id_into_json_payload() {
let temp_dir = TempDir::new().expect("Failed to create temp dir");
let data_dir = temp_dir.path();
seed_workspace(data_dir, "wk_test");
let create_assert = cli_cmd(data_dir)
.args([
"request",
"create",
"wk_test",
"--json",
r#"{"name":"Merged Request","url":"https://example.com"}"#,
])
.assert()
.success();
let request_id = parse_created_id(&create_assert.get_output().stdout, "request create");
cli_cmd(data_dir)
.args(["request", "show", &request_id])
.assert()
.success()
.stdout(contains("\"workspaceId\": \"wk_test\""))
.stdout(contains("\"name\": \"Merged Request\""));
}
#[test]
fn create_rejects_conflicting_workspace_ids_between_arg_and_json() {
let temp_dir = TempDir::new().expect("Failed to create temp dir");
let data_dir = temp_dir.path();
seed_workspace(data_dir, "wk_test");
seed_workspace(data_dir, "wk_other");
cli_cmd(data_dir)
.args([
"request",
"create",
"wk_test",
"--json",
r#"{"workspaceId":"wk_other","name":"Mismatch"}"#,
])
.assert()
.failure()
.stderr(contains(
"request create got conflicting workspace_id values between positional arg and JSON payload",
));
}
#[test] #[test]
fn request_send_persists_response_body_and_events() { fn request_send_persists_response_body_and_events() {
let temp_dir = TempDir::new().expect("Failed to create temp dir"); let temp_dir = TempDir::new().expect("Failed to create temp dir");
@@ -204,6 +156,7 @@ fn request_send_persists_response_body_and_events() {
.args(["request", "send", &request_id]) .args(["request", "send", &request_id])
.assert() .assert()
.success() .success()
.stdout(contains("HTTP 200 OK"))
.stdout(contains("hello from integration test")); .stdout(contains("hello from integration test"));
let qm = query_manager(data_dir); let qm = query_manager(data_dir);
@@ -236,26 +189,6 @@ fn request_schema_http_outputs_json_schema() {
.args(["request", "schema", "http"]) .args(["request", "schema", "http"])
.assert() .assert()
.success() .success()
.stdout(contains("\"type\":\"object\""))
.stdout(contains("\"x-yaak-agent-hints\""))
.stdout(contains("\"templateVariableSyntax\":\"${[ my_var ]}\""))
.stdout(contains(
"\"templateFunctionSyntax\":\"${[ namespace.my_func(a='aaa',b='bbb') ]}\"",
))
.stdout(contains("\"authentication\":"))
.stdout(contains("/foo/:id/comments/:commentId"))
.stdout(contains("put concrete values in `urlParameters`"));
}
#[test]
fn request_schema_http_pretty_prints_with_flag() {
let temp_dir = TempDir::new().expect("Failed to create temp dir");
let data_dir = temp_dir.path();
cli_cmd(data_dir)
.args(["request", "schema", "http", "--pretty"])
.assert()
.success()
.stdout(contains("\"type\": \"object\"")) .stdout(contains("\"type\": \"object\""))
.stdout(contains("\"authentication\"")); .stdout(contains("\"authentication\""));
} }

View File

@@ -31,6 +31,7 @@ fn top_level_send_workspace_sends_http_requests_and_prints_summary() {
.args(["send", "wk_test"]) .args(["send", "wk_test"])
.assert() .assert()
.success() .success()
.stdout(contains("HTTP 200 OK"))
.stdout(contains("workspace bulk send")) .stdout(contains("workspace bulk send"))
.stdout(contains("Send summary: 1 succeeded, 0 failed")); .stdout(contains("Send summary: 1 succeeded, 0 failed"));
} }
@@ -61,6 +62,7 @@ fn top_level_send_folder_sends_http_requests_and_prints_summary() {
.args(["send", "fl_test"]) .args(["send", "fl_test"])
.assert() .assert()
.success() .success()
.stdout(contains("HTTP 200 OK"))
.stdout(contains("folder bulk send")) .stdout(contains("folder bulk send"))
.stdout(contains("Send summary: 1 succeeded, 0 failed")); .stdout(contains("Send summary: 1 succeeded, 0 failed"));
} }

View File

@@ -57,21 +57,3 @@ fn json_create_and_update_merge_patch_round_trip() {
.stdout(contains("\"name\": \"Json Workspace\"")) .stdout(contains("\"name\": \"Json Workspace\""))
.stdout(contains("\"description\": \"Updated via JSON\"")); .stdout(contains("\"description\": \"Updated via JSON\""));
} }
#[test]
fn workspace_schema_outputs_json_schema() {
let temp_dir = TempDir::new().expect("Failed to create temp dir");
let data_dir = temp_dir.path();
cli_cmd(data_dir)
.args(["workspace", "schema"])
.assert()
.success()
.stdout(contains("\"type\":\"object\""))
.stdout(contains("\"x-yaak-agent-hints\""))
.stdout(contains("\"templateVariableSyntax\":\"${[ my_var ]}\""))
.stdout(contains(
"\"templateFunctionSyntax\":\"${[ namespace.my_func(a='aaa',b='bbb') ]}\"",
))
.stdout(contains("\"name\""));
}

View File

@@ -30,21 +30,11 @@ eventsource-client = { git = "https://github.com/yaakapp/rust-eventsource-client
http = { version = "1.2.0", default-features = false } http = { version = "1.2.0", default-features = false }
log = { workspace = true } log = { workspace = true }
md5 = "0.8.0" md5 = "0.8.0"
pretty_graphql = "0.2"
r2d2 = "0.8.10" r2d2 = "0.8.10"
r2d2_sqlite = "0.25.0" r2d2_sqlite = "0.25.0"
mime_guess = "2.0.5" mime_guess = "2.0.5"
rand = "0.9.0" rand = "0.9.0"
reqwest = { workspace = true, features = [ reqwest = { workspace = true, features = ["multipart", "gzip", "brotli", "deflate", "json", "rustls-tls-manual-roots-no-provider", "socks", "http2"] }
"multipart",
"gzip",
"brotli",
"deflate",
"json",
"rustls-tls-manual-roots-no-provider",
"socks",
"http2",
] }
serde = { workspace = true, features = ["derive"] } serde = { workspace = true, features = ["derive"] }
serde_json = { workspace = true, features = ["raw_value"] } serde_json = { workspace = true, features = ["raw_value"] }
tauri = { workspace = true, features = ["devtools", "protocol-asset"] } tauri = { workspace = true, features = ["devtools", "protocol-asset"] }

View File

@@ -1,7 +1,9 @@
{ {
"identifier": "default", "identifier": "default",
"description": "Default capabilities for all build variants", "description": "Default capabilities for all build variants",
"windows": ["*"], "windows": [
"*"
],
"permissions": [ "permissions": [
"core:app:allow-identifier", "core:app:allow-identifier",
"core:event:allow-emit", "core:event:allow-emit",

View File

@@ -1,6 +1,6 @@
{ {
"name": "@yaakapp-internal/tauri", "name": "@yaakapp-internal/tauri",
"version": "1.0.0",
"private": true, "private": true,
"version": "1.0.0",
"main": "bindings/index.ts" "main": "bindings/index.ts"
} }

View File

@@ -154,7 +154,6 @@ async fn send_http_request_inner<R: Runtime>(
cookie_jar_id, cookie_jar_id,
response_dir: &response_dir, response_dir: &response_dir,
emit_events_to: None, emit_events_to: None,
emit_response_body_chunks_to: None,
existing_response: Some(response_ctx.response().clone()), existing_response: Some(response_ctx.response().clone()),
plugin_manager, plugin_manager,
encryption_manager, encryption_manager,

View File

@@ -31,16 +31,14 @@ use tauri_plugin_window_state::{AppHandleExt, StateFlags};
use tokio::sync::Mutex; use tokio::sync::Mutex;
use tokio::task::block_in_place; use tokio::task::block_in_place;
use tokio::time; use tokio::time;
use yaak_common::command::new_checked_command;
use yaak_crypto::manager::EncryptionManager; use yaak_crypto::manager::EncryptionManager;
use yaak_grpc::manager::{GrpcConfig, GrpcHandle}; use yaak_grpc::manager::{GrpcConfig, GrpcHandle};
use yaak_templates::strip_json_comments::strip_json_comments;
use yaak_grpc::{Code, ServiceDefinition, serialize_message}; use yaak_grpc::{Code, ServiceDefinition, serialize_message};
use yaak_mac_window::AppHandleMacWindowExt; use yaak_mac_window::AppHandleMacWindowExt;
use yaak_models::models::{ use yaak_models::models::{
AnyModel, CookieJar, Environment, GrpcConnection, GrpcConnectionState, GrpcEvent, AnyModel, CookieJar, Environment, GrpcConnection, GrpcConnectionState, GrpcEvent,
GrpcEventType, HttpRequest, HttpResponse, HttpResponseEvent, HttpResponseState, Workspace, GrpcEventType, HttpRequest, HttpResponse, HttpResponseEvent, HttpResponseState, Plugin,
WorkspaceMeta, Workspace, WorkspaceMeta,
}; };
use yaak_models::util::{BatchUpsertResult, UpdateSource, get_workspace_export_resources}; use yaak_models::util::{BatchUpsertResult, UpdateSource, get_workspace_export_resources};
use yaak_plugins::events::{ use yaak_plugins::events::{
@@ -99,7 +97,6 @@ impl<R: Runtime> PluginContextExt<R> for WebviewWindow<R> {
struct AppMetaData { struct AppMetaData {
is_dev: bool, is_dev: bool,
version: String, version: String,
cli_version: Option<String>,
name: String, name: String,
app_data_dir: String, app_data_dir: String,
app_log_dir: String, app_log_dir: String,
@@ -116,11 +113,9 @@ async fn cmd_metadata(app_handle: AppHandle) -> YaakResult<AppMetaData> {
let vendored_plugin_dir = let vendored_plugin_dir =
app_handle.path().resolve("vendored/plugins", BaseDirectory::Resource)?; app_handle.path().resolve("vendored/plugins", BaseDirectory::Resource)?;
let default_project_dir = app_handle.path().home_dir()?.join("YaakProjects"); let default_project_dir = app_handle.path().home_dir()?.join("YaakProjects");
let cli_version = detect_cli_version().await;
Ok(AppMetaData { Ok(AppMetaData {
is_dev: is_dev(), is_dev: is_dev(),
version: app_handle.package_info().version.to_string(), version: app_handle.package_info().version.to_string(),
cli_version,
name: app_handle.package_info().name.to_string(), name: app_handle.package_info().name.to_string(),
app_data_dir: app_data_dir.to_string_lossy().to_string(), app_data_dir: app_data_dir.to_string_lossy().to_string(),
app_log_dir: app_log_dir.to_string_lossy().to_string(), app_log_dir: app_log_dir.to_string_lossy().to_string(),
@@ -131,24 +126,6 @@ async fn cmd_metadata(app_handle: AppHandle) -> YaakResult<AppMetaData> {
}) })
} }
async fn detect_cli_version() -> Option<String> {
detect_cli_version_for_binary("yaak").await
}
async fn detect_cli_version_for_binary(program: &str) -> Option<String> {
let mut cmd = new_checked_command(program, "--version").await.ok()?;
let out = cmd.arg("--version").output().await.ok()?;
if !out.status.success() {
return None;
}
let line = String::from_utf8(out.stdout).ok()?;
let line = line.lines().find(|l| !l.trim().is_empty())?.trim();
let mut parts = line.split_whitespace();
let _name = parts.next();
Some(parts.next().unwrap_or(line).to_string())
}
#[tauri::command] #[tauri::command]
async fn cmd_template_tokens_to_string<R: Runtime>( async fn cmd_template_tokens_to_string<R: Runtime>(
window: WebviewWindow<R>, window: WebviewWindow<R>,
@@ -434,7 +411,6 @@ async fn cmd_grpc_go<R: Runtime>(
result.expect("Failed to render template") result.expect("Failed to render template")
}) })
}); });
let msg = strip_json_comments(&msg);
in_msg_tx.try_send(msg.clone()).unwrap(); in_msg_tx.try_send(msg.clone()).unwrap();
} }
Ok(IncomingMsg::Commit) => { Ok(IncomingMsg::Commit) => {
@@ -470,7 +446,6 @@ async fn cmd_grpc_go<R: Runtime>(
&RenderOptions { error_behavior: RenderErrorBehavior::Throw }, &RenderOptions { error_behavior: RenderErrorBehavior::Throw },
) )
.await?; .await?;
let msg = strip_json_comments(&msg);
app_handle.db().upsert_grpc_event( app_handle.db().upsert_grpc_event(
&GrpcEvent { &GrpcEvent {
@@ -872,14 +847,6 @@ async fn cmd_format_json(text: &str) -> YaakResult<String> {
Ok(format_json(text, " ")) Ok(format_json(text, " "))
} }
#[tauri::command]
async fn cmd_format_graphql(text: &str) -> YaakResult<String> {
match pretty_graphql::format_text(text, &Default::default()) {
Ok(formatted) => Ok(formatted),
Err(_) => Ok(text.to_string()),
}
}
#[tauri::command] #[tauri::command]
async fn cmd_http_response_body<R: Runtime>( async fn cmd_http_response_body<R: Runtime>(
window: WebviewWindow<R>, window: WebviewWindow<R>,
@@ -1378,17 +1345,41 @@ async fn cmd_send_http_request<R: Runtime>(
Ok(r) Ok(r)
} }
#[tauri::command]
async fn cmd_install_plugin<R: Runtime>(
directory: &str,
url: Option<String>,
plugin_manager: State<'_, PluginManager>,
app_handle: AppHandle<R>,
window: WebviewWindow<R>,
) -> YaakResult<Plugin> {
let plugin = app_handle.db().upsert_plugin(
&Plugin { directory: directory.into(), url, enabled: true, ..Default::default() },
&UpdateSource::from_window_label(window.label()),
)?;
plugin_manager
.add_plugin(
&PluginContext::new(Some(window.label().to_string()), window.workspace_id()),
&plugin,
)
.await?;
Ok(plugin)
}
#[tauri::command] #[tauri::command]
async fn cmd_reload_plugins<R: Runtime>( async fn cmd_reload_plugins<R: Runtime>(
app_handle: AppHandle<R>, app_handle: AppHandle<R>,
window: WebviewWindow<R>, window: WebviewWindow<R>,
plugin_manager: State<'_, PluginManager>, plugin_manager: State<'_, PluginManager>,
) -> YaakResult<Vec<(String, String)>> { ) -> YaakResult<()> {
let plugins = app_handle.db().list_plugins()?; let plugins = app_handle.db().list_plugins()?;
let plugin_context = let plugin_context =
PluginContext::new(Some(window.label().to_string()), window.workspace_id()); PluginContext::new(Some(window.label().to_string()), window.workspace_id());
let errors = plugin_manager.initialize_all_plugins(plugins, &plugin_context).await; let _errors = plugin_manager.initialize_all_plugins(plugins, &plugin_context).await;
Ok(errors) // Note: errors are returned but we don't show toasts here since this is a manual reload
Ok(())
} }
#[tauri::command] #[tauri::command]
@@ -1648,7 +1639,6 @@ pub fn run() {
cmd_http_request_body, cmd_http_request_body,
cmd_http_response_body, cmd_http_response_body,
cmd_format_json, cmd_format_json,
cmd_format_graphql,
cmd_get_http_authentication_summaries, cmd_get_http_authentication_summaries,
cmd_get_http_authentication_config, cmd_get_http_authentication_config,
cmd_get_sse_events, cmd_get_sse_events,
@@ -1662,6 +1652,7 @@ pub fn run() {
cmd_workspace_actions, cmd_workspace_actions,
cmd_folder_actions, cmd_folder_actions,
cmd_import_data, cmd_import_data,
cmd_install_plugin,
cmd_metadata, cmd_metadata,
cmd_new_child_window, cmd_new_child_window,
cmd_new_main_window, cmd_new_main_window,
@@ -1730,8 +1721,6 @@ pub fn run() {
git_ext::cmd_git_rm_remote, git_ext::cmd_git_rm_remote,
// //
// Plugin commands // Plugin commands
plugins_ext::cmd_plugin_init_errors,
plugins_ext::cmd_plugins_install_from_directory,
plugins_ext::cmd_plugins_search, plugins_ext::cmd_plugins_search,
plugins_ext::cmd_plugins_install, plugins_ext::cmd_plugins_install,
plugins_ext::cmd_plugins_uninstall, plugins_ext::cmd_plugins_uninstall,

View File

@@ -15,7 +15,6 @@ use yaak_models::error::Result;
use yaak_models::models::{AnyModel, GraphQlIntrospection, GrpcEvent, Settings, WebsocketEvent}; use yaak_models::models::{AnyModel, GraphQlIntrospection, GrpcEvent, Settings, WebsocketEvent};
use yaak_models::query_manager::QueryManager; use yaak_models::query_manager::QueryManager;
use yaak_models::util::UpdateSource; use yaak_models::util::UpdateSource;
use yaak_plugins::manager::PluginManager;
const MODEL_CHANGES_RETENTION_HOURS: i64 = 1; const MODEL_CHANGES_RETENTION_HOURS: i64 = 1;
const MODEL_CHANGES_POLL_INTERVAL_MS: u64 = 1000; const MODEL_CHANGES_POLL_INTERVAL_MS: u64 = 1000;
@@ -256,32 +255,23 @@ pub(crate) fn models_upsert_graphql_introspection<R: Runtime>(
} }
#[tauri::command] #[tauri::command]
pub(crate) async fn models_workspace_models<R: Runtime>( pub(crate) fn models_workspace_models<R: Runtime>(
window: WebviewWindow<R>, window: WebviewWindow<R>,
workspace_id: Option<&str>, workspace_id: Option<&str>,
plugin_manager: State<'_, PluginManager>,
) -> Result<String> { ) -> Result<String> {
let db = window.db();
let mut l: Vec<AnyModel> = Vec::new(); let mut l: Vec<AnyModel> = Vec::new();
// Add the global models // Add the settings
{ l.push(db.get_settings().into());
let db = window.db();
l.push(db.get_settings().into());
l.append(&mut db.list_workspaces()?.into_iter().map(Into::into).collect());
l.append(&mut db.list_key_values()?.into_iter().map(Into::into).collect());
}
let plugins = { // Add global models
let db = window.db(); l.append(&mut db.list_workspaces()?.into_iter().map(Into::into).collect());
db.list_plugins()? l.append(&mut db.list_key_values()?.into_iter().map(Into::into).collect());
}; l.append(&mut db.list_plugins()?.into_iter().map(Into::into).collect());
let plugins = plugin_manager.resolve_plugins_for_runtime_from_db(plugins).await;
l.append(&mut plugins.into_iter().map(Into::into).collect());
// Add the workspace children // Add the workspace children
if let Some(wid) = workspace_id { if let Some(wid) = workspace_id {
let db = window.db();
l.append(&mut db.list_cookie_jars(wid)?.into_iter().map(Into::into).collect()); l.append(&mut db.list_cookie_jars(wid)?.into_iter().map(Into::into).collect());
l.append(&mut db.list_environments_ensure_base(wid)?.into_iter().map(Into::into).collect()); l.append(&mut db.list_environments_ensure_base(wid)?.into_iter().map(Into::into).collect());
l.append(&mut db.list_folders(wid)?.into_iter().map(Into::into).collect()); l.append(&mut db.list_folders(wid)?.into_iter().map(Into::into).collect());

View File

@@ -8,7 +8,7 @@ use serde::{Deserialize, Serialize};
use std::time::Instant; use std::time::Instant;
use tauri::{AppHandle, Emitter, Manager, Runtime, WebviewWindow}; use tauri::{AppHandle, Emitter, Manager, Runtime, WebviewWindow};
use ts_rs::TS; use ts_rs::TS;
use yaak_api::{ApiClientKind, yaak_api_client}; use yaak_api::yaak_api_client;
use yaak_common::platform::get_os_str; use yaak_common::platform::get_os_str;
use yaak_models::util::UpdateSource; use yaak_models::util::UpdateSource;
@@ -102,7 +102,7 @@ impl YaakNotifier {
let launch_info = get_or_upsert_launch_info(app_handle); let launch_info = get_or_upsert_launch_info(app_handle);
let app_version = app_handle.package_info().version.to_string(); let app_version = app_handle.package_info().version.to_string();
let req = yaak_api_client(ApiClientKind::App, &app_version)? let req = yaak_api_client(&app_version)?
.request(Method::GET, "https://notify.yaak.app/notifications") .request(Method::GET, "https://notify.yaak.app/notifications")
.query(&[ .query(&[
("version", &launch_info.current_version), ("version", &launch_info.current_version),

View File

@@ -19,13 +19,13 @@ use yaak::plugin_events::{
GroupedPluginEvent, HostRequest, SharedPluginEventContext, handle_shared_plugin_event, GroupedPluginEvent, HostRequest, SharedPluginEventContext, handle_shared_plugin_event,
}; };
use yaak_crypto::manager::EncryptionManager; use yaak_crypto::manager::EncryptionManager;
use yaak_models::models::{HttpResponse, Plugin}; use yaak_models::models::{AnyModel, HttpResponse, Plugin};
use yaak_models::queries::any_request::AnyRequest; use yaak_models::queries::any_request::AnyRequest;
use yaak_models::util::UpdateSource; use yaak_models::util::UpdateSource;
use yaak_plugins::error::Error::PluginErr; use yaak_plugins::error::Error::PluginErr;
use yaak_plugins::events::{ use yaak_plugins::events::{
Color, EmptyPayload, ErrorResponse, GetCookieValueResponse, Icon, InternalEvent, Color, EmptyPayload, ErrorResponse, FindHttpResponsesResponse, GetCookieValueResponse, Icon,
InternalEventPayload, ListCookieNamesResponse, ListOpenWorkspacesResponse, InternalEvent, InternalEventPayload, ListCookieNamesResponse, ListOpenWorkspacesResponse,
RenderGrpcRequestResponse, RenderHttpRequestResponse, SendHttpRequestResponse, RenderGrpcRequestResponse, RenderHttpRequestResponse, SendHttpRequestResponse,
ShowToastRequest, TemplateRenderResponse, WindowInfoResponse, WindowNavigateEvent, ShowToastRequest, TemplateRenderResponse, WindowInfoResponse, WindowNavigateEvent,
WorkspaceInfo, WorkspaceInfo,
@@ -118,7 +118,7 @@ async fn handle_host_plugin_request<R: Runtime>(
&InternalEventPayload::ShowToastRequest(ShowToastRequest { &InternalEventPayload::ShowToastRequest(ShowToastRequest {
message: format!("Reloaded plugin {}@{}", info.name, info.version), message: format!("Reloaded plugin {}@{}", info.name, info.version),
icon: Some(Icon::Info), icon: Some(Icon::Info),
timeout: Some(5000), timeout: Some(3000),
..Default::default() ..Default::default()
}), }),
None, None,
@@ -190,6 +190,71 @@ async fn handle_host_plugin_request<R: Runtime>(
Ok(None) Ok(None)
} }
} }
HostRequest::FindHttpResponses(req) => {
let http_responses = app_handle
.db()
.list_http_responses_for_request(&req.request_id, req.limit.map(|l| l as u64))
.unwrap_or_default();
Ok(Some(InternalEventPayload::FindHttpResponsesResponse(FindHttpResponsesResponse {
http_responses,
})))
}
HostRequest::UpsertModel(req) => {
use AnyModel::*;
let model = match &req.model {
HttpRequest(m) => {
HttpRequest(app_handle.db().upsert_http_request(m, &UpdateSource::Plugin)?)
}
GrpcRequest(m) => {
GrpcRequest(app_handle.db().upsert_grpc_request(m, &UpdateSource::Plugin)?)
}
WebsocketRequest(m) => WebsocketRequest(
app_handle.db().upsert_websocket_request(m, &UpdateSource::Plugin)?,
),
Folder(m) => Folder(app_handle.db().upsert_folder(m, &UpdateSource::Plugin)?),
Environment(m) => {
Environment(app_handle.db().upsert_environment(m, &UpdateSource::Plugin)?)
}
Workspace(m) => {
Workspace(app_handle.db().upsert_workspace(m, &UpdateSource::Plugin)?)
}
_ => {
return Err(PluginErr("Upsert not supported for this model type".into()).into());
}
};
Ok(Some(InternalEventPayload::UpsertModelResponse(
yaak_plugins::events::UpsertModelResponse { model },
)))
}
HostRequest::DeleteModel(req) => {
let model = match req.model.as_str() {
"http_request" => AnyModel::HttpRequest(
app_handle.db().delete_http_request_by_id(&req.id, &UpdateSource::Plugin)?,
),
"grpc_request" => AnyModel::GrpcRequest(
app_handle.db().delete_grpc_request_by_id(&req.id, &UpdateSource::Plugin)?,
),
"websocket_request" => AnyModel::WebsocketRequest(
app_handle
.db()
.delete_websocket_request_by_id(&req.id, &UpdateSource::Plugin)?,
),
"folder" => AnyModel::Folder(
app_handle.db().delete_folder_by_id(&req.id, &UpdateSource::Plugin)?,
),
"environment" => AnyModel::Environment(
app_handle.db().delete_environment_by_id(&req.id, &UpdateSource::Plugin)?,
),
_ => {
return Err(PluginErr("Delete not supported for this model type".into()).into());
}
};
Ok(Some(InternalEventPayload::DeleteModelResponse(
yaak_plugins::events::DeleteModelResponse { model },
)))
}
HostRequest::RenderGrpcRequest(req) => { HostRequest::RenderGrpcRequest(req) => {
let window = get_window_from_plugin_context(app_handle, plugin_context)?; let window = get_window_from_plugin_context(app_handle, plugin_context)?;
@@ -297,7 +362,7 @@ async fn handle_host_plugin_request<R: Runtime>(
workspace_id: http_request.workspace_id.clone(), workspace_id: http_request.workspace_id.clone(),
..Default::default() ..Default::default()
}, },
&UpdateSource::from_window_label(window.label()), &UpdateSource::Plugin,
&blobs, &blobs,
)? )?
}; };

View File

@@ -21,14 +21,14 @@ use tauri::{
}; };
use tokio::sync::Mutex; use tokio::sync::Mutex;
use ts_rs::TS; use ts_rs::TS;
use yaak_api::{ApiClientKind, yaak_api_client}; use yaak_api::yaak_api_client;
use yaak_models::models::{Plugin, PluginSource}; use yaak_models::models::Plugin;
use yaak_models::util::UpdateSource; use yaak_models::util::UpdateSource;
use yaak_plugins::api::{ use yaak_plugins::api::{
PluginNameVersion, PluginSearchResponse, PluginUpdatesResponse, check_plugin_updates, PluginNameVersion, PluginSearchResponse, PluginUpdatesResponse, check_plugin_updates,
search_plugins, search_plugins,
}; };
use yaak_plugins::events::PluginContext; use yaak_plugins::events::{Color, Icon, PluginContext, ShowToastRequest};
use yaak_plugins::install::{delete_and_uninstall, download_and_install}; use yaak_plugins::install::{delete_and_uninstall, download_and_install};
use yaak_plugins::manager::PluginManager; use yaak_plugins::manager::PluginManager;
use yaak_plugins::plugin_meta::get_plugin_meta; use yaak_plugins::plugin_meta::get_plugin_meta;
@@ -73,7 +73,7 @@ impl PluginUpdater {
info!("Checking for plugin updates"); info!("Checking for plugin updates");
let app_version = window.app_handle().package_info().version.to_string(); let app_version = window.app_handle().package_info().version.to_string();
let http_client = yaak_api_client(ApiClientKind::App, &app_version)?; let http_client = yaak_api_client(&app_version)?;
let plugins = window.app_handle().db().list_plugins()?; let plugins = window.app_handle().db().list_plugins()?;
let updates = check_plugin_updates(&http_client, plugins.clone()).await?; let updates = check_plugin_updates(&http_client, plugins.clone()).await?;
@@ -138,7 +138,7 @@ pub async fn cmd_plugins_search<R: Runtime>(
query: &str, query: &str,
) -> Result<PluginSearchResponse> { ) -> Result<PluginSearchResponse> {
let app_version = app_handle.package_info().version.to_string(); let app_version = app_handle.package_info().version.to_string();
let http_client = yaak_api_client(ApiClientKind::App, &app_version)?; let http_client = yaak_api_client(&app_version)?;
Ok(search_plugins(&http_client, query).await?) Ok(search_plugins(&http_client, query).await?)
} }
@@ -150,7 +150,7 @@ pub async fn cmd_plugins_install<R: Runtime>(
) -> Result<()> { ) -> Result<()> {
let plugin_manager = Arc::new((*window.state::<PluginManager>()).clone()); let plugin_manager = Arc::new((*window.state::<PluginManager>()).clone());
let app_version = window.app_handle().package_info().version.to_string(); let app_version = window.app_handle().package_info().version.to_string();
let http_client = yaak_api_client(ApiClientKind::App, &app_version)?; let http_client = yaak_api_client(&app_version)?;
let query_manager = window.state::<yaak_models::query_manager::QueryManager>(); let query_manager = window.state::<yaak_models::query_manager::QueryManager>();
let plugin_context = window.plugin_context(); let plugin_context = window.plugin_context();
download_and_install( download_and_install(
@@ -165,28 +165,6 @@ pub async fn cmd_plugins_install<R: Runtime>(
Ok(()) Ok(())
} }
#[command]
pub async fn cmd_plugins_install_from_directory<R: Runtime>(
window: WebviewWindow<R>,
directory: &str,
) -> Result<Plugin> {
let plugin = window.db().upsert_plugin(
&Plugin {
directory: directory.into(),
url: None,
enabled: true,
source: PluginSource::Filesystem,
..Default::default()
},
&UpdateSource::from_window_label(window.label()),
)?;
let plugin_manager = Arc::new((*window.state::<PluginManager>()).clone());
plugin_manager.add_plugin(&window.plugin_context(), &plugin).await?;
Ok(plugin)
}
#[command] #[command]
pub async fn cmd_plugins_uninstall<R: Runtime>( pub async fn cmd_plugins_uninstall<R: Runtime>(
plugin_id: &str, plugin_id: &str,
@@ -198,19 +176,12 @@ pub async fn cmd_plugins_uninstall<R: Runtime>(
Ok(delete_and_uninstall(plugin_manager, &query_manager, &plugin_context, plugin_id).await?) Ok(delete_and_uninstall(plugin_manager, &query_manager, &plugin_context, plugin_id).await?)
} }
#[command]
pub async fn cmd_plugin_init_errors(
plugin_manager: State<'_, PluginManager>,
) -> Result<Vec<(String, String)>> {
Ok(plugin_manager.take_init_errors().await)
}
#[command] #[command]
pub async fn cmd_plugins_updates<R: Runtime>( pub async fn cmd_plugins_updates<R: Runtime>(
app_handle: AppHandle<R>, app_handle: AppHandle<R>,
) -> Result<PluginUpdatesResponse> { ) -> Result<PluginUpdatesResponse> {
let app_version = app_handle.package_info().version.to_string(); let app_version = app_handle.package_info().version.to_string();
let http_client = yaak_api_client(ApiClientKind::App, &app_version)?; let http_client = yaak_api_client(&app_version)?;
let plugins = app_handle.db().list_plugins()?; let plugins = app_handle.db().list_plugins()?;
Ok(check_plugin_updates(&http_client, plugins).await?) Ok(check_plugin_updates(&http_client, plugins).await?)
} }
@@ -220,7 +191,7 @@ pub async fn cmd_plugins_update_all<R: Runtime>(
window: WebviewWindow<R>, window: WebviewWindow<R>,
) -> Result<Vec<PluginNameVersion>> { ) -> Result<Vec<PluginNameVersion>> {
let app_version = window.app_handle().package_info().version.to_string(); let app_version = window.app_handle().package_info().version.to_string();
let http_client = yaak_api_client(ApiClientKind::App, &app_version)?; let http_client = yaak_api_client(&app_version)?;
let plugins = window.db().list_plugins()?; let plugins = window.db().list_plugins()?;
// Get list of available updates (already filtered to only registry plugins) // Get list of available updates (already filtered to only registry plugins)
@@ -297,8 +268,6 @@ pub fn init<R: Runtime>() -> TauriPlugin<R> {
.join("index.cjs"); .join("index.cjs");
let dev_mode = is_dev(); let dev_mode = is_dev();
let query_manager =
app_handle.state::<yaak_models::query_manager::QueryManager>().inner().clone();
// Create plugin manager asynchronously // Create plugin manager asynchronously
let app_handle_clone = app_handle.clone(); let app_handle_clone = app_handle.clone();
@@ -308,12 +277,53 @@ pub fn init<R: Runtime>() -> TauriPlugin<R> {
installed_plugin_dir, installed_plugin_dir,
node_bin_path, node_bin_path,
plugin_runtime_main, plugin_runtime_main,
&query_manager,
&PluginContext::new_empty(),
dev_mode, dev_mode,
) )
.await .await;
.expect("Failed to start plugin runtime");
// Initialize all plugins after manager is created
let bundled_dirs = manager
.list_bundled_plugin_dirs()
.await
.expect("Failed to list bundled plugins");
// Ensure all bundled plugins make it into the database
let db = app_handle_clone.db();
for dir in &bundled_dirs {
if db.get_plugin_by_directory(dir).is_none() {
db.upsert_plugin(
&Plugin {
directory: dir.clone(),
enabled: true,
url: None,
..Default::default()
},
&UpdateSource::Background,
)
.expect("Failed to upsert bundled plugin");
}
}
// Get all plugins from database and initialize
let plugins = db.list_plugins().expect("Failed to list plugins from database");
drop(db); // Explicitly drop the connection before await
let errors =
manager.initialize_all_plugins(plugins, &PluginContext::new_empty()).await;
// Show toast for any failed plugins
for (plugin_dir, error_msg) in errors {
let plugin_name = plugin_dir.split('/').last().unwrap_or(&plugin_dir);
let toast = ShowToastRequest {
message: format!("Failed to start plugin '{}': {}", plugin_name, error_msg),
color: Some(Color::Danger),
icon: Some(Icon::AlertTriangle),
timeout: Some(10000),
};
if let Err(emit_err) = app_handle_clone.emit("show_toast", toast) {
error!("Failed to emit toast for plugin error: {emit_err:?}");
}
}
app_handle_clone.manage(manager); app_handle_clone.manage(manager);
}); });

View File

@@ -1,6 +1,8 @@
use log::info;
use serde_json::Value; use serde_json::Value;
pub use yaak::render::{render_grpc_request, render_http_request}; use std::collections::BTreeMap;
use yaak_models::models::Environment; pub use yaak::render::render_http_request;
use yaak_models::models::{Environment, GrpcRequest, HttpRequestHeader};
use yaak_models::render::make_vars_hashmap; use yaak_models::render::make_vars_hashmap;
use yaak_templates::{RenderOptions, TemplateCallback, parse_and_render, render_json_value_raw}; use yaak_templates::{RenderOptions, TemplateCallback, parse_and_render, render_json_value_raw};
@@ -23,3 +25,61 @@ pub async fn render_json_value<T: TemplateCallback>(
let vars = &make_vars_hashmap(environment_chain); let vars = &make_vars_hashmap(environment_chain);
render_json_value_raw(value, vars, cb, opt).await render_json_value_raw(value, vars, cb, opt).await
} }
pub async fn render_grpc_request<T: TemplateCallback>(
r: &GrpcRequest,
environment_chain: Vec<Environment>,
cb: &T,
opt: &RenderOptions,
) -> yaak_templates::error::Result<GrpcRequest> {
let vars = &make_vars_hashmap(environment_chain);
let mut metadata = Vec::new();
for p in r.metadata.clone() {
if !p.enabled {
continue;
}
metadata.push(HttpRequestHeader {
enabled: p.enabled,
name: parse_and_render(p.name.as_str(), vars, cb, &opt).await?,
value: parse_and_render(p.value.as_str(), vars, cb, &opt).await?,
id: p.id,
})
}
let authentication = {
let mut disabled = false;
let mut auth = BTreeMap::new();
match r.authentication.get("disabled") {
Some(Value::Bool(true)) => {
disabled = true;
}
Some(Value::String(tmpl)) => {
disabled = parse_and_render(tmpl.as_str(), vars, cb, &opt)
.await
.unwrap_or_default()
.is_empty();
info!(
"Rendering authentication.disabled as a template: {disabled} from \"{tmpl}\""
);
}
_ => {}
}
if disabled {
auth.insert("disabled".to_string(), Value::Bool(true));
} else {
for (k, v) in r.authentication.clone() {
if k == "disabled" {
auth.insert(k, Value::Bool(false));
} else {
auth.insert(k, render_json_value_raw(v, vars, cb, &opt).await?);
}
}
}
auth
};
let url = parse_and_render(r.url.as_str(), vars, cb, &opt).await?;
Ok(GrpcRequest { url, metadata, authentication, ..r.to_owned() })
}

View File

@@ -8,7 +8,7 @@ use std::fs;
use std::sync::Arc; use std::sync::Arc;
use tauri::{AppHandle, Emitter, Manager, Runtime, Url}; use tauri::{AppHandle, Emitter, Manager, Runtime, Url};
use tauri_plugin_dialog::{DialogExt, MessageDialogButtons, MessageDialogKind}; use tauri_plugin_dialog::{DialogExt, MessageDialogButtons, MessageDialogKind};
use yaak_api::{ApiClientKind, yaak_api_client}; use yaak_api::yaak_api_client;
use yaak_models::util::generate_id; use yaak_models::util::generate_id;
use yaak_plugins::events::{Color, ShowToastRequest}; use yaak_plugins::events::{Color, ShowToastRequest};
use yaak_plugins::install::download_and_install; use yaak_plugins::install::download_and_install;
@@ -47,7 +47,7 @@ pub(crate) async fn handle_deep_link<R: Runtime>(
let plugin_manager = Arc::new((*window.state::<PluginManager>()).clone()); let plugin_manager = Arc::new((*window.state::<PluginManager>()).clone());
let query_manager = app_handle.db_manager(); let query_manager = app_handle.db_manager();
let app_version = app_handle.package_info().version.to_string(); let app_version = app_handle.package_info().version.to_string();
let http_client = yaak_api_client(ApiClientKind::App, &app_version)?; let http_client = yaak_api_client(&app_version)?;
let plugin_context = window.plugin_context(); let plugin_context = window.plugin_context();
let pv = download_and_install( let pv = download_and_install(
plugin_manager, plugin_manager,
@@ -88,8 +88,7 @@ pub(crate) async fn handle_deep_link<R: Runtime>(
} }
let app_version = app_handle.package_info().version.to_string(); let app_version = app_handle.package_info().version.to_string();
let resp = let resp = yaak_api_client(&app_version)?.get(file_url).send().await?;
yaak_api_client(ApiClientKind::App, &app_version)?.get(file_url).send().await?;
let json = resp.bytes().await?; let json = resp.bytes().await?;
let p = app_handle let p = app_handle
.path() .path()

View File

@@ -24,7 +24,6 @@ use yaak_models::util::UpdateSource;
use yaak_plugins::events::{CallHttpAuthenticationRequest, HttpHeader, RenderPurpose}; use yaak_plugins::events::{CallHttpAuthenticationRequest, HttpHeader, RenderPurpose};
use yaak_plugins::manager::PluginManager; use yaak_plugins::manager::PluginManager;
use yaak_plugins::template_callback::PluginTemplateCallback; use yaak_plugins::template_callback::PluginTemplateCallback;
use yaak_templates::strip_json_comments::maybe_strip_json_comments;
use yaak_templates::{RenderErrorBehavior, RenderOptions}; use yaak_templates::{RenderErrorBehavior, RenderOptions};
use yaak_tls::find_client_certificate; use yaak_tls::find_client_certificate;
use yaak_ws::{WebsocketManager, render_websocket_request}; use yaak_ws::{WebsocketManager, render_websocket_request};
@@ -73,10 +72,8 @@ pub async fn cmd_ws_send<R: Runtime>(
) )
.await?; .await?;
let message = maybe_strip_json_comments(&request.message);
let mut ws_manager = ws_manager.lock().await; let mut ws_manager = ws_manager.lock().await;
ws_manager.send(&connection.id, Message::Text(message.clone().into())).await?; ws_manager.send(&connection.id, Message::Text(request.message.clone().into())).await?;
app_handle.db().upsert_websocket_event( app_handle.db().upsert_websocket_event(
&WebsocketEvent { &WebsocketEvent {
@@ -85,7 +82,7 @@ pub async fn cmd_ws_send<R: Runtime>(
workspace_id: connection.workspace_id.clone(), workspace_id: connection.workspace_id.clone(),
is_server: false, is_server: false,
message_type: WebsocketEventType::Text, message_type: WebsocketEventType::Text,
message: message.into(), message: request.message.into(),
..Default::default() ..Default::default()
}, },
&UpdateSource::from_window_label(window.label()), &UpdateSource::from_window_label(window.label()),

View File

@@ -14,7 +14,10 @@
"assetProtocol": { "assetProtocol": {
"enable": true, "enable": true,
"scope": { "scope": {
"allow": ["$APPDATA/responses/*", "$RESOURCE/static/*"] "allow": [
"$APPDATA/responses/*",
"$RESOURCE/static/*"
]
} }
} }
} }
@@ -22,7 +25,9 @@
"plugins": { "plugins": {
"deep-link": { "deep-link": {
"desktop": { "desktop": {
"schemes": ["yaak"] "schemes": [
"yaak"
]
} }
} }
}, },

View File

@@ -16,7 +16,9 @@
}, },
"plugins": { "plugins": {
"updater": { "updater": {
"endpoints": ["https://update.yaak.app/check/{{target}}/{{arch}}/{{current_version}}"], "endpoints": [
"https://update.yaak.app/check/{{target}}/{{arch}}/{{current_version}}"
],
"pubkey": "dW50cnVzdGVkIGNvbW1lbnQ6IG1pbmlzaWduIHB1YmxpYyBrZXk6IEVGRkFGMjQxRUNEOTQ3MzAKUldRd1I5bnNRZkw2NzRtMnRlWTN3R24xYUR3aGRsUjJzWGwvdHdEcGljb3ZJMUNlMjFsaHlqVU4K" "pubkey": "dW50cnVzdGVkIGNvbW1lbnQ6IG1pbmlzaWduIHB1YmxpYyBrZXk6IEVGRkFGMjQxRUNEOTQ3MzAKUldRd1I5bnNRZkw2NzRtMnRlWTN3R24xYUR3aGRsUjJzWGwvdHdEcGljb3ZJMUNlMjFsaHlqVU4K"
} }
}, },

View File

@@ -1,14 +1,14 @@
import { useQuery } from "@tanstack/react-query"; import { useQuery } from '@tanstack/react-query';
import { invoke } from "@tauri-apps/api/core"; import { invoke } from '@tauri-apps/api/core';
import { Fonts } from "./bindings/gen_fonts"; import { Fonts } from './bindings/gen_fonts';
export async function listFonts() { export async function listFonts() {
return invoke<Fonts>("plugin:yaak-fonts|list", {}); return invoke<Fonts>('plugin:yaak-fonts|list', {});
} }
export function useFonts() { export function useFonts() {
return useQuery({ return useQuery({
queryKey: ["list_fonts"], queryKey: ['list_fonts'],
queryFn: () => listFonts(), queryFn: () => listFonts(),
}); });
} }

View File

@@ -1,6 +1,6 @@
{ {
"name": "@yaakapp-internal/fonts", "name": "@yaakapp-internal/fonts",
"version": "1.0.0",
"private": true, "private": true,
"version": "1.0.0",
"main": "index.ts" "main": "index.ts"
} }

View File

@@ -1,35 +1,35 @@
import { useMutation, useQuery, useQueryClient } from "@tanstack/react-query"; import { useMutation, useQuery, useQueryClient } from '@tanstack/react-query';
import { invoke } from "@tauri-apps/api/core"; import { invoke } from '@tauri-apps/api/core';
import { listen } from "@tauri-apps/api/event"; import { listen } from '@tauri-apps/api/event';
import { appInfo } from "@yaakapp/app/lib/appInfo"; import { appInfo } from '@yaakapp/app/lib/appInfo';
import { useEffect } from "react"; import { useEffect } from 'react';
import { LicenseCheckStatus } from "./bindings/license"; import { LicenseCheckStatus } from './bindings/license';
export * from "./bindings/license"; export * from './bindings/license';
const CHECK_QUERY_KEY = ["license.check"]; const CHECK_QUERY_KEY = ['license.check'];
export function useLicense() { export function useLicense() {
const queryClient = useQueryClient(); const queryClient = useQueryClient();
const activate = useMutation<void, string, { licenseKey: string }>({ const activate = useMutation<void, string, { licenseKey: string }>({
mutationKey: ["license.activate"], mutationKey: ['license.activate'],
mutationFn: (payload) => invoke("plugin:yaak-license|activate", payload), mutationFn: (payload) => invoke('plugin:yaak-license|activate', payload),
onSuccess: () => queryClient.invalidateQueries({ queryKey: CHECK_QUERY_KEY }), onSuccess: () => queryClient.invalidateQueries({ queryKey: CHECK_QUERY_KEY }),
}); });
const deactivate = useMutation<void, string, void>({ const deactivate = useMutation<void, string, void>({
mutationKey: ["license.deactivate"], mutationKey: ['license.deactivate'],
mutationFn: () => invoke("plugin:yaak-license|deactivate"), mutationFn: () => invoke('plugin:yaak-license|deactivate'),
onSuccess: () => queryClient.invalidateQueries({ queryKey: CHECK_QUERY_KEY }), onSuccess: () => queryClient.invalidateQueries({ queryKey: CHECK_QUERY_KEY }),
}); });
// Check the license again after a license is activated // Check the license again after a license is activated
useEffect(() => { useEffect(() => {
const unlisten = listen("license-activated", async () => { const unlisten = listen('license-activated', async () => {
await queryClient.invalidateQueries({ queryKey: CHECK_QUERY_KEY }); await queryClient.invalidateQueries({ queryKey: CHECK_QUERY_KEY });
}); });
return () => { return () => {
void unlisten.then((fn) => fn()); unlisten.then((fn) => fn());
}; };
}, []); }, []);
@@ -41,7 +41,7 @@ export function useLicense() {
if (!appInfo.featureLicense) { if (!appInfo.featureLicense) {
return null; return null;
} }
return invoke<LicenseCheckStatus>("plugin:yaak-license|check"); return invoke<LicenseCheckStatus>('plugin:yaak-license|check');
}, },
}); });

View File

@@ -1,6 +1,6 @@
{ {
"name": "@yaakapp-internal/license", "name": "@yaakapp-internal/license",
"version": "1.0.0",
"private": true, "private": true,
"version": "1.0.0",
"main": "index.ts" "main": "index.ts"
} }

View File

@@ -7,7 +7,7 @@ use std::ops::Add;
use std::time::Duration; use std::time::Duration;
use tauri::{AppHandle, Emitter, Manager, Runtime, WebviewWindow, is_dev}; use tauri::{AppHandle, Emitter, Manager, Runtime, WebviewWindow, is_dev};
use ts_rs::TS; use ts_rs::TS;
use yaak_api::{ApiClientKind, yaak_api_client}; use yaak_api::yaak_api_client;
use yaak_common::platform::get_os_str; use yaak_common::platform::get_os_str;
use yaak_models::db_context::DbContext; use yaak_models::db_context::DbContext;
use yaak_models::query_manager::QueryManager; use yaak_models::query_manager::QueryManager;
@@ -119,7 +119,7 @@ pub async fn activate_license<R: Runtime>(
) -> Result<()> { ) -> Result<()> {
info!("Activating license {}", license_key); info!("Activating license {}", license_key);
let app_version = window.app_handle().package_info().version.to_string(); let app_version = window.app_handle().package_info().version.to_string();
let client = yaak_api_client(ApiClientKind::App, &app_version)?; let client = yaak_api_client(&app_version)?;
let payload = ActivateLicenseRequestPayload { let payload = ActivateLicenseRequestPayload {
license_key: license_key.to_string(), license_key: license_key.to_string(),
app_platform: get_os_str().to_string(), app_platform: get_os_str().to_string(),
@@ -157,7 +157,7 @@ pub async fn deactivate_license<R: Runtime>(window: &WebviewWindow<R>) -> Result
let activation_id = get_activation_id(app_handle).await; let activation_id = get_activation_id(app_handle).await;
let app_version = window.app_handle().package_info().version.to_string(); let app_version = window.app_handle().package_info().version.to_string();
let client = yaak_api_client(ApiClientKind::App, &app_version)?; let client = yaak_api_client(&app_version)?;
let path = format!("/licenses/activations/{}/deactivate", activation_id); let path = format!("/licenses/activations/{}/deactivate", activation_id);
let payload = let payload =
DeactivateLicenseRequestPayload { app_platform: get_os_str().to_string(), app_version }; DeactivateLicenseRequestPayload { app_platform: get_os_str().to_string(), app_version };
@@ -203,7 +203,7 @@ pub async fn check_license<R: Runtime>(window: &WebviewWindow<R>) -> Result<Lice
(true, _) => { (true, _) => {
info!("Checking license activation"); info!("Checking license activation");
// A license has been activated, so let's check the license server // A license has been activated, so let's check the license server
let client = yaak_api_client(ApiClientKind::App, &payload.app_version)?; let client = yaak_api_client(&payload.app_version)?;
let path = format!("/licenses/activations/{activation_id}/check-v2"); let path = format!("/licenses/activations/{activation_id}/check-v2");
let response = client.post(build_url(&path)).json(&payload).send().await?; let response = client.post(build_url(&path)).json(&payload).send().await?;

View File

@@ -1,9 +1,9 @@
import { invoke } from "@tauri-apps/api/core"; import { invoke } from '@tauri-apps/api/core';
export function setWindowTitle(title: string) { export function setWindowTitle(title: string) {
invoke("plugin:yaak-mac-window|set_title", { title }).catch(console.error); invoke('plugin:yaak-mac-window|set_title', { title }).catch(console.error);
} }
export function setWindowTheme(bgColor: string) { export function setWindowTheme(bgColor: string) {
invoke("plugin:yaak-mac-window|set_theme", { bgColor }).catch(console.error); invoke('plugin:yaak-mac-window|set_theme', { bgColor }).catch(console.error);
} }

View File

@@ -1,6 +1,6 @@
{ {
"name": "@yaakapp-internal/mac-window", "name": "@yaakapp-internal/mac-window",
"version": "1.0.0",
"private": true, "private": true,
"version": "1.0.0",
"main": "index.ts" "main": "index.ts"
} }

View File

@@ -1,3 +1,6 @@
[default] [default]
description = "Default permissions for the plugin" description = "Default permissions for the plugin"
permissions = ["allow-set-title", "allow-set-theme"] permissions = [
"allow-set-title",
"allow-set-theme",
]

View File

@@ -12,11 +12,6 @@ unsafe impl Sync for UnsafeWindowHandle {}
const WINDOW_CONTROL_PAD_X: f64 = 13.0; const WINDOW_CONTROL_PAD_X: f64 = 13.0;
const WINDOW_CONTROL_PAD_Y: f64 = 18.0; const WINDOW_CONTROL_PAD_Y: f64 = 18.0;
/// Extra pixels to add to the title bar height when the default title bar is
/// already as tall as button_height + PAD_Y (i.e. macOS Tahoe 26+, where the
/// default is 32px and 14 + 18 = 32). On pre-Tahoe this is unused because the
/// default title bar is shorter than button_height + PAD_Y.
const TITLEBAR_EXTRA_HEIGHT: f64 = 4.0;
const MAIN_WINDOW_PREFIX: &str = "main_"; const MAIN_WINDOW_PREFIX: &str = "main_";
pub(crate) fn update_window_title<R: Runtime>(window: Window<R>, title: String) { pub(crate) fn update_window_title<R: Runtime>(window: Window<R>, title: String) {
@@ -100,29 +95,12 @@ fn position_traffic_lights(ns_window_handle: UnsafeWindowHandle, x: f64, y: f64,
ns_window.standardWindowButton_(NSWindowButton::NSWindowMiniaturizeButton); ns_window.standardWindowButton_(NSWindowButton::NSWindowMiniaturizeButton);
let zoom = ns_window.standardWindowButton_(NSWindowButton::NSWindowZoomButton); let zoom = ns_window.standardWindowButton_(NSWindowButton::NSWindowZoomButton);
let title_bar_container_view = close.superview().superview();
let close_rect: NSRect = msg_send![close, frame]; let close_rect: NSRect = msg_send![close, frame];
let button_height = close_rect.size.height; let button_height = close_rect.size.height;
let title_bar_container_view = close.superview().superview(); let title_bar_frame_height = button_height + y;
// Capture the OS default title bar height on the first call, before
// we've modified it. This avoids the height growing on repeated calls.
use std::sync::OnceLock;
static DEFAULT_TITLEBAR_HEIGHT: OnceLock<f64> = OnceLock::new();
let default_height =
*DEFAULT_TITLEBAR_HEIGHT.get_or_init(|| NSView::frame(title_bar_container_view).size.height);
// On pre-Tahoe, button_height + y is larger than the default title bar
// height, so the resize works as before. On Tahoe (26+), the default is
// already 32px and button_height + y = 32, so nothing changes. In that
// case, add TITLEBAR_EXTRA_HEIGHT extra pixels to push the buttons down.
let desired = button_height + y;
let title_bar_frame_height = if desired > default_height {
desired
} else {
default_height + TITLEBAR_EXTRA_HEIGHT
};
let mut title_bar_rect = NSView::frame(title_bar_container_view); let mut title_bar_rect = NSView::frame(title_bar_container_view);
title_bar_rect.size.height = title_bar_frame_height; title_bar_rect.size.height = title_bar_frame_height;
title_bar_rect.origin.y = NSView::frame(ns_window).size.height - title_bar_frame_height; title_bar_rect.origin.y = NSView::frame(ns_window).size.height - title_bar_frame_height;

View File

@@ -8,24 +8,14 @@ use reqwest::header::{HeaderMap, HeaderValue};
use std::time::Duration; use std::time::Duration;
use yaak_common::platform::{get_ua_arch, get_ua_platform}; use yaak_common::platform::{get_ua_arch, get_ua_platform};
#[derive(Debug, Clone, Copy, Eq, PartialEq)]
pub enum ApiClientKind {
App,
Cli,
}
/// Build a reqwest Client configured for Yaak's own API calls. /// Build a reqwest Client configured for Yaak's own API calls.
/// ///
/// Includes a custom User-Agent, JSON accept header, 20s timeout, gzip, /// Includes a custom User-Agent, JSON accept header, 20s timeout, gzip,
/// and automatic OS-level proxy detection via sysproxy. /// and automatic OS-level proxy detection via sysproxy.
pub fn yaak_api_client(kind: ApiClientKind, version: &str) -> Result<Client> { pub fn yaak_api_client(version: &str) -> Result<Client> {
let platform = get_ua_platform(); let platform = get_ua_platform();
let arch = get_ua_arch(); let arch = get_ua_arch();
let product = match kind { let ua = format!("Yaak/{version} ({platform}; {arch})");
ApiClientKind::App => "Yaak",
ApiClientKind::Cli => "YaakCli",
};
let ua = format!("{product}/{version} ({platform}; {arch})");
let mut default_headers = HeaderMap::new(); let mut default_headers = HeaderMap::new();
default_headers.insert("Accept", HeaderValue::from_str("application/json").unwrap()); default_headers.insert("Accept", HeaderValue::from_str("application/json").unwrap());

View File

@@ -1,6 +1,4 @@
use std::ffi::{OsStr, OsString}; use std::ffi::OsStr;
use std::io::{self, ErrorKind};
use std::process::Stdio;
#[cfg(target_os = "windows")] #[cfg(target_os = "windows")]
const CREATE_NO_WINDOW: u32 = 0x0800_0000; const CREATE_NO_WINDOW: u32 = 0x0800_0000;
@@ -16,27 +14,3 @@ pub fn new_xplatform_command<S: AsRef<OsStr>>(program: S) -> tokio::process::Com
} }
cmd cmd
} }
/// Creates a command only if the binary exists and can be invoked with the given probe argument.
pub async fn new_checked_command<S: AsRef<OsStr>>(
program: S,
probe_arg: &str,
) -> io::Result<tokio::process::Command> {
let program: OsString = program.as_ref().to_os_string();
let mut probe = new_xplatform_command(&program);
probe.arg(probe_arg).stdin(Stdio::null()).stdout(Stdio::null()).stderr(Stdio::null());
let status = probe.status().await?;
if !status.success() {
return Err(io::Error::new(
ErrorKind::NotFound,
format!(
"'{}' is not available on PATH or failed to execute",
program.to_string_lossy()
),
));
}
Ok(new_xplatform_command(&program))
}

View File

@@ -21,10 +21,3 @@ pub fn get_str_map<'a>(v: &'a BTreeMap<String, Value>, key: &str) -> &'a str {
Some(v) => v.as_str().unwrap_or_default(), Some(v) => v.as_str().unwrap_or_default(),
} }
} }
pub fn get_bool_map(v: &BTreeMap<String, Value>, key: &str, fallback: bool) -> bool {
match v.get(key) {
None => fallback,
Some(v) => v.as_bool().unwrap_or(fallback),
}
}

View File

@@ -1,17 +1,17 @@
import { invoke } from "@tauri-apps/api/core"; import { invoke } from '@tauri-apps/api/core';
export function enableEncryption(workspaceId: string) { export function enableEncryption(workspaceId: string) {
return invoke<void>("cmd_enable_encryption", { workspaceId }); return invoke<void>('cmd_enable_encryption', { workspaceId });
} }
export function revealWorkspaceKey(workspaceId: string) { export function revealWorkspaceKey(workspaceId: string) {
return invoke<string>("cmd_reveal_workspace_key", { workspaceId }); return invoke<string>('cmd_reveal_workspace_key', { workspaceId });
} }
export function setWorkspaceKey(args: { workspaceId: string; key: string }) { export function setWorkspaceKey(args: { workspaceId: string; key: string }) {
return invoke<void>("cmd_set_workspace_key", args); return invoke<void>('cmd_set_workspace_key', args);
} }
export function disableEncryption(workspaceId: string) { export function disableEncryption(workspaceId: string) {
return invoke<void>("cmd_disable_encryption", { workspaceId }); return invoke<void>('cmd_disable_encryption', { workspaceId });
} }

View File

@@ -1,6 +1,6 @@
{ {
"name": "@yaakapp-internal/crypto", "name": "@yaakapp-internal/crypto",
"version": "1.0.0",
"private": true, "private": true,
"version": "1.0.0",
"main": "index.ts" "main": "index.ts"
} }

View File

@@ -1,66 +1,60 @@
import { useQuery } from "@tanstack/react-query"; import { useQuery } from '@tanstack/react-query';
import { invoke } from "@tauri-apps/api/core"; import { invoke } from '@tauri-apps/api/core';
import { createFastMutation } from "@yaakapp/app/hooks/useFastMutation"; import { createFastMutation } from '@yaakapp/app/hooks/useFastMutation';
import { queryClient } from "@yaakapp/app/lib/queryClient"; import { queryClient } from '@yaakapp/app/lib/queryClient';
import { useMemo } from "react"; import { useMemo } from 'react';
import { import { BranchDeleteResult, CloneResult, GitCommit, GitRemote, GitStatusSummary, PullResult, PushResult } from './bindings/gen_git';
BranchDeleteResult, import { showToast } from '@yaakapp/app/lib/toast';
CloneResult,
GitCommit,
GitRemote,
GitStatusSummary,
PullResult,
PushResult,
} from "./bindings/gen_git";
import { showToast } from "@yaakapp/app/lib/toast";
export * from "./bindings/gen_git"; export * from './bindings/gen_git';
export * from "./bindings/gen_models"; export * from './bindings/gen_models';
export interface GitCredentials { export interface GitCredentials {
username: string; username: string;
password: string; password: string;
} }
export type DivergedStrategy = "force_reset" | "merge" | "cancel"; export type DivergedStrategy = 'force_reset' | 'merge' | 'cancel';
export type UncommittedChangesStrategy = "reset" | "cancel"; export type UncommittedChangesStrategy = 'reset' | 'cancel';
export interface GitCallbacks { export interface GitCallbacks {
addRemote: () => Promise<GitRemote | null>; addRemote: () => Promise<GitRemote | null>;
promptCredentials: ( promptCredentials: (
result: Extract<PushResult, { type: "needs_credentials" }>, result: Extract<PushResult, { type: 'needs_credentials' }>,
) => Promise<GitCredentials | null>; ) => Promise<GitCredentials | null>;
promptDiverged: (result: Extract<PullResult, { type: "diverged" }>) => Promise<DivergedStrategy>; promptDiverged: (
result: Extract<PullResult, { type: 'diverged' }>,
) => Promise<DivergedStrategy>;
promptUncommittedChanges: () => Promise<UncommittedChangesStrategy>; promptUncommittedChanges: () => Promise<UncommittedChangesStrategy>;
forceSync: () => Promise<void>; forceSync: () => Promise<void>;
} }
const onSuccess = () => queryClient.invalidateQueries({ queryKey: ["git"] }); const onSuccess = () => queryClient.invalidateQueries({ queryKey: ['git'] });
export function useGit(dir: string, callbacks: GitCallbacks, refreshKey?: string) { export function useGit(dir: string, callbacks: GitCallbacks, refreshKey?: string) {
const mutations = useMemo(() => gitMutations(dir, callbacks), [dir, callbacks]); const mutations = useMemo(() => gitMutations(dir, callbacks), [dir, callbacks]);
const fetchAll = useQuery<void, string>({ const fetchAll = useQuery<void, string>({
queryKey: ["git", "fetch_all", dir, refreshKey], queryKey: ['git', 'fetch_all', dir, refreshKey],
queryFn: () => invoke("cmd_git_fetch_all", { dir }), queryFn: () => invoke('cmd_git_fetch_all', { dir }),
refetchInterval: 10 * 60_000, refetchInterval: 10 * 60_000,
}); });
return [ return [
{ {
remotes: useQuery<GitRemote[], string>({ remotes: useQuery<GitRemote[], string>({
queryKey: ["git", "remotes", dir, refreshKey], queryKey: ['git', 'remotes', dir, refreshKey],
queryFn: () => getRemotes(dir), queryFn: () => getRemotes(dir),
placeholderData: (prev) => prev, placeholderData: (prev) => prev,
}), }),
log: useQuery<GitCommit[], string>({ log: useQuery<GitCommit[], string>({
queryKey: ["git", "log", dir, refreshKey], queryKey: ['git', 'log', dir, refreshKey],
queryFn: () => invoke("cmd_git_log", { dir }), queryFn: () => invoke('cmd_git_log', { dir }),
placeholderData: (prev) => prev, placeholderData: (prev) => prev,
}), }),
status: useQuery<GitStatusSummary, string>({ status: useQuery<GitStatusSummary, string>({
refetchOnMount: true, refetchOnMount: true,
queryKey: ["git", "status", dir, refreshKey, fetchAll.dataUpdatedAt], queryKey: ['git', 'status', dir, refreshKey, fetchAll.dataUpdatedAt],
queryFn: () => invoke("cmd_git_status", { dir }), queryFn: () => invoke('cmd_git_status', { dir }),
placeholderData: (prev) => prev, placeholderData: (prev) => prev,
}), }),
}, },
@@ -73,167 +67,151 @@ export const gitMutations = (dir: string, callbacks: GitCallbacks) => {
const remotes = await getRemotes(dir); const remotes = await getRemotes(dir);
if (remotes.length === 0) { if (remotes.length === 0) {
const remote = await callbacks.addRemote(); const remote = await callbacks.addRemote();
if (remote == null) throw new Error("No remote found"); if (remote == null) throw new Error('No remote found');
} }
const result = await invoke<PushResult>("cmd_git_push", { dir }); const result = await invoke<PushResult>('cmd_git_push', { dir });
if (result.type !== "needs_credentials") return result; if (result.type !== 'needs_credentials') return result;
// Needs credentials, prompt for them // Needs credentials, prompt for them
const creds = await callbacks.promptCredentials(result); const creds = await callbacks.promptCredentials(result);
if (creds == null) throw new Error("Canceled"); if (creds == null) throw new Error('Canceled');
await invoke("cmd_git_add_credential", { await invoke('cmd_git_add_credential', {
remoteUrl: result.url, remoteUrl: result.url,
username: creds.username, username: creds.username,
password: creds.password, password: creds.password,
}); });
// Push again // Push again
return invoke<PushResult>("cmd_git_push", { dir }); return invoke<PushResult>('cmd_git_push', { dir });
}; };
const handleError = (err: unknown) => { const handleError = (err: unknown) => {
showToast({ showToast({
id: err instanceof Error ? err.message : String(err), id: `${err}`,
message: err instanceof Error ? err.message : String(err), message: `${err}`,
color: "danger", color: 'danger',
timeout: 5000, timeout: 5000,
}); });
}; }
return { return {
init: createFastMutation<void, string, void>({ init: createFastMutation<void, string, void>({
mutationKey: ["git", "init"], mutationKey: ['git', 'init'],
mutationFn: () => invoke("cmd_git_initialize", { dir }), mutationFn: () => invoke('cmd_git_initialize', { dir }),
onSuccess, onSuccess,
}), }),
add: createFastMutation<void, string, { relaPaths: string[] }>({ add: createFastMutation<void, string, { relaPaths: string[] }>({
mutationKey: ["git", "add", dir], mutationKey: ['git', 'add', dir],
mutationFn: (args) => invoke("cmd_git_add", { dir, ...args }), mutationFn: (args) => invoke('cmd_git_add', { dir, ...args }),
onSuccess, onSuccess,
}), }),
addRemote: createFastMutation<GitRemote, string, GitRemote>({ addRemote: createFastMutation<GitRemote, string, GitRemote>({
mutationKey: ["git", "add-remote"], mutationKey: ['git', 'add-remote'],
mutationFn: (args) => invoke("cmd_git_add_remote", { dir, ...args }), mutationFn: (args) => invoke('cmd_git_add_remote', { dir, ...args }),
onSuccess, onSuccess,
}), }),
rmRemote: createFastMutation<void, string, { name: string }>({ rmRemote: createFastMutation<void, string, { name: string }>({
mutationKey: ["git", "rm-remote", dir], mutationKey: ['git', 'rm-remote', dir],
mutationFn: (args) => invoke("cmd_git_rm_remote", { dir, ...args }), mutationFn: (args) => invoke('cmd_git_rm_remote', { dir, ...args }),
onSuccess, onSuccess,
}), }),
createBranch: createFastMutation<void, string, { branch: string; base?: string }>({ createBranch: createFastMutation<void, string, { branch: string; base?: string }>({
mutationKey: ["git", "branch", dir], mutationKey: ['git', 'branch', dir],
mutationFn: (args) => invoke("cmd_git_branch", { dir, ...args }), mutationFn: (args) => invoke('cmd_git_branch', { dir, ...args }),
onSuccess, onSuccess,
}), }),
mergeBranch: createFastMutation<void, string, { branch: string }>({ mergeBranch: createFastMutation<void, string, { branch: string }>({
mutationKey: ["git", "merge", dir], mutationKey: ['git', 'merge', dir],
mutationFn: (args) => invoke("cmd_git_merge_branch", { dir, ...args }), mutationFn: (args) => invoke('cmd_git_merge_branch', { dir, ...args }),
onSuccess, onSuccess,
}), }),
deleteBranch: createFastMutation< deleteBranch: createFastMutation<BranchDeleteResult, string, { branch: string, force?: boolean }>({
BranchDeleteResult, mutationKey: ['git', 'delete-branch', dir],
string, mutationFn: (args) => invoke('cmd_git_delete_branch', { dir, ...args }),
{ branch: string; force?: boolean }
>({
mutationKey: ["git", "delete-branch", dir],
mutationFn: (args) => invoke("cmd_git_delete_branch", { dir, ...args }),
onSuccess, onSuccess,
}), }),
deleteRemoteBranch: createFastMutation<void, string, { branch: string }>({ deleteRemoteBranch: createFastMutation<void, string, { branch: string }>({
mutationKey: ["git", "delete-remote-branch", dir], mutationKey: ['git', 'delete-remote-branch', dir],
mutationFn: (args) => invoke("cmd_git_delete_remote_branch", { dir, ...args }), mutationFn: (args) => invoke('cmd_git_delete_remote_branch', { dir, ...args }),
onSuccess, onSuccess,
}), }),
renameBranch: createFastMutation<void, string, { oldName: string; newName: string }>({ renameBranch: createFastMutation<void, string, { oldName: string, newName: string }>({
mutationKey: ["git", "rename-branch", dir], mutationKey: ['git', 'rename-branch', dir],
mutationFn: (args) => invoke("cmd_git_rename_branch", { dir, ...args }), mutationFn: (args) => invoke('cmd_git_rename_branch', { dir, ...args }),
onSuccess, onSuccess,
}), }),
checkout: createFastMutation<string, string, { branch: string; force: boolean }>({ checkout: createFastMutation<string, string, { branch: string; force: boolean }>({
mutationKey: ["git", "checkout", dir], mutationKey: ['git', 'checkout', dir],
mutationFn: (args) => invoke("cmd_git_checkout", { dir, ...args }), mutationFn: (args) => invoke('cmd_git_checkout', { dir, ...args }),
onSuccess, onSuccess,
}), }),
commit: createFastMutation<void, string, { message: string }>({ commit: createFastMutation<void, string, { message: string }>({
mutationKey: ["git", "commit", dir], mutationKey: ['git', 'commit', dir],
mutationFn: (args) => invoke("cmd_git_commit", { dir, ...args }), mutationFn: (args) => invoke('cmd_git_commit', { dir, ...args }),
onSuccess, onSuccess,
}), }),
commitAndPush: createFastMutation<PushResult, string, { message: string }>({ commitAndPush: createFastMutation<PushResult, string, { message: string }>({
mutationKey: ["git", "commit_push", dir], mutationKey: ['git', 'commit_push', dir],
mutationFn: async (args) => { mutationFn: async (args) => {
await invoke("cmd_git_commit", { dir, ...args }); await invoke('cmd_git_commit', { dir, ...args });
return push(); return push();
}, },
onSuccess, onSuccess,
}), }),
push: createFastMutation<PushResult, string, void>({ push: createFastMutation<PushResult, string, void>({
mutationKey: ["git", "push", dir], mutationKey: ['git', 'push', dir],
mutationFn: push, mutationFn: push,
onSuccess, onSuccess,
}), }),
pull: createFastMutation<PullResult, string, void>({ pull: createFastMutation<PullResult, string, void>({
mutationKey: ["git", "pull", dir], mutationKey: ['git', 'pull', dir],
async mutationFn() { async mutationFn() {
const result = await invoke<PullResult>("cmd_git_pull", { dir }); const result = await invoke<PullResult>('cmd_git_pull', { dir });
if (result.type === "needs_credentials") { if (result.type === 'needs_credentials') {
const creds = await callbacks.promptCredentials(result); const creds = await callbacks.promptCredentials(result);
if (creds == null) throw new Error("Canceled"); if (creds == null) throw new Error('Canceled');
await invoke("cmd_git_add_credential", { await invoke('cmd_git_add_credential', {
remoteUrl: result.url, remoteUrl: result.url,
username: creds.username, username: creds.username,
password: creds.password, password: creds.password,
}); });
// Pull again after credentials // Pull again after credentials
return invoke<PullResult>("cmd_git_pull", { dir }); return invoke<PullResult>('cmd_git_pull', { dir });
} }
if (result.type === "uncommitted_changes") { if (result.type === 'uncommitted_changes') {
void callbacks callbacks.promptUncommittedChanges().then(async (strategy) => {
.promptUncommittedChanges() if (strategy === 'cancel') return;
.then(async (strategy) => {
if (strategy === "cancel") return;
await invoke("cmd_git_reset_changes", { dir }); await invoke('cmd_git_reset_changes', { dir });
return invoke<PullResult>("cmd_git_pull", { dir }); return invoke<PullResult>('cmd_git_pull', { dir });
}) }).then(async () => { onSuccess(); await callbacks.forceSync(); }, handleError);
.then(async () => {
await onSuccess();
await callbacks.forceSync();
}, handleError);
} }
if (result.type === "diverged") { if (result.type === 'diverged') {
void callbacks callbacks.promptDiverged(result).then((strategy) => {
.promptDiverged(result) if (strategy === 'cancel') return;
.then((strategy) => {
if (strategy === "cancel") return;
if (strategy === "force_reset") { if (strategy === 'force_reset') {
return invoke<PullResult>("cmd_git_pull_force_reset", { return invoke<PullResult>('cmd_git_pull_force_reset', {
dir,
remote: result.remote,
branch: result.branch,
});
}
return invoke<PullResult>("cmd_git_pull_merge", {
dir, dir,
remote: result.remote, remote: result.remote,
branch: result.branch, branch: result.branch,
}); });
}) }
.then(async () => {
await onSuccess(); return invoke<PullResult>('cmd_git_pull_merge', {
await callbacks.forceSync(); dir,
}, handleError); remote: result.remote,
branch: result.branch,
});
}).then(async () => { onSuccess(); await callbacks.forceSync(); }, handleError);
} }
return result; return result;
@@ -241,20 +219,20 @@ export const gitMutations = (dir: string, callbacks: GitCallbacks) => {
onSuccess, onSuccess,
}), }),
unstage: createFastMutation<void, string, { relaPaths: string[] }>({ unstage: createFastMutation<void, string, { relaPaths: string[] }>({
mutationKey: ["git", "unstage", dir], mutationKey: ['git', 'unstage', dir],
mutationFn: (args) => invoke("cmd_git_unstage", { dir, ...args }), mutationFn: (args) => invoke('cmd_git_unstage', { dir, ...args }),
onSuccess, onSuccess,
}), }),
resetChanges: createFastMutation<void, string, void>({ resetChanges: createFastMutation<void, string, void>({
mutationKey: ["git", "reset-changes", dir], mutationKey: ['git', 'reset-changes', dir],
mutationFn: () => invoke("cmd_git_reset_changes", { dir }), mutationFn: () => invoke('cmd_git_reset_changes', { dir }),
onSuccess, onSuccess,
}), }),
} as const; } as const;
}; };
async function getRemotes(dir: string) { async function getRemotes(dir: string) {
return invoke<GitRemote[]>("cmd_git_remotes", { dir }); return invoke<GitRemote[]>('cmd_git_remotes', { dir });
} }
/** /**
@@ -263,24 +241,21 @@ async function getRemotes(dir: string) {
export async function gitClone( export async function gitClone(
url: string, url: string,
dir: string, dir: string,
promptCredentials: (args: { promptCredentials: (args: { url: string; error: string | null }) => Promise<GitCredentials | null>,
url: string;
error: string | null;
}) => Promise<GitCredentials | null>,
): Promise<CloneResult> { ): Promise<CloneResult> {
const result = await invoke<CloneResult>("cmd_git_clone", { url, dir }); const result = await invoke<CloneResult>('cmd_git_clone', { url, dir });
if (result.type !== "needs_credentials") return result; if (result.type !== 'needs_credentials') return result;
// Prompt for credentials // Prompt for credentials
const creds = await promptCredentials({ url: result.url, error: result.error }); const creds = await promptCredentials({ url: result.url, error: result.error });
if (creds == null) return { type: "cancelled" }; if (creds == null) return {type: 'cancelled'};
// Store credentials and retry // Store credentials and retry
await invoke("cmd_git_add_credential", { await invoke('cmd_git_add_credential', {
remoteUrl: result.url, remoteUrl: result.url,
username: creds.username, username: creds.username,
password: creds.password, password: creds.password,
}); });
return invoke<CloneResult>("cmd_git_clone", { url, dir }); return invoke<CloneResult>('cmd_git_clone', { url, dir });
} }

View File

@@ -1,6 +1,6 @@
{ {
"name": "@yaakapp-internal/git", "name": "@yaakapp-internal/git",
"version": "1.0.0",
"private": true, "private": true,
"version": "1.0.0",
"main": "index.ts" "main": "index.ts"
} }

View File

@@ -1,8 +1,9 @@
use crate::error::Error::GitNotFound; use crate::error::Error::GitNotFound;
use crate::error::Result; use crate::error::Result;
use std::path::Path; use std::path::Path;
use std::process::Stdio;
use tokio::process::Command; use tokio::process::Command;
use yaak_common::command::new_checked_command; use yaak_common::command::new_xplatform_command;
/// Create a git command that runs in the specified directory /// Create a git command that runs in the specified directory
pub(crate) async fn new_binary_command(dir: &Path) -> Result<Command> { pub(crate) async fn new_binary_command(dir: &Path) -> Result<Command> {
@@ -13,5 +14,17 @@ pub(crate) async fn new_binary_command(dir: &Path) -> Result<Command> {
/// Create a git command without a specific directory (for global operations) /// Create a git command without a specific directory (for global operations)
pub(crate) async fn new_binary_command_global() -> Result<Command> { pub(crate) async fn new_binary_command_global() -> Result<Command> {
new_checked_command("git", "--version").await.map_err(|_| GitNotFound) // 1. Probe that `git` exists and is runnable
let mut probe = new_xplatform_command("git");
probe.arg("--version").stdin(Stdio::null()).stdout(Stdio::null()).stderr(Stdio::null());
let status = probe.status().await.map_err(|_| GitNotFound)?;
if !status.success() {
return Err(GitNotFound);
}
// 2. Build the reusable git command
let cmd = new_xplatform_command("git");
Ok(cmd)
} }

View File

@@ -55,7 +55,6 @@ mod tests {
let mut out = Vec::new(); let mut out = Vec::new();
super::collect_any_types(json, &mut out); super::collect_any_types(json, &mut out);
out.sort();
assert_eq!(out, vec!["foo.bar", "mount_source.MountSourceRBDVolume"]); assert_eq!(out, vec!["foo.bar", "mount_source.MountSourceRBDVolume"]);
} }
} }

View File

@@ -19,12 +19,7 @@ hyper-util = { version = "0.1.17", default-features = false, features = ["client
log = { workspace = true } log = { workspace = true }
mime_guess = "2.0.5" mime_guess = "2.0.5"
regex = "1.11.1" regex = "1.11.1"
reqwest = { workspace = true, features = [ reqwest = { workspace = true, features = ["rustls-tls-manual-roots-no-provider", "socks", "http2", "stream"] }
"rustls-tls-manual-roots-no-provider",
"socks",
"http2",
"stream",
] }
serde = { workspace = true, features = ["derive"] } serde = { workspace = true, features = ["derive"] }
serde_json = { workspace = true } serde_json = { workspace = true }
thiserror = { workspace = true } thiserror = { workspace = true }
@@ -34,5 +29,4 @@ tower-service = "0.3.3"
urlencoding = "2.1.3" urlencoding = "2.1.3"
yaak-common = { workspace = true } yaak-common = { workspace = true }
yaak-models = { workspace = true } yaak-models = { workspace = true }
yaak-templates = { workspace = true }
yaak-tls = { workspace = true } yaak-tls = { workspace = true }

View File

@@ -1,6 +1,7 @@
use crate::client::HttpConnectionOptions; use crate::client::HttpConnectionOptions;
use crate::dns::LocalhostResolver; use crate::dns::LocalhostResolver;
use crate::error::Result; use crate::error::Result;
use log::info;
use reqwest::Client; use reqwest::Client;
use std::collections::BTreeMap; use std::collections::BTreeMap;
use std::sync::Arc; use std::sync::Arc;
@@ -35,6 +36,7 @@ impl HttpConnectionManager {
connections.retain(|_, (_, last_used)| last_used.elapsed() <= self.ttl); connections.retain(|_, (_, last_used)| last_used.elapsed() <= self.ttl);
if let Some((cached, last_used)) = connections.get_mut(&id) { if let Some((cached, last_used)) = connections.get_mut(&id) {
info!("Re-using HTTP client {id}");
*last_used = Instant::now(); *last_used = Instant::now();
return Ok(CachedClient { return Ok(CachedClient {
client: cached.client.clone(), client: cached.client.clone(),

View File

@@ -30,8 +30,6 @@ pub enum HttpResponseEvent {
url: String, url: String,
status: u16, status: u16,
behavior: RedirectBehavior, behavior: RedirectBehavior,
dropped_body: bool,
dropped_headers: Vec<String>,
}, },
SendUrl { SendUrl {
method: String, method: String,
@@ -69,28 +67,12 @@ impl Display for HttpResponseEvent {
match self { match self {
HttpResponseEvent::Setting(name, value) => write!(f, "* Setting {}={}", name, value), HttpResponseEvent::Setting(name, value) => write!(f, "* Setting {}={}", name, value),
HttpResponseEvent::Info(s) => write!(f, "* {}", s), HttpResponseEvent::Info(s) => write!(f, "* {}", s),
HttpResponseEvent::Redirect { HttpResponseEvent::Redirect { url, status, behavior } => {
url,
status,
behavior,
dropped_body,
dropped_headers,
} => {
let behavior_str = match behavior { let behavior_str = match behavior {
RedirectBehavior::Preserve => "preserve", RedirectBehavior::Preserve => "preserve",
RedirectBehavior::DropBody => "drop body", RedirectBehavior::DropBody => "drop body",
}; };
let body_str = if *dropped_body { ", body dropped" } else { "" }; write!(f, "* Redirect {} -> {} ({})", status, url, behavior_str)
let headers_str = if dropped_headers.is_empty() {
String::new()
} else {
format!(", headers dropped: {}", dropped_headers.join(", "))
};
write!(
f,
"* Redirect {} -> {} ({}{}{})",
status, url, behavior_str, body_str, headers_str
)
} }
HttpResponseEvent::SendUrl { HttpResponseEvent::SendUrl {
method, method,
@@ -148,21 +130,13 @@ impl From<HttpResponseEvent> for yaak_models::models::HttpResponseEventData {
match event { match event {
HttpResponseEvent::Setting(name, value) => D::Setting { name, value }, HttpResponseEvent::Setting(name, value) => D::Setting { name, value },
HttpResponseEvent::Info(message) => D::Info { message }, HttpResponseEvent::Info(message) => D::Info { message },
HttpResponseEvent::Redirect { HttpResponseEvent::Redirect { url, status, behavior } => D::Redirect {
url,
status,
behavior,
dropped_body,
dropped_headers,
} => D::Redirect {
url, url,
status, status,
behavior: match behavior { behavior: match behavior {
RedirectBehavior::Preserve => "preserve".to_string(), RedirectBehavior::Preserve => "preserve".to_string(),
RedirectBehavior::DropBody => "drop_body".to_string(), RedirectBehavior::DropBody => "drop_body".to_string(),
}, },
dropped_body,
dropped_headers,
}, },
HttpResponseEvent::SendUrl { HttpResponseEvent::SendUrl {
method, method,

View File

@@ -1,7 +1,7 @@
use crate::cookies::CookieStore; use crate::cookies::CookieStore;
use crate::error::Result; use crate::error::Result;
use crate::sender::{HttpResponse, HttpResponseEvent, HttpSender, RedirectBehavior}; use crate::sender::{HttpResponse, HttpResponseEvent, HttpSender, RedirectBehavior};
use crate::types::{SendableBody, SendableHttpRequest}; use crate::types::SendableHttpRequest;
use log::debug; use log::debug;
use tokio::sync::mpsc; use tokio::sync::mpsc;
use tokio::sync::watch::Receiver; use tokio::sync::watch::Receiver;
@@ -87,11 +87,6 @@ impl<S: HttpSender> HttpTransaction<S> {
}; };
// Build request for this iteration // Build request for this iteration
let preserved_body = match &current_body {
Some(SendableBody::Bytes(b)) => Some(SendableBody::Bytes(b.clone())),
_ => None,
};
let request_had_body = current_body.is_some();
let req = SendableHttpRequest { let req = SendableHttpRequest {
url: current_url.clone(), url: current_url.clone(),
method: current_method.clone(), method: current_method.clone(),
@@ -187,6 +182,8 @@ impl<S: HttpSender> HttpTransaction<S> {
format!("{}/{}", base_path, location) format!("{}/{}", base_path, location)
}; };
Self::remove_sensitive_headers(&mut current_headers, &previous_url, &current_url);
// Determine redirect behavior based on status code and method // Determine redirect behavior based on status code and method
let behavior = if status == 303 { let behavior = if status == 303 {
// 303 See Other always changes to GET // 303 See Other always changes to GET
@@ -200,8 +197,11 @@ impl<S: HttpSender> HttpTransaction<S> {
RedirectBehavior::Preserve RedirectBehavior::Preserve
}; };
let mut dropped_headers = send_event(HttpResponseEvent::Redirect {
Self::remove_sensitive_headers(&mut current_headers, &previous_url, &current_url); url: current_url.clone(),
status,
behavior: behavior.clone(),
});
// Handle method changes for certain redirect codes // Handle method changes for certain redirect codes
if matches!(behavior, RedirectBehavior::DropBody) { if matches!(behavior, RedirectBehavior::DropBody) {
@@ -211,40 +211,13 @@ impl<S: HttpSender> HttpTransaction<S> {
// Remove content-related headers // Remove content-related headers
current_headers.retain(|h| { current_headers.retain(|h| {
let name_lower = h.0.to_lowercase(); let name_lower = h.0.to_lowercase();
let should_drop = !name_lower.starts_with("content-") && name_lower != "transfer-encoding"
name_lower.starts_with("content-") || name_lower == "transfer-encoding";
if should_drop {
Self::push_header_if_missing(&mut dropped_headers, &h.0);
}
!should_drop
}); });
} }
// Restore body for Preserve redirects (307/308), drop for others. // Reset body for next iteration (since it was moved in the send call)
// Stream bodies can't be replayed (same limitation as reqwest). // For redirects that change method to GET or for all redirects since body was consumed
current_body = if matches!(behavior, RedirectBehavior::Preserve) { current_body = None;
if request_had_body && preserved_body.is_none() {
// Stream body was consumed and can't be replayed (same as reqwest)
return Err(crate::error::Error::RequestError(
"Cannot follow redirect: request body was a stream and cannot be resent"
.to_string(),
));
}
preserved_body
} else {
None
};
// Body was dropped if the request had one but we can't resend it
let dropped_body = request_had_body && current_body.is_none();
send_event(HttpResponseEvent::Redirect {
url: current_url.clone(),
status,
behavior: behavior.clone(),
dropped_body,
dropped_headers,
});
redirect_count += 1; redirect_count += 1;
} }
@@ -258,8 +231,7 @@ impl<S: HttpSender> HttpTransaction<S> {
headers: &mut Vec<(String, String)>, headers: &mut Vec<(String, String)>,
previous_url: &str, previous_url: &str,
next_url: &str, next_url: &str,
) -> Vec<String> { ) {
let mut dropped_headers = Vec::new();
let previous_host = Url::parse(previous_url).ok().and_then(|u| { let previous_host = Url::parse(previous_url).ok().and_then(|u| {
u.host_str().map(|h| format!("{}:{}", h, u.port_or_known_default().unwrap_or(0))) u.host_str().map(|h| format!("{}:{}", h, u.port_or_known_default().unwrap_or(0)))
}); });
@@ -269,24 +241,13 @@ impl<S: HttpSender> HttpTransaction<S> {
if previous_host != next_host { if previous_host != next_host {
headers.retain(|h| { headers.retain(|h| {
let name_lower = h.0.to_lowercase(); let name_lower = h.0.to_lowercase();
let should_drop = name_lower == "authorization" name_lower != "authorization"
|| name_lower == "cookie" && name_lower != "cookie"
|| name_lower == "cookie2" && name_lower != "cookie2"
|| name_lower == "proxy-authorization" && name_lower != "proxy-authorization"
|| name_lower == "www-authenticate"; && name_lower != "www-authenticate"
if should_drop {
Self::push_header_if_missing(&mut dropped_headers, &h.0);
}
!should_drop
}); });
} }
dropped_headers
}
fn push_header_if_missing(headers: &mut Vec<String>, name: &str) {
if !headers.iter().any(|h| h.eq_ignore_ascii_case(name)) {
headers.push(name.to_string());
}
} }
/// Check if a status code indicates a redirect /// Check if a status code indicates a redirect

View File

@@ -9,9 +9,8 @@ use std::collections::BTreeMap;
use std::pin::Pin; use std::pin::Pin;
use std::time::Duration; use std::time::Duration;
use tokio::io::AsyncRead; use tokio::io::AsyncRead;
use yaak_common::serde::{get_bool, get_bool_map, get_str, get_str_map}; use yaak_common::serde::{get_bool, get_str, get_str_map};
use yaak_models::models::HttpRequest; use yaak_models::models::HttpRequest;
use yaak_templates::strip_json_comments::{maybe_strip_json_comments, strip_json_comments};
pub(crate) const MULTIPART_BOUNDARY: &str = "------YaakFormBoundary"; pub(crate) const MULTIPART_BOUNDARY: &str = "------YaakFormBoundary";
@@ -135,69 +134,16 @@ pub fn append_query_params(url: &str, params: Vec<(String, String)>) -> String {
result result
} }
fn strip_query_params(url: &str, names: &[&str]) -> String {
// Split off fragment
let (base_and_query, fragment) = if let Some(hash_pos) = url.find('#') {
(&url[..hash_pos], Some(&url[hash_pos..]))
} else {
(url, None)
};
let result = if let Some(q_pos) = base_and_query.find('?') {
let base = &base_and_query[..q_pos];
let query = &base_and_query[q_pos + 1..];
let filtered: Vec<&str> = query
.split('&')
.filter(|pair| {
let key = pair.split('=').next().unwrap_or("");
let decoded = urlencoding::decode(key).unwrap_or_default();
!names.contains(&decoded.as_ref())
})
.collect();
if filtered.is_empty() {
base.to_string()
} else {
format!("{}?{}", base, filtered.join("&"))
}
} else {
base_and_query.to_string()
};
match fragment {
Some(f) => format!("{}{}", result, f),
None => result,
}
}
fn build_url(r: &HttpRequest) -> String { fn build_url(r: &HttpRequest) -> String {
let (url_string, params) = apply_path_placeholders(&ensure_proto(&r.url), &r.url_parameters); let (url_string, params) = apply_path_placeholders(&ensure_proto(&r.url), &r.url_parameters);
let mut url = append_query_params( append_query_params(
&url_string, &url_string,
params params
.iter() .iter()
.filter(|p| p.enabled && !p.name.is_empty()) .filter(|p| p.enabled && !p.name.is_empty())
.map(|p| (p.name.clone(), p.value.clone())) .map(|p| (p.name.clone(), p.value.clone()))
.collect(), .collect(),
); )
// GraphQL GET requests encode query/variables as URL query parameters
if r.method.to_lowercase() == "get" && r.body_type.as_deref() == Some("graphql") {
url = append_graphql_query_params(&url, &r.body);
}
url
}
fn append_graphql_query_params(url: &str, body: &BTreeMap<String, serde_json::Value>) -> String {
let query = get_str_map(body, "query").to_string();
let variables = strip_json_comments(&get_str_map(body, "variables"));
let mut params = vec![("query".to_string(), query)];
if !variables.trim().is_empty() {
params.push(("variables".to_string(), variables));
}
// Strip existing query/variables params to avoid duplicates
let url = strip_query_params(url, &["query", "variables"]);
append_query_params(&url, params)
} }
fn build_headers(r: &HttpRequest) -> Vec<(String, String)> { fn build_headers(r: &HttpRequest) -> Vec<(String, String)> {
@@ -226,10 +172,12 @@ async fn build_body(
let (body, content_type) = match body_type.as_str() { let (body, content_type) = match body_type.as_str() {
"binary" => (build_binary_body(&body).await?, None), "binary" => (build_binary_body(&body).await?, None),
"graphql" => (build_graphql_body(&method, &body), None), "graphql" => (build_graphql_body(&method, &body), Some("application/json".to_string())),
"application/x-www-form-urlencoded" => (build_form_body(&body), None), "application/x-www-form-urlencoded" => {
(build_form_body(&body), Some("application/x-www-form-urlencoded".to_string()))
}
"multipart/form-data" => build_multipart_body(&body, &headers).await?, "multipart/form-data" => build_multipart_body(&body, &headers).await?,
_ if body.contains_key("text") => (build_text_body(&body, body_type), None), _ if body.contains_key("text") => (build_text_body(&body), None),
t => { t => {
warn!("Unsupported body type: {}", t); warn!("Unsupported body type: {}", t);
(None, None) (None, None)
@@ -304,20 +252,13 @@ async fn build_binary_body(
})) }))
} }
fn build_text_body(body: &BTreeMap<String, serde_json::Value>, body_type: &str) -> Option<SendableBodyWithMeta> { fn build_text_body(body: &BTreeMap<String, serde_json::Value>) -> Option<SendableBodyWithMeta> {
let text = get_str_map(body, "text"); let text = get_str_map(body, "text");
if text.is_empty() { if text.is_empty() {
return None; None
}
let send_comments = get_bool_map(body, "sendJsonComments", false);
let text = if !send_comments && body_type == "application/json" {
maybe_strip_json_comments(text)
} else { } else {
text.to_string() Some(SendableBodyWithMeta::Bytes(Bytes::from(text.to_string())))
}; }
Some(SendableBodyWithMeta::Bytes(Bytes::from(text)))
} }
fn build_graphql_body( fn build_graphql_body(
@@ -325,7 +266,7 @@ fn build_graphql_body(
body: &BTreeMap<String, serde_json::Value>, body: &BTreeMap<String, serde_json::Value>,
) -> Option<SendableBodyWithMeta> { ) -> Option<SendableBodyWithMeta> {
let query = get_str_map(body, "query"); let query = get_str_map(body, "query");
let variables = strip_json_comments(&get_str_map(body, "variables")); let variables = get_str_map(body, "variables");
if method.to_lowercase() == "get" { if method.to_lowercase() == "get" {
// GraphQL GET requests use query parameters, not a body // GraphQL GET requests use query parameters, not a body
@@ -743,7 +684,7 @@ mod tests {
let mut body = BTreeMap::new(); let mut body = BTreeMap::new();
body.insert("text".to_string(), json!("Hello, World!")); body.insert("text".to_string(), json!("Hello, World!"));
let result = build_text_body(&body, "application/json"); let result = build_text_body(&body);
match result { match result {
Some(SendableBodyWithMeta::Bytes(bytes)) => { Some(SendableBodyWithMeta::Bytes(bytes)) => {
assert_eq!(bytes, Bytes::from("Hello, World!")) assert_eq!(bytes, Bytes::from("Hello, World!"))
@@ -757,7 +698,7 @@ mod tests {
let mut body = BTreeMap::new(); let mut body = BTreeMap::new();
body.insert("text".to_string(), json!("")); body.insert("text".to_string(), json!(""));
let result = build_text_body(&body, "application/json"); let result = build_text_body(&body);
assert!(result.is_none()); assert!(result.is_none());
} }
@@ -765,57 +706,10 @@ mod tests {
async fn test_text_body_missing() { async fn test_text_body_missing() {
let body = BTreeMap::new(); let body = BTreeMap::new();
let result = build_text_body(&body, "application/json"); let result = build_text_body(&body);
assert!(result.is_none()); assert!(result.is_none());
} }
#[tokio::test]
async fn test_text_body_strips_json_comments_by_default() {
let mut body = BTreeMap::new();
body.insert("text".to_string(), json!("{\n // comment\n \"foo\": \"bar\"\n}"));
let result = build_text_body(&body, "application/json");
match result {
Some(SendableBodyWithMeta::Bytes(bytes)) => {
let text = String::from_utf8_lossy(&bytes);
assert!(!text.contains("// comment"));
assert!(text.contains("\"foo\": \"bar\""));
}
_ => panic!("Expected Some(SendableBody::Bytes)"),
}
}
#[tokio::test]
async fn test_text_body_send_json_comments_when_opted_in() {
let mut body = BTreeMap::new();
body.insert("text".to_string(), json!("{\n // comment\n \"foo\": \"bar\"\n}"));
body.insert("sendJsonComments".to_string(), json!(true));
let result = build_text_body(&body, "application/json");
match result {
Some(SendableBodyWithMeta::Bytes(bytes)) => {
let text = String::from_utf8_lossy(&bytes);
assert!(text.contains("// comment"));
}
_ => panic!("Expected Some(SendableBody::Bytes)"),
}
}
#[tokio::test]
async fn test_text_body_no_strip_for_non_json() {
let mut body = BTreeMap::new();
body.insert("text".to_string(), json!("// not json\nsome text"));
let result = build_text_body(&body, "text/plain");
match result {
Some(SendableBodyWithMeta::Bytes(bytes)) => {
let text = String::from_utf8_lossy(&bytes);
assert!(text.contains("// not json"));
}
_ => panic!("Expected Some(SendableBody::Bytes)"),
}
}
#[tokio::test] #[tokio::test]
async fn test_form_urlencoded_body() -> Result<()> { async fn test_form_urlencoded_body() -> Result<()> {
let mut body = BTreeMap::new(); let mut body = BTreeMap::new();

View File

@@ -49,7 +49,7 @@ export type HttpResponseEvent = { model: "http_response_event", id: string, crea
* This mirrors `yaak_http::sender::HttpResponseEvent` but with serde support. * This mirrors `yaak_http::sender::HttpResponseEvent` but with serde support.
* The `From` impl is in yaak-http to avoid circular dependencies. * The `From` impl is in yaak-http to avoid circular dependencies.
*/ */
export type HttpResponseEventData = { "type": "setting", name: string, value: string, } | { "type": "info", message: string, } | { "type": "redirect", url: string, status: number, behavior: string, dropped_body: boolean, dropped_headers: Array<string>, } | { "type": "send_url", method: string, scheme: string, username: string, password: string, host: string, port: number, path: string, query: string, fragment: string, } | { "type": "receive_url", version: string, status: string, } | { "type": "header_up", name: string, value: string, } | { "type": "header_down", name: string, value: string, } | { "type": "chunk_sent", bytes: number, } | { "type": "chunk_received", bytes: number, } | { "type": "dns_resolved", hostname: string, addresses: Array<string>, duration: bigint, overridden: boolean, }; export type HttpResponseEventData = { "type": "setting", name: string, value: string, } | { "type": "info", message: string, } | { "type": "redirect", url: string, status: number, behavior: string, } | { "type": "send_url", method: string, scheme: string, username: string, password: string, host: string, port: number, path: string, query: string, fragment: string, } | { "type": "receive_url", version: string, status: string, } | { "type": "header_up", name: string, value: string, } | { "type": "header_down", name: string, value: string, } | { "type": "chunk_sent", bytes: number, } | { "type": "chunk_received", bytes: number, } | { "type": "dns_resolved", hostname: string, addresses: Array<string>, duration: bigint, overridden: boolean, };
export type HttpResponseHeader = { name: string, value: string, }; export type HttpResponseHeader = { name: string, value: string, };
@@ -67,9 +67,7 @@ export type ParentAuthentication = { authentication: Record<string, any>, authen
export type ParentHeaders = { headers: Array<HttpRequestHeader>, }; export type ParentHeaders = { headers: Array<HttpRequestHeader>, };
export type Plugin = { model: "plugin", id: string, createdAt: string, updatedAt: string, checkedAt: string | null, directory: string, enabled: boolean, url: string | null, source: PluginSource, }; export type Plugin = { model: "plugin", id: string, createdAt: string, updatedAt: string, checkedAt: string | null, directory: string, enabled: boolean, url: string | null, };
export type PluginSource = "bundled" | "filesystem" | "registry";
export type PluginKeyValue = { model: "plugin_key_value", createdAt: string, updatedAt: string, pluginName: string, key: string, value: string, }; export type PluginKeyValue = { model: "plugin_key_value", createdAt: string, updatedAt: string, pluginName: string, key: string, value: string, };

View File

@@ -1,39 +1,35 @@
import { atom } from "jotai"; import { atom } from 'jotai';
import { selectAtom } from "jotai/utils"; import { selectAtom } from 'jotai/utils';
import type { AnyModel } from "../bindings/gen_models"; import type { AnyModel } from '../bindings/gen_models';
import { ExtractModel } from "./types"; import { ExtractModel } from './types';
import { newStoreData } from "./util"; import { newStoreData } from './util';
export const modelStoreDataAtom = atom(newStoreData()); export const modelStoreDataAtom = atom(newStoreData());
export const cookieJarsAtom = createOrderedModelAtom("cookie_jar", "name", "asc"); export const cookieJarsAtom = createOrderedModelAtom('cookie_jar', 'name', 'asc');
export const environmentsAtom = createOrderedModelAtom("environment", "sortPriority", "asc"); export const environmentsAtom = createOrderedModelAtom('environment', 'sortPriority', 'asc');
export const foldersAtom = createModelAtom("folder"); export const foldersAtom = createModelAtom('folder');
export const grpcConnectionsAtom = createOrderedModelAtom("grpc_connection", "createdAt", "desc"); export const grpcConnectionsAtom = createOrderedModelAtom('grpc_connection', 'createdAt', 'desc');
export const grpcEventsAtom = createOrderedModelAtom("grpc_event", "createdAt", "asc"); export const grpcEventsAtom = createOrderedModelAtom('grpc_event', 'createdAt', 'asc');
export const grpcRequestsAtom = createModelAtom("grpc_request"); export const grpcRequestsAtom = createModelAtom('grpc_request');
export const httpRequestsAtom = createModelAtom("http_request"); export const httpRequestsAtom = createModelAtom('http_request');
export const httpResponsesAtom = createOrderedModelAtom("http_response", "createdAt", "desc"); export const httpResponsesAtom = createOrderedModelAtom('http_response', 'createdAt', 'desc');
export const httpResponseEventsAtom = createOrderedModelAtom( export const httpResponseEventsAtom = createOrderedModelAtom('http_response_event', 'createdAt', 'asc');
"http_response_event", export const keyValuesAtom = createModelAtom('key_value');
"createdAt", export const pluginsAtom = createModelAtom('plugin');
"asc", export const settingsAtom = createSingularModelAtom('settings');
); export const websocketRequestsAtom = createModelAtom('websocket_request');
export const keyValuesAtom = createModelAtom("key_value"); export const websocketEventsAtom = createOrderedModelAtom('websocket_event', 'createdAt', 'asc');
export const pluginsAtom = createModelAtom("plugin");
export const settingsAtom = createSingularModelAtom("settings");
export const websocketRequestsAtom = createModelAtom("websocket_request");
export const websocketEventsAtom = createOrderedModelAtom("websocket_event", "createdAt", "asc");
export const websocketConnectionsAtom = createOrderedModelAtom( export const websocketConnectionsAtom = createOrderedModelAtom(
"websocket_connection", 'websocket_connection',
"createdAt", 'createdAt',
"desc", 'desc',
); );
export const workspaceMetasAtom = createModelAtom("workspace_meta"); export const workspaceMetasAtom = createModelAtom('workspace_meta');
export const workspacesAtom = createOrderedModelAtom("workspace", "name", "asc"); export const workspacesAtom = createOrderedModelAtom('workspace', 'name', 'asc');
export function createModelAtom<M extends AnyModel["model"]>(modelType: M) { export function createModelAtom<M extends AnyModel['model']>(modelType: M) {
return selectAtom( return selectAtom(
modelStoreDataAtom, modelStoreDataAtom,
(data) => Object.values(data[modelType] ?? {}), (data) => Object.values(data[modelType] ?? {}),
@@ -41,19 +37,19 @@ export function createModelAtom<M extends AnyModel["model"]>(modelType: M) {
); );
} }
export function createSingularModelAtom<M extends AnyModel["model"]>(modelType: M) { export function createSingularModelAtom<M extends AnyModel['model']>(modelType: M) {
return selectAtom(modelStoreDataAtom, (data) => { return selectAtom(modelStoreDataAtom, (data) => {
const modelData = Object.values(data[modelType] ?? {}); const modelData = Object.values(data[modelType] ?? {});
const item = modelData[0]; const item = modelData[0];
if (item == null) throw new Error("Failed creating singular model with no data: " + modelType); if (item == null) throw new Error('Failed creating singular model with no data: ' + modelType);
return item; return item;
}); });
} }
export function createOrderedModelAtom<M extends AnyModel["model"]>( export function createOrderedModelAtom<M extends AnyModel['model']>(
modelType: M, modelType: M,
field: keyof ExtractModel<AnyModel, M>, field: keyof ExtractModel<AnyModel, M>,
order: "asc" | "desc", order: 'asc' | 'desc',
) { ) {
return selectAtom( return selectAtom(
modelStoreDataAtom, modelStoreDataAtom,
@@ -62,7 +58,7 @@ export function createOrderedModelAtom<M extends AnyModel["model"]>(
return Object.values(modelData).sort( return Object.values(modelData).sort(
(a: ExtractModel<AnyModel, M>, b: ExtractModel<AnyModel, M>) => { (a: ExtractModel<AnyModel, M>, b: ExtractModel<AnyModel, M>) => {
const n = a[field] > b[field] ? 1 : -1; const n = a[field] > b[field] ? 1 : -1;
return order === "desc" ? n * -1 : n; return order === 'desc' ? n * -1 : n;
}, },
); );
}, },

View File

@@ -1,11 +1,11 @@
import { AnyModel } from "../bindings/gen_models"; import { AnyModel } from '../bindings/gen_models';
export * from "../bindings/gen_models"; export * from '../bindings/gen_models';
export * from "../bindings/gen_util"; export * from '../bindings/gen_util';
export * from "./store"; export * from './store';
export * from "./atoms"; export * from './atoms';
export function modelTypeLabel(m: AnyModel): string { export function modelTypeLabel(m: AnyModel): string {
const capitalize = (str: string) => str.charAt(0).toUpperCase() + str.slice(1); const capitalize = (str: string) => str.charAt(0).toUpperCase() + str.slice(1);
return m.model.split("_").map(capitalize).join(" "); return m.model.split('_').map(capitalize).join(' ');
} }

View File

@@ -1,10 +1,10 @@
import { invoke } from "@tauri-apps/api/core"; import { invoke } from '@tauri-apps/api/core';
import { getCurrentWebviewWindow } from "@tauri-apps/api/webviewWindow"; import { getCurrentWebviewWindow } from '@tauri-apps/api/webviewWindow';
import { resolvedModelName } from "@yaakapp/app/lib/resolvedModelName"; import { resolvedModelName } from '@yaakapp/app/lib/resolvedModelName';
import { AnyModel, ModelPayload } from "../bindings/gen_models"; import { AnyModel, ModelPayload } from '../bindings/gen_models';
import { modelStoreDataAtom } from "./atoms"; import { modelStoreDataAtom } from './atoms';
import { ExtractModel, JotaiStore, ModelStoreData } from "./types"; import { ExtractModel, JotaiStore, ModelStoreData } from './types';
import { newStoreData } from "./util"; import { newStoreData } from './util';
let _store: JotaiStore | null = null; let _store: JotaiStore | null = null;
@@ -12,11 +12,11 @@ export function initModelStore(store: JotaiStore) {
_store = store; _store = store;
getCurrentWebviewWindow() getCurrentWebviewWindow()
.listen<ModelPayload>("model_write", ({ payload }) => { .listen<ModelPayload>('model_write', ({ payload }) => {
if (shouldIgnoreModel(payload)) return; if (shouldIgnoreModel(payload)) return;
mustStore().set(modelStoreDataAtom, (prev: ModelStoreData) => { mustStore().set(modelStoreDataAtom, (prev: ModelStoreData) => {
if (payload.change.type === "upsert") { if (payload.change.type === 'upsert') {
return { return {
...prev, ...prev,
[payload.model.model]: { [payload.model.model]: {
@@ -36,7 +36,7 @@ export function initModelStore(store: JotaiStore) {
function mustStore(): JotaiStore { function mustStore(): JotaiStore {
if (_store == null) { if (_store == null) {
throw new Error("Model store was not initialized"); throw new Error('Model store was not initialized');
} }
return _store; return _store;
@@ -45,8 +45,8 @@ function mustStore(): JotaiStore {
let _activeWorkspaceId: string | null = null; let _activeWorkspaceId: string | null = null;
export async function changeModelStoreWorkspace(workspaceId: string | null) { export async function changeModelStoreWorkspace(workspaceId: string | null) {
console.log("Syncing models with new workspace", workspaceId); console.log('Syncing models with new workspace', workspaceId);
const workspaceModelsStr = await invoke<string>("models_workspace_models", { const workspaceModelsStr = await invoke<string>('models_workspace_models', {
workspaceId, // NOTE: if no workspace id provided, it will just fetch global models workspaceId, // NOTE: if no workspace id provided, it will just fetch global models
}); });
const workspaceModels = JSON.parse(workspaceModelsStr) as AnyModel[]; const workspaceModels = JSON.parse(workspaceModelsStr) as AnyModel[];
@@ -57,12 +57,12 @@ export async function changeModelStoreWorkspace(workspaceId: string | null) {
mustStore().set(modelStoreDataAtom, data); mustStore().set(modelStoreDataAtom, data);
console.log("Synced model store with workspace", workspaceId, data); console.log('Synced model store with workspace', workspaceId, data);
_activeWorkspaceId = workspaceId; _activeWorkspaceId = workspaceId;
} }
export function listModels<M extends AnyModel["model"], T extends ExtractModel<AnyModel, M>>( export function listModels<M extends AnyModel['model'], T extends ExtractModel<AnyModel, M>>(
modelType: M | ReadonlyArray<M>, modelType: M | ReadonlyArray<M>,
): T[] { ): T[] {
let data = mustStore().get(modelStoreDataAtom); let data = mustStore().get(modelStoreDataAtom);
@@ -70,7 +70,7 @@ export function listModels<M extends AnyModel["model"], T extends ExtractModel<A
return types.flatMap((t) => Object.values(data[t]) as T[]); return types.flatMap((t) => Object.values(data[t]) as T[]);
} }
export function getModel<M extends AnyModel["model"], T extends ExtractModel<AnyModel, M>>( export function getModel<M extends AnyModel['model'], T extends ExtractModel<AnyModel, M>>(
modelType: M | ReadonlyArray<M>, modelType: M | ReadonlyArray<M>,
id: string, id: string,
): T | null { ): T | null {
@@ -83,17 +83,18 @@ export function getModel<M extends AnyModel["model"], T extends ExtractModel<Any
return null; return null;
} }
export function getAnyModel(id: string): AnyModel | null { export function getAnyModel(
id: string,
): AnyModel | null {
let data = mustStore().get(modelStoreDataAtom); let data = mustStore().get(modelStoreDataAtom);
for (const t of Object.keys(data)) { for (const t of Object.keys(data)) {
// oxlint-disable-next-line no-explicit-any
let v = (data as any)[t]?.[id]; let v = (data as any)[t]?.[id];
if (v?.model === t) return v; if (v?.model === t) return v;
} }
return null; return null;
} }
export function patchModelById<M extends AnyModel["model"], T extends ExtractModel<AnyModel, M>>( export function patchModelById<M extends AnyModel['model'], T extends ExtractModel<AnyModel, M>>(
model: M, model: M,
id: string, id: string,
patch: Partial<T> | ((prev: T) => T), patch: Partial<T> | ((prev: T) => T),
@@ -103,54 +104,54 @@ export function patchModelById<M extends AnyModel["model"], T extends ExtractMod
throw new Error(`Failed to get model to patch id=${id} model=${model}`); throw new Error(`Failed to get model to patch id=${id} model=${model}`);
} }
const newModel = typeof patch === "function" ? patch(prev) : { ...prev, ...patch }; const newModel = typeof patch === 'function' ? patch(prev) : { ...prev, ...patch };
return updateModel(newModel); return updateModel(newModel);
} }
export async function patchModel<M extends AnyModel["model"], T extends ExtractModel<AnyModel, M>>( export async function patchModel<M extends AnyModel['model'], T extends ExtractModel<AnyModel, M>>(
base: Pick<T, "id" | "model">, base: Pick<T, 'id' | 'model'>,
patch: Partial<T>, patch: Partial<T>,
): Promise<string> { ): Promise<string> {
return patchModelById<M, T>(base.model, base.id, patch); return patchModelById<M, T>(base.model, base.id, patch);
} }
export async function updateModel<M extends AnyModel["model"], T extends ExtractModel<AnyModel, M>>( export async function updateModel<M extends AnyModel['model'], T extends ExtractModel<AnyModel, M>>(
model: T, model: T,
): Promise<string> { ): Promise<string> {
return invoke<string>("models_upsert", { model }); return invoke<string>('models_upsert', { model });
} }
export async function deleteModelById< export async function deleteModelById<
M extends AnyModel["model"], M extends AnyModel['model'],
T extends ExtractModel<AnyModel, M>, T extends ExtractModel<AnyModel, M>,
>(modelType: M | M[], id: string) { >(modelType: M | M[], id: string) {
let model = getModel<M, T>(modelType, id); let model = getModel<M, T>(modelType, id);
await deleteModel(model); await deleteModel(model);
} }
export async function deleteModel<M extends AnyModel["model"], T extends ExtractModel<AnyModel, M>>( export async function deleteModel<M extends AnyModel['model'], T extends ExtractModel<AnyModel, M>>(
model: T | null, model: T | null,
) { ) {
if (model == null) { if (model == null) {
throw new Error("Failed to delete null model"); throw new Error('Failed to delete null model');
} }
await invoke<string>("models_delete", { model }); await invoke<string>('models_delete', { model });
} }
export function duplicateModel<M extends AnyModel["model"], T extends ExtractModel<AnyModel, M>>( export function duplicateModel<M extends AnyModel['model'], T extends ExtractModel<AnyModel, M>>(
model: T | null, model: T | null,
) { ) {
if (model == null) { if (model == null) {
throw new Error("Failed to duplicate null model"); throw new Error('Failed to duplicate null model');
} }
// If the model has a name, try to duplicate it with a name that doesn't conflict // If the model has a name, try to duplicate it with a name that doesn't conflict
let name = "name" in model ? resolvedModelName(model) : undefined; let name = 'name' in model ? resolvedModelName(model) : undefined;
if (name != null) { if (name != null) {
const existingModels = listModels(model.model); const existingModels = listModels(model.model);
for (let i = 0; i < 100; i++) { for (let i = 0; i < 100; i++) {
const hasConflict = existingModels.some((m) => { const hasConflict = existingModels.some((m) => {
if ("folderId" in m && "folderId" in model && model.folderId !== m.folderId) { if ('folderId' in m && 'folderId' in model && model.folderId !== m.folderId) {
return false; return false;
} else if (resolvedModelName(m) !== name) { } else if (resolvedModelName(m) !== name) {
return false; return false;
@@ -164,7 +165,7 @@ export function duplicateModel<M extends AnyModel["model"], T extends ExtractMod
// Name conflict. Try another one // Name conflict. Try another one
const m: RegExpMatchArray | null = name.match(/ Copy( (?<n>\d+))?$/); const m: RegExpMatchArray | null = name.match(/ Copy( (?<n>\d+))?$/);
if (m != null && m.groups?.n == null) { if (m != null && m.groups?.n == null) {
name = name.substring(0, m.index) + " Copy 2"; name = name.substring(0, m.index) + ' Copy 2';
} else if (m != null && m.groups?.n != null) { } else if (m != null && m.groups?.n != null) {
name = name.substring(0, m.index) + ` Copy ${parseInt(m.groups.n) + 1}`; name = name.substring(0, m.index) + ` Copy ${parseInt(m.groups.n) + 1}`;
} else { } else {
@@ -173,23 +174,23 @@ export function duplicateModel<M extends AnyModel["model"], T extends ExtractMod
} }
} }
return invoke<string>("models_duplicate", { model: { ...model, name } }); return invoke<string>('models_duplicate', { model: { ...model, name } });
} }
export async function createGlobalModel<T extends Exclude<AnyModel, { workspaceId: string }>>( export async function createGlobalModel<T extends Exclude<AnyModel, { workspaceId: string }>>(
patch: Partial<T> & Pick<T, "model">, patch: Partial<T> & Pick<T, 'model'>,
): Promise<string> { ): Promise<string> {
return invoke<string>("models_upsert", { model: patch }); return invoke<string>('models_upsert', { model: patch });
} }
export async function createWorkspaceModel<T extends Extract<AnyModel, { workspaceId: string }>>( export async function createWorkspaceModel<T extends Extract<AnyModel, { workspaceId: string }>>(
patch: Partial<T> & Pick<T, "model" | "workspaceId">, patch: Partial<T> & Pick<T, 'model' | 'workspaceId'>,
): Promise<string> { ): Promise<string> {
return invoke<string>("models_upsert", { model: patch }); return invoke<string>('models_upsert', { model: patch });
} }
export function replaceModelsInStore< export function replaceModelsInStore<
M extends AnyModel["model"], M extends AnyModel['model'],
T extends Extract<AnyModel, { model: M }>, T extends Extract<AnyModel, { model: M }>,
>(model: M, models: T[]) { >(model: M, models: T[]) {
const newModels: Record<string, T> = {}; const newModels: Record<string, T> = {};
@@ -206,7 +207,7 @@ export function replaceModelsInStore<
} }
export function mergeModelsInStore< export function mergeModelsInStore<
M extends AnyModel["model"], M extends AnyModel['model'],
T extends Extract<AnyModel, { model: M }>, T extends Extract<AnyModel, { model: M }>,
>(model: M, models: T[], filter?: (model: T) => boolean) { >(model: M, models: T[], filter?: (model: T) => boolean) {
mustStore().set(modelStoreDataAtom, (prev: ModelStoreData) => { mustStore().set(modelStoreDataAtom, (prev: ModelStoreData) => {
@@ -235,7 +236,7 @@ export function mergeModelsInStore<
function shouldIgnoreModel({ model, updateSource }: ModelPayload) { function shouldIgnoreModel({ model, updateSource }: ModelPayload) {
// Never ignore updates from non-user sources // Never ignore updates from non-user sources
if (updateSource.type !== "window") { if (updateSource.type !== 'window') {
return false; return false;
} }
@@ -245,11 +246,11 @@ function shouldIgnoreModel({ model, updateSource }: ModelPayload) {
} }
// Only sync models that belong to this workspace, if a workspace ID is present // Only sync models that belong to this workspace, if a workspace ID is present
if ("workspaceId" in model && model.workspaceId !== _activeWorkspaceId) { if ('workspaceId' in model && model.workspaceId !== _activeWorkspaceId) {
return true; return true;
} }
if (model.model === "key_value" && model.namespace === "no_sync") { if (model.model === 'key_value' && model.namespace === 'no_sync') {
return true; return true;
} }

Some files were not shown because too many files have changed in this diff Show More