Compare commits

..

1 Commits

Author SHA1 Message Date
Gregory Schier
72d862ed22 Disable editor state 2025-01-21 11:58:52 -08:00
1328 changed files with 233124 additions and 89265 deletions

View File

@@ -1,72 +0,0 @@
# Claude Context: Detaching Tauri from Yaak
## Goal
Make Yaak runnable as a standalone CLI without Tauri as a dependency. The core Rust crates in `crates/` should be usable independently, while Tauri-specific code lives in `crates-tauri/`.
## Project Structure
```
crates/ # Core crates - should NOT depend on Tauri
crates-tauri/ # Tauri-specific crates (yaak-app, yaak-tauri-utils, etc.)
crates-cli/ # CLI crate (yaak-cli)
```
## Completed Work
### 1. Folder Restructure
- Moved Tauri-dependent app code to `crates-tauri/yaak-app/`
- Created `crates-tauri/yaak-tauri-utils/` for shared Tauri utilities (window traits, api_client, error handling)
- Created `crates-cli/yaak-cli/` for the standalone CLI
### 2. Decoupled Crates (no longer depend on Tauri)
- **yaak-models**: Uses `init_standalone()` pattern for CLI database access
- **yaak-http**: Removed Tauri plugin, HttpConnectionManager initialized in yaak-app setup
- **yaak-common**: Only contains Tauri-free utilities (serde, platform)
- **yaak-crypto**: Removed Tauri plugin, EncryptionManager initialized in yaak-app setup, commands moved to yaak-app
- **yaak-grpc**: Replaced AppHandle with GrpcConfig struct, uses tokio::process::Command instead of Tauri sidecar
### 3. CLI Implementation
- Basic CLI at `crates-cli/yaak-cli/src/main.rs`
- Commands: workspaces, requests, send (by ID), get (ad-hoc URL), create
- Uses same database as Tauri app via `yaak_models::init_standalone()`
## Remaining Work
### Crates Still Depending on Tauri (in `crates/`)
1. **yaak-git** (3 files) - Moderate complexity
2. **yaak-plugins** (13 files) - **Hardest** - deeply integrated with Tauri for plugin-to-window communication
3. **yaak-sync** (4 files) - Moderate complexity
4. **yaak-ws** (5 files) - Moderate complexity
### Pattern for Decoupling
1. Remove Tauri plugin `init()` function from the crate
2. Move commands to `yaak-app/src/commands.rs` or keep inline in `lib.rs`
3. Move extension traits (e.g., `SomethingManagerExt`) to yaak-app or yaak-tauri-utils
4. Initialize managers in yaak-app's `.setup()` block
5. Remove `tauri` from Cargo.toml dependencies
6. Update `crates-tauri/yaak-app/capabilities/default.json` to remove the plugin permission
7. Replace `tauri::async_runtime::block_on` with `tokio::runtime::Handle::current().block_on()`
## Key Files
- `crates-tauri/yaak-app/src/lib.rs` - Main Tauri app, setup block initializes managers
- `crates-tauri/yaak-app/src/commands.rs` - Migrated Tauri commands
- `crates-tauri/yaak-app/src/models_ext.rs` - Database plugin and extension traits
- `crates-tauri/yaak-tauri-utils/src/window.rs` - WorkspaceWindowTrait for window state
- `crates/yaak-models/src/lib.rs` - Contains `init_standalone()` for CLI usage
## Git Branch
Working on `detach-tauri` branch.
## Recent Commits
```
c40cff40 Remove Tauri dependencies from yaak-crypto and yaak-grpc
df495f1d Move Tauri utilities from yaak-common to yaak-tauri-utils
481e0273 Remove Tauri dependencies from yaak-http and yaak-common
10568ac3 Add HTTP request sending to yaak-cli
bcb7d600 Add yaak-cli stub with basic database access
e718a5f1 Refactor models_ext to use init_standalone from yaak-models
```
## Testing
- Run `cargo check -p <crate>` to verify a crate builds without Tauri
- Run `npm run app-dev` to test the Tauri app still works
- Run `cargo run -p yaak-cli -- --help` to test the CLI

View File

@@ -1,62 +0,0 @@
---
description: Review a PR in a new worktree
allowed-tools: Bash(git worktree:*), Bash(gh pr:*), Bash(git branch:*)
---
Check out a GitHub pull request for review.
## Usage
```
/check-out-pr <PR_NUMBER>
```
## What to do
1. If no PR number is provided, list all open pull requests and ask the user to select one
2. Get PR information using `gh pr view <PR_NUMBER> --json number,headRefName`
3. **Ask the user** whether they want to:
- **A) Check out in current directory** — simple `gh pr checkout <PR_NUMBER>`
- **B) Create a new worktree** — isolated copy at `../yaak-worktrees/pr-<PR_NUMBER>`
4. Follow the appropriate path below
## Option A: Check out in current directory
1. Run `gh pr checkout <PR_NUMBER>`
2. Inform the user which branch they're now on
## Option B: Create a new worktree
1. Create a new worktree at `../yaak-worktrees/pr-<PR_NUMBER>` using `git worktree add` with a timeout of at least 300000ms (5 minutes) since the post-checkout hook runs a bootstrap script
2. Checkout the PR branch in the new worktree using `gh pr checkout <PR_NUMBER>`
3. The post-checkout hook will automatically:
- Create `.env.local` with unique ports
- Copy editor config folders
- Run `npm install && npm run bootstrap`
4. Inform the user:
- Where the worktree was created
- What ports were assigned
- How to access it (cd command)
- How to run the dev server
- How to remove the worktree when done
### Example worktree output
```
Created worktree for PR #123 at ../yaak-worktrees/pr-123
Branch: feature-auth
Ports: Vite (1421), MCP (64344)
To start working:
cd ../yaak-worktrees/pr-123
npm run app-dev
To remove when done:
git worktree remove ../yaak-worktrees/pr-123
```
## Error Handling
- If the PR doesn't exist, show a helpful error
- If the worktree already exists, inform the user and ask if they want to remove and recreate it
- If `gh` CLI is not available, inform the user to install it

View File

@@ -1,49 +0,0 @@
---
description: Generate formatted release notes for Yaak releases
allowed-tools: Bash(git tag:*)
---
Generate formatted release notes for Yaak releases by analyzing git history and pull request descriptions.
## What to do
1. Identifies the version tag and previous version
2. Retrieves all commits between versions
- If the version is a beta version, it retrieves commits between the beta version and previous beta version
- If the version is a stable version, it retrieves commits between the stable version and the previous stable version
3. Fetches PR descriptions for linked issues to find:
- Feedback URLs (feedback.yaak.app)
- Additional context and descriptions
- Installation links for plugins
4. Formats the release notes using the standard Yaak format:
- Changelog badge at the top
- Bulleted list of changes with PR links
- Feedback links where available
- Full changelog comparison link at the bottom
## Output Format
The skill generates markdown-formatted release notes following this structure:
```markdown
[![Changelog](https://img.shields.io/badge/Changelog-VERSION-blue)](https://yaak.app/changelog/VERSION)
- Feature/fix description in by @username [#123](https://github.com/mountain-loop/yaak/pull/123)
- [Linked feedback item](https://feedback.yaak.app/p/item) by @username in [#456](https://github.com/mountain-loop/yaak/pull/456)
- A simple item that doesn't have a feedback or PR link
**Full Changelog**: https://github.com/mountain-loop/yaak/compare/vPREV...vCURRENT
```
**IMPORTANT**: Always add a blank lines around the markdown code fence and output the markdown code block last
**IMPORTANT**: PRs by `@gschier` should not mention the @username
## After Generating Release Notes
After outputting the release notes, ask the user if they would like to create a draft GitHub release with these notes. If they confirm, create the release using:
```bash
gh release create <tag> --draft --prerelease --title "Release <version>" --notes '<release notes>'
```
**IMPORTANT**: The release title format is "Release XXXX" where XXXX is the version WITHOUT the `v` prefix. For example, tag `v2026.2.1-beta.1` gets title "Release 2026.2.1-beta.1".

View File

@@ -1,27 +0,0 @@
# Project Rules
## General Development
- **NEVER** commit or push without explicit confirmation
## Build and Lint
- **ALWAYS** run `npm run lint` after modifying TypeScript or JavaScript files
- Run `npm run bootstrap` after changing plugin runtime or MCP server code
## Plugin System
### Backend Constraints
- Always use `UpdateSource::Plugin` when calling database methods from plugin events
- Never send timestamps (`createdAt`, `updatedAt`) from TypeScript - Rust backend controls these
- Backend uses `NaiveDateTime` (no timezone) so avoid sending ISO timestamp strings
### MCP Server
- MCP server has **no active window context** - cannot call `window.workspaceId()`
- Get workspace ID from `workspaceCtx.yaak.workspace.list()` instead
## Rust Type Generation
- Run `cargo test --package yaak-plugins` (and for other crates) to regenerate TypeScript bindings after modifying Rust event types

View File

@@ -1,35 +0,0 @@
# Worktree Management Skill
## Creating Worktrees
When creating git worktrees for this project, ALWAYS use the path format:
```
../yaak-worktrees/<NAME>
```
For example:
- `git worktree add ../yaak-worktrees/feature-auth`
- `git worktree add ../yaak-worktrees/bugfix-login`
- `git worktree add ../yaak-worktrees/refactor-api`
## What Happens Automatically
The post-checkout hook will automatically:
1. Create `.env.local` with unique ports (YAAK_DEV_PORT and YAAK_PLUGIN_MCP_SERVER_PORT)
2. Copy gitignored editor config folders (.zed, .idea, etc.)
3. Run `npm install && npm run bootstrap`
## Deleting Worktrees
```bash
git worktree remove ../yaak-worktrees/<NAME>
```
## Port Assignments
- Main worktree: 1420 (Vite), 64343 (MCP)
- First worktree: 1421, 64344
- Second worktree: 1422, 64345
- etc.
Each worktree can run `npm run app-dev` simultaneously without conflicts.

6
.eslintignore Normal file
View File

@@ -0,0 +1,6 @@
node_modules/
dist/
.eslintrc.cjs
.prettierrc.cjs
src-web/postcss.config.cjs
src-web/vite.config.ts

49
.eslintrc.cjs Normal file
View File

@@ -0,0 +1,49 @@
module.exports = {
extends: [
'eslint:recommended',
'plugin:react/recommended',
'plugin:react-hooks/recommended',
'plugin:import/recommended',
'plugin:jsx-a11y/recommended',
'plugin:@typescript-eslint/recommended',
'eslint-config-prettier',
],
plugins: ['react-refresh'],
parser: '@typescript-eslint/parser',
parserOptions: {
project: ['./tsconfig.json'],
},
ignorePatterns: [
'scripts/**/*',
'packages/plugin-runtime/**/*',
'packages/plugin-runtime-types/**/*',
'src-tauri/**/*',
'src-web/tailwind.config.cjs',
'src-web/vite.config.ts',
],
settings: {
react: {
version: 'detect',
},
'import/resolver': {
node: {
paths: ['src-web'],
extensions: ['.ts', '.tsx'],
},
},
},
rules: {
'react-refresh/only-export-components': 'error',
'jsx-a11y/no-autofocus': 'off',
'react/react-in-jsx-scope': 'off',
'import/no-unresolved': 'off',
'@typescript-eslint/consistent-type-imports': [
'error',
{
prefer: 'type-imports',
disallowTypeAnnotations: true,
fixStyle: 'separate-type-imports',
},
],
},
};

9
.gitattributes vendored
View File

@@ -1,7 +1,2 @@
crates-tauri/yaak-app/vendored/**/* linguist-generated=true
crates-tauri/yaak-app/gen/schemas/**/* linguist-generated=true
**/bindings/* linguist-generated=true
crates/yaak-templates/pkg/* linguist-generated=true
# Ensure consistent line endings for test files that check exact content
crates/yaak-http/tests/test.txt text eol=lf
src-tauri/vendored/**/* linguist-generated=true
src-tauri/gen/schemas/**/* linguist-generated=true

3
.github/FUNDING.yml vendored
View File

@@ -1,3 +0,0 @@
# These are supported funding model platforms
github: gschier

View File

@@ -1,38 +0,0 @@
---
name: Bug report
about: Create a report to help us improve
title: ''
labels: ''
assignees: ''
---
**Describe the bug**
A clear and concise description of what the bug is.
**To Reproduce**
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Desktop (please complete the following information):**
- OS: [e.g. iOS]
- Browser [e.g. chrome, safari]
- Version [e.g. 22]
**Smartphone (please complete the following information):**
- Device: [e.g. iPhone6]
- OS: [e.g. iOS8.1]
- Browser [e.g. stock browser, safari]
- Version [e.g. 22]
**Additional context**
Add any other context about the problem here.

18
.github/workflows/ci-js.yml vendored Normal file
View File

@@ -0,0 +1,18 @@
on:
pull_request:
branches: [develop]
name: CI (JS)
jobs:
test:
name: Lint/Test
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 20
- run: npm ci
- run: npm run lint
- run: npm test

36
.github/workflows/ci-rust.yml vendored Normal file
View File

@@ -0,0 +1,36 @@
on:
pull_request:
branches: [develop]
paths:
- src-tauri/**
- .github/workflows/**
name: CI (Rust)
defaults:
run:
working-directory: src-tauri
jobs:
test:
name: Check/Test
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- run: |
sudo apt-get update
sudo apt-get install -y libwebkit2gtk-4.1-dev
- uses: dtolnay/rust-toolchain@stable
- uses: actions/cache@v3
continue-on-error: false
with:
path: |
~/.cargo/bin/
~/.cargo/registry/index/
~/.cargo/registry/cache/
~/.cargo/git/db/
target/
key: ${{ runner.os }}-cargo-${{ hashFiles('**/Cargo.lock') }}
restore-keys: ${{ runner.os }}-cargo-
- run: cargo check --all
- run: cargo test --all

View File

@@ -1,30 +0,0 @@
on:
pull_request:
push:
branches:
- main
name: Lint and Test
permissions:
contents: read
jobs:
test:
name: Lint/Test
runs-on: macos-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
- uses: dtolnay/rust-toolchain@stable
- uses: Swatinem/rust-cache@v2
with:
shared-key: ci
cache-on-failure: true
- run: npm ci
- run: npm run bootstrap
- run: npm run lint
- name: Run JS Tests
run: npm test
- name: Run Rust Tests
run: cargo test --all

View File

@@ -1,50 +0,0 @@
name: Claude Code
on:
issue_comment:
types: [created]
pull_request_review_comment:
types: [created]
issues:
types: [opened, assigned]
pull_request_review:
types: [submitted]
jobs:
claude:
if: |
(github.event_name == 'issue_comment' && contains(github.event.comment.body, '@claude')) ||
(github.event_name == 'pull_request_review_comment' && contains(github.event.comment.body, '@claude')) ||
(github.event_name == 'pull_request_review' && contains(github.event.review.body, '@claude')) ||
(github.event_name == 'issues' && (contains(github.event.issue.body, '@claude') || contains(github.event.issue.title, '@claude')))
runs-on: ubuntu-latest
permissions:
contents: read
pull-requests: read
issues: read
id-token: write
actions: read # Required for Claude to read CI results on PRs
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
fetch-depth: 1
- name: Run Claude Code
id: claude
uses: anthropics/claude-code-action@v1
with:
claude_code_oauth_token: ${{ secrets.CLAUDE_CODE_OAUTH_TOKEN }}
# This is an optional setting that allows Claude to read CI results on PRs
additional_permissions: |
actions: read
# Optional: Give a custom prompt to Claude. If this is not specified, Claude will perform the instructions specified in the comment that tagged it.
# prompt: 'Update the pull request description to include a summary of changes.'
# Optional: Add claude_args to customize behavior and configuration
# See https://github.com/anthropics/claude-code-action/blob/main/docs/usage.md
# or https://code.claude.com/docs/en/cli-reference for available options
# claude_args: '--allowed-tools Bash(gh pr:*)'

View File

@@ -1,52 +0,0 @@
name: Update Flathub
on:
release:
types: [published]
permissions:
contents: read
jobs:
update-flathub:
name: Update Flathub manifest
runs-on: ubuntu-latest
# Only run for stable releases (skip betas/pre-releases)
if: ${{ !github.event.release.prerelease }}
steps:
- name: Checkout app repo
uses: actions/checkout@v4
- name: Checkout Flathub repo
uses: actions/checkout@v4
with:
repository: flathub/app.yaak.Yaak
token: ${{ secrets.FLATHUB_TOKEN }}
path: flathub-repo
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.12"
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version: "22"
- name: Install source generators
run: |
pip install flatpak-node-generator tomlkit aiohttp
git clone --depth 1 https://github.com/flatpak/flatpak-builder-tools flatpak/flatpak-builder-tools
- name: Run update-manifest.sh
run: bash flatpak/update-manifest.sh "${{ github.event.release.tag_name }}" flathub-repo
- name: Commit and push to Flathub
working-directory: flathub-repo
run: |
git config user.name "github-actions[bot]"
git config user.email "github-actions[bot]@users.noreply.github.com"
git add -A
git diff --cached --quiet && echo "No changes to commit" && exit 0
git commit -m "Update to ${{ github.event.release.tag_name }}"
git push

View File

@@ -1,7 +1,10 @@
name: Generate Artifacts
on:
push:
tags: [v*]
tags: [ v* ]
env:
YAAK_PLUGINS_DIR: checkout/plugins
jobs:
build-artifacts:
@@ -13,37 +16,18 @@ jobs:
fail-fast: false
matrix:
include:
- platform: "macos-latest" # for Arm-based Macs (M1 and above).
args: "--target aarch64-apple-darwin"
yaak_arch: "arm64"
os: "macos"
targets: "aarch64-apple-darwin"
- platform: "macos-latest" # for Intel-based Macs.
args: "--target x86_64-apple-darwin"
yaak_arch: "x64"
os: "macos"
targets: "x86_64-apple-darwin"
- platform: "ubuntu-22.04"
args: ""
yaak_arch: "x64"
os: "ubuntu"
targets: ""
- platform: "ubuntu-22.04-arm"
args: ""
yaak_arch: "arm64"
os: "ubuntu"
targets: ""
- platform: "windows-latest"
args: ""
yaak_arch: "x64"
os: "windows"
targets: ""
# Windows ARM64
- platform: "windows-latest"
args: "--target aarch64-pc-windows-msvc"
yaak_arch: "arm64"
os: "windows"
targets: "aarch64-pc-windows-msvc"
- platform: 'macos-latest' # for Arm-based Macs (M1 and above).
args: '--target aarch64-apple-darwin'
yaak_arch: 'arm64'
- platform: 'macos-latest' # for Intel-based Macs.
args: '--target x86_64-apple-darwin'
yaak_arch: 'x64'
- platform: 'ubuntu-22.04'
args: ''
yaak_arch: 'x64'
- platform: 'windows-latest'
args: ''
yaak_arch: 'x64'
runs-on: ${{ matrix.platform }}
timeout-minutes: 40
steps:
@@ -52,81 +36,67 @@ jobs:
- name: Setup Node
uses: actions/setup-node@v4
with:
node-version: 22
- name: install Rust stable
uses: dtolnay/rust-toolchain@stable
with:
targets: ${{ matrix.targets }}
# Those targets are only used on macos runners so it's in an `if` to slightly speed up windows and linux builds.
targets: ${{ matrix.platform == 'macos-latest' && 'aarch64-apple-darwin,x86_64-apple-darwin' || '' }}
- uses: Swatinem/rust-cache@v2
- uses: actions/cache@v3
continue-on-error: false
with:
shared-key: ci
cache-on-failure: true
path: |
~/.cargo/bin/
~/.cargo/registry/index/
~/.cargo/registry/cache/
~/.cargo/git/db/
src-tauri/target/
key: ${{ runner.os }}-cargo-${{ hashFiles('**/Cargo.lock') }}
restore-keys: ${{ runner.os }}-cargo-
- name: install dependencies (Linux only)
if: matrix.os == 'ubuntu'
- name: install dependencies (ubuntu only)
if: matrix.platform == 'ubuntu-22.04' # This must match the platform value defined above.
run: |
sudo apt-get update
sudo apt-get install -y libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev patchelf xdg-utils
sudo apt-get install -y libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev patchelf
- name: install dependencies (windows only)
if: matrix.platform == 'windows-latest'
run: cargo install --force trusted-signing-cli
- name: Install NPM Dependencies
run: |
npm ci
npm install @yaakapp/cli
- name: Install Protoc for plugin-runtime
uses: arduino/setup-protoc@v3
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}
- name: Install trusted-signing-cli (Windows only)
if: matrix.os == 'windows'
shell: pwsh
run: |
$ErrorActionPreference = 'Stop'
$dir = "$env:USERPROFILE\trusted-signing"
New-Item -ItemType Directory -Force -Path $dir | Out-Null
$url = "https://github.com/Levminer/trusted-signing-cli/releases/download/0.8.0/trusted-signing-cli.exe"
$exe = Join-Path $dir "trusted-signing-cli.exe"
Invoke-WebRequest -Uri $url -OutFile $exe
echo $dir >> $env:GITHUB_PATH
& $exe --version
- name: Run JS build
run: npm run build
- run: npm ci
- run: npm run bootstrap
env:
YAAK_TARGET_ARCH: ${{ matrix.yaak_arch }}
- run: npm run lint
- name: Run JS Tests
run: npm test
- name: Run Rust Tests
run: cargo test --all
- name: Run lint
run: npm run lint
- name: Checkout yaakapp/plugins
uses: actions/checkout@v4
with:
repository: yaakapp/plugins
path: ${{ env.YAAK_PLUGINS_DIR }}
- name: Set version
run: npm run replace-version
env:
YAAK_VERSION: ${{ github.ref_name }}
- name: Sign vendored binaries (macOS only)
if: matrix.os == 'macos'
env:
APPLE_CERTIFICATE: ${{ secrets.APPLE_CERTIFICATE }}
APPLE_CERTIFICATE_PASSWORD: ${{ secrets.APPLE_CERTIFICATE_PASSWORD }}
APPLE_SIGNING_IDENTITY: ${{ secrets.APPLE_SIGNING_IDENTITY }}
KEYCHAIN_PASSWORD: ${{ secrets.KEYCHAIN_PASSWORD }}
run: |
# Create keychain
KEYCHAIN_PATH=$RUNNER_TEMP/app-signing.keychain-db
security create-keychain -p "$KEYCHAIN_PASSWORD" $KEYCHAIN_PATH
security set-keychain-settings -lut 21600 $KEYCHAIN_PATH
security unlock-keychain -p "$KEYCHAIN_PASSWORD" $KEYCHAIN_PATH
# Import certificate
echo "$APPLE_CERTIFICATE" | base64 --decode > certificate.p12
security import certificate.p12 -P "$APPLE_CERTIFICATE_PASSWORD" -A -t cert -f pkcs12 -k $KEYCHAIN_PATH
security list-keychain -d user -s $KEYCHAIN_PATH
# Sign vendored binaries with hardened runtime and their specific entitlements
codesign --force --options runtime --entitlements crates-tauri/yaak-app/macos/entitlements.yaakprotoc.plist --sign "$APPLE_SIGNING_IDENTITY" crates-tauri/yaak-app/vendored/protoc/yaakprotoc || true
codesign --force --options runtime --entitlements crates-tauri/yaak-app/macos/entitlements.yaaknode.plist --sign "$APPLE_SIGNING_IDENTITY" crates-tauri/yaak-app/vendored/node/yaaknode || true
- uses: tauri-apps/tauri-action@v0
env:
YAAK_PLUGINS_DIR: ${{ env.YAAK_PLUGINS_DIR }}
YAAK_TARGET_ARCH: ${{ matrix.yaak_arch }}
ENABLE_CODE_SIGNING: ${{ secrets.APPLE_CERTIFICATE }}
@@ -135,21 +105,21 @@ jobs:
TAURI_SIGNING_PRIVATE_KEY_PASSWORD: ${{ secrets.TAURI_KEY_PASSWORD }}
# Apple signing stuff
APPLE_CERTIFICATE: ${{ matrix.os == 'macos' && secrets.APPLE_CERTIFICATE }}
APPLE_CERTIFICATE_PASSWORD: ${{ matrix.os == 'macos' && secrets.APPLE_CERTIFICATE_PASSWORD }}
APPLE_ID: ${{ matrix.os == 'macos' && secrets.APPLE_ID }}
APPLE_PASSWORD: ${{ matrix.os == 'macos' && secrets.APPLE_PASSWORD }}
APPLE_SIGNING_IDENTITY: ${{ matrix.os == 'macos' && secrets.APPLE_SIGNING_IDENTITY }}
APPLE_TEAM_ID: ${{ matrix.os == 'macos' && secrets.APPLE_TEAM_ID }}
APPLE_CERTIFICATE: ${{ matrix.platform == 'macos-latest' && secrets.APPLE_CERTIFICATE }}
APPLE_CERTIFICATE_PASSWORD: ${{ matrix.platform == 'macos-latest' && secrets.APPLE_CERTIFICATE_PASSWORD }}
APPLE_ID: ${{ matrix.platform == 'macos-latest' && secrets.APPLE_ID }}
APPLE_PASSWORD: ${{ matrix.platform == 'macos-latest' && secrets.APPLE_PASSWORD }}
APPLE_SIGNING_IDENTITY: ${{ matrix.platform == 'macos-latest' && secrets.APPLE_SIGNING_IDENTITY }}
APPLE_TEAM_ID: ${{ matrix.platform == 'macos-latest' && secrets.APPLE_TEAM_ID }}
# Windows signing stuff
AZURE_CLIENT_ID: ${{ matrix.os == 'windows' && secrets.AZURE_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ matrix.os == 'windows' && secrets.AZURE_CLIENT_SECRET }}
AZURE_TENANT_ID: ${{ matrix.os == 'windows' && secrets.AZURE_TENANT_ID }}
AZURE_CLIENT_ID: ${{ matrix.platform == 'windows-latest' && secrets.AZURE_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ matrix.platform == 'windows-latest' && secrets.AZURE_CLIENT_SECRET }}
AZURE_TENANT_ID: ${{ matrix.platform == 'windows-latest' && secrets.AZURE_TENANT_ID }}
with:
tagName: "v__VERSION__"
releaseName: "Release __VERSION__"
releaseBody: "[Changelog __VERSION__](https://yaak.app/blog/__VERSION__)"
tagName: 'v__VERSION__'
releaseName: 'Release __VERSION__'
releaseBody: '[Changelog __VERSION__](https://yaak.app/blog/__VERSION__)'
releaseDraft: true
prerelease: true
args: "${{ matrix.args }} --config ./crates-tauri/yaak-app/tauri.release.conf.json"
prerelease: false
args: ${{ matrix.args }}

View File

@@ -1,44 +0,0 @@
name: Generate Sponsors README
on:
workflow_dispatch:
schedule:
- cron: 30 15 * * 0-6
permissions:
contents: write
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- name: Checkout 🛎️
uses: actions/checkout@v2
- name: Generate Sponsors
uses: JamesIves/github-sponsors-readme-action@v1
with:
token: ${{ secrets.SPONSORS_PAT }}
file: 'README.md'
maximum: 1999
template: '<a href="https://github.com/{{{ login }}}"><img src="{{{ avatarUrl }}}" width="50px" alt="User avatar: {{{ login }}}" /></a>&nbsp;&nbsp;'
active-only: false
include-private: true
marker: 'sponsors-base'
- name: Generate Sponsors
uses: JamesIves/github-sponsors-readme-action@v1
with:
token: ${{ secrets.SPONSORS_PAT }}
file: 'README.md'
minimum: 2000
template: '<a href="https://github.com/{{{ login }}}"><img src="{{{ avatarUrl }}}" width="80px" alt="User avatar: {{{ login }}}" /></a>&nbsp;&nbsp;'
active-only: false
include-private: true
marker: 'sponsors-premium'
# ⚠️ Note: You can use any deployment step here to automatically push the README
# changes back to your branch.
- name: Commit Changes
uses: JamesIves/github-pages-deploy-action@v4
with:
branch: main
force: false
folder: '.'

23
.gitignore vendored
View File

@@ -15,8 +15,6 @@ dist-ssr
# Editor directories and files
.vscode/*
!.vscode/extensions.json
!.vscode/settings.json
!.vscode/launch.json
.idea
.DS_Store
*.suo
@@ -25,7 +23,6 @@ dist-ssr
*.sln
*.sw?
.eslintcache
out
*.sqlite
*.sqlite-*
@@ -34,23 +31,3 @@ out
.tmp
tmp
.zed
codebook.toml
target
# Per-worktree Tauri config (generated by post-checkout hook)
crates-tauri/yaak-app/tauri.worktree.conf.json
# Tauri auto-generated permission files
**/permissions/autogenerated
**/permissions/schemas
# Flatpak build artifacts
flatpak-repo/
.flatpak-builder/
flatpak/flatpak-builder-tools/
flatpak/cargo-sources.json
flatpak/node-sources.json
# Local Codex desktop env state
.codex/environments/environment.toml

View File

@@ -1 +0,0 @@
node scripts/git-hooks/post-checkout.mjs "$@"

4
.prettierignore Normal file
View File

@@ -0,0 +1,4 @@
node_modules/
dist/
out/
.prettierrc.cjs

8
.prettierrc.js Normal file
View File

@@ -0,0 +1,8 @@
export default {
"trailingComma": "all",
"tabWidth": 2,
"semi": true,
"singleQuote": true,
"printWidth": 100,
"bracketSpacing": true
}

View File

@@ -1,3 +0,0 @@
{
"recommendations": ["biomejs.biome", "rust-lang.rust-analyzer", "bradlc.vscode-tailwindcss"]
}

26
.vscode/launch.json vendored
View File

@@ -1,26 +0,0 @@
{
"version": "0.2.0",
"configurations": [
{
"type": "node",
"request": "launch",
"name": "Dev App",
"runtimeExecutable": "npm",
"runtimeArgs": ["run", "start"]
},
{
"type": "node",
"request": "launch",
"name": "Build App",
"runtimeExecutable": "npm",
"runtimeArgs": ["run", "start"]
},
{
"type": "node",
"request": "launch",
"name": "Bootstrap",
"runtimeExecutable": "npm",
"runtimeArgs": ["run", "bootstrap"]
}
]
}

View File

@@ -1,6 +0,0 @@
{
"editor.defaultFormatter": "biomejs.biome",
"editor.formatOnSave": true,
"biome.enabled": true,
"biome.lint.format.enable": true
}

View File

@@ -1,73 +0,0 @@
[workspace]
resolver = "2"
members = [
"crates/yaak",
# Shared crates (no Tauri dependency)
"crates/yaak-core",
"crates/yaak-common",
"crates/yaak-crypto",
"crates/yaak-git",
"crates/yaak-grpc",
"crates/yaak-http",
"crates/yaak-models",
"crates/yaak-plugins",
"crates/yaak-sse",
"crates/yaak-sync",
"crates/yaak-templates",
"crates/yaak-tls",
"crates/yaak-ws",
"crates/yaak-api",
# CLI crates
"crates-cli/yaak-cli",
# Tauri-specific crates
"crates-tauri/yaak-app",
"crates-tauri/yaak-fonts",
"crates-tauri/yaak-license",
"crates-tauri/yaak-mac-window",
"crates-tauri/yaak-tauri-utils",
]
[workspace.dependencies]
chrono = "0.4.42"
hex = "0.4.3"
keyring = "3.6.3"
log = "0.4.29"
reqwest = "0.12.20"
rustls = { version = "0.23.34", default-features = false }
rustls-platform-verifier = "0.6.2"
serde = "1.0.228"
serde_json = "1.0.145"
sha2 = "0.10.9"
tauri = "2.9.5"
tauri-plugin = "2.5.2"
tauri-plugin-dialog = "2.4.2"
tauri-plugin-shell = "2.3.3"
thiserror = "2.0.17"
tokio = "1.48.0"
ts-rs = "11.1.0"
# Internal crates - shared
yaak-core = { path = "crates/yaak-core" }
yaak = { path = "crates/yaak" }
yaak-common = { path = "crates/yaak-common" }
yaak-crypto = { path = "crates/yaak-crypto" }
yaak-git = { path = "crates/yaak-git" }
yaak-grpc = { path = "crates/yaak-grpc" }
yaak-http = { path = "crates/yaak-http" }
yaak-models = { path = "crates/yaak-models" }
yaak-plugins = { path = "crates/yaak-plugins" }
yaak-sse = { path = "crates/yaak-sse" }
yaak-sync = { path = "crates/yaak-sync" }
yaak-templates = { path = "crates/yaak-templates" }
yaak-tls = { path = "crates/yaak-tls" }
yaak-ws = { path = "crates/yaak-ws" }
yaak-api = { path = "crates/yaak-api" }
# Internal crates - Tauri-specific
yaak-fonts = { path = "crates-tauri/yaak-fonts" }
yaak-license = { path = "crates-tauri/yaak-license" }
yaak-mac-window = { path = "crates-tauri/yaak-mac-window" }
yaak-tauri-utils = { path = "crates-tauri/yaak-tauri-utils" }
[profile.release]
strip = false

View File

@@ -34,6 +34,8 @@ Run the `bootstrap` command to do some initial setup:
npm run bootstrap
```
_NOTE: Run with `YAAK_PLUGINS_DIR=<Path to yaakapp/plugins>` to re-build bundled plugins_
## Run the App
After bootstrapping, start the app in development mode:
@@ -42,47 +44,19 @@ After bootstrapping, start the app in development mode:
npm start
```
_NOTE: If working on bundled plugins, run with `YAAK_PLUGINS_DIR=<Path to yaakapp/plugins>`_
## SQLite Migrations
New migrations can be created from the `src-tauri/` directory:
```shell
npm run migration
cd src-tauri
sqlx migrate add migration-name
```
Rerun the app to apply the migrations.
Run the app to apply the migrations.
_Note: For safety, development builds use a separate database location from production builds._
If nothing happens, try `cargo clean` and run the app again.
## Lezer Grammar Generation
```sh
# Example
lezer-generator components/core/Editor/<LANG>/<LANG>.grammar > components/core/Editor/<LANG>/<LANG>.ts
```
## Linting & Formatting
This repo uses Biome for linting and formatting (replacing ESLint + Prettier).
- Lint the entire repo:
```sh
npm run lint
```
- Auto-fix lint issues where possible:
```sh
npm run lint:fix
```
- Format code:
```sh
npm run format
```
Notes:
- Many workspace packages also expose the same scripts (`lint`, `lint:fix`, and `format`).
- TypeScript type-checking still runs separately via `tsc --noEmit` in relevant packages.
_Note: Development builds use a separate database location from production builds._

View File

@@ -1,70 +1,26 @@
<p align="center">
<a href="https://github.com/JamesIves/github-sponsors-readme-action">
<img width="200px" src="https://github.com/mountain-loop/yaak/raw/main/crates-tauri/yaak-app/icons/icon.png">
</a>
</p>
# Yaak API Client
<h1 align="center">
💫 Yaak ➟ Desktop API Client 💫
</h1>
Yaak is a desktop API client for organizing and executing REST, GraphQL, and gRPC
requests. It's built using [Tauri](https://tauri.app), Rust, and ReactJS.
<p align="center">
A fast, privacy-first API client for REST, GraphQL, SSE, WebSocket, and gRPC built with Tauri, Rust, and React.
</p>
<p align="center">
Development is funded by community-purchased <a href="https://yaak.app/pricing">licenses</a>. You can also <a href="https://github.com/sponsors/gschier">become a sponsor</a> to have your logo appear below. 💖
</p>
<br>
![screenshot](https://github.com/user-attachments/assets/f18e963f-0b68-4ecb-b8b8-cb71aa9aec02)
## Feedback and Bug Reports
All feedback, bug reports, questions, and feature requests should be reported to
[feedback.yaak.app](https://feedback.yaak.app). Issues will be duplicated
in this repository if applicable.
<p align="center">
<!-- sponsors-premium --><a href="https://github.com/MVST-Solutions"><img src="https:&#x2F;&#x2F;github.com&#x2F;MVST-Solutions.png" width="80px" alt="User avatar: MVST-Solutions" /></a>&nbsp;&nbsp;<a href="https://github.com/dharsanb"><img src="https:&#x2F;&#x2F;github.com&#x2F;dharsanb.png" width="80px" alt="User avatar: dharsanb" /></a>&nbsp;&nbsp;<a href="https://github.com/railwayapp"><img src="https:&#x2F;&#x2F;github.com&#x2F;railwayapp.png" width="80px" alt="User avatar: railwayapp" /></a>&nbsp;&nbsp;<a href="https://github.com/caseyamcl"><img src="https:&#x2F;&#x2F;github.com&#x2F;caseyamcl.png" width="80px" alt="User avatar: caseyamcl" /></a>&nbsp;&nbsp;<a href="https://github.com/bytebase"><img src="https:&#x2F;&#x2F;github.com&#x2F;bytebase.png" width="80px" alt="User avatar: bytebase" /></a>&nbsp;&nbsp;<a href="https://github.com/"><img src="https:&#x2F;&#x2F;raw.githubusercontent.com&#x2F;JamesIves&#x2F;github-sponsors-readme-action&#x2F;dev&#x2F;.github&#x2F;assets&#x2F;placeholder.png" width="80px" alt="User avatar: " /></a>&nbsp;&nbsp;<!-- sponsors-premium -->
</p>
<p align="center">
<!-- sponsors-base --><a href="https://github.com/seanwash"><img src="https:&#x2F;&#x2F;github.com&#x2F;seanwash.png" width="50px" alt="User avatar: seanwash" /></a>&nbsp;&nbsp;<a href="https://github.com/jerath"><img src="https:&#x2F;&#x2F;github.com&#x2F;jerath.png" width="50px" alt="User avatar: jerath" /></a>&nbsp;&nbsp;<a href="https://github.com/itsa-sh"><img src="https:&#x2F;&#x2F;github.com&#x2F;itsa-sh.png" width="50px" alt="User avatar: itsa-sh" /></a>&nbsp;&nbsp;<a href="https://github.com/dmmulroy"><img src="https:&#x2F;&#x2F;github.com&#x2F;dmmulroy.png" width="50px" alt="User avatar: dmmulroy" /></a>&nbsp;&nbsp;<a href="https://github.com/timcole"><img src="https:&#x2F;&#x2F;github.com&#x2F;timcole.png" width="50px" alt="User avatar: timcole" /></a>&nbsp;&nbsp;<a href="https://github.com/VLZH"><img src="https:&#x2F;&#x2F;github.com&#x2F;VLZH.png" width="50px" alt="User avatar: VLZH" /></a>&nbsp;&nbsp;<a href="https://github.com/terasaka2k"><img src="https:&#x2F;&#x2F;github.com&#x2F;terasaka2k.png" width="50px" alt="User avatar: terasaka2k" /></a>&nbsp;&nbsp;<a href="https://github.com/andriyor"><img src="https:&#x2F;&#x2F;github.com&#x2F;andriyor.png" width="50px" alt="User avatar: andriyor" /></a>&nbsp;&nbsp;<a href="https://github.com/majudhu"><img src="https:&#x2F;&#x2F;github.com&#x2F;majudhu.png" width="50px" alt="User avatar: majudhu" /></a>&nbsp;&nbsp;<a href="https://github.com/axelrindle"><img src="https:&#x2F;&#x2F;github.com&#x2F;axelrindle.png" width="50px" alt="User avatar: axelrindle" /></a>&nbsp;&nbsp;<a href="https://github.com/jirizverina"><img src="https:&#x2F;&#x2F;github.com&#x2F;jirizverina.png" width="50px" alt="User avatar: jirizverina" /></a>&nbsp;&nbsp;<a href="https://github.com/chip-well"><img src="https:&#x2F;&#x2F;github.com&#x2F;chip-well.png" width="50px" alt="User avatar: chip-well" /></a>&nbsp;&nbsp;<a href="https://github.com/GRAYAH"><img src="https:&#x2F;&#x2F;github.com&#x2F;GRAYAH.png" width="50px" alt="User avatar: GRAYAH" /></a>&nbsp;&nbsp;<a href="https://github.com/flashblaze"><img src="https:&#x2F;&#x2F;github.com&#x2F;flashblaze.png" width="50px" alt="User avatar: flashblaze" /></a>&nbsp;&nbsp;<!-- sponsors-base -->
</p>
![Yaak API Client](https://yaak.app/static/screenshot.png)
## Features
Yaak is an offline-first API client designed to stay out of your way while giving you everything you need when you need it.
Built with [Tauri](https://tauri.app), Rust, and React, its fast, lightweight, and private. No telemetry, no VC funding, and no cloud lock-in.
### 🌐 Work with any API
- Import collections from Postman, Insomnia, OpenAPI, Swagger, or Curl.
- Send requests via REST, GraphQL, gRPC, WebSocket, or Server-Sent Events.
- Filter and inspect responses with JSONPath or XPath.
### 🔐 Stay secure
- Use OAuth 2.0, JWT, Basic Auth, or custom plugins for authentication.
- Secure sensitive values with encrypted secrets.
- Store secrets in your OS keychain.
### ☁️ Organize & collaborate
- Group requests into workspaces and nested folders.
- Use environment variables to switch between dev, staging, and prod.
- Mirror workspaces to your filesystem for versioning in Git or syncing with Dropbox.
### 🧩 Extend & customize
- Insert dynamic values like UUIDs or timestamps with template tags.
- Pick from built-in themes or build your own.
- Create plugins to extend authentication, template tags, or the UI.
## Community Projects
- [`yaak2postman`](https://github.com/BiteCraft/yaak2postman) CLI for converting Yaak data
exports to Postman-compatible collections
## Contribution Policy
Yaak is open source but only accepting contributions for bug fixes. To get started,
visit [`DEVELOPMENT.md`](DEVELOPMENT.md) for tips on setting up your environment.
Yaak is open source, but only accepting contributions for bug fixes. See the
[`good first issue`](https://github.com/yaakapp/app/labels/good%20first%20issue) label for
issues that are more approachable for contribution.
## Useful Resources
- [Feedback and Bug Reports](https://feedback.yaak.app)
- [Documentation](https://yaak.app/docs)
- [Yaak vs Postman](https://yaak.app/alternatives/postman)
- [Yaak vs Bruno](https://yaak.app/alternatives/bruno)
- [Yaak vs Insomnia](https://yaak.app/alternatives/insomnia)
To get started, visit [`DEVELOPMENT.md`](DEVELOPMENT.md) for tips on setting up your
environment.

View File

@@ -1,54 +0,0 @@
{
"$schema": "https://biomejs.dev/schemas/2.3.11/schema.json",
"linter": {
"enabled": true,
"rules": {
"recommended": true,
"a11y": {
"useKeyWithClickEvents": "off"
}
}
},
"formatter": {
"enabled": true,
"indentStyle": "space",
"indentWidth": 2,
"lineWidth": 100,
"bracketSpacing": true
},
"css": {
"parser": {
"tailwindDirectives": true
},
"linter": {
"enabled": false
}
},
"javascript": {
"formatter": {
"quoteStyle": "single",
"jsxQuoteStyle": "double",
"trailingCommas": "all",
"semicolons": "always"
}
},
"files": {
"includes": [
"**",
"!**/node_modules",
"!**/dist",
"!**/build",
"!target",
"!scripts",
"!crates",
"!crates-tauri",
"!src-web/tailwind.config.cjs",
"!src-web/postcss.config.cjs",
"!src-web/vite.config.ts",
"!src-web/routeTree.gen.ts",
"!packages/plugin-runtime-types/lib",
"!**/bindings",
"!flatpak"
]
}
}

View File

@@ -1,29 +0,0 @@
[package]
name = "yaak-cli"
version = "0.1.0"
edition = "2024"
publish = false
[[bin]]
name = "yaakcli"
path = "src/main.rs"
[dependencies]
clap = { version = "4", features = ["derive"] }
dirs = "6"
env_logger = "0.11"
log = { workspace = true }
serde = { workspace = true }
serde_json = { workspace = true }
tokio = { workspace = true, features = ["rt-multi-thread", "macros"] }
yaak = { workspace = true }
yaak-crypto = { workspace = true }
yaak-http = { workspace = true }
yaak-models = { workspace = true }
yaak-plugins = { workspace = true }
yaak-templates = { workspace = true }
[dev-dependencies]
assert_cmd = "2"
predicates = "3"
tempfile = "3"

View File

@@ -1,340 +0,0 @@
# CLI Command Architecture Plan
## Goal
Redesign the yaak-cli command structure to use a resource-oriented `<resource> <action>`
pattern that scales well, is discoverable, and supports both human and LLM workflows.
## Status Snapshot
Current branch state:
- Modular CLI structure with command modules and shared `CliContext`
- Resource/action hierarchy in place for:
- `workspace list|show|create|update|delete`
- `request list|show|create|update|send|delete`
- `folder list|show|create|update|delete`
- `environment list|show|create|update|delete`
- Top-level `send` exists as a request-send shortcut (not yet flexible request/folder/workspace resolution)
- Legacy `get` command removed
- JSON create/update flow implemented (`--json` and positional JSON shorthand)
- No `request schema` command yet
Progress checklist:
- [x] Phase 1 complete
- [x] Phase 2 complete
- [x] Phase 3 complete
- [ ] Phase 4 complete
- [ ] Phase 5 complete
- [ ] Phase 6 complete
## Command Architecture
### Design Principles
- **Resource-oriented**: top-level commands are nouns, subcommands are verbs
- **Polymorphic requests**: `request` covers HTTP, gRPC, and WebSocket — the CLI
resolves the type via `get_any_request` and adapts behavior accordingly
- **Simple creation, full-fidelity via JSON**: human-friendly flags for basic creation,
`--json` for full control (targeted at LLM and scripting workflows)
- **Runtime schema introspection**: `request schema` outputs JSON Schema for the request
models, with dynamic auth fields populated from loaded plugins at runtime
- **Destructive actions require confirmation**: `delete` commands prompt for user
confirmation before proceeding. Can be bypassed with `--yes` / `-y` for scripting
### Commands
```
# Top-level shortcut
yaakcli send <id> [-e <env_id>] # id can be a request, folder, or workspace
# Resource commands
yaakcli workspace list
yaakcli workspace show <id>
yaakcli workspace create --name <name>
yaakcli workspace create --json '{"name": "My Workspace"}'
yaakcli workspace create '{"name": "My Workspace"}' # positional JSON shorthand
yaakcli workspace update --json '{"id": "wk_abc", "name": "New Name"}'
yaakcli workspace delete <id>
yaakcli request list <workspace_id>
yaakcli request show <id>
yaakcli request create <workspace_id> --name <name> --url <url> [--method GET]
yaakcli request create --json '{"workspaceId": "wk_abc", "url": "..."}'
yaakcli request update --json '{"id": "rq_abc", "url": "https://new.com"}'
yaakcli request send <id> [-e <env_id>]
yaakcli request delete <id>
yaakcli request schema <http|grpc|websocket>
yaakcli folder list <workspace_id>
yaakcli folder show <id>
yaakcli folder create <workspace_id> --name <name>
yaakcli folder create --json '{"workspaceId": "wk_abc", "name": "Auth"}'
yaakcli folder update --json '{"id": "fl_abc", "name": "New Name"}'
yaakcli folder delete <id>
yaakcli environment list <workspace_id>
yaakcli environment show <id>
yaakcli environment create <workspace_id> --name <name>
yaakcli environment create --json '{"workspaceId": "wk_abc", "name": "Production"}'
yaakcli environment update --json '{"id": "ev_abc", ...}'
yaakcli environment delete <id>
```
### `send` — Top-Level Shortcut
`yaakcli send <id>` is a convenience alias that accepts any sendable ID. It tries
each type in order via DB lookups (short-circuiting on first match):
1. Request (HTTP, gRPC, or WebSocket via `get_any_request`)
2. Folder (sends all requests in the folder)
3. Workspace (sends all requests in the workspace)
ID prefixes exist (e.g. `rq_`, `fl_`, `wk_`) but are not relied upon — resolution
is purely by DB lookup.
`request send <id>` is the same but restricted to request IDs only.
### Request Send — Polymorphic Behavior
`send` means "execute this request" regardless of protocol:
- **HTTP**: send request, print response, exit
- **gRPC**: invoke the method; for streaming, stream output to stdout until done/Ctrl+C
- **WebSocket**: connect, stream messages to stdout until closed/Ctrl+C
### `request schema` — Runtime JSON Schema
Outputs a JSON Schema describing the full request shape, including dynamic fields:
1. Generate base schema from `schemars::JsonSchema` derive on the Rust model structs
2. Load plugins, collect auth strategy definitions and their form inputs
3. Merge plugin-defined auth fields into the `authentication` property as a `oneOf`
4. Output the combined schema as JSON
This lets an LLM call `schema`, read the shape, and construct valid JSON for
`create --json` or `update --json`.
## Implementation Steps
### Phase 1: Restructure commands (no new functionality)
Refactor `main.rs` into the new resource/action pattern using clap subcommand nesting.
Existing behavior stays the same, just reorganized. Remove the `get` command.
1. Create module structure: `commands/workspace.rs`, `commands/request.rs`, etc.
2. Define nested clap enums:
```rust
enum Commands {
Send(SendArgs),
Workspace(WorkspaceArgs),
Request(RequestArgs),
Folder(FolderArgs),
Environment(EnvironmentArgs),
}
```
3. Move existing `Workspaces` logic into `workspace list`
4. Move existing `Requests` logic into `request list`
5. Move existing `Send` logic into `request send`
6. Move existing `Create` logic into `request create`
7. Delete the `Get` command entirely
8. Extract shared setup (DB init, plugin init, encryption) into a reusable context struct
### Phase 2: Add missing CRUD commands
Status: complete
1. `workspace show <id>`
2. `workspace create --name <name>` (and `--json`)
3. `workspace update --json`
4. `workspace delete <id>`
5. `request show <id>` (JSON output of the full request model)
6. `request delete <id>`
7. `folder list <workspace_id>`
8. `folder show <id>`
9. `folder create <workspace_id> --name <name>` (and `--json`)
10. `folder update --json`
11. `folder delete <id>`
12. `environment list <workspace_id>`
13. `environment show <id>`
14. `environment create <workspace_id> --name <name>` (and `--json`)
15. `environment update --json`
16. `environment delete <id>`
### Phase 3: JSON input for create/update
Both commands accept JSON via `--json <string>` or as a positional argument (detected
by leading `{`). They follow the same upsert pattern as the plugin API.
- **`create --json`**: JSON must include `workspaceId`. Must NOT include `id` (or
use empty string `""`). Deserializes into the model with defaults for missing fields,
then upserts (insert).
- **`update --json`**: JSON must include `id`. Performs a fetch-merge-upsert:
1. Fetch the existing model from DB
2. Serialize it to `serde_json::Value`
3. Deep-merge the user's partial JSON on top (JSON Merge Patch / RFC 7386 semantics)
4. Deserialize back into the typed model
5. Upsert (update)
This matches how the MCP server plugin already does it (fetch existing, spread, override),
but the CLI handles the merge server-side so callers don't have to.
Setting a field to `null` removes it (for `Option<T>` fields), per RFC 7386.
Implementation:
1. Add `--json` flag and positional JSON detection to `create` commands
2. Add `update` commands with required `--json` flag
3. Implement JSON merge utility (or use `json-patch` crate)
### Phase 4: Runtime schema generation
1. Add `schemars` dependency to `yaak-models`
2. Derive `JsonSchema` on `HttpRequest`, `GrpcRequest`, `WebsocketRequest`, and their
nested types (`HttpRequestHeader`, `HttpUrlParameter`, etc.)
3. Implement `request schema` command:
- Generate base schema from schemars
- Query plugins for auth strategy form inputs
- Convert plugin form inputs into JSON Schema properties
- Merge into the `authentication` field
- Print to stdout
### Phase 5: Polymorphic send
1. Update `request send` to use `get_any_request` to resolve the request type
2. Match on `AnyRequest` variant and dispatch to the appropriate sender:
- `AnyRequest::HttpRequest` — existing HTTP send logic
- `AnyRequest::GrpcRequest` — gRPC invoke (future implementation)
- `AnyRequest::WebsocketRequest` — WebSocket connect (future implementation)
3. gRPC and WebSocket send can initially return "not yet implemented" errors
### Phase 6: Top-level `send` and folder/workspace send
1. Add top-level `yaakcli send <id>` command
2. Resolve ID by trying DB lookups in order: any_request → folder → workspace
3. For folder: list all requests in folder, send each
4. For workspace: list all requests in workspace, send each
5. Add execution options: `--sequential` (default), `--parallel`, `--fail-fast`
## Execution Plan (PR Slices)
### PR 1: Command tree refactor + compatibility aliases
Scope:
1. Introduce `commands/` modules and a `CliContext` for shared setup
2. Add new clap hierarchy (`workspace`, `request`, `folder`, `environment`)
3. Route existing behavior into:
- `workspace list`
- `request list <workspace_id>`
- `request send <id>`
- `request create <workspace_id> ...`
4. Keep compatibility aliases temporarily:
- `workspaces` -> `workspace list`
- `requests <workspace_id>` -> `request list <workspace_id>`
- `create ...` -> `request create ...`
5. Remove `get` and update help text
Acceptance criteria:
- `yaakcli --help` shows noun/verb structure
- Existing list/send/create workflows still work
- No behavior change in HTTP send output format
### PR 2: CRUD surface area
Scope:
1. Implement `show/create/update/delete` for `workspace`, `request`, `folder`, `environment`
2. Ensure delete commands require confirmation by default (`--yes` bypass)
3. Normalize output format for list/show/create/update/delete responses
Acceptance criteria:
- Every command listed in the "Commands" section parses and executes
- Delete commands are safe by default in interactive terminals
- `--yes` supports non-interactive scripts
### PR 3: JSON input + merge patch semantics
Scope:
1. Add shared parser for `--json` and positional JSON shorthand
2. Add `create --json` and `update --json` for all mutable resources
3. Implement server-side RFC 7386 merge patch behavior
4. Add guardrails:
- `create --json`: reject non-empty `id`
- `update --json`: require `id`
Acceptance criteria:
- Partial `update --json` only modifies provided keys
- `null` clears optional values
- Invalid JSON and missing required fields return actionable errors
### PR 4: `request schema` and plugin auth integration
Scope:
1. Add `schemars` to `yaak-models` and derive `JsonSchema` for request models
2. Implement `request schema <http|grpc|websocket>`
3. Merge plugin auth form inputs into `authentication` schema at runtime
Acceptance criteria:
- Command prints valid JSON schema
- Schema reflects installed auth providers at runtime
- No panic when plugins fail to initialize (degrade gracefully)
### PR 5: Polymorphic request send
Scope:
1. Replace request resolution in `request send` with `get_any_request`
2. Dispatch by request type
3. Keep HTTP fully functional
4. Return explicit NYI errors for gRPC/WebSocket until implemented
Acceptance criteria:
- HTTP behavior remains unchanged
- gRPC/WebSocket IDs are recognized and return explicit status
### PR 6: Top-level `send` + bulk execution
Scope:
1. Add top-level `send <id>` for request/folder/workspace IDs
2. Implement folder/workspace fan-out execution
3. Add execution controls: `--sequential`, `--parallel`, `--fail-fast`
Acceptance criteria:
- Correct ID dispatch order: request -> folder -> workspace
- Deterministic summary output (success/failure counts)
- Non-zero exit code when any request fails (unless explicitly configured otherwise)
## Validation Matrix
1. CLI parsing tests for every command path (including aliases while retained)
2. Integration tests against temp SQLite DB for CRUD flows
3. Snapshot tests for output text where scripting compatibility matters
4. Manual smoke tests:
- Send HTTP request with template/rendered vars
- JSON create/update for each resource
- Delete confirmation and `--yes`
- Top-level `send` on request/folder/workspace
## Open Questions
1. Should compatibility aliases (`workspaces`, `requests`, `create`) be removed immediately or after one release cycle?
2. For bulk `send`, should default behavior stop on first failure or continue and summarize?
3. Should command output default to human-readable text with an optional `--format json`, or return JSON by default for `show`/`list`?
4. For `request schema`, should plugin-derived auth fields be namespaced by plugin ID to avoid collisions?
## Crate Changes
- **yaak-cli**: restructure into modules, new clap hierarchy
- **yaak-models**: add `schemars` dependency, derive `JsonSchema` on model structs
(current derives: `Debug, Clone, PartialEq, Serialize, Deserialize, Default, TS`)

View File

@@ -1,87 +0,0 @@
# yaak-cli
Command-line interface for Yaak.
## Command Overview
Current top-level commands:
```text
yaakcli send <request_id>
yaakcli workspace list
yaakcli workspace show <workspace_id>
yaakcli workspace create --name <name>
yaakcli workspace create --json '{"name":"My Workspace"}'
yaakcli workspace create '{"name":"My Workspace"}'
yaakcli workspace update --json '{"id":"wk_abc","description":"Updated"}'
yaakcli workspace delete <workspace_id> [--yes]
yaakcli request list <workspace_id>
yaakcli request show <request_id>
yaakcli request send <request_id>
yaakcli request create <workspace_id> --name <name> --url <url> [--method GET]
yaakcli request create --json '{"workspaceId":"wk_abc","name":"Users","url":"https://api.example.com/users"}'
yaakcli request create '{"workspaceId":"wk_abc","name":"Users","url":"https://api.example.com/users"}'
yaakcli request update --json '{"id":"rq_abc","name":"Users v2"}'
yaakcli request delete <request_id> [--yes]
yaakcli folder list <workspace_id>
yaakcli folder show <folder_id>
yaakcli folder create <workspace_id> --name <name>
yaakcli folder create --json '{"workspaceId":"wk_abc","name":"Auth"}'
yaakcli folder create '{"workspaceId":"wk_abc","name":"Auth"}'
yaakcli folder update --json '{"id":"fl_abc","name":"Auth v2"}'
yaakcli folder delete <folder_id> [--yes]
yaakcli environment list <workspace_id>
yaakcli environment show <environment_id>
yaakcli environment create <workspace_id> --name <name>
yaakcli environment create --json '{"workspaceId":"wk_abc","name":"Production"}'
yaakcli environment create '{"workspaceId":"wk_abc","name":"Production"}'
yaakcli environment update --json '{"id":"ev_abc","color":"#00ff00"}'
yaakcli environment delete <environment_id> [--yes]
```
Global options:
- `--data-dir <path>`: use a custom data directory
- `-e, --environment <id>`: environment to use during request rendering/sending
- `-v, --verbose`: verbose logging and send output
Notes:
- `send` is currently a shortcut for sending an HTTP request ID.
- `delete` commands prompt for confirmation unless `--yes` is provided.
- In non-interactive mode, `delete` commands require `--yes`.
- `create` and `update` commands support `--json` and positional JSON shorthand.
- `update` uses JSON Merge Patch semantics (RFC 7386) for partial updates.
## Examples
```bash
yaakcli workspace list
yaakcli workspace create --name "My Workspace"
yaakcli workspace show wk_abc
yaakcli workspace update --json '{"id":"wk_abc","description":"Team workspace"}'
yaakcli request list wk_abc
yaakcli request show rq_abc
yaakcli request create wk_abc --name "Users" --url "https://api.example.com/users"
yaakcli request update --json '{"id":"rq_abc","name":"Users v2"}'
yaakcli request send rq_abc -e ev_abc
yaakcli request delete rq_abc --yes
yaakcli folder create wk_abc --name "Auth"
yaakcli folder update --json '{"id":"fl_abc","name":"Auth v2"}'
yaakcli environment create wk_abc --name "Production"
yaakcli environment update --json '{"id":"ev_abc","color":"#00ff00"}'
```
## Roadmap
Planned command expansion (request schema and polymorphic send) is tracked in `PLAN.md`.
When command behavior changes, update this README and verify with:
```bash
cargo run -q -p yaak-cli -- --help
cargo run -q -p yaak-cli -- request --help
cargo run -q -p yaak-cli -- workspace --help
cargo run -q -p yaak-cli -- folder --help
cargo run -q -p yaak-cli -- environment --help
```

View File

@@ -1,282 +0,0 @@
use clap::{Args, Parser, Subcommand};
use std::path::PathBuf;
#[derive(Parser)]
#[command(name = "yaakcli")]
#[command(about = "Yaak CLI - API client from the command line")]
pub struct Cli {
/// Use a custom data directory
#[arg(long, global = true)]
pub data_dir: Option<PathBuf>,
/// Environment ID to use for variable substitution
#[arg(long, short, global = true)]
pub environment: Option<String>,
/// Enable verbose logging
#[arg(long, short, global = true)]
pub verbose: bool,
#[command(subcommand)]
pub command: Commands,
}
#[derive(Subcommand)]
pub enum Commands {
/// Send an HTTP request by ID
Send(SendArgs),
/// Workspace commands
Workspace(WorkspaceArgs),
/// Request commands
Request(RequestArgs),
/// Folder commands
Folder(FolderArgs),
/// Environment commands
Environment(EnvironmentArgs),
}
#[derive(Args)]
pub struct SendArgs {
/// Request ID
pub request_id: String,
}
#[derive(Args)]
pub struct WorkspaceArgs {
#[command(subcommand)]
pub command: WorkspaceCommands,
}
#[derive(Subcommand)]
pub enum WorkspaceCommands {
/// List all workspaces
List,
/// Show a workspace as JSON
Show {
/// Workspace ID
workspace_id: String,
},
/// Create a workspace
Create {
/// Workspace name
#[arg(short, long)]
name: Option<String>,
/// JSON payload
#[arg(long, conflicts_with = "json_input")]
json: Option<String>,
/// JSON payload shorthand
#[arg(value_name = "JSON", conflicts_with = "json")]
json_input: Option<String>,
},
/// Update a workspace
Update {
/// JSON payload
#[arg(long, conflicts_with = "json_input")]
json: Option<String>,
/// JSON payload shorthand
#[arg(value_name = "JSON", conflicts_with = "json")]
json_input: Option<String>,
},
/// Delete a workspace
Delete {
/// Workspace ID
workspace_id: String,
/// Skip confirmation prompt
#[arg(short, long)]
yes: bool,
},
}
#[derive(Args)]
pub struct RequestArgs {
#[command(subcommand)]
pub command: RequestCommands,
}
#[derive(Subcommand)]
pub enum RequestCommands {
/// List requests in a workspace
List {
/// Workspace ID
workspace_id: String,
},
/// Show a request as JSON
Show {
/// Request ID
request_id: String,
},
/// Send an HTTP request by ID
Send {
/// Request ID
request_id: String,
},
/// Create a new HTTP request
Create {
/// Workspace ID (or positional JSON payload shorthand)
workspace_id: Option<String>,
/// Request name
#[arg(short, long)]
name: Option<String>,
/// HTTP method
#[arg(short, long)]
method: Option<String>,
/// URL
#[arg(short, long)]
url: Option<String>,
/// JSON payload
#[arg(long)]
json: Option<String>,
},
/// Update an HTTP request
Update {
/// JSON payload
#[arg(long, conflicts_with = "json_input")]
json: Option<String>,
/// JSON payload shorthand
#[arg(value_name = "JSON", conflicts_with = "json")]
json_input: Option<String>,
},
/// Delete a request
Delete {
/// Request ID
request_id: String,
/// Skip confirmation prompt
#[arg(short, long)]
yes: bool,
},
}
#[derive(Args)]
pub struct FolderArgs {
#[command(subcommand)]
pub command: FolderCommands,
}
#[derive(Subcommand)]
pub enum FolderCommands {
/// List folders in a workspace
List {
/// Workspace ID
workspace_id: String,
},
/// Show a folder as JSON
Show {
/// Folder ID
folder_id: String,
},
/// Create a folder
Create {
/// Workspace ID (or positional JSON payload shorthand)
workspace_id: Option<String>,
/// Folder name
#[arg(short, long)]
name: Option<String>,
/// JSON payload
#[arg(long)]
json: Option<String>,
},
/// Update a folder
Update {
/// JSON payload
#[arg(long, conflicts_with = "json_input")]
json: Option<String>,
/// JSON payload shorthand
#[arg(value_name = "JSON", conflicts_with = "json")]
json_input: Option<String>,
},
/// Delete a folder
Delete {
/// Folder ID
folder_id: String,
/// Skip confirmation prompt
#[arg(short, long)]
yes: bool,
},
}
#[derive(Args)]
pub struct EnvironmentArgs {
#[command(subcommand)]
pub command: EnvironmentCommands,
}
#[derive(Subcommand)]
pub enum EnvironmentCommands {
/// List environments in a workspace
List {
/// Workspace ID
workspace_id: String,
},
/// Show an environment as JSON
Show {
/// Environment ID
environment_id: String,
},
/// Create an environment
Create {
/// Workspace ID (or positional JSON payload shorthand)
workspace_id: Option<String>,
/// Environment name
#[arg(short, long)]
name: Option<String>,
/// JSON payload
#[arg(long)]
json: Option<String>,
},
/// Update an environment
Update {
/// JSON payload
#[arg(long, conflicts_with = "json_input")]
json: Option<String>,
/// JSON payload shorthand
#[arg(value_name = "JSON", conflicts_with = "json")]
json_input: Option<String>,
},
/// Delete an environment
Delete {
/// Environment ID
environment_id: String,
/// Skip confirmation prompt
#[arg(short, long)]
yes: bool,
},
}

View File

@@ -1,16 +0,0 @@
use std::io::{self, IsTerminal, Write};
pub fn confirm_delete(resource_name: &str, resource_id: &str) -> bool {
if !io::stdin().is_terminal() {
eprintln!("Refusing to delete in non-interactive mode without --yes");
std::process::exit(1);
}
print!("Delete {resource_name} {resource_id}? [y/N]: ");
io::stdout().flush().expect("Failed to flush stdout");
let mut input = String::new();
io::stdin().read_line(&mut input).expect("Failed to read confirmation");
matches!(input.trim().to_lowercase().as_str(), "y" | "yes")
}

View File

@@ -1,134 +0,0 @@
use crate::cli::{EnvironmentArgs, EnvironmentCommands};
use crate::commands::confirm::confirm_delete;
use crate::commands::json::{
apply_merge_patch, is_json_shorthand, parse_optional_json, parse_required_json, require_id,
validate_create_id,
};
use crate::context::CliContext;
use yaak_models::models::Environment;
use yaak_models::util::UpdateSource;
pub fn run(ctx: &CliContext, args: EnvironmentArgs) {
match args.command {
EnvironmentCommands::List { workspace_id } => list(ctx, &workspace_id),
EnvironmentCommands::Show { environment_id } => show(ctx, &environment_id),
EnvironmentCommands::Create { workspace_id, name, json } => {
create(ctx, workspace_id, name, json)
}
EnvironmentCommands::Update { json, json_input } => update(ctx, json, json_input),
EnvironmentCommands::Delete { environment_id, yes } => delete(ctx, &environment_id, yes),
}
}
fn list(ctx: &CliContext, workspace_id: &str) {
let environments =
ctx.db().list_environments_ensure_base(workspace_id).expect("Failed to list environments");
if environments.is_empty() {
println!("No environments found in workspace {}", workspace_id);
} else {
for environment in environments {
println!("{} - {} ({})", environment.id, environment.name, environment.parent_model);
}
}
}
fn show(ctx: &CliContext, environment_id: &str) {
let environment = ctx.db().get_environment(environment_id).expect("Failed to get environment");
let output =
serde_json::to_string_pretty(&environment).expect("Failed to serialize environment");
println!("{output}");
}
fn create(
ctx: &CliContext,
workspace_id: Option<String>,
name: Option<String>,
json: Option<String>,
) {
if json.is_some() && workspace_id.as_deref().is_some_and(|v| !is_json_shorthand(v)) {
panic!("environment create cannot combine workspace_id with --json payload");
}
let payload = parse_optional_json(
json,
workspace_id.clone().filter(|v| is_json_shorthand(v)),
"environment create",
);
if let Some(payload) = payload {
if name.is_some() {
panic!("environment create cannot combine --name with JSON payload");
}
validate_create_id(&payload, "environment");
let mut environment: Environment =
serde_json::from_value(payload).expect("Failed to parse environment create JSON");
if environment.workspace_id.is_empty() {
panic!("environment create JSON requires non-empty \"workspaceId\"");
}
if environment.parent_model.is_empty() {
environment.parent_model = "environment".to_string();
}
let created = ctx
.db()
.upsert_environment(&environment, &UpdateSource::Sync)
.expect("Failed to create environment");
println!("Created environment: {}", created.id);
return;
}
let workspace_id = workspace_id.unwrap_or_else(|| {
panic!("environment create requires workspace_id unless JSON payload is provided")
});
let name = name.unwrap_or_else(|| {
panic!("environment create requires --name unless JSON payload is provided")
});
let environment = Environment {
workspace_id,
name,
parent_model: "environment".to_string(),
..Default::default()
};
let created = ctx
.db()
.upsert_environment(&environment, &UpdateSource::Sync)
.expect("Failed to create environment");
println!("Created environment: {}", created.id);
}
fn update(ctx: &CliContext, json: Option<String>, json_input: Option<String>) {
let patch = parse_required_json(json, json_input, "environment update");
let id = require_id(&patch, "environment update");
let existing = ctx.db().get_environment(&id).expect("Failed to get environment for update");
let updated = apply_merge_patch(&existing, &patch, &id, "environment update");
let saved = ctx
.db()
.upsert_environment(&updated, &UpdateSource::Sync)
.expect("Failed to update environment");
println!("Updated environment: {}", saved.id);
}
fn delete(ctx: &CliContext, environment_id: &str, yes: bool) {
if !yes && !confirm_delete("environment", environment_id) {
println!("Aborted");
return;
}
let deleted = ctx
.db()
.delete_environment_by_id(environment_id, &UpdateSource::Sync)
.expect("Failed to delete environment");
println!("Deleted environment: {}", deleted.id);
}

View File

@@ -1,115 +0,0 @@
use crate::cli::{FolderArgs, FolderCommands};
use crate::commands::confirm::confirm_delete;
use crate::commands::json::{
apply_merge_patch, is_json_shorthand, parse_optional_json, parse_required_json, require_id,
validate_create_id,
};
use crate::context::CliContext;
use yaak_models::models::Folder;
use yaak_models::util::UpdateSource;
pub fn run(ctx: &CliContext, args: FolderArgs) {
match args.command {
FolderCommands::List { workspace_id } => list(ctx, &workspace_id),
FolderCommands::Show { folder_id } => show(ctx, &folder_id),
FolderCommands::Create { workspace_id, name, json } => {
create(ctx, workspace_id, name, json)
}
FolderCommands::Update { json, json_input } => update(ctx, json, json_input),
FolderCommands::Delete { folder_id, yes } => delete(ctx, &folder_id, yes),
}
}
fn list(ctx: &CliContext, workspace_id: &str) {
let folders = ctx.db().list_folders(workspace_id).expect("Failed to list folders");
if folders.is_empty() {
println!("No folders found in workspace {}", workspace_id);
} else {
for folder in folders {
println!("{} - {}", folder.id, folder.name);
}
}
}
fn show(ctx: &CliContext, folder_id: &str) {
let folder = ctx.db().get_folder(folder_id).expect("Failed to get folder");
let output = serde_json::to_string_pretty(&folder).expect("Failed to serialize folder");
println!("{output}");
}
fn create(
ctx: &CliContext,
workspace_id: Option<String>,
name: Option<String>,
json: Option<String>,
) {
if json.is_some() && workspace_id.as_deref().is_some_and(|v| !is_json_shorthand(v)) {
panic!("folder create cannot combine workspace_id with --json payload");
}
let payload = parse_optional_json(
json,
workspace_id.clone().filter(|v| is_json_shorthand(v)),
"folder create",
);
if let Some(payload) = payload {
if name.is_some() {
panic!("folder create cannot combine --name with JSON payload");
}
validate_create_id(&payload, "folder");
let folder: Folder =
serde_json::from_value(payload).expect("Failed to parse folder create JSON");
if folder.workspace_id.is_empty() {
panic!("folder create JSON requires non-empty \"workspaceId\"");
}
let created =
ctx.db().upsert_folder(&folder, &UpdateSource::Sync).expect("Failed to create folder");
println!("Created folder: {}", created.id);
return;
}
let workspace_id = workspace_id.unwrap_or_else(|| {
panic!("folder create requires workspace_id unless JSON payload is provided")
});
let name = name
.unwrap_or_else(|| panic!("folder create requires --name unless JSON payload is provided"));
let folder = Folder { workspace_id, name, ..Default::default() };
let created =
ctx.db().upsert_folder(&folder, &UpdateSource::Sync).expect("Failed to create folder");
println!("Created folder: {}", created.id);
}
fn update(ctx: &CliContext, json: Option<String>, json_input: Option<String>) {
let patch = parse_required_json(json, json_input, "folder update");
let id = require_id(&patch, "folder update");
let existing = ctx.db().get_folder(&id).expect("Failed to get folder for update");
let updated = apply_merge_patch(&existing, &patch, &id, "folder update");
let saved =
ctx.db().upsert_folder(&updated, &UpdateSource::Sync).expect("Failed to update folder");
println!("Updated folder: {}", saved.id);
}
fn delete(ctx: &CliContext, folder_id: &str, yes: bool) {
if !yes && !confirm_delete("folder", folder_id) {
println!("Aborted");
return;
}
let deleted = ctx
.db()
.delete_folder_by_id(folder_id, &UpdateSource::Sync)
.expect("Failed to delete folder");
println!("Deleted folder: {}", deleted.id);
}

View File

@@ -1,108 +0,0 @@
use serde::Serialize;
use serde::de::DeserializeOwned;
use serde_json::{Map, Value};
pub fn is_json_shorthand(input: &str) -> bool {
input.trim_start().starts_with('{')
}
pub fn parse_json_object(raw: &str, context: &str) -> Value {
let value: Value = serde_json::from_str(raw)
.unwrap_or_else(|error| panic!("Invalid JSON for {context}: {error}"));
if !value.is_object() {
panic!("JSON payload for {context} must be an object");
}
value
}
pub fn parse_optional_json(
json_flag: Option<String>,
json_shorthand: Option<String>,
context: &str,
) -> Option<Value> {
match (json_flag, json_shorthand) {
(Some(_), Some(_)) => {
panic!("Cannot provide both --json and positional JSON for {context}")
}
(Some(raw), None) => Some(parse_json_object(&raw, context)),
(None, Some(raw)) => Some(parse_json_object(&raw, context)),
(None, None) => None,
}
}
pub fn parse_required_json(
json_flag: Option<String>,
json_shorthand: Option<String>,
context: &str,
) -> Value {
parse_optional_json(json_flag, json_shorthand, context).unwrap_or_else(|| {
panic!("Missing JSON payload for {context}. Use --json or positional JSON")
})
}
pub fn require_id(payload: &Value, context: &str) -> String {
payload
.get("id")
.and_then(|value| value.as_str())
.filter(|value| !value.is_empty())
.map(|value| value.to_string())
.unwrap_or_else(|| panic!("{context} requires a non-empty \"id\" field"))
}
pub fn validate_create_id(payload: &Value, context: &str) {
let Some(id_value) = payload.get("id") else {
return;
};
match id_value {
Value::String(id) if id.is_empty() => {}
_ => panic!("{context} create JSON must omit \"id\" or set it to an empty string"),
}
}
pub fn apply_merge_patch<T>(existing: &T, patch: &Value, id: &str, context: &str) -> T
where
T: Serialize + DeserializeOwned,
{
let mut base = serde_json::to_value(existing).unwrap_or_else(|error| {
panic!("Failed to serialize existing model for {context}: {error}")
});
merge_patch(&mut base, patch);
let Some(base_object) = base.as_object_mut() else {
panic!("Merged payload for {context} must be an object");
};
base_object.insert("id".to_string(), Value::String(id.to_string()));
serde_json::from_value(base).unwrap_or_else(|error| {
panic!("Failed to deserialize merged payload for {context}: {error}")
})
}
fn merge_patch(target: &mut Value, patch: &Value) {
match patch {
Value::Object(patch_map) => {
if !target.is_object() {
*target = Value::Object(Map::new());
}
let target_map =
target.as_object_mut().expect("merge_patch target expected to be object");
for (key, patch_value) in patch_map {
if patch_value.is_null() {
target_map.remove(key);
continue;
}
let target_entry = target_map.entry(key.clone()).or_insert(Value::Null);
merge_patch(target_entry, patch_value);
}
}
_ => {
*target = patch.clone();
}
}
}

View File

@@ -1,7 +0,0 @@
pub mod confirm;
pub mod environment;
pub mod folder;
pub mod json;
pub mod request;
pub mod send;
pub mod workspace;

View File

@@ -1,224 +0,0 @@
use crate::cli::{RequestArgs, RequestCommands};
use crate::commands::confirm::confirm_delete;
use crate::commands::json::{
apply_merge_patch, is_json_shorthand, parse_optional_json, parse_required_json, require_id,
validate_create_id,
};
use crate::context::CliContext;
use tokio::sync::mpsc;
use yaak::send::{SendHttpRequestByIdWithPluginsParams, send_http_request_by_id_with_plugins};
use yaak_models::models::HttpRequest;
use yaak_models::util::UpdateSource;
use yaak_plugins::events::PluginContext;
pub async fn run(
ctx: &CliContext,
args: RequestArgs,
environment: Option<&str>,
verbose: bool,
) -> i32 {
match args.command {
RequestCommands::List { workspace_id } => {
list(ctx, &workspace_id);
0
}
RequestCommands::Show { request_id } => {
show(ctx, &request_id);
0
}
RequestCommands::Send { request_id } => {
match send_request_by_id(ctx, &request_id, environment, verbose).await {
Ok(()) => 0,
Err(error) => {
eprintln!("Error: {error}");
1
}
}
}
RequestCommands::Create { workspace_id, name, method, url, json } => {
create(ctx, workspace_id, name, method, url, json);
0
}
RequestCommands::Update { json, json_input } => {
update(ctx, json, json_input);
0
}
RequestCommands::Delete { request_id, yes } => {
delete(ctx, &request_id, yes);
0
}
}
}
fn list(ctx: &CliContext, workspace_id: &str) {
let requests = ctx.db().list_http_requests(workspace_id).expect("Failed to list requests");
if requests.is_empty() {
println!("No requests found in workspace {}", workspace_id);
} else {
for request in requests {
println!("{} - {} {}", request.id, request.method, request.name);
}
}
}
fn create(
ctx: &CliContext,
workspace_id: Option<String>,
name: Option<String>,
method: Option<String>,
url: Option<String>,
json: Option<String>,
) {
if json.is_some() && workspace_id.as_deref().is_some_and(|v| !is_json_shorthand(v)) {
panic!("request create cannot combine workspace_id with --json payload");
}
let payload = parse_optional_json(
json,
workspace_id.clone().filter(|v| is_json_shorthand(v)),
"request create",
);
if let Some(payload) = payload {
if name.is_some() || method.is_some() || url.is_some() {
panic!("request create cannot combine simple flags with JSON payload");
}
validate_create_id(&payload, "request");
let request: HttpRequest =
serde_json::from_value(payload).expect("Failed to parse request create JSON");
if request.workspace_id.is_empty() {
panic!("request create JSON requires non-empty \"workspaceId\"");
}
let created = ctx
.db()
.upsert_http_request(&request, &UpdateSource::Sync)
.expect("Failed to create request");
println!("Created request: {}", created.id);
return;
}
let workspace_id = workspace_id.unwrap_or_else(|| {
panic!("request create requires workspace_id unless JSON payload is provided")
});
let name = name.unwrap_or_else(|| {
panic!("request create requires --name unless JSON payload is provided")
});
let url = url
.unwrap_or_else(|| panic!("request create requires --url unless JSON payload is provided"));
let method = method.unwrap_or_else(|| "GET".to_string());
let request = HttpRequest {
workspace_id,
name,
method: method.to_uppercase(),
url,
..Default::default()
};
let created = ctx
.db()
.upsert_http_request(&request, &UpdateSource::Sync)
.expect("Failed to create request");
println!("Created request: {}", created.id);
}
fn update(ctx: &CliContext, json: Option<String>, json_input: Option<String>) {
let patch = parse_required_json(json, json_input, "request update");
let id = require_id(&patch, "request update");
let existing = ctx.db().get_http_request(&id).expect("Failed to get request for update");
let updated = apply_merge_patch(&existing, &patch, &id, "request update");
let saved = ctx
.db()
.upsert_http_request(&updated, &UpdateSource::Sync)
.expect("Failed to update request");
println!("Updated request: {}", saved.id);
}
fn show(ctx: &CliContext, request_id: &str) {
let request = ctx.db().get_http_request(request_id).expect("Failed to get request");
let output = serde_json::to_string_pretty(&request).expect("Failed to serialize request");
println!("{output}");
}
fn delete(ctx: &CliContext, request_id: &str, yes: bool) {
if !yes && !confirm_delete("request", request_id) {
println!("Aborted");
return;
}
let deleted = ctx
.db()
.delete_http_request_by_id(request_id, &UpdateSource::Sync)
.expect("Failed to delete request");
println!("Deleted request: {}", deleted.id);
}
/// Send a request by ID and print response in the same format as legacy `send`.
pub async fn send_request_by_id(
ctx: &CliContext,
request_id: &str,
environment: Option<&str>,
verbose: bool,
) -> Result<(), String> {
let request =
ctx.db().get_http_request(request_id).map_err(|e| format!("Failed to get request: {e}"))?;
let plugin_context = PluginContext::new(None, Some(request.workspace_id.clone()));
let (event_tx, mut event_rx) = mpsc::channel(100);
let event_handle = tokio::spawn(async move {
while let Some(event) = event_rx.recv().await {
if verbose {
println!("{}", event);
}
}
});
let response_dir = ctx.data_dir().join("responses");
let result = send_http_request_by_id_with_plugins(SendHttpRequestByIdWithPluginsParams {
query_manager: ctx.query_manager(),
blob_manager: ctx.blob_manager(),
request_id,
environment_id: environment,
update_source: UpdateSource::Sync,
cookie_jar_id: None,
response_dir: &response_dir,
emit_events_to: Some(event_tx),
plugin_manager: ctx.plugin_manager(),
encryption_manager: ctx.encryption_manager.clone(),
plugin_context: &plugin_context,
cancelled_rx: None,
connection_manager: None,
})
.await;
let _ = event_handle.await;
let result = result.map_err(|e| e.to_string())?;
if verbose {
println!();
}
println!(
"HTTP {} {}",
result.response.status,
result.response.status_reason.as_deref().unwrap_or("")
);
if verbose {
for header in &result.response.headers {
println!("{}: {}", header.name, header.value);
}
println!();
}
let body = String::from_utf8(result.response_body)
.map_err(|e| format!("Failed to read response body: {e}"))?;
println!("{}", body);
Ok(())
}

View File

@@ -1,18 +0,0 @@
use crate::cli::SendArgs;
use crate::commands::request;
use crate::context::CliContext;
pub async fn run(
ctx: &CliContext,
args: SendArgs,
environment: Option<&str>,
verbose: bool,
) -> i32 {
match request::send_request_by_id(ctx, &args.request_id, environment, verbose).await {
Ok(()) => 0,
Err(error) => {
eprintln!("Error: {error}");
1
}
}
}

View File

@@ -1,100 +0,0 @@
use crate::cli::{WorkspaceArgs, WorkspaceCommands};
use crate::commands::confirm::confirm_delete;
use crate::commands::json::{
apply_merge_patch, parse_optional_json, parse_required_json, require_id, validate_create_id,
};
use crate::context::CliContext;
use yaak_models::models::Workspace;
use yaak_models::util::UpdateSource;
pub fn run(ctx: &CliContext, args: WorkspaceArgs) {
match args.command {
WorkspaceCommands::List => list(ctx),
WorkspaceCommands::Show { workspace_id } => show(ctx, &workspace_id),
WorkspaceCommands::Create { name, json, json_input } => create(ctx, name, json, json_input),
WorkspaceCommands::Update { json, json_input } => update(ctx, json, json_input),
WorkspaceCommands::Delete { workspace_id, yes } => delete(ctx, &workspace_id, yes),
}
}
fn list(ctx: &CliContext) {
let workspaces = ctx.db().list_workspaces().expect("Failed to list workspaces");
if workspaces.is_empty() {
println!("No workspaces found");
} else {
for workspace in workspaces {
println!("{} - {}", workspace.id, workspace.name);
}
}
}
fn show(ctx: &CliContext, workspace_id: &str) {
let workspace = ctx.db().get_workspace(workspace_id).expect("Failed to get workspace");
let output = serde_json::to_string_pretty(&workspace).expect("Failed to serialize workspace");
println!("{output}");
}
fn create(
ctx: &CliContext,
name: Option<String>,
json: Option<String>,
json_input: Option<String>,
) {
let payload = parse_optional_json(json, json_input, "workspace create");
if let Some(payload) = payload {
if name.is_some() {
panic!("workspace create cannot combine --name with JSON payload");
}
validate_create_id(&payload, "workspace");
let workspace: Workspace =
serde_json::from_value(payload).expect("Failed to parse workspace create JSON");
let created = ctx
.db()
.upsert_workspace(&workspace, &UpdateSource::Sync)
.expect("Failed to create workspace");
println!("Created workspace: {}", created.id);
return;
}
let name = name.unwrap_or_else(|| {
panic!("workspace create requires --name unless JSON payload is provided")
});
let workspace = Workspace { name, ..Default::default() };
let created = ctx
.db()
.upsert_workspace(&workspace, &UpdateSource::Sync)
.expect("Failed to create workspace");
println!("Created workspace: {}", created.id);
}
fn update(ctx: &CliContext, json: Option<String>, json_input: Option<String>) {
let patch = parse_required_json(json, json_input, "workspace update");
let id = require_id(&patch, "workspace update");
let existing = ctx.db().get_workspace(&id).expect("Failed to get workspace for update");
let updated = apply_merge_patch(&existing, &patch, &id, "workspace update");
let saved = ctx
.db()
.upsert_workspace(&updated, &UpdateSource::Sync)
.expect("Failed to update workspace");
println!("Updated workspace: {}", saved.id);
}
fn delete(ctx: &CliContext, workspace_id: &str, yes: bool) {
if !yes && !confirm_delete("workspace", workspace_id) {
println!("Aborted");
return;
}
let deleted = ctx
.db()
.delete_workspace_by_id(workspace_id, &UpdateSource::Sync)
.expect("Failed to delete workspace");
println!("Deleted workspace: {}", deleted.id);
}

View File

@@ -1,96 +0,0 @@
use std::path::{Path, PathBuf};
use std::sync::Arc;
use yaak_crypto::manager::EncryptionManager;
use yaak_models::blob_manager::BlobManager;
use yaak_models::db_context::DbContext;
use yaak_models::query_manager::QueryManager;
use yaak_plugins::events::PluginContext;
use yaak_plugins::manager::PluginManager;
pub struct CliContext {
data_dir: PathBuf,
query_manager: QueryManager,
blob_manager: BlobManager,
pub encryption_manager: Arc<EncryptionManager>,
plugin_manager: Option<Arc<PluginManager>>,
}
impl CliContext {
pub async fn initialize(data_dir: PathBuf, app_id: &str, with_plugins: bool) -> Self {
let db_path = data_dir.join("db.sqlite");
let blob_path = data_dir.join("blobs.sqlite");
let (query_manager, blob_manager, _rx) = yaak_models::init_standalone(&db_path, &blob_path)
.expect("Failed to initialize database");
let encryption_manager = Arc::new(EncryptionManager::new(query_manager.clone(), app_id));
let plugin_manager = if with_plugins {
let vendored_plugin_dir = data_dir.join("vendored-plugins");
let installed_plugin_dir = data_dir.join("installed-plugins");
let node_bin_path = PathBuf::from("node");
let plugin_runtime_main =
std::env::var("YAAK_PLUGIN_RUNTIME").map(PathBuf::from).unwrap_or_else(|_| {
PathBuf::from(env!("CARGO_MANIFEST_DIR"))
.join("../../crates-tauri/yaak-app/vendored/plugin-runtime/index.cjs")
});
let plugin_manager = Arc::new(
PluginManager::new(
vendored_plugin_dir,
installed_plugin_dir,
node_bin_path,
plugin_runtime_main,
false,
)
.await,
);
let plugins = query_manager.connect().list_plugins().unwrap_or_default();
if !plugins.is_empty() {
let errors = plugin_manager
.initialize_all_plugins(plugins, &PluginContext::new_empty())
.await;
for (plugin_dir, error_msg) in errors {
eprintln!(
"Warning: Failed to initialize plugin '{}': {}",
plugin_dir, error_msg
);
}
}
Some(plugin_manager)
} else {
None
};
Self { data_dir, query_manager, blob_manager, encryption_manager, plugin_manager }
}
pub fn data_dir(&self) -> &Path {
&self.data_dir
}
pub fn db(&self) -> DbContext<'_> {
self.query_manager.connect()
}
pub fn query_manager(&self) -> &QueryManager {
&self.query_manager
}
pub fn blob_manager(&self) -> &BlobManager {
&self.blob_manager
}
pub fn plugin_manager(&self) -> Arc<PluginManager> {
self.plugin_manager.clone().expect("Plugin manager was not initialized for this command")
}
pub async fn shutdown(&self) {
if let Some(plugin_manager) = &self.plugin_manager {
plugin_manager.terminate().await;
}
}
}

View File

@@ -1,57 +0,0 @@
mod cli;
mod commands;
mod context;
use clap::Parser;
use cli::{Cli, Commands, RequestCommands};
use context::CliContext;
#[tokio::main]
async fn main() {
let Cli { data_dir, environment, verbose, command } = Cli::parse();
if verbose {
env_logger::Builder::from_env(env_logger::Env::default().default_filter_or("info")).init();
}
let app_id = if cfg!(debug_assertions) { "app.yaak.desktop.dev" } else { "app.yaak.desktop" };
let data_dir = data_dir.unwrap_or_else(|| {
dirs::data_dir().expect("Could not determine data directory").join(app_id)
});
let needs_plugins = matches!(
&command,
Commands::Send(_)
| Commands::Request(cli::RequestArgs { command: RequestCommands::Send { .. } })
);
let context = CliContext::initialize(data_dir, app_id, needs_plugins).await;
let exit_code = match command {
Commands::Send(args) => {
commands::send::run(&context, args, environment.as_deref(), verbose).await
}
Commands::Workspace(args) => {
commands::workspace::run(&context, args);
0
}
Commands::Request(args) => {
commands::request::run(&context, args, environment.as_deref(), verbose).await
}
Commands::Folder(args) => {
commands::folder::run(&context, args);
0
}
Commands::Environment(args) => {
commands::environment::run(&context, args);
0
}
};
context.shutdown().await;
if exit_code != 0 {
std::process::exit(exit_code);
}
}

View File

@@ -1,42 +0,0 @@
use std::io::{Read, Write};
use std::net::TcpListener;
use std::thread;
pub struct TestHttpServer {
pub url: String,
handle: Option<thread::JoinHandle<()>>,
}
impl TestHttpServer {
pub fn spawn_ok(body: &'static str) -> Self {
let listener = TcpListener::bind("127.0.0.1:0").expect("Failed to bind test HTTP server");
let addr = listener.local_addr().expect("Failed to get local addr");
let url = format!("http://{addr}/test");
let body_bytes = body.as_bytes().to_vec();
let handle = thread::spawn(move || {
if let Ok((mut stream, _)) = listener.accept() {
let mut request_buf = [0u8; 4096];
let _ = stream.read(&mut request_buf);
let response = format!(
"HTTP/1.1 200 OK\r\nContent-Type: text/plain\r\nContent-Length: {}\r\nConnection: close\r\n\r\n",
body_bytes.len()
);
let _ = stream.write_all(response.as_bytes());
let _ = stream.write_all(&body_bytes);
let _ = stream.flush();
}
});
Self { url, handle: Some(handle) }
}
}
impl Drop for TestHttpServer {
fn drop(&mut self) {
if let Some(handle) = self.handle.take() {
let _ = handle.join();
}
}
}

View File

@@ -1,62 +0,0 @@
#![allow(dead_code)]
pub mod http_server;
use assert_cmd::Command;
use assert_cmd::cargo::cargo_bin_cmd;
use std::path::Path;
use yaak_models::models::{HttpRequest, Workspace};
use yaak_models::query_manager::QueryManager;
use yaak_models::util::UpdateSource;
pub fn cli_cmd(data_dir: &Path) -> Command {
let mut cmd = cargo_bin_cmd!("yaakcli");
cmd.arg("--data-dir").arg(data_dir);
cmd
}
pub fn parse_created_id(stdout: &[u8], label: &str) -> String {
String::from_utf8_lossy(stdout)
.trim()
.split_once(": ")
.map(|(_, id)| id.to_string())
.unwrap_or_else(|| panic!("Expected id in '{label}' output"))
}
pub fn query_manager(data_dir: &Path) -> QueryManager {
let db_path = data_dir.join("db.sqlite");
let blob_path = data_dir.join("blobs.sqlite");
let (query_manager, _blob_manager, _rx) =
yaak_models::init_standalone(&db_path, &blob_path).expect("Failed to initialize DB");
query_manager
}
pub fn seed_workspace(data_dir: &Path, workspace_id: &str) {
let workspace = Workspace {
id: workspace_id.to_string(),
name: "Seed Workspace".to_string(),
description: "Seeded for integration tests".to_string(),
..Default::default()
};
query_manager(data_dir)
.connect()
.upsert_workspace(&workspace, &UpdateSource::Sync)
.expect("Failed to seed workspace");
}
pub fn seed_request(data_dir: &Path, workspace_id: &str, request_id: &str) {
let request = HttpRequest {
id: request_id.to_string(),
workspace_id: workspace_id.to_string(),
name: "Seeded Request".to_string(),
method: "GET".to_string(),
url: "https://example.com".to_string(),
..Default::default()
};
query_manager(data_dir)
.connect()
.upsert_http_request(&request, &UpdateSource::Sync)
.expect("Failed to seed request");
}

View File

@@ -1,80 +0,0 @@
mod common;
use common::{cli_cmd, parse_created_id, query_manager, seed_workspace};
use predicates::str::contains;
use tempfile::TempDir;
#[test]
fn create_list_show_delete_round_trip() {
let temp_dir = TempDir::new().expect("Failed to create temp dir");
let data_dir = temp_dir.path();
seed_workspace(data_dir, "wk_test");
cli_cmd(data_dir)
.args(["environment", "list", "wk_test"])
.assert()
.success()
.stdout(contains("Global Variables"));
let create_assert = cli_cmd(data_dir)
.args(["environment", "create", "wk_test", "--name", "Production"])
.assert()
.success();
let environment_id = parse_created_id(&create_assert.get_output().stdout, "environment create");
cli_cmd(data_dir)
.args(["environment", "list", "wk_test"])
.assert()
.success()
.stdout(contains(&environment_id))
.stdout(contains("Production"));
cli_cmd(data_dir)
.args(["environment", "show", &environment_id])
.assert()
.success()
.stdout(contains(format!("\"id\": \"{environment_id}\"")))
.stdout(contains("\"parentModel\": \"environment\""));
cli_cmd(data_dir)
.args(["environment", "delete", &environment_id, "--yes"])
.assert()
.success()
.stdout(contains(format!("Deleted environment: {environment_id}")));
assert!(query_manager(data_dir).connect().get_environment(&environment_id).is_err());
}
#[test]
fn json_create_and_update_merge_patch_round_trip() {
let temp_dir = TempDir::new().expect("Failed to create temp dir");
let data_dir = temp_dir.path();
seed_workspace(data_dir, "wk_test");
let create_assert = cli_cmd(data_dir)
.args([
"environment",
"create",
r#"{"workspaceId":"wk_test","name":"Json Environment"}"#,
])
.assert()
.success();
let environment_id = parse_created_id(&create_assert.get_output().stdout, "environment create");
cli_cmd(data_dir)
.args([
"environment",
"update",
&format!(r##"{{"id":"{}","color":"#00ff00"}}"##, environment_id),
])
.assert()
.success()
.stdout(contains(format!("Updated environment: {environment_id}")));
cli_cmd(data_dir)
.args(["environment", "show", &environment_id])
.assert()
.success()
.stdout(contains("\"name\": \"Json Environment\""))
.stdout(contains("\"color\": \"#00ff00\""));
}

View File

@@ -1,74 +0,0 @@
mod common;
use common::{cli_cmd, parse_created_id, query_manager, seed_workspace};
use predicates::str::contains;
use tempfile::TempDir;
#[test]
fn create_list_show_delete_round_trip() {
let temp_dir = TempDir::new().expect("Failed to create temp dir");
let data_dir = temp_dir.path();
seed_workspace(data_dir, "wk_test");
let create_assert = cli_cmd(data_dir)
.args(["folder", "create", "wk_test", "--name", "Auth"])
.assert()
.success();
let folder_id = parse_created_id(&create_assert.get_output().stdout, "folder create");
cli_cmd(data_dir)
.args(["folder", "list", "wk_test"])
.assert()
.success()
.stdout(contains(&folder_id))
.stdout(contains("Auth"));
cli_cmd(data_dir)
.args(["folder", "show", &folder_id])
.assert()
.success()
.stdout(contains(format!("\"id\": \"{folder_id}\"")))
.stdout(contains("\"workspaceId\": \"wk_test\""));
cli_cmd(data_dir)
.args(["folder", "delete", &folder_id, "--yes"])
.assert()
.success()
.stdout(contains(format!("Deleted folder: {folder_id}")));
assert!(query_manager(data_dir).connect().get_folder(&folder_id).is_err());
}
#[test]
fn json_create_and_update_merge_patch_round_trip() {
let temp_dir = TempDir::new().expect("Failed to create temp dir");
let data_dir = temp_dir.path();
seed_workspace(data_dir, "wk_test");
let create_assert = cli_cmd(data_dir)
.args([
"folder",
"create",
r#"{"workspaceId":"wk_test","name":"Json Folder"}"#,
])
.assert()
.success();
let folder_id = parse_created_id(&create_assert.get_output().stdout, "folder create");
cli_cmd(data_dir)
.args([
"folder",
"update",
&format!(r#"{{"id":"{}","description":"Folder Description"}}"#, folder_id),
])
.assert()
.success()
.stdout(contains(format!("Updated folder: {folder_id}")));
cli_cmd(data_dir)
.args(["folder", "show", &folder_id])
.assert()
.success()
.stdout(contains("\"name\": \"Json Folder\""))
.stdout(contains("\"description\": \"Folder Description\""));
}

View File

@@ -1,159 +0,0 @@
mod common;
use common::http_server::TestHttpServer;
use common::{cli_cmd, parse_created_id, query_manager, seed_request, seed_workspace};
use predicates::str::contains;
use tempfile::TempDir;
use yaak_models::models::HttpResponseState;
#[test]
fn show_and_delete_yes_round_trip() {
let temp_dir = TempDir::new().expect("Failed to create temp dir");
let data_dir = temp_dir.path();
seed_workspace(data_dir, "wk_test");
let create_assert = cli_cmd(data_dir)
.args([
"request",
"create",
"wk_test",
"--name",
"Smoke Test",
"--url",
"https://example.com",
])
.assert()
.success();
let request_id = parse_created_id(&create_assert.get_output().stdout, "request create");
cli_cmd(data_dir)
.args(["request", "show", &request_id])
.assert()
.success()
.stdout(contains(format!("\"id\": \"{request_id}\"")))
.stdout(contains("\"workspaceId\": \"wk_test\""));
cli_cmd(data_dir)
.args(["request", "delete", &request_id, "--yes"])
.assert()
.success()
.stdout(contains(format!("Deleted request: {request_id}")));
assert!(query_manager(data_dir).connect().get_http_request(&request_id).is_err());
}
#[test]
fn delete_without_yes_fails_in_non_interactive_mode() {
let temp_dir = TempDir::new().expect("Failed to create temp dir");
let data_dir = temp_dir.path();
seed_workspace(data_dir, "wk_test");
seed_request(data_dir, "wk_test", "rq_seed_delete_noninteractive");
cli_cmd(data_dir)
.args(["request", "delete", "rq_seed_delete_noninteractive"])
.assert()
.failure()
.code(1)
.stderr(contains("Refusing to delete in non-interactive mode without --yes"));
assert!(
query_manager(data_dir).connect().get_http_request("rq_seed_delete_noninteractive").is_ok()
);
}
#[test]
fn json_create_and_update_merge_patch_round_trip() {
let temp_dir = TempDir::new().expect("Failed to create temp dir");
let data_dir = temp_dir.path();
seed_workspace(data_dir, "wk_test");
let create_assert = cli_cmd(data_dir)
.args([
"request",
"create",
r#"{"workspaceId":"wk_test","name":"Json Request","url":"https://example.com"}"#,
])
.assert()
.success();
let request_id = parse_created_id(&create_assert.get_output().stdout, "request create");
cli_cmd(data_dir)
.args([
"request",
"update",
&format!(r#"{{"id":"{}","name":"Renamed Request"}}"#, request_id),
])
.assert()
.success()
.stdout(contains(format!("Updated request: {request_id}")));
cli_cmd(data_dir)
.args(["request", "show", &request_id])
.assert()
.success()
.stdout(contains("\"name\": \"Renamed Request\""))
.stdout(contains("\"url\": \"https://example.com\""));
}
#[test]
fn update_requires_id_in_json_payload() {
let temp_dir = TempDir::new().expect("Failed to create temp dir");
let data_dir = temp_dir.path();
cli_cmd(data_dir)
.args(["request", "update", r#"{"name":"No ID"}"#])
.assert()
.failure()
.stderr(contains("request update requires a non-empty \"id\" field"));
}
#[test]
fn request_send_persists_response_body_and_events() {
let temp_dir = TempDir::new().expect("Failed to create temp dir");
let data_dir = temp_dir.path();
seed_workspace(data_dir, "wk_test");
let server = TestHttpServer::spawn_ok("hello from integration test");
let create_assert = cli_cmd(data_dir)
.args([
"request",
"create",
"wk_test",
"--name",
"Send Test",
"--url",
&server.url,
])
.assert()
.success();
let request_id = parse_created_id(&create_assert.get_output().stdout, "request create");
cli_cmd(data_dir)
.args(["request", "send", &request_id])
.assert()
.success()
.stdout(contains("HTTP 200 OK"))
.stdout(contains("hello from integration test"));
let qm = query_manager(data_dir);
let db = qm.connect();
let responses =
db.list_http_responses_for_request(&request_id, None).expect("Failed to load responses");
assert_eq!(responses.len(), 1, "expected exactly one persisted response");
let response = &responses[0];
assert_eq!(response.status, 200);
assert!(matches!(response.state, HttpResponseState::Closed));
assert!(response.error.is_none());
let body_path =
response.body_path.as_ref().expect("expected persisted response body path").to_string();
let body = std::fs::read_to_string(&body_path).expect("Failed to read response body file");
assert_eq!(body, "hello from integration test");
let events =
db.list_http_response_events(&response.id).expect("Failed to load response events");
assert!(!events.is_empty(), "expected at least one persisted response event");
}

View File

@@ -1,59 +0,0 @@
mod common;
use common::{cli_cmd, parse_created_id, query_manager};
use predicates::str::contains;
use tempfile::TempDir;
#[test]
fn create_show_delete_round_trip() {
let temp_dir = TempDir::new().expect("Failed to create temp dir");
let data_dir = temp_dir.path();
let create_assert =
cli_cmd(data_dir).args(["workspace", "create", "--name", "WS One"]).assert().success();
let workspace_id = parse_created_id(&create_assert.get_output().stdout, "workspace create");
cli_cmd(data_dir)
.args(["workspace", "show", &workspace_id])
.assert()
.success()
.stdout(contains(format!("\"id\": \"{workspace_id}\"")))
.stdout(contains("\"name\": \"WS One\""));
cli_cmd(data_dir)
.args(["workspace", "delete", &workspace_id, "--yes"])
.assert()
.success()
.stdout(contains(format!("Deleted workspace: {workspace_id}")));
assert!(query_manager(data_dir).connect().get_workspace(&workspace_id).is_err());
}
#[test]
fn json_create_and_update_merge_patch_round_trip() {
let temp_dir = TempDir::new().expect("Failed to create temp dir");
let data_dir = temp_dir.path();
let create_assert = cli_cmd(data_dir)
.args(["workspace", "create", r#"{"name":"Json Workspace"}"#])
.assert()
.success();
let workspace_id = parse_created_id(&create_assert.get_output().stdout, "workspace create");
cli_cmd(data_dir)
.args([
"workspace",
"update",
&format!(r#"{{"id":"{}","description":"Updated via JSON"}}"#, workspace_id),
])
.assert()
.success()
.stdout(contains(format!("Updated workspace: {workspace_id}")));
cli_cmd(data_dir)
.args(["workspace", "show", &workspace_id])
.assert()
.success()
.stdout(contains("\"name\": \"Json Workspace\""))
.stdout(contains("\"description\": \"Updated via JSON\""));
}

View File

@@ -1,78 +0,0 @@
[package]
name = "yaak-app"
version = "0.0.0"
edition = "2024"
authors = ["Gregory Schier"]
publish = false
# Produce a library for mobile support
[lib]
name = "tauri_app_lib"
crate-type = ["staticlib", "cdylib", "lib"]
[features]
cargo-clippy = []
default = []
updater = []
license = ["yaak-license"]
[build-dependencies]
tauri-build = { version = "2.5.3", features = [] }
[target.'cfg(target_os = "linux")'.dependencies]
openssl-sys = { version = "0.9.105", features = ["vendored"] } # For Ubuntu installation to work
[dependencies]
charset = "0.1.5"
chrono = { workspace = true, features = ["serde"] }
cookie = "0.18.1"
eventsource-client = { git = "https://github.com/yaakapp/rust-eventsource-client", version = "0.14.0" }
http = { version = "1.2.0", default-features = false }
log = { workspace = true }
md5 = "0.8.0"
r2d2 = "0.8.10"
r2d2_sqlite = "0.25.0"
mime_guess = "2.0.5"
rand = "0.9.0"
reqwest = { workspace = true, features = ["multipart", "gzip", "brotli", "deflate", "json", "rustls-tls-manual-roots-no-provider", "socks", "http2"] }
serde = { workspace = true, features = ["derive"] }
serde_json = { workspace = true, features = ["raw_value"] }
tauri = { workspace = true, features = ["devtools", "protocol-asset"] }
tauri-plugin-clipboard-manager = "2.3.2"
tauri-plugin-deep-link = "2.4.5"
tauri-plugin-dialog = { workspace = true }
tauri-plugin-fs = "2.4.4"
tauri-plugin-log = { version = "2.7.1", features = ["colored"] }
tauri-plugin-opener = "2.5.2"
tauri-plugin-os = "2.3.2"
tauri-plugin-shell = { workspace = true }
tauri-plugin-single-instance = { version = "2.3.6", features = ["deep-link"] }
tauri-plugin-updater = "2.9.0"
tauri-plugin-window-state = "2.4.1"
thiserror = { workspace = true }
tokio = { workspace = true, features = ["sync"] }
tokio-stream = "0.1.17"
tokio-tungstenite = { version = "0.26.2", default-features = false }
url = "2"
tokio-util = { version = "0.7", features = ["codec"] }
ts-rs = { workspace = true }
uuid = "1.12.1"
yaak-api = { workspace = true }
yaak-common = { workspace = true }
yaak-tauri-utils = { workspace = true }
yaak-core = { workspace = true }
yaak = { workspace = true }
yaak-crypto = { workspace = true }
yaak-fonts = { workspace = true }
yaak-git = { workspace = true }
yaak-grpc = { workspace = true }
yaak-http = { workspace = true }
yaak-license = { workspace = true, optional = true }
yaak-mac-window = { workspace = true }
yaak-models = { workspace = true }
yaak-plugins = { workspace = true }
yaak-sse = { workspace = true }
yaak-sync = { workspace = true }
yaak-templates = { workspace = true }
yaak-tls = { workspace = true }
yaak-ws = { workspace = true }

View File

@@ -1,3 +0,0 @@
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
export type WatchResult = { unlistenEvent: string, };

View File

@@ -1,17 +0,0 @@
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
export type PluginUpdateInfo = { name: string, currentVersion: string, latestVersion: string, };
export type PluginUpdateNotification = { updateCount: number, plugins: Array<PluginUpdateInfo>, };
export type UpdateInfo = { replyEventId: string, version: string, downloaded: boolean, };
export type UpdateResponse = { "type": "ack" } | { "type": "action", action: UpdateResponseAction, };
export type UpdateResponseAction = "install" | "skip";
export type WatchResult = { unlistenEvent: string, };
export type YaakNotification = { timestamp: string, timeout: number | null, id: string, title: string | null, message: string, color: string | null, action: YaakNotificationAction | null, };
export type YaakNotificationAction = { label: string, url: string, };

View File

@@ -1,5 +0,0 @@
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
export type PluginUpdateInfo = { name: string, currentVersion: string, latestVersion: string, };
export type PluginUpdateNotification = { updateCount: number, plugins: Array<PluginUpdateInfo>, };

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.9 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 5.6 KiB

View File

@@ -1,13 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<!-- Enable for NodeJS/V8 JIT compiler -->
<key>com.apple.security.cs.allow-unsigned-executable-memory</key>
<true/>
<!-- Allow loading plugins signed with different Team IDs (e.g., 1Password) -->
<key>com.apple.security.cs.disable-library-validation</key>
<true/>
</dict>
</plist>

View File

@@ -1,6 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
</dict>
</plist>

View File

@@ -1,6 +0,0 @@
{
"name": "@yaakapp-internal/tauri",
"private": true,
"version": "1.0.0",
"main": "bindings/index.ts"
}

View File

@@ -1,100 +0,0 @@
use crate::PluginContextExt;
use crate::error::Result;
use std::sync::Arc;
use tauri::{AppHandle, Manager, Runtime, State, WebviewWindow, command};
use yaak_crypto::manager::EncryptionManager;
use yaak_models::models::HttpRequestHeader;
use yaak_models::queries::workspaces::default_headers;
use yaak_plugins::events::GetThemesResponse;
use yaak_plugins::manager::PluginManager;
use yaak_plugins::native_template_functions::{
decrypt_secure_template_function, encrypt_secure_template_function,
};
/// Extension trait for accessing the EncryptionManager from Tauri Manager types.
pub trait EncryptionManagerExt<'a, R> {
fn crypto(&'a self) -> State<'a, EncryptionManager>;
}
impl<'a, R: Runtime, M: Manager<R>> EncryptionManagerExt<'a, R> for M {
fn crypto(&'a self) -> State<'a, EncryptionManager> {
self.state::<EncryptionManager>()
}
}
#[command]
pub(crate) async fn cmd_decrypt_template<R: Runtime>(
window: WebviewWindow<R>,
template: &str,
) -> Result<String> {
let encryption_manager = window.app_handle().state::<EncryptionManager>();
let plugin_context = window.plugin_context();
Ok(decrypt_secure_template_function(&encryption_manager, &plugin_context, template)?)
}
#[command]
pub(crate) async fn cmd_secure_template<R: Runtime>(
app_handle: AppHandle<R>,
window: WebviewWindow<R>,
template: &str,
) -> Result<String> {
let plugin_manager = Arc::new((*app_handle.state::<PluginManager>()).clone());
let encryption_manager = Arc::new((*app_handle.state::<EncryptionManager>()).clone());
let plugin_context = window.plugin_context();
Ok(encrypt_secure_template_function(
plugin_manager,
encryption_manager,
&plugin_context,
template,
)?)
}
#[command]
pub(crate) async fn cmd_get_themes<R: Runtime>(
window: WebviewWindow<R>,
plugin_manager: State<'_, PluginManager>,
) -> Result<Vec<GetThemesResponse>> {
Ok(plugin_manager.get_themes(&window.plugin_context()).await?)
}
#[command]
pub(crate) async fn cmd_enable_encryption<R: Runtime>(
window: WebviewWindow<R>,
workspace_id: &str,
) -> Result<()> {
window.crypto().ensure_workspace_key(workspace_id)?;
window.crypto().reveal_workspace_key(workspace_id)?;
Ok(())
}
#[command]
pub(crate) async fn cmd_reveal_workspace_key<R: Runtime>(
window: WebviewWindow<R>,
workspace_id: &str,
) -> Result<String> {
Ok(window.crypto().reveal_workspace_key(workspace_id)?)
}
#[command]
pub(crate) async fn cmd_set_workspace_key<R: Runtime>(
window: WebviewWindow<R>,
workspace_id: &str,
key: &str,
) -> Result<()> {
window.crypto().set_human_key(workspace_id, key)?;
Ok(())
}
#[command]
pub(crate) async fn cmd_disable_encryption<R: Runtime>(
window: WebviewWindow<R>,
workspace_id: &str,
) -> Result<()> {
window.crypto().disable_encryption(workspace_id)?;
Ok(())
}
#[command]
pub(crate) fn cmd_default_headers() -> Vec<HttpRequestHeader> {
default_headers()
}

View File

@@ -1,20 +0,0 @@
use mime_guess::{Mime, mime};
use std::path::Path;
use std::str::FromStr;
use tokio::fs;
pub async fn read_response_body(body_path: impl AsRef<Path>, content_type: &str) -> Option<String> {
let body = fs::read(body_path).await.ok()?;
let body_charset = parse_charset(content_type).unwrap_or("utf-8".to_string());
if let Some(decoder) = charset::Charset::for_label(body_charset.as_bytes()) {
let (cow, _real_encoding, _exist_replace) = decoder.decode(&body);
return cow.into_owned().into();
}
Some(String::from_utf8_lossy(&body).to_string())
}
fn parse_charset(content_type: &str) -> Option<String> {
let mime: Mime = Mime::from_str(content_type).ok()?;
mime.get_param(mime::CHARSET).map(|v| v.to_string())
}

View File

@@ -1,78 +0,0 @@
use serde::{Serialize, Serializer};
use std::io;
use thiserror::Error;
#[derive(Error, Debug)]
pub enum Error {
#[error(transparent)]
TemplateError(#[from] yaak_templates::error::Error),
#[error(transparent)]
ModelError(#[from] yaak_models::error::Error),
#[error(transparent)]
SyncError(#[from] yaak_sync::error::Error),
#[error(transparent)]
CryptoError(#[from] yaak_crypto::error::Error),
#[error(transparent)]
HttpError(#[from] yaak_http::error::Error),
#[error(transparent)]
GitError(#[from] yaak_git::error::Error),
#[error(transparent)]
TokioTimeoutElapsed(#[from] tokio::time::error::Elapsed),
#[error(transparent)]
WebsocketError(#[from] yaak_ws::error::Error),
#[cfg(feature = "license")]
#[error(transparent)]
LicenseError(#[from] yaak_license::error::Error),
#[error(transparent)]
PluginError(#[from] yaak_plugins::error::Error),
#[error(transparent)]
ApiError(#[from] yaak_api::Error),
#[error(transparent)]
ClipboardError(#[from] tauri_plugin_clipboard_manager::Error),
#[error(transparent)]
OpenerError(#[from] tauri_plugin_opener::Error),
#[error("Updater error: {0}")]
UpdaterError(#[from] tauri_plugin_updater::Error),
#[error("JSON error: {0}")]
JsonError(#[from] serde_json::error::Error),
#[error("Tauri error: {0}")]
TauriError(#[from] tauri::Error),
#[error("Event source error: {0}")]
EventSourceError(#[from] eventsource_client::Error),
#[error("I/O error: {0}")]
IOError(#[from] io::Error),
#[error("Request error: {0}")]
RequestError(#[from] reqwest::Error),
#[error("{0}")]
GenericError(String),
}
impl Serialize for Error {
fn serialize<S>(&self, serializer: S) -> std::result::Result<S::Ok, S::Error>
where
S: Serializer,
{
serializer.serialize_str(self.to_string().as_ref())
}
}
pub type Result<T> = std::result::Result<T, Error>;

View File

@@ -1,149 +0,0 @@
//! Tauri-specific extensions for yaak-git.
//!
//! This module provides the Tauri commands for git functionality.
use crate::error::Result;
use std::path::{Path, PathBuf};
use tauri::command;
use yaak_git::{
BranchDeleteResult, CloneResult, GitCommit, GitRemote, GitStatusSummary, PullResult,
PushResult, git_add, git_add_credential, git_add_remote, git_checkout_branch, git_clone,
git_commit, git_create_branch, git_delete_branch, git_delete_remote_branch, git_fetch_all,
git_init, git_log, git_merge_branch, git_pull, git_pull_force_reset, git_pull_merge, git_push,
git_remotes, git_rename_branch, git_reset_changes, git_rm_remote, git_status, git_unstage,
};
// NOTE: All of these commands are async to prevent blocking work from locking up the UI
#[command]
pub async fn cmd_git_checkout(dir: &Path, branch: &str, force: bool) -> Result<String> {
Ok(git_checkout_branch(dir, branch, force).await?)
}
#[command]
pub async fn cmd_git_branch(dir: &Path, branch: &str, base: Option<&str>) -> Result<()> {
Ok(git_create_branch(dir, branch, base).await?)
}
#[command]
pub async fn cmd_git_delete_branch(
dir: &Path,
branch: &str,
force: Option<bool>,
) -> Result<BranchDeleteResult> {
Ok(git_delete_branch(dir, branch, force.unwrap_or(false)).await?)
}
#[command]
pub async fn cmd_git_delete_remote_branch(dir: &Path, branch: &str) -> Result<()> {
Ok(git_delete_remote_branch(dir, branch).await?)
}
#[command]
pub async fn cmd_git_merge_branch(dir: &Path, branch: &str) -> Result<()> {
Ok(git_merge_branch(dir, branch).await?)
}
#[command]
pub async fn cmd_git_rename_branch(dir: &Path, old_name: &str, new_name: &str) -> Result<()> {
Ok(git_rename_branch(dir, old_name, new_name).await?)
}
#[command]
pub async fn cmd_git_status(dir: &Path) -> Result<GitStatusSummary> {
Ok(git_status(dir)?)
}
#[command]
pub async fn cmd_git_log(dir: &Path) -> Result<Vec<GitCommit>> {
Ok(git_log(dir)?)
}
#[command]
pub async fn cmd_git_initialize(dir: &Path) -> Result<()> {
Ok(git_init(dir)?)
}
#[command]
pub async fn cmd_git_clone(url: &str, dir: &Path) -> Result<CloneResult> {
Ok(git_clone(url, dir).await?)
}
#[command]
pub async fn cmd_git_commit(dir: &Path, message: &str) -> Result<()> {
Ok(git_commit(dir, message).await?)
}
#[command]
pub async fn cmd_git_fetch_all(dir: &Path) -> Result<()> {
Ok(git_fetch_all(dir).await?)
}
#[command]
pub async fn cmd_git_push(dir: &Path) -> Result<PushResult> {
Ok(git_push(dir).await?)
}
#[command]
pub async fn cmd_git_pull(dir: &Path) -> Result<PullResult> {
Ok(git_pull(dir).await?)
}
#[command]
pub async fn cmd_git_pull_force_reset(
dir: &Path,
remote: &str,
branch: &str,
) -> Result<PullResult> {
Ok(git_pull_force_reset(dir, remote, branch).await?)
}
#[command]
pub async fn cmd_git_pull_merge(dir: &Path, remote: &str, branch: &str) -> Result<PullResult> {
Ok(git_pull_merge(dir, remote, branch).await?)
}
#[command]
pub async fn cmd_git_add(dir: &Path, rela_paths: Vec<PathBuf>) -> Result<()> {
for path in rela_paths {
git_add(dir, &path)?;
}
Ok(())
}
#[command]
pub async fn cmd_git_unstage(dir: &Path, rela_paths: Vec<PathBuf>) -> Result<()> {
for path in rela_paths {
git_unstage(dir, &path)?;
}
Ok(())
}
#[command]
pub async fn cmd_git_reset_changes(dir: &Path) -> Result<()> {
Ok(git_reset_changes(dir).await?)
}
#[command]
pub async fn cmd_git_add_credential(
remote_url: &str,
username: &str,
password: &str,
) -> Result<()> {
Ok(git_add_credential(remote_url, username, password).await?)
}
#[command]
pub async fn cmd_git_remotes(dir: &Path) -> Result<Vec<GitRemote>> {
Ok(git_remotes(dir)?)
}
#[command]
pub async fn cmd_git_add_remote(dir: &Path, name: &str, url: &str) -> Result<GitRemote> {
Ok(git_add_remote(dir, name, url)?)
}
#[command]
pub async fn cmd_git_rm_remote(dir: &Path, name: &str) -> Result<()> {
Ok(git_rm_remote(dir, name)?)
}

View File

@@ -1,98 +0,0 @@
use std::collections::BTreeMap;
use crate::PluginContextExt;
use crate::error::Result;
use crate::models_ext::QueryManagerExt;
use KeyAndValueRef::{Ascii, Binary};
use tauri::{Manager, Runtime, WebviewWindow};
use yaak_grpc::{KeyAndValueRef, MetadataMap};
use yaak_models::models::GrpcRequest;
use yaak_plugins::events::{CallHttpAuthenticationRequest, HttpHeader};
use yaak_plugins::manager::PluginManager;
pub(crate) fn metadata_to_map(metadata: MetadataMap) -> BTreeMap<String, String> {
let mut entries = BTreeMap::new();
for r in metadata.iter() {
match r {
Ascii(k, v) => entries.insert(k.to_string(), v.to_str().unwrap().to_string()),
Binary(k, v) => entries.insert(k.to_string(), format!("{:?}", v)),
};
}
entries
}
pub(crate) fn resolve_grpc_request<R: Runtime>(
window: &WebviewWindow<R>,
request: &GrpcRequest,
) -> Result<(GrpcRequest, String)> {
let mut new_request = request.clone();
let (authentication_type, authentication, authentication_context_id) =
window.db().resolve_auth_for_grpc_request(request)?;
new_request.authentication_type = authentication_type;
new_request.authentication = authentication;
let metadata = window.db().resolve_metadata_for_grpc_request(request)?;
new_request.metadata = metadata;
Ok((new_request, authentication_context_id))
}
pub(crate) async fn build_metadata<R: Runtime>(
window: &WebviewWindow<R>,
request: &GrpcRequest,
authentication_context_id: &str,
) -> Result<BTreeMap<String, String>> {
let plugin_manager = window.state::<PluginManager>();
let mut metadata = BTreeMap::new();
// Add the rest of metadata
for h in request.metadata.clone() {
if h.name.is_empty() && h.value.is_empty() {
continue;
}
if !h.enabled {
continue;
}
metadata.insert(h.name, h.value);
}
match request.authentication_type.clone() {
None => {
// No authentication found. Not even inherited
}
Some(authentication_type) if authentication_type == "none" => {
// Explicitly no authentication
}
Some(authentication_type) => {
let auth = request.authentication.clone();
let plugin_req = CallHttpAuthenticationRequest {
context_id: format!("{:x}", md5::compute(authentication_context_id)),
values: serde_json::from_value(serde_json::to_value(&auth)?)?,
method: "POST".to_string(),
url: request.url.clone(),
headers: metadata
.iter()
.map(|(name, value)| HttpHeader {
name: name.to_string(),
value: value.to_string(),
})
.collect(),
};
let plugin_result = plugin_manager
.call_http_authentication(
&window.plugin_context(),
&authentication_type,
plugin_req,
)
.await?;
for header in plugin_result.set_headers.unwrap_or_default() {
metadata.insert(header.name, header.value);
}
}
}
Ok(metadata)
}

View File

@@ -1,74 +0,0 @@
use crate::models_ext::QueryManagerExt;
use chrono::{NaiveDateTime, Utc};
use log::debug;
use std::sync::OnceLock;
use tauri::{AppHandle, Runtime};
use yaak_models::util::UpdateSource;
const NAMESPACE: &str = "analytics";
const NUM_LAUNCHES_KEY: &str = "num_launches";
const LAST_VERSION_KEY: &str = "last_tracked_version";
const PREV_VERSION_KEY: &str = "last_tracked_version_prev";
const VERSION_SINCE_KEY: &str = "last_tracked_version_since";
#[derive(Default, Debug, Clone)]
pub struct LaunchEventInfo {
pub current_version: String,
pub previous_version: String,
pub launched_after_update: bool,
pub version_since: NaiveDateTime,
pub user_since: NaiveDateTime,
pub num_launches: i32,
}
static LAUNCH_INFO: OnceLock<LaunchEventInfo> = OnceLock::new();
pub fn get_or_upsert_launch_info<R: Runtime>(app_handle: &AppHandle<R>) -> &LaunchEventInfo {
LAUNCH_INFO.get_or_init(|| {
let now = Utc::now().naive_utc();
let mut info = LaunchEventInfo {
version_since: app_handle.db().get_key_value_dte(NAMESPACE, VERSION_SINCE_KEY, now),
current_version: app_handle.package_info().version.to_string(),
user_since: app_handle.db().get_settings().created_at,
num_launches: app_handle.db().get_key_value_int(NAMESPACE, NUM_LAUNCHES_KEY, 0) + 1,
// The rest will be set below
..Default::default()
};
app_handle
.with_tx(|tx| {
// Load the previously tracked version
let curr_db = tx.get_key_value_str(NAMESPACE, LAST_VERSION_KEY, "");
let prev_db = tx.get_key_value_str(NAMESPACE, PREV_VERSION_KEY, "");
// We just updated if the app version is different from the last tracked version we stored
if !curr_db.is_empty() && info.current_version != curr_db {
info.launched_after_update = true;
}
// If we just updated, track the previous version as the "previous" current version
if info.launched_after_update {
info.previous_version = curr_db.clone();
info.version_since = now;
} else {
info.previous_version = prev_db.clone();
}
// Rotate stored versions: move previous into the "prev" slot before overwriting
let source = &UpdateSource::Background;
tx.set_key_value_str(NAMESPACE, PREV_VERSION_KEY, &info.previous_version, source);
tx.set_key_value_str(NAMESPACE, LAST_VERSION_KEY, &info.current_version, source);
tx.set_key_value_dte(NAMESPACE, VERSION_SINCE_KEY, info.version_since, source);
tx.set_key_value_int(NAMESPACE, NUM_LAUNCHES_KEY, info.num_launches, source);
Ok(())
})
.unwrap();
debug!("Initialized launch info");
info
})
}

View File

@@ -1,185 +0,0 @@
use crate::PluginContextExt;
use crate::error::Error::GenericError;
use crate::error::Result;
use crate::models_ext::BlobManagerExt;
use crate::models_ext::QueryManagerExt;
use log::warn;
use std::sync::Arc;
use std::time::Instant;
use tauri::{AppHandle, Manager, Runtime, WebviewWindow};
use tokio::sync::watch::Receiver;
use yaak::send::{SendHttpRequestWithPluginsParams, send_http_request_with_plugins};
use yaak_crypto::manager::EncryptionManager;
use yaak_http::manager::HttpConnectionManager;
use yaak_models::models::{CookieJar, Environment, HttpRequest, HttpResponse, HttpResponseState};
use yaak_models::util::UpdateSource;
use yaak_plugins::events::PluginContext;
use yaak_plugins::manager::PluginManager;
/// Context for managing response state during HTTP transactions.
/// Handles both persisted responses (stored in DB) and ephemeral responses (in-memory only).
struct ResponseContext<R: Runtime> {
app_handle: AppHandle<R>,
response: HttpResponse,
update_source: UpdateSource,
}
impl<R: Runtime> ResponseContext<R> {
fn new(app_handle: AppHandle<R>, response: HttpResponse, update_source: UpdateSource) -> Self {
Self { app_handle, response, update_source }
}
/// Whether this response is persisted (has a non-empty ID)
fn is_persisted(&self) -> bool {
!self.response.id.is_empty()
}
/// Update the response state. For persisted responses, fetches from DB, applies the
/// closure, and updates the DB. For ephemeral responses, just applies the closure
/// to the in-memory response.
fn update<F>(&mut self, func: F) -> Result<()>
where
F: FnOnce(&mut HttpResponse),
{
if self.is_persisted() {
let r = self.app_handle.with_tx(|tx| {
let mut r = tx.get_http_response(&self.response.id)?;
func(&mut r);
tx.update_http_response_if_id(&r, &self.update_source)?;
Ok(r)
})?;
self.response = r;
Ok(())
} else {
func(&mut self.response);
Ok(())
}
}
/// Get the current response state
fn response(&self) -> &HttpResponse {
&self.response
}
}
pub async fn send_http_request<R: Runtime>(
window: &WebviewWindow<R>,
unrendered_request: &HttpRequest,
og_response: &HttpResponse,
environment: Option<Environment>,
cookie_jar: Option<CookieJar>,
cancelled_rx: &mut Receiver<bool>,
) -> Result<HttpResponse> {
send_http_request_with_context(
window,
unrendered_request,
og_response,
environment,
cookie_jar,
cancelled_rx,
&window.plugin_context(),
)
.await
}
pub async fn send_http_request_with_context<R: Runtime>(
window: &WebviewWindow<R>,
unrendered_request: &HttpRequest,
og_response: &HttpResponse,
environment: Option<Environment>,
cookie_jar: Option<CookieJar>,
cancelled_rx: &Receiver<bool>,
plugin_context: &PluginContext,
) -> Result<HttpResponse> {
let app_handle = window.app_handle().clone();
let update_source = UpdateSource::from_window_label(window.label());
let mut response_ctx =
ResponseContext::new(app_handle.clone(), og_response.clone(), update_source);
// Execute the inner send logic and handle errors consistently
let start = Instant::now();
let result = send_http_request_inner(
window,
unrendered_request,
environment,
cookie_jar,
cancelled_rx,
plugin_context,
&mut response_ctx,
)
.await;
match result {
Ok(response) => Ok(response),
Err(e) => {
let error = e.to_string();
let elapsed = start.elapsed().as_millis() as i32;
warn!("Failed to send request: {error:?}");
let _ = response_ctx.update(|r| {
r.state = HttpResponseState::Closed;
r.elapsed = elapsed;
if r.elapsed_headers == 0 {
r.elapsed_headers = elapsed;
}
r.error = Some(error);
});
Ok(response_ctx.response().clone())
}
}
}
async fn send_http_request_inner<R: Runtime>(
window: &WebviewWindow<R>,
unrendered_request: &HttpRequest,
environment: Option<Environment>,
cookie_jar: Option<CookieJar>,
cancelled_rx: &Receiver<bool>,
plugin_context: &PluginContext,
response_ctx: &mut ResponseContext<R>,
) -> Result<HttpResponse> {
let app_handle = window.app_handle().clone();
let plugin_manager = Arc::new((*app_handle.state::<PluginManager>()).clone());
let encryption_manager = Arc::new((*app_handle.state::<EncryptionManager>()).clone());
let connection_manager = app_handle.state::<HttpConnectionManager>();
let environment_id = environment.map(|e| e.id);
let cookie_jar_id = cookie_jar.as_ref().map(|jar| jar.id.clone());
let response_dir = app_handle.path().app_data_dir()?.join("responses");
let result = send_http_request_with_plugins(SendHttpRequestWithPluginsParams {
query_manager: app_handle.db_manager().inner(),
blob_manager: app_handle.blob_manager().inner(),
request: unrendered_request.clone(),
environment_id: environment_id.as_deref(),
update_source: response_ctx.update_source.clone(),
cookie_jar_id,
response_dir: &response_dir,
emit_events_to: None,
existing_response: Some(response_ctx.response().clone()),
plugin_manager,
encryption_manager,
plugin_context,
cancelled_rx: Some(cancelled_rx.clone()),
connection_manager: Some(connection_manager.inner()),
})
.await
.map_err(|e| GenericError(e.to_string()))?;
Ok(result.response)
}
pub fn resolve_http_request<R: Runtime>(
window: &WebviewWindow<R>,
request: &HttpRequest,
) -> Result<(HttpRequest, String)> {
let mut new_request = request.clone();
let (authentication_type, authentication, authentication_context_id) =
window.db().resolve_auth_for_http_request(request)?;
new_request.authentication_type = authentication_type;
new_request.authentication = authentication;
let headers = window.db().resolve_headers_for_http_request(request)?;
new_request.headers = headers;
Ok((new_request, authentication_context_id))
}

View File

@@ -1,129 +0,0 @@
use crate::PluginContextExt;
use crate::error::Result;
use crate::models_ext::QueryManagerExt;
use log::info;
use std::collections::BTreeMap;
use std::fs::read_to_string;
use tauri::{Manager, Runtime, WebviewWindow};
use yaak_core::WorkspaceContext;
use yaak_models::models::{
Environment, Folder, GrpcRequest, HttpRequest, WebsocketRequest, Workspace,
};
use yaak_models::util::{BatchUpsertResult, UpdateSource, maybe_gen_id, maybe_gen_id_opt};
use yaak_plugins::manager::PluginManager;
use yaak_tauri_utils::window::WorkspaceWindowTrait;
pub(crate) async fn import_data<R: Runtime>(
window: &WebviewWindow<R>,
file_path: &str,
) -> Result<BatchUpsertResult> {
let plugin_manager = window.state::<PluginManager>();
let file =
read_to_string(file_path).unwrap_or_else(|_| panic!("Unable to read file {}", file_path));
let file_contents = file.as_str();
let import_result = plugin_manager.import_data(&window.plugin_context(), file_contents).await?;
let mut id_map: BTreeMap<String, String> = BTreeMap::new();
// Create WorkspaceContext from window
let ctx = WorkspaceContext {
workspace_id: window.workspace_id(),
environment_id: window.environment_id(),
cookie_jar_id: window.cookie_jar_id(),
request_id: None,
};
let resources = import_result.resources;
let workspaces: Vec<Workspace> = resources
.workspaces
.into_iter()
.map(|mut v| {
v.id = maybe_gen_id::<Workspace>(&ctx, v.id.as_str(), &mut id_map);
v
})
.collect();
let environments: Vec<Environment> = resources
.environments
.into_iter()
.map(|mut v| {
v.id = maybe_gen_id::<Environment>(&ctx, v.id.as_str(), &mut id_map);
v.workspace_id = maybe_gen_id::<Workspace>(&ctx, v.workspace_id.as_str(), &mut id_map);
match (v.parent_model.as_str(), v.parent_id.clone().as_deref()) {
("folder", Some(parent_id)) => {
v.parent_id = Some(maybe_gen_id::<Folder>(&ctx, &parent_id, &mut id_map));
}
("", _) => {
// Fix any empty ones
v.parent_model = "workspace".to_string();
}
_ => {
// Parent ID only required for the folder case
v.parent_id = None;
}
};
v
})
.collect();
let folders: Vec<Folder> = resources
.folders
.into_iter()
.map(|mut v| {
v.id = maybe_gen_id::<Folder>(&ctx, v.id.as_str(), &mut id_map);
v.workspace_id = maybe_gen_id::<Workspace>(&ctx, v.workspace_id.as_str(), &mut id_map);
v.folder_id = maybe_gen_id_opt::<Folder>(&ctx, v.folder_id, &mut id_map);
v
})
.collect();
let http_requests: Vec<HttpRequest> = resources
.http_requests
.into_iter()
.map(|mut v| {
v.id = maybe_gen_id::<HttpRequest>(&ctx, v.id.as_str(), &mut id_map);
v.workspace_id = maybe_gen_id::<Workspace>(&ctx, v.workspace_id.as_str(), &mut id_map);
v.folder_id = maybe_gen_id_opt::<Folder>(&ctx, v.folder_id, &mut id_map);
v
})
.collect();
let grpc_requests: Vec<GrpcRequest> = resources
.grpc_requests
.into_iter()
.map(|mut v| {
v.id = maybe_gen_id::<GrpcRequest>(&ctx, v.id.as_str(), &mut id_map);
v.workspace_id = maybe_gen_id::<Workspace>(&ctx, v.workspace_id.as_str(), &mut id_map);
v.folder_id = maybe_gen_id_opt::<Folder>(&ctx, v.folder_id, &mut id_map);
v
})
.collect();
let websocket_requests: Vec<WebsocketRequest> = resources
.websocket_requests
.into_iter()
.map(|mut v| {
v.id = maybe_gen_id::<WebsocketRequest>(&ctx, v.id.as_str(), &mut id_map);
v.workspace_id = maybe_gen_id::<Workspace>(&ctx, v.workspace_id.as_str(), &mut id_map);
v.folder_id = maybe_gen_id_opt::<Folder>(&ctx, v.folder_id, &mut id_map);
v
})
.collect();
info!("Importing data");
let upserted = window.with_tx(|tx| {
tx.batch_upsert(
workspaces,
environments,
folders,
http_requests,
grpc_requests,
websocket_requests,
&UpdateSource::Import,
)
})?;
Ok(upserted)
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,373 +0,0 @@
//! Tauri-specific extensions for yaak-models.
//!
//! This module provides the Tauri plugin initialization and extension traits
//! that allow accessing QueryManager and BlobManager from Tauri's Manager types.
use chrono::Utc;
use log::error;
use std::time::Duration;
use tauri::plugin::TauriPlugin;
use tauri::{Emitter, Manager, Runtime, State};
use tauri_plugin_dialog::{DialogExt, MessageDialogKind};
use yaak_models::blob_manager::BlobManager;
use yaak_models::db_context::DbContext;
use yaak_models::error::Result;
use yaak_models::models::{AnyModel, GraphQlIntrospection, GrpcEvent, Settings, WebsocketEvent};
use yaak_models::query_manager::QueryManager;
use yaak_models::util::UpdateSource;
const MODEL_CHANGES_RETENTION_HOURS: i64 = 1;
const MODEL_CHANGES_POLL_INTERVAL_MS: u64 = 250;
const MODEL_CHANGES_POLL_BATCH_SIZE: usize = 200;
struct ModelChangeCursor {
created_at: String,
id: i64,
}
impl ModelChangeCursor {
fn from_launch_time() -> Self {
Self {
created_at: Utc::now().naive_utc().format("%Y-%m-%d %H:%M:%S%.3f").to_string(),
id: 0,
}
}
}
fn drain_model_changes_batch<R: Runtime>(
query_manager: &QueryManager,
app_handle: &tauri::AppHandle<R>,
cursor: &mut ModelChangeCursor,
) -> bool {
let changes = match query_manager.connect().list_model_changes_since(
&cursor.created_at,
cursor.id,
MODEL_CHANGES_POLL_BATCH_SIZE,
) {
Ok(changes) => changes,
Err(err) => {
error!("Failed to poll model_changes rows: {err:?}");
return false;
}
};
if changes.is_empty() {
return false;
}
let fetched_count = changes.len();
for change in changes {
cursor.created_at = change.created_at;
cursor.id = change.id;
// Local window-originated writes are forwarded immediately from the
// in-memory model event channel.
if matches!(change.payload.update_source, UpdateSource::Window { .. }) {
continue;
}
if let Err(err) = app_handle.emit("model_write", change.payload) {
error!("Failed to emit model_write event: {err:?}");
}
}
fetched_count == MODEL_CHANGES_POLL_BATCH_SIZE
}
async fn run_model_change_poller<R: Runtime>(
query_manager: QueryManager,
app_handle: tauri::AppHandle<R>,
mut cursor: ModelChangeCursor,
) {
loop {
while drain_model_changes_batch(&query_manager, &app_handle, &mut cursor) {}
tokio::time::sleep(Duration::from_millis(MODEL_CHANGES_POLL_INTERVAL_MS)).await;
}
}
/// Extension trait for accessing the QueryManager from Tauri Manager types.
pub trait QueryManagerExt<'a, R> {
fn db_manager(&'a self) -> State<'a, QueryManager>;
fn db(&'a self) -> DbContext<'a>;
fn with_tx<F, T>(&'a self, func: F) -> Result<T>
where
F: FnOnce(&DbContext) -> Result<T>;
}
impl<'a, R: Runtime, M: Manager<R>> QueryManagerExt<'a, R> for M {
fn db_manager(&'a self) -> State<'a, QueryManager> {
self.state::<QueryManager>()
}
fn db(&'a self) -> DbContext<'a> {
let qm = self.state::<QueryManager>();
qm.inner().connect()
}
fn with_tx<F, T>(&'a self, func: F) -> Result<T>
where
F: FnOnce(&DbContext) -> Result<T>,
{
let qm = self.state::<QueryManager>();
qm.inner().with_tx(func)
}
}
/// Extension trait for accessing the BlobManager from Tauri Manager types.
pub trait BlobManagerExt<'a, R> {
fn blob_manager(&'a self) -> State<'a, BlobManager>;
fn blobs(&'a self) -> yaak_models::blob_manager::BlobContext;
}
impl<'a, R: Runtime, M: Manager<R>> BlobManagerExt<'a, R> for M {
fn blob_manager(&'a self) -> State<'a, BlobManager> {
self.state::<BlobManager>()
}
fn blobs(&'a self) -> yaak_models::blob_manager::BlobContext {
let manager = self.state::<BlobManager>();
manager.inner().connect()
}
}
// Commands for yaak-models
use tauri::WebviewWindow;
#[tauri::command]
pub(crate) fn models_upsert<R: Runtime>(
window: WebviewWindow<R>,
model: AnyModel,
) -> Result<String> {
use yaak_models::error::Error::GenericError;
let db = window.db();
let blobs = window.blob_manager();
let source = &UpdateSource::from_window_label(window.label());
let id = match model {
AnyModel::CookieJar(m) => db.upsert_cookie_jar(&m, source)?.id,
AnyModel::Environment(m) => db.upsert_environment(&m, source)?.id,
AnyModel::Folder(m) => db.upsert_folder(&m, source)?.id,
AnyModel::GrpcRequest(m) => db.upsert_grpc_request(&m, source)?.id,
AnyModel::HttpRequest(m) => db.upsert_http_request(&m, source)?.id,
AnyModel::HttpResponse(m) => db.upsert_http_response(&m, source, &blobs)?.id,
AnyModel::KeyValue(m) => db.upsert_key_value(&m, source)?.id,
AnyModel::Plugin(m) => db.upsert_plugin(&m, source)?.id,
AnyModel::Settings(m) => db.upsert_settings(&m, source)?.id,
AnyModel::WebsocketRequest(m) => db.upsert_websocket_request(&m, source)?.id,
AnyModel::Workspace(m) => db.upsert_workspace(&m, source)?.id,
AnyModel::WorkspaceMeta(m) => db.upsert_workspace_meta(&m, source)?.id,
a => return Err(GenericError(format!("Cannot upsert AnyModel {a:?})"))),
};
Ok(id)
}
#[tauri::command]
pub(crate) fn models_delete<R: Runtime>(
window: WebviewWindow<R>,
model: AnyModel,
) -> Result<String> {
use yaak_models::error::Error::GenericError;
let blobs = window.blob_manager();
// Use transaction for deletions because it might recurse
window.with_tx(|tx| {
let source = &UpdateSource::from_window_label(window.label());
let id = match model {
AnyModel::CookieJar(m) => tx.delete_cookie_jar(&m, source)?.id,
AnyModel::Environment(m) => tx.delete_environment(&m, source)?.id,
AnyModel::Folder(m) => tx.delete_folder(&m, source)?.id,
AnyModel::GrpcConnection(m) => tx.delete_grpc_connection(&m, source)?.id,
AnyModel::GrpcRequest(m) => tx.delete_grpc_request(&m, source)?.id,
AnyModel::HttpRequest(m) => tx.delete_http_request(&m, source)?.id,
AnyModel::HttpResponse(m) => tx.delete_http_response(&m, source, &blobs)?.id,
AnyModel::Plugin(m) => tx.delete_plugin(&m, source)?.id,
AnyModel::WebsocketConnection(m) => tx.delete_websocket_connection(&m, source)?.id,
AnyModel::WebsocketRequest(m) => tx.delete_websocket_request(&m, source)?.id,
AnyModel::Workspace(m) => tx.delete_workspace(&m, source)?.id,
a => return Err(GenericError(format!("Cannot delete AnyModel {a:?})"))),
};
Ok(id)
})
}
#[tauri::command]
pub(crate) fn models_duplicate<R: Runtime>(
window: WebviewWindow<R>,
model: AnyModel,
) -> Result<String> {
use yaak_models::error::Error::GenericError;
// Use transaction for duplications because it might recurse
window.with_tx(|tx| {
let source = &UpdateSource::from_window_label(window.label());
let id = match model {
AnyModel::Environment(m) => tx.duplicate_environment(&m, source)?.id,
AnyModel::Folder(m) => tx.duplicate_folder(&m, source)?.id,
AnyModel::GrpcRequest(m) => tx.duplicate_grpc_request(&m, source)?.id,
AnyModel::HttpRequest(m) => tx.duplicate_http_request(&m, source)?.id,
AnyModel::WebsocketRequest(m) => tx.duplicate_websocket_request(&m, source)?.id,
a => return Err(GenericError(format!("Cannot duplicate AnyModel {a:?})"))),
};
Ok(id)
})
}
#[tauri::command]
pub(crate) fn models_websocket_events<R: Runtime>(
app_handle: tauri::AppHandle<R>,
connection_id: &str,
) -> Result<Vec<WebsocketEvent>> {
Ok(app_handle.db().list_websocket_events(connection_id)?)
}
#[tauri::command]
pub(crate) fn models_grpc_events<R: Runtime>(
app_handle: tauri::AppHandle<R>,
connection_id: &str,
) -> Result<Vec<GrpcEvent>> {
Ok(app_handle.db().list_grpc_events(connection_id)?)
}
#[tauri::command]
pub(crate) fn models_get_settings<R: Runtime>(app_handle: tauri::AppHandle<R>) -> Result<Settings> {
Ok(app_handle.db().get_settings())
}
#[tauri::command]
pub(crate) fn models_get_graphql_introspection<R: Runtime>(
app_handle: tauri::AppHandle<R>,
request_id: &str,
) -> Result<Option<GraphQlIntrospection>> {
Ok(app_handle.db().get_graphql_introspection(request_id))
}
#[tauri::command]
pub(crate) fn models_upsert_graphql_introspection<R: Runtime>(
app_handle: tauri::AppHandle<R>,
request_id: &str,
workspace_id: &str,
content: Option<String>,
window: WebviewWindow<R>,
) -> Result<GraphQlIntrospection> {
let source = UpdateSource::from_window_label(window.label());
Ok(app_handle.db().upsert_graphql_introspection(workspace_id, request_id, content, &source)?)
}
#[tauri::command]
pub(crate) fn models_workspace_models<R: Runtime>(
window: WebviewWindow<R>,
workspace_id: Option<&str>,
) -> Result<String> {
let db = window.db();
let mut l: Vec<AnyModel> = Vec::new();
// Add the settings
l.push(db.get_settings().into());
// Add global models
l.append(&mut db.list_workspaces()?.into_iter().map(Into::into).collect());
l.append(&mut db.list_key_values()?.into_iter().map(Into::into).collect());
l.append(&mut db.list_plugins()?.into_iter().map(Into::into).collect());
// Add the workspace children
if let Some(wid) = workspace_id {
l.append(&mut db.list_cookie_jars(wid)?.into_iter().map(Into::into).collect());
l.append(&mut db.list_environments_ensure_base(wid)?.into_iter().map(Into::into).collect());
l.append(&mut db.list_folders(wid)?.into_iter().map(Into::into).collect());
l.append(&mut db.list_grpc_connections(wid)?.into_iter().map(Into::into).collect());
l.append(&mut db.list_grpc_requests(wid)?.into_iter().map(Into::into).collect());
l.append(&mut db.list_http_requests(wid)?.into_iter().map(Into::into).collect());
l.append(&mut db.list_http_responses(wid, None)?.into_iter().map(Into::into).collect());
l.append(&mut db.list_websocket_connections(wid)?.into_iter().map(Into::into).collect());
l.append(&mut db.list_websocket_requests(wid)?.into_iter().map(Into::into).collect());
l.append(&mut db.list_workspace_metas(wid)?.into_iter().map(Into::into).collect());
}
let j = serde_json::to_string(&l)?;
Ok(escape_str_for_webview(&j))
}
fn escape_str_for_webview(input: &str) -> String {
input
.chars()
.map(|c| {
let code = c as u32;
// ASCII
if code <= 0x7F {
c.to_string()
// BMP characters encoded normally
} else if code < 0xFFFF {
format!("\\u{:04X}", code)
// Beyond BMP encoded a surrogate pairs
} else {
let high = ((code - 0x10000) >> 10) + 0xD800;
let low = ((code - 0x10000) & 0x3FF) + 0xDC00;
format!("\\u{:04X}\\u{:04X}", high, low)
}
})
.collect()
}
/// Initialize database managers as a plugin (for initialization order).
/// Commands are in the main invoke_handler.
/// This must be registered before other plugins that depend on the database.
pub fn init<R: Runtime>() -> TauriPlugin<R> {
tauri::plugin::Builder::new("yaak-models-db")
.setup(|app_handle, _api| {
let app_path = app_handle.path().app_data_dir().unwrap();
let db_path = app_path.join("db.sqlite");
let blob_path = app_path.join("blobs.sqlite");
let (query_manager, blob_manager, rx) =
match yaak_models::init_standalone(&db_path, &blob_path) {
Ok(result) => result,
Err(e) => {
app_handle
.dialog()
.message(e.to_string())
.kind(MessageDialogKind::Error)
.blocking_show();
return Err(Box::from(e.to_string()));
}
};
let db = query_manager.connect();
if let Err(err) = db.prune_model_changes_older_than_hours(MODEL_CHANGES_RETENTION_HOURS)
{
error!("Failed to prune model_changes rows on startup: {err:?}");
}
// Only stream writes that happen after this app launch.
let cursor = ModelChangeCursor::from_launch_time();
let poll_query_manager = query_manager.clone();
app_handle.manage(query_manager);
app_handle.manage(blob_manager);
// Poll model_changes so all writers (including external CLI processes) update the UI.
let app_handle_poll = app_handle.clone();
let query_manager = poll_query_manager;
tauri::async_runtime::spawn(async move {
run_model_change_poller(query_manager, app_handle_poll, cursor).await;
});
// Fast path for local app writes initiated by frontend windows. This keeps the
// current sync-model UX snappy, while DB polling handles external writers (CLI).
let app_handle_local = app_handle.clone();
tauri::async_runtime::spawn(async move {
for payload in rx {
if !matches!(payload.update_source, UpdateSource::Window { .. }) {
continue;
}
if let Err(err) = app_handle_local.emit("model_write", payload) {
error!("Failed to emit local model_write event: {err:?}");
}
}
});
Ok(())
})
.build()
}

View File

@@ -1,173 +0,0 @@
use crate::error::Result;
use crate::history::get_or_upsert_launch_info;
use crate::models_ext::QueryManagerExt;
use chrono::{DateTime, Utc};
use log::{debug, info};
use reqwest::Method;
use serde::{Deserialize, Serialize};
use std::time::Instant;
use tauri::{AppHandle, Emitter, Manager, Runtime, WebviewWindow};
use ts_rs::TS;
use yaak_api::yaak_api_client;
use yaak_common::platform::get_os_str;
use yaak_models::util::UpdateSource;
// Check for updates every hour
const MAX_UPDATE_CHECK_SECONDS: u64 = 60 * 60;
const KV_NAMESPACE: &str = "notifications";
const KV_KEY: &str = "seen";
// Create updater struct
pub struct YaakNotifier {
last_check: Option<Instant>,
}
#[derive(Debug, Clone, Serialize, Deserialize, Default, TS)]
#[serde(default, rename_all = "camelCase")]
#[ts(export, export_to = "index.ts")]
pub struct YaakNotification {
timestamp: DateTime<Utc>,
timeout: Option<f64>,
id: String,
title: Option<String>,
message: String,
color: Option<String>,
action: Option<YaakNotificationAction>,
}
#[derive(Debug, Clone, Serialize, Deserialize, Default, TS)]
#[serde(default, rename_all = "camelCase")]
#[ts(export, export_to = "index.ts")]
pub struct YaakNotificationAction {
label: String,
url: String,
}
impl YaakNotifier {
pub fn new() -> Self {
Self { last_check: None }
}
pub async fn seen<R: Runtime>(&mut self, window: &WebviewWindow<R>, id: &str) -> Result<()> {
let app_handle = window.app_handle();
let mut seen = get_kv(app_handle).await?;
seen.push(id.to_string());
debug!("Marked notification as seen {}", id);
let seen_json = serde_json::to_string(&seen)?;
window.db().set_key_value_raw(
KV_NAMESPACE,
KV_KEY,
seen_json.as_str(),
&UpdateSource::from_window_label(window.label()),
);
Ok(())
}
pub async fn maybe_check<R: Runtime>(&mut self, window: &WebviewWindow<R>) -> Result<()> {
let app_handle = window.app_handle();
if let Some(i) = self.last_check
&& i.elapsed().as_secs() < MAX_UPDATE_CHECK_SECONDS
{
return Ok(());
}
self.last_check = Some(Instant::now());
if !app_handle.db().get_settings().check_notifications {
info!("Notifications are disabled. Skipping check.");
return Ok(());
}
debug!("Checking for notifications");
#[cfg(feature = "license")]
let license_check = {
use yaak_license::{LicenseCheckStatus, check_license};
match check_license(window).await {
Ok(LicenseCheckStatus::PersonalUse { .. }) => "personal",
Ok(LicenseCheckStatus::Active { .. }) => "commercial",
Ok(LicenseCheckStatus::PastDue { .. }) => "past_due",
Ok(LicenseCheckStatus::Inactive { .. }) => "invalid_license",
Ok(LicenseCheckStatus::Trialing { .. }) => "trialing",
Ok(LicenseCheckStatus::Expired { .. }) => "expired",
Ok(LicenseCheckStatus::Error { .. }) => "error",
Err(_) => "unknown",
}
.to_string()
};
#[cfg(not(feature = "license"))]
let license_check = "disabled".to_string();
let launch_info = get_or_upsert_launch_info(app_handle);
let app_version = app_handle.package_info().version.to_string();
let req = yaak_api_client(&app_version)?
.request(Method::GET, "https://notify.yaak.app/notifications")
.query(&[
("version", &launch_info.current_version),
("version_prev", &launch_info.previous_version),
("launches", &launch_info.num_launches.to_string()),
("installed", &launch_info.user_since.format("%Y-%m-%d").to_string()),
("license", &license_check),
("updates", &get_updater_status(app_handle).to_string()),
("platform", &get_os_str().to_string()),
]);
let resp = req.send().await?;
if resp.status() != 200 {
debug!("Skipping notification status code {}", resp.status());
return Ok(());
}
for notification in resp.json::<Vec<YaakNotification>>().await? {
let seen = get_kv(app_handle).await?;
if seen.contains(&notification.id) {
debug!("Already seen notification {}", notification.id);
continue;
}
debug!("Got notification {:?}", notification);
let _ = app_handle.emit_to(window.label(), "notification", notification.clone());
break; // Only show one notification
}
Ok(())
}
}
async fn get_kv<R: Runtime>(app_handle: &AppHandle<R>) -> Result<Vec<String>> {
match app_handle.db().get_key_value_raw("notifications", "seen") {
None => Ok(Vec::new()),
Some(v) => Ok(serde_json::from_str(&v.value)?),
}
}
#[allow(unused)]
fn get_updater_status<R: Runtime>(app_handle: &AppHandle<R>) -> &'static str {
#[cfg(not(feature = "updater"))]
{
// Updater is not enabled as a Rust feature
return "missing";
}
#[cfg(all(feature = "updater", target_os = "linux"))]
{
let settings = app_handle.db().get_settings();
if !settings.autoupdate {
// Updates are explicitly disabled
"disabled"
} else if std::env::var("APPIMAGE").is_err() {
// Updates are enabled, but unsupported
"unsupported"
} else {
// Updates are enabled and supported
"enabled"
}
}
#[cfg(all(feature = "updater", not(target_os = "linux")))]
{
let settings = app_handle.db().get_settings();
if settings.autoupdate { "enabled" } else { "disabled" }
}
}

View File

@@ -1,542 +0,0 @@
use crate::error::Result;
use crate::http_request::send_http_request_with_context;
use crate::models_ext::BlobManagerExt;
use crate::models_ext::QueryManagerExt;
use crate::render::{render_grpc_request, render_http_request, render_json_value};
use crate::window::{CreateWindowConfig, create_window};
use crate::{
call_frontend, cookie_jar_from_window, environment_from_window, get_window_from_plugin_context,
workspace_from_window,
};
use chrono::Utc;
use cookie::Cookie;
use log::error;
use std::sync::Arc;
use tauri::{AppHandle, Emitter, Listener, Manager, Runtime};
use tauri_plugin_clipboard_manager::ClipboardExt;
use tauri_plugin_opener::OpenerExt;
use yaak_crypto::manager::EncryptionManager;
use yaak_models::models::{AnyModel, HttpResponse, Plugin};
use yaak_models::queries::any_request::AnyRequest;
use yaak_models::util::UpdateSource;
use yaak_plugins::error::Error::PluginErr;
use yaak_plugins::events::{
Color, DeleteKeyValueResponse, EmptyPayload, ErrorResponse, FindHttpResponsesResponse,
GetCookieValueResponse, GetHttpRequestByIdResponse, GetKeyValueResponse, Icon, InternalEvent,
InternalEventPayload, ListCookieNamesResponse, ListHttpRequestsResponse,
ListWorkspacesResponse, RenderGrpcRequestResponse, RenderHttpRequestResponse,
SendHttpRequestResponse, SetKeyValueResponse, ShowToastRequest, TemplateRenderResponse,
WindowInfoResponse, WindowNavigateEvent, WorkspaceInfo,
};
use yaak_plugins::manager::PluginManager;
use yaak_plugins::plugin_handle::PluginHandle;
use yaak_plugins::template_callback::PluginTemplateCallback;
use yaak_tauri_utils::window::WorkspaceWindowTrait;
use yaak_templates::{RenderErrorBehavior, RenderOptions};
pub(crate) async fn handle_plugin_event<R: Runtime>(
app_handle: &AppHandle<R>,
event: &InternalEvent,
plugin_handle: &PluginHandle,
) -> Result<Option<InternalEventPayload>> {
// log::debug!("Got event to app {event:?}");
let plugin_context = event.context.to_owned();
match event.clone().payload {
InternalEventPayload::CopyTextRequest(req) => {
app_handle.clipboard().write_text(req.text.as_str())?;
Ok(Some(InternalEventPayload::CopyTextResponse(EmptyPayload {})))
}
InternalEventPayload::ShowToastRequest(req) => {
match plugin_context.label {
Some(label) => app_handle.emit_to(label, "show_toast", req)?,
None => app_handle.emit("show_toast", req)?,
};
Ok(Some(InternalEventPayload::ShowToastResponse(EmptyPayload {})))
}
InternalEventPayload::PromptTextRequest(_) => {
let window = get_window_from_plugin_context(app_handle, &plugin_context)?;
Ok(call_frontend(&window, event).await)
}
InternalEventPayload::PromptFormRequest(_) => {
let window = get_window_from_plugin_context(app_handle, &plugin_context)?;
if event.reply_id.is_some() {
// Follow-up update from plugin runtime with resolved inputs — forward to frontend
window.emit_to(window.label(), "plugin_event", event.clone())?;
Ok(None)
} else {
// Initial request — set up bidirectional communication
window.emit_to(window.label(), "plugin_event", event.clone()).unwrap();
let event_id = event.id.clone();
let plugin_handle = plugin_handle.clone();
let plugin_context = plugin_context.clone();
let window = window.clone();
// Spawn async task to handle bidirectional form communication
tauri::async_runtime::spawn(async move {
let (tx, mut rx) = tokio::sync::mpsc::channel::<InternalEvent>(128);
// Listen for replies from the frontend
let listener_id = window.listen(event_id, move |ev: tauri::Event| {
let resp: InternalEvent = serde_json::from_str(ev.payload()).unwrap();
let _ = tx.try_send(resp);
});
// Forward each reply to the plugin runtime
while let Some(resp) = rx.recv().await {
let is_done = matches!(
&resp.payload,
InternalEventPayload::PromptFormResponse(r) if r.done.unwrap_or(false)
);
let event_to_send = plugin_handle.build_event_to_send(
&plugin_context,
&resp.payload,
Some(resp.reply_id.unwrap_or_default()),
);
if let Err(e) = plugin_handle.send(&event_to_send).await {
log::warn!("Failed to forward form response to plugin: {:?}", e);
}
if is_done {
break;
}
}
window.unlisten(listener_id);
});
Ok(None)
}
}
InternalEventPayload::FindHttpResponsesRequest(req) => {
let http_responses = app_handle
.db()
.list_http_responses_for_request(&req.request_id, req.limit.map(|l| l as u64))
.unwrap_or_default();
Ok(Some(InternalEventPayload::FindHttpResponsesResponse(FindHttpResponsesResponse {
http_responses,
})))
}
InternalEventPayload::ListHttpRequestsRequest(req) => {
let w = get_window_from_plugin_context(app_handle, &plugin_context)?;
let workspace = workspace_from_window(&w)
.ok_or(PluginErr("Failed to get workspace from window".into()))?;
let http_requests = if let Some(folder_id) = req.folder_id {
app_handle.db().list_http_requests_for_folder_recursive(&folder_id)?
} else {
app_handle.db().list_http_requests(&workspace.id)?
};
Ok(Some(InternalEventPayload::ListHttpRequestsResponse(ListHttpRequestsResponse {
http_requests,
})))
}
InternalEventPayload::ListFoldersRequest(_req) => {
let w = get_window_from_plugin_context(app_handle, &plugin_context)?;
let workspace = workspace_from_window(&w)
.ok_or(PluginErr("Failed to get workspace from window".into()))?;
let folders = app_handle.db().list_folders(&workspace.id)?;
Ok(Some(InternalEventPayload::ListFoldersResponse(
yaak_plugins::events::ListFoldersResponse { folders },
)))
}
InternalEventPayload::UpsertModelRequest(req) => {
use AnyModel::*;
let model = match &req.model {
HttpRequest(m) => {
HttpRequest(app_handle.db().upsert_http_request(m, &UpdateSource::Plugin)?)
}
GrpcRequest(m) => {
GrpcRequest(app_handle.db().upsert_grpc_request(m, &UpdateSource::Plugin)?)
}
WebsocketRequest(m) => WebsocketRequest(
app_handle.db().upsert_websocket_request(m, &UpdateSource::Plugin)?,
),
Folder(m) => Folder(app_handle.db().upsert_folder(m, &UpdateSource::Plugin)?),
Environment(m) => {
Environment(app_handle.db().upsert_environment(m, &UpdateSource::Plugin)?)
}
Workspace(m) => {
Workspace(app_handle.db().upsert_workspace(m, &UpdateSource::Plugin)?)
}
_ => {
return Err(PluginErr("Upsert not supported for this model type".into()).into());
}
};
Ok(Some(InternalEventPayload::UpsertModelResponse(
yaak_plugins::events::UpsertModelResponse { model },
)))
}
InternalEventPayload::DeleteModelRequest(req) => {
let model = match req.model.as_str() {
"http_request" => AnyModel::HttpRequest(
app_handle.db().delete_http_request_by_id(&req.id, &UpdateSource::Plugin)?,
),
"grpc_request" => AnyModel::GrpcRequest(
app_handle.db().delete_grpc_request_by_id(&req.id, &UpdateSource::Plugin)?,
),
"websocket_request" => AnyModel::WebsocketRequest(
app_handle
.db()
.delete_websocket_request_by_id(&req.id, &UpdateSource::Plugin)?,
),
"folder" => AnyModel::Folder(
app_handle.db().delete_folder_by_id(&req.id, &UpdateSource::Plugin)?,
),
"environment" => AnyModel::Environment(
app_handle.db().delete_environment_by_id(&req.id, &UpdateSource::Plugin)?,
),
_ => {
return Err(PluginErr("Delete not supported for this model type".into()).into());
}
};
Ok(Some(InternalEventPayload::DeleteModelResponse(
yaak_plugins::events::DeleteModelResponse { model },
)))
}
InternalEventPayload::GetHttpRequestByIdRequest(req) => {
let http_request = app_handle.db().get_http_request(&req.id).ok();
Ok(Some(InternalEventPayload::GetHttpRequestByIdResponse(GetHttpRequestByIdResponse {
http_request,
})))
}
InternalEventPayload::RenderGrpcRequestRequest(req) => {
let window = get_window_from_plugin_context(app_handle, &plugin_context)?;
let workspace =
workspace_from_window(&window).expect("Failed to get workspace_id from window URL");
let environment_id = environment_from_window(&window).map(|e| e.id);
let environment_chain = window.db().resolve_environments(
&workspace.id,
req.grpc_request.folder_id.as_deref(),
environment_id.as_deref(),
)?;
let plugin_manager = Arc::new((*app_handle.state::<PluginManager>()).clone());
let encryption_manager = Arc::new((*app_handle.state::<EncryptionManager>()).clone());
let cb = PluginTemplateCallback::new(
plugin_manager,
encryption_manager,
&plugin_context,
req.purpose,
);
let opt = RenderOptions { error_behavior: RenderErrorBehavior::Throw };
let grpc_request =
render_grpc_request(&req.grpc_request, environment_chain, &cb, &opt).await?;
Ok(Some(InternalEventPayload::RenderGrpcRequestResponse(RenderGrpcRequestResponse {
grpc_request,
})))
}
InternalEventPayload::RenderHttpRequestRequest(req) => {
let window = get_window_from_plugin_context(app_handle, &plugin_context)?;
let workspace =
workspace_from_window(&window).expect("Failed to get workspace_id from window URL");
let environment_id = environment_from_window(&window).map(|e| e.id);
let environment_chain = window.db().resolve_environments(
&workspace.id,
req.http_request.folder_id.as_deref(),
environment_id.as_deref(),
)?;
let plugin_manager = Arc::new((*app_handle.state::<PluginManager>()).clone());
let encryption_manager = Arc::new((*app_handle.state::<EncryptionManager>()).clone());
let cb = PluginTemplateCallback::new(
plugin_manager,
encryption_manager,
&plugin_context,
req.purpose,
);
let opt = &RenderOptions { error_behavior: RenderErrorBehavior::Throw };
let http_request =
render_http_request(&req.http_request, environment_chain, &cb, &opt).await?;
Ok(Some(InternalEventPayload::RenderHttpRequestResponse(RenderHttpRequestResponse {
http_request,
})))
}
InternalEventPayload::TemplateRenderRequest(req) => {
let window = get_window_from_plugin_context(app_handle, &plugin_context)?;
let workspace =
workspace_from_window(&window).expect("Failed to get workspace_id from window URL");
let environment_id = environment_from_window(&window).map(|e| e.id);
let folder_id = if let Some(id) = window.request_id() {
match window.db().get_any_request(&id) {
Ok(AnyRequest::HttpRequest(r)) => r.folder_id,
Ok(AnyRequest::GrpcRequest(r)) => r.folder_id,
Ok(AnyRequest::WebsocketRequest(r)) => r.folder_id,
Err(_) => None,
}
} else {
None
};
let environment_chain = window.db().resolve_environments(
&workspace.id,
folder_id.as_deref(),
environment_id.as_deref(),
)?;
let plugin_manager = Arc::new((*app_handle.state::<PluginManager>()).clone());
let encryption_manager = Arc::new((*app_handle.state::<EncryptionManager>()).clone());
let cb = PluginTemplateCallback::new(
plugin_manager,
encryption_manager,
&plugin_context,
req.purpose,
);
let opt = RenderOptions { error_behavior: RenderErrorBehavior::Throw };
let data = render_json_value(req.data, environment_chain, &cb, &opt).await?;
Ok(Some(InternalEventPayload::TemplateRenderResponse(TemplateRenderResponse { data })))
}
InternalEventPayload::ErrorResponse(resp) => {
error!("Plugin error: {}: {:?}", resp.error, resp);
let toast_event = plugin_handle.build_event_to_send(
&plugin_context,
&InternalEventPayload::ShowToastRequest(ShowToastRequest {
message: format!(
"Plugin error from {}: {}",
plugin_handle.info().name,
resp.error
),
color: Some(Color::Danger),
timeout: Some(30000),
..Default::default()
}),
None,
);
Box::pin(handle_plugin_event(app_handle, &toast_event, plugin_handle)).await
}
InternalEventPayload::ReloadResponse(req) => {
let plugins = app_handle.db().list_plugins()?;
for plugin in plugins {
if plugin.directory != plugin_handle.dir {
continue;
}
let new_plugin = Plugin {
updated_at: Utc::now().naive_utc(), // TODO: Add reloaded_at field to use instead
..plugin
};
app_handle.db().upsert_plugin(&new_plugin, &UpdateSource::Plugin)?;
}
if !req.silent {
let info = plugin_handle.info();
let toast_event = plugin_handle.build_event_to_send(
&plugin_context,
&InternalEventPayload::ShowToastRequest(ShowToastRequest {
message: format!("Reloaded plugin {}@{}", info.name, info.version),
icon: Some(Icon::Info),
timeout: Some(3000),
..Default::default()
}),
None,
);
Box::pin(handle_plugin_event(app_handle, &toast_event, plugin_handle)).await
} else {
Ok(None)
}
}
InternalEventPayload::SendHttpRequestRequest(req) => {
let window = get_window_from_plugin_context(app_handle, &plugin_context)?;
let mut http_request = req.http_request;
let workspace =
workspace_from_window(&window).expect("Failed to get workspace_id from window URL");
let cookie_jar = cookie_jar_from_window(&window);
let environment = environment_from_window(&window);
if http_request.workspace_id.is_empty() {
http_request.workspace_id = workspace.id;
}
let http_response = if http_request.id.is_empty() {
HttpResponse::default()
} else {
let blobs = window.blob_manager();
window.db().upsert_http_response(
&HttpResponse {
request_id: http_request.id.clone(),
workspace_id: http_request.workspace_id.clone(),
..Default::default()
},
&UpdateSource::Plugin,
&blobs,
)?
};
let http_response = send_http_request_with_context(
&window,
&http_request,
&http_response,
environment,
cookie_jar,
&mut tokio::sync::watch::channel(false).1, // No-op cancel channel
&plugin_context,
)
.await?;
Ok(Some(InternalEventPayload::SendHttpRequestResponse(SendHttpRequestResponse {
http_response,
})))
}
InternalEventPayload::OpenWindowRequest(req) => {
let (navigation_tx, mut navigation_rx) = tokio::sync::mpsc::channel(128);
let (close_tx, mut close_rx) = tokio::sync::mpsc::channel(128);
let win_config = CreateWindowConfig {
url: &req.url,
label: &req.label,
title: &req.title.clone().unwrap_or_default(),
navigation_tx: Some(navigation_tx),
close_tx: Some(close_tx),
inner_size: req.size.clone().map(|s| (s.width, s.height)),
data_dir_key: req.data_dir_key.clone(),
..Default::default()
};
if let Err(e) = create_window(app_handle, win_config) {
let error_event = plugin_handle.build_event_to_send(
&plugin_context,
&InternalEventPayload::ErrorResponse(ErrorResponse {
error: format!("Failed to create window: {:?}", e),
}),
None,
);
return Box::pin(handle_plugin_event(app_handle, &error_event, plugin_handle))
.await;
}
{
let event_id = event.id.clone();
let plugin_handle = plugin_handle.clone();
let plugin_context = plugin_context.clone();
tauri::async_runtime::spawn(async move {
while let Some(url) = navigation_rx.recv().await {
let url = url.to_string();
let event_to_send = plugin_handle.build_event_to_send(
&plugin_context, // NOTE: Sending existing context on purpose here
&InternalEventPayload::WindowNavigateEvent(WindowNavigateEvent { url }),
Some(event_id.clone()),
);
plugin_handle.send(&event_to_send).await.unwrap();
}
});
}
{
let event_id = event.id.clone();
let plugin_handle = plugin_handle.clone();
let plugin_context = plugin_context.clone();
tauri::async_runtime::spawn(async move {
while let Some(_) = close_rx.recv().await {
let event_to_send = plugin_handle.build_event_to_send(
&plugin_context,
&InternalEventPayload::WindowCloseEvent,
Some(event_id.clone()),
);
plugin_handle.send(&event_to_send).await.unwrap();
}
});
}
Ok(None)
}
InternalEventPayload::CloseWindowRequest(req) => {
if let Some(window) = app_handle.webview_windows().get(&req.label) {
window.close()?;
}
Ok(None)
}
InternalEventPayload::OpenExternalUrlRequest(req) => {
app_handle.opener().open_url(&req.url, None::<&str>)?;
Ok(Some(InternalEventPayload::OpenExternalUrlResponse(EmptyPayload {})))
}
InternalEventPayload::SetKeyValueRequest(req) => {
let name = plugin_handle.info().name;
app_handle.db().set_plugin_key_value(&name, &req.key, &req.value);
Ok(Some(InternalEventPayload::SetKeyValueResponse(SetKeyValueResponse {})))
}
InternalEventPayload::GetKeyValueRequest(req) => {
let name = plugin_handle.info().name;
let value = app_handle.db().get_plugin_key_value(&name, &req.key).map(|v| v.value);
Ok(Some(InternalEventPayload::GetKeyValueResponse(GetKeyValueResponse { value })))
}
InternalEventPayload::DeleteKeyValueRequest(req) => {
let name = plugin_handle.info().name;
let deleted = app_handle.db().delete_plugin_key_value(&name, &req.key)?;
Ok(Some(InternalEventPayload::DeleteKeyValueResponse(DeleteKeyValueResponse {
deleted,
})))
}
InternalEventPayload::ListCookieNamesRequest(_req) => {
let window = get_window_from_plugin_context(app_handle, &plugin_context)?;
let names = match cookie_jar_from_window(&window) {
None => Vec::new(),
Some(j) => j
.cookies
.into_iter()
.filter_map(|c| Cookie::parse(c.raw_cookie).ok().map(|c| c.name().to_string()))
.collect(),
};
Ok(Some(InternalEventPayload::ListCookieNamesResponse(ListCookieNamesResponse {
names,
})))
}
InternalEventPayload::GetCookieValueRequest(req) => {
let window = get_window_from_plugin_context(app_handle, &plugin_context)?;
let value = match cookie_jar_from_window(&window) {
None => None,
Some(j) => j.cookies.into_iter().find_map(|c| match Cookie::parse(c.raw_cookie) {
Ok(c) if c.name().to_string().eq(&req.name) => {
Some(c.value_trimmed().to_string())
}
_ => None,
}),
};
Ok(Some(InternalEventPayload::GetCookieValueResponse(GetCookieValueResponse { value })))
}
InternalEventPayload::WindowInfoRequest(req) => {
let w = app_handle
.get_webview_window(&req.label)
.ok_or(PluginErr(format!("Failed to find window for {}", req.label)))?;
// Actually look up the data so we never return an invalid ID
let environment_id = environment_from_window(&w).map(|m| m.id);
let workspace_id = workspace_from_window(&w).map(|m| m.id);
let request_id =
match app_handle.db().get_any_request(&w.request_id().unwrap_or_default()) {
Ok(AnyRequest::HttpRequest(r)) => Some(r.id),
Ok(AnyRequest::WebsocketRequest(r)) => Some(r.id),
Ok(AnyRequest::GrpcRequest(r)) => Some(r.id),
Err(_) => None,
};
Ok(Some(InternalEventPayload::WindowInfoResponse(WindowInfoResponse {
label: w.label().to_string(),
request_id,
workspace_id,
environment_id,
})))
}
InternalEventPayload::ListWorkspacesRequest(_) => {
let mut workspaces = Vec::new();
for (_, window) in app_handle.webview_windows() {
if let Some(workspace) = workspace_from_window(&window) {
workspaces.push(WorkspaceInfo {
id: workspace.id.clone(),
name: workspace.name.clone(),
label: window.label().to_string(),
});
}
}
Ok(Some(InternalEventPayload::ListWorkspacesResponse(ListWorkspacesResponse {
workspaces,
})))
}
_ => Ok(None),
}
}

View File

@@ -1,364 +0,0 @@
//! Tauri-specific plugin management code.
//!
//! This module contains all Tauri integration for the plugin system:
//! - Plugin initialization and lifecycle management
//! - Tauri commands for plugin search/install/uninstall
//! - Plugin update checking
use crate::PluginContextExt;
use crate::error::Result;
use crate::models_ext::QueryManagerExt;
use log::{error, info, warn};
use serde::Serialize;
use std::sync::Arc;
use std::sync::atomic::{AtomicBool, Ordering};
use std::time::{Duration, Instant};
use tauri::path::BaseDirectory;
use tauri::plugin::{Builder, TauriPlugin};
use tauri::{
AppHandle, Emitter, Manager, RunEvent, Runtime, State, WebviewWindow, WindowEvent, command,
is_dev,
};
use tokio::sync::Mutex;
use ts_rs::TS;
use yaak_api::yaak_api_client;
use yaak_models::models::Plugin;
use yaak_models::util::UpdateSource;
use yaak_plugins::api::{
PluginNameVersion, PluginSearchResponse, PluginUpdatesResponse, check_plugin_updates,
search_plugins,
};
use yaak_plugins::events::{Color, Icon, PluginContext, ShowToastRequest};
use yaak_plugins::install::{delete_and_uninstall, download_and_install};
use yaak_plugins::manager::PluginManager;
use yaak_plugins::plugin_meta::get_plugin_meta;
static EXITING: AtomicBool = AtomicBool::new(false);
// ============================================================================
// Plugin Updater
// ============================================================================
const MAX_UPDATE_CHECK_HOURS: u64 = 12;
pub struct PluginUpdater {
last_check: Option<Instant>,
}
#[derive(Debug, Clone, PartialEq, Serialize, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export, export_to = "index.ts")]
pub struct PluginUpdateNotification {
pub update_count: usize,
pub plugins: Vec<PluginUpdateInfo>,
}
#[derive(Debug, Clone, PartialEq, Serialize, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export, export_to = "index.ts")]
pub struct PluginUpdateInfo {
pub name: String,
pub current_version: String,
pub latest_version: String,
}
impl PluginUpdater {
pub fn new() -> Self {
Self { last_check: None }
}
pub async fn check_now<R: Runtime>(&mut self, window: &WebviewWindow<R>) -> Result<bool> {
self.last_check = Some(Instant::now());
info!("Checking for plugin updates");
let app_version = window.app_handle().package_info().version.to_string();
let http_client = yaak_api_client(&app_version)?;
let plugins = window.app_handle().db().list_plugins()?;
let updates = check_plugin_updates(&http_client, plugins.clone()).await?;
if updates.plugins.is_empty() {
info!("No plugin updates available");
return Ok(false);
}
// Get current plugin versions to build notification
let mut update_infos = Vec::new();
for update in &updates.plugins {
if let Some(plugin) = plugins.iter().find(|p| {
if let Ok(meta) = get_plugin_meta(&std::path::Path::new(&p.directory)) {
meta.name == update.name
} else {
false
}
}) {
if let Ok(meta) = get_plugin_meta(&std::path::Path::new(&plugin.directory)) {
update_infos.push(PluginUpdateInfo {
name: update.name.clone(),
current_version: meta.version,
latest_version: update.version.clone(),
});
}
}
}
let notification =
PluginUpdateNotification { update_count: update_infos.len(), plugins: update_infos };
info!("Found {} plugin update(s)", notification.update_count);
if let Err(e) = window.emit_to(window.label(), "plugin_updates_available", &notification) {
error!("Failed to emit plugin_updates_available event: {}", e);
}
Ok(true)
}
pub async fn maybe_check<R: Runtime>(&mut self, window: &WebviewWindow<R>) -> Result<bool> {
let update_period_seconds = MAX_UPDATE_CHECK_HOURS * 60 * 60;
if let Some(i) = self.last_check
&& i.elapsed().as_secs() < update_period_seconds
{
return Ok(false);
}
self.check_now(window).await
}
}
// ============================================================================
// Tauri Commands
// ============================================================================
#[command]
pub async fn cmd_plugins_search<R: Runtime>(
app_handle: AppHandle<R>,
query: &str,
) -> Result<PluginSearchResponse> {
let app_version = app_handle.package_info().version.to_string();
let http_client = yaak_api_client(&app_version)?;
Ok(search_plugins(&http_client, query).await?)
}
#[command]
pub async fn cmd_plugins_install<R: Runtime>(
window: WebviewWindow<R>,
name: &str,
version: Option<String>,
) -> Result<()> {
let plugin_manager = Arc::new((*window.state::<PluginManager>()).clone());
let app_version = window.app_handle().package_info().version.to_string();
let http_client = yaak_api_client(&app_version)?;
let query_manager = window.state::<yaak_models::query_manager::QueryManager>();
let plugin_context = window.plugin_context();
download_and_install(
plugin_manager,
&query_manager,
&http_client,
&plugin_context,
name,
version,
)
.await?;
Ok(())
}
#[command]
pub async fn cmd_plugins_uninstall<R: Runtime>(
plugin_id: &str,
window: WebviewWindow<R>,
) -> Result<Plugin> {
let plugin_manager = Arc::new((*window.state::<PluginManager>()).clone());
let query_manager = window.state::<yaak_models::query_manager::QueryManager>();
let plugin_context = window.plugin_context();
Ok(delete_and_uninstall(plugin_manager, &query_manager, &plugin_context, plugin_id).await?)
}
#[command]
pub async fn cmd_plugins_updates<R: Runtime>(
app_handle: AppHandle<R>,
) -> Result<PluginUpdatesResponse> {
let app_version = app_handle.package_info().version.to_string();
let http_client = yaak_api_client(&app_version)?;
let plugins = app_handle.db().list_plugins()?;
Ok(check_plugin_updates(&http_client, plugins).await?)
}
#[command]
pub async fn cmd_plugins_update_all<R: Runtime>(
window: WebviewWindow<R>,
) -> Result<Vec<PluginNameVersion>> {
let app_version = window.app_handle().package_info().version.to_string();
let http_client = yaak_api_client(&app_version)?;
let plugins = window.db().list_plugins()?;
// Get list of available updates (already filtered to only registry plugins)
let updates = check_plugin_updates(&http_client, plugins).await?;
if updates.plugins.is_empty() {
return Ok(Vec::new());
}
let plugin_manager = Arc::new((*window.state::<PluginManager>()).clone());
let query_manager = window.state::<yaak_models::query_manager::QueryManager>();
let plugin_context = window.plugin_context();
let mut updated = Vec::new();
for update in updates.plugins {
info!("Updating plugin: {} to version {}", update.name, update.version);
match download_and_install(
plugin_manager.clone(),
&query_manager,
&http_client,
&plugin_context,
&update.name,
Some(update.version.clone()),
)
.await
{
Ok(_) => {
info!("Successfully updated plugin: {}", update.name);
updated.push(update.clone());
}
Err(e) => {
log::error!("Failed to update plugin {}: {:?}", update.name, e);
}
}
}
Ok(updated)
}
// ============================================================================
// Tauri Plugin Initialization
// ============================================================================
pub fn init<R: Runtime>() -> TauriPlugin<R> {
Builder::new("yaak-plugins")
.setup(|app_handle, _| {
// Resolve paths for plugin manager
let vendored_plugin_dir = app_handle
.path()
.resolve("vendored/plugins", BaseDirectory::Resource)
.expect("failed to resolve plugin directory resource");
let installed_plugin_dir = app_handle
.path()
.app_data_dir()
.expect("failed to get app data dir")
.join("installed-plugins");
#[cfg(target_os = "windows")]
let node_bin_name = "yaaknode.exe";
#[cfg(not(target_os = "windows"))]
let node_bin_name = "yaaknode";
let node_bin_path = app_handle
.path()
.resolve(format!("vendored/node/{}", node_bin_name), BaseDirectory::Resource)
.expect("failed to resolve yaaknode binary");
let plugin_runtime_main = app_handle
.path()
.resolve("vendored/plugin-runtime", BaseDirectory::Resource)
.expect("failed to resolve plugin runtime")
.join("index.cjs");
let dev_mode = is_dev();
// Create plugin manager asynchronously
let app_handle_clone = app_handle.clone();
tauri::async_runtime::block_on(async move {
let manager = PluginManager::new(
vendored_plugin_dir,
installed_plugin_dir,
node_bin_path,
plugin_runtime_main,
dev_mode,
)
.await;
// Initialize all plugins after manager is created
let bundled_dirs = manager
.list_bundled_plugin_dirs()
.await
.expect("Failed to list bundled plugins");
// Ensure all bundled plugins make it into the database
let db = app_handle_clone.db();
for dir in &bundled_dirs {
if db.get_plugin_by_directory(dir).is_none() {
db.upsert_plugin(
&Plugin {
directory: dir.clone(),
enabled: true,
url: None,
..Default::default()
},
&UpdateSource::Background,
)
.expect("Failed to upsert bundled plugin");
}
}
// Get all plugins from database and initialize
let plugins = db.list_plugins().expect("Failed to list plugins from database");
drop(db); // Explicitly drop the connection before await
let errors =
manager.initialize_all_plugins(plugins, &PluginContext::new_empty()).await;
// Show toast for any failed plugins
for (plugin_dir, error_msg) in errors {
let plugin_name = plugin_dir.split('/').last().unwrap_or(&plugin_dir);
let toast = ShowToastRequest {
message: format!("Failed to start plugin '{}': {}", plugin_name, error_msg),
color: Some(Color::Danger),
icon: Some(Icon::AlertTriangle),
timeout: Some(10000),
};
if let Err(emit_err) = app_handle_clone.emit("show_toast", toast) {
error!("Failed to emit toast for plugin error: {emit_err:?}");
}
}
app_handle_clone.manage(manager);
});
let plugin_updater = PluginUpdater::new();
app_handle.manage(Mutex::new(plugin_updater));
Ok(())
})
.on_event(|app, e| match e {
RunEvent::ExitRequested { api, .. } => {
if EXITING.swap(true, Ordering::SeqCst) {
return; // Only exit once to prevent infinite recursion
}
api.prevent_exit();
tauri::async_runtime::block_on(async move {
info!("Exiting plugin runtime due to app exit");
let manager: State<PluginManager> = app.state();
manager.terminate().await;
app.exit(0);
});
}
RunEvent::WindowEvent { event: WindowEvent::Focused(true), label, .. } => {
// Check for plugin updates on window focus
let w = app.get_webview_window(&label).unwrap();
let h = app.clone();
tauri::async_runtime::spawn(async move {
tokio::time::sleep(Duration::from_secs(3)).await;
let val: State<'_, Mutex<PluginUpdater>> = h.state();
if let Err(e) = val.lock().await.maybe_check(&w).await {
warn!("Failed to check for plugin updates {e:?}");
}
});
}
_ => {}
})
.build()
}

View File

@@ -1,85 +0,0 @@
use log::info;
use serde_json::Value;
use std::collections::BTreeMap;
pub use yaak::render::render_http_request;
use yaak_models::models::{Environment, GrpcRequest, HttpRequestHeader};
use yaak_models::render::make_vars_hashmap;
use yaak_templates::{RenderOptions, TemplateCallback, parse_and_render, render_json_value_raw};
pub async fn render_template<T: TemplateCallback>(
template: &str,
environment_chain: Vec<Environment>,
cb: &T,
opt: &RenderOptions,
) -> yaak_templates::error::Result<String> {
let vars = &make_vars_hashmap(environment_chain);
parse_and_render(template, vars, cb, &opt).await
}
pub async fn render_json_value<T: TemplateCallback>(
value: Value,
environment_chain: Vec<Environment>,
cb: &T,
opt: &RenderOptions,
) -> yaak_templates::error::Result<Value> {
let vars = &make_vars_hashmap(environment_chain);
render_json_value_raw(value, vars, cb, opt).await
}
pub async fn render_grpc_request<T: TemplateCallback>(
r: &GrpcRequest,
environment_chain: Vec<Environment>,
cb: &T,
opt: &RenderOptions,
) -> yaak_templates::error::Result<GrpcRequest> {
let vars = &make_vars_hashmap(environment_chain);
let mut metadata = Vec::new();
for p in r.metadata.clone() {
if !p.enabled {
continue;
}
metadata.push(HttpRequestHeader {
enabled: p.enabled,
name: parse_and_render(p.name.as_str(), vars, cb, &opt).await?,
value: parse_and_render(p.value.as_str(), vars, cb, &opt).await?,
id: p.id,
})
}
let authentication = {
let mut disabled = false;
let mut auth = BTreeMap::new();
match r.authentication.get("disabled") {
Some(Value::Bool(true)) => {
disabled = true;
}
Some(Value::String(tmpl)) => {
disabled = parse_and_render(tmpl.as_str(), vars, cb, &opt)
.await
.unwrap_or_default()
.is_empty();
info!(
"Rendering authentication.disabled as a template: {disabled} from \"{tmpl}\""
);
}
_ => {}
}
if disabled {
auth.insert("disabled".to_string(), Value::Bool(true));
} else {
for (k, v) in r.authentication.clone() {
if k == "disabled" {
auth.insert(k, Value::Bool(false));
} else {
auth.insert(k, render_json_value_raw(v, vars, cb, &opt).await?);
}
}
}
auth
};
let url = parse_and_render(r.url.as_str(), vars, cb, &opt).await?;
Ok(GrpcRequest { url, metadata, authentication, ..r.to_owned() })
}

View File

@@ -1,104 +0,0 @@
//! Tauri-specific extensions for yaak-sync.
//!
//! This module provides the Tauri commands for sync functionality.
use crate::error::Result;
use crate::models_ext::QueryManagerExt;
use chrono::Utc;
use log::warn;
use serde::{Deserialize, Serialize};
use std::path::Path;
use tauri::ipc::Channel;
use tauri::{AppHandle, Listener, Runtime, command};
use tokio::sync::watch;
use ts_rs::TS;
use yaak_sync::error::Error::InvalidSyncDirectory;
use yaak_sync::sync::{
FsCandidate, SyncOp, apply_sync_ops, apply_sync_state_ops, compute_sync_ops, get_db_candidates,
get_fs_candidates,
};
use yaak_sync::watch::{WatchEvent, watch_directory};
#[command]
pub(crate) async fn cmd_sync_calculate<R: Runtime>(
app_handle: AppHandle<R>,
workspace_id: &str,
sync_dir: &Path,
) -> Result<Vec<SyncOp>> {
if !sync_dir.exists() {
return Err(InvalidSyncDirectory(sync_dir.to_string_lossy().to_string()).into());
}
let db = app_handle.db();
let version = app_handle.package_info().version.to_string();
let db_candidates = get_db_candidates(&db, &version, workspace_id, sync_dir)?;
let fs_candidates = get_fs_candidates(sync_dir)?
.into_iter()
// Only keep items in the same workspace
.filter(|fs| fs.model.workspace_id() == workspace_id)
.collect::<Vec<FsCandidate>>();
Ok(compute_sync_ops(db_candidates, fs_candidates))
}
#[command]
pub(crate) async fn cmd_sync_calculate_fs(dir: &Path) -> Result<Vec<SyncOp>> {
let db_candidates = Vec::new();
let fs_candidates = get_fs_candidates(dir)?;
Ok(compute_sync_ops(db_candidates, fs_candidates))
}
#[command]
pub(crate) async fn cmd_sync_apply<R: Runtime>(
app_handle: AppHandle<R>,
sync_ops: Vec<SyncOp>,
sync_dir: &Path,
workspace_id: &str,
) -> Result<()> {
let db = app_handle.db();
let sync_state_ops = apply_sync_ops(&db, workspace_id, sync_dir, sync_ops)?;
apply_sync_state_ops(&db, workspace_id, sync_dir, sync_state_ops)?;
Ok(())
}
#[derive(Debug, Clone, Serialize, Deserialize, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export, export_to = "index.ts")]
pub(crate) struct WatchResult {
unlisten_event: String,
}
#[command]
pub(crate) async fn cmd_sync_watch<R: Runtime>(
app_handle: AppHandle<R>,
sync_dir: &Path,
workspace_id: &str,
channel: Channel<WatchEvent>,
) -> Result<WatchResult> {
let (cancel_tx, cancel_rx) = watch::channel(());
// Create a callback that forwards events to the Tauri channel
let callback = move |event: WatchEvent| {
if let Err(e) = channel.send(event) {
warn!("Failed to send watch event: {:?}", e);
}
};
watch_directory(&sync_dir, callback, cancel_rx).await?;
let app_handle_inner = app_handle.clone();
let unlisten_event =
format!("watch-unlisten-{}-{}", workspace_id, Utc::now().timestamp_millis());
// TODO: Figure out a way to unlisten when the client app_handle refreshes or closes. Perhaps with
// a heartbeat mechanism, or ensuring only a single subscription per workspace (at least
// this won't create `n` subs). We could also maybe have a global fs watcher that we keep
// adding to here.
app_handle.listen_any(unlisten_event.clone(), move |event| {
app_handle_inner.unlisten(event.id());
if let Err(e) = cancel_tx.send(()) {
warn!("Failed to send cancel signal to watcher {e:?}");
}
});
Ok(WatchResult { unlisten_event })
}

View File

@@ -1,388 +0,0 @@
use std::fmt::{Display, Formatter};
use std::path::PathBuf;
use std::time::{Duration, Instant};
use crate::error::Result;
use crate::models_ext::QueryManagerExt;
use log::{debug, error, info, warn};
use serde::{Deserialize, Serialize};
use tauri::{Emitter, Listener, Manager, Runtime, WebviewWindow};
use tauri_plugin_dialog::{DialogExt, MessageDialogButtons};
use tauri_plugin_updater::{Update, UpdaterExt};
use tokio::task::block_in_place;
use tokio::time::sleep;
use ts_rs::TS;
use yaak_models::util::generate_id;
use yaak_plugins::manager::PluginManager;
use url::Url;
use yaak_api::get_system_proxy_url;
use crate::error::Error::GenericError;
use crate::is_dev;
const MAX_UPDATE_CHECK_HOURS_STABLE: u64 = 12;
const MAX_UPDATE_CHECK_HOURS_BETA: u64 = 3;
const MAX_UPDATE_CHECK_HOURS_ALPHA: u64 = 1;
// Create updater struct
pub struct YaakUpdater {
last_check: Option<Instant>,
}
pub enum UpdateMode {
Stable,
Beta,
Alpha,
}
impl Display for UpdateMode {
fn fmt(&self, f: &mut Formatter<'_>) -> std::fmt::Result {
let s = match self {
UpdateMode::Stable => "stable",
UpdateMode::Beta => "beta",
UpdateMode::Alpha => "alpha",
};
write!(f, "{}", s)
}
}
impl UpdateMode {
pub fn new(mode: &str) -> UpdateMode {
match mode {
"beta" => UpdateMode::Beta,
"alpha" => UpdateMode::Alpha,
_ => UpdateMode::Stable,
}
}
}
#[derive(PartialEq)]
pub enum UpdateTrigger {
Background,
User,
}
impl YaakUpdater {
pub fn new() -> Self {
Self { last_check: None }
}
pub async fn check_now<R: Runtime>(
&mut self,
window: &WebviewWindow<R>,
mode: UpdateMode,
auto_download: bool,
update_trigger: UpdateTrigger,
) -> Result<bool> {
// Only AppImage supports updates on Linux, so skip if it's not
#[cfg(target_os = "linux")]
{
if std::env::var("APPIMAGE").is_err() {
return Ok(false);
}
}
let settings = window.db().get_settings();
let update_key = format!("{:x}", md5::compute(settings.id));
self.last_check = Some(Instant::now());
info!("Checking for updates mode={} autodl={}", mode, auto_download);
let w = window.clone();
let mut updater_builder = w.updater_builder();
if let Some(proxy_url) = get_system_proxy_url() {
if let Ok(url) = Url::parse(&proxy_url) {
updater_builder = updater_builder.proxy(url);
}
}
let update_check_result = updater_builder
.on_before_exit(move || {
// Kill plugin manager before exit or NSIS installer will fail to replace sidecar
// while it's running.
// NOTE: This is only called on Windows
let w = w.clone();
block_in_place(|| {
tauri::async_runtime::block_on(async move {
info!("Shutting down plugin manager before update");
let plugin_manager = w.state::<PluginManager>();
plugin_manager.terminate().await;
});
});
})
.header("X-Update-Mode", mode.to_string())?
.header("X-Update-Key", update_key)?
.header(
"X-Update-Trigger",
match update_trigger {
UpdateTrigger::Background => "background",
UpdateTrigger::User => "user",
},
)?
.build()?
.check()
.await;
let result = match update_check_result? {
None => false,
Some(update) => {
let w = window.clone();
tauri::async_runtime::spawn(async move {
// Force native updater if specified (useful if a release broke the UI)
let native_install_mode =
update.raw_json.get("install_mode").map(|v| v.as_str()).unwrap_or_default()
== Some("native");
if native_install_mode {
start_native_update(&w, &update).await;
return;
}
// If it's a background update, try downloading it first
if update_trigger == UpdateTrigger::Background && auto_download {
info!("Downloading update {} in background", update.version);
if let Err(e) = download_update_idempotent(&w, &update).await {
error!("Failed to download {}: {}", update.version, e);
}
}
match start_integrated_update(&w, &update).await {
Ok(UpdateResponseAction::Skip) => {
info!("Confirmed {}: skipped", update.version);
}
Ok(UpdateResponseAction::Install) => {
info!("Confirmed {}: install", update.version);
if let Err(e) = install_update_maybe_download(&w, &update).await {
error!("Failed to install: {e}");
return;
};
info!("Installed {}", update.version);
finish_integrated_update(&w, &update).await;
}
Err(e) => {
warn!("Failed to notify frontend, falling back: {e}",);
start_native_update(&w, &update).await;
}
};
});
true
}
};
Ok(result)
}
pub async fn maybe_check<R: Runtime>(
&mut self,
window: &WebviewWindow<R>,
auto_download: bool,
mode: UpdateMode,
) -> Result<bool> {
let update_period_seconds = match mode {
UpdateMode::Stable => MAX_UPDATE_CHECK_HOURS_STABLE,
UpdateMode::Beta => MAX_UPDATE_CHECK_HOURS_BETA,
UpdateMode::Alpha => MAX_UPDATE_CHECK_HOURS_ALPHA,
} * (60 * 60);
if let Some(i) = self.last_check
&& i.elapsed().as_secs() < update_period_seconds
{
return Ok(false);
}
// Don't check if development (can still with manual user trigger)
if is_dev() {
return Ok(false);
}
self.check_now(window, mode, auto_download, UpdateTrigger::Background).await
}
}
#[derive(Debug, Clone, PartialEq, Serialize, Default, TS)]
#[serde(default, rename_all = "camelCase")]
#[ts(export, export_to = "index.ts")]
struct UpdateInfo {
reply_event_id: String,
version: String,
downloaded: bool,
}
#[derive(Debug, Clone, PartialEq, Deserialize, TS)]
#[serde(rename_all = "camelCase", tag = "type")]
#[ts(export, export_to = "index.ts")]
enum UpdateResponse {
Ack,
Action { action: UpdateResponseAction },
}
#[derive(Debug, Clone, PartialEq, Deserialize, TS)]
#[serde(rename_all = "snake_case")]
#[ts(export, export_to = "index.ts")]
enum UpdateResponseAction {
Install,
Skip,
}
async fn finish_integrated_update<R: Runtime>(window: &WebviewWindow<R>, update: &Update) {
if let Err(e) = window.emit_to(window.label(), "update_installed", update.version.to_string()) {
warn!("Failed to notify frontend of update install: {}", e);
}
}
async fn start_integrated_update<R: Runtime>(
window: &WebviewWindow<R>,
update: &Update,
) -> Result<UpdateResponseAction> {
let download_path = ensure_download_path(window, update)?;
debug!("Download path: {}", download_path.display());
let downloaded = download_path.exists();
let ack_wait = Duration::from_secs(3);
let reply_id = generate_id();
// 1) Start listening BEFORE emitting to avoid missing a fast reply
let (tx, mut rx) = tokio::sync::mpsc::unbounded_channel::<UpdateResponse>();
let w_for_listener = window.clone();
let event_id = w_for_listener.listen(reply_id.clone(), move |ev| {
match serde_json::from_str::<UpdateResponse>(ev.payload()) {
Ok(UpdateResponse::Ack) => {
let _ = tx.send(UpdateResponse::Ack);
}
Ok(UpdateResponse::Action { action }) => {
let _ = tx.send(UpdateResponse::Action { action });
}
Err(e) => {
warn!("Failed to parse update reply from frontend: {e:?}");
}
}
});
// Make sure we always unlisten
struct Unlisten<'a, R: Runtime> {
win: &'a WebviewWindow<R>,
id: tauri::EventId,
}
impl<'a, R: Runtime> Drop for Unlisten<'a, R> {
fn drop(&mut self) {
self.win.unlisten(self.id);
}
}
let _guard = Unlisten { win: window, id: event_id };
// 2) Emit the event now that listener is in place
let info =
UpdateInfo { version: update.version.to_string(), downloaded, reply_event_id: reply_id };
window
.emit_to(window.label(), "update_available", &info)
.map_err(|e| GenericError(format!("Failed to emit update_available: {e}")))?;
// 3) Two-stage timeout: first wait for ack, then wait for final action
// --- Phase 1: wait for ACK with timeout ---
let ack_timer = sleep(ack_wait);
tokio::pin!(ack_timer);
loop {
tokio::select! {
msg = rx.recv() => match msg {
Some(UpdateResponse::Ack) => break, // proceed to Phase 2
Some(UpdateResponse::Action{action}) => return Ok(action), // user was fast
None => return Err(GenericError("frontend channel closed before ack".into())),
},
_ = &mut ack_timer => {
return Err(GenericError("timed out waiting for frontend ack".into()));
}
}
}
// --- Phase 2: wait forever for final action ---
loop {
match rx.recv().await {
Some(UpdateResponse::Action { action }) => return Ok(action),
Some(UpdateResponse::Ack) => { /* ignore extra acks */ }
None => return Err(GenericError("frontend channel closed before action".into())),
}
}
}
async fn start_native_update<R: Runtime>(window: &WebviewWindow<R>, update: &Update) {
// If the frontend doesn't respond, fallback to native dialogs
let confirmed = window
.dialog()
.message(format!(
"{} is available. Would you like to download and install it now?",
update.version
))
.buttons(MessageDialogButtons::OkCancelCustom("Download".to_string(), "Later".to_string()))
.title("Update Available")
.blocking_show();
if !confirmed {
return;
}
match update.download_and_install(|_, _| {}, || {}).await {
Ok(()) => {
if window
.dialog()
.message("Would you like to restart the app?")
.title("Update Installed")
.buttons(MessageDialogButtons::OkCancelCustom(
"Restart".to_string(),
"Later".to_string(),
))
.blocking_show()
{
window.app_handle().request_restart();
}
}
Err(e) => {
window.dialog().message(format!("The update failed to install: {}", e));
}
}
}
pub async fn download_update_idempotent<R: Runtime>(
window: &WebviewWindow<R>,
update: &Update,
) -> Result<PathBuf> {
let dl_path = ensure_download_path(window, update)?;
if dl_path.exists() {
info!("{} already downloaded to {}", update.version, dl_path.display());
return Ok(dl_path);
}
info!("{} downloading: {}", update.version, dl_path.display());
let dl_bytes = update.download(|_, _| {}, || {}).await?;
std::fs::write(&dl_path, dl_bytes)
.map_err(|e| GenericError(format!("Failed to write update: {e}")))?;
info!("{} downloaded", update.version);
Ok(dl_path)
}
pub async fn install_update_maybe_download<R: Runtime>(
window: &WebviewWindow<R>,
update: &Update,
) -> Result<()> {
let dl_path = download_update_idempotent(window, update).await?;
let update_bytes = std::fs::read(&dl_path)?;
update.install(update_bytes.as_slice())?;
Ok(())
}
pub fn ensure_download_path<R: Runtime>(
window: &WebviewWindow<R>,
update: &Update,
) -> Result<PathBuf> {
// Ensure dir exists
let base_dir = window.path().app_cache_dir()?.join("updates");
std::fs::create_dir_all(&base_dir)?;
// Generate name based on signature
let sig_digest = md5::compute(&update.signature);
let name = format!("yaak-{}-{:x}", update.version, sig_digest);
let dl_path = base_dir.join(name);
Ok(dl_path)
}

View File

@@ -1,136 +0,0 @@
use crate::PluginContextExt;
use crate::error::Result;
use crate::import::import_data;
use crate::models_ext::QueryManagerExt;
use log::{info, warn};
use std::collections::HashMap;
use std::fs;
use std::sync::Arc;
use tauri::{AppHandle, Emitter, Manager, Runtime, Url};
use tauri_plugin_dialog::{DialogExt, MessageDialogButtons, MessageDialogKind};
use yaak_api::yaak_api_client;
use yaak_models::util::generate_id;
use yaak_plugins::events::{Color, ShowToastRequest};
use yaak_plugins::install::download_and_install;
use yaak_plugins::manager::PluginManager;
pub(crate) async fn handle_deep_link<R: Runtime>(
app_handle: &AppHandle<R>,
url: &Url,
) -> Result<()> {
let command = url.domain().unwrap_or_default();
info!("Yaak URI scheme invoked {}?{}", command, url.query().unwrap_or_default());
let query_map: HashMap<String, String> = url.query_pairs().into_owned().collect();
let windows = app_handle.webview_windows();
let (_, window) = windows.iter().next().unwrap();
match command {
"install-plugin" => {
let name = query_map.get("name").unwrap();
let version = query_map.get("version").cloned();
_ = window.set_focus();
let confirmed_install = app_handle
.dialog()
.message(format!("Install plugin {name} {version:?}?"))
.kind(MessageDialogKind::Info)
.buttons(MessageDialogButtons::OkCancelCustom(
"Install".to_string(),
"Cancel".to_string(),
))
.blocking_show();
if !confirmed_install {
// Cancelled installation
return Ok(());
}
let plugin_manager = Arc::new((*window.state::<PluginManager>()).clone());
let query_manager = app_handle.db_manager();
let app_version = app_handle.package_info().version.to_string();
let http_client = yaak_api_client(&app_version)?;
let plugin_context = window.plugin_context();
let pv = download_and_install(
plugin_manager,
&query_manager,
&http_client,
&plugin_context,
name,
version,
)
.await?;
app_handle.emit(
"show_toast",
ShowToastRequest {
message: format!("Installed {name}@{}", pv.version),
color: Some(Color::Success),
icon: None,
timeout: Some(5000),
},
)?;
}
"import-data" => {
let mut file_path = query_map.get("path").map(|s| s.to_owned());
let name = query_map.get("name").map(|s| s.to_owned()).unwrap_or("data".to_string());
_ = window.set_focus();
if let Some(file_url) = query_map.get("url") {
let confirmed_import = app_handle
.dialog()
.message(format!("Import {name} from {file_url}?"))
.kind(MessageDialogKind::Info)
.buttons(MessageDialogButtons::OkCancelCustom(
"Import".to_string(),
"Cancel".to_string(),
))
.blocking_show();
if !confirmed_import {
return Ok(());
}
let app_version = app_handle.package_info().version.to_string();
let resp = yaak_api_client(&app_version)?.get(file_url).send().await?;
let json = resp.bytes().await?;
let p = app_handle
.path()
.temp_dir()?
.join(format!("import-{}", generate_id()))
.to_string_lossy()
.to_string();
fs::write(&p, json)?;
file_path = Some(p);
}
let file_path = match file_path {
Some(p) => p,
None => {
app_handle.emit(
"show_toast",
ShowToastRequest {
message: "Failed to import data".to_string(),
color: Some(Color::Danger),
icon: None,
timeout: None,
},
)?;
return Ok(());
}
};
let results = import_data(window, &file_path).await?;
window.emit(
"show_toast",
ShowToastRequest {
message: format!("Imported data for {} workspaces", results.workspaces.len()),
color: Some(Color::Success),
icon: None,
timeout: Some(5000),
},
)?;
}
_ => {
warn!("Unknown deep link command: {command}");
}
}
Ok(())
}

View File

@@ -1,289 +0,0 @@
use crate::error::Result;
use crate::models_ext::QueryManagerExt;
use crate::window_menu::app_menu;
use log::{info, warn};
use rand::random;
use tauri::{
AppHandle, Emitter, LogicalSize, Manager, PhysicalSize, Runtime, WebviewUrl, WebviewWindow,
WindowEvent,
};
use tauri_plugin_opener::OpenerExt;
use tokio::sync::mpsc;
const DEFAULT_WINDOW_WIDTH: f64 = 1100.0;
const DEFAULT_WINDOW_HEIGHT: f64 = 600.0;
const MIN_WINDOW_WIDTH: f64 = 300.0;
const MIN_WINDOW_HEIGHT: f64 = 300.0;
pub(crate) const MAIN_WINDOW_PREFIX: &str = "main_";
const OTHER_WINDOW_PREFIX: &str = "other_";
#[derive(Default, Debug)]
pub(crate) struct CreateWindowConfig<'s> {
pub url: &'s str,
pub label: &'s str,
pub title: &'s str,
pub inner_size: Option<(f64, f64)>,
pub position: Option<(f64, f64)>,
pub navigation_tx: Option<mpsc::Sender<String>>,
pub close_tx: Option<mpsc::Sender<()>>,
pub data_dir_key: Option<String>,
pub hide_titlebar: bool,
}
pub(crate) fn create_window<R: Runtime>(
handle: &AppHandle<R>,
config: CreateWindowConfig,
) -> Result<WebviewWindow<R>> {
#[allow(unused_variables)]
let menu = app_menu(handle)?;
// This causes the window to not be clickable (in AppImage), so disable on Linux
#[cfg(not(target_os = "linux"))]
handle.set_menu(menu).expect("Failed to set app menu");
info!("Create new window label={}", config.label);
let mut win_builder =
tauri::WebviewWindowBuilder::new(handle, config.label, WebviewUrl::App(config.url.into()))
.title(config.title)
.resizable(true)
.visible(false) // To prevent theme flashing, the frontend code calls show() immediately after configuring the theme
.fullscreen(false)
.min_inner_size(MIN_WINDOW_WIDTH, MIN_WINDOW_HEIGHT);
if let Some(key) = config.data_dir_key {
#[cfg(not(target_os = "macos"))]
{
use std::fs;
let safe_key = format!("{:x}", md5::compute(key.as_bytes()));
let dir = handle.path().app_data_dir()?.join("window-sessions").join(safe_key);
fs::create_dir_all(&dir)?;
win_builder = win_builder.data_directory(dir);
}
// macOS doesn't support `data_directory()` so must use this fn instead
#[cfg(target_os = "macos")]
{
let hash = md5::compute(key.as_bytes());
let mut id = [0u8; 16];
id.copy_from_slice(&hash[..16]); // Take the first 16 bytes of the hash
win_builder = win_builder.data_store_identifier(id);
}
}
if let Some((w, h)) = config.inner_size {
win_builder = win_builder.inner_size(w, h);
} else {
win_builder = win_builder.inner_size(600.0, 600.0);
}
if let Some((x, y)) = config.position {
win_builder = win_builder.position(x, y);
} else {
win_builder = win_builder.center();
}
if let Some(tx) = config.navigation_tx {
win_builder = win_builder.on_navigation(move |url| {
let url = url.to_string();
let tx = tx.clone();
tauri::async_runtime::block_on(async move {
tx.send(url).await.unwrap();
});
true
});
}
let settings = handle.db().get_settings();
if config.hide_titlebar && !settings.use_native_titlebar {
#[cfg(target_os = "macos")]
{
use tauri::TitleBarStyle;
win_builder = win_builder.hidden_title(true).title_bar_style(TitleBarStyle::Overlay);
}
#[cfg(not(target_os = "macos"))]
{
win_builder = win_builder.decorations(false);
}
}
if let Some(w) = handle.webview_windows().get(config.label) {
info!("Webview with label {} already exists. Focusing existing", config.label);
w.set_focus()?;
return Ok(w.to_owned());
}
let win = win_builder.build()?;
if let Some(tx) = config.close_tx {
win.on_window_event(move |event| match event {
WindowEvent::CloseRequested { .. } => {
let tx = tx.clone();
tauri::async_runtime::spawn(async move {
tx.send(()).await.unwrap();
});
}
_ => {}
});
}
let webview_window = win.clone();
win.on_menu_event(move |w, event| {
if !w.is_focused().unwrap() {
return;
}
let event_id = event.id().0.as_str();
match event_id {
"hacked_quit" => {
// Cmd+Q on macOS doesn't trigger `CloseRequested` so we use a custom Quit menu
// and trigger close() for each window.
w.webview_windows().iter().for_each(|(_, w)| {
info!("Closing window {}", w.label());
let _ = w.close();
});
}
"close" => w.close().unwrap(),
"zoom_reset" => w.emit("zoom_reset", true).unwrap(),
"zoom_in" => w.emit("zoom_in", true).unwrap(),
"zoom_out" => w.emit("zoom_out", true).unwrap(),
"settings" => w.emit("settings", true).unwrap(),
"open_feedback" => {
if let Err(e) =
w.app_handle().opener().open_url("https://yaak.app/feedback", None::<&str>)
{
warn!("Failed to open feedback {e:?}")
}
}
// Commands for development
"dev.reset_size" => webview_window
.set_size(LogicalSize::new(DEFAULT_WINDOW_WIDTH, DEFAULT_WINDOW_HEIGHT))
.unwrap(),
"dev.reset_size_16x9" => {
let width = webview_window.outer_size().unwrap().width;
let height = width * 9 / 16;
webview_window.set_size(PhysicalSize::new(width, height)).unwrap()
}
"dev.reset_size_16x10" => {
let width = webview_window.outer_size().unwrap().width;
let height = width * 10 / 16;
webview_window.set_size(PhysicalSize::new(width, height)).unwrap()
}
"dev.refresh" => webview_window.eval("location.reload()").unwrap(),
"dev.generate_theme_css" => {
w.emit("generate_theme_css", true).unwrap();
}
"dev.toggle_devtools" => {
if webview_window.is_devtools_open() {
webview_window.close_devtools();
} else {
webview_window.open_devtools();
}
}
_ => {}
}
});
Ok(win)
}
pub(crate) fn create_main_window(handle: &AppHandle, url: &str) -> Result<WebviewWindow> {
let mut counter = 0;
let label = loop {
let label = format!("{MAIN_WINDOW_PREFIX}{counter}");
match handle.webview_windows().get(label.as_str()) {
None => break Some(label),
Some(_) => counter += 1,
}
}
.expect("Failed to generate label for new window");
let config = CreateWindowConfig {
url,
label: label.as_str(),
title: "Yaak",
inner_size: Some((DEFAULT_WINDOW_WIDTH, DEFAULT_WINDOW_HEIGHT)),
position: Some((
// Offset by random amount so it's easier to differentiate
100.0 + random::<f64>() * 20.0,
100.0 + random::<f64>() * 20.0,
)),
hide_titlebar: true,
..Default::default()
};
create_window(handle, config)
}
pub(crate) fn create_child_window(
parent_window: &WebviewWindow,
url: &str,
label: &str,
title: &str,
inner_size: (f64, f64),
) -> Result<WebviewWindow> {
let app_handle = parent_window.app_handle();
let label = format!("{OTHER_WINDOW_PREFIX}_{label}");
let scale_factor = parent_window.scale_factor()?;
let current_pos = parent_window.inner_position()?.to_logical::<f64>(scale_factor);
let current_size = parent_window.inner_size()?.to_logical::<f64>(scale_factor);
// Position the new window in the middle of the parent
let position = (
current_pos.x + current_size.width / 2.0 - inner_size.0 / 2.0,
current_pos.y + current_size.height / 2.0 - inner_size.1 / 2.0,
);
let config = CreateWindowConfig {
label: label.as_str(),
title,
url,
inner_size: Some(inner_size),
position: Some(position),
hide_titlebar: true,
..Default::default()
};
let child_window = create_window(&app_handle, config)?;
// NOTE: These listeners will remain active even when the windows close. Unfortunately,
// there's no way to unlisten to events for now, so we just have to be defensive.
{
let parent_window = parent_window.clone();
let child_window = child_window.clone();
child_window.clone().on_window_event(move |e| match e {
// When the new window is destroyed, bring the other up behind it
WindowEvent::Destroyed => {
if let Some(w) = parent_window.get_webview_window(child_window.label()) {
w.set_focus().unwrap();
}
}
_ => {}
});
}
{
let parent_window = parent_window.clone();
let child_window = child_window.clone();
parent_window.clone().on_window_event(move |e| match e {
// When the parent window is closed, close the child
WindowEvent::CloseRequested { .. } => child_window.close().unwrap(),
// When the parent window is focused, bring the child above
WindowEvent::Focused(focus) => {
if *focus {
if let Some(w) = parent_window.get_webview_window(child_window.label()) {
w.set_focus().unwrap();
};
}
}
_ => {}
});
}
Ok(child_window)
}

View File

@@ -1,447 +0,0 @@
//! WebSocket Tauri command wrappers
//! These wrap the core yaak-ws functionality for Tauri IPC.
use crate::PluginContextExt;
use crate::error::Result;
use crate::models_ext::QueryManagerExt;
use http::HeaderMap;
use log::{debug, info, warn};
use std::str::FromStr;
use std::sync::Arc;
use tauri::http::HeaderValue;
use tauri::{AppHandle, Manager, Runtime, State, WebviewWindow, command};
use tokio::sync::{Mutex, mpsc};
use tokio_tungstenite::tungstenite::Message;
use url::Url;
use yaak_crypto::manager::EncryptionManager;
use yaak_http::cookies::CookieStore;
use yaak_http::path_placeholders::apply_path_placeholders;
use yaak_models::models::{
HttpResponseHeader, WebsocketConnection, WebsocketConnectionState, WebsocketEvent,
WebsocketEventType, WebsocketRequest,
};
use yaak_models::util::UpdateSource;
use yaak_plugins::events::{CallHttpAuthenticationRequest, HttpHeader, RenderPurpose};
use yaak_plugins::manager::PluginManager;
use yaak_plugins::template_callback::PluginTemplateCallback;
use yaak_templates::{RenderErrorBehavior, RenderOptions};
use yaak_tls::find_client_certificate;
use yaak_ws::{WebsocketManager, render_websocket_request};
#[command]
pub async fn cmd_ws_delete_connections<R: Runtime>(
request_id: &str,
app_handle: AppHandle<R>,
window: WebviewWindow<R>,
) -> Result<()> {
Ok(app_handle.db().delete_all_websocket_connections_for_request(
request_id,
&UpdateSource::from_window_label(window.label()),
)?)
}
#[command]
pub async fn cmd_ws_send<R: Runtime>(
connection_id: &str,
environment_id: Option<&str>,
app_handle: AppHandle<R>,
window: WebviewWindow<R>,
ws_manager: State<'_, Mutex<WebsocketManager>>,
) -> Result<WebsocketConnection> {
let connection = app_handle.db().get_websocket_connection(connection_id)?;
let unrendered_request = app_handle.db().get_websocket_request(&connection.request_id)?;
let environment_chain = app_handle.db().resolve_environments(
&unrendered_request.workspace_id,
unrendered_request.folder_id.as_deref(),
environment_id,
)?;
let (resolved_request, _auth_context_id) =
resolve_websocket_request(&window, &unrendered_request)?;
let plugin_manager = Arc::new((*app_handle.state::<PluginManager>()).clone());
let encryption_manager = Arc::new((*app_handle.state::<EncryptionManager>()).clone());
let request = render_websocket_request(
&resolved_request,
environment_chain,
&PluginTemplateCallback::new(
plugin_manager,
encryption_manager,
&window.plugin_context(),
RenderPurpose::Send,
),
&RenderOptions { error_behavior: RenderErrorBehavior::Throw },
)
.await?;
let mut ws_manager = ws_manager.lock().await;
ws_manager.send(&connection.id, Message::Text(request.message.clone().into())).await?;
app_handle.db().upsert_websocket_event(
&WebsocketEvent {
connection_id: connection.id.clone(),
request_id: request.id.clone(),
workspace_id: connection.workspace_id.clone(),
is_server: false,
message_type: WebsocketEventType::Text,
message: request.message.into(),
..Default::default()
},
&UpdateSource::from_window_label(window.label()),
)?;
Ok(connection)
}
#[command]
pub async fn cmd_ws_close<R: Runtime>(
connection_id: &str,
app_handle: AppHandle<R>,
window: WebviewWindow<R>,
ws_manager: State<'_, Mutex<WebsocketManager>>,
) -> Result<WebsocketConnection> {
let connection = {
let db = app_handle.db();
let connection = db.get_websocket_connection(connection_id)?;
db.upsert_websocket_connection(
&WebsocketConnection { state: WebsocketConnectionState::Closing, ..connection },
&UpdateSource::from_window_label(window.label()),
)?
};
let mut ws_manager = ws_manager.lock().await;
if let Err(e) = ws_manager.close(&connection.id).await {
warn!("Failed to close WebSocket connection: {e:?}");
};
Ok(connection)
}
#[command]
pub async fn cmd_ws_connect<R: Runtime>(
request_id: &str,
environment_id: Option<&str>,
cookie_jar_id: Option<&str>,
app_handle: AppHandle<R>,
window: WebviewWindow<R>,
_plugin_manager: State<'_, PluginManager>,
ws_manager: State<'_, Mutex<WebsocketManager>>,
) -> Result<WebsocketConnection> {
let unrendered_request = app_handle.db().get_websocket_request(request_id)?;
let environment_chain = app_handle.db().resolve_environments(
&unrendered_request.workspace_id,
unrendered_request.folder_id.as_deref(),
environment_id,
)?;
let workspace = app_handle.db().get_workspace(&unrendered_request.workspace_id)?;
let settings = app_handle.db().get_settings();
let (resolved_request, auth_context_id) =
resolve_websocket_request(&window, &unrendered_request)?;
let plugin_manager = Arc::new((*app_handle.state::<PluginManager>()).clone());
let encryption_manager = Arc::new((*app_handle.state::<EncryptionManager>()).clone());
let request = render_websocket_request(
&resolved_request,
environment_chain,
&PluginTemplateCallback::new(
plugin_manager.clone(),
encryption_manager.clone(),
&window.plugin_context(),
RenderPurpose::Send,
),
&RenderOptions { error_behavior: RenderErrorBehavior::Throw },
)
.await?;
let connection = app_handle.db().upsert_websocket_connection(
&WebsocketConnection {
workspace_id: request.workspace_id.clone(),
request_id: request_id.to_string(),
..Default::default()
},
&UpdateSource::from_window_label(window.label()),
)?;
let (mut url, url_parameters) = apply_path_placeholders(&request.url, &request.url_parameters);
if !url.starts_with("ws://") && !url.starts_with("wss://") {
url.insert_str(0, "ws://");
}
// Add URL parameters to URL
let mut url = match Url::parse(&url) {
Ok(url) => url,
Err(e) => {
return Ok(app_handle.db().upsert_websocket_connection(
&WebsocketConnection {
error: Some(format!("Failed to parse URL {}", e.to_string())),
state: WebsocketConnectionState::Closed,
..connection
},
&UpdateSource::from_window_label(window.label()),
)?);
}
};
let mut headers = HeaderMap::new();
for h in request.headers.clone() {
if h.name.is_empty() && h.value.is_empty() {
continue;
}
if !h.enabled {
continue;
}
headers.insert(
http::HeaderName::from_str(&h.name).unwrap(),
HeaderValue::from_str(&h.value).unwrap(),
);
}
match request.authentication_type {
None => {
// No authentication found. Not even inherited
}
Some(authentication_type) if authentication_type == "none" => {
// Explicitly no authentication
}
Some(authentication_type) => {
let auth = request.authentication.clone();
let plugin_req = CallHttpAuthenticationRequest {
context_id: format!("{:x}", md5::compute(auth_context_id)),
values: serde_json::from_value(serde_json::to_value(&auth).unwrap()).unwrap(),
method: "POST".to_string(),
url: request.url.clone(),
headers: request
.headers
.clone()
.into_iter()
.map(|h| HttpHeader { name: h.name, value: h.value })
.collect(),
};
let plugin_result = plugin_manager
.call_http_authentication(
&window.plugin_context(),
&authentication_type,
plugin_req,
)
.await?;
for header in plugin_result.set_headers.unwrap_or_default() {
match (
http::HeaderName::from_str(&header.name),
HeaderValue::from_str(&header.value),
) {
(Ok(name), Ok(value)) => {
headers.insert(name, value);
}
_ => continue,
};
}
if let Some(params) = plugin_result.set_query_parameters {
let mut query_pairs = url.query_pairs_mut();
for p in params {
query_pairs.append_pair(&p.name, &p.value);
}
}
}
}
// Add cookies to WS HTTP Upgrade
if let Some(id) = cookie_jar_id {
let cookie_jar = app_handle.db().get_cookie_jar(&id)?;
let store = CookieStore::from_cookies(cookie_jar.cookies);
// Convert WS URL -> HTTP URL because our cookie store matches based on
// Path/HttpOnly/Secure attributes even though WS upgrades are HTTP requests
let http_url = convert_ws_url_to_http(&url);
if let Some(cookie_header_value) = store.get_cookie_header(&http_url) {
debug!("Inserting cookies into WS upgrade to {}: {}", url, cookie_header_value);
headers.insert(
http::HeaderName::from_static("cookie"),
HeaderValue::from_str(&cookie_header_value).unwrap(),
);
}
}
let (receive_tx, mut receive_rx) = mpsc::channel::<Message>(128);
let mut ws_manager = ws_manager.lock().await;
{
let valid_query_pairs = url_parameters
.into_iter()
.filter(|p| p.enabled && !p.name.is_empty())
.collect::<Vec<_>>();
// NOTE: Only mutate query pairs if there are any, or it will append an empty `?` to the URL
if !valid_query_pairs.is_empty() {
let mut query_pairs = url.query_pairs_mut();
for p in valid_query_pairs {
query_pairs.append_pair(p.name.as_str(), p.value.as_str());
}
}
}
let client_cert = find_client_certificate(url.as_str(), &settings.client_certificates);
let response = match ws_manager
.connect(
&connection.id,
url.as_str(),
headers,
receive_tx,
workspace.setting_validate_certificates,
client_cert,
)
.await
{
Ok(r) => r,
Err(e) => {
return Ok(app_handle.db().upsert_websocket_connection(
&WebsocketConnection {
error: Some(e.to_string()),
state: WebsocketConnectionState::Closed,
..connection
},
&UpdateSource::from_window_label(window.label()),
)?);
}
};
app_handle.db().upsert_websocket_event(
&WebsocketEvent {
connection_id: connection.id.clone(),
request_id: request.id.clone(),
workspace_id: connection.workspace_id.clone(),
is_server: false,
message_type: WebsocketEventType::Open,
..Default::default()
},
&UpdateSource::from_window_label(window.label()),
)?;
let response_headers = response
.headers()
.into_iter()
.map(|(name, value)| HttpResponseHeader {
name: name.to_string(),
value: value.to_str().unwrap().to_string(),
})
.collect::<Vec<HttpResponseHeader>>();
let connection = app_handle.db().upsert_websocket_connection(
&WebsocketConnection {
state: WebsocketConnectionState::Connected,
headers: response_headers,
status: response.status().as_u16() as i32,
url: request.url.clone(),
..connection
},
&UpdateSource::from_window_label(window.label()),
)?;
{
let connection_id = connection.id.clone();
let request_id = request.id.to_string();
let workspace_id = request.workspace_id.clone();
let connection = connection.clone();
let window_label = window.label().to_string();
let mut has_written_close = false;
tokio::spawn(async move {
while let Some(message) = receive_rx.recv().await {
if let Message::Close(_) = message {
has_written_close = true;
}
app_handle
.db()
.upsert_websocket_event(
&WebsocketEvent {
connection_id: connection_id.clone(),
request_id: request_id.clone(),
workspace_id: workspace_id.clone(),
is_server: true,
message_type: match message {
Message::Text(_) => WebsocketEventType::Text,
Message::Binary(_) => WebsocketEventType::Binary,
Message::Ping(_) => WebsocketEventType::Ping,
Message::Pong(_) => WebsocketEventType::Pong,
Message::Close(_) => WebsocketEventType::Close,
// Raw frame will never happen during a read
Message::Frame(_) => WebsocketEventType::Frame,
},
message: message.into_data().into(),
..Default::default()
},
&UpdateSource::from_window_label(&window_label),
)
.unwrap();
}
info!("Websocket connection closed");
if !has_written_close {
app_handle
.db()
.upsert_websocket_event(
&WebsocketEvent {
connection_id: connection_id.clone(),
request_id: request_id.clone(),
workspace_id: workspace_id.clone(),
is_server: true,
message_type: WebsocketEventType::Close,
..Default::default()
},
&UpdateSource::from_window_label(&window_label),
)
.unwrap();
}
app_handle
.db()
.upsert_websocket_connection(
&WebsocketConnection {
workspace_id: request.workspace_id.clone(),
request_id: request_id.to_string(),
state: WebsocketConnectionState::Closed,
..connection
},
&UpdateSource::from_window_label(&window_label),
)
.unwrap();
});
}
Ok(connection)
}
/// Resolve inherited authentication and headers for a websocket request
fn resolve_websocket_request<R: Runtime>(
window: &WebviewWindow<R>,
request: &WebsocketRequest,
) -> Result<(WebsocketRequest, String)> {
let mut new_request = request.clone();
let (authentication_type, authentication, authentication_context_id) =
window.db().resolve_auth_for_websocket_request(request)?;
new_request.authentication_type = authentication_type;
new_request.authentication = authentication;
let headers = window.db().resolve_headers_for_websocket_request(request)?;
new_request.headers = headers;
Ok((new_request, authentication_context_id))
}
/// Convert WS URL to HTTP URL for cookie filtering
/// WebSocket upgrade requests are HTTP requests initially, so HttpOnly cookies should apply
fn convert_ws_url_to_http(ws_url: &Url) -> Url {
let mut http_url = ws_url.clone();
match ws_url.scheme() {
"ws" => {
http_url.set_scheme("http").expect("Failed to set http scheme");
}
"wss" => {
http_url.set_scheme("https").expect("Failed to set https scheme");
}
_ => {
// Already HTTP/HTTPS, no conversion needed
}
}
http_url
}

Binary file not shown.

Before

Width:  |  Height:  |  Size: 10 KiB

View File

@@ -1,51 +0,0 @@
{
"productName": "Yaak",
"version": "0.0.0",
"identifier": "app.yaak.desktop",
"build": {
"beforeBuildCommand": "npm run tauri-before-build",
"beforeDevCommand": "npm run tauri-before-dev",
"devUrl": "http://localhost:1420",
"frontendDist": "../../dist"
},
"app": {
"withGlobalTauri": false,
"security": {
"assetProtocol": {
"enable": true,
"scope": {
"allow": [
"$APPDATA/responses/*",
"$RESOURCE/static/*"
]
}
}
}
},
"plugins": {
"deep-link": {
"desktop": {
"schemes": [
"yaak"
]
}
}
},
"bundle": {
"icon": [
"icons/release/32x32.png",
"icons/release/128x128.png",
"icons/release/128x128@2x.png",
"icons/release/icon.icns",
"icons/release/icon.ico"
],
"resources": [
"static",
"vendored/protoc/include",
"vendored/plugins",
"vendored/plugin-runtime",
"vendored/node/yaaknode*",
"vendored/protoc/yaakprotoc*"
]
}
}

View File

@@ -1,60 +0,0 @@
{
"build": {
"features": ["updater", "license"]
},
"app": {
"security": {
"capabilities": [
"default",
{
"identifier": "release",
"windows": ["*"],
"permissions": ["yaak-license:default"]
}
]
}
},
"plugins": {
"updater": {
"endpoints": [
"https://update.yaak.app/check/{{target}}/{{arch}}/{{current_version}}"
],
"pubkey": "dW50cnVzdGVkIGNvbW1lbnQ6IG1pbmlzaWduIHB1YmxpYyBrZXk6IEVGRkFGMjQxRUNEOTQ3MzAKUldRd1I5bnNRZkw2NzRtMnRlWTN3R24xYUR3aGRsUjJzWGwvdHdEcGljb3ZJMUNlMjFsaHlqVU4K"
}
},
"bundle": {
"publisher": "Yaak",
"license": "MIT",
"copyright": "Yaak",
"homepage": "https://yaak.app",
"active": true,
"category": "DeveloperTool",
"createUpdaterArtifacts": true,
"longDescription": "A cross-platform desktop app for interacting with REST, GraphQL, and gRPC",
"shortDescription": "Play with APIs, intuitively",
"targets": ["app", "appimage", "deb", "dmg", "nsis", "rpm"],
"macOS": {
"minimumSystemVersion": "13.0",
"exceptionDomain": "",
"entitlements": "macos/entitlements.plist",
"frameworks": []
},
"windows": {
"signCommand": "trusted-signing-cli -e https://eus.codesigning.azure.net/ -a Yaak -c yaakapp %1"
},
"linux": {
"deb": {
"desktopTemplate": "./template.desktop",
"files": {
"/usr/share/metainfo/app.yaak.Yaak.metainfo.xml": "../../flatpak/app.yaak.Yaak.metainfo.xml"
}
},
"rpm": {
"desktopTemplate": "./template.desktop",
"files": {
"/usr/share/metainfo/app.yaak.Yaak.metainfo.xml": "../../flatpak/app.yaak.Yaak.metainfo.xml"
}
}
}
}
}

View File

@@ -1,9 +0,0 @@
[Desktop Entry]
Categories={{categories}}
Comment={{comment}}
Exec={{exec}}
Icon={{icon}}
Name={{name}}
StartupWMClass={{exec}}
Terminal=false
Type=Application

View File

@@ -1,16 +0,0 @@
[package]
name = "yaak-fonts"
links = "yaak-fonts"
version = "0.1.0"
edition = "2021"
publish = false
[dependencies]
font-loader = "0.11.0"
tauri = { workspace = true }
ts-rs = { workspace = true }
serde = "1.0"
thiserror = { workspace = true }
[build-dependencies]
tauri-plugin = { workspace = true, features = ["build"] }

View File

@@ -1,3 +0,0 @@
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
export type Fonts = { editorFonts: Array<string>, uiFonts: Array<string>, };

View File

@@ -1,14 +0,0 @@
import { useQuery } from '@tanstack/react-query';
import { invoke } from '@tauri-apps/api/core';
import { Fonts } from './bindings/gen_fonts';
export async function listFonts() {
return invoke<Fonts>('plugin:yaak-fonts|list', {});
}
export function useFonts() {
return useQuery({
queryKey: ['list_fonts'],
queryFn: () => listFonts(),
});
}

View File

@@ -1,6 +0,0 @@
{
"name": "@yaakapp-internal/fonts",
"private": true,
"version": "1.0.0",
"main": "index.ts"
}

View File

@@ -1,38 +0,0 @@
use crate::Result;
use font_loader::system_fonts;
use serde::{Deserialize, Serialize};
use std::collections::HashSet;
use tauri::command;
use ts_rs::TS;
#[derive(Default, Debug, Clone, Serialize, Deserialize, TS, PartialEq)]
#[serde(rename_all = "camelCase")]
#[ts(export, export_to = "gen_fonts.ts")]
pub struct Fonts {
pub editor_fonts: Vec<String>,
pub ui_fonts: Vec<String>,
}
#[command]
pub(crate) async fn list() -> Result<Fonts> {
let mut ui_fonts = HashSet::new();
let mut editor_fonts = HashSet::new();
let mut property = system_fonts::FontPropertyBuilder::new().monospace().build();
for font in &system_fonts::query_specific(&mut property) {
editor_fonts.insert(font.to_string());
}
for font in &system_fonts::query_all() {
if !editor_fonts.contains(font) {
ui_fonts.insert(font.to_string());
}
}
let mut ui_fonts: Vec<String> = ui_fonts.into_iter().collect();
let mut editor_fonts: Vec<String> = editor_fonts.into_iter().collect();
ui_fonts.sort();
editor_fonts.sort();
Ok(Fonts { ui_fonts, editor_fonts })
}

View File

@@ -1,15 +0,0 @@
use serde::{ser::Serializer, Serialize};
#[derive(Debug, thiserror::Error)]
pub enum Error {}
impl Serialize for Error {
fn serialize<S>(&self, serializer: S) -> std::result::Result<S::Ok, S::Error>
where
S: Serializer,
{
serializer.serialize_str(self.to_string().as_ref())
}
}
pub type Result<T> = std::result::Result<T, Error>;

View File

@@ -1,15 +0,0 @@
use tauri::{
generate_handler,
plugin::{Builder, TauriPlugin},
Runtime,
};
mod commands;
mod error;
use crate::commands::list;
pub use error::{Error, Result};
pub fn init<R: Runtime>() -> TauriPlugin<R> {
Builder::new("yaak-fonts").invoke_handler(generate_handler![list]).build()
}

View File

@@ -1,11 +0,0 @@
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
export type APIErrorResponsePayload = { error: string, message: string, };
export type ActivateLicenseRequestPayload = { licenseKey: string, appVersion: string, appPlatform: string, };
export type ActivateLicenseResponsePayload = { activationId: string, };
export type DeactivateLicenseRequestPayload = { appVersion: string, appPlatform: string, };
export type LicenseCheckStatus = { "status": "personal_use", "data": { trial_ended: string, } } | { "status": "trialing", "data": { end: string, } } | { "status": "error", "data": { message: string, code: string, } } | { "status": "active", "data": { periodEnd: string, cancelAt: string | null, } } | { "status": "inactive", "data": { status: string, } } | { "status": "expired", "data": { changes: number, changesUrl: string | null, billingUrl: string, periodEnd: string, } } | { "status": "past_due", "data": { billingUrl: string, periodEnd: string, } };

View File

@@ -1,5 +0,0 @@
const COMMANDS: &[&str] = &["activate", "deactivate", "check"];
fn main() {
tauri_plugin::Builder::new(COMMANDS).build();
}

View File

@@ -1,3 +0,0 @@
[default]
description = "Default permissions for the plugin"
permissions = ["allow-check", "allow-activate", "allow-deactivate"]

View File

@@ -1,18 +0,0 @@
use crate::error::Result;
use crate::{LicenseCheckStatus, activate_license, check_license, deactivate_license};
use tauri::{Runtime, WebviewWindow, command};
#[command]
pub async fn check<R: Runtime>(window: WebviewWindow<R>) -> Result<LicenseCheckStatus> {
check_license(&window).await
}
#[command]
pub async fn activate<R: Runtime>(license_key: &str, window: WebviewWindow<R>) -> Result<()> {
activate_license(&window, license_key).await
}
#[command]
pub async fn deactivate<R: Runtime>(window: WebviewWindow<R>) -> Result<()> {
deactivate_license(&window).await
}

View File

@@ -1,17 +0,0 @@
use tauri::{
Runtime, generate_handler,
plugin::{Builder, TauriPlugin},
};
mod commands;
pub mod error;
mod license;
use crate::commands::{activate, check, deactivate};
pub use license::*;
pub fn init<R: Runtime>() -> TauriPlugin<R> {
Builder::new("yaak-license")
.invoke_handler(generate_handler![check, activate, deactivate])
.build()
}

View File

@@ -1,242 +0,0 @@
use crate::error::Error::{ClientError, JsonError, ServerError};
use crate::error::Result;
use chrono::{DateTime, Utc};
use log::{info, warn};
use serde::{Deserialize, Serialize};
use std::ops::Add;
use std::time::Duration;
use tauri::{AppHandle, Emitter, Manager, Runtime, WebviewWindow, is_dev};
use ts_rs::TS;
use yaak_api::yaak_api_client;
use yaak_common::platform::get_os_str;
use yaak_models::db_context::DbContext;
use yaak_models::query_manager::QueryManager;
use yaak_models::util::UpdateSource;
/// Extension trait for accessing the QueryManager from Tauri Manager types.
/// This is needed temporarily until all crates are refactored to not use Tauri.
trait QueryManagerExt<'a, R> {
fn db(&'a self) -> DbContext<'a>;
}
impl<'a, R: Runtime, M: Manager<R>> QueryManagerExt<'a, R> for M {
fn db(&'a self) -> DbContext<'a> {
let qm = self.state::<QueryManager>();
qm.inner().connect()
}
}
const KV_NAMESPACE: &str = "license";
const KV_ACTIVATION_ID_KEY: &str = "activation_id";
const TRIAL_SECONDS: u64 = 3600 * 24 * 30;
#[derive(Debug, Clone, Serialize, Deserialize, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export, export_to = "gen_models.ts")]
pub struct CheckActivationRequestPayload {
pub app_version: String,
pub app_platform: String,
}
#[derive(Debug, Clone, Serialize, Deserialize, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export, export_to = "license.ts")]
pub struct ActivateLicenseRequestPayload {
pub license_key: String,
pub app_version: String,
pub app_platform: String,
}
#[derive(Debug, Clone, Serialize, Deserialize, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export, export_to = "license.ts")]
pub struct DeactivateLicenseRequestPayload {
pub app_version: String,
pub app_platform: String,
}
#[derive(Debug, Clone, Serialize, Deserialize, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export, export_to = "license.ts")]
pub struct ActivateLicenseResponsePayload {
pub activation_id: String,
}
#[derive(Debug, Clone, Serialize, Deserialize, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export, export_to = "license.ts")]
pub struct APIErrorResponsePayload {
pub error: String,
pub message: String,
}
#[derive(Debug, Clone, Serialize, Deserialize, TS)]
#[serde(rename_all = "snake_case", tag = "status", content = "data")]
#[ts(export, export_to = "license.ts")]
pub enum LicenseCheckStatus {
// Local Types
PersonalUse {
trial_ended: DateTime<Utc>,
},
Trialing {
end: DateTime<Utc>,
},
Error {
message: String,
code: String,
},
// Server Types
Active {
#[serde(rename = "periodEnd")]
period_end: DateTime<Utc>,
#[serde(rename = "cancelAt")]
cancel_at: Option<DateTime<Utc>>,
},
Inactive {
status: String,
},
Expired {
changes: i32,
#[serde(rename = "changesUrl")]
changes_url: Option<String>,
#[serde(rename = "billingUrl")]
billing_url: String,
#[serde(rename = "periodEnd")]
period_end: DateTime<Utc>,
},
PastDue {
#[serde(rename = "billingUrl")]
billing_url: String,
#[serde(rename = "periodEnd")]
period_end: DateTime<Utc>,
},
}
pub async fn activate_license<R: Runtime>(
window: &WebviewWindow<R>,
license_key: &str,
) -> Result<()> {
info!("Activating license {}", license_key);
let app_version = window.app_handle().package_info().version.to_string();
let client = yaak_api_client(&app_version)?;
let payload = ActivateLicenseRequestPayload {
license_key: license_key.to_string(),
app_platform: get_os_str().to_string(),
app_version,
};
let response = client.post(build_url("/licenses/activate")).json(&payload).send().await?;
if response.status().is_client_error() {
let body: APIErrorResponsePayload = response.json().await?;
return Err(ClientError { message: body.message, error: body.error });
}
if response.status().is_server_error() {
return Err(ServerError);
}
let body: ActivateLicenseResponsePayload = response.json().await?;
window.app_handle().db().set_key_value_str(
KV_ACTIVATION_ID_KEY,
KV_NAMESPACE,
body.activation_id.as_str(),
&UpdateSource::from_window_label(window.label()),
);
if let Err(e) = window.emit("license-activated", true) {
warn!("Failed to emit check-license event: {}", e);
}
Ok(())
}
pub async fn deactivate_license<R: Runtime>(window: &WebviewWindow<R>) -> Result<()> {
info!("Deactivating activation");
let app_handle = window.app_handle();
let activation_id = get_activation_id(app_handle).await;
let app_version = window.app_handle().package_info().version.to_string();
let client = yaak_api_client(&app_version)?;
let path = format!("/licenses/activations/{}/deactivate", activation_id);
let payload =
DeactivateLicenseRequestPayload { app_platform: get_os_str().to_string(), app_version };
let response = client.post(build_url(&path)).json(&payload).send().await?;
if response.status().is_client_error() {
let body: APIErrorResponsePayload = response.json().await?;
return Err(ClientError { message: body.message, error: body.error });
}
if response.status().is_server_error() {
return Err(ServerError);
}
app_handle.db().delete_key_value(
KV_ACTIVATION_ID_KEY,
KV_NAMESPACE,
&UpdateSource::from_window_label(window.label()),
)?;
if let Err(e) = app_handle.emit("license-deactivated", true) {
warn!("Failed to emit deactivate-license event: {}", e);
}
Ok(())
}
pub async fn check_license<R: Runtime>(window: &WebviewWindow<R>) -> Result<LicenseCheckStatus> {
let app_version = window.app_handle().package_info().version.to_string();
let payload =
CheckActivationRequestPayload { app_platform: get_os_str().to_string(), app_version };
let activation_id = get_activation_id(window.app_handle()).await;
let settings = window.db().get_settings();
let trial_end = settings.created_at.add(Duration::from_secs(TRIAL_SECONDS)).and_utc();
let has_activation_id = !activation_id.is_empty();
let trial_period_active = Utc::now() < trial_end;
match (has_activation_id, trial_period_active) {
(false, true) => Ok(LicenseCheckStatus::Trialing { end: trial_end }),
(false, false) => Ok(LicenseCheckStatus::PersonalUse { trial_ended: trial_end }),
(true, _) => {
info!("Checking license activation");
// A license has been activated, so let's check the license server
let client = yaak_api_client(&payload.app_version)?;
let path = format!("/licenses/activations/{activation_id}/check-v2");
let response = client.post(build_url(&path)).json(&payload).send().await?;
if response.status().is_client_error() {
let body: APIErrorResponsePayload = response.json().await?;
return Err(ClientError { message: body.message, error: body.error });
}
if response.status().is_server_error() {
warn!("Failed to check license {}", response.status());
return Err(ServerError);
}
let body_text = response.text().await?;
match serde_json::from_str::<LicenseCheckStatus>(&body_text) {
Ok(b) => Ok(b),
Err(e) => {
warn!("Failed to decode server response: {} {:?}", body_text, e);
Err(JsonError(e))
}
}
}
}
}
fn build_url(path: &str) -> String {
if is_dev() {
format!("http://localhost:9444{path}")
} else {
format!("https://license.yaak.app{path}")
}
}
pub async fn get_activation_id<R: Runtime>(app_handle: &AppHandle<R>) -> String {
app_handle.db().get_key_value_str(KV_ACTIVATION_ID_KEY, KV_NAMESPACE, "")
}

View File

@@ -1,22 +0,0 @@
[package]
name = "yaak-mac-window"
links = "yaak-mac-window"
version = "0.1.0"
edition = "2024"
publish = false
[lints.rust]
unexpected_cfgs = { level = "warn", check-cfg = ['cfg(feature, values("cargo-clippy"))'] }
[build-dependencies]
tauri-plugin = { workspace = true, features = ["build"] }
[target.'cfg(target_os = "macos")'.dependencies]
cocoa = "0.26.0"
log = { workspace = true }
objc = "0.2.7"
rand = "0.9.0"
csscolorparser = "0.7.2"
[dependencies]
tauri = { workspace = true }

View File

@@ -1,5 +0,0 @@
const COMMANDS: &[&str] = &["set_title", "set_theme"];
fn main() {
tauri_plugin::Builder::new(COMMANDS).build();
}

View File

@@ -1,9 +0,0 @@
import { invoke } from '@tauri-apps/api/core';
export function setWindowTitle(title: string) {
invoke('plugin:yaak-mac-window|set_title', { title }).catch(console.error);
}
export function setWindowTheme(bgColor: string) {
invoke('plugin:yaak-mac-window|set_theme', { bgColor }).catch(console.error);
}

View File

@@ -1,6 +0,0 @@
{
"name": "@yaakapp-internal/mac-window",
"private": true,
"version": "1.0.0",
"main": "index.ts"
}

View File

@@ -1,6 +0,0 @@
[default]
description = "Default permissions for the plugin"
permissions = [
"allow-set-title",
"allow-set-theme",
]

View File

@@ -1,36 +0,0 @@
use tauri::{Runtime, Window, command};
#[command]
pub(crate) fn set_title<R: Runtime>(window: Window<R>, title: &str) {
#[cfg(target_os = "macos")]
{
crate::mac::update_window_title(window, title.to_string());
}
#[cfg(not(target_os = "macos"))]
{
let _ = window.set_title(title);
}
}
#[command]
#[allow(unused)]
pub(crate) fn set_theme<R: Runtime>(window: Window<R>, bg_color: &str) {
#[cfg(target_os = "macos")]
{
use log::warn;
match csscolorparser::parse(bg_color.trim()) {
Ok(color) => {
crate::mac::update_window_theme(window, color);
}
Err(err) => {
warn!("Failed to parse background color '{}': {}", bg_color, err)
}
}
}
#[cfg(not(target_os = "macos"))]
{
// Nothing yet for non-Mac platforms
}
}

View File

@@ -1,43 +0,0 @@
mod commands;
#[cfg(target_os = "macos")]
mod mac;
use crate::commands::{set_theme, set_title};
use std::sync::atomic::AtomicBool;
use tauri::{Manager, Runtime, generate_handler, plugin, plugin::TauriPlugin};
pub trait AppHandleMacWindowExt {
/// Sets whether to use the native titlebar
fn set_native_titlebar(&self, enable: bool);
}
impl<R: Runtime> AppHandleMacWindowExt for tauri::AppHandle<R> {
fn set_native_titlebar(&self, enable: bool) {
self.state::<PluginState>()
.native_titlebar
.store(enable, std::sync::atomic::Ordering::Relaxed);
}
}
pub(crate) struct PluginState {
native_titlebar: AtomicBool,
}
pub fn init<R: Runtime>() -> TauriPlugin<R> {
let mut builder = plugin::Builder::new("yaak-mac-window")
.setup(move |app, _| {
app.manage(PluginState { native_titlebar: AtomicBool::new(false) });
Ok(())
})
.invoke_handler(generate_handler![set_title, set_theme]);
#[cfg(target_os = "macos")]
{
builder = builder.on_window_ready(move |window| {
mac::setup_traffic_light_positioner(&window);
});
}
builder.build()
}

Some files were not shown because too many files have changed in this diff Show More