Compare commits

..

1 Commits

Author SHA1 Message Date
Gregory Schier
f640a7d9c0 Transform CPU 2025-01-21 12:05:44 -08:00
1296 changed files with 233037 additions and 85263 deletions

View File

@@ -1,72 +0,0 @@
# Claude Context: Detaching Tauri from Yaak
## Goal
Make Yaak runnable as a standalone CLI without Tauri as a dependency. The core Rust crates in `crates/` should be usable independently, while Tauri-specific code lives in `crates-tauri/`.
## Project Structure
```
crates/ # Core crates - should NOT depend on Tauri
crates-tauri/ # Tauri-specific crates (yaak-app, yaak-tauri-utils, etc.)
crates-cli/ # CLI crate (yaak-cli)
```
## Completed Work
### 1. Folder Restructure
- Moved Tauri-dependent app code to `crates-tauri/yaak-app/`
- Created `crates-tauri/yaak-tauri-utils/` for shared Tauri utilities (window traits, api_client, error handling)
- Created `crates-cli/yaak-cli/` for the standalone CLI
### 2. Decoupled Crates (no longer depend on Tauri)
- **yaak-models**: Uses `init_standalone()` pattern for CLI database access
- **yaak-http**: Removed Tauri plugin, HttpConnectionManager initialized in yaak-app setup
- **yaak-common**: Only contains Tauri-free utilities (serde, platform)
- **yaak-crypto**: Removed Tauri plugin, EncryptionManager initialized in yaak-app setup, commands moved to yaak-app
- **yaak-grpc**: Replaced AppHandle with GrpcConfig struct, uses tokio::process::Command instead of Tauri sidecar
### 3. CLI Implementation
- Basic CLI at `crates-cli/yaak-cli/src/main.rs`
- Commands: workspaces, requests, send (by ID), get (ad-hoc URL), create
- Uses same database as Tauri app via `yaak_models::init_standalone()`
## Remaining Work
### Crates Still Depending on Tauri (in `crates/`)
1. **yaak-git** (3 files) - Moderate complexity
2. **yaak-plugins** (13 files) - **Hardest** - deeply integrated with Tauri for plugin-to-window communication
3. **yaak-sync** (4 files) - Moderate complexity
4. **yaak-ws** (5 files) - Moderate complexity
### Pattern for Decoupling
1. Remove Tauri plugin `init()` function from the crate
2. Move commands to `yaak-app/src/commands.rs` or keep inline in `lib.rs`
3. Move extension traits (e.g., `SomethingManagerExt`) to yaak-app or yaak-tauri-utils
4. Initialize managers in yaak-app's `.setup()` block
5. Remove `tauri` from Cargo.toml dependencies
6. Update `crates-tauri/yaak-app/capabilities/default.json` to remove the plugin permission
7. Replace `tauri::async_runtime::block_on` with `tokio::runtime::Handle::current().block_on()`
## Key Files
- `crates-tauri/yaak-app/src/lib.rs` - Main Tauri app, setup block initializes managers
- `crates-tauri/yaak-app/src/commands.rs` - Migrated Tauri commands
- `crates-tauri/yaak-app/src/models_ext.rs` - Database plugin and extension traits
- `crates-tauri/yaak-tauri-utils/src/window.rs` - WorkspaceWindowTrait for window state
- `crates/yaak-models/src/lib.rs` - Contains `init_standalone()` for CLI usage
## Git Branch
Working on `detach-tauri` branch.
## Recent Commits
```
c40cff40 Remove Tauri dependencies from yaak-crypto and yaak-grpc
df495f1d Move Tauri utilities from yaak-common to yaak-tauri-utils
481e0273 Remove Tauri dependencies from yaak-http and yaak-common
10568ac3 Add HTTP request sending to yaak-cli
bcb7d600 Add yaak-cli stub with basic database access
e718a5f1 Refactor models_ext to use init_standalone from yaak-models
```
## Testing
- Run `cargo check -p <crate>` to verify a crate builds without Tauri
- Run `npm run app-dev` to test the Tauri app still works
- Run `cargo run -p yaak-cli -- --help` to test the CLI

View File

@@ -1,51 +0,0 @@
---
description: Review a PR in a new worktree
allowed-tools: Bash(git worktree:*), Bash(gh pr:*)
---
Review a GitHub pull request in a new git worktree.
## Usage
```
/review-pr <PR_NUMBER>
```
## What to do
1. List all open pull requests and ask the user to select one
2. Get PR information using `gh pr view <PR_NUMBER> --json number,headRefName`
3. Extract the branch name from the PR
4. Create a new worktree at `../yaak-worktrees/pr-<PR_NUMBER>` using `git worktree add` with a timeout of at least 300000ms (5 minutes) since the post-checkout hook runs a bootstrap script
5. Checkout the PR branch in the new worktree using `gh pr checkout <PR_NUMBER>`
6. The post-checkout hook will automatically:
- Create `.env.local` with unique ports
- Copy editor config folders
- Run `npm install && npm run bootstrap`
7. Inform the user:
- Where the worktree was created
- What ports were assigned
- How to access it (cd command)
- How to run the dev server
- How to remove the worktree when done
## Example Output
```
Created worktree for PR #123 at ../yaak-worktrees/pr-123
Branch: feature-auth
Ports: Vite (1421), MCP (64344)
To start working:
cd ../yaak-worktrees/pr-123
npm run app-dev
To remove when done:
git worktree remove ../yaak-worktrees/pr-123
```
## Error Handling
- If the PR doesn't exist, show a helpful error
- If the worktree already exists, inform the user and ask if they want to remove and recreate it
- If `gh` CLI is not available, inform the user to install it

View File

@@ -1,47 +0,0 @@
---
description: Generate formatted release notes for Yaak releases
allowed-tools: Bash(git tag:*)
---
Generate formatted release notes for Yaak releases by analyzing git history and pull request descriptions.
## What to do
1. Identifies the version tag and previous version
2. Retrieves all commits between versions
- If the version is a beta version, it retrieves commits between the beta version and previous beta version
- If the version is a stable version, it retrieves commits between the stable version and the previous stable version
3. Fetches PR descriptions for linked issues to find:
- Feedback URLs (feedback.yaak.app)
- Additional context and descriptions
- Installation links for plugins
4. Formats the release notes using the standard Yaak format:
- Changelog badge at the top
- Bulleted list of changes with PR links
- Feedback links where available
- Full changelog comparison link at the bottom
## Output Format
The skill generates markdown-formatted release notes following this structure:
```markdown
[![Changelog](https://img.shields.io/badge/Changelog-VERSION-blue)](https://yaak.app/changelog/VERSION)
- Feature/fix description in by @username [#123](https://github.com/mountain-loop/yaak/pull/123)
- [Linked feedback item](https://feedback.yaak.app/p/item) by @username in [#456](https://github.com/mountain-loop/yaak/pull/456)
- A simple item that doesn't have a feedback or PR link
**Full Changelog**: https://github.com/mountain-loop/yaak/compare/vPREV...vCURRENT
```
**IMPORTANT**: Always add a blank lines around the markdown code fence and output the markdown code block last
**IMPORTANT**: PRs by `@gschier` should not mention the @username
## After Generating Release Notes
After outputting the release notes, ask the user if they would like to create a draft GitHub release with these notes. If they confirm, create the release using:
```bash
gh release create <tag> --draft --prerelease --title "<tag>" --notes '<release notes>'
```

View File

@@ -1,27 +0,0 @@
# Project Rules
## General Development
- **NEVER** commit or push without explicit confirmation
## Build and Lint
- **ALWAYS** run `npm run lint` after modifying TypeScript or JavaScript files
- Run `npm run bootstrap` after changing plugin runtime or MCP server code
## Plugin System
### Backend Constraints
- Always use `UpdateSource::Plugin` when calling database methods from plugin events
- Never send timestamps (`createdAt`, `updatedAt`) from TypeScript - Rust backend controls these
- Backend uses `NaiveDateTime` (no timezone) so avoid sending ISO timestamp strings
### MCP Server
- MCP server has **no active window context** - cannot call `window.workspaceId()`
- Get workspace ID from `workspaceCtx.yaak.workspace.list()` instead
## Rust Type Generation
- Run `cargo test --package yaak-plugins` (and for other crates) to regenerate TypeScript bindings after modifying Rust event types

View File

@@ -1,35 +0,0 @@
# Worktree Management Skill
## Creating Worktrees
When creating git worktrees for this project, ALWAYS use the path format:
```
../yaak-worktrees/<NAME>
```
For example:
- `git worktree add ../yaak-worktrees/feature-auth`
- `git worktree add ../yaak-worktrees/bugfix-login`
- `git worktree add ../yaak-worktrees/refactor-api`
## What Happens Automatically
The post-checkout hook will automatically:
1. Create `.env.local` with unique ports (YAAK_DEV_PORT and YAAK_PLUGIN_MCP_SERVER_PORT)
2. Copy gitignored editor config folders (.zed, .idea, etc.)
3. Run `npm install && npm run bootstrap`
## Deleting Worktrees
```bash
git worktree remove ../yaak-worktrees/<NAME>
```
## Port Assignments
- Main worktree: 1420 (Vite), 64343 (MCP)
- First worktree: 1421, 64344
- Second worktree: 1422, 64345
- etc.
Each worktree can run `npm run app-dev` simultaneously without conflicts.

6
.eslintignore Normal file
View File

@@ -0,0 +1,6 @@
node_modules/
dist/
.eslintrc.cjs
.prettierrc.cjs
src-web/postcss.config.cjs
src-web/vite.config.ts

49
.eslintrc.cjs Normal file
View File

@@ -0,0 +1,49 @@
module.exports = {
extends: [
'eslint:recommended',
'plugin:react/recommended',
'plugin:react-hooks/recommended',
'plugin:import/recommended',
'plugin:jsx-a11y/recommended',
'plugin:@typescript-eslint/recommended',
'eslint-config-prettier',
],
plugins: ['react-refresh'],
parser: '@typescript-eslint/parser',
parserOptions: {
project: ['./tsconfig.json'],
},
ignorePatterns: [
'scripts/**/*',
'packages/plugin-runtime/**/*',
'packages/plugin-runtime-types/**/*',
'src-tauri/**/*',
'src-web/tailwind.config.cjs',
'src-web/vite.config.ts',
],
settings: {
react: {
version: 'detect',
},
'import/resolver': {
node: {
paths: ['src-web'],
extensions: ['.ts', '.tsx'],
},
},
},
rules: {
'react-refresh/only-export-components': 'error',
'jsx-a11y/no-autofocus': 'off',
'react/react-in-jsx-scope': 'off',
'import/no-unresolved': 'off',
'@typescript-eslint/consistent-type-imports': [
'error',
{
prefer: 'type-imports',
disallowTypeAnnotations: true,
fixStyle: 'separate-type-imports',
},
],
},
};

9
.gitattributes vendored
View File

@@ -1,7 +1,2 @@
crates-tauri/yaak-app/vendored/**/* linguist-generated=true
crates-tauri/yaak-app/gen/schemas/**/* linguist-generated=true
**/bindings/* linguist-generated=true
crates/yaak-templates/pkg/* linguist-generated=true
# Ensure consistent line endings for test files that check exact content
crates/yaak-http/tests/test.txt text eol=lf
src-tauri/vendored/**/* linguist-generated=true
src-tauri/gen/schemas/**/* linguist-generated=true

3
.github/FUNDING.yml vendored
View File

@@ -1,3 +0,0 @@
# These are supported funding model platforms
github: gschier

View File

@@ -1,38 +0,0 @@
---
name: Bug report
about: Create a report to help us improve
title: ''
labels: ''
assignees: ''
---
**Describe the bug**
A clear and concise description of what the bug is.
**To Reproduce**
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Desktop (please complete the following information):**
- OS: [e.g. iOS]
- Browser [e.g. chrome, safari]
- Version [e.g. 22]
**Smartphone (please complete the following information):**
- Device: [e.g. iPhone6]
- OS: [e.g. iOS8.1]
- Browser [e.g. stock browser, safari]
- Version [e.g. 22]
**Additional context**
Add any other context about the problem here.

18
.github/workflows/ci-js.yml vendored Normal file
View File

@@ -0,0 +1,18 @@
on:
pull_request:
branches: [develop]
name: CI (JS)
jobs:
test:
name: Lint/Test
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 20
- run: npm ci
- run: npm run lint
- run: npm test

36
.github/workflows/ci-rust.yml vendored Normal file
View File

@@ -0,0 +1,36 @@
on:
pull_request:
branches: [develop]
paths:
- src-tauri/**
- .github/workflows/**
name: CI (Rust)
defaults:
run:
working-directory: src-tauri
jobs:
test:
name: Check/Test
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- run: |
sudo apt-get update
sudo apt-get install -y libwebkit2gtk-4.1-dev
- uses: dtolnay/rust-toolchain@stable
- uses: actions/cache@v3
continue-on-error: false
with:
path: |
~/.cargo/bin/
~/.cargo/registry/index/
~/.cargo/registry/cache/
~/.cargo/git/db/
target/
key: ${{ runner.os }}-cargo-${{ hashFiles('**/Cargo.lock') }}
restore-keys: ${{ runner.os }}-cargo-
- run: cargo check --all
- run: cargo test --all

View File

@@ -1,30 +0,0 @@
on:
pull_request:
push:
branches:
- main
name: Lint and Test
permissions:
contents: read
jobs:
test:
name: Lint/Test
runs-on: macos-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
- uses: dtolnay/rust-toolchain@stable
- uses: Swatinem/rust-cache@v2
with:
shared-key: ci
cache-on-failure: true
- run: npm ci
- run: npm run bootstrap
- run: npm run lint
- name: Run JS Tests
run: npm test
- name: Run Rust Tests
run: cargo test --all

View File

@@ -1,50 +0,0 @@
name: Claude Code
on:
issue_comment:
types: [created]
pull_request_review_comment:
types: [created]
issues:
types: [opened, assigned]
pull_request_review:
types: [submitted]
jobs:
claude:
if: |
(github.event_name == 'issue_comment' && contains(github.event.comment.body, '@claude')) ||
(github.event_name == 'pull_request_review_comment' && contains(github.event.comment.body, '@claude')) ||
(github.event_name == 'pull_request_review' && contains(github.event.review.body, '@claude')) ||
(github.event_name == 'issues' && (contains(github.event.issue.body, '@claude') || contains(github.event.issue.title, '@claude')))
runs-on: ubuntu-latest
permissions:
contents: read
pull-requests: read
issues: read
id-token: write
actions: read # Required for Claude to read CI results on PRs
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
fetch-depth: 1
- name: Run Claude Code
id: claude
uses: anthropics/claude-code-action@v1
with:
claude_code_oauth_token: ${{ secrets.CLAUDE_CODE_OAUTH_TOKEN }}
# This is an optional setting that allows Claude to read CI results on PRs
additional_permissions: |
actions: read
# Optional: Give a custom prompt to Claude. If this is not specified, Claude will perform the instructions specified in the comment that tagged it.
# prompt: 'Update the pull request description to include a summary of changes.'
# Optional: Add claude_args to customize behavior and configuration
# See https://github.com/anthropics/claude-code-action/blob/main/docs/usage.md
# or https://code.claude.com/docs/en/cli-reference for available options
# claude_args: '--allowed-tools Bash(gh pr:*)'

View File

@@ -1,7 +1,10 @@
name: Generate Artifacts
on:
push:
tags: [v*]
tags: [ v* ]
env:
YAAK_PLUGINS_DIR: checkout/plugins
jobs:
build-artifacts:
@@ -13,37 +16,18 @@ jobs:
fail-fast: false
matrix:
include:
- platform: "macos-latest" # for Arm-based Macs (M1 and above).
args: "--target aarch64-apple-darwin"
yaak_arch: "arm64"
os: "macos"
targets: "aarch64-apple-darwin"
- platform: "macos-latest" # for Intel-based Macs.
args: "--target x86_64-apple-darwin"
yaak_arch: "x64"
os: "macos"
targets: "x86_64-apple-darwin"
- platform: "ubuntu-22.04"
args: ""
yaak_arch: "x64"
os: "ubuntu"
targets: ""
- platform: "ubuntu-22.04-arm"
args: ""
yaak_arch: "arm64"
os: "ubuntu"
targets: ""
- platform: "windows-latest"
args: ""
yaak_arch: "x64"
os: "windows"
targets: ""
# Windows ARM64
- platform: "windows-latest"
args: "--target aarch64-pc-windows-msvc"
yaak_arch: "arm64"
os: "windows"
targets: "aarch64-pc-windows-msvc"
- platform: 'macos-latest' # for Arm-based Macs (M1 and above).
args: '--target aarch64-apple-darwin'
yaak_arch: 'arm64'
- platform: 'macos-latest' # for Intel-based Macs.
args: '--target x86_64-apple-darwin'
yaak_arch: 'x64'
- platform: 'ubuntu-22.04'
args: ''
yaak_arch: 'x64'
- platform: 'windows-latest'
args: ''
yaak_arch: 'x64'
runs-on: ${{ matrix.platform }}
timeout-minutes: 40
steps:
@@ -52,81 +36,67 @@ jobs:
- name: Setup Node
uses: actions/setup-node@v4
with:
node-version: 22
- name: install Rust stable
uses: dtolnay/rust-toolchain@stable
with:
targets: ${{ matrix.targets }}
# Those targets are only used on macos runners so it's in an `if` to slightly speed up windows and linux builds.
targets: ${{ matrix.platform == 'macos-latest' && 'aarch64-apple-darwin,x86_64-apple-darwin' || '' }}
- uses: Swatinem/rust-cache@v2
- uses: actions/cache@v3
continue-on-error: false
with:
shared-key: ci
cache-on-failure: true
path: |
~/.cargo/bin/
~/.cargo/registry/index/
~/.cargo/registry/cache/
~/.cargo/git/db/
src-tauri/target/
key: ${{ runner.os }}-cargo-${{ hashFiles('**/Cargo.lock') }}
restore-keys: ${{ runner.os }}-cargo-
- name: install dependencies (Linux only)
if: matrix.os == 'ubuntu'
- name: install dependencies (ubuntu only)
if: matrix.platform == 'ubuntu-22.04' # This must match the platform value defined above.
run: |
sudo apt-get update
sudo apt-get install -y libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev patchelf xdg-utils
sudo apt-get install -y libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev patchelf
- name: install dependencies (windows only)
if: matrix.platform == 'windows-latest'
run: cargo install --force trusted-signing-cli
- name: Install NPM Dependencies
run: |
npm ci
npm install @yaakapp/cli
- name: Install Protoc for plugin-runtime
uses: arduino/setup-protoc@v3
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}
- name: Install trusted-signing-cli (Windows only)
if: matrix.os == 'windows'
shell: pwsh
run: |
$ErrorActionPreference = 'Stop'
$dir = "$env:USERPROFILE\trusted-signing"
New-Item -ItemType Directory -Force -Path $dir | Out-Null
$url = "https://github.com/Levminer/trusted-signing-cli/releases/download/0.8.0/trusted-signing-cli.exe"
$exe = Join-Path $dir "trusted-signing-cli.exe"
Invoke-WebRequest -Uri $url -OutFile $exe
echo $dir >> $env:GITHUB_PATH
& $exe --version
- name: Run JS build
run: npm run build
- run: npm ci
- run: npm run bootstrap
env:
YAAK_TARGET_ARCH: ${{ matrix.yaak_arch }}
- run: npm run lint
- name: Run JS Tests
run: npm test
- name: Run Rust Tests
run: cargo test --all
- name: Run lint
run: npm run lint
- name: Checkout yaakapp/plugins
uses: actions/checkout@v4
with:
repository: yaakapp/plugins
path: ${{ env.YAAK_PLUGINS_DIR }}
- name: Set version
run: npm run replace-version
env:
YAAK_VERSION: ${{ github.ref_name }}
- name: Sign vendored binaries (macOS only)
if: matrix.os == 'macos'
env:
APPLE_CERTIFICATE: ${{ secrets.APPLE_CERTIFICATE }}
APPLE_CERTIFICATE_PASSWORD: ${{ secrets.APPLE_CERTIFICATE_PASSWORD }}
APPLE_SIGNING_IDENTITY: ${{ secrets.APPLE_SIGNING_IDENTITY }}
KEYCHAIN_PASSWORD: ${{ secrets.KEYCHAIN_PASSWORD }}
run: |
# Create keychain
KEYCHAIN_PATH=$RUNNER_TEMP/app-signing.keychain-db
security create-keychain -p "$KEYCHAIN_PASSWORD" $KEYCHAIN_PATH
security set-keychain-settings -lut 21600 $KEYCHAIN_PATH
security unlock-keychain -p "$KEYCHAIN_PASSWORD" $KEYCHAIN_PATH
# Import certificate
echo "$APPLE_CERTIFICATE" | base64 --decode > certificate.p12
security import certificate.p12 -P "$APPLE_CERTIFICATE_PASSWORD" -A -t cert -f pkcs12 -k $KEYCHAIN_PATH
security list-keychain -d user -s $KEYCHAIN_PATH
# Sign vendored binaries with hardened runtime and their specific entitlements
codesign --force --options runtime --entitlements crates-tauri/yaak-app/macos/entitlements.yaakprotoc.plist --sign "$APPLE_SIGNING_IDENTITY" crates-tauri/yaak-app/vendored/protoc/yaakprotoc || true
codesign --force --options runtime --entitlements crates-tauri/yaak-app/macos/entitlements.yaaknode.plist --sign "$APPLE_SIGNING_IDENTITY" crates-tauri/yaak-app/vendored/node/yaaknode || true
- uses: tauri-apps/tauri-action@v0
env:
YAAK_PLUGINS_DIR: ${{ env.YAAK_PLUGINS_DIR }}
YAAK_TARGET_ARCH: ${{ matrix.yaak_arch }}
ENABLE_CODE_SIGNING: ${{ secrets.APPLE_CERTIFICATE }}
@@ -135,21 +105,21 @@ jobs:
TAURI_SIGNING_PRIVATE_KEY_PASSWORD: ${{ secrets.TAURI_KEY_PASSWORD }}
# Apple signing stuff
APPLE_CERTIFICATE: ${{ matrix.os == 'macos' && secrets.APPLE_CERTIFICATE }}
APPLE_CERTIFICATE_PASSWORD: ${{ matrix.os == 'macos' && secrets.APPLE_CERTIFICATE_PASSWORD }}
APPLE_ID: ${{ matrix.os == 'macos' && secrets.APPLE_ID }}
APPLE_PASSWORD: ${{ matrix.os == 'macos' && secrets.APPLE_PASSWORD }}
APPLE_SIGNING_IDENTITY: ${{ matrix.os == 'macos' && secrets.APPLE_SIGNING_IDENTITY }}
APPLE_TEAM_ID: ${{ matrix.os == 'macos' && secrets.APPLE_TEAM_ID }}
APPLE_CERTIFICATE: ${{ matrix.platform == 'macos-latest' && secrets.APPLE_CERTIFICATE }}
APPLE_CERTIFICATE_PASSWORD: ${{ matrix.platform == 'macos-latest' && secrets.APPLE_CERTIFICATE_PASSWORD }}
APPLE_ID: ${{ matrix.platform == 'macos-latest' && secrets.APPLE_ID }}
APPLE_PASSWORD: ${{ matrix.platform == 'macos-latest' && secrets.APPLE_PASSWORD }}
APPLE_SIGNING_IDENTITY: ${{ matrix.platform == 'macos-latest' && secrets.APPLE_SIGNING_IDENTITY }}
APPLE_TEAM_ID: ${{ matrix.platform == 'macos-latest' && secrets.APPLE_TEAM_ID }}
# Windows signing stuff
AZURE_CLIENT_ID: ${{ matrix.os == 'windows' && secrets.AZURE_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ matrix.os == 'windows' && secrets.AZURE_CLIENT_SECRET }}
AZURE_TENANT_ID: ${{ matrix.os == 'windows' && secrets.AZURE_TENANT_ID }}
AZURE_CLIENT_ID: ${{ matrix.platform == 'windows-latest' && secrets.AZURE_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ matrix.platform == 'windows-latest' && secrets.AZURE_CLIENT_SECRET }}
AZURE_TENANT_ID: ${{ matrix.platform == 'windows-latest' && secrets.AZURE_TENANT_ID }}
with:
tagName: "v__VERSION__"
releaseName: "Release __VERSION__"
releaseBody: "[Changelog __VERSION__](https://yaak.app/blog/__VERSION__)"
tagName: 'v__VERSION__'
releaseName: 'Release __VERSION__'
releaseBody: '[Changelog __VERSION__](https://yaak.app/blog/__VERSION__)'
releaseDraft: true
prerelease: true
args: "${{ matrix.args }} --config ./crates-tauri/yaak-app/tauri.release.conf.json"
prerelease: false
args: ${{ matrix.args }}

View File

@@ -1,44 +0,0 @@
name: Generate Sponsors README
on:
workflow_dispatch:
schedule:
- cron: 30 15 * * 0-6
permissions:
contents: write
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- name: Checkout 🛎️
uses: actions/checkout@v2
- name: Generate Sponsors
uses: JamesIves/github-sponsors-readme-action@v1
with:
token: ${{ secrets.SPONSORS_PAT }}
file: 'README.md'
maximum: 1999
template: '<a href="https://github.com/{{{ login }}}"><img src="{{{ avatarUrl }}}" width="50px" alt="User avatar: {{{ login }}}" /></a>&nbsp;&nbsp;'
active-only: false
include-private: true
marker: 'sponsors-base'
- name: Generate Sponsors
uses: JamesIves/github-sponsors-readme-action@v1
with:
token: ${{ secrets.SPONSORS_PAT }}
file: 'README.md'
minimum: 2000
template: '<a href="https://github.com/{{{ login }}}"><img src="{{{ avatarUrl }}}" width="80px" alt="User avatar: {{{ login }}}" /></a>&nbsp;&nbsp;'
active-only: false
include-private: true
marker: 'sponsors-premium'
# ⚠️ Note: You can use any deployment step here to automatically push the README
# changes back to your branch.
- name: Commit Changes
uses: JamesIves/github-pages-deploy-action@v4
with:
branch: main
force: false
folder: '.'

13
.gitignore vendored
View File

@@ -15,8 +15,6 @@ dist-ssr
# Editor directories and files
.vscode/*
!.vscode/extensions.json
!.vscode/settings.json
!.vscode/launch.json
.idea
.DS_Store
*.suo
@@ -25,7 +23,6 @@ dist-ssr
*.sln
*.sw?
.eslintcache
out
*.sqlite
*.sqlite-*
@@ -34,13 +31,3 @@ out
.tmp
tmp
.zed
codebook.toml
target
# Per-worktree Tauri config (generated by post-checkout hook)
crates-tauri/yaak-app/tauri.worktree.conf.json
# Tauri auto-generated permission files
**/permissions/autogenerated
**/permissions/schemas

View File

@@ -1 +0,0 @@
node scripts/git-hooks/post-checkout.mjs "$@"

4
.prettierignore Normal file
View File

@@ -0,0 +1,4 @@
node_modules/
dist/
out/
.prettierrc.cjs

8
.prettierrc.js Normal file
View File

@@ -0,0 +1,8 @@
export default {
"trailingComma": "all",
"tabWidth": 2,
"semi": true,
"singleQuote": true,
"printWidth": 100,
"bracketSpacing": true
}

View File

@@ -1,3 +0,0 @@
{
"recommendations": ["biomejs.biome", "rust-lang.rust-analyzer", "bradlc.vscode-tailwindcss"]
}

26
.vscode/launch.json vendored
View File

@@ -1,26 +0,0 @@
{
"version": "0.2.0",
"configurations": [
{
"type": "node",
"request": "launch",
"name": "Dev App",
"runtimeExecutable": "npm",
"runtimeArgs": ["run", "start"]
},
{
"type": "node",
"request": "launch",
"name": "Build App",
"runtimeExecutable": "npm",
"runtimeArgs": ["run", "start"]
},
{
"type": "node",
"request": "launch",
"name": "Bootstrap",
"runtimeExecutable": "npm",
"runtimeArgs": ["run", "bootstrap"]
}
]
}

View File

@@ -1,6 +0,0 @@
{
"editor.defaultFormatter": "biomejs.biome",
"editor.formatOnSave": true,
"biome.enabled": true,
"biome.lint.format.enable": true
}

View File

@@ -1,69 +0,0 @@
[workspace]
resolver = "2"
members = [
# Shared crates (no Tauri dependency)
"crates/yaak-core",
"crates/yaak-common",
"crates/yaak-crypto",
"crates/yaak-git",
"crates/yaak-grpc",
"crates/yaak-http",
"crates/yaak-models",
"crates/yaak-plugins",
"crates/yaak-sse",
"crates/yaak-sync",
"crates/yaak-templates",
"crates/yaak-tls",
"crates/yaak-ws",
# CLI crates
"crates-cli/yaak-cli",
# Tauri-specific crates
"crates-tauri/yaak-app",
"crates-tauri/yaak-fonts",
"crates-tauri/yaak-license",
"crates-tauri/yaak-mac-window",
"crates-tauri/yaak-tauri-utils",
]
[workspace.dependencies]
chrono = "0.4.42"
hex = "0.4.3"
keyring = "3.6.3"
log = "0.4.29"
reqwest = "0.12.20"
rustls = { version = "0.23.34", default-features = false }
rustls-platform-verifier = "0.6.2"
serde = "1.0.228"
serde_json = "1.0.145"
sha2 = "0.10.9"
tauri = "2.9.5"
tauri-plugin = "2.5.2"
tauri-plugin-dialog = "2.4.2"
tauri-plugin-shell = "2.3.3"
thiserror = "2.0.17"
tokio = "1.48.0"
ts-rs = "11.1.0"
# Internal crates - shared
yaak-core = { path = "crates/yaak-core" }
yaak-common = { path = "crates/yaak-common" }
yaak-crypto = { path = "crates/yaak-crypto" }
yaak-git = { path = "crates/yaak-git" }
yaak-grpc = { path = "crates/yaak-grpc" }
yaak-http = { path = "crates/yaak-http" }
yaak-models = { path = "crates/yaak-models" }
yaak-plugins = { path = "crates/yaak-plugins" }
yaak-sse = { path = "crates/yaak-sse" }
yaak-sync = { path = "crates/yaak-sync" }
yaak-templates = { path = "crates/yaak-templates" }
yaak-tls = { path = "crates/yaak-tls" }
yaak-ws = { path = "crates/yaak-ws" }
# Internal crates - Tauri-specific
yaak-fonts = { path = "crates-tauri/yaak-fonts" }
yaak-license = { path = "crates-tauri/yaak-license" }
yaak-mac-window = { path = "crates-tauri/yaak-mac-window" }
yaak-tauri-utils = { path = "crates-tauri/yaak-tauri-utils" }
[profile.release]
strip = false

View File

@@ -34,6 +34,8 @@ Run the `bootstrap` command to do some initial setup:
npm run bootstrap
```
_NOTE: Run with `YAAK_PLUGINS_DIR=<Path to yaakapp/plugins>` to re-build bundled plugins_
## Run the App
After bootstrapping, start the app in development mode:
@@ -42,47 +44,19 @@ After bootstrapping, start the app in development mode:
npm start
```
_NOTE: If working on bundled plugins, run with `YAAK_PLUGINS_DIR=<Path to yaakapp/plugins>`_
## SQLite Migrations
New migrations can be created from the `src-tauri/` directory:
```shell
npm run migration
cd src-tauri
sqlx migrate add migration-name
```
Rerun the app to apply the migrations.
Run the app to apply the migrations.
_Note: For safety, development builds use a separate database location from production builds._
If nothing happens, try `cargo clean` and run the app again.
## Lezer Grammar Generation
```sh
# Example
lezer-generator components/core/Editor/<LANG>/<LANG>.grammar > components/core/Editor/<LANG>/<LANG>.ts
```
## Linting & Formatting
This repo uses Biome for linting and formatting (replacing ESLint + Prettier).
- Lint the entire repo:
```sh
npm run lint
```
- Auto-fix lint issues where possible:
```sh
npm run lint:fix
```
- Format code:
```sh
npm run format
```
Notes:
- Many workspace packages also expose the same scripts (`lint`, `lint:fix`, and `format`).
- TypeScript type-checking still runs separately via `tsc --noEmit` in relevant packages.
_Note: Development builds use a separate database location from production builds._

View File

@@ -1,27 +0,0 @@
# MCP Client Plan
## Goal
Add an MCP client mode to Yaak so users can connect to and debug MCP servers.
## Core Design
- **Protocol layer:** Implement JSONRPC framing, message IDs, and notifications as the common core.
- **Transport interface:** Define an async trait with `connect`, `send`, `receive`, and `close` methods.
- **Transports:**
- Start with **Standard I/O** for local development.
- Reuse the existing HTTP stack for **HTTP streaming** next.
- Leave hooks for **WebSocket** support later.
## Integration
- Register MCP as a new request type alongside REST, GraphQL, gRPC, and WebSocket.
- Allow perrequest transport selection (stdio or HTTP).
- Map inbound messages into a new MCP response model that feeds existing timeline and debug views.
## Testing and Dogfooding
- Convert Yaak's own MCP server to Standard I/O for local testing.
- Use it internally to validate protocol behavior and message flow.
- Add unit and integration tests for JSONRPC messaging and transport abstractions.
## Future Refinements
- Add WebSocket transport support once core paths are stable.
- Extend timelines for protocollevel visualization layered over raw transport events.
- Implement version and capability negotiation between client and server.

View File

@@ -1,70 +1,26 @@
<p align="center">
<a href="https://github.com/JamesIves/github-sponsors-readme-action">
<img width="200px" src="https://github.com/mountain-loop/yaak/raw/main/crates-tauri/yaak-app/icons/icon.png">
</a>
</p>
# Yaak API Client
<h1 align="center">
💫 Yaak ➟ Desktop API Client 💫
</h1>
Yaak is a desktop API client for organizing and executing REST, GraphQL, and gRPC
requests. It's built using [Tauri](https://tauri.app), Rust, and ReactJS.
<p align="center">
A fast, privacy-first API client for REST, GraphQL, SSE, WebSocket, and gRPC built with Tauri, Rust, and React.
</p>
<p align="center">
Development is funded by community-purchased <a href="https://yaak.app/pricing">licenses</a>. You can also <a href="https://github.com/sponsors/gschier">become a sponsor</a> to have your logo appear below. 💖
</p>
<br>
![screenshot](https://github.com/user-attachments/assets/f18e963f-0b68-4ecb-b8b8-cb71aa9aec02)
## Feedback and Bug Reports
All feedback, bug reports, questions, and feature requests should be reported to
[feedback.yaak.app](https://feedback.yaak.app). Issues will be duplicated
in this repository if applicable.
<p align="center">
<!-- sponsors-premium --><a href="https://github.com/MVST-Solutions"><img src="https:&#x2F;&#x2F;github.com&#x2F;MVST-Solutions.png" width="80px" alt="User avatar: MVST-Solutions" /></a>&nbsp;&nbsp;<a href="https://github.com/dharsanb"><img src="https:&#x2F;&#x2F;github.com&#x2F;dharsanb.png" width="80px" alt="User avatar: dharsanb" /></a>&nbsp;&nbsp;<a href="https://github.com/railwayapp"><img src="https:&#x2F;&#x2F;github.com&#x2F;railwayapp.png" width="80px" alt="User avatar: railwayapp" /></a>&nbsp;&nbsp;<a href="https://github.com/caseyamcl"><img src="https:&#x2F;&#x2F;github.com&#x2F;caseyamcl.png" width="80px" alt="User avatar: caseyamcl" /></a>&nbsp;&nbsp;<a href="https://github.com/bytebase"><img src="https:&#x2F;&#x2F;github.com&#x2F;bytebase.png" width="80px" alt="User avatar: bytebase" /></a>&nbsp;&nbsp;<a href="https://github.com/"><img src="https:&#x2F;&#x2F;raw.githubusercontent.com&#x2F;JamesIves&#x2F;github-sponsors-readme-action&#x2F;dev&#x2F;.github&#x2F;assets&#x2F;placeholder.png" width="80px" alt="User avatar: " /></a>&nbsp;&nbsp;<!-- sponsors-premium -->
</p>
<p align="center">
<!-- sponsors-base --><a href="https://github.com/seanwash"><img src="https:&#x2F;&#x2F;github.com&#x2F;seanwash.png" width="50px" alt="User avatar: seanwash" /></a>&nbsp;&nbsp;<a href="https://github.com/jerath"><img src="https:&#x2F;&#x2F;github.com&#x2F;jerath.png" width="50px" alt="User avatar: jerath" /></a>&nbsp;&nbsp;<a href="https://github.com/itsa-sh"><img src="https:&#x2F;&#x2F;github.com&#x2F;itsa-sh.png" width="50px" alt="User avatar: itsa-sh" /></a>&nbsp;&nbsp;<a href="https://github.com/dmmulroy"><img src="https:&#x2F;&#x2F;github.com&#x2F;dmmulroy.png" width="50px" alt="User avatar: dmmulroy" /></a>&nbsp;&nbsp;<a href="https://github.com/timcole"><img src="https:&#x2F;&#x2F;github.com&#x2F;timcole.png" width="50px" alt="User avatar: timcole" /></a>&nbsp;&nbsp;<a href="https://github.com/VLZH"><img src="https:&#x2F;&#x2F;github.com&#x2F;VLZH.png" width="50px" alt="User avatar: VLZH" /></a>&nbsp;&nbsp;<a href="https://github.com/terasaka2k"><img src="https:&#x2F;&#x2F;github.com&#x2F;terasaka2k.png" width="50px" alt="User avatar: terasaka2k" /></a>&nbsp;&nbsp;<a href="https://github.com/andriyor"><img src="https:&#x2F;&#x2F;github.com&#x2F;andriyor.png" width="50px" alt="User avatar: andriyor" /></a>&nbsp;&nbsp;<a href="https://github.com/majudhu"><img src="https:&#x2F;&#x2F;github.com&#x2F;majudhu.png" width="50px" alt="User avatar: majudhu" /></a>&nbsp;&nbsp;<a href="https://github.com/axelrindle"><img src="https:&#x2F;&#x2F;github.com&#x2F;axelrindle.png" width="50px" alt="User avatar: axelrindle" /></a>&nbsp;&nbsp;<a href="https://github.com/jirizverina"><img src="https:&#x2F;&#x2F;github.com&#x2F;jirizverina.png" width="50px" alt="User avatar: jirizverina" /></a>&nbsp;&nbsp;<a href="https://github.com/chip-well"><img src="https:&#x2F;&#x2F;github.com&#x2F;chip-well.png" width="50px" alt="User avatar: chip-well" /></a>&nbsp;&nbsp;<a href="https://github.com/GRAYAH"><img src="https:&#x2F;&#x2F;github.com&#x2F;GRAYAH.png" width="50px" alt="User avatar: GRAYAH" /></a>&nbsp;&nbsp;<a href="https://github.com/flashblaze"><img src="https:&#x2F;&#x2F;github.com&#x2F;flashblaze.png" width="50px" alt="User avatar: flashblaze" /></a>&nbsp;&nbsp;<!-- sponsors-base -->
</p>
![Yaak API Client](https://yaak.app/static/screenshot.png)
## Features
Yaak is an offline-first API client designed to stay out of your way while giving you everything you need when you need it.
Built with [Tauri](https://tauri.app), Rust, and React, its fast, lightweight, and private. No telemetry, no VC funding, and no cloud lock-in.
### 🌐 Work with any API
- Import collections from Postman, Insomnia, OpenAPI, Swagger, or Curl.
- Send requests via REST, GraphQL, gRPC, WebSocket, or Server-Sent Events.
- Filter and inspect responses with JSONPath or XPath.
### 🔐 Stay secure
- Use OAuth 2.0, JWT, Basic Auth, or custom plugins for authentication.
- Secure sensitive values with encrypted secrets.
- Store secrets in your OS keychain.
### ☁️ Organize & collaborate
- Group requests into workspaces and nested folders.
- Use environment variables to switch between dev, staging, and prod.
- Mirror workspaces to your filesystem for versioning in Git or syncing with Dropbox.
### 🧩 Extend & customize
- Insert dynamic values like UUIDs or timestamps with template tags.
- Pick from built-in themes or build your own.
- Create plugins to extend authentication, template tags, or the UI.
## Community Projects
- [`yaak2postman`](https://github.com/BiteCraft/yaak2postman) CLI for converting Yaak data
exports to Postman-compatible collections
## Contribution Policy
Yaak is open source but only accepting contributions for bug fixes. To get started,
visit [`DEVELOPMENT.md`](DEVELOPMENT.md) for tips on setting up your environment.
Yaak is open source, but only accepting contributions for bug fixes. See the
[`good first issue`](https://github.com/yaakapp/app/labels/good%20first%20issue) label for
issues that are more approachable for contribution.
## Useful Resources
- [Feedback and Bug Reports](https://feedback.yaak.app)
- [Documentation](https://yaak.app/docs)
- [Yaak vs Postman](https://yaak.app/alternatives/postman)
- [Yaak vs Bruno](https://yaak.app/alternatives/bruno)
- [Yaak vs Insomnia](https://yaak.app/alternatives/insomnia)
To get started, visit [`DEVELOPMENT.md`](DEVELOPMENT.md) for tips on setting up your
environment.

View File

@@ -1,53 +0,0 @@
{
"$schema": "https://biomejs.dev/schemas/2.3.11/schema.json",
"linter": {
"enabled": true,
"rules": {
"recommended": true,
"a11y": {
"useKeyWithClickEvents": "off"
}
}
},
"formatter": {
"enabled": true,
"indentStyle": "space",
"indentWidth": 2,
"lineWidth": 100,
"bracketSpacing": true
},
"css": {
"parser": {
"tailwindDirectives": true
},
"linter": {
"enabled": false
}
},
"javascript": {
"formatter": {
"quoteStyle": "single",
"jsxQuoteStyle": "double",
"trailingCommas": "all",
"semicolons": "always"
}
},
"files": {
"includes": [
"**",
"!**/node_modules",
"!**/dist",
"!**/build",
"!target",
"!scripts",
"!crates",
"!crates-tauri",
"!src-web/tailwind.config.cjs",
"!src-web/postcss.config.cjs",
"!src-web/vite.config.ts",
"!src-web/routeTree.gen.ts",
"!packages/plugin-runtime-types/lib",
"!**/bindings"
]
}
}

View File

@@ -1,22 +0,0 @@
[package]
name = "yaak-cli"
version = "0.1.0"
edition = "2024"
publish = false
[[bin]]
name = "yaakcli"
path = "src/main.rs"
[dependencies]
clap = { version = "4", features = ["derive"] }
dirs = "6"
env_logger = "0.11"
log = { workspace = true }
serde_json = { workspace = true }
tokio = { workspace = true, features = ["rt-multi-thread", "macros"] }
yaak-crypto = { workspace = true }
yaak-http = { workspace = true }
yaak-models = { workspace = true }
yaak-plugins = { workspace = true }
yaak-templates = { workspace = true }

View File

@@ -1,198 +0,0 @@
# CLI Command Architecture Plan
## Goal
Redesign the yaak-cli command structure to use a resource-oriented `<resource> <action>`
pattern that scales well, is discoverable, and supports both human and LLM workflows.
## Command Architecture
### Design Principles
- **Resource-oriented**: top-level commands are nouns, subcommands are verbs
- **Polymorphic requests**: `request` covers HTTP, gRPC, and WebSocket — the CLI
resolves the type via `get_any_request` and adapts behavior accordingly
- **Simple creation, full-fidelity via JSON**: human-friendly flags for basic creation,
`--json` for full control (targeted at LLM and scripting workflows)
- **Runtime schema introspection**: `request schema` outputs JSON Schema for the request
models, with dynamic auth fields populated from loaded plugins at runtime
- **Destructive actions require confirmation**: `delete` commands prompt for user
confirmation before proceeding. Can be bypassed with `--yes` / `-y` for scripting
### Commands
```
# Top-level shortcut
yaakcli send <id> [-e <env_id>] # id can be a request, folder, or workspace
# Resource commands
yaakcli workspace list
yaakcli workspace show <id>
yaakcli workspace create --name <name>
yaakcli workspace create --json '{"name": "My Workspace"}'
yaakcli workspace create '{"name": "My Workspace"}' # positional JSON shorthand
yaakcli workspace update --json '{"id": "wk_abc", "name": "New Name"}'
yaakcli workspace delete <id>
yaakcli request list <workspace_id>
yaakcli request show <id>
yaakcli request create <workspace_id> --name <name> --url <url> [--method GET]
yaakcli request create --json '{"workspaceId": "wk_abc", "url": "..."}'
yaakcli request update --json '{"id": "rq_abc", "url": "https://new.com"}'
yaakcli request send <id> [-e <env_id>]
yaakcli request delete <id>
yaakcli request schema <http|grpc|websocket>
yaakcli folder list <workspace_id>
yaakcli folder show <id>
yaakcli folder create <workspace_id> --name <name>
yaakcli folder create --json '{"workspaceId": "wk_abc", "name": "Auth"}'
yaakcli folder update --json '{"id": "fl_abc", "name": "New Name"}'
yaakcli folder delete <id>
yaakcli environment list <workspace_id>
yaakcli environment show <id>
yaakcli environment create <workspace_id> --name <name>
yaakcli environment create --json '{"workspaceId": "wk_abc", "name": "Production"}'
yaakcli environment update --json '{"id": "ev_abc", ...}'
yaakcli environment delete <id>
```
### `send` — Top-Level Shortcut
`yaakcli send <id>` is a convenience alias that accepts any sendable ID. It tries
each type in order via DB lookups (short-circuiting on first match):
1. Request (HTTP, gRPC, or WebSocket via `get_any_request`)
2. Folder (sends all requests in the folder)
3. Workspace (sends all requests in the workspace)
ID prefixes exist (e.g. `rq_`, `fl_`, `wk_`) but are not relied upon — resolution
is purely by DB lookup.
`request send <id>` is the same but restricted to request IDs only.
### Request Send — Polymorphic Behavior
`send` means "execute this request" regardless of protocol:
- **HTTP**: send request, print response, exit
- **gRPC**: invoke the method; for streaming, stream output to stdout until done/Ctrl+C
- **WebSocket**: connect, stream messages to stdout until closed/Ctrl+C
### `request schema` — Runtime JSON Schema
Outputs a JSON Schema describing the full request shape, including dynamic fields:
1. Generate base schema from `schemars::JsonSchema` derive on the Rust model structs
2. Load plugins, collect auth strategy definitions and their form inputs
3. Merge plugin-defined auth fields into the `authentication` property as a `oneOf`
4. Output the combined schema as JSON
This lets an LLM call `schema`, read the shape, and construct valid JSON for
`create --json` or `update --json`.
## Implementation Steps
### Phase 1: Restructure commands (no new functionality)
Refactor `main.rs` into the new resource/action pattern using clap subcommand nesting.
Existing behavior stays the same, just reorganized. Remove the `get` command.
1. Create module structure: `commands/workspace.rs`, `commands/request.rs`, etc.
2. Define nested clap enums:
```rust
enum Commands {
Send(SendArgs),
Workspace(WorkspaceArgs),
Request(RequestArgs),
Folder(FolderArgs),
Environment(EnvironmentArgs),
}
```
3. Move existing `Workspaces` logic into `workspace list`
4. Move existing `Requests` logic into `request list`
5. Move existing `Send` logic into `request send`
6. Move existing `Create` logic into `request create`
7. Delete the `Get` command entirely
8. Extract shared setup (DB init, plugin init, encryption) into a reusable context struct
### Phase 2: Add missing CRUD commands
1. `workspace show <id>`
2. `workspace create --name <name>` (and `--json`)
3. `workspace update --json`
4. `workspace delete <id>`
5. `request show <id>` (JSON output of the full request model)
6. `request delete <id>`
7. `folder list <workspace_id>`
8. `folder show <id>`
9. `folder create <workspace_id> --name <name>` (and `--json`)
10. `folder update --json`
11. `folder delete <id>`
12. `environment list <workspace_id>`
13. `environment show <id>`
14. `environment create <workspace_id> --name <name>` (and `--json`)
15. `environment update --json`
16. `environment delete <id>`
### Phase 3: JSON input for create/update
Both commands accept JSON via `--json <string>` or as a positional argument (detected
by leading `{`). They follow the same upsert pattern as the plugin API.
- **`create --json`**: JSON must include `workspaceId`. Must NOT include `id` (or
use empty string `""`). Deserializes into the model with defaults for missing fields,
then upserts (insert).
- **`update --json`**: JSON must include `id`. Performs a fetch-merge-upsert:
1. Fetch the existing model from DB
2. Serialize it to `serde_json::Value`
3. Deep-merge the user's partial JSON on top (JSON Merge Patch / RFC 7386 semantics)
4. Deserialize back into the typed model
5. Upsert (update)
This matches how the MCP server plugin already does it (fetch existing, spread, override),
but the CLI handles the merge server-side so callers don't have to.
Setting a field to `null` removes it (for `Option<T>` fields), per RFC 7386.
Implementation:
1. Add `--json` flag and positional JSON detection to `create` commands
2. Add `update` commands with required `--json` flag
3. Implement JSON merge utility (or use `json-patch` crate)
### Phase 4: Runtime schema generation
1. Add `schemars` dependency to `yaak-models`
2. Derive `JsonSchema` on `HttpRequest`, `GrpcRequest`, `WebsocketRequest`, and their
nested types (`HttpRequestHeader`, `HttpUrlParameter`, etc.)
3. Implement `request schema` command:
- Generate base schema from schemars
- Query plugins for auth strategy form inputs
- Convert plugin form inputs into JSON Schema properties
- Merge into the `authentication` field
- Print to stdout
### Phase 5: Polymorphic send
1. Update `request send` to use `get_any_request` to resolve the request type
2. Match on `AnyRequest` variant and dispatch to the appropriate sender:
- `AnyRequest::HttpRequest` — existing HTTP send logic
- `AnyRequest::GrpcRequest` — gRPC invoke (future implementation)
- `AnyRequest::WebsocketRequest` — WebSocket connect (future implementation)
3. gRPC and WebSocket send can initially return "not yet implemented" errors
### Phase 6: Top-level `send` and folder/workspace send
1. Add top-level `yaakcli send <id>` command
2. Resolve ID by trying DB lookups in order: any_request → folder → workspace
3. For folder: list all requests in folder, send each
4. For workspace: list all requests in workspace, send each
5. Add execution options: `--sequential` (default), `--parallel`, `--fail-fast`
## Crate Changes
- **yaak-cli**: restructure into modules, new clap hierarchy
- **yaak-models**: add `schemars` dependency, derive `JsonSchema` on model structs
(current derives: `Debug, Clone, PartialEq, Serialize, Deserialize, Default, TS`)

View File

@@ -1,409 +0,0 @@
use clap::{Parser, Subcommand};
use log::info;
use serde_json::Value;
use std::collections::BTreeMap;
use std::path::PathBuf;
use std::sync::Arc;
use tokio::sync::mpsc;
use yaak_crypto::manager::EncryptionManager;
use yaak_http::path_placeholders::apply_path_placeholders;
use yaak_http::sender::{HttpSender, ReqwestSender};
use yaak_http::types::{SendableHttpRequest, SendableHttpRequestOptions};
use yaak_models::models::{HttpRequest, HttpRequestHeader, HttpUrlParameter};
use yaak_models::render::make_vars_hashmap;
use yaak_models::util::UpdateSource;
use yaak_plugins::events::{PluginContext, RenderPurpose};
use yaak_plugins::manager::PluginManager;
use yaak_plugins::template_callback::PluginTemplateCallback;
use yaak_templates::{RenderOptions, parse_and_render, render_json_value_raw};
#[derive(Parser)]
#[command(name = "yaakcli")]
#[command(about = "Yaak CLI - API client from the command line")]
struct Cli {
/// Use a custom data directory
#[arg(long, global = true)]
data_dir: Option<PathBuf>,
/// Environment ID to use for variable substitution
#[arg(long, short, global = true)]
environment: Option<String>,
/// Enable verbose logging
#[arg(long, short, global = true)]
verbose: bool,
#[command(subcommand)]
command: Commands,
}
#[derive(Subcommand)]
enum Commands {
/// List all workspaces
Workspaces,
/// List requests in a workspace
Requests {
/// Workspace ID
workspace_id: String,
},
/// Send an HTTP request by ID
Send {
/// Request ID
request_id: String,
},
/// Send a GET request to a URL
Get {
/// URL to request
url: String,
},
/// Create a new HTTP request
Create {
/// Workspace ID
workspace_id: String,
/// Request name
#[arg(short, long)]
name: String,
/// HTTP method
#[arg(short, long, default_value = "GET")]
method: String,
/// URL
#[arg(short, long)]
url: String,
},
}
/// Render an HTTP request with template variables and plugin functions
async fn render_http_request(
r: &HttpRequest,
environment_chain: Vec<yaak_models::models::Environment>,
cb: &PluginTemplateCallback,
opt: &RenderOptions,
) -> yaak_templates::error::Result<HttpRequest> {
let vars = &make_vars_hashmap(environment_chain);
let mut url_parameters = Vec::new();
for p in r.url_parameters.clone() {
if !p.enabled {
continue;
}
url_parameters.push(HttpUrlParameter {
enabled: p.enabled,
name: parse_and_render(p.name.as_str(), vars, cb, opt).await?,
value: parse_and_render(p.value.as_str(), vars, cb, opt).await?,
id: p.id,
})
}
let mut headers = Vec::new();
for p in r.headers.clone() {
if !p.enabled {
continue;
}
headers.push(HttpRequestHeader {
enabled: p.enabled,
name: parse_and_render(p.name.as_str(), vars, cb, opt).await?,
value: parse_and_render(p.value.as_str(), vars, cb, opt).await?,
id: p.id,
})
}
let mut body = BTreeMap::new();
for (k, v) in r.body.clone() {
body.insert(k, render_json_value_raw(v, vars, cb, opt).await?);
}
let authentication = {
let mut disabled = false;
let mut auth = BTreeMap::new();
match r.authentication.get("disabled") {
Some(Value::Bool(true)) => {
disabled = true;
}
Some(Value::String(tmpl)) => {
disabled = parse_and_render(tmpl.as_str(), vars, cb, opt)
.await
.unwrap_or_default()
.is_empty();
info!(
"Rendering authentication.disabled as a template: {disabled} from \"{tmpl}\""
);
}
_ => {}
}
if disabled {
auth.insert("disabled".to_string(), Value::Bool(true));
} else {
for (k, v) in r.authentication.clone() {
if k == "disabled" {
auth.insert(k, Value::Bool(false));
} else {
auth.insert(k, render_json_value_raw(v, vars, cb, opt).await?);
}
}
}
auth
};
let url = parse_and_render(r.url.clone().as_str(), vars, cb, opt).await?;
// Apply path placeholders (e.g., /users/:id -> /users/123)
let (url, url_parameters) = apply_path_placeholders(&url, &url_parameters);
Ok(HttpRequest { url, url_parameters, headers, body, authentication, ..r.to_owned() })
}
#[tokio::main]
async fn main() {
let cli = Cli::parse();
// Initialize logging
if cli.verbose {
env_logger::Builder::from_env(env_logger::Env::default().default_filter_or("info")).init();
}
// Use the same app_id for both data directory and keyring
let app_id = if cfg!(debug_assertions) { "app.yaak.desktop.dev" } else { "app.yaak.desktop" };
let data_dir = cli.data_dir.unwrap_or_else(|| {
dirs::data_dir().expect("Could not determine data directory").join(app_id)
});
let db_path = data_dir.join("db.sqlite");
let blob_path = data_dir.join("blobs.sqlite");
let (query_manager, _blob_manager, _rx) =
yaak_models::init_standalone(&db_path, &blob_path).expect("Failed to initialize database");
let db = query_manager.connect();
// Initialize encryption manager for secure() template function
// Use the same app_id as the Tauri app for keyring access
let encryption_manager = Arc::new(EncryptionManager::new(query_manager.clone(), app_id));
// Initialize plugin manager for template functions
let vendored_plugin_dir = data_dir.join("vendored-plugins");
let installed_plugin_dir = data_dir.join("installed-plugins");
// Use system node for CLI (must be in PATH)
let node_bin_path = PathBuf::from("node");
// Find the plugin runtime - check YAAK_PLUGIN_RUNTIME env var, then fallback to development path
let plugin_runtime_main =
std::env::var("YAAK_PLUGIN_RUNTIME").map(PathBuf::from).unwrap_or_else(|_| {
// Development fallback: look relative to crate root
PathBuf::from(env!("CARGO_MANIFEST_DIR"))
.join("../../crates-tauri/yaak-app/vendored/plugin-runtime/index.cjs")
});
// Create plugin manager (plugins may not be available in CLI context)
let plugin_manager = Arc::new(
PluginManager::new(
vendored_plugin_dir,
installed_plugin_dir,
node_bin_path,
plugin_runtime_main,
false,
)
.await,
);
// Initialize plugins from database
let plugins = db.list_plugins().unwrap_or_default();
if !plugins.is_empty() {
let errors =
plugin_manager.initialize_all_plugins(plugins, &PluginContext::new_empty()).await;
for (plugin_dir, error_msg) in errors {
eprintln!("Warning: Failed to initialize plugin '{}': {}", plugin_dir, error_msg);
}
}
match cli.command {
Commands::Workspaces => {
let workspaces = db.list_workspaces().expect("Failed to list workspaces");
if workspaces.is_empty() {
println!("No workspaces found");
} else {
for ws in workspaces {
println!("{} - {}", ws.id, ws.name);
}
}
}
Commands::Requests { workspace_id } => {
let requests = db.list_http_requests(&workspace_id).expect("Failed to list requests");
if requests.is_empty() {
println!("No requests found in workspace {}", workspace_id);
} else {
for req in requests {
println!("{} - {} {}", req.id, req.method, req.name);
}
}
}
Commands::Send { request_id } => {
let request = db.get_http_request(&request_id).expect("Failed to get request");
// Resolve environment chain for variable substitution
let environment_chain = db
.resolve_environments(
&request.workspace_id,
request.folder_id.as_deref(),
cli.environment.as_deref(),
)
.unwrap_or_default();
// Create template callback with plugin support
let plugin_context = PluginContext::new(None, Some(request.workspace_id.clone()));
let template_callback = PluginTemplateCallback::new(
plugin_manager.clone(),
encryption_manager.clone(),
&plugin_context,
RenderPurpose::Send,
);
// Render templates in the request
let rendered_request = render_http_request(
&request,
environment_chain,
&template_callback,
&RenderOptions::throw(),
)
.await
.expect("Failed to render request templates");
if cli.verbose {
println!("> {} {}", rendered_request.method, rendered_request.url);
}
// Convert to sendable request
let sendable = SendableHttpRequest::from_http_request(
&rendered_request,
SendableHttpRequestOptions::default(),
)
.await
.expect("Failed to build request");
// Create event channel for progress
let (event_tx, mut event_rx) = mpsc::channel(100);
// Spawn task to print events if verbose
let verbose = cli.verbose;
let verbose_handle = if verbose {
Some(tokio::spawn(async move {
while let Some(event) = event_rx.recv().await {
println!("{}", event);
}
}))
} else {
// Drain events silently
tokio::spawn(async move { while event_rx.recv().await.is_some() {} });
None
};
// Send the request
let sender = ReqwestSender::new().expect("Failed to create HTTP client");
let response = sender.send(sendable, event_tx).await.expect("Failed to send request");
// Wait for event handler to finish
if let Some(handle) = verbose_handle {
let _ = handle.await;
}
// Print response
if verbose {
println!();
}
println!(
"HTTP {} {}",
response.status,
response.status_reason.as_deref().unwrap_or("")
);
if verbose {
for (name, value) in &response.headers {
println!("{}: {}", name, value);
}
println!();
}
// Print body
let (body, _stats) = response.text().await.expect("Failed to read response body");
println!("{}", body);
}
Commands::Get { url } => {
if cli.verbose {
println!("> GET {}", url);
}
// Build a simple GET request
let sendable = SendableHttpRequest {
url: url.clone(),
method: "GET".to_string(),
headers: vec![],
body: None,
options: SendableHttpRequestOptions::default(),
};
// Create event channel for progress
let (event_tx, mut event_rx) = mpsc::channel(100);
// Spawn task to print events if verbose
let verbose = cli.verbose;
let verbose_handle = if verbose {
Some(tokio::spawn(async move {
while let Some(event) = event_rx.recv().await {
println!("{}", event);
}
}))
} else {
tokio::spawn(async move { while event_rx.recv().await.is_some() {} });
None
};
// Send the request
let sender = ReqwestSender::new().expect("Failed to create HTTP client");
let response = sender.send(sendable, event_tx).await.expect("Failed to send request");
if let Some(handle) = verbose_handle {
let _ = handle.await;
}
// Print response
if verbose {
println!();
}
println!(
"HTTP {} {}",
response.status,
response.status_reason.as_deref().unwrap_or("")
);
if verbose {
for (name, value) in &response.headers {
println!("{}: {}", name, value);
}
println!();
}
// Print body
let (body, _stats) = response.text().await.expect("Failed to read response body");
println!("{}", body);
}
Commands::Create { workspace_id, name, method, url } => {
let request = HttpRequest {
workspace_id,
name,
method: method.to_uppercase(),
url,
..Default::default()
};
let created = db
.upsert_http_request(&request, &UpdateSource::Sync)
.expect("Failed to create request");
println!("Created request: {}", created.id);
}
}
// Terminate plugin manager gracefully
plugin_manager.terminate().await;
}

View File

@@ -1,76 +0,0 @@
[package]
name = "yaak-app"
version = "0.0.0"
edition = "2024"
authors = ["Gregory Schier"]
publish = false
# Produce a library for mobile support
[lib]
name = "tauri_app_lib"
crate-type = ["staticlib", "cdylib", "lib"]
[features]
cargo-clippy = []
default = []
updater = []
license = ["yaak-license"]
[build-dependencies]
tauri-build = { version = "2.5.3", features = [] }
[target.'cfg(target_os = "linux")'.dependencies]
openssl-sys = { version = "0.9.105", features = ["vendored"] } # For Ubuntu installation to work
[dependencies]
charset = "0.1.5"
chrono = { workspace = true, features = ["serde"] }
cookie = "0.18.1"
eventsource-client = { git = "https://github.com/yaakapp/rust-eventsource-client", version = "0.14.0" }
http = { version = "1.2.0", default-features = false }
log = { workspace = true }
md5 = "0.8.0"
r2d2 = "0.8.10"
r2d2_sqlite = "0.25.0"
mime_guess = "2.0.5"
rand = "0.9.0"
reqwest = { workspace = true, features = ["multipart", "gzip", "brotli", "deflate", "json", "rustls-tls-manual-roots-no-provider", "socks", "http2"] }
serde = { workspace = true, features = ["derive"] }
serde_json = { workspace = true, features = ["raw_value"] }
tauri = { workspace = true, features = ["devtools", "protocol-asset"] }
tauri-plugin-clipboard-manager = "2.3.2"
tauri-plugin-deep-link = "2.4.5"
tauri-plugin-dialog = { workspace = true }
tauri-plugin-fs = "2.4.4"
tauri-plugin-log = { version = "2.7.1", features = ["colored"] }
tauri-plugin-opener = "2.5.2"
tauri-plugin-os = "2.3.2"
tauri-plugin-shell = { workspace = true }
tauri-plugin-single-instance = { version = "2.3.6", features = ["deep-link"] }
tauri-plugin-updater = "2.9.0"
tauri-plugin-window-state = "2.4.1"
thiserror = { workspace = true }
tokio = { workspace = true, features = ["sync"] }
tokio-stream = "0.1.17"
tokio-tungstenite = { version = "0.26.2", default-features = false }
url = "2"
tokio-util = { version = "0.7", features = ["codec"] }
ts-rs = { workspace = true }
uuid = "1.12.1"
yaak-common = { workspace = true }
yaak-tauri-utils = { workspace = true }
yaak-core = { workspace = true }
yaak-crypto = { workspace = true }
yaak-fonts = { workspace = true }
yaak-git = { workspace = true }
yaak-grpc = { workspace = true }
yaak-http = { workspace = true }
yaak-license = { workspace = true, optional = true }
yaak-mac-window = { workspace = true }
yaak-models = { workspace = true }
yaak-plugins = { workspace = true }
yaak-sse = { workspace = true }
yaak-sync = { workspace = true }
yaak-templates = { workspace = true }
yaak-tls = { workspace = true }
yaak-ws = { workspace = true }

View File

@@ -1,3 +0,0 @@
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
export type WatchResult = { unlistenEvent: string, };

View File

@@ -1,17 +0,0 @@
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
export type PluginUpdateInfo = { name: string, currentVersion: string, latestVersion: string, };
export type PluginUpdateNotification = { updateCount: number, plugins: Array<PluginUpdateInfo>, };
export type UpdateInfo = { replyEventId: string, version: string, downloaded: boolean, };
export type UpdateResponse = { "type": "ack" } | { "type": "action", action: UpdateResponseAction, };
export type UpdateResponseAction = "install" | "skip";
export type WatchResult = { unlistenEvent: string, };
export type YaakNotification = { timestamp: string, timeout: number | null, id: string, title: string | null, message: string, color: string | null, action: YaakNotificationAction | null, };
export type YaakNotificationAction = { label: string, url: string, };

View File

@@ -1,5 +0,0 @@
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
export type PluginUpdateInfo = { name: string, currentVersion: string, latestVersion: string, };
export type PluginUpdateNotification = { updateCount: number, plugins: Array<PluginUpdateInfo>, };

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.9 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 5.6 KiB

View File

@@ -1,13 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<!-- Enable for NodeJS/V8 JIT compiler -->
<key>com.apple.security.cs.allow-unsigned-executable-memory</key>
<true/>
<!-- Allow loading plugins signed with different Team IDs (e.g., 1Password) -->
<key>com.apple.security.cs.disable-library-validation</key>
<true/>
</dict>
</plist>

View File

@@ -1,6 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
</dict>
</plist>

View File

@@ -1,6 +0,0 @@
{
"name": "@yaakapp-internal/tauri",
"private": true,
"version": "1.0.0",
"main": "bindings/index.ts"
}

View File

@@ -1,100 +0,0 @@
use crate::PluginContextExt;
use crate::error::Result;
use std::sync::Arc;
use tauri::{AppHandle, Manager, Runtime, State, WebviewWindow, command};
use yaak_crypto::manager::EncryptionManager;
use yaak_models::models::HttpRequestHeader;
use yaak_models::queries::workspaces::default_headers;
use yaak_plugins::events::GetThemesResponse;
use yaak_plugins::manager::PluginManager;
use yaak_plugins::native_template_functions::{
decrypt_secure_template_function, encrypt_secure_template_function,
};
/// Extension trait for accessing the EncryptionManager from Tauri Manager types.
pub trait EncryptionManagerExt<'a, R> {
fn crypto(&'a self) -> State<'a, EncryptionManager>;
}
impl<'a, R: Runtime, M: Manager<R>> EncryptionManagerExt<'a, R> for M {
fn crypto(&'a self) -> State<'a, EncryptionManager> {
self.state::<EncryptionManager>()
}
}
#[command]
pub(crate) async fn cmd_decrypt_template<R: Runtime>(
window: WebviewWindow<R>,
template: &str,
) -> Result<String> {
let encryption_manager = window.app_handle().state::<EncryptionManager>();
let plugin_context = window.plugin_context();
Ok(decrypt_secure_template_function(&encryption_manager, &plugin_context, template)?)
}
#[command]
pub(crate) async fn cmd_secure_template<R: Runtime>(
app_handle: AppHandle<R>,
window: WebviewWindow<R>,
template: &str,
) -> Result<String> {
let plugin_manager = Arc::new((*app_handle.state::<PluginManager>()).clone());
let encryption_manager = Arc::new((*app_handle.state::<EncryptionManager>()).clone());
let plugin_context = window.plugin_context();
Ok(encrypt_secure_template_function(
plugin_manager,
encryption_manager,
&plugin_context,
template,
)?)
}
#[command]
pub(crate) async fn cmd_get_themes<R: Runtime>(
window: WebviewWindow<R>,
plugin_manager: State<'_, PluginManager>,
) -> Result<Vec<GetThemesResponse>> {
Ok(plugin_manager.get_themes(&window.plugin_context()).await?)
}
#[command]
pub(crate) async fn cmd_enable_encryption<R: Runtime>(
window: WebviewWindow<R>,
workspace_id: &str,
) -> Result<()> {
window.crypto().ensure_workspace_key(workspace_id)?;
window.crypto().reveal_workspace_key(workspace_id)?;
Ok(())
}
#[command]
pub(crate) async fn cmd_reveal_workspace_key<R: Runtime>(
window: WebviewWindow<R>,
workspace_id: &str,
) -> Result<String> {
Ok(window.crypto().reveal_workspace_key(workspace_id)?)
}
#[command]
pub(crate) async fn cmd_set_workspace_key<R: Runtime>(
window: WebviewWindow<R>,
workspace_id: &str,
key: &str,
) -> Result<()> {
window.crypto().set_human_key(workspace_id, key)?;
Ok(())
}
#[command]
pub(crate) async fn cmd_disable_encryption<R: Runtime>(
window: WebviewWindow<R>,
workspace_id: &str,
) -> Result<()> {
window.crypto().disable_encryption(workspace_id)?;
Ok(())
}
#[command]
pub(crate) fn cmd_default_headers() -> Vec<HttpRequestHeader> {
default_headers()
}

View File

@@ -1,20 +0,0 @@
use mime_guess::{Mime, mime};
use std::path::Path;
use std::str::FromStr;
use tokio::fs;
pub async fn read_response_body(body_path: impl AsRef<Path>, content_type: &str) -> Option<String> {
let body = fs::read(body_path).await.ok()?;
let body_charset = parse_charset(content_type).unwrap_or("utf-8".to_string());
if let Some(decoder) = charset::Charset::for_label(body_charset.as_bytes()) {
let (cow, _real_encoding, _exist_replace) = decoder.decode(&body);
return cow.into_owned().into();
}
Some(String::from_utf8_lossy(&body).to_string())
}
fn parse_charset(content_type: &str) -> Option<String> {
let mime: Mime = Mime::from_str(content_type).ok()?;
mime.get_param(mime::CHARSET).map(|v| v.to_string())
}

View File

@@ -1,78 +0,0 @@
use serde::{Serialize, Serializer};
use std::io;
use thiserror::Error;
#[derive(Error, Debug)]
pub enum Error {
#[error(transparent)]
TemplateError(#[from] yaak_templates::error::Error),
#[error(transparent)]
ModelError(#[from] yaak_models::error::Error),
#[error(transparent)]
SyncError(#[from] yaak_sync::error::Error),
#[error(transparent)]
CryptoError(#[from] yaak_crypto::error::Error),
#[error(transparent)]
HttpError(#[from] yaak_http::error::Error),
#[error(transparent)]
GitError(#[from] yaak_git::error::Error),
#[error(transparent)]
TokioTimeoutElapsed(#[from] tokio::time::error::Elapsed),
#[error(transparent)]
WebsocketError(#[from] yaak_ws::error::Error),
#[cfg(feature = "license")]
#[error(transparent)]
LicenseError(#[from] yaak_license::error::Error),
#[error(transparent)]
PluginError(#[from] yaak_plugins::error::Error),
#[error(transparent)]
TauriUtilsError(#[from] yaak_tauri_utils::error::Error),
#[error(transparent)]
ClipboardError(#[from] tauri_plugin_clipboard_manager::Error),
#[error(transparent)]
OpenerError(#[from] tauri_plugin_opener::Error),
#[error("Updater error: {0}")]
UpdaterError(#[from] tauri_plugin_updater::Error),
#[error("JSON error: {0}")]
JsonError(#[from] serde_json::error::Error),
#[error("Tauri error: {0}")]
TauriError(#[from] tauri::Error),
#[error("Event source error: {0}")]
EventSourceError(#[from] eventsource_client::Error),
#[error("I/O error: {0}")]
IOError(#[from] io::Error),
#[error("Request error: {0}")]
RequestError(#[from] reqwest::Error),
#[error("{0}")]
GenericError(String),
}
impl Serialize for Error {
fn serialize<S>(&self, serializer: S) -> std::result::Result<S::Ok, S::Error>
where
S: Serializer,
{
serializer.serialize_str(self.to_string().as_ref())
}
}
pub type Result<T> = std::result::Result<T, Error>;

View File

@@ -1,149 +0,0 @@
//! Tauri-specific extensions for yaak-git.
//!
//! This module provides the Tauri commands for git functionality.
use crate::error::Result;
use std::path::{Path, PathBuf};
use tauri::command;
use yaak_git::{
BranchDeleteResult, CloneResult, GitCommit, GitRemote, GitStatusSummary, PullResult,
PushResult, git_add, git_add_credential, git_add_remote, git_checkout_branch, git_clone,
git_commit, git_create_branch, git_delete_branch, git_delete_remote_branch, git_fetch_all,
git_init, git_log, git_merge_branch, git_pull, git_pull_force_reset, git_pull_merge, git_push,
git_remotes, git_rename_branch, git_reset_changes, git_rm_remote, git_status, git_unstage,
};
// NOTE: All of these commands are async to prevent blocking work from locking up the UI
#[command]
pub async fn cmd_git_checkout(dir: &Path, branch: &str, force: bool) -> Result<String> {
Ok(git_checkout_branch(dir, branch, force).await?)
}
#[command]
pub async fn cmd_git_branch(dir: &Path, branch: &str, base: Option<&str>) -> Result<()> {
Ok(git_create_branch(dir, branch, base).await?)
}
#[command]
pub async fn cmd_git_delete_branch(
dir: &Path,
branch: &str,
force: Option<bool>,
) -> Result<BranchDeleteResult> {
Ok(git_delete_branch(dir, branch, force.unwrap_or(false)).await?)
}
#[command]
pub async fn cmd_git_delete_remote_branch(dir: &Path, branch: &str) -> Result<()> {
Ok(git_delete_remote_branch(dir, branch).await?)
}
#[command]
pub async fn cmd_git_merge_branch(dir: &Path, branch: &str) -> Result<()> {
Ok(git_merge_branch(dir, branch).await?)
}
#[command]
pub async fn cmd_git_rename_branch(dir: &Path, old_name: &str, new_name: &str) -> Result<()> {
Ok(git_rename_branch(dir, old_name, new_name).await?)
}
#[command]
pub async fn cmd_git_status(dir: &Path) -> Result<GitStatusSummary> {
Ok(git_status(dir)?)
}
#[command]
pub async fn cmd_git_log(dir: &Path) -> Result<Vec<GitCommit>> {
Ok(git_log(dir)?)
}
#[command]
pub async fn cmd_git_initialize(dir: &Path) -> Result<()> {
Ok(git_init(dir)?)
}
#[command]
pub async fn cmd_git_clone(url: &str, dir: &Path) -> Result<CloneResult> {
Ok(git_clone(url, dir).await?)
}
#[command]
pub async fn cmd_git_commit(dir: &Path, message: &str) -> Result<()> {
Ok(git_commit(dir, message).await?)
}
#[command]
pub async fn cmd_git_fetch_all(dir: &Path) -> Result<()> {
Ok(git_fetch_all(dir).await?)
}
#[command]
pub async fn cmd_git_push(dir: &Path) -> Result<PushResult> {
Ok(git_push(dir).await?)
}
#[command]
pub async fn cmd_git_pull(dir: &Path) -> Result<PullResult> {
Ok(git_pull(dir).await?)
}
#[command]
pub async fn cmd_git_pull_force_reset(
dir: &Path,
remote: &str,
branch: &str,
) -> Result<PullResult> {
Ok(git_pull_force_reset(dir, remote, branch).await?)
}
#[command]
pub async fn cmd_git_pull_merge(dir: &Path, remote: &str, branch: &str) -> Result<PullResult> {
Ok(git_pull_merge(dir, remote, branch).await?)
}
#[command]
pub async fn cmd_git_add(dir: &Path, rela_paths: Vec<PathBuf>) -> Result<()> {
for path in rela_paths {
git_add(dir, &path)?;
}
Ok(())
}
#[command]
pub async fn cmd_git_unstage(dir: &Path, rela_paths: Vec<PathBuf>) -> Result<()> {
for path in rela_paths {
git_unstage(dir, &path)?;
}
Ok(())
}
#[command]
pub async fn cmd_git_reset_changes(dir: &Path) -> Result<()> {
Ok(git_reset_changes(dir).await?)
}
#[command]
pub async fn cmd_git_add_credential(
remote_url: &str,
username: &str,
password: &str,
) -> Result<()> {
Ok(git_add_credential(remote_url, username, password).await?)
}
#[command]
pub async fn cmd_git_remotes(dir: &Path) -> Result<Vec<GitRemote>> {
Ok(git_remotes(dir)?)
}
#[command]
pub async fn cmd_git_add_remote(dir: &Path, name: &str, url: &str) -> Result<GitRemote> {
Ok(git_add_remote(dir, name, url)?)
}
#[command]
pub async fn cmd_git_rm_remote(dir: &Path, name: &str) -> Result<()> {
Ok(git_rm_remote(dir, name)?)
}

View File

@@ -1,98 +0,0 @@
use std::collections::BTreeMap;
use crate::PluginContextExt;
use crate::error::Result;
use crate::models_ext::QueryManagerExt;
use KeyAndValueRef::{Ascii, Binary};
use tauri::{Manager, Runtime, WebviewWindow};
use yaak_grpc::{KeyAndValueRef, MetadataMap};
use yaak_models::models::GrpcRequest;
use yaak_plugins::events::{CallHttpAuthenticationRequest, HttpHeader};
use yaak_plugins::manager::PluginManager;
pub(crate) fn metadata_to_map(metadata: MetadataMap) -> BTreeMap<String, String> {
let mut entries = BTreeMap::new();
for r in metadata.iter() {
match r {
Ascii(k, v) => entries.insert(k.to_string(), v.to_str().unwrap().to_string()),
Binary(k, v) => entries.insert(k.to_string(), format!("{:?}", v)),
};
}
entries
}
pub(crate) fn resolve_grpc_request<R: Runtime>(
window: &WebviewWindow<R>,
request: &GrpcRequest,
) -> Result<(GrpcRequest, String)> {
let mut new_request = request.clone();
let (authentication_type, authentication, authentication_context_id) =
window.db().resolve_auth_for_grpc_request(request)?;
new_request.authentication_type = authentication_type;
new_request.authentication = authentication;
let metadata = window.db().resolve_metadata_for_grpc_request(request)?;
new_request.metadata = metadata;
Ok((new_request, authentication_context_id))
}
pub(crate) async fn build_metadata<R: Runtime>(
window: &WebviewWindow<R>,
request: &GrpcRequest,
authentication_context_id: &str,
) -> Result<BTreeMap<String, String>> {
let plugin_manager = window.state::<PluginManager>();
let mut metadata = BTreeMap::new();
// Add the rest of metadata
for h in request.metadata.clone() {
if h.name.is_empty() && h.value.is_empty() {
continue;
}
if !h.enabled {
continue;
}
metadata.insert(h.name, h.value);
}
match request.authentication_type.clone() {
None => {
// No authentication found. Not even inherited
}
Some(authentication_type) if authentication_type == "none" => {
// Explicitly no authentication
}
Some(authentication_type) => {
let auth = request.authentication.clone();
let plugin_req = CallHttpAuthenticationRequest {
context_id: format!("{:x}", md5::compute(authentication_context_id)),
values: serde_json::from_value(serde_json::to_value(&auth)?)?,
method: "POST".to_string(),
url: request.url.clone(),
headers: metadata
.iter()
.map(|(name, value)| HttpHeader {
name: name.to_string(),
value: value.to_string(),
})
.collect(),
};
let plugin_result = plugin_manager
.call_http_authentication(
&window.plugin_context(),
&authentication_type,
plugin_req,
)
.await?;
for header in plugin_result.set_headers.unwrap_or_default() {
metadata.insert(header.name, header.value);
}
}
}
Ok(metadata)
}

View File

@@ -1,74 +0,0 @@
use crate::models_ext::QueryManagerExt;
use chrono::{NaiveDateTime, Utc};
use log::debug;
use std::sync::OnceLock;
use tauri::{AppHandle, Runtime};
use yaak_models::util::UpdateSource;
const NAMESPACE: &str = "analytics";
const NUM_LAUNCHES_KEY: &str = "num_launches";
const LAST_VERSION_KEY: &str = "last_tracked_version";
const PREV_VERSION_KEY: &str = "last_tracked_version_prev";
const VERSION_SINCE_KEY: &str = "last_tracked_version_since";
#[derive(Default, Debug, Clone)]
pub struct LaunchEventInfo {
pub current_version: String,
pub previous_version: String,
pub launched_after_update: bool,
pub version_since: NaiveDateTime,
pub user_since: NaiveDateTime,
pub num_launches: i32,
}
static LAUNCH_INFO: OnceLock<LaunchEventInfo> = OnceLock::new();
pub fn get_or_upsert_launch_info<R: Runtime>(app_handle: &AppHandle<R>) -> &LaunchEventInfo {
LAUNCH_INFO.get_or_init(|| {
let now = Utc::now().naive_utc();
let mut info = LaunchEventInfo {
version_since: app_handle.db().get_key_value_dte(NAMESPACE, VERSION_SINCE_KEY, now),
current_version: app_handle.package_info().version.to_string(),
user_since: app_handle.db().get_settings().created_at,
num_launches: app_handle.db().get_key_value_int(NAMESPACE, NUM_LAUNCHES_KEY, 0) + 1,
// The rest will be set below
..Default::default()
};
app_handle
.with_tx(|tx| {
// Load the previously tracked version
let curr_db = tx.get_key_value_str(NAMESPACE, LAST_VERSION_KEY, "");
let prev_db = tx.get_key_value_str(NAMESPACE, PREV_VERSION_KEY, "");
// We just updated if the app version is different from the last tracked version we stored
if !curr_db.is_empty() && info.current_version != curr_db {
info.launched_after_update = true;
}
// If we just updated, track the previous version as the "previous" current version
if info.launched_after_update {
info.previous_version = curr_db.clone();
info.version_since = now;
} else {
info.previous_version = prev_db.clone();
}
// Rotate stored versions: move previous into the "prev" slot before overwriting
let source = &UpdateSource::Background;
tx.set_key_value_str(NAMESPACE, PREV_VERSION_KEY, &info.previous_version, source);
tx.set_key_value_str(NAMESPACE, LAST_VERSION_KEY, &info.current_version, source);
tx.set_key_value_dte(NAMESPACE, VERSION_SINCE_KEY, info.version_since, source);
tx.set_key_value_int(NAMESPACE, NUM_LAUNCHES_KEY, info.num_launches, source);
Ok(())
})
.unwrap();
debug!("Initialized launch info");
info
})
}

View File

@@ -1,722 +0,0 @@
use crate::PluginContextExt;
use crate::error::Error::GenericError;
use crate::error::Result;
use crate::models_ext::BlobManagerExt;
use crate::models_ext::QueryManagerExt;
use crate::render::render_http_request;
use log::{debug, warn};
use std::pin::Pin;
use std::sync::Arc;
use std::sync::atomic::{AtomicI32, Ordering};
use std::time::{Duration, Instant};
use tauri::{AppHandle, Manager, Runtime, WebviewWindow};
use tokio::fs::{File, create_dir_all};
use tokio::io::{AsyncRead, AsyncReadExt, AsyncWriteExt};
use tokio::sync::watch::Receiver;
use tokio_util::bytes::Bytes;
use yaak_crypto::manager::EncryptionManager;
use yaak_http::client::{
HttpConnectionOptions, HttpConnectionProxySetting, HttpConnectionProxySettingAuth,
};
use yaak_http::cookies::CookieStore;
use yaak_http::manager::{CachedClient, HttpConnectionManager};
use yaak_http::sender::ReqwestSender;
use yaak_http::tee_reader::TeeReader;
use yaak_http::transaction::HttpTransaction;
use yaak_http::types::{
SendableBody, SendableHttpRequest, SendableHttpRequestOptions, append_query_params,
};
use yaak_models::blob_manager::BodyChunk;
use yaak_models::models::{
CookieJar, Environment, HttpRequest, HttpResponse, HttpResponseEvent, HttpResponseHeader,
HttpResponseState, ProxySetting, ProxySettingAuth,
};
use yaak_models::util::UpdateSource;
use yaak_plugins::events::{
CallHttpAuthenticationRequest, HttpHeader, PluginContext, RenderPurpose,
};
use yaak_plugins::manager::PluginManager;
use yaak_plugins::template_callback::PluginTemplateCallback;
use yaak_templates::RenderOptions;
use yaak_tls::find_client_certificate;
/// Chunk size for storing request bodies (1MB)
const REQUEST_BODY_CHUNK_SIZE: usize = 1024 * 1024;
/// Context for managing response state during HTTP transactions.
/// Handles both persisted responses (stored in DB) and ephemeral responses (in-memory only).
struct ResponseContext<R: Runtime> {
app_handle: AppHandle<R>,
response: HttpResponse,
update_source: UpdateSource,
}
impl<R: Runtime> ResponseContext<R> {
fn new(app_handle: AppHandle<R>, response: HttpResponse, update_source: UpdateSource) -> Self {
Self { app_handle, response, update_source }
}
/// Whether this response is persisted (has a non-empty ID)
fn is_persisted(&self) -> bool {
!self.response.id.is_empty()
}
/// Update the response state. For persisted responses, fetches from DB, applies the
/// closure, and updates the DB. For ephemeral responses, just applies the closure
/// to the in-memory response.
fn update<F>(&mut self, func: F) -> Result<()>
where
F: FnOnce(&mut HttpResponse),
{
if self.is_persisted() {
let r = self.app_handle.with_tx(|tx| {
let mut r = tx.get_http_response(&self.response.id)?;
func(&mut r);
tx.update_http_response_if_id(&r, &self.update_source)?;
Ok(r)
})?;
self.response = r;
Ok(())
} else {
func(&mut self.response);
Ok(())
}
}
/// Get the current response state
fn response(&self) -> &HttpResponse {
&self.response
}
}
pub async fn send_http_request<R: Runtime>(
window: &WebviewWindow<R>,
unrendered_request: &HttpRequest,
og_response: &HttpResponse,
environment: Option<Environment>,
cookie_jar: Option<CookieJar>,
cancelled_rx: &mut Receiver<bool>,
) -> Result<HttpResponse> {
send_http_request_with_context(
window,
unrendered_request,
og_response,
environment,
cookie_jar,
cancelled_rx,
&window.plugin_context(),
)
.await
}
pub async fn send_http_request_with_context<R: Runtime>(
window: &WebviewWindow<R>,
unrendered_request: &HttpRequest,
og_response: &HttpResponse,
environment: Option<Environment>,
cookie_jar: Option<CookieJar>,
cancelled_rx: &Receiver<bool>,
plugin_context: &PluginContext,
) -> Result<HttpResponse> {
let app_handle = window.app_handle().clone();
let update_source = UpdateSource::from_window_label(window.label());
let mut response_ctx =
ResponseContext::new(app_handle.clone(), og_response.clone(), update_source);
// Execute the inner send logic and handle errors consistently
let start = Instant::now();
let result = send_http_request_inner(
window,
unrendered_request,
environment,
cookie_jar,
cancelled_rx,
plugin_context,
&mut response_ctx,
)
.await;
match result {
Ok(response) => Ok(response),
Err(e) => {
let error = e.to_string();
let elapsed = start.elapsed().as_millis() as i32;
warn!("Failed to send request: {error:?}");
let _ = response_ctx.update(|r| {
r.state = HttpResponseState::Closed;
r.elapsed = elapsed;
if r.elapsed_headers == 0 {
r.elapsed_headers = elapsed;
}
r.error = Some(error);
});
Ok(response_ctx.response().clone())
}
}
}
async fn send_http_request_inner<R: Runtime>(
window: &WebviewWindow<R>,
unrendered_request: &HttpRequest,
environment: Option<Environment>,
cookie_jar: Option<CookieJar>,
cancelled_rx: &Receiver<bool>,
plugin_context: &PluginContext,
response_ctx: &mut ResponseContext<R>,
) -> Result<HttpResponse> {
let app_handle = window.app_handle().clone();
let plugin_manager = Arc::new((*app_handle.state::<PluginManager>()).clone());
let encryption_manager = Arc::new((*app_handle.state::<EncryptionManager>()).clone());
let connection_manager = app_handle.state::<HttpConnectionManager>();
let settings = window.db().get_settings();
let workspace_id = &unrendered_request.workspace_id;
let folder_id = unrendered_request.folder_id.as_deref();
let environment_id = environment.map(|e| e.id);
let workspace = window.db().get_workspace(workspace_id)?;
let (resolved, auth_context_id) = resolve_http_request(window, unrendered_request)?;
let cb = PluginTemplateCallback::new(
plugin_manager.clone(),
encryption_manager.clone(),
&plugin_context,
RenderPurpose::Send,
);
let env_chain =
window.db().resolve_environments(&workspace.id, folder_id, environment_id.as_deref())?;
let mut cancel_rx = cancelled_rx.clone();
let render_options = RenderOptions::throw();
let request = tokio::select! {
result = render_http_request(&resolved, env_chain, &cb, &render_options) => result?,
_ = cancel_rx.changed() => {
return Err(GenericError("Request canceled".to_string()));
}
};
// Build the sendable request using the new SendableHttpRequest type
let options = SendableHttpRequestOptions {
follow_redirects: workspace.setting_follow_redirects,
timeout: if workspace.setting_request_timeout > 0 {
Some(Duration::from_millis(workspace.setting_request_timeout.unsigned_abs() as u64))
} else {
None
},
};
let mut sendable_request = SendableHttpRequest::from_http_request(&request, options).await?;
debug!("Sending request to {} {}", sendable_request.method, sendable_request.url);
let proxy_setting = match settings.proxy {
None => HttpConnectionProxySetting::System,
Some(ProxySetting::Disabled) => HttpConnectionProxySetting::Disabled,
Some(ProxySetting::Enabled { http, https, auth, bypass, disabled }) => {
if disabled {
HttpConnectionProxySetting::System
} else {
HttpConnectionProxySetting::Enabled {
http,
https,
bypass,
auth: match auth {
None => None,
Some(ProxySettingAuth { user, password }) => {
Some(HttpConnectionProxySettingAuth { user, password })
}
},
}
}
}
};
let client_certificate =
find_client_certificate(&sendable_request.url, &settings.client_certificates);
// Create cookie store if a cookie jar is specified
let maybe_cookie_store = match cookie_jar.clone() {
Some(CookieJar { id, .. }) => {
// NOTE: We need to refetch the cookie jar because a chained request might have
// updated cookies when we rendered the request.
let cj = window.db().get_cookie_jar(&id)?;
let cookie_store = CookieStore::from_cookies(cj.cookies.clone());
Some((cookie_store, cj))
}
None => None,
};
let cached_client = connection_manager
.get_client(&HttpConnectionOptions {
id: plugin_context.id.clone(),
validate_certificates: workspace.setting_validate_certificates,
proxy: proxy_setting,
client_certificate,
dns_overrides: workspace.setting_dns_overrides.clone(),
})
.await?;
// Apply authentication to the request, racing against cancellation since
// auth plugins (e.g. OAuth2) can block indefinitely waiting for user action.
let mut cancel_rx = cancelled_rx.clone();
tokio::select! {
result = apply_authentication(
&window,
&mut sendable_request,
&request,
auth_context_id,
&plugin_manager,
plugin_context,
) => result?,
_ = cancel_rx.changed() => {
return Err(GenericError("Request canceled".to_string()));
}
};
let cookie_store = maybe_cookie_store.as_ref().map(|(cs, _)| cs.clone());
let result = execute_transaction(
cached_client,
sendable_request,
response_ctx,
cancelled_rx.clone(),
cookie_store,
)
.await;
// Wait for blob writing to complete and check for errors
let final_result = match result {
Ok((response, maybe_blob_write_handle)) => {
// Check if blob writing failed
if let Some(handle) = maybe_blob_write_handle {
if let Ok(Err(e)) = handle.await {
// Update response with the storage error
let _ = response_ctx.update(|r| {
let error_msg =
format!("Request succeeded but failed to store request body: {}", e);
r.error = Some(match &r.error {
Some(existing) => format!("{}; {}", existing, error_msg),
None => error_msg,
});
});
}
}
Ok(response)
}
Err(e) => Err(e),
};
// Persist cookies back to the database after the request completes
if let Some((cookie_store, mut cj)) = maybe_cookie_store {
let cookies = cookie_store.get_all_cookies();
cj.cookies = cookies;
if let Err(e) = window.db().upsert_cookie_jar(&cj, &UpdateSource::Background) {
warn!("Failed to persist cookies to database: {}", e);
}
}
final_result
}
pub fn resolve_http_request<R: Runtime>(
window: &WebviewWindow<R>,
request: &HttpRequest,
) -> Result<(HttpRequest, String)> {
let mut new_request = request.clone();
let (authentication_type, authentication, authentication_context_id) =
window.db().resolve_auth_for_http_request(request)?;
new_request.authentication_type = authentication_type;
new_request.authentication = authentication;
let headers = window.db().resolve_headers_for_http_request(request)?;
new_request.headers = headers;
Ok((new_request, authentication_context_id))
}
async fn execute_transaction<R: Runtime>(
cached_client: CachedClient,
mut sendable_request: SendableHttpRequest,
response_ctx: &mut ResponseContext<R>,
mut cancelled_rx: Receiver<bool>,
cookie_store: Option<CookieStore>,
) -> Result<(HttpResponse, Option<tauri::async_runtime::JoinHandle<Result<()>>>)> {
let app_handle = &response_ctx.app_handle.clone();
let response_id = response_ctx.response().id.clone();
let workspace_id = response_ctx.response().workspace_id.clone();
let is_persisted = response_ctx.is_persisted();
// Keep a reference to the resolver for DNS timing events
let resolver = cached_client.resolver.clone();
let sender = ReqwestSender::with_client(cached_client.client);
let transaction = match cookie_store {
Some(cs) => HttpTransaction::with_cookie_store(sender, cs),
None => HttpTransaction::new(sender),
};
let start = Instant::now();
// Capture request headers before sending
let request_headers: Vec<HttpResponseHeader> = sendable_request
.headers
.iter()
.map(|(name, value)| HttpResponseHeader { name: name.clone(), value: value.clone() })
.collect();
// Update response with headers info
response_ctx.update(|r| {
r.url = sendable_request.url.clone();
r.request_headers = request_headers;
})?;
// Create bounded channel for receiving events and spawn a task to store them in DB
// Buffer size of 100 events provides back pressure if DB writes are slow
let (event_tx, mut event_rx) =
tokio::sync::mpsc::channel::<yaak_http::sender::HttpResponseEvent>(100);
// Set the event sender on the DNS resolver so it can emit DNS timing events
resolver.set_event_sender(Some(event_tx.clone())).await;
// Shared state to capture DNS timing from the event processing task
let dns_elapsed = Arc::new(AtomicI32::new(0));
// Write events to DB in a task (only for persisted responses)
if is_persisted {
let response_id = response_id.clone();
let app_handle = app_handle.clone();
let update_source = response_ctx.update_source.clone();
let workspace_id = workspace_id.clone();
let dns_elapsed = dns_elapsed.clone();
tokio::spawn(async move {
while let Some(event) = event_rx.recv().await {
// Capture DNS timing when we see a DNS event
if let yaak_http::sender::HttpResponseEvent::DnsResolved { duration, .. } = &event {
dns_elapsed.store(*duration as i32, Ordering::SeqCst);
}
let db_event = HttpResponseEvent::new(&response_id, &workspace_id, event.into());
let _ = app_handle.db().upsert_http_response_event(&db_event, &update_source);
}
});
} else {
// For ephemeral responses, just drain the events but still capture DNS timing
let dns_elapsed = dns_elapsed.clone();
tokio::spawn(async move {
while let Some(event) = event_rx.recv().await {
if let yaak_http::sender::HttpResponseEvent::DnsResolved { duration, .. } = &event {
dns_elapsed.store(*duration as i32, Ordering::SeqCst);
}
}
});
};
// Capture request body as it's sent (only for persisted responses)
let body_id = format!("{}.request", response_id);
let maybe_blob_write_handle = match sendable_request.body {
Some(SendableBody::Bytes(bytes)) => {
if is_persisted {
write_bytes_to_db_sync(response_ctx, &body_id, bytes.clone())?;
}
sendable_request.body = Some(SendableBody::Bytes(bytes));
None
}
Some(SendableBody::Stream(stream)) => {
// Wrap stream with TeeReader to capture data as it's read
// Use unbounded channel to ensure all data is captured without blocking the HTTP request
let (body_chunk_tx, body_chunk_rx) = tokio::sync::mpsc::unbounded_channel::<Vec<u8>>();
let tee_reader = TeeReader::new(stream, body_chunk_tx);
let pinned: Pin<Box<dyn AsyncRead + Send + 'static>> = Box::pin(tee_reader);
let handle = if is_persisted {
// Spawn task to write request body chunks to blob DB
let app_handle = app_handle.clone();
let response_id = response_id.clone();
let workspace_id = workspace_id.clone();
let body_id = body_id.clone();
let update_source = response_ctx.update_source.clone();
Some(tauri::async_runtime::spawn(async move {
write_stream_chunks_to_db(
app_handle,
&body_id,
&workspace_id,
&response_id,
&update_source,
body_chunk_rx,
)
.await
}))
} else {
// For ephemeral responses, just drain the body chunks
tauri::async_runtime::spawn(async move {
let mut rx = body_chunk_rx;
while rx.recv().await.is_some() {}
});
None
};
sendable_request.body = Some(SendableBody::Stream(pinned));
handle
}
None => {
sendable_request.body = None;
None
}
};
// Execute the transaction with cancellation support
// This returns the response with headers, but body is not yet consumed
// Events (headers, settings, chunks) are sent through the channel
let mut http_response = transaction
.execute_with_cancellation(sendable_request, cancelled_rx.clone(), event_tx)
.await?;
// Prepare the response path before consuming the body
let body_path = if response_id.is_empty() {
// Ephemeral responses: use OS temp directory for automatic cleanup
let temp_dir = std::env::temp_dir().join("yaak-ephemeral-responses");
create_dir_all(&temp_dir).await?;
temp_dir.join(uuid::Uuid::new_v4().to_string())
} else {
// Persisted responses: use app data directory
let dir = app_handle.path().app_data_dir()?;
let base_dir = dir.join("responses");
create_dir_all(&base_dir).await?;
base_dir.join(&response_id)
};
// Extract metadata before consuming the body (headers are available immediately)
// Url might change, so update again
response_ctx.update(|r| {
r.body_path = Some(body_path.to_string_lossy().to_string());
r.elapsed_headers = start.elapsed().as_millis() as i32;
r.status = http_response.status as i32;
r.status_reason = http_response.status_reason.clone();
r.url = http_response.url.clone();
r.remote_addr = http_response.remote_addr.clone();
r.version = http_response.version.clone();
r.headers = http_response
.headers
.iter()
.map(|(name, value)| HttpResponseHeader { name: name.clone(), value: value.clone() })
.collect();
r.content_length = http_response.content_length.map(|l| l as i32);
r.state = HttpResponseState::Connected;
r.request_headers = http_response
.request_headers
.iter()
.map(|(n, v)| HttpResponseHeader { name: n.clone(), value: v.clone() })
.collect();
})?;
// Get the body stream for manual consumption
let mut body_stream = http_response.into_body_stream()?;
// Open file for writing
let mut file = File::options()
.create(true)
.truncate(true)
.write(true)
.open(&body_path)
.await
.map_err(|e| GenericError(format!("Failed to open file: {}", e)))?;
// Stream body to file, with throttled DB updates to avoid excessive writes
let mut written_bytes: usize = 0;
let mut last_update_time = start;
let mut buf = [0u8; 8192];
// Throttle settings: update DB at most every 100ms
const UPDATE_INTERVAL_MS: u128 = 100;
loop {
// Check for cancellation. If we already have headers/body, just close cleanly without error
if *cancelled_rx.borrow() {
break;
}
// Use select! to race between reading and cancellation, so cancellation is immediate
let read_result = tokio::select! {
biased;
_ = cancelled_rx.changed() => {
break;
}
result = body_stream.read(&mut buf) => result,
};
match read_result {
Ok(0) => break, // EOF
Ok(n) => {
file.write_all(&buf[..n])
.await
.map_err(|e| GenericError(format!("Failed to write to file: {}", e)))?;
file.flush()
.await
.map_err(|e| GenericError(format!("Failed to flush file: {}", e)))?;
written_bytes += n;
// Throttle DB updates: only update if enough time has passed
let now = Instant::now();
let elapsed_since_update = now.duration_since(last_update_time).as_millis();
if elapsed_since_update >= UPDATE_INTERVAL_MS {
response_ctx.update(|r| {
r.elapsed = start.elapsed().as_millis() as i32;
r.content_length = Some(written_bytes as i32);
})?;
last_update_time = now;
}
}
Err(e) => {
return Err(GenericError(format!("Failed to read response body: {}", e)));
}
}
}
// Final update with closed state and accurate byte count
response_ctx.update(|r| {
r.elapsed = start.elapsed().as_millis() as i32;
r.elapsed_dns = dns_elapsed.load(Ordering::SeqCst);
r.content_length = Some(written_bytes as i32);
r.state = HttpResponseState::Closed;
})?;
// Clear the event sender from the resolver since this request is done
resolver.set_event_sender(None).await;
Ok((response_ctx.response().clone(), maybe_blob_write_handle))
}
fn write_bytes_to_db_sync<R: Runtime>(
response_ctx: &mut ResponseContext<R>,
body_id: &str,
data: Bytes,
) -> Result<()> {
if data.is_empty() {
return Ok(());
}
// Write in chunks if data is large
let mut offset = 0;
let mut chunk_index = 0;
while offset < data.len() {
let end = std::cmp::min(offset + REQUEST_BODY_CHUNK_SIZE, data.len());
let chunk_data = data.slice(offset..end).to_vec();
let chunk = BodyChunk::new(body_id, chunk_index, chunk_data);
response_ctx.app_handle.blobs().insert_chunk(&chunk)?;
offset = end;
chunk_index += 1;
}
// Update the response with the total request body size
response_ctx.update(|r| {
r.request_content_length = Some(data.len() as i32);
})?;
Ok(())
}
async fn write_stream_chunks_to_db<R: Runtime>(
app_handle: AppHandle<R>,
body_id: &str,
workspace_id: &str,
response_id: &str,
update_source: &UpdateSource,
mut rx: tokio::sync::mpsc::UnboundedReceiver<Vec<u8>>,
) -> Result<()> {
let mut buffer = Vec::with_capacity(REQUEST_BODY_CHUNK_SIZE);
let mut chunk_index = 0;
let mut total_bytes: usize = 0;
while let Some(data) = rx.recv().await {
total_bytes += data.len();
buffer.extend_from_slice(&data);
// Flush when buffer reaches chunk size
while buffer.len() >= REQUEST_BODY_CHUNK_SIZE {
debug!("Writing chunk {chunk_index} to DB");
let chunk_data: Vec<u8> = buffer.drain(..REQUEST_BODY_CHUNK_SIZE).collect();
let chunk = BodyChunk::new(body_id, chunk_index, chunk_data);
app_handle.blobs().insert_chunk(&chunk)?;
app_handle.db().upsert_http_response_event(
&HttpResponseEvent::new(
response_id,
workspace_id,
yaak_http::sender::HttpResponseEvent::ChunkSent {
bytes: REQUEST_BODY_CHUNK_SIZE,
}
.into(),
),
update_source,
)?;
chunk_index += 1;
}
}
// Flush remaining data
if !buffer.is_empty() {
let chunk = BodyChunk::new(body_id, chunk_index, buffer);
debug!("Flushing remaining data {chunk_index} {}", chunk.data.len());
app_handle.blobs().insert_chunk(&chunk)?;
app_handle.db().upsert_http_response_event(
&HttpResponseEvent::new(
response_id,
workspace_id,
yaak_http::sender::HttpResponseEvent::ChunkSent { bytes: chunk.data.len() }.into(),
),
update_source,
)?;
}
// Update the response with the total request body size
app_handle.with_tx(|tx| {
debug!("Updating final body length {total_bytes}");
if let Ok(mut response) = tx.get_http_response(&response_id) {
response.request_content_length = Some(total_bytes as i32);
tx.update_http_response_if_id(&response, update_source)?;
}
Ok(())
})?;
Ok(())
}
async fn apply_authentication<R: Runtime>(
_window: &WebviewWindow<R>,
sendable_request: &mut SendableHttpRequest,
request: &HttpRequest,
auth_context_id: String,
plugin_manager: &PluginManager,
plugin_context: &PluginContext,
) -> Result<()> {
match &request.authentication_type {
None => {
// No authentication found. Not even inherited
}
Some(authentication_type) if authentication_type == "none" => {
// Explicitly no authentication
}
Some(authentication_type) => {
let req = CallHttpAuthenticationRequest {
context_id: format!("{:x}", md5::compute(auth_context_id)),
values: serde_json::from_value(serde_json::to_value(&request.authentication)?)?,
url: sendable_request.url.clone(),
method: sendable_request.method.clone(),
headers: sendable_request
.headers
.iter()
.map(|(name, value)| HttpHeader {
name: name.to_string(),
value: value.to_string(),
})
.collect(),
};
let plugin_result = plugin_manager
.call_http_authentication(plugin_context, &authentication_type, req)
.await?;
for header in plugin_result.set_headers.unwrap_or_default() {
sendable_request.insert_header((header.name, header.value));
}
if let Some(params) = plugin_result.set_query_parameters {
let params = params.into_iter().map(|p| (p.name, p.value)).collect::<Vec<_>>();
sendable_request.url = append_query_params(&sendable_request.url, params);
}
}
}
Ok(())
}

View File

@@ -1,129 +0,0 @@
use crate::PluginContextExt;
use crate::error::Result;
use crate::models_ext::QueryManagerExt;
use log::info;
use std::collections::BTreeMap;
use std::fs::read_to_string;
use tauri::{Manager, Runtime, WebviewWindow};
use yaak_core::WorkspaceContext;
use yaak_models::models::{
Environment, Folder, GrpcRequest, HttpRequest, WebsocketRequest, Workspace,
};
use yaak_models::util::{BatchUpsertResult, UpdateSource, maybe_gen_id, maybe_gen_id_opt};
use yaak_plugins::manager::PluginManager;
use yaak_tauri_utils::window::WorkspaceWindowTrait;
pub(crate) async fn import_data<R: Runtime>(
window: &WebviewWindow<R>,
file_path: &str,
) -> Result<BatchUpsertResult> {
let plugin_manager = window.state::<PluginManager>();
let file =
read_to_string(file_path).unwrap_or_else(|_| panic!("Unable to read file {}", file_path));
let file_contents = file.as_str();
let import_result = plugin_manager.import_data(&window.plugin_context(), file_contents).await?;
let mut id_map: BTreeMap<String, String> = BTreeMap::new();
// Create WorkspaceContext from window
let ctx = WorkspaceContext {
workspace_id: window.workspace_id(),
environment_id: window.environment_id(),
cookie_jar_id: window.cookie_jar_id(),
request_id: None,
};
let resources = import_result.resources;
let workspaces: Vec<Workspace> = resources
.workspaces
.into_iter()
.map(|mut v| {
v.id = maybe_gen_id::<Workspace>(&ctx, v.id.as_str(), &mut id_map);
v
})
.collect();
let environments: Vec<Environment> = resources
.environments
.into_iter()
.map(|mut v| {
v.id = maybe_gen_id::<Environment>(&ctx, v.id.as_str(), &mut id_map);
v.workspace_id = maybe_gen_id::<Workspace>(&ctx, v.workspace_id.as_str(), &mut id_map);
match (v.parent_model.as_str(), v.parent_id.clone().as_deref()) {
("folder", Some(parent_id)) => {
v.parent_id = Some(maybe_gen_id::<Folder>(&ctx, &parent_id, &mut id_map));
}
("", _) => {
// Fix any empty ones
v.parent_model = "workspace".to_string();
}
_ => {
// Parent ID only required for the folder case
v.parent_id = None;
}
};
v
})
.collect();
let folders: Vec<Folder> = resources
.folders
.into_iter()
.map(|mut v| {
v.id = maybe_gen_id::<Folder>(&ctx, v.id.as_str(), &mut id_map);
v.workspace_id = maybe_gen_id::<Workspace>(&ctx, v.workspace_id.as_str(), &mut id_map);
v.folder_id = maybe_gen_id_opt::<Folder>(&ctx, v.folder_id, &mut id_map);
v
})
.collect();
let http_requests: Vec<HttpRequest> = resources
.http_requests
.into_iter()
.map(|mut v| {
v.id = maybe_gen_id::<HttpRequest>(&ctx, v.id.as_str(), &mut id_map);
v.workspace_id = maybe_gen_id::<Workspace>(&ctx, v.workspace_id.as_str(), &mut id_map);
v.folder_id = maybe_gen_id_opt::<Folder>(&ctx, v.folder_id, &mut id_map);
v
})
.collect();
let grpc_requests: Vec<GrpcRequest> = resources
.grpc_requests
.into_iter()
.map(|mut v| {
v.id = maybe_gen_id::<GrpcRequest>(&ctx, v.id.as_str(), &mut id_map);
v.workspace_id = maybe_gen_id::<Workspace>(&ctx, v.workspace_id.as_str(), &mut id_map);
v.folder_id = maybe_gen_id_opt::<Folder>(&ctx, v.folder_id, &mut id_map);
v
})
.collect();
let websocket_requests: Vec<WebsocketRequest> = resources
.websocket_requests
.into_iter()
.map(|mut v| {
v.id = maybe_gen_id::<WebsocketRequest>(&ctx, v.id.as_str(), &mut id_map);
v.workspace_id = maybe_gen_id::<Workspace>(&ctx, v.workspace_id.as_str(), &mut id_map);
v.folder_id = maybe_gen_id_opt::<Folder>(&ctx, v.folder_id, &mut id_map);
v
})
.collect();
info!("Importing data");
let upserted = window.with_tx(|tx| {
tx.batch_upsert(
workspaces,
environments,
folders,
http_requests,
grpc_requests,
websocket_requests,
&UpdateSource::Import,
)
})?;
Ok(upserted)
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,279 +0,0 @@
//! Tauri-specific extensions for yaak-models.
//!
//! This module provides the Tauri plugin initialization and extension traits
//! that allow accessing QueryManager and BlobManager from Tauri's Manager types.
use tauri::plugin::TauriPlugin;
use tauri::{Emitter, Manager, Runtime, State};
use tauri_plugin_dialog::{DialogExt, MessageDialogKind};
use yaak_models::blob_manager::BlobManager;
use yaak_models::db_context::DbContext;
use yaak_models::error::Result;
use yaak_models::models::{AnyModel, GraphQlIntrospection, GrpcEvent, Settings, WebsocketEvent};
use yaak_models::query_manager::QueryManager;
use yaak_models::util::UpdateSource;
/// Extension trait for accessing the QueryManager from Tauri Manager types.
pub trait QueryManagerExt<'a, R> {
fn db_manager(&'a self) -> State<'a, QueryManager>;
fn db(&'a self) -> DbContext<'a>;
fn with_tx<F, T>(&'a self, func: F) -> Result<T>
where
F: FnOnce(&DbContext) -> Result<T>;
}
impl<'a, R: Runtime, M: Manager<R>> QueryManagerExt<'a, R> for M {
fn db_manager(&'a self) -> State<'a, QueryManager> {
self.state::<QueryManager>()
}
fn db(&'a self) -> DbContext<'a> {
let qm = self.state::<QueryManager>();
qm.inner().connect()
}
fn with_tx<F, T>(&'a self, func: F) -> Result<T>
where
F: FnOnce(&DbContext) -> Result<T>,
{
let qm = self.state::<QueryManager>();
qm.inner().with_tx(func)
}
}
/// Extension trait for accessing the BlobManager from Tauri Manager types.
pub trait BlobManagerExt<'a, R> {
fn blob_manager(&'a self) -> State<'a, BlobManager>;
fn blobs(&'a self) -> yaak_models::blob_manager::BlobContext;
}
impl<'a, R: Runtime, M: Manager<R>> BlobManagerExt<'a, R> for M {
fn blob_manager(&'a self) -> State<'a, BlobManager> {
self.state::<BlobManager>()
}
fn blobs(&'a self) -> yaak_models::blob_manager::BlobContext {
let manager = self.state::<BlobManager>();
manager.inner().connect()
}
}
// Commands for yaak-models
use tauri::WebviewWindow;
#[tauri::command]
pub(crate) fn models_upsert<R: Runtime>(
window: WebviewWindow<R>,
model: AnyModel,
) -> Result<String> {
use yaak_models::error::Error::GenericError;
let db = window.db();
let blobs = window.blob_manager();
let source = &UpdateSource::from_window_label(window.label());
let id = match model {
AnyModel::CookieJar(m) => db.upsert_cookie_jar(&m, source)?.id,
AnyModel::Environment(m) => db.upsert_environment(&m, source)?.id,
AnyModel::Folder(m) => db.upsert_folder(&m, source)?.id,
AnyModel::GrpcRequest(m) => db.upsert_grpc_request(&m, source)?.id,
AnyModel::HttpRequest(m) => db.upsert_http_request(&m, source)?.id,
AnyModel::HttpResponse(m) => db.upsert_http_response(&m, source, &blobs)?.id,
AnyModel::KeyValue(m) => db.upsert_key_value(&m, source)?.id,
AnyModel::Plugin(m) => db.upsert_plugin(&m, source)?.id,
AnyModel::Settings(m) => db.upsert_settings(&m, source)?.id,
AnyModel::WebsocketRequest(m) => db.upsert_websocket_request(&m, source)?.id,
AnyModel::Workspace(m) => db.upsert_workspace(&m, source)?.id,
AnyModel::WorkspaceMeta(m) => db.upsert_workspace_meta(&m, source)?.id,
a => return Err(GenericError(format!("Cannot upsert AnyModel {a:?})"))),
};
Ok(id)
}
#[tauri::command]
pub(crate) fn models_delete<R: Runtime>(
window: WebviewWindow<R>,
model: AnyModel,
) -> Result<String> {
use yaak_models::error::Error::GenericError;
let blobs = window.blob_manager();
// Use transaction for deletions because it might recurse
window.with_tx(|tx| {
let source = &UpdateSource::from_window_label(window.label());
let id = match model {
AnyModel::CookieJar(m) => tx.delete_cookie_jar(&m, source)?.id,
AnyModel::Environment(m) => tx.delete_environment(&m, source)?.id,
AnyModel::Folder(m) => tx.delete_folder(&m, source)?.id,
AnyModel::GrpcConnection(m) => tx.delete_grpc_connection(&m, source)?.id,
AnyModel::GrpcRequest(m) => tx.delete_grpc_request(&m, source)?.id,
AnyModel::HttpRequest(m) => tx.delete_http_request(&m, source)?.id,
AnyModel::HttpResponse(m) => tx.delete_http_response(&m, source, &blobs)?.id,
AnyModel::Plugin(m) => tx.delete_plugin(&m, source)?.id,
AnyModel::WebsocketConnection(m) => tx.delete_websocket_connection(&m, source)?.id,
AnyModel::WebsocketRequest(m) => tx.delete_websocket_request(&m, source)?.id,
AnyModel::Workspace(m) => tx.delete_workspace(&m, source)?.id,
a => return Err(GenericError(format!("Cannot delete AnyModel {a:?})"))),
};
Ok(id)
})
}
#[tauri::command]
pub(crate) fn models_duplicate<R: Runtime>(
window: WebviewWindow<R>,
model: AnyModel,
) -> Result<String> {
use yaak_models::error::Error::GenericError;
// Use transaction for duplications because it might recurse
window.with_tx(|tx| {
let source = &UpdateSource::from_window_label(window.label());
let id = match model {
AnyModel::Environment(m) => tx.duplicate_environment(&m, source)?.id,
AnyModel::Folder(m) => tx.duplicate_folder(&m, source)?.id,
AnyModel::GrpcRequest(m) => tx.duplicate_grpc_request(&m, source)?.id,
AnyModel::HttpRequest(m) => tx.duplicate_http_request(&m, source)?.id,
AnyModel::WebsocketRequest(m) => tx.duplicate_websocket_request(&m, source)?.id,
a => return Err(GenericError(format!("Cannot duplicate AnyModel {a:?})"))),
};
Ok(id)
})
}
#[tauri::command]
pub(crate) fn models_websocket_events<R: Runtime>(
app_handle: tauri::AppHandle<R>,
connection_id: &str,
) -> Result<Vec<WebsocketEvent>> {
Ok(app_handle.db().list_websocket_events(connection_id)?)
}
#[tauri::command]
pub(crate) fn models_grpc_events<R: Runtime>(
app_handle: tauri::AppHandle<R>,
connection_id: &str,
) -> Result<Vec<GrpcEvent>> {
Ok(app_handle.db().list_grpc_events(connection_id)?)
}
#[tauri::command]
pub(crate) fn models_get_settings<R: Runtime>(app_handle: tauri::AppHandle<R>) -> Result<Settings> {
Ok(app_handle.db().get_settings())
}
#[tauri::command]
pub(crate) fn models_get_graphql_introspection<R: Runtime>(
app_handle: tauri::AppHandle<R>,
request_id: &str,
) -> Result<Option<GraphQlIntrospection>> {
Ok(app_handle.db().get_graphql_introspection(request_id))
}
#[tauri::command]
pub(crate) fn models_upsert_graphql_introspection<R: Runtime>(
app_handle: tauri::AppHandle<R>,
request_id: &str,
workspace_id: &str,
content: Option<String>,
window: WebviewWindow<R>,
) -> Result<GraphQlIntrospection> {
let source = UpdateSource::from_window_label(window.label());
Ok(app_handle.db().upsert_graphql_introspection(workspace_id, request_id, content, &source)?)
}
#[tauri::command]
pub(crate) fn models_workspace_models<R: Runtime>(
window: WebviewWindow<R>,
workspace_id: Option<&str>,
) -> Result<String> {
let db = window.db();
let mut l: Vec<AnyModel> = Vec::new();
// Add the settings
l.push(db.get_settings().into());
// Add global models
l.append(&mut db.list_workspaces()?.into_iter().map(Into::into).collect());
l.append(&mut db.list_key_values()?.into_iter().map(Into::into).collect());
l.append(&mut db.list_plugins()?.into_iter().map(Into::into).collect());
// Add the workspace children
if let Some(wid) = workspace_id {
l.append(&mut db.list_cookie_jars(wid)?.into_iter().map(Into::into).collect());
l.append(&mut db.list_environments_ensure_base(wid)?.into_iter().map(Into::into).collect());
l.append(&mut db.list_folders(wid)?.into_iter().map(Into::into).collect());
l.append(&mut db.list_grpc_connections(wid)?.into_iter().map(Into::into).collect());
l.append(&mut db.list_grpc_requests(wid)?.into_iter().map(Into::into).collect());
l.append(&mut db.list_http_requests(wid)?.into_iter().map(Into::into).collect());
l.append(&mut db.list_http_responses(wid, None)?.into_iter().map(Into::into).collect());
l.append(&mut db.list_websocket_connections(wid)?.into_iter().map(Into::into).collect());
l.append(&mut db.list_websocket_requests(wid)?.into_iter().map(Into::into).collect());
l.append(&mut db.list_workspace_metas(wid)?.into_iter().map(Into::into).collect());
}
let j = serde_json::to_string(&l)?;
Ok(escape_str_for_webview(&j))
}
fn escape_str_for_webview(input: &str) -> String {
input
.chars()
.map(|c| {
let code = c as u32;
// ASCII
if code <= 0x7F {
c.to_string()
// BMP characters encoded normally
} else if code < 0xFFFF {
format!("\\u{:04X}", code)
// Beyond BMP encoded a surrogate pairs
} else {
let high = ((code - 0x10000) >> 10) + 0xD800;
let low = ((code - 0x10000) & 0x3FF) + 0xDC00;
format!("\\u{:04X}\\u{:04X}", high, low)
}
})
.collect()
}
/// Initialize database managers as a plugin (for initialization order).
/// Commands are in the main invoke_handler.
/// This must be registered before other plugins that depend on the database.
pub fn init<R: Runtime>() -> TauriPlugin<R> {
tauri::plugin::Builder::new("yaak-models-db")
.setup(|app_handle, _api| {
let app_path = app_handle.path().app_data_dir().unwrap();
let db_path = app_path.join("db.sqlite");
let blob_path = app_path.join("blobs.sqlite");
let (query_manager, blob_manager, rx) =
match yaak_models::init_standalone(&db_path, &blob_path) {
Ok(result) => result,
Err(e) => {
app_handle
.dialog()
.message(e.to_string())
.kind(MessageDialogKind::Error)
.blocking_show();
return Err(Box::from(e.to_string()));
}
};
app_handle.manage(query_manager);
app_handle.manage(blob_manager);
// Forward model change events to the frontend
let app_handle = app_handle.clone();
tauri::async_runtime::spawn(async move {
for payload in rx {
app_handle.emit("model_write", payload).unwrap();
}
});
Ok(())
})
.build()
}

View File

@@ -1,172 +0,0 @@
use crate::error::Result;
use crate::history::get_or_upsert_launch_info;
use crate::models_ext::QueryManagerExt;
use chrono::{DateTime, Utc};
use log::{debug, info};
use reqwest::Method;
use serde::{Deserialize, Serialize};
use std::time::Instant;
use tauri::{AppHandle, Emitter, Manager, Runtime, WebviewWindow};
use ts_rs::TS;
use yaak_common::platform::get_os_str;
use yaak_models::util::UpdateSource;
use yaak_tauri_utils::api_client::yaak_api_client;
// Check for updates every hour
const MAX_UPDATE_CHECK_SECONDS: u64 = 60 * 60;
const KV_NAMESPACE: &str = "notifications";
const KV_KEY: &str = "seen";
// Create updater struct
pub struct YaakNotifier {
last_check: Option<Instant>,
}
#[derive(Debug, Clone, Serialize, Deserialize, Default, TS)]
#[serde(default, rename_all = "camelCase")]
#[ts(export, export_to = "index.ts")]
pub struct YaakNotification {
timestamp: DateTime<Utc>,
timeout: Option<f64>,
id: String,
title: Option<String>,
message: String,
color: Option<String>,
action: Option<YaakNotificationAction>,
}
#[derive(Debug, Clone, Serialize, Deserialize, Default, TS)]
#[serde(default, rename_all = "camelCase")]
#[ts(export, export_to = "index.ts")]
pub struct YaakNotificationAction {
label: String,
url: String,
}
impl YaakNotifier {
pub fn new() -> Self {
Self { last_check: None }
}
pub async fn seen<R: Runtime>(&mut self, window: &WebviewWindow<R>, id: &str) -> Result<()> {
let app_handle = window.app_handle();
let mut seen = get_kv(app_handle).await?;
seen.push(id.to_string());
debug!("Marked notification as seen {}", id);
let seen_json = serde_json::to_string(&seen)?;
window.db().set_key_value_raw(
KV_NAMESPACE,
KV_KEY,
seen_json.as_str(),
&UpdateSource::from_window_label(window.label()),
);
Ok(())
}
pub async fn maybe_check<R: Runtime>(&mut self, window: &WebviewWindow<R>) -> Result<()> {
let app_handle = window.app_handle();
if let Some(i) = self.last_check
&& i.elapsed().as_secs() < MAX_UPDATE_CHECK_SECONDS
{
return Ok(());
}
self.last_check = Some(Instant::now());
if !app_handle.db().get_settings().check_notifications {
info!("Notifications are disabled. Skipping check.");
return Ok(());
}
debug!("Checking for notifications");
#[cfg(feature = "license")]
let license_check = {
use yaak_license::{LicenseCheckStatus, check_license};
match check_license(window).await {
Ok(LicenseCheckStatus::PersonalUse { .. }) => "personal",
Ok(LicenseCheckStatus::Active { .. }) => "commercial",
Ok(LicenseCheckStatus::PastDue { .. }) => "past_due",
Ok(LicenseCheckStatus::Inactive { .. }) => "invalid_license",
Ok(LicenseCheckStatus::Trialing { .. }) => "trialing",
Ok(LicenseCheckStatus::Expired { .. }) => "expired",
Ok(LicenseCheckStatus::Error { .. }) => "error",
Err(_) => "unknown",
}
.to_string()
};
#[cfg(not(feature = "license"))]
let license_check = "disabled".to_string();
let launch_info = get_or_upsert_launch_info(app_handle);
let req = yaak_api_client(app_handle)?
.request(Method::GET, "https://notify.yaak.app/notifications")
.query(&[
("version", &launch_info.current_version),
("version_prev", &launch_info.previous_version),
("launches", &launch_info.num_launches.to_string()),
("installed", &launch_info.user_since.format("%Y-%m-%d").to_string()),
("license", &license_check),
("updates", &get_updater_status(app_handle).to_string()),
("platform", &get_os_str().to_string()),
]);
let resp = req.send().await?;
if resp.status() != 200 {
debug!("Skipping notification status code {}", resp.status());
return Ok(());
}
for notification in resp.json::<Vec<YaakNotification>>().await? {
let seen = get_kv(app_handle).await?;
if seen.contains(&notification.id) {
debug!("Already seen notification {}", notification.id);
continue;
}
debug!("Got notification {:?}", notification);
let _ = app_handle.emit_to(window.label(), "notification", notification.clone());
break; // Only show one notification
}
Ok(())
}
}
async fn get_kv<R: Runtime>(app_handle: &AppHandle<R>) -> Result<Vec<String>> {
match app_handle.db().get_key_value_raw("notifications", "seen") {
None => Ok(Vec::new()),
Some(v) => Ok(serde_json::from_str(&v.value)?),
}
}
#[allow(unused)]
fn get_updater_status<R: Runtime>(app_handle: &AppHandle<R>) -> &'static str {
#[cfg(not(feature = "updater"))]
{
// Updater is not enabled as a Rust feature
return "missing";
}
#[cfg(all(feature = "updater", target_os = "linux"))]
{
let settings = app_handle.db().get_settings();
if !settings.autoupdate {
// Updates are explicitly disabled
"disabled"
} else if std::env::var("APPIMAGE").is_err() {
// Updates are enabled, but unsupported
"unsupported"
} else {
// Updates are enabled and supported
"enabled"
}
}
#[cfg(all(feature = "updater", not(target_os = "linux")))]
{
let settings = app_handle.db().get_settings();
if settings.autoupdate { "enabled" } else { "disabled" }
}
}

View File

@@ -1,542 +0,0 @@
use crate::error::Result;
use crate::http_request::send_http_request_with_context;
use crate::models_ext::BlobManagerExt;
use crate::models_ext::QueryManagerExt;
use crate::render::{render_grpc_request, render_http_request, render_json_value};
use crate::window::{CreateWindowConfig, create_window};
use crate::{
call_frontend, cookie_jar_from_window, environment_from_window, get_window_from_plugin_context,
workspace_from_window,
};
use chrono::Utc;
use cookie::Cookie;
use log::error;
use std::sync::Arc;
use tauri::{AppHandle, Emitter, Listener, Manager, Runtime};
use tauri_plugin_clipboard_manager::ClipboardExt;
use tauri_plugin_opener::OpenerExt;
use yaak_crypto::manager::EncryptionManager;
use yaak_models::models::{AnyModel, HttpResponse, Plugin};
use yaak_models::queries::any_request::AnyRequest;
use yaak_models::util::UpdateSource;
use yaak_plugins::error::Error::PluginErr;
use yaak_plugins::events::{
Color, DeleteKeyValueResponse, EmptyPayload, ErrorResponse, FindHttpResponsesResponse,
GetCookieValueResponse, GetHttpRequestByIdResponse, GetKeyValueResponse, Icon, InternalEvent,
InternalEventPayload, ListCookieNamesResponse, ListHttpRequestsResponse,
ListWorkspacesResponse, RenderGrpcRequestResponse, RenderHttpRequestResponse,
SendHttpRequestResponse, SetKeyValueResponse, ShowToastRequest, TemplateRenderResponse,
WindowInfoResponse, WindowNavigateEvent, WorkspaceInfo,
};
use yaak_plugins::manager::PluginManager;
use yaak_plugins::plugin_handle::PluginHandle;
use yaak_plugins::template_callback::PluginTemplateCallback;
use yaak_tauri_utils::window::WorkspaceWindowTrait;
use yaak_templates::{RenderErrorBehavior, RenderOptions};
pub(crate) async fn handle_plugin_event<R: Runtime>(
app_handle: &AppHandle<R>,
event: &InternalEvent,
plugin_handle: &PluginHandle,
) -> Result<Option<InternalEventPayload>> {
// log::debug!("Got event to app {event:?}");
let plugin_context = event.context.to_owned();
match event.clone().payload {
InternalEventPayload::CopyTextRequest(req) => {
app_handle.clipboard().write_text(req.text.as_str())?;
Ok(Some(InternalEventPayload::CopyTextResponse(EmptyPayload {})))
}
InternalEventPayload::ShowToastRequest(req) => {
match plugin_context.label {
Some(label) => app_handle.emit_to(label, "show_toast", req)?,
None => app_handle.emit("show_toast", req)?,
};
Ok(Some(InternalEventPayload::ShowToastResponse(EmptyPayload {})))
}
InternalEventPayload::PromptTextRequest(_) => {
let window = get_window_from_plugin_context(app_handle, &plugin_context)?;
Ok(call_frontend(&window, event).await)
}
InternalEventPayload::PromptFormRequest(_) => {
let window = get_window_from_plugin_context(app_handle, &plugin_context)?;
if event.reply_id.is_some() {
// Follow-up update from plugin runtime with resolved inputs — forward to frontend
window.emit_to(window.label(), "plugin_event", event.clone())?;
Ok(None)
} else {
// Initial request — set up bidirectional communication
window.emit_to(window.label(), "plugin_event", event.clone()).unwrap();
let event_id = event.id.clone();
let plugin_handle = plugin_handle.clone();
let plugin_context = plugin_context.clone();
let window = window.clone();
// Spawn async task to handle bidirectional form communication
tauri::async_runtime::spawn(async move {
let (tx, mut rx) = tokio::sync::mpsc::channel::<InternalEvent>(128);
// Listen for replies from the frontend
let listener_id = window.listen(event_id, move |ev: tauri::Event| {
let resp: InternalEvent = serde_json::from_str(ev.payload()).unwrap();
let _ = tx.try_send(resp);
});
// Forward each reply to the plugin runtime
while let Some(resp) = rx.recv().await {
let is_done = matches!(
&resp.payload,
InternalEventPayload::PromptFormResponse(r) if r.done.unwrap_or(false)
);
let event_to_send = plugin_handle.build_event_to_send(
&plugin_context,
&resp.payload,
Some(resp.reply_id.unwrap_or_default()),
);
if let Err(e) = plugin_handle.send(&event_to_send).await {
log::warn!("Failed to forward form response to plugin: {:?}", e);
}
if is_done {
break;
}
}
window.unlisten(listener_id);
});
Ok(None)
}
}
InternalEventPayload::FindHttpResponsesRequest(req) => {
let http_responses = app_handle
.db()
.list_http_responses_for_request(&req.request_id, req.limit.map(|l| l as u64))
.unwrap_or_default();
Ok(Some(InternalEventPayload::FindHttpResponsesResponse(FindHttpResponsesResponse {
http_responses,
})))
}
InternalEventPayload::ListHttpRequestsRequest(req) => {
let w = get_window_from_plugin_context(app_handle, &plugin_context)?;
let workspace = workspace_from_window(&w)
.ok_or(PluginErr("Failed to get workspace from window".into()))?;
let http_requests = if let Some(folder_id) = req.folder_id {
app_handle.db().list_http_requests_for_folder_recursive(&folder_id)?
} else {
app_handle.db().list_http_requests(&workspace.id)?
};
Ok(Some(InternalEventPayload::ListHttpRequestsResponse(ListHttpRequestsResponse {
http_requests,
})))
}
InternalEventPayload::ListFoldersRequest(_req) => {
let w = get_window_from_plugin_context(app_handle, &plugin_context)?;
let workspace = workspace_from_window(&w)
.ok_or(PluginErr("Failed to get workspace from window".into()))?;
let folders = app_handle.db().list_folders(&workspace.id)?;
Ok(Some(InternalEventPayload::ListFoldersResponse(
yaak_plugins::events::ListFoldersResponse { folders },
)))
}
InternalEventPayload::UpsertModelRequest(req) => {
use AnyModel::*;
let model = match &req.model {
HttpRequest(m) => {
HttpRequest(app_handle.db().upsert_http_request(m, &UpdateSource::Plugin)?)
}
GrpcRequest(m) => {
GrpcRequest(app_handle.db().upsert_grpc_request(m, &UpdateSource::Plugin)?)
}
WebsocketRequest(m) => WebsocketRequest(
app_handle.db().upsert_websocket_request(m, &UpdateSource::Plugin)?,
),
Folder(m) => Folder(app_handle.db().upsert_folder(m, &UpdateSource::Plugin)?),
Environment(m) => {
Environment(app_handle.db().upsert_environment(m, &UpdateSource::Plugin)?)
}
Workspace(m) => {
Workspace(app_handle.db().upsert_workspace(m, &UpdateSource::Plugin)?)
}
_ => {
return Err(PluginErr("Upsert not supported for this model type".into()).into());
}
};
Ok(Some(InternalEventPayload::UpsertModelResponse(
yaak_plugins::events::UpsertModelResponse { model },
)))
}
InternalEventPayload::DeleteModelRequest(req) => {
let model = match req.model.as_str() {
"http_request" => AnyModel::HttpRequest(
app_handle.db().delete_http_request_by_id(&req.id, &UpdateSource::Plugin)?,
),
"grpc_request" => AnyModel::GrpcRequest(
app_handle.db().delete_grpc_request_by_id(&req.id, &UpdateSource::Plugin)?,
),
"websocket_request" => AnyModel::WebsocketRequest(
app_handle
.db()
.delete_websocket_request_by_id(&req.id, &UpdateSource::Plugin)?,
),
"folder" => AnyModel::Folder(
app_handle.db().delete_folder_by_id(&req.id, &UpdateSource::Plugin)?,
),
"environment" => AnyModel::Environment(
app_handle.db().delete_environment_by_id(&req.id, &UpdateSource::Plugin)?,
),
_ => {
return Err(PluginErr("Delete not supported for this model type".into()).into());
}
};
Ok(Some(InternalEventPayload::DeleteModelResponse(
yaak_plugins::events::DeleteModelResponse { model },
)))
}
InternalEventPayload::GetHttpRequestByIdRequest(req) => {
let http_request = app_handle.db().get_http_request(&req.id).ok();
Ok(Some(InternalEventPayload::GetHttpRequestByIdResponse(GetHttpRequestByIdResponse {
http_request,
})))
}
InternalEventPayload::RenderGrpcRequestRequest(req) => {
let window = get_window_from_plugin_context(app_handle, &plugin_context)?;
let workspace =
workspace_from_window(&window).expect("Failed to get workspace_id from window URL");
let environment_id = environment_from_window(&window).map(|e| e.id);
let environment_chain = window.db().resolve_environments(
&workspace.id,
req.grpc_request.folder_id.as_deref(),
environment_id.as_deref(),
)?;
let plugin_manager = Arc::new((*app_handle.state::<PluginManager>()).clone());
let encryption_manager = Arc::new((*app_handle.state::<EncryptionManager>()).clone());
let cb = PluginTemplateCallback::new(
plugin_manager,
encryption_manager,
&plugin_context,
req.purpose,
);
let opt = RenderOptions { error_behavior: RenderErrorBehavior::Throw };
let grpc_request =
render_grpc_request(&req.grpc_request, environment_chain, &cb, &opt).await?;
Ok(Some(InternalEventPayload::RenderGrpcRequestResponse(RenderGrpcRequestResponse {
grpc_request,
})))
}
InternalEventPayload::RenderHttpRequestRequest(req) => {
let window = get_window_from_plugin_context(app_handle, &plugin_context)?;
let workspace =
workspace_from_window(&window).expect("Failed to get workspace_id from window URL");
let environment_id = environment_from_window(&window).map(|e| e.id);
let environment_chain = window.db().resolve_environments(
&workspace.id,
req.http_request.folder_id.as_deref(),
environment_id.as_deref(),
)?;
let plugin_manager = Arc::new((*app_handle.state::<PluginManager>()).clone());
let encryption_manager = Arc::new((*app_handle.state::<EncryptionManager>()).clone());
let cb = PluginTemplateCallback::new(
plugin_manager,
encryption_manager,
&plugin_context,
req.purpose,
);
let opt = &RenderOptions { error_behavior: RenderErrorBehavior::Throw };
let http_request =
render_http_request(&req.http_request, environment_chain, &cb, &opt).await?;
Ok(Some(InternalEventPayload::RenderHttpRequestResponse(RenderHttpRequestResponse {
http_request,
})))
}
InternalEventPayload::TemplateRenderRequest(req) => {
let window = get_window_from_plugin_context(app_handle, &plugin_context)?;
let workspace =
workspace_from_window(&window).expect("Failed to get workspace_id from window URL");
let environment_id = environment_from_window(&window).map(|e| e.id);
let folder_id = if let Some(id) = window.request_id() {
match window.db().get_any_request(&id) {
Ok(AnyRequest::HttpRequest(r)) => r.folder_id,
Ok(AnyRequest::GrpcRequest(r)) => r.folder_id,
Ok(AnyRequest::WebsocketRequest(r)) => r.folder_id,
Err(_) => None,
}
} else {
None
};
let environment_chain = window.db().resolve_environments(
&workspace.id,
folder_id.as_deref(),
environment_id.as_deref(),
)?;
let plugin_manager = Arc::new((*app_handle.state::<PluginManager>()).clone());
let encryption_manager = Arc::new((*app_handle.state::<EncryptionManager>()).clone());
let cb = PluginTemplateCallback::new(
plugin_manager,
encryption_manager,
&plugin_context,
req.purpose,
);
let opt = RenderOptions { error_behavior: RenderErrorBehavior::Throw };
let data = render_json_value(req.data, environment_chain, &cb, &opt).await?;
Ok(Some(InternalEventPayload::TemplateRenderResponse(TemplateRenderResponse { data })))
}
InternalEventPayload::ErrorResponse(resp) => {
error!("Plugin error: {}: {:?}", resp.error, resp);
let toast_event = plugin_handle.build_event_to_send(
&plugin_context,
&InternalEventPayload::ShowToastRequest(ShowToastRequest {
message: format!(
"Plugin error from {}: {}",
plugin_handle.info().name,
resp.error
),
color: Some(Color::Danger),
timeout: Some(30000),
..Default::default()
}),
None,
);
Box::pin(handle_plugin_event(app_handle, &toast_event, plugin_handle)).await
}
InternalEventPayload::ReloadResponse(req) => {
let plugins = app_handle.db().list_plugins()?;
for plugin in plugins {
if plugin.directory != plugin_handle.dir {
continue;
}
let new_plugin = Plugin {
updated_at: Utc::now().naive_utc(), // TODO: Add reloaded_at field to use instead
..plugin
};
app_handle.db().upsert_plugin(&new_plugin, &UpdateSource::Plugin)?;
}
if !req.silent {
let info = plugin_handle.info();
let toast_event = plugin_handle.build_event_to_send(
&plugin_context,
&InternalEventPayload::ShowToastRequest(ShowToastRequest {
message: format!("Reloaded plugin {}@{}", info.name, info.version),
icon: Some(Icon::Info),
timeout: Some(3000),
..Default::default()
}),
None,
);
Box::pin(handle_plugin_event(app_handle, &toast_event, plugin_handle)).await
} else {
Ok(None)
}
}
InternalEventPayload::SendHttpRequestRequest(req) => {
let window = get_window_from_plugin_context(app_handle, &plugin_context)?;
let mut http_request = req.http_request;
let workspace =
workspace_from_window(&window).expect("Failed to get workspace_id from window URL");
let cookie_jar = cookie_jar_from_window(&window);
let environment = environment_from_window(&window);
if http_request.workspace_id.is_empty() {
http_request.workspace_id = workspace.id;
}
let http_response = if http_request.id.is_empty() {
HttpResponse::default()
} else {
let blobs = window.blob_manager();
window.db().upsert_http_response(
&HttpResponse {
request_id: http_request.id.clone(),
workspace_id: http_request.workspace_id.clone(),
..Default::default()
},
&UpdateSource::Plugin,
&blobs,
)?
};
let http_response = send_http_request_with_context(
&window,
&http_request,
&http_response,
environment,
cookie_jar,
&mut tokio::sync::watch::channel(false).1, // No-op cancel channel
&plugin_context,
)
.await?;
Ok(Some(InternalEventPayload::SendHttpRequestResponse(SendHttpRequestResponse {
http_response,
})))
}
InternalEventPayload::OpenWindowRequest(req) => {
let (navigation_tx, mut navigation_rx) = tokio::sync::mpsc::channel(128);
let (close_tx, mut close_rx) = tokio::sync::mpsc::channel(128);
let win_config = CreateWindowConfig {
url: &req.url,
label: &req.label,
title: &req.title.clone().unwrap_or_default(),
navigation_tx: Some(navigation_tx),
close_tx: Some(close_tx),
inner_size: req.size.clone().map(|s| (s.width, s.height)),
data_dir_key: req.data_dir_key.clone(),
..Default::default()
};
if let Err(e) = create_window(app_handle, win_config) {
let error_event = plugin_handle.build_event_to_send(
&plugin_context,
&InternalEventPayload::ErrorResponse(ErrorResponse {
error: format!("Failed to create window: {:?}", e),
}),
None,
);
return Box::pin(handle_plugin_event(app_handle, &error_event, plugin_handle))
.await;
}
{
let event_id = event.id.clone();
let plugin_handle = plugin_handle.clone();
let plugin_context = plugin_context.clone();
tauri::async_runtime::spawn(async move {
while let Some(url) = navigation_rx.recv().await {
let url = url.to_string();
let event_to_send = plugin_handle.build_event_to_send(
&plugin_context, // NOTE: Sending existing context on purpose here
&InternalEventPayload::WindowNavigateEvent(WindowNavigateEvent { url }),
Some(event_id.clone()),
);
plugin_handle.send(&event_to_send).await.unwrap();
}
});
}
{
let event_id = event.id.clone();
let plugin_handle = plugin_handle.clone();
let plugin_context = plugin_context.clone();
tauri::async_runtime::spawn(async move {
while let Some(_) = close_rx.recv().await {
let event_to_send = plugin_handle.build_event_to_send(
&plugin_context,
&InternalEventPayload::WindowCloseEvent,
Some(event_id.clone()),
);
plugin_handle.send(&event_to_send).await.unwrap();
}
});
}
Ok(None)
}
InternalEventPayload::CloseWindowRequest(req) => {
if let Some(window) = app_handle.webview_windows().get(&req.label) {
window.close()?;
}
Ok(None)
}
InternalEventPayload::OpenExternalUrlRequest(req) => {
app_handle.opener().open_url(&req.url, None::<&str>)?;
Ok(Some(InternalEventPayload::OpenExternalUrlResponse(EmptyPayload {})))
}
InternalEventPayload::SetKeyValueRequest(req) => {
let name = plugin_handle.info().name;
app_handle.db().set_plugin_key_value(&name, &req.key, &req.value);
Ok(Some(InternalEventPayload::SetKeyValueResponse(SetKeyValueResponse {})))
}
InternalEventPayload::GetKeyValueRequest(req) => {
let name = plugin_handle.info().name;
let value = app_handle.db().get_plugin_key_value(&name, &req.key).map(|v| v.value);
Ok(Some(InternalEventPayload::GetKeyValueResponse(GetKeyValueResponse { value })))
}
InternalEventPayload::DeleteKeyValueRequest(req) => {
let name = plugin_handle.info().name;
let deleted = app_handle.db().delete_plugin_key_value(&name, &req.key)?;
Ok(Some(InternalEventPayload::DeleteKeyValueResponse(DeleteKeyValueResponse {
deleted,
})))
}
InternalEventPayload::ListCookieNamesRequest(_req) => {
let window = get_window_from_plugin_context(app_handle, &plugin_context)?;
let names = match cookie_jar_from_window(&window) {
None => Vec::new(),
Some(j) => j
.cookies
.into_iter()
.filter_map(|c| Cookie::parse(c.raw_cookie).ok().map(|c| c.name().to_string()))
.collect(),
};
Ok(Some(InternalEventPayload::ListCookieNamesResponse(ListCookieNamesResponse {
names,
})))
}
InternalEventPayload::GetCookieValueRequest(req) => {
let window = get_window_from_plugin_context(app_handle, &plugin_context)?;
let value = match cookie_jar_from_window(&window) {
None => None,
Some(j) => j.cookies.into_iter().find_map(|c| match Cookie::parse(c.raw_cookie) {
Ok(c) if c.name().to_string().eq(&req.name) => {
Some(c.value_trimmed().to_string())
}
_ => None,
}),
};
Ok(Some(InternalEventPayload::GetCookieValueResponse(GetCookieValueResponse { value })))
}
InternalEventPayload::WindowInfoRequest(req) => {
let w = app_handle
.get_webview_window(&req.label)
.ok_or(PluginErr(format!("Failed to find window for {}", req.label)))?;
// Actually look up the data so we never return an invalid ID
let environment_id = environment_from_window(&w).map(|m| m.id);
let workspace_id = workspace_from_window(&w).map(|m| m.id);
let request_id =
match app_handle.db().get_any_request(&w.request_id().unwrap_or_default()) {
Ok(AnyRequest::HttpRequest(r)) => Some(r.id),
Ok(AnyRequest::WebsocketRequest(r)) => Some(r.id),
Ok(AnyRequest::GrpcRequest(r)) => Some(r.id),
Err(_) => None,
};
Ok(Some(InternalEventPayload::WindowInfoResponse(WindowInfoResponse {
label: w.label().to_string(),
request_id,
workspace_id,
environment_id,
})))
}
InternalEventPayload::ListWorkspacesRequest(_) => {
let mut workspaces = Vec::new();
for (_, window) in app_handle.webview_windows() {
if let Some(workspace) = workspace_from_window(&window) {
workspaces.push(WorkspaceInfo {
id: workspace.id.clone(),
name: workspace.name.clone(),
label: window.label().to_string(),
});
}
}
Ok(Some(InternalEventPayload::ListWorkspacesResponse(ListWorkspacesResponse {
workspaces,
})))
}
_ => Ok(None),
}
}

View File

@@ -1,359 +0,0 @@
//! Tauri-specific plugin management code.
//!
//! This module contains all Tauri integration for the plugin system:
//! - Plugin initialization and lifecycle management
//! - Tauri commands for plugin search/install/uninstall
//! - Plugin update checking
use crate::PluginContextExt;
use crate::error::Result;
use crate::models_ext::QueryManagerExt;
use log::{error, info, warn};
use serde::Serialize;
use std::sync::Arc;
use std::sync::atomic::{AtomicBool, Ordering};
use std::time::{Duration, Instant};
use tauri::path::BaseDirectory;
use tauri::plugin::{Builder, TauriPlugin};
use tauri::{
AppHandle, Emitter, Manager, RunEvent, Runtime, State, WebviewWindow, WindowEvent, command,
is_dev,
};
use tokio::sync::Mutex;
use ts_rs::TS;
use yaak_models::models::Plugin;
use yaak_models::util::UpdateSource;
use yaak_plugins::api::{
PluginNameVersion, PluginSearchResponse, PluginUpdatesResponse, check_plugin_updates,
search_plugins,
};
use yaak_plugins::events::{Color, Icon, PluginContext, ShowToastRequest};
use yaak_plugins::install::{delete_and_uninstall, download_and_install};
use yaak_plugins::manager::PluginManager;
use yaak_plugins::plugin_meta::get_plugin_meta;
use yaak_tauri_utils::api_client::yaak_api_client;
static EXITING: AtomicBool = AtomicBool::new(false);
// ============================================================================
// Plugin Updater
// ============================================================================
const MAX_UPDATE_CHECK_HOURS: u64 = 12;
pub struct PluginUpdater {
last_check: Option<Instant>,
}
#[derive(Debug, Clone, PartialEq, Serialize, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export, export_to = "index.ts")]
pub struct PluginUpdateNotification {
pub update_count: usize,
pub plugins: Vec<PluginUpdateInfo>,
}
#[derive(Debug, Clone, PartialEq, Serialize, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export, export_to = "index.ts")]
pub struct PluginUpdateInfo {
pub name: String,
pub current_version: String,
pub latest_version: String,
}
impl PluginUpdater {
pub fn new() -> Self {
Self { last_check: None }
}
pub async fn check_now<R: Runtime>(&mut self, window: &WebviewWindow<R>) -> Result<bool> {
self.last_check = Some(Instant::now());
info!("Checking for plugin updates");
let http_client = yaak_api_client(window.app_handle())?;
let plugins = window.app_handle().db().list_plugins()?;
let updates = check_plugin_updates(&http_client, plugins.clone()).await?;
if updates.plugins.is_empty() {
info!("No plugin updates available");
return Ok(false);
}
// Get current plugin versions to build notification
let mut update_infos = Vec::new();
for update in &updates.plugins {
if let Some(plugin) = plugins.iter().find(|p| {
if let Ok(meta) = get_plugin_meta(&std::path::Path::new(&p.directory)) {
meta.name == update.name
} else {
false
}
}) {
if let Ok(meta) = get_plugin_meta(&std::path::Path::new(&plugin.directory)) {
update_infos.push(PluginUpdateInfo {
name: update.name.clone(),
current_version: meta.version,
latest_version: update.version.clone(),
});
}
}
}
let notification =
PluginUpdateNotification { update_count: update_infos.len(), plugins: update_infos };
info!("Found {} plugin update(s)", notification.update_count);
if let Err(e) = window.emit_to(window.label(), "plugin_updates_available", &notification) {
error!("Failed to emit plugin_updates_available event: {}", e);
}
Ok(true)
}
pub async fn maybe_check<R: Runtime>(&mut self, window: &WebviewWindow<R>) -> Result<bool> {
let update_period_seconds = MAX_UPDATE_CHECK_HOURS * 60 * 60;
if let Some(i) = self.last_check
&& i.elapsed().as_secs() < update_period_seconds
{
return Ok(false);
}
self.check_now(window).await
}
}
// ============================================================================
// Tauri Commands
// ============================================================================
#[command]
pub async fn cmd_plugins_search<R: Runtime>(
app_handle: AppHandle<R>,
query: &str,
) -> Result<PluginSearchResponse> {
let http_client = yaak_api_client(&app_handle)?;
Ok(search_plugins(&http_client, query).await?)
}
#[command]
pub async fn cmd_plugins_install<R: Runtime>(
window: WebviewWindow<R>,
name: &str,
version: Option<String>,
) -> Result<()> {
let plugin_manager = Arc::new((*window.state::<PluginManager>()).clone());
let http_client = yaak_api_client(window.app_handle())?;
let query_manager = window.state::<yaak_models::query_manager::QueryManager>();
let plugin_context = window.plugin_context();
download_and_install(
plugin_manager,
&query_manager,
&http_client,
&plugin_context,
name,
version,
)
.await?;
Ok(())
}
#[command]
pub async fn cmd_plugins_uninstall<R: Runtime>(
plugin_id: &str,
window: WebviewWindow<R>,
) -> Result<Plugin> {
let plugin_manager = Arc::new((*window.state::<PluginManager>()).clone());
let query_manager = window.state::<yaak_models::query_manager::QueryManager>();
let plugin_context = window.plugin_context();
Ok(delete_and_uninstall(plugin_manager, &query_manager, &plugin_context, plugin_id).await?)
}
#[command]
pub async fn cmd_plugins_updates<R: Runtime>(
app_handle: AppHandle<R>,
) -> Result<PluginUpdatesResponse> {
let http_client = yaak_api_client(&app_handle)?;
let plugins = app_handle.db().list_plugins()?;
Ok(check_plugin_updates(&http_client, plugins).await?)
}
#[command]
pub async fn cmd_plugins_update_all<R: Runtime>(
window: WebviewWindow<R>,
) -> Result<Vec<PluginNameVersion>> {
let http_client = yaak_api_client(window.app_handle())?;
let plugins = window.db().list_plugins()?;
// Get list of available updates (already filtered to only registry plugins)
let updates = check_plugin_updates(&http_client, plugins).await?;
if updates.plugins.is_empty() {
return Ok(Vec::new());
}
let plugin_manager = Arc::new((*window.state::<PluginManager>()).clone());
let query_manager = window.state::<yaak_models::query_manager::QueryManager>();
let plugin_context = window.plugin_context();
let mut updated = Vec::new();
for update in updates.plugins {
info!("Updating plugin: {} to version {}", update.name, update.version);
match download_and_install(
plugin_manager.clone(),
&query_manager,
&http_client,
&plugin_context,
&update.name,
Some(update.version.clone()),
)
.await
{
Ok(_) => {
info!("Successfully updated plugin: {}", update.name);
updated.push(update.clone());
}
Err(e) => {
log::error!("Failed to update plugin {}: {:?}", update.name, e);
}
}
}
Ok(updated)
}
// ============================================================================
// Tauri Plugin Initialization
// ============================================================================
pub fn init<R: Runtime>() -> TauriPlugin<R> {
Builder::new("yaak-plugins")
.setup(|app_handle, _| {
// Resolve paths for plugin manager
let vendored_plugin_dir = app_handle
.path()
.resolve("vendored/plugins", BaseDirectory::Resource)
.expect("failed to resolve plugin directory resource");
let installed_plugin_dir = app_handle
.path()
.app_data_dir()
.expect("failed to get app data dir")
.join("installed-plugins");
#[cfg(target_os = "windows")]
let node_bin_name = "yaaknode.exe";
#[cfg(not(target_os = "windows"))]
let node_bin_name = "yaaknode";
let node_bin_path = app_handle
.path()
.resolve(format!("vendored/node/{}", node_bin_name), BaseDirectory::Resource)
.expect("failed to resolve yaaknode binary");
let plugin_runtime_main = app_handle
.path()
.resolve("vendored/plugin-runtime", BaseDirectory::Resource)
.expect("failed to resolve plugin runtime")
.join("index.cjs");
let dev_mode = is_dev();
// Create plugin manager asynchronously
let app_handle_clone = app_handle.clone();
tauri::async_runtime::block_on(async move {
let manager = PluginManager::new(
vendored_plugin_dir,
installed_plugin_dir,
node_bin_path,
plugin_runtime_main,
dev_mode,
)
.await;
// Initialize all plugins after manager is created
let bundled_dirs = manager
.list_bundled_plugin_dirs()
.await
.expect("Failed to list bundled plugins");
// Ensure all bundled plugins make it into the database
let db = app_handle_clone.db();
for dir in &bundled_dirs {
if db.get_plugin_by_directory(dir).is_none() {
db.upsert_plugin(
&Plugin {
directory: dir.clone(),
enabled: true,
url: None,
..Default::default()
},
&UpdateSource::Background,
)
.expect("Failed to upsert bundled plugin");
}
}
// Get all plugins from database and initialize
let plugins = db.list_plugins().expect("Failed to list plugins from database");
drop(db); // Explicitly drop the connection before await
let errors =
manager.initialize_all_plugins(plugins, &PluginContext::new_empty()).await;
// Show toast for any failed plugins
for (plugin_dir, error_msg) in errors {
let plugin_name = plugin_dir.split('/').last().unwrap_or(&plugin_dir);
let toast = ShowToastRequest {
message: format!("Failed to start plugin '{}': {}", plugin_name, error_msg),
color: Some(Color::Danger),
icon: Some(Icon::AlertTriangle),
timeout: Some(10000),
};
if let Err(emit_err) = app_handle_clone.emit("show_toast", toast) {
error!("Failed to emit toast for plugin error: {emit_err:?}");
}
}
app_handle_clone.manage(manager);
});
let plugin_updater = PluginUpdater::new();
app_handle.manage(Mutex::new(plugin_updater));
Ok(())
})
.on_event(|app, e| match e {
RunEvent::ExitRequested { api, .. } => {
if EXITING.swap(true, Ordering::SeqCst) {
return; // Only exit once to prevent infinite recursion
}
api.prevent_exit();
tauri::async_runtime::block_on(async move {
info!("Exiting plugin runtime due to app exit");
let manager: State<PluginManager> = app.state();
manager.terminate().await;
app.exit(0);
});
}
RunEvent::WindowEvent { event: WindowEvent::Focused(true), label, .. } => {
// Check for plugin updates on window focus
let w = app.get_webview_window(&label).unwrap();
let h = app.clone();
tauri::async_runtime::spawn(async move {
tokio::time::sleep(Duration::from_secs(3)).await;
let val: State<'_, Mutex<PluginUpdater>> = h.state();
if let Err(e) = val.lock().await.maybe_check(&w).await {
warn!("Failed to check for plugin updates {e:?}");
}
});
}
_ => {}
})
.build()
}

View File

@@ -1,163 +0,0 @@
use log::info;
use serde_json::Value;
use std::collections::BTreeMap;
use yaak_http::path_placeholders::apply_path_placeholders;
use yaak_models::models::{
Environment, GrpcRequest, HttpRequest, HttpRequestHeader, HttpUrlParameter,
};
use yaak_models::render::make_vars_hashmap;
use yaak_templates::{RenderOptions, TemplateCallback, parse_and_render, render_json_value_raw};
pub async fn render_template<T: TemplateCallback>(
template: &str,
environment_chain: Vec<Environment>,
cb: &T,
opt: &RenderOptions,
) -> yaak_templates::error::Result<String> {
let vars = &make_vars_hashmap(environment_chain);
parse_and_render(template, vars, cb, &opt).await
}
pub async fn render_json_value<T: TemplateCallback>(
value: Value,
environment_chain: Vec<Environment>,
cb: &T,
opt: &RenderOptions,
) -> yaak_templates::error::Result<Value> {
let vars = &make_vars_hashmap(environment_chain);
render_json_value_raw(value, vars, cb, opt).await
}
pub async fn render_grpc_request<T: TemplateCallback>(
r: &GrpcRequest,
environment_chain: Vec<Environment>,
cb: &T,
opt: &RenderOptions,
) -> yaak_templates::error::Result<GrpcRequest> {
let vars = &make_vars_hashmap(environment_chain);
let mut metadata = Vec::new();
for p in r.metadata.clone() {
metadata.push(HttpRequestHeader {
enabled: p.enabled,
name: parse_and_render(p.name.as_str(), vars, cb, &opt).await?,
value: parse_and_render(p.value.as_str(), vars, cb, &opt).await?,
id: p.id,
})
}
let authentication = {
let mut disabled = false;
let mut auth = BTreeMap::new();
match r.authentication.get("disabled") {
Some(Value::Bool(true)) => {
disabled = true;
}
Some(Value::String(tmpl)) => {
disabled = parse_and_render(tmpl.as_str(), vars, cb, &opt)
.await
.unwrap_or_default()
.is_empty();
info!(
"Rendering authentication.disabled as a template: {disabled} from \"{tmpl}\""
);
}
_ => {}
}
if disabled {
auth.insert("disabled".to_string(), Value::Bool(true));
} else {
for (k, v) in r.authentication.clone() {
if k == "disabled" {
auth.insert(k, Value::Bool(false));
} else {
auth.insert(k, render_json_value_raw(v, vars, cb, &opt).await?);
}
}
}
auth
};
let url = parse_and_render(r.url.as_str(), vars, cb, &opt).await?;
Ok(GrpcRequest { url, metadata, authentication, ..r.to_owned() })
}
pub async fn render_http_request<T: TemplateCallback>(
r: &HttpRequest,
environment_chain: Vec<Environment>,
cb: &T,
opt: &RenderOptions,
) -> yaak_templates::error::Result<HttpRequest> {
let vars = &make_vars_hashmap(environment_chain);
let mut url_parameters = Vec::new();
for p in r.url_parameters.clone() {
if !p.enabled {
continue;
}
url_parameters.push(HttpUrlParameter {
enabled: p.enabled,
name: parse_and_render(p.name.as_str(), vars, cb, &opt).await?,
value: parse_and_render(p.value.as_str(), vars, cb, &opt).await?,
id: p.id,
})
}
let mut headers = Vec::new();
for p in r.headers.clone() {
if !p.enabled {
continue;
}
headers.push(HttpRequestHeader {
enabled: p.enabled,
name: parse_and_render(p.name.as_str(), vars, cb, &opt).await?,
value: parse_and_render(p.value.as_str(), vars, cb, &opt).await?,
id: p.id,
})
}
let mut body = BTreeMap::new();
for (k, v) in r.body.clone() {
body.insert(k, render_json_value_raw(v, vars, cb, &opt).await?);
}
let authentication = {
let mut disabled = false;
let mut auth = BTreeMap::new();
match r.authentication.get("disabled") {
Some(Value::Bool(true)) => {
disabled = true;
}
Some(Value::String(tmpl)) => {
disabled = parse_and_render(tmpl.as_str(), vars, cb, &opt)
.await
.unwrap_or_default()
.is_empty();
info!(
"Rendering authentication.disabled as a template: {disabled} from \"{tmpl}\""
);
}
_ => {}
}
if disabled {
auth.insert("disabled".to_string(), Value::Bool(true));
} else {
for (k, v) in r.authentication.clone() {
if k == "disabled" {
auth.insert(k, Value::Bool(false));
} else {
auth.insert(k, render_json_value_raw(v, vars, cb, &opt).await?);
}
}
}
auth
};
let url = parse_and_render(r.url.clone().as_str(), vars, cb, &opt).await?;
// This doesn't fit perfectly with the concept of "rendering" but it kind of does
let (url, url_parameters) = apply_path_placeholders(&url, &url_parameters);
Ok(HttpRequest { url, url_parameters, headers, body, authentication, ..r.to_owned() })
}

View File

@@ -1,104 +0,0 @@
//! Tauri-specific extensions for yaak-sync.
//!
//! This module provides the Tauri commands for sync functionality.
use crate::error::Result;
use crate::models_ext::QueryManagerExt;
use chrono::Utc;
use log::warn;
use serde::{Deserialize, Serialize};
use std::path::Path;
use tauri::ipc::Channel;
use tauri::{AppHandle, Listener, Runtime, command};
use tokio::sync::watch;
use ts_rs::TS;
use yaak_sync::error::Error::InvalidSyncDirectory;
use yaak_sync::sync::{
FsCandidate, SyncOp, apply_sync_ops, apply_sync_state_ops, compute_sync_ops, get_db_candidates,
get_fs_candidates,
};
use yaak_sync::watch::{WatchEvent, watch_directory};
#[command]
pub(crate) async fn cmd_sync_calculate<R: Runtime>(
app_handle: AppHandle<R>,
workspace_id: &str,
sync_dir: &Path,
) -> Result<Vec<SyncOp>> {
if !sync_dir.exists() {
return Err(InvalidSyncDirectory(sync_dir.to_string_lossy().to_string()).into());
}
let db = app_handle.db();
let version = app_handle.package_info().version.to_string();
let db_candidates = get_db_candidates(&db, &version, workspace_id, sync_dir)?;
let fs_candidates = get_fs_candidates(sync_dir)?
.into_iter()
// Only keep items in the same workspace
.filter(|fs| fs.model.workspace_id() == workspace_id)
.collect::<Vec<FsCandidate>>();
Ok(compute_sync_ops(db_candidates, fs_candidates))
}
#[command]
pub(crate) async fn cmd_sync_calculate_fs(dir: &Path) -> Result<Vec<SyncOp>> {
let db_candidates = Vec::new();
let fs_candidates = get_fs_candidates(dir)?;
Ok(compute_sync_ops(db_candidates, fs_candidates))
}
#[command]
pub(crate) async fn cmd_sync_apply<R: Runtime>(
app_handle: AppHandle<R>,
sync_ops: Vec<SyncOp>,
sync_dir: &Path,
workspace_id: &str,
) -> Result<()> {
let db = app_handle.db();
let sync_state_ops = apply_sync_ops(&db, workspace_id, sync_dir, sync_ops)?;
apply_sync_state_ops(&db, workspace_id, sync_dir, sync_state_ops)?;
Ok(())
}
#[derive(Debug, Clone, Serialize, Deserialize, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export, export_to = "index.ts")]
pub(crate) struct WatchResult {
unlisten_event: String,
}
#[command]
pub(crate) async fn cmd_sync_watch<R: Runtime>(
app_handle: AppHandle<R>,
sync_dir: &Path,
workspace_id: &str,
channel: Channel<WatchEvent>,
) -> Result<WatchResult> {
let (cancel_tx, cancel_rx) = watch::channel(());
// Create a callback that forwards events to the Tauri channel
let callback = move |event: WatchEvent| {
if let Err(e) = channel.send(event) {
warn!("Failed to send watch event: {:?}", e);
}
};
watch_directory(&sync_dir, callback, cancel_rx).await?;
let app_handle_inner = app_handle.clone();
let unlisten_event =
format!("watch-unlisten-{}-{}", workspace_id, Utc::now().timestamp_millis());
// TODO: Figure out a way to unlisten when the client app_handle refreshes or closes. Perhaps with
// a heartbeat mechanism, or ensuring only a single subscription per workspace (at least
// this won't create `n` subs). We could also maybe have a global fs watcher that we keep
// adding to here.
app_handle.listen_any(unlisten_event.clone(), move |event| {
app_handle_inner.unlisten(event.id());
if let Err(e) = cancel_tx.send(()) {
warn!("Failed to send cancel signal to watcher {e:?}");
}
});
Ok(WatchResult { unlisten_event })
}

View File

@@ -1,380 +0,0 @@
use std::fmt::{Display, Formatter};
use std::path::PathBuf;
use std::time::{Duration, Instant};
use crate::error::Result;
use crate::models_ext::QueryManagerExt;
use log::{debug, error, info, warn};
use serde::{Deserialize, Serialize};
use tauri::{Emitter, Listener, Manager, Runtime, WebviewWindow};
use tauri_plugin_dialog::{DialogExt, MessageDialogButtons};
use tauri_plugin_updater::{Update, UpdaterExt};
use tokio::task::block_in_place;
use tokio::time::sleep;
use ts_rs::TS;
use yaak_models::util::generate_id;
use yaak_plugins::manager::PluginManager;
use crate::error::Error::GenericError;
use crate::is_dev;
const MAX_UPDATE_CHECK_HOURS_STABLE: u64 = 12;
const MAX_UPDATE_CHECK_HOURS_BETA: u64 = 3;
const MAX_UPDATE_CHECK_HOURS_ALPHA: u64 = 1;
// Create updater struct
pub struct YaakUpdater {
last_check: Option<Instant>,
}
pub enum UpdateMode {
Stable,
Beta,
Alpha,
}
impl Display for UpdateMode {
fn fmt(&self, f: &mut Formatter<'_>) -> std::fmt::Result {
let s = match self {
UpdateMode::Stable => "stable",
UpdateMode::Beta => "beta",
UpdateMode::Alpha => "alpha",
};
write!(f, "{}", s)
}
}
impl UpdateMode {
pub fn new(mode: &str) -> UpdateMode {
match mode {
"beta" => UpdateMode::Beta,
"alpha" => UpdateMode::Alpha,
_ => UpdateMode::Stable,
}
}
}
#[derive(PartialEq)]
pub enum UpdateTrigger {
Background,
User,
}
impl YaakUpdater {
pub fn new() -> Self {
Self { last_check: None }
}
pub async fn check_now<R: Runtime>(
&mut self,
window: &WebviewWindow<R>,
mode: UpdateMode,
auto_download: bool,
update_trigger: UpdateTrigger,
) -> Result<bool> {
// Only AppImage supports updates on Linux, so skip if it's not
#[cfg(target_os = "linux")]
{
if std::env::var("APPIMAGE").is_err() {
return Ok(false);
}
}
let settings = window.db().get_settings();
let update_key = format!("{:x}", md5::compute(settings.id));
self.last_check = Some(Instant::now());
info!("Checking for updates mode={} autodl={}", mode, auto_download);
let w = window.clone();
let update_check_result = w
.updater_builder()
.on_before_exit(move || {
// Kill plugin manager before exit or NSIS installer will fail to replace sidecar
// while it's running.
// NOTE: This is only called on Windows
let w = w.clone();
block_in_place(|| {
tauri::async_runtime::block_on(async move {
info!("Shutting down plugin manager before update");
let plugin_manager = w.state::<PluginManager>();
plugin_manager.terminate().await;
});
});
})
.header("X-Update-Mode", mode.to_string())?
.header("X-Update-Key", update_key)?
.header(
"X-Update-Trigger",
match update_trigger {
UpdateTrigger::Background => "background",
UpdateTrigger::User => "user",
},
)?
.build()?
.check()
.await;
let result = match update_check_result? {
None => false,
Some(update) => {
let w = window.clone();
tauri::async_runtime::spawn(async move {
// Force native updater if specified (useful if a release broke the UI)
let native_install_mode =
update.raw_json.get("install_mode").map(|v| v.as_str()).unwrap_or_default()
== Some("native");
if native_install_mode {
start_native_update(&w, &update).await;
return;
}
// If it's a background update, try downloading it first
if update_trigger == UpdateTrigger::Background && auto_download {
info!("Downloading update {} in background", update.version);
if let Err(e) = download_update_idempotent(&w, &update).await {
error!("Failed to download {}: {}", update.version, e);
}
}
match start_integrated_update(&w, &update).await {
Ok(UpdateResponseAction::Skip) => {
info!("Confirmed {}: skipped", update.version);
}
Ok(UpdateResponseAction::Install) => {
info!("Confirmed {}: install", update.version);
if let Err(e) = install_update_maybe_download(&w, &update).await {
error!("Failed to install: {e}");
return;
};
info!("Installed {}", update.version);
finish_integrated_update(&w, &update).await;
}
Err(e) => {
warn!("Failed to notify frontend, falling back: {e}",);
start_native_update(&w, &update).await;
}
};
});
true
}
};
Ok(result)
}
pub async fn maybe_check<R: Runtime>(
&mut self,
window: &WebviewWindow<R>,
auto_download: bool,
mode: UpdateMode,
) -> Result<bool> {
let update_period_seconds = match mode {
UpdateMode::Stable => MAX_UPDATE_CHECK_HOURS_STABLE,
UpdateMode::Beta => MAX_UPDATE_CHECK_HOURS_BETA,
UpdateMode::Alpha => MAX_UPDATE_CHECK_HOURS_ALPHA,
} * (60 * 60);
if let Some(i) = self.last_check
&& i.elapsed().as_secs() < update_period_seconds
{
return Ok(false);
}
// Don't check if development (can still with manual user trigger)
if is_dev() {
return Ok(false);
}
self.check_now(window, mode, auto_download, UpdateTrigger::Background).await
}
}
#[derive(Debug, Clone, PartialEq, Serialize, Default, TS)]
#[serde(default, rename_all = "camelCase")]
#[ts(export, export_to = "index.ts")]
struct UpdateInfo {
reply_event_id: String,
version: String,
downloaded: bool,
}
#[derive(Debug, Clone, PartialEq, Deserialize, TS)]
#[serde(rename_all = "camelCase", tag = "type")]
#[ts(export, export_to = "index.ts")]
enum UpdateResponse {
Ack,
Action { action: UpdateResponseAction },
}
#[derive(Debug, Clone, PartialEq, Deserialize, TS)]
#[serde(rename_all = "snake_case")]
#[ts(export, export_to = "index.ts")]
enum UpdateResponseAction {
Install,
Skip,
}
async fn finish_integrated_update<R: Runtime>(window: &WebviewWindow<R>, update: &Update) {
if let Err(e) = window.emit_to(window.label(), "update_installed", update.version.to_string()) {
warn!("Failed to notify frontend of update install: {}", e);
}
}
async fn start_integrated_update<R: Runtime>(
window: &WebviewWindow<R>,
update: &Update,
) -> Result<UpdateResponseAction> {
let download_path = ensure_download_path(window, update)?;
debug!("Download path: {}", download_path.display());
let downloaded = download_path.exists();
let ack_wait = Duration::from_secs(3);
let reply_id = generate_id();
// 1) Start listening BEFORE emitting to avoid missing a fast reply
let (tx, mut rx) = tokio::sync::mpsc::unbounded_channel::<UpdateResponse>();
let w_for_listener = window.clone();
let event_id = w_for_listener.listen(reply_id.clone(), move |ev| {
match serde_json::from_str::<UpdateResponse>(ev.payload()) {
Ok(UpdateResponse::Ack) => {
let _ = tx.send(UpdateResponse::Ack);
}
Ok(UpdateResponse::Action { action }) => {
let _ = tx.send(UpdateResponse::Action { action });
}
Err(e) => {
warn!("Failed to parse update reply from frontend: {e:?}");
}
}
});
// Make sure we always unlisten
struct Unlisten<'a, R: Runtime> {
win: &'a WebviewWindow<R>,
id: tauri::EventId,
}
impl<'a, R: Runtime> Drop for Unlisten<'a, R> {
fn drop(&mut self) {
self.win.unlisten(self.id);
}
}
let _guard = Unlisten { win: window, id: event_id };
// 2) Emit the event now that listener is in place
let info =
UpdateInfo { version: update.version.to_string(), downloaded, reply_event_id: reply_id };
window
.emit_to(window.label(), "update_available", &info)
.map_err(|e| GenericError(format!("Failed to emit update_available: {e}")))?;
// 3) Two-stage timeout: first wait for ack, then wait for final action
// --- Phase 1: wait for ACK with timeout ---
let ack_timer = sleep(ack_wait);
tokio::pin!(ack_timer);
loop {
tokio::select! {
msg = rx.recv() => match msg {
Some(UpdateResponse::Ack) => break, // proceed to Phase 2
Some(UpdateResponse::Action{action}) => return Ok(action), // user was fast
None => return Err(GenericError("frontend channel closed before ack".into())),
},
_ = &mut ack_timer => {
return Err(GenericError("timed out waiting for frontend ack".into()));
}
}
}
// --- Phase 2: wait forever for final action ---
loop {
match rx.recv().await {
Some(UpdateResponse::Action { action }) => return Ok(action),
Some(UpdateResponse::Ack) => { /* ignore extra acks */ }
None => return Err(GenericError("frontend channel closed before action".into())),
}
}
}
async fn start_native_update<R: Runtime>(window: &WebviewWindow<R>, update: &Update) {
// If the frontend doesn't respond, fallback to native dialogs
let confirmed = window
.dialog()
.message(format!(
"{} is available. Would you like to download and install it now?",
update.version
))
.buttons(MessageDialogButtons::OkCancelCustom("Download".to_string(), "Later".to_string()))
.title("Update Available")
.blocking_show();
if !confirmed {
return;
}
match update.download_and_install(|_, _| {}, || {}).await {
Ok(()) => {
if window
.dialog()
.message("Would you like to restart the app?")
.title("Update Installed")
.buttons(MessageDialogButtons::OkCancelCustom(
"Restart".to_string(),
"Later".to_string(),
))
.blocking_show()
{
window.app_handle().request_restart();
}
}
Err(e) => {
window.dialog().message(format!("The update failed to install: {}", e));
}
}
}
pub async fn download_update_idempotent<R: Runtime>(
window: &WebviewWindow<R>,
update: &Update,
) -> Result<PathBuf> {
let dl_path = ensure_download_path(window, update)?;
if dl_path.exists() {
info!("{} already downloaded to {}", update.version, dl_path.display());
return Ok(dl_path);
}
info!("{} downloading: {}", update.version, dl_path.display());
let dl_bytes = update.download(|_, _| {}, || {}).await?;
std::fs::write(&dl_path, dl_bytes)
.map_err(|e| GenericError(format!("Failed to write update: {e}")))?;
info!("{} downloaded", update.version);
Ok(dl_path)
}
pub async fn install_update_maybe_download<R: Runtime>(
window: &WebviewWindow<R>,
update: &Update,
) -> Result<()> {
let dl_path = download_update_idempotent(window, update).await?;
let update_bytes = std::fs::read(&dl_path)?;
update.install(update_bytes.as_slice())?;
Ok(())
}
pub fn ensure_download_path<R: Runtime>(
window: &WebviewWindow<R>,
update: &Update,
) -> Result<PathBuf> {
// Ensure dir exists
let base_dir = window.path().app_cache_dir()?.join("updates");
std::fs::create_dir_all(&base_dir)?;
// Generate name based on signature
let sig_digest = md5::compute(&update.signature);
let name = format!("yaak-{}-{:x}", update.version, sig_digest);
let dl_path = base_dir.join(name);
Ok(dl_path)
}

View File

@@ -1,134 +0,0 @@
use crate::PluginContextExt;
use crate::error::Result;
use crate::import::import_data;
use crate::models_ext::QueryManagerExt;
use log::{info, warn};
use std::collections::HashMap;
use std::fs;
use std::sync::Arc;
use tauri::{AppHandle, Emitter, Manager, Runtime, Url};
use tauri_plugin_dialog::{DialogExt, MessageDialogButtons, MessageDialogKind};
use yaak_models::util::generate_id;
use yaak_plugins::events::{Color, ShowToastRequest};
use yaak_plugins::install::download_and_install;
use yaak_plugins::manager::PluginManager;
use yaak_tauri_utils::api_client::yaak_api_client;
pub(crate) async fn handle_deep_link<R: Runtime>(
app_handle: &AppHandle<R>,
url: &Url,
) -> Result<()> {
let command = url.domain().unwrap_or_default();
info!("Yaak URI scheme invoked {}?{}", command, url.query().unwrap_or_default());
let query_map: HashMap<String, String> = url.query_pairs().into_owned().collect();
let windows = app_handle.webview_windows();
let (_, window) = windows.iter().next().unwrap();
match command {
"install-plugin" => {
let name = query_map.get("name").unwrap();
let version = query_map.get("version").cloned();
_ = window.set_focus();
let confirmed_install = app_handle
.dialog()
.message(format!("Install plugin {name} {version:?}?"))
.kind(MessageDialogKind::Info)
.buttons(MessageDialogButtons::OkCancelCustom(
"Install".to_string(),
"Cancel".to_string(),
))
.blocking_show();
if !confirmed_install {
// Cancelled installation
return Ok(());
}
let plugin_manager = Arc::new((*window.state::<PluginManager>()).clone());
let query_manager = app_handle.db_manager();
let http_client = yaak_api_client(app_handle)?;
let plugin_context = window.plugin_context();
let pv = download_and_install(
plugin_manager,
&query_manager,
&http_client,
&plugin_context,
name,
version,
)
.await?;
app_handle.emit(
"show_toast",
ShowToastRequest {
message: format!("Installed {name}@{}", pv.version),
color: Some(Color::Success),
icon: None,
timeout: Some(5000),
},
)?;
}
"import-data" => {
let mut file_path = query_map.get("path").map(|s| s.to_owned());
let name = query_map.get("name").map(|s| s.to_owned()).unwrap_or("data".to_string());
_ = window.set_focus();
if let Some(file_url) = query_map.get("url") {
let confirmed_import = app_handle
.dialog()
.message(format!("Import {name} from {file_url}?"))
.kind(MessageDialogKind::Info)
.buttons(MessageDialogButtons::OkCancelCustom(
"Import".to_string(),
"Cancel".to_string(),
))
.blocking_show();
if !confirmed_import {
return Ok(());
}
let resp = yaak_api_client(app_handle)?.get(file_url).send().await?;
let json = resp.bytes().await?;
let p = app_handle
.path()
.temp_dir()?
.join(format!("import-{}", generate_id()))
.to_string_lossy()
.to_string();
fs::write(&p, json)?;
file_path = Some(p);
}
let file_path = match file_path {
Some(p) => p,
None => {
app_handle.emit(
"show_toast",
ShowToastRequest {
message: "Failed to import data".to_string(),
color: Some(Color::Danger),
icon: None,
timeout: None,
},
)?;
return Ok(());
}
};
let results = import_data(window, &file_path).await?;
window.emit(
"show_toast",
ShowToastRequest {
message: format!("Imported data for {} workspaces", results.workspaces.len()),
color: Some(Color::Success),
icon: None,
timeout: Some(5000),
},
)?;
}
_ => {
warn!("Unknown deep link command: {command}");
}
}
Ok(())
}

View File

@@ -1,289 +0,0 @@
use crate::error::Result;
use crate::models_ext::QueryManagerExt;
use crate::window_menu::app_menu;
use log::{info, warn};
use rand::random;
use tauri::{
AppHandle, Emitter, LogicalSize, Manager, PhysicalSize, Runtime, WebviewUrl, WebviewWindow,
WindowEvent,
};
use tauri_plugin_opener::OpenerExt;
use tokio::sync::mpsc;
const DEFAULT_WINDOW_WIDTH: f64 = 1100.0;
const DEFAULT_WINDOW_HEIGHT: f64 = 600.0;
const MIN_WINDOW_WIDTH: f64 = 300.0;
const MIN_WINDOW_HEIGHT: f64 = 300.0;
pub(crate) const MAIN_WINDOW_PREFIX: &str = "main_";
const OTHER_WINDOW_PREFIX: &str = "other_";
#[derive(Default, Debug)]
pub(crate) struct CreateWindowConfig<'s> {
pub url: &'s str,
pub label: &'s str,
pub title: &'s str,
pub inner_size: Option<(f64, f64)>,
pub position: Option<(f64, f64)>,
pub navigation_tx: Option<mpsc::Sender<String>>,
pub close_tx: Option<mpsc::Sender<()>>,
pub data_dir_key: Option<String>,
pub hide_titlebar: bool,
}
pub(crate) fn create_window<R: Runtime>(
handle: &AppHandle<R>,
config: CreateWindowConfig,
) -> Result<WebviewWindow<R>> {
#[allow(unused_variables)]
let menu = app_menu(handle)?;
// This causes the window to not be clickable (in AppImage), so disable on Linux
#[cfg(not(target_os = "linux"))]
handle.set_menu(menu).expect("Failed to set app menu");
info!("Create new window label={}", config.label);
let mut win_builder =
tauri::WebviewWindowBuilder::new(handle, config.label, WebviewUrl::App(config.url.into()))
.title(config.title)
.resizable(true)
.visible(false) // To prevent theme flashing, the frontend code calls show() immediately after configuring the theme
.fullscreen(false)
.min_inner_size(MIN_WINDOW_WIDTH, MIN_WINDOW_HEIGHT);
if let Some(key) = config.data_dir_key {
#[cfg(not(target_os = "macos"))]
{
use std::fs;
let safe_key = format!("{:x}", md5::compute(key.as_bytes()));
let dir = handle.path().app_data_dir()?.join("window-sessions").join(safe_key);
fs::create_dir_all(&dir)?;
win_builder = win_builder.data_directory(dir);
}
// macOS doesn't support `data_directory()` so must use this fn instead
#[cfg(target_os = "macos")]
{
let hash = md5::compute(key.as_bytes());
let mut id = [0u8; 16];
id.copy_from_slice(&hash[..16]); // Take the first 16 bytes of the hash
win_builder = win_builder.data_store_identifier(id);
}
}
if let Some((w, h)) = config.inner_size {
win_builder = win_builder.inner_size(w, h);
} else {
win_builder = win_builder.inner_size(600.0, 600.0);
}
if let Some((x, y)) = config.position {
win_builder = win_builder.position(x, y);
} else {
win_builder = win_builder.center();
}
if let Some(tx) = config.navigation_tx {
win_builder = win_builder.on_navigation(move |url| {
let url = url.to_string();
let tx = tx.clone();
tauri::async_runtime::block_on(async move {
tx.send(url).await.unwrap();
});
true
});
}
let settings = handle.db().get_settings();
if config.hide_titlebar && !settings.use_native_titlebar {
#[cfg(target_os = "macos")]
{
use tauri::TitleBarStyle;
win_builder = win_builder.hidden_title(true).title_bar_style(TitleBarStyle::Overlay);
}
#[cfg(not(target_os = "macos"))]
{
win_builder = win_builder.decorations(false);
}
}
if let Some(w) = handle.webview_windows().get(config.label) {
info!("Webview with label {} already exists. Focusing existing", config.label);
w.set_focus()?;
return Ok(w.to_owned());
}
let win = win_builder.build()?;
if let Some(tx) = config.close_tx {
win.on_window_event(move |event| match event {
WindowEvent::CloseRequested { .. } => {
let tx = tx.clone();
tauri::async_runtime::spawn(async move {
tx.send(()).await.unwrap();
});
}
_ => {}
});
}
let webview_window = win.clone();
win.on_menu_event(move |w, event| {
if !w.is_focused().unwrap() {
return;
}
let event_id = event.id().0.as_str();
match event_id {
"hacked_quit" => {
// Cmd+Q on macOS doesn't trigger `CloseRequested` so we use a custom Quit menu
// and trigger close() for each window.
w.webview_windows().iter().for_each(|(_, w)| {
info!("Closing window {}", w.label());
let _ = w.close();
});
}
"close" => w.close().unwrap(),
"zoom_reset" => w.emit("zoom_reset", true).unwrap(),
"zoom_in" => w.emit("zoom_in", true).unwrap(),
"zoom_out" => w.emit("zoom_out", true).unwrap(),
"settings" => w.emit("settings", true).unwrap(),
"open_feedback" => {
if let Err(e) =
w.app_handle().opener().open_url("https://yaak.app/feedback", None::<&str>)
{
warn!("Failed to open feedback {e:?}")
}
}
// Commands for development
"dev.reset_size" => webview_window
.set_size(LogicalSize::new(DEFAULT_WINDOW_WIDTH, DEFAULT_WINDOW_HEIGHT))
.unwrap(),
"dev.reset_size_16x9" => {
let width = webview_window.outer_size().unwrap().width;
let height = width * 9 / 16;
webview_window.set_size(PhysicalSize::new(width, height)).unwrap()
}
"dev.reset_size_16x10" => {
let width = webview_window.outer_size().unwrap().width;
let height = width * 10 / 16;
webview_window.set_size(PhysicalSize::new(width, height)).unwrap()
}
"dev.refresh" => webview_window.eval("location.reload()").unwrap(),
"dev.generate_theme_css" => {
w.emit("generate_theme_css", true).unwrap();
}
"dev.toggle_devtools" => {
if webview_window.is_devtools_open() {
webview_window.close_devtools();
} else {
webview_window.open_devtools();
}
}
_ => {}
}
});
Ok(win)
}
pub(crate) fn create_main_window(handle: &AppHandle, url: &str) -> Result<WebviewWindow> {
let mut counter = 0;
let label = loop {
let label = format!("{MAIN_WINDOW_PREFIX}{counter}");
match handle.webview_windows().get(label.as_str()) {
None => break Some(label),
Some(_) => counter += 1,
}
}
.expect("Failed to generate label for new window");
let config = CreateWindowConfig {
url,
label: label.as_str(),
title: "Yaak",
inner_size: Some((DEFAULT_WINDOW_WIDTH, DEFAULT_WINDOW_HEIGHT)),
position: Some((
// Offset by random amount so it's easier to differentiate
100.0 + random::<f64>() * 20.0,
100.0 + random::<f64>() * 20.0,
)),
hide_titlebar: true,
..Default::default()
};
create_window(handle, config)
}
pub(crate) fn create_child_window(
parent_window: &WebviewWindow,
url: &str,
label: &str,
title: &str,
inner_size: (f64, f64),
) -> Result<WebviewWindow> {
let app_handle = parent_window.app_handle();
let label = format!("{OTHER_WINDOW_PREFIX}_{label}");
let scale_factor = parent_window.scale_factor()?;
let current_pos = parent_window.inner_position()?.to_logical::<f64>(scale_factor);
let current_size = parent_window.inner_size()?.to_logical::<f64>(scale_factor);
// Position the new window in the middle of the parent
let position = (
current_pos.x + current_size.width / 2.0 - inner_size.0 / 2.0,
current_pos.y + current_size.height / 2.0 - inner_size.1 / 2.0,
);
let config = CreateWindowConfig {
label: label.as_str(),
title,
url,
inner_size: Some(inner_size),
position: Some(position),
hide_titlebar: true,
..Default::default()
};
let child_window = create_window(&app_handle, config)?;
// NOTE: These listeners will remain active even when the windows close. Unfortunately,
// there's no way to unlisten to events for now, so we just have to be defensive.
{
let parent_window = parent_window.clone();
let child_window = child_window.clone();
child_window.clone().on_window_event(move |e| match e {
// When the new window is destroyed, bring the other up behind it
WindowEvent::Destroyed => {
if let Some(w) = parent_window.get_webview_window(child_window.label()) {
w.set_focus().unwrap();
}
}
_ => {}
});
}
{
let parent_window = parent_window.clone();
let child_window = child_window.clone();
parent_window.clone().on_window_event(move |e| match e {
// When the parent window is closed, close the child
WindowEvent::CloseRequested { .. } => child_window.close().unwrap(),
// When the parent window is focused, bring the child above
WindowEvent::Focused(focus) => {
if *focus {
if let Some(w) = parent_window.get_webview_window(child_window.label()) {
w.set_focus().unwrap();
};
}
}
_ => {}
});
}
Ok(child_window)
}

View File

@@ -1,447 +0,0 @@
//! WebSocket Tauri command wrappers
//! These wrap the core yaak-ws functionality for Tauri IPC.
use crate::PluginContextExt;
use crate::error::Result;
use crate::models_ext::QueryManagerExt;
use http::HeaderMap;
use log::{debug, info, warn};
use std::str::FromStr;
use std::sync::Arc;
use tauri::http::HeaderValue;
use tauri::{AppHandle, Manager, Runtime, State, WebviewWindow, command};
use tokio::sync::{Mutex, mpsc};
use tokio_tungstenite::tungstenite::Message;
use url::Url;
use yaak_crypto::manager::EncryptionManager;
use yaak_http::cookies::CookieStore;
use yaak_http::path_placeholders::apply_path_placeholders;
use yaak_models::models::{
HttpResponseHeader, WebsocketConnection, WebsocketConnectionState, WebsocketEvent,
WebsocketEventType, WebsocketRequest,
};
use yaak_models::util::UpdateSource;
use yaak_plugins::events::{CallHttpAuthenticationRequest, HttpHeader, RenderPurpose};
use yaak_plugins::manager::PluginManager;
use yaak_plugins::template_callback::PluginTemplateCallback;
use yaak_templates::{RenderErrorBehavior, RenderOptions};
use yaak_tls::find_client_certificate;
use yaak_ws::{WebsocketManager, render_websocket_request};
#[command]
pub async fn cmd_ws_delete_connections<R: Runtime>(
request_id: &str,
app_handle: AppHandle<R>,
window: WebviewWindow<R>,
) -> Result<()> {
Ok(app_handle.db().delete_all_websocket_connections_for_request(
request_id,
&UpdateSource::from_window_label(window.label()),
)?)
}
#[command]
pub async fn cmd_ws_send<R: Runtime>(
connection_id: &str,
environment_id: Option<&str>,
app_handle: AppHandle<R>,
window: WebviewWindow<R>,
ws_manager: State<'_, Mutex<WebsocketManager>>,
) -> Result<WebsocketConnection> {
let connection = app_handle.db().get_websocket_connection(connection_id)?;
let unrendered_request = app_handle.db().get_websocket_request(&connection.request_id)?;
let environment_chain = app_handle.db().resolve_environments(
&unrendered_request.workspace_id,
unrendered_request.folder_id.as_deref(),
environment_id,
)?;
let (resolved_request, _auth_context_id) =
resolve_websocket_request(&window, &unrendered_request)?;
let plugin_manager = Arc::new((*app_handle.state::<PluginManager>()).clone());
let encryption_manager = Arc::new((*app_handle.state::<EncryptionManager>()).clone());
let request = render_websocket_request(
&resolved_request,
environment_chain,
&PluginTemplateCallback::new(
plugin_manager,
encryption_manager,
&window.plugin_context(),
RenderPurpose::Send,
),
&RenderOptions { error_behavior: RenderErrorBehavior::Throw },
)
.await?;
let mut ws_manager = ws_manager.lock().await;
ws_manager.send(&connection.id, Message::Text(request.message.clone().into())).await?;
app_handle.db().upsert_websocket_event(
&WebsocketEvent {
connection_id: connection.id.clone(),
request_id: request.id.clone(),
workspace_id: connection.workspace_id.clone(),
is_server: false,
message_type: WebsocketEventType::Text,
message: request.message.into(),
..Default::default()
},
&UpdateSource::from_window_label(window.label()),
)?;
Ok(connection)
}
#[command]
pub async fn cmd_ws_close<R: Runtime>(
connection_id: &str,
app_handle: AppHandle<R>,
window: WebviewWindow<R>,
ws_manager: State<'_, Mutex<WebsocketManager>>,
) -> Result<WebsocketConnection> {
let connection = {
let db = app_handle.db();
let connection = db.get_websocket_connection(connection_id)?;
db.upsert_websocket_connection(
&WebsocketConnection { state: WebsocketConnectionState::Closing, ..connection },
&UpdateSource::from_window_label(window.label()),
)?
};
let mut ws_manager = ws_manager.lock().await;
if let Err(e) = ws_manager.close(&connection.id).await {
warn!("Failed to close WebSocket connection: {e:?}");
};
Ok(connection)
}
#[command]
pub async fn cmd_ws_connect<R: Runtime>(
request_id: &str,
environment_id: Option<&str>,
cookie_jar_id: Option<&str>,
app_handle: AppHandle<R>,
window: WebviewWindow<R>,
_plugin_manager: State<'_, PluginManager>,
ws_manager: State<'_, Mutex<WebsocketManager>>,
) -> Result<WebsocketConnection> {
let unrendered_request = app_handle.db().get_websocket_request(request_id)?;
let environment_chain = app_handle.db().resolve_environments(
&unrendered_request.workspace_id,
unrendered_request.folder_id.as_deref(),
environment_id,
)?;
let workspace = app_handle.db().get_workspace(&unrendered_request.workspace_id)?;
let settings = app_handle.db().get_settings();
let (resolved_request, auth_context_id) =
resolve_websocket_request(&window, &unrendered_request)?;
let plugin_manager = Arc::new((*app_handle.state::<PluginManager>()).clone());
let encryption_manager = Arc::new((*app_handle.state::<EncryptionManager>()).clone());
let request = render_websocket_request(
&resolved_request,
environment_chain,
&PluginTemplateCallback::new(
plugin_manager.clone(),
encryption_manager.clone(),
&window.plugin_context(),
RenderPurpose::Send,
),
&RenderOptions { error_behavior: RenderErrorBehavior::Throw },
)
.await?;
let connection = app_handle.db().upsert_websocket_connection(
&WebsocketConnection {
workspace_id: request.workspace_id.clone(),
request_id: request_id.to_string(),
..Default::default()
},
&UpdateSource::from_window_label(window.label()),
)?;
let (mut url, url_parameters) = apply_path_placeholders(&request.url, &request.url_parameters);
if !url.starts_with("ws://") && !url.starts_with("wss://") {
url.insert_str(0, "ws://");
}
// Add URL parameters to URL
let mut url = match Url::parse(&url) {
Ok(url) => url,
Err(e) => {
return Ok(app_handle.db().upsert_websocket_connection(
&WebsocketConnection {
error: Some(format!("Failed to parse URL {}", e.to_string())),
state: WebsocketConnectionState::Closed,
..connection
},
&UpdateSource::from_window_label(window.label()),
)?);
}
};
let mut headers = HeaderMap::new();
for h in request.headers.clone() {
if h.name.is_empty() && h.value.is_empty() {
continue;
}
if !h.enabled {
continue;
}
headers.insert(
http::HeaderName::from_str(&h.name).unwrap(),
HeaderValue::from_str(&h.value).unwrap(),
);
}
match request.authentication_type {
None => {
// No authentication found. Not even inherited
}
Some(authentication_type) if authentication_type == "none" => {
// Explicitly no authentication
}
Some(authentication_type) => {
let auth = request.authentication.clone();
let plugin_req = CallHttpAuthenticationRequest {
context_id: format!("{:x}", md5::compute(auth_context_id)),
values: serde_json::from_value(serde_json::to_value(&auth).unwrap()).unwrap(),
method: "POST".to_string(),
url: request.url.clone(),
headers: request
.headers
.clone()
.into_iter()
.map(|h| HttpHeader { name: h.name, value: h.value })
.collect(),
};
let plugin_result = plugin_manager
.call_http_authentication(
&window.plugin_context(),
&authentication_type,
plugin_req,
)
.await?;
for header in plugin_result.set_headers.unwrap_or_default() {
match (
http::HeaderName::from_str(&header.name),
HeaderValue::from_str(&header.value),
) {
(Ok(name), Ok(value)) => {
headers.insert(name, value);
}
_ => continue,
};
}
if let Some(params) = plugin_result.set_query_parameters {
let mut query_pairs = url.query_pairs_mut();
for p in params {
query_pairs.append_pair(&p.name, &p.value);
}
}
}
}
// Add cookies to WS HTTP Upgrade
if let Some(id) = cookie_jar_id {
let cookie_jar = app_handle.db().get_cookie_jar(&id)?;
let store = CookieStore::from_cookies(cookie_jar.cookies);
// Convert WS URL -> HTTP URL because our cookie store matches based on
// Path/HttpOnly/Secure attributes even though WS upgrades are HTTP requests
let http_url = convert_ws_url_to_http(&url);
if let Some(cookie_header_value) = store.get_cookie_header(&http_url) {
debug!("Inserting cookies into WS upgrade to {}: {}", url, cookie_header_value);
headers.insert(
http::HeaderName::from_static("cookie"),
HeaderValue::from_str(&cookie_header_value).unwrap(),
);
}
}
let (receive_tx, mut receive_rx) = mpsc::channel::<Message>(128);
let mut ws_manager = ws_manager.lock().await;
{
let valid_query_pairs = url_parameters
.into_iter()
.filter(|p| p.enabled && !p.name.is_empty())
.collect::<Vec<_>>();
// NOTE: Only mutate query pairs if there are any, or it will append an empty `?` to the URL
if !valid_query_pairs.is_empty() {
let mut query_pairs = url.query_pairs_mut();
for p in valid_query_pairs {
query_pairs.append_pair(p.name.as_str(), p.value.as_str());
}
}
}
let client_cert = find_client_certificate(url.as_str(), &settings.client_certificates);
let response = match ws_manager
.connect(
&connection.id,
url.as_str(),
headers,
receive_tx,
workspace.setting_validate_certificates,
client_cert,
)
.await
{
Ok(r) => r,
Err(e) => {
return Ok(app_handle.db().upsert_websocket_connection(
&WebsocketConnection {
error: Some(e.to_string()),
state: WebsocketConnectionState::Closed,
..connection
},
&UpdateSource::from_window_label(window.label()),
)?);
}
};
app_handle.db().upsert_websocket_event(
&WebsocketEvent {
connection_id: connection.id.clone(),
request_id: request.id.clone(),
workspace_id: connection.workspace_id.clone(),
is_server: false,
message_type: WebsocketEventType::Open,
..Default::default()
},
&UpdateSource::from_window_label(window.label()),
)?;
let response_headers = response
.headers()
.into_iter()
.map(|(name, value)| HttpResponseHeader {
name: name.to_string(),
value: value.to_str().unwrap().to_string(),
})
.collect::<Vec<HttpResponseHeader>>();
let connection = app_handle.db().upsert_websocket_connection(
&WebsocketConnection {
state: WebsocketConnectionState::Connected,
headers: response_headers,
status: response.status().as_u16() as i32,
url: request.url.clone(),
..connection
},
&UpdateSource::from_window_label(window.label()),
)?;
{
let connection_id = connection.id.clone();
let request_id = request.id.to_string();
let workspace_id = request.workspace_id.clone();
let connection = connection.clone();
let window_label = window.label().to_string();
let mut has_written_close = false;
tokio::spawn(async move {
while let Some(message) = receive_rx.recv().await {
if let Message::Close(_) = message {
has_written_close = true;
}
app_handle
.db()
.upsert_websocket_event(
&WebsocketEvent {
connection_id: connection_id.clone(),
request_id: request_id.clone(),
workspace_id: workspace_id.clone(),
is_server: true,
message_type: match message {
Message::Text(_) => WebsocketEventType::Text,
Message::Binary(_) => WebsocketEventType::Binary,
Message::Ping(_) => WebsocketEventType::Ping,
Message::Pong(_) => WebsocketEventType::Pong,
Message::Close(_) => WebsocketEventType::Close,
// Raw frame will never happen during a read
Message::Frame(_) => WebsocketEventType::Frame,
},
message: message.into_data().into(),
..Default::default()
},
&UpdateSource::from_window_label(&window_label),
)
.unwrap();
}
info!("Websocket connection closed");
if !has_written_close {
app_handle
.db()
.upsert_websocket_event(
&WebsocketEvent {
connection_id: connection_id.clone(),
request_id: request_id.clone(),
workspace_id: workspace_id.clone(),
is_server: true,
message_type: WebsocketEventType::Close,
..Default::default()
},
&UpdateSource::from_window_label(&window_label),
)
.unwrap();
}
app_handle
.db()
.upsert_websocket_connection(
&WebsocketConnection {
workspace_id: request.workspace_id.clone(),
request_id: request_id.to_string(),
state: WebsocketConnectionState::Closed,
..connection
},
&UpdateSource::from_window_label(&window_label),
)
.unwrap();
});
}
Ok(connection)
}
/// Resolve inherited authentication and headers for a websocket request
fn resolve_websocket_request<R: Runtime>(
window: &WebviewWindow<R>,
request: &WebsocketRequest,
) -> Result<(WebsocketRequest, String)> {
let mut new_request = request.clone();
let (authentication_type, authentication, authentication_context_id) =
window.db().resolve_auth_for_websocket_request(request)?;
new_request.authentication_type = authentication_type;
new_request.authentication = authentication;
let headers = window.db().resolve_headers_for_websocket_request(request)?;
new_request.headers = headers;
Ok((new_request, authentication_context_id))
}
/// Convert WS URL to HTTP URL for cookie filtering
/// WebSocket upgrade requests are HTTP requests initially, so HttpOnly cookies should apply
fn convert_ws_url_to_http(ws_url: &Url) -> Url {
let mut http_url = ws_url.clone();
match ws_url.scheme() {
"ws" => {
http_url.set_scheme("http").expect("Failed to set http scheme");
}
"wss" => {
http_url.set_scheme("https").expect("Failed to set https scheme");
}
_ => {
// Already HTTP/HTTPS, no conversion needed
}
}
http_url
}

Binary file not shown.

Before

Width:  |  Height:  |  Size: 10 KiB

View File

@@ -1,51 +0,0 @@
{
"productName": "Yaak",
"version": "0.0.0",
"identifier": "app.yaak.desktop",
"build": {
"beforeBuildCommand": "npm run tauri-before-build",
"beforeDevCommand": "npm run tauri-before-dev",
"devUrl": "http://localhost:1420",
"frontendDist": "../../dist"
},
"app": {
"withGlobalTauri": false,
"security": {
"assetProtocol": {
"enable": true,
"scope": {
"allow": [
"$APPDATA/responses/*",
"$RESOURCE/static/*"
]
}
}
}
},
"plugins": {
"deep-link": {
"desktop": {
"schemes": [
"yaak"
]
}
}
},
"bundle": {
"icon": [
"icons/release/32x32.png",
"icons/release/128x128.png",
"icons/release/128x128@2x.png",
"icons/release/icon.icns",
"icons/release/icon.ico"
],
"resources": [
"static",
"vendored/protoc/include",
"vendored/plugins",
"vendored/plugin-runtime",
"vendored/node/yaaknode*",
"vendored/protoc/yaakprotoc*"
]
}
}

View File

@@ -1,68 +0,0 @@
{
"build": {
"features": [
"updater",
"license"
]
},
"app": {
"security": {
"capabilities": [
"default",
{
"identifier": "release",
"windows": [
"*"
],
"permissions": [
"yaak-license:default"
]
}
]
}
},
"plugins": {
"updater": {
"endpoints": [
"https://update.yaak.app/check/{{target}}/{{arch}}/{{current_version}}"
],
"pubkey": "dW50cnVzdGVkIGNvbW1lbnQ6IG1pbmlzaWduIHB1YmxpYyBrZXk6IEVGRkFGMjQxRUNEOTQ3MzAKUldRd1I5bnNRZkw2NzRtMnRlWTN3R24xYUR3aGRsUjJzWGwvdHdEcGljb3ZJMUNlMjFsaHlqVU4K"
}
},
"bundle": {
"publisher": "Yaak",
"license": "MIT",
"copyright": "Yaak",
"homepage": "https://yaak.app",
"active": true,
"category": "DeveloperTool",
"createUpdaterArtifacts": true,
"longDescription": "A cross-platform desktop app for interacting with REST, GraphQL, and gRPC",
"shortDescription": "Play with APIs, intuitively",
"targets": [
"app",
"appimage",
"deb",
"dmg",
"nsis",
"rpm"
],
"macOS": {
"minimumSystemVersion": "13.0",
"exceptionDomain": "",
"entitlements": "macos/entitlements.plist",
"frameworks": []
},
"windows": {
"signCommand": "trusted-signing-cli -e https://eus.codesigning.azure.net/ -a Yaak -c yaakapp %1"
},
"linux": {
"deb": {
"desktopTemplate": "./template.desktop"
},
"rpm": {
"desktopTemplate": "./template.desktop"
}
}
}
}

View File

@@ -1,9 +0,0 @@
[Desktop Entry]
Categories={{categories}}
Comment={{comment}}
Exec={{exec}}
Icon={{icon}}
Name={{name}}
StartupWMClass={{exec}}
Terminal=false
Type=Application

View File

@@ -1,16 +0,0 @@
[package]
name = "yaak-fonts"
links = "yaak-fonts"
version = "0.1.0"
edition = "2021"
publish = false
[dependencies]
font-loader = "0.11.0"
tauri = { workspace = true }
ts-rs = { workspace = true }
serde = "1.0"
thiserror = { workspace = true }
[build-dependencies]
tauri-plugin = { workspace = true, features = ["build"] }

View File

@@ -1,3 +0,0 @@
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
export type Fonts = { editorFonts: Array<string>, uiFonts: Array<string>, };

View File

@@ -1,14 +0,0 @@
import { useQuery } from '@tanstack/react-query';
import { invoke } from '@tauri-apps/api/core';
import { Fonts } from './bindings/gen_fonts';
export async function listFonts() {
return invoke<Fonts>('plugin:yaak-fonts|list', {});
}
export function useFonts() {
return useQuery({
queryKey: ['list_fonts'],
queryFn: () => listFonts(),
});
}

View File

@@ -1,6 +0,0 @@
{
"name": "@yaakapp-internal/fonts",
"private": true,
"version": "1.0.0",
"main": "index.ts"
}

View File

@@ -1,38 +0,0 @@
use crate::Result;
use font_loader::system_fonts;
use serde::{Deserialize, Serialize};
use std::collections::HashSet;
use tauri::command;
use ts_rs::TS;
#[derive(Default, Debug, Clone, Serialize, Deserialize, TS, PartialEq)]
#[serde(rename_all = "camelCase")]
#[ts(export, export_to = "gen_fonts.ts")]
pub struct Fonts {
pub editor_fonts: Vec<String>,
pub ui_fonts: Vec<String>,
}
#[command]
pub(crate) async fn list() -> Result<Fonts> {
let mut ui_fonts = HashSet::new();
let mut editor_fonts = HashSet::new();
let mut property = system_fonts::FontPropertyBuilder::new().monospace().build();
for font in &system_fonts::query_specific(&mut property) {
editor_fonts.insert(font.to_string());
}
for font in &system_fonts::query_all() {
if !editor_fonts.contains(font) {
ui_fonts.insert(font.to_string());
}
}
let mut ui_fonts: Vec<String> = ui_fonts.into_iter().collect();
let mut editor_fonts: Vec<String> = editor_fonts.into_iter().collect();
ui_fonts.sort();
editor_fonts.sort();
Ok(Fonts { ui_fonts, editor_fonts })
}

View File

@@ -1,15 +0,0 @@
use serde::{ser::Serializer, Serialize};
#[derive(Debug, thiserror::Error)]
pub enum Error {}
impl Serialize for Error {
fn serialize<S>(&self, serializer: S) -> std::result::Result<S::Ok, S::Error>
where
S: Serializer,
{
serializer.serialize_str(self.to_string().as_ref())
}
}
pub type Result<T> = std::result::Result<T, Error>;

View File

@@ -1,15 +0,0 @@
use tauri::{
generate_handler,
plugin::{Builder, TauriPlugin},
Runtime,
};
mod commands;
mod error;
use crate::commands::list;
pub use error::{Error, Result};
pub fn init<R: Runtime>() -> TauriPlugin<R> {
Builder::new("yaak-fonts").invoke_handler(generate_handler![list]).build()
}

View File

@@ -1,11 +0,0 @@
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
export type APIErrorResponsePayload = { error: string, message: string, };
export type ActivateLicenseRequestPayload = { licenseKey: string, appVersion: string, appPlatform: string, };
export type ActivateLicenseResponsePayload = { activationId: string, };
export type DeactivateLicenseRequestPayload = { appVersion: string, appPlatform: string, };
export type LicenseCheckStatus = { "status": "personal_use", "data": { trial_ended: string, } } | { "status": "trialing", "data": { end: string, } } | { "status": "error", "data": { message: string, code: string, } } | { "status": "active", "data": { periodEnd: string, cancelAt: string | null, } } | { "status": "inactive", "data": { status: string, } } | { "status": "expired", "data": { changes: number, changesUrl: string | null, billingUrl: string, periodEnd: string, } } | { "status": "past_due", "data": { billingUrl: string, periodEnd: string, } };

View File

@@ -1,5 +0,0 @@
const COMMANDS: &[&str] = &["activate", "deactivate", "check"];
fn main() {
tauri_plugin::Builder::new(COMMANDS).build();
}

View File

@@ -1,3 +0,0 @@
[default]
description = "Default permissions for the plugin"
permissions = ["allow-check", "allow-activate", "allow-deactivate"]

View File

@@ -1,18 +0,0 @@
use crate::error::Result;
use crate::{LicenseCheckStatus, activate_license, check_license, deactivate_license};
use tauri::{Runtime, WebviewWindow, command};
#[command]
pub async fn check<R: Runtime>(window: WebviewWindow<R>) -> Result<LicenseCheckStatus> {
check_license(&window).await
}
#[command]
pub async fn activate<R: Runtime>(license_key: &str, window: WebviewWindow<R>) -> Result<()> {
activate_license(&window, license_key).await
}
#[command]
pub async fn deactivate<R: Runtime>(window: WebviewWindow<R>) -> Result<()> {
deactivate_license(&window).await
}

View File

@@ -1,17 +0,0 @@
use tauri::{
Runtime, generate_handler,
plugin::{Builder, TauriPlugin},
};
mod commands;
pub mod error;
mod license;
use crate::commands::{activate, check, deactivate};
pub use license::*;
pub fn init<R: Runtime>() -> TauriPlugin<R> {
Builder::new("yaak-license")
.invoke_handler(generate_handler![check, activate, deactivate])
.build()
}

View File

@@ -1,243 +0,0 @@
use crate::error::Error::{ClientError, JsonError, ServerError};
use crate::error::Result;
use chrono::{DateTime, Utc};
use log::{info, warn};
use serde::{Deserialize, Serialize};
use std::ops::Add;
use std::time::Duration;
use tauri::{AppHandle, Emitter, Manager, Runtime, WebviewWindow, is_dev};
use ts_rs::TS;
use yaak_common::platform::get_os_str;
use yaak_models::db_context::DbContext;
use yaak_models::query_manager::QueryManager;
use yaak_models::util::UpdateSource;
use yaak_tauri_utils::api_client::yaak_api_client;
/// Extension trait for accessing the QueryManager from Tauri Manager types.
/// This is needed temporarily until all crates are refactored to not use Tauri.
trait QueryManagerExt<'a, R> {
fn db(&'a self) -> DbContext<'a>;
}
impl<'a, R: Runtime, M: Manager<R>> QueryManagerExt<'a, R> for M {
fn db(&'a self) -> DbContext<'a> {
let qm = self.state::<QueryManager>();
qm.inner().connect()
}
}
const KV_NAMESPACE: &str = "license";
const KV_ACTIVATION_ID_KEY: &str = "activation_id";
const TRIAL_SECONDS: u64 = 3600 * 24 * 30;
#[derive(Debug, Clone, Serialize, Deserialize, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export, export_to = "gen_models.ts")]
pub struct CheckActivationRequestPayload {
pub app_version: String,
pub app_platform: String,
}
#[derive(Debug, Clone, Serialize, Deserialize, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export, export_to = "license.ts")]
pub struct ActivateLicenseRequestPayload {
pub license_key: String,
pub app_version: String,
pub app_platform: String,
}
#[derive(Debug, Clone, Serialize, Deserialize, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export, export_to = "license.ts")]
pub struct DeactivateLicenseRequestPayload {
pub app_version: String,
pub app_platform: String,
}
#[derive(Debug, Clone, Serialize, Deserialize, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export, export_to = "license.ts")]
pub struct ActivateLicenseResponsePayload {
pub activation_id: String,
}
#[derive(Debug, Clone, Serialize, Deserialize, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export, export_to = "license.ts")]
pub struct APIErrorResponsePayload {
pub error: String,
pub message: String,
}
#[derive(Debug, Clone, Serialize, Deserialize, TS)]
#[serde(rename_all = "snake_case", tag = "status", content = "data")]
#[ts(export, export_to = "license.ts")]
pub enum LicenseCheckStatus {
// Local Types
PersonalUse {
trial_ended: DateTime<Utc>,
},
Trialing {
end: DateTime<Utc>,
},
Error {
message: String,
code: String,
},
// Server Types
Active {
#[serde(rename = "periodEnd")]
period_end: DateTime<Utc>,
#[serde(rename = "cancelAt")]
cancel_at: Option<DateTime<Utc>>,
},
Inactive {
status: String,
},
Expired {
changes: i32,
#[serde(rename = "changesUrl")]
changes_url: Option<String>,
#[serde(rename = "billingUrl")]
billing_url: String,
#[serde(rename = "periodEnd")]
period_end: DateTime<Utc>,
},
PastDue {
#[serde(rename = "billingUrl")]
billing_url: String,
#[serde(rename = "periodEnd")]
period_end: DateTime<Utc>,
},
}
pub async fn activate_license<R: Runtime>(
window: &WebviewWindow<R>,
license_key: &str,
) -> Result<()> {
info!("Activating license {}", license_key);
let client = reqwest::Client::new();
let payload = ActivateLicenseRequestPayload {
license_key: license_key.to_string(),
app_platform: get_os_str().to_string(),
app_version: window.app_handle().package_info().version.to_string(),
};
let response = client.post(build_url("/licenses/activate")).json(&payload).send().await?;
if response.status().is_client_error() {
let body: APIErrorResponsePayload = response.json().await?;
return Err(ClientError { message: body.message, error: body.error });
}
if response.status().is_server_error() {
return Err(ServerError);
}
let body: ActivateLicenseResponsePayload = response.json().await?;
window.app_handle().db().set_key_value_str(
KV_ACTIVATION_ID_KEY,
KV_NAMESPACE,
body.activation_id.as_str(),
&UpdateSource::from_window_label(window.label()),
);
if let Err(e) = window.emit("license-activated", true) {
warn!("Failed to emit check-license event: {}", e);
}
Ok(())
}
pub async fn deactivate_license<R: Runtime>(window: &WebviewWindow<R>) -> Result<()> {
info!("Deactivating activation");
let app_handle = window.app_handle();
let activation_id = get_activation_id(app_handle).await;
let client = reqwest::Client::new();
let path = format!("/licenses/activations/{}/deactivate", activation_id);
let payload = DeactivateLicenseRequestPayload {
app_platform: get_os_str().to_string(),
app_version: window.app_handle().package_info().version.to_string(),
};
let response = client.post(build_url(&path)).json(&payload).send().await?;
if response.status().is_client_error() {
let body: APIErrorResponsePayload = response.json().await?;
return Err(ClientError { message: body.message, error: body.error });
}
if response.status().is_server_error() {
return Err(ServerError);
}
app_handle.db().delete_key_value(
KV_ACTIVATION_ID_KEY,
KV_NAMESPACE,
&UpdateSource::from_window_label(window.label()),
)?;
if let Err(e) = app_handle.emit("license-deactivated", true) {
warn!("Failed to emit deactivate-license event: {}", e);
}
Ok(())
}
pub async fn check_license<R: Runtime>(window: &WebviewWindow<R>) -> Result<LicenseCheckStatus> {
let payload = CheckActivationRequestPayload {
app_platform: get_os_str().to_string(),
app_version: window.package_info().version.to_string(),
};
let activation_id = get_activation_id(window.app_handle()).await;
let settings = window.db().get_settings();
let trial_end = settings.created_at.add(Duration::from_secs(TRIAL_SECONDS)).and_utc();
let has_activation_id = !activation_id.is_empty();
let trial_period_active = Utc::now() < trial_end;
match (has_activation_id, trial_period_active) {
(false, true) => Ok(LicenseCheckStatus::Trialing { end: trial_end }),
(false, false) => Ok(LicenseCheckStatus::PersonalUse { trial_ended: trial_end }),
(true, _) => {
info!("Checking license activation");
// A license has been activated, so let's check the license server
let client = yaak_api_client(window.app_handle())?;
let path = format!("/licenses/activations/{activation_id}/check-v2");
let response = client.post(build_url(&path)).json(&payload).send().await?;
if response.status().is_client_error() {
let body: APIErrorResponsePayload = response.json().await?;
return Err(ClientError { message: body.message, error: body.error });
}
if response.status().is_server_error() {
warn!("Failed to check license {}", response.status());
return Err(ServerError);
}
let body_text = response.text().await?;
match serde_json::from_str::<LicenseCheckStatus>(&body_text) {
Ok(b) => Ok(b),
Err(e) => {
warn!("Failed to decode server response: {} {:?}", body_text, e);
Err(JsonError(e))
}
}
}
}
}
fn build_url(path: &str) -> String {
if is_dev() {
format!("http://localhost:9444{path}")
} else {
format!("https://license.yaak.app{path}")
}
}
pub async fn get_activation_id<R: Runtime>(app_handle: &AppHandle<R>) -> String {
app_handle.db().get_key_value_str(KV_ACTIVATION_ID_KEY, KV_NAMESPACE, "")
}

View File

@@ -1,22 +0,0 @@
[package]
name = "yaak-mac-window"
links = "yaak-mac-window"
version = "0.1.0"
edition = "2024"
publish = false
[lints.rust]
unexpected_cfgs = { level = "warn", check-cfg = ['cfg(feature, values("cargo-clippy"))'] }
[build-dependencies]
tauri-plugin = { workspace = true, features = ["build"] }
[target.'cfg(target_os = "macos")'.dependencies]
cocoa = "0.26.0"
log = { workspace = true }
objc = "0.2.7"
rand = "0.9.0"
csscolorparser = "0.7.2"
[dependencies]
tauri = { workspace = true }

View File

@@ -1,5 +0,0 @@
const COMMANDS: &[&str] = &["set_title", "set_theme"];
fn main() {
tauri_plugin::Builder::new(COMMANDS).build();
}

View File

@@ -1,9 +0,0 @@
import { invoke } from '@tauri-apps/api/core';
export function setWindowTitle(title: string) {
invoke('plugin:yaak-mac-window|set_title', { title }).catch(console.error);
}
export function setWindowTheme(bgColor: string) {
invoke('plugin:yaak-mac-window|set_theme', { bgColor }).catch(console.error);
}

View File

@@ -1,6 +0,0 @@
{
"name": "@yaakapp-internal/mac-window",
"private": true,
"version": "1.0.0",
"main": "index.ts"
}

View File

@@ -1,6 +0,0 @@
[default]
description = "Default permissions for the plugin"
permissions = [
"allow-set-title",
"allow-set-theme",
]

View File

@@ -1,36 +0,0 @@
use tauri::{Runtime, Window, command};
#[command]
pub(crate) fn set_title<R: Runtime>(window: Window<R>, title: &str) {
#[cfg(target_os = "macos")]
{
crate::mac::update_window_title(window, title.to_string());
}
#[cfg(not(target_os = "macos"))]
{
let _ = window.set_title(title);
}
}
#[command]
#[allow(unused)]
pub(crate) fn set_theme<R: Runtime>(window: Window<R>, bg_color: &str) {
#[cfg(target_os = "macos")]
{
use log::warn;
match csscolorparser::parse(bg_color.trim()) {
Ok(color) => {
crate::mac::update_window_theme(window, color);
}
Err(err) => {
warn!("Failed to parse background color '{}': {}", bg_color, err)
}
}
}
#[cfg(not(target_os = "macos"))]
{
// Nothing yet for non-Mac platforms
}
}

View File

@@ -1,43 +0,0 @@
mod commands;
#[cfg(target_os = "macos")]
mod mac;
use crate::commands::{set_theme, set_title};
use std::sync::atomic::AtomicBool;
use tauri::{Manager, Runtime, generate_handler, plugin, plugin::TauriPlugin};
pub trait AppHandleMacWindowExt {
/// Sets whether to use the native titlebar
fn set_native_titlebar(&self, enable: bool);
}
impl<R: Runtime> AppHandleMacWindowExt for tauri::AppHandle<R> {
fn set_native_titlebar(&self, enable: bool) {
self.state::<PluginState>()
.native_titlebar
.store(enable, std::sync::atomic::Ordering::Relaxed);
}
}
pub(crate) struct PluginState {
native_titlebar: AtomicBool,
}
pub fn init<R: Runtime>() -> TauriPlugin<R> {
let mut builder = plugin::Builder::new("yaak-mac-window")
.setup(move |app, _| {
app.manage(PluginState { native_titlebar: AtomicBool::new(false) });
Ok(())
})
.invoke_handler(generate_handler![set_title, set_theme]);
#[cfg(target_os = "macos")]
{
builder = builder.on_window_ready(move |window| {
mac::setup_traffic_light_positioner(&window);
});
}
builder.build()
}

View File

@@ -1,13 +0,0 @@
[package]
name = "yaak-tauri-utils"
version = "0.1.0"
edition = "2024"
publish = false
[dependencies]
tauri = { workspace = true }
reqwest = { workspace = true, features = ["gzip"] }
thiserror = { workspace = true }
serde = { workspace = true, features = ["derive"] }
regex = "1.11.0"
yaak-common = { workspace = true }

View File

@@ -1,24 +0,0 @@
use crate::error::Result;
use reqwest::Client;
use std::time::Duration;
use tauri::http::{HeaderMap, HeaderValue};
use tauri::{AppHandle, Runtime};
use yaak_common::platform::{get_ua_arch, get_ua_platform};
pub fn yaak_api_client<R: Runtime>(app_handle: &AppHandle<R>) -> Result<Client> {
let platform = get_ua_platform();
let version = app_handle.package_info().version.clone();
let arch = get_ua_arch();
let ua = format!("Yaak/{version} ({platform}; {arch})");
let mut default_headers = HeaderMap::new();
default_headers.insert("Accept", HeaderValue::from_str("application/json").unwrap());
let client = reqwest::ClientBuilder::new()
.timeout(Duration::from_secs(20))
.default_headers(default_headers)
.gzip(true)
.user_agent(ua)
.build()?;
Ok(client)
}

View File

@@ -1,19 +0,0 @@
use serde::{Serialize, Serializer};
use thiserror::Error;
#[derive(Error, Debug)]
pub enum Error {
#[error(transparent)]
ReqwestError(#[from] reqwest::Error),
}
impl Serialize for Error {
fn serialize<S>(&self, serializer: S) -> std::result::Result<S::Ok, S::Error>
where
S: Serializer,
{
serializer.serialize_str(self.to_string().as_ref())
}
}
pub type Result<T> = std::result::Result<T, Error>;

View File

@@ -1,3 +0,0 @@
pub mod api_client;
pub mod error;
pub mod window;

View File

@@ -1,38 +0,0 @@
use regex::Regex;
use tauri::{Runtime, WebviewWindow};
pub trait WorkspaceWindowTrait {
fn workspace_id(&self) -> Option<String>;
fn cookie_jar_id(&self) -> Option<String>;
fn environment_id(&self) -> Option<String>;
fn request_id(&self) -> Option<String>;
}
impl<R: Runtime> WorkspaceWindowTrait for WebviewWindow<R> {
fn workspace_id(&self) -> Option<String> {
let url = self.url().unwrap();
let re = Regex::new(r"/workspaces/(?<id>\w+)").unwrap();
match re.captures(url.as_str()) {
None => None,
Some(captures) => captures.name("id").map(|c| c.as_str().to_string()),
}
}
fn cookie_jar_id(&self) -> Option<String> {
let url = self.url().unwrap();
let mut query_pairs = url.query_pairs();
query_pairs.find(|(k, _v)| k == "cookie_jar_id").map(|(_k, v)| v.to_string())
}
fn environment_id(&self) -> Option<String> {
let url = self.url().unwrap();
let mut query_pairs = url.query_pairs();
query_pairs.find(|(k, _v)| k == "environment_id").map(|(_k, v)| v.to_string())
}
fn request_id(&self) -> Option<String> {
let url = self.url().unwrap();
let mut query_pairs = url.query_pairs();
query_pairs.find(|(k, _v)| k == "request_id").map(|(_k, v)| v.to_string())
}
}

View File

@@ -1,9 +0,0 @@
[package]
name = "yaak-common"
version = "0.1.0"
edition = "2024"
publish = false
[dependencies]
serde_json = { workspace = true }
tokio = { workspace = true, features = ["process"] }

View File

@@ -1,16 +0,0 @@
use std::ffi::OsStr;
#[cfg(target_os = "windows")]
const CREATE_NO_WINDOW: u32 = 0x0800_0000;
/// Creates a new `tokio::process::Command` that won't spawn a console window on Windows.
pub fn new_xplatform_command<S: AsRef<OsStr>>(program: S) -> tokio::process::Command {
#[allow(unused_mut)]
let mut cmd = tokio::process::Command::new(program);
#[cfg(target_os = "windows")]
{
use std::os::windows::process::CommandExt;
cmd.creation_flags(CREATE_NO_WINDOW);
}
cmd
}

View File

@@ -1,3 +0,0 @@
pub mod command;
pub mod platform;
pub mod serde;

View File

@@ -1,55 +0,0 @@
use crate::platform::OperatingSystem::{Linux, MacOS, Unknown, Windows};
pub enum OperatingSystem {
Windows,
MacOS,
Linux,
Unknown,
}
pub fn get_os() -> OperatingSystem {
if cfg!(target_os = "windows") {
Windows
} else if cfg!(target_os = "macos") {
MacOS
} else if cfg!(target_os = "linux") {
Linux
} else {
Unknown
}
}
pub fn get_os_str() -> &'static str {
match get_os() {
Windows => "windows",
MacOS => "macos",
Linux => "linux",
Unknown => "unknown",
}
}
pub fn get_ua_platform() -> &'static str {
if cfg!(target_os = "windows") {
"Win"
} else if cfg!(target_os = "macos") {
"Mac"
} else if cfg!(target_os = "linux") {
"Linux"
} else {
"Unknown"
}
}
pub fn get_ua_arch() -> &'static str {
if cfg!(target_arch = "x86_64") {
"x86_64"
} else if cfg!(target_arch = "x86") {
"i386"
} else if cfg!(target_arch = "arm") {
"ARM"
} else if cfg!(target_arch = "aarch64") {
"ARM64"
} else {
"Unknown"
}
}

View File

@@ -1,23 +0,0 @@
use serde_json::Value;
use std::collections::BTreeMap;
pub fn get_bool(v: &Value, key: &str, fallback: bool) -> bool {
match v.get(key) {
None => fallback,
Some(v) => v.as_bool().unwrap_or(fallback),
}
}
pub fn get_str<'a>(v: &'a Value, key: &str) -> &'a str {
match v.get(key) {
None => "",
Some(v) => v.as_str().unwrap_or_default(),
}
}
pub fn get_str_map<'a>(v: &'a BTreeMap<String, Value>, key: &str) -> &'a str {
match v.get(key) {
None => "",
Some(v) => v.as_str().unwrap_or_default(),
}
}

View File

@@ -1,9 +0,0 @@
[package]
name = "yaak-core"
version = "0.0.0"
edition = "2024"
authors = ["Gregory Schier"]
publish = false
[dependencies]
thiserror = { workspace = true }

View File

@@ -1,56 +0,0 @@
use std::path::PathBuf;
/// Context for a workspace operation.
///
/// In Tauri, this is extracted from the WebviewWindow URL.
/// In CLI, this is constructed from command arguments or config.
#[derive(Debug, Clone, Default)]
pub struct WorkspaceContext {
pub workspace_id: Option<String>,
pub environment_id: Option<String>,
pub cookie_jar_id: Option<String>,
pub request_id: Option<String>,
}
impl WorkspaceContext {
pub fn new() -> Self {
Self::default()
}
pub fn with_workspace(mut self, workspace_id: impl Into<String>) -> Self {
self.workspace_id = Some(workspace_id.into());
self
}
pub fn with_environment(mut self, environment_id: impl Into<String>) -> Self {
self.environment_id = Some(environment_id.into());
self
}
pub fn with_cookie_jar(mut self, cookie_jar_id: impl Into<String>) -> Self {
self.cookie_jar_id = Some(cookie_jar_id.into());
self
}
pub fn with_request(mut self, request_id: impl Into<String>) -> Self {
self.request_id = Some(request_id.into());
self
}
}
/// Application context trait for accessing app-level resources.
///
/// This abstracts over Tauri's `AppHandle` for path resolution and app identity.
/// Implemented by Tauri's AppHandle and by CLI's own context struct.
pub trait AppContext: Send + Sync + Clone {
/// Returns the path to the application data directory.
/// This is where the database and other persistent data are stored.
fn app_data_dir(&self) -> PathBuf;
/// Returns the application identifier (e.g., "app.yaak.desktop").
/// Used for keyring access and other platform-specific features.
fn app_identifier(&self) -> &str;
/// Returns true if running in development mode.
fn is_dev(&self) -> bool;
}

View File

@@ -1,15 +0,0 @@
use thiserror::Error;
pub type Result<T> = std::result::Result<T, Error>;
#[derive(Error, Debug)]
pub enum Error {
#[error("Missing required context: {0}")]
MissingContext(String),
#[error("Configuration error: {0}")]
Config(String),
#[error("IO error: {0}")]
Io(#[from] std::io::Error),
}

View File

@@ -1,10 +0,0 @@
//! Core abstractions for Yaak that work without Tauri.
//!
//! This crate provides foundational types and traits that allow Yaak's
//! business logic to run in both Tauri (desktop app) and CLI contexts.
mod context;
mod error;
pub use context::{AppContext, WorkspaceContext};
pub use error::{Error, Result};

View File

@@ -1,15 +0,0 @@
[package]
name = "yaak-crypto"
version = "0.1.0"
edition = "2021"
publish = false
[dependencies]
base32 = "0.5.1" # For encoding human-readable key
base64 = "0.22.1" # For encoding in the database
chacha20poly1305 = "0.10.1"
keyring = { workspace = true, features = ["apple-native", "windows-native", "sync-secret-service"] }
log = { workspace = true }
serde = { workspace = true, features = ["derive"] }
thiserror = { workspace = true }
yaak-models = { workspace = true }

View File

@@ -1,17 +0,0 @@
import { invoke } from '@tauri-apps/api/core';
export function enableEncryption(workspaceId: string) {
return invoke<void>('cmd_enable_encryption', { workspaceId });
}
export function revealWorkspaceKey(workspaceId: string) {
return invoke<string>('cmd_reveal_workspace_key', { workspaceId });
}
export function setWorkspaceKey(args: { workspaceId: string; key: string }) {
return invoke<void>('cmd_set_workspace_key', args);
}
export function disableEncryption(workspaceId: string) {
return invoke<void>('cmd_disable_encryption', { workspaceId });
}

View File

@@ -1,98 +0,0 @@
use crate::error::Error::{DecryptionError, EncryptionError, InvalidEncryptedData};
use crate::error::Result;
use chacha20poly1305::aead::generic_array::typenum::Unsigned;
use chacha20poly1305::aead::{Aead, AeadCore, Key, KeyInit, OsRng};
use chacha20poly1305::XChaCha20Poly1305;
const ENCRYPTION_TAG: &str = "yA4k3nC";
const ENCRYPTION_VERSION: u8 = 1;
pub(crate) fn encrypt_data(data: &[u8], key: &Key<XChaCha20Poly1305>) -> Result<Vec<u8>> {
let nonce = XChaCha20Poly1305::generate_nonce(&mut OsRng);
let cipher = XChaCha20Poly1305::new(&key);
let ciphered_data = cipher.encrypt(&nonce, data).map_err(|_| EncryptionError)?;
let mut data: Vec<u8> = Vec::new();
data.extend_from_slice(ENCRYPTION_TAG.as_bytes()); // Tag
data.push(ENCRYPTION_VERSION); // Version
data.extend_from_slice(&nonce.as_slice()); // Nonce
data.extend_from_slice(&ciphered_data); // Ciphertext
Ok(data)
}
pub(crate) fn decrypt_data(cipher_data: &[u8], key: &Key<XChaCha20Poly1305>) -> Result<Vec<u8>> {
// Yaak Tag + ID + Version + Nonce + ... ciphertext ...
let (tag, rest) =
cipher_data.split_at_checked(ENCRYPTION_TAG.len()).ok_or(InvalidEncryptedData)?;
if tag != ENCRYPTION_TAG.as_bytes() {
return Err(InvalidEncryptedData);
}
let (version, rest) = rest.split_at_checked(1).ok_or(InvalidEncryptedData)?;
if version[0] != ENCRYPTION_VERSION {
return Err(InvalidEncryptedData);
}
let nonce_bytes = <XChaCha20Poly1305 as AeadCore>::NonceSize::to_usize();
let (nonce, ciphered_data) = rest.split_at_checked(nonce_bytes).ok_or(InvalidEncryptedData)?;
let cipher = XChaCha20Poly1305::new(&key);
cipher.decrypt(nonce.into(), ciphered_data).map_err(|_e| DecryptionError)
}
#[cfg(test)]
mod test {
use crate::encryption::{decrypt_data, encrypt_data};
use crate::error::Error::InvalidEncryptedData;
use crate::error::Result;
use chacha20poly1305::aead::OsRng;
use chacha20poly1305::{KeyInit, XChaCha20Poly1305};
#[test]
fn test_encrypt_decrypt() -> Result<()> {
let key = XChaCha20Poly1305::generate_key(OsRng);
let encrypted = encrypt_data("hello world".as_bytes(), &key)?;
let decrypted = decrypt_data(encrypted.as_slice(), &key)?;
assert_eq!(String::from_utf8(decrypted).unwrap(), "hello world");
Ok(())
}
#[test]
fn test_decrypt_empty() -> Result<()> {
let key = XChaCha20Poly1305::generate_key(OsRng);
let encrypted = encrypt_data(&[], &key)?;
assert_eq!(encrypted.len(), 48);
let decrypted = decrypt_data(encrypted.as_slice(), &key)?;
assert_eq!(String::from_utf8(decrypted).unwrap(), "");
Ok(())
}
#[test]
fn test_decrypt_bad_version() -> Result<()> {
let key = XChaCha20Poly1305::generate_key(OsRng);
let mut encrypted = encrypt_data("hello world".as_bytes(), &key)?;
encrypted[7] = 0;
let decrypted = decrypt_data(encrypted.as_slice(), &key);
assert!(matches!(decrypted, Err(InvalidEncryptedData)));
Ok(())
}
#[test]
fn test_decrypt_bad_tag() -> Result<()> {
let key = XChaCha20Poly1305::generate_key(OsRng);
let mut encrypted = encrypt_data("hello world".as_bytes(), &key)?;
encrypted[0] = 2;
let decrypted = decrypt_data(encrypted.as_slice(), &key);
assert!(matches!(decrypted, Err(InvalidEncryptedData)));
Ok(())
}
#[test]
fn test_decrypt_unencrypted_data() -> Result<()> {
let key = XChaCha20Poly1305::generate_key(OsRng);
let decrypted = decrypt_data("123".as_bytes(), &key);
assert!(matches!(decrypted, Err(InvalidEncryptedData)));
Ok(())
}
}

Some files were not shown because too many files have changed in this diff Show More