mirror of
https://github.com/mountain-loop/yaak.git
synced 2026-02-20 06:37:48 +01:00
Compare commits
124 Commits
v2025.2.0-
...
v2025.3.0
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
72dd768f55 | ||
|
|
862d85e48d | ||
|
|
a6d03cbeeb | ||
|
|
7d1ca1c232 | ||
|
|
261911b57e | ||
|
|
245054cd7d | ||
|
|
101582e540 | ||
|
|
0a932798a0 | ||
|
|
4609c95ad5 | ||
|
|
9d54e40aa8 | ||
|
|
9ec9222216 | ||
|
|
4d1dda0786 | ||
|
|
31605881ac | ||
|
|
4cd2e9cd31 | ||
|
|
13d959799a | ||
|
|
a6b18c23e1 | ||
|
|
041298b3f8 | ||
|
|
b400940f0e | ||
|
|
2e144f064d | ||
|
|
d8b1cadae6 | ||
|
|
c2f9760d08 | ||
|
|
a4c600cb48 | ||
|
|
bc3a5e3e58 | ||
|
|
4c3a02ac53 | ||
|
|
1974d61aa4 | ||
|
|
0bcb092854 | ||
|
|
409620f533 | ||
|
|
3e9037f70a | ||
|
|
be82b67ed3 | ||
|
|
432b366105 | ||
|
|
42e70b941d | ||
|
|
3808215210 | ||
|
|
763a60982a | ||
|
|
a05679fd93 | ||
|
|
73c366dc27 | ||
|
|
0be7d0283b | ||
|
|
749df338c5 | ||
|
|
3184c1b79e | ||
|
|
b52bf7cd56 | ||
|
|
d962d7f94b | ||
|
|
21e2a67c1e | ||
|
|
c188435524 | ||
|
|
8a7a7ba49d | ||
|
|
cbc40230bb | ||
|
|
bc4c3178c9 | ||
|
|
121fe5b3ea | ||
|
|
861609ddc0 | ||
|
|
e5070513ac | ||
|
|
f5c3798df9 | ||
|
|
469d12fede | ||
|
|
417a02744b | ||
|
|
81e78ef24c | ||
|
|
dad9cebb9e | ||
|
|
b3ede3d6d6 | ||
|
|
035fe54df0 | ||
|
|
5f8d99ba64 | ||
|
|
84b8d130dc | ||
|
|
94d4227bc1 | ||
|
|
77cdea2f9f | ||
|
|
8b1ca4cb47 | ||
|
|
d3b8a42180 | ||
|
|
95f39c514a | ||
|
|
7cba082eb0 | ||
|
|
3b9b320be2 | ||
|
|
18664975a9 | ||
|
|
bb014b7c43 | ||
|
|
9fa0650647 | ||
|
|
b8c42677ca | ||
|
|
2eb3c2241c | ||
|
|
8fb7bbfe2e | ||
|
|
52eba74151 | ||
|
|
e651760713 | ||
|
|
82451a26f6 | ||
|
|
cc15f60fb6 | ||
|
|
2f8b2a81c7 | ||
|
|
6d4fdc91fe | ||
|
|
faca29c789 | ||
|
|
1ab937aae4 | ||
|
|
45fcea1954 | ||
|
|
73554078d1 | ||
|
|
a42a88de7b | ||
|
|
14a6079176 | ||
|
|
6c513616c0 | ||
|
|
cdf5f1b7a5 | ||
|
|
6566857d54 | ||
|
|
2e55a1bd6d | ||
|
|
e114a85c39 | ||
|
|
92be088e6c | ||
|
|
f1757ae427 | ||
|
|
ce885c3551 | ||
|
|
17657a4d04 | ||
|
|
b7f62b78b1 | ||
|
|
006284b99c | ||
|
|
bac3968aac | ||
|
|
e5fa044eda | ||
|
|
5969120140 | ||
|
|
8801936ad2 | ||
|
|
1d37d46130 | ||
|
|
445c30f3a9 | ||
|
|
5fedea38c2 | ||
|
|
d86549f492 | ||
|
|
4c4eaba7d2 | ||
|
|
cf8f8743bb | ||
|
|
aa75636026 | ||
|
|
2c41b243b6 | ||
|
|
6aea343d4f | ||
|
|
c300e8cbd5 | ||
|
|
6e25c26e9f | ||
|
|
dce1455be7 | ||
|
|
736025b12f | ||
|
|
cb9e9a67a3 | ||
|
|
93c323458f | ||
|
|
6f8c03d8c1 | ||
|
|
afd4228fcf | ||
|
|
d478e5a12e | ||
|
|
0db9ebe67d | ||
|
|
80ea5e6b91 | ||
|
|
cb773babe1 | ||
|
|
b9ed554aca | ||
|
|
f42f3d0e27 | ||
|
|
93ba5b6e5c | ||
|
|
be11d5968e | ||
|
|
0828599e4f | ||
|
|
f47d22c395 |
2
.github/workflows/release.yml
vendored
2
.github/workflows/release.yml
vendored
@@ -65,7 +65,7 @@ jobs:
|
||||
|
||||
- name: install dependencies (windows only)
|
||||
if: matrix.platform == 'windows-latest'
|
||||
run: cargo install --force trusted-signing-cli
|
||||
run: cargo install --force trusted-signing-cli --version 0.5.0
|
||||
|
||||
- name: Install NPM Dependencies
|
||||
run: |
|
||||
|
||||
@@ -60,3 +60,10 @@ Run the app to apply the migrations.
|
||||
If nothing happens, try `cargo clean` and run the app again.
|
||||
|
||||
_Note: Development builds use a separate database location from production builds._
|
||||
|
||||
## Lezer Grammer Generation
|
||||
|
||||
```sh
|
||||
# Example
|
||||
lezer-generator components/core/Editor/<LANG>/<LANG>.grammar > components/core/Editor/<LANG>/<LANG>.ts
|
||||
```
|
||||
|
||||
25
README.md
25
README.md
@@ -3,7 +3,12 @@
|
||||
Yaak is a desktop API client for interacting with REST, GraphQL, Server Sent Events (SSE), WebSocket, and gRPC
|
||||
APIs. It's built using [Tauri](https://tauri.app), Rust, and ReactJS.
|
||||
|
||||

|
||||

|
||||
|
||||
## Contribution Policy
|
||||
|
||||
Yaak is open source, but only accepting contributions for bug fixes. To get started,
|
||||
visit [`DEVELOPMENT.md`](DEVELOPMENT.md) for tips on setting up your environment.
|
||||
|
||||
## Feature Overview
|
||||
|
||||
@@ -14,6 +19,7 @@ APIs. It's built using [Tauri](https://tauri.app), Rust, and ReactJS.
|
||||
- ⛓️ Chain together multiple requests to dynamically reference values.<br/>
|
||||
- 📂 Organize requests into workspaces and nested folders.<br/>
|
||||
- 🧮 Use environment variables to easily switch between Prod and Dev.<br/>
|
||||
- 🛡️ Secure arbitrary text values with end-to-end encryption<br/>
|
||||
- 🏷️ Send dynamic values like UUIDs or timestamps using template tags.<br/>
|
||||
- 🎨 Choose from many of the included themes, or make your own.<br/>
|
||||
- 💽 Mirror workspace data to a directory for integration with Git or Dropbox.<br/>
|
||||
@@ -21,17 +27,8 @@ APIs. It's built using [Tauri](https://tauri.app), Rust, and ReactJS.
|
||||
- 🔌 Create your own plugins for authentication, template tags, and more!<br/>
|
||||
- 🛜 Configure a proxy to access firewall-blocked APIs
|
||||
|
||||
## Feedback and Bug Reports
|
||||
## Useful Resources
|
||||
|
||||
All feedback, bug reports, questions, and feature requests should be reported to
|
||||
[feedback.yaak.app](https://feedback.yaak.app).
|
||||
|
||||
## Community Projects
|
||||
|
||||
- [`yaak2postman`](https://github.com/BiteCraft/yaak2postman) CLI for converting Yaak data
|
||||
exports to Postman-compatible collections
|
||||
|
||||
## Contribution Policy
|
||||
|
||||
Yaak is open source, but only accepting contributions for bug fixes. To get started,
|
||||
visit [`DEVELOPMENT.md`](DEVELOPMENT.md) for tips on setting up your environment.
|
||||
- [Feedback and Bug Reports](https://feedback.yaak.app)
|
||||
- [Documentation](https://feedback.yaak.app/help)
|
||||
- [Yaak vs Postman](https://yaak.app/blog/postman-alternative)
|
||||
|
||||
3845
package-lock.json
generated
3845
package-lock.json
generated
File diff suppressed because it is too large
Load Diff
15
package.json
15
package.json
@@ -10,8 +10,10 @@
|
||||
"packages/plugin-runtime",
|
||||
"packages/plugin-runtime-types",
|
||||
"packages/common-lib",
|
||||
"src-tauri/yaak-license",
|
||||
"src-tauri/yaak-crypto",
|
||||
"src-tauri/yaak-git",
|
||||
"src-tauri/yaak-license",
|
||||
"src-tauri/yaak-mac-window",
|
||||
"src-tauri/yaak-models",
|
||||
"src-tauri/yaak-plugins",
|
||||
"src-tauri/yaak-sse",
|
||||
@@ -35,10 +37,13 @@
|
||||
"tauri-before-build": "npm run bootstrap && npm run --workspaces --if-present build",
|
||||
"tauri-before-dev": "npm run --workspaces --if-present dev"
|
||||
},
|
||||
"dependencies": {
|
||||
"jotai": "^2.12.2"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@tauri-apps/cli": "^2.2.7",
|
||||
"@typescript-eslint/eslint-plugin": "^8.18.1",
|
||||
"@typescript-eslint/parser": "^8.18.1",
|
||||
"@tauri-apps/cli": "^2.4.1",
|
||||
"@typescript-eslint/eslint-plugin": "^8.27.0",
|
||||
"@typescript-eslint/parser": "^8.27.0",
|
||||
"eslint": "^8",
|
||||
"eslint-config-prettier": "^8",
|
||||
"eslint-plugin-import": "^2.31.0",
|
||||
@@ -48,6 +53,6 @@
|
||||
"nodejs-file-downloader": "^4.13.0",
|
||||
"npm-run-all": "^4.1.5",
|
||||
"prettier": "^3.4.2",
|
||||
"typescript": "^5.7.2"
|
||||
"typescript": "^5.8.2"
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@yaakapp/api",
|
||||
"version": "0.5.0",
|
||||
"version": "0.6.0",
|
||||
"main": "lib/index.js",
|
||||
"typings": "./lib/index.d.ts",
|
||||
"files": [
|
||||
@@ -20,8 +20,6 @@
|
||||
"@types/node": "^22.5.4"
|
||||
},
|
||||
"devDependencies": {
|
||||
"cpy-cli": "^5.0.0",
|
||||
"npm-run-all": "^4.1.5",
|
||||
"typescript": "^5.6.2"
|
||||
"cpy-cli": "^5.0.0"
|
||||
}
|
||||
}
|
||||
|
||||
@@ -104,7 +104,11 @@ hideLabel?: boolean,
|
||||
/**
|
||||
* The default value
|
||||
*/
|
||||
defaultValue?: string, disabled?: boolean, };
|
||||
defaultValue?: string, disabled?: boolean,
|
||||
/**
|
||||
* Longer description of the input, likely shown in a tooltip
|
||||
*/
|
||||
description?: string, };
|
||||
|
||||
export type FormInputCheckbox = {
|
||||
/**
|
||||
@@ -131,7 +135,11 @@ hideLabel?: boolean,
|
||||
/**
|
||||
* The default value
|
||||
*/
|
||||
defaultValue?: string, disabled?: boolean, };
|
||||
defaultValue?: string, disabled?: boolean,
|
||||
/**
|
||||
* Longer description of the input, likely shown in a tooltip
|
||||
*/
|
||||
description?: string, };
|
||||
|
||||
export type FormInputEditor = {
|
||||
/**
|
||||
@@ -170,7 +178,11 @@ hideLabel?: boolean,
|
||||
/**
|
||||
* The default value
|
||||
*/
|
||||
defaultValue?: string, disabled?: boolean, };
|
||||
defaultValue?: string, disabled?: boolean,
|
||||
/**
|
||||
* Longer description of the input, likely shown in a tooltip
|
||||
*/
|
||||
description?: string, };
|
||||
|
||||
export type FormInputFile = {
|
||||
/**
|
||||
@@ -205,7 +217,11 @@ hideLabel?: boolean,
|
||||
/**
|
||||
* The default value
|
||||
*/
|
||||
defaultValue?: string, disabled?: boolean, };
|
||||
defaultValue?: string, disabled?: boolean,
|
||||
/**
|
||||
* Longer description of the input, likely shown in a tooltip
|
||||
*/
|
||||
description?: string, };
|
||||
|
||||
export type FormInputHttpRequest = {
|
||||
/**
|
||||
@@ -232,7 +248,11 @@ hideLabel?: boolean,
|
||||
/**
|
||||
* The default value
|
||||
*/
|
||||
defaultValue?: string, disabled?: boolean, };
|
||||
defaultValue?: string, disabled?: boolean,
|
||||
/**
|
||||
* Longer description of the input, likely shown in a tooltip
|
||||
*/
|
||||
description?: string, };
|
||||
|
||||
export type FormInputMarkdown = { content: string, hidden?: boolean, };
|
||||
|
||||
@@ -265,7 +285,11 @@ hideLabel?: boolean,
|
||||
/**
|
||||
* The default value
|
||||
*/
|
||||
defaultValue?: string, disabled?: boolean, };
|
||||
defaultValue?: string, disabled?: boolean,
|
||||
/**
|
||||
* Longer description of the input, likely shown in a tooltip
|
||||
*/
|
||||
description?: string, };
|
||||
|
||||
export type FormInputSelectOption = { label: string, value: string, };
|
||||
|
||||
@@ -306,10 +330,18 @@ hideLabel?: boolean,
|
||||
/**
|
||||
* The default value
|
||||
*/
|
||||
defaultValue?: string, disabled?: boolean, };
|
||||
defaultValue?: string, disabled?: boolean,
|
||||
/**
|
||||
* Longer description of the input, likely shown in a tooltip
|
||||
*/
|
||||
description?: string, };
|
||||
|
||||
export type GenericCompletionOption = { label: string, detail?: string, info?: string, type?: CompletionOptionType, boost?: number, };
|
||||
|
||||
export type GetCookieValueRequest = { name: string, };
|
||||
|
||||
export type GetCookieValueResponse = { value: string | null, };
|
||||
|
||||
export type GetHttpAuthenticationConfigRequest = { contextId: string, values: { [key in string]?: JsonPrimitive }, };
|
||||
|
||||
export type GetHttpAuthenticationConfigResponse = { args: Array<FormInput>, pluginRefId: string, actions?: Array<HttpAuthenticationAction>, };
|
||||
@@ -344,18 +376,24 @@ export type ImportResources = { workspaces: Array<Workspace>, environments: Arra
|
||||
|
||||
export type ImportResponse = { resources: ImportResources, };
|
||||
|
||||
export type InternalEvent = { id: string, pluginRefId: string, pluginName: string, replyId: string | null, windowContext: WindowContext, payload: InternalEventPayload, };
|
||||
export type InternalEvent = { id: string, pluginRefId: string, pluginName: string, replyId: string | null, windowContext: PluginWindowContext, payload: InternalEventPayload, };
|
||||
|
||||
export type InternalEventPayload = { "type": "boot_request" } & BootRequest | { "type": "boot_response" } & BootResponse | { "type": "reload_request" } & EmptyPayload | { "type": "reload_response" } & EmptyPayload | { "type": "terminate_request" } | { "type": "terminate_response" } | { "type": "import_request" } & ImportRequest | { "type": "import_response" } & ImportResponse | { "type": "filter_request" } & FilterRequest | { "type": "filter_response" } & FilterResponse | { "type": "export_http_request_request" } & ExportHttpRequestRequest | { "type": "export_http_request_response" } & ExportHttpRequestResponse | { "type": "send_http_request_request" } & SendHttpRequestRequest | { "type": "send_http_request_response" } & SendHttpRequestResponse | { "type": "get_http_request_actions_request" } & EmptyPayload | { "type": "get_http_request_actions_response" } & GetHttpRequestActionsResponse | { "type": "call_http_request_action_request" } & CallHttpRequestActionRequest | { "type": "get_template_functions_request" } | { "type": "get_template_functions_response" } & GetTemplateFunctionsResponse | { "type": "call_template_function_request" } & CallTemplateFunctionRequest | { "type": "call_template_function_response" } & CallTemplateFunctionResponse | { "type": "get_http_authentication_summary_request" } & EmptyPayload | { "type": "get_http_authentication_summary_response" } & GetHttpAuthenticationSummaryResponse | { "type": "get_http_authentication_config_request" } & GetHttpAuthenticationConfigRequest | { "type": "get_http_authentication_config_response" } & GetHttpAuthenticationConfigResponse | { "type": "call_http_authentication_request" } & CallHttpAuthenticationRequest | { "type": "call_http_authentication_response" } & CallHttpAuthenticationResponse | { "type": "call_http_authentication_action_request" } & CallHttpAuthenticationActionRequest | { "type": "call_http_authentication_action_response" } & EmptyPayload | { "type": "copy_text_request" } & CopyTextRequest | { "type": "copy_text_response" } & EmptyPayload | { "type": "render_http_request_request" } & RenderHttpRequestRequest | { "type": "render_http_request_response" } & RenderHttpRequestResponse | { "type": "get_key_value_request" } & GetKeyValueRequest | { "type": "get_key_value_response" } & GetKeyValueResponse | { "type": "set_key_value_request" } & SetKeyValueRequest | { "type": "set_key_value_response" } & SetKeyValueResponse | { "type": "delete_key_value_request" } & DeleteKeyValueRequest | { "type": "delete_key_value_response" } & DeleteKeyValueResponse | { "type": "open_window_request" } & OpenWindowRequest | { "type": "window_navigate_event" } & WindowNavigateEvent | { "type": "window_close_event" } | { "type": "close_window_request" } & CloseWindowRequest | { "type": "template_render_request" } & TemplateRenderRequest | { "type": "template_render_response" } & TemplateRenderResponse | { "type": "show_toast_request" } & ShowToastRequest | { "type": "show_toast_response" } & EmptyPayload | { "type": "prompt_text_request" } & PromptTextRequest | { "type": "prompt_text_response" } & PromptTextResponse | { "type": "get_http_request_by_id_request" } & GetHttpRequestByIdRequest | { "type": "get_http_request_by_id_response" } & GetHttpRequestByIdResponse | { "type": "find_http_responses_request" } & FindHttpResponsesRequest | { "type": "find_http_responses_response" } & FindHttpResponsesResponse | { "type": "empty_response" } & EmptyPayload | { "type": "error_response" } & ErrorResponse;
|
||||
export type InternalEventPayload = { "type": "boot_request" } & BootRequest | { "type": "boot_response" } & BootResponse | { "type": "reload_request" } & EmptyPayload | { "type": "reload_response" } & EmptyPayload | { "type": "terminate_request" } | { "type": "terminate_response" } | { "type": "import_request" } & ImportRequest | { "type": "import_response" } & ImportResponse | { "type": "filter_request" } & FilterRequest | { "type": "filter_response" } & FilterResponse | { "type": "export_http_request_request" } & ExportHttpRequestRequest | { "type": "export_http_request_response" } & ExportHttpRequestResponse | { "type": "send_http_request_request" } & SendHttpRequestRequest | { "type": "send_http_request_response" } & SendHttpRequestResponse | { "type": "list_cookie_names_request" } & ListCookieNamesRequest | { "type": "list_cookie_names_response" } & ListCookieNamesResponse | { "type": "get_cookie_value_request" } & GetCookieValueRequest | { "type": "get_cookie_value_response" } & GetCookieValueResponse | { "type": "get_http_request_actions_request" } & EmptyPayload | { "type": "get_http_request_actions_response" } & GetHttpRequestActionsResponse | { "type": "call_http_request_action_request" } & CallHttpRequestActionRequest | { "type": "get_template_functions_request" } | { "type": "get_template_functions_response" } & GetTemplateFunctionsResponse | { "type": "call_template_function_request" } & CallTemplateFunctionRequest | { "type": "call_template_function_response" } & CallTemplateFunctionResponse | { "type": "get_http_authentication_summary_request" } & EmptyPayload | { "type": "get_http_authentication_summary_response" } & GetHttpAuthenticationSummaryResponse | { "type": "get_http_authentication_config_request" } & GetHttpAuthenticationConfigRequest | { "type": "get_http_authentication_config_response" } & GetHttpAuthenticationConfigResponse | { "type": "call_http_authentication_request" } & CallHttpAuthenticationRequest | { "type": "call_http_authentication_response" } & CallHttpAuthenticationResponse | { "type": "call_http_authentication_action_request" } & CallHttpAuthenticationActionRequest | { "type": "call_http_authentication_action_response" } & EmptyPayload | { "type": "copy_text_request" } & CopyTextRequest | { "type": "copy_text_response" } & EmptyPayload | { "type": "render_http_request_request" } & RenderHttpRequestRequest | { "type": "render_http_request_response" } & RenderHttpRequestResponse | { "type": "get_key_value_request" } & GetKeyValueRequest | { "type": "get_key_value_response" } & GetKeyValueResponse | { "type": "set_key_value_request" } & SetKeyValueRequest | { "type": "set_key_value_response" } & SetKeyValueResponse | { "type": "delete_key_value_request" } & DeleteKeyValueRequest | { "type": "delete_key_value_response" } & DeleteKeyValueResponse | { "type": "open_window_request" } & OpenWindowRequest | { "type": "window_navigate_event" } & WindowNavigateEvent | { "type": "window_close_event" } | { "type": "close_window_request" } & CloseWindowRequest | { "type": "template_render_request" } & TemplateRenderRequest | { "type": "template_render_response" } & TemplateRenderResponse | { "type": "show_toast_request" } & ShowToastRequest | { "type": "show_toast_response" } & EmptyPayload | { "type": "prompt_text_request" } & PromptTextRequest | { "type": "prompt_text_response" } & PromptTextResponse | { "type": "get_http_request_by_id_request" } & GetHttpRequestByIdRequest | { "type": "get_http_request_by_id_response" } & GetHttpRequestByIdResponse | { "type": "find_http_responses_request" } & FindHttpResponsesRequest | { "type": "find_http_responses_response" } & FindHttpResponsesResponse | { "type": "empty_response" } & EmptyPayload | { "type": "error_response" } & ErrorResponse;
|
||||
|
||||
export type JsonPrimitive = string | number | boolean | null;
|
||||
|
||||
export type ListCookieNamesRequest = {};
|
||||
|
||||
export type ListCookieNamesResponse = { names: Array<string>, };
|
||||
|
||||
export type OpenWindowRequest = { url: string,
|
||||
/**
|
||||
* Label for the window. If not provided, a random one will be generated.
|
||||
*/
|
||||
label: string, title?: string, size?: WindowSize, dataDirKey?: string, };
|
||||
|
||||
export type PluginWindowContext = { "type": "none" } | { "type": "label", label: string, workspace_id: string | null, };
|
||||
|
||||
export type PromptTextRequest = { id: string, title: string, label: string, description?: string, defaultValue?: string, placeholder?: string,
|
||||
/**
|
||||
* Text to add to the confirmation button
|
||||
@@ -393,14 +431,17 @@ export type TemplateFunction = { name: string, description?: string,
|
||||
* Also support alternative names. This is useful for not breaking existing
|
||||
* tags when changing the `name` property
|
||||
*/
|
||||
aliases?: Array<string>, args: Array<FormInput>, };
|
||||
aliases?: Array<string>, args: Array<TemplateFunctionArg>, };
|
||||
|
||||
/**
|
||||
* Similar to FormInput, but contains
|
||||
*/
|
||||
export type TemplateFunctionArg = FormInput;
|
||||
|
||||
export type TemplateRenderRequest = { data: JsonValue, purpose: RenderPurpose, };
|
||||
|
||||
export type TemplateRenderResponse = { data: JsonValue, };
|
||||
|
||||
export type WindowContext = { "type": "none" } | { "type": "label", label: string, };
|
||||
|
||||
export type WindowNavigateEvent = { url: string, };
|
||||
|
||||
export type WindowSize = { width: number, height: number, };
|
||||
|
||||
@@ -1,14 +1,12 @@
|
||||
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||
|
||||
export type Environment = { model: "environment", id: string, workspaceId: string, environmentId: string | null, createdAt: string, updatedAt: string, name: string, variables: Array<EnvironmentVariable>, };
|
||||
export type Environment = { model: "environment", id: string, workspaceId: string, createdAt: string, updatedAt: string, name: string, public: boolean, base: boolean, variables: Array<EnvironmentVariable>, };
|
||||
|
||||
export type EnvironmentVariable = { enabled?: boolean, name: string, value: string, id?: string, };
|
||||
|
||||
export type Folder = { model: "folder", id: string, createdAt: string, updatedAt: string, workspaceId: string, folderId: string | null, name: string, description: string, sortPriority: number, };
|
||||
export type Folder = { model: "folder", id: string, createdAt: string, updatedAt: string, workspaceId: string, folderId: string | null, authentication: Record<string, any>, authenticationType: string | null, description: string, headers: Array<HttpRequestHeader>, name: string, sortPriority: number, };
|
||||
|
||||
export type GrpcMetadataEntry = { enabled?: boolean, name: string, value: string, id?: string, };
|
||||
|
||||
export type GrpcRequest = { model: "grpc_request", id: string, createdAt: string, updatedAt: string, workspaceId: string, folderId: string | null, authenticationType: string | null, authentication: Record<string, any>, description: string, message: string, metadata: Array<GrpcMetadataEntry>, method: string | null, name: string, service: string | null, sortPriority: number, url: string, };
|
||||
export type GrpcRequest = { model: "grpc_request", id: string, createdAt: string, updatedAt: string, workspaceId: string, folderId: string | null, authenticationType: string | null, authentication: Record<string, any>, description: string, message: string, metadata: Array<HttpRequestHeader>, method: string | null, name: string, service: string | null, sortPriority: number, url: string, };
|
||||
|
||||
export type HttpRequest = { model: "http_request", id: string, createdAt: string, updatedAt: string, workspaceId: string, folderId: string | null, authentication: Record<string, any>, authenticationType: string | null, body: Record<string, any>, bodyType: string | null, description: string, headers: Array<HttpRequestHeader>, method: string, name: string, sortPriority: number, url: string, urlParameters: Array<HttpUrlParameter>, };
|
||||
|
||||
@@ -24,4 +22,4 @@ export type HttpUrlParameter = { enabled?: boolean, name: string, value: string,
|
||||
|
||||
export type WebsocketRequest = { model: "websocket_request", id: string, createdAt: string, updatedAt: string, workspaceId: string, folderId: string | null, authentication: Record<string, any>, authenticationType: string | null, description: string, headers: Array<HttpRequestHeader>, message: string, name: string, sortPriority: number, url: string, urlParameters: Array<HttpUrlParameter>, };
|
||||
|
||||
export type Workspace = { model: "workspace", id: string, createdAt: string, updatedAt: string, name: string, description: string, settingValidateCertificates: boolean, settingFollowRedirects: boolean, settingRequestTimeout: number, };
|
||||
export type Workspace = { model: "workspace", id: string, createdAt: string, updatedAt: string, authentication: Record<string, any>, authenticationType: string | null, description: string, headers: Array<HttpRequestHeader>, name: string, encryptionKeyChallenge: string | null, settingValidateCertificates: boolean, settingFollowRedirects: boolean, settingRequestTimeout: number, };
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||
|
||||
export type JsonValue = number | string | Array<JsonValue> | { [key in string]?: JsonValue };
|
||||
export type JsonValue = number | string | boolean | Array<JsonValue> | { [key in string]?: JsonValue } | null;
|
||||
|
||||
@@ -3,3 +3,7 @@ export type * from './themes';
|
||||
|
||||
export * from './bindings/gen_models';
|
||||
export * from './bindings/gen_events';
|
||||
|
||||
// Some extras for utility
|
||||
|
||||
export type { PartialImportResources } from './plugins/ImporterPlugin';
|
||||
|
||||
@@ -1,8 +1,11 @@
|
||||
import type {
|
||||
FindHttpResponsesRequest,
|
||||
FindHttpResponsesResponse,
|
||||
GetCookieValueRequest,
|
||||
GetCookieValueResponse,
|
||||
GetHttpRequestByIdRequest,
|
||||
GetHttpRequestByIdResponse,
|
||||
ListCookieNamesResponse,
|
||||
OpenWindowRequest,
|
||||
PromptTextRequest,
|
||||
PromptTextResponse,
|
||||
@@ -34,10 +37,14 @@ export interface Context {
|
||||
openUrl(
|
||||
args: OpenWindowRequest & {
|
||||
onNavigate?: (args: { url: string }) => void;
|
||||
onClose: () => void;
|
||||
onClose?: () => void;
|
||||
},
|
||||
): Promise<{ close: () => void }>;
|
||||
};
|
||||
cookies: {
|
||||
listNames(): Promise<ListCookieNamesResponse['names']>;
|
||||
getValue(args: GetCookieValueRequest): Promise<GetCookieValueResponse['value']>;
|
||||
};
|
||||
httpRequest: {
|
||||
send(args: SendHttpRequestRequest): Promise<SendHttpRequestResponse['httpResponse']>;
|
||||
getById(args: GetHttpRequestByIdRequest): Promise<GetHttpRequestByIdResponse['httpRequest']>;
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import type { Context } from './Context';
|
||||
|
||||
export type FilterPluginResponse = { filtered: string };
|
||||
type FilterPluginResponse = { filtered: string };
|
||||
|
||||
export type FilterPlugin = {
|
||||
name: string;
|
||||
|
||||
@@ -1,15 +1,21 @@
|
||||
import { Environment, Folder, GrpcRequest, HttpRequest, Workspace } from '../bindings/gen_models';
|
||||
import type { AtLeast } from '../helpers';
|
||||
import { ImportResources } from '../bindings/gen_events';
|
||||
import { AtLeast } from '../helpers';
|
||||
import type { Context } from './Context';
|
||||
|
||||
type ImportPluginResponse = null | {
|
||||
resources: {
|
||||
workspaces: AtLeast<Workspace, 'name' | 'id' | 'model'>[];
|
||||
environments: AtLeast<Environment, 'name' | 'id' | 'model' | 'workspaceId'>[];
|
||||
folders: AtLeast<Folder, 'name' | 'id' | 'model' | 'workspaceId'>[];
|
||||
httpRequests: AtLeast<HttpRequest, 'name' | 'id' | 'model' | 'workspaceId'>[];
|
||||
grpcRequests: AtLeast<GrpcRequest, 'name' | 'id' | 'model' | 'workspaceId'>[];
|
||||
};
|
||||
type RootFields = 'name' | 'id' | 'model';
|
||||
type CommonFields = RootFields | 'workspaceId';
|
||||
|
||||
export type PartialImportResources = {
|
||||
workspaces: Array<AtLeast<ImportResources['workspaces'][0], RootFields>>;
|
||||
environments: Array<AtLeast<ImportResources['environments'][0], CommonFields>>;
|
||||
folders: Array<AtLeast<ImportResources['folders'][0], CommonFields>>;
|
||||
httpRequests: Array<AtLeast<ImportResources['httpRequests'][0], CommonFields>>;
|
||||
grpcRequests: Array<AtLeast<ImportResources['grpcRequests'][0], CommonFields>>;
|
||||
websocketRequests: Array<AtLeast<ImportResources['websocketRequests'][0], CommonFields>>;
|
||||
};
|
||||
|
||||
export type ImportPluginResponse = null | {
|
||||
resources: PartialImportResources;
|
||||
};
|
||||
|
||||
export type ImporterPlugin = {
|
||||
|
||||
@@ -15,7 +15,6 @@ export class PluginHandle {
|
||||
bootRequest: this.bootRequest,
|
||||
};
|
||||
this.#instance = new PluginInstance(workerData, pluginToAppEvents);
|
||||
console.log('Created plugin worker for ', this.bootRequest.dir);
|
||||
}
|
||||
|
||||
sendToWorker(event: InternalEvent) {
|
||||
|
||||
@@ -1,9 +1,10 @@
|
||||
import type {
|
||||
import {
|
||||
BootRequest,
|
||||
Context,
|
||||
DeleteKeyValueResponse,
|
||||
FindHttpResponsesResponse,
|
||||
FormInput,
|
||||
GetCookieValueRequest,
|
||||
GetCookieValueResponse,
|
||||
GetHttpRequestByIdResponse,
|
||||
GetKeyValueResponse,
|
||||
HttpAuthenticationAction,
|
||||
@@ -11,20 +12,20 @@ import type {
|
||||
InternalEvent,
|
||||
InternalEventPayload,
|
||||
JsonPrimitive,
|
||||
PluginDefinition,
|
||||
ListCookieNamesResponse,
|
||||
PluginWindowContext,
|
||||
PromptTextResponse,
|
||||
RenderHttpRequestResponse,
|
||||
SendHttpRequestResponse,
|
||||
TemplateFunction,
|
||||
TemplateFunctionArg,
|
||||
TemplateRenderResponse,
|
||||
WindowContext,
|
||||
} from '@yaakapp/api';
|
||||
} from '@yaakapp-internal/plugins';
|
||||
import { Context, PluginDefinition } from '@yaakapp/api';
|
||||
import console from 'node:console';
|
||||
import { readFileSync, type Stats, statSync, watch } from 'node:fs';
|
||||
import path from 'node:path';
|
||||
// import util from 'node:util';
|
||||
import { EventChannel } from './EventChannel';
|
||||
// import { interceptStdout } from './interceptStdout';
|
||||
import { migrateTemplateFunctionSelectOptions } from './migrations';
|
||||
|
||||
export interface PluginWorkerData {
|
||||
@@ -45,12 +46,12 @@ export class PluginInstance {
|
||||
this.#appToPluginEvents = new EventChannel();
|
||||
|
||||
// Forward incoming events to onMessage()
|
||||
this.#appToPluginEvents.listen(async event => {
|
||||
this.#appToPluginEvents.listen(async (event) => {
|
||||
await this.#onMessage(event);
|
||||
})
|
||||
});
|
||||
|
||||
// Reload plugin if the JS or package.json changes
|
||||
const windowContextNone: WindowContext = { type: 'none' };
|
||||
const windowContextNone: PluginWindowContext = { type: 'none' };
|
||||
const fileChangeCallback = async () => {
|
||||
this.#importModule();
|
||||
return this.#sendPayload(windowContextNone, { type: 'reload_response' }, null);
|
||||
@@ -261,10 +262,10 @@ export class PluginInstance {
|
||||
payload.type === 'call_template_function_request' &&
|
||||
Array.isArray(this.#mod?.templateFunctions)
|
||||
) {
|
||||
const action = this.#mod.templateFunctions.find((a) => a.name === payload.name);
|
||||
if (typeof action?.onRender === 'function') {
|
||||
applyFormInputDefaults(action.args, payload.args.values);
|
||||
const result = await action.onRender(ctx, payload.args);
|
||||
const fn = this.#mod.templateFunctions.find((a) => a.name === payload.name);
|
||||
if (typeof fn?.onRender === 'function') {
|
||||
applyFormInputDefaults(fn.args, payload.args.values);
|
||||
const result = await fn.onRender(ctx, payload.args);
|
||||
this.#sendPayload(
|
||||
windowContext,
|
||||
{
|
||||
@@ -317,7 +318,7 @@ export class PluginInstance {
|
||||
}
|
||||
|
||||
#buildEventToSend(
|
||||
windowContext: WindowContext,
|
||||
windowContext: PluginWindowContext,
|
||||
payload: InternalEventPayload,
|
||||
replyId: string | null = null,
|
||||
): InternalEvent {
|
||||
@@ -332,7 +333,7 @@ export class PluginInstance {
|
||||
}
|
||||
|
||||
#sendPayload(
|
||||
windowContext: WindowContext,
|
||||
windowContext: PluginWindowContext,
|
||||
payload: InternalEventPayload,
|
||||
replyId: string | null,
|
||||
): string {
|
||||
@@ -348,12 +349,12 @@ export class PluginInstance {
|
||||
this.#pluginToAppEvents.emit(event);
|
||||
}
|
||||
|
||||
#sendEmpty(windowContext: WindowContext, replyId: string | null = null): string {
|
||||
#sendEmpty(windowContext: PluginWindowContext, replyId: string | null = null): string {
|
||||
return this.#sendPayload(windowContext, { type: 'empty_response' }, replyId);
|
||||
}
|
||||
|
||||
#sendAndWaitForReply<T extends Omit<InternalEventPayload, 'type'>>(
|
||||
windowContext: WindowContext,
|
||||
windowContext: PluginWindowContext,
|
||||
payload: InternalEventPayload,
|
||||
): Promise<T> {
|
||||
// 1. Build event to send
|
||||
@@ -379,7 +380,7 @@ export class PluginInstance {
|
||||
}
|
||||
|
||||
#sendAndListenForEvents(
|
||||
windowContext: WindowContext,
|
||||
windowContext: PluginWindowContext,
|
||||
payload: InternalEventPayload,
|
||||
onEvent: (event: InternalEventPayload) => void,
|
||||
): void {
|
||||
@@ -495,6 +496,27 @@ export class PluginInstance {
|
||||
return httpRequest;
|
||||
},
|
||||
},
|
||||
cookies: {
|
||||
getValue: async (args: GetCookieValueRequest) => {
|
||||
const payload = {
|
||||
type: 'get_cookie_value_request',
|
||||
...args,
|
||||
} as const;
|
||||
const { value } = await this.#sendAndWaitForReply<GetCookieValueResponse>(
|
||||
event.windowContext,
|
||||
payload,
|
||||
);
|
||||
return value;
|
||||
},
|
||||
listNames: async () => {
|
||||
const payload = { type: 'list_cookie_names_request' } as const;
|
||||
const { names } = await this.#sendAndWaitForReply<ListCookieNamesResponse>(
|
||||
event.windowContext,
|
||||
payload,
|
||||
);
|
||||
return names;
|
||||
},
|
||||
},
|
||||
templates: {
|
||||
/**
|
||||
* Invoke Yaak's template engine to render a value. If the value is a nested type
|
||||
@@ -551,7 +573,7 @@ function genId(len = 5): string {
|
||||
|
||||
/** Recursively apply form input defaults to a set of values */
|
||||
function applyFormInputDefaults(
|
||||
inputs: FormInput[],
|
||||
inputs: TemplateFunctionArg[],
|
||||
values: { [p: string]: JsonPrimitive | undefined },
|
||||
) {
|
||||
for (const input of inputs) {
|
||||
@@ -580,18 +602,3 @@ function watchFile(filepath: string, cb: (filepath: string) => void) {
|
||||
watchedFiles[filepath] = stat;
|
||||
});
|
||||
}
|
||||
|
||||
// function prefixStdout(s: string) {
|
||||
// if (!s.includes('%s')) {
|
||||
// throw new Error('Console prefix must contain a "%s" replacer');
|
||||
// }
|
||||
// interceptStdout((text: string) => {
|
||||
// const lines = text.split(/\n/);
|
||||
// let newText = '';
|
||||
// for (let i = 0; i < lines.length; i++) {
|
||||
// if (lines[i] == '') continue;
|
||||
// newText += util.format(s, lines[i]) + '\n';
|
||||
// }
|
||||
// return newText.trimEnd();
|
||||
// });
|
||||
// }
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
edition = "2018"
|
||||
edition = "2024"
|
||||
|
||||
# Widths
|
||||
chain_width = 100
|
||||
|
||||
@@ -9,12 +9,14 @@ const NODE_VERSION = 'v22.9.0';
|
||||
// `${process.platform}_${process.arch}`
|
||||
const MAC_ARM = 'darwin_arm64';
|
||||
const MAC_X64 = 'darwin_x64';
|
||||
const LNX_ARM = 'linux_arm64';
|
||||
const LNX_X64 = 'linux_x64';
|
||||
const WIN_X64 = 'win32_x64';
|
||||
|
||||
const URL_MAP = {
|
||||
[MAC_ARM]: `https://nodejs.org/download/release/${NODE_VERSION}/node-${NODE_VERSION}-darwin-arm64.tar.gz`,
|
||||
[MAC_X64]: `https://nodejs.org/download/release/${NODE_VERSION}/node-${NODE_VERSION}-darwin-x64.tar.gz`,
|
||||
[LNX_ARM]: `https://nodejs.org/download/release/${NODE_VERSION}/node-${NODE_VERSION}-linux-arm64.tar.gz`,
|
||||
[LNX_X64]: `https://nodejs.org/download/release/${NODE_VERSION}/node-${NODE_VERSION}-linux-x64.tar.gz`,
|
||||
[WIN_X64]: `https://nodejs.org/download/release/${NODE_VERSION}/node-${NODE_VERSION}-win-x64.zip`,
|
||||
};
|
||||
@@ -22,15 +24,17 @@ const URL_MAP = {
|
||||
const SRC_BIN_MAP = {
|
||||
[MAC_ARM]: `node-${NODE_VERSION}-darwin-arm64/bin/node`,
|
||||
[MAC_X64]: `node-${NODE_VERSION}-darwin-x64/bin/node`,
|
||||
[LNX_ARM]: `node-${NODE_VERSION}-linux-arm64/bin/node`,
|
||||
[LNX_X64]: `node-${NODE_VERSION}-linux-x64/bin/node`,
|
||||
[WIN_X64]: `node-${NODE_VERSION}-win-x64/node.exe`,
|
||||
};
|
||||
|
||||
const DST_BIN_MAP = {
|
||||
darwin_arm64: 'yaaknode-aarch64-apple-darwin',
|
||||
darwin_x64: 'yaaknode-x86_64-apple-darwin',
|
||||
linux_x64: 'yaaknode-x86_64-unknown-linux-gnu',
|
||||
win32_x64: 'yaaknode-x86_64-pc-windows-msvc.exe',
|
||||
[MAC_ARM]: 'yaaknode-aarch64-apple-darwin',
|
||||
[MAC_X64]: 'yaaknode-x86_64-apple-darwin',
|
||||
[LNX_ARM]: 'yaaknode-aarch64-unknown-linux-gnu',
|
||||
[LNX_X64]: 'yaaknode-x86_64-unknown-linux-gnu',
|
||||
[WIN_X64]: 'yaaknode-x86_64-pc-windows-msvc.exe',
|
||||
};
|
||||
|
||||
const key = `${process.platform}_${process.env.YAAK_TARGET_ARCH ?? process.arch}`;
|
||||
|
||||
@@ -9,12 +9,14 @@ const VERSION = '28.3';
|
||||
// `${process.platform}_${process.arch}`
|
||||
const MAC_ARM = 'darwin_arm64';
|
||||
const MAC_X64 = 'darwin_x64';
|
||||
const LNX_ARM = 'linux_arm64';
|
||||
const LNX_X64 = 'linux_x64';
|
||||
const WIN_X64 = 'win32_x64';
|
||||
|
||||
const URL_MAP = {
|
||||
[MAC_ARM]: `https://github.com/protocolbuffers/protobuf/releases/download/v${VERSION}/protoc-${VERSION}-osx-aarch_64.zip`,
|
||||
[MAC_X64]: `https://github.com/protocolbuffers/protobuf/releases/download/v${VERSION}/protoc-${VERSION}-osx-x86_64.zip`,
|
||||
[LNX_ARM]: `https://github.com/protocolbuffers/protobuf/releases/download/v${VERSION}/protoc-${VERSION}-linux-aarch_64.zip`,
|
||||
[LNX_X64]: `https://github.com/protocolbuffers/protobuf/releases/download/v${VERSION}/protoc-${VERSION}-linux-x86_64.zip`,
|
||||
[WIN_X64]: `https://github.com/protocolbuffers/protobuf/releases/download/v${VERSION}/protoc-${VERSION}-win64.zip`,
|
||||
};
|
||||
@@ -22,6 +24,7 @@ const URL_MAP = {
|
||||
const SRC_BIN_MAP = {
|
||||
[MAC_ARM]: 'bin/protoc',
|
||||
[MAC_X64]: 'bin/protoc',
|
||||
[LNX_ARM]: 'bin/protoc',
|
||||
[LNX_X64]: 'bin/protoc',
|
||||
[WIN_X64]: 'bin/protoc.exe',
|
||||
};
|
||||
@@ -29,6 +32,7 @@ const SRC_BIN_MAP = {
|
||||
const DST_BIN_MAP = {
|
||||
[MAC_ARM]: 'yaakprotoc-aarch64-apple-darwin',
|
||||
[MAC_X64]: 'yaakprotoc-x86_64-apple-darwin',
|
||||
[LNX_ARM]: 'yaakprotoc-aarch64-unknown-linux-gnu',
|
||||
[LNX_X64]: 'yaakprotoc-x86_64-unknown-linux-gnu',
|
||||
[WIN_X64]: 'yaakprotoc-x86_64-pc-windows-msvc.exe',
|
||||
};
|
||||
|
||||
5
src-tauri/.gitignore
vendored
5
src-tauri/.gitignore
vendored
@@ -4,3 +4,8 @@ target/
|
||||
|
||||
vendored/*
|
||||
!vendored/plugins
|
||||
|
||||
gen/*
|
||||
|
||||
**/permissions/autogenerated
|
||||
**/permissions/schemas
|
||||
|
||||
1436
src-tauri/Cargo.lock
generated
1436
src-tauri/Cargo.lock
generated
File diff suppressed because it is too large
Load Diff
@@ -1,9 +1,11 @@
|
||||
[workspace]
|
||||
members = [
|
||||
"yaak-crypto",
|
||||
"yaak-git",
|
||||
"yaak-grpc",
|
||||
"yaak-http",
|
||||
"yaak-license",
|
||||
"yaak-mac-window",
|
||||
"yaak-models",
|
||||
"yaak-plugins",
|
||||
"yaak-sse",
|
||||
@@ -15,7 +17,7 @@ members = [
|
||||
[package]
|
||||
name = "yaak-app"
|
||||
version = "0.0.0"
|
||||
edition = "2021"
|
||||
edition = "2024"
|
||||
authors = ["Gregory Schier"]
|
||||
publish = false
|
||||
|
||||
@@ -31,51 +33,49 @@ strip = true # Automatically strip symbols from the binary.
|
||||
cargo-clippy = []
|
||||
|
||||
[build-dependencies]
|
||||
tauri-build = { version = "2.0.5", features = [] }
|
||||
|
||||
[target.'cfg(target_os = "macos")'.dependencies]
|
||||
objc = "0.2.7"
|
||||
cocoa = "0.26.0"
|
||||
tauri-build = { version = "2.1.1", features = [] }
|
||||
|
||||
[target.'cfg(target_os = "linux")'.dependencies]
|
||||
openssl-sys = { version = "0.9.105", features = ["vendored"] } # For Ubuntu installation to work
|
||||
|
||||
[dependencies]
|
||||
chrono = { version = "0.4.31", features = ["serde"] }
|
||||
cookie = "0.18.1"
|
||||
encoding_rs = "0.8.35"
|
||||
eventsource-client = { git = "https://github.com/yaakapp/rust-eventsource-client", version = "0.14.0" }
|
||||
hex_color = "3.0.0"
|
||||
http = { version = "1.2.0", default-features = false }
|
||||
log = "0.4.21"
|
||||
log = "0.4.27"
|
||||
md5 = "0.7.0"
|
||||
mime_guess = "2.0.5"
|
||||
rand = "0.9.0"
|
||||
regex = "1.10.2"
|
||||
reqwest = { workspace = true, features = ["multipart", "cookies", "gzip", "brotli", "deflate", "json", "rustls-tls-manual-roots-no-provider"] }
|
||||
reqwest_cookie_store = "0.8.0"
|
||||
rustls = { version = "0.23.22", default-features = false, features = ["custom-provider", "ring"] }
|
||||
rustls-platform-verifier = "0.5.0"
|
||||
rustls = { version = "0.23.25", default-features = false, features = ["custom-provider", "ring"] }
|
||||
rustls-platform-verifier = "0.5.1"
|
||||
serde = { workspace = true, features = ["derive"] }
|
||||
serde_json = { workspace = true, features = ["raw_value"] }
|
||||
tauri = { workspace = true, features = ["devtools", "protocol-asset"] }
|
||||
tauri-plugin-clipboard-manager = "2.2.1"
|
||||
tauri-plugin-clipboard-manager = "2.2.2"
|
||||
tauri-plugin-dialog = "2.2.0"
|
||||
tauri-plugin-fs = "2.2.0"
|
||||
tauri-plugin-log = { version = "2.2.1", features = ["colored"] }
|
||||
tauri-plugin-opener = "2.2.5"
|
||||
tauri-plugin-os = "2.2.0"
|
||||
tauri-plugin-log = { version = "2.3.1", features = ["colored"] }
|
||||
tauri-plugin-opener = "2.2.6"
|
||||
tauri-plugin-os = "2.2.1"
|
||||
tauri-plugin-shell = { workspace = true }
|
||||
tauri-plugin-single-instance = "2.2.1"
|
||||
tauri-plugin-updater = "2.5.0"
|
||||
tauri-plugin-single-instance = "2.2.2"
|
||||
tauri-plugin-updater = "2.6.1"
|
||||
tauri-plugin-window-state = "2.2.1"
|
||||
tokio = { version = "1.43.0", features = ["sync"] }
|
||||
tokio-stream = "0.1.17"
|
||||
thiserror = { workspace = true }
|
||||
tokio = { workspace = true, features = ["sync"] }
|
||||
tokio-stream = "0.1.17"
|
||||
uuid = "1.12.1"
|
||||
yaak-common = { workspace = true }
|
||||
yaak-crypto = { workspace = true }
|
||||
yaak-git = { path = "yaak-git" }
|
||||
yaak-grpc = { path = "yaak-grpc" }
|
||||
yaak-http = { workspace = true }
|
||||
yaak-license = { path = "yaak-license" }
|
||||
yaak-mac-window = { path = "yaak-mac-window" }
|
||||
yaak-models = { workspace = true }
|
||||
yaak-plugins = { workspace = true }
|
||||
yaak-sse = { workspace = true }
|
||||
@@ -84,17 +84,20 @@ yaak-templates = { workspace = true }
|
||||
yaak-ws = { path = "yaak-ws" }
|
||||
|
||||
[workspace.dependencies]
|
||||
reqwest = "0.12.12"
|
||||
serde = "1.0.215"
|
||||
serde_json = "1.0.132"
|
||||
tauri = "2.2.5"
|
||||
tauri-plugin = "2.0.4"
|
||||
tauri-plugin-shell = "2.2.0"
|
||||
thiserror = "2.0.3"
|
||||
ts-rs = "10.0.0"
|
||||
reqwest = "0.12.15"
|
||||
serde = "1.0.219"
|
||||
serde_json = "1.0.140"
|
||||
tauri = "2.4.1"
|
||||
tauri-plugin = "2.1.1"
|
||||
tauri-plugin-shell = "2.2.1"
|
||||
tokio = "1.44.2"
|
||||
thiserror = "2.0.12"
|
||||
ts-rs = "10.1.0"
|
||||
yaak-common = { path = "yaak-common" }
|
||||
yaak-http = { path = "yaak-http" }
|
||||
yaak-models = { path = "yaak-models" }
|
||||
yaak-plugins = { path = "yaak-plugins" }
|
||||
yaak-sync = { path = "yaak-sync" }
|
||||
yaak-sse = { path = "yaak-sse" }
|
||||
yaak-sync = { path = "yaak-sync" }
|
||||
yaak-templates = { path = "yaak-templates" }
|
||||
yaak-crypto = { path = "yaak-crypto" }
|
||||
|
||||
@@ -7,6 +7,7 @@
|
||||
"*"
|
||||
],
|
||||
"permissions": [
|
||||
"core:app:allow-identifier",
|
||||
"core:event:allow-emit",
|
||||
"core:event:allow-listen",
|
||||
"core:event:allow-unlisten",
|
||||
@@ -50,8 +51,11 @@
|
||||
"opener:allow-open-url",
|
||||
"opener:allow-reveal-item-in-dir",
|
||||
"shell:allow-open",
|
||||
"yaak-license:default",
|
||||
"yaak-crypto:default",
|
||||
"yaak-git:default",
|
||||
"yaak-license:default",
|
||||
"yaak-mac-window:default",
|
||||
"yaak-models:default",
|
||||
"yaak-sync:default",
|
||||
"yaak-ws:default"
|
||||
]
|
||||
|
||||
1
src-tauri/gen/schemas/acl-manifests.json
generated
1
src-tauri/gen/schemas/acl-manifests.json
generated
File diff suppressed because one or more lines are too long
1
src-tauri/gen/schemas/capabilities.json
generated
1
src-tauri/gen/schemas/capabilities.json
generated
@@ -1 +0,0 @@
|
||||
{"main":{"identifier":"main","description":"Main permissions","local":true,"windows":["*"],"permissions":["core:event:allow-emit","core:event:allow-listen","core:event:allow-unlisten","os:allow-os-type","clipboard-manager:allow-clear","clipboard-manager:allow-write-text","clipboard-manager:allow-read-text","dialog:allow-open","dialog:allow-save","fs:allow-read-dir","fs:allow-read-file","fs:allow-read-text-file",{"identifier":"fs:scope","allow":[{"path":"$APPDATA"},{"path":"$APPDATA/**"}]},"clipboard-manager:allow-read-text","clipboard-manager:allow-write-text","core:webview:allow-set-webview-zoom","core:window:allow-close","core:window:allow-internal-toggle-maximize","core:window:allow-is-fullscreen","core:window:allow-is-maximized","core:window:allow-maximize","core:window:allow-minimize","core:window:allow-set-decorations","core:window:allow-set-title","core:window:allow-show","core:window:allow-start-dragging","core:window:allow-theme","core:window:allow-unmaximize","opener:allow-default-urls","opener:allow-open-path","opener:allow-open-url","opener:allow-reveal-item-in-dir","shell:allow-open","yaak-license:default","yaak-git:default","yaak-sync:default","yaak-ws:default"]}}
|
||||
5953
src-tauri/gen/schemas/desktop-schema.json
generated
5953
src-tauri/gen/schemas/desktop-schema.json
generated
File diff suppressed because it is too large
Load Diff
5643
src-tauri/gen/schemas/linux-schema.json
generated
5643
src-tauri/gen/schemas/linux-schema.json
generated
File diff suppressed because it is too large
Load Diff
5953
src-tauri/gen/schemas/macOS-schema.json
generated
5953
src-tauri/gen/schemas/macOS-schema.json
generated
File diff suppressed because it is too large
Load Diff
5643
src-tauri/gen/schemas/windows-schema.json
generated
5643
src-tauri/gen/schemas/windows-schema.json
generated
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,2 @@
|
||||
ALTER TABLE settings
|
||||
ADD COLUMN hide_window_controls BOOLEAN DEFAULT FALSE NOT NULL;
|
||||
43
src-tauri/migrations/20250326193143_key-value-id.sql
Normal file
43
src-tauri/migrations/20250326193143_key-value-id.sql
Normal file
@@ -0,0 +1,43 @@
|
||||
-- 1. Create the new table with `id` as the primary key
|
||||
CREATE TABLE key_values_new
|
||||
(
|
||||
id TEXT PRIMARY KEY,
|
||||
model TEXT DEFAULT 'key_value' NOT NULL,
|
||||
created_at DATETIME DEFAULT CURRENT_TIMESTAMP NOT NULL,
|
||||
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP NOT NULL,
|
||||
deleted_at DATETIME,
|
||||
namespace TEXT NOT NULL,
|
||||
key TEXT NOT NULL,
|
||||
value TEXT NOT NULL
|
||||
);
|
||||
|
||||
-- 2. Copy data from the old table
|
||||
INSERT INTO key_values_new (id, model, created_at, updated_at, deleted_at, namespace, key, value)
|
||||
SELECT (
|
||||
-- This is the best way to generate a random string in SQLite, apparently
|
||||
'kv_' || SUBSTR('abcdefghijkmnopqrstuvwxyzABCDEFGHJKLMNPQRSTUVWXYZ23457789', (ABS(RANDOM()) % 57) + 1, 1) ||
|
||||
SUBSTR('abcdefghijkmnopqrstuvwxyzABCDEFGHJKLMNPQRSTUVWXYZ23457789', (ABS(RANDOM()) % 57) + 1, 1) ||
|
||||
SUBSTR('abcdefghijkmnopqrstuvwxyzABCDEFGHJKLMNPQRSTUVWXYZ23457789', (ABS(RANDOM()) % 57) + 1, 1) ||
|
||||
SUBSTR('abcdefghijkmnopqrstuvwxyzABCDEFGHJKLMNPQRSTUVWXYZ23457789', (ABS(RANDOM()) % 57) + 1, 1) ||
|
||||
SUBSTR('abcdefghijkmnopqrstuvwxyzABCDEFGHJKLMNPQRSTUVWXYZ23457789', (ABS(RANDOM()) % 57) + 1, 1) ||
|
||||
SUBSTR('abcdefghijkmnopqrstuvwxyzABCDEFGHJKLMNPQRSTUVWXYZ23457789', (ABS(RANDOM()) % 57) + 1, 1) ||
|
||||
SUBSTR('abcdefghijkmnopqrstuvwxyzABCDEFGHJKLMNPQRSTUVWXYZ23457789', (ABS(RANDOM()) % 57) + 1, 1) ||
|
||||
SUBSTR('abcdefghijkmnopqrstuvwxyzABCDEFGHJKLMNPQRSTUVWXYZ23457789', (ABS(RANDOM()) % 57) + 1, 1) ||
|
||||
SUBSTR('abcdefghijkmnopqrstuvwxyzABCDEFGHJKLMNPQRSTUVWXYZ23457789', (ABS(RANDOM()) % 57) + 1, 1) ||
|
||||
SUBSTR('abcdefghijkmnopqrstuvwxyzABCDEFGHJKLMNPQRSTUVWXYZ23457789', (ABS(RANDOM()) % 57) + 1, 1)
|
||||
) AS id,
|
||||
model,
|
||||
created_at,
|
||||
updated_at,
|
||||
deleted_at,
|
||||
namespace,
|
||||
key,
|
||||
value
|
||||
FROM key_values;
|
||||
|
||||
-- 3. Drop the old table
|
||||
DROP TABLE key_values;
|
||||
|
||||
-- 4. Rename the new table
|
||||
ALTER TABLE key_values_new
|
||||
RENAME TO key_values;
|
||||
1
src-tauri/migrations/20250401122407_encrypted-key.sql
Normal file
1
src-tauri/migrations/20250401122407_encrypted-key.sql
Normal file
@@ -0,0 +1 @@
|
||||
ALTER TABLE workspace_metas ADD COLUMN encryption_key TEXT NULL DEFAULT NULL;
|
||||
@@ -0,0 +1 @@
|
||||
ALTER TABLE workspaces ADD COLUMN encryption_key_challenge TEXT NULL;
|
||||
245
src-tauri/migrations/20250424152740_remove-fks.sql
Normal file
245
src-tauri/migrations/20250424152740_remove-fks.sql
Normal file
@@ -0,0 +1,245 @@
|
||||
-- NOTE: SQLite does not support dropping foreign keys, so we need to create new
|
||||
-- tables and copy data instead. To prevent cascade deletes from wrecking stuff,
|
||||
-- we start with the leaf tables and finish with the parent tables (eg. folder).
|
||||
|
||||
----------------------------
|
||||
-- Remove http request FK --
|
||||
----------------------------
|
||||
|
||||
CREATE TABLE http_requests_dg_tmp
|
||||
(
|
||||
id TEXT NOT NULL
|
||||
PRIMARY KEY,
|
||||
model TEXT DEFAULT 'http_request' NOT NULL,
|
||||
workspace_id TEXT NOT NULL
|
||||
REFERENCES workspaces
|
||||
ON DELETE CASCADE,
|
||||
created_at DATETIME DEFAULT CURRENT_TIMESTAMP NOT NULL,
|
||||
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP NOT NULL,
|
||||
deleted_at DATETIME,
|
||||
name TEXT NOT NULL,
|
||||
url TEXT NOT NULL,
|
||||
method TEXT NOT NULL,
|
||||
headers TEXT NOT NULL,
|
||||
body_type TEXT,
|
||||
sort_priority REAL DEFAULT 0 NOT NULL,
|
||||
authentication TEXT DEFAULT '{}' NOT NULL,
|
||||
authentication_type TEXT,
|
||||
folder_id TEXT,
|
||||
body TEXT DEFAULT '{}' NOT NULL,
|
||||
url_parameters TEXT DEFAULT '[]' NOT NULL,
|
||||
description TEXT DEFAULT '' NOT NULL
|
||||
);
|
||||
|
||||
INSERT INTO http_requests_dg_tmp(id, model, workspace_id, created_at, updated_at, deleted_at, name, url, method,
|
||||
headers, body_type, sort_priority, authentication, authentication_type, folder_id,
|
||||
body, url_parameters, description)
|
||||
SELECT id,
|
||||
model,
|
||||
workspace_id,
|
||||
created_at,
|
||||
updated_at,
|
||||
deleted_at,
|
||||
name,
|
||||
url,
|
||||
method,
|
||||
headers,
|
||||
body_type,
|
||||
sort_priority,
|
||||
authentication,
|
||||
authentication_type,
|
||||
folder_id,
|
||||
body,
|
||||
url_parameters,
|
||||
description
|
||||
FROM http_requests;
|
||||
|
||||
DROP TABLE http_requests;
|
||||
|
||||
ALTER TABLE http_requests_dg_tmp
|
||||
RENAME TO http_requests;
|
||||
|
||||
----------------------------
|
||||
-- Remove grpc request FK --
|
||||
----------------------------
|
||||
|
||||
CREATE TABLE grpc_requests_dg_tmp
|
||||
(
|
||||
id TEXT NOT NULL
|
||||
PRIMARY KEY,
|
||||
model TEXT DEFAULT 'grpc_request' NOT NULL,
|
||||
workspace_id TEXT NOT NULL
|
||||
REFERENCES workspaces
|
||||
ON DELETE CASCADE,
|
||||
folder_id TEXT,
|
||||
created_at DATETIME DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')) NOT NULL,
|
||||
updated_at DATETIME DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')) NOT NULL,
|
||||
name TEXT NOT NULL,
|
||||
sort_priority REAL NOT NULL,
|
||||
url TEXT NOT NULL,
|
||||
service TEXT,
|
||||
method TEXT,
|
||||
message TEXT NOT NULL,
|
||||
authentication TEXT DEFAULT '{}' NOT NULL,
|
||||
authentication_type TEXT,
|
||||
metadata TEXT DEFAULT '[]' NOT NULL,
|
||||
description TEXT DEFAULT '' NOT NULL
|
||||
);
|
||||
|
||||
INSERT INTO grpc_requests_dg_tmp(id, model, workspace_id, folder_id, created_at, updated_at, name, sort_priority, url,
|
||||
service, method, message, authentication, authentication_type, metadata, description)
|
||||
SELECT id,
|
||||
model,
|
||||
workspace_id,
|
||||
folder_id,
|
||||
created_at,
|
||||
updated_at,
|
||||
name,
|
||||
sort_priority,
|
||||
url,
|
||||
service,
|
||||
method,
|
||||
message,
|
||||
authentication,
|
||||
authentication_type,
|
||||
metadata,
|
||||
description
|
||||
FROM grpc_requests;
|
||||
|
||||
DROP TABLE grpc_requests;
|
||||
|
||||
ALTER TABLE grpc_requests_dg_tmp
|
||||
RENAME TO grpc_requests;
|
||||
|
||||
---------------------------------
|
||||
-- Remove websocket request FK --
|
||||
---------------------------------
|
||||
|
||||
CREATE TABLE websocket_requests_dg_tmp
|
||||
(
|
||||
id TEXT NOT NULL
|
||||
PRIMARY KEY,
|
||||
model TEXT DEFAULT 'websocket_request' NOT NULL,
|
||||
workspace_id TEXT NOT NULL
|
||||
REFERENCES workspaces
|
||||
ON DELETE CASCADE,
|
||||
folder_id TEXT,
|
||||
created_at DATETIME DEFAULT CURRENT_TIMESTAMP NOT NULL,
|
||||
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP NOT NULL,
|
||||
deleted_at DATETIME,
|
||||
authentication TEXT DEFAULT '{}' NOT NULL,
|
||||
authentication_type TEXT,
|
||||
description TEXT NOT NULL,
|
||||
name TEXT NOT NULL,
|
||||
url TEXT NOT NULL,
|
||||
headers TEXT NOT NULL,
|
||||
message TEXT NOT NULL,
|
||||
sort_priority REAL NOT NULL,
|
||||
url_parameters TEXT DEFAULT '[]' NOT NULL
|
||||
);
|
||||
|
||||
INSERT INTO websocket_requests_dg_tmp(id, model, workspace_id, folder_id, created_at, updated_at, deleted_at,
|
||||
authentication, authentication_type, description, name, url, headers, message,
|
||||
sort_priority, url_parameters)
|
||||
SELECT id,
|
||||
model,
|
||||
workspace_id,
|
||||
folder_id,
|
||||
created_at,
|
||||
updated_at,
|
||||
deleted_at,
|
||||
authentication,
|
||||
authentication_type,
|
||||
description,
|
||||
name,
|
||||
url,
|
||||
headers,
|
||||
message,
|
||||
sort_priority,
|
||||
url_parameters
|
||||
FROM websocket_requests;
|
||||
|
||||
DROP TABLE websocket_requests;
|
||||
|
||||
ALTER TABLE websocket_requests_dg_tmp
|
||||
RENAME TO websocket_requests;
|
||||
|
||||
PRAGMA foreign_keys = ON;
|
||||
|
||||
---------------------------
|
||||
-- Remove environment FK --
|
||||
---------------------------
|
||||
|
||||
CREATE TABLE environments_dg_tmp
|
||||
(
|
||||
id TEXT NOT NULL
|
||||
PRIMARY KEY,
|
||||
created_at DATETIME DEFAULT CURRENT_TIMESTAMP NOT NULL,
|
||||
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP NOT NULL,
|
||||
deleted_at DATETIME,
|
||||
workspace_id TEXT NOT NULL
|
||||
REFERENCES workspaces
|
||||
ON DELETE CASCADE,
|
||||
name TEXT NOT NULL,
|
||||
variables DEFAULT '[]' NOT NULL,
|
||||
model TEXT DEFAULT 'environment',
|
||||
environment_id TEXT
|
||||
);
|
||||
|
||||
INSERT INTO environments_dg_tmp(id, created_at, updated_at, deleted_at, workspace_id, name, variables, model,
|
||||
environment_id)
|
||||
SELECT id,
|
||||
created_at,
|
||||
updated_at,
|
||||
deleted_at,
|
||||
workspace_id,
|
||||
name,
|
||||
variables,
|
||||
model,
|
||||
environment_id
|
||||
FROM environments;
|
||||
|
||||
DROP TABLE environments;
|
||||
|
||||
ALTER TABLE environments_dg_tmp
|
||||
RENAME TO environments;
|
||||
|
||||
----------------------
|
||||
-- Remove folder FK --
|
||||
----------------------
|
||||
|
||||
CREATE TABLE folders_dg_tmp
|
||||
(
|
||||
id TEXT NOT NULL
|
||||
PRIMARY KEY,
|
||||
model TEXT DEFAULT 'folder' NOT NULL,
|
||||
created_at DATETIME DEFAULT CURRENT_TIMESTAMP NOT NULL,
|
||||
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP NOT NULL,
|
||||
deleted_at DATETIME,
|
||||
workspace_id TEXT NOT NULL
|
||||
REFERENCES workspaces
|
||||
ON DELETE CASCADE,
|
||||
folder_id TEXT,
|
||||
name TEXT NOT NULL,
|
||||
sort_priority REAL DEFAULT 0 NOT NULL,
|
||||
description TEXT DEFAULT '' NOT NULL
|
||||
);
|
||||
|
||||
INSERT INTO folders_dg_tmp(id, model, created_at, updated_at, deleted_at, workspace_id, folder_id, name, sort_priority,
|
||||
description)
|
||||
SELECT id,
|
||||
model,
|
||||
created_at,
|
||||
updated_at,
|
||||
deleted_at,
|
||||
workspace_id,
|
||||
folder_id,
|
||||
name,
|
||||
sort_priority,
|
||||
description
|
||||
FROM folders;
|
||||
|
||||
DROP TABLE folders;
|
||||
|
||||
ALTER TABLE folders_dg_tmp
|
||||
RENAME TO folders;
|
||||
@@ -0,0 +1,11 @@
|
||||
-- There used to be sync code that skipped over environments because we didn't
|
||||
-- want to sync potentially insecure data. With encryption, it is now possible
|
||||
-- to sync environments securely. However, there were already sync states in the
|
||||
-- DB that marked environments as "Synced". Running the sync code on these envs
|
||||
-- would mark them as deleted by FS (exist in SyncState but not on FS).
|
||||
--
|
||||
-- To undo this mess, we have this migration to delete all environment-related
|
||||
-- sync states so we can sync from a clean slate.
|
||||
DELETE
|
||||
FROM sync_states
|
||||
WHERE model_id LIKE 'ev_%';
|
||||
20
src-tauri/migrations/20250508161145_public-environments.sql
Normal file
20
src-tauri/migrations/20250508161145_public-environments.sql
Normal file
@@ -0,0 +1,20 @@
|
||||
-- Add a public column to represent whether an environment can be shared or exported
|
||||
ALTER TABLE environments
|
||||
ADD COLUMN public BOOLEAN DEFAULT FALSE;
|
||||
|
||||
-- Add a base column to represent whether an environment is a base or sub environment. We used to
|
||||
-- do this with environment_id, but we need a more flexible solution now that envs can be optionally
|
||||
-- synced. E.g., it's now possible to only import a sub environment from a different client without
|
||||
-- its base environment "parent."
|
||||
ALTER TABLE environments
|
||||
ADD COLUMN base BOOLEAN DEFAULT FALSE;
|
||||
|
||||
-- SQLite doesn't support dynamic default values, so we update `base` based on the value of
|
||||
-- environment_id.
|
||||
UPDATE environments
|
||||
SET base = TRUE
|
||||
WHERE environment_id IS NULL;
|
||||
|
||||
-- Finally, we drop the old `environment_id` column that will no longer be used
|
||||
ALTER TABLE environments
|
||||
DROP COLUMN environment_id;
|
||||
15
src-tauri/migrations/20250516182745_default-attrs.sql
Normal file
15
src-tauri/migrations/20250516182745_default-attrs.sql
Normal file
@@ -0,0 +1,15 @@
|
||||
-- Auth
|
||||
ALTER TABLE workspaces
|
||||
ADD COLUMN authentication TEXT NOT NULL DEFAULT '{}';
|
||||
ALTER TABLE folders
|
||||
ADD COLUMN authentication TEXT NOT NULL DEFAULT '{}';
|
||||
ALTER TABLE workspaces
|
||||
ADD COLUMN authentication_type TEXT;
|
||||
ALTER TABLE folders
|
||||
ADD COLUMN authentication_type TEXT;
|
||||
|
||||
-- Headers
|
||||
ALTER TABLE workspaces
|
||||
ADD COLUMN headers TEXT NOT NULL DEFAULT '[]';
|
||||
ALTER TABLE folders
|
||||
ADD COLUMN headers TEXT NOT NULL DEFAULT '[]';
|
||||
40
src-tauri/src/commands.rs
Normal file
40
src-tauri/src/commands.rs
Normal file
@@ -0,0 +1,40 @@
|
||||
use crate::error::Result;
|
||||
use tauri::{command, AppHandle, Manager, Runtime, WebviewWindow};
|
||||
use tauri_plugin_dialog::{DialogExt, MessageDialogKind};
|
||||
use yaak_crypto::manager::EncryptionManagerExt;
|
||||
use yaak_plugins::events::PluginWindowContext;
|
||||
use yaak_plugins::native_template_functions::{decrypt_secure_template_function, encrypt_secure_template_function};
|
||||
|
||||
#[command]
|
||||
pub(crate) async fn cmd_show_workspace_key<R: Runtime>(
|
||||
window: WebviewWindow<R>,
|
||||
workspace_id: &str,
|
||||
) -> Result<()> {
|
||||
let key = window.crypto().reveal_workspace_key(workspace_id)?;
|
||||
window
|
||||
.dialog()
|
||||
.message(format!("Your workspace key is \n\n{}", key))
|
||||
.kind(MessageDialogKind::Info)
|
||||
.show(|_v| {});
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub(crate) async fn cmd_decrypt_template<R: Runtime>(
|
||||
window: WebviewWindow<R>,
|
||||
template: &str,
|
||||
) -> Result<String> {
|
||||
let app_handle = window.app_handle();
|
||||
let window_context = &PluginWindowContext::new(&window);
|
||||
Ok(decrypt_secure_template_function(&app_handle, window_context, template)?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub(crate) async fn cmd_secure_template<R: Runtime>(
|
||||
app_handle: AppHandle<R>,
|
||||
window: WebviewWindow<R>,
|
||||
template: &str,
|
||||
) -> Result<String> {
|
||||
let window_context = &PluginWindowContext::new(&window);
|
||||
Ok(encrypt_secure_template_function(&app_handle, window_context, template)?)
|
||||
}
|
||||
@@ -1,29 +1,48 @@
|
||||
use std::io;
|
||||
use serde::{Serialize, Serializer};
|
||||
use thiserror::Error;
|
||||
|
||||
#[derive(Error, Debug)]
|
||||
pub enum Error {
|
||||
#[error("Render error: {0}")]
|
||||
#[error(transparent)]
|
||||
TemplateError(#[from] yaak_templates::error::Error),
|
||||
|
||||
#[error("Model error: {0}")]
|
||||
#[error(transparent)]
|
||||
ModelError(#[from] yaak_models::error::Error),
|
||||
|
||||
#[error("Sync error: {0}")]
|
||||
#[error(transparent)]
|
||||
SyncError(#[from] yaak_sync::error::Error),
|
||||
|
||||
#[error(transparent)]
|
||||
CryptoError(#[from] yaak_crypto::error::Error),
|
||||
|
||||
#[error("Git error: {0}")]
|
||||
#[error(transparent)]
|
||||
GitError(#[from] yaak_git::error::Error),
|
||||
|
||||
#[error("Websocket error: {0}")]
|
||||
#[error(transparent)]
|
||||
WebsocketError(#[from] yaak_ws::error::Error),
|
||||
|
||||
#[error("License error: {0}")]
|
||||
#[error(transparent)]
|
||||
LicenseError(#[from] yaak_license::error::Error),
|
||||
|
||||
#[error("Plugin error: {0}")]
|
||||
#[error(transparent)]
|
||||
PluginError(#[from] yaak_plugins::error::Error),
|
||||
|
||||
#[error("Updater error: {0}")]
|
||||
UpdaterError(#[from] tauri_plugin_updater::Error),
|
||||
|
||||
#[error("JSON error: {0}")]
|
||||
JsonError(#[from] serde_json::error::Error),
|
||||
|
||||
#[error("Tauri error: {0}")]
|
||||
TauriError(#[from] tauri::Error),
|
||||
|
||||
#[error("Event source error: {0}")]
|
||||
EventSourceError(#[from] eventsource_client::Error),
|
||||
|
||||
#[error("I/O error: {0}")]
|
||||
IOError(#[from] io::Error),
|
||||
|
||||
#[error("Request error: {0}")]
|
||||
RequestError(#[from] reqwest::Error),
|
||||
|
||||
|
||||
@@ -1,10 +1,15 @@
|
||||
use std::collections::BTreeMap;
|
||||
|
||||
use crate::error::Result;
|
||||
use tauri::{Manager, Runtime, WebviewWindow};
|
||||
use yaak_grpc::{KeyAndValueRef, MetadataMap};
|
||||
use yaak_models::models::GrpcRequest;
|
||||
use yaak_models::query_manager::QueryManagerExt;
|
||||
use yaak_plugins::events::{CallHttpAuthenticationRequest, HttpHeader};
|
||||
use yaak_plugins::manager::PluginManager;
|
||||
use KeyAndValueRef::{Ascii, Binary};
|
||||
|
||||
use yaak_grpc::{KeyAndValueRef, MetadataMap};
|
||||
|
||||
pub fn metadata_to_map(metadata: MetadataMap) -> BTreeMap<String, String> {
|
||||
pub(crate) fn metadata_to_map(metadata: MetadataMap) -> BTreeMap<String, String> {
|
||||
let mut entries = BTreeMap::new();
|
||||
for r in metadata.iter() {
|
||||
match r {
|
||||
@@ -14,3 +19,74 @@ pub fn metadata_to_map(metadata: MetadataMap) -> BTreeMap<String, String> {
|
||||
}
|
||||
entries
|
||||
}
|
||||
|
||||
pub(crate) fn resolve_grpc_request<R: Runtime>(
|
||||
window: &WebviewWindow<R>,
|
||||
request: &GrpcRequest,
|
||||
) -> Result<GrpcRequest> {
|
||||
let mut new_request = request.clone();
|
||||
|
||||
let (authentication_type, authentication) =
|
||||
window.db().resolve_auth_for_grpc_request(request)?;
|
||||
new_request.authentication_type = authentication_type;
|
||||
new_request.authentication = authentication;
|
||||
|
||||
let metadata = window.db().resolve_metadata_for_grpc_request(request)?;
|
||||
new_request.metadata = metadata;
|
||||
|
||||
Ok(new_request)
|
||||
}
|
||||
|
||||
pub(crate) async fn build_metadata<R: Runtime>(
|
||||
window: &WebviewWindow<R>,
|
||||
request: &GrpcRequest,
|
||||
) -> Result<BTreeMap<String, String>> {
|
||||
let plugin_manager = window.state::<PluginManager>();
|
||||
let mut metadata = BTreeMap::new();
|
||||
|
||||
// Add the rest of metadata
|
||||
for h in request.metadata.clone() {
|
||||
if h.name.is_empty() && h.value.is_empty() {
|
||||
continue;
|
||||
}
|
||||
|
||||
if !h.enabled {
|
||||
continue;
|
||||
}
|
||||
|
||||
metadata.insert(h.name, h.value);
|
||||
}
|
||||
|
||||
match request.authentication_type.clone() {
|
||||
None => {
|
||||
// No authentication found. Not even inherited
|
||||
}
|
||||
Some(authentication_type) if authentication_type == "none" => {
|
||||
// Explicitly no authentication
|
||||
}
|
||||
Some(authentication_type) => {
|
||||
let auth = request.authentication.clone();
|
||||
let plugin_req = CallHttpAuthenticationRequest {
|
||||
context_id: format!("{:x}", md5::compute(request.id.clone())),
|
||||
values: serde_json::from_value(serde_json::to_value(&auth).unwrap()).unwrap(),
|
||||
method: "POST".to_string(),
|
||||
url: request.url.clone(),
|
||||
headers: metadata
|
||||
.iter()
|
||||
.map(|(name, value)| HttpHeader {
|
||||
name: name.to_string(),
|
||||
value: value.to_string(),
|
||||
})
|
||||
.collect(),
|
||||
};
|
||||
let plugin_result = plugin_manager
|
||||
.call_http_authentication(&window, &authentication_type, plugin_req)
|
||||
.await?;
|
||||
for header in plugin_result.set_headers {
|
||||
metadata.insert(header.name, header.value);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(metadata)
|
||||
}
|
||||
|
||||
@@ -1,9 +1,6 @@
|
||||
use tauri::{Manager, Runtime, WebviewWindow};
|
||||
|
||||
use yaak_models::queries::{
|
||||
get_key_value_int, get_key_value_string,
|
||||
set_key_value_int, set_key_value_string, UpdateSource,
|
||||
};
|
||||
use tauri::{AppHandle, Runtime};
|
||||
use yaak_models::query_manager::QueryManagerExt;
|
||||
use yaak_models::util::UpdateSource;
|
||||
|
||||
const NAMESPACE: &str = "analytics";
|
||||
const NUM_LAUNCHES_KEY: &str = "num_launches";
|
||||
@@ -16,33 +13,32 @@ pub struct LaunchEventInfo {
|
||||
pub num_launches: i32,
|
||||
}
|
||||
|
||||
pub async fn store_launch_history<R: Runtime>(w: &WebviewWindow<R>) -> LaunchEventInfo {
|
||||
pub async fn store_launch_history<R: Runtime>(app_handle: &AppHandle<R>) -> LaunchEventInfo {
|
||||
let last_tracked_version_key = "last_tracked_version";
|
||||
|
||||
let mut info = LaunchEventInfo::default();
|
||||
|
||||
info.num_launches = get_num_launches(w).await + 1;
|
||||
info.previous_version = get_key_value_string(w, NAMESPACE, last_tracked_version_key, "").await;
|
||||
info.current_version = w.package_info().version.to_string();
|
||||
info.num_launches = get_num_launches(app_handle).await + 1;
|
||||
info.current_version = app_handle.package_info().version.to_string();
|
||||
|
||||
if info.previous_version.is_empty() {
|
||||
} else {
|
||||
info.launched_after_update = info.current_version != info.previous_version;
|
||||
};
|
||||
app_handle
|
||||
.with_tx(|tx| {
|
||||
info.previous_version =
|
||||
tx.get_key_value_string(NAMESPACE, last_tracked_version_key, "");
|
||||
|
||||
// Update key values
|
||||
if !info.previous_version.is_empty() {
|
||||
info.launched_after_update = info.current_version != info.previous_version;
|
||||
};
|
||||
|
||||
set_key_value_string(
|
||||
w,
|
||||
NAMESPACE,
|
||||
last_tracked_version_key,
|
||||
info.current_version.as_str(),
|
||||
&UpdateSource::Background,
|
||||
)
|
||||
.await;
|
||||
|
||||
set_key_value_int(w, NAMESPACE, NUM_LAUNCHES_KEY, info.num_launches, &UpdateSource::Background)
|
||||
.await;
|
||||
// Update key values
|
||||
|
||||
let source = &UpdateSource::Background;
|
||||
let version = info.current_version.as_str();
|
||||
tx.set_key_value_string(NAMESPACE, last_tracked_version_key, version, source);
|
||||
tx.set_key_value_int(NAMESPACE, NUM_LAUNCHES_KEY, info.num_launches, source);
|
||||
Ok(())
|
||||
})
|
||||
.unwrap();
|
||||
|
||||
info
|
||||
}
|
||||
@@ -59,6 +55,6 @@ pub fn get_os() -> &'static str {
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn get_num_launches<R: Runtime>(w: &WebviewWindow<R>) -> i32 {
|
||||
get_key_value_int(w, NAMESPACE, NUM_LAUNCHES_KEY, 0).await
|
||||
pub async fn get_num_launches<R: Runtime>(app_handle: &AppHandle<R>) -> i32 {
|
||||
app_handle.db().get_key_value_int(NAMESPACE, NUM_LAUNCHES_KEY, 0)
|
||||
}
|
||||
|
||||
@@ -3,14 +3,14 @@ use crate::error::Result;
|
||||
use crate::render::render_http_request;
|
||||
use crate::response_err;
|
||||
use http::header::{ACCEPT, USER_AGENT};
|
||||
use http::{HeaderMap, HeaderName, HeaderValue, Uri};
|
||||
use http::{HeaderMap, HeaderName, HeaderValue};
|
||||
use log::{debug, error, warn};
|
||||
use mime_guess::Mime;
|
||||
use reqwest::redirect::Policy;
|
||||
use reqwest::{multipart, Proxy, Url};
|
||||
use reqwest::{Method, Response};
|
||||
use rustls::crypto::ring;
|
||||
use reqwest::{Proxy, Url, multipart};
|
||||
use rustls::ClientConfig;
|
||||
use rustls::crypto::ring;
|
||||
use rustls_platform_verifier::BuilderVerifierExt;
|
||||
use serde_json::Value;
|
||||
use std::collections::BTreeMap;
|
||||
@@ -20,20 +20,18 @@ use std::sync::Arc;
|
||||
use std::time::Duration;
|
||||
use tauri::{Manager, Runtime, WebviewWindow};
|
||||
use tokio::fs;
|
||||
use tokio::fs::{create_dir_all, File};
|
||||
use tokio::fs::{File, create_dir_all};
|
||||
use tokio::io::AsyncWriteExt;
|
||||
use tokio::sync::watch::Receiver;
|
||||
use tokio::sync::{oneshot, Mutex};
|
||||
use tokio::sync::{Mutex, oneshot};
|
||||
use yaak_models::models::{
|
||||
Cookie, CookieJar, Environment, HttpRequest, HttpResponse, HttpResponseHeader,
|
||||
HttpResponseState, ProxySetting, ProxySettingAuth,
|
||||
};
|
||||
use yaak_models::queries::{
|
||||
get_base_environment, get_http_response, get_or_create_settings, get_workspace,
|
||||
update_response_if_id, upsert_cookie_jar, UpdateSource,
|
||||
};
|
||||
use yaak_models::query_manager::QueryManagerExt;
|
||||
use yaak_models::util::UpdateSource;
|
||||
use yaak_plugins::events::{
|
||||
CallHttpAuthenticationRequest, HttpHeader, RenderPurpose, WindowContext,
|
||||
CallHttpAuthenticationRequest, HttpHeader, PluginWindowContext, RenderPurpose,
|
||||
};
|
||||
use yaak_plugins::manager::PluginManager;
|
||||
use yaak_plugins::template_callback::PluginTemplateCallback;
|
||||
@@ -46,32 +44,55 @@ pub async fn send_http_request<R: Runtime>(
|
||||
cookie_jar: Option<CookieJar>,
|
||||
cancelled_rx: &mut Receiver<bool>,
|
||||
) -> Result<HttpResponse> {
|
||||
let plugin_manager = window.state::<PluginManager>();
|
||||
let workspace = get_workspace(window, &unrendered_request.workspace_id).await?;
|
||||
let base_environment = get_base_environment(window, &unrendered_request.workspace_id).await?;
|
||||
let settings = get_or_create_settings(window).await;
|
||||
let cb = PluginTemplateCallback::new(
|
||||
window.app_handle(),
|
||||
&WindowContext::from_window(window),
|
||||
RenderPurpose::Send,
|
||||
);
|
||||
let app_handle = window.app_handle().clone();
|
||||
let plugin_manager = app_handle.state::<PluginManager>();
|
||||
let (settings, workspace) = {
|
||||
let db = window.db();
|
||||
let settings = db.get_settings();
|
||||
let workspace = db.get_workspace(&unrendered_request.workspace_id)?;
|
||||
(settings, workspace)
|
||||
};
|
||||
let base_environment =
|
||||
app_handle.db().get_base_environment(&unrendered_request.workspace_id)?;
|
||||
|
||||
let response_id = og_response.id.clone();
|
||||
let response = Arc::new(Mutex::new(og_response.clone()));
|
||||
|
||||
let request = match render_http_request(
|
||||
&unrendered_request,
|
||||
&base_environment,
|
||||
environment.as_ref(),
|
||||
&cb,
|
||||
)
|
||||
.await
|
||||
{
|
||||
let cb = PluginTemplateCallback::new(
|
||||
window.app_handle(),
|
||||
&PluginWindowContext::new(window),
|
||||
RenderPurpose::Send,
|
||||
);
|
||||
let update_source = UpdateSource::from_window(window);
|
||||
|
||||
let resolved_request = match resolve_http_request(window, unrendered_request) {
|
||||
Ok(r) => r,
|
||||
Err(e) => return Ok(response_err(&*response.lock().await, e.to_string(), window).await),
|
||||
Err(e) => {
|
||||
return Ok(response_err(
|
||||
&app_handle,
|
||||
&*response.lock().await,
|
||||
e.to_string(),
|
||||
&update_source,
|
||||
));
|
||||
}
|
||||
};
|
||||
|
||||
let mut url_string = request.url;
|
||||
let request =
|
||||
match render_http_request(&resolved_request, &base_environment, environment.as_ref(), &cb)
|
||||
.await
|
||||
{
|
||||
Ok(r) => r,
|
||||
Err(e) => {
|
||||
return Ok(response_err(
|
||||
&app_handle,
|
||||
&*response.lock().await,
|
||||
e.to_string(),
|
||||
&update_source,
|
||||
));
|
||||
}
|
||||
};
|
||||
|
||||
let mut url_string = request.url.clone();
|
||||
|
||||
url_string = ensure_proto(&url_string);
|
||||
if !url_string.starts_with("http://") && !url_string.starts_with("https://") {
|
||||
@@ -110,7 +131,12 @@ pub async fn send_http_request<R: Runtime>(
|
||||
|
||||
match settings.proxy {
|
||||
Some(ProxySetting::Disabled) => client_builder = client_builder.no_proxy(),
|
||||
Some(ProxySetting::Enabled { http, https, auth }) => {
|
||||
Some(ProxySetting::Enabled {
|
||||
http,
|
||||
https,
|
||||
auth,
|
||||
disabled,
|
||||
}) if !disabled => {
|
||||
debug!("Using proxy http={http} https={https}");
|
||||
let mut proxy = Proxy::custom(move |url| {
|
||||
let http = if http.is_empty() { None } else { Some(http.to_owned()) };
|
||||
@@ -130,12 +156,15 @@ pub async fn send_http_request<R: Runtime>(
|
||||
|
||||
client_builder = client_builder.proxy(proxy);
|
||||
}
|
||||
None => {} // Nothing to do for this one, as it is the default
|
||||
_ => {} // Nothing to do for this one, as it is the default
|
||||
}
|
||||
|
||||
// Add cookie store if specified
|
||||
let maybe_cookie_manager = match cookie_jar.clone() {
|
||||
Some(cj) => {
|
||||
Some(CookieJar { id, .. }) => {
|
||||
// NOTE: WE need to refetch the cookie jar because a chained request might have
|
||||
// updated cookies when we rendered the request.
|
||||
let cj = window.db().get_cookie_jar(&id)?;
|
||||
// HACK: Can't construct Cookie without serde, so we have to do this
|
||||
let cookies = cj
|
||||
.cookies
|
||||
@@ -174,27 +203,15 @@ pub async fn send_http_request<R: Runtime>(
|
||||
query_params.push((p.name, p.value));
|
||||
}
|
||||
|
||||
let uri = match Uri::from_str(url_string.as_str()) {
|
||||
let url = match Url::from_str(&url_string) {
|
||||
Ok(u) => u,
|
||||
Err(e) => {
|
||||
return Ok(response_err(
|
||||
&app_handle,
|
||||
&*response.lock().await,
|
||||
format!("Failed to parse URL \"{}\": {}", url_string, e.to_string()),
|
||||
window,
|
||||
)
|
||||
.await);
|
||||
}
|
||||
};
|
||||
// Yes, we're parsing both URI and URL because they could return different errors
|
||||
let url = match Url::from_str(uri.to_string().as_str()) {
|
||||
Ok(u) => u,
|
||||
Err(e) => {
|
||||
return Ok(response_err(
|
||||
&*response.lock().await,
|
||||
format!("Failed to parse URL \"{}\": {}", url_string, e.to_string()),
|
||||
window,
|
||||
)
|
||||
.await);
|
||||
&update_source,
|
||||
));
|
||||
}
|
||||
};
|
||||
|
||||
@@ -249,7 +266,7 @@ pub async fn send_http_request<R: Runtime>(
|
||||
}
|
||||
|
||||
let request_body = request.body.clone();
|
||||
if let Some(body_type) = &request.body_type {
|
||||
if let Some(body_type) = &request.body_type.clone() {
|
||||
if body_type == "graphql" {
|
||||
let query = get_str_h(&request_body, "query");
|
||||
let variables = get_str_h(&request_body, "variables");
|
||||
@@ -296,7 +313,12 @@ pub async fn send_http_request<R: Runtime>(
|
||||
request_builder = request_builder.body(f);
|
||||
}
|
||||
Err(e) => {
|
||||
return Ok(response_err(&*response.lock().await, e, window).await);
|
||||
return Ok(response_err(
|
||||
&app_handle,
|
||||
&*response.lock().await,
|
||||
e,
|
||||
&update_source,
|
||||
));
|
||||
}
|
||||
}
|
||||
} else if body_type == "multipart/form-data" && request_body.contains_key("form") {
|
||||
@@ -323,11 +345,11 @@ pub async fn send_http_request<R: Runtime>(
|
||||
Ok(f) => multipart::Part::bytes(f),
|
||||
Err(e) => {
|
||||
return Ok(response_err(
|
||||
&app_handle,
|
||||
&*response.lock().await,
|
||||
e.to_string(),
|
||||
window,
|
||||
)
|
||||
.await);
|
||||
&update_source,
|
||||
));
|
||||
}
|
||||
}
|
||||
};
|
||||
@@ -340,11 +362,11 @@ pub async fn send_http_request<R: Runtime>(
|
||||
Ok(p) => p,
|
||||
Err(e) => {
|
||||
return Ok(response_err(
|
||||
&app_handle,
|
||||
&*response.lock().await,
|
||||
format!("Invalid mime for multi-part entry {e:?}"),
|
||||
window,
|
||||
)
|
||||
.await);
|
||||
&update_source,
|
||||
));
|
||||
}
|
||||
};
|
||||
} else if !file_path.is_empty() {
|
||||
@@ -356,16 +378,16 @@ pub async fn send_http_request<R: Runtime>(
|
||||
Ok(p) => p,
|
||||
Err(e) => {
|
||||
return Ok(response_err(
|
||||
&app_handle,
|
||||
&*response.lock().await,
|
||||
format!("Invalid mime for multi-part entry {e:?}"),
|
||||
window,
|
||||
)
|
||||
.await);
|
||||
&update_source,
|
||||
));
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
// Set file path if not empty
|
||||
// Set a file path if it is not empty
|
||||
if !file_path.is_empty() {
|
||||
let filename = PathBuf::from(file_path)
|
||||
.file_name()
|
||||
@@ -388,6 +410,15 @@ pub async fn send_http_request<R: Runtime>(
|
||||
} else {
|
||||
warn!("Unsupported body type: {}", body_type);
|
||||
}
|
||||
} else {
|
||||
// No body set
|
||||
let method = request.method.to_ascii_lowercase();
|
||||
let is_body_method = method == "post" || method == "put" || method == "patch";
|
||||
// Add Content-Length for methods that commonly accept a body because some servers
|
||||
// will error if they don't receive it.
|
||||
if is_body_method && !headers.contains_key("content-length") {
|
||||
headers.insert("Content-Length", HeaderValue::from_static("0"));
|
||||
}
|
||||
}
|
||||
|
||||
// Add headers last, because previous steps may modify them
|
||||
@@ -397,42 +428,61 @@ pub async fn send_http_request<R: Runtime>(
|
||||
Ok(r) => r,
|
||||
Err(e) => {
|
||||
warn!("Failed to build request builder {e:?}");
|
||||
return Ok(response_err(&*response.lock().await, e.to_string(), window).await);
|
||||
return Ok(response_err(
|
||||
&app_handle,
|
||||
&*response.lock().await,
|
||||
e.to_string(),
|
||||
&update_source,
|
||||
));
|
||||
}
|
||||
};
|
||||
|
||||
// Apply authentication
|
||||
|
||||
if let Some(auth_name) = request.authentication_type.to_owned() {
|
||||
let req = CallHttpAuthenticationRequest {
|
||||
context_id: format!("{:x}", md5::compute(request.id)),
|
||||
values: serde_json::from_value(serde_json::to_value(&request.authentication).unwrap())
|
||||
match request.authentication_type {
|
||||
None => {
|
||||
// No authentication found. Not even inherited
|
||||
}
|
||||
Some(authentication_type) if authentication_type == "none" => {
|
||||
// Explicitly no authentication
|
||||
}
|
||||
Some(authentication_type) => {
|
||||
let req = CallHttpAuthenticationRequest {
|
||||
context_id: format!("{:x}", md5::compute(request.id)),
|
||||
values: serde_json::from_value(
|
||||
serde_json::to_value(&request.authentication).unwrap(),
|
||||
)
|
||||
.unwrap(),
|
||||
url: sendable_req.url().to_string(),
|
||||
method: sendable_req.method().to_string(),
|
||||
headers: sendable_req
|
||||
.headers()
|
||||
.iter()
|
||||
.map(|(name, value)| HttpHeader {
|
||||
name: name.to_string(),
|
||||
value: value.to_str().unwrap_or_default().to_string(),
|
||||
})
|
||||
.collect(),
|
||||
};
|
||||
let auth_result = plugin_manager.call_http_authentication(window, &auth_name, req).await;
|
||||
let plugin_result = match auth_result {
|
||||
Ok(r) => r,
|
||||
Err(e) => {
|
||||
return Ok(response_err(&*response.lock().await, e.to_string(), window).await);
|
||||
}
|
||||
};
|
||||
url: sendable_req.url().to_string(),
|
||||
method: sendable_req.method().to_string(),
|
||||
headers: sendable_req
|
||||
.headers()
|
||||
.iter()
|
||||
.map(|(name, value)| HttpHeader {
|
||||
name: name.to_string(),
|
||||
value: value.to_str().unwrap_or_default().to_string(),
|
||||
})
|
||||
.collect(),
|
||||
};
|
||||
let auth_result =
|
||||
plugin_manager.call_http_authentication(&window, &authentication_type, req).await;
|
||||
let plugin_result = match auth_result {
|
||||
Ok(r) => r,
|
||||
Err(e) => {
|
||||
return Ok(response_err(
|
||||
&app_handle,
|
||||
&*response.lock().await,
|
||||
e.to_string(),
|
||||
&update_source,
|
||||
));
|
||||
}
|
||||
};
|
||||
|
||||
let headers = sendable_req.headers_mut();
|
||||
for header in plugin_result.set_headers {
|
||||
headers.insert(
|
||||
HeaderName::from_str(&header.name).unwrap(),
|
||||
HeaderValue::from_str(&header.value).unwrap(),
|
||||
);
|
||||
let headers = sendable_req.headers_mut();
|
||||
for header in plugin_result.set_headers {
|
||||
headers.insert(
|
||||
HeaderName::from_str(&header.name).unwrap(),
|
||||
HeaderValue::from_str(&header.value).unwrap(),
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -448,22 +498,26 @@ pub async fn send_http_request<R: Runtime>(
|
||||
let raw_response = tokio::select! {
|
||||
Ok(r) = resp_rx => r,
|
||||
_ = cancelled_rx.changed() => {
|
||||
debug!("Request cancelled");
|
||||
return Ok(response_err(&*response.lock().await, "Request was cancelled".to_string(), window).await);
|
||||
let mut r = response.lock().await;
|
||||
r.elapsed_headers = start.elapsed().as_millis() as i32;
|
||||
r.elapsed = start.elapsed().as_millis() as i32;
|
||||
return Ok(response_err(&app_handle, &r, "Request was cancelled".to_string(), &update_source));
|
||||
}
|
||||
};
|
||||
|
||||
{
|
||||
let app_handle = app_handle.clone();
|
||||
let window = window.clone();
|
||||
let cancelled_rx = cancelled_rx.clone();
|
||||
let response_id = response_id.clone();
|
||||
let response = response.clone();
|
||||
let update_source = update_source.clone();
|
||||
tokio::spawn(async move {
|
||||
match raw_response {
|
||||
Ok(mut v) => {
|
||||
let content_length = v.content_length();
|
||||
let response_headers = v.headers().clone();
|
||||
let dir = window.app_handle().path().app_data_dir().unwrap();
|
||||
let dir = app_handle.path().app_data_dir().unwrap();
|
||||
let base_dir = dir.join("responses");
|
||||
create_dir_all(base_dir.clone()).await.expect("Failed to create responses dir");
|
||||
let body_path = if response_id.is_empty() {
|
||||
@@ -476,6 +530,7 @@ pub async fn send_http_request<R: Runtime>(
|
||||
let mut r = response.lock().await;
|
||||
r.body_path = Some(body_path.to_str().unwrap().to_string());
|
||||
r.elapsed_headers = start.elapsed().as_millis() as i32;
|
||||
r.elapsed = start.elapsed().as_millis() as i32;
|
||||
r.status = v.status().as_u16() as i32;
|
||||
r.status_reason = v.status().canonical_reason().map(|s| s.to_string());
|
||||
r.headers = response_headers
|
||||
@@ -497,8 +552,9 @@ pub async fn send_http_request<R: Runtime>(
|
||||
};
|
||||
|
||||
r.state = HttpResponseState::Connected;
|
||||
update_response_if_id(&window, &r, &UpdateSource::Window)
|
||||
.await
|
||||
app_handle
|
||||
.db()
|
||||
.update_http_response_if_id(&r, &update_source)
|
||||
.expect("Failed to update response after connected");
|
||||
}
|
||||
|
||||
@@ -526,21 +582,27 @@ pub async fn send_http_request<R: Runtime>(
|
||||
f.flush().await.expect("Failed to flush file");
|
||||
written_bytes += bytes.len();
|
||||
r.content_length = Some(written_bytes as i32);
|
||||
update_response_if_id(&window, &r, &UpdateSource::Window)
|
||||
.await
|
||||
app_handle
|
||||
.db()
|
||||
.update_http_response_if_id(&r, &update_source)
|
||||
.expect("Failed to update response");
|
||||
}
|
||||
Ok(None) => {
|
||||
break;
|
||||
}
|
||||
Err(e) => {
|
||||
response_err(&*response.lock().await, e.to_string(), &window).await;
|
||||
response_err(
|
||||
&app_handle,
|
||||
&*response.lock().await,
|
||||
e.to_string(),
|
||||
&update_source,
|
||||
);
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Set final content length
|
||||
// Set the final content length
|
||||
{
|
||||
let mut r = response.lock().await;
|
||||
r.content_length = match content_length {
|
||||
@@ -548,8 +610,9 @@ pub async fn send_http_request<R: Runtime>(
|
||||
None => Some(written_bytes as i32),
|
||||
};
|
||||
r.state = HttpResponseState::Closed;
|
||||
update_response_if_id(&window, &r, &UpdateSource::Window)
|
||||
.await
|
||||
app_handle
|
||||
.db()
|
||||
.update_http_response_if_id(&r, &UpdateSource::from_window(&window))
|
||||
.expect("Failed to update response");
|
||||
};
|
||||
|
||||
@@ -574,8 +637,9 @@ pub async fn send_http_request<R: Runtime>(
|
||||
})
|
||||
.collect::<Vec<_>>();
|
||||
cookie_jar.cookies = json_cookies;
|
||||
if let Err(e) =
|
||||
upsert_cookie_jar(&window, &cookie_jar, &UpdateSource::Window).await
|
||||
if let Err(e) = app_handle
|
||||
.db()
|
||||
.upsert_cookie_jar(&cookie_jar, &UpdateSource::from_window(&window))
|
||||
{
|
||||
error!("Failed to update cookie jar: {}", e);
|
||||
};
|
||||
@@ -583,7 +647,12 @@ pub async fn send_http_request<R: Runtime>(
|
||||
}
|
||||
Err(e) => {
|
||||
warn!("Failed to execute request {e}");
|
||||
response_err(&*response.lock().await, format!("{e} → {e:?}"), &window).await;
|
||||
response_err(
|
||||
&app_handle,
|
||||
&*response.lock().await,
|
||||
format!("{e} → {e:?}"),
|
||||
&update_source,
|
||||
);
|
||||
}
|
||||
};
|
||||
|
||||
@@ -592,22 +661,43 @@ pub async fn send_http_request<R: Runtime>(
|
||||
});
|
||||
};
|
||||
|
||||
let app_handle = app_handle.clone();
|
||||
Ok(tokio::select! {
|
||||
Ok(r) = done_rx => r,
|
||||
_ = cancelled_rx.changed() => {
|
||||
match get_http_response(window, response_id.as_str()).await {
|
||||
match app_handle.with_db(|c| c.get_http_response(&response_id)) {
|
||||
Ok(mut r) => {
|
||||
r.state = HttpResponseState::Closed;
|
||||
update_response_if_id(&window, &r, &UpdateSource::Window).await.expect("Failed to update response")
|
||||
r.elapsed = start.elapsed().as_millis() as i32;
|
||||
r.elapsed_headers = start.elapsed().as_millis() as i32;
|
||||
app_handle.db().update_http_response_if_id(&r, &UpdateSource::from_window(window))
|
||||
.expect("Failed to update response")
|
||||
},
|
||||
_ => {
|
||||
response_err(&*response.lock().await, "Ephemeral request was cancelled".to_string(), &window).await
|
||||
response_err(&app_handle, &*response.lock().await, "Ephemeral request was cancelled".to_string(), &update_source)
|
||||
}.clone(),
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
fn resolve_http_request<R: Runtime>(
|
||||
window: &WebviewWindow<R>,
|
||||
request: &HttpRequest,
|
||||
) -> Result<HttpRequest> {
|
||||
let mut new_request = request.clone();
|
||||
|
||||
let (authentication_type, authentication) =
|
||||
window.db().resolve_auth_for_http_request(request)?;
|
||||
new_request.authentication_type = authentication_type;
|
||||
new_request.authentication = authentication;
|
||||
|
||||
let headers = window.db().resolve_headers_for_http_request(request)?;
|
||||
new_request.headers = headers;
|
||||
|
||||
Ok(new_request)
|
||||
}
|
||||
|
||||
fn ensure_proto(url_str: &str) -> String {
|
||||
if url_str.starts_with("http://") || url_str.starts_with("https://") {
|
||||
return url_str.to_string();
|
||||
|
||||
1690
src-tauri/src/lib.rs
1690
src-tauri/src/lib.rs
File diff suppressed because it is too large
Load Diff
@@ -1,13 +1,16 @@
|
||||
use std::time::SystemTime;
|
||||
|
||||
use crate::error::Result;
|
||||
use crate::history::{get_num_launches, get_os};
|
||||
use chrono::{DateTime, Duration, Utc};
|
||||
use log::debug;
|
||||
use reqwest::Method;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use serde_json::Value;
|
||||
use tauri::{Emitter, Manager, Runtime, WebviewWindow};
|
||||
use yaak_models::queries::{get_key_value_raw, set_key_value_raw, UpdateSource};
|
||||
use tauri::{AppHandle, Emitter, Manager, Runtime, WebviewWindow};
|
||||
use yaak_license::{LicenseCheckStatus, check_license};
|
||||
use yaak_models::query_manager::QueryManagerExt;
|
||||
use yaak_models::util::UpdateSource;
|
||||
|
||||
// Check for updates every hour
|
||||
const MAX_UPDATE_CHECK_SECONDS: u64 = 60 * 60;
|
||||
@@ -43,16 +46,23 @@ impl YaakNotifier {
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn seen<R: Runtime>(&mut self, w: &WebviewWindow<R>, id: &str) -> Result<(), String> {
|
||||
let mut seen = get_kv(w).await?;
|
||||
pub async fn seen<R: Runtime>(&mut self, window: &WebviewWindow<R>, id: &str) -> Result<()> {
|
||||
let app_handle = window.app_handle();
|
||||
let mut seen = get_kv(app_handle).await?;
|
||||
seen.push(id.to_string());
|
||||
debug!("Marked notification as seen {}", id);
|
||||
let seen_json = serde_json::to_string(&seen).map_err(|e| e.to_string())?;
|
||||
set_key_value_raw(w, KV_NAMESPACE, KV_KEY, seen_json.as_str(), &UpdateSource::Window).await;
|
||||
let seen_json = serde_json::to_string(&seen)?;
|
||||
window.db().set_key_value_raw(
|
||||
KV_NAMESPACE,
|
||||
KV_KEY,
|
||||
seen_json.as_str(),
|
||||
&UpdateSource::from_window(window),
|
||||
);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn check<R: Runtime>(&mut self, window: &WebviewWindow<R>) -> Result<(), String> {
|
||||
pub async fn check<R: Runtime>(&mut self, window: &WebviewWindow<R>) -> Result<()> {
|
||||
let app_handle = window.app_handle();
|
||||
let ignore_check = self.last_check.elapsed().unwrap().as_secs() < MAX_UPDATE_CHECK_SECONDS;
|
||||
|
||||
if ignore_check {
|
||||
@@ -61,22 +71,31 @@ impl YaakNotifier {
|
||||
|
||||
self.last_check = SystemTime::now();
|
||||
|
||||
let num_launches = get_num_launches(window).await;
|
||||
let info = window.app_handle().package_info().clone();
|
||||
let license_check = match check_license(window).await? {
|
||||
LicenseCheckStatus::PersonalUse { .. } => "personal".to_string(),
|
||||
LicenseCheckStatus::CommercialUse => "commercial".to_string(),
|
||||
LicenseCheckStatus::InvalidLicense => "invalid_license".to_string(),
|
||||
LicenseCheckStatus::Trialing { .. } => "trialing".to_string(),
|
||||
};
|
||||
let settings = window.db().get_settings();
|
||||
let num_launches = get_num_launches(app_handle).await;
|
||||
let info = app_handle.package_info().clone();
|
||||
let req = reqwest::Client::default()
|
||||
.request(Method::GET, "https://notify.yaak.app/notifications")
|
||||
.query(&[
|
||||
("version", info.version.to_string().as_str()),
|
||||
("launches", num_launches.to_string().as_str()),
|
||||
("platform", get_os())
|
||||
("installed", settings.created_at.format("%Y-%m-%d").to_string().as_str()),
|
||||
("license", &license_check),
|
||||
("platform", get_os()),
|
||||
]);
|
||||
let resp = req.send().await.map_err(|e| e.to_string())?;
|
||||
let resp = req.send().await?;
|
||||
if resp.status() != 200 {
|
||||
debug!("Skipping notification status code {}", resp.status());
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
let result = resp.json::<Value>().await.map_err(|e| e.to_string())?;
|
||||
let result = resp.json::<Value>().await?;
|
||||
|
||||
// Support both single and multiple notifications.
|
||||
// TODO: Remove support for single after April 2025
|
||||
@@ -90,23 +109,24 @@ impl YaakNotifier {
|
||||
|
||||
for notification in notifications {
|
||||
let age = notification.timestamp.signed_duration_since(Utc::now());
|
||||
let seen = get_kv(window).await?;
|
||||
let seen = get_kv(app_handle).await?;
|
||||
if seen.contains(¬ification.id) || (age > Duration::days(2)) {
|
||||
debug!("Already seen notification {}", notification.id);
|
||||
return Ok(());
|
||||
continue;
|
||||
}
|
||||
debug!("Got notification {:?}", notification);
|
||||
|
||||
let _ = window.emit_to(window.label(), "notification", notification.clone());
|
||||
let _ = app_handle.emit_to(window.label(), "notification", notification.clone());
|
||||
break; // Only show one notification
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
async fn get_kv<R: Runtime>(w: &WebviewWindow<R>) -> Result<Vec<String>, String> {
|
||||
match get_key_value_raw(w, "notifications", "seen").await {
|
||||
async fn get_kv<R: Runtime>(app_handle: &AppHandle<R>) -> Result<Vec<String>> {
|
||||
match app_handle.db().get_key_value_raw("notifications", "seen") {
|
||||
None => Ok(Vec::new()),
|
||||
Some(v) => serde_json::from_str(&v.value).map_err(|e| e.to_string()),
|
||||
Some(v) => Ok(serde_json::from_str(&v.value)?),
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,25 +1,24 @@
|
||||
use crate::http_request::send_http_request;
|
||||
use crate::render::{render_http_request, render_json_value};
|
||||
use crate::window::{create_window, CreateWindowConfig};
|
||||
use crate::window::{CreateWindowConfig, create_window};
|
||||
use crate::{
|
||||
call_frontend, cookie_jar_from_window, environment_from_window, get_window_from_window_context,
|
||||
workspace_from_window,
|
||||
};
|
||||
use chrono::Utc;
|
||||
use cookie::Cookie;
|
||||
use log::warn;
|
||||
use tauri::{AppHandle, Emitter, Manager, Runtime, State};
|
||||
use tauri_plugin_clipboard_manager::ClipboardExt;
|
||||
use yaak_models::models::{HttpResponse, Plugin};
|
||||
use yaak_models::queries::{
|
||||
create_default_http_response, delete_plugin_key_value, get_base_environment, get_http_request,
|
||||
get_plugin_key_value, list_http_responses_for_request, list_plugins, set_plugin_key_value,
|
||||
upsert_plugin, UpdateSource,
|
||||
};
|
||||
use yaak_models::query_manager::QueryManagerExt;
|
||||
use yaak_models::util::UpdateSource;
|
||||
use yaak_plugins::events::{
|
||||
Color, DeleteKeyValueResponse, EmptyPayload, FindHttpResponsesResponse,
|
||||
Color, DeleteKeyValueResponse, EmptyPayload, FindHttpResponsesResponse, GetCookieValueResponse,
|
||||
GetHttpRequestByIdResponse, GetKeyValueResponse, Icon, InternalEvent, InternalEventPayload,
|
||||
RenderHttpRequestResponse, SendHttpRequestResponse, SetKeyValueResponse, ShowToastRequest,
|
||||
TemplateRenderResponse, WindowContext, WindowNavigateEvent,
|
||||
ListCookieNamesResponse, PluginWindowContext, RenderHttpRequestResponse,
|
||||
SendHttpRequestResponse, SetKeyValueResponse, ShowToastRequest, TemplateRenderResponse,
|
||||
WindowNavigateEvent,
|
||||
};
|
||||
use yaak_plugins::manager::PluginManager;
|
||||
use yaak_plugins::plugin_handle::PluginHandle;
|
||||
@@ -42,7 +41,7 @@ pub(crate) async fn handle_plugin_event<R: Runtime>(
|
||||
}
|
||||
InternalEventPayload::ShowToastRequest(req) => {
|
||||
match window_context {
|
||||
WindowContext::Label { label } => app_handle
|
||||
PluginWindowContext::Label { label, .. } => app_handle
|
||||
.emit_to(label, "show_toast", req)
|
||||
.expect("Failed to emit show_toast to window"),
|
||||
_ => app_handle.emit("show_toast", req).expect("Failed to emit show_toast"),
|
||||
@@ -55,19 +54,16 @@ pub(crate) async fn handle_plugin_event<R: Runtime>(
|
||||
call_frontend(window, event).await
|
||||
}
|
||||
InternalEventPayload::FindHttpResponsesRequest(req) => {
|
||||
let http_responses = list_http_responses_for_request(
|
||||
app_handle,
|
||||
req.request_id.as_str(),
|
||||
req.limit.map(|l| l as i64),
|
||||
)
|
||||
.await
|
||||
.unwrap_or_default();
|
||||
let http_responses = app_handle
|
||||
.db()
|
||||
.list_http_responses_for_request(&req.request_id, req.limit.map(|l| l as u64))
|
||||
.unwrap_or_default();
|
||||
Some(InternalEventPayload::FindHttpResponsesResponse(FindHttpResponsesResponse {
|
||||
http_responses,
|
||||
}))
|
||||
}
|
||||
InternalEventPayload::GetHttpRequestByIdRequest(req) => {
|
||||
let http_request = get_http_request(app_handle, req.id.as_str()).await.unwrap();
|
||||
let http_request = app_handle.db().get_http_request(&req.id).ok();
|
||||
Some(InternalEventPayload::GetHttpRequestByIdResponse(GetHttpRequestByIdResponse {
|
||||
http_request,
|
||||
}))
|
||||
@@ -76,12 +72,12 @@ pub(crate) async fn handle_plugin_event<R: Runtime>(
|
||||
let window = get_window_from_window_context(app_handle, &window_context)
|
||||
.expect("Failed to find window for render http request");
|
||||
|
||||
let workspace = workspace_from_window(&window)
|
||||
.await
|
||||
.expect("Failed to get workspace_id from window URL");
|
||||
let environment = environment_from_window(&window).await;
|
||||
let base_environment = get_base_environment(&window, workspace.id.as_str())
|
||||
.await
|
||||
let workspace =
|
||||
workspace_from_window(&window).expect("Failed to get workspace_id from window URL");
|
||||
let environment = environment_from_window(&window);
|
||||
let base_environment = app_handle
|
||||
.db()
|
||||
.get_base_environment(&workspace.id)
|
||||
.expect("Failed to get base environment");
|
||||
let cb = PluginTemplateCallback::new(app_handle, &window_context, req.purpose);
|
||||
let http_request = render_http_request(
|
||||
@@ -100,12 +96,12 @@ pub(crate) async fn handle_plugin_event<R: Runtime>(
|
||||
let window = get_window_from_window_context(app_handle, &window_context)
|
||||
.expect("Failed to find window for render");
|
||||
|
||||
let workspace = workspace_from_window(&window)
|
||||
.await
|
||||
.expect("Failed to get workspace_id from window URL");
|
||||
let environment = environment_from_window(&window).await;
|
||||
let base_environment = get_base_environment(&window, workspace.id.as_str())
|
||||
.await
|
||||
let workspace =
|
||||
workspace_from_window(&window).expect("Failed to get workspace_id from window URL");
|
||||
let environment = environment_from_window(&window);
|
||||
let base_environment = app_handle
|
||||
.db()
|
||||
.get_base_environment(&workspace.id)
|
||||
.expect("Failed to get base environment");
|
||||
let cb = PluginTemplateCallback::new(app_handle, &window_context, req.purpose);
|
||||
let data = render_json_value(req.data, &base_environment, environment.as_ref(), &cb)
|
||||
@@ -114,10 +110,8 @@ pub(crate) async fn handle_plugin_event<R: Runtime>(
|
||||
Some(InternalEventPayload::TemplateRenderResponse(TemplateRenderResponse { data }))
|
||||
}
|
||||
InternalEventPayload::ErrorResponse(resp) => {
|
||||
let window = get_window_from_window_context(app_handle, &window_context)
|
||||
.expect("Failed to find window for plugin reload");
|
||||
let toast_event = plugin_handle.build_event_to_send(
|
||||
&WindowContext::from_window(&window),
|
||||
&window_context,
|
||||
&InternalEventPayload::ShowToastRequest(ShowToastRequest {
|
||||
message: format!(
|
||||
"Plugin error from {}: {}",
|
||||
@@ -133,9 +127,7 @@ pub(crate) async fn handle_plugin_event<R: Runtime>(
|
||||
None
|
||||
}
|
||||
InternalEventPayload::ReloadResponse(_) => {
|
||||
let window = get_window_from_window_context(app_handle, &window_context)
|
||||
.expect("Failed to find window for plugin reload");
|
||||
let plugins = list_plugins(app_handle).await.unwrap();
|
||||
let plugins = app_handle.db().list_plugins().unwrap();
|
||||
for plugin in plugins {
|
||||
if plugin.directory != plugin_handle.dir {
|
||||
continue;
|
||||
@@ -145,10 +137,10 @@ pub(crate) async fn handle_plugin_event<R: Runtime>(
|
||||
updated_at: Utc::now().naive_utc(), // TODO: Add reloaded_at field to use instead
|
||||
..plugin
|
||||
};
|
||||
upsert_plugin(&window, new_plugin, &UpdateSource::Plugin).await.unwrap();
|
||||
app_handle.db().upsert_plugin(&new_plugin, &UpdateSource::Plugin).unwrap();
|
||||
}
|
||||
let toast_event = plugin_handle.build_event_to_send(
|
||||
&WindowContext::from_window(&window),
|
||||
&window_context,
|
||||
&InternalEventPayload::ShowToastRequest(ShowToastRequest {
|
||||
message: format!("Reloaded plugin {}", plugin_handle.dir),
|
||||
icon: Some(Icon::Info),
|
||||
@@ -163,32 +155,35 @@ pub(crate) async fn handle_plugin_event<R: Runtime>(
|
||||
let window = get_window_from_window_context(app_handle, &window_context)
|
||||
.expect("Failed to find window for sending HTTP request");
|
||||
let mut http_request = req.http_request;
|
||||
let workspace = workspace_from_window(&window)
|
||||
.await
|
||||
.expect("Failed to get workspace_id from window URL");
|
||||
let cookie_jar = cookie_jar_from_window(&window).await;
|
||||
let environment = environment_from_window(&window).await;
|
||||
let workspace =
|
||||
workspace_from_window(&window).expect("Failed to get workspace_id from window URL");
|
||||
let cookie_jar = cookie_jar_from_window(&window);
|
||||
let environment = environment_from_window(&window);
|
||||
|
||||
if http_request.workspace_id.is_empty() {
|
||||
http_request.workspace_id = workspace.id;
|
||||
}
|
||||
|
||||
let resp = if http_request.id.is_empty() {
|
||||
HttpResponse::new()
|
||||
let http_response = if http_request.id.is_empty() {
|
||||
HttpResponse::default()
|
||||
} else {
|
||||
create_default_http_response(
|
||||
&window,
|
||||
http_request.id.as_str(),
|
||||
&UpdateSource::Plugin,
|
||||
)
|
||||
.await
|
||||
.unwrap()
|
||||
window
|
||||
.db()
|
||||
.upsert_http_response(
|
||||
&HttpResponse {
|
||||
request_id: http_request.id.clone(),
|
||||
workspace_id: http_request.workspace_id.clone(),
|
||||
..Default::default()
|
||||
},
|
||||
&UpdateSource::Plugin,
|
||||
)
|
||||
.unwrap()
|
||||
};
|
||||
|
||||
let result = send_http_request(
|
||||
&window,
|
||||
&http_request,
|
||||
&resp,
|
||||
&http_response,
|
||||
environment,
|
||||
cookie_jar,
|
||||
&mut tokio::sync::watch::channel(false).1, // No-op cancel channel
|
||||
@@ -223,13 +218,12 @@ pub(crate) async fn handle_plugin_event<R: Runtime>(
|
||||
{
|
||||
let event_id = event.id.clone();
|
||||
let plugin_handle = plugin_handle.clone();
|
||||
let label = label.clone();
|
||||
let window_context = window_context.clone();
|
||||
tauri::async_runtime::spawn(async move {
|
||||
while let Some(url) = navigation_rx.recv().await {
|
||||
let url = url.to_string();
|
||||
let label = label.clone();
|
||||
let event_to_send = plugin_handle.build_event_to_send(
|
||||
&WindowContext::Label { label },
|
||||
&window_context, // NOTE: Sending existing context on purpose here
|
||||
&InternalEventPayload::WindowNavigateEvent(WindowNavigateEvent { url }),
|
||||
Some(event_id.clone()),
|
||||
);
|
||||
@@ -241,12 +235,11 @@ pub(crate) async fn handle_plugin_event<R: Runtime>(
|
||||
{
|
||||
let event_id = event.id.clone();
|
||||
let plugin_handle = plugin_handle.clone();
|
||||
let label = label.clone();
|
||||
let window_context = window_context.clone();
|
||||
tauri::async_runtime::spawn(async move {
|
||||
while let Some(_) = close_rx.recv().await {
|
||||
let label = label.clone();
|
||||
let event_to_send = plugin_handle.build_event_to_send(
|
||||
&WindowContext::Label { label },
|
||||
&window_context,
|
||||
&InternalEventPayload::WindowCloseEvent,
|
||||
Some(event_id.clone()),
|
||||
);
|
||||
@@ -265,19 +258,46 @@ pub(crate) async fn handle_plugin_event<R: Runtime>(
|
||||
}
|
||||
InternalEventPayload::SetKeyValueRequest(req) => {
|
||||
let name = plugin_handle.name().await;
|
||||
set_plugin_key_value(app_handle, &name, &req.key, &req.value).await;
|
||||
app_handle.db().set_plugin_key_value(&name, &req.key, &req.value);
|
||||
Some(InternalEventPayload::SetKeyValueResponse(SetKeyValueResponse {}))
|
||||
}
|
||||
InternalEventPayload::GetKeyValueRequest(req) => {
|
||||
let name = plugin_handle.name().await;
|
||||
let value = get_plugin_key_value(app_handle, &name, &req.key).await.map(|v| v.value);
|
||||
let value = app_handle.db().get_plugin_key_value(&name, &req.key).map(|v| v.value);
|
||||
Some(InternalEventPayload::GetKeyValueResponse(GetKeyValueResponse { value }))
|
||||
}
|
||||
InternalEventPayload::DeleteKeyValueRequest(req) => {
|
||||
let name = plugin_handle.name().await;
|
||||
let deleted = delete_plugin_key_value(app_handle, &name, &req.key).await;
|
||||
let deleted = app_handle.db().delete_plugin_key_value(&name, &req.key).unwrap();
|
||||
Some(InternalEventPayload::DeleteKeyValueResponse(DeleteKeyValueResponse { deleted }))
|
||||
}
|
||||
InternalEventPayload::ListCookieNamesRequest(_req) => {
|
||||
let window = get_window_from_window_context(app_handle, &window_context)
|
||||
.expect("Failed to find window for listing cookies");
|
||||
let names = match cookie_jar_from_window(&window) {
|
||||
None => Vec::new(),
|
||||
Some(j) => j
|
||||
.cookies
|
||||
.into_iter()
|
||||
.filter_map(|c| Cookie::parse(c.raw_cookie).ok().map(|c| c.name().to_string()))
|
||||
.collect(),
|
||||
};
|
||||
Some(InternalEventPayload::ListCookieNamesResponse(ListCookieNamesResponse { names }))
|
||||
}
|
||||
InternalEventPayload::GetCookieValueRequest(req) => {
|
||||
let window = get_window_from_window_context(app_handle, &window_context)
|
||||
.expect("Failed to find window for listing cookies");
|
||||
let value = match cookie_jar_from_window(&window) {
|
||||
None => None,
|
||||
Some(j) => j.cookies.into_iter().find_map(|c| match Cookie::parse(c.raw_cookie) {
|
||||
Ok(c) if c.name().to_string().eq(&req.name) => {
|
||||
Some(c.value_trimmed().to_string())
|
||||
}
|
||||
_ => None,
|
||||
}),
|
||||
};
|
||||
Some(InternalEventPayload::GetCookieValueResponse(GetCookieValueResponse { value }))
|
||||
}
|
||||
_ => None,
|
||||
};
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@ use serde_json::Value;
|
||||
use std::collections::{BTreeMap, HashMap};
|
||||
use yaak_http::apply_path_placeholders;
|
||||
use yaak_models::models::{
|
||||
Environment, GrpcMetadataEntry, GrpcRequest, HttpRequest, HttpRequestHeader, HttpUrlParameter,
|
||||
Environment, GrpcRequest, HttpRequest, HttpRequestHeader, HttpUrlParameter,
|
||||
};
|
||||
use yaak_models::render::make_vars_hashmap;
|
||||
use yaak_templates::{parse_and_render, render_json_value_raw, TemplateCallback};
|
||||
@@ -37,7 +37,7 @@ pub async fn render_grpc_request<T: TemplateCallback>(
|
||||
|
||||
let mut metadata = Vec::new();
|
||||
for p in r.metadata.clone() {
|
||||
metadata.push(GrpcMetadataEntry {
|
||||
metadata.push(HttpRequestHeader {
|
||||
enabled: p.enabled,
|
||||
name: render(p.name.as_str(), vars, cb).await?,
|
||||
value: render(p.value.as_str(), vars, cb).await?,
|
||||
|
||||
@@ -1,12 +1,13 @@
|
||||
use std::fmt::{Display, Formatter};
|
||||
use std::time::SystemTime;
|
||||
|
||||
use crate::error::Result;
|
||||
use log::info;
|
||||
use tauri::{AppHandle, Manager};
|
||||
use tauri::{Manager, Runtime, WebviewWindow};
|
||||
use tauri_plugin_dialog::{DialogExt, MessageDialogButtons};
|
||||
use tauri_plugin_updater::UpdaterExt;
|
||||
use tokio::task::block_in_place;
|
||||
use yaak_models::queries::get_or_create_settings;
|
||||
use yaak_models::query_manager::QueryManagerExt;
|
||||
use yaak_plugins::manager::PluginManager;
|
||||
|
||||
use crate::is_dev;
|
||||
@@ -59,30 +60,30 @@ impl YaakUpdater {
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn check_now(
|
||||
pub async fn check_now<R: Runtime>(
|
||||
&mut self,
|
||||
app_handle: &AppHandle,
|
||||
window: &WebviewWindow<R>,
|
||||
mode: UpdateMode,
|
||||
update_trigger: UpdateTrigger,
|
||||
) -> Result<bool, tauri_plugin_updater::Error> {
|
||||
let settings = get_or_create_settings(app_handle).await;
|
||||
) -> Result<bool> {
|
||||
let settings = window.db().get_settings();
|
||||
let update_key = format!("{:x}", md5::compute(settings.id));
|
||||
self.last_update_check = SystemTime::now();
|
||||
|
||||
info!("Checking for updates mode={}", mode);
|
||||
|
||||
let h = app_handle.clone();
|
||||
let update_check_result = app_handle
|
||||
let w = window.clone();
|
||||
let update_check_result = w
|
||||
.updater_builder()
|
||||
.on_before_exit(move || {
|
||||
// Kill plugin manager before exit or NSIS installer will fail to replace sidecar
|
||||
// while it's running.
|
||||
// NOTE: This is only called on Windows
|
||||
let h = h.clone();
|
||||
let w = w.clone();
|
||||
block_in_place(|| {
|
||||
tauri::async_runtime::block_on(async move {
|
||||
info!("Shutting down plugin manager before update");
|
||||
let plugin_manager = h.state::<PluginManager>();
|
||||
let plugin_manager = w.state::<PluginManager>();
|
||||
plugin_manager.terminate().await;
|
||||
});
|
||||
});
|
||||
@@ -100,11 +101,11 @@ impl YaakUpdater {
|
||||
.check()
|
||||
.await;
|
||||
|
||||
match update_check_result {
|
||||
Ok(Some(update)) => {
|
||||
let h = app_handle.clone();
|
||||
app_handle
|
||||
.dialog()
|
||||
let result = match update_check_result? {
|
||||
None => false,
|
||||
Some(update) => {
|
||||
let w = window.clone();
|
||||
w.dialog()
|
||||
.message(format!(
|
||||
"{} is available. Would you like to download and install it now?",
|
||||
update.version
|
||||
@@ -121,7 +122,7 @@ impl YaakUpdater {
|
||||
tauri::async_runtime::spawn(async move {
|
||||
match update.download_and_install(|_, _| {}, || {}).await {
|
||||
Ok(_) => {
|
||||
if h.dialog()
|
||||
if w.dialog()
|
||||
.message("Would you like to restart the app?")
|
||||
.title("Update Installed")
|
||||
.buttons(MessageDialogButtons::OkCancelCustom(
|
||||
@@ -130,27 +131,27 @@ impl YaakUpdater {
|
||||
))
|
||||
.blocking_show()
|
||||
{
|
||||
h.restart();
|
||||
w.app_handle().restart();
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
h.dialog()
|
||||
w.dialog()
|
||||
.message(format!("The update failed to install: {}", e));
|
||||
}
|
||||
}
|
||||
});
|
||||
});
|
||||
Ok(true)
|
||||
true
|
||||
}
|
||||
Ok(None) => Ok(false),
|
||||
Err(e) => Err(e),
|
||||
}
|
||||
};
|
||||
|
||||
Ok(result)
|
||||
}
|
||||
pub async fn maybe_check(
|
||||
pub async fn maybe_check<R: Runtime>(
|
||||
&mut self,
|
||||
app_handle: &AppHandle,
|
||||
window: &WebviewWindow<R>,
|
||||
mode: UpdateMode,
|
||||
) -> Result<bool, tauri_plugin_updater::Error> {
|
||||
) -> Result<bool> {
|
||||
let update_period_seconds = match mode {
|
||||
UpdateMode::Stable => MAX_UPDATE_CHECK_HOURS_STABLE,
|
||||
UpdateMode::Beta => MAX_UPDATE_CHECK_HOURS_BETA,
|
||||
@@ -167,6 +168,6 @@ impl YaakUpdater {
|
||||
return Ok(false);
|
||||
}
|
||||
|
||||
self.check_now(app_handle, mode, UpdateTrigger::Background).await
|
||||
self.check_now(window, mode, UpdateTrigger::Background).await
|
||||
}
|
||||
}
|
||||
|
||||
25
src-tauri/src/uri_scheme.rs
Normal file
25
src-tauri/src/uri_scheme.rs
Normal file
@@ -0,0 +1,25 @@
|
||||
use log::{info, warn};
|
||||
use tauri::{Manager, Runtime, UriSchemeContext};
|
||||
|
||||
pub(crate) fn handle_uri_scheme<R: Runtime>(
|
||||
a: UriSchemeContext<R>,
|
||||
req: http::Request<Vec<u8>>,
|
||||
) -> http::Response<Vec<u8>> {
|
||||
println!("------------- Yaak URI scheme invoked!");
|
||||
let uri = req.uri();
|
||||
let window = a
|
||||
.app_handle()
|
||||
.get_webview_window(a.webview_label())
|
||||
.expect("Failed to get webview window for URI scheme event");
|
||||
info!("Yaak URI scheme invoked with {uri:?} {window:?}");
|
||||
|
||||
let path = uri.path();
|
||||
if path == "/data/import" {
|
||||
warn!("TODO: import data")
|
||||
} else if path == "/plugins/install" {
|
||||
warn!("TODO: install plugin")
|
||||
}
|
||||
|
||||
let msg = format!("No handler found for {path}");
|
||||
tauri::http::Response::builder().status(404).body(msg.as_bytes().to_vec()).unwrap()
|
||||
}
|
||||
@@ -1,6 +1,6 @@
|
||||
use crate::window_menu::app_menu;
|
||||
use crate::{DEFAULT_WINDOW_HEIGHT, DEFAULT_WINDOW_WIDTH, MIN_WINDOW_HEIGHT, MIN_WINDOW_WIDTH};
|
||||
use log::{info, warn};
|
||||
use rand::random;
|
||||
use std::process::exit;
|
||||
use tauri::{
|
||||
AppHandle, Emitter, LogicalSize, Manager, Runtime, WebviewUrl, WebviewWindow, WindowEvent,
|
||||
@@ -8,6 +8,15 @@ use tauri::{
|
||||
use tauri_plugin_opener::OpenerExt;
|
||||
use tokio::sync::mpsc;
|
||||
|
||||
const DEFAULT_WINDOW_WIDTH: f64 = 1100.0;
|
||||
const DEFAULT_WINDOW_HEIGHT: f64 = 600.0;
|
||||
|
||||
const MIN_WINDOW_WIDTH: f64 = 300.0;
|
||||
const MIN_WINDOW_HEIGHT: f64 = 300.0;
|
||||
|
||||
pub(crate) const MAIN_WINDOW_PREFIX: &str = "main_";
|
||||
const OTHER_WINDOW_PREFIX: &str = "other_";
|
||||
|
||||
#[derive(Default, Debug)]
|
||||
pub(crate) struct CreateWindowConfig<'s> {
|
||||
pub url: &'s str,
|
||||
@@ -55,7 +64,10 @@ pub(crate) fn create_window<R: Runtime>(
|
||||
// macOS doesn't support data dir so must use this fn instead
|
||||
#[cfg(target_os = "macos")]
|
||||
{
|
||||
win_builder = win_builder.data_store_identifier(to_fixed_hash(&key));
|
||||
let hash = md5::compute(key.as_bytes());
|
||||
let mut id = [0u8; 16];
|
||||
id.copy_from_slice(&hash[..16]); // Take the first 16 bytes of the hash
|
||||
win_builder = win_builder.data_store_identifier(id);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -159,9 +171,94 @@ pub(crate) fn create_window<R: Runtime>(
|
||||
win
|
||||
}
|
||||
|
||||
fn to_fixed_hash(s: &str) -> [u8; 16] {
|
||||
let hash = md5::compute(s.as_bytes());
|
||||
let mut fixed = [0u8; 16];
|
||||
fixed.copy_from_slice(&hash[..16]); // Take the first 16 bytes of the hash
|
||||
fixed
|
||||
pub(crate) fn create_main_window(handle: &AppHandle, url: &str) -> WebviewWindow {
|
||||
let mut counter = 0;
|
||||
let label = loop {
|
||||
let label = format!("{MAIN_WINDOW_PREFIX}{counter}");
|
||||
match handle.webview_windows().get(label.as_str()) {
|
||||
None => break Some(label),
|
||||
Some(_) => counter += 1,
|
||||
}
|
||||
}
|
||||
.expect("Failed to generate label for new window");
|
||||
|
||||
let config = CreateWindowConfig {
|
||||
url,
|
||||
label: label.as_str(),
|
||||
title: "Yaak",
|
||||
inner_size: Some((DEFAULT_WINDOW_WIDTH, DEFAULT_WINDOW_HEIGHT)),
|
||||
position: Some((
|
||||
// Offset by random amount so it's easier to differentiate
|
||||
100.0 + random::<f64>() * 20.0,
|
||||
100.0 + random::<f64>() * 20.0,
|
||||
)),
|
||||
hide_titlebar: true,
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
create_window(handle, config)
|
||||
}
|
||||
|
||||
pub(crate) fn create_child_window(parent_window: &WebviewWindow, url: &str, label: &str, title: &str, inner_size: (f64, f64)) -> WebviewWindow {
|
||||
let app_handle = parent_window.app_handle();
|
||||
let label = format!("{OTHER_WINDOW_PREFIX}_{label}");
|
||||
let scale_factor = parent_window.scale_factor().unwrap();
|
||||
|
||||
let current_pos = parent_window.inner_position().unwrap().to_logical::<f64>(scale_factor);
|
||||
let current_size = parent_window.inner_size().unwrap().to_logical::<f64>(scale_factor);
|
||||
|
||||
// Position the new window in the middle of the parent
|
||||
let position = (
|
||||
current_pos.x + current_size.width / 2.0 - inner_size.0 / 2.0,
|
||||
current_pos.y + current_size.height / 2.0 - inner_size.1 / 2.0,
|
||||
);
|
||||
|
||||
let config = CreateWindowConfig {
|
||||
label: label.as_str(),
|
||||
title,
|
||||
url,
|
||||
inner_size: Some(inner_size),
|
||||
position: Some(position),
|
||||
hide_titlebar: true,
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let child_window = create_window(&app_handle, config);
|
||||
|
||||
// NOTE: These listeners will remain active even when the windows close. Unfortunately,
|
||||
// there's no way to unlisten to events for now, so we just have to be defensive.
|
||||
|
||||
{
|
||||
let parent_window = parent_window.clone();
|
||||
let child_window = child_window.clone();
|
||||
child_window.clone().on_window_event(move |e| match e {
|
||||
// When the new window is destroyed, bring the other up behind it
|
||||
WindowEvent::Destroyed => {
|
||||
if let Some(w) = parent_window.get_webview_window(child_window.label()) {
|
||||
w.set_focus().unwrap();
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
});
|
||||
}
|
||||
|
||||
{
|
||||
let parent_window = parent_window.clone();
|
||||
let child_window = child_window.clone();
|
||||
parent_window.clone().on_window_event(move |e| match e {
|
||||
// When the parent window is closed, close the child
|
||||
WindowEvent::CloseRequested { .. } => child_window.destroy().unwrap(),
|
||||
// When the parent window is focused, bring the child above
|
||||
WindowEvent::Focused(focus) => {
|
||||
if *focus {
|
||||
if let Some(w) = parent_window.get_webview_window(child_window.label()) {
|
||||
w.set_focus().unwrap();
|
||||
};
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
});
|
||||
}
|
||||
|
||||
child_window
|
||||
}
|
||||
|
||||
@@ -32,6 +32,7 @@ var import_node_fs = require("node:fs");
|
||||
async function getAccessToken(ctx, {
|
||||
accessTokenUrl,
|
||||
scope,
|
||||
audience,
|
||||
params,
|
||||
grantType,
|
||||
credentialsInBody,
|
||||
@@ -56,6 +57,7 @@ async function getAccessToken(ctx, {
|
||||
]
|
||||
};
|
||||
if (scope) httpRequest.body.form.push({ name: "scope", value: scope });
|
||||
if (scope) httpRequest.body.form.push({ name: "audience", value: audience });
|
||||
if (credentialsInBody) {
|
||||
httpRequest.body.form.push({ name: "client_id", value: clientId });
|
||||
httpRequest.body.form.push({ name: "client_secret", value: clientSecret });
|
||||
@@ -64,10 +66,10 @@ async function getAccessToken(ctx, {
|
||||
httpRequest.headers.push({ name: "Authorization", value });
|
||||
}
|
||||
const resp = await ctx.httpRequest.send({ httpRequest });
|
||||
const body = resp.bodyPath ? (0, import_node_fs.readFileSync)(resp.bodyPath, "utf8") : "";
|
||||
if (resp.status < 200 || resp.status >= 300) {
|
||||
throw new Error("Failed to fetch access token with status=" + resp.status);
|
||||
throw new Error("Failed to fetch access token with status=" + resp.status + " and body=" + body);
|
||||
}
|
||||
const body = (0, import_node_fs.readFileSync)(resp.bodyPath ?? "", "utf8");
|
||||
let response;
|
||||
try {
|
||||
response = JSON.parse(body);
|
||||
@@ -168,10 +170,10 @@ async function getOrRefreshAccessToken(ctx, contextId, {
|
||||
await deleteToken(ctx, contextId);
|
||||
return null;
|
||||
}
|
||||
const body = resp.bodyPath ? (0, import_node_fs2.readFileSync)(resp.bodyPath, "utf8") : "";
|
||||
if (resp.status < 200 || resp.status >= 300) {
|
||||
throw new Error("Failed to fetch access token with status=" + resp.status);
|
||||
throw new Error("Failed to refresh access token with status=" + resp.status + " and body=" + body);
|
||||
}
|
||||
const body = (0, import_node_fs2.readFileSync)(resp.bodyPath ?? "", "utf8");
|
||||
let response;
|
||||
try {
|
||||
response = JSON.parse(body);
|
||||
@@ -201,6 +203,7 @@ async function getAuthorizationCode(ctx, contextId, {
|
||||
redirectUri,
|
||||
scope,
|
||||
state,
|
||||
audience,
|
||||
credentialsInBody,
|
||||
pkce
|
||||
}) {
|
||||
@@ -220,6 +223,7 @@ async function getAuthorizationCode(ctx, contextId, {
|
||||
if (redirectUri) authorizationUrl.searchParams.set("redirect_uri", redirectUri);
|
||||
if (scope) authorizationUrl.searchParams.set("scope", scope);
|
||||
if (state) authorizationUrl.searchParams.set("state", state);
|
||||
if (audience) authorizationUrl.searchParams.set("audience", audience);
|
||||
if (pkce) {
|
||||
const verifier = pkce.codeVerifier || createPkceCodeVerifier();
|
||||
const challengeMethod = pkce.challengeMethod || DEFAULT_PKCE_METHOD;
|
||||
@@ -256,6 +260,7 @@ async function getAuthorizationCode(ctx, contextId, {
|
||||
clientId,
|
||||
clientSecret,
|
||||
scope,
|
||||
audience,
|
||||
credentialsInBody,
|
||||
params: [
|
||||
{ name: "code", value: code },
|
||||
@@ -291,6 +296,7 @@ async function getClientCredentials(ctx, contextId, {
|
||||
clientId,
|
||||
clientSecret,
|
||||
scope,
|
||||
audience,
|
||||
credentialsInBody
|
||||
}) {
|
||||
const token = await getToken(ctx, contextId);
|
||||
@@ -299,6 +305,7 @@ async function getClientCredentials(ctx, contextId, {
|
||||
const response = await getAccessToken(ctx, {
|
||||
grantType: "client_credentials",
|
||||
accessTokenUrl,
|
||||
audience,
|
||||
clientId,
|
||||
clientSecret,
|
||||
scope,
|
||||
@@ -315,38 +322,46 @@ function getImplicit(ctx, contextId, {
|
||||
clientId,
|
||||
redirectUri,
|
||||
scope,
|
||||
state
|
||||
state,
|
||||
audience
|
||||
}) {
|
||||
return new Promise(async (resolve, reject) => {
|
||||
const token = await getToken(ctx, contextId);
|
||||
if (token) {
|
||||
}
|
||||
const authorizationUrl = new URL(`${authorizationUrlRaw ?? ""}`);
|
||||
authorizationUrl.searchParams.set("response_type", "code");
|
||||
authorizationUrl.searchParams.set("response_type", "token");
|
||||
authorizationUrl.searchParams.set("client_id", clientId);
|
||||
if (redirectUri) authorizationUrl.searchParams.set("redirect_uri", redirectUri);
|
||||
if (scope) authorizationUrl.searchParams.set("scope", scope);
|
||||
if (state) authorizationUrl.searchParams.set("state", state);
|
||||
if (audience) authorizationUrl.searchParams.set("audience", audience);
|
||||
if (responseType.includes("id_token")) {
|
||||
authorizationUrl.searchParams.set("nonce", String(Math.floor(Math.random() * 9999999999999) + 1));
|
||||
}
|
||||
const authorizationUrlStr = authorizationUrl.toString();
|
||||
let foundAccessToken = false;
|
||||
let { close } = await ctx.window.openUrl({
|
||||
url: authorizationUrlStr,
|
||||
label: "oauth-authorization-url",
|
||||
async onClose() {
|
||||
if (!foundAccessToken) {
|
||||
reject(new Error("Authorization window closed"));
|
||||
}
|
||||
},
|
||||
async onNavigate({ url: urlStr }) {
|
||||
const url = new URL(urlStr);
|
||||
if (url.searchParams.has("error")) {
|
||||
return reject(Error(`Failed to authorize: ${url.searchParams.get("error")}`));
|
||||
}
|
||||
close();
|
||||
const hash = url.hash.slice(1);
|
||||
const params = new URLSearchParams(hash);
|
||||
const idToken = params.get("id_token");
|
||||
if (idToken) {
|
||||
params.set("access_token", idToken);
|
||||
params.delete("id_token");
|
||||
const accessToken = params.get("access_token");
|
||||
if (!accessToken) {
|
||||
return;
|
||||
}
|
||||
foundAccessToken = true;
|
||||
close();
|
||||
const response = Object.fromEntries(params);
|
||||
try {
|
||||
resolve(await storeToken(ctx, contextId, response));
|
||||
@@ -366,6 +381,7 @@ async function getPassword(ctx, contextId, {
|
||||
username,
|
||||
password,
|
||||
credentialsInBody,
|
||||
audience,
|
||||
scope
|
||||
}) {
|
||||
const token = await getOrRefreshAccessToken(ctx, contextId, {
|
||||
@@ -383,6 +399,7 @@ async function getPassword(ctx, contextId, {
|
||||
clientId,
|
||||
clientSecret,
|
||||
scope,
|
||||
audience,
|
||||
grantType: "password",
|
||||
credentialsInBody,
|
||||
params: [
|
||||
@@ -530,6 +547,12 @@ var plugin = {
|
||||
optional: true,
|
||||
dynamic: hiddenIfNot(["authorization_code", "implicit"])
|
||||
},
|
||||
{
|
||||
type: "text",
|
||||
name: "audience",
|
||||
label: "Audience",
|
||||
optional: true
|
||||
},
|
||||
{
|
||||
type: "checkbox",
|
||||
name: "usePkce",
|
||||
@@ -635,6 +658,7 @@ var plugin = {
|
||||
clientSecret: stringArg(values, "clientSecret"),
|
||||
redirectUri: stringArgOrNull(values, "redirectUri"),
|
||||
scope: stringArgOrNull(values, "scope"),
|
||||
audience: stringArgOrNull(values, "audience"),
|
||||
state: stringArgOrNull(values, "state"),
|
||||
credentialsInBody,
|
||||
pkce: values.usePkce ? {
|
||||
@@ -650,6 +674,7 @@ var plugin = {
|
||||
redirectUri: stringArgOrNull(values, "redirectUri"),
|
||||
responseType: stringArg(values, "responseType"),
|
||||
scope: stringArgOrNull(values, "scope"),
|
||||
audience: stringArgOrNull(values, "audience"),
|
||||
state: stringArgOrNull(values, "state")
|
||||
});
|
||||
} else if (grantType === "client_credentials") {
|
||||
@@ -659,6 +684,7 @@ var plugin = {
|
||||
clientId: stringArg(values, "clientId"),
|
||||
clientSecret: stringArg(values, "clientSecret"),
|
||||
scope: stringArgOrNull(values, "scope"),
|
||||
audience: stringArgOrNull(values, "audience"),
|
||||
credentialsInBody
|
||||
});
|
||||
} else if (grantType === "password") {
|
||||
@@ -670,6 +696,7 @@ var plugin = {
|
||||
username: stringArg(values, "username"),
|
||||
password: stringArg(values, "password"),
|
||||
scope: stringArgOrNull(values, "scope"),
|
||||
audience: stringArgOrNull(values, "audience"),
|
||||
credentialsInBody
|
||||
});
|
||||
} else {
|
||||
|
||||
1287
src-tauri/vendored/plugins/filter-jsonpath/build/index.js
generated
1287
src-tauri/vendored/plugins/filter-jsonpath/build/index.js
generated
File diff suppressed because it is too large
Load Diff
@@ -7,7 +7,7 @@
|
||||
"dev": "yaakcli dev ./src/index.js"
|
||||
},
|
||||
"dependencies": {
|
||||
"jsonpath-plus": "^9.0.0"
|
||||
"jsonpath-plus": "^10.3.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/jsonpath": "^0.2.4"
|
||||
|
||||
@@ -542,20 +542,23 @@ function pairsToDataParameters(keyedPairs) {
|
||||
}
|
||||
for (const p of pairs) {
|
||||
if (typeof p !== "string") continue;
|
||||
const [name, value] = p.split("=");
|
||||
if (p.startsWith("@")) {
|
||||
dataParameters.push({
|
||||
name: name ?? "",
|
||||
value: "",
|
||||
filePath: p.slice(1),
|
||||
enabled: true
|
||||
});
|
||||
} else {
|
||||
dataParameters.push({
|
||||
name: name ?? "",
|
||||
value: flagName === "data-urlencode" ? encodeURIComponent(value ?? "") : value ?? "",
|
||||
enabled: true
|
||||
});
|
||||
let params = p.split("&");
|
||||
for (const param of params) {
|
||||
const [name, value] = param.split("=");
|
||||
if (param.startsWith("@")) {
|
||||
dataParameters.push({
|
||||
name: name ?? "",
|
||||
value: "",
|
||||
filePath: param.slice(1),
|
||||
enabled: true
|
||||
});
|
||||
} else {
|
||||
dataParameters.push({
|
||||
name: name ?? "",
|
||||
value: flagName === "data-urlencode" ? encodeURIComponent(value ?? "") : value ?? "",
|
||||
enabled: true
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -7296,35 +7296,48 @@ __export(src_exports, {
|
||||
});
|
||||
module.exports = __toCommonJS(src_exports);
|
||||
var import_yaml = __toESM(require_dist());
|
||||
var plugin = {
|
||||
importer: {
|
||||
name: "Insomnia",
|
||||
description: "Import Insomnia workspaces",
|
||||
onImport(_ctx, args) {
|
||||
return convertInsomnia(args.text);
|
||||
}
|
||||
|
||||
// src/common.ts
|
||||
function convertSyntax(variable) {
|
||||
if (!isJSString(variable)) return variable;
|
||||
return variable.replaceAll(/{{\s*(_\.)?([^}]+)\s*}}/g, "${[$2]}");
|
||||
}
|
||||
function isJSObject(obj) {
|
||||
return Object.prototype.toString.call(obj) === "[object Object]";
|
||||
}
|
||||
function isJSString(obj) {
|
||||
return Object.prototype.toString.call(obj) === "[object String]";
|
||||
}
|
||||
function convertId(id) {
|
||||
if (id.startsWith("GENERATE_ID::")) {
|
||||
return id;
|
||||
}
|
||||
};
|
||||
function convertInsomnia(contents) {
|
||||
let parsed;
|
||||
try {
|
||||
parsed = JSON.parse(contents);
|
||||
} catch (e) {
|
||||
return `GENERATE_ID::${id}`;
|
||||
}
|
||||
function deleteUndefinedAttrs(obj) {
|
||||
if (Array.isArray(obj) && obj != null) {
|
||||
return obj.map(deleteUndefinedAttrs);
|
||||
} else if (typeof obj === "object" && obj != null) {
|
||||
return Object.fromEntries(
|
||||
Object.entries(obj).filter(([, v]) => v !== void 0).map(([k, v]) => [k, deleteUndefinedAttrs(v)])
|
||||
);
|
||||
} else {
|
||||
return obj;
|
||||
}
|
||||
try {
|
||||
parsed = parsed ?? import_yaml.default.parse(contents);
|
||||
} catch (e) {
|
||||
}
|
||||
if (!isJSObject(parsed)) return;
|
||||
if (!Array.isArray(parsed.resources)) return;
|
||||
}
|
||||
|
||||
// src/v4.ts
|
||||
function convertInsomniaV4(parsed) {
|
||||
if (!Array.isArray(parsed.resources)) return null;
|
||||
const resources = {
|
||||
workspaces: [],
|
||||
httpRequests: [],
|
||||
grpcRequests: [],
|
||||
environments: [],
|
||||
folders: []
|
||||
folders: [],
|
||||
grpcRequests: [],
|
||||
httpRequests: [],
|
||||
websocketRequests: [],
|
||||
workspaces: []
|
||||
};
|
||||
const workspacesToImport = parsed.resources.filter(isWorkspace);
|
||||
const workspacesToImport = parsed.resources.filter((r) => isJSObject(r) && r._type === "workspace");
|
||||
for (const w of workspacesToImport) {
|
||||
resources.workspaces.push({
|
||||
id: convertId(w._id),
|
||||
@@ -7335,25 +7348,25 @@ function convertInsomnia(contents) {
|
||||
description: w.description || void 0
|
||||
});
|
||||
const environmentsToImport = parsed.resources.filter(
|
||||
(r) => isEnvironment(r)
|
||||
(r) => isJSObject(r) && r._type === "environment"
|
||||
);
|
||||
resources.environments.push(
|
||||
...environmentsToImport.map((r) => importEnvironment(r, w._id))
|
||||
);
|
||||
const nextFolder = (parentId) => {
|
||||
const children = parsed.resources.filter((r) => r.parentId === parentId);
|
||||
let sortPriority = 0;
|
||||
for (const child of children) {
|
||||
if (isRequestGroup(child)) {
|
||||
if (!isJSObject(child)) continue;
|
||||
if (child._type === "request_group") {
|
||||
resources.folders.push(importFolder(child, w._id));
|
||||
nextFolder(child._id);
|
||||
} else if (isHttpRequest(child)) {
|
||||
} else if (child._type === "request") {
|
||||
resources.httpRequests.push(
|
||||
importHttpRequest(child, w._id, sortPriority++)
|
||||
importHttpRequest(child, w._id)
|
||||
);
|
||||
} else if (isGrpcRequest(child)) {
|
||||
} else if (child._type === "grpc_request") {
|
||||
resources.grpcRequests.push(
|
||||
importGrpcRequest(child, w._id, sortPriority++)
|
||||
importGrpcRequest(child, w._id)
|
||||
);
|
||||
}
|
||||
}
|
||||
@@ -7364,62 +7377,9 @@ function convertInsomnia(contents) {
|
||||
resources.grpcRequests = resources.grpcRequests.filter(Boolean);
|
||||
resources.environments = resources.environments.filter(Boolean);
|
||||
resources.workspaces = resources.workspaces.filter(Boolean);
|
||||
return { resources: deleteUndefinedAttrs(resources) };
|
||||
return { resources };
|
||||
}
|
||||
function importEnvironment(e, workspaceId) {
|
||||
return {
|
||||
id: convertId(e._id),
|
||||
createdAt: e.created ? new Date(e.created).toISOString().replace("Z", "") : void 0,
|
||||
updatedAt: e.updated ? new Date(e.updated).toISOString().replace("Z", "") : void 0,
|
||||
workspaceId: convertId(workspaceId),
|
||||
environmentId: e.parentId === workspaceId ? null : convertId(e.parentId),
|
||||
model: "environment",
|
||||
name: e.name,
|
||||
variables: Object.entries(e.data).map(([name, value]) => ({
|
||||
enabled: true,
|
||||
name,
|
||||
value: `${value}`
|
||||
}))
|
||||
};
|
||||
}
|
||||
function importFolder(f, workspaceId) {
|
||||
return {
|
||||
id: convertId(f._id),
|
||||
createdAt: f.created ? new Date(f.created).toISOString().replace("Z", "") : void 0,
|
||||
updatedAt: f.updated ? new Date(f.updated).toISOString().replace("Z", "") : void 0,
|
||||
folderId: f.parentId === workspaceId ? null : convertId(f.parentId),
|
||||
workspaceId: convertId(workspaceId),
|
||||
description: f.description || void 0,
|
||||
model: "folder",
|
||||
name: f.name
|
||||
};
|
||||
}
|
||||
function importGrpcRequest(r, workspaceId, sortPriority = 0) {
|
||||
const parts = r.protoMethodName.split("/").filter((p) => p !== "");
|
||||
const service = parts[0] ?? null;
|
||||
const method = parts[1] ?? null;
|
||||
return {
|
||||
id: convertId(r._id),
|
||||
createdAt: r.created ? new Date(r.created).toISOString().replace("Z", "") : void 0,
|
||||
updatedAt: r.updated ? new Date(r.updated).toISOString().replace("Z", "") : void 0,
|
||||
workspaceId: convertId(workspaceId),
|
||||
folderId: r.parentId === workspaceId ? null : convertId(r.parentId),
|
||||
model: "grpc_request",
|
||||
sortPriority,
|
||||
name: r.name,
|
||||
description: r.description || void 0,
|
||||
url: convertSyntax(r.url),
|
||||
service,
|
||||
method,
|
||||
message: r.body?.text ?? "",
|
||||
metadata: (r.metadata ?? []).map((h) => ({
|
||||
enabled: !h.disabled,
|
||||
name: h.name ?? "",
|
||||
value: h.value ?? ""
|
||||
})).filter(({ name, value }) => name !== "" || value !== "")
|
||||
};
|
||||
}
|
||||
function importHttpRequest(r, workspaceId, sortPriority = 0) {
|
||||
function importHttpRequest(r, workspaceId) {
|
||||
let bodyType = null;
|
||||
let body = {};
|
||||
if (r.body.mimeType === "application/octet-stream") {
|
||||
@@ -7466,13 +7426,13 @@ function importHttpRequest(r, workspaceId, sortPriority = 0) {
|
||||
};
|
||||
}
|
||||
return {
|
||||
id: convertId(r._id),
|
||||
id: convertId(r.meta?.id ?? r._id),
|
||||
createdAt: r.created ? new Date(r.created).toISOString().replace("Z", "") : void 0,
|
||||
updatedAt: r.updated ? new Date(r.updated).toISOString().replace("Z", "") : void 0,
|
||||
updatedAt: r.modified ? new Date(r.modified).toISOString().replace("Z", "") : void 0,
|
||||
workspaceId: convertId(workspaceId),
|
||||
folderId: r.parentId === workspaceId ? null : convertId(r.parentId),
|
||||
model: "http_request",
|
||||
sortPriority,
|
||||
sortPriority: r.metaSortKey,
|
||||
name: r.name,
|
||||
description: r.description || void 0,
|
||||
url: convertSyntax(r.url),
|
||||
@@ -7488,47 +7448,309 @@ function importHttpRequest(r, workspaceId, sortPriority = 0) {
|
||||
})).filter(({ name, value }) => name !== "" || value !== "")
|
||||
};
|
||||
}
|
||||
function convertSyntax(variable) {
|
||||
if (!isJSString(variable)) return variable;
|
||||
return variable.replaceAll(/{{\s*(_\.)?([^}]+)\s*}}/g, "${[$2]}");
|
||||
function importGrpcRequest(r, workspaceId) {
|
||||
const parts = r.protoMethodName.split("/").filter((p) => p !== "");
|
||||
const service = parts[0] ?? null;
|
||||
const method = parts[1] ?? null;
|
||||
return {
|
||||
id: convertId(r.meta?.id ?? r._id),
|
||||
createdAt: r.created ? new Date(r.created).toISOString().replace("Z", "") : void 0,
|
||||
updatedAt: r.modified ? new Date(r.modified).toISOString().replace("Z", "") : void 0,
|
||||
workspaceId: convertId(workspaceId),
|
||||
folderId: r.parentId === workspaceId ? null : convertId(r.parentId),
|
||||
model: "grpc_request",
|
||||
sortPriority: r.metaSortKey,
|
||||
name: r.name,
|
||||
description: r.description || void 0,
|
||||
url: convertSyntax(r.url),
|
||||
service,
|
||||
method,
|
||||
message: r.body?.text ?? "",
|
||||
metadata: (r.metadata ?? []).map((h) => ({
|
||||
enabled: !h.disabled,
|
||||
name: h.name ?? "",
|
||||
value: h.value ?? ""
|
||||
})).filter(({ name, value }) => name !== "" || value !== "")
|
||||
};
|
||||
}
|
||||
function isWorkspace(obj) {
|
||||
return isJSObject(obj) && obj._type === "workspace";
|
||||
function importFolder(f, workspaceId) {
|
||||
return {
|
||||
id: convertId(f._id),
|
||||
createdAt: f.created ? new Date(f.created).toISOString().replace("Z", "") : void 0,
|
||||
updatedAt: f.modified ? new Date(f.modified).toISOString().replace("Z", "") : void 0,
|
||||
folderId: f.parentId === workspaceId ? null : convertId(f.parentId),
|
||||
workspaceId: convertId(workspaceId),
|
||||
description: f.description || void 0,
|
||||
model: "folder",
|
||||
name: f.name
|
||||
};
|
||||
}
|
||||
function isRequestGroup(obj) {
|
||||
return isJSObject(obj) && obj._type === "request_group";
|
||||
function importEnvironment(e, workspaceId, isParent) {
|
||||
return {
|
||||
id: convertId(e._id),
|
||||
createdAt: e.created ? new Date(e.created).toISOString().replace("Z", "") : void 0,
|
||||
updatedAt: e.modified ? new Date(e.modified).toISOString().replace("Z", "") : void 0,
|
||||
workspaceId: convertId(workspaceId),
|
||||
// @ts-ignore
|
||||
sortPriority: e.metaSortKey,
|
||||
// Will be added to Yaak later
|
||||
base: isParent ?? e.parentId === workspaceId,
|
||||
model: "environment",
|
||||
name: e.name,
|
||||
variables: Object.entries(e.data).map(([name, value]) => ({
|
||||
enabled: true,
|
||||
name,
|
||||
value: `${value}`
|
||||
}))
|
||||
};
|
||||
}
|
||||
function isHttpRequest(obj) {
|
||||
return isJSObject(obj) && obj._type === "request";
|
||||
|
||||
// src/v5.ts
|
||||
function convertInsomniaV5(parsed) {
|
||||
if (!Array.isArray(parsed.collection)) return null;
|
||||
const resources = {
|
||||
environments: [],
|
||||
folders: [],
|
||||
grpcRequests: [],
|
||||
httpRequests: [],
|
||||
websocketRequests: [],
|
||||
workspaces: []
|
||||
};
|
||||
const meta = parsed.meta ?? {};
|
||||
resources.workspaces.push({
|
||||
id: convertId(meta.id ?? "collection"),
|
||||
createdAt: meta.created ? new Date(meta.created).toISOString().replace("Z", "") : void 0,
|
||||
updatedAt: meta.modified ? new Date(meta.modified).toISOString().replace("Z", "") : void 0,
|
||||
model: "workspace",
|
||||
name: parsed.name,
|
||||
description: meta.description || void 0
|
||||
});
|
||||
resources.environments.push(
|
||||
importEnvironment2(parsed.environments, meta.id, true),
|
||||
...(parsed.environments.subEnvironments ?? []).map((r) => importEnvironment2(r, meta.id))
|
||||
);
|
||||
const nextFolder = (children, parentId) => {
|
||||
for (const child of children ?? []) {
|
||||
if (!isJSObject(child)) continue;
|
||||
if (Array.isArray(child.children)) {
|
||||
resources.folders.push(importFolder2(child, meta.id, parentId));
|
||||
nextFolder(child.children, child.meta.id);
|
||||
} else if (child.method) {
|
||||
resources.httpRequests.push(
|
||||
importHttpRequest2(child, meta.id, parentId)
|
||||
);
|
||||
} else if (child.protoFileId) {
|
||||
resources.grpcRequests.push(
|
||||
importGrpcRequest2(child, meta.id, parentId)
|
||||
);
|
||||
} else if (child.url) {
|
||||
resources.websocketRequests.push(
|
||||
importWebsocketRequest(child, meta.id, parentId)
|
||||
);
|
||||
}
|
||||
}
|
||||
};
|
||||
nextFolder(parsed.collection ?? [], meta.id);
|
||||
resources.httpRequests = resources.httpRequests.filter(Boolean);
|
||||
resources.grpcRequests = resources.grpcRequests.filter(Boolean);
|
||||
resources.environments = resources.environments.filter(Boolean);
|
||||
resources.workspaces = resources.workspaces.filter(Boolean);
|
||||
return { resources };
|
||||
}
|
||||
function isGrpcRequest(obj) {
|
||||
return isJSObject(obj) && obj._type === "grpc_request";
|
||||
}
|
||||
function isEnvironment(obj) {
|
||||
return isJSObject(obj) && obj._type === "environment";
|
||||
}
|
||||
function isJSObject(obj) {
|
||||
return Object.prototype.toString.call(obj) === "[object Object]";
|
||||
}
|
||||
function isJSString(obj) {
|
||||
return Object.prototype.toString.call(obj) === "[object String]";
|
||||
}
|
||||
function convertId(id) {
|
||||
if (id.startsWith("GENERATE_ID::")) {
|
||||
return id;
|
||||
function importHttpRequest2(r, workspaceId, parentId) {
|
||||
const id = r.meta?.id ?? r._id;
|
||||
const created = r.meta?.created ?? r.created;
|
||||
const updated = r.meta?.modified ?? r.updated;
|
||||
const sortKey = r.meta?.sortKey ?? r.sortKey;
|
||||
let bodyType = null;
|
||||
let body = {};
|
||||
if (r.body?.mimeType === "application/octet-stream") {
|
||||
bodyType = "binary";
|
||||
body = { filePath: r.body.fileName ?? "" };
|
||||
} else if (r.body?.mimeType === "application/x-www-form-urlencoded") {
|
||||
bodyType = "application/x-www-form-urlencoded";
|
||||
body = {
|
||||
form: (r.body.params ?? []).map((p) => ({
|
||||
enabled: !p.disabled,
|
||||
name: p.name ?? "",
|
||||
value: p.value ?? ""
|
||||
}))
|
||||
};
|
||||
} else if (r.body?.mimeType === "multipart/form-data") {
|
||||
bodyType = "multipart/form-data";
|
||||
body = {
|
||||
form: (r.body.params ?? []).map((p) => ({
|
||||
enabled: !p.disabled,
|
||||
name: p.name ?? "",
|
||||
value: p.value ?? "",
|
||||
file: p.fileName ?? null
|
||||
}))
|
||||
};
|
||||
} else if (r.body?.mimeType === "application/graphql") {
|
||||
bodyType = "graphql";
|
||||
body = { text: convertSyntax(r.body.text ?? "") };
|
||||
} else if (r.body?.mimeType === "application/json") {
|
||||
bodyType = "application/json";
|
||||
body = { text: convertSyntax(r.body.text ?? "") };
|
||||
}
|
||||
return `GENERATE_ID::${id}`;
|
||||
return {
|
||||
id: convertId(id),
|
||||
workspaceId: convertId(workspaceId),
|
||||
createdAt: created ? new Date(created).toISOString().replace("Z", "") : void 0,
|
||||
updatedAt: updated ? new Date(updated).toISOString().replace("Z", "") : void 0,
|
||||
folderId: parentId === workspaceId ? null : convertId(parentId),
|
||||
sortPriority: sortKey,
|
||||
model: "http_request",
|
||||
name: r.name,
|
||||
description: r.meta?.description || void 0,
|
||||
url: convertSyntax(r.url),
|
||||
body,
|
||||
bodyType,
|
||||
method: r.method,
|
||||
...importHeaders(r),
|
||||
...importAuthentication(r)
|
||||
};
|
||||
}
|
||||
function deleteUndefinedAttrs(obj) {
|
||||
if (Array.isArray(obj) && obj != null) {
|
||||
return obj.map(deleteUndefinedAttrs);
|
||||
} else if (typeof obj === "object" && obj != null) {
|
||||
return Object.fromEntries(
|
||||
Object.entries(obj).filter(([, v]) => v !== void 0).map(([k, v]) => [k, deleteUndefinedAttrs(v)])
|
||||
);
|
||||
} else {
|
||||
return obj;
|
||||
function importGrpcRequest2(r, workspaceId, parentId) {
|
||||
const id = r.meta?.id ?? r._id;
|
||||
const created = r.meta?.created ?? r.created;
|
||||
const updated = r.meta?.modified ?? r.updated;
|
||||
const sortKey = r.meta?.sortKey ?? r.sortKey;
|
||||
const parts = r.protoMethodName.split("/").filter((p) => p !== "");
|
||||
const service = parts[0] ?? null;
|
||||
const method = parts[1] ?? null;
|
||||
return {
|
||||
model: "grpc_request",
|
||||
id: convertId(id),
|
||||
workspaceId: convertId(workspaceId),
|
||||
createdAt: created ? new Date(created).toISOString().replace("Z", "") : void 0,
|
||||
updatedAt: updated ? new Date(updated).toISOString().replace("Z", "") : void 0,
|
||||
folderId: parentId === workspaceId ? null : convertId(parentId),
|
||||
sortPriority: sortKey,
|
||||
name: r.name,
|
||||
description: r.description || void 0,
|
||||
url: convertSyntax(r.url),
|
||||
service,
|
||||
method,
|
||||
message: r.body?.text ?? "",
|
||||
metadata: (r.metadata ?? []).map((h) => ({
|
||||
enabled: !h.disabled,
|
||||
name: h.name ?? "",
|
||||
value: h.value ?? ""
|
||||
})).filter(({ name, value }) => name !== "" || value !== "")
|
||||
};
|
||||
}
|
||||
function importWebsocketRequest(r, workspaceId, parentId) {
|
||||
const id = r.meta?.id ?? r._id;
|
||||
const created = r.meta?.created ?? r.created;
|
||||
const updated = r.meta?.modified ?? r.updated;
|
||||
const sortKey = r.meta?.sortKey ?? r.sortKey;
|
||||
return {
|
||||
model: "websocket_request",
|
||||
id: convertId(id),
|
||||
workspaceId: convertId(workspaceId),
|
||||
createdAt: created ? new Date(created).toISOString().replace("Z", "") : void 0,
|
||||
updatedAt: updated ? new Date(updated).toISOString().replace("Z", "") : void 0,
|
||||
folderId: parentId === workspaceId ? null : convertId(parentId),
|
||||
sortPriority: sortKey,
|
||||
name: r.name,
|
||||
description: r.description || void 0,
|
||||
url: convertSyntax(r.url),
|
||||
message: r.body?.text ?? "",
|
||||
...importHeaders(r),
|
||||
...importAuthentication(r)
|
||||
};
|
||||
}
|
||||
function importHeaders(r) {
|
||||
const headers = (r.headers ?? []).map((h) => ({
|
||||
enabled: !h.disabled,
|
||||
name: h.name ?? "",
|
||||
value: h.value ?? ""
|
||||
})).filter(({ name, value }) => name !== "" || value !== "");
|
||||
return { headers };
|
||||
}
|
||||
function importAuthentication(r) {
|
||||
let authenticationType = null;
|
||||
let authentication = {};
|
||||
if (r.authentication?.type === "bearer") {
|
||||
authenticationType = "bearer";
|
||||
authentication = {
|
||||
token: convertSyntax(r.authentication.token)
|
||||
};
|
||||
} else if (r.authentication?.type === "basic") {
|
||||
authenticationType = "basic";
|
||||
authentication = {
|
||||
username: convertSyntax(r.authentication.username),
|
||||
password: convertSyntax(r.authentication.password)
|
||||
};
|
||||
}
|
||||
return { authenticationType, authentication };
|
||||
}
|
||||
function importFolder2(f, workspaceId, parentId) {
|
||||
const id = f.meta?.id ?? f._id;
|
||||
const created = f.meta?.created ?? f.created;
|
||||
const updated = f.meta?.modified ?? f.updated;
|
||||
const sortKey = f.meta?.sortKey ?? f.sortKey;
|
||||
return {
|
||||
model: "folder",
|
||||
id: convertId(id),
|
||||
createdAt: created ? new Date(created).toISOString().replace("Z", "") : void 0,
|
||||
updatedAt: updated ? new Date(updated).toISOString().replace("Z", "") : void 0,
|
||||
folderId: parentId === workspaceId ? null : convertId(parentId),
|
||||
sortPriority: sortKey,
|
||||
workspaceId: convertId(workspaceId),
|
||||
description: f.description || void 0,
|
||||
name: f.name
|
||||
};
|
||||
}
|
||||
function importEnvironment2(e, workspaceId, isParent) {
|
||||
const id = e.meta?.id ?? e._id;
|
||||
const created = e.meta?.created ?? e.created;
|
||||
const updated = e.meta?.modified ?? e.updated;
|
||||
const sortKey = e.meta?.sortKey ?? e.sortKey;
|
||||
return {
|
||||
id: convertId(id),
|
||||
createdAt: created ? new Date(created).toISOString().replace("Z", "") : void 0,
|
||||
updatedAt: updated ? new Date(updated).toISOString().replace("Z", "") : void 0,
|
||||
workspaceId: convertId(workspaceId),
|
||||
public: !e.isPrivate,
|
||||
// @ts-ignore
|
||||
sortPriority: sortKey,
|
||||
// Will be added to Yaak later
|
||||
base: isParent ?? e.parentId === workspaceId,
|
||||
model: "environment",
|
||||
name: e.name,
|
||||
variables: Object.entries(e.data ?? {}).map(([name, value]) => ({
|
||||
enabled: true,
|
||||
name,
|
||||
value: `${value}`
|
||||
}))
|
||||
};
|
||||
}
|
||||
|
||||
// src/index.ts
|
||||
var plugin = {
|
||||
importer: {
|
||||
name: "Insomnia",
|
||||
description: "Import Insomnia workspaces",
|
||||
async onImport(_ctx, args) {
|
||||
return convertInsomnia(args.text);
|
||||
}
|
||||
}
|
||||
};
|
||||
function convertInsomnia(contents) {
|
||||
let parsed;
|
||||
try {
|
||||
parsed = JSON.parse(contents);
|
||||
} catch (e) {
|
||||
}
|
||||
try {
|
||||
parsed = parsed ?? import_yaml.default.parse(contents);
|
||||
} catch (e) {
|
||||
}
|
||||
if (!isJSObject(parsed)) return null;
|
||||
const result = convertInsomniaV5(parsed) ?? convertInsomniaV4(parsed);
|
||||
return deleteUndefinedAttrs(result);
|
||||
}
|
||||
// Annotate the CommonJS export names for ESM import in node:
|
||||
0 && (module.exports = {
|
||||
|
||||
@@ -69,6 +69,12 @@ function migrateImport(contents) {
|
||||
}
|
||||
}
|
||||
}
|
||||
for (const environment of parsed.resources.environments ?? []) {
|
||||
if ("environmentId" in environment) {
|
||||
environment.base = environment.environmentId == null;
|
||||
delete environment.environmentId;
|
||||
}
|
||||
}
|
||||
return { resources: parsed.resources };
|
||||
}
|
||||
function isJSObject(obj) {
|
||||
|
||||
47
src-tauri/vendored/plugins/template-function-cookie/build/index.js
generated
Normal file
47
src-tauri/vendored/plugins/template-function-cookie/build/index.js
generated
Normal file
@@ -0,0 +1,47 @@
|
||||
"use strict";
|
||||
var __defProp = Object.defineProperty;
|
||||
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||
var __export = (target, all) => {
|
||||
for (var name in all)
|
||||
__defProp(target, name, { get: all[name], enumerable: true });
|
||||
};
|
||||
var __copyProps = (to, from, except, desc) => {
|
||||
if (from && typeof from === "object" || typeof from === "function") {
|
||||
for (let key of __getOwnPropNames(from))
|
||||
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||
}
|
||||
return to;
|
||||
};
|
||||
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||
|
||||
// src/index.ts
|
||||
var src_exports = {};
|
||||
__export(src_exports, {
|
||||
plugin: () => plugin
|
||||
});
|
||||
module.exports = __toCommonJS(src_exports);
|
||||
var plugin = {
|
||||
templateFunctions: [
|
||||
{
|
||||
name: "cookie.value",
|
||||
description: "Read the value of a cookie in the jar, by name",
|
||||
args: [
|
||||
{
|
||||
type: "text",
|
||||
name: "cookie_name",
|
||||
label: "Cookie Name"
|
||||
}
|
||||
],
|
||||
async onRender(ctx, args) {
|
||||
return ctx.cookies.getValue({ name: String(args.values.cookie_name) });
|
||||
}
|
||||
}
|
||||
]
|
||||
};
|
||||
// Annotate the CommonJS export names for ESM import in node:
|
||||
0 && (module.exports = {
|
||||
plugin
|
||||
});
|
||||
9
src-tauri/vendored/plugins/template-function-cookie/package.json
generated
Normal file
9
src-tauri/vendored/plugins/template-function-cookie/package.json
generated
Normal file
@@ -0,0 +1,9 @@
|
||||
{
|
||||
"name": "@yaakapp/template-function-cookie",
|
||||
"private": true,
|
||||
"version": "0.0.1",
|
||||
"scripts": {
|
||||
"build": "yaakcli build ./src/index.ts",
|
||||
"dev": "yaakcli dev ./src/index.js"
|
||||
}
|
||||
}
|
||||
69
src-tauri/vendored/plugins/template-function-encode/build/index.js
generated
Normal file
69
src-tauri/vendored/plugins/template-function-encode/build/index.js
generated
Normal file
@@ -0,0 +1,69 @@
|
||||
"use strict";
|
||||
var __defProp = Object.defineProperty;
|
||||
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||
var __export = (target, all) => {
|
||||
for (var name in all)
|
||||
__defProp(target, name, { get: all[name], enumerable: true });
|
||||
};
|
||||
var __copyProps = (to, from, except, desc) => {
|
||||
if (from && typeof from === "object" || typeof from === "function") {
|
||||
for (let key of __getOwnPropNames(from))
|
||||
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||
}
|
||||
return to;
|
||||
};
|
||||
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||
|
||||
// src/index.ts
|
||||
var src_exports = {};
|
||||
__export(src_exports, {
|
||||
plugin: () => plugin
|
||||
});
|
||||
module.exports = __toCommonJS(src_exports);
|
||||
var plugin = {
|
||||
templateFunctions: [
|
||||
{
|
||||
name: "base64.encode",
|
||||
description: "Encode a value to base64",
|
||||
args: [{ label: "Plain Text", type: "text", name: "value", multiLine: true }],
|
||||
async onRender(_ctx, args) {
|
||||
return Buffer.from(args.values.value ?? "").toString("base64");
|
||||
}
|
||||
},
|
||||
{
|
||||
name: "base64.decode",
|
||||
description: "Decode a value from base64",
|
||||
args: [{ label: "Encoded Value", type: "text", name: "value", multiLine: true }],
|
||||
async onRender(_ctx, args) {
|
||||
return Buffer.from(args.values.value ?? "", "base64").toString("utf-8");
|
||||
}
|
||||
},
|
||||
{
|
||||
name: "url.encode",
|
||||
description: "Encode a value for use in a URL (percent-encoding)",
|
||||
args: [{ label: "Plain Text", type: "text", name: "value", multiLine: true }],
|
||||
async onRender(_ctx, args) {
|
||||
return encodeURIComponent(args.values.value ?? "");
|
||||
}
|
||||
},
|
||||
{
|
||||
name: "url.decode",
|
||||
description: "Decode a percent-encoded URL value",
|
||||
args: [{ label: "Encoded Value", type: "text", name: "value", multiLine: true }],
|
||||
async onRender(_ctx, args) {
|
||||
try {
|
||||
return decodeURIComponent(args.values.value ?? "");
|
||||
} catch {
|
||||
return "";
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
};
|
||||
// Annotate the CommonJS export names for ESM import in node:
|
||||
0 && (module.exports = {
|
||||
plugin
|
||||
});
|
||||
9
src-tauri/vendored/plugins/template-function-encode/package.json
generated
Normal file
9
src-tauri/vendored/plugins/template-function-encode/package.json
generated
Normal file
@@ -0,0 +1,9 @@
|
||||
{
|
||||
"name": "@yaakapp/template-function-encode",
|
||||
"private": true,
|
||||
"version": "0.0.1",
|
||||
"scripts": {
|
||||
"build": "yaakcli build ./src/index.ts",
|
||||
"dev": "yaakcli dev ./src/index.js"
|
||||
}
|
||||
}
|
||||
@@ -25,24 +25,76 @@ __export(src_exports, {
|
||||
module.exports = __toCommonJS(src_exports);
|
||||
var import_node_crypto = require("node:crypto");
|
||||
var algorithms = ["md5", "sha1", "sha256", "sha512"];
|
||||
var plugin = {
|
||||
templateFunctions: algorithms.map((algorithm) => ({
|
||||
name: `hash.${algorithm}`,
|
||||
description: "Hash a value to its hexidecimal representation",
|
||||
args: [
|
||||
{
|
||||
name: "input",
|
||||
label: "Input",
|
||||
placeholder: "input text",
|
||||
type: "text"
|
||||
}
|
||||
],
|
||||
async onRender(_ctx, args) {
|
||||
if (!args.values.input) return "";
|
||||
return (0, import_node_crypto.createHash)(algorithm).update(args.values.input, "utf-8").digest("hex");
|
||||
var encodings = ["base64", "hex"];
|
||||
var hashFunctions = algorithms.map((algorithm) => ({
|
||||
name: `hash.${algorithm}`,
|
||||
description: "Hash a value to its hexidecimal representation",
|
||||
args: [
|
||||
{
|
||||
type: "text",
|
||||
name: "input",
|
||||
label: "Input",
|
||||
placeholder: "input text",
|
||||
multiLine: true
|
||||
},
|
||||
{
|
||||
type: "select",
|
||||
name: "encoding",
|
||||
label: "Encoding",
|
||||
defaultValue: "base64",
|
||||
options: encodings.map((encoding) => ({
|
||||
label: capitalize(encoding),
|
||||
value: encoding
|
||||
}))
|
||||
}
|
||||
}))
|
||||
],
|
||||
async onRender(_ctx, args) {
|
||||
const input = String(args.values.input);
|
||||
const encoding = String(args.values.encoding);
|
||||
return (0, import_node_crypto.createHash)(algorithm).update(input, "utf-8").digest(encoding);
|
||||
}
|
||||
}));
|
||||
var hmacFunctions = algorithms.map((algorithm) => ({
|
||||
name: `hmac.${algorithm}`,
|
||||
description: "Compute the HMAC of a value",
|
||||
args: [
|
||||
{
|
||||
type: "text",
|
||||
name: "input",
|
||||
label: "Input",
|
||||
placeholder: "input text",
|
||||
multiLine: true
|
||||
},
|
||||
{
|
||||
type: "text",
|
||||
name: "key",
|
||||
label: "Key",
|
||||
password: true
|
||||
},
|
||||
{
|
||||
type: "select",
|
||||
name: "encoding",
|
||||
label: "Encoding",
|
||||
defaultValue: "base64",
|
||||
options: encodings.map((encoding) => ({
|
||||
value: encoding,
|
||||
label: capitalize(encoding)
|
||||
}))
|
||||
}
|
||||
],
|
||||
async onRender(_ctx, args) {
|
||||
const input = String(args.values.input);
|
||||
const key = String(args.values.key);
|
||||
const encoding = String(args.values.encoding);
|
||||
return (0, import_node_crypto.createHmac)(algorithm, key, {}).update(input).digest(encoding);
|
||||
}
|
||||
}));
|
||||
var plugin = {
|
||||
templateFunctions: [...hashFunctions, ...hmacFunctions]
|
||||
};
|
||||
function capitalize(str) {
|
||||
return str.charAt(0).toUpperCase() + str.slice(1);
|
||||
}
|
||||
// Annotate the CommonJS export names for ESM import in node:
|
||||
0 && (module.exports = {
|
||||
plugin
|
||||
|
||||
1769
src-tauri/vendored/plugins/template-function-json/build/index.js
generated
Normal file
1769
src-tauri/vendored/plugins/template-function-json/build/index.js
generated
Normal file
File diff suppressed because it is too large
Load Diff
15
src-tauri/vendored/plugins/template-function-json/package.json
generated
Executable file
15
src-tauri/vendored/plugins/template-function-json/package.json
generated
Executable file
@@ -0,0 +1,15 @@
|
||||
{
|
||||
"name": "@yaakapp/template-function-json",
|
||||
"private": true,
|
||||
"version": "0.0.1",
|
||||
"scripts": {
|
||||
"build": "yaakcli build ./src/index.ts",
|
||||
"dev": "yaakcli dev ./src/index.js"
|
||||
},
|
||||
"dependencies": {
|
||||
"jsonpath-plus": "^10.3.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/jsonpath": "^0.2.4"
|
||||
}
|
||||
}
|
||||
@@ -1,9 +1,7 @@
|
||||
"use strict";
|
||||
var __create = Object.create;
|
||||
var __defProp = Object.defineProperty;
|
||||
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||
var __getProtoOf = Object.getPrototypeOf;
|
||||
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||
var __export = (target, all) => {
|
||||
for (var name in all)
|
||||
@@ -17,14 +15,6 @@ var __copyProps = (to, from, except, desc) => {
|
||||
}
|
||||
return to;
|
||||
};
|
||||
var __toESM = (mod, isNodeMode, target) => (target = mod != null ? __create(__getProtoOf(mod)) : {}, __copyProps(
|
||||
// If the importer is in node compatibility mode or this is not an ESM
|
||||
// file that has been converted to a CommonJS file using a Babel-
|
||||
// compatible transform (i.e. "__esModule" has not been set), then set
|
||||
// "default" to the CommonJS "module.exports" for node compatibility.
|
||||
isNodeMode || !mod || !mod.__esModule ? __defProp(target, "default", { value: mod, enumerable: true }) : target,
|
||||
mod
|
||||
));
|
||||
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||
|
||||
// src/index.ts
|
||||
@@ -33,18 +23,26 @@ __export(src_exports, {
|
||||
plugin: () => plugin
|
||||
});
|
||||
module.exports = __toCommonJS(src_exports);
|
||||
var import_node_fs = __toESM(require("node:fs"));
|
||||
var plugin = {
|
||||
templateFunctions: [{
|
||||
name: "fs.readFile",
|
||||
args: [{ title: "Select File", type: "file", name: "path", label: "File" }],
|
||||
name: "regex.match",
|
||||
description: "Extract",
|
||||
args: [
|
||||
{
|
||||
type: "text",
|
||||
name: "regex",
|
||||
label: "Regular Expression",
|
||||
placeholder: "^w+=(?<value>w*)$",
|
||||
defaultValue: "^(.*)$",
|
||||
description: "A JavaScript regular expression, evaluated using the Node.js RegExp engine. Capture groups or named groups can be used to extract values."
|
||||
},
|
||||
{ type: "text", name: "input", label: "Input Text", multiLine: true }
|
||||
],
|
||||
async onRender(_ctx, args) {
|
||||
if (!args.values.path) return null;
|
||||
try {
|
||||
return import_node_fs.default.promises.readFile(args.values.path, "utf-8");
|
||||
} catch (err) {
|
||||
return null;
|
||||
}
|
||||
if (!args.values.regex) return "";
|
||||
const regex = new RegExp(String(args.values.regex));
|
||||
const match = args.values.input?.match(regex);
|
||||
return match?.groups ? Object.values(match.groups)[0] ?? "" : match?.[1] ?? match?.[0] ?? "";
|
||||
}
|
||||
}]
|
||||
};
|
||||
@@ -1,5 +1,5 @@
|
||||
{
|
||||
"name": "@yaakapp/template-function-file",
|
||||
"name": "@yaakapp/template-function-regex",
|
||||
"private": true,
|
||||
"version": "0.0.1",
|
||||
"scripts": {
|
||||
417
src-tauri/vendored/plugins/template-function-uuid/build/index.js
generated
Normal file
417
src-tauri/vendored/plugins/template-function-uuid/build/index.js
generated
Normal file
@@ -0,0 +1,417 @@
|
||||
"use strict";
|
||||
var __defProp = Object.defineProperty;
|
||||
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||
var __export = (target, all) => {
|
||||
for (var name in all)
|
||||
__defProp(target, name, { get: all[name], enumerable: true });
|
||||
};
|
||||
var __copyProps = (to, from, except, desc) => {
|
||||
if (from && typeof from === "object" || typeof from === "function") {
|
||||
for (let key of __getOwnPropNames(from))
|
||||
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||
}
|
||||
return to;
|
||||
};
|
||||
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||
|
||||
// src/index.ts
|
||||
var src_exports = {};
|
||||
__export(src_exports, {
|
||||
plugin: () => plugin
|
||||
});
|
||||
module.exports = __toCommonJS(src_exports);
|
||||
|
||||
// node_modules/uuid/dist/esm/regex.js
|
||||
var regex_default = /^(?:[0-9a-f]{8}-[0-9a-f]{4}-[1-8][0-9a-f]{3}-[89ab][0-9a-f]{3}-[0-9a-f]{12}|00000000-0000-0000-0000-000000000000|ffffffff-ffff-ffff-ffff-ffffffffffff)$/i;
|
||||
|
||||
// node_modules/uuid/dist/esm/validate.js
|
||||
function validate(uuid) {
|
||||
return typeof uuid === "string" && regex_default.test(uuid);
|
||||
}
|
||||
var validate_default = validate;
|
||||
|
||||
// node_modules/uuid/dist/esm/parse.js
|
||||
function parse(uuid) {
|
||||
if (!validate_default(uuid)) {
|
||||
throw TypeError("Invalid UUID");
|
||||
}
|
||||
let v;
|
||||
return Uint8Array.of((v = parseInt(uuid.slice(0, 8), 16)) >>> 24, v >>> 16 & 255, v >>> 8 & 255, v & 255, (v = parseInt(uuid.slice(9, 13), 16)) >>> 8, v & 255, (v = parseInt(uuid.slice(14, 18), 16)) >>> 8, v & 255, (v = parseInt(uuid.slice(19, 23), 16)) >>> 8, v & 255, (v = parseInt(uuid.slice(24, 36), 16)) / 1099511627776 & 255, v / 4294967296 & 255, v >>> 24 & 255, v >>> 16 & 255, v >>> 8 & 255, v & 255);
|
||||
}
|
||||
var parse_default = parse;
|
||||
|
||||
// node_modules/uuid/dist/esm/stringify.js
|
||||
var byteToHex = [];
|
||||
for (let i = 0; i < 256; ++i) {
|
||||
byteToHex.push((i + 256).toString(16).slice(1));
|
||||
}
|
||||
function unsafeStringify(arr, offset = 0) {
|
||||
return (byteToHex[arr[offset + 0]] + byteToHex[arr[offset + 1]] + byteToHex[arr[offset + 2]] + byteToHex[arr[offset + 3]] + "-" + byteToHex[arr[offset + 4]] + byteToHex[arr[offset + 5]] + "-" + byteToHex[arr[offset + 6]] + byteToHex[arr[offset + 7]] + "-" + byteToHex[arr[offset + 8]] + byteToHex[arr[offset + 9]] + "-" + byteToHex[arr[offset + 10]] + byteToHex[arr[offset + 11]] + byteToHex[arr[offset + 12]] + byteToHex[arr[offset + 13]] + byteToHex[arr[offset + 14]] + byteToHex[arr[offset + 15]]).toLowerCase();
|
||||
}
|
||||
|
||||
// node_modules/uuid/dist/esm/rng.js
|
||||
var import_crypto = require("crypto");
|
||||
var rnds8Pool = new Uint8Array(256);
|
||||
var poolPtr = rnds8Pool.length;
|
||||
function rng() {
|
||||
if (poolPtr > rnds8Pool.length - 16) {
|
||||
(0, import_crypto.randomFillSync)(rnds8Pool);
|
||||
poolPtr = 0;
|
||||
}
|
||||
return rnds8Pool.slice(poolPtr, poolPtr += 16);
|
||||
}
|
||||
|
||||
// node_modules/uuid/dist/esm/v1.js
|
||||
var _state = {};
|
||||
function v1(options, buf, offset) {
|
||||
let bytes;
|
||||
const isV6 = options?._v6 ?? false;
|
||||
if (options) {
|
||||
const optionsKeys = Object.keys(options);
|
||||
if (optionsKeys.length === 1 && optionsKeys[0] === "_v6") {
|
||||
options = void 0;
|
||||
}
|
||||
}
|
||||
if (options) {
|
||||
bytes = v1Bytes(options.random ?? options.rng?.() ?? rng(), options.msecs, options.nsecs, options.clockseq, options.node, buf, offset);
|
||||
} else {
|
||||
const now = Date.now();
|
||||
const rnds = rng();
|
||||
updateV1State(_state, now, rnds);
|
||||
bytes = v1Bytes(rnds, _state.msecs, _state.nsecs, isV6 ? void 0 : _state.clockseq, isV6 ? void 0 : _state.node, buf, offset);
|
||||
}
|
||||
return buf ?? unsafeStringify(bytes);
|
||||
}
|
||||
function updateV1State(state, now, rnds) {
|
||||
state.msecs ??= -Infinity;
|
||||
state.nsecs ??= 0;
|
||||
if (now === state.msecs) {
|
||||
state.nsecs++;
|
||||
if (state.nsecs >= 1e4) {
|
||||
state.node = void 0;
|
||||
state.nsecs = 0;
|
||||
}
|
||||
} else if (now > state.msecs) {
|
||||
state.nsecs = 0;
|
||||
} else if (now < state.msecs) {
|
||||
state.node = void 0;
|
||||
}
|
||||
if (!state.node) {
|
||||
state.node = rnds.slice(10, 16);
|
||||
state.node[0] |= 1;
|
||||
state.clockseq = (rnds[8] << 8 | rnds[9]) & 16383;
|
||||
}
|
||||
state.msecs = now;
|
||||
return state;
|
||||
}
|
||||
function v1Bytes(rnds, msecs, nsecs, clockseq, node, buf, offset = 0) {
|
||||
if (rnds.length < 16) {
|
||||
throw new Error("Random bytes length must be >= 16");
|
||||
}
|
||||
if (!buf) {
|
||||
buf = new Uint8Array(16);
|
||||
offset = 0;
|
||||
} else {
|
||||
if (offset < 0 || offset + 16 > buf.length) {
|
||||
throw new RangeError(`UUID byte range ${offset}:${offset + 15} is out of buffer bounds`);
|
||||
}
|
||||
}
|
||||
msecs ??= Date.now();
|
||||
nsecs ??= 0;
|
||||
clockseq ??= (rnds[8] << 8 | rnds[9]) & 16383;
|
||||
node ??= rnds.slice(10, 16);
|
||||
msecs += 122192928e5;
|
||||
const tl = ((msecs & 268435455) * 1e4 + nsecs) % 4294967296;
|
||||
buf[offset++] = tl >>> 24 & 255;
|
||||
buf[offset++] = tl >>> 16 & 255;
|
||||
buf[offset++] = tl >>> 8 & 255;
|
||||
buf[offset++] = tl & 255;
|
||||
const tmh = msecs / 4294967296 * 1e4 & 268435455;
|
||||
buf[offset++] = tmh >>> 8 & 255;
|
||||
buf[offset++] = tmh & 255;
|
||||
buf[offset++] = tmh >>> 24 & 15 | 16;
|
||||
buf[offset++] = tmh >>> 16 & 255;
|
||||
buf[offset++] = clockseq >>> 8 | 128;
|
||||
buf[offset++] = clockseq & 255;
|
||||
for (let n = 0; n < 6; ++n) {
|
||||
buf[offset++] = node[n];
|
||||
}
|
||||
return buf;
|
||||
}
|
||||
var v1_default = v1;
|
||||
|
||||
// node_modules/uuid/dist/esm/v1ToV6.js
|
||||
function v1ToV6(uuid) {
|
||||
const v1Bytes2 = typeof uuid === "string" ? parse_default(uuid) : uuid;
|
||||
const v6Bytes = _v1ToV6(v1Bytes2);
|
||||
return typeof uuid === "string" ? unsafeStringify(v6Bytes) : v6Bytes;
|
||||
}
|
||||
function _v1ToV6(v1Bytes2) {
|
||||
return Uint8Array.of((v1Bytes2[6] & 15) << 4 | v1Bytes2[7] >> 4 & 15, (v1Bytes2[7] & 15) << 4 | (v1Bytes2[4] & 240) >> 4, (v1Bytes2[4] & 15) << 4 | (v1Bytes2[5] & 240) >> 4, (v1Bytes2[5] & 15) << 4 | (v1Bytes2[0] & 240) >> 4, (v1Bytes2[0] & 15) << 4 | (v1Bytes2[1] & 240) >> 4, (v1Bytes2[1] & 15) << 4 | (v1Bytes2[2] & 240) >> 4, 96 | v1Bytes2[2] & 15, v1Bytes2[3], v1Bytes2[8], v1Bytes2[9], v1Bytes2[10], v1Bytes2[11], v1Bytes2[12], v1Bytes2[13], v1Bytes2[14], v1Bytes2[15]);
|
||||
}
|
||||
|
||||
// node_modules/uuid/dist/esm/md5.js
|
||||
var import_crypto2 = require("crypto");
|
||||
function md5(bytes) {
|
||||
if (Array.isArray(bytes)) {
|
||||
bytes = Buffer.from(bytes);
|
||||
} else if (typeof bytes === "string") {
|
||||
bytes = Buffer.from(bytes, "utf8");
|
||||
}
|
||||
return (0, import_crypto2.createHash)("md5").update(bytes).digest();
|
||||
}
|
||||
var md5_default = md5;
|
||||
|
||||
// node_modules/uuid/dist/esm/v35.js
|
||||
function stringToBytes(str) {
|
||||
str = unescape(encodeURIComponent(str));
|
||||
const bytes = new Uint8Array(str.length);
|
||||
for (let i = 0; i < str.length; ++i) {
|
||||
bytes[i] = str.charCodeAt(i);
|
||||
}
|
||||
return bytes;
|
||||
}
|
||||
var DNS = "6ba7b810-9dad-11d1-80b4-00c04fd430c8";
|
||||
var URL = "6ba7b811-9dad-11d1-80b4-00c04fd430c8";
|
||||
function v35(version, hash, value, namespace, buf, offset) {
|
||||
const valueBytes = typeof value === "string" ? stringToBytes(value) : value;
|
||||
const namespaceBytes = typeof namespace === "string" ? parse_default(namespace) : namespace;
|
||||
if (typeof namespace === "string") {
|
||||
namespace = parse_default(namespace);
|
||||
}
|
||||
if (namespace?.length !== 16) {
|
||||
throw TypeError("Namespace must be array-like (16 iterable integer values, 0-255)");
|
||||
}
|
||||
let bytes = new Uint8Array(16 + valueBytes.length);
|
||||
bytes.set(namespaceBytes);
|
||||
bytes.set(valueBytes, namespaceBytes.length);
|
||||
bytes = hash(bytes);
|
||||
bytes[6] = bytes[6] & 15 | version;
|
||||
bytes[8] = bytes[8] & 63 | 128;
|
||||
if (buf) {
|
||||
offset = offset || 0;
|
||||
for (let i = 0; i < 16; ++i) {
|
||||
buf[offset + i] = bytes[i];
|
||||
}
|
||||
return buf;
|
||||
}
|
||||
return unsafeStringify(bytes);
|
||||
}
|
||||
|
||||
// node_modules/uuid/dist/esm/v3.js
|
||||
function v3(value, namespace, buf, offset) {
|
||||
return v35(48, md5_default, value, namespace, buf, offset);
|
||||
}
|
||||
v3.DNS = DNS;
|
||||
v3.URL = URL;
|
||||
var v3_default = v3;
|
||||
|
||||
// node_modules/uuid/dist/esm/native.js
|
||||
var import_crypto3 = require("crypto");
|
||||
var native_default = { randomUUID: import_crypto3.randomUUID };
|
||||
|
||||
// node_modules/uuid/dist/esm/v4.js
|
||||
function v4(options, buf, offset) {
|
||||
if (native_default.randomUUID && !buf && !options) {
|
||||
return native_default.randomUUID();
|
||||
}
|
||||
options = options || {};
|
||||
const rnds = options.random ?? options.rng?.() ?? rng();
|
||||
if (rnds.length < 16) {
|
||||
throw new Error("Random bytes length must be >= 16");
|
||||
}
|
||||
rnds[6] = rnds[6] & 15 | 64;
|
||||
rnds[8] = rnds[8] & 63 | 128;
|
||||
if (buf) {
|
||||
offset = offset || 0;
|
||||
if (offset < 0 || offset + 16 > buf.length) {
|
||||
throw new RangeError(`UUID byte range ${offset}:${offset + 15} is out of buffer bounds`);
|
||||
}
|
||||
for (let i = 0; i < 16; ++i) {
|
||||
buf[offset + i] = rnds[i];
|
||||
}
|
||||
return buf;
|
||||
}
|
||||
return unsafeStringify(rnds);
|
||||
}
|
||||
var v4_default = v4;
|
||||
|
||||
// node_modules/uuid/dist/esm/sha1.js
|
||||
var import_crypto4 = require("crypto");
|
||||
function sha1(bytes) {
|
||||
if (Array.isArray(bytes)) {
|
||||
bytes = Buffer.from(bytes);
|
||||
} else if (typeof bytes === "string") {
|
||||
bytes = Buffer.from(bytes, "utf8");
|
||||
}
|
||||
return (0, import_crypto4.createHash)("sha1").update(bytes).digest();
|
||||
}
|
||||
var sha1_default = sha1;
|
||||
|
||||
// node_modules/uuid/dist/esm/v5.js
|
||||
function v5(value, namespace, buf, offset) {
|
||||
return v35(80, sha1_default, value, namespace, buf, offset);
|
||||
}
|
||||
v5.DNS = DNS;
|
||||
v5.URL = URL;
|
||||
var v5_default = v5;
|
||||
|
||||
// node_modules/uuid/dist/esm/v6.js
|
||||
function v6(options, buf, offset) {
|
||||
options ??= {};
|
||||
offset ??= 0;
|
||||
let bytes = v1_default({ ...options, _v6: true }, new Uint8Array(16));
|
||||
bytes = v1ToV6(bytes);
|
||||
if (buf) {
|
||||
for (let i = 0; i < 16; i++) {
|
||||
buf[offset + i] = bytes[i];
|
||||
}
|
||||
return buf;
|
||||
}
|
||||
return unsafeStringify(bytes);
|
||||
}
|
||||
var v6_default = v6;
|
||||
|
||||
// node_modules/uuid/dist/esm/v7.js
|
||||
var _state2 = {};
|
||||
function v7(options, buf, offset) {
|
||||
let bytes;
|
||||
if (options) {
|
||||
bytes = v7Bytes(options.random ?? options.rng?.() ?? rng(), options.msecs, options.seq, buf, offset);
|
||||
} else {
|
||||
const now = Date.now();
|
||||
const rnds = rng();
|
||||
updateV7State(_state2, now, rnds);
|
||||
bytes = v7Bytes(rnds, _state2.msecs, _state2.seq, buf, offset);
|
||||
}
|
||||
return buf ?? unsafeStringify(bytes);
|
||||
}
|
||||
function updateV7State(state, now, rnds) {
|
||||
state.msecs ??= -Infinity;
|
||||
state.seq ??= 0;
|
||||
if (now > state.msecs) {
|
||||
state.seq = rnds[6] << 23 | rnds[7] << 16 | rnds[8] << 8 | rnds[9];
|
||||
state.msecs = now;
|
||||
} else {
|
||||
state.seq = state.seq + 1 | 0;
|
||||
if (state.seq === 0) {
|
||||
state.msecs++;
|
||||
}
|
||||
}
|
||||
return state;
|
||||
}
|
||||
function v7Bytes(rnds, msecs, seq, buf, offset = 0) {
|
||||
if (rnds.length < 16) {
|
||||
throw new Error("Random bytes length must be >= 16");
|
||||
}
|
||||
if (!buf) {
|
||||
buf = new Uint8Array(16);
|
||||
offset = 0;
|
||||
} else {
|
||||
if (offset < 0 || offset + 16 > buf.length) {
|
||||
throw new RangeError(`UUID byte range ${offset}:${offset + 15} is out of buffer bounds`);
|
||||
}
|
||||
}
|
||||
msecs ??= Date.now();
|
||||
seq ??= rnds[6] * 127 << 24 | rnds[7] << 16 | rnds[8] << 8 | rnds[9];
|
||||
buf[offset++] = msecs / 1099511627776 & 255;
|
||||
buf[offset++] = msecs / 4294967296 & 255;
|
||||
buf[offset++] = msecs / 16777216 & 255;
|
||||
buf[offset++] = msecs / 65536 & 255;
|
||||
buf[offset++] = msecs / 256 & 255;
|
||||
buf[offset++] = msecs & 255;
|
||||
buf[offset++] = 112 | seq >>> 28 & 15;
|
||||
buf[offset++] = seq >>> 20 & 255;
|
||||
buf[offset++] = 128 | seq >>> 14 & 63;
|
||||
buf[offset++] = seq >>> 6 & 255;
|
||||
buf[offset++] = seq << 2 & 255 | rnds[10] & 3;
|
||||
buf[offset++] = rnds[11];
|
||||
buf[offset++] = rnds[12];
|
||||
buf[offset++] = rnds[13];
|
||||
buf[offset++] = rnds[14];
|
||||
buf[offset++] = rnds[15];
|
||||
return buf;
|
||||
}
|
||||
var v7_default = v7;
|
||||
|
||||
// src/index.ts
|
||||
var plugin = {
|
||||
templateFunctions: [
|
||||
{
|
||||
name: "uuid.v1",
|
||||
description: "Generate a UUID V1",
|
||||
args: [],
|
||||
async onRender(_ctx, _args) {
|
||||
return v1_default();
|
||||
}
|
||||
},
|
||||
{
|
||||
name: "uuid.v3",
|
||||
description: "Generate a UUID V3",
|
||||
args: [
|
||||
{ type: "text", name: "name", label: "Name" },
|
||||
{
|
||||
type: "text",
|
||||
name: "namespace",
|
||||
label: "Namespace UUID",
|
||||
description: "A valid UUID to use as the namespace",
|
||||
placeholder: "24ced880-3bf4-11f0-8329-cd053d577f0e"
|
||||
}
|
||||
],
|
||||
async onRender(_ctx, args) {
|
||||
return v3_default(String(args.values.name), String(args.values.namespace));
|
||||
}
|
||||
},
|
||||
{
|
||||
name: "uuid.v4",
|
||||
description: "Generate a UUID V4",
|
||||
args: [],
|
||||
async onRender(_ctx, _args) {
|
||||
return v4_default();
|
||||
}
|
||||
},
|
||||
{
|
||||
name: "uuid.v5",
|
||||
description: "Generate a UUID V5",
|
||||
args: [
|
||||
{ type: "text", name: "name", label: "Name" },
|
||||
{ type: "text", name: "namespace", label: "Namespace" }
|
||||
],
|
||||
async onRender(_ctx, args) {
|
||||
return v5_default(String(args.values.name), String(args.values.namespace));
|
||||
}
|
||||
},
|
||||
{
|
||||
name: "uuid.v6",
|
||||
description: "Generate a UUID V6",
|
||||
args: [
|
||||
{
|
||||
type: "text",
|
||||
name: "timestamp",
|
||||
label: "Timestamp",
|
||||
optional: true,
|
||||
description: "Can be any format that can be parsed by JavaScript new Date(...)",
|
||||
placeholder: "2025-05-28T11:15:00Z"
|
||||
}
|
||||
],
|
||||
async onRender(_ctx, args) {
|
||||
return v6_default({ msecs: new Date(String(args.values.timestamp)).getTime() });
|
||||
}
|
||||
},
|
||||
{
|
||||
name: "uuid.v7",
|
||||
description: "Generate a UUID V7",
|
||||
args: [],
|
||||
async onRender(_ctx, _args) {
|
||||
return v7_default();
|
||||
}
|
||||
}
|
||||
]
|
||||
};
|
||||
// Annotate the CommonJS export names for ESM import in node:
|
||||
0 && (module.exports = {
|
||||
plugin
|
||||
});
|
||||
12
src-tauri/vendored/plugins/template-function-uuid/package.json
generated
Normal file
12
src-tauri/vendored/plugins/template-function-uuid/package.json
generated
Normal file
@@ -0,0 +1,12 @@
|
||||
{
|
||||
"name": "@yaakapp/template-function-uuid",
|
||||
"private": true,
|
||||
"version": "0.0.1",
|
||||
"scripts": {
|
||||
"build": "yaakcli build ./src/index.ts",
|
||||
"dev": "yaakcli dev ./src/index.js"
|
||||
},
|
||||
"dependencies": {
|
||||
"uuid": "^11.1.0"
|
||||
}
|
||||
}
|
||||
8380
src-tauri/vendored/plugins/template-function-xml/build/index.js
generated
Normal file
8380
src-tauri/vendored/plugins/template-function-xml/build/index.js
generated
Normal file
File diff suppressed because it is too large
Load Diff
13
src-tauri/vendored/plugins/template-function-xml/package.json
generated
Executable file
13
src-tauri/vendored/plugins/template-function-xml/package.json
generated
Executable file
@@ -0,0 +1,13 @@
|
||||
{
|
||||
"name": "@yaakapp/template-function-xml",
|
||||
"private": true,
|
||||
"version": "0.0.1",
|
||||
"scripts": {
|
||||
"build": "yaakcli build ./src/index.ts",
|
||||
"dev": "yaakcli dev ./src/index.js"
|
||||
},
|
||||
"dependencies": {
|
||||
"@xmldom/xmldom": "^0.8.10",
|
||||
"xpath": "^0.0.34"
|
||||
}
|
||||
}
|
||||
9
src-tauri/yaak-common/Cargo.toml
Normal file
9
src-tauri/yaak-common/Cargo.toml
Normal file
@@ -0,0 +1,9 @@
|
||||
[package]
|
||||
name = "yaak-common"
|
||||
version = "0.1.0"
|
||||
edition = "2024"
|
||||
publish = false
|
||||
|
||||
[dependencies]
|
||||
tauri = { workspace = true }
|
||||
regex = "1.11.0"
|
||||
1
src-tauri/yaak-common/src/lib.rs
Normal file
1
src-tauri/yaak-common/src/lib.rs
Normal file
@@ -0,0 +1 @@
|
||||
pub mod window;
|
||||
31
src-tauri/yaak-common/src/window.rs
Normal file
31
src-tauri/yaak-common/src/window.rs
Normal file
@@ -0,0 +1,31 @@
|
||||
use regex::Regex;
|
||||
use tauri::{Runtime, WebviewWindow};
|
||||
|
||||
pub trait WorkspaceWindowTrait {
|
||||
fn workspace_id(&self) -> Option<String>;
|
||||
fn cookie_jar_id(&self) -> Option<String>;
|
||||
fn environment_id(&self) -> Option<String>;
|
||||
}
|
||||
|
||||
impl<R: Runtime> WorkspaceWindowTrait for WebviewWindow<R> {
|
||||
fn workspace_id(&self) -> Option<String> {
|
||||
let url = self.url().unwrap();
|
||||
let re = Regex::new(r"/workspaces/(?<id>\w+)").unwrap();
|
||||
match re.captures(url.as_str()) {
|
||||
None => None,
|
||||
Some(captures) => captures.name("id").map(|c| c.as_str().to_string()),
|
||||
}
|
||||
}
|
||||
|
||||
fn cookie_jar_id(&self) -> Option<String> {
|
||||
let url = self.url().unwrap();
|
||||
let mut query_pairs = url.query_pairs();
|
||||
query_pairs.find(|(k, _v)| k == "cookie_jar_id").map(|(_k, v)| v.to_string())
|
||||
}
|
||||
|
||||
fn environment_id(&self) -> Option<String> {
|
||||
let url = self.url().unwrap();
|
||||
let mut query_pairs = url.query_pairs();
|
||||
query_pairs.find(|(k, _v)| k == "environment_id").map(|(_k, v)| v.to_string())
|
||||
}
|
||||
}
|
||||
20
src-tauri/yaak-crypto/Cargo.toml
Normal file
20
src-tauri/yaak-crypto/Cargo.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "yaak-crypto"
|
||||
links = "yaak-crypto"
|
||||
version = "0.1.0"
|
||||
edition = "2021"
|
||||
publish = false
|
||||
|
||||
[dependencies]
|
||||
base32 = "0.5.1" # For encoding human-readable key
|
||||
base64 = "0.22.1" # For encoding in the database
|
||||
chacha20poly1305 = "0.10.1"
|
||||
keyring = { version = "4.0.0-rc.1" }
|
||||
log = "0.4.26"
|
||||
serde = { workspace = true, features = ["derive"] }
|
||||
tauri = { workspace = true }
|
||||
thiserror = "2.0.12"
|
||||
yaak-models = { workspace = true }
|
||||
|
||||
[build-dependencies]
|
||||
tauri-plugin = { workspace = true, features = ["build"] }
|
||||
9
src-tauri/yaak-crypto/build.rs
Normal file
9
src-tauri/yaak-crypto/build.rs
Normal file
@@ -0,0 +1,9 @@
|
||||
const COMMANDS: &[&str] = &[
|
||||
"enable_encryption",
|
||||
"reveal_workspace_key",
|
||||
"set_workspace_key",
|
||||
];
|
||||
|
||||
fn main() {
|
||||
tauri_plugin::Builder::new(COMMANDS).build();
|
||||
}
|
||||
13
src-tauri/yaak-crypto/index.ts
Normal file
13
src-tauri/yaak-crypto/index.ts
Normal file
@@ -0,0 +1,13 @@
|
||||
import { invoke } from '@tauri-apps/api/core';
|
||||
|
||||
export function enableEncryption(workspaceId: string) {
|
||||
return invoke<void>('plugin:yaak-crypto|enable_encryption', { workspaceId });
|
||||
}
|
||||
|
||||
export function revealWorkspaceKey(workspaceId: string) {
|
||||
return invoke<string>('plugin:yaak-crypto|reveal_workspace_key', { workspaceId });
|
||||
}
|
||||
|
||||
export function setWorkspaceKey(args: { workspaceId: string; key: string }) {
|
||||
return invoke<void>('plugin:yaak-crypto|set_workspace_key', args);
|
||||
}
|
||||
6
src-tauri/yaak-crypto/package.json
Normal file
6
src-tauri/yaak-crypto/package.json
Normal file
@@ -0,0 +1,6 @@
|
||||
{
|
||||
"name": "@yaakapp-internal/crypto",
|
||||
"private": true,
|
||||
"version": "1.0.0",
|
||||
"main": "index.ts"
|
||||
}
|
||||
7
src-tauri/yaak-crypto/permissions/default.toml
Normal file
7
src-tauri/yaak-crypto/permissions/default.toml
Normal file
@@ -0,0 +1,7 @@
|
||||
[default]
|
||||
description = "Default permissions for the plugin"
|
||||
permissions = [
|
||||
"allow-enable-encryption",
|
||||
"allow-reveal-workspace-key",
|
||||
"allow-set-workspace-key",
|
||||
]
|
||||
31
src-tauri/yaak-crypto/src/commands.rs
Normal file
31
src-tauri/yaak-crypto/src/commands.rs
Normal file
@@ -0,0 +1,31 @@
|
||||
use crate::error::Result;
|
||||
use crate::manager::EncryptionManagerExt;
|
||||
use tauri::{command, Runtime, WebviewWindow};
|
||||
|
||||
#[command]
|
||||
pub(crate) async fn enable_encryption<R: Runtime>(
|
||||
window: WebviewWindow<R>,
|
||||
workspace_id: &str,
|
||||
) -> Result<()> {
|
||||
window.crypto().ensure_workspace_key(workspace_id)?;
|
||||
window.crypto().reveal_workspace_key(workspace_id)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub(crate) async fn reveal_workspace_key<R: Runtime>(
|
||||
window: WebviewWindow<R>,
|
||||
workspace_id: &str,
|
||||
) -> Result<String> {
|
||||
Ok(window.crypto().reveal_workspace_key(workspace_id)?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub(crate) async fn set_workspace_key<R: Runtime>(
|
||||
window: WebviewWindow<R>,
|
||||
workspace_id: &str,
|
||||
key: &str,
|
||||
) -> Result<()> {
|
||||
window.crypto().set_human_key(workspace_id, key)?;
|
||||
Ok(())
|
||||
}
|
||||
98
src-tauri/yaak-crypto/src/encryption.rs
Normal file
98
src-tauri/yaak-crypto/src/encryption.rs
Normal file
@@ -0,0 +1,98 @@
|
||||
use crate::error::Error::{DecryptionError, EncryptionError, InvalidEncryptedData};
|
||||
use crate::error::Result;
|
||||
use chacha20poly1305::aead::generic_array::typenum::Unsigned;
|
||||
use chacha20poly1305::aead::{Aead, AeadCore, Key, KeyInit, OsRng};
|
||||
use chacha20poly1305::XChaCha20Poly1305;
|
||||
|
||||
const ENCRYPTION_TAG: &str = "yA4k3nC";
|
||||
const ENCRYPTION_VERSION: u8 = 1;
|
||||
|
||||
pub(crate) fn encrypt_data(data: &[u8], key: &Key<XChaCha20Poly1305>) -> Result<Vec<u8>> {
|
||||
let nonce = XChaCha20Poly1305::generate_nonce(&mut OsRng);
|
||||
let cipher = XChaCha20Poly1305::new(&key);
|
||||
let ciphered_data = cipher.encrypt(&nonce, data).map_err(|_| EncryptionError)?;
|
||||
|
||||
let mut data: Vec<u8> = Vec::new();
|
||||
data.extend_from_slice(ENCRYPTION_TAG.as_bytes()); // Tag
|
||||
data.push(ENCRYPTION_VERSION); // Version
|
||||
data.extend_from_slice(&nonce.as_slice()); // Nonce
|
||||
data.extend_from_slice(&ciphered_data); // Ciphertext
|
||||
|
||||
Ok(data)
|
||||
}
|
||||
|
||||
pub(crate) fn decrypt_data(cipher_data: &[u8], key: &Key<XChaCha20Poly1305>) -> Result<Vec<u8>> {
|
||||
// Yaak Tag + ID + Version + Nonce + ... ciphertext ...
|
||||
let (tag, rest) =
|
||||
cipher_data.split_at_checked(ENCRYPTION_TAG.len()).ok_or(InvalidEncryptedData)?;
|
||||
if tag != ENCRYPTION_TAG.as_bytes() {
|
||||
return Err(InvalidEncryptedData);
|
||||
}
|
||||
|
||||
let (version, rest) = rest.split_at_checked(1).ok_or(InvalidEncryptedData)?;
|
||||
if version[0] != ENCRYPTION_VERSION {
|
||||
return Err(InvalidEncryptedData);
|
||||
}
|
||||
|
||||
let nonce_bytes = <XChaCha20Poly1305 as AeadCore>::NonceSize::to_usize();
|
||||
let (nonce, ciphered_data) = rest.split_at_checked(nonce_bytes).ok_or(InvalidEncryptedData)?;
|
||||
|
||||
let cipher = XChaCha20Poly1305::new(&key);
|
||||
cipher.decrypt(nonce.into(), ciphered_data).map_err(|_e| DecryptionError)
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod test {
|
||||
use crate::encryption::{decrypt_data, encrypt_data};
|
||||
use crate::error::Error::InvalidEncryptedData;
|
||||
use crate::error::Result;
|
||||
use chacha20poly1305::aead::OsRng;
|
||||
use chacha20poly1305::{KeyInit, XChaCha20Poly1305};
|
||||
|
||||
#[test]
|
||||
fn test_encrypt_decrypt() -> Result<()> {
|
||||
let key = XChaCha20Poly1305::generate_key(OsRng);
|
||||
let encrypted = encrypt_data("hello world".as_bytes(), &key)?;
|
||||
let decrypted = decrypt_data(encrypted.as_slice(), &key)?;
|
||||
assert_eq!(String::from_utf8(decrypted).unwrap(), "hello world");
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_decrypt_empty() -> Result<()> {
|
||||
let key = XChaCha20Poly1305::generate_key(OsRng);
|
||||
let encrypted = encrypt_data(&[], &key)?;
|
||||
assert_eq!(encrypted.len(), 48);
|
||||
let decrypted = decrypt_data(encrypted.as_slice(), &key)?;
|
||||
assert_eq!(String::from_utf8(decrypted).unwrap(), "");
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_decrypt_bad_version() -> Result<()> {
|
||||
let key = XChaCha20Poly1305::generate_key(OsRng);
|
||||
let mut encrypted = encrypt_data("hello world".as_bytes(), &key)?;
|
||||
encrypted[7] = 0;
|
||||
let decrypted = decrypt_data(encrypted.as_slice(), &key);
|
||||
assert!(matches!(decrypted, Err(InvalidEncryptedData)));
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_decrypt_bad_tag() -> Result<()> {
|
||||
let key = XChaCha20Poly1305::generate_key(OsRng);
|
||||
let mut encrypted = encrypt_data("hello world".as_bytes(), &key)?;
|
||||
encrypted[0] = 2;
|
||||
let decrypted = decrypt_data(encrypted.as_slice(), &key);
|
||||
assert!(matches!(decrypted, Err(InvalidEncryptedData)));
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_decrypt_unencrypted_data() -> Result<()> {
|
||||
let key = XChaCha20Poly1305::generate_key(OsRng);
|
||||
let decrypted = decrypt_data("123".as_bytes(), &key);
|
||||
assert!(matches!(decrypted, Err(InvalidEncryptedData)));
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
50
src-tauri/yaak-crypto/src/error.rs
Normal file
50
src-tauri/yaak-crypto/src/error.rs
Normal file
@@ -0,0 +1,50 @@
|
||||
use serde::{Serialize, Serializer};
|
||||
use std::io;
|
||||
use thiserror::Error;
|
||||
|
||||
#[derive(Error, Debug)]
|
||||
pub enum Error {
|
||||
#[error(transparent)]
|
||||
DbError(#[from] yaak_models::error::Error),
|
||||
|
||||
#[error("Keyring error: {0}")]
|
||||
KeyringError(#[from] keyring::Error),
|
||||
|
||||
#[error("Missing workspace encryption key")]
|
||||
MissingWorkspaceKey,
|
||||
|
||||
#[error("Incorrect workspace key")]
|
||||
IncorrectWorkspaceKey,
|
||||
|
||||
#[error("Failed to decrypt workspace key: {0}")]
|
||||
WorkspaceKeyDecryptionError(String),
|
||||
|
||||
#[error("Crypto IO error: {0}")]
|
||||
IoError(#[from] io::Error),
|
||||
|
||||
#[error("Failed to encrypt data")]
|
||||
EncryptionError,
|
||||
|
||||
#[error("Failed to decrypt data")]
|
||||
DecryptionError,
|
||||
|
||||
#[error("Invalid encrypted data")]
|
||||
InvalidEncryptedData,
|
||||
|
||||
#[error("Invalid key provided")]
|
||||
InvalidHumanKey,
|
||||
|
||||
#[error("Encryption error: {0}")]
|
||||
GenericError(String),
|
||||
}
|
||||
|
||||
impl Serialize for Error {
|
||||
fn serialize<S>(&self, serializer: S) -> std::result::Result<S::Ok, S::Error>
|
||||
where
|
||||
S: Serializer,
|
||||
{
|
||||
serializer.serialize_str(self.to_string().as_ref())
|
||||
}
|
||||
}
|
||||
|
||||
pub type Result<T> = std::result::Result<T, Error>;
|
||||
27
src-tauri/yaak-crypto/src/lib.rs
Normal file
27
src-tauri/yaak-crypto/src/lib.rs
Normal file
@@ -0,0 +1,27 @@
|
||||
extern crate core;
|
||||
|
||||
use crate::commands::*;
|
||||
use crate::manager::EncryptionManager;
|
||||
use tauri::plugin::{Builder, TauriPlugin};
|
||||
use tauri::{generate_handler, Manager, Runtime};
|
||||
|
||||
mod commands;
|
||||
pub mod encryption;
|
||||
pub mod error;
|
||||
pub mod manager;
|
||||
mod master_key;
|
||||
mod workspace_key;
|
||||
|
||||
pub fn init<R: Runtime>() -> TauriPlugin<R> {
|
||||
Builder::new("yaak-crypto")
|
||||
.invoke_handler(generate_handler![
|
||||
enable_encryption,
|
||||
reveal_workspace_key,
|
||||
set_workspace_key
|
||||
])
|
||||
.setup(|app, _api| {
|
||||
app.manage(EncryptionManager::new(app.app_handle()));
|
||||
Ok(())
|
||||
})
|
||||
.build()
|
||||
}
|
||||
176
src-tauri/yaak-crypto/src/manager.rs
Normal file
176
src-tauri/yaak-crypto/src/manager.rs
Normal file
@@ -0,0 +1,176 @@
|
||||
use crate::error::Error::{
|
||||
GenericError, IncorrectWorkspaceKey, MissingWorkspaceKey, WorkspaceKeyDecryptionError,
|
||||
};
|
||||
use crate::error::{Error, Result};
|
||||
use crate::master_key::MasterKey;
|
||||
use crate::workspace_key::WorkspaceKey;
|
||||
use base64::prelude::BASE64_STANDARD;
|
||||
use base64::Engine;
|
||||
use log::{info, warn};
|
||||
use std::collections::HashMap;
|
||||
use std::sync::{Arc, Mutex};
|
||||
use tauri::{AppHandle, Manager, Runtime, State};
|
||||
use yaak_models::models::{EncryptedKey, Workspace, WorkspaceMeta};
|
||||
use yaak_models::query_manager::{QueryManager, QueryManagerExt};
|
||||
use yaak_models::util::{generate_id_of_length, UpdateSource};
|
||||
|
||||
const KEY_USER: &str = "encryption-key";
|
||||
|
||||
pub trait EncryptionManagerExt<'a, R> {
|
||||
fn crypto(&'a self) -> State<'a, EncryptionManager>;
|
||||
}
|
||||
|
||||
impl<'a, R: Runtime, M: Manager<R>> EncryptionManagerExt<'a, R> for M {
|
||||
fn crypto(&'a self) -> State<'a, EncryptionManager> {
|
||||
self.state::<EncryptionManager>()
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct EncryptionManager {
|
||||
cached_master_key: Arc<Mutex<Option<MasterKey>>>,
|
||||
cached_workspace_keys: Arc<Mutex<HashMap<String, WorkspaceKey>>>,
|
||||
query_manager: QueryManager,
|
||||
app_id: String,
|
||||
}
|
||||
|
||||
impl EncryptionManager {
|
||||
pub fn new<R: Runtime>(app_handle: &AppHandle<R>) -> Self {
|
||||
Self {
|
||||
cached_master_key: Default::default(),
|
||||
cached_workspace_keys: Default::default(),
|
||||
query_manager: app_handle.db_manager().inner().clone(),
|
||||
app_id: app_handle.config().identifier.to_string(),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn encrypt(&self, workspace_id: &str, data: &[u8]) -> Result<Vec<u8>> {
|
||||
let workspace_secret = self.get_workspace_key(workspace_id)?;
|
||||
workspace_secret.encrypt(data)
|
||||
}
|
||||
|
||||
pub fn decrypt(&self, workspace_id: &str, data: &[u8]) -> Result<Vec<u8>> {
|
||||
let workspace_secret = self.get_workspace_key(workspace_id)?;
|
||||
workspace_secret.decrypt(data)
|
||||
}
|
||||
|
||||
pub fn reveal_workspace_key(&self, workspace_id: &str) -> Result<String> {
|
||||
let key = self.get_workspace_key(workspace_id)?;
|
||||
key.to_human()
|
||||
}
|
||||
|
||||
pub fn set_human_key(&self, workspace_id: &str, human_key: &str) -> Result<WorkspaceMeta> {
|
||||
let wkey = WorkspaceKey::from_human(human_key)?;
|
||||
|
||||
let workspace = self.query_manager.connect().get_workspace(workspace_id)?;
|
||||
let encryption_key_challenge = match workspace.encryption_key_challenge {
|
||||
None => return self.set_workspace_key(workspace_id, &wkey),
|
||||
Some(c) => c,
|
||||
};
|
||||
|
||||
let encryption_key_challenge = match BASE64_STANDARD.decode(encryption_key_challenge) {
|
||||
Ok(c) => c,
|
||||
Err(_) => return Err(GenericError("Failed to decode workspace challenge".to_string())),
|
||||
};
|
||||
|
||||
if let Err(_) = wkey.decrypt(encryption_key_challenge.as_slice()) {
|
||||
return Err(IncorrectWorkspaceKey);
|
||||
};
|
||||
|
||||
self.set_workspace_key(workspace_id, &wkey)
|
||||
}
|
||||
|
||||
pub(crate) fn set_workspace_key(
|
||||
&self,
|
||||
workspace_id: &str,
|
||||
wkey: &WorkspaceKey,
|
||||
) -> Result<WorkspaceMeta> {
|
||||
info!("Created workspace key for {workspace_id}");
|
||||
|
||||
let encrypted_key = BASE64_STANDARD.encode(self.get_master_key()?.encrypt(wkey.raw_key())?);
|
||||
let encrypted_key = EncryptedKey { encrypted_key };
|
||||
let encryption_key_challenge = wkey.encrypt(generate_id_of_length(50).as_bytes())?;
|
||||
let encryption_key_challenge = Some(BASE64_STANDARD.encode(encryption_key_challenge));
|
||||
|
||||
let workspace_meta = self.query_manager.with_tx::<WorkspaceMeta, Error>(|tx| {
|
||||
let workspace = tx.get_workspace(workspace_id)?;
|
||||
let workspace_meta = tx.get_or_create_workspace_meta(workspace_id)?;
|
||||
tx.upsert_workspace(
|
||||
&Workspace {
|
||||
encryption_key_challenge,
|
||||
..workspace
|
||||
},
|
||||
&UpdateSource::Background,
|
||||
)?;
|
||||
|
||||
Ok(tx.upsert_workspace_meta(
|
||||
&WorkspaceMeta {
|
||||
encryption_key: Some(encrypted_key.clone()),
|
||||
..workspace_meta
|
||||
},
|
||||
&UpdateSource::Background,
|
||||
)?)
|
||||
})?;
|
||||
|
||||
let mut cache = self.cached_workspace_keys.lock().unwrap();
|
||||
cache.insert(workspace_id.to_string(), wkey.clone());
|
||||
|
||||
Ok(workspace_meta)
|
||||
}
|
||||
|
||||
pub(crate) fn ensure_workspace_key(&self, workspace_id: &str) -> Result<WorkspaceMeta> {
|
||||
let workspace_meta =
|
||||
self.query_manager.connect().get_or_create_workspace_meta(workspace_id)?;
|
||||
|
||||
// Already exists
|
||||
if let Some(_) = workspace_meta.encryption_key {
|
||||
warn!("Tried to create workspace key when one already exists for {workspace_id}");
|
||||
return Ok(workspace_meta);
|
||||
}
|
||||
|
||||
let wkey = WorkspaceKey::create()?;
|
||||
self.set_workspace_key(workspace_id, &wkey)
|
||||
}
|
||||
|
||||
fn get_workspace_key(&self, workspace_id: &str) -> Result<WorkspaceKey> {
|
||||
{
|
||||
let cache = self.cached_workspace_keys.lock().unwrap();
|
||||
if let Some(k) = cache.get(workspace_id) {
|
||||
return Ok(k.clone());
|
||||
}
|
||||
};
|
||||
|
||||
let db = self.query_manager.connect();
|
||||
let workspace_meta = db.get_or_create_workspace_meta(workspace_id)?;
|
||||
|
||||
let key = match workspace_meta.encryption_key {
|
||||
None => return Err(MissingWorkspaceKey),
|
||||
Some(k) => k,
|
||||
};
|
||||
|
||||
let mkey = self.get_master_key()?;
|
||||
let decoded_key = BASE64_STANDARD
|
||||
.decode(key.encrypted_key)
|
||||
.map_err(|e| WorkspaceKeyDecryptionError(e.to_string()))?;
|
||||
let raw_key = mkey
|
||||
.decrypt(decoded_key.as_slice())
|
||||
.map_err(|e| WorkspaceKeyDecryptionError(e.to_string()))?;
|
||||
info!("Got existing workspace key for {workspace_id}");
|
||||
let wkey = WorkspaceKey::from_raw_key(raw_key.as_slice());
|
||||
|
||||
Ok(wkey)
|
||||
}
|
||||
|
||||
fn get_master_key(&self) -> Result<MasterKey> {
|
||||
// NOTE: This locks the key for the entire function which seems wrong, but this prevents
|
||||
// concurrent access from prompting the user for a keychain password multiple times.
|
||||
let mut master_secret = self.cached_master_key.lock().unwrap();
|
||||
if let Some(k) = master_secret.as_ref() {
|
||||
return Ok(k.to_owned());
|
||||
}
|
||||
|
||||
let mkey = MasterKey::get_or_create(&self.app_id, KEY_USER)?;
|
||||
*master_secret = Some(mkey.clone());
|
||||
Ok(mkey)
|
||||
}
|
||||
}
|
||||
79
src-tauri/yaak-crypto/src/master_key.rs
Normal file
79
src-tauri/yaak-crypto/src/master_key.rs
Normal file
@@ -0,0 +1,79 @@
|
||||
use crate::encryption::{decrypt_data, encrypt_data};
|
||||
use crate::error::Error::GenericError;
|
||||
use crate::error::Result;
|
||||
use base32::Alphabet;
|
||||
use chacha20poly1305::aead::{Key, KeyInit, OsRng};
|
||||
use chacha20poly1305::XChaCha20Poly1305;
|
||||
use keyring::{Entry, Error};
|
||||
use log::info;
|
||||
|
||||
const HUMAN_PREFIX: &str = "YKM_";
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub(crate) struct MasterKey {
|
||||
key: Key<XChaCha20Poly1305>,
|
||||
}
|
||||
|
||||
impl MasterKey {
|
||||
pub(crate) fn get_or_create(app_id: &str, user: &str) -> Result<Self> {
|
||||
let id = format!("{app_id}.EncryptionKey");
|
||||
let entry = Entry::new(&id, user)?;
|
||||
|
||||
let key = match entry.get_password() {
|
||||
Ok(encoded) => {
|
||||
let without_prefix = encoded.strip_prefix(HUMAN_PREFIX).unwrap_or(&encoded);
|
||||
let key_bytes = base32::decode(Alphabet::Crockford {}, &without_prefix)
|
||||
.ok_or(GenericError("Failed to decode master key".to_string()))?;
|
||||
Key::<XChaCha20Poly1305>::clone_from_slice(key_bytes.as_slice())
|
||||
}
|
||||
Err(Error::NoEntry) => {
|
||||
info!("Creating new master key");
|
||||
let key = XChaCha20Poly1305::generate_key(OsRng);
|
||||
let encoded = base32::encode(Alphabet::Crockford {}, key.as_slice());
|
||||
let with_prefix = format!("{HUMAN_PREFIX}{encoded}");
|
||||
entry.set_password(&with_prefix)?;
|
||||
key
|
||||
}
|
||||
Err(e) => return Err(GenericError(e.to_string())),
|
||||
};
|
||||
|
||||
Ok(Self { key })
|
||||
}
|
||||
|
||||
pub(crate) fn encrypt(&self, data: &[u8]) -> Result<Vec<u8>> {
|
||||
encrypt_data(data, &self.key)
|
||||
}
|
||||
|
||||
pub(crate) fn decrypt(&self, data: &[u8]) -> Result<Vec<u8>> {
|
||||
decrypt_data(data, &self.key)
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
pub(crate) fn test_key() -> Self {
|
||||
let key: Key<XChaCha20Poly1305> = Key::<XChaCha20Poly1305>::clone_from_slice(
|
||||
"00000000000000000000000000000000".as_bytes(),
|
||||
);
|
||||
Self { key }
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use crate::error::Result;
|
||||
use crate::master_key::MasterKey;
|
||||
|
||||
#[test]
|
||||
fn test_master_key() -> Result<()> {
|
||||
// Test out the master key
|
||||
let mkey = MasterKey::test_key();
|
||||
let encrypted = mkey.encrypt("hello".as_bytes())?;
|
||||
let decrypted = mkey.decrypt(encrypted.as_slice()).unwrap();
|
||||
assert_eq!(decrypted, "hello".as_bytes().to_vec());
|
||||
|
||||
let mkey = MasterKey::test_key();
|
||||
let decrypted = mkey.decrypt(encrypted.as_slice()).unwrap();
|
||||
assert_eq!(decrypted, "hello".as_bytes().to_vec());
|
||||
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
116
src-tauri/yaak-crypto/src/workspace_key.rs
Normal file
116
src-tauri/yaak-crypto/src/workspace_key.rs
Normal file
@@ -0,0 +1,116 @@
|
||||
use crate::encryption::{decrypt_data, encrypt_data};
|
||||
use crate::error::Error::InvalidHumanKey;
|
||||
use crate::error::Result;
|
||||
use base32::Alphabet;
|
||||
use chacha20poly1305::aead::{Key, KeyInit, OsRng};
|
||||
use chacha20poly1305::{KeySizeUser, XChaCha20Poly1305};
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct WorkspaceKey {
|
||||
key: Key<XChaCha20Poly1305>,
|
||||
}
|
||||
|
||||
const HUMAN_PREFIX: &str = "YK";
|
||||
|
||||
impl WorkspaceKey {
|
||||
pub(crate) fn to_human(&self) -> Result<String> {
|
||||
let encoded = base32::encode(Alphabet::Crockford {}, self.key.as_slice());
|
||||
let with_prefix = format!("{HUMAN_PREFIX}{encoded}");
|
||||
let with_separators = with_prefix
|
||||
.chars()
|
||||
.collect::<Vec<_>>()
|
||||
.chunks(6)
|
||||
.map(|chunk| chunk.iter().collect::<String>())
|
||||
.collect::<Vec<_>>()
|
||||
.join("-");
|
||||
Ok(with_separators)
|
||||
}
|
||||
|
||||
#[allow(dead_code)]
|
||||
pub(crate) fn from_human(human_key: &str) -> Result<Self> {
|
||||
let without_prefix = human_key.strip_prefix(HUMAN_PREFIX).unwrap_or(human_key);
|
||||
let without_separators = without_prefix.replace("-", "");
|
||||
let key =
|
||||
base32::decode(Alphabet::Crockford {}, &without_separators).ok_or(InvalidHumanKey)?;
|
||||
if key.len() != XChaCha20Poly1305::key_size() {
|
||||
return Err(InvalidHumanKey);
|
||||
}
|
||||
Ok(Self::from_raw_key(key.as_slice()))
|
||||
}
|
||||
|
||||
pub(crate) fn from_raw_key(key: &[u8]) -> Self {
|
||||
Self {
|
||||
key: Key::<XChaCha20Poly1305>::clone_from_slice(key),
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) fn raw_key(&self) -> &[u8] {
|
||||
self.key.as_slice()
|
||||
}
|
||||
|
||||
pub(crate) fn create() -> Result<Self> {
|
||||
let key = XChaCha20Poly1305::generate_key(OsRng);
|
||||
Ok(Self::from_raw_key(key.as_slice()))
|
||||
}
|
||||
|
||||
pub(crate) fn encrypt(&self, data: &[u8]) -> Result<Vec<u8>> {
|
||||
encrypt_data(data, &self.key)
|
||||
}
|
||||
|
||||
pub(crate) fn decrypt(&self, data: &[u8]) -> Result<Vec<u8>> {
|
||||
decrypt_data(data, &self.key)
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
pub(crate) fn test_key() -> Self {
|
||||
Self::from_raw_key("f1a2d4b3c8e799af1456be3478a4c3f2".as_bytes())
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use crate::error::Error::InvalidHumanKey;
|
||||
use crate::error::Result;
|
||||
use crate::workspace_key::WorkspaceKey;
|
||||
|
||||
#[test]
|
||||
fn test_persisted_key() -> Result<()> {
|
||||
let key = WorkspaceKey::test_key();
|
||||
let encrypted = key.encrypt("hello".as_bytes())?;
|
||||
assert_eq!(key.decrypt(encrypted.as_slice())?, "hello".as_bytes());
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_human_format() -> Result<()> {
|
||||
let key = WorkspaceKey::test_key();
|
||||
|
||||
let encrypted = key.encrypt("hello".as_bytes())?;
|
||||
assert_eq!(key.decrypt(encrypted.as_slice())?, "hello".as_bytes());
|
||||
|
||||
let human = key.to_human()?;
|
||||
assert_eq!(human, "YKCRRP-2CK46H-H36RSR-CMVKJE-B1CRRK-8D9PC9-JK6D1Q-71GK8R-SKCRS0");
|
||||
assert_eq!(
|
||||
WorkspaceKey::from_human(&human)?.decrypt(encrypted.as_slice())?,
|
||||
"hello".as_bytes()
|
||||
);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_from_human_invalid() -> Result<()> {
|
||||
assert!(matches!(
|
||||
WorkspaceKey::from_human(
|
||||
"YKCRRP-2CK46H-H36RSR-CMVKJE-B1CRRK-8D9PC9-JK6D1Q-71GK8R-SKCRS0-H3X38D",
|
||||
),
|
||||
Err(InvalidHumanKey)
|
||||
));
|
||||
|
||||
assert!(matches!(WorkspaceKey::from_human("bad-key",), Err(InvalidHumanKey)));
|
||||
assert!(matches!(WorkspaceKey::from_human("",), Err(InvalidHumanKey)));
|
||||
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
@@ -2,15 +2,15 @@
|
||||
name = "yaak-git"
|
||||
links = "yaak-git"
|
||||
version = "0.1.0"
|
||||
edition = "2021"
|
||||
edition = "2024"
|
||||
publish = false
|
||||
|
||||
[dependencies]
|
||||
chrono = { version = "0.4.38", features = ["serde"] }
|
||||
git2 = { version = "0.20.0" , features = ["vendored-libgit2", "vendored-openssl"]}
|
||||
git2 = { version = "0.20.0", features = ["vendored-libgit2", "vendored-openssl"] }
|
||||
log = "0.4.22"
|
||||
serde = { version = "1.0.215", features = ["derive"] }
|
||||
serde_json = "1.0.132"
|
||||
serde = { workspace = true, features = ["derive"] }
|
||||
serde_json = { workspace = true }
|
||||
serde_yaml = "0.9.34"
|
||||
tauri = { workspace = true }
|
||||
thiserror = { workspace = true }
|
||||
@@ -19,4 +19,4 @@ yaak-models = { workspace = true }
|
||||
yaak-sync = { workspace = true }
|
||||
|
||||
[build-dependencies]
|
||||
tauri-plugin = { version = "2.0.3", features = ["build"] }
|
||||
tauri-plugin = { workspace = true, features = ["build"] }
|
||||
|
||||
@@ -1,14 +1,12 @@
|
||||
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||
|
||||
export type Environment = { model: "environment", id: string, workspaceId: string, environmentId: string | null, createdAt: string, updatedAt: string, name: string, variables: Array<EnvironmentVariable>, };
|
||||
export type Environment = { model: "environment", id: string, workspaceId: string, createdAt: string, updatedAt: string, name: string, public: boolean, base: boolean, variables: Array<EnvironmentVariable>, };
|
||||
|
||||
export type EnvironmentVariable = { enabled?: boolean, name: string, value: string, id?: string, };
|
||||
|
||||
export type Folder = { model: "folder", id: string, createdAt: string, updatedAt: string, workspaceId: string, folderId: string | null, name: string, description: string, sortPriority: number, };
|
||||
export type Folder = { model: "folder", id: string, createdAt: string, updatedAt: string, workspaceId: string, folderId: string | null, authentication: Record<string, any>, authenticationType: string | null, description: string, headers: Array<HttpRequestHeader>, name: string, sortPriority: number, };
|
||||
|
||||
export type GrpcMetadataEntry = { enabled?: boolean, name: string, value: string, id?: string, };
|
||||
|
||||
export type GrpcRequest = { model: "grpc_request", id: string, createdAt: string, updatedAt: string, workspaceId: string, folderId: string | null, authenticationType: string | null, authentication: Record<string, any>, description: string, message: string, metadata: Array<GrpcMetadataEntry>, method: string | null, name: string, service: string | null, sortPriority: number, url: string, };
|
||||
export type GrpcRequest = { model: "grpc_request", id: string, createdAt: string, updatedAt: string, workspaceId: string, folderId: string | null, authenticationType: string | null, authentication: Record<string, any>, description: string, message: string, metadata: Array<HttpRequestHeader>, method: string | null, name: string, service: string | null, sortPriority: number, url: string, };
|
||||
|
||||
export type HttpRequest = { model: "http_request", id: string, createdAt: string, updatedAt: string, workspaceId: string, folderId: string | null, authentication: Record<string, any>, authenticationType: string | null, body: Record<string, any>, bodyType: string | null, description: string, headers: Array<HttpRequestHeader>, method: string, name: string, sortPriority: number, url: string, urlParameters: Array<HttpUrlParameter>, };
|
||||
|
||||
@@ -20,4 +18,4 @@ export type SyncModel = { "type": "workspace" } & Workspace | { "type": "environ
|
||||
|
||||
export type WebsocketRequest = { model: "websocket_request", id: string, createdAt: string, updatedAt: string, workspaceId: string, folderId: string | null, authentication: Record<string, any>, authenticationType: string | null, description: string, headers: Array<HttpRequestHeader>, message: string, name: string, sortPriority: number, url: string, urlParameters: Array<HttpUrlParameter>, };
|
||||
|
||||
export type Workspace = { model: "workspace", id: string, createdAt: string, updatedAt: string, name: string, description: string, settingValidateCertificates: boolean, settingFollowRedirects: boolean, settingRequestTimeout: number, };
|
||||
export type Workspace = { model: "workspace", id: string, createdAt: string, updatedAt: string, authentication: Record<string, any>, authenticationType: string | null, description: string, headers: Array<HttpRequestHeader>, name: string, encryptionKeyChallenge: string | null, settingValidateCertificates: boolean, settingFollowRedirects: boolean, settingRequestTimeout: number, };
|
||||
|
||||
@@ -52,7 +52,7 @@ export function useGit(dir: string) {
|
||||
onSuccess,
|
||||
}),
|
||||
commitAndPush: useMutation<PushResult, string, { message: string }>({
|
||||
mutationKey: ['git', 'commitpush', dir],
|
||||
mutationKey: ['git', 'commit_push', dir],
|
||||
mutationFn: async (args) => {
|
||||
await invoke('plugin:yaak-git|commit', { dir, ...args });
|
||||
return invoke('plugin:yaak-git|push', { dir });
|
||||
@@ -79,10 +79,18 @@ export function useGit(dir: string) {
|
||||
mutationFn: (args) => invoke('plugin:yaak-git|unstage', { dir, ...args }),
|
||||
onSuccess,
|
||||
}),
|
||||
init: useGitInit(),
|
||||
},
|
||||
] as const;
|
||||
}
|
||||
|
||||
export async function gitInit(dir: string) {
|
||||
await invoke('plugin:yaak-git|initialize', { dir });
|
||||
export function useGitInit() {
|
||||
const queryClient = useQueryClient();
|
||||
const onSuccess = () => queryClient.invalidateQueries({ queryKey: ['git'] });
|
||||
|
||||
return useMutation<void, string, { dir: string }>({
|
||||
mutationKey: ['git', 'init'],
|
||||
mutationFn: (args) => invoke('plugin:yaak-git|initialize', { ...args }),
|
||||
onSuccess,
|
||||
});
|
||||
}
|
||||
|
||||
@@ -1,13 +0,0 @@
|
||||
# Automatically generated - DO NOT EDIT!
|
||||
|
||||
"$schema" = "../../schemas/schema.json"
|
||||
|
||||
[[permission]]
|
||||
identifier = "allow-add"
|
||||
description = "Enables the add command without any pre-configured scope."
|
||||
commands.allow = ["add"]
|
||||
|
||||
[[permission]]
|
||||
identifier = "deny-add"
|
||||
description = "Denies the add command without any pre-configured scope."
|
||||
commands.deny = ["add"]
|
||||
@@ -1,13 +0,0 @@
|
||||
# Automatically generated - DO NOT EDIT!
|
||||
|
||||
"$schema" = "../../schemas/schema.json"
|
||||
|
||||
[[permission]]
|
||||
identifier = "allow-branch"
|
||||
description = "Enables the branch command without any pre-configured scope."
|
||||
commands.allow = ["branch"]
|
||||
|
||||
[[permission]]
|
||||
identifier = "deny-branch"
|
||||
description = "Denies the branch command without any pre-configured scope."
|
||||
commands.deny = ["branch"]
|
||||
@@ -1,13 +0,0 @@
|
||||
# Automatically generated - DO NOT EDIT!
|
||||
|
||||
"$schema" = "../../schemas/schema.json"
|
||||
|
||||
[[permission]]
|
||||
identifier = "allow-checkout"
|
||||
description = "Enables the checkout command without any pre-configured scope."
|
||||
commands.allow = ["checkout"]
|
||||
|
||||
[[permission]]
|
||||
identifier = "deny-checkout"
|
||||
description = "Denies the checkout command without any pre-configured scope."
|
||||
commands.deny = ["checkout"]
|
||||
@@ -1,13 +0,0 @@
|
||||
# Automatically generated - DO NOT EDIT!
|
||||
|
||||
"$schema" = "../../schemas/schema.json"
|
||||
|
||||
[[permission]]
|
||||
identifier = "allow-checkout-remote"
|
||||
description = "Enables the checkout_remote command without any pre-configured scope."
|
||||
commands.allow = ["checkout_remote"]
|
||||
|
||||
[[permission]]
|
||||
identifier = "deny-checkout-remote"
|
||||
description = "Denies the checkout_remote command without any pre-configured scope."
|
||||
commands.deny = ["checkout_remote"]
|
||||
@@ -1,13 +0,0 @@
|
||||
# Automatically generated - DO NOT EDIT!
|
||||
|
||||
"$schema" = "../../schemas/schema.json"
|
||||
|
||||
[[permission]]
|
||||
identifier = "allow-commit"
|
||||
description = "Enables the commit command without any pre-configured scope."
|
||||
commands.allow = ["commit"]
|
||||
|
||||
[[permission]]
|
||||
identifier = "deny-commit"
|
||||
description = "Denies the commit command without any pre-configured scope."
|
||||
commands.deny = ["commit"]
|
||||
@@ -1,13 +0,0 @@
|
||||
# Automatically generated - DO NOT EDIT!
|
||||
|
||||
"$schema" = "../../schemas/schema.json"
|
||||
|
||||
[[permission]]
|
||||
identifier = "allow-delete-branch"
|
||||
description = "Enables the delete_branch command without any pre-configured scope."
|
||||
commands.allow = ["delete_branch"]
|
||||
|
||||
[[permission]]
|
||||
identifier = "deny-delete-branch"
|
||||
description = "Denies the delete_branch command without any pre-configured scope."
|
||||
commands.deny = ["delete_branch"]
|
||||
@@ -1,13 +0,0 @@
|
||||
# Automatically generated - DO NOT EDIT!
|
||||
|
||||
"$schema" = "../../schemas/schema.json"
|
||||
|
||||
[[permission]]
|
||||
identifier = "allow-fetch-all"
|
||||
description = "Enables the fetch_all command without any pre-configured scope."
|
||||
commands.allow = ["fetch_all"]
|
||||
|
||||
[[permission]]
|
||||
identifier = "deny-fetch-all"
|
||||
description = "Denies the fetch_all command without any pre-configured scope."
|
||||
commands.deny = ["fetch_all"]
|
||||
@@ -1,13 +0,0 @@
|
||||
# Automatically generated - DO NOT EDIT!
|
||||
|
||||
"$schema" = "../../schemas/schema.json"
|
||||
|
||||
[[permission]]
|
||||
identifier = "allow-initialize"
|
||||
description = "Enables the initialize command without any pre-configured scope."
|
||||
commands.allow = ["initialize"]
|
||||
|
||||
[[permission]]
|
||||
identifier = "deny-initialize"
|
||||
description = "Denies the initialize command without any pre-configured scope."
|
||||
commands.deny = ["initialize"]
|
||||
@@ -1,13 +0,0 @@
|
||||
# Automatically generated - DO NOT EDIT!
|
||||
|
||||
"$schema" = "../../schemas/schema.json"
|
||||
|
||||
[[permission]]
|
||||
identifier = "allow-log"
|
||||
description = "Enables the log command without any pre-configured scope."
|
||||
commands.allow = ["log"]
|
||||
|
||||
[[permission]]
|
||||
identifier = "deny-log"
|
||||
description = "Denies the log command without any pre-configured scope."
|
||||
commands.deny = ["log"]
|
||||
@@ -1,13 +0,0 @@
|
||||
# Automatically generated - DO NOT EDIT!
|
||||
|
||||
"$schema" = "../../schemas/schema.json"
|
||||
|
||||
[[permission]]
|
||||
identifier = "allow-merge-branch"
|
||||
description = "Enables the merge_branch command without any pre-configured scope."
|
||||
commands.allow = ["merge_branch"]
|
||||
|
||||
[[permission]]
|
||||
identifier = "deny-merge-branch"
|
||||
description = "Denies the merge_branch command without any pre-configured scope."
|
||||
commands.deny = ["merge_branch"]
|
||||
@@ -1,13 +0,0 @@
|
||||
# Automatically generated - DO NOT EDIT!
|
||||
|
||||
"$schema" = "../../schemas/schema.json"
|
||||
|
||||
[[permission]]
|
||||
identifier = "allow-pull"
|
||||
description = "Enables the pull command without any pre-configured scope."
|
||||
commands.allow = ["pull"]
|
||||
|
||||
[[permission]]
|
||||
identifier = "deny-pull"
|
||||
description = "Denies the pull command without any pre-configured scope."
|
||||
commands.deny = ["pull"]
|
||||
@@ -1,13 +0,0 @@
|
||||
# Automatically generated - DO NOT EDIT!
|
||||
|
||||
"$schema" = "../../schemas/schema.json"
|
||||
|
||||
[[permission]]
|
||||
identifier = "allow-push"
|
||||
description = "Enables the push command without any pre-configured scope."
|
||||
commands.allow = ["push"]
|
||||
|
||||
[[permission]]
|
||||
identifier = "deny-push"
|
||||
description = "Denies the push command without any pre-configured scope."
|
||||
commands.deny = ["push"]
|
||||
@@ -1,13 +0,0 @@
|
||||
# Automatically generated - DO NOT EDIT!
|
||||
|
||||
"$schema" = "../../schemas/schema.json"
|
||||
|
||||
[[permission]]
|
||||
identifier = "allow-status"
|
||||
description = "Enables the status command without any pre-configured scope."
|
||||
commands.allow = ["status"]
|
||||
|
||||
[[permission]]
|
||||
identifier = "deny-status"
|
||||
description = "Denies the status command without any pre-configured scope."
|
||||
commands.deny = ["status"]
|
||||
@@ -1,13 +0,0 @@
|
||||
# Automatically generated - DO NOT EDIT!
|
||||
|
||||
"$schema" = "../../schemas/schema.json"
|
||||
|
||||
[[permission]]
|
||||
identifier = "allow-unstage"
|
||||
description = "Enables the unstage command without any pre-configured scope."
|
||||
commands.allow = ["unstage"]
|
||||
|
||||
[[permission]]
|
||||
identifier = "deny-unstage"
|
||||
description = "Denies the unstage command without any pre-configured scope."
|
||||
commands.deny = ["unstage"]
|
||||
@@ -1,391 +0,0 @@
|
||||
## Default Permission
|
||||
|
||||
Default permissions for the plugin
|
||||
|
||||
- `allow-add`
|
||||
- `allow-branch`
|
||||
- `allow-checkout`
|
||||
- `allow-commit`
|
||||
- `allow-delete-branch`
|
||||
- `allow-fetch-all`
|
||||
- `allow-initialize`
|
||||
- `allow-log`
|
||||
- `allow-merge-branch`
|
||||
- `allow-pull`
|
||||
- `allow-push`
|
||||
- `allow-status`
|
||||
- `allow-unstage`
|
||||
|
||||
## Permission Table
|
||||
|
||||
<table>
|
||||
<tr>
|
||||
<th>Identifier</th>
|
||||
<th>Description</th>
|
||||
</tr>
|
||||
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:allow-add`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Enables the add command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:deny-add`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Denies the add command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:allow-branch`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Enables the branch command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:deny-branch`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Denies the branch command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:allow-checkout`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Enables the checkout command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:deny-checkout`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Denies the checkout command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:allow-checkout-remote`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Enables the checkout_remote command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:deny-checkout-remote`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Denies the checkout_remote command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:allow-commit`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Enables the commit command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:deny-commit`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Denies the commit command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:allow-delete-branch`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Enables the delete_branch command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:deny-delete-branch`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Denies the delete_branch command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:allow-fetch-all`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Enables the fetch_all command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:deny-fetch-all`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Denies the fetch_all command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:allow-initialize`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Enables the initialize command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:deny-initialize`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Denies the initialize command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:allow-log`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Enables the log command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:deny-log`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Denies the log command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:allow-merge-branch`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Enables the merge_branch command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:deny-merge-branch`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Denies the merge_branch command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:allow-pull`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Enables the pull command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:deny-pull`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Denies the pull command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:allow-push`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Enables the push command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:deny-push`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Denies the push command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:allow-status`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Enables the status command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:deny-status`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Denies the status command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:allow-unstage`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Enables the unstage command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td>
|
||||
|
||||
`yaak-git:deny-unstage`
|
||||
|
||||
</td>
|
||||
<td>
|
||||
|
||||
Denies the unstage command without any pre-configured scope.
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
</table>
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user