Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use select input for text fields with enum property #402

Merged
merged 3 commits into from
Jan 9, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions src/addons/dispatch/Form.svelte
Original file line number Diff line number Diff line change
Expand Up @@ -78,6 +78,7 @@
required={required.includes(name)}
bind:value={$values[name]}
defaultValue={params.default}
choices={params.enum}
/>
{/each}
</fieldset>
Expand Down
23 changes: 23 additions & 0 deletions src/addons/dispatch/fields/Select.svelte
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
<script lang="ts">
import Field from "./Field.svelte";

export let title: string;
export let name: string;
export let defaultValue: string = "";
export let value: string = defaultValue || null;
export let required: boolean = false;
export let description: string = "";
export let inline = false;
export let choices: string[] = [];
</script>

<Field {title} {description} {inline} {required}>
<select {name} bind:value on:input on:focus on:change on:blur>
{#if !required}
<option value="">---</option>
{/if}
{#each choices as choice}
<option value={choice}>{choice}</option>
{/each}
</select>
</Field>
6 changes: 6 additions & 0 deletions src/addons/dispatch/fields/index.js
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ import ArrayField from "./ArrayField.svelte";
import Checkbox from "./Checkbox.svelte";
import Number from "./Number.svelte";
import Text from "./Text.svelte";
import Select from "./Select.svelte";

// https://json-schema.org/understanding-json-schema/reference/type.html
const fields = {
Expand Down Expand Up @@ -31,5 +32,10 @@ export function autofield(params, fallback = Text) {
return fallback;
}

// only string enums for now
if (type === "string" && params.enum) {
return Select;
}

return fields[type];
}
14 changes: 9 additions & 5 deletions src/addons/fixtures/addons.json
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,7 @@
"type": "object",
"title": "Scraper",
"required": ["site", "project"],
"categories": ["monitor"],
"properties": {
"site": {
"type": "string",
Expand Down Expand Up @@ -65,10 +66,11 @@
"description": "Recursively scrape same-domain links found on the page (Must be between 0 and 2)"
},
"access_level": {
"enum": ["public", "organization", "private"],
"type": "string",
"title": "Access Level",
"default": "public",
"description": "Access level (public, private, or organization) of documents scraped."
"description": "Access level of documents scraped."
},
"slack_webhook": {
"type": "string",
Expand All @@ -77,18 +79,20 @@
"description": "Enter a slack webhook to enable Slack notifications"
}
},
"description": "<p>This add-on will scrape and optionally crawl a given site for documents to upload to DocumentCloud. It can also alert you of given keywords appearing in those documents.</p>",
"description": "<p>Scrape and optionally crawl a given site for documents to upload to DocumentCloud.</p>",
"eventOptions": {
"name": "site",
"events": ["hourly", "daily", "weekly"]
}
},
"instructions": "<p>You may specify a project to scrape the documents into as well as an access level. \nScraper can alert you by email or Slack notification when given keywords appear in\ndocuments if you specify keywords to monitor. For Slack notifications, you must provide a webhook. </p>\n<p>The crawl depth is a parameter that tells the Scraper how many clicks deep away from the\nsite you specify in order to continue looking for documents. \nIf the PDFs are directly linked on the site you provide \n(1 click to get to the PDF), 0 is the crawl depth you should use. \nIf the site you provide a link to contains multiple links to other pages that have PDFs linked to those pages, \nyour crawl depth would be 1. A crawl depth of 2 is the maximum supported. </p>\n<p>The Scraper Add-On now supports Google Drive links. \nIt will upload the first 30 Google Drive documents it sees per run. \nScraper will upload the first 100 regular documents it sees per run. \nThe Scraper keeps track of which documents it has seen and already uploaded.</p>"
},
"created_at": "2022-05-17T13:49:53.635344Z",
"updated_at": "2023-05-12T16:48:05.637062Z",
"updated_at": "2024-01-08T19:55:22.662655Z",
"active": false,
"default": true,
"featured": false
"featured": true
},

{
"id": 110,
"user": 20080,
Expand Down