Documentation
Learn how to use Krosyn to connect your applications.
Import & Export
Export your data as CSV files and import data in bulk.
CSV Export
You can export data from list pages (Connectors, Users) as CSV files. The export dropdown offers two options:
- Export All: Downloads all records regardless of active filters or search terms.
- Export Current View: Downloads only the records matching your current filters, search query, and sort order. This is useful for exporting a specific subset of data.
The exported CSV file includes column headers and all visible fields for each record.
CSV Import
Import data in bulk by uploading a CSV file. To import:
- Navigate to the list page (e.g. Connectors or Users)
- Click the Import button
- Select a CSV file from your computer
- The system validates the file and shows a preview of the data
- Confirm to import the records
CSV Format Requirements
- The first row must contain column headers
- Column names should match the expected field names (see the export file for reference)
- Use UTF-8 encoding
- Both comma and semicolon delimiters are auto-detected
- Values containing commas or double quotes should be wrapped in double quotes
- Maximum file size: 2 MB
Connector Field Reference
The following fields are available when importing or exporting connectors. All fields except name are optional.
| Field | Type | Description |
|---|---|---|
id | integer | Used to match existing records during import. Leave blank to create new. Not editable. |
name | string (required, max 120) | Connector name |
status | string | enabled or disabled. New connectors default to disabled. |
example_payload | JSON string | Sample JSON payload for testing and autocomplete |
trigger | string | Expression evaluated against the payload, e.g. equals($payload.event_type, "order.created") |
process | JSON string | Array of field mappings: [{"key":"field_name","value":"$expression"}] |
process_api_url | URL string | Enrichment API endpoint for the Process step |
process_api_params | JSON string | Query parameters: [{"key":"param","value":"$expression"}] |
process_api_headers | JSON string | Lookup headers: [{"key":"Authorization","value":"Bearer $secret.KEY"}] |
perform | JSON string | Request body fields: [{"key":"field_name","value":"$expression"}] |
perform_api_url | URL string | Target API endpoint for the Perform step |
perform_api_method | string | POST, PUT, PATCH, or DELETE |
perform_headers | JSON string | Outbound headers: [{"key":"Content-Type","value":"application/json"}] |
User Field Reference
The following fields are available when importing or exporting users.
| Field | Type | Description |
|---|---|---|
name | string (required, max 255) | User's full name |
email | string (required, max 255) | Must be unique within the organization |
CSV Format Example
A minimal valid connector CSV with three fields:
name,status,trigger,perform_api_url,perform_api_method
"My Connector","enabled","equals($payload.type, ""order"")","https://api.example.com/webhook","POST"Note the doubled quotes inside the trigger expression. This is standard CSV escaping for values that contain double quotes.
JSON Field Format
Fields like process, perform, process_api_params, process_api_headers, and perform_headers contain JSON arrays. To encode these in CSV:
- Write the JSON array normally
- Wrap the entire value in double quotes
- Escape any inner double quotes by doubling them (
"")
Example: perform field in CSV
"[{""key"":""email"",""value"":""$payload.email""},{""key"":""name"",""value"":""$payload.name""}]"If the JSON is malformed, the import will fail for that row with an "invalid JSON" error.
Import Behavior
How the import system handles creates vs updates:
- Rows with an
idmatching an existing connector in your organization will update that record. - Rows without an
id(or with an ID that does not exist in your organization) create a new connector. - New connectors are created with status
disabledby default, regardless of the status field in the CSV. - Maximum 100 rows per import file.
- Import is all-or-nothing: if any row fails validation, the entire import is rolled back and no records are changed.
- Trigger expressions are validated during import. Invalid expressions will cause the import to fail.
- Subscription limits are enforced. If importing would exceed your plan's connector limit, the import is rejected.
- New connectors automatically get an endpoint provisioned with bearer and secret tokens.
Programmatic Import
The CSV format is designed to be machine-readable, making it suitable for integration with scripts or AI tools.
- Export first to get the exact format template. The exported CSV contains all columns with proper headers and shows how existing data is encoded.
- JSON fields must be properly escaped in CSV. Use your language's CSV library rather than building CSV strings manually.
- The delimiter is auto-detected (comma or semicolon), so either format works.
- UTF-8 encoding is required. A BOM prefix is automatically stripped if present.
Tips
- Use Export as a template. If you are not sure about the CSV format for imports, export your existing data first and use the exported file as a template.
- Filtered exports for reporting: Apply filters (e.g. status = active, or search for specific records) and use "Export Current View" to generate targeted reports.
- Sort order is preserved. When using "Export Current View", the sort order you have applied on the page is maintained in the exported file.