Skip to main content
Krosyn is launching soon. Join the waitlist for early access.Join waitlist

Documentation

Learn how to use Krosyn to connect your applications.

Import & Export

Export your data as CSV files and import data in bulk.

CSV Export

You can export data from list pages (Connectors, Users) as CSV files. The export dropdown offers two options:

  • Export All: Downloads all records regardless of active filters or search terms.
  • Export Current View: Downloads only the records matching your current filters, search query, and sort order. This is useful for exporting a specific subset of data.

The exported CSV file includes column headers and all visible fields for each record.

CSV Import

Import data in bulk by uploading a CSV file. To import:

  1. Navigate to the list page (e.g. Connectors or Users)
  2. Click the Import button
  3. Select a CSV file from your computer
  4. The system validates the file and shows a preview of the data
  5. Confirm to import the records

CSV Format Requirements

  • The first row must contain column headers
  • Column names should match the expected field names (see the export file for reference)
  • Use UTF-8 encoding
  • Both comma and semicolon delimiters are auto-detected
  • Values containing commas or double quotes should be wrapped in double quotes
  • Maximum file size: 2 MB

Connector Field Reference

The following fields are available when importing or exporting connectors. All fields except name are optional.

FieldTypeDescription
idintegerUsed to match existing records during import. Leave blank to create new. Not editable.
namestring (required, max 120)Connector name
statusstringenabled or disabled. New connectors default to disabled.
example_payloadJSON stringSample JSON payload for testing and autocomplete
triggerstring Expression evaluated against the payload, e.g. equals($payload.event_type, "order.created")
processJSON string Array of field mappings: [{"key":"field_name","value":"$expression"}]
process_api_urlURL stringEnrichment API endpoint for the Process step
process_api_paramsJSON string Query parameters: [{"key":"param","value":"$expression"}]
process_api_headersJSON string Lookup headers: [{"key":"Authorization","value":"Bearer $secret.KEY"}]
performJSON string Request body fields: [{"key":"field_name","value":"$expression"}]
perform_api_urlURL stringTarget API endpoint for the Perform step
perform_api_methodstringPOST, PUT, PATCH, or DELETE
perform_headersJSON string Outbound headers: [{"key":"Content-Type","value":"application/json"}]

User Field Reference

The following fields are available when importing or exporting users.

FieldTypeDescription
namestring (required, max 255)User's full name
emailstring (required, max 255)Must be unique within the organization

CSV Format Example

A minimal valid connector CSV with three fields:

name,status,trigger,perform_api_url,perform_api_method
"My Connector","enabled","equals($payload.type, ""order"")","https://api.example.com/webhook","POST"

Note the doubled quotes inside the trigger expression. This is standard CSV escaping for values that contain double quotes.

JSON Field Format

Fields like process, perform, process_api_params, process_api_headers, and perform_headers contain JSON arrays. To encode these in CSV:

  1. Write the JSON array normally
  2. Wrap the entire value in double quotes
  3. Escape any inner double quotes by doubling them ("")

Example: perform field in CSV

"[{""key"":""email"",""value"":""$payload.email""},{""key"":""name"",""value"":""$payload.name""}]"

If the JSON is malformed, the import will fail for that row with an "invalid JSON" error.

Import Behavior

How the import system handles creates vs updates:

  • Rows with an id matching an existing connector in your organization will update that record.
  • Rows without an id (or with an ID that does not exist in your organization) create a new connector.
  • New connectors are created with status disabled by default, regardless of the status field in the CSV.
  • Maximum 100 rows per import file.
  • Import is all-or-nothing: if any row fails validation, the entire import is rolled back and no records are changed.
  • Trigger expressions are validated during import. Invalid expressions will cause the import to fail.
  • Subscription limits are enforced. If importing would exceed your plan's connector limit, the import is rejected.
  • New connectors automatically get an endpoint provisioned with bearer and secret tokens.

Programmatic Import

The CSV format is designed to be machine-readable, making it suitable for integration with scripts or AI tools.

  • Export first to get the exact format template. The exported CSV contains all columns with proper headers and shows how existing data is encoded.
  • JSON fields must be properly escaped in CSV. Use your language's CSV library rather than building CSV strings manually.
  • The delimiter is auto-detected (comma or semicolon), so either format works.
  • UTF-8 encoding is required. A BOM prefix is automatically stripped if present.

Tips

  • Use Export as a template. If you are not sure about the CSV format for imports, export your existing data first and use the exported file as a template.
  • Filtered exports for reporting: Apply filters (e.g. status = active, or search for specific records) and use "Export Current View" to generate targeted reports.
  • Sort order is preserved. When using "Export Current View", the sort order you have applied on the page is maintained in the exported file.