Importing and Exporting Pipelines
Share pipelines as files, generate curl commands, and move pipelines between environments.
Exporting Pipelines
Export as JSON
1. Open Pipeline Menu
Click the ••• button on any pipeline card (in the list) or in the pipeline builder.
2. Select Export
Choose Export as JSON from the menu.
3. Download File
The pipeline definition downloads as a .json file.
File contents include:
- Pipeline name and description
- All nodes and their configurations
- Connections between nodes
- Layout information (node positions)
- Metadata (created, modified dates)
What’s Included vs. Excluded
✅ Included: Pipeline definition, node configurations, connections, layout ❌ NOT included: Input data, execution results, version history
Why input data isn’t included:
- Keeps files small
- Protects sensitive data
- Makes pipelines reusable with different datasets
- Easier to share across environments
Importing Pipelines
Import from File
1. Go to Pipeline List
Navigate to /dashboard/pipeline.
2. Click Import
Click the Import button in the top toolbar.
3. Select File
Choose a .json pipeline file from your computer.
4. Configure Import
- Name — Enter a name (uses filename by default)
- Tags — Add optional tags
- Duplicate or replace — Choose how to handle existing pipelines
5. Import
Click Import to create the pipeline.
Import Conflicts
If a pipeline with the same name exists:
| Option | Behavior |
|---|---|
| Create new | Adds a number suffix: “Pipeline (2)“ |
| Replace | Overwrites existing pipeline |
| Skip | Keeps existing, ignores imported |
⚠️ Replace is permanent — Overwritten pipelines cannot be recovered unless you have version history.
Sharing via Share Links
Generate Share Link
1. Open Pipeline Menu
Click the ••• button on the pipeline.
2. Select Share
Choose Share from the menu.
3. Configure Permissions
View only:
- Recipients can see the pipeline
- Cannot edit or copy
Allow copy:
- Recipients can copy the pipeline to their account
- Can then edit their copy
4. Copy Link
Click Copy link to copy the share URL to your clipboard.
5. Share
Send the link via email, chat, or any messaging app.
Share Link Management
Viewing shared pipelines:
- Go to Sharing in the sidebar
- See all pipelines you’ve shared
- See share count and access logs
Revoking access:
- Open pipeline menu
- Select Share
- Click Revoke all links
- Confirm revocation
💡 Share links expire — Demo mode links expire after 7 days. Dashboard mode links don’t expire unless you revoke them.
cURL Export
Generate cURL Command
Run any pipeline via HTTP using curl:
1. Open Pipeline
Open the pipeline in the builder.
2. Click Export
Click Export → cURL command.
3. Copy Command
The cURL command includes:
- Pipeline definition
- Input data
- Authentication
- Endpoint URL
4. Run in Terminal
Paste into your terminal and run.
curl -X POST https://api.example.com/pipelines/execute \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"pipeline": {...},
"input": {...}
}'cURL Options
Customize the command:
| Option | Description |
|---|---|
| Include input | Embed input data in command |
| Input from file | Read input from local file |
| Input from URL | Fetch input from URL |
| Output format | Choose JSON, pretty-printed, or minified |
Example with file input:
curl -X POST https://api.example.com/pipelines/execute \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d @pipeline.json \
--data-binary @input.jsonAPI Execution
Execute via API
Run pipelines programmatically using the REST API.
Endpoint:
POST /api/pipelines/executeRequest body:
{
"pipelineId": "pipeline_abc123",
"input": {
"your": "data",
"goes": "here"
},
"options": {
"timeout": 30000,
"includeSteps": true
}
}Response:
{
"success": true,
"output": {...},
"steps": [
{"stepId": "step_1", "output": {...}},
{"stepId": "step_2", "output": {...}}
],
"executionTime": 1234
}Authentication
Include your API key in the request header:
Authorization: Bearer YOUR_API_KEY🔑 Get your API key — Go to Dashboard → API Keys to create and manage keys.
Rate Limits
| Plan | Requests per minute |
|---|---|
| Free | 10 |
| Pro | 100 |
| Enterprise | Unlimited |
Check response headers for rate limit info:
X-RateLimit-Limit: 100
X-RateLimit-Remaining: 95
X-RateLimit-Reset: 1234567890Pipeline Versioning
Export Specific Version
Each pipeline saves versions automatically. Export any version:
- Open pipeline
- Click Versions tab
- Find the version to export
- Click Export
- Download
.jsonfile
Import as New Version
When importing, choose whether to:
- Create new pipeline — Fresh pipeline with imported version
- Add version to existing — Add as a new version of current pipeline
- Replace current version — Overwrite current version
Moving Between Environments
Demo to Dashboard
Move pipelines from demo (localStorage) to dashboard (cloud):
-
Export from demo:
- Open pipeline in
/demo/pipeline - Click Export
- Download
.jsonfile
- Open pipeline in
-
Import to dashboard:
- Go to
/dashboard/pipeline - Click Import
- Select downloaded file
- Save to cloud
- Go to
Dashboard to Demo
Move pipelines from dashboard to demo:
-
Export from dashboard:
- Open pipeline in
/dashboard/pipeline/[id] - Click Export
- Download
.jsonfile
- Open pipeline in
-
Import to demo:
- Go to
/demo/pipeline - Click Import
- Select downloaded file
- Save to localStorage
- Go to
⚠️ Data loss risk — Demo mode uses localStorage, which can be cleared by browser settings. Only use demo for testing, not for important work.
Pipeline File Format
JSON Schema
{
"version": "1.0",
"name": "My Pipeline",
"description": "Does cool things",
"nodes": [
{
"id": "node_1",
"type": "input",
"position": {"x": 100, "y": 100}
},
{
"id": "node_2",
"type": "utility",
"utilityId": "cleanup.clean-json",
"config": {
"removeNulls": true,
"trimStrings": true
},
"position": {"x": 400, "y": 100}
},
{
"id": "node_3",
"type": "output",
"position": {"x": 700, "y": 100}
}
],
"edges": [
{
"id": "edge_1",
"source": "node_1",
"target": "node_2",
"sourceHandle": "output",
"targetHandle": "input"
},
{
"id": "edge_2",
"source": "node_2",
"target": "node_3",
"sourceHandle": "output",
"targetHandle": "input"
}
],
"metadata": {
"createdAt": "2024-01-15T00:00:00Z",
"modifiedAt": "2024-03-20T00:00:00Z",
"version": 5
}
}Modifying Exported Files
You can edit exported .json files:
- Change pipeline name or description
- Modify node configurations
- Adjust node positions
- Add or remove nodes
Import back to apply changes.
⚠️ Validate JSON — After manual editing, validate JSON syntax before importing. Invalid files will fail to import.
Troubleshooting
Import Fails
“Invalid JSON format”
- Validate file syntax using a JSON validator
- Check for missing commas, brackets, or quotes
- Ensure file is complete (not truncated)
“Missing required fields”
- Verify file includes
nodesandedgesarrays - Check that each node has
id,type, andposition - Ensure edges reference valid node IDs
“Utility not found”
- Utility may not exist in this environment
- Check utility ID matches available utilities
- Replace with alternative utility if needed
Export Fails
“Pipeline too large”
- Pipeline exceeds file size limits
- Remove unnecessary nodes or data
- Split into multiple smaller pipelines
“Permission denied”
- You don’t have export permission
- Pipeline may be shared with you (not owned)
- Request owner to export instead
Next Steps
- API Guide — Complete API reference
- API Quick Reference — Common endpoints
- Management — Organize and share pipelines