API Documentation

Automate your data exports

Schedule recurring exports to S3, Azure Blob, SFTP, or webhook endpoints. Define format, scope, and schedule once — then track every run with full history and error details.

Four Destinations

Push data to s3, azure-blob, sftp, or webhook endpoints with per-job configuration.

Cron Scheduling

Define schedules with standard cron expressions. Run nightly, hourly, or on any custom cadence using the schedule field.

Multiple Formats

Export as csv, json, jsonl, or xml. Optionally compress output with gzip for large datasets.

Run History

Every execution is logged with status, record count, file size, destination path, and error details for failed runs.

Endpoints

Method Path Description
GET /export-jobs List all export jobs
POST /export-jobs Create a new export job
GET /export-jobs/:id Get export job by ID
PUT /export-jobs/:id Update an export job
DELETE /export-jobs/:id Delete an export job
POST /export-jobs/:id/pause Pause an active export job
POST /export-jobs/:id/resume Resume a paused export job
GET /export-jobs/:id/runs List run history for a job
POST /export-jobs/:id/runs Record a new run

POST   /export-jobs

Create a new scheduled export job. Define the destination, format, data scope, and cron schedule. The job starts in an active state by default.

Request Body

FieldTypeDescription
name string Human-readable name for the export job.
description string? Optional description of the job's purpose.
destinationType "s3" | "azure-blob" | "sftp" | "webhook" Where to deliver the exported file.
destinationConfig object Destination-specific settings: bucket, path, region for S3; url for webhook; etc.
format "csv" | "json" | "jsonl" | "xml" Output file format.
schemaId string? Filter to records belonging to this schema.
catalogId string? Filter to records assigned to this catalog.
dimensionScope object? Dimension scope for value flattening, e.g. {"dim-language": "seg-en"}.
schedule string Cron expression for the schedule, e.g. "0 2 * * *" for nightly at 2 AM.
includeVariants boolean Whether to include variant records in the export.
compress boolean Gzip-compress the output file.
filenameTemplate string Output filename template. Supports {date} placeholder, e.g. "products-{date}.csv".
active boolean Whether the job is active. Defaults to true.

Example — Create a nightly S3 export

curl -X POST https://api.sigma-pim.com/api/v1/export-jobs \ -H "Authorization: Bearer eyJhbGciOiJIUzI1NiIs..." \ -H "Content-Type: application/json" \ -d '{ "name": "Nightly Product Feed", "description": "English product catalog for the webshop", "destinationType": "s3", "destinationConfig": { "bucket": "sigma-exports", "path": "feeds/products/", "region": "eu-west-1" }, "format": "csv", "schemaId": "schema-apparel", "dimensionScope": { "dim-language": "seg-en" }, "schedule": "0 2 * * *", "includeVariants": true, "compress": true, "filenameTemplate": "products-{date}.csv.gz", "active": true }'
// Response { "id": "expjob-nightly-feed", "name": "Nightly Product Feed", "description": "English product catalog for the webshop", "destinationType": "s3", "destinationConfig": { "bucket": "sigma-exports", "path": "feeds/products/", "region": "eu-west-1" }, "format": "csv", "schemaId": "schema-apparel", "dimensionScope": { "dim-language": "seg-en" }, "schedule": "0 2 * * *", "includeVariants": true, "compress": true, "filenameTemplate": "products-{date}.csv.gz", "active": true, "createdAt": "2026-03-08T10:30:00.000Z", "updatedAt": "2026-03-08T10:30:00.000Z" }

GET   /export-jobs

List all export jobs configured in the workspace.

curl https://api.sigma-pim.com/api/v1/export-jobs \ -H "Authorization: Bearer eyJhbGciOiJIUzI1NiIs..."

GET   /export-jobs/:id

Get a single export job by its ID.

curl https://api.sigma-pim.com/api/v1/export-jobs/expjob-nightly-feed \ -H "Authorization: Bearer eyJhbGciOiJIUzI1NiIs..."

PUT   /export-jobs/:id

Update an existing export job. Send the full updated object.

curl -X PUT https://api.sigma-pim.com/api/v1/export-jobs/expjob-nightly-feed \ -H "Authorization: Bearer eyJhbGciOiJIUzI1NiIs..." \ -H "Content-Type: application/json" \ -d '{ "name": "Nightly Product Feed (v2)", "schedule": "0 3 * * *", "compress": false, "filenameTemplate": "products-{date}.csv" }'

DELETE   /export-jobs/:id

Delete an export job and all its run history.

curl -X DELETE https://api.sigma-pim.com/api/v1/export-jobs/expjob-nightly-feed \ -H "Authorization: Bearer eyJhbGciOiJIUzI1NiIs..."
// Response { "deleted": true }

POST   /export-jobs/:id/pause

Pause an active export job. The schedule stops running but the job configuration is preserved.

curl -X POST https://api.sigma-pim.com/api/v1/export-jobs/expjob-nightly-feed/pause \ -H "Authorization: Bearer eyJhbGciOiJIUzI1NiIs..."

POST   /export-jobs/:id/resume

Resume a paused export job. The schedule resumes from the next cron match.

curl -X POST https://api.sigma-pim.com/api/v1/export-jobs/expjob-nightly-feed/resume \ -H "Authorization: Bearer eyJhbGciOiJIUzI1NiIs..."

GET   /export-jobs/:id/runs

List the execution history for an export job. Returns runs sorted by most recent first. Use the limit query parameter to control how many runs are returned.

Query Parameters

ParameterTypeDefaultDescription
limit number 50 Maximum number of runs to return.

Run Object

FieldTypeDescription
id string Unique run identifier.
jobId string The parent export job ID.
status "running" | "success" | "failed" Current status of the run.
startedAt string ISO 8601 timestamp when the run started.
completedAt string? ISO 8601 timestamp when the run completed. Null if still running.
recordCount number? Number of records exported in this run.
fileSize number? Size of the exported file in bytes.
destinationPath string? Full path or URL where the file was delivered.
error string? Error message if the run failed.

Example — List run history

curl "https://api.sigma-pim.com/api/v1/export-jobs/expjob-nightly-feed/runs?limit=5" \ -H "Authorization: Bearer eyJhbGciOiJIUzI1NiIs..."
// Response [ { "id": "run-20260308-020000", "jobId": "expjob-nightly-feed", "status": "success", "startedAt": "2026-03-08T02:00:00.000Z", "completedAt": "2026-03-08T02:00:14.320Z", "recordCount": 347, "fileSize": 84210, "destinationPath": "s3://sigma-exports/feeds/products/products-2026-03-08.csv.gz", "error": null }, { "id": "run-20260307-020000", "jobId": "expjob-nightly-feed", "status": "success", "startedAt": "2026-03-07T02:00:00.000Z", "completedAt": "2026-03-07T02:00:12.880Z", "recordCount": 345, "fileSize": 83540, "destinationPath": "s3://sigma-exports/feeds/products/products-2026-03-07.csv.gz", "error": null }, { "id": "run-20260306-020000", "jobId": "expjob-nightly-feed", "status": "failed", "startedAt": "2026-03-06T02:00:00.000Z", "completedAt": "2026-03-06T02:00:03.150Z", "recordCount": null, "fileSize": null, "destinationPath": null, "error": "S3 PutObject failed: Access Denied (403). Check IAM permissions for bucket 'sigma-exports'." } ]

POST   /export-jobs/:id/runs

Record a new run entry for an export job. Typically called by the export runner service to log execution results.

curl -X POST https://api.sigma-pim.com/api/v1/export-jobs/expjob-nightly-feed/runs \ -H "Authorization: Bearer eyJhbGciOiJIUzI1NiIs..." \ -H "Content-Type: application/json" \ -d '{ "status": "success", "startedAt": "2026-03-08T02:00:00.000Z", "completedAt": "2026-03-08T02:00:14.320Z", "recordCount": 347, "fileSize": 84210, "destinationPath": "s3://sigma-exports/feeds/products/products-2026-03-08.csv.gz" }'