Data Exports
Exports allow you to retrieve historical submission data in bulk. Unlike webhooks, which push data in real-time, exports are asynchronous background jobs designed for analytics, compliance, and archival purposes.
What Are Exports?
An export is a snapshot of your form data at a specific point in time. Because submission volumes can range from hundreds to millions of records, generating this data is resource-intensive and cannot be performed synchronously within a standard API request cycle.
When you trigger an export, Forge queues a background worker to query the data store, serialize the results, and upload the resulting artifact to a secure, temporary storage bucket.
When to Use Exports
- Analytics & BIIngesting bulk data into data warehouses (Snowflake, BigQuery) for trend analysis.
- System BackfillsSeeding a new CRM or internal tool with historical submission data.
- Compliance & AuditsRetrieving all records for a specific time range to satisfy legal discovery or GDPR requests.
- MigrationMoving data off the Forge platform to on-premise storage.
Supported Formats
| Format | Content Type | Description |
|---|---|---|
| JSON | application/json | Full fidelity array of objects. Preserves nested structures and metadata types. |
| CSV | text/csv | Flattened structure. Useful for Excel/Sheets. UTF-8 encoded with BOM. |
Note: In CSV exports, deeply nested JSON fields in submissions will be stringified.
Export Lifecycle
Exports follow an asynchronous state machine pattern. You initiate a job, poll for status, and download the artifact upon completion.
1. Initiate Job
Send a POST request to the exports endpoint specifying the form ID and date range.
POST /v1/exports
{
"form_id": "f_12345",
"format": "json",
"date_from": "2025-01-01T00:00:00Z"
}2. Processing
The API returns a 202 Accepted response with a Job ID. The job enters the processing state.
3. Retrieval
Once the status is completed, the job resource will contain a signed download_url.
GET /v1/exports/job_98765
{
"id": "job_98765",
"status": "completed",
"download_url": "https://storage.forge.dev/exports/..."
}Handling Large Datasets
For datasets exceeding 100,000 records, exports are processed in chunks to ensure memory safety and system stability. Do not expect instant generation; processing time scales linearly with the volume of data and the complexity of the payload.
- Timeout Safety: Because exports are async, there are no HTTP timeout risks during generation.
- Concurrency: Each account has a concurrency limit. Queued jobs will remain in
pendinguntil a worker slot is free. - Artifact Expiry: Generated files are deleted after 24 hours to enforce security and reduce storage costs.
Security & Access Controls
Exports involve sensitive bulk data. We apply strict controls to the generation and retrieval process.
Signed URLs
Download links are pre-signed with a short expiration time (15 minutes). They cannot be shared publicly or accessed without the signature parameters.
Audit Logging
Every export initiation and download event is logged in the Audit Trail, including the user IP, User-Agent, and timestamp.
Common Pitfalls
Not for Real-time Usage
Do not trigger an export every time a user submits a form. Use Webhooks for real-time event processing.
Redundant Generation
Check the status of existing jobs before triggering new ones. Generating the same export multiple times wastes quota and delays processing.