Write APIs
Government Admin users can use Write APIs to automatically and programmatically publish data to data.gov.sg without needing to log into the UI
Prerequisites
You must be a government officer
Your email has admin permissions to manage datasets
Step 1 - Generate API Keys:
Create an API key from the admin dashboard.
Store the API key in a secure location. It will be used later.
Step by step details can be found here: How to generate API keys
Step 2 - Verify API Connectivity:
your-api-key
Successful response
GET /v2/admin/api/auth/whoami HTTP/1.1
Host: api-production.data.gov.sg
Accept: */*
Successful response
{
"user": {
"userId": "167",
"email": "[email protected]"
}
}
Run a get request against the above endpoint, using your admin api key in the header:
https://api-production.data.gov.sg/v2/admin/api/auth/whoami
A 200 response means that you're connected.
Common issues:
Check your API URL - did you use the correct URL above?
Check your API key - did you use the correct API key? Try regenerating the API key and testing again
Are the network configurations set up to connect to the internet? data.gov.sg is an internet platform and require internet connectivity
Step 3 - Publish Dataset:
To push data, an existing dataset must exist. The dataset can be created from the admin dashboard.
The dataset must already contain existing data.
The dataset must be published.
For more details on publishing data: How to publish data
Using Write APIs involves three steps:
Generating upload link
Uploading file to link
Polling for upload status
Step 4 - Create Upload / Append URL:
There are two types of links you can create depending on your use case:
If you're replacing an existing dataset completely, use the
get create upload URL
methodIf you are appending rows to an existing dataset, use the
get create append URL
method.Please ensure the column data types and names are all the same and no new columns have been added. This append API only supports appending rows to an existing dataset and not appending columns
d_263ee33e4d309a0bba75a1fd30ab7035
your-api-key
Created
GET /v2/admin/api/datasets/{datasetId}/upload-link HTTP/1.1
Host: api-production.data.gov.sg
Accept: */*
Created
{
"url": "https://s3.ap-southeast-1.amazonaws.com/attachments.data.gov.sg/c/6047/private/d_2a6070fa301695904e1a626434189e59/VW50aXRsZWQgRGF0YXNldA/d_2a6070fa301695904e1a626434189e59.csv?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Credential=ASIAU7LWPY2WH4V4RGEJ%2F20231129%2Fap-southeast-1%2Fs3%2Faws4_request&X-Amz-Date=20231129T053810Z&X-Amz-Expires=3600&X-Amz-Security-Token=IQoJb3JpZ2luX2VjEHYaDmFwLXNvdXRoZWFzdC0xIkYwRAIgdQy7ZfOZaDbQIEsS5VqAd76mSGitIgNSHxm3C6sP9SwCIFS5kaaPMcCYQpVv9UdgGzwLZyujx1S844oiCHZxT3yhKqcDCM%2F%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FwEQBBoMMzQyMjM1MjY4NzgwIgxKiGm5wh31GOszXoIq%2BwLwXu11Ykcov8rg20fsos4%2FcGfR8JtKuUvTtjKC4IBP46v5FzBKPIlMT%2ByfrMvOYZV4kqSiN5%2F%2FeOuGMwk6tfNGY0PphSWfTnWCEVT5wap%2BFRsW7OTPFy9G3ih0X1h7NDYpK%2F1Jkh9pM3%2FyW4x7FQUnXaptNP3k%2FWArdBG3kIvH2kaOiBU9TlqvRhP0xd7dEXoqARY%2BZiOIFBoA2WknNPxQU82mzavOFj9hmeSyi5rvsFa%2BG47m02g6F46vIicnDzQSDgO7UAusH9pQPO40jXvspYAYknW7BQ6EnMcUL61fUWPCaS8zaLdvRNnb6gisasL8RC6gJsJeAPLRhijUxD8KsVDWtt3ZBho7YR5tQbdg%2FAIVmPL4xQe%2B05N9dHt5wG9ws8A2789F3SgF0yZW9tzc5IfNIxsixotOrLkZRUBXmzujAVe51sQM98dx93Y2%2B%2F9mGbQY7700ShGjAyErYvdl2CYJutJ9jI1yswaJkFyR8MkxmSf26vUY7Of0MMCcm6sGOp4BNi7cp9OFj6enS%2Fn7jhlhdFbkn4zBa%2F0e4ywK05Vqzr7mRguUmMawsUh9i%2Booc%2Bnn90PA7C6YvzZ3IBxyJLWxmzkRDpdlThbkQwxpREJ2c%2FzaPNPdUTrDvp28qAPV3bUe%2FWJTKY7Lh72U2AIRFlKUoTCpQ6ihvgIZH1GGPJsomcajXaguK%2F5yLAqFxY%2B7org%2FJlFwMXfZUSzuObK%2BHsE%3D&X-Amz-Signature=c3444a1b8e9d73d8ac7fd64da4a963860a3e9353ae956f08687a78640dedb834&X-Amz-SignedHeaders=host&x-id=PutObject"
}
d_263ee33e4d309a0bba75a1fd30ab7035
(Optional) MD5 hash of the file to be uploaded. The MD5 Hash should be encoded in base64. The base64, MD5 hash should then be URI encoded.
FkXeisMYBP6UpdSiY8kOsA%3D%3D
your-api-key
Created
GET /v2/admin/api/datasets/{datasetId}/append-link HTTP/1.1
Host: api-production.data.gov.sg
Accept: */*
Created
{
"url": "https://s3.ap-southeast-1.amazonaws.com/attachments.data.gov.sg/c/6047/private/d_2a6070fa301695904e1a626434189e59/VW50aXRsZWQgRGF0YXNldA/d_2a6070fa301695904e1a626434189e59.csv?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Credential=ASIAU7LWPY2WH4V4RGEJ%2F20231129%2Fap-southeast-1%2Fs3%2Faws4_request&X-Amz-Date=20231129T053810Z&X-Amz-Expires=3600&X-Amz-Security-Token=IQoJb3JpZ2luX2VjEHYaDmFwLXNvdXRoZWFzdC0xIkYwRAIgdQy7ZfOZaDbQIEsS5VqAd76mSGitIgNSHxm3C6sP9SwCIFS5kaaPMcCYQpVv9UdgGzwLZyujx1S844oiCHZxT3yhKqcDCM%2F%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FwEQBBoMMzQyMjM1MjY4NzgwIgxKiGm5wh31GOszXoIq%2BwLwXu11Ykcov8rg20fsos4%2FcGfR8JtKuUvTtjKC4IBP46v5FzBKPIlMT%2ByfrMvOYZV4kqSiN5%2F%2FeOuGMwk6tfNGY0PphSWfTnWCEVT5wap%2BFRsW7OTPFy9G3ih0X1h7NDYpK%2F1Jkh9pM3%2FyW4x7FQUnXaptNP3k%2FWArdBG3kIvH2kaOiBU9TlqvRhP0xd7dEXoqARY%2BZiOIFBoA2WknNPxQU82mzavOFj9hmeSyi5rvsFa%2BG47m02g6F46vIicnDzQSDgO7UAusH9pQPO40jXvspYAYknW7BQ6EnMcUL61fUWPCaS8zaLdvRNnb6gisasL8RC6gJsJeAPLRhijUxD8KsVDWtt3ZBho7YR5tQbdg%2FAIVmPL4xQe%2B05N9dHt5wG9ws8A2789F3SgF0yZW9tzc5IfNIxsixotOrLkZRUBXmzujAVe51sQM98dx93Y2%2B%2F9mGbQY7700ShGjAyErYvdl2CYJutJ9jI1yswaJkFyR8MkxmSf26vUY7Of0MMCcm6sGOp4BNi7cp9OFj6enS%2Fn7jhlhdFbkn4zBa%2F0e4ywK05Vqzr7mRguUmMawsUh9i%2Booc%2Bnn90PA7C6YvzZ3IBxyJLWxmzkRDpdlThbkQwxpREJ2c%2FzaPNPdUTrDvp28qAPV3bUe%2FWJTKY7Lh72U2AIRFlKUoTCpQ6ihvgIZH1GGPJsomcajXaguK%2F5yLAqFxY%2B7org%2FJlFwMXfZUSzuObK%2BHsE%3D&X-Amz-Signature=c3444a1b8e9d73d8ac7fd64da4a963860a3e9353ae956f08687a78640dedb834&X-Amz-SignedHeaders=host&x-id=PutObject"
}
Step 5 - Upload Tabular File:
You'll need to make a direct PUT
request to the S3 pre-signed url returned in the response of the previous step to upload your CSV file.
The url should start with https://s3.ap-southeast-1.amazonaws.com/
For example: https://s3.ap-southeast-1.amazonaws.com/attachments.data.gov.sg/c/6047/private/d_2a6070fa301695904e1a626434189e59/VW50aXRsZWQgRGF0YXNldA/d_2a6070fa301695904e1a626434189e59.csv?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Credential=ASIAU7LWPY2WH4V4RGEJ%2F20231129%2Fap-southeast-1%2Fs3%2Faws4_request&X-Amz-Date=20231129T053810Z&X-Amz-Expires=3600&...
Javascript
// Upload CSV file to presigned URL
const uploadCsv = async (presignedUrl, csvContent) => {
const response = await fetch(presignedUrl, {
method: 'PUT',
headers: {
'Content-Type': 'text/csv'
},
body: csvContent
});
if (response.ok) {
console.log('Upload successful!');
} else {
console.error('Upload failed:', response.statusText);
}
};
// Usage
const csvData = `name,title,email
janedoe,SWE,[email protected]
johnsmith,PM,[email protected]`;
await uploadCsv(presignedUrl, csvData);
Python
import requests
# Upload CSV file to presigned URL
def upload_csv(presigned_url, csv_content):
headers = {'Content-Type': 'text/csv'}
response = requests.put(presigned_url, headers=headers, data=csv_content)
response.raise_for_status()
return response
# Usage
csv_data = """name,title,email
janedoe,SWE,[email protected]
johnsmith,PM,[email protected]"""
upload_response = upload_csv(presigned_url, csv_data)
print("Upload successful!" if upload_response.status_code == 200 else "Upload failed")
Common issues:
Check that you're using a PUT request
Check that you're using the generated S3 pre-signed url, instead of the one above
Set the header as
Content-Type: text/csv
when uploading the CSV fileThe pre-signed URLs expire after 1 hour, so you'd need to generate them again
Reference:
Step 6 - Get Latest Ingestion Status:
d_263ee33e4d309a0bba75a1fd30ab7035
your-api-key
Legend
Order | Ingestion status | Description |
---|---|---|
1 | Pending | The dataset validation and ingestion process has been started. |
2 | Pending Validation | The dataset has been queued for validation. |
2.1 | Validation Failed | The dataset contains invalid values causing validation failure. Please review the reported errors, make the necessary changes and try again. |
2.2 | Validation Passed | The dataset has been validated successfully, but not yet ingested. |
3 | Pending Ingestion | The dataset has been queued for ingestion. |
3.1 | Ingestion Failed | The dataset failed to be ingested. Please try again later. |
3.2 | Ingestion Success | The dataset has been ingested successfully. |
GET /v2/admin/api/datasets/{datasetId}/ingestion-status HTTP/1.1
Host: api-production.data.gov.sg
Accept: */*
Legend
Order | Ingestion status | Description |
---|---|---|
1 | Pending | The dataset validation and ingestion process has been started. |
2 | Pending Validation | The dataset has been queued for validation. |
2.1 | Validation Failed | The dataset contains invalid values causing validation failure. Please review the reported errors, make the necessary changes and try again. |
2.2 | Validation Passed | The dataset has been validated successfully, but not yet ingested. |
3 | Pending Ingestion | The dataset has been queued for ingestion. |
3.1 | Ingestion Failed | The dataset failed to be ingested. Please try again later. |
3.2 | Ingestion Success | The dataset has been ingested successfully. |
{
"version": 0,
"userId": 1,
"updatedAt": 1699953464415,
"status": "NA",
"failures": null
}
The validation and ingestion process of data.gov.sg is asynchronous. The upload request from the previous section will return before the process is complete. In order to get the status for validation and ingestion, manual polling is required.
Here are the possible validation and ingestion statuses:
1
Pending
The validation and ingestion process has been started
2
Pending Validation
The dataset has been queued for validation
2.1
Validation Failed
The dataset contains invalid values causing validation failures. Please review the reported errors, make the necessary changes and try again.
2.2
Validation Passed
The dataset has been validated successfully, but not yet ingested.
3
Pending Ingestion
The dataset has been queued for ingestion.
3.1
Ingestion Failed
The dataset failed to be ingested. Please try again later.
3.2
Ingestion Success
The dataset has been ingested successfully.
Common questions:
What happens when there are concurrent writes?
If there is an existing dataset upload that has not completed, subsequent dataset uploads will automatically fail. This is to prevent multiple writes happening at the same time, that will cause unexpected behaviour
What happens if there is one or more rows that fail validation?
We currently take the conservative approach and will reject the dataset upload if any of the rows fail validation. In these cases, we recommend to fix the dataset before retrying.
The API will not return detailed explanations on the validation error. For further debugging and investigation, please try uploading the dataset via the UI, which will show more explicit error messages to help with debugging.
What is the dataset upload limit?
5GB
Raw API Schema:
For further queries feel free to reach out to us: Contact the Data.gov.sg team
Last updated
Was this helpful?