post

Overview

The upload resource is used for adding data content to a table. You may only create uploads on tables that belong to unreleased versions. You may create one or more uploads for per table, and their records will be "stacked" based on common variable names across the uploads.
There are multiple different mechanisms to upload content through this endpoint. In general, if your file is larger than 10MB, or if you are on an unreliable internet connection, it is highly recommended to use resumable uploads. If you are utilizing a Redivis client library to perform your uploads, this will be taken care of for you.

Simple uploads

Simple uploads should be used for smaller files that follow standard conventions. Provide the file name (and optionally, type) through query parameters, and the file's content in the request body.

Multipart uploads

If a file is still small, but you need to provide additional metadata, you may send a multipart request body. The first part must be JSON-encoded and contain the upload's metadata, and the second part should contain the file's content.

Resumable uploads

For larger files or less reliable network connection, it is recommended to use resumable uploads for better fault tolerance. To perform a resumable upload, first create the upload via this method (optionally providing a JSON-encoded request body with the upload metadata), and then send the data in chunks via upload.put.

HTTP Request

1
POST /api/v1/tables/:tableReference/uploads
Copied!
This endpoint extends the general API structure

Path parameters

Parameter
tableReference
A qualified reference to the table. See referencing resources for more information.

Query parameters

Parameter
name
Required only if no name provided in the request body. The name of the file being uploaded. Overrides the name in the request body if both are provided.
The file type will be auto-determined based on the ending of this name. If you'd like to manually set the type, provide it via the request body.

Request body

If you are performing a simple upload, provide the upload's data in the request body.
If you are performing a multipart or resumable
Provide a JSON object with information about the table.
Property name
Type
Description
name
string
Required. The name of the table. Must be unique (non-word characters ignored) within the version of the dataset.
type
string
Optional. The type of the file. If not provided, will be auto determined based on the file ending provided via the upload name. An error will occur if no valid type can be determined based on the upload name. Allowed types:
    delimited - Auto-determined for files ending in .csv, .tsv, .psv, .dsv, .txt
    avro
    ndjson
    parquet
    orc
    xls
    xlsx
    dta
    sas7bdat
    sav
skipBadRecords
bool
Optional. Default false. If set to true, the upload will succeed even if some records are invalid / un-parseable. After the upload is completed, the number of skipped records will be available through the skippedBadRecords property on the upload resource.
hasHeaderRow
bool
Optional. Whether or not the file has a header present as the first row. Only applicable for delimited, xls, and xlsx types. Defaults to true.
delimiter
string
Optional. Only applicable for delimited types. The character to use as a field separator in the delimited file. If not set, the delimiter will be auto-determined based on analysis of the file.
quoteCharacter
string
Optional. Only applicable for delimited types. The character used to enclose fields containing the delimiter. Defaults to the double quote character: "
allowQuotedNewLines
bool
Optional. Default false. Only applicable for delimited types. If true, specifies that line breaks are allowed within a quoted field of the delimited file.
Normally, a line break within a field will cause an import error if this field isn't set. Setting this value to true won't cause issues with files that don't have newline characters in fields, though it will substantially increase import processing time and may lead to inaccurate error messages if set unnecessarily.

Authorization

Edit access to the dataset is required. Your access token must have the following scope:
1
{
2
// Required attributes
3
"type": string("delimited", "avro", "ndjson", "parquet", "orc", "xls", "xlsx", "dta", "sas7bdat", "sav"),
4
"mergeStrategy": string("replace"|"append"),
5
6
// Optional attributes
7
"name": string, // default: "untitled_upload.{type}"
8
"hasHeaderRow": bool, // default: true
9
"delimiter": string, // default: null (autocompute)
10
"quoteCharacter": string, // default: "
11
"skipBadRecords": bool, // default: false
12
"allowQuotedNewlines": bool // default: false
13
}
Copied!
    data.edit

Response body

If successful, returns a JSON representation of an upload resource.

Examples

Last modified 4mo ago