Redivis API
User documentationredivis.com
  • Introduction
  • Referencing resources
  • Client libraries
    • redivis-js
      • Getting started
      • Examples
    • redivis-python
      • Getting started
      • Reference
        • redivis
          • redivis.current_notebook
          • redivis.file
          • redivis.make_api_request
          • redivis.organization
          • redivis.query
          • redivis.table
          • redivis.user
        • Dataset
          • Dataset.add_labels
          • Dataset.create
          • Dataset.create_next_version
          • Dataset.delete
          • Dataset.exists
          • Dataset.get
          • Dataset.list_tables
          • Dataset.list_versions
          • Dataset.query
          • Dataset.release
          • Dataset.remove_labels
          • Dataset.table
          • Dataset.unrelease
          • Dataset.update
          • Dataset.version
        • File
          • File.download
          • File.get
          • File.read
          • File.stream
        • Member
          • Member.add_labels
          • Member.exists
          • Member.get
          • Member.remove_labels
          • Member.update
        • Notebook
          • Notebook.create_output_table
        • Organization
          • Organization.dataset
          • Organization.list_datasets
          • Organization.list_members
          • Organization.member
          • Organization.secret
        • Query
          • Query.download_files
          • Query.get
          • Query.list_files
          • Query.list_rows
          • Query.to_arrow_batch_iterator
          • Query.to_arrow_dataset
          • Query.to_arrow_table
          • Query.to_dataframe
          • Query.to_geopandas_dataframe
          • Query.to_dask_dataframe
          • Query.to_pandas_dataframe
          • Query.to_polars_lazyframe
        • Secret
          • Secret.get_value
        • Table
          • Table.add_files
          • Table.create
          • Table.delete
          • Table.download
          • Table.download_files
          • Table.get
          • Table.exists
          • Table.list_files
          • Table.list_rows
          • Table.list_uploads
          • Table.list_variables
          • Table.to_arrow_batch_iterator
          • Table.to_arrow_dataset
          • Table.to_arrow_table
          • Table.to_dataframe
          • Table.to_geopandas_dataframe
          • Table.to_dask_dataframe
          • Table.to_pandas_dataframe
          • Table.to_polars_lazyframe
          • Table.update
          • Table.upload
          • Table.variable
        • Upload
          • Upload.create
          • Upload.delete
          • Upload.exists
          • Upload.get
          • Upload.insert_rows
          • Upload.list_variables
          • Upload.to_*
        • User
          • User.dataset
          • User.list_datasets
          • User.secret
          • User.workflow
          • User.list_workflows
        • Variable
          • Variable.get
          • Variable.get_statistics
          • Variable.exists
          • Variable.update
        • Version
          • Version.dataset
          • Version.delete
          • Version.exists
          • Version.get
          • Version.previous_version
          • Version.next_version
        • Workflow
          • Workflow.get
          • Workflow.exists
          • Workflow.list_tables
          • Workflow.query
          • Workflow.table
      • Examples
        • Listing resources
        • Querying data
        • Reading tabular data
        • Uploading data
        • Working with non-tabular files
    • redivis-r
      • Getting started
      • Reference
        • redivis
          • redivis$current_notebook
          • redivis$file
          • redivis$make_api_request
          • redivis$organization
          • redivis$query
          • redivis$table
          • redivis$user
        • Dataset
          • Dataset$create
          • Dataset$create_next_version
          • Dataset$delete
          • Dataset$exists
          • Dataset$get
          • Dataset$list_tables
          • Dataset$query
          • Dataset$release
          • Dataset$table
          • Dataset$unrelease
          • Dataset$update
        • File
          • File$download
          • File$get
          • File$read
          • File$stream
        • Notebook
          • Notebook$create_output_table
        • Organization
          • Organization$dataset
          • Organization$list_datasets
          • Organization$secret
        • Query
          • Query$download_files
          • Query$get
          • Query$list_files
          • Query$to_arrow_batch_reader
          • Query$to_arrow_dataset
          • Query$to_arrow_table
          • Query$to_data_frame
          • Query$to_data_table
          • Query$to_tibble
          • Query$to_sf_tibble
        • Secret
          • Secret$get_value
        • Table
          • Table$add_files
          • Table$create
          • Table$delete
          • Table$download
          • Table$download_files
          • Table$get
          • Table$exists
          • Table$list_files
          • Table$list_uploads
          • Table$list_variables
          • Table$to_arrow_batch_reader
          • Table$to_arrow_dataset
          • Table$to_arrow_table
          • Table$to_data_frame
          • Table$to_data_table
          • Table$to_tibble
          • Table$to_sf_tibble
          • Table$update
          • Table$upload
          • Table$variable
        • Upload
          • Upload$create
          • Upload$delete
          • Upload$exists
          • Upload$get
          • Upload$insert_rows
          • Upload$list_variables
          • Upload$to_*
        • User
          • User$dataset
          • User$list_datasets
          • User$list_workflows
          • User$secret
          • User$workflow
        • Variable
          • Variable$get
          • Variable$get_statistics
          • Variable$exists
          • Variable$update
        • Workflow
          • Workflow$get
          • Workflow$exists
          • Workflow$list_tables
          • Workflow$query
          • Workflow$table
      • Examples
        • Listing resources
        • Querying data
        • Reading tabular data
        • Uploading data
        • Working with non-tabular data
  • REST API
    • General structure
    • Authorization
    • Access
      • get
      • list
    • Datasets
      • delete
      • get
      • list
      • patch
      • post
    • DataSources
      • get
      • list
      • patch
    • Exports
      • download
      • get
      • post
    • Files
      • createSignedUrl
      • get
      • head
      • post
    • Members
      • get
      • list
    • Notebooks
      • get
      • list
      • run
      • stop
    • Queries
      • get
      • post
      • listRows
    • ReadSessions
      • post
      • getStream
    • Tables
      • createTempUploads
      • delete
      • get
      • list
      • listRows
      • patch
      • post
    • Transforms
      • cancel
      • get
      • list
      • run
    • Uploads
      • delete
      • get
      • insertRows
      • list
      • listRows
      • post
    • Variables
      • get
      • getStatistics
      • list
      • patch
    • Versions
      • delete
      • get
      • list
      • post
      • release
      • unrelease
    • Workflows
      • get
      • list
  • Resource definitions
    • Access
    • Dataset
    • DataSource
    • Export
    • Member
    • Notebook
    • Organization
    • Query
    • Table
    • Transform
    • Upload
    • User
    • Variable
    • VariableStatistics
    • Version
    • Workflow
Powered by GitBook
On this page
  • Overview
  • HTTP Request
  • Path parameters
  • Request body
  • Authorization
  • Response body

Was this helpful?

  1. REST API
  2. DataSources

patch

PreviouslistNextExports

Last updated 7 days ago

Was this helpful?

Overview

This endpoint is used for updating information about a dataset's tables.

HTTP Request

PATCH /api/v1/workflows/:workflowReference/dataSources/:dataSourceReference

Path parameters

Parameter

workflowReference

dataSourceReference

A qualified reference to the dataSource's current source dataset or workflow. Alternatively, can provide the id of the dataSource.

This endpoint extends the

Request body

Provide a JSON object with information about the dataSource.

Property name
Type
Description

sourceDataset

string

A qualified reference to the (new) source dataset for this dataSource.

Include version / sample information in the reference to update a dataSource's version or sample status.

Either a sourceDataset or sourceWorkflow must be provided, but not both.

sourceWorkflow

string

A qualified reference to the (new) source workflow for this dataSource.

Either a sourceDataset or sourceWorkflow must be provided, but not both.

mappedTables

object

An object whose keys represent the current tables that are referenced on the dataSource, and whose values represent the corresponding tables in the new source dataset or workflow. E.g.,

In many circumstances, tables can be automatically mapped based on their name and identifier, and this parameter isn't needed. However, if a previously-referenced table doesn't exist under the new data source – e.g., a table is deleted in subsequent versions – you'll need to provide information so that any downstream notebooks or transforms can be updates. It is only necessary to map tables that are referenced by a downstream node, not all tables associated with the dataSource.

Authorization

Edit access to the corresponding dataset is required. Your access token must have the following scope:

  • workflow.write

Response body

A qualified reference to the workflow. See for more information.

Returns the JSON-encoded "get" representation of a .

# Reference id suffixes are optional, but take precedence over names
{ "old_table:xje9": "new_table:a842" }
general API structure
Learn more about authorization.
dataSource resource
referencing resources