Files (CSV, JSON, Excel, Feather, Parquet) Source Connector

Files (CSV, JSON, Excel, Feather, Parquet)

This page contains the setup guide and reference information for the Files source connector.

Prerequisites

  • URL to access the file
  • Format
  • Reader options
  • Storage Providers

Setup guide

For Airbyte Cloud:

Setup through Airbyte Cloud will be exactly the same as the open-source setup, except for the fact that local files are disabled.

For Airbyte Open Source:

  1. Once the File Source is selected, you should define both the storage provider along its URL and format of the file.
  2. Depending on the provider choice and privacy of the data, you will have to configure more options.

Fields description

  • For Dataset Name use the name of the final table to replicate this file into (should include letters, numbers dash and underscores only).
  • For File Format use the format of the file which should be replicated (Warning: some formats may be experimental, please refer to the docs).
  • For Reader Options use a string in JSON format. It depends on the chosen file format to provide additional options and tune its behavior. For example, {} for empty options, {"sep": " "} for set up separator to one space ' '.
  • For URL use the URL path to access the file which should be replicated.
  • For Storage Provider use the storage Provider or Location of the file(s) which should be replicated.
    • [Default] Public Web
      • User-Agent set to active if you want to add User-Agent to requests
    • GCS: Google Cloud Storage
      • Service Account JSON In order to access private Buckets stored on Google Cloud, this connector would need a service account json credentials with the proper permissions as described here. Please generate the credentials.json file and copy/paste its content to this field (expecting JSON formats). If accessing publicly available data, this field is not necessary.
    • S3: Amazon Web Services
      • AWS Access Key ID In order to access private Buckets stored on AWS S3, this connector would need credentials with the proper permissions. If accessing publicly available data, this field is not necessary.
      • AWS Secret Access KeyIn order to access private Buckets stored on AWS S3, this connector would need credentials with the proper permissions. If accessing publicly available data, this field is not necessary.
    • AzBlob: Azure Blob Storage
      • Storage Account The globally unique name of the storage account that the desired blob sits within. See here for more details.
      • SAS Token To access Azure Blob Storage, this connector would need credentials with the proper permissions. One option is a SAS (Shared Access Signature) token. If accessing publicly available data, this field is not necessary.
      • Shared Key To access Azure Blob Storage, this connector would need credentials with the proper permissions. One option is a storage account shared key (aka account key or access key). If accessing publicly available data, this field is not necessary.
    • SSH: Secure Shell
      • User use username.
      • Password use password.
      • Host use a host.
      • Port use a port for your host.
    • SCP: Secure copy protocol
      • User use username.
      • Password use password.
      • Host use a host.
      • Port use a port for your host.
    • SFTP: Secure File Transfer Protocol
      • User use username.
      • Password use password.
      • Host use a host.
      • Port use a port for your host.
    • Local Filesystem (limited)
      • Storage WARNING: Note that the local storage URL available for reading must start with the local mount "/local/" at the moment until we implement more advanced docker mounting options.

Provider Specific Information

  • In case of Google Drive, it is necesary to use the Download URL, the format for that is https://drive.google.com/uc?export=download&id=[DRIVE_FILE_ID] where [DRIVE_FILE_ID] is the string found in the Share URL here https://drive.google.com/file/d/[DRIVE_FILE_ID]/view?usp=sharing
  • In case of GCS, it is necessary to provide the content of the service account keyfile to access private buckets. See settings of BigQuery Destination
  • In case of AWS S3, the pair of aws_access_key_id and aws_secret_access_key is necessary to access private S3 buckets.
  • In case of AzBlob, it is necessary to provide the storage_account in which the blob you want to access resides. Either sas_token (info) or shared_key (info) is necessary to access private blobs.
  • In case of a locally stored file on a Windows OS, it's necessary to change the values for LOCAL_ROOT, LOCAL_DOCKER_MOUNT and HACK_LOCAL_ROOT_PARENT in the .env file to an existing absolute path on your machine (colons in the path need to be replaced with a double forward slash, //). LOCAL_ROOT & LOCAL_DOCKER_MOUNT should be the same value, and HACK_LOCAL_ROOT_PARENT should be the parent directory of the other two.

Reader Options

The Reader in charge of loading the file format is currently based on Pandas IO Tools. It is possible to customize how to load the file into a Pandas DataFrame as part of this Source Connector. This is doable in the reader_options that should be in JSON format and depends on the chosen file format. See pandas' documentation, depending on the format:

For example, if the format CSV is selected, then options from the read_csv functions are available.

  • It is therefore possible to customize the delimiter (or sep) to in case of tab separated files.
  • Header line can be ignored with header=0 and customized with names
  • etc

We would therefore provide in the reader_options the following json:

{ "sep" : "\t", "header" : 0, "names": ["column1", "column2"]}

In case you select JSON format, then options from the read_json reader are available.

For example, you can use the {"orient" : "records"} to change how orientation of data is loaded (if data is [{column -> value}, … , {column -> value}])

If you need to read Excel Binary Workbook, please specify excel_binary format in File Format select.

 

:::warning

This connector does not support syncing unstructured data files such as raw text, audio, or videos.

:::

Supported sync modes

 

:::info

This source produces a single table for the target file as it replicates only one file at a time for the moment. Note that you should provide the `dataset_name` which dictates how the table will be identified in the destination (since `URL` can be made of complex characters).

:::

File / Stream Compression

Storage Providers

File Formats

Changing data types of source columns

Normally, Airbyte tries to infer the data type from the source, but you can use reader_options to force specific data types. If you input {"dtype":"string"}, all columns will be forced to be parsed as strings. If you only want a specific column to be parsed as a string, simply use {"dtype" : {"column name": "string"}}.

Examples

Here are a list of examples of possible file inputs: