# Snowflake

> JDBC Snowflake Sink Connector

### Key Features[​](https://seatunnel.apache.org/docs/2.3.7/connector-v2/sink/Snowflake#key-features) <a href="#key-features" id="key-features"></a>

* [ ] &#x20;exactly-once
* [x] &#x20;cdc

### Description[​](https://seatunnel.apache.org/docs/2.3.7/connector-v2/sink/Snowflake#description) <a href="#description" id="description"></a>

Write data through jdbc. Support Batch mode and Streaming mode, support concurrent writing.

### Supported DataSource list[​](https://seatunnel.apache.org/docs/2.3.7/connector-v2/sink/Snowflake#supported-datasource-list) <a href="#supported-datasource-list" id="supported-datasource-list"></a>

<table><thead><tr><th>Datasource</th><th>Supported Versions</th><th>Driver</th><th>Url</th><th data-hidden>Maven</th></tr></thead><tbody><tr><td>snowflake</td><td>Different dependency version has different driver class.</td><td>net.snowflake.client.jdbc.SnowflakeDriver</td><td>jdbc:snowflake://&#x3C;account_name>.snowflakecomputing.com</td><td><a href="https://mvnrepository.com/artifact/net.snowflake/snowflake-jdbc">Download</a></td></tr></tbody></table>

### Data Type Mapping[​](https://seatunnel.apache.org/docs/2.3.7/connector-v2/sink/Snowflake#data-type-mapping) <a href="#data-type-mapping" id="data-type-mapping"></a>

| Snowflake Data Type                                                               | Nexus Data Type |
| --------------------------------------------------------------------------------- | --------------- |
| BOOLEAN                                                                           | BOOLEAN         |
| <p>TINYINT<br>SMALLINT<br>BYTEINT<br></p>                                         | SHORT\_TYPE     |
| <p>INT<br>INTEGER<br></p>                                                         | INT             |
| BIGINT                                                                            | LONG            |
| <p>DECIMAL<br>NUMERIC<br>NUMBER<br></p>                                           | DECIMAL(x,y)    |
| DECIMAL(x,y)(Get the designated column's specified column size.>38)               | DECIMAL(38,18)  |
| <p>REAL<br>FLOAT4</p>                                                             | FLOAT           |
| <p>DOUBLE<br>DOUBLE PRECISION<br>FLOAT8<br>FLOAT<br></p>                          | DOUBLE          |
| <p>CHAR<br>CHARACTER<br>VARCHAR<br>STRING<br>TEXT<br>VARIANT<br>OBJECT</p>        | STRING          |
| DATE                                                                              | DATE            |
| TIME                                                                              | TIME            |
| <p>DATETIME<br>TIMESTAMP<br>TIMESTAMP\_LTZ<br>TIMESTAMP\_NTZ<br>TIMESTAMP\_TZ</p> | TIMESTAMP       |
| <p>BINARY<br>VARBINARY<br>GEOGRAPHY<br>GEOMETRY</p>                               | BYTES           |

### Options[​](https://seatunnel.apache.org/docs/2.3.7/connector-v2/sink/Snowflake#options) <a href="#options" id="options"></a>

| Name                                            | Type    | Required | Default | Description                                                                                                                                                                                                                                         |
| ----------------------------------------------- | ------- | -------- | ------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| url                                             | String  | Yes      | -       | The URL of the JDBC connection. Refer to a case: jdbc:snowflake://\<account\_name>.snowflakecomputing.com                                                                                                                                           |
| driver                                          | String  | Yes      | -       | <p>The jdbc class name used to connect to the remote data source,<br>if you use Snowflake the value is <code>net.snowflake.client.jdbc.SnowflakeDriver</code>.</p>                                                                                  |
| user                                            | String  | No       | -       | Connection instance user name                                                                                                                                                                                                                       |
| password                                        | String  | No       | -       | Connection instance password                                                                                                                                                                                                                        |
| query                                           | String  | No       | -       | Use this sql write upstream input datas to database. e.g `INSERT ...`,`query` have the higher priority                                                                                                                                              |
| database                                        | String  | No       | -       | <p>Use this <code>database</code> and <code>table-name</code> auto-generate sql and receive upstream input datas write to database.<br>This option is mutually exclusive with <code>query</code> and has a higher priority.</p>                     |
| table                                           | String  | No       | -       | <p>Use database and this table-name auto-generate sql and receive upstream input datas write to database.<br>This option is mutually exclusive with <code>query</code> and has a higher priority.</p>                                               |
| primary\_keys                                   | Array   | No       | -       | This option is used to support operations such as `insert`, `delete`, and `update` when automatically generate sql.                                                                                                                                 |
| support\_upsert\_by\_query\_primary\_key\_exist | Boolean | No       | false   | Choose to use INSERT sql, UPDATE sql to process update events(INSERT, UPDATE\_AFTER) based on query primary key exists. This configuration is only used when database unsupport upsert syntax. **Note**: that this method has low performance       |
| connection\_check\_timeout\_sec                 | Int     | No       | 30      | The time in seconds to wait for the database operation used to validate the connection to complete.                                                                                                                                                 |
| max\_retries                                    | Int     | No       | 0       | The number of retries to submit failed (executeBatch)                                                                                                                                                                                               |
| batch\_size                                     | Int     | No       | 1000    | <p>For batch writing, when the number of buffered records reaches the number of <code>batch\_size</code> or the time reaches <code>checkpoint.interval</code><br>, the data will be flushed into the database</p>                                   |
| max\_commit\_attempts                           | Int     | No       | 3       | The number of retries for transaction commit failures                                                                                                                                                                                               |
| transaction\_timeout\_sec                       | Int     | No       | -1      | <p>The timeout after the transaction is opened, the default is -1 (never timeout). Note that setting the timeout may affect<br>exactly-once semantics</p>                                                                                           |
| auto\_commit                                    | Boolean | No       | true    | Automatic transaction commit is enabled by default                                                                                                                                                                                                  |
| properties                                      | Map     | No       | -       | <p>Additional connection configuration parameters,when properties and URL have the same parameters, the priority is determined by the<br>specific implementation of the driver. For example, in MySQL, properties take precedence over the URL.</p> |
| common-options                                  |         | No       | -       | Sink plugin common parameters, please refer to [Sink Common Options](/data-integration-with-nexus/nexus-elements/connectors/sink/sink-common-options.md) for details                                                                                |
| enable\_upsert                                  | Boolean | No       | true    | Enable upsert by primary\_keys exist, If the task has no key duplicate data, setting this parameter to `false` can speed up data import                                                                                                             |

### tips[​](https://seatunnel.apache.org/docs/2.3.7/connector-v2/sink/Snowflake#tips) <a href="#tips" id="tips"></a>

> If partition\_column is not set, it will run in single concurrency, and if partition\_column is set, it will be executed in parallel according to the concurrency of tasks.
>
> ### Task Example[​](https://seatunnel.apache.org/docs/2.3.7/connector-v2/sink/Snowflake#task-example) <a href="#task-example" id="task-example"></a>

#### simple:[​](https://seatunnel.apache.org/docs/2.3.7/connector-v2/sink/Snowflake#simple) <a href="#simple" id="simple"></a>

> This example defines a Nexus synchronization task that automatically generates data through FakeSource and sends it to JDBC Sink. FakeSource generates a total of 16 rows of data (row\.num=16), with each row having two fields, name (string type) and age (int type). The final target table is test\_table will also be 16 rows of data in the table. Before run this job, you need create database test and table test\_table in your snowflake database.&#x20;

```
# Defining the runtime environment
env {
    parallelism = 1
    job.mode = "BATCH"
}
source {
    # This is a example source plugin **only for test and demonstrate the feature source plugin**
    FakeSource {
        parallelism = 1
        result_table_name = "fake"
        row.num = 16
        schema = {
            fields {
                name = "string"
                age = "int"
            }
        }
    }
    # If you would like to get more information about how to configure Nexus and see full list of source plugins,
    # please go to source page
}
transform {
    # If you would like to get more information about how to configure Nexus and see full list of transform plugins,
    # please go to transform page
}
sink {
    jdbc {
        url = "jdbc:snowflake://<account_name>.snowflakecomputing.com"
        driver = "net.snowflake.client.jdbc.SnowflakeDriver"
        user = "root"
        password = "123456"
        query = "insert into test_table(name,age) values(?,?)"
    }
    # If you would like to get more information about how to configure Nexus and see full list of sink plugins,
    # please go to sink page
}
```

#### CDC(Change data capture) event[​](https://seatunnel.apache.org/docs/2.3.7/connector-v2/sink/Snowflake#cdcchange-data-capture-event) <a href="#cdcchange-data-capture-event" id="cdcchange-data-capture-event"></a>

> CDC change data is also supported by us In this case, you need config database, table and primary\_keys.

```
sink {
   jdbc {
   url = "jdbc:snowflake://<account_name>.snowflakecomputing.com"
   driver = "net.snowflake.client.jdbc.SnowflakeDriver"
   user = "root"
   password = "123456"
   generate_sink_sql = true
   
   
   # You need to configure both database and table
   database = test
   table = sink_table
   primary_keys = ["id","name"]
  }
}
```


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.selfuel.digital/data-integration-with-nexus/nexus-elements/connectors/sink/snowflake.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
