JOB_NAME: a job name of your choice
PROJECT_ID: your template project ID
DRIVER_JARS: path to your JDBC drivers
CONNECTION_URL: your JDBC connection URL string
DRIVER_CLASS_NAME: your JDBC driver class name
CONNECTION_PROPERTIES: your JDBC connection property string
QUERY: your JDBC source SQL query
OUTPUT_ASSET: your Dataplex Universal Catalog output asset ID
PROJECT_ID: your template project ID
REGION_NAME: region in which to run the job
JOB_NAME: a job name of your choice
DRIVER_JARS: path to your JDBC drivers
CONNECTION_URL: your JDBC connection URL string
DRIVER_CLASS_NAME: your JDBC driver class name
CONNECTION_PROPERTIES: your JDBC connection property string
QUERY: your JDBC source SQL query
OUTPUT_ASSET: your Dataplex Universal Catalog output asset ID
[[["容易理解","easyToUnderstand","thumb-up"],["確實解決了我的問題","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["難以理解","hardToUnderstand","thumb-down"],["資訊或程式碼範例有誤","incorrectInformationOrSampleCode","thumb-down"],["缺少我需要的資訊/範例","missingTheInformationSamplesINeed","thumb-down"],["翻譯問題","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["上次更新時間:2025-09-04 (世界標準時間)。"],[[["\u003cp\u003eDataplex offers templates, powered by Dataflow, for various data processing tasks, including data ingestion.\u003c/p\u003e\n"],["\u003cp\u003eThe JDBC ingestion template in Dataplex facilitates copying data from a relational database into a specified Dataplex asset, which can be either Cloud Storage or BigQuery.\u003c/p\u003e\n"],["\u003cp\u003eThis template uses JDBC for database connectivity, and it supports optional encryption of sensitive parameters like username, password, and connection strings using Cloud KMS.\u003c/p\u003e\n"],["\u003cp\u003eThe template transparently handles Cloud Storage and BigQuery asset types, and data on Cloud Storage assets are Hive-style partitioned and made available as a table in Data Catalog, BigQuery, or an attached Dataproc Metastore instance.\u003c/p\u003e\n"],["\u003cp\u003eRunning the template can be done in three different ways: the Google Cloud console, gcloud command-line interface, and REST API, with various parameters to configure the data transfer.\u003c/p\u003e\n"]]],[],null,["# Ingest using templates\n\nDataplex Universal Catalog provides templates (powered by Dataflow)\nto perform common data processing tasks like data ingestion, processing, and\nmanaging data lifecycle. This guide describes how to configure and run a\ntemplate that ingests data using a JDBC connection.\n\nBefore you begin\n----------------\n\nDataplex Universal Catalog task templates are powered by Dataflow.\nBefore you use templates, enable the Dataflow APIs.\n\n[Enable the Dataflow APIs](https://console.cloud.google.com/apis/api/dataflow.googleapis.com/overview)\n| **Note:** All templates support common [Dataflow pipeline parameters](/dataflow/docs/reference/pipeline-options).\n\nTemplate: Ingest data into Dataplex Universal Catalog using a JDBC connection\n-----------------------------------------------------------------------------\n\nThe Dataplex Universal Catalog JDBC ingestion template copies data from a relational\ndatabase into a Dataplex Universal Catalog asset target. The Dataplex Universal Catalog\nasset can be a Cloud Storage asset or a BigQuery asset.\n\nThis pipeline uses JDBC to connect to the relational database. For an extra\nlayer of protection, you can also pass in a Cloud KMS key along with a\nBase64-encoded username, password, and connection string parameters encrypted\nwith the Cloud KMS key.\n\nThe template transparently handles the different asset types. Data stored on the\nCloud Storage asset is Hive-style partitioned and Dataplex Universal Catalog\n[Discovery](/dataplex/docs/discover-data) makes it automatically available as a\ntable in Data Catalog ([Deprecated](/data-catalog/docs/deprecations)),\nBigQuery (external table),\nor an attached Dataproc Metastore instance.\n\n### Template parameters\n\n### Run the template\n\n### Console\n\n1. In the Google Cloud console, go to the Dataplex Universal Catalog page:\n\n [Go to Dataplex Universal Catalog](https://console.cloud.google.com/dataplex/lakes)\n2. Navigate to the **Process** view.\n\n3. Click **Create Task**.\n\n4. Under **Ingest JDBC to Dataplex** , click **Create task**.\n\n5. Choose a Dataplex Universal Catalog lake.\n\n6. Provide a task name.\n\n7. Choose a region for task execution.\n\n8. Fill in the required parameters.\n\n9. Click **Continue**.\n\n### gcloud\n\nIn your shell or terminal, run the following template: \n\n```\ngcloud beta dataflow flex-template run JOB_NAME \\\n--project=PROJECT_ID \\\n--region=REGION_NAME \\\n--template-file-gcs-location=gs://dataflow-templates-REGION_NAME/latest/flex/Dataplex_JDBC_Ingestion_Preview \\\n--parameters \\\ndriverJars=DRIVER_JARS,\\\nconnectionUrl=CONNECTION_URL,\\\ndriverClassName=DRIVER_CLASS_NAME,\\\nconnectionProperties=CONNECTION_PROPERTIES,\\\nquery=QUERY\\\noutputAsset=OUTPUT_ASSET\\\n```\n\nReplace the following: \n\n```\nJOB_NAME: a job name of your choice\nPROJECT_ID: your template project ID\nDRIVER_JARS: path to your JDBC drivers\nCONNECTION_URL: your JDBC connection URL string\nDRIVER_CLASS_NAME: your JDBC driver class name\nCONNECTION_PROPERTIES: your JDBC connection property string\nQUERY: your JDBC source SQL query\nOUTPUT_ASSET: your Dataplex Universal Catalog output asset ID\n```\n\n### REST API\n\nSubmit an HTTP POST request: \n\n```\nPOST https://dataflow.googleapis.com/v1b3/projects/PROJECT_ID/locations/REGION_NAME/flexTemplates:launch\n{\n \"launch_parameter\": {\n \"jobName\": \"JOB_NAME\",\n \"parameters\": {\n \"driverJars\": \"DRIVER_JARS\",\n \"connectionUrl\": \"CONNECTION_URL\",\n \"driverClassName\": \"DRIVER_CLASS_NAME\",\n \"connectionProperties\": \"CONNECTION_PROPERTIES\",\n \"query\": \"QUERY\"\n \"outputAsset\": \"OUTPUT_ASSET\"\n },\n \"containerSpecGcsPath\": \"gs://dataflow-templates-REGION_NAME/latest/flex/Dataplex_JDBC_Ingestion_Preview\",\n }\n}\n```\n\nReplace the following: \n\n```\nPROJECT_ID: your template project ID\nREGION_NAME: region in which to run the job\nJOB_NAME: a job name of your choice\nDRIVER_JARS: path to your JDBC drivers\nCONNECTION_URL: your JDBC connection URL string\nDRIVER_CLASS_NAME: your JDBC driver class name\nCONNECTION_PROPERTIES: your JDBC connection property string\nQUERY: your JDBC source SQL query\nOUTPUT_ASSET: your Dataplex Universal Catalog output asset ID\n```\n\nWhat's next\n-----------\n\n- Learn how to [manage your lake](/dataplex/docs/manage-lake).\n- Learn how to [Manage your zones](/dataplex/docs/manage-zone)."]]