2

I would like to unload data from the Redshift db to an S3 bucket, which would later be used to copy into another database. I have written my DAG as below:

from airflow.operators import RedshiftToS3Transfer from datetime import datetime, timedelta from airflow import DAG default_args = { 'owner': 'me', 'start_date': datetime.today(), 'max_active_runs': 1, } dag = DAG(dag_id='redshift_S3', default_args=default_args, schedule_interval="@once", catchup=False ) unload_to_S3 = RedshiftToS3Transfer( task_id='unload_to_S3', schema='schema_name', table='table_name', s3_bucket='bucket_name', s3_key='s3_key', redshift_conn_id='redshift', aws_conn_id='my_s3_conn', dag=dag ) 

But I get an error "Broken DAG: cannot import name 'RedshiftToS3Transfer' from 'airflow.operators' (unknown location)". Any idea on how to import the RedshiftToS3Transfer would be of help.

1
  • 1
    Hi, if you're answering your own question, please put it as an answer rather than an update to the question. This makes it clear this question has been answered. Commented Aug 29, 2019 at 0:15

1 Answer 1

7

The right way to import this is

from airflow.operators.redshift_to_s3_operator import RedshiftToS3Transfer

Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.