Accessing configuration parameters passed to Airflow through CLI

This is probably a continuation of the answer provided by devj.

  1. At airflow.cfg the following property should be set to true:
    dag_run_conf_overrides_params=True

  2. While defining the PythonOperator, pass the following argument provide_context=True. For example:

get_row_count_operator = PythonOperator(task_id='get_row_count', python_callable=do_work, dag=dag, provide_context=True)
  1. Define the python callable (Note the use of **kwargs):
def do_work(**kwargs):    
    table_name = kwargs['dag_run'].conf.get('table_name')    
    # Rest of the code
  1. Invoke the dag from command line:
airflow trigger_dag read_hive --conf '{"table_name":"my_table_name"}'

I have found this discussion to be helpful.

Leave a Comment

Hata!: SQLSTATE[HY000] [1045] Access denied for user 'divattrend_liink'@'localhost' (using password: YES)