Airflow parallelism

parallelism: not a very descriptive name. The description says it sets the maximum task instances for the airflow installation, which is a bit ambiguous — if I have two hosts running airflow workers, I’d have airflow installed on two hosts, so that should be two installations, but based on context ‘per installation’ here means ‘per … Read more

How do I restart airflow webserver?

I advice running airflow in a robust way, with auto-recovery with systemd so you can do: – to start systemctl start airflow – to stop systemctl stop airflow – to restart systemctl restart airflow For this you’ll need a systemd ‘unit’ file. As a (working) example you can use the following: put it in /lib/systemd/system/airflow.service … Read more

Writing to Airflow Logs

You can import the logging module into your code and write to logs that way import logging logging.info(‘Hello’) Here are some more options import logging logging.debug(‘This is a debug message’) logging.info(‘This is an info message’) logging.warning(‘This is a warning message’) logging.error(‘This is an error message’) logging.critical(‘This is a critical message’)

Error while install airflow: By default one of Airflow’s dependencies installs a GPL

Try the following: export AIRFLOW_GPL_UNIDECODE=yes OR export SLUGIFY_USES_TEXT_UNIDECODE=yes Using export makes the environment variable available to all the subprocesses. Also, make sure you are using pip install apache-airflow[postgres] and not pip install airflow[postgres] Which should you use: if using AIRFLOW_GPL_UNIDECODE, airflow will install a dependency that is under GPL license, which means you won’t be … Read more

Hata!: SQLSTATE[HY000] [1045] Access denied for user 'divattrend_liink'@'localhost' (using password: YES)