Where extras are passed as parameters of the URI (note that all components of the URI should be URL-encoded). When specifying the connection as URI (in AIRFLOW_CONN_* variable) you should specify it following the standard syntax of DB connections, Please refer to the Airflow documentation to understand the use of extras parameters, for example in order to configure You can also use those variables to adapt your compose file to match an existing PostgreSQL instance managed elsewhere. If you don't provide them explicitly: Variable The AIRFLOW_CORE_SQL_ALCHEMY_CONN and AIRFLOW_CELERY_RESULT_BACKEND variables when needed for you Here is a list of PostgreSQL configuration variables and their default values. If the executor type is set to anything else than SequentialExecutor you'll need an SQL database. Simplified SQL database configuration using PostgreSQL The entrypoint.sh script execute the pip install command (with -user option)ĭocker run -rm -ti puckel/docker-airflow bashĭocker run -rm -ti puckel/docker-airflow ipython.Mount this file as a volume -v $(pwd)/requirements.txt:/requirements.txt (or add it as a volume in docker-compose file).Create a file "requirements.txt" with the desired python modules.Use docker-compose-LocalExecutor.yml or docker-compose-CeleryExecutor.yml which contains support for adding the plugins folder as a volume.Include the folder as a volume in command-line -v $(pwd)/plugins/:/usr/local/airflow/plugins.Mount the folder as a volume by doing either of the following:.Create the plugins folders plugins/ with your custom plugins.In order to incorporate plugins into your docker container Documentation on plugins can be found here This will work for hooks etc, but won't show up in the "Ad-hoc Query" section unless an (empty) connection is also created in the DB Custom Airflow pluginsĪirflow allows for custom user-created plugins which are typically found in $/plugins folder. You can also define connections via environment variables by prefixing them with AIRFLOW_CONN_ - for example for a connection called "postgres_master". The general rule is the environment variable should be named AIRFLOW_, for example AIRFLOW_CORE_SQL_ALCHEMY_CONN sets the sql_alchemy_conn config option in the section.Ĭheck out the Airflow documentation for more details It's possible to set any configuration value for Airflow from environment variables, which are used over values from the airflow.cfg. Docker run puckel/docker-airflow python -c "from cryptography.fernet import Fernet FERNET_KEY = Fernet.generate_key().decode() print(FERNET_KEY)"
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |