redqert.blogg.se

Airflow kubernetes executor pod template
Airflow kubernetes executor pod template








airflow kubernetes executor pod template

This will replace the default pod_template_file named in the airflow.cfg and then override that template using the pod_override. You can also create custom pod_template_file on a per-task basis so that you can recycle the same base values between multiple tasks. apiVersion : v1 kind : Pod metadata : name : dummy-name spec : containers : - args : command : env : - name : AIRFLOW_CORE_EXECUTOR value : LocalExecutor # Hard Coded Airflow Envs - name : AIRFLOW_CORE_FERNET_KEY valueFrom : secretKeyRef : name : RELEASE-NAME-fernet-key key : fernet-key - name : AIRFLOW_CORE_SQL_ALCHEMY_CONN valueFrom : secretKeyRef : name : RELEASE-NAME-airflow-metadata key : connection - name : AIRFLOW_CONN_AIRFLOW_DB valueFrom : secretKeyRef : name : RELEASE-NAME-airflow-metadata key : connection envFrom : image : dummy_image imagePullPolicy : IfNotPresent name : base ports : volumeMounts : - mountPath : "/opt/airflow/logs" name : airflow-logs - mountPath : /opt/airflow/dags name : airflow-dags readOnly : false - mountPath : /opt/airflow/dags name : airflow-dags readOnly : true subPath : repo/tests/dags hostNetwork : false restartPolicy : Never securit圜ontext : runAsUser : 50000 fsGroup : 50000 nodeSelector :, ) Also, configuration information specific to the Kubernetes Executor, such as the worker namespace and image information, needs to be specified in the Airflow Configuration file.Īdditionally, the Kubernetes Executor enables specification of additional features on a per-task basis using the Executor config. One example of an Airflow deployment running on a distributed set of five nodes in a Kubernetes cluster is shown below.Ĭonsistent with the regular Airflow architecture, the Workers need access to the DAG files to execute the tasks within those DAGs and interact with the Metadata repository. The worker pod then runs the task, reports the result, and terminates. When a DAG submits a task, the KubernetesExecutor requests a worker pod from the Kubernetes API. KubernetesExecutor requires a non-sqlite database in the backend. Not necessarily need to be running on Kubernetes, but does need access to a Kubernetes cluster. KubernetesExecutor runs as a process in the Airflow Scheduler. The Kubernetes executor runs each task instance in its own pod on a Kubernetes cluster.

airflow kubernetes executor pod template

But What About Cases Where the Scheduler Pod Crashes?.










Airflow kubernetes executor pod template