Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in AWS by (5.6k points)
I'm using airflow on Amazon Web services using EC2 instances, Now the issue is that the average usage of the instances is about 2%. I would like to use a scalable architecture and creating instances only for the duration of the job and kill it. Is it possible to use AWS BATCH as an executor for all airflow jobs?

1 Answer

0 votes
by (12.4k points)

There are some, like a SequentialExecutor, a LocalExecutor, a DuskExecutor, a CeleryExecutor, and a MesosExecutor, they are working AIRFLOW-1899 targeted for 2.0 to introduce a KuberenetesExecutor. So Dask and Celery, don't seem to support a mode where their work is created per task. Mesos might and Kubernetes should but then you would have to scale the clusters for the workers according to account for turning off the nodes when un-needed.

Do Check out the AWS sysops training offered by Intellipaat.

Related questions

0 votes
1 answer

Want to get 50% Hike on your Salary?

Learn how we helped 50,000+ professionals like you !

+1 vote
1 answer
0 votes
1 answer
asked Feb 17, 2021 in AWS by Amyra (12.9k points)
0 votes
1 answer

Browse Categories

...