BigData / Apache Airflow Interview Questions
What is an Airflow Dataset and how does data-driven scheduling work?
Datasets (introduced in Airflow 2.4) are logical references to data assets identified by a URI. A DAG can produce a Dataset via an outlet, and another DAG can be scheduled to run automatically when that Dataset is updated.
from airflow.datasets import Dataset
my_dataset = Dataset('s3://my-bucket/output/daily.csv')
# Producer DAG
@dag(schedule='@daily', ...)
def producer():
@task(outlets=[my_dataset])
def write_data():
... # write to S3
# Consumer DAG — triggered whenever my_dataset is updated
@dag(schedule=[my_dataset], ...)
def consumer():
...
This replaces fragile time-based scheduling with event-driven, data-dependency-aware scheduling.
Invest now in Acorns!!! 🚀
Join Acorns and get your $5 bonus!
Acorns is a micro-investing app that automatically invests your "spare change" from daily purchases into diversified, expert-built portfolios of ETFs. It is designed for beginners, allowing you to start investing with as little as $5. The service automates saving and investing. Disclosure: I may receive a referral bonus.
Invest now!!! Get Free equity stock (US, UK only)!
Use Robinhood app to invest in stocks. It is safe and secure. Use the Referral link to claim your free stock when you sign up!.
The Robinhood app makes it easy to trade stocks, crypto and more.
Webull! Receive free stock by signing up using the link: Webull signup.
More Related questions...
