modern day work ow management
play

Modern day workow management BUILDIN G DATA EN GIN EERIN G P IP - PowerPoint PPT Presentation

Modern day workow management BUILDIN G DATA EN GIN EERIN G P IP ELIN ES IN P YTH ON Oliver Willekens Data Engineer at Data Minded What is a workow? A workow: Sequence of tasks Scheduled at a time or triggered by an event


  1. Modern day work�ow management BUILDIN G DATA EN GIN EERIN G P IP ELIN ES IN P YTH ON Oliver Willekens Data Engineer at Data Minded

  2. What is a work�ow? A work�ow: Sequence of tasks Scheduled at a time or triggered by an event Orchestrate data processing pipelines BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  3. Scheduling with cron Cron reads “crontab” �les: tabulate tasks to be executed at certain times one task per line */15 9-17 * * 1-3,5 log_my_activity BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  4. Scheduling with cron */15 9-17 * * 1-3,5 log_my_activity ____ BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  5. Scheduling with cron The Air�ow task: An instance of an Operator class Inherits from BaseOperator -> Must implement execute() method. Performs a speci�c action (delegation): BashOperator -> run bash command/script PythonOperator -> run Python script SparkSubmitOperator -> submit a Spark job with a cluster BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  6. Scheduling with cron */15 9-17 * * 1-3,5 log_my_activity _ BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  7. Scheduling with cron */15 9-17 * * 1-3,5 log_my_activity _ BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  8. Scheduling with cron */15 9-17 * * 1-3,5 log_my_activity _____ BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  9. Scheduling with cron */15 9-17 * * 1-3,5 log_my_activity BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  10. Scheduling with cron */15 9-17 * * 1-3,5 log_my_activity # Minutes Hours Days Months Days of the week Command */15 9-17 * * 1-3,5 log_my_activity Cron is a dinosaur. Modern work�ow managers: Luigi (Spotify, 2011, Python-based) Azkaban (LinkedIn, 2009, Java-based) Air�ow (Airbnb, 2015, Python-based) BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  11. Apache Air�ow ful�lls modern engineering needs 1. Create and visualize complex work�ows, BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  12. Apache Air�ow ful�lls modern engineering needs 1. Create and visualize complex work�ows, 2. Monitor and log work�ows, BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  13. Apache Air�ow ful�lls modern engineering needs 1. Create and visualize complex work�ows, 2. Monitor and log work�ows, 3. Scales horizontally. BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  14. The Directed Acyclic Graph (DAG) BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  15. The Directed Acyclic Graph (DAG) BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  16. The Directed Acyclic Graph in code from airflow import DAG my_dag = DAG( dag_id="publish_logs", schedule_interval="* * * * *", start_date=datetime(2010, 1, 1) ) BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  17. Classes of operators The Air�ow task: An instance of an Operator class Inherits from BaseOperator -> Must implement execute() method. Performs a speci�c action (delegation): BashOperator -> run bash command/script PythonOperator -> run Python script SparkSubmitOperator -> submit a Spark job with a cluster BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  18. Expressing dependencies between operators dag = DAG(…) task1 = BashOperator(…) task2 = PythonOperator(…) task3 = PythonOperator(…) task1.set_downstream(task2) task3.set_upstream(task2) # equivalent, but shorter: # task1 >> task2 # task3 << task2 # Even clearer: # task1 >> task2 >> task3 BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  19. Let’s practice! BUILDIN G DATA EN GIN EERIN G P IP ELIN ES IN P YTH ON

  20. Building a data pipeline with Air�ow BUILDIN G DATA EN GIN EERIN G P IP ELIN ES IN P YTH ON Oliver Willekens Data Engineer at Data Minded

  21. Air�ow’s BashOperator Executes bash commands Air�ow adds logging, retry options and metrics over running this yourself. from airflow.operators.bash_operator import BashOperator bash_task = BashOperator( task_id='greet_world', dag=dag, bash_command='echo "Hello, world!"' ) BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  22. Air�ow’s PythonOperator Executes Python callables from airflow.operators.python_operator import PythonOperator from my_library import my_magic_function python_task = PythonOperator( dag=dag, task_id='perform_magic', python_callable=my_magic_function, op_kwargs={"snowflake": "*", "amount": 42} ) BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  23. Running PySpark from Air�ow BashOperator: SSHOperator spark_master = ( from airflow.contrib.operators\ "spark://" .ssh_operator import SSHOperator "spark_standalone_cluster_ip" ":7077") task = SSHOperator( task_id='ssh_spark_submit', command = ( dag=dag, "spark-submit " command=command, "--master {master} " ssh_conn_id='spark_master_ssh' "--py-files package1.zip " ) "/path/to/app.py" ).format(master=spark_master) BashOperator(bash_command=command, …) BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  24. Running PySpark from Air�ow SparkSubmitOperator SSHOperator from airflow.contrib.operators\ from airflow.contrib.operators\ .spark_submit_operator \ .ssh_operator import SSHOperator import SparkSubmitOperator task = SSHOperator( spark_task = SparkSubmitOperator( task_id='ssh_spark_submit', task_id='spark_submit_id', dag=dag, dag=dag, command=command, application="/path/to/app.py", ssh_conn_id='spark_master_ssh' py_files="package1.zip", ) conn_id='spark_default' ) BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  25. Let’s practice! BUILDIN G DATA EN GIN EERIN G P IP ELIN ES IN P YTH ON

  26. Deploying Air�ow BUILDIN G DATA EN GIN EERIN G P IP ELIN ES IN P YTH ON Oliver Willekens Data Engineer at Data Minded

  27. Installing and con�guring Air�ow export AIRFLOW_HOME=~/airflow pip install apache-airflow airflow initdb [core] # lots of other configuration settings # … # The executor class that airflow should u # Choices include SequentialExecutor, # LocalExecutor, CeleryExecutor, DaskExecu # KubernetesExecutor executor = SequentialExecutor BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  28. Setting up for production dags : place to store the dags (con�gurable) tests : unit test the possible deployment, possibly ensure consistency across DAGs plugins : store custom operators and hooks connections , pools , variables : provide a location for various con�guration �les you can import into Air�ow. BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  29. Example Air�ow deployment test from airflow.models import DagBag def test_dagbag_import(): """Verify that Airflow will be able to import all DAGs in the repository.""" dagbag = DagBag() number_of_failures = len(dagbag.import_errors) assert number_of_failures == 0, \ "There should be no DAG failures. Got: %s" % dagbag.import_errors BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  30. Transferring DAGs and plugins BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  31. Transferring DAGs and plugins BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  32. Transferring DAGs and plugins BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  33. Transferring DAGs and plugins BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  34. Let’s practice! BUILDIN G DATA EN GIN EERIN G P IP ELIN ES IN P YTH ON

  35. Final thoughts BUILDIN G DATA EN GIN EERIN G P IP ELIN ES IN P YTH ON Oliver Willekens Data Engineer at Data Minded

  36. What you learned De�ne purpose of components of data platforms Write an ingestion pipeline using Singer Create and deploy pipelines for big data in Spark Con�gure automated testing using CircleCI Manage and deploy a full data pipeline with Air�ow BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  37. Additional resources External resources DataCamp courses Singer: https://www.singer.io/ Software engineering: https://www.datacamp.com/courses/software Apache Spark: https://spark.apache.org/ engineering-for-data-scientists-in-python Pytest: https://pytest.org/en/latest/ Spark: Flake8: http://�ake8.pycqa.org/en/latest/ https://www.datacamp.com/courses/cleaning- Circle CI: - https://circleci.com/ data-with-apache-spark-in-python (and other courses) Apache Air�ow: https://air�ow.apache.org/ Unit testing: link yet to be revealed BUILDING DATA ENGINEERING PIPELINES IN PYTHON

  38. Congratulations! ???? BUILDIN G DATA EN GIN EERIN G P IP ELIN ES IN P YTH ON

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend