Mapping Data Flows are executable activities within Azure Data Factory which allow data engineers to construct ETL flows without writing code. It is highly intuitive to work with and you have the option to debug your flow. If you have worked with SSIS before, you will notice how similar it is, but with the exception that MDF is run in the cloud and that Spark clusters are supporting the process from behind, allowing for more resilient processing. You will be demonstrated how to build an ETL flow for large-scale data within Azure MDF: how to source data, how to transform it, and lastly, how to perform upsert to a SQL DB.
for future events