This tutorial will give examples that you can use to transform your data using Scala and Spark. The focus of this tutorial is how to use Spark Datasets after reading in your data, and before writing it out… the
Extract, Transform, Load (ETL).
One of the benefits of writing code with Scala on Spark is that Scala allows you to write in an object-oriented programming (OOP) or a functional programming (FP) style. This is useful when you have Java developers who only know how to write code in an OOP style. However, Spark is a distributed processing engine…
Senior Data Engineer working with Scala, Spark, Docker, and Kubernetes at IBM. Currently getting a Math degree at the University of Texas at Austin.