Struggling to conquer Apache Spark?

Learning is hard enough as it is but when you bring in distributed computing frameworks in sophisticated programming languages - things don't get any easier. While self-study can certainly help, without a good guide, things are always more difficult than they should be. That's why I created Spark Tutorials, to make it easier to learn and use Apache Spark. is here to provide simple, easy to follow tutorials to help you get up and running quickly. You'll learn the foundational abstractions in Apache Spark from RDDs to DataFrames and MLLib. Start off with some of the articles below.

Spark Clusters on AWS EC2 - Reading and Writing S3 Data - Predicting Flight Delays with Spark Part 1

In this tutorial we're gong to set up a complete predictive modeling pipeline in Spark using DataFrames, Pipelines and MLlib. The first part of this tutorial will explain some of the basic concepts that we're going to need to build this model, walk you through how to download the data we'll use, and lastly create our Spark Cluster on Amazon AWS and read and write from AWS S3!

Visit Article »

Spark MLLib - Predict Store Sales with ML Pipelines

In this tutorial we're going to be doing a full-stack machine learning project. We're going all the way from data manipulation to feature creation and finally serving predictions.

Visit Article »

A Simple Scala Spark Project Template and Guide

Thus far I haven't found a good project template for Apache Spark and it's been a repeated process to get it right. In this tutorial, I walk through a simple project template that I've created as an effort to help others get started with Apache Spark in Scala.

Visit Article »