Spark - Overview

Create SparkConf. local[2] means local mode, 2 cores.

val conf = new SparkConf().setAppName("myAppName").setMaster("local[2]")

Create SparkContext

val sc = new SparkContext(conf)

Create SQLContext

val sqlContext = new SQLContext(sc)

Load data file

val distFile = sc.textFile("src/main/resources/Titanic/train.csv")

print some info



print all the lines, use .foreach instead of .map, since .map is a transformation, will not be evaluated until an action


To join strings, use .mkString

records.foreach(row => println(row.mkString(",")))

Write to file(use Java API)

val writer = new FileOutputStream("path/to/file.csv")

sbt: %% auto scala version

libraryDependencies += "com.databricks" %% "spark-csv" % "1.0.3"

equivalent to

libraryDependencies += "com.databricks" % "spark-csv_2.11" % "1.0.3"

Spark RDD Cache vs Checkpoint

  • cache: save in memory, may need to recompute upon worker failure
  • checkpoint: save in external storage(disk)