Core Spark functionality. org.apache.spark.SparkContext serves as the main entry point to
Spark, while org.apache.spark.rdd.RDD is the data type representing a distributed collection,
and provides most parallel operations.
In addition, org.apache.spark.rdd.PairRDDFunctions contains operations available only on RDDs
of key-value pairs, such as groupByKey and join; org.apache.spark.rdd.DoubleRDDFunctions
contains operations available only on RDDs of Doubles; and
org.apache.spark.rdd.SequenceFileRDDFunctions contains operations available on RDDs that can
be saved as SequenceFiles. These operations are automatically available on any RDD of the right
type (e.g. RDD[(Int, Int)] through implicit conversions.
Java programmers should reference the org.apache.spark.api.java package
for Spark programming APIs in Java.
Classes and methods marked with
Experimental are user-facing features which have not been officially adopted by the
Spark project. These are subject to change or removal in minor releases.
Classes and methods marked with
Developer API are intended for advanced users want to extend Spark through lower
level interfaces. These are subject to changes or removal in minor releases.
Core Spark functionality. org.apache.spark.SparkContext serves as the main entry point to Spark, while org.apache.spark.rdd.RDD is the data type representing a distributed collection, and provides most parallel operations.
In addition, org.apache.spark.rdd.PairRDDFunctions contains operations available only on RDDs of key-value pairs, such as
groupByKey
andjoin
; org.apache.spark.rdd.DoubleRDDFunctions contains operations available only on RDDs of Doubles; and org.apache.spark.rdd.SequenceFileRDDFunctions contains operations available on RDDs that can be saved as SequenceFiles. These operations are automatically available on any RDD of the right type (e.g. RDD[(Int, Int)] through implicit conversions.Java programmers should reference the org.apache.spark.api.java package for Spark programming APIs in Java.
Classes and methods marked with Experimental are user-facing features which have not been officially adopted by the Spark project. These are subject to change or removal in minor releases.
Classes and methods marked with Developer API are intended for advanced users want to extend Spark through lower level interfaces. These are subject to changes or removal in minor releases.