:: DeveloperApi :: Information about an org.apache.spark.Accumulable modified during a task or stage.
:: DeveloperApi :: Information about an org.apache.spark.Accumulable modified during a task or stage.
Once this is JSON serialized the types of update
and value
will be lost and be
cast to strings. This is because the user can define an accumulator of any type and it will
be difficult to preserve the type in consumers of the event log. This does not apply to
internal accumulators that represent task level metrics.
:: DeveloperApi :: Parses and holds information about inputFormat (and files) specified as a parameter.
:: DeveloperApi :: Parses and holds information about inputFormat (and files) specified as a parameter.
:: DeveloperApi :: A result of a job in the DAGScheduler.
:: DeveloperApi :: A result of a job in the DAGScheduler.
:: DeveloperApi :: A default implementation for SparkListenerInterface that has no-op implementations for all callbacks.
:: DeveloperApi :: A default implementation for SparkListenerInterface that has no-op implementations for all callbacks.
Note that this is an internal interface which might change in different Spark releases.
Periodic updates from executors.
Periodic updates from executors.
executor id
sequence of (taskId, stageId, stageAttemptId, accumUpdates)
:: DeveloperApi :: Stores information about a stage to pass from the scheduler to SparkListeners.
:: DeveloperApi :: Stores information about a stage to pass from the scheduler to SparkListeners.
:: DeveloperApi :: Simple SparkListener that logs a few summary statistics when each stage completes.
:: DeveloperApi :: Simple SparkListener that logs a few summary statistics when each stage completes.
:: DeveloperApi :: Information about a running task attempt inside a TaskSet.
:: DeveloperApi :: Information about a running task attempt inside a TaskSet.
A collection of deprecated constructors.
A collection of deprecated constructors. This will be removed soon.
"FAIR" and "FIFO" determines which policy is used to order tasks amongst a Schedulable's sub-queues "NONE" is used when the a Schedulable has no sub-queues.
Spark's scheduling components. This includes the org.apache.spark.scheduler.DAGScheduler and lower level org.apache.spark.scheduler.TaskScheduler.