Spark - Installation

Card Puncher Data Processing

About

Spark is agnostic to the underlying cluster manager.

The installation is then cluster manager dependent .

Installation Type / cluster managers

Docker

Configuration

See Spark - Configuration

Hdfs

To enable HDFS, set HADOOP_CONF_DIR in SPARK_HOME/conf/spark-env.sh to a location containing the configuration files.





Discover More
Card Puncher Data Processing
Spark - Home directory (SPARK_HOME)

The spark home is the root installation directory of Spark.



Share this page:
Follow us:
Task Runner