Spark - HDFS

> Database > Spark > Spark - Admin

1 - About

HDFS in Spark.

Advertising

3 - Management

3.1 - Configuration

If you plan to read and write from HDFS using Spark, there are two Hadoop configuration files that should be included on Spark’s classpath (???):

To make these files visible to Spark, set HADOOP_CONF_DIR in $SPARK_HOME/conf/spark-env.sh to a location containing the configuration files.

4 - Documentation / Reference