From 079849388c4aa15350d30afa3103b542b8228c71 Mon Sep 17 00:00:00 2001 From: "pre-commit-ci[bot]" <66853113+pre-commit-ci[bot]@users.noreply.github.com> Date: Tue, 5 Jul 2022 19:42:18 +0000 Subject: [PATCH] [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci --- docs/using/specifics.md | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/docs/using/specifics.md b/docs/using/specifics.md index c34cba05..a7f1b6b2 100644 --- a/docs/using/specifics.md +++ b/docs/using/specifics.md @@ -43,13 +43,14 @@ You can build a `pyspark-notebook` image (and also the downstream `all-spark-not - Spark distribution is defined by the combination of Spark, Hadoop and Scala versions and verified by the package checksum, see [Download Apache Spark](https://spark.apache.org/downloads.html) and the [archive repo](https://archive.apache.org/dist/spark/) for more information. + - `spark_version`: The Spark version to install (`3.3.0`). - `hadoop_version`: The Hadoop version (`3.2`). - `scala_version`: The Scala version (`2.13`). - `spark_checksum`: The package checksum (`BFE4540...`). - `openjdk_version`: The version of the OpenJDK (JRE headless) distribution (`17`). - - This version needs to match the version supported by the Spark distribution used above. - - See [Spark Overview](https://spark.apache.org/docs/latest/#downloading) and [Ubuntu packages](https://packages.ubuntu.com/search?keywords=openjdk). + - This version needs to match the version supported by the Spark distribution used above. + - See [Spark Overview](https://spark.apache.org/docs/latest/#downloading) and [Ubuntu packages](https://packages.ubuntu.com/search?keywords=openjdk). - Starting with _Spark >= 3.2_ the distribution file contains Scala version, hence building older Spark will not work. - Building older version requires modification to the Dockerfile or using it's older version of the Dockerfile