[pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci
This commit is contained in:
pre-commit-ci[bot]
2022-07-05 19:42:18 +00:00
parent b96757876e
commit 079849388c

View File

@@ -43,13 +43,14 @@ You can build a `pyspark-notebook` image (and also the downstream `all-spark-not
- Spark distribution is defined by the combination of Spark, Hadoop and Scala versions and verified by the package checksum,
see [Download Apache Spark](https://spark.apache.org/downloads.html) and the [archive repo](https://archive.apache.org/dist/spark/) for more information.
- `spark_version`: The Spark version to install (`3.3.0`).
- `hadoop_version`: The Hadoop version (`3.2`).
- `scala_version`: The Scala version (`2.13`).
- `spark_checksum`: The package checksum (`BFE4540...`).
- `openjdk_version`: The version of the OpenJDK (JRE headless) distribution (`17`).
- This version needs to match the version supported by the Spark distribution used above.
- See [Spark Overview](https://spark.apache.org/docs/latest/#downloading) and [Ubuntu packages](https://packages.ubuntu.com/search?keywords=openjdk).
- This version needs to match the version supported by the Spark distribution used above.
- See [Spark Overview](https://spark.apache.org/docs/latest/#downloading) and [Ubuntu packages](https://packages.ubuntu.com/search?keywords=openjdk).
- Starting with _Spark >= 3.2_ the distribution file contains Scala version, hence building older Spark will not work.
- Building older version requires modification to the Dockerfile or using it's older version of the Dockerfile