mirror of
https://github.com/jupyter/docker-stacks.git
synced 2025-10-14 21:42:57 +00:00
Update specifics.md
This commit is contained in:
@@ -42,7 +42,7 @@ ipython profile create
|
|||||||
You can build a `pyspark-notebook` image with a different `Spark` version by overriding the default value of the following arguments at build time.
|
You can build a `pyspark-notebook` image with a different `Spark` version by overriding the default value of the following arguments at build time.
|
||||||
`all-spark-notebook` is inherited from `pyspark-notebook`, so you have to first build `pyspark-notebook` and then `all-spark-notebook` to get the same version in `all-spark-notebook`.
|
`all-spark-notebook` is inherited from `pyspark-notebook`, so you have to first build `pyspark-notebook` and then `all-spark-notebook` to get the same version in `all-spark-notebook`.
|
||||||
|
|
||||||
- Spark distribution is defined by the combination of Spark, Hadoop and Scala versions and verified by the package checksum,
|
- Spark distribution is defined by the combination of Spark, Hadoop, and Scala versions and verified by the package checksum,
|
||||||
see [Download Apache Spark](https://spark.apache.org/downloads.html) and the [archive repo](https://archive.apache.org/dist/spark/) for more information.
|
see [Download Apache Spark](https://spark.apache.org/downloads.html) and the [archive repo](https://archive.apache.org/dist/spark/) for more information.
|
||||||
|
|
||||||
- `spark_version`: The Spark version to install (`3.3.0`).
|
- `spark_version`: The Spark version to install (`3.3.0`).
|
||||||
@@ -55,7 +55,7 @@ You can build a `pyspark-notebook` image with a different `Spark` version by ove
|
|||||||
|
|
||||||
- Starting with _Spark >= 3.2_, the distribution file might contain Scala version.
|
- Starting with _Spark >= 3.2_, the distribution file might contain Scala version.
|
||||||
|
|
||||||
For example, here is how to build a `pyspark-notebook` image with Spark `3.2.0`, Hadoop `3.2` and OpenJDK `11`.
|
For example, here is how to build a `pyspark-notebook` image with Spark `3.2.0`, Hadoop `3.2`, and OpenJDK `11`.
|
||||||
|
|
||||||
```{warning}
|
```{warning}
|
||||||
This recipe is not tested and might be broken.
|
This recipe is not tested and might be broken.
|
||||||
|
Reference in New Issue
Block a user