mirror of
https://github.com/jupyter/docker-stacks.git
synced 2025-10-13 13:02:56 +00:00
Bunch of small refactorings and updates
This commit is contained in:
@@ -116,7 +116,7 @@ mamba install some-package
|
||||
|
||||
### Using alternative channels
|
||||
|
||||
Conda is configured by default to use only the [`conda-forge`](https://anaconda.org/conda-forge) channel.
|
||||
Conda is configured by default to use only the [`conda-forge`](https://anaconda.org/conda-forge) channel.
|
||||
However, alternative channels can be used either one shot by overwriting the default channel in the installation command or by configuring `conda` to use different channels.
|
||||
The examples below show how to use the [anaconda default channels](https://repo.anaconda.com/pkgs/main) instead of `conda-forge` to install packages.
|
||||
|
||||
|
@@ -130,7 +130,7 @@ RUN $CONDA_DIR/envs/${conda_env}/bin/python -m ipykernel install --user --name=$
|
||||
fix-permissions /home/$NB_USER
|
||||
|
||||
# any additional pip installs can be added by uncommenting the following line
|
||||
# RUN $CONDA_DIR/envs/${conda_env}/bin/pip install
|
||||
# RUN $CONDA_DIR/envs/${conda_env}/bin/pip install
|
||||
|
||||
# prepend conda environment to path
|
||||
ENV PATH $CONDA_DIR/envs/${conda_env}/bin:$PATH
|
||||
|
@@ -76,7 +76,7 @@ Executing the command: jupyter notebook
|
||||
|
||||
Pressing `Ctrl-C` shuts down the notebook server and immediately destroys the Docker container. Files written to `~/work` in the container remain touched. Any other changes made in the container are lost.
|
||||
|
||||
**Example 3** This command pulls the `jupyter/all-spark-notebook` image currently tagged `latest` from Docker Hub if an image tagged `latest` is not already present on the local host. It then starts a container named `notebook` running a JupyterLab server and exposes the server on a randomly selected port.
|
||||
**Example 3** This command pulls the `jupyter/all-spark-notebook` image currently tagged `latest` from Docker Hub if an image tagged `latest` is not already present on the local host. It then starts a container named `notebook` running a JupyterLab server and exposes the server on a randomly selected port.
|
||||
|
||||
```
|
||||
docker run -d -P --name notebook jupyter/all-spark-notebook
|
||||
|
@@ -82,7 +82,7 @@ sc <- sparkR.session("local")
|
||||
# Sum of the first 100 whole numbers
|
||||
sdf <- createDataFrame(list(1:100))
|
||||
dapplyCollect(sdf,
|
||||
function(x)
|
||||
function(x)
|
||||
{ x <- sum(x)}
|
||||
)
|
||||
# 5050
|
||||
@@ -102,7 +102,7 @@ conf$spark.sql.catalogImplementation <- "in-memory"
|
||||
sc <- spark_connect(master = "local", config = conf)
|
||||
|
||||
# Sum of the first 100 whole numbers
|
||||
sdf_len(sc, 100, repartition = 1) %>%
|
||||
sdf_len(sc, 100, repartition = 1) %>%
|
||||
spark_apply(function(e) sum(e))
|
||||
# 5050
|
||||
```
|
||||
@@ -171,7 +171,7 @@ sc <- sparkR.session("spark://master:7077")
|
||||
# Sum of the first 100 whole numbers
|
||||
sdf <- createDataFrame(list(1:100))
|
||||
dapplyCollect(sdf,
|
||||
function(x)
|
||||
function(x)
|
||||
{ x <- sum(x)}
|
||||
)
|
||||
# 5050
|
||||
@@ -190,7 +190,7 @@ conf$spark.sql.catalogImplementation <- "in-memory"
|
||||
sc <- spark_connect(master = "spark://master:7077", config = conf)
|
||||
|
||||
# Sum of the first 100 whole numbers
|
||||
sdf_len(sc, 100, repartition = 1) %>%
|
||||
sdf_len(sc, 100, repartition = 1) %>%
|
||||
spark_apply(function(e) sum(e))
|
||||
# 5050
|
||||
```
|
||||
|
Reference in New Issue
Block a user