Apply suggestions from code review

Co-authored-by: Erik Sundell <erik.i.sundell@gmail.com>
This commit is contained in:
Ayaz Salikhov
2022-02-02 12:41:16 +03:00
committed by GitHub
parent 6ff12c18a9
commit 12ccbf5149
5 changed files with 11 additions and 5 deletions

View File

@@ -5,7 +5,9 @@ The dependencies resolution is a difficult thing to do
This means that packages might have old versions. This means that packages might have old versions.
Images are rebuilt weekly, so usually, packages receive updates quite frequently. Images are rebuilt weekly, so usually, packages receive updates quite frequently.
_Note: We pin major.minor version of python, so it will stay the same even after `mamba update` command._ ```{note}
We pin major.minor version of python, so it will stay the same even after `mamba update` command.
```
## Outdated packages ## Outdated packages

View File

@@ -18,7 +18,7 @@ defined in the [conftest.py](https://github.com/jupyter/docker-stacks/blob/maste
If you want to simply run a python script in one of our images, you could add a unit test. If you want to simply run a python script in one of our images, you could add a unit test.
Simply create `<somestack>-notebook/test/units/` directory, if it doesn't already exist and put your file there. Simply create `<somestack>-notebook/test/units/` directory, if it doesn't already exist and put your file there.
These file will run automatically when tests are run. These file will run automatically when tests are run.
You could see an example for tensorflow package [here](https://github.com/jupyter/docker-stacks/blob/master/tensorflow-notebook/test/units/unit_tensorflow.py). You could see an example for tensorflow package [here](https://github.com/jupyter/docker-stacks/blob/HEAD/tensorflow-notebook/test/units/unit_tensorflow.py).
## Contributing New Tests ## Contributing New Tests

View File

@@ -22,7 +22,9 @@ Pushing `Run Workflow` button will trigger this process.
## Adding a New Core Image to Docker Hub ## Adding a New Core Image to Docker Hub
_Note: in general, we do not add new core images and ask contributors to either create a [recipe](../using/recipes.md) or [community stack](../using/stacks.md)_ ```{note}
In general, we do not add new core images and ask contributors to either create a [recipe](../using/recipes.md) or [community stack](../using/stacks.md).
```
When there's a new stack definition, do the following before merging the PR with the new stack: When there's a new stack definition, do the following before merging the PR with the new stack:

View File

@@ -207,7 +207,7 @@ For example, to run the text-based `ipython` console in a container, do the foll
docker run -it --rm jupyter/base-notebook start.sh ipython docker run -it --rm jupyter/base-notebook start.sh ipython
``` ```
Or, to run Jupyter Notebook classic instead of the JupyterLab, run the following: Or, to run Jupyter Notebook classic instead of JupyterLab, run the following:
```bash ```bash
docker run -it --rm -p 8888:8888 jupyter/base-notebook start.sh jupyter notebook docker run -it --rm -p 8888:8888 jupyter/base-notebook start.sh jupyter notebook

View File

@@ -162,7 +162,9 @@ Connection to Spark Cluster on **[Standalone Mode](https://spark.apache.org/docs
2. Run the Docker container with `--net=host` in a location that is network addressable by all of 2. Run the Docker container with `--net=host` in a location that is network addressable by all of
your Spark workers. your Spark workers.
(This is a [Spark networking requirement](https://spark.apache.org/docs/latest/cluster-overview.html#components).) (This is a [Spark networking requirement](https://spark.apache.org/docs/latest/cluster-overview.html#components).)
_Note: When using `--net=host`, you must also use the flags `--pid=host -e TINI_SUBREAPER=true`. See <https://github.com/jupyter/docker-stacks/issues/64> for details._ ```{note}
When using `--net=host`, you must also use the flags `--pid=host -e TINI_SUBREAPER=true`. See <https://github.com/jupyter/docker-stacks/issues/64> for details._
```
**Note**: In the following examples we are using the Spark master URL `spark://master:7077` that shall be replaced by the URL of the Spark master. **Note**: In the following examples we are using the Spark master URL `spark://master:7077` that shall be replaced by the URL of the Spark master.