diff --git a/docs/contributing/packages.md b/docs/contributing/packages.md index 0932c94e..eeec2ec9 100644 --- a/docs/contributing/packages.md +++ b/docs/contributing/packages.md @@ -5,7 +5,9 @@ The dependencies resolution is a difficult thing to do This means that packages might have old versions. Images are rebuilt weekly, so usually, packages receive updates quite frequently. -_Note: We pin major.minor version of python, so it will stay the same even after `mamba update` command._ +```{note} +We pin major.minor version of python, so it will stay the same even after `mamba update` command. +``` ## Outdated packages diff --git a/docs/contributing/tests.md b/docs/contributing/tests.md index 0e7078f5..24d1b628 100644 --- a/docs/contributing/tests.md +++ b/docs/contributing/tests.md @@ -18,7 +18,7 @@ defined in the [conftest.py](https://github.com/jupyter/docker-stacks/blob/maste If you want to simply run a python script in one of our images, you could add a unit test. Simply create `-notebook/test/units/` directory, if it doesn't already exist and put your file there. These file will run automatically when tests are run. -You could see an example for tensorflow package [here](https://github.com/jupyter/docker-stacks/blob/master/tensorflow-notebook/test/units/unit_tensorflow.py). +You could see an example for tensorflow package [here](https://github.com/jupyter/docker-stacks/blob/HEAD/tensorflow-notebook/test/units/unit_tensorflow.py). ## Contributing New Tests diff --git a/docs/maintaining/tasks.md b/docs/maintaining/tasks.md index a41fff0a..2312f234 100644 --- a/docs/maintaining/tasks.md +++ b/docs/maintaining/tasks.md @@ -22,7 +22,9 @@ Pushing `Run Workflow` button will trigger this process. ## Adding a New Core Image to Docker Hub -_Note: in general, we do not add new core images and ask contributors to either create a [recipe](../using/recipes.md) or [community stack](../using/stacks.md)_ +```{note} +In general, we do not add new core images and ask contributors to either create a [recipe](../using/recipes.md) or [community stack](../using/stacks.md). +``` When there's a new stack definition, do the following before merging the PR with the new stack: diff --git a/docs/using/common.md b/docs/using/common.md index d95834d7..b105dd10 100644 --- a/docs/using/common.md +++ b/docs/using/common.md @@ -207,7 +207,7 @@ For example, to run the text-based `ipython` console in a container, do the foll docker run -it --rm jupyter/base-notebook start.sh ipython ``` -Or, to run Jupyter Notebook classic instead of the JupyterLab, run the following: +Or, to run Jupyter Notebook classic instead of JupyterLab, run the following: ```bash docker run -it --rm -p 8888:8888 jupyter/base-notebook start.sh jupyter notebook diff --git a/docs/using/specifics.md b/docs/using/specifics.md index dc8d956f..90770038 100644 --- a/docs/using/specifics.md +++ b/docs/using/specifics.md @@ -162,7 +162,9 @@ Connection to Spark Cluster on **[Standalone Mode](https://spark.apache.org/docs 2. Run the Docker container with `--net=host` in a location that is network addressable by all of your Spark workers. (This is a [Spark networking requirement](https://spark.apache.org/docs/latest/cluster-overview.html#components).) - _Note: When using `--net=host`, you must also use the flags `--pid=host -e TINI_SUBREAPER=true`. See for details._ + ```{note} + When using `--net=host`, you must also use the flags `--pid=host -e TINI_SUBREAPER=true`. See for details._ + ``` **Note**: In the following examples we are using the Spark master URL `spark://master:7077` that shall be replaced by the URL of the Spark master.