mirror of
https://github.com/jupyter/docker-stacks.git
synced 2025-10-17 23:12:56 +00:00
Update links
This commit is contained in:
@@ -103,7 +103,7 @@ You can bypass the provided scripts and specify an arbitrary start command. If y
|
||||
|
||||
## Conda Environments
|
||||
|
||||
The default Python 3.x [Conda environment](http://conda.pydata.org/docs/using/envs.html) resides in `/opt/conda`. The `/opt/conda/bin` directory is part of the default `jovyan` user's `$PATH`. That directory is also whitelisted for use in `sudo` commands by the `start.sh` script.
|
||||
The default Python 3.x [Conda environment](https://conda.io/projects/conda/en/latest/user-guide/concepts/environments.html) resides in `/opt/conda`. The `/opt/conda/bin` directory is part of the default `jovyan` user's `$PATH`. That directory is also whitelisted for use in `sudo` commands by the `start.sh` script.
|
||||
|
||||
The `jovyan` user has full read/write access to the `/opt/conda` directory. You can use either `conda`, `mamba` or `pip` to install new packages without any additional permissions.
|
||||
|
||||
|
@@ -278,7 +278,7 @@ For Ubuntu 18.04 (bionic) and earlier, you may also require to workaround for a
|
||||
|
||||
```dockerfile
|
||||
# https://git.savannah.gnu.org/cgit/man-db.git/commit/?id=8197d7824f814c5d4b992b4c8730b5b0f7ec589a
|
||||
# http://launchpadlibrarian.net/435841763/man-db_2.8.5-2_2.8.6-1.diff.gz
|
||||
# https://launchpadlibrarian.net/435841763/man-db_2.8.5-2_2.8.6-1.diff.gz
|
||||
|
||||
RUN echo "MANPATH_MAP ${CONDA_DIR}/bin ${CONDA_DIR}/man" >> /etc/manpath.config \
|
||||
&& echo "MANPATH_MAP ${CONDA_DIR}/bin ${CONDA_DIR}/share/man" >> /etc/manpath.config \
|
||||
@@ -416,14 +416,14 @@ ENV HADOOP_CONF_DIR /usr/local/hadoop-2.7.3/etc/hadoop
|
||||
|
||||
USER root
|
||||
# Add proper open-jdk-8 not just the jre, needed for pydoop
|
||||
RUN echo 'deb http://cdn-fastly.deb.debian.org/debian jessie-backports main' > /etc/apt/sources.list.d/jessie-backports.list && \
|
||||
RUN echo 'deb https://cdn-fastly.deb.debian.org/debian jessie-backports main' > /etc/apt/sources.list.d/jessie-backports.list && \
|
||||
apt-get -y update && \
|
||||
apt-get install --no-install-recommends -t jessie-backports -y openjdk-8-jdk && \
|
||||
rm /etc/apt/sources.list.d/jessie-backports.list && \
|
||||
apt-get clean && \
|
||||
rm -rf /var/lib/apt/lists/ && \
|
||||
# Add hadoop binaries
|
||||
wget http://mirrors.ukfast.co.uk/sites/ftp.apache.org/hadoop/common/hadoop-2.7.3/hadoop-2.7.3.tar.gz && \
|
||||
wget https://mirrors.ukfast.co.uk/sites/ftp.apache.org/hadoop/common/hadoop-2.7.3/hadoop-2.7.3.tar.gz && \
|
||||
tar -xvf hadoop-2.7.3.tar.gz -C /usr/local && \
|
||||
chown -R $NB_USER:users /usr/local/hadoop-2.7.3 && \
|
||||
rm -f hadoop-2.7.3.tar.gz && \
|
||||
|
@@ -116,7 +116,7 @@ notebook
|
||||
|
||||
## Using JupyterHub
|
||||
|
||||
You can configure JupyterHub to launcher Docker containers from the Jupyter Docker Stacks images. If you've been following the [Zero to JupyterHub with Kubernetes](http://zero-to-jupyterhub.readthedocs.io/en/latest/) guide, see the [Use an existing Docker image](http://zero-to-jupyterhub.readthedocs.io/en/latest/user-environment.html#use-an-existing-docker-image) section for details. If you have a custom JupyterHub deployment, see the [Picking or building a Docker image](https://github.com/jupyterhub/dockerspawner#picking-or-building-a-docker-image) instructions for the [dockerspawner](https://github.com/jupyterhub/dockerspawner) instead.
|
||||
You can configure JupyterHub to launcher Docker containers from the Jupyter Docker Stacks images. If you've been following the [Zero to JupyterHub with Kubernetes](https://zero-to-jupyterhub.readthedocs.io/en/latest/) guide, see the [Use an existing Docker image](https://zero-to-jupyterhub.readthedocs.io/en/latest/jupyterhub/customizing/user-environment.html#choose-and-use-an-existing-docker-image) section for details. If you have a custom JupyterHub deployment, see the [Picking or building a Docker image](https://github.com/jupyterhub/dockerspawner#picking-or-building-a-docker-image) instructions for the [dockerspawner](https://github.com/jupyterhub/dockerspawner) instead.
|
||||
|
||||
## Using Other Tools and Services
|
||||
|
||||
|
@@ -14,7 +14,7 @@ This section provides details about the first.
|
||||
## Core Stacks
|
||||
|
||||
The Jupyter team maintains a set of Docker image definitions in the
|
||||
[https://github.com/jupyter/docker-stacks](https://github.com/jupyter/docker-stacks) GitHub
|
||||
<https://github.com/jupyter/docker-stacks> GitHub
|
||||
repository. The following sections describe these images including their contents, relationships,
|
||||
and versioning strategy.
|
||||
|
||||
@@ -66,16 +66,16 @@ and versioning strategy.
|
||||
- The [R](https://www.r-project.org/) interpreter and base environment
|
||||
- [IRKernel](https://irkernel.github.io/) to support R code in Jupyter notebooks
|
||||
- [tidyverse](https://www.tidyverse.org/) packages from
|
||||
[conda-forge](https://conda-forge.github.io/feedstocks)
|
||||
[conda-forge](https://conda-forge.org/feedstock-outputs/index.html)
|
||||
- [devtools](https://cran.r-project.org/web/packages/devtools/index.html),
|
||||
[shiny](https://shiny.rstudio.com/), [rmarkdown](http://rmarkdown.rstudio.com/),
|
||||
[shiny](https://shiny.rstudio.com/), [rmarkdown](https://rmarkdown.rstudio.com),
|
||||
[forecast](https://cran.r-project.org/web/packages/forecast/forecast.pdf),
|
||||
[rsqlite](https://cran.r-project.org/web/packages/RSQLite/index.html),
|
||||
[nycflights13](https://cran.r-project.org/web/packages/nycflights13/index.html),
|
||||
[caret](http://topepo.github.io/caret/index.html), [tidymodels](https://www.tidymodels.org/),
|
||||
[caret](https://topepo.github.io/caret/index.html), [tidymodels](https://www.tidymodels.org/),
|
||||
[rcurl](https://cran.r-project.org/web/packages/RCurl/index.html), and
|
||||
[randomforest](https://cran.r-project.org/web/packages/randomForest/randomForest.pdf) packages
|
||||
from [conda-forge](https://conda-forge.github.io/feedstocks)
|
||||
from [conda-forge](https://conda-forge.org/feedstock-outputs/index.html)
|
||||
|
||||
### jupyter/scipy-notebook
|
||||
|
||||
@@ -89,20 +89,20 @@ and versioning strategy.
|
||||
- [dask](https://dask.org/), [pandas](https://pandas.pydata.org/),
|
||||
[numexpr](https://github.com/pydata/numexpr), [matplotlib](https://matplotlib.org/),
|
||||
[scipy](https://www.scipy.org/), [seaborn](https://seaborn.pydata.org/),
|
||||
[scikit-learn](http://scikit-learn.org/stable/), [scikit-image](http://scikit-image.org/),
|
||||
[sympy](http://www.sympy.org/en/index.html), [cython](http://cython.org/),
|
||||
[scikit-learn](https://scikit-learn.org/stable/), [scikit-image](https://scikit-image.org),
|
||||
[sympy](https://www.sympy.org/en/index.html), [cython](https://cython.org),
|
||||
[patsy](https://patsy.readthedocs.io/en/latest/),
|
||||
[statsmodel](http://www.statsmodels.org/stable/index.html),
|
||||
[statsmodel](https://www.statsmodels.org/stable/index.html),
|
||||
[cloudpickle](https://github.com/cloudpipe/cloudpickle),
|
||||
[dill](https://pypi.python.org/pypi/dill), [numba](https://numba.pydata.org/),
|
||||
[bokeh](https://bokeh.pydata.org/en/latest/), [sqlalchemy](https://www.sqlalchemy.org/),
|
||||
[hdf5](http://www.h5py.org/), [vincent](http://vincent.readthedocs.io/en/latest/),
|
||||
[dill](https://pypi.org/project/dill/), [numba](https://numba.pydata.org/),
|
||||
[bokeh](https://docs.bokeh.org/en/latest/), [sqlalchemy](https://www.sqlalchemy.org/),
|
||||
[hdf5](https://www.h5py.org), [vincent](https://vincent.readthedocs.io/en/latest/),
|
||||
[beautifulsoup](https://www.crummy.com/software/BeautifulSoup/),
|
||||
[protobuf](https://developers.google.com/protocol-buffers/docs/pythontutorial),
|
||||
[xlrd](http://www.python-excel.org/), [bottleneck](https://bottleneck.readthedocs.io/en/latest/),
|
||||
[xlrd](https://www.python-excel.org), [bottleneck](https://bottleneck.readthedocs.io/en/latest/),
|
||||
and [pytables](https://www.pytables.org/) packages
|
||||
- [ipywidgets](https://ipywidgets.readthedocs.io/en/stable/) and
|
||||
[ipympl](https://github.com/matplotlib/jupyter-matplotlib) for interactive visualizations and
|
||||
[ipympl](https://github.com/matplotlib/ipympl) for interactive visualizations and
|
||||
plots in Python notebooks
|
||||
- [Facets](https://github.com/PAIR-code/facets) for visualizing machine learning datasets
|
||||
|
||||
@@ -131,8 +131,8 @@ communities.
|
||||
images
|
||||
- The [Julia](https://julialang.org/) compiler and base environment
|
||||
- [IJulia](https://github.com/JuliaLang/IJulia.jl) to support Julia code in Jupyter notebooks
|
||||
- [HDF5](https://github.com/JuliaIO/HDF5.jl), [Gadfly](http://gadflyjl.org/stable/), and
|
||||
[RDatasets](https://github.com/johnmyleswhite/RDatasets.jl) packages
|
||||
- [HDF5](https://github.com/JuliaIO/HDF5.jl), [Gadfly](https://gadflyjl.org/stable/), and
|
||||
[RDatasets](https://github.com/JuliaStats/RDatasets.jl) packages
|
||||
|
||||
### jupyter/pyspark-notebook
|
||||
|
||||
@@ -156,9 +156,9 @@ communities.
|
||||
- Everything in `jupyter/pyspark-notebook` and its ancestor images
|
||||
- [IRKernel](https://irkernel.github.io/) to support R code in Jupyter notebooks
|
||||
- [Apache Toree](https://toree.apache.org/) and
|
||||
[spylon-kernel](https://github.com/maxpoint/spylon-kernel) to support Scala code in Jupyter
|
||||
[spylon-kernel](https://github.com/vericast/spylon-kernel) to support Scala code in Jupyter
|
||||
notebooks
|
||||
- [ggplot2](https://ggplot2.tidyverse.org), [sparklyr](http://spark.rstudio.com/), and
|
||||
- [ggplot2](https://ggplot2.tidyverse.org), [sparklyr](https://spark.rstudio.com), and
|
||||
[rcurl](https://cran.r-project.org/web/packages/RCurl/index.html) packages
|
||||
|
||||
### Image Relationships
|
||||
@@ -228,7 +228,7 @@ core images and link them below.
|
||||
[](https://mybinder.org/v2/gh/jbindinga/java-notebook/master).
|
||||
|
||||
- [sage-notebook](https://github.com/sharpTrick/sage-notebook) is a community Jupyter Docker Stack
|
||||
image with the [sagemath](https://sagemath.org) kernel on top of the minimal-notebook image. Click
|
||||
image with the [sagemath](https://www.sagemath.org) kernel on top of the minimal-notebook image. Click
|
||||
here to launch it on
|
||||
[](https://mybinder.org/v2/gh/sharpTrick/sage-notebook/master).
|
||||
|
||||
|
@@ -6,7 +6,7 @@ This page provides details about features specific to one or more images.
|
||||
|
||||
### Specific Docker Image Options
|
||||
|
||||
- `-p 4040:4040` - The `jupyter/pyspark-notebook` and `jupyter/all-spark-notebook` images open [SparkUI (Spark Monitoring and Instrumentation UI)](http://spark.apache.org/docs/latest/monitoring.html) at default port `4040`, this option map `4040` port inside docker container to `4040` port on host machine . Note every new spark context that is created is put onto an incrementing port (ie. 4040, 4041, 4042, etc.), and it might be necessary to open multiple ports. For example: `docker run -d -p 8888:8888 -p 4040:4040 -p 4041:4041 jupyter/pyspark-notebook`.
|
||||
- `-p 4040:4040` - The `jupyter/pyspark-notebook` and `jupyter/all-spark-notebook` images open [SparkUI (Spark Monitoring and Instrumentation UI)](https://spark.apache.org/docs/latest/monitoring.html) at default port `4040`, this option map `4040` port inside docker container to `4040` port on host machine . Note every new spark context that is created is put onto an incrementing port (ie. 4040, 4041, 4042, etc.), and it might be necessary to open multiple ports. For example: `docker run -d -p 8888:8888 -p 4040:4040 -p 4041:4041 jupyter/pyspark-notebook`.
|
||||
|
||||
### Build an Image with a Different Version of Spark
|
||||
|
||||
@@ -131,10 +131,10 @@ Connection to Spark Cluster on **[Standalone Mode](https://spark.apache.org/docs
|
||||
|
||||
0. Verify that the docker image (check the Dockerfile) and the Spark Cluster which is being
|
||||
deployed, run the same version of Spark.
|
||||
1. [Deploy Spark in Standalone Mode](http://spark.apache.org/docs/latest/spark-standalone.html).
|
||||
1. [Deploy Spark in Standalone Mode](https://spark.apache.org/docs/latest/spark-standalone.html).
|
||||
2. Run the Docker container with `--net=host` in a location that is network addressable by all of
|
||||
your Spark workers. (This is a [Spark networking
|
||||
requirement](http://spark.apache.org/docs/latest/cluster-overview.html#components).)
|
||||
requirement](https://spark.apache.org/docs/latest/cluster-overview.html#components).)
|
||||
- NOTE: When using `--net=host`, you must also use the flags `--pid=host -e TINI_SUBREAPER=true`. See <https://github.com/jupyter/docker-stacks/issues/64> for details.
|
||||
|
||||
**Note**: In the following examples we are using the Spark master URL `spark://master:7077` that shall be replaced by the URL of the Spark master.
|
||||
|
Reference in New Issue
Block a user