mirror of
https://github.com/jupyterhub/jupyterhub.git
synced 2025-10-09 19:13:03 +00:00
Compare commits
84 Commits
Author | SHA1 | Date | |
---|---|---|---|
![]() |
fc6435825c | ||
![]() |
b3ab48eb68 | ||
![]() |
a212151c09 | ||
![]() |
67ccfc7eb7 | ||
![]() |
9af103c673 | ||
![]() |
82643adfb6 | ||
![]() |
74df94d15a | ||
![]() |
da1b9bdd80 | ||
![]() |
18675ef6df | ||
![]() |
bf9dea5522 | ||
![]() |
62e30c1d79 | ||
![]() |
1316196542 | ||
![]() |
1a377bd03a | ||
![]() |
66a99ce881 | ||
![]() |
481debcb80 | ||
![]() |
03c25b5cac | ||
![]() |
26c060d2c5 | ||
![]() |
7ff42f9b55 | ||
![]() |
a35d8a6262 | ||
![]() |
8f39e1f8f9 | ||
![]() |
ff19b799c4 | ||
![]() |
e547949aee | ||
![]() |
31be00b49f | ||
![]() |
4533d96002 | ||
![]() |
7f89f1a2a0 | ||
![]() |
aed29e1db8 | ||
![]() |
49bee25820 | ||
![]() |
838c8eb057 | ||
![]() |
be5860822d | ||
![]() |
5a10d304c9 | ||
![]() |
fdd3746f54 | ||
![]() |
4d55a48a79 | ||
![]() |
b2ece48239 | ||
![]() |
6375ba30b7 | ||
![]() |
f565f8ac53 | ||
![]() |
5ec05822f1 | ||
![]() |
335b47d7c1 | ||
![]() |
f922561003 | ||
![]() |
79df83f0d3 | ||
![]() |
29416463ff | ||
![]() |
dd2e1ef758 | ||
![]() |
a9b8542ec7 | ||
![]() |
a4ae2ec2d8 | ||
![]() |
b54bfad8c2 | ||
![]() |
724bf7c4ce | ||
![]() |
fccc954fb4 | ||
![]() |
74385a6906 | ||
![]() |
dd66fe63c0 | ||
![]() |
e74934cb17 | ||
![]() |
450281a90a | ||
![]() |
6e7fc0574e | ||
![]() |
fc49aac02b | ||
![]() |
097d883905 | ||
![]() |
cb55118f70 | ||
![]() |
2a3c87945e | ||
![]() |
2b2aacedc6 | ||
![]() |
8ebec52827 | ||
![]() |
1642cc30c8 | ||
![]() |
1645d8f0c0 | ||
![]() |
8d390819a1 | ||
![]() |
c7dd18bb03 | ||
![]() |
84b7de4d21 | ||
![]() |
161df53143 | ||
![]() |
1cfd6cf12e | ||
![]() |
d40dcc35fb | ||
![]() |
a570e95602 | ||
![]() |
e4e43521ee | ||
![]() |
1b2c21a99c | ||
![]() |
e28eda6386 | ||
![]() |
39c171cce7 | ||
![]() |
c81cefd768 | ||
![]() |
325f137265 | ||
![]() |
1ae795df18 | ||
![]() |
2aacd5e28b | ||
![]() |
6e1425e2c0 | ||
![]() |
010db6ce72 | ||
![]() |
ce8d782220 | ||
![]() |
90c2b23fc0 | ||
![]() |
32685aeac1 | ||
![]() |
01c5608104 | ||
![]() |
a35f6298f0 | ||
![]() |
8955d6aed4 | ||
![]() |
cafbf8b990 | ||
![]() |
f626d2f6e5 |
@@ -15,3 +15,7 @@ script:
|
|||||||
- py.test --cov jupyterhub jupyterhub/tests -v
|
- py.test --cov jupyterhub jupyterhub/tests -v
|
||||||
after_success:
|
after_success:
|
||||||
- codecov
|
- codecov
|
||||||
|
matrix:
|
||||||
|
include:
|
||||||
|
- python: 3.5
|
||||||
|
env: JUPYTERHUB_TEST_SUBDOMAIN_HOST=http://127.0.0.1.xip.io:8000
|
||||||
|
@@ -46,5 +46,7 @@ WORKDIR /srv/jupyterhub/
|
|||||||
|
|
||||||
EXPOSE 8000
|
EXPOSE 8000
|
||||||
|
|
||||||
|
LABEL org.jupyter.service="jupyterhub"
|
||||||
|
|
||||||
ONBUILD ADD jupyterhub_config.py /srv/jupyterhub/jupyterhub_config.py
|
ONBUILD ADD jupyterhub_config.py /srv/jupyterhub/jupyterhub_config.py
|
||||||
CMD ["jupyterhub", "-f", "/srv/jupyterhub/jupyterhub_config.py"]
|
CMD ["jupyterhub", "-f", "/srv/jupyterhub/jupyterhub_config.py"]
|
||||||
|
23
README.md
23
README.md
@@ -25,7 +25,7 @@ Basic principles:
|
|||||||
|
|
||||||
## Dependencies
|
## Dependencies
|
||||||
|
|
||||||
JupyterHub requires [IPython](https://ipython.org/install.html) >= 3.0 (current master) and [Python](https://www.python.org/downloads/) >= 3.3.
|
JupyterHub itself requires [Python](https://www.python.org/downloads/) ≥ 3.3. To run the single-user servers (which may be on the same system as the Hub or not), [Jupyter Notebook](https://jupyter.readthedocs.org/en/latest/install.html) ≥ 4 is required.
|
||||||
|
|
||||||
Install [nodejs/npm](https://www.npmjs.com/), which is available from your
|
Install [nodejs/npm](https://www.npmjs.com/), which is available from your
|
||||||
package manager. For example, install on Linux (Debian/Ubuntu) using:
|
package manager. For example, install on Linux (Debian/Ubuntu) using:
|
||||||
@@ -50,14 +50,15 @@ Notes on the `pip` command used in the installation directions below:
|
|||||||
|
|
||||||
## Installation
|
## Installation
|
||||||
|
|
||||||
JupyterHub can be installed with pip:
|
JupyterHub can be installed with pip, and the proxy with npm:
|
||||||
|
|
||||||
|
npm install -g configurable-http-proxy
|
||||||
pip3 install jupyterhub
|
pip3 install jupyterhub
|
||||||
|
|
||||||
If you plan to run notebook servers locally, you may also need to install the
|
If you plan to run notebook servers locally, you may also need to install the
|
||||||
Jupyter ~~IPython~~ notebook:
|
Jupyter ~~IPython~~ notebook:
|
||||||
|
|
||||||
pip3 install notebook
|
pip3 install --upgrade notebook
|
||||||
|
|
||||||
|
|
||||||
### Development install
|
### Development install
|
||||||
@@ -116,6 +117,22 @@ Some examples, meant as illustration and testing of this concept:
|
|||||||
- Using GitHub OAuth instead of PAM with [OAuthenticator](https://github.com/jupyter/oauthenticator)
|
- Using GitHub OAuth instead of PAM with [OAuthenticator](https://github.com/jupyter/oauthenticator)
|
||||||
- Spawning single-user servers with Docker, using the [DockerSpawner](https://github.com/jupyter/dockerspawner)
|
- Spawning single-user servers with Docker, using the [DockerSpawner](https://github.com/jupyter/dockerspawner)
|
||||||
|
|
||||||
|
### Docker
|
||||||
|
|
||||||
|
There is a ready to go [docker image](https://hub.docker.com/r/jupyter/jupyterhub/).
|
||||||
|
It can be started with the following command:
|
||||||
|
|
||||||
|
docker run -d --name jupyter.cont [-v /home/jupyter-home:/home] jupyter/jupyterhub jupyterhub
|
||||||
|
|
||||||
|
This command will create a container named `jupyter.cont` that you can stop and resume with `docker stop/start`.
|
||||||
|
It will be listening on all interfaces at port 8000. So this is perfect to test docker on your desktop or laptop.
|
||||||
|
If you want to run docker on a computer that has a public IP then you should (as in MUST) secure it with ssl by
|
||||||
|
adding ssl options to your docker configuration or using a ssl enabled proxy. The `-v/--volume` option will
|
||||||
|
allow you to store data outside the docker image (host system) so it will be persistent, even when you start
|
||||||
|
a new image. The command `docker exec -it jupyter.cont bash` will spawn a root shell in your started docker
|
||||||
|
container. You can use it to create system users in the container. These accounts will be used for authentication
|
||||||
|
in jupyterhub's default configuration. In order to run without SSL, you'll need to set `--no-ssl` explicitly.
|
||||||
|
|
||||||
# Getting help
|
# Getting help
|
||||||
|
|
||||||
We encourage you to ask questions on the mailing list:
|
We encourage you to ask questions on the mailing list:
|
||||||
|
@@ -77,6 +77,7 @@ For simple mappings, a configurable dict `Authenticator.username_map` is used to
|
|||||||
c.Authenticator.username_map = {
|
c.Authenticator.username_map = {
|
||||||
'service-name': 'localname'
|
'service-name': 'localname'
|
||||||
}
|
}
|
||||||
|
```
|
||||||
|
|
||||||
### Validating usernames
|
### Validating usernames
|
||||||
|
|
||||||
|
@@ -2,6 +2,18 @@
|
|||||||
|
|
||||||
See `git log` for a more detailed summary.
|
See `git log` for a more detailed summary.
|
||||||
|
|
||||||
|
## 0.5
|
||||||
|
|
||||||
|
|
||||||
|
- Single-user server must be run with Jupyter Notebook ≥ 4.0
|
||||||
|
- Require `--no-ssl` confirmation to allow the Hub to be run without SSL (e.g. behind SSL termination in nginx)
|
||||||
|
- Add lengths to text fields for MySQL support
|
||||||
|
- Add `Spawner.disable_user_config` for preventing user-owned configuration from modifying single-user servers.
|
||||||
|
- Fixes for MySQL support.
|
||||||
|
- Add ability to run each user's server on its own subdomain. Requires wildcard DNS and wildcard SSL to be feasible. Enable subdomains by setting `JupyterHub.subdomain_host = 'https://jupyterhub.domain.tld[:port]'`.
|
||||||
|
- Use `127.0.0.1` for local communication instead of `localhost`, avoiding issues with DNS on some systems.
|
||||||
|
- Fix race that could add users to proxy prematurely if spawning is slow.
|
||||||
|
|
||||||
## 0.4
|
## 0.4
|
||||||
|
|
||||||
### 0.4.1
|
### 0.4.1
|
||||||
|
@@ -22,6 +22,9 @@ There are three main categories of processes run by the `jupyterhub` command lin
|
|||||||
|
|
||||||
## JupyterHub's default behavior
|
## JupyterHub's default behavior
|
||||||
|
|
||||||
|
**IMPORTANT:** In its default configuration, JupyterHub runs without SSL encryption (HTTPS).
|
||||||
|
**You should not run JupyterHub without SSL encryption on a public network.**
|
||||||
|
See [Security documentation](#Security) for how to configure JupyterHub to use SSL.
|
||||||
|
|
||||||
To start JupyterHub in its default configuration, type the following at the command line:
|
To start JupyterHub in its default configuration, type the following at the command line:
|
||||||
|
|
||||||
@@ -44,10 +47,6 @@ or any other public IP or domain pointing to your system.
|
|||||||
In their default configuration, the other services, the **Hub** and **Single-User Servers**,
|
In their default configuration, the other services, the **Hub** and **Single-User Servers**,
|
||||||
all communicate with each other on localhost only.
|
all communicate with each other on localhost only.
|
||||||
|
|
||||||
**NOTE:** In its default configuration, JupyterHub runs without SSL encryption (HTTPS).
|
|
||||||
You should not run JupyterHub without SSL encryption on a public network.
|
|
||||||
See [Security documentation](#Security) for how to configure JupyterHub to use SSL.
|
|
||||||
|
|
||||||
By default, starting JupyterHub will write two files to disk in the current working directory:
|
By default, starting JupyterHub will write two files to disk in the current working directory:
|
||||||
|
|
||||||
- `jupyterhub.sqlite` is the sqlite database containing all of the state of the **Hub**.
|
- `jupyterhub.sqlite` is the sqlite database containing all of the state of the **Hub**.
|
||||||
@@ -148,6 +147,9 @@ c.JupyterHub.hub_port = 54321
|
|||||||
|
|
||||||
## Security
|
## Security
|
||||||
|
|
||||||
|
**IMPORTANT:** In its default configuration, JupyterHub runs without SSL encryption (HTTPS).
|
||||||
|
**You should not run JupyterHub without SSL encryption on a public network.**
|
||||||
|
|
||||||
Security is the most important aspect of configuring Jupyter. There are three main aspects of the
|
Security is the most important aspect of configuring Jupyter. There are three main aspects of the
|
||||||
security configuration:
|
security configuration:
|
||||||
|
|
||||||
|
@@ -1,22 +1,34 @@
|
|||||||
.. JupyterHub documentation master file, created by
|
|
||||||
sphinx-quickstart on Mon Jan 4 16:31:09 2016.
|
|
||||||
You can adapt this file completely to your liking, but it should at least
|
|
||||||
contain the root `toctree` directive.
|
|
||||||
|
|
||||||
JupyterHub
|
JupyterHub
|
||||||
==========
|
==========
|
||||||
|
|
||||||
.. note:: This is the official documentation for JupyterHub. This project is
|
JupyterHub is a server that gives multiple users access to Jupyter notebooks,
|
||||||
under active development.
|
running an independent Jupyter notebook server for each user.
|
||||||
|
|
||||||
JupyterHub is a multi-user server that manages and proxies multiple instances
|
To use JupyterHub, you need a Unix server (typically Linux) running
|
||||||
of the single-user Jupyter notebook server.
|
somewhere that is accessible to your team on the network. The JupyterHub server
|
||||||
|
can be on an internal network at your organisation, or it can run on the public
|
||||||
|
internet (in which case, take care with `security <getting-started.html#security>`__).
|
||||||
|
Users access JupyterHub in a web browser, by going to the IP address or
|
||||||
|
domain name of the server.
|
||||||
|
|
||||||
Three actors:
|
Different :doc:`authenticators <authenticators>` control access
|
||||||
|
to JupyterHub. The default one (pam) uses the user accounts on the server where
|
||||||
|
JupyterHub is running. If you use this, you will need to create a user account
|
||||||
|
on the system for each user on your team. Using other authenticators, you can
|
||||||
|
allow users to sign in with e.g. a Github account, or with any single-sign-on
|
||||||
|
system your organisation has.
|
||||||
|
|
||||||
* multi-user Hub (tornado process)
|
Next, :doc:`spawners <spawners>` control how JupyterHub starts
|
||||||
* `configurable http proxy <https://github.com/jupyter/configurable-http-proxy>`_ (node-http-proxy)
|
the individual notebook server for each user. The default spawner will
|
||||||
* multiple single-user IPython notebook servers (Python/IPython/tornado)
|
start a notebook server on the same machine running under their system username.
|
||||||
|
The other main option is to start each server in a separate container, often
|
||||||
|
using Docker.
|
||||||
|
|
||||||
|
JupyterHub runs as three separate parts:
|
||||||
|
|
||||||
|
* The multi-user Hub (Python & Tornado)
|
||||||
|
* A `configurable http proxy <https://github.com/jupyter/configurable-http-proxy>`_ (NodeJS)
|
||||||
|
* Multiple single-user Jupyter notebook servers (Python & Tornado)
|
||||||
|
|
||||||
Basic principles:
|
Basic principles:
|
||||||
|
|
||||||
|
@@ -86,7 +86,7 @@ class APIHandler(BaseHandler):
|
|||||||
model = {
|
model = {
|
||||||
'name': user.name,
|
'name': user.name,
|
||||||
'admin': user.admin,
|
'admin': user.admin,
|
||||||
'server': user.server.base_url if user.running else None,
|
'server': user.url if user.running else None,
|
||||||
'pending': None,
|
'pending': None,
|
||||||
'last_activity': user.last_activity.isoformat(),
|
'last_activity': user.last_activity.isoformat(),
|
||||||
}
|
}
|
||||||
|
@@ -28,7 +28,7 @@ class ProxyAPIHandler(APIHandler):
|
|||||||
@gen.coroutine
|
@gen.coroutine
|
||||||
def post(self):
|
def post(self):
|
||||||
"""POST checks the proxy to ensure"""
|
"""POST checks the proxy to ensure"""
|
||||||
yield self.proxy.check_routes()
|
yield self.proxy.check_routes(self.users)
|
||||||
|
|
||||||
|
|
||||||
@admin_only
|
@admin_only
|
||||||
@@ -59,7 +59,7 @@ class ProxyAPIHandler(APIHandler):
|
|||||||
self.proxy.auth_token = model['auth_token']
|
self.proxy.auth_token = model['auth_token']
|
||||||
self.db.commit()
|
self.db.commit()
|
||||||
self.log.info("Updated proxy at %s", server.bind_url)
|
self.log.info("Updated proxy at %s", server.bind_url)
|
||||||
yield self.proxy.check_routes()
|
yield self.proxy.check_routes(self.users)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
@@ -16,6 +16,7 @@ from datetime import datetime
|
|||||||
from distutils.version import LooseVersion as V
|
from distutils.version import LooseVersion as V
|
||||||
from getpass import getuser
|
from getpass import getuser
|
||||||
from subprocess import Popen
|
from subprocess import Popen
|
||||||
|
from urllib.parse import urlparse
|
||||||
|
|
||||||
if sys.version_info[:2] < (3,3):
|
if sys.version_info[:2] < (3,3):
|
||||||
raise ValueError("Python < 3.3 not supported: %s" % sys.version)
|
raise ValueError("Python < 3.3 not supported: %s" % sys.version)
|
||||||
@@ -42,7 +43,7 @@ here = os.path.dirname(__file__)
|
|||||||
|
|
||||||
import jupyterhub
|
import jupyterhub
|
||||||
from . import handlers, apihandlers
|
from . import handlers, apihandlers
|
||||||
from .handlers.static import CacheControlStaticFilesHandler
|
from .handlers.static import CacheControlStaticFilesHandler, LogoHandler
|
||||||
|
|
||||||
from . import orm
|
from . import orm
|
||||||
from .user import User, UserDict
|
from .user import User, UserDict
|
||||||
@@ -50,7 +51,7 @@ from ._data import DATA_FILES_PATH
|
|||||||
from .log import CoroutineLogFormatter, log_request
|
from .log import CoroutineLogFormatter, log_request
|
||||||
from .traitlets import URLPrefix, Command
|
from .traitlets import URLPrefix, Command
|
||||||
from .utils import (
|
from .utils import (
|
||||||
url_path_join, localhost,
|
url_path_join,
|
||||||
ISO8601_ms, ISO8601_s,
|
ISO8601_ms, ISO8601_s,
|
||||||
)
|
)
|
||||||
# classes for config
|
# classes for config
|
||||||
@@ -87,6 +88,9 @@ flags = {
|
|||||||
'no-db': ({'JupyterHub': {'db_url': 'sqlite:///:memory:'}},
|
'no-db': ({'JupyterHub': {'db_url': 'sqlite:///:memory:'}},
|
||||||
"disable persisting state database to disk"
|
"disable persisting state database to disk"
|
||||||
),
|
),
|
||||||
|
'no-ssl': ({'JupyterHub': {'confirm_no_ssl': True}},
|
||||||
|
"Allow JupyterHub to run without SSL (SSL termination should be happening elsewhere)."
|
||||||
|
),
|
||||||
}
|
}
|
||||||
|
|
||||||
SECRET_BYTES = 2048 # the number of bytes to use when generating new secrets
|
SECRET_BYTES = 2048 # the number of bytes to use when generating new secrets
|
||||||
@@ -208,7 +212,12 @@ class JupyterHub(Application):
|
|||||||
|
|
||||||
def _template_paths_default(self):
|
def _template_paths_default(self):
|
||||||
return [os.path.join(self.data_files_path, 'templates')]
|
return [os.path.join(self.data_files_path, 'templates')]
|
||||||
|
|
||||||
|
confirm_no_ssl = Bool(False, config=True,
|
||||||
|
help="""Confirm that JupyterHub should be run without SSL.
|
||||||
|
This is **NOT RECOMMENDED** unless SSL termination is being handled by another layer.
|
||||||
|
"""
|
||||||
|
)
|
||||||
ssl_key = Unicode('', config=True,
|
ssl_key = Unicode('', config=True,
|
||||||
help="""Path to SSL key file for the public facing interface of the proxy
|
help="""Path to SSL key file for the public facing interface of the proxy
|
||||||
|
|
||||||
@@ -222,14 +231,39 @@ class JupyterHub(Application):
|
|||||||
"""
|
"""
|
||||||
)
|
)
|
||||||
ip = Unicode('', config=True,
|
ip = Unicode('', config=True,
|
||||||
help="The public facing ip of the proxy"
|
help="The public facing ip of the whole application (the proxy)"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
subdomain_host = Unicode('', config=True,
|
||||||
|
help="""Run single-user servers on subdomains of this host.
|
||||||
|
|
||||||
|
This should be the full https://hub.domain.tld[:port]
|
||||||
|
|
||||||
|
Provides additional cross-site protections for javascript served by single-user servers.
|
||||||
|
|
||||||
|
Requires <username>.hub.domain.tld to resolve to the same host as hub.domain.tld.
|
||||||
|
|
||||||
|
In general, this is most easily achieved with wildcard DNS.
|
||||||
|
|
||||||
|
When using SSL (i.e. always) this also requires a wildcard SSL certificate.
|
||||||
|
""")
|
||||||
|
def _subdomain_host_changed(self, name, old, new):
|
||||||
|
if new and '://' not in new:
|
||||||
|
# host should include '://'
|
||||||
|
# if not specified, assume https: You have to be really explicit about HTTP!
|
||||||
|
self.subdomain_host = 'https://' + new
|
||||||
|
|
||||||
port = Integer(8000, config=True,
|
port = Integer(8000, config=True,
|
||||||
help="The public facing port of the proxy"
|
help="The public facing port of the proxy"
|
||||||
)
|
)
|
||||||
base_url = URLPrefix('/', config=True,
|
base_url = URLPrefix('/', config=True,
|
||||||
help="The base URL of the entire application"
|
help="The base URL of the entire application"
|
||||||
)
|
)
|
||||||
|
logo_file = Unicode('', config=True,
|
||||||
|
help="Specify path to a logo image to override the Jupyter logo in the banner."
|
||||||
|
)
|
||||||
|
def _logo_file_default(self):
|
||||||
|
return os.path.join(self.data_files_path, 'static', 'images', 'jupyter.png')
|
||||||
|
|
||||||
jinja_environment_options = Dict(config=True,
|
jinja_environment_options = Dict(config=True,
|
||||||
help="Supply extra arguments that will be passed to Jinja environment."
|
help="Supply extra arguments that will be passed to Jinja environment."
|
||||||
@@ -260,7 +294,7 @@ class JupyterHub(Application):
|
|||||||
token = orm.new_token()
|
token = orm.new_token()
|
||||||
return token
|
return token
|
||||||
|
|
||||||
proxy_api_ip = Unicode(localhost(), config=True,
|
proxy_api_ip = Unicode('127.0.0.1', config=True,
|
||||||
help="The ip for the proxy API handlers"
|
help="The ip for the proxy API handlers"
|
||||||
)
|
)
|
||||||
proxy_api_port = Integer(config=True,
|
proxy_api_port = Integer(config=True,
|
||||||
@@ -272,10 +306,9 @@ class JupyterHub(Application):
|
|||||||
hub_port = Integer(8081, config=True,
|
hub_port = Integer(8081, config=True,
|
||||||
help="The port for this process"
|
help="The port for this process"
|
||||||
)
|
)
|
||||||
hub_ip = Unicode(localhost(), config=True,
|
hub_ip = Unicode('127.0.0.1', config=True,
|
||||||
help="The ip for this process"
|
help="The ip for this process"
|
||||||
)
|
)
|
||||||
|
|
||||||
hub_prefix = URLPrefix('/hub/', config=True,
|
hub_prefix = URLPrefix('/hub/', config=True,
|
||||||
help="The prefix for the hub server. Must not be '/'"
|
help="The prefix for the hub server. Must not be '/'"
|
||||||
)
|
)
|
||||||
@@ -475,6 +508,8 @@ class JupyterHub(Application):
|
|||||||
# set default handlers
|
# set default handlers
|
||||||
h.extend(handlers.default_handlers)
|
h.extend(handlers.default_handlers)
|
||||||
h.extend(apihandlers.default_handlers)
|
h.extend(apihandlers.default_handlers)
|
||||||
|
|
||||||
|
h.append((r'/logo', LogoHandler, {'path': self.logo_file}))
|
||||||
self.handlers = self.add_url_prefix(self.hub_prefix, h)
|
self.handlers = self.add_url_prefix(self.hub_prefix, h)
|
||||||
# some extra handlers, outside hub_prefix
|
# some extra handlers, outside hub_prefix
|
||||||
self.handlers.extend([
|
self.handlers.extend([
|
||||||
@@ -559,11 +594,15 @@ class JupyterHub(Application):
|
|||||||
q = self.db.query(orm.Hub)
|
q = self.db.query(orm.Hub)
|
||||||
assert q.count() <= 1
|
assert q.count() <= 1
|
||||||
self._local.hub = q.first()
|
self._local.hub = q.first()
|
||||||
|
if self.subdomain_host and self._local.hub:
|
||||||
|
self._local.hub.host = self.subdomain_host
|
||||||
return self._local.hub
|
return self._local.hub
|
||||||
|
|
||||||
@hub.setter
|
@hub.setter
|
||||||
def hub(self, hub):
|
def hub(self, hub):
|
||||||
self._local.hub = hub
|
self._local.hub = hub
|
||||||
|
if hub and self.subdomain_host:
|
||||||
|
hub.host = self.subdomain_host
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def proxy(self):
|
def proxy(self):
|
||||||
@@ -616,6 +655,10 @@ class JupyterHub(Application):
|
|||||||
server.ip = self.hub_ip
|
server.ip = self.hub_ip
|
||||||
server.port = self.hub_port
|
server.port = self.hub_port
|
||||||
server.base_url = self.hub_prefix
|
server.base_url = self.hub_prefix
|
||||||
|
if self.subdomain_host:
|
||||||
|
if not self.subdomain_host:
|
||||||
|
raise ValueError("Must specify subdomain_host when using subdomains."
|
||||||
|
" This should be the public domain[:port] of the Hub.")
|
||||||
|
|
||||||
self.db.commit()
|
self.db.commit()
|
||||||
|
|
||||||
@@ -794,12 +837,26 @@ class JupyterHub(Application):
|
|||||||
'--api-port', str(self.proxy.api_server.port),
|
'--api-port', str(self.proxy.api_server.port),
|
||||||
'--default-target', self.hub.server.host,
|
'--default-target', self.hub.server.host,
|
||||||
]
|
]
|
||||||
|
if self.subdomain_host:
|
||||||
|
cmd.append('--host-routing')
|
||||||
if self.debug_proxy:
|
if self.debug_proxy:
|
||||||
cmd.extend(['--log-level', 'debug'])
|
cmd.extend(['--log-level', 'debug'])
|
||||||
if self.ssl_key:
|
if self.ssl_key:
|
||||||
cmd.extend(['--ssl-key', self.ssl_key])
|
cmd.extend(['--ssl-key', self.ssl_key])
|
||||||
if self.ssl_cert:
|
if self.ssl_cert:
|
||||||
cmd.extend(['--ssl-cert', self.ssl_cert])
|
cmd.extend(['--ssl-cert', self.ssl_cert])
|
||||||
|
# Require SSL to be used or `--no-ssl` to confirm no SSL on
|
||||||
|
if ' --ssl' not in ' '.join(cmd):
|
||||||
|
if self.confirm_no_ssl:
|
||||||
|
self.log.warning("Running JupyterHub without SSL."
|
||||||
|
" There better be SSL termination happening somewhere else...")
|
||||||
|
else:
|
||||||
|
self.log.error(
|
||||||
|
"Refusing to run JuptyterHub without SSL."
|
||||||
|
" If you are terminating SSL in another layer,"
|
||||||
|
" pass --no-ssl to tell JupyterHub to allow the proxy to listen on HTTP."
|
||||||
|
)
|
||||||
|
self.exit(1)
|
||||||
self.log.info("Starting proxy @ %s", self.proxy.public_server.bind_url)
|
self.log.info("Starting proxy @ %s", self.proxy.public_server.bind_url)
|
||||||
self.log.debug("Proxy cmd: %s", cmd)
|
self.log.debug("Proxy cmd: %s", cmd)
|
||||||
try:
|
try:
|
||||||
@@ -840,7 +897,7 @@ class JupyterHub(Application):
|
|||||||
)
|
)
|
||||||
yield self.start_proxy()
|
yield self.start_proxy()
|
||||||
self.log.info("Setting up routes on new proxy")
|
self.log.info("Setting up routes on new proxy")
|
||||||
yield self.proxy.add_all_users()
|
yield self.proxy.add_all_users(self.users)
|
||||||
self.log.info("New proxy back up, and good to go")
|
self.log.info("New proxy back up, and good to go")
|
||||||
|
|
||||||
def init_tornado_settings(self):
|
def init_tornado_settings(self):
|
||||||
@@ -866,6 +923,8 @@ class JupyterHub(Application):
|
|||||||
else:
|
else:
|
||||||
version_hash=datetime.now().strftime("%Y%m%d%H%M%S"),
|
version_hash=datetime.now().strftime("%Y%m%d%H%M%S"),
|
||||||
|
|
||||||
|
subdomain_host = self.subdomain_host
|
||||||
|
domain = urlparse(subdomain_host).hostname
|
||||||
settings = dict(
|
settings = dict(
|
||||||
log_function=log_request,
|
log_function=log_request,
|
||||||
config=self.config,
|
config=self.config,
|
||||||
@@ -888,6 +947,8 @@ class JupyterHub(Application):
|
|||||||
template_path=self.template_paths,
|
template_path=self.template_paths,
|
||||||
jinja2_env=jinja_env,
|
jinja2_env=jinja_env,
|
||||||
version_hash=version_hash,
|
version_hash=version_hash,
|
||||||
|
subdomain_host=subdomain_host,
|
||||||
|
domain=domain,
|
||||||
)
|
)
|
||||||
# allow configured settings to have priority
|
# allow configured settings to have priority
|
||||||
settings.update(self.tornado_settings)
|
settings.update(self.tornado_settings)
|
||||||
@@ -1024,7 +1085,7 @@ class JupyterHub(Application):
|
|||||||
user.last_activity = max(user.last_activity, dt)
|
user.last_activity = max(user.last_activity, dt)
|
||||||
|
|
||||||
self.db.commit()
|
self.db.commit()
|
||||||
yield self.proxy.check_routes(routes)
|
yield self.proxy.check_routes(self.users, routes)
|
||||||
|
|
||||||
@gen.coroutine
|
@gen.coroutine
|
||||||
def start(self):
|
def start(self):
|
||||||
@@ -1059,7 +1120,7 @@ class JupyterHub(Application):
|
|||||||
self.exit(1)
|
self.exit(1)
|
||||||
return
|
return
|
||||||
|
|
||||||
loop.add_callback(self.proxy.add_all_users)
|
loop.add_callback(self.proxy.add_all_users, self.users)
|
||||||
|
|
||||||
if self.proxy_process:
|
if self.proxy_process:
|
||||||
# only check / restart the proxy if we started it in the first place.
|
# only check / restart the proxy if we started it in the first place.
|
||||||
|
@@ -51,6 +51,14 @@ class BaseHandler(RequestHandler):
|
|||||||
def version_hash(self):
|
def version_hash(self):
|
||||||
return self.settings.get('version_hash', '')
|
return self.settings.get('version_hash', '')
|
||||||
|
|
||||||
|
@property
|
||||||
|
def subdomain_host(self):
|
||||||
|
return self.settings.get('subdomain_host', '')
|
||||||
|
|
||||||
|
@property
|
||||||
|
def domain(self):
|
||||||
|
return self.settings['domain']
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def db(self):
|
def db(self):
|
||||||
return self.settings['db']
|
return self.settings['db']
|
||||||
@@ -199,42 +207,44 @@ class BaseHandler(RequestHandler):
|
|||||||
user = self.get_current_user()
|
user = self.get_current_user()
|
||||||
else:
|
else:
|
||||||
user = self.find_user(name)
|
user = self.find_user(name)
|
||||||
|
kwargs = {}
|
||||||
|
if self.subdomain_host:
|
||||||
|
kwargs['domain'] = self.domain
|
||||||
if user and user.server:
|
if user and user.server:
|
||||||
self.clear_cookie(user.server.cookie_name, path=user.server.base_url)
|
self.clear_cookie(user.server.cookie_name, path=user.server.base_url, **kwargs)
|
||||||
self.clear_cookie(self.hub.server.cookie_name, path=self.hub.server.base_url)
|
self.clear_cookie(self.hub.server.cookie_name, path=self.hub.server.base_url, **kwargs)
|
||||||
|
|
||||||
|
def _set_user_cookie(self, user, server):
|
||||||
|
# tornado <4.2 have a bug that consider secure==True as soon as
|
||||||
|
# 'secure' kwarg is passed to set_secure_cookie
|
||||||
|
if self.request.protocol == 'https':
|
||||||
|
kwargs = {'secure': True}
|
||||||
|
else:
|
||||||
|
kwargs = {}
|
||||||
|
if self.subdomain_host:
|
||||||
|
kwargs['domain'] = self.domain
|
||||||
|
self.log.debug("Setting cookie for %s: %s, %s", user.name, server.cookie_name, kwargs)
|
||||||
|
self.set_secure_cookie(
|
||||||
|
server.cookie_name,
|
||||||
|
user.cookie_id,
|
||||||
|
path=server.base_url,
|
||||||
|
**kwargs
|
||||||
|
)
|
||||||
|
|
||||||
def set_server_cookie(self, user):
|
def set_server_cookie(self, user):
|
||||||
"""set the login cookie for the single-user server"""
|
"""set the login cookie for the single-user server"""
|
||||||
# tornado <4.2 have a bug that consider secure==True as soon as
|
self._set_user_cookie(user, user.server)
|
||||||
# 'secure' kwarg is passed to set_secure_cookie
|
|
||||||
if self.request.protocol == 'https':
|
|
||||||
kwargs = {'secure':True}
|
|
||||||
else:
|
|
||||||
kwargs = {}
|
|
||||||
self.set_secure_cookie(
|
|
||||||
user.server.cookie_name,
|
|
||||||
user.cookie_id,
|
|
||||||
path=user.server.base_url,
|
|
||||||
**kwargs
|
|
||||||
)
|
|
||||||
|
|
||||||
def set_hub_cookie(self, user):
|
def set_hub_cookie(self, user):
|
||||||
"""set the login cookie for the Hub"""
|
"""set the login cookie for the Hub"""
|
||||||
# tornado <4.2 have a bug that consider secure==True as soon as
|
self._set_user_cookie(user, self.hub.server)
|
||||||
# 'secure' kwarg is passed to set_secure_cookie
|
|
||||||
if self.request.protocol == 'https':
|
|
||||||
kwargs = {'secure':True}
|
|
||||||
else:
|
|
||||||
kwargs = {}
|
|
||||||
self.set_secure_cookie(
|
|
||||||
self.hub.server.cookie_name,
|
|
||||||
user.cookie_id,
|
|
||||||
path=self.hub.server.base_url,
|
|
||||||
**kwargs
|
|
||||||
)
|
|
||||||
|
|
||||||
def set_login_cookie(self, user):
|
def set_login_cookie(self, user):
|
||||||
"""Set login cookies for the Hub and single-user server."""
|
"""Set login cookies for the Hub and single-user server."""
|
||||||
|
if self.subdomain_host and not self.request.host.startswith(self.domain):
|
||||||
|
self.log.warning(
|
||||||
|
"Possibly setting cookie on wrong domain: %s != %s",
|
||||||
|
self.request.host, self.domain)
|
||||||
# create and set a new cookie token for the single-user server
|
# create and set a new cookie token for the single-user server
|
||||||
if user.server:
|
if user.server:
|
||||||
self.set_server_cookie(user)
|
self.set_server_cookie(user)
|
||||||
@@ -296,16 +306,23 @@ class BaseHandler(RequestHandler):
|
|||||||
yield gen.with_timeout(timedelta(seconds=self.slow_spawn_timeout), f)
|
yield gen.with_timeout(timedelta(seconds=self.slow_spawn_timeout), f)
|
||||||
except gen.TimeoutError:
|
except gen.TimeoutError:
|
||||||
if user.spawn_pending:
|
if user.spawn_pending:
|
||||||
|
# still in Spawner.start, which is taking a long time
|
||||||
|
# we shouldn't poll while spawn_pending is True
|
||||||
|
self.log.warn("User %s server is slow to start", user.name)
|
||||||
|
# schedule finish for when the user finishes spawning
|
||||||
|
IOLoop.current().add_future(f, finish_user_spawn)
|
||||||
|
else:
|
||||||
|
# start has finished, but the server hasn't come up
|
||||||
|
# check if the server died while we were waiting
|
||||||
status = yield user.spawner.poll()
|
status = yield user.spawner.poll()
|
||||||
if status is None:
|
if status is None:
|
||||||
# hit timeout, but spawn is still pending
|
# hit timeout, but server's running. Hope that it'll show up soon enough,
|
||||||
self.log.warn("User %s server is slow to start", user.name)
|
# though it's possible that it started at the wrong URL
|
||||||
|
self.log.warn("User %s server is slow to become responsive", user.name)
|
||||||
# schedule finish for when the user finishes spawning
|
# schedule finish for when the user finishes spawning
|
||||||
IOLoop.current().add_future(f, finish_user_spawn)
|
IOLoop.current().add_future(f, finish_user_spawn)
|
||||||
else:
|
else:
|
||||||
raise web.HTTPError(500, "Spawner failed to start [status=%s]" % status)
|
raise web.HTTPError(500, "Spawner failed to start [status=%s]" % status)
|
||||||
else:
|
|
||||||
raise
|
|
||||||
else:
|
else:
|
||||||
yield finish_user_spawn()
|
yield finish_user_spawn()
|
||||||
|
|
||||||
@@ -437,22 +454,26 @@ class PrefixRedirectHandler(BaseHandler):
|
|||||||
|
|
||||||
|
|
||||||
class UserSpawnHandler(BaseHandler):
|
class UserSpawnHandler(BaseHandler):
|
||||||
"""Requests to /user/name handled by the Hub
|
"""Redirect requests to /user/name/* handled by the Hub.
|
||||||
should result in spawning the single-user server and
|
|
||||||
being redirected to the original.
|
If logged in, spawn a single-user server and redirect request.
|
||||||
|
If a user, alice, requests /user/bob/notebooks/mynotebook.ipynb,
|
||||||
|
redirect her to /user/alice/notebooks/mynotebook.ipynb, which should
|
||||||
|
in turn call this function.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
@gen.coroutine
|
@gen.coroutine
|
||||||
def get(self, name):
|
def get(self, name, user_path):
|
||||||
current_user = self.get_current_user()
|
current_user = self.get_current_user()
|
||||||
if current_user and current_user.name == name:
|
if current_user and current_user.name == name:
|
||||||
# logged in, spawn the server
|
# logged in as correct user, spawn the server
|
||||||
if current_user.spawner:
|
if current_user.spawner:
|
||||||
if current_user.spawn_pending:
|
if current_user.spawn_pending:
|
||||||
# spawn has started, but not finished
|
# spawn has started, but not finished
|
||||||
html = self.render_template("spawn_pending.html", user=current_user)
|
html = self.render_template("spawn_pending.html", user=current_user)
|
||||||
self.finish(html)
|
self.finish(html)
|
||||||
return
|
return
|
||||||
|
|
||||||
# spawn has supposedly finished, check on the status
|
# spawn has supposedly finished, check on the status
|
||||||
status = yield current_user.spawner.poll()
|
status = yield current_user.spawner.poll()
|
||||||
if status is not None:
|
if status is not None:
|
||||||
@@ -465,15 +486,22 @@ class UserSpawnHandler(BaseHandler):
|
|||||||
self.set_login_cookie(current_user)
|
self.set_login_cookie(current_user)
|
||||||
without_prefix = self.request.uri[len(self.hub.server.base_url):]
|
without_prefix = self.request.uri[len(self.hub.server.base_url):]
|
||||||
target = url_path_join(self.base_url, without_prefix)
|
target = url_path_join(self.base_url, without_prefix)
|
||||||
|
if self.subdomain_host:
|
||||||
|
target = current_user.host + target
|
||||||
|
self.redirect(target)
|
||||||
|
elif current_user:
|
||||||
|
# logged in as a different user, redirect
|
||||||
|
target = url_path_join(self.base_url, 'user', current_user.name,
|
||||||
|
user_path or '')
|
||||||
self.redirect(target)
|
self.redirect(target)
|
||||||
else:
|
else:
|
||||||
# not logged in to the right user,
|
# not logged in, clear any cookies and reload
|
||||||
# clear any cookies and reload (will redirect to login)
|
|
||||||
self.clear_login_cookie()
|
self.clear_login_cookie()
|
||||||
self.redirect(url_concat(
|
self.redirect(url_concat(
|
||||||
self.settings['login_url'],
|
self.settings['login_url'],
|
||||||
{'next': self.request.uri,
|
{'next': self.request.uri},
|
||||||
}))
|
))
|
||||||
|
|
||||||
|
|
||||||
class CSPReportHandler(BaseHandler):
|
class CSPReportHandler(BaseHandler):
|
||||||
'''Accepts a content security policy violation report'''
|
'''Accepts a content security policy violation report'''
|
||||||
@@ -484,6 +512,6 @@ class CSPReportHandler(BaseHandler):
|
|||||||
self.request.body.decode('utf8', 'replace'))
|
self.request.body.decode('utf8', 'replace'))
|
||||||
|
|
||||||
default_handlers = [
|
default_handlers = [
|
||||||
(r'/user/([^/]+)/?.*', UserSpawnHandler),
|
(r'/user/([^/]+)(/.*)?', UserSpawnHandler),
|
||||||
(r'/security/csp-report', CSPReportHandler),
|
(r'/security/csp-report', CSPReportHandler),
|
||||||
]
|
]
|
||||||
|
@@ -25,7 +25,7 @@ class RootHandler(BaseHandler):
|
|||||||
user = self.get_current_user()
|
user = self.get_current_user()
|
||||||
if user:
|
if user:
|
||||||
if user.running:
|
if user.running:
|
||||||
url = user.server.base_url
|
url = user.url
|
||||||
self.log.debug("User is running: %s", url)
|
self.log.debug("User is running: %s", url)
|
||||||
else:
|
else:
|
||||||
url = url_path_join(self.hub.server.base_url, 'home')
|
url = url_path_join(self.hub.server.base_url, 'home')
|
||||||
@@ -67,7 +67,7 @@ class SpawnHandler(BaseHandler):
|
|||||||
"""GET renders form for spawning with user-specified options"""
|
"""GET renders form for spawning with user-specified options"""
|
||||||
user = self.get_current_user()
|
user = self.get_current_user()
|
||||||
if user.running:
|
if user.running:
|
||||||
url = user.server.base_url
|
url = user.url
|
||||||
self.log.debug("User is running: %s", url)
|
self.log.debug("User is running: %s", url)
|
||||||
self.redirect(url)
|
self.redirect(url)
|
||||||
return
|
return
|
||||||
@@ -84,7 +84,7 @@ class SpawnHandler(BaseHandler):
|
|||||||
"""POST spawns with user-specified options"""
|
"""POST spawns with user-specified options"""
|
||||||
user = self.get_current_user()
|
user = self.get_current_user()
|
||||||
if user.running:
|
if user.running:
|
||||||
url = user.server.base_url
|
url = user.url
|
||||||
self.log.warning("User is already running: %s", url)
|
self.log.warning("User is already running: %s", url)
|
||||||
self.redirect(url)
|
self.redirect(url)
|
||||||
return
|
return
|
||||||
@@ -93,15 +93,15 @@ class SpawnHandler(BaseHandler):
|
|||||||
form_options[key] = [ bs.decode('utf8') for bs in byte_list ]
|
form_options[key] = [ bs.decode('utf8') for bs in byte_list ]
|
||||||
for key, byte_list in self.request.files.items():
|
for key, byte_list in self.request.files.items():
|
||||||
form_options["%s_file"%key] = byte_list
|
form_options["%s_file"%key] = byte_list
|
||||||
options = user.spawner.options_from_form(form_options)
|
|
||||||
try:
|
try:
|
||||||
|
options = user.spawner.options_from_form(form_options)
|
||||||
yield self.spawn_single_user(user, options=options)
|
yield self.spawn_single_user(user, options=options)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
self.log.error("Failed to spawn single-user server with form", exc_info=True)
|
self.log.error("Failed to spawn single-user server with form", exc_info=True)
|
||||||
self.finish(self._render_form(str(e)))
|
self.finish(self._render_form(str(e)))
|
||||||
return
|
return
|
||||||
self.set_login_cookie(user)
|
self.set_login_cookie(user)
|
||||||
url = user.server.base_url
|
url = user.url
|
||||||
self.redirect(url)
|
self.redirect(url)
|
||||||
|
|
||||||
class AdminHandler(BaseHandler):
|
class AdminHandler(BaseHandler):
|
||||||
|
@@ -1,6 +1,7 @@
|
|||||||
# Copyright (c) Jupyter Development Team.
|
# Copyright (c) Jupyter Development Team.
|
||||||
# Distributed under the terms of the Modified BSD License.
|
# Distributed under the terms of the Modified BSD License.
|
||||||
|
|
||||||
|
import os
|
||||||
from tornado.web import StaticFileHandler
|
from tornado.web import StaticFileHandler
|
||||||
|
|
||||||
class CacheControlStaticFilesHandler(StaticFileHandler):
|
class CacheControlStaticFilesHandler(StaticFileHandler):
|
||||||
@@ -14,4 +15,14 @@ class CacheControlStaticFilesHandler(StaticFileHandler):
|
|||||||
def set_extra_headers(self, path):
|
def set_extra_headers(self, path):
|
||||||
if "v" not in self.request.arguments:
|
if "v" not in self.request.arguments:
|
||||||
self.add_header("Cache-Control", "no-cache")
|
self.add_header("Cache-Control", "no-cache")
|
||||||
|
|
||||||
|
class LogoHandler(StaticFileHandler):
|
||||||
|
"""A singular handler for serving the logo."""
|
||||||
|
def get(self):
|
||||||
|
return super().get('')
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def get_absolute_path(cls, root, path):
|
||||||
|
"""We only serve one file, ignore relative path"""
|
||||||
|
return os.path.abspath(root)
|
||||||
|
|
||||||
|
@@ -4,15 +4,13 @@
|
|||||||
# Distributed under the terms of the Modified BSD License.
|
# Distributed under the terms of the Modified BSD License.
|
||||||
|
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
import errno
|
|
||||||
import json
|
import json
|
||||||
import socket
|
|
||||||
|
|
||||||
from tornado import gen
|
from tornado import gen
|
||||||
from tornado.log import app_log
|
from tornado.log import app_log
|
||||||
from tornado.httpclient import HTTPRequest, AsyncHTTPClient
|
from tornado.httpclient import HTTPRequest, AsyncHTTPClient
|
||||||
|
|
||||||
from sqlalchemy.types import TypeDecorator, VARCHAR
|
from sqlalchemy.types import TypeDecorator, TEXT
|
||||||
from sqlalchemy import (
|
from sqlalchemy import (
|
||||||
inspect,
|
inspect,
|
||||||
Column, Integer, ForeignKey, Unicode, Boolean,
|
Column, Integer, ForeignKey, Unicode, Boolean,
|
||||||
@@ -26,7 +24,7 @@ from sqlalchemy import create_engine
|
|||||||
|
|
||||||
from .utils import (
|
from .utils import (
|
||||||
random_port, url_path_join, wait_for_server, wait_for_http_server,
|
random_port, url_path_join, wait_for_server, wait_for_http_server,
|
||||||
new_token, hash_token, compare_token, localhost,
|
new_token, hash_token, compare_token, can_connect,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
@@ -39,7 +37,7 @@ class JSONDict(TypeDecorator):
|
|||||||
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
impl = VARCHAR
|
impl = TEXT
|
||||||
|
|
||||||
def process_bind_param(self, value, dialect):
|
def process_bind_param(self, value, dialect):
|
||||||
if value is not None:
|
if value is not None:
|
||||||
@@ -59,26 +57,26 @@ Base.log = app_log
|
|||||||
|
|
||||||
class Server(Base):
|
class Server(Base):
|
||||||
"""The basic state of a server
|
"""The basic state of a server
|
||||||
|
|
||||||
connection and cookie info
|
connection and cookie info
|
||||||
"""
|
"""
|
||||||
__tablename__ = 'servers'
|
__tablename__ = 'servers'
|
||||||
id = Column(Integer, primary_key=True)
|
id = Column(Integer, primary_key=True)
|
||||||
proto = Column(Unicode, default='http')
|
proto = Column(Unicode(15), default='http')
|
||||||
ip = Column(Unicode, default='')
|
ip = Column(Unicode(255), default='') # could also be a DNS name
|
||||||
port = Column(Integer, default=random_port)
|
port = Column(Integer, default=random_port)
|
||||||
base_url = Column(Unicode, default='/')
|
base_url = Column(Unicode(255), default='/')
|
||||||
cookie_name = Column(Unicode, default='cookie')
|
cookie_name = Column(Unicode(255), default='cookie')
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
return "<Server(%s:%s)>" % (self.ip, self.port)
|
return "<Server(%s:%s)>" % (self.ip, self.port)
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def host(self):
|
def host(self):
|
||||||
ip = self.ip
|
ip = self.ip
|
||||||
if ip in {'', '0.0.0.0'}:
|
if ip in {'', '0.0.0.0'}:
|
||||||
# when listening on all interfaces, connect to localhost
|
# when listening on all interfaces, connect to localhost
|
||||||
ip = localhost()
|
ip = '127.0.0.1'
|
||||||
return "{proto}://{ip}:{port}".format(
|
return "{proto}://{ip}:{port}".format(
|
||||||
proto=self.proto,
|
proto=self.proto,
|
||||||
ip=ip,
|
ip=ip,
|
||||||
@@ -91,52 +89,34 @@ class Server(Base):
|
|||||||
host=self.host,
|
host=self.host,
|
||||||
uri=self.base_url,
|
uri=self.base_url,
|
||||||
)
|
)
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def bind_url(self):
|
def bind_url(self):
|
||||||
"""representation of URL used for binding
|
"""representation of URL used for binding
|
||||||
|
|
||||||
Never used in APIs, only logging,
|
Never used in APIs, only logging,
|
||||||
since it can be non-connectable value, such as '', meaning all interfaces.
|
since it can be non-connectable value, such as '', meaning all interfaces.
|
||||||
"""
|
"""
|
||||||
if self.ip in {'', '0.0.0.0'}:
|
if self.ip in {'', '0.0.0.0'}:
|
||||||
return self.url.replace('localhost', self.ip or '*', 1)
|
return self.url.replace('127.0.0.1', self.ip or '*', 1)
|
||||||
return self.url
|
return self.url
|
||||||
|
|
||||||
@gen.coroutine
|
@gen.coroutine
|
||||||
def wait_up(self, timeout=10, http=False):
|
def wait_up(self, timeout=10, http=False):
|
||||||
"""Wait for this server to come up"""
|
"""Wait for this server to come up"""
|
||||||
if http:
|
if http:
|
||||||
yield wait_for_http_server(self.url, timeout=timeout)
|
yield wait_for_http_server(self.url, timeout=timeout)
|
||||||
else:
|
else:
|
||||||
yield wait_for_server(self.ip or localhost(), self.port, timeout=timeout)
|
yield wait_for_server(self.ip or '127.0.0.1', self.port, timeout=timeout)
|
||||||
|
|
||||||
def is_up(self):
|
def is_up(self):
|
||||||
"""Is the server accepting connections?"""
|
"""Is the server accepting connections?"""
|
||||||
try:
|
return can_connect(self.ip or '127.0.0.1', self.port)
|
||||||
socket.create_connection((self.ip or localhost(), self.port))
|
|
||||||
except socket.error as e:
|
|
||||||
if e.errno == errno.ENETUNREACH:
|
|
||||||
try:
|
|
||||||
socket.create_connection((self.ip or '127.0.0.1', self.port))
|
|
||||||
except socket.error as e:
|
|
||||||
if e.errno == errno.ECONNREFUSED:
|
|
||||||
return False
|
|
||||||
else:
|
|
||||||
raise
|
|
||||||
else:
|
|
||||||
return True
|
|
||||||
elif e.errno == errno.ECONNREFUSED:
|
|
||||||
return False
|
|
||||||
else:
|
|
||||||
raise
|
|
||||||
else:
|
|
||||||
return True
|
|
||||||
|
|
||||||
|
|
||||||
class Proxy(Base):
|
class Proxy(Base):
|
||||||
"""A configurable-http-proxy instance.
|
"""A configurable-http-proxy instance.
|
||||||
|
|
||||||
A proxy consists of the API server info and the public-facing server info,
|
A proxy consists of the API server info and the public-facing server info,
|
||||||
plus an auth token for configuring the proxy table.
|
plus an auth token for configuring the proxy table.
|
||||||
"""
|
"""
|
||||||
@@ -147,7 +127,7 @@ class Proxy(Base):
|
|||||||
public_server = relationship(Server, primaryjoin=_public_server_id == Server.id)
|
public_server = relationship(Server, primaryjoin=_public_server_id == Server.id)
|
||||||
_api_server_id = Column(Integer, ForeignKey('servers.id'))
|
_api_server_id = Column(Integer, ForeignKey('servers.id'))
|
||||||
api_server = relationship(Server, primaryjoin=_api_server_id == Server.id)
|
api_server = relationship(Server, primaryjoin=_api_server_id == Server.id)
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
if self.public_server:
|
if self.public_server:
|
||||||
return "<%s %s:%s>" % (
|
return "<%s %s:%s>" % (
|
||||||
@@ -155,7 +135,7 @@ class Proxy(Base):
|
|||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
return "<%s [unconfigured]>" % self.__class__.__name__
|
return "<%s [unconfigured]>" % self.__class__.__name__
|
||||||
|
|
||||||
def api_request(self, path, method='GET', body=None, client=None):
|
def api_request(self, path, method='GET', body=None, client=None):
|
||||||
"""Make an authenticated API request of the proxy"""
|
"""Make an authenticated API request of the proxy"""
|
||||||
client = client or AsyncHTTPClient()
|
client = client or AsyncHTTPClient()
|
||||||
@@ -176,10 +156,14 @@ class Proxy(Base):
|
|||||||
def add_user(self, user, client=None):
|
def add_user(self, user, client=None):
|
||||||
"""Add a user's server to the proxy table."""
|
"""Add a user's server to the proxy table."""
|
||||||
self.log.info("Adding user %s to proxy %s => %s",
|
self.log.info("Adding user %s to proxy %s => %s",
|
||||||
user.name, user.server.base_url, user.server.host,
|
user.name, user.proxy_path, user.server.host,
|
||||||
)
|
)
|
||||||
|
|
||||||
yield self.api_request(user.server.base_url,
|
if user.spawn_pending:
|
||||||
|
raise RuntimeError(
|
||||||
|
"User %s's spawn is pending, shouldn't be added to the proxy yet!", user.name)
|
||||||
|
|
||||||
|
yield self.api_request(user.proxy_path,
|
||||||
method='POST',
|
method='POST',
|
||||||
body=dict(
|
body=dict(
|
||||||
target=user.server.host,
|
target=user.server.host,
|
||||||
@@ -187,30 +171,15 @@ class Proxy(Base):
|
|||||||
),
|
),
|
||||||
client=client,
|
client=client,
|
||||||
)
|
)
|
||||||
|
|
||||||
@gen.coroutine
|
@gen.coroutine
|
||||||
def delete_user(self, user, client=None):
|
def delete_user(self, user, client=None):
|
||||||
"""Remove a user's server to the proxy table."""
|
"""Remove a user's server to the proxy table."""
|
||||||
self.log.info("Removing user %s from proxy", user.name)
|
self.log.info("Removing user %s from proxy", user.name)
|
||||||
yield self.api_request(user.server.base_url,
|
yield self.api_request(user.proxy_path,
|
||||||
method='DELETE',
|
method='DELETE',
|
||||||
client=client,
|
client=client,
|
||||||
)
|
)
|
||||||
|
|
||||||
@gen.coroutine
|
|
||||||
def add_all_users(self):
|
|
||||||
"""Update the proxy table from the database.
|
|
||||||
|
|
||||||
Used when loading up a new proxy.
|
|
||||||
"""
|
|
||||||
db = inspect(self).session
|
|
||||||
futures = []
|
|
||||||
for user in db.query(User):
|
|
||||||
if (user.server):
|
|
||||||
futures.append(self.add_user(user))
|
|
||||||
# wait after submitting them all
|
|
||||||
for f in futures:
|
|
||||||
yield f
|
|
||||||
|
|
||||||
@gen.coroutine
|
@gen.coroutine
|
||||||
def get_routes(self, client=None):
|
def get_routes(self, client=None):
|
||||||
@@ -219,17 +188,42 @@ class Proxy(Base):
|
|||||||
return json.loads(resp.body.decode('utf8', 'replace'))
|
return json.loads(resp.body.decode('utf8', 'replace'))
|
||||||
|
|
||||||
@gen.coroutine
|
@gen.coroutine
|
||||||
def check_routes(self, routes=None):
|
def add_all_users(self, user_dict):
|
||||||
"""Check that all users are properly"""
|
"""Update the proxy table from the database.
|
||||||
|
|
||||||
|
Used when loading up a new proxy.
|
||||||
|
"""
|
||||||
|
db = inspect(self).session
|
||||||
|
futures = []
|
||||||
|
for orm_user in db.query(User):
|
||||||
|
user = user_dict[orm_user]
|
||||||
|
if user.running:
|
||||||
|
futures.append(self.add_user(user))
|
||||||
|
# wait after submitting them all
|
||||||
|
for f in futures:
|
||||||
|
yield f
|
||||||
|
|
||||||
|
@gen.coroutine
|
||||||
|
def check_routes(self, user_dict, routes=None):
|
||||||
|
"""Check that all users are properly routed on the proxy"""
|
||||||
if not routes:
|
if not routes:
|
||||||
routes = yield self.get_routes()
|
routes = yield self.get_routes()
|
||||||
|
|
||||||
have_routes = { r['user'] for r in routes.values() if 'user' in r }
|
have_routes = { r['user'] for r in routes.values() if 'user' in r }
|
||||||
futures = []
|
futures = []
|
||||||
db = inspect(self).session
|
db = inspect(self).session
|
||||||
for user in db.query(User).filter(User.server != None):
|
for orm_user in db.query(User).filter(User.server != None):
|
||||||
|
user = user_dict[orm_user]
|
||||||
|
if not user.running:
|
||||||
|
# Don't add users to the proxy that haven't finished starting
|
||||||
|
continue
|
||||||
|
if user.server is None:
|
||||||
|
# This should never be True, but seems to be on rare occasion.
|
||||||
|
# catch filter bug, either in sqlalchemy or my understanding of its behavior
|
||||||
|
self.log.error("User %s has no server, but wasn't filtered out.", user)
|
||||||
|
continue
|
||||||
if user.name not in have_routes:
|
if user.name not in have_routes:
|
||||||
self.log.warn("Adding missing route for %s", user.name)
|
self.log.warning("Adding missing route for %s (%s)", user.name, user.server)
|
||||||
futures.append(self.add_user(user))
|
futures.append(self.add_user(user))
|
||||||
for f in futures:
|
for f in futures:
|
||||||
yield f
|
yield f
|
||||||
@@ -238,9 +232,9 @@ class Proxy(Base):
|
|||||||
|
|
||||||
class Hub(Base):
|
class Hub(Base):
|
||||||
"""Bring it all together at the hub.
|
"""Bring it all together at the hub.
|
||||||
|
|
||||||
The Hub is a server, plus its API path suffix
|
The Hub is a server, plus its API path suffix
|
||||||
|
|
||||||
the api_url is the full URL plus the api_path suffix on the end
|
the api_url is the full URL plus the api_path suffix on the end
|
||||||
of the server base_url.
|
of the server base_url.
|
||||||
"""
|
"""
|
||||||
@@ -248,12 +242,13 @@ class Hub(Base):
|
|||||||
id = Column(Integer, primary_key=True)
|
id = Column(Integer, primary_key=True)
|
||||||
_server_id = Column(Integer, ForeignKey('servers.id'))
|
_server_id = Column(Integer, ForeignKey('servers.id'))
|
||||||
server = relationship(Server, primaryjoin=_server_id == Server.id)
|
server = relationship(Server, primaryjoin=_server_id == Server.id)
|
||||||
|
host = ''
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def api_url(self):
|
def api_url(self):
|
||||||
"""return the full API url (with proto://host...)"""
|
"""return the full API url (with proto://host...)"""
|
||||||
return url_path_join(self.server.url, 'api')
|
return url_path_join(self.server.url, 'api')
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
if self.server:
|
if self.server:
|
||||||
return "<%s %s:%s>" % (
|
return "<%s %s:%s>" % (
|
||||||
@@ -265,31 +260,31 @@ class Hub(Base):
|
|||||||
|
|
||||||
class User(Base):
|
class User(Base):
|
||||||
"""The User table
|
"""The User table
|
||||||
|
|
||||||
Each user has a single server,
|
Each user has a single server,
|
||||||
and multiple tokens used for authorization.
|
and multiple tokens used for authorization.
|
||||||
|
|
||||||
API tokens grant access to the Hub's REST API.
|
API tokens grant access to the Hub's REST API.
|
||||||
These are used by single-user servers to authenticate requests,
|
These are used by single-user servers to authenticate requests,
|
||||||
and external services to manipulate the Hub.
|
and external services to manipulate the Hub.
|
||||||
|
|
||||||
Cookies are set with a single ID.
|
Cookies are set with a single ID.
|
||||||
Resetting the Cookie ID invalidates all cookies, forcing user to login again.
|
Resetting the Cookie ID invalidates all cookies, forcing user to login again.
|
||||||
|
|
||||||
A `state` column contains a JSON dict,
|
A `state` column contains a JSON dict,
|
||||||
used for restoring state of a Spawner.
|
used for restoring state of a Spawner.
|
||||||
"""
|
"""
|
||||||
__tablename__ = 'users'
|
__tablename__ = 'users'
|
||||||
id = Column(Integer, primary_key=True, autoincrement=True)
|
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||||
name = Column(Unicode)
|
name = Column(Unicode(1023))
|
||||||
# should we allow multiple servers per user?
|
# should we allow multiple servers per user?
|
||||||
_server_id = Column(Integer, ForeignKey('servers.id'))
|
_server_id = Column(Integer, ForeignKey('servers.id'))
|
||||||
server = relationship(Server, primaryjoin=_server_id == Server.id)
|
server = relationship(Server, primaryjoin=_server_id == Server.id)
|
||||||
admin = Column(Boolean, default=False)
|
admin = Column(Boolean, default=False)
|
||||||
last_activity = Column(DateTime, default=datetime.utcnow)
|
last_activity = Column(DateTime, default=datetime.utcnow)
|
||||||
|
|
||||||
api_tokens = relationship("APIToken", backref="user")
|
api_tokens = relationship("APIToken", backref="user")
|
||||||
cookie_id = Column(Unicode, default=new_token)
|
cookie_id = Column(Unicode(1023), default=new_token)
|
||||||
state = Column(JSONDict)
|
state = Column(JSONDict)
|
||||||
|
|
||||||
other_user_cookies = set([])
|
other_user_cookies = set([])
|
||||||
@@ -307,7 +302,7 @@ class User(Base):
|
|||||||
cls=self.__class__.__name__,
|
cls=self.__class__.__name__,
|
||||||
name=self.name,
|
name=self.name,
|
||||||
)
|
)
|
||||||
|
|
||||||
def new_api_token(self):
|
def new_api_token(self):
|
||||||
"""Create a new API token"""
|
"""Create a new API token"""
|
||||||
assert self.id is not None
|
assert self.id is not None
|
||||||
@@ -330,29 +325,29 @@ class User(Base):
|
|||||||
class APIToken(Base):
|
class APIToken(Base):
|
||||||
"""An API token"""
|
"""An API token"""
|
||||||
__tablename__ = 'api_tokens'
|
__tablename__ = 'api_tokens'
|
||||||
|
|
||||||
@declared_attr
|
@declared_attr
|
||||||
def user_id(cls):
|
def user_id(cls):
|
||||||
return Column(Integer, ForeignKey('users.id'))
|
return Column(Integer, ForeignKey('users.id'))
|
||||||
|
|
||||||
id = Column(Integer, primary_key=True)
|
id = Column(Integer, primary_key=True)
|
||||||
hashed = Column(Unicode)
|
hashed = Column(Unicode(1023))
|
||||||
prefix = Column(Unicode)
|
prefix = Column(Unicode(1023))
|
||||||
prefix_length = 4
|
prefix_length = 4
|
||||||
algorithm = "sha512"
|
algorithm = "sha512"
|
||||||
rounds = 16384
|
rounds = 16384
|
||||||
salt_bytes = 8
|
salt_bytes = 8
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def token(self):
|
def token(self):
|
||||||
raise AttributeError("token is write-only")
|
raise AttributeError("token is write-only")
|
||||||
|
|
||||||
@token.setter
|
@token.setter
|
||||||
def token(self, token):
|
def token(self, token):
|
||||||
"""Store the hashed value and prefix for a token"""
|
"""Store the hashed value and prefix for a token"""
|
||||||
self.prefix = token[:self.prefix_length]
|
self.prefix = token[:self.prefix_length]
|
||||||
self.hashed = hash_token(token, rounds=self.rounds, salt=self.salt_bytes, algorithm=self.algorithm)
|
self.hashed = hash_token(token, rounds=self.rounds, salt=self.salt_bytes, algorithm=self.algorithm)
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
return "<{cls}('{pre}...', user='{u}')>".format(
|
return "<{cls}('{pre}...', user='{u}')>".format(
|
||||||
cls=self.__class__.__name__,
|
cls=self.__class__.__name__,
|
||||||
@@ -373,7 +368,7 @@ class APIToken(Base):
|
|||||||
for orm_token in prefix_match:
|
for orm_token in prefix_match:
|
||||||
if orm_token.match(token):
|
if orm_token.match(token):
|
||||||
return orm_token
|
return orm_token
|
||||||
|
|
||||||
def match(self, token):
|
def match(self, token):
|
||||||
"""Is this my token?"""
|
"""Is this my token?"""
|
||||||
return compare_token(self.hashed, token)
|
return compare_token(self.hashed, token)
|
||||||
|
@@ -11,7 +11,7 @@ import signal
|
|||||||
import sys
|
import sys
|
||||||
import grp
|
import grp
|
||||||
from subprocess import Popen
|
from subprocess import Popen
|
||||||
from tempfile import TemporaryDirectory
|
from tempfile import mkdtemp
|
||||||
|
|
||||||
from tornado import gen
|
from tornado import gen
|
||||||
from tornado.ioloop import IOLoop, PeriodicCallback
|
from tornado.ioloop import IOLoop, PeriodicCallback
|
||||||
@@ -22,7 +22,7 @@ from traitlets import (
|
|||||||
)
|
)
|
||||||
|
|
||||||
from .traitlets import Command
|
from .traitlets import Command
|
||||||
from .utils import random_port, localhost
|
from .utils import random_port
|
||||||
|
|
||||||
class Spawner(LoggingConfigurable):
|
class Spawner(LoggingConfigurable):
|
||||||
"""Base class for spawning single-user notebook servers.
|
"""Base class for spawning single-user notebook servers.
|
||||||
@@ -41,7 +41,7 @@ class Spawner(LoggingConfigurable):
|
|||||||
hub = Any()
|
hub = Any()
|
||||||
authenticator = Any()
|
authenticator = Any()
|
||||||
api_token = Unicode()
|
api_token = Unicode()
|
||||||
ip = Unicode(localhost(), config=True,
|
ip = Unicode('127.0.0.1', config=True,
|
||||||
help="The IP address (or hostname) the single-user server should listen on"
|
help="The IP address (or hostname) the single-user server should listen on"
|
||||||
)
|
)
|
||||||
start_timeout = Integer(60, config=True,
|
start_timeout = Integer(60, config=True,
|
||||||
@@ -139,6 +139,25 @@ class Spawner(LoggingConfigurable):
|
|||||||
`%U` will be expanded to the user's username
|
`%U` will be expanded to the user's username
|
||||||
"""
|
"""
|
||||||
)
|
)
|
||||||
|
|
||||||
|
default_url = Unicode('', config=True,
|
||||||
|
help="""The default URL for the single-user server.
|
||||||
|
|
||||||
|
Can be used in conjunction with --notebook-dir=/ to enable
|
||||||
|
full filesystem traversal, while preserving user's homedir as
|
||||||
|
landing page for notebook
|
||||||
|
|
||||||
|
`%U` will be expanded to the user's username
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
|
disable_user_config = Bool(False, config=True,
|
||||||
|
help="""Disable per-user configuration of single-user servers.
|
||||||
|
|
||||||
|
This prevents any config in users' $HOME directories
|
||||||
|
from having an effect on their server.
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
def __init__(self, **kwargs):
|
def __init__(self, **kwargs):
|
||||||
super(Spawner, self).__init__(**kwargs)
|
super(Spawner, self).__init__(**kwargs)
|
||||||
@@ -199,6 +218,7 @@ class Spawner(LoggingConfigurable):
|
|||||||
'--port=%i' % self.user.server.port,
|
'--port=%i' % self.user.server.port,
|
||||||
'--cookie-name=%s' % self.user.server.cookie_name,
|
'--cookie-name=%s' % self.user.server.cookie_name,
|
||||||
'--base-url=%s' % self.user.server.base_url,
|
'--base-url=%s' % self.user.server.base_url,
|
||||||
|
'--hub-host=%s' % self.hub.host,
|
||||||
'--hub-prefix=%s' % self.hub.server.base_url,
|
'--hub-prefix=%s' % self.hub.server.base_url,
|
||||||
'--hub-api-url=%s' % self.hub.api_url,
|
'--hub-api-url=%s' % self.hub.api_url,
|
||||||
]
|
]
|
||||||
@@ -207,8 +227,14 @@ class Spawner(LoggingConfigurable):
|
|||||||
if self.notebook_dir:
|
if self.notebook_dir:
|
||||||
self.notebook_dir = self.notebook_dir.replace("%U",self.user.name)
|
self.notebook_dir = self.notebook_dir.replace("%U",self.user.name)
|
||||||
args.append('--notebook-dir=%s' % self.notebook_dir)
|
args.append('--notebook-dir=%s' % self.notebook_dir)
|
||||||
|
if self.default_url:
|
||||||
|
self.default_url = self.default_url.replace("%U",self.user.name)
|
||||||
|
args.append('--NotebookApp.default_url=%s' % self.default_url)
|
||||||
|
|
||||||
if self.debug:
|
if self.debug:
|
||||||
args.append('--debug')
|
args.append('--debug')
|
||||||
|
if self.disable_user_config:
|
||||||
|
args.append('--disable-user-config')
|
||||||
args.extend(self.args)
|
args.extend(self.args)
|
||||||
return args
|
return args
|
||||||
|
|
||||||
@@ -302,12 +328,13 @@ def _try_setcwd(path):
|
|||||||
try:
|
try:
|
||||||
os.chdir(path)
|
os.chdir(path)
|
||||||
except OSError as e:
|
except OSError as e:
|
||||||
|
exc = e # break exception instance out of except scope
|
||||||
print("Couldn't set CWD to %s (%s)" % (path, e), file=sys.stderr)
|
print("Couldn't set CWD to %s (%s)" % (path, e), file=sys.stderr)
|
||||||
path, _ = os.path.split(path)
|
path, _ = os.path.split(path)
|
||||||
else:
|
else:
|
||||||
return
|
return
|
||||||
print("Couldn't set CWD at all (%s), using temp dir" % e, file=sys.stderr)
|
print("Couldn't set CWD at all (%s), using temp dir" % exc, file=sys.stderr)
|
||||||
td = TemporaryDirectory().name
|
td = mkdtemp()
|
||||||
os.chdir(td)
|
os.chdir(td)
|
||||||
|
|
||||||
|
|
||||||
|
@@ -1,7 +1,7 @@
|
|||||||
"""mock utilities for testing"""
|
"""mock utilities for testing"""
|
||||||
|
|
||||||
|
import os
|
||||||
import sys
|
import sys
|
||||||
from datetime import timedelta
|
|
||||||
from tempfile import NamedTemporaryFile
|
from tempfile import NamedTemporaryFile
|
||||||
import threading
|
import threading
|
||||||
|
|
||||||
@@ -13,11 +13,11 @@ from tornado import gen
|
|||||||
from tornado.concurrent import Future
|
from tornado.concurrent import Future
|
||||||
from tornado.ioloop import IOLoop
|
from tornado.ioloop import IOLoop
|
||||||
|
|
||||||
from ..spawner import LocalProcessSpawner
|
|
||||||
from ..app import JupyterHub
|
from ..app import JupyterHub
|
||||||
from ..auth import PAMAuthenticator
|
from ..auth import PAMAuthenticator
|
||||||
from .. import orm
|
from .. import orm
|
||||||
from ..utils import localhost
|
from ..spawner import LocalProcessSpawner
|
||||||
|
from ..utils import url_path_join
|
||||||
|
|
||||||
from pamela import PAMError
|
from pamela import PAMError
|
||||||
|
|
||||||
@@ -109,9 +109,15 @@ class MockHub(JupyterHub):
|
|||||||
"""Hub with various mock bits"""
|
"""Hub with various mock bits"""
|
||||||
|
|
||||||
db_file = None
|
db_file = None
|
||||||
|
confirm_no_ssl = True
|
||||||
|
|
||||||
|
last_activity_interval = 2
|
||||||
|
|
||||||
|
def _subdomain_host_default(self):
|
||||||
|
return os.environ.get('JUPYTERHUB_TEST_SUBDOMAIN_HOST', '')
|
||||||
|
|
||||||
def _ip_default(self):
|
def _ip_default(self):
|
||||||
return localhost()
|
return '127.0.0.1'
|
||||||
|
|
||||||
def _authenticator_class_default(self):
|
def _authenticator_class_default(self):
|
||||||
return MockPAMAuthenticator
|
return MockPAMAuthenticator
|
||||||
@@ -124,7 +130,8 @@ class MockHub(JupyterHub):
|
|||||||
|
|
||||||
def start(self, argv=None):
|
def start(self, argv=None):
|
||||||
self.db_file = NamedTemporaryFile()
|
self.db_file = NamedTemporaryFile()
|
||||||
self.db_url = 'sqlite:///' + self.db_file.name
|
self.pid_file = NamedTemporaryFile(delete=False).name
|
||||||
|
self.db_url = self.db_file.name
|
||||||
|
|
||||||
evt = threading.Event()
|
evt = threading.Event()
|
||||||
|
|
||||||
@@ -161,13 +168,33 @@ class MockHub(JupyterHub):
|
|||||||
self.db_file.close()
|
self.db_file.close()
|
||||||
|
|
||||||
def login_user(self, name):
|
def login_user(self, name):
|
||||||
r = requests.post(self.proxy.public_server.url + 'hub/login',
|
base_url = public_url(self)
|
||||||
|
r = requests.post(base_url + 'hub/login',
|
||||||
data={
|
data={
|
||||||
'username': name,
|
'username': name,
|
||||||
'password': name,
|
'password': name,
|
||||||
},
|
},
|
||||||
allow_redirects=False,
|
allow_redirects=False,
|
||||||
)
|
)
|
||||||
|
r.raise_for_status()
|
||||||
assert r.cookies
|
assert r.cookies
|
||||||
return r.cookies
|
return r.cookies
|
||||||
|
|
||||||
|
|
||||||
|
def public_host(app):
|
||||||
|
if app.subdomain_host:
|
||||||
|
return app.subdomain_host
|
||||||
|
else:
|
||||||
|
return app.proxy.public_server.host
|
||||||
|
|
||||||
|
|
||||||
|
def public_url(app):
|
||||||
|
return public_host(app) + app.proxy.public_server.base_url
|
||||||
|
|
||||||
|
|
||||||
|
def user_url(user, app):
|
||||||
|
if app.subdomain_host:
|
||||||
|
host = user.host
|
||||||
|
else:
|
||||||
|
host = public_host(app)
|
||||||
|
return host + user.server.base_url
|
||||||
|
@@ -2,9 +2,8 @@
|
|||||||
|
|
||||||
import json
|
import json
|
||||||
import time
|
import time
|
||||||
from datetime import timedelta
|
|
||||||
from queue import Queue
|
from queue import Queue
|
||||||
from urllib.parse import urlparse
|
from urllib.parse import urlparse, quote
|
||||||
|
|
||||||
import requests
|
import requests
|
||||||
|
|
||||||
@@ -14,6 +13,7 @@ from .. import orm
|
|||||||
from ..user import User
|
from ..user import User
|
||||||
from ..utils import url_path_join as ujoin
|
from ..utils import url_path_join as ujoin
|
||||||
from . import mocking
|
from . import mocking
|
||||||
|
from .mocking import public_url, user_url
|
||||||
|
|
||||||
|
|
||||||
def check_db_locks(func):
|
def check_db_locks(func):
|
||||||
@@ -105,7 +105,7 @@ def test_auth_api(app):
|
|||||||
|
|
||||||
|
|
||||||
def test_referer_check(app, io_loop):
|
def test_referer_check(app, io_loop):
|
||||||
url = app.hub.server.url
|
url = ujoin(public_url(app), app.hub.server.base_url)
|
||||||
host = urlparse(url).netloc
|
host = urlparse(url).netloc
|
||||||
user = find_user(app.db, 'admin')
|
user = find_user(app.db, 'admin')
|
||||||
if user is None:
|
if user is None:
|
||||||
@@ -352,15 +352,19 @@ def test_spawn(app, io_loop):
|
|||||||
assert status is None
|
assert status is None
|
||||||
|
|
||||||
assert user.server.base_url == '/user/%s' % name
|
assert user.server.base_url == '/user/%s' % name
|
||||||
r = requests.get(ujoin(app.proxy.public_server.url, user.server.base_url))
|
url = user_url(user, app)
|
||||||
|
print(url)
|
||||||
|
r = requests.get(url)
|
||||||
assert r.status_code == 200
|
assert r.status_code == 200
|
||||||
assert r.text == user.server.base_url
|
assert r.text == user.server.base_url
|
||||||
|
|
||||||
r = requests.get(ujoin(app.proxy.public_server.url, user.server.base_url, 'args'))
|
r = requests.get(ujoin(url, 'args'))
|
||||||
assert r.status_code == 200
|
assert r.status_code == 200
|
||||||
argv = r.json()
|
argv = r.json()
|
||||||
for expected in ['--user=%s' % name, '--base-url=%s' % user.server.base_url]:
|
for expected in ['--user=%s' % name, '--base-url=%s' % user.server.base_url]:
|
||||||
assert expected in argv
|
assert expected in argv
|
||||||
|
if app.subdomain_host:
|
||||||
|
assert '--hub-host=%s' % app.subdomain_host in argv
|
||||||
|
|
||||||
r = api_request(app, 'users', name, 'server', method='delete')
|
r = api_request(app, 'users', name, 'server', method='delete')
|
||||||
assert r.status_code == 204
|
assert r.status_code == 204
|
||||||
@@ -379,6 +383,7 @@ def test_slow_spawn(app, io_loop):
|
|||||||
name = 'zoe'
|
name = 'zoe'
|
||||||
user = add_user(db, app=app, name=name)
|
user = add_user(db, app=app, name=name)
|
||||||
r = api_request(app, 'users', name, 'server', method='post')
|
r = api_request(app, 'users', name, 'server', method='post')
|
||||||
|
app.tornado_settings['spawner_class'] = mocking.MockSpawner
|
||||||
r.raise_for_status()
|
r.raise_for_status()
|
||||||
assert r.status_code == 202
|
assert r.status_code == 202
|
||||||
app_user = get_app_user(app, name)
|
app_user = get_app_user(app, name)
|
||||||
@@ -428,6 +433,7 @@ def test_never_spawn(app, io_loop):
|
|||||||
name = 'badger'
|
name = 'badger'
|
||||||
user = add_user(db, app=app, name=name)
|
user = add_user(db, app=app, name=name)
|
||||||
r = api_request(app, 'users', name, 'server', method='post')
|
r = api_request(app, 'users', name, 'server', method='post')
|
||||||
|
app.tornado_settings['spawner_class'] = mocking.MockSpawner
|
||||||
app_user = get_app_user(app, name)
|
app_user = get_app_user(app, name)
|
||||||
assert app_user.spawner is not None
|
assert app_user.spawner is not None
|
||||||
assert app_user.spawn_pending
|
assert app_user.spawn_pending
|
||||||
@@ -450,6 +456,55 @@ def test_get_proxy(app, io_loop):
|
|||||||
assert list(reply.keys()) == ['/']
|
assert list(reply.keys()) == ['/']
|
||||||
|
|
||||||
|
|
||||||
|
def test_cookie(app):
|
||||||
|
db = app.db
|
||||||
|
name = 'patience'
|
||||||
|
user = add_user(db, app=app, name=name)
|
||||||
|
r = api_request(app, 'users', name, 'server', method='post')
|
||||||
|
assert r.status_code == 201
|
||||||
|
assert 'pid' in user.state
|
||||||
|
app_user = get_app_user(app, name)
|
||||||
|
|
||||||
|
cookies = app.login_user(name)
|
||||||
|
# cookie jar gives '"cookie-value"', we want 'cookie-value'
|
||||||
|
cookie = cookies[user.server.cookie_name][1:-1]
|
||||||
|
r = api_request(app, 'authorizations/cookie', user.server.cookie_name, "nothintoseehere")
|
||||||
|
assert r.status_code == 404
|
||||||
|
|
||||||
|
r = api_request(app, 'authorizations/cookie', user.server.cookie_name, quote(cookie, safe=''))
|
||||||
|
r.raise_for_status()
|
||||||
|
reply = r.json()
|
||||||
|
assert reply['name'] == name
|
||||||
|
|
||||||
|
# deprecated cookie in body:
|
||||||
|
r = api_request(app, 'authorizations/cookie', user.server.cookie_name, data=cookie)
|
||||||
|
r.raise_for_status()
|
||||||
|
reply = r.json()
|
||||||
|
assert reply['name'] == name
|
||||||
|
|
||||||
|
def test_token(app):
|
||||||
|
name = 'book'
|
||||||
|
user = add_user(app.db, app=app, name=name)
|
||||||
|
token = user.new_api_token()
|
||||||
|
r = api_request(app, 'authorizations/token', token)
|
||||||
|
r.raise_for_status()
|
||||||
|
user_model = r.json()
|
||||||
|
assert user_model['name'] == name
|
||||||
|
r = api_request(app, 'authorizations/token', 'notauthorized')
|
||||||
|
assert r.status_code == 404
|
||||||
|
|
||||||
|
|
||||||
|
def test_options(app):
|
||||||
|
r = api_request(app, 'users', method='options')
|
||||||
|
r.raise_for_status()
|
||||||
|
assert 'Access-Control-Allow-Headers' in r.headers
|
||||||
|
|
||||||
|
|
||||||
|
def test_bad_json_body(app):
|
||||||
|
r = api_request(app, 'users', method='post', data='notjson')
|
||||||
|
assert r.status_code == 400
|
||||||
|
|
||||||
|
|
||||||
def test_shutdown(app):
|
def test_shutdown(app):
|
||||||
r = api_request(app, 'shutdown', method='post', data=json.dumps({
|
r = api_request(app, 'shutdown', method='post', data=json.dumps({
|
||||||
'servers': True,
|
'servers': True,
|
||||||
|
@@ -3,8 +3,9 @@
|
|||||||
import os
|
import os
|
||||||
import re
|
import re
|
||||||
import sys
|
import sys
|
||||||
from subprocess import check_output
|
from subprocess import check_output, Popen, PIPE
|
||||||
from tempfile import NamedTemporaryFile, TemporaryDirectory
|
from tempfile import NamedTemporaryFile, TemporaryDirectory
|
||||||
|
from .mocking import MockHub
|
||||||
|
|
||||||
def test_help_all():
|
def test_help_all():
|
||||||
out = check_output([sys.executable, '-m', 'jupyterhub', '--help-all']).decode('utf8', 'replace')
|
out = check_output([sys.executable, '-m', 'jupyterhub', '--help-all']).decode('utf8', 'replace')
|
||||||
@@ -23,10 +24,23 @@ def test_token_app():
|
|||||||
def test_generate_config():
|
def test_generate_config():
|
||||||
with NamedTemporaryFile(prefix='jupyterhub_config', suffix='.py') as tf:
|
with NamedTemporaryFile(prefix='jupyterhub_config', suffix='.py') as tf:
|
||||||
cfg_file = tf.name
|
cfg_file = tf.name
|
||||||
|
with open(cfg_file, 'w') as f:
|
||||||
out = check_output([sys.executable, '-m', 'jupyterhub',
|
f.write("c.A = 5")
|
||||||
'--generate-config', '-f', cfg_file]
|
p = Popen([sys.executable, '-m', 'jupyterhub',
|
||||||
).decode('utf8', 'replace')
|
'--generate-config', '-f', cfg_file],
|
||||||
|
stdout=PIPE, stdin=PIPE)
|
||||||
|
out, _ = p.communicate(b'n')
|
||||||
|
out = out.decode('utf8', 'replace')
|
||||||
|
assert os.path.exists(cfg_file)
|
||||||
|
with open(cfg_file) as f:
|
||||||
|
cfg_text = f.read()
|
||||||
|
assert cfg_text == 'c.A = 5'
|
||||||
|
|
||||||
|
p = Popen([sys.executable, '-m', 'jupyterhub',
|
||||||
|
'--generate-config', '-f', cfg_file],
|
||||||
|
stdout=PIPE, stdin=PIPE)
|
||||||
|
out, _ = p.communicate(b'x\ny')
|
||||||
|
out = out.decode('utf8', 'replace')
|
||||||
assert os.path.exists(cfg_file)
|
assert os.path.exists(cfg_file)
|
||||||
with open(cfg_file) as f:
|
with open(cfg_file) as f:
|
||||||
cfg_text = f.read()
|
cfg_text = f.read()
|
||||||
|
@@ -20,7 +20,7 @@ def test_server(db):
|
|||||||
assert server.proto == 'http'
|
assert server.proto == 'http'
|
||||||
assert isinstance(server.port, int)
|
assert isinstance(server.port, int)
|
||||||
assert isinstance(server.cookie_name, str)
|
assert isinstance(server.cookie_name, str)
|
||||||
assert server.host == 'http://localhost:%i' % server.port
|
assert server.host == 'http://127.0.0.1:%i' % server.port
|
||||||
assert server.url == server.host + '/'
|
assert server.url == server.host + '/'
|
||||||
assert server.bind_url == 'http://*:%i/' % server.port
|
assert server.bind_url == 'http://*:%i/' % server.port
|
||||||
server.ip = '127.0.0.1'
|
server.ip = '127.0.0.1'
|
||||||
|
@@ -1,6 +1,6 @@
|
|||||||
"""Tests for HTML pages"""
|
"""Tests for HTML pages"""
|
||||||
|
|
||||||
from urllib.parse import urlparse
|
from urllib.parse import urlencode, urlparse
|
||||||
|
|
||||||
import requests
|
import requests
|
||||||
|
|
||||||
@@ -8,12 +8,11 @@ from ..utils import url_path_join as ujoin
|
|||||||
from .. import orm
|
from .. import orm
|
||||||
|
|
||||||
import mock
|
import mock
|
||||||
from .mocking import FormSpawner
|
from .mocking import FormSpawner, public_url, public_host, user_url
|
||||||
from .test_api import api_request
|
from .test_api import api_request
|
||||||
|
|
||||||
|
|
||||||
def get_page(path, app, **kw):
|
def get_page(path, app, **kw):
|
||||||
base_url = ujoin(app.proxy.public_server.host, app.hub.server.base_url)
|
base_url = ujoin(public_url(app), app.hub.server.base_url)
|
||||||
print(base_url)
|
print(base_url)
|
||||||
return requests.get(ujoin(base_url, path), **kw)
|
return requests.get(ujoin(base_url, path), **kw)
|
||||||
|
|
||||||
@@ -22,15 +21,17 @@ def test_root_no_auth(app, io_loop):
|
|||||||
routes = io_loop.run_sync(app.proxy.get_routes)
|
routes = io_loop.run_sync(app.proxy.get_routes)
|
||||||
print(routes)
|
print(routes)
|
||||||
print(app.hub.server)
|
print(app.hub.server)
|
||||||
r = requests.get(app.proxy.public_server.host)
|
url = public_url(app)
|
||||||
|
print(url)
|
||||||
|
r = requests.get(url)
|
||||||
r.raise_for_status()
|
r.raise_for_status()
|
||||||
assert r.url == ujoin(app.proxy.public_server.host, app.hub.server.base_url, 'login')
|
assert r.url == ujoin(url, app.hub.server.base_url, 'login')
|
||||||
|
|
||||||
def test_root_auth(app):
|
def test_root_auth(app):
|
||||||
cookies = app.login_user('river')
|
cookies = app.login_user('river')
|
||||||
r = requests.get(app.proxy.public_server.host, cookies=cookies)
|
r = requests.get(public_url(app), cookies=cookies)
|
||||||
r.raise_for_status()
|
r.raise_for_status()
|
||||||
assert r.url == ujoin(app.proxy.public_server.host, '/user/river')
|
assert r.url == user_url(app.users['river'], app)
|
||||||
|
|
||||||
def test_home_no_auth(app):
|
def test_home_no_auth(app):
|
||||||
r = get_page('home', app, allow_redirects=False)
|
r = get_page('home', app, allow_redirects=False)
|
||||||
@@ -62,6 +63,7 @@ def test_admin(app):
|
|||||||
r.raise_for_status()
|
r.raise_for_status()
|
||||||
assert r.url.endswith('/admin')
|
assert r.url.endswith('/admin')
|
||||||
|
|
||||||
|
|
||||||
def test_spawn_redirect(app, io_loop):
|
def test_spawn_redirect(app, io_loop):
|
||||||
name = 'wash'
|
name = 'wash'
|
||||||
cookies = app.login_user(name)
|
cookies = app.login_user(name)
|
||||||
@@ -100,7 +102,7 @@ def test_spawn_page(app):
|
|||||||
|
|
||||||
def test_spawn_form(app, io_loop):
|
def test_spawn_form(app, io_loop):
|
||||||
with mock.patch.dict(app.users.settings, {'spawner_class': FormSpawner}):
|
with mock.patch.dict(app.users.settings, {'spawner_class': FormSpawner}):
|
||||||
base_url = ujoin(app.proxy.public_server.host, app.hub.server.base_url)
|
base_url = ujoin(public_url(app), app.hub.server.base_url)
|
||||||
cookies = app.login_user('jones')
|
cookies = app.login_user('jones')
|
||||||
orm_u = orm.User.find(app.db, 'jones')
|
orm_u = orm.User.find(app.db, 'jones')
|
||||||
u = app.users[orm_u]
|
u = app.users[orm_u]
|
||||||
@@ -121,7 +123,7 @@ def test_spawn_form(app, io_loop):
|
|||||||
|
|
||||||
def test_spawn_form_with_file(app, io_loop):
|
def test_spawn_form_with_file(app, io_loop):
|
||||||
with mock.patch.dict(app.users.settings, {'spawner_class': FormSpawner}):
|
with mock.patch.dict(app.users.settings, {'spawner_class': FormSpawner}):
|
||||||
base_url = ujoin(app.proxy.public_server.host, app.hub.server.base_url)
|
base_url = ujoin(public_url(app), app.hub.server.base_url)
|
||||||
cookies = app.login_user('jones')
|
cookies = app.login_user('jones')
|
||||||
orm_u = orm.User.find(app.db, 'jones')
|
orm_u = orm.User.find(app.db, 'jones')
|
||||||
u = app.users[orm_u]
|
u = app.users[orm_u]
|
||||||
@@ -147,3 +149,88 @@ def test_spawn_form_with_file(app, io_loop):
|
|||||||
'content_type': 'application/unknown'},
|
'content_type': 'application/unknown'},
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_user_redirect(app):
|
||||||
|
name = 'wash'
|
||||||
|
cookies = app.login_user(name)
|
||||||
|
|
||||||
|
r = get_page('/user/baduser', app, cookies=cookies)
|
||||||
|
r.raise_for_status()
|
||||||
|
print(urlparse(r.url))
|
||||||
|
path = urlparse(r.url).path
|
||||||
|
assert path == '/user/%s' % name
|
||||||
|
|
||||||
|
r = get_page('/user/baduser/test.ipynb', app, cookies=cookies)
|
||||||
|
r.raise_for_status()
|
||||||
|
print(urlparse(r.url))
|
||||||
|
path = urlparse(r.url).path
|
||||||
|
assert path == '/user/%s/test.ipynb' % name
|
||||||
|
|
||||||
|
r = get_page('/user/baduser/test.ipynb', app)
|
||||||
|
r.raise_for_status()
|
||||||
|
print(urlparse(r.url))
|
||||||
|
path = urlparse(r.url).path
|
||||||
|
assert path == '/hub/login'
|
||||||
|
query = urlparse(r.url).query
|
||||||
|
assert query == urlencode({'next': '/hub/user/baduser/test.ipynb'})
|
||||||
|
|
||||||
|
|
||||||
|
def test_login_fail(app):
|
||||||
|
name = 'wash'
|
||||||
|
base_url = public_url(app)
|
||||||
|
r = requests.post(base_url + 'hub/login',
|
||||||
|
data={
|
||||||
|
'username': name,
|
||||||
|
'password': 'wrong',
|
||||||
|
},
|
||||||
|
allow_redirects=False,
|
||||||
|
)
|
||||||
|
assert not r.cookies
|
||||||
|
|
||||||
|
|
||||||
|
def test_login_redirect(app, io_loop):
|
||||||
|
cookies = app.login_user('river')
|
||||||
|
user = app.users['river']
|
||||||
|
# no next_url, server running
|
||||||
|
io_loop.run_sync(user.spawn)
|
||||||
|
r = get_page('login', app, cookies=cookies, allow_redirects=False)
|
||||||
|
r.raise_for_status()
|
||||||
|
assert r.status_code == 302
|
||||||
|
assert '/user/river' in r.headers['Location']
|
||||||
|
|
||||||
|
# no next_url, server not running
|
||||||
|
io_loop.run_sync(user.stop)
|
||||||
|
r = get_page('login', app, cookies=cookies, allow_redirects=False)
|
||||||
|
r.raise_for_status()
|
||||||
|
assert r.status_code == 302
|
||||||
|
assert '/hub/' in r.headers['Location']
|
||||||
|
|
||||||
|
# next URL given, use it
|
||||||
|
r = get_page('login?next=/hub/admin', app, cookies=cookies, allow_redirects=False)
|
||||||
|
r.raise_for_status()
|
||||||
|
assert r.status_code == 302
|
||||||
|
assert r.headers['Location'].endswith('/hub/admin')
|
||||||
|
|
||||||
|
|
||||||
|
def test_logout(app):
|
||||||
|
name = 'wash'
|
||||||
|
cookies = app.login_user(name)
|
||||||
|
r = requests.get(public_host(app) + app.tornado_settings['logout_url'], cookies=cookies)
|
||||||
|
r.raise_for_status()
|
||||||
|
login_url = public_host(app) + app.tornado_settings['login_url']
|
||||||
|
assert r.url == login_url
|
||||||
|
assert r.cookies == {}
|
||||||
|
|
||||||
|
|
||||||
|
def test_static_files(app):
|
||||||
|
base_url = ujoin(public_url(app), app.hub.server.base_url)
|
||||||
|
print(base_url)
|
||||||
|
r = requests.get(ujoin(base_url, 'logo'))
|
||||||
|
r.raise_for_status()
|
||||||
|
assert r.headers['content-type'] == 'image/png'
|
||||||
|
r = requests.get(ujoin(base_url, 'static', 'images', 'jupyter.png'))
|
||||||
|
r.raise_for_status()
|
||||||
|
assert r.headers['content-type'] == 'image/png'
|
||||||
|
r = requests.get(ujoin(base_url, 'static', 'css', 'style.min.css'))
|
||||||
|
r.raise_for_status()
|
||||||
|
assert r.headers['content-type'] == 'text/css'
|
||||||
|
@@ -4,6 +4,7 @@ import json
|
|||||||
import os
|
import os
|
||||||
from queue import Queue
|
from queue import Queue
|
||||||
from subprocess import Popen
|
from subprocess import Popen
|
||||||
|
from urllib.parse import urlparse
|
||||||
|
|
||||||
from .. import orm
|
from .. import orm
|
||||||
from .mocking import MockHub
|
from .mocking import MockHub
|
||||||
@@ -34,6 +35,8 @@ def test_external_proxy(request, io_loop):
|
|||||||
'--api-port', str(proxy_port),
|
'--api-port', str(proxy_port),
|
||||||
'--default-target', 'http://%s:%i' % (app.hub_ip, app.hub_port),
|
'--default-target', 'http://%s:%i' % (app.hub_ip, app.hub_port),
|
||||||
]
|
]
|
||||||
|
if app.subdomain_host:
|
||||||
|
cmd.append('--host-routing')
|
||||||
proxy = Popen(cmd, env=env)
|
proxy = Popen(cmd, env=env)
|
||||||
def _cleanup_proxy():
|
def _cleanup_proxy():
|
||||||
if proxy.poll() is None:
|
if proxy.poll() is None:
|
||||||
@@ -60,7 +63,11 @@ def test_external_proxy(request, io_loop):
|
|||||||
r.raise_for_status()
|
r.raise_for_status()
|
||||||
|
|
||||||
routes = io_loop.run_sync(app.proxy.get_routes)
|
routes = io_loop.run_sync(app.proxy.get_routes)
|
||||||
assert sorted(routes.keys()) == ['/', '/user/river']
|
user_path = '/user/river'
|
||||||
|
if app.subdomain_host:
|
||||||
|
domain = urlparse(app.subdomain_host).hostname
|
||||||
|
user_path = '/%s.%s' % (name, domain) + user_path
|
||||||
|
assert sorted(routes.keys()) == ['/', user_path]
|
||||||
|
|
||||||
# teardown the proxy and start a new one in the same place
|
# teardown the proxy and start a new one in the same place
|
||||||
proxy.terminate()
|
proxy.terminate()
|
||||||
@@ -76,7 +83,7 @@ def test_external_proxy(request, io_loop):
|
|||||||
|
|
||||||
# check that the routes are correct
|
# check that the routes are correct
|
||||||
routes = io_loop.run_sync(app.proxy.get_routes)
|
routes = io_loop.run_sync(app.proxy.get_routes)
|
||||||
assert sorted(routes.keys()) == ['/', '/user/river']
|
assert sorted(routes.keys()) == ['/', user_path]
|
||||||
|
|
||||||
# teardown the proxy again, and start a new one with different auth and port
|
# teardown the proxy again, and start a new one with different auth and port
|
||||||
proxy.terminate()
|
proxy.terminate()
|
||||||
@@ -90,13 +97,16 @@ def test_external_proxy(request, io_loop):
|
|||||||
'--api-port', str(proxy_port),
|
'--api-port', str(proxy_port),
|
||||||
'--default-target', 'http://%s:%i' % (app.hub_ip, app.hub_port),
|
'--default-target', 'http://%s:%i' % (app.hub_ip, app.hub_port),
|
||||||
]
|
]
|
||||||
|
if app.subdomain_host:
|
||||||
|
cmd.append('--host-routing')
|
||||||
proxy = Popen(cmd, env=env)
|
proxy = Popen(cmd, env=env)
|
||||||
wait_for_proxy()
|
wait_for_proxy()
|
||||||
|
|
||||||
# tell the hub where the new proxy is
|
# tell the hub where the new proxy is
|
||||||
r = api_request(app, 'proxy', method='patch', data=json.dumps({
|
r = api_request(app, 'proxy', method='patch', data=json.dumps({
|
||||||
'port': proxy_port,
|
'port': proxy_port,
|
||||||
|
'protocol': 'http',
|
||||||
|
'ip': app.ip,
|
||||||
'auth_token': new_auth_token,
|
'auth_token': new_auth_token,
|
||||||
}))
|
}))
|
||||||
r.raise_for_status()
|
r.raise_for_status()
|
||||||
@@ -113,7 +123,8 @@ def test_external_proxy(request, io_loop):
|
|||||||
|
|
||||||
# check that the routes are correct
|
# check that the routes are correct
|
||||||
routes = io_loop.run_sync(app.proxy.get_routes)
|
routes = io_loop.run_sync(app.proxy.get_routes)
|
||||||
assert sorted(routes.keys()) == ['/', '/user/river']
|
assert sorted(routes.keys()) == ['/', user_path]
|
||||||
|
|
||||||
|
|
||||||
def test_check_routes(app, io_loop):
|
def test_check_routes(app, io_loop):
|
||||||
proxy = app.proxy
|
proxy = app.proxy
|
||||||
@@ -123,13 +134,24 @@ def test_check_routes(app, io_loop):
|
|||||||
r.raise_for_status()
|
r.raise_for_status()
|
||||||
zoe = orm.User.find(app.db, 'zoe')
|
zoe = orm.User.find(app.db, 'zoe')
|
||||||
assert zoe is not None
|
assert zoe is not None
|
||||||
|
zoe = app.users[zoe]
|
||||||
before = sorted(io_loop.run_sync(app.proxy.get_routes))
|
before = sorted(io_loop.run_sync(app.proxy.get_routes))
|
||||||
assert '/user/zoe' in before
|
assert zoe.proxy_path in before
|
||||||
io_loop.run_sync(app.proxy.check_routes)
|
io_loop.run_sync(lambda : app.proxy.check_routes(app.users))
|
||||||
io_loop.run_sync(lambda : proxy.delete_user(zoe))
|
io_loop.run_sync(lambda : proxy.delete_user(zoe))
|
||||||
during = sorted(io_loop.run_sync(app.proxy.get_routes))
|
during = sorted(io_loop.run_sync(app.proxy.get_routes))
|
||||||
assert '/user/zoe' not in during
|
assert zoe.proxy_path not in during
|
||||||
io_loop.run_sync(app.proxy.check_routes)
|
io_loop.run_sync(lambda : app.proxy.check_routes(app.users))
|
||||||
after = sorted(io_loop.run_sync(app.proxy.get_routes))
|
after = sorted(io_loop.run_sync(app.proxy.get_routes))
|
||||||
assert '/user/zoe' in after
|
assert zoe.proxy_path in after
|
||||||
assert before == after
|
assert before == after
|
||||||
|
|
||||||
|
|
||||||
|
def test_patch_proxy_bad_req(app):
|
||||||
|
r = api_request(app, 'proxy', method='patch')
|
||||||
|
assert r.status_code == 400
|
||||||
|
r = api_request(app, 'proxy', method='patch', data='notjson')
|
||||||
|
assert r.status_code == 400
|
||||||
|
r = api_request(app, 'proxy', method='patch', data=json.dumps([]))
|
||||||
|
assert r.status_code == 400
|
||||||
|
|
@@ -4,9 +4,14 @@
|
|||||||
# Distributed under the terms of the Modified BSD License.
|
# Distributed under the terms of the Modified BSD License.
|
||||||
|
|
||||||
import logging
|
import logging
|
||||||
|
import os
|
||||||
import signal
|
import signal
|
||||||
import sys
|
import sys
|
||||||
|
import tempfile
|
||||||
import time
|
import time
|
||||||
|
from unittest import mock
|
||||||
|
|
||||||
|
from tornado import gen
|
||||||
|
|
||||||
from .. import spawner as spawnermod
|
from .. import spawner as spawnermod
|
||||||
from ..spawner import LocalProcessSpawner
|
from ..spawner import LocalProcessSpawner
|
||||||
@@ -39,13 +44,14 @@ def new_spawner(db, **kwargs):
|
|||||||
kwargs.setdefault('INTERRUPT_TIMEOUT', 1)
|
kwargs.setdefault('INTERRUPT_TIMEOUT', 1)
|
||||||
kwargs.setdefault('TERM_TIMEOUT', 1)
|
kwargs.setdefault('TERM_TIMEOUT', 1)
|
||||||
kwargs.setdefault('KILL_TIMEOUT', 1)
|
kwargs.setdefault('KILL_TIMEOUT', 1)
|
||||||
|
kwargs.setdefault('poll_interval', 1)
|
||||||
return LocalProcessSpawner(db=db, **kwargs)
|
return LocalProcessSpawner(db=db, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
def test_spawner(db, io_loop):
|
def test_spawner(db, io_loop):
|
||||||
spawner = new_spawner(db)
|
spawner = new_spawner(db)
|
||||||
io_loop.run_sync(spawner.start)
|
io_loop.run_sync(spawner.start)
|
||||||
assert spawner.user.server.ip == 'localhost'
|
assert spawner.user.server.ip == '127.0.0.1'
|
||||||
|
|
||||||
# wait for the process to get to the while True: loop
|
# wait for the process to get to the while True: loop
|
||||||
time.sleep(1)
|
time.sleep(1)
|
||||||
@@ -59,7 +65,7 @@ def test_spawner(db, io_loop):
|
|||||||
def test_single_user_spawner(db, io_loop):
|
def test_single_user_spawner(db, io_loop):
|
||||||
spawner = new_spawner(db, cmd=['jupyterhub-singleuser'])
|
spawner = new_spawner(db, cmd=['jupyterhub-singleuser'])
|
||||||
io_loop.run_sync(spawner.start)
|
io_loop.run_sync(spawner.start)
|
||||||
assert spawner.user.server.ip == 'localhost'
|
assert spawner.user.server.ip == '127.0.0.1'
|
||||||
# wait for http server to come up,
|
# wait for http server to come up,
|
||||||
# checking for early termination every 1s
|
# checking for early termination every 1s
|
||||||
def wait():
|
def wait():
|
||||||
@@ -110,3 +116,53 @@ def test_stop_spawner_stop_now(db, io_loop):
|
|||||||
status = io_loop.run_sync(spawner.poll)
|
status = io_loop.run_sync(spawner.poll)
|
||||||
assert status == -signal.SIGTERM
|
assert status == -signal.SIGTERM
|
||||||
|
|
||||||
|
def test_spawner_poll(db, io_loop):
|
||||||
|
first_spawner = new_spawner(db)
|
||||||
|
user = first_spawner.user
|
||||||
|
io_loop.run_sync(first_spawner.start)
|
||||||
|
proc = first_spawner.proc
|
||||||
|
status = io_loop.run_sync(first_spawner.poll)
|
||||||
|
assert status is None
|
||||||
|
user.state = first_spawner.get_state()
|
||||||
|
assert 'pid' in user.state
|
||||||
|
|
||||||
|
# create a new Spawner, loading from state of previous
|
||||||
|
spawner = new_spawner(db, user=first_spawner.user)
|
||||||
|
spawner.start_polling()
|
||||||
|
|
||||||
|
# wait for the process to get to the while True: loop
|
||||||
|
io_loop.run_sync(lambda : gen.sleep(1))
|
||||||
|
status = io_loop.run_sync(spawner.poll)
|
||||||
|
assert status is None
|
||||||
|
|
||||||
|
# kill the process
|
||||||
|
proc.terminate()
|
||||||
|
for i in range(10):
|
||||||
|
if proc.poll() is None:
|
||||||
|
time.sleep(1)
|
||||||
|
else:
|
||||||
|
break
|
||||||
|
assert proc.poll() is not None
|
||||||
|
|
||||||
|
io_loop.run_sync(lambda : gen.sleep(2))
|
||||||
|
status = io_loop.run_sync(spawner.poll)
|
||||||
|
assert status is not None
|
||||||
|
|
||||||
|
def test_setcwd():
|
||||||
|
cwd = os.getcwd()
|
||||||
|
with tempfile.TemporaryDirectory() as td:
|
||||||
|
td = os.path.realpath(os.path.abspath(td))
|
||||||
|
spawnermod._try_setcwd(td)
|
||||||
|
assert os.path.samefile(os.getcwd(), td)
|
||||||
|
os.chdir(cwd)
|
||||||
|
chdir = os.chdir
|
||||||
|
temp_root = os.path.realpath(os.path.abspath(tempfile.gettempdir()))
|
||||||
|
def raiser(path):
|
||||||
|
path = os.path.realpath(os.path.abspath(path))
|
||||||
|
if not path.startswith(temp_root):
|
||||||
|
raise OSError(path)
|
||||||
|
chdir(path)
|
||||||
|
with mock.patch('os.chdir', raiser):
|
||||||
|
spawnermod._try_setcwd(cwd)
|
||||||
|
assert os.getcwd().startswith(temp_root)
|
||||||
|
os.chdir(cwd)
|
||||||
|
@@ -2,7 +2,7 @@
|
|||||||
# Distributed under the terms of the Modified BSD License.
|
# Distributed under the terms of the Modified BSD License.
|
||||||
|
|
||||||
from datetime import datetime, timedelta
|
from datetime import datetime, timedelta
|
||||||
from urllib.parse import quote
|
from urllib.parse import quote, urlparse
|
||||||
|
|
||||||
from tornado import gen
|
from tornado import gen
|
||||||
from tornado.log import app_log
|
from tornado.log import app_log
|
||||||
@@ -38,6 +38,12 @@ class UserDict(dict):
|
|||||||
def __getitem__(self, key):
|
def __getitem__(self, key):
|
||||||
if isinstance(key, User):
|
if isinstance(key, User):
|
||||||
key = key.id
|
key = key.id
|
||||||
|
elif isinstance(key, str):
|
||||||
|
orm_user = self.db.query(orm.User).filter(orm.User.name==key).first()
|
||||||
|
if orm_user is None:
|
||||||
|
raise KeyError("No such user: %s" % name)
|
||||||
|
else:
|
||||||
|
key = orm_user
|
||||||
if isinstance(key, orm.User):
|
if isinstance(key, orm.User):
|
||||||
# users[orm_user] returns User(orm_user)
|
# users[orm_user] returns User(orm_user)
|
||||||
orm_user = key
|
orm_user = key
|
||||||
@@ -139,6 +145,8 @@ class User(HasTraits):
|
|||||||
@property
|
@property
|
||||||
def running(self):
|
def running(self):
|
||||||
"""property for whether a user has a running server"""
|
"""property for whether a user has a running server"""
|
||||||
|
if self.spawn_pending or self.stop_pending:
|
||||||
|
return False # server is not running if spawn or stop is still pending
|
||||||
if self.server is None:
|
if self.server is None:
|
||||||
return False
|
return False
|
||||||
return True
|
return True
|
||||||
@@ -148,6 +156,41 @@ class User(HasTraits):
|
|||||||
"""My name, escaped for use in URLs, cookies, etc."""
|
"""My name, escaped for use in URLs, cookies, etc."""
|
||||||
return quote(self.name, safe='@')
|
return quote(self.name, safe='@')
|
||||||
|
|
||||||
|
@property
|
||||||
|
def proxy_path(self):
|
||||||
|
if self.settings.get('subdomain_host'):
|
||||||
|
return url_path_join('/' + self.domain, self.server.base_url)
|
||||||
|
else:
|
||||||
|
return self.server.base_url
|
||||||
|
|
||||||
|
@property
|
||||||
|
def domain(self):
|
||||||
|
"""Get the domain for my server."""
|
||||||
|
# FIXME: escaped_name probably isn't escaped enough in general for a domain fragment
|
||||||
|
return self.escaped_name + '.' + self.settings['domain']
|
||||||
|
|
||||||
|
@property
|
||||||
|
def host(self):
|
||||||
|
"""Get the *host* for my server (domain[:port])"""
|
||||||
|
# FIXME: escaped_name probably isn't escaped enough in general for a domain fragment
|
||||||
|
parsed = urlparse(self.settings['subdomain_host'])
|
||||||
|
h = '%s://%s.%s' % (parsed.scheme, self.escaped_name, parsed.netloc)
|
||||||
|
return h
|
||||||
|
|
||||||
|
@property
|
||||||
|
def url(self):
|
||||||
|
"""My URL
|
||||||
|
|
||||||
|
Full name.domain/path if using subdomains, otherwise just my /base/url
|
||||||
|
"""
|
||||||
|
if self.settings.get('subdomain_host'):
|
||||||
|
return '{host}{path}'.format(
|
||||||
|
host=self.host,
|
||||||
|
path=self.server.base_url,
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
return self.server.base_url
|
||||||
|
|
||||||
@gen.coroutine
|
@gen.coroutine
|
||||||
def spawn(self, options=None):
|
def spawn(self, options=None):
|
||||||
"""Start the user's spawner"""
|
"""Start the user's spawner"""
|
||||||
@@ -206,6 +249,7 @@ class User(HasTraits):
|
|||||||
self.state = spawner.get_state()
|
self.state = spawner.get_state()
|
||||||
self.last_activity = datetime.utcnow()
|
self.last_activity = datetime.utcnow()
|
||||||
db.commit()
|
db.commit()
|
||||||
|
self.spawn_pending = False
|
||||||
try:
|
try:
|
||||||
yield self.server.wait_up(http=True, timeout=spawner.http_timeout)
|
yield self.server.wait_up(http=True, timeout=spawner.http_timeout)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
@@ -232,7 +276,6 @@ class User(HasTraits):
|
|||||||
), exc_info=True)
|
), exc_info=True)
|
||||||
# raise original TimeoutError
|
# raise original TimeoutError
|
||||||
raise e
|
raise e
|
||||||
self.spawn_pending = False
|
|
||||||
return self
|
return self
|
||||||
|
|
||||||
@gen.coroutine
|
@gen.coroutine
|
||||||
|
@@ -30,22 +30,32 @@ def random_port():
|
|||||||
ISO8601_ms = '%Y-%m-%dT%H:%M:%S.%fZ'
|
ISO8601_ms = '%Y-%m-%dT%H:%M:%S.%fZ'
|
||||||
ISO8601_s = '%Y-%m-%dT%H:%M:%SZ'
|
ISO8601_s = '%Y-%m-%dT%H:%M:%SZ'
|
||||||
|
|
||||||
|
def can_connect(ip, port):
|
||||||
|
"""Check if we can connect to an ip:port
|
||||||
|
|
||||||
|
return True if we can connect, False otherwise.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
socket.create_connection((ip, port))
|
||||||
|
except socket.error as e:
|
||||||
|
if e.errno not in {errno.ECONNREFUSED, errno.ETIMEDOUT}:
|
||||||
|
app_log.error("Unexpected error connecting to %s:%i %s",
|
||||||
|
ip, port, e
|
||||||
|
)
|
||||||
|
return False
|
||||||
|
else:
|
||||||
|
return True
|
||||||
|
|
||||||
@gen.coroutine
|
@gen.coroutine
|
||||||
def wait_for_server(ip, port, timeout=10):
|
def wait_for_server(ip, port, timeout=10):
|
||||||
"""wait for any server to show up at ip:port"""
|
"""wait for any server to show up at ip:port"""
|
||||||
loop = ioloop.IOLoop.current()
|
loop = ioloop.IOLoop.current()
|
||||||
tic = loop.time()
|
tic = loop.time()
|
||||||
while loop.time() - tic < timeout:
|
while loop.time() - tic < timeout:
|
||||||
try:
|
if can_connect(ip, port):
|
||||||
socket.create_connection((ip, port))
|
|
||||||
except socket.error as e:
|
|
||||||
if e.errno != errno.ECONNREFUSED:
|
|
||||||
app_log.error("Unexpected error waiting for %s:%i %s",
|
|
||||||
ip, port, e
|
|
||||||
)
|
|
||||||
yield gen.sleep(0.1)
|
|
||||||
else:
|
|
||||||
return
|
return
|
||||||
|
else:
|
||||||
|
yield gen.sleep(0.1)
|
||||||
raise TimeoutError("Server at {ip}:{port} didn't respond in {timeout} seconds".format(
|
raise TimeoutError("Server at {ip}:{port} didn't respond in {timeout} seconds".format(
|
||||||
**locals()
|
**locals()
|
||||||
))
|
))
|
||||||
@@ -195,35 +205,3 @@ def url_path_join(*pieces):
|
|||||||
|
|
||||||
return result
|
return result
|
||||||
|
|
||||||
def localhost():
|
|
||||||
"""Return localhost or 127.0.0.1"""
|
|
||||||
if hasattr(localhost, '_localhost'):
|
|
||||||
return localhost._localhost
|
|
||||||
binder = connector = None
|
|
||||||
try:
|
|
||||||
binder = socket.socket()
|
|
||||||
binder.bind(('localhost', 0))
|
|
||||||
binder.listen(1)
|
|
||||||
port = binder.getsockname()[1]
|
|
||||||
def accept():
|
|
||||||
try:
|
|
||||||
conn, addr = binder.accept()
|
|
||||||
except ConnectionAbortedError:
|
|
||||||
pass
|
|
||||||
else:
|
|
||||||
conn.close()
|
|
||||||
t = Thread(target=accept)
|
|
||||||
t.start()
|
|
||||||
connector = socket.create_connection(('localhost', port), timeout=10)
|
|
||||||
t.join(timeout=10)
|
|
||||||
except (socket.error, socket.gaierror) as e:
|
|
||||||
warnings.warn("localhost doesn't appear to work, using 127.0.0.1\n%s" % e, RuntimeWarning)
|
|
||||||
localhost._localhost = '127.0.0.1'
|
|
||||||
else:
|
|
||||||
localhost._localhost = 'localhost'
|
|
||||||
finally:
|
|
||||||
if binder:
|
|
||||||
binder.close()
|
|
||||||
if connector:
|
|
||||||
connector.close()
|
|
||||||
return localhost._localhost
|
|
||||||
|
@@ -5,8 +5,8 @@
|
|||||||
|
|
||||||
version_info = (
|
version_info = (
|
||||||
0,
|
0,
|
||||||
4,
|
5,
|
||||||
1,
|
0,
|
||||||
)
|
)
|
||||||
|
|
||||||
__version__ = '.'.join(map(str, version_info))
|
__version__ = '.'.join(map(str, version_info))
|
||||||
|
@@ -17,34 +17,27 @@ from jinja2 import ChoiceLoader, FunctionLoader
|
|||||||
from tornado import ioloop
|
from tornado import ioloop
|
||||||
from tornado.web import HTTPError
|
from tornado.web import HTTPError
|
||||||
|
|
||||||
|
try:
|
||||||
|
import notebook
|
||||||
|
except ImportError:
|
||||||
|
raise ImportError("JupyterHub single-user server requires notebook >= 4.0")
|
||||||
|
|
||||||
from IPython.utils.traitlets import (
|
from traitlets import (
|
||||||
|
Bool,
|
||||||
Integer,
|
Integer,
|
||||||
Unicode,
|
Unicode,
|
||||||
CUnicode,
|
CUnicode,
|
||||||
)
|
)
|
||||||
|
|
||||||
try:
|
from notebook.notebookapp import (
|
||||||
import notebook
|
NotebookApp,
|
||||||
# 4.x
|
aliases as notebook_aliases,
|
||||||
except ImportError:
|
flags as notebook_flags,
|
||||||
from IPython.html.notebookapp import NotebookApp, aliases as notebook_aliases
|
)
|
||||||
from IPython.html.auth.login import LoginHandler
|
from notebook.auth.login import LoginHandler
|
||||||
from IPython.html.auth.logout import LogoutHandler
|
from notebook.auth.logout import LogoutHandler
|
||||||
|
|
||||||
from IPython.html.utils import url_path_join
|
from notebook.utils import url_path_join
|
||||||
|
|
||||||
from distutils.version import LooseVersion as V
|
|
||||||
|
|
||||||
import IPython
|
|
||||||
if V(IPython.__version__) < V('3.0'):
|
|
||||||
raise ImportError("JupyterHub Requires IPython >= 3.0, found %s" % IPython.__version__)
|
|
||||||
else:
|
|
||||||
from notebook.notebookapp import NotebookApp, aliases as notebook_aliases
|
|
||||||
from notebook.auth.login import LoginHandler
|
|
||||||
from notebook.auth.logout import LogoutHandler
|
|
||||||
|
|
||||||
from notebook.utils import url_path_join
|
|
||||||
|
|
||||||
|
|
||||||
# Define two methods to attach to AuthenticatedHandler,
|
# Define two methods to attach to AuthenticatedHandler,
|
||||||
@@ -116,7 +109,9 @@ class JupyterHubLoginHandler(LoginHandler):
|
|||||||
|
|
||||||
class JupyterHubLogoutHandler(LogoutHandler):
|
class JupyterHubLogoutHandler(LogoutHandler):
|
||||||
def get(self):
|
def get(self):
|
||||||
self.redirect(url_path_join(self.settings['hub_prefix'], 'logout'))
|
self.redirect(
|
||||||
|
self.settings['hub_host'] +
|
||||||
|
url_path_join(self.settings['hub_prefix'], 'logout'))
|
||||||
|
|
||||||
|
|
||||||
# register new hub related command-line aliases
|
# register new hub related command-line aliases
|
||||||
@@ -125,9 +120,18 @@ aliases.update({
|
|||||||
'user' : 'SingleUserNotebookApp.user',
|
'user' : 'SingleUserNotebookApp.user',
|
||||||
'cookie-name': 'SingleUserNotebookApp.cookie_name',
|
'cookie-name': 'SingleUserNotebookApp.cookie_name',
|
||||||
'hub-prefix': 'SingleUserNotebookApp.hub_prefix',
|
'hub-prefix': 'SingleUserNotebookApp.hub_prefix',
|
||||||
|
'hub-host': 'SingleUserNotebookApp.hub_host',
|
||||||
'hub-api-url': 'SingleUserNotebookApp.hub_api_url',
|
'hub-api-url': 'SingleUserNotebookApp.hub_api_url',
|
||||||
'base-url': 'SingleUserNotebookApp.base_url',
|
'base-url': 'SingleUserNotebookApp.base_url',
|
||||||
})
|
})
|
||||||
|
flags = dict(notebook_flags)
|
||||||
|
flags.update({
|
||||||
|
'disable-user-config': ({
|
||||||
|
'SingleUserNotebookApp': {
|
||||||
|
'disable_user_config': True
|
||||||
|
}
|
||||||
|
}, "Disable user-controlled configuration of the notebook server.")
|
||||||
|
})
|
||||||
|
|
||||||
page_template = """
|
page_template = """
|
||||||
{% extends "templates/page.html" %}
|
{% extends "templates/page.html" %}
|
||||||
@@ -141,8 +145,21 @@ page_template = """
|
|||||||
>
|
>
|
||||||
Control Panel</a>
|
Control Panel</a>
|
||||||
{% endblock %}
|
{% endblock %}
|
||||||
|
{% block logo %}
|
||||||
|
<img src='{{logo_url}}' alt='Jupyter Notebook'/>
|
||||||
|
{% endblock logo %}
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
def _exclude_home(path_list):
|
||||||
|
"""Filter out any entries in a path list that are in my home directory.
|
||||||
|
|
||||||
|
Used to disable per-user configuration.
|
||||||
|
"""
|
||||||
|
home = os.path.expanduser('~')
|
||||||
|
for p in path_list:
|
||||||
|
if not p.startswith(home):
|
||||||
|
yield p
|
||||||
|
|
||||||
class SingleUserNotebookApp(NotebookApp):
|
class SingleUserNotebookApp(NotebookApp):
|
||||||
"""A Subclass of the regular NotebookApp that is aware of the parent multiuser context."""
|
"""A Subclass of the regular NotebookApp that is aware of the parent multiuser context."""
|
||||||
user = CUnicode(config=True)
|
user = CUnicode(config=True)
|
||||||
@@ -150,12 +167,23 @@ class SingleUserNotebookApp(NotebookApp):
|
|||||||
self.log.name = new
|
self.log.name = new
|
||||||
cookie_name = Unicode(config=True)
|
cookie_name = Unicode(config=True)
|
||||||
hub_prefix = Unicode(config=True)
|
hub_prefix = Unicode(config=True)
|
||||||
|
hub_host = Unicode(config=True)
|
||||||
hub_api_url = Unicode(config=True)
|
hub_api_url = Unicode(config=True)
|
||||||
aliases = aliases
|
aliases = aliases
|
||||||
|
flags = flags
|
||||||
open_browser = False
|
open_browser = False
|
||||||
trust_xheaders = True
|
trust_xheaders = True
|
||||||
login_handler_class = JupyterHubLoginHandler
|
login_handler_class = JupyterHubLoginHandler
|
||||||
logout_handler_class = JupyterHubLogoutHandler
|
logout_handler_class = JupyterHubLogoutHandler
|
||||||
|
port_retries = 0 # disable port-retries, since the Spawner will tell us what port to use
|
||||||
|
|
||||||
|
disable_user_config = Bool(False, config=True,
|
||||||
|
help="""Disable user configuration of single-user server.
|
||||||
|
|
||||||
|
Prevents user-writable files that normally configure the single-user server
|
||||||
|
from being loaded, ensuring admins have full control of configuration.
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
cookie_cache_lifetime = Integer(
|
cookie_cache_lifetime = Integer(
|
||||||
config=True,
|
config=True,
|
||||||
@@ -183,6 +211,36 @@ class SingleUserNotebookApp(NotebookApp):
|
|||||||
self.log.debug("Clearing cookie cache")
|
self.log.debug("Clearing cookie cache")
|
||||||
self.tornado_settings['cookie_cache'].clear()
|
self.tornado_settings['cookie_cache'].clear()
|
||||||
|
|
||||||
|
def migrate_config(self):
|
||||||
|
if self.disable_user_config:
|
||||||
|
# disable config-migration when user config is disabled
|
||||||
|
return
|
||||||
|
else:
|
||||||
|
super(SingleUserNotebookApp, self).migrate_config()
|
||||||
|
|
||||||
|
@property
|
||||||
|
def config_file_paths(self):
|
||||||
|
path = super(SingleUserNotebookApp, self).config_file_paths
|
||||||
|
|
||||||
|
if self.disable_user_config:
|
||||||
|
# filter out user-writable config dirs if user config is disabled
|
||||||
|
path = list(_exclude_home(path))
|
||||||
|
return path
|
||||||
|
|
||||||
|
@property
|
||||||
|
def nbextensions_path(self):
|
||||||
|
path = super(SingleUserNotebookApp, self).nbextensions_path
|
||||||
|
|
||||||
|
if self.disable_user_config:
|
||||||
|
path = list(_exclude_home(path))
|
||||||
|
return path
|
||||||
|
|
||||||
|
def _static_custom_path_default(self):
|
||||||
|
path = super(SingleUserNotebookApp, self)._static_custom_path_default()
|
||||||
|
if self.disable_user_config:
|
||||||
|
path = list(_exclude_home(path))
|
||||||
|
return path
|
||||||
|
|
||||||
def start(self):
|
def start(self):
|
||||||
# Start a PeriodicCallback to clear cached cookies. This forces us to
|
# Start a PeriodicCallback to clear cached cookies. This forces us to
|
||||||
# revalidate our user with the Hub at least every
|
# revalidate our user with the Hub at least every
|
||||||
@@ -202,20 +260,22 @@ class SingleUserNotebookApp(NotebookApp):
|
|||||||
s['user'] = self.user
|
s['user'] = self.user
|
||||||
s['hub_api_key'] = env.pop('JPY_API_TOKEN')
|
s['hub_api_key'] = env.pop('JPY_API_TOKEN')
|
||||||
s['hub_prefix'] = self.hub_prefix
|
s['hub_prefix'] = self.hub_prefix
|
||||||
|
s['hub_host'] = self.hub_host
|
||||||
s['cookie_name'] = self.cookie_name
|
s['cookie_name'] = self.cookie_name
|
||||||
s['login_url'] = self.hub_prefix
|
s['login_url'] = self.hub_host + self.hub_prefix
|
||||||
s['hub_api_url'] = self.hub_api_url
|
s['hub_api_url'] = self.hub_api_url
|
||||||
s['csp_report_uri'] = url_path_join(self.hub_prefix, 'security/csp-report')
|
s['csp_report_uri'] = self.hub_host + url_path_join(self.hub_prefix, 'security/csp-report')
|
||||||
|
|
||||||
super(SingleUserNotebookApp, self).init_webapp()
|
super(SingleUserNotebookApp, self).init_webapp()
|
||||||
self.patch_templates()
|
self.patch_templates()
|
||||||
|
|
||||||
def patch_templates(self):
|
def patch_templates(self):
|
||||||
"""Patch page templates to add Hub-related buttons"""
|
"""Patch page templates to add Hub-related buttons"""
|
||||||
|
|
||||||
|
self.jinja_template_vars['logo_url'] = self.hub_host + url_path_join(self.hub_prefix, 'logo')
|
||||||
env = self.web_app.settings['jinja2_env']
|
env = self.web_app.settings['jinja2_env']
|
||||||
|
|
||||||
env.globals['hub_control_panel_url'] = \
|
env.globals['hub_control_panel_url'] = \
|
||||||
url_path_join(self.hub_prefix, 'home')
|
self.hub_host + url_path_join(self.hub_prefix, 'home')
|
||||||
|
|
||||||
# patch jinja env loading to modify page template
|
# patch jinja env loading to modify page template
|
||||||
def get_page(name):
|
def get_page(name):
|
||||||
|
@@ -82,7 +82,7 @@
|
|||||||
|
|
||||||
<div id="header" class="navbar navbar-static-top">
|
<div id="header" class="navbar navbar-static-top">
|
||||||
<div class="container">
|
<div class="container">
|
||||||
<span id="jupyterhub-logo" class="pull-left"><a href="{{base_url}}"><img src='{{static_url("images/jupyter.png") }}' alt='JupyterHub' class='jpy-logo' title='Home'/></a></span>
|
<span id="jupyterhub-logo" class="pull-left"><a href="{{base_url}}"><img src='{{base_url}}logo' alt='JupyterHub' class='jpy-logo' title='Home'/></a></span>
|
||||||
|
|
||||||
{% block login_widget %}
|
{% block login_widget %}
|
||||||
|
|
||||||
|
Reference in New Issue
Block a user