Compare commits

...

98 Commits
0.4.0 ... 0.5.0

Author SHA1 Message Date
Min RK
fc6435825c release 0.5.0 2016-03-08 08:57:33 +01:00
Min RK
b3ab48eb68 Merge pull request #463 from minrk/moar-coverage
Increase some test coverage
2016-03-07 17:13:20 +01:00
Carol Willing
a212151c09 Merge pull request #461 from minrk/0.5
0.5 changelog
2016-03-07 08:07:19 -08:00
Min RK
67ccfc7eb7 increase some test coverage 2016-03-07 16:13:57 +01:00
Min RK
9af103c673 fixes for handling failed chdir in spawners 2016-03-07 15:12:30 +01:00
Min RK
82643adfb6 stop_pending also counts as not running 2016-03-07 14:27:40 +01:00
Min RK
74df94d15a 0.5 changelog 2016-03-07 13:54:40 +01:00
Min RK
da1b9bdd80 Merge pull request #460 from yuvipanda/mysql-fix
Add lengths to all Unicode() columns
2016-03-07 10:36:17 +01:00
Min RK
18675ef6df Merge pull request #453 from minrk/timeout-in-is-up
use the same connection check everywhere
2016-03-07 10:35:12 +01:00
YuviPanda
bf9dea5522 Add lengths to all Unicode() ones
- Otherwise does not work with MySQL
- Change JSONDict to be TEXT (Unbounded) rather than VARCHAR.
  This makes most sense, since you can't index these anyway.
- The 'ip' field in Server is set to 255, since that is the
  max allowed length of DNS entries.
- Most of the rest of the Unicodes have approximately high
  values that most people should not mostly run into
  (famous last words).
2016-03-06 18:26:25 -08:00
Min RK
62e30c1d79 Merge pull request #457 from shreddd/default_url
Enable default_url to pass in to notebook server
2016-03-06 10:33:52 +01:00
shreddd
1316196542 Update spawner.py
type
2016-03-05 12:24:39 -08:00
Shreyas Cholia
1a377bd03a comment on default_url being used with notebook_dir 2016-03-05 12:16:10 -08:00
Shreyas Cholia
66a99ce881 Add support for default_url 2016-03-05 12:05:58 -08:00
shreddd
481debcb80 Merge pull request #1 from jupyter/master
sync master
2016-03-05 12:04:09 -08:00
Carol Willing
03c25b5cac Merge pull request #452 from minrk/redundant-use-subdomain
remove redundant use_subdomains
2016-03-05 11:52:43 -08:00
Carol Willing
26c060d2c5 Merge pull request #456 from willingc/readme-clarify
Add minor clarification to README
2016-03-05 10:57:13 -08:00
Carol Willing
7ff42f9b55 Add @betatim's suggested wording 2016-03-05 10:43:45 -08:00
Carol Willing
a35d8a6262 Add minor clarification 2016-03-05 10:14:44 -08:00
Carol Willing
8f39e1f8f9 Merge pull request #455 from betatim/readme-fix
README uses two different names for docker container
2016-03-05 10:08:34 -08:00
Tim Head
ff19b799c4 container -> cont for consistency 2016-03-05 09:19:15 +01:00
Kyle Kelley
e547949aee Merge pull request #433 from minrk/disable-user-config
allow disabling user configuration of single-user servers
2016-03-04 09:57:45 -06:00
Min RK
31be00b49f failure to connect may be a timeout 2016-03-04 16:28:57 +01:00
Min RK
4533d96002 use the same connection check everywhere
avoids inconsistencies in error handling
2016-03-04 16:28:57 +01:00
Min RK
7f89f1a2a0 expose disable_user_config as Spawner.disable_user_config 2016-03-04 14:41:40 +01:00
Min RK
aed29e1db8 Simplify filter to exclude config in the home directory 2016-03-04 11:43:45 +01:00
Min RK
49bee25820 allow disabling user configuration of single-user servers 2016-03-04 11:43:45 +01:00
Min RK
838c8eb057 Merge pull request #448 from daradib/redirect
Redirect requests to logged in user
2016-03-04 11:15:56 +01:00
Min RK
be5860822d remove redundant use_subdomains
non-empty subdomain_host is enough
2016-03-04 11:11:41 +01:00
Dara Adib
5a10d304c9 Redirect user to login page when not logged in 2016-03-02 16:55:33 -08:00
Dara Adib
fdd3746f54 Add test for user redirect 2016-03-02 16:18:02 -08:00
Dara Adib
4d55a48a79 Redirect requests to logged in user
If a user, alice, requests /user/bob/notebooks/mynotebook.ipynb,
redirect her to /user/alice/notebooks/mynotebook.ipynb.
Currently, such requests get stuck in a redirect loop because
the request will be redirected to login page with a next parameter
that when followed is again redirected.

When notebook_dir is consistent across users, this will allow
users to share notebook URLs. Fixes #424.
2016-03-02 16:15:50 -08:00
Min RK
b2ece48239 reverse arguments in check_routes 2016-03-01 19:42:55 +01:00
Kyle Kelley
6375ba30b7 Merge pull request #445 from minrk/check-routes-pending
Don't add users with spawn_pending to the proxy
2016-03-01 09:19:42 -06:00
Min RK
f565f8ac53 Don't add users with spawn_pending to the proxy
check_routes checks for missing routes for running users.
This is meant for when the proxy has been relaunched outside the Hub.

If spawners are slow to start, it's possible for check_routes to fire in the middle of spawning,
triggering addition of the user's server (which has no defined location yet) to the proxy before it's up.
If the spawning fails, the route will remain indefinitely (because it never should have been added in the first place), and the user will see 503 until their server is launched manually again.

Checking `spawn_pending` in user.running prevents this.
2016-03-01 15:18:51 +01:00
Kyle Kelley
5ec05822f1 Merge pull request #436 from minrk/subdomains
allow running single-user servers on subdomains
2016-02-28 09:49:45 -06:00
Min RK
335b47d7c1 include protocol in subdomain_host
makes everything easier, and tests are passing with and without subdomains (yay!)
2016-02-28 11:12:41 +01:00
Min RK
f922561003 Tests are passing with subdomains 2016-02-26 17:32:55 +01:00
Min RK
79df83f0d3 Allow getting users by name 2016-02-26 17:32:55 +01:00
Min RK
29416463ff proxy needs user dict, which has proxy path
this won't be needed if/when I make a schema change, where domain is included in the Server table.
2016-02-26 17:32:55 +01:00
Min RK
dd2e1ef758 turn off subdomains by default 2016-02-26 17:32:55 +01:00
Min RK
a9b8542ec7 pass hub's host to single-user servers via hub_host 2016-02-26 17:32:54 +01:00
Min RK
a4ae2ec2d8 consolidate cookie setting in _set_user_cookie 2016-02-26 17:32:54 +01:00
Min RK
b54bfad8c2 [WIP]: allow running single-user servers on subdomains
relies on CHP's host-based routing (a feature I didn't add!)

requires wildcard DNS and wildcard SSL for a proper setup

still lots to workout and cleanup in terms of cookies and where to use host, domain, path, but it works locally.
2016-02-26 17:32:54 +01:00
Min RK
724bf7c4ce Merge pull request #441 from jupyter/revert-440-master
Revert "Do not consider `@` character url-safe"
2016-02-26 09:06:46 +01:00
Kyle Kelley
fccc954fb4 Merge pull request #442 from minrk/never-poll-before-start-is-done
avoid calling Spawner.poll during Spawner.start
2016-02-25 08:20:35 -06:00
Kyle Kelley
74385a6906 Merge pull request #443 from minrk/catch-options-from-form
catch exceptions in options_from_form
2016-02-25 08:19:16 -06:00
Min RK
dd66fe63c0 catch exceptions in options_from_form
Allows form validation to be implemented in options_from_form, as well as start.
2016-02-25 12:02:23 +01:00
Min RK
e74934cb17 avoid calling Spawner.poll during Spawner.start
moves `spawn_pending` flag to only around start, not the HTTP wait.

Some Spawners may not know how to poll until start has finished (DockerSpawner).
Let's not require that they do.
2016-02-25 10:13:51 +01:00
Min RK
450281a90a Revert "Do not consider @ character url-safe" 2016-02-25 09:04:25 +01:00
Kyle Kelley
6e7fc0574e Merge pull request #440 from ResearchComputing/master
Do not consider `@` character url-safe
2016-02-24 23:58:45 -06:00
Jonathon Anderson
fc49aac02b Do not consider `@' character url-safe
Usernames that have an `@'-separated domain component
break JupyterHub when the server expects to see query
strings that contain an `@', when browsers and other
clients send `%40'.
2016-02-24 16:48:23 -07:00
Kyle Kelley
097d883905 Merge pull request #435 from minrk/debug-no-server
add debug logging for adding users with no running server
2016-02-20 06:04:12 -08:00
Min RK
cb55118f70 add debug logging for adding users with no running server
in check_routes, it has been reported that users without a running server are attempted to be added.

So something is wrong, either in sqlalchemy or my understanding of what it does (likely the latter),
because a filter for users with a non-None server is returning at least one result whose server is None.
2016-02-20 14:22:50 +01:00
Carol Willing
2a3c87945e Merge pull request #434 from rgbkrk/ssl
Don't let the default include `--no-ssl`.
2016-02-18 16:48:06 -08:00
Kyle Kelley
2b2aacedc6 Don't let the default include --no-ssl. 2016-02-18 16:27:53 -08:00
Kyle Kelley
8ebec52827 Merge pull request #431 from ObiWahn/master
Update README.md
2016-02-18 16:25:56 -08:00
Jan Christoph Uhde
1642cc30c8 fix: run vs exec and split sentence 2016-02-19 00:13:02 +01:00
Kyle Kelley
1645d8f0c0 Merge pull request #432 from minrk/no-port-retries
disable port_retries in single-user server
2016-02-18 06:40:51 -08:00
Min RK
8d390819a1 disable port_retries in single-user server
since Spawners won't notice that the server has started somewhere other than where it was asked to
2016-02-18 09:03:45 +01:00
Jan Christoph Uhde
c7dd18bb03 Update README.md 2016-02-16 22:58:27 +01:00
Min RK
84b7de4d21 set x bit on jupyterhub-singleuser 2016-02-15 21:50:55 +01:00
Carol Willing
161df53143 Merge pull request #426 from takluyver/docs-intro
Add overview to landing page
2016-02-13 11:12:35 -08:00
Thomas Kluyver
1cfd6cf12e Fix grammaros 2016-02-13 18:18:23 +00:00
Thomas Kluyver
d40dcc35fb Reword intro 2016-02-13 16:44:41 +00:00
Thomas Kluyver
a570e95602 Add my overview to intro
Closes gh-425
2016-02-13 15:29:08 +00:00
Thomas Kluyver
e4e43521ee Close code block 2016-02-13 15:28:37 +00:00
Min RK
1b2c21a99c Merge pull request #423 from minrk/custom-logo
allow overriding logo
2016-02-11 15:03:02 +01:00
Min RK
e28eda6386 exercise some static file handlers in tests 2016-02-09 15:38:44 +01:00
Min RK
39c171cce7 allow overriding logo
by specifying JupyterHub.logo_file

also ensures single-user server always has the same logo image as the Hub
2016-02-09 15:38:34 +01:00
Min RK
c81cefd768 Merge pull request #372 from minrk/require-notebook-4
drop support for single-user server from IPython 3.x
2016-02-09 14:42:12 +01:00
Min RK
325f137265 Merge pull request #421 from Fokko/add-docker-label
Added label to dockerfile for referencing
2016-02-09 14:41:48 +01:00
Fokko Driesprong
1ae795df18 Changed domain of the label to .org 2016-02-09 14:16:50 +01:00
Fokko Driesprong
2aacd5e28b Added label to dockerfile for referencing 2016-02-08 17:16:20 +01:00
Kyle Kelley
6e1425e2c0 Merge pull request #417 from minrk/require-confirm-insecure
require confirmation for JupyterHub to run without SSL
2016-02-05 19:27:37 -06:00
Carol Willing
010db6ce72 Merge pull request #416 from willingc/doc-warn
Add more prominent message for https
2016-02-04 14:21:52 -08:00
Min RK
ce8d782220 no-ssl in changelog 2016-02-04 23:00:54 +01:00
Min RK
90c2b23fc0 require confirmation for JupyterHub to run without SSL
ensures folks deploying JupyterHub on HTTP have been told what's up.
2016-02-04 23:00:54 +01:00
Carol Willing
32685aeac1 Add more prominent message for https 2016-02-04 13:42:13 -08:00
Min RK
01c5608104 update version requirements in README 2016-02-04 22:41:18 +01:00
Min RK
a35f6298f0 drop support for single-user server from IPython 3.x 2016-02-04 22:40:44 +01:00
Min RK
8955d6aed4 Merge pull request #411 from minrk/one-two-seven
use 127.0.0.1 instead of localhost
2016-02-04 20:37:09 +01:00
Min RK
cafbf8b990 back to dev 2016-02-03 21:05:48 +01:00
Min RK
7837a9cf68 release 0.4.1 2016-02-03 21:04:32 +01:00
Min RK
65a019e05b Merge pull request #413 from minrk/login_url
Restore /login handler
2016-02-03 21:00:03 +01:00
Min RK
f2014c5687 note that login/logout should always be registered 2016-02-03 20:54:01 +01:00
Min RK
109c315336 changelog for 0.4.1 2016-02-03 16:55:25 +01:00
Min RK
941fc7e627 restore /login page
erroneously removed in 0.4
2016-02-03 16:52:43 +01:00
Min RK
f626d2f6e5 use 127.0.0.1 instead of localhost
localhost can cause some issues on badly behaved or misconfigured systems,
and 127 seems simpler.
2016-02-03 10:30:09 +01:00
Min RK
80215f6b3c Merge pull request #407 from willingc/doc-proxy
Add doc details for #406
2016-02-02 09:06:35 +01:00
Carol Willing
84916062f0 Edit per @minrk and added troubleshooting 2016-02-01 14:17:14 -08:00
Carol Willing
641154bf06 Add doc details for #406 2016-02-01 11:47:08 -08:00
Min RK
14b0dbde0e Merge pull request #405 from willingc/doc-link
Update documentation link to source code
2016-02-01 19:58:08 +01:00
Carol Willing
cd85766441 Update link to source code 2016-02-01 08:42:49 -08:00
Min RK
6c072bdb3d nonempty long_description
avoids dumping README.md garbage onto PyPI
2016-02-01 11:05:58 +01:00
Min RK
35f080458e Upload with twine 2016-02-01 10:41:51 +01:00
Min RK
feac4f6bc4 Changelog for 0.4 2016-02-01 10:41:51 +01:00
Min RK
1bbabbb989 back to dev 2016-02-01 10:37:46 +01:00
33 changed files with 880 additions and 325 deletions

View File

@@ -15,3 +15,7 @@ script:
- py.test --cov jupyterhub jupyterhub/tests -v
after_success:
- codecov
matrix:
include:
- python: 3.5
env: JUPYTERHUB_TEST_SUBDOMAIN_HOST=http://127.0.0.1.xip.io:8000

View File

@@ -46,5 +46,7 @@ WORKDIR /srv/jupyterhub/
EXPOSE 8000
LABEL org.jupyter.service="jupyterhub"
ONBUILD ADD jupyterhub_config.py /srv/jupyterhub/jupyterhub_config.py
CMD ["jupyterhub", "-f", "/srv/jupyterhub/jupyterhub_config.py"]

View File

@@ -25,7 +25,7 @@ Basic principles:
## Dependencies
JupyterHub requires [IPython](https://ipython.org/install.html) >= 3.0 (current master) and [Python](https://www.python.org/downloads/) >= 3.3.
JupyterHub itself requires [Python](https://www.python.org/downloads/) 3.3. To run the single-user servers (which may be on the same system as the Hub or not), [Jupyter Notebook](https://jupyter.readthedocs.org/en/latest/install.html) ≥ 4 is required.
Install [nodejs/npm](https://www.npmjs.com/), which is available from your
package manager. For example, install on Linux (Debian/Ubuntu) using:
@@ -50,14 +50,15 @@ Notes on the `pip` command used in the installation directions below:
## Installation
JupyterHub can be installed with pip:
JupyterHub can be installed with pip, and the proxy with npm:
npm install -g configurable-http-proxy
pip3 install jupyterhub
If you plan to run notebook servers locally, you may also need to install the
Jupyter ~~IPython~~ notebook:
pip3 install notebook
pip3 install --upgrade notebook
### Development install
@@ -116,6 +117,22 @@ Some examples, meant as illustration and testing of this concept:
- Using GitHub OAuth instead of PAM with [OAuthenticator](https://github.com/jupyter/oauthenticator)
- Spawning single-user servers with Docker, using the [DockerSpawner](https://github.com/jupyter/dockerspawner)
### Docker
There is a ready to go [docker image](https://hub.docker.com/r/jupyter/jupyterhub/).
It can be started with the following command:
docker run -d --name jupyter.cont [-v /home/jupyter-home:/home] jupyter/jupyterhub jupyterhub
This command will create a container named `jupyter.cont` that you can stop and resume with `docker stop/start`.
It will be listening on all interfaces at port 8000. So this is perfect to test docker on your desktop or laptop.
If you want to run docker on a computer that has a public IP then you should (as in MUST) secure it with ssl by
adding ssl options to your docker configuration or using a ssl enabled proxy. The `-v/--volume` option will
allow you to store data outside the docker image (host system) so it will be persistent, even when you start
a new image. The command `docker exec -it jupyter.cont bash` will spawn a root shell in your started docker
container. You can use it to create system users in the container. These accounts will be used for authentication
in jupyterhub's default configuration. In order to run without SSL, you'll need to set `--no-ssl` explicitly.
# Getting help
We encourage you to ask questions on the mailing list:

View File

@@ -77,6 +77,7 @@ For simple mappings, a configurable dict `Authenticator.username_map` is used to
c.Authenticator.username_map = {
'service-name': 'localname'
}
```
### Validating usernames

View File

@@ -2,7 +2,36 @@
See `git log` for a more detailed summary.
## 0.3.0
## 0.5
- Single-user server must be run with Jupyter Notebook ≥ 4.0
- Require `--no-ssl` confirmation to allow the Hub to be run without SSL (e.g. behind SSL termination in nginx)
- Add lengths to text fields for MySQL support
- Add `Spawner.disable_user_config` for preventing user-owned configuration from modifying single-user servers.
- Fixes for MySQL support.
- Add ability to run each user's server on its own subdomain. Requires wildcard DNS and wildcard SSL to be feasible. Enable subdomains by setting `JupyterHub.subdomain_host = 'https://jupyterhub.domain.tld[:port]'`.
- Use `127.0.0.1` for local communication instead of `localhost`, avoiding issues with DNS on some systems.
- Fix race that could add users to proxy prematurely if spawning is slow.
## 0.4
### 0.4.1
Fix removal of `/login` page in 0.4.0, breaking some OAuth providers.
### 0.4.0
- Add `Spawner.user_options_form` for specifying an HTML form to present to users,
allowing users to influence the spawning of their own servers.
- Add `Authenticator.pre_spawn_start` and `Authenticator.post_spawn_stop` hooks,
so that Authenticators can do setup or teardown (e.g. passing credentials to Spawner,
mounting data sources, etc.).
These methods are typically used with custom Authenticator+Spawner pairs.
- 0.4 will be the last JupyterHub release where single-user servers running IPython 3 is supported instead of Notebook ≥ 4.0.
## 0.3
- No longer make the user starting the Hub an admin
- start PAM sessions on login
@@ -10,13 +39,13 @@ See `git log` for a more detailed summary.
allowing deeper interaction between Spawner/Authenticator pairs.
- login redirect fixes
## 0.2.0
## 0.2
- Based on standalone traitlets instead of IPython.utils.traitlets
- multiple users in admin panel
- Fixes for usernames that require escaping
## 0.1.0
## 0.1
First preview release

View File

@@ -14,14 +14,17 @@ See [the readme](https://github.com/jupyter/jupyterhub/blob/master/README.md) fo
JupyterHub is a set of processes that together provide a multiuser Jupyter Notebook server.
There are three main categories of processes run by the `jupyterhub` command line program:
- *Single User Server*: a dedicated, single-user, Jupyter Notebook is started for each user on the system
when they log in. The object that starts these processes is called a *Spawner*.
- *Proxy*: the public facing part of the server that uses a dynamic proxy to route HTTP requests
to the *Hub* and *Single User Servers*.
- *Hub*: manages user accounts and authentication and coordinates *Single Users Servers* using a *Spawner*.
- **Single User Server**: a dedicated, single-user, Jupyter Notebook is started for each user on the system
when they log in. The object that starts these processes is called a Spawner.
- **Proxy**: the public facing part of the server that uses a dynamic proxy to route HTTP requests
to the Hub and Single User Servers.
- **Hub**: manages user accounts and authentication and coordinates Single Users Servers using a Spawner.
## JupyterHub's default behavior
**IMPORTANT:** In its default configuration, JupyterHub runs without SSL encryption (HTTPS).
**You should not run JupyterHub without SSL encryption on a public network.**
See [Security documentation](#Security) for how to configure JupyterHub to use SSL.
To start JupyterHub in its default configuration, type the following at the command line:
@@ -34,29 +37,25 @@ Any user on the system with a password will be allowed to start a single-user no
The default Spawner starts servers locally as each user, one dedicated server per user.
These servers listen on localhost, and start in the given user's home directory.
By default, the *Proxy* listens on all public interfaces on port 8000.
Thus you can reach JupyterHub through:
By default, the **Proxy** listens on all public interfaces on port 8000.
Thus you can reach JupyterHub through either:
http://localhost:8000
or any other public IP or domain pointing to your system.
In their default configuration, the other services, the *Hub* and *Single-User Servers*,
In their default configuration, the other services, the **Hub** and **Single-User Servers**,
all communicate with each other on localhost only.
**NOTE:** In its default configuration, JupyterHub runs without SSL encryption (HTTPS).
You should not run JupyterHub without SSL encryption on a public network.
See [below](#Security) for how to configure JupyterHub to use SSL.
By default, starting JupyterHub will write two files to disk in the current working directory:
- `jupyterhub.sqlite` is the sqlite database containing all of the state of the *Hub*.
This file allows the *Hub* to remember what users are running and where,
- `jupyterhub.sqlite` is the sqlite database containing all of the state of the **Hub**.
This file allows the **Hub** to remember what users are running and where,
as well as other information enabling you to restart parts of JupyterHub separately.
- `jupyterhub_cookie_secret` is the encryption key used for securing cookies.
This file needs to persist in order for restarting the Hub server to avoid invalidating cookies.
Conversely, deleting this file and restarting the server effectively invalidates all login cookies.
The cookie secret file is discussed [below](#Security).
The cookie secret file is discussed in the [Cookie Secret documentation](#Cookie secret).
The location of these files can be specified via configuration, discussed below.
@@ -96,32 +95,37 @@ on the config system Jupyter uses.
## Networking
In most situations you will want to change the main IP address and port of the Proxy.
This address determines where JupyterHub is available to your users.
The default is all network interfaces (`''`) on port 8000.
### Configuring the Proxy's IP address and port
The Proxy's main IP address setting determines where JupyterHub is available to users.
By default, JupyterHub is configured to be available on all network interfaces
(`''`) on port 8000. **Note**: Use of `'*'` is discouraged for IP configuration;
instead, use of `'0.0.0.0'` is preferred.
This can be done with the following command line arguments:
Changing the IP address and port can be done with the following command line
arguments:
jupyterhub --ip=192.168.1.2 --port=443
Or you can put the following lines in a configuration file:
Or by placing the following lines in a configuration file:
```python
c.JupyterHub.ip = '192.168.1.2'
c.JupyterHub.port = 443
```
Port 443 is used in these examples as it is the default port for SSL/HTTPS.
Port 443 is used as an example since 443 is the default port for SSL/HTTPS.
Configuring only the main IP and port of JupyterHub should be sufficient for most deployments of JupyterHub.
However, for more customized scenarios,
you can configure the following additional networking details.
However, more customized scenarios may need additional networking details to
be configured.
### Configuring the Proxy's REST API communication IP address and port (optional)
The Hub service talks to the proxy via a REST API on a secondary port,
whose network interface and port can be configured separately.
By default, this REST API listens on port 8081 of localhost only.
If you want to run the Proxy separate from the Hub,
you may need to configure this IP and port with:
If running the Proxy separate from the Hub,
configure the REST API communication IP address and port with:
```python
# ideally a private network address
@@ -129,11 +133,12 @@ c.JupyterHub.proxy_api_ip = '10.0.1.4'
c.JupyterHub.proxy_api_port = 5432
```
### Configuring the Hub if Spawners or Proxy are remote or isolated in containers
The Hub service also listens only on localhost (port 8080) by default.
The Hub needs needs to be accessible from both the proxy and all Spawners.
When spawning local servers localhost is fine,
but if *either* the Proxy or (more likely) the Spawners will be remote or isolated in containers,
the Hub must listen on an IP that is accessible.
When spawning local servers, an IP address setting of localhost is fine.
If *either* the Proxy *or* (more likely) the Spawners will be remote or
isolated in containers, the Hub must listen on an IP that is accessible.
```python
c.JupyterHub.hub_ip = '10.0.1.4'
@@ -142,6 +147,9 @@ c.JupyterHub.hub_port = 54321
## Security
**IMPORTANT:** In its default configuration, JupyterHub runs without SSL encryption (HTTPS).
**You should not run JupyterHub without SSL encryption on a public network.**
Security is the most important aspect of configuring Jupyter. There are three main aspects of the
security configuration:
@@ -193,7 +201,8 @@ openssl rand -hex 1024 > /srv/jupyterhub/cookie_secret
In most deployments of JupyterHub, you should point this to a secure location on the file
system, such as `/srv/jupyterhub/cookie_secret`. If the cookie secret file doesn't exist when
the Hub starts, a new cookie secret is generated and stored in the file.
the Hub starts, a new cookie secret is generated and stored in the file. The recommended
permissions for the cookie secret file should be 600 (owner-only rw).
If you would like to avoid the need for files, the value can be loaded in the Hub process from
the `JPY_COOKIE_SECRET` environment variable:
@@ -397,9 +406,9 @@ jupyterhub -f /path/to/aboveconfig.py
# Further reading
- TODO: troubleshooting
- [Custom Authenticators](./authenticators.html)
- [Custom Spawners](./spawners.html)
- [Troubleshooting](./troubleshooting.html)
[oauth-setup]: https://github.com/jupyter/oauthenticator#setup

View File

@@ -5,7 +5,7 @@ JupyterHub is a multi-user server that manages and proxies multiple instances of
There are three basic processes involved:
- multi-user Hub (Python/Tornado)
- configurable http proxy (nodejs)
- [configurable http proxy](https://github.com/jupyter/configurable-http-proxy) (node-http-proxy)
- multiple single-user IPython notebook servers (Python/IPython/Tornado)
The proxy is the only process that listens on a public interface.

View File

@@ -1,22 +1,34 @@
.. JupyterHub documentation master file, created by
sphinx-quickstart on Mon Jan 4 16:31:09 2016.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
JupyterHub
==========
.. note:: This is the official documentation for JupyterHub. This project is
under active development.
JupyterHub is a server that gives multiple users access to Jupyter notebooks,
running an independent Jupyter notebook server for each user.
JupyterHub is a multi-user server that manages and proxies multiple instances
of the single-user Jupyter notebook server.
To use JupyterHub, you need a Unix server (typically Linux) running
somewhere that is accessible to your team on the network. The JupyterHub server
can be on an internal network at your organisation, or it can run on the public
internet (in which case, take care with `security <getting-started.html#security>`__).
Users access JupyterHub in a web browser, by going to the IP address or
domain name of the server.
Three actors:
Different :doc:`authenticators <authenticators>` control access
to JupyterHub. The default one (pam) uses the user accounts on the server where
JupyterHub is running. If you use this, you will need to create a user account
on the system for each user on your team. Using other authenticators, you can
allow users to sign in with e.g. a Github account, or with any single-sign-on
system your organisation has.
* multi-user Hub (tornado process)
* configurable http proxy (node-http-proxy)
* multiple single-user IPython notebook servers (Python/IPython/tornado)
Next, :doc:`spawners <spawners>` control how JupyterHub starts
the individual notebook server for each user. The default spawner will
start a notebook server on the same machine running under their system username.
The other main option is to start each server in a separate container, often
using Docker.
JupyterHub runs as three separate parts:
* The multi-user Hub (Python & Tornado)
* A `configurable http proxy <https://github.com/jupyter/configurable-http-proxy>`_ (NodeJS)
* Multiple single-user Jupyter notebook servers (Python & Tornado)
Basic principles:
@@ -29,7 +41,7 @@ Basic principles:
Contents:
.. toctree::
:maxdepth: 1
:maxdepth: 2
:caption: User Documentation
getting-started
@@ -41,6 +53,7 @@ Contents:
authenticators
spawners
troubleshooting
.. toctree::
:maxdepth: 1

View File

@@ -157,4 +157,4 @@ When `Spawner.spawn` is called, this dictionary is accessible as `self.user_opti
[Spawner]: ../jupyterhub/spawner.py
[Spawner]: https://github.com/jupyter/jupyterhub/blob/master/jupyterhub/spawner.py

View File

@@ -0,0 +1,11 @@
# Troubleshooting
This document is under active development.
## Networking
If JupyterHub proxy fails to start:
- check if the JupyterHub IP configuration setting is
``c.JupyterHub.ip = '*'``; if it is, try ``c.JupyterHub.ip = ''``
- Try starting with ``jupyterhub --ip=0.0.0.0``

View File

@@ -86,7 +86,7 @@ class APIHandler(BaseHandler):
model = {
'name': user.name,
'admin': user.admin,
'server': user.server.base_url if user.running else None,
'server': user.url if user.running else None,
'pending': None,
'last_activity': user.last_activity.isoformat(),
}

View File

@@ -28,7 +28,7 @@ class ProxyAPIHandler(APIHandler):
@gen.coroutine
def post(self):
"""POST checks the proxy to ensure"""
yield self.proxy.check_routes()
yield self.proxy.check_routes(self.users)
@admin_only
@@ -59,7 +59,7 @@ class ProxyAPIHandler(APIHandler):
self.proxy.auth_token = model['auth_token']
self.db.commit()
self.log.info("Updated proxy at %s", server.bind_url)
yield self.proxy.check_routes()
yield self.proxy.check_routes(self.users)

View File

@@ -16,6 +16,7 @@ from datetime import datetime
from distutils.version import LooseVersion as V
from getpass import getuser
from subprocess import Popen
from urllib.parse import urlparse
if sys.version_info[:2] < (3,3):
raise ValueError("Python < 3.3 not supported: %s" % sys.version)
@@ -42,7 +43,7 @@ here = os.path.dirname(__file__)
import jupyterhub
from . import handlers, apihandlers
from .handlers.static import CacheControlStaticFilesHandler
from .handlers.static import CacheControlStaticFilesHandler, LogoHandler
from . import orm
from .user import User, UserDict
@@ -50,7 +51,7 @@ from ._data import DATA_FILES_PATH
from .log import CoroutineLogFormatter, log_request
from .traitlets import URLPrefix, Command
from .utils import (
url_path_join, localhost,
url_path_join,
ISO8601_ms, ISO8601_s,
)
# classes for config
@@ -87,6 +88,9 @@ flags = {
'no-db': ({'JupyterHub': {'db_url': 'sqlite:///:memory:'}},
"disable persisting state database to disk"
),
'no-ssl': ({'JupyterHub': {'confirm_no_ssl': True}},
"Allow JupyterHub to run without SSL (SSL termination should be happening elsewhere)."
),
}
SECRET_BYTES = 2048 # the number of bytes to use when generating new secrets
@@ -209,6 +213,11 @@ class JupyterHub(Application):
def _template_paths_default(self):
return [os.path.join(self.data_files_path, 'templates')]
confirm_no_ssl = Bool(False, config=True,
help="""Confirm that JupyterHub should be run without SSL.
This is **NOT RECOMMENDED** unless SSL termination is being handled by another layer.
"""
)
ssl_key = Unicode('', config=True,
help="""Path to SSL key file for the public facing interface of the proxy
@@ -222,14 +231,39 @@ class JupyterHub(Application):
"""
)
ip = Unicode('', config=True,
help="The public facing ip of the proxy"
help="The public facing ip of the whole application (the proxy)"
)
subdomain_host = Unicode('', config=True,
help="""Run single-user servers on subdomains of this host.
This should be the full https://hub.domain.tld[:port]
Provides additional cross-site protections for javascript served by single-user servers.
Requires <username>.hub.domain.tld to resolve to the same host as hub.domain.tld.
In general, this is most easily achieved with wildcard DNS.
When using SSL (i.e. always) this also requires a wildcard SSL certificate.
""")
def _subdomain_host_changed(self, name, old, new):
if new and '://' not in new:
# host should include '://'
# if not specified, assume https: You have to be really explicit about HTTP!
self.subdomain_host = 'https://' + new
port = Integer(8000, config=True,
help="The public facing port of the proxy"
)
base_url = URLPrefix('/', config=True,
help="The base URL of the entire application"
)
logo_file = Unicode('', config=True,
help="Specify path to a logo image to override the Jupyter logo in the banner."
)
def _logo_file_default(self):
return os.path.join(self.data_files_path, 'static', 'images', 'jupyter.png')
jinja_environment_options = Dict(config=True,
help="Supply extra arguments that will be passed to Jinja environment."
@@ -260,7 +294,7 @@ class JupyterHub(Application):
token = orm.new_token()
return token
proxy_api_ip = Unicode(localhost(), config=True,
proxy_api_ip = Unicode('127.0.0.1', config=True,
help="The ip for the proxy API handlers"
)
proxy_api_port = Integer(config=True,
@@ -272,10 +306,9 @@ class JupyterHub(Application):
hub_port = Integer(8081, config=True,
help="The port for this process"
)
hub_ip = Unicode(localhost(), config=True,
hub_ip = Unicode('127.0.0.1', config=True,
help="The ip for this process"
)
hub_prefix = URLPrefix('/hub/', config=True,
help="The prefix for the hub server. Must not be '/'"
)
@@ -475,6 +508,8 @@ class JupyterHub(Application):
# set default handlers
h.extend(handlers.default_handlers)
h.extend(apihandlers.default_handlers)
h.append((r'/logo', LogoHandler, {'path': self.logo_file}))
self.handlers = self.add_url_prefix(self.hub_prefix, h)
# some extra handlers, outside hub_prefix
self.handlers.extend([
@@ -559,11 +594,15 @@ class JupyterHub(Application):
q = self.db.query(orm.Hub)
assert q.count() <= 1
self._local.hub = q.first()
if self.subdomain_host and self._local.hub:
self._local.hub.host = self.subdomain_host
return self._local.hub
@hub.setter
def hub(self, hub):
self._local.hub = hub
if hub and self.subdomain_host:
hub.host = self.subdomain_host
@property
def proxy(self):
@@ -616,6 +655,10 @@ class JupyterHub(Application):
server.ip = self.hub_ip
server.port = self.hub_port
server.base_url = self.hub_prefix
if self.subdomain_host:
if not self.subdomain_host:
raise ValueError("Must specify subdomain_host when using subdomains."
" This should be the public domain[:port] of the Hub.")
self.db.commit()
@@ -794,12 +837,26 @@ class JupyterHub(Application):
'--api-port', str(self.proxy.api_server.port),
'--default-target', self.hub.server.host,
]
if self.subdomain_host:
cmd.append('--host-routing')
if self.debug_proxy:
cmd.extend(['--log-level', 'debug'])
if self.ssl_key:
cmd.extend(['--ssl-key', self.ssl_key])
if self.ssl_cert:
cmd.extend(['--ssl-cert', self.ssl_cert])
# Require SSL to be used or `--no-ssl` to confirm no SSL on
if ' --ssl' not in ' '.join(cmd):
if self.confirm_no_ssl:
self.log.warning("Running JupyterHub without SSL."
" There better be SSL termination happening somewhere else...")
else:
self.log.error(
"Refusing to run JuptyterHub without SSL."
" If you are terminating SSL in another layer,"
" pass --no-ssl to tell JupyterHub to allow the proxy to listen on HTTP."
)
self.exit(1)
self.log.info("Starting proxy @ %s", self.proxy.public_server.bind_url)
self.log.debug("Proxy cmd: %s", cmd)
try:
@@ -840,7 +897,7 @@ class JupyterHub(Application):
)
yield self.start_proxy()
self.log.info("Setting up routes on new proxy")
yield self.proxy.add_all_users()
yield self.proxy.add_all_users(self.users)
self.log.info("New proxy back up, and good to go")
def init_tornado_settings(self):
@@ -866,6 +923,8 @@ class JupyterHub(Application):
else:
version_hash=datetime.now().strftime("%Y%m%d%H%M%S"),
subdomain_host = self.subdomain_host
domain = urlparse(subdomain_host).hostname
settings = dict(
log_function=log_request,
config=self.config,
@@ -888,6 +947,8 @@ class JupyterHub(Application):
template_path=self.template_paths,
jinja2_env=jinja_env,
version_hash=version_hash,
subdomain_host=subdomain_host,
domain=domain,
)
# allow configured settings to have priority
settings.update(self.tornado_settings)
@@ -1024,7 +1085,7 @@ class JupyterHub(Application):
user.last_activity = max(user.last_activity, dt)
self.db.commit()
yield self.proxy.check_routes(routes)
yield self.proxy.check_routes(self.users, routes)
@gen.coroutine
def start(self):
@@ -1059,7 +1120,7 @@ class JupyterHub(Application):
self.exit(1)
return
loop.add_callback(self.proxy.add_all_users)
loop.add_callback(self.proxy.add_all_users, self.users)
if self.proxy_process:
# only check / restart the proxy if we started it in the first place.

View File

@@ -51,6 +51,14 @@ class BaseHandler(RequestHandler):
def version_hash(self):
return self.settings.get('version_hash', '')
@property
def subdomain_host(self):
return self.settings.get('subdomain_host', '')
@property
def domain(self):
return self.settings['domain']
@property
def db(self):
return self.settings['db']
@@ -199,42 +207,44 @@ class BaseHandler(RequestHandler):
user = self.get_current_user()
else:
user = self.find_user(name)
kwargs = {}
if self.subdomain_host:
kwargs['domain'] = self.domain
if user and user.server:
self.clear_cookie(user.server.cookie_name, path=user.server.base_url)
self.clear_cookie(self.hub.server.cookie_name, path=self.hub.server.base_url)
self.clear_cookie(user.server.cookie_name, path=user.server.base_url, **kwargs)
self.clear_cookie(self.hub.server.cookie_name, path=self.hub.server.base_url, **kwargs)
def _set_user_cookie(self, user, server):
# tornado <4.2 have a bug that consider secure==True as soon as
# 'secure' kwarg is passed to set_secure_cookie
if self.request.protocol == 'https':
kwargs = {'secure': True}
else:
kwargs = {}
if self.subdomain_host:
kwargs['domain'] = self.domain
self.log.debug("Setting cookie for %s: %s, %s", user.name, server.cookie_name, kwargs)
self.set_secure_cookie(
server.cookie_name,
user.cookie_id,
path=server.base_url,
**kwargs
)
def set_server_cookie(self, user):
"""set the login cookie for the single-user server"""
# tornado <4.2 have a bug that consider secure==True as soon as
# 'secure' kwarg is passed to set_secure_cookie
if self.request.protocol == 'https':
kwargs = {'secure':True}
else:
kwargs = {}
self.set_secure_cookie(
user.server.cookie_name,
user.cookie_id,
path=user.server.base_url,
**kwargs
)
self._set_user_cookie(user, user.server)
def set_hub_cookie(self, user):
"""set the login cookie for the Hub"""
# tornado <4.2 have a bug that consider secure==True as soon as
# 'secure' kwarg is passed to set_secure_cookie
if self.request.protocol == 'https':
kwargs = {'secure':True}
else:
kwargs = {}
self.set_secure_cookie(
self.hub.server.cookie_name,
user.cookie_id,
path=self.hub.server.base_url,
**kwargs
)
self._set_user_cookie(user, self.hub.server)
def set_login_cookie(self, user):
"""Set login cookies for the Hub and single-user server."""
if self.subdomain_host and not self.request.host.startswith(self.domain):
self.log.warning(
"Possibly setting cookie on wrong domain: %s != %s",
self.request.host, self.domain)
# create and set a new cookie token for the single-user server
if user.server:
self.set_server_cookie(user)
@@ -296,16 +306,23 @@ class BaseHandler(RequestHandler):
yield gen.with_timeout(timedelta(seconds=self.slow_spawn_timeout), f)
except gen.TimeoutError:
if user.spawn_pending:
status = yield user.spawner.poll()
if status is None:
# hit timeout, but spawn is still pending
# still in Spawner.start, which is taking a long time
# we shouldn't poll while spawn_pending is True
self.log.warn("User %s server is slow to start", user.name)
# schedule finish for when the user finishes spawning
IOLoop.current().add_future(f, finish_user_spawn)
else:
raise web.HTTPError(500, "Spawner failed to start [status=%s]" % status)
# start has finished, but the server hasn't come up
# check if the server died while we were waiting
status = yield user.spawner.poll()
if status is None:
# hit timeout, but server's running. Hope that it'll show up soon enough,
# though it's possible that it started at the wrong URL
self.log.warn("User %s server is slow to become responsive", user.name)
# schedule finish for when the user finishes spawning
IOLoop.current().add_future(f, finish_user_spawn)
else:
raise
raise web.HTTPError(500, "Spawner failed to start [status=%s]" % status)
else:
yield finish_user_spawn()
@@ -437,15 +454,19 @@ class PrefixRedirectHandler(BaseHandler):
class UserSpawnHandler(BaseHandler):
"""Requests to /user/name handled by the Hub
should result in spawning the single-user server and
being redirected to the original.
"""Redirect requests to /user/name/* handled by the Hub.
If logged in, spawn a single-user server and redirect request.
If a user, alice, requests /user/bob/notebooks/mynotebook.ipynb,
redirect her to /user/alice/notebooks/mynotebook.ipynb, which should
in turn call this function.
"""
@gen.coroutine
def get(self, name):
def get(self, name, user_path):
current_user = self.get_current_user()
if current_user and current_user.name == name:
# logged in, spawn the server
# logged in as correct user, spawn the server
if current_user.spawner:
if current_user.spawn_pending:
# spawn has started, but not finished
@@ -465,15 +486,22 @@ class UserSpawnHandler(BaseHandler):
self.set_login_cookie(current_user)
without_prefix = self.request.uri[len(self.hub.server.base_url):]
target = url_path_join(self.base_url, without_prefix)
if self.subdomain_host:
target = current_user.host + target
self.redirect(target)
elif current_user:
# logged in as a different user, redirect
target = url_path_join(self.base_url, 'user', current_user.name,
user_path or '')
self.redirect(target)
else:
# not logged in to the right user,
# clear any cookies and reload (will redirect to login)
# not logged in, clear any cookies and reload
self.clear_login_cookie()
self.redirect(url_concat(
self.settings['login_url'],
{'next': self.request.uri,
}))
{'next': self.request.uri},
))
class CSPReportHandler(BaseHandler):
'''Accepts a content security policy violation report'''
@@ -484,6 +512,6 @@ class CSPReportHandler(BaseHandler):
self.request.body.decode('utf8', 'replace'))
default_handlers = [
(r'/user/([^/]+)/?.*', UserSpawnHandler),
(r'/user/([^/]+)(/.*)?', UserSpawnHandler),
(r'/security/csp-report', CSPReportHandler),
]

View File

@@ -86,7 +86,10 @@ class LoginHandler(BaseHandler):
self.finish(html)
# Only logout is a default handler.
# /login renders the login page or the "Login with..." link,
# so it should always be registered.
# /logout clears cookies.
default_handlers = [
(r"/login", LoginHandler),
(r"/logout", LogoutHandler),
]

View File

@@ -25,7 +25,7 @@ class RootHandler(BaseHandler):
user = self.get_current_user()
if user:
if user.running:
url = user.server.base_url
url = user.url
self.log.debug("User is running: %s", url)
else:
url = url_path_join(self.hub.server.base_url, 'home')
@@ -67,7 +67,7 @@ class SpawnHandler(BaseHandler):
"""GET renders form for spawning with user-specified options"""
user = self.get_current_user()
if user.running:
url = user.server.base_url
url = user.url
self.log.debug("User is running: %s", url)
self.redirect(url)
return
@@ -84,7 +84,7 @@ class SpawnHandler(BaseHandler):
"""POST spawns with user-specified options"""
user = self.get_current_user()
if user.running:
url = user.server.base_url
url = user.url
self.log.warning("User is already running: %s", url)
self.redirect(url)
return
@@ -93,15 +93,15 @@ class SpawnHandler(BaseHandler):
form_options[key] = [ bs.decode('utf8') for bs in byte_list ]
for key, byte_list in self.request.files.items():
form_options["%s_file"%key] = byte_list
options = user.spawner.options_from_form(form_options)
try:
options = user.spawner.options_from_form(form_options)
yield self.spawn_single_user(user, options=options)
except Exception as e:
self.log.error("Failed to spawn single-user server with form", exc_info=True)
self.finish(self._render_form(str(e)))
return
self.set_login_cookie(user)
url = user.server.base_url
url = user.url
self.redirect(url)
class AdminHandler(BaseHandler):

View File

@@ -1,6 +1,7 @@
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import os
from tornado.web import StaticFileHandler
class CacheControlStaticFilesHandler(StaticFileHandler):
@@ -15,3 +16,13 @@ class CacheControlStaticFilesHandler(StaticFileHandler):
if "v" not in self.request.arguments:
self.add_header("Cache-Control", "no-cache")
class LogoHandler(StaticFileHandler):
"""A singular handler for serving the logo."""
def get(self):
return super().get('')
@classmethod
def get_absolute_path(cls, root, path):
"""We only serve one file, ignore relative path"""
return os.path.abspath(root)

View File

@@ -4,15 +4,13 @@
# Distributed under the terms of the Modified BSD License.
from datetime import datetime
import errno
import json
import socket
from tornado import gen
from tornado.log import app_log
from tornado.httpclient import HTTPRequest, AsyncHTTPClient
from sqlalchemy.types import TypeDecorator, VARCHAR
from sqlalchemy.types import TypeDecorator, TEXT
from sqlalchemy import (
inspect,
Column, Integer, ForeignKey, Unicode, Boolean,
@@ -26,7 +24,7 @@ from sqlalchemy import create_engine
from .utils import (
random_port, url_path_join, wait_for_server, wait_for_http_server,
new_token, hash_token, compare_token, localhost,
new_token, hash_token, compare_token, can_connect,
)
@@ -39,7 +37,7 @@ class JSONDict(TypeDecorator):
"""
impl = VARCHAR
impl = TEXT
def process_bind_param(self, value, dialect):
if value is not None:
@@ -64,11 +62,11 @@ class Server(Base):
"""
__tablename__ = 'servers'
id = Column(Integer, primary_key=True)
proto = Column(Unicode, default='http')
ip = Column(Unicode, default='')
proto = Column(Unicode(15), default='http')
ip = Column(Unicode(255), default='') # could also be a DNS name
port = Column(Integer, default=random_port)
base_url = Column(Unicode, default='/')
cookie_name = Column(Unicode, default='cookie')
base_url = Column(Unicode(255), default='/')
cookie_name = Column(Unicode(255), default='cookie')
def __repr__(self):
return "<Server(%s:%s)>" % (self.ip, self.port)
@@ -78,7 +76,7 @@ class Server(Base):
ip = self.ip
if ip in {'', '0.0.0.0'}:
# when listening on all interfaces, connect to localhost
ip = localhost()
ip = '127.0.0.1'
return "{proto}://{ip}:{port}".format(
proto=self.proto,
ip=ip,
@@ -100,7 +98,7 @@ class Server(Base):
since it can be non-connectable value, such as '', meaning all interfaces.
"""
if self.ip in {'', '0.0.0.0'}:
return self.url.replace('localhost', self.ip or '*', 1)
return self.url.replace('127.0.0.1', self.ip or '*', 1)
return self.url
@gen.coroutine
@@ -109,29 +107,11 @@ class Server(Base):
if http:
yield wait_for_http_server(self.url, timeout=timeout)
else:
yield wait_for_server(self.ip or localhost(), self.port, timeout=timeout)
yield wait_for_server(self.ip or '127.0.0.1', self.port, timeout=timeout)
def is_up(self):
"""Is the server accepting connections?"""
try:
socket.create_connection((self.ip or localhost(), self.port))
except socket.error as e:
if e.errno == errno.ENETUNREACH:
try:
socket.create_connection((self.ip or '127.0.0.1', self.port))
except socket.error as e:
if e.errno == errno.ECONNREFUSED:
return False
else:
raise
else:
return True
elif e.errno == errno.ECONNREFUSED:
return False
else:
raise
else:
return True
return can_connect(self.ip or '127.0.0.1', self.port)
class Proxy(Base):
@@ -176,10 +156,14 @@ class Proxy(Base):
def add_user(self, user, client=None):
"""Add a user's server to the proxy table."""
self.log.info("Adding user %s to proxy %s => %s",
user.name, user.server.base_url, user.server.host,
user.name, user.proxy_path, user.server.host,
)
yield self.api_request(user.server.base_url,
if user.spawn_pending:
raise RuntimeError(
"User %s's spawn is pending, shouldn't be added to the proxy yet!", user.name)
yield self.api_request(user.proxy_path,
method='POST',
body=dict(
target=user.server.host,
@@ -192,26 +176,11 @@ class Proxy(Base):
def delete_user(self, user, client=None):
"""Remove a user's server to the proxy table."""
self.log.info("Removing user %s from proxy", user.name)
yield self.api_request(user.server.base_url,
yield self.api_request(user.proxy_path,
method='DELETE',
client=client,
)
@gen.coroutine
def add_all_users(self):
"""Update the proxy table from the database.
Used when loading up a new proxy.
"""
db = inspect(self).session
futures = []
for user in db.query(User):
if (user.server):
futures.append(self.add_user(user))
# wait after submitting them all
for f in futures:
yield f
@gen.coroutine
def get_routes(self, client=None):
"""Fetch the proxy's routes"""
@@ -219,17 +188,42 @@ class Proxy(Base):
return json.loads(resp.body.decode('utf8', 'replace'))
@gen.coroutine
def check_routes(self, routes=None):
"""Check that all users are properly"""
def add_all_users(self, user_dict):
"""Update the proxy table from the database.
Used when loading up a new proxy.
"""
db = inspect(self).session
futures = []
for orm_user in db.query(User):
user = user_dict[orm_user]
if user.running:
futures.append(self.add_user(user))
# wait after submitting them all
for f in futures:
yield f
@gen.coroutine
def check_routes(self, user_dict, routes=None):
"""Check that all users are properly routed on the proxy"""
if not routes:
routes = yield self.get_routes()
have_routes = { r['user'] for r in routes.values() if 'user' in r }
futures = []
db = inspect(self).session
for user in db.query(User).filter(User.server != None):
for orm_user in db.query(User).filter(User.server != None):
user = user_dict[orm_user]
if not user.running:
# Don't add users to the proxy that haven't finished starting
continue
if user.server is None:
# This should never be True, but seems to be on rare occasion.
# catch filter bug, either in sqlalchemy or my understanding of its behavior
self.log.error("User %s has no server, but wasn't filtered out.", user)
continue
if user.name not in have_routes:
self.log.warn("Adding missing route for %s", user.name)
self.log.warning("Adding missing route for %s (%s)", user.name, user.server)
futures.append(self.add_user(user))
for f in futures:
yield f
@@ -248,6 +242,7 @@ class Hub(Base):
id = Column(Integer, primary_key=True)
_server_id = Column(Integer, ForeignKey('servers.id'))
server = relationship(Server, primaryjoin=_server_id == Server.id)
host = ''
@property
def api_url(self):
@@ -281,7 +276,7 @@ class User(Base):
"""
__tablename__ = 'users'
id = Column(Integer, primary_key=True, autoincrement=True)
name = Column(Unicode)
name = Column(Unicode(1023))
# should we allow multiple servers per user?
_server_id = Column(Integer, ForeignKey('servers.id'))
server = relationship(Server, primaryjoin=_server_id == Server.id)
@@ -289,7 +284,7 @@ class User(Base):
last_activity = Column(DateTime, default=datetime.utcnow)
api_tokens = relationship("APIToken", backref="user")
cookie_id = Column(Unicode, default=new_token)
cookie_id = Column(Unicode(1023), default=new_token)
state = Column(JSONDict)
other_user_cookies = set([])
@@ -336,8 +331,8 @@ class APIToken(Base):
return Column(Integer, ForeignKey('users.id'))
id = Column(Integer, primary_key=True)
hashed = Column(Unicode)
prefix = Column(Unicode)
hashed = Column(Unicode(1023))
prefix = Column(Unicode(1023))
prefix_length = 4
algorithm = "sha512"
rounds = 16384

View File

@@ -11,7 +11,7 @@ import signal
import sys
import grp
from subprocess import Popen
from tempfile import TemporaryDirectory
from tempfile import mkdtemp
from tornado import gen
from tornado.ioloop import IOLoop, PeriodicCallback
@@ -22,7 +22,7 @@ from traitlets import (
)
from .traitlets import Command
from .utils import random_port, localhost
from .utils import random_port
class Spawner(LoggingConfigurable):
"""Base class for spawning single-user notebook servers.
@@ -41,7 +41,7 @@ class Spawner(LoggingConfigurable):
hub = Any()
authenticator = Any()
api_token = Unicode()
ip = Unicode(localhost(), config=True,
ip = Unicode('127.0.0.1', config=True,
help="The IP address (or hostname) the single-user server should listen on"
)
start_timeout = Integer(60, config=True,
@@ -140,6 +140,25 @@ class Spawner(LoggingConfigurable):
"""
)
default_url = Unicode('', config=True,
help="""The default URL for the single-user server.
Can be used in conjunction with --notebook-dir=/ to enable
full filesystem traversal, while preserving user's homedir as
landing page for notebook
`%U` will be expanded to the user's username
"""
)
disable_user_config = Bool(False, config=True,
help="""Disable per-user configuration of single-user servers.
This prevents any config in users' $HOME directories
from having an effect on their server.
"""
)
def __init__(self, **kwargs):
super(Spawner, self).__init__(**kwargs)
if self.user.state:
@@ -199,6 +218,7 @@ class Spawner(LoggingConfigurable):
'--port=%i' % self.user.server.port,
'--cookie-name=%s' % self.user.server.cookie_name,
'--base-url=%s' % self.user.server.base_url,
'--hub-host=%s' % self.hub.host,
'--hub-prefix=%s' % self.hub.server.base_url,
'--hub-api-url=%s' % self.hub.api_url,
]
@@ -207,8 +227,14 @@ class Spawner(LoggingConfigurable):
if self.notebook_dir:
self.notebook_dir = self.notebook_dir.replace("%U",self.user.name)
args.append('--notebook-dir=%s' % self.notebook_dir)
if self.default_url:
self.default_url = self.default_url.replace("%U",self.user.name)
args.append('--NotebookApp.default_url=%s' % self.default_url)
if self.debug:
args.append('--debug')
if self.disable_user_config:
args.append('--disable-user-config')
args.extend(self.args)
return args
@@ -302,12 +328,13 @@ def _try_setcwd(path):
try:
os.chdir(path)
except OSError as e:
exc = e # break exception instance out of except scope
print("Couldn't set CWD to %s (%s)" % (path, e), file=sys.stderr)
path, _ = os.path.split(path)
else:
return
print("Couldn't set CWD at all (%s), using temp dir" % e, file=sys.stderr)
td = TemporaryDirectory().name
print("Couldn't set CWD at all (%s), using temp dir" % exc, file=sys.stderr)
td = mkdtemp()
os.chdir(td)

View File

@@ -1,7 +1,7 @@
"""mock utilities for testing"""
import os
import sys
from datetime import timedelta
from tempfile import NamedTemporaryFile
import threading
@@ -13,11 +13,11 @@ from tornado import gen
from tornado.concurrent import Future
from tornado.ioloop import IOLoop
from ..spawner import LocalProcessSpawner
from ..app import JupyterHub
from ..auth import PAMAuthenticator
from .. import orm
from ..utils import localhost
from ..spawner import LocalProcessSpawner
from ..utils import url_path_join
from pamela import PAMError
@@ -109,9 +109,15 @@ class MockHub(JupyterHub):
"""Hub with various mock bits"""
db_file = None
confirm_no_ssl = True
last_activity_interval = 2
def _subdomain_host_default(self):
return os.environ.get('JUPYTERHUB_TEST_SUBDOMAIN_HOST', '')
def _ip_default(self):
return localhost()
return '127.0.0.1'
def _authenticator_class_default(self):
return MockPAMAuthenticator
@@ -124,7 +130,8 @@ class MockHub(JupyterHub):
def start(self, argv=None):
self.db_file = NamedTemporaryFile()
self.db_url = 'sqlite:///' + self.db_file.name
self.pid_file = NamedTemporaryFile(delete=False).name
self.db_url = self.db_file.name
evt = threading.Event()
@@ -161,13 +168,33 @@ class MockHub(JupyterHub):
self.db_file.close()
def login_user(self, name):
r = requests.post(self.proxy.public_server.url + 'hub/login',
base_url = public_url(self)
r = requests.post(base_url + 'hub/login',
data={
'username': name,
'password': name,
},
allow_redirects=False,
)
r.raise_for_status()
assert r.cookies
return r.cookies
def public_host(app):
if app.subdomain_host:
return app.subdomain_host
else:
return app.proxy.public_server.host
def public_url(app):
return public_host(app) + app.proxy.public_server.base_url
def user_url(user, app):
if app.subdomain_host:
host = user.host
else:
host = public_host(app)
return host + user.server.base_url

View File

@@ -2,9 +2,8 @@
import json
import time
from datetime import timedelta
from queue import Queue
from urllib.parse import urlparse
from urllib.parse import urlparse, quote
import requests
@@ -14,6 +13,7 @@ from .. import orm
from ..user import User
from ..utils import url_path_join as ujoin
from . import mocking
from .mocking import public_url, user_url
def check_db_locks(func):
@@ -105,7 +105,7 @@ def test_auth_api(app):
def test_referer_check(app, io_loop):
url = app.hub.server.url
url = ujoin(public_url(app), app.hub.server.base_url)
host = urlparse(url).netloc
user = find_user(app.db, 'admin')
if user is None:
@@ -352,15 +352,19 @@ def test_spawn(app, io_loop):
assert status is None
assert user.server.base_url == '/user/%s' % name
r = requests.get(ujoin(app.proxy.public_server.url, user.server.base_url))
url = user_url(user, app)
print(url)
r = requests.get(url)
assert r.status_code == 200
assert r.text == user.server.base_url
r = requests.get(ujoin(app.proxy.public_server.url, user.server.base_url, 'args'))
r = requests.get(ujoin(url, 'args'))
assert r.status_code == 200
argv = r.json()
for expected in ['--user=%s' % name, '--base-url=%s' % user.server.base_url]:
assert expected in argv
if app.subdomain_host:
assert '--hub-host=%s' % app.subdomain_host in argv
r = api_request(app, 'users', name, 'server', method='delete')
assert r.status_code == 204
@@ -379,6 +383,7 @@ def test_slow_spawn(app, io_loop):
name = 'zoe'
user = add_user(db, app=app, name=name)
r = api_request(app, 'users', name, 'server', method='post')
app.tornado_settings['spawner_class'] = mocking.MockSpawner
r.raise_for_status()
assert r.status_code == 202
app_user = get_app_user(app, name)
@@ -428,6 +433,7 @@ def test_never_spawn(app, io_loop):
name = 'badger'
user = add_user(db, app=app, name=name)
r = api_request(app, 'users', name, 'server', method='post')
app.tornado_settings['spawner_class'] = mocking.MockSpawner
app_user = get_app_user(app, name)
assert app_user.spawner is not None
assert app_user.spawn_pending
@@ -450,6 +456,55 @@ def test_get_proxy(app, io_loop):
assert list(reply.keys()) == ['/']
def test_cookie(app):
db = app.db
name = 'patience'
user = add_user(db, app=app, name=name)
r = api_request(app, 'users', name, 'server', method='post')
assert r.status_code == 201
assert 'pid' in user.state
app_user = get_app_user(app, name)
cookies = app.login_user(name)
# cookie jar gives '"cookie-value"', we want 'cookie-value'
cookie = cookies[user.server.cookie_name][1:-1]
r = api_request(app, 'authorizations/cookie', user.server.cookie_name, "nothintoseehere")
assert r.status_code == 404
r = api_request(app, 'authorizations/cookie', user.server.cookie_name, quote(cookie, safe=''))
r.raise_for_status()
reply = r.json()
assert reply['name'] == name
# deprecated cookie in body:
r = api_request(app, 'authorizations/cookie', user.server.cookie_name, data=cookie)
r.raise_for_status()
reply = r.json()
assert reply['name'] == name
def test_token(app):
name = 'book'
user = add_user(app.db, app=app, name=name)
token = user.new_api_token()
r = api_request(app, 'authorizations/token', token)
r.raise_for_status()
user_model = r.json()
assert user_model['name'] == name
r = api_request(app, 'authorizations/token', 'notauthorized')
assert r.status_code == 404
def test_options(app):
r = api_request(app, 'users', method='options')
r.raise_for_status()
assert 'Access-Control-Allow-Headers' in r.headers
def test_bad_json_body(app):
r = api_request(app, 'users', method='post', data='notjson')
assert r.status_code == 400
def test_shutdown(app):
r = api_request(app, 'shutdown', method='post', data=json.dumps({
'servers': True,

View File

@@ -3,8 +3,9 @@
import os
import re
import sys
from subprocess import check_output
from subprocess import check_output, Popen, PIPE
from tempfile import NamedTemporaryFile, TemporaryDirectory
from .mocking import MockHub
def test_help_all():
out = check_output([sys.executable, '-m', 'jupyterhub', '--help-all']).decode('utf8', 'replace')
@@ -23,10 +24,23 @@ def test_token_app():
def test_generate_config():
with NamedTemporaryFile(prefix='jupyterhub_config', suffix='.py') as tf:
cfg_file = tf.name
with open(cfg_file, 'w') as f:
f.write("c.A = 5")
p = Popen([sys.executable, '-m', 'jupyterhub',
'--generate-config', '-f', cfg_file],
stdout=PIPE, stdin=PIPE)
out, _ = p.communicate(b'n')
out = out.decode('utf8', 'replace')
assert os.path.exists(cfg_file)
with open(cfg_file) as f:
cfg_text = f.read()
assert cfg_text == 'c.A = 5'
out = check_output([sys.executable, '-m', 'jupyterhub',
'--generate-config', '-f', cfg_file]
).decode('utf8', 'replace')
p = Popen([sys.executable, '-m', 'jupyterhub',
'--generate-config', '-f', cfg_file],
stdout=PIPE, stdin=PIPE)
out, _ = p.communicate(b'x\ny')
out = out.decode('utf8', 'replace')
assert os.path.exists(cfg_file)
with open(cfg_file) as f:
cfg_text = f.read()

View File

@@ -20,7 +20,7 @@ def test_server(db):
assert server.proto == 'http'
assert isinstance(server.port, int)
assert isinstance(server.cookie_name, str)
assert server.host == 'http://localhost:%i' % server.port
assert server.host == 'http://127.0.0.1:%i' % server.port
assert server.url == server.host + '/'
assert server.bind_url == 'http://*:%i/' % server.port
server.ip = '127.0.0.1'

View File

@@ -1,6 +1,6 @@
"""Tests for HTML pages"""
from urllib.parse import urlparse
from urllib.parse import urlencode, urlparse
import requests
@@ -8,12 +8,11 @@ from ..utils import url_path_join as ujoin
from .. import orm
import mock
from .mocking import FormSpawner
from .mocking import FormSpawner, public_url, public_host, user_url
from .test_api import api_request
def get_page(path, app, **kw):
base_url = ujoin(app.proxy.public_server.host, app.hub.server.base_url)
base_url = ujoin(public_url(app), app.hub.server.base_url)
print(base_url)
return requests.get(ujoin(base_url, path), **kw)
@@ -22,15 +21,17 @@ def test_root_no_auth(app, io_loop):
routes = io_loop.run_sync(app.proxy.get_routes)
print(routes)
print(app.hub.server)
r = requests.get(app.proxy.public_server.host)
url = public_url(app)
print(url)
r = requests.get(url)
r.raise_for_status()
assert r.url == ujoin(app.proxy.public_server.host, app.hub.server.base_url, 'login')
assert r.url == ujoin(url, app.hub.server.base_url, 'login')
def test_root_auth(app):
cookies = app.login_user('river')
r = requests.get(app.proxy.public_server.host, cookies=cookies)
r = requests.get(public_url(app), cookies=cookies)
r.raise_for_status()
assert r.url == ujoin(app.proxy.public_server.host, '/user/river')
assert r.url == user_url(app.users['river'], app)
def test_home_no_auth(app):
r = get_page('home', app, allow_redirects=False)
@@ -62,6 +63,7 @@ def test_admin(app):
r.raise_for_status()
assert r.url.endswith('/admin')
def test_spawn_redirect(app, io_loop):
name = 'wash'
cookies = app.login_user(name)
@@ -100,7 +102,7 @@ def test_spawn_page(app):
def test_spawn_form(app, io_loop):
with mock.patch.dict(app.users.settings, {'spawner_class': FormSpawner}):
base_url = ujoin(app.proxy.public_server.host, app.hub.server.base_url)
base_url = ujoin(public_url(app), app.hub.server.base_url)
cookies = app.login_user('jones')
orm_u = orm.User.find(app.db, 'jones')
u = app.users[orm_u]
@@ -121,7 +123,7 @@ def test_spawn_form(app, io_loop):
def test_spawn_form_with_file(app, io_loop):
with mock.patch.dict(app.users.settings, {'spawner_class': FormSpawner}):
base_url = ujoin(app.proxy.public_server.host, app.hub.server.base_url)
base_url = ujoin(public_url(app), app.hub.server.base_url)
cookies = app.login_user('jones')
orm_u = orm.User.find(app.db, 'jones')
u = app.users[orm_u]
@@ -147,3 +149,88 @@ def test_spawn_form_with_file(app, io_loop):
'content_type': 'application/unknown'},
}
def test_user_redirect(app):
name = 'wash'
cookies = app.login_user(name)
r = get_page('/user/baduser', app, cookies=cookies)
r.raise_for_status()
print(urlparse(r.url))
path = urlparse(r.url).path
assert path == '/user/%s' % name
r = get_page('/user/baduser/test.ipynb', app, cookies=cookies)
r.raise_for_status()
print(urlparse(r.url))
path = urlparse(r.url).path
assert path == '/user/%s/test.ipynb' % name
r = get_page('/user/baduser/test.ipynb', app)
r.raise_for_status()
print(urlparse(r.url))
path = urlparse(r.url).path
assert path == '/hub/login'
query = urlparse(r.url).query
assert query == urlencode({'next': '/hub/user/baduser/test.ipynb'})
def test_login_fail(app):
name = 'wash'
base_url = public_url(app)
r = requests.post(base_url + 'hub/login',
data={
'username': name,
'password': 'wrong',
},
allow_redirects=False,
)
assert not r.cookies
def test_login_redirect(app, io_loop):
cookies = app.login_user('river')
user = app.users['river']
# no next_url, server running
io_loop.run_sync(user.spawn)
r = get_page('login', app, cookies=cookies, allow_redirects=False)
r.raise_for_status()
assert r.status_code == 302
assert '/user/river' in r.headers['Location']
# no next_url, server not running
io_loop.run_sync(user.stop)
r = get_page('login', app, cookies=cookies, allow_redirects=False)
r.raise_for_status()
assert r.status_code == 302
assert '/hub/' in r.headers['Location']
# next URL given, use it
r = get_page('login?next=/hub/admin', app, cookies=cookies, allow_redirects=False)
r.raise_for_status()
assert r.status_code == 302
assert r.headers['Location'].endswith('/hub/admin')
def test_logout(app):
name = 'wash'
cookies = app.login_user(name)
r = requests.get(public_host(app) + app.tornado_settings['logout_url'], cookies=cookies)
r.raise_for_status()
login_url = public_host(app) + app.tornado_settings['login_url']
assert r.url == login_url
assert r.cookies == {}
def test_static_files(app):
base_url = ujoin(public_url(app), app.hub.server.base_url)
print(base_url)
r = requests.get(ujoin(base_url, 'logo'))
r.raise_for_status()
assert r.headers['content-type'] == 'image/png'
r = requests.get(ujoin(base_url, 'static', 'images', 'jupyter.png'))
r.raise_for_status()
assert r.headers['content-type'] == 'image/png'
r = requests.get(ujoin(base_url, 'static', 'css', 'style.min.css'))
r.raise_for_status()
assert r.headers['content-type'] == 'text/css'

View File

@@ -4,6 +4,7 @@ import json
import os
from queue import Queue
from subprocess import Popen
from urllib.parse import urlparse
from .. import orm
from .mocking import MockHub
@@ -34,6 +35,8 @@ def test_external_proxy(request, io_loop):
'--api-port', str(proxy_port),
'--default-target', 'http://%s:%i' % (app.hub_ip, app.hub_port),
]
if app.subdomain_host:
cmd.append('--host-routing')
proxy = Popen(cmd, env=env)
def _cleanup_proxy():
if proxy.poll() is None:
@@ -60,7 +63,11 @@ def test_external_proxy(request, io_loop):
r.raise_for_status()
routes = io_loop.run_sync(app.proxy.get_routes)
assert sorted(routes.keys()) == ['/', '/user/river']
user_path = '/user/river'
if app.subdomain_host:
domain = urlparse(app.subdomain_host).hostname
user_path = '/%s.%s' % (name, domain) + user_path
assert sorted(routes.keys()) == ['/', user_path]
# teardown the proxy and start a new one in the same place
proxy.terminate()
@@ -76,7 +83,7 @@ def test_external_proxy(request, io_loop):
# check that the routes are correct
routes = io_loop.run_sync(app.proxy.get_routes)
assert sorted(routes.keys()) == ['/', '/user/river']
assert sorted(routes.keys()) == ['/', user_path]
# teardown the proxy again, and start a new one with different auth and port
proxy.terminate()
@@ -90,13 +97,16 @@ def test_external_proxy(request, io_loop):
'--api-port', str(proxy_port),
'--default-target', 'http://%s:%i' % (app.hub_ip, app.hub_port),
]
if app.subdomain_host:
cmd.append('--host-routing')
proxy = Popen(cmd, env=env)
wait_for_proxy()
# tell the hub where the new proxy is
r = api_request(app, 'proxy', method='patch', data=json.dumps({
'port': proxy_port,
'protocol': 'http',
'ip': app.ip,
'auth_token': new_auth_token,
}))
r.raise_for_status()
@@ -113,7 +123,8 @@ def test_external_proxy(request, io_loop):
# check that the routes are correct
routes = io_loop.run_sync(app.proxy.get_routes)
assert sorted(routes.keys()) == ['/', '/user/river']
assert sorted(routes.keys()) == ['/', user_path]
def test_check_routes(app, io_loop):
proxy = app.proxy
@@ -123,13 +134,24 @@ def test_check_routes(app, io_loop):
r.raise_for_status()
zoe = orm.User.find(app.db, 'zoe')
assert zoe is not None
zoe = app.users[zoe]
before = sorted(io_loop.run_sync(app.proxy.get_routes))
assert '/user/zoe' in before
io_loop.run_sync(app.proxy.check_routes)
assert zoe.proxy_path in before
io_loop.run_sync(lambda : app.proxy.check_routes(app.users))
io_loop.run_sync(lambda : proxy.delete_user(zoe))
during = sorted(io_loop.run_sync(app.proxy.get_routes))
assert '/user/zoe' not in during
io_loop.run_sync(app.proxy.check_routes)
assert zoe.proxy_path not in during
io_loop.run_sync(lambda : app.proxy.check_routes(app.users))
after = sorted(io_loop.run_sync(app.proxy.get_routes))
assert '/user/zoe' in after
assert zoe.proxy_path in after
assert before == after
def test_patch_proxy_bad_req(app):
r = api_request(app, 'proxy', method='patch')
assert r.status_code == 400
r = api_request(app, 'proxy', method='patch', data='notjson')
assert r.status_code == 400
r = api_request(app, 'proxy', method='patch', data=json.dumps([]))
assert r.status_code == 400

View File

@@ -4,9 +4,14 @@
# Distributed under the terms of the Modified BSD License.
import logging
import os
import signal
import sys
import tempfile
import time
from unittest import mock
from tornado import gen
from .. import spawner as spawnermod
from ..spawner import LocalProcessSpawner
@@ -39,13 +44,14 @@ def new_spawner(db, **kwargs):
kwargs.setdefault('INTERRUPT_TIMEOUT', 1)
kwargs.setdefault('TERM_TIMEOUT', 1)
kwargs.setdefault('KILL_TIMEOUT', 1)
kwargs.setdefault('poll_interval', 1)
return LocalProcessSpawner(db=db, **kwargs)
def test_spawner(db, io_loop):
spawner = new_spawner(db)
io_loop.run_sync(spawner.start)
assert spawner.user.server.ip == 'localhost'
assert spawner.user.server.ip == '127.0.0.1'
# wait for the process to get to the while True: loop
time.sleep(1)
@@ -59,7 +65,7 @@ def test_spawner(db, io_loop):
def test_single_user_spawner(db, io_loop):
spawner = new_spawner(db, cmd=['jupyterhub-singleuser'])
io_loop.run_sync(spawner.start)
assert spawner.user.server.ip == 'localhost'
assert spawner.user.server.ip == '127.0.0.1'
# wait for http server to come up,
# checking for early termination every 1s
def wait():
@@ -110,3 +116,53 @@ def test_stop_spawner_stop_now(db, io_loop):
status = io_loop.run_sync(spawner.poll)
assert status == -signal.SIGTERM
def test_spawner_poll(db, io_loop):
first_spawner = new_spawner(db)
user = first_spawner.user
io_loop.run_sync(first_spawner.start)
proc = first_spawner.proc
status = io_loop.run_sync(first_spawner.poll)
assert status is None
user.state = first_spawner.get_state()
assert 'pid' in user.state
# create a new Spawner, loading from state of previous
spawner = new_spawner(db, user=first_spawner.user)
spawner.start_polling()
# wait for the process to get to the while True: loop
io_loop.run_sync(lambda : gen.sleep(1))
status = io_loop.run_sync(spawner.poll)
assert status is None
# kill the process
proc.terminate()
for i in range(10):
if proc.poll() is None:
time.sleep(1)
else:
break
assert proc.poll() is not None
io_loop.run_sync(lambda : gen.sleep(2))
status = io_loop.run_sync(spawner.poll)
assert status is not None
def test_setcwd():
cwd = os.getcwd()
with tempfile.TemporaryDirectory() as td:
td = os.path.realpath(os.path.abspath(td))
spawnermod._try_setcwd(td)
assert os.path.samefile(os.getcwd(), td)
os.chdir(cwd)
chdir = os.chdir
temp_root = os.path.realpath(os.path.abspath(tempfile.gettempdir()))
def raiser(path):
path = os.path.realpath(os.path.abspath(path))
if not path.startswith(temp_root):
raise OSError(path)
chdir(path)
with mock.patch('os.chdir', raiser):
spawnermod._try_setcwd(cwd)
assert os.getcwd().startswith(temp_root)
os.chdir(cwd)

View File

@@ -2,7 +2,7 @@
# Distributed under the terms of the Modified BSD License.
from datetime import datetime, timedelta
from urllib.parse import quote
from urllib.parse import quote, urlparse
from tornado import gen
from tornado.log import app_log
@@ -38,6 +38,12 @@ class UserDict(dict):
def __getitem__(self, key):
if isinstance(key, User):
key = key.id
elif isinstance(key, str):
orm_user = self.db.query(orm.User).filter(orm.User.name==key).first()
if orm_user is None:
raise KeyError("No such user: %s" % name)
else:
key = orm_user
if isinstance(key, orm.User):
# users[orm_user] returns User(orm_user)
orm_user = key
@@ -139,6 +145,8 @@ class User(HasTraits):
@property
def running(self):
"""property for whether a user has a running server"""
if self.spawn_pending or self.stop_pending:
return False # server is not running if spawn or stop is still pending
if self.server is None:
return False
return True
@@ -148,6 +156,41 @@ class User(HasTraits):
"""My name, escaped for use in URLs, cookies, etc."""
return quote(self.name, safe='@')
@property
def proxy_path(self):
if self.settings.get('subdomain_host'):
return url_path_join('/' + self.domain, self.server.base_url)
else:
return self.server.base_url
@property
def domain(self):
"""Get the domain for my server."""
# FIXME: escaped_name probably isn't escaped enough in general for a domain fragment
return self.escaped_name + '.' + self.settings['domain']
@property
def host(self):
"""Get the *host* for my server (domain[:port])"""
# FIXME: escaped_name probably isn't escaped enough in general for a domain fragment
parsed = urlparse(self.settings['subdomain_host'])
h = '%s://%s.%s' % (parsed.scheme, self.escaped_name, parsed.netloc)
return h
@property
def url(self):
"""My URL
Full name.domain/path if using subdomains, otherwise just my /base/url
"""
if self.settings.get('subdomain_host'):
return '{host}{path}'.format(
host=self.host,
path=self.server.base_url,
)
else:
return self.server.base_url
@gen.coroutine
def spawn(self, options=None):
"""Start the user's spawner"""
@@ -206,6 +249,7 @@ class User(HasTraits):
self.state = spawner.get_state()
self.last_activity = datetime.utcnow()
db.commit()
self.spawn_pending = False
try:
yield self.server.wait_up(http=True, timeout=spawner.http_timeout)
except Exception as e:
@@ -232,7 +276,6 @@ class User(HasTraits):
), exc_info=True)
# raise original TimeoutError
raise e
self.spawn_pending = False
return self
@gen.coroutine

View File

@@ -30,22 +30,32 @@ def random_port():
ISO8601_ms = '%Y-%m-%dT%H:%M:%S.%fZ'
ISO8601_s = '%Y-%m-%dT%H:%M:%SZ'
def can_connect(ip, port):
"""Check if we can connect to an ip:port
return True if we can connect, False otherwise.
"""
try:
socket.create_connection((ip, port))
except socket.error as e:
if e.errno not in {errno.ECONNREFUSED, errno.ETIMEDOUT}:
app_log.error("Unexpected error connecting to %s:%i %s",
ip, port, e
)
return False
else:
return True
@gen.coroutine
def wait_for_server(ip, port, timeout=10):
"""wait for any server to show up at ip:port"""
loop = ioloop.IOLoop.current()
tic = loop.time()
while loop.time() - tic < timeout:
try:
socket.create_connection((ip, port))
except socket.error as e:
if e.errno != errno.ECONNREFUSED:
app_log.error("Unexpected error waiting for %s:%i %s",
ip, port, e
)
yield gen.sleep(0.1)
else:
if can_connect(ip, port):
return
else:
yield gen.sleep(0.1)
raise TimeoutError("Server at {ip}:{port} didn't respond in {timeout} seconds".format(
**locals()
))
@@ -195,35 +205,3 @@ def url_path_join(*pieces):
return result
def localhost():
"""Return localhost or 127.0.0.1"""
if hasattr(localhost, '_localhost'):
return localhost._localhost
binder = connector = None
try:
binder = socket.socket()
binder.bind(('localhost', 0))
binder.listen(1)
port = binder.getsockname()[1]
def accept():
try:
conn, addr = binder.accept()
except ConnectionAbortedError:
pass
else:
conn.close()
t = Thread(target=accept)
t.start()
connector = socket.create_connection(('localhost', port), timeout=10)
t.join(timeout=10)
except (socket.error, socket.gaierror) as e:
warnings.warn("localhost doesn't appear to work, using 127.0.0.1\n%s" % e, RuntimeWarning)
localhost._localhost = '127.0.0.1'
else:
localhost._localhost = 'localhost'
finally:
if binder:
binder.close()
if connector:
connector.close()
return localhost._localhost

View File

@@ -5,7 +5,7 @@
version_info = (
0,
4,
5,
0,
)

View File

@@ -17,30 +17,23 @@ from jinja2 import ChoiceLoader, FunctionLoader
from tornado import ioloop
from tornado.web import HTTPError
try:
import notebook
except ImportError:
raise ImportError("JupyterHub single-user server requires notebook >= 4.0")
from IPython.utils.traitlets import (
from traitlets import (
Bool,
Integer,
Unicode,
CUnicode,
)
try:
import notebook
# 4.x
except ImportError:
from IPython.html.notebookapp import NotebookApp, aliases as notebook_aliases
from IPython.html.auth.login import LoginHandler
from IPython.html.auth.logout import LogoutHandler
from IPython.html.utils import url_path_join
from distutils.version import LooseVersion as V
import IPython
if V(IPython.__version__) < V('3.0'):
raise ImportError("JupyterHub Requires IPython >= 3.0, found %s" % IPython.__version__)
else:
from notebook.notebookapp import NotebookApp, aliases as notebook_aliases
from notebook.notebookapp import (
NotebookApp,
aliases as notebook_aliases,
flags as notebook_flags,
)
from notebook.auth.login import LoginHandler
from notebook.auth.logout import LogoutHandler
@@ -116,7 +109,9 @@ class JupyterHubLoginHandler(LoginHandler):
class JupyterHubLogoutHandler(LogoutHandler):
def get(self):
self.redirect(url_path_join(self.settings['hub_prefix'], 'logout'))
self.redirect(
self.settings['hub_host'] +
url_path_join(self.settings['hub_prefix'], 'logout'))
# register new hub related command-line aliases
@@ -125,9 +120,18 @@ aliases.update({
'user' : 'SingleUserNotebookApp.user',
'cookie-name': 'SingleUserNotebookApp.cookie_name',
'hub-prefix': 'SingleUserNotebookApp.hub_prefix',
'hub-host': 'SingleUserNotebookApp.hub_host',
'hub-api-url': 'SingleUserNotebookApp.hub_api_url',
'base-url': 'SingleUserNotebookApp.base_url',
})
flags = dict(notebook_flags)
flags.update({
'disable-user-config': ({
'SingleUserNotebookApp': {
'disable_user_config': True
}
}, "Disable user-controlled configuration of the notebook server.")
})
page_template = """
{% extends "templates/page.html" %}
@@ -141,8 +145,21 @@ page_template = """
>
Control Panel</a>
{% endblock %}
{% block logo %}
<img src='{{logo_url}}' alt='Jupyter Notebook'/>
{% endblock logo %}
"""
def _exclude_home(path_list):
"""Filter out any entries in a path list that are in my home directory.
Used to disable per-user configuration.
"""
home = os.path.expanduser('~')
for p in path_list:
if not p.startswith(home):
yield p
class SingleUserNotebookApp(NotebookApp):
"""A Subclass of the regular NotebookApp that is aware of the parent multiuser context."""
user = CUnicode(config=True)
@@ -150,12 +167,23 @@ class SingleUserNotebookApp(NotebookApp):
self.log.name = new
cookie_name = Unicode(config=True)
hub_prefix = Unicode(config=True)
hub_host = Unicode(config=True)
hub_api_url = Unicode(config=True)
aliases = aliases
flags = flags
open_browser = False
trust_xheaders = True
login_handler_class = JupyterHubLoginHandler
logout_handler_class = JupyterHubLogoutHandler
port_retries = 0 # disable port-retries, since the Spawner will tell us what port to use
disable_user_config = Bool(False, config=True,
help="""Disable user configuration of single-user server.
Prevents user-writable files that normally configure the single-user server
from being loaded, ensuring admins have full control of configuration.
"""
)
cookie_cache_lifetime = Integer(
config=True,
@@ -183,6 +211,36 @@ class SingleUserNotebookApp(NotebookApp):
self.log.debug("Clearing cookie cache")
self.tornado_settings['cookie_cache'].clear()
def migrate_config(self):
if self.disable_user_config:
# disable config-migration when user config is disabled
return
else:
super(SingleUserNotebookApp, self).migrate_config()
@property
def config_file_paths(self):
path = super(SingleUserNotebookApp, self).config_file_paths
if self.disable_user_config:
# filter out user-writable config dirs if user config is disabled
path = list(_exclude_home(path))
return path
@property
def nbextensions_path(self):
path = super(SingleUserNotebookApp, self).nbextensions_path
if self.disable_user_config:
path = list(_exclude_home(path))
return path
def _static_custom_path_default(self):
path = super(SingleUserNotebookApp, self)._static_custom_path_default()
if self.disable_user_config:
path = list(_exclude_home(path))
return path
def start(self):
# Start a PeriodicCallback to clear cached cookies. This forces us to
# revalidate our user with the Hub at least every
@@ -202,20 +260,22 @@ class SingleUserNotebookApp(NotebookApp):
s['user'] = self.user
s['hub_api_key'] = env.pop('JPY_API_TOKEN')
s['hub_prefix'] = self.hub_prefix
s['hub_host'] = self.hub_host
s['cookie_name'] = self.cookie_name
s['login_url'] = self.hub_prefix
s['login_url'] = self.hub_host + self.hub_prefix
s['hub_api_url'] = self.hub_api_url
s['csp_report_uri'] = url_path_join(self.hub_prefix, 'security/csp-report')
s['csp_report_uri'] = self.hub_host + url_path_join(self.hub_prefix, 'security/csp-report')
super(SingleUserNotebookApp, self).init_webapp()
self.patch_templates()
def patch_templates(self):
"""Patch page templates to add Hub-related buttons"""
self.jinja_template_vars['logo_url'] = self.hub_host + url_path_join(self.hub_prefix, 'logo')
env = self.web_app.settings['jinja2_env']
env.globals['hub_control_panel_url'] = \
url_path_join(self.hub_prefix, 'home')
self.hub_host + url_path_join(self.hub_prefix, 'home')
# patch jinja env loading to modify page template
def get_page(name):

View File

@@ -83,8 +83,8 @@ setup_args = dict(
# this will be overridden when bower is run anyway
data_files = get_data_files() or ['dummy'],
version = ns['__version__'],
description = """JupyterHub: A multi-user server for Jupyter notebooks""",
long_description = "",
description = "JupyterHub: A multi-user server for Jupyter notebooks",
long_description = "See https://jupyterhub.readthedocs.org for more info.",
author = "Jupyter Development Team",
author_email = "jupyter@googlegroups.com",
url = "http://jupyter.org",

View File

@@ -82,7 +82,7 @@
<div id="header" class="navbar navbar-static-top">
<div class="container">
<span id="jupyterhub-logo" class="pull-left"><a href="{{base_url}}"><img src='{{static_url("images/jupyter.png") }}' alt='JupyterHub' class='jpy-logo' title='Home'/></a></span>
<span id="jupyterhub-logo" class="pull-left"><a href="{{base_url}}"><img src='{{base_url}}logo' alt='JupyterHub' class='jpy-logo' title='Home'/></a></span>
{% block login_widget %}

View File

@@ -165,15 +165,13 @@ def make_env(*packages):
return py
def build_sdist(py, upload=False):
def build_sdist(py):
"""Build sdists
Returns the path to the tarball
"""
with cd(repo_root):
cmd = [py, 'setup.py', 'sdist', '--formats=zip,gztar']
if upload:
cmd.append('upload')
run(cmd)
return glob.glob(pjoin(repo_root, 'dist', '*.tar.gz'))[0]
@@ -184,7 +182,12 @@ def sdist(vs, upload=False):
clone_repo()
tag(vs, push=upload)
py = make_env()
tarball = build_sdist(py, upload=upload)
tarball = build_sdist(py)
if upload:
with cd(repo_root):
install(py, 'twine')
run([py, '-m', 'twine', 'upload', 'dist/*'])
untag(vs, push=upload)
return untar(tarball)
@@ -214,14 +217,10 @@ def untar(tarball):
return glob.glob(pjoin(sdist_root, '*'))[0]
def bdist(upload=False):
def bdist():
"""build a wheel, optionally uploading it"""
py = make_env('wheel')
cmd = [py, 'setup.py', 'bdist_wheel']
if upload:
cmd.append('upload')
run(cmd)
run([py, 'setup.py', 'bdist_wheel'])
@task
@@ -233,7 +232,10 @@ def release(vs, upload=False):
shutil.rmtree(env_root)
path = sdist(vs, upload=upload)
print("Working in %r" % path)
with cd(path):
bdist(upload=upload)
bdist()
if upload:
py = make_env('twine')
run([py, '-m', 'twine', 'upload', 'dist/*'])