Merge branch 'master' into jupyter_2105

This commit is contained in:
Min RK
2018-10-19 09:31:45 +02:00
committed by GitHub
40 changed files with 1056 additions and 89 deletions

View File

@@ -2,7 +2,9 @@
mock
beautifulsoup4
codecov
coverage<5 # pin coverage to < 5 due to coveragepy#716
cryptography
html5lib # needed for beautifulsoup
pytest-cov
pytest-tornado
pytest>=3.3

View File

@@ -100,6 +100,11 @@ easy to do with RStudio too.
- https://dsa.missouri.edu/faq/
### Paderborn University
- [Data Science (DICE) group](https://dice.cs.uni-paderborn.de/)
- [nbgraderutils](https://github.com/dice-group/nbgraderutils): Use JupyterHub + nbgrader + iJava kernel for online Java exercises. Used in lecture Statistical Natural Language Processing.
### University of Rochester CIRC
- [JupyterHub Userguide](https://info.circ.rochester.edu/Web_Applications/JupyterHub.html) - Slurm, beehive

View File

@@ -124,7 +124,7 @@ hex-encoded string. You can set it this way:
.. code-block:: bash
export JPY_COOKIE_SECRET=`openssl rand -hex 32`
export JPY_COOKIE_SECRET=$(openssl rand -hex 32)
For security reasons, this environment variable should only be visible to the
Hub. If you set it dynamically as above, all users will be logged out each time
@@ -173,7 +173,7 @@ using the ``CONFIGPROXY_AUTH_TOKEN`` environment variable:
.. code-block:: bash
export CONFIGPROXY_AUTH_TOKEN='openssl rand -hex 32'
export CONFIGPROXY_AUTH_TOKEN=$(openssl rand -hex 32)
This environment variable needs to be visible to the Hub and Proxy.

View File

@@ -172,7 +172,7 @@ c.JupyterHub.authenticator_class = 'mypackage:MyAuthenticator'
```
previously required.
Additionally, configurable attributes for your spawner will
Additionally, configurable attributes for your authenticator will
appear in jupyterhub help output and auto-generated configuration files
via `jupyterhub --generate-config`.

View File

@@ -45,15 +45,12 @@ If your proxy should be launched when the Hub starts, you must define how
to start and stop your proxy:
```python
from tornado import gen
class MyProxy(Proxy):
...
@gen.coroutine
def start(self):
async def start(self):
"""Start the proxy"""
@gen.coroutine
def stop(self):
async def stop(self):
"""Stop the proxy"""
```
@@ -62,6 +59,18 @@ These methods **may** be coroutines.
`c.Proxy.should_start` is a configurable flag that determines whether the
Hub should call these methods when the Hub itself starts and stops.
## Encryption
When using `internal_ssl` to encrypt traffic behind the proxy, at minimum,
your `Proxy` will need client ssl certificates which the `Hub` must be made
aware of. These can be generated with the command `jupyterhub --generate-certs`
which will write them to the `internal_certs_location` in folders named
`proxy_api` and `proxy_client`. Alternatively, these can be provided to the
hub via the `jupyterhub_config.py` file by providing a `dict` of named paths
to the `external_authorities` option. The hub will include all certificates
provided in that `dict` in the trust bundle utilized by all internal
components.
### Purely external proxies
Probably most custom proxies will be externally managed,
@@ -100,8 +109,7 @@ Python wrapper may have to handle storing the `data` piece itself, e.g in a
simple file or database.
```python
@gen.coroutine
def add_route(self, routespec, target, data):
async def add_route(self, routespec, target, data):
"""Proxy `routespec` to `target`.
Store `data` associated with the routespec
@@ -112,7 +120,7 @@ def add_route(self, routespec, target, data):
Adding a route for a user looks like this:
```python
proxy.add_route('/user/pgeorgiou/', 'http://127.0.0.1:1227',
await proxy.add_route('/user/pgeorgiou/', 'http://127.0.0.1:1227',
{'user': 'pgeorgiou'})
```
@@ -122,8 +130,7 @@ proxy.add_route('/user/pgeorgiou/', 'http://127.0.0.1:1227',
`delete_route` should still succeed, but a warning may be issued.
```python
@gen.coroutine
def delete_route(self, routespec):
async def delete_route(self, routespec):
"""Delete the route"""
```
@@ -135,8 +142,7 @@ routes. The return value for this function should be a dictionary, keyed by
`add_route` (`routespec`, `target`, `data`)
```python
@gen.coroutine
def get_all_routes(self):
async def get_all_routes(self):
"""Return all routes, keyed by routespec"""
```
@@ -179,3 +185,38 @@ tracked, and services such as cull-idle will not work.
Now that `notebook-5.0` tracks activity internally, we can retrieve activity
information from the single-user servers instead, removing the need to track
activity in the proxy. But this is not yet implemented in JupyterHub 0.8.0.
### Registering custom Proxies via entry points
As of JupyterHub 1.0, custom proxy implementations can register themselves via
the `jupyterhub.proxies` entry point metadata.
To do this, in your `setup.py` add:
```python
setup(
...
entry_points={
'jupyterhub.proxies': [
'mything = mypackage:MyProxy',
],
},
)
```
If you have added this metadata to your package,
users can select your authenticator with the configuration:
```python
c.JupyterHub.proxy_class = 'mything'
```
instead of the full
```python
c.JupyterHub.proxy_class = 'mypackage:MyProxy'
```
previously required.
Additionally, configurable attributes for your proxy will
appear in jupyterhub help output and auto-generated configuration files
via `jupyterhub --generate-config`.

View File

@@ -260,3 +260,30 @@ in the single-user notebook server when a guarantee is being provided.
**The spawner's underlying system or cluster is responsible for enforcing these
limits and providing these guarantees.** If these values are set to `None`, no
limits or guarantees are provided, and no environment values are set.
### Encryption
Communication between the `Proxy`, `Hub`, and `Notebook` can be secured by
turning on `internal_ssl` in `jupyterhub_config.py`. For a custom spawner to
utilize these certs, there are two methods of interest on the base `Spawner`
class: `.create_certs` and `.move_certs`.
The first method, `.create_certs` will sign a key-cert pair using an internally
trusted authority for notebooks. During this process, `.create_certs` can
apply `ip` and `dns` name information to the cert via an `alt_names` `kwarg`.
This is used for certificate authentication (verification). Without proper
verification, the `Notebook` will be unable to communicate with the `Hub` and
vice versa when `internal_ssl` is enabled. For example, given a deployment
using the `DockerSpawner` which will start containers with `ips` from the
`docker` subnet pool, the `DockerSpawner` would need to instead choose a
container `ip` prior to starting and pass that to `.create_certs` (TODO: edit).
In general though, this method will not need to be changed and the default
`ip`/`dns` (localhost) info will suffice.
When `.create_certs` is run, it will `.create_certs` in a default, central
location specified by `c.JupyterHub.internal_certs_location`. For `Spawners`
that need access to these certs elsewhere (i.e. on another host altogether),
the `.move_certs` method can be overridden to move the certs appropriately.
Again, using `DockerSpawner` as an example, this would entail moving certs
to a directory that will get mounted into the container this spawner starts.

View File

@@ -99,6 +99,23 @@ single-user server, and not the environment(s) in which the user's kernel(s)
may run. Installing additional packages in the kernel environment does not
pose additional risk to the web application's security.
### Encrypt internal connections with SSL/TLS
By default, all communication on the server, between the proxy, hub, and single
-user notebooks is performed unencrypted. Setting the `internal_ssl` flag in
`jupyterhub_config.py` secures the aforementioned routes. Turning this
feature on does require that the enabled `Spawner` can use the certificates
generated by the `Hub` (the default `LocalProcessSpawner` can, for instance).
It is also important to note that this encryption **does not** (yet) cover the
`zmq tcp` sockets between the Notebook client and kernel. While users cannot
submit arbitrary commands to another user's kernel, they can bind to these
sockets and listen. When serving untrusted users, this eavesdropping can be
mitigated by setting `KernelManager.transport` to `ipc`. This applies standard
Unix permissions to the communication sockets thereby restricting
communication to the socket owner. The `internal_ssl` option will eventually
extend to securing the `tcp` sockets as well.
## Security audits
We recommend that you do periodic reviews of your deployment's security. It's

View File

@@ -118,7 +118,7 @@ Here's an example on what you could do in your shell script. See also
# - The first parameter for the Bootstrap Script is the USER.
USER=$1
if ["$USER" == ""]; then
if [ "$USER" == "" ]; then
exit 1
fi
# ----------------------------------------------------------------------------

View File

@@ -6,7 +6,7 @@
# - The first parameter for the Bootstrap Script is the USER.
USER=$1
if ["$USER" == ""]; then
if [ "$USER" == "" ]; then
exit 1
fi
# ----------------------------------------------------------------------------

View File

@@ -36,6 +36,6 @@ Generate an API token and store it in the `JUPYTERHUB_API_TOKEN` environment
variable. Run `cull_idle_servers.py` manually.
```bash
export JUPYTERHUB_API_TOKEN=`jupyterhub token`
export JUPYTERHUB_API_TOKEN=$(jupyterhub token)
python3 cull_idle_servers.py [--timeout=900] [--url=http://127.0.0.1:8081/hub/api]
```

View File

@@ -22,7 +22,7 @@ You can run this as a service managed by JupyterHub with this in your config::
Or run it manually by generating an API token and storing it in `JUPYTERHUB_API_TOKEN`:
export JUPYTERHUB_API_TOKEN=`jupyterhub token`
export JUPYTERHUB_API_TOKEN=$(jupyterhub token)
python3 cull_idle_servers.py [--timeout=900] [--url=http://127.0.0.1:8081/hub/api]
This script uses the same ``--timeout`` and ``--max-age`` values for

View File

@@ -18,7 +18,7 @@ implementations in other web servers or languages.
1. generate an API token:
export JUPYTERHUB_API_TOKEN=`openssl rand -hex 32`
export JUPYTERHUB_API_TOKEN=$(openssl rand -hex 32)
2. launch a version of the the whoami service.
For `whoami-oauth`:

View File

@@ -1,4 +1,4 @@
export CONFIGPROXY_AUTH_TOKEN=`openssl rand -hex 32`
export CONFIGPROXY_AUTH_TOKEN=$(openssl rand -hex 32)
# start JupyterHub
jupyterhub --ip=127.0.0.1

View File

@@ -16,11 +16,11 @@ from operator import itemgetter
import os
import re
import signal
import socket
import sys
from textwrap import dedent
from urllib.parse import unquote, urlparse, urlunparse
if sys.version_info[:2] < (3, 3):
raise ValueError("Python < 3.3 not supported: %s" % sys.version)
@@ -63,6 +63,7 @@ from .utils import (
maybe_future,
url_path_join,
print_stacks, print_ps_info,
make_ssl_context,
)
# classes for config
from .auth import Authenticator, PAMAuthenticator
@@ -102,6 +103,8 @@ flags = {
"set log level to logging.DEBUG (maximize logging output)"),
'generate-config': ({'JupyterHub': {'generate_config': True}},
"generate default config file"),
'generate-certs': ({'JupyterHub': {'generate_certs': True}},
"generate certificates used for internal ssl"),
'no-db': ({'JupyterHub': {'db_url': 'sqlite:///:memory:'}},
"disable persisting state database to disk"
),
@@ -258,6 +261,9 @@ class JupyterHub(Application):
generate_config = Bool(False,
help="Generate default config file",
).tag(config=True)
generate_certs = Bool(False,
help="Generate certs used for internal ssl",
).tag(config=True)
answer_yes = Bool(False,
help="Answer yes to any questions (e.g. confirm overwrite)"
).tag(config=True)
@@ -318,6 +324,101 @@ class JupyterHub(Application):
When setting this, you should also set ssl_key
"""
).tag(config=True)
internal_ssl = Bool(False,
help="""Enable SSL for all internal communication
This enables end-to-end encryption between all JupyterHub components.
JupyterHub will automatically create the necessary certificate
authority and sign notebook certificates as they're created.
"""
).tag(config=True)
internal_certs_location = Unicode('internal-ssl',
help="""The location to store certificates automatically created by
JupyterHub.
Use with internal_ssl
"""
).tag(config=True)
recreate_internal_certs = Bool(False,
help="""Recreate all certificates used within JupyterHub on restart.
Note: enabling this feature requires restarting all notebook servers.
Use with internal_ssl
"""
).tag(config=True)
external_ssl_authorities = Dict(
help="""Dict authority:dict(files). Specify the key, cert, and/or
ca file for an authority. This is useful for externally managed
proxies that wish to use internal_ssl.
The files dict has this format (you must specify at least a cert)::
{
'key': '/path/to/key.key',
'cert': '/path/to/cert.crt',
'ca': '/path/to/ca.crt'
}
The authorities you can override: 'hub-ca', 'notebooks-ca',
'proxy-api-ca', 'proxy-client-ca', and 'services-ca'.
Use with internal_ssl
"""
).tag(config=True)
internal_ssl_authorities = Dict(
default_value={
'hub-ca': None,
'notebooks-ca': None,
'proxy-api-ca': None,
'proxy-client-ca': None,
'services-ca': None,
},
help="""Dict authority:dict(files). When creating the various
CAs needed for internal_ssl, these are the names that will be used
for each authority.
Use with internal_ssl
"""
)
internal_ssl_components_trust = Dict(
help="""Dict component:list(components). This dict specifies the
relationships of components secured by internal_ssl.
"""
)
internal_trust_bundles = Dict(
help="""Dict component:path. These are the paths to the trust bundles
that each component should have. They will be set during
`init_internal_ssl`.
Use with internal_ssl
"""
)
internal_ssl_key = Unicode(
help="""The key to be used for internal ssl"""
)
internal_ssl_cert = Unicode(
help="""The cert to be used for internal ssl"""
)
internal_ssl_ca = Unicode(
help="""The certificate authority to be used for internal ssl"""
)
internal_proxy_certs = Dict(
help=""" Dict component:dict(cert files). This dict contains the certs
generated for both the proxy API and proxy client.
"""
)
trusted_alt_names = List(Unicode(),
help="""Names to include in the subject alternative name.
These names will be used for server name verification. This is useful
if JupyterHub is being run behind a reverse proxy or services using ssl
are on different hosts.
Use with internal_ssl
"""
).tag(config=True)
ip = Unicode('',
help="""The public facing ip of the whole JupyterHub application
(specifically referred to as the proxy).
@@ -421,9 +522,19 @@ class JupyterHub(Application):
help="Supply extra arguments that will be passed to Jinja environment."
).tag(config=True)
proxy_class = Type(ConfigurableHTTPProxy, Proxy,
help="""Select the Proxy API implementation."""
).tag(config=True)
proxy_class = EntryPointType(
default_value=ConfigurableHTTPProxy,
klass=Proxy,
entry_point_group="jupyterhub.proxies",
help="""The class to use for configuring the JupyterHub proxy.
Should be a subclass of :class:`jupyterhub.proxy.Proxy`.
.. versionchanged:: 1.0
proxies may be registered via entry points,
e.g. `c.JupyterHub.proxy_class = 'traefik'`
"""
).tag(config=True)
proxy_cmd = Command([], config=True,
help="DEPRECATED since version 0.8. Use ConfigurableHTTPProxy.command",
@@ -1086,6 +1197,105 @@ class JupyterHub(Application):
# store the loaded trait value
self.cookie_secret = secret
def init_internal_ssl(self):
"""Create the certs needed to turn on internal SSL."""
if self.internal_ssl:
from certipy import Certipy, CertNotFoundError
certipy = Certipy(store_dir=self.internal_certs_location,
remove_existing=self.recreate_internal_certs)
# Here we define how trust should be laid out per each component
self.internal_ssl_components_trust = {
'hub-ca': list(self.internal_ssl_authorities.keys()),
'proxy-api-ca': ['hub-ca', 'services-ca', 'notebooks-ca'],
'proxy-client-ca': ['hub-ca', 'notebooks-ca'],
'notebooks-ca': ['hub-ca', 'proxy-client-ca'],
'services-ca': ['hub-ca', 'proxy-api-ca'],
}
hub_name = 'hub-ca'
# If any external CAs were specified in external_ssl_authorities
# add records of them to Certipy's store.
self.internal_ssl_authorities.update(self.external_ssl_authorities)
for authority, files in self.internal_ssl_authorities.items():
if files:
self.log.info("Adding CA for %s", authority)
certipy.store.add_record(
authority, is_ca=True, files=files)
self.internal_trust_bundles = certipy.trust_from_graph(
self.internal_ssl_components_trust)
default_alt_names = ["IP:127.0.0.1", "DNS:localhost"]
if self.subdomain_host:
default_alt_names.append("DNS:%s" % urlparse(self.subdomain_host).hostname)
# The signed certs used by hub-internal components
try:
internal_key_pair = certipy.store.get_record("hub-internal")
except CertNotFoundError:
alt_names = list(default_alt_names)
# In the event the hub needs to be accessed externally, add
# the fqdn and (optionally) rev_proxy to the set of alt_names.
alt_names += (["DNS:" + socket.getfqdn()]
+ self.trusted_alt_names)
self.log.info(
"Adding CA for %s: %s",
"hub-internal",
";".join(alt_names),
)
internal_key_pair = certipy.create_signed_pair(
"hub-internal",
hub_name,
alt_names=alt_names,
)
else:
self.log.info("Using existing hub-internal CA")
# Create the proxy certs
proxy_api = 'proxy-api'
proxy_client = 'proxy-client'
for component in [proxy_api, proxy_client]:
ca_name = component + '-ca'
alt_names = default_alt_names + self.trusted_alt_names
try:
record = certipy.store.get_record(component)
except CertNotFoundError:
self.log.info(
"Generating signed pair for %s: %s",
component,
';'.join(alt_names),
)
record = certipy.create_signed_pair(
component,
ca_name,
alt_names=alt_names,
)
else:
self.log.info("Using existing %s CA", component)
self.internal_proxy_certs[component] = {
"keyfile": record['files']['key'],
"certfile": record['files']['cert'],
"cafile": record['files']['cert'],
}
self.internal_ssl_key = internal_key_pair['files']['key']
self.internal_ssl_cert = internal_key_pair['files']['cert']
self.internal_ssl_ca = self.internal_trust_bundles[hub_name]
# Configure the AsyncHTTPClient. This will affect anything using
# AsyncHTTPClient.
ssl_context = make_ssl_context(
self.internal_ssl_key,
self.internal_ssl_cert,
cafile=self.internal_ssl_ca,
)
AsyncHTTPClient.configure(
None, defaults={"ssl_options" : ssl_context}
)
def init_db(self):
"""Create the database connection"""
@@ -1120,6 +1330,9 @@ class JupyterHub(Application):
hub_args = dict(
base_url=self.hub_prefix,
public_host=self.subdomain_host,
certfile=self.internal_ssl_cert,
keyfile=self.internal_ssl_key,
cafile=self.internal_ssl_ca,
)
if self.hub_bind_url:
# ensure hub_prefix is set on bind_url
@@ -1161,6 +1374,8 @@ class JupyterHub(Application):
._replace(path=self.hub_prefix)
)
self.hub.connect_url = self.hub_connect_url
if self.internal_ssl:
self.hub.proto = 'https'
async def init_users(self):
"""Load users into and from the database"""
@@ -1368,6 +1583,7 @@ class JupyterHub(Application):
orm_service.admin = spec.get('admin', False)
self.db.commit()
service = Service(parent=self,
app=self,
base_url=self.base_url,
db=self.db, orm=orm_service,
domain=domain, host=host,
@@ -1624,7 +1840,15 @@ class JupyterHub(Application):
concurrent_spawn_limit=self.concurrent_spawn_limit,
spawn_throttle_retry_range=self.spawn_throttle_retry_range,
active_server_limit=self.active_server_limit,
authenticate_prometheus=self.authenticate_prometheus
authenticate_prometheus=self.authenticate_prometheus,
internal_ssl=self.internal_ssl,
internal_certs_location=self.internal_certs_location,
internal_authorities=self.internal_ssl_authorities,
internal_trust_bundles=self.internal_trust_bundles,
internal_ssl_key=self.internal_ssl_key,
internal_ssl_cert=self.internal_ssl_cert,
internal_ssl_ca=self.internal_ssl_ca,
trusted_alt_names=self.trusted_alt_names,
)
# allow configured settings to have priority
settings.update(self.tornado_settings)
@@ -1655,7 +1879,7 @@ class JupyterHub(Application):
@catch_config_error
async def initialize(self, *args, **kwargs):
super().initialize(*args, **kwargs)
if self.generate_config or self.subapp:
if self.generate_config or self.generate_certs or self.subapp:
return
self.load_config_file(self.config_file)
self.init_logging()
@@ -1698,6 +1922,7 @@ class JupyterHub(Application):
self.init_pycurl()
self.init_secrets()
self.init_internal_ssl()
self.init_db()
self.init_hub()
self.init_proxy()
@@ -1856,8 +2081,28 @@ class JupyterHub(Application):
loop.stop()
return
if self.generate_certs:
self.load_config_file(self.config_file)
if not self.internal_ssl:
self.log.warn("You'll need to enable `internal_ssl` "
"in the `jupyterhub_config` file to use "
"these certs.")
self.internal_ssl = True
self.init_internal_ssl()
self.log.info("Certificates written to directory `{}`".format(
self.internal_certs_location))
loop.stop()
return
ssl_context = make_ssl_context(
self.internal_ssl_key,
self.internal_ssl_cert,
cafile=self.internal_ssl_ca,
check_hostname=False
)
# start the webserver
self.http_server = tornado.httpserver.HTTPServer(self.tornado_application, xheaders=True)
self.http_server = tornado.httpserver.HTTPServer(self.tornado_application, ssl_options=ssl_context, xheaders=True)
bind_url = urlparse(self.hub.bind_url)
try:
if bind_url.scheme.startswith('unix+'):
@@ -1907,7 +2152,12 @@ class JupyterHub(Application):
tries = 10 if service.managed else 1
for i in range(tries):
try:
await Server.from_orm(service.orm.server).wait_up(http=True, timeout=1)
ssl_context = make_ssl_context(
self.internal_ssl_key,
self.internal_ssl_cert,
cafile=self.internal_ssl_ca
)
await Server.from_orm(service.orm.server).wait_up(http=True, timeout=1, ssl_context=ssl_context)
except TimeoutError:
if service.managed:
status = await service.spawner.poll()

View File

@@ -33,7 +33,7 @@ from ..metrics import (
SERVER_SPAWN_DURATION_SECONDS, ServerSpawnStatus,
PROXY_ADD_DURATION_SECONDS, ProxyAddStatus,
SERVER_POLL_DURATION_SECONDS, ServerPollStatus,
RUNNING_SERVERS
RUNNING_SERVERS, SERVER_STOP_DURATION_SECONDS, ServerStopStatus
)
# pattern for the authentication token header
@@ -893,18 +893,25 @@ class BaseHandler(RequestHandler):
2. stop the server
3. notice that it stopped
"""
tic = IOLoop.current().time()
tic = time.perf_counter()
try:
await self.proxy.delete_user(user, server_name)
await user.stop(server_name)
toc = time.perf_counter()
self.log.info("User %s server took %.3f seconds to stop", user.name, toc - tic)
self.statsd.timing('spawner.stop', (toc - tic) * 1000)
RUNNING_SERVERS.dec()
SERVER_STOP_DURATION_SECONDS.labels(
status=ServerStopStatus.success
).observe(toc - tic)
except:
SERVER_STOP_DURATION_SECONDS.labels(
status=ServerStopStatus.failure
).observe(time.perf_counter() - tic)
finally:
spawner._stop_future = None
spawner._stop_pending = False
toc = IOLoop.current().time()
self.log.info("User %s server took %.3f seconds to stop", user.name, toc - tic)
self.statsd.timing('spawner.stop', (toc - tic) * 1000)
RUNNING_SERVERS.dec()
future = spawner._stop_future = asyncio.ensure_future(stop())

View File

@@ -12,7 +12,7 @@ from tornado import web, gen
from tornado.httputil import url_concat
from .. import orm
from ..utils import admin_only, url_path_join
from ..utils import admin_only, url_path_join, maybe_future
from .base import BaseHandler
@@ -107,6 +107,8 @@ class SpawnHandler(BaseHandler):
self.redirect(url)
return
if user.spawner.options_form:
# Add handler to spawner here so you can access query params in form rendering.
user.spawner.handler = self
form = await self._render_form(for_user=user)
self.finish(form)
else:
@@ -147,7 +149,7 @@ class SpawnHandler(BaseHandler):
for key, byte_list in self.request.files.items():
form_options["%s_file"%key] = byte_list
try:
options = user.spawner.options_from_form(form_options)
options = await maybe_future(user.spawner.options_from_form(form_options))
await self.spawn_single_user(user, options=options)
except Exception as e:
self.log.error("Failed to spawn single-user server with form", exc_info=True)

View File

@@ -42,6 +42,18 @@ RUNNING_SERVERS = Gauge(
RUNNING_SERVERS.set(0)
TOTAL_USERS = Gauge(
'total_users',
'toal number of users'
)
TOTAL_USERS.set(0)
CHECK_ROUTES_DURATION_SECONDS = Histogram(
'check_routes_duration_seconds',
'Time taken to validate all routes in proxy'
)
class ServerSpawnStatus(Enum):
"""
Possible values for 'status' label of SERVER_SPAWN_DURATION_SECONDS
@@ -100,11 +112,29 @@ class ServerPollStatus(Enum):
return cls.running
return cls.stopped
for s in ServerPollStatus:
SERVER_POLL_DURATION_SECONDS.labels(status=s)
SERVER_STOP_DURATION_SECONDS = Histogram(
'server_stop_seconds',
'time taken for server stopping operation',
['status'],
)
class ServerStopStatus(Enum):
"""
Possible values for 'status' label of SERVER_STOP_DURATION_SECONDS
"""
success = 'success'
failure = 'failure'
def __str__(self):
return self.value
for s in ServerPollStatus:
SERVER_POLL_DURATION_SECONDS.labels(status=s)
for s in ServerStopStatus:
SERVER_STOP_DURATION_SECONDS.labels(status=s)
def prometheus_log_method(handler):

View File

@@ -15,7 +15,7 @@ from .traitlets import URLPrefix
from . import orm
from .utils import (
url_path_join, can_connect, wait_for_server,
wait_for_http_server, random_port,
wait_for_http_server, random_port, make_ssl_context,
)
class Server(HasTraits):
@@ -35,6 +35,9 @@ class Server(HasTraits):
cookie_name = Unicode('')
connect_url = Unicode('')
bind_url = Unicode('')
certfile = Unicode()
keyfile = Unicode()
cafile = Unicode()
@default('bind_url')
def bind_url_default(self):
@@ -157,10 +160,15 @@ class Server(HasTraits):
uri=self.base_url,
)
def wait_up(self, timeout=10, http=False):
def wait_up(self, timeout=10, http=False, ssl_context=None):
"""Wait for this server to come up"""
if http:
return wait_for_http_server(self.url, timeout=timeout)
ssl_context = ssl_context or make_ssl_context(
self.keyfile, self.certfile, cafile=self.cafile)
return wait_for_http_server(
self.url, timeout=timeout, ssl_context=ssl_context)
else:
return wait_for_server(self._connect_ip, self._connect_port, timeout=timeout)

View File

@@ -39,9 +39,11 @@ from traitlets import (
from jupyterhub.traitlets import Command
from traitlets.config import LoggingConfigurable
from .metrics import CHECK_ROUTES_DURATION_SECONDS
from .objects import Server
from . import utils
from .utils import url_path_join
from .utils import url_path_join, make_ssl_context
def _one_at_a_time(method):
@@ -292,6 +294,7 @@ class Proxy(LoggingConfigurable):
@_one_at_a_time
async def check_routes(self, user_dict, service_dict, routes=None):
"""Check that all users are properly routed on the proxy."""
start = time.perf_counter() #timer starts here when user is created
if not routes:
self.log.debug("Fetching routes to check")
routes = await self.get_all_routes()
@@ -364,6 +367,8 @@ class Proxy(LoggingConfigurable):
futures.append(self.delete_route(routespec))
await gen.multi(futures)
stop = time.perf_counter() #timer stops here when user is deleted
CHECK_ROUTES_DURATION_SECONDS.observe(stop - start) #histogram metric
def add_hub_route(self, hub):
"""Add the default route for the Hub"""
@@ -432,9 +437,22 @@ class ConfigurableHTTPProxy(Proxy):
token = utils.new_token()
return token
api_url = Unicode('http://127.0.0.1:8001', config=True,
api_url = Unicode(config=True,
help="""The ip (or hostname) of the proxy's API endpoint"""
)
@default('api_url')
def _api_url_default(self):
url = '127.0.0.1:8001'
proto = 'http'
if self.app.internal_ssl:
proto = 'https'
return "{proto}://{url}".format(
proto=proto,
url=url,
)
command = Command('configurable-http-proxy', config=True,
help="""The command to start the proxy"""
)
@@ -554,6 +572,28 @@ class ConfigurableHTTPProxy(Proxy):
cmd.extend(['--ssl-key', self.ssl_key])
if self.ssl_cert:
cmd.extend(['--ssl-cert', self.ssl_cert])
if self.app.internal_ssl:
proxy_api = 'proxy-api'
proxy_client = 'proxy-client'
api_key = self.app.internal_proxy_certs[proxy_api]['keyfile']
api_cert = self.app.internal_proxy_certs[proxy_api]['certfile']
api_ca = self.app.internal_trust_bundles[proxy_api + '-ca']
client_key = self.app.internal_proxy_certs[proxy_client]['keyfile']
client_cert = self.app.internal_proxy_certs[proxy_client]['certfile']
client_ca = self.app.internal_trust_bundles[proxy_client + '-ca']
cmd.extend(['--api-ssl-key', api_key])
cmd.extend(['--api-ssl-cert', api_cert])
cmd.extend(['--api-ssl-ca', api_ca])
cmd.extend(['--api-ssl-request-cert'])
cmd.extend(['--api-ssl-reject-unauthorized'])
cmd.extend(['--client-ssl-key', client_key])
cmd.extend(['--client-ssl-cert', client_cert])
cmd.extend(['--client-ssl-ca', client_ca])
cmd.extend(['--client-ssl-request-cert'])
cmd.extend(['--client-ssl-reject-unauthorized'])
if self.app.statsd_host:
cmd.extend([
'--statsd-host', self.app.statsd_host,

View File

@@ -196,6 +196,27 @@ class HubAuth(SingletonConfigurable):
def _default_login_url(self):
return self.hub_host + url_path_join(self.hub_prefix, 'login')
keyfile = Unicode('',
help="""The ssl key to use for requests
Use with certfile
"""
).tag(config=True)
certfile = Unicode('',
help="""The ssl cert to use for requests
Use with keyfile
"""
).tag(config=True)
client_ca = Unicode('',
help="""The ssl certificate authority to use to verify requests
Use with keyfile and certfile
"""
).tag(config=True)
cookie_name = Unicode('jupyterhub-services',
help="""The name of the cookie I should be looking for"""
).tag(config=True)
@@ -277,6 +298,10 @@ class HubAuth(SingletonConfigurable):
allow_404 = kwargs.pop('allow_404', False)
headers = kwargs.setdefault('headers', {})
headers.setdefault('Authorization', 'token %s' % self.api_token)
if "cert" not in kwargs and self.certfile and self.keyfile:
kwargs["cert"] = (self.certfile, self.keyfile)
if self.client_ca:
kwargs["verify"] = self.client_ca
try:
r = requests.request(method, url, **kwargs)
except requests.ConnectionError as e:

View File

@@ -224,6 +224,7 @@ class Service(LoggingConfigurable):
domain = Unicode()
host = Unicode()
hub = Any()
app = Any()
proc = Any()
# handles on globals:
@@ -331,6 +332,9 @@ class Service(LoggingConfigurable):
server=self.orm.server,
host=self.host,
),
internal_ssl=self.app.internal_ssl,
internal_certs_location=self.app.internal_certs_location,
internal_trust_bundles=self.app.internal_trust_bundles,
)
self.spawner.start()
self.proc = self.spawner.proc

View File

@@ -43,7 +43,7 @@ from notebook.base.handlers import IPythonHandler
from ._version import __version__, _check_version
from .log import log_request
from .services.auth import HubOAuth, HubOAuthenticated, HubOAuthCallbackHandler
from .utils import url_path_join
from .utils import url_path_join, make_ssl_context
# Authenticate requests with the Hub
@@ -245,6 +245,18 @@ class SingleUserNotebookApp(NotebookApp):
hub_prefix = Unicode('/hub/').tag(config=True)
@default('keyfile')
def _keyfile_default(self):
return os.environ.get('JUPYTERHUB_SSL_KEYFILE') or ''
@default('certfile')
def _certfile_default(self):
return os.environ.get('JUPYTERHUB_SSL_CERTFILE') or ''
@default('client_ca')
def _client_ca_default(self):
return os.environ.get('JUPYTERHUB_SSL_CLIENT_CA') or ''
@default('hub_prefix')
def _hub_prefix_default(self):
base_url = os.environ.get('JUPYTERHUB_BASE_URL') or '/'
@@ -379,6 +391,13 @@ class SingleUserNotebookApp(NotebookApp):
- exit if I can't connect at all
- check version and warn on sufficient mismatch
"""
ssl_context = make_ssl_context(
self.keyfile,
self.certfile,
cafile=self.client_ca,
)
AsyncHTTPClient.configure(None, defaults={"ssl_options" : ssl_context})
client = AsyncHTTPClient()
RETRIES = 5
for i in range(1, RETRIES+1):
@@ -424,6 +443,9 @@ class SingleUserNotebookApp(NotebookApp):
api_url=self.hub_api_url,
hub_prefix=self.hub_prefix,
base_url=self.base_url,
keyfile=self.keyfile,
certfile=self.certfile,
client_ca=self.client_ca,
)
# smoke check
if not self.hub_auth.oauth_client_id:

View File

@@ -14,6 +14,7 @@ import shutil
import signal
import sys
import warnings
import pwd
from subprocess import Popen
from tempfile import mkdtemp
@@ -27,7 +28,7 @@ from tornado.ioloop import PeriodicCallback
from traitlets.config import LoggingConfigurable
from traitlets import (
Any, Bool, Dict, Instance, Integer, Float, List, Unicode, Union,
observe, validate,
default, observe, validate,
)
from .objects import Server
@@ -162,6 +163,12 @@ class Spawner(LoggingConfigurable):
if self.orm_spawner:
return self.orm_spawner.name
return ''
hub = Any()
authenticator = Any()
internal_ssl = Bool(False)
internal_trust_bundles = Dict()
internal_certs_location = Unicode('')
cert_paths = Dict()
admin_access = Bool(False)
api_token = Unicode()
oauth_client_id = Unicode()
@@ -644,6 +651,11 @@ class Spawner(LoggingConfigurable):
if self.cpu_guarantee:
env['CPU_GUARANTEE'] = str(self.cpu_guarantee)
if self.cert_paths:
env['JUPYTERHUB_SSL_KEYFILE'] = self.cert_paths['keyfile']
env['JUPYTERHUB_SSL_CERTFILE'] = self.cert_paths['certfile']
env['JUPYTERHUB_SSL_CLIENT_CA'] = self.cert_paths['cafile']
return env
def template_namespace(self):
@@ -684,6 +696,115 @@ class Spawner(LoggingConfigurable):
"""
return s.format(**self.template_namespace())
trusted_alt_names = List(Unicode())
ssl_alt_names = List(
Unicode(),
config=True,
help="""List of SSL alt names
May be set in config if all spawners should have the same value(s),
or set at runtime by Spawner that know their names.
"""
)
@default('ssl_alt_names')
def _default_ssl_alt_names(self):
# by default, use trusted_alt_names
# inherited from global app
return list(self.trusted_alt_names)
ssl_alt_names_include_local = Bool(
True,
config=True,
help="""Whether to include DNS:localhost, IP:127.0.0.1 in alt names""",
)
async def create_certs(self):
"""Create and set ownership for the certs to be used for internal ssl
Keyword Arguments:
alt_names (list): a list of alternative names to identify the
server by, see:
https://en.wikipedia.org/wiki/Subject_Alternative_Name
override: override the default_names with the provided alt_names
Returns:
dict: Path to cert files and CA
This method creates certs for use with the singleuser notebook. It
enables SSL and ensures that the notebook can perform bi-directional
SSL auth with the hub (verification based on CA).
If the singleuser host has a name or ip other than localhost,
an appropriate alternative name(s) must be passed for ssl verification
by the hub to work. For example, for Jupyter hosts with an IP of
10.10.10.10 or DNS name of jupyter.example.com, this would be:
alt_names=["IP:10.10.10.10"]
alt_names=["DNS:jupyter.example.com"]
respectively. The list can contain both the IP and DNS names to refer
to the host by either IP or DNS name (note the `default_names` below).
"""
from certipy import Certipy
default_names = ["DNS:localhost", "IP:127.0.0.1"]
alt_names = []
alt_names.extend(self.ssl_alt_names)
if self.ssl_alt_names_include_local:
alt_names = default_names + alt_names
self.log.info("Creating certs for %s: %s",
self._log_name,
';'.join(alt_names),
)
common_name = self.user.name or 'service'
certipy = Certipy(store_dir=self.internal_certs_location)
notebook_component = 'notebooks-ca'
notebook_key_pair = certipy.create_signed_pair(
'user-' + common_name,
notebook_component,
alt_names=alt_names,
overwrite=True
)
paths = {
"keyfile": notebook_key_pair['files']['key'],
"certfile": notebook_key_pair['files']['cert'],
"cafile": self.internal_trust_bundles[notebook_component],
}
return paths
async def move_certs(self, paths):
"""Takes certificate paths and makes them available to the notebook server
Arguments:
paths (dict): a list of paths for key, cert, and CA.
These paths will be resolvable and readable by the Hub process,
but not necessarily by the notebook server.
Returns:
dict: a list (potentially altered) of paths for key, cert,
and CA.
These paths should be resolvable and readable
by the notebook server to be launched.
`.move_certs` is called after certs for the singleuser notebook have
been created by create_certs.
By default, certs are created in a standard, central location defined
by `internal_certs_location`. For a local, single-host deployment of
JupyterHub, this should suffice. If, however, singleuser notebooks
are spawned on other hosts, `.move_certs` should be overridden to move
these files appropriately. This could mean using `scp` to copy them
to another host, moving them to a volume mounted in a docker container,
or exporting them as a secret in kubernetes.
"""
return paths
def get_args(self):
"""Return the arguments to be passed after self.cmd
@@ -927,7 +1048,6 @@ def set_user_setuid(username, chdir=True):
home directory.
"""
import grp
import pwd
user = pwd.getpwnam(username)
uid = user.pw_uid
gid = user.pw_gid
@@ -1088,6 +1208,48 @@ class LocalProcessSpawner(Spawner):
env = self.user_env(env)
return env
async def move_certs(self, paths):
"""Takes cert paths, moves and sets ownership for them
Arguments:
paths (dict): a list of paths for key, cert, and CA
Returns:
dict: a list (potentially altered) of paths for key, cert,
and CA
Stage certificates into a private home directory
and make them readable by the user.
"""
key = paths['keyfile']
cert = paths['certfile']
ca = paths['cafile']
user = pwd.getpwnam(self.user.name)
uid = user.pw_uid
gid = user.pw_gid
home = user.pw_dir
# Create dir for user's certs wherever we're starting
out_dir = "{home}/.jupyterhub/jupyterhub-certs".format(home=home)
shutil.rmtree(out_dir, ignore_errors=True)
os.makedirs(out_dir, 0o700, exist_ok=True)
# Move certs to users dir
shutil.move(paths['keyfile'], out_dir)
shutil.move(paths['certfile'], out_dir)
shutil.copy(paths['cafile'], out_dir)
key = os.path.join(out_dir, os.path.basename(paths['keyfile']))
cert = os.path.join(out_dir, os.path.basename(paths['certfile']))
ca = os.path.join(out_dir, os.path.basename(paths['cafile']))
# Set cert ownership to user
for f in [out_dir, key, cert, ca]:
shutil.chown(f, user=uid, group=gid)
return {"keyfile": key, "certfile": cert, "cafile": ca}
async def start(self):
"""Start the single-user server."""
self.port = random_port()

View File

@@ -27,14 +27,18 @@ Fixtures to add functionality or spawning behavior
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import asyncio
from getpass import getuser
import logging
import os
from pytest import fixture, raises
import sys
from subprocess import TimeoutExpired
from unittest import mock
from pytest import fixture, raises
from tornado import ioloop, gen
from tornado.httpclient import HTTPError
from unittest import mock
from tornado.platform.asyncio import AsyncIOMainLoop
from .. import orm
from .. import crypto
@@ -42,6 +46,7 @@ from ..utils import random_port
from . import mocking
from .mocking import MockHub
from .utils import ssl_setup
from .test_services import mockservice_cmd
import jupyterhub.services.service
@@ -51,24 +56,41 @@ _db = None
@fixture(scope='module')
def app(request, io_loop):
def ssl_tmpdir(tmpdir_factory):
return tmpdir_factory.mktemp('ssl')
@fixture(scope='module')
def app(request, io_loop, ssl_tmpdir):
"""Mock a jupyterhub app for testing"""
mocked_app = MockHub.instance(log_level=logging.DEBUG)
mocked_app = None
ssl_enabled = getattr(request.module, "ssl_enabled", False)
kwargs = dict(log_level=logging.DEBUG)
if ssl_enabled:
kwargs.update(
dict(
internal_ssl=True,
internal_certs_location=str(ssl_tmpdir),
)
)
@gen.coroutine
def make_app():
yield mocked_app.initialize([])
yield mocked_app.start()
mocked_app = MockHub.instance(**kwargs)
io_loop.run_sync(make_app)
async def make_app():
await mocked_app.initialize([])
await mocked_app.start()
def fin():
# disconnect logging during cleanup because pytest closes captured FDs prematurely
mocked_app.log.handlers = []
MockHub.clear_instance()
mocked_app.stop()
try:
mocked_app.stop()
except Exception as e:
print("Error stopping Hub: %s" % e, file=sys.stderr)
request.addfinalizer(fin)
io_loop.run_sync(make_app)
return mocked_app
@@ -106,8 +128,13 @@ def db():
@fixture(scope='module')
def io_loop(request):
"""Same as pytest-tornado.io_loop, but re-scoped to module-level"""
ioloop.IOLoop.configure(AsyncIOMainLoop)
aio_loop = asyncio.new_event_loop()
asyncio.set_event_loop(aio_loop)
io_loop = ioloop.IOLoop()
io_loop.make_current()
assert asyncio.get_event_loop() is aio_loop
assert io_loop.asyncio_loop is aio_loop
def _close():
io_loop.clear_current()
@@ -162,7 +189,10 @@ def _mockservice(request, app, url=False):
'admin': True,
}
if url:
spec['url'] = 'http://127.0.0.1:%i' % random_port()
if app.internal_ssl:
spec['url'] = 'https://127.0.0.1:%i' % random_port()
else:
spec['url'] = 'http://127.0.0.1:%i' % random_port()
io_loop = app.io_loop

View File

@@ -40,7 +40,7 @@ from tornado import gen
from tornado.concurrent import Future
from tornado.ioloop import IOLoop
from traitlets import default
from traitlets import Bool, default
from ..app import JupyterHub
from ..auth import PAMAuthenticator
@@ -49,7 +49,7 @@ from ..objects import Server
from ..spawner import LocalProcessSpawner
from ..singleuser import SingleUserNotebookApp
from ..utils import random_port, url_path_join
from .utils import async_requests
from .utils import async_requests, ssl_setup
from pamela import PAMError
@@ -95,6 +95,10 @@ class MockSpawner(LocalProcessSpawner):
def _cmd_default(self):
return [sys.executable, '-m', 'jupyterhub.tests.mocksu']
def move_certs(self, paths):
"""Return the paths unmodified"""
return paths
use_this_api_token = None
def start(self):
if self.use_this_api_token:
@@ -217,6 +221,14 @@ class MockHub(JupyterHub):
db_file = None
last_activity_interval = 2
log_datefmt = '%M:%S'
external_certs = None
log_level = 10
def __init__(self, *args, **kwargs):
if 'internal_certs_location' in kwargs:
cert_location = kwargs['internal_certs_location']
kwargs['external_certs'] = ssl_setup(cert_location, 'hub-ca')
super().__init__(*args, **kwargs)
@default('subdomain_host')
def _subdomain_host_default(self):
@@ -228,7 +240,7 @@ class MockHub(JupyterHub):
port = urlparse(self.subdomain_host).port
else:
port = random_port()
return 'http://127.0.0.1:%i/@/space%%20word/' % port
return 'http://127.0.0.1:%i/@/space%%20word/' % (port,)
@default('ip')
def _ip_default(self):
@@ -270,6 +282,18 @@ class MockHub(JupyterHub):
self.db.expire(service)
return super().init_services()
test_clean_db = Bool(True)
def init_db(self):
"""Ensure we start with a clean user list"""
super().init_db()
if self.test_clean_db:
for user in self.db.query(orm.User):
self.db.delete(user)
for group in self.db.query(orm.Group):
self.db.delete(group)
self.db.commit()
@gen.coroutine
def initialize(self, argv=None):
self.pid_file = NamedTemporaryFile(delete=False).name
@@ -310,12 +334,16 @@ class MockHub(JupyterHub):
def login_user(self, name):
"""Login a user by name, returning her cookies."""
base_url = public_url(self)
external_ca = None
if self.internal_ssl:
external_ca = self.external_certs['files']['ca']
r = yield async_requests.post(base_url + 'hub/login',
data={
'username': name,
'password': name,
},
allow_redirects=False,
verify=external_ca,
)
r.raise_for_status()
assert r.cookies
@@ -373,6 +401,7 @@ class StubSingleUserSpawner(MockSpawner):
evt = threading.Event()
print(args, env)
def _run():
asyncio.set_event_loop(asyncio.new_event_loop())
io_loop = IOLoop()
io_loop.make_current()
io_loop.add_callback(lambda : evt.set())

View File

@@ -23,6 +23,7 @@ import requests
from tornado import web, httpserver, ioloop
from jupyterhub.services.auth import HubAuthenticated, HubOAuthenticated, HubOAuthCallbackHandler
from jupyterhub.utils import make_ssl_context
class EchoHandler(web.RequestHandler):
@@ -85,7 +86,19 @@ def main():
(r'.*', EchoHandler),
], cookie_secret=os.urandom(32))
server = httpserver.HTTPServer(app)
ssl_context = None
key = os.environ.get('JUPYTERHUB_SSL_KEYFILE') or ''
cert = os.environ.get('JUPYTERHUB_SSL_CERTFILE') or ''
ca = os.environ.get('JUPYTERHUB_SSL_CLIENT_CA') or ''
if key and cert and ca:
ssl_context = make_ssl_context(
key,
cert,
cafile = ca,
check_hostname = False)
server = httpserver.HTTPServer(app, ssl_options=ssl_context)
server.listen(url.port, url.hostname)
try:
ioloop.IOLoop.instance().start()

View File

@@ -14,9 +14,11 @@ Handlers and their purpose include:
import argparse
import json
import sys
import os
from tornado import web, httpserver, ioloop
from .mockservice import EnvHandler
from ..utils import make_ssl_context
class EchoHandler(web.RequestHandler):
def get(self):
@@ -33,8 +35,21 @@ def main(args):
(r'.*/env', EnvHandler),
(r'.*', EchoHandler),
])
ssl_context = None
key = os.environ.get('JUPYTERHUB_SSL_KEYFILE') or ''
cert = os.environ.get('JUPYTERHUB_SSL_CERTFILE') or ''
ca = os.environ.get('JUPYTERHUB_SSL_CLIENT_CA') or ''
server = httpserver.HTTPServer(app)
if key and cert and ca:
ssl_context = make_ssl_context(
key,
cert,
cafile = ca,
check_hostname = False
)
server = httpserver.HTTPServer(app, ssl_options=ssl_context)
server.listen(args.port)
try:
ioloop.IOLoop.instance().start()

View File

@@ -91,11 +91,17 @@ def api_request(app, *api_path, **kwargs):
headers = kwargs.setdefault('headers', {})
if 'Authorization' not in headers and not kwargs.pop('noauth', False):
headers.update(auth_header(app.db, 'admin'))
# make a copy to avoid modifying arg in-place
kwargs['headers'] = h = {}
h.update(headers)
h.update(auth_header(app.db, 'admin'))
url = ujoin(base_url, 'api', *api_path)
method = kwargs.pop('method', 'get')
f = getattr(async_requests, method)
if app.internal_ssl:
kwargs['cert'] = (app.internal_ssl_cert, app.internal_ssl_key)
kwargs["verify"] = app.internal_ssl_ca
resp = yield f(url, **kwargs)
assert "frame-ancestors 'self'" in resp.headers['Content-Security-Policy']
assert ujoin(app.hub.base_url, "security/csp-report") in resp.headers['Content-Security-Policy']
@@ -575,15 +581,19 @@ def test_spawn(app):
assert spawner.server.base_url == ujoin(app.base_url, 'user/%s' % name) + '/'
url = public_url(app, user)
r = yield async_requests.get(url)
kwargs = {}
if app.internal_ssl:
kwargs['cert'] = (app.internal_ssl_cert, app.internal_ssl_key)
kwargs["verify"] = app.internal_ssl_ca
r = yield async_requests.get(url, **kwargs)
assert r.status_code == 200
assert r.text == spawner.server.base_url
r = yield async_requests.get(ujoin(url, 'args'))
r = yield async_requests.get(ujoin(url, 'args'), **kwargs)
assert r.status_code == 200
argv = r.json()
assert '--port' in ' '.join(argv)
r = yield async_requests.get(ujoin(url, 'env'))
r = yield async_requests.get(ujoin(url, 'env'), **kwargs)
env = r.json()
for expected in ['JUPYTERHUB_USER', 'JUPYTERHUB_BASE_URL', 'JUPYTERHUB_API_TOKEN']:
assert expected in env
@@ -620,8 +630,11 @@ def test_spawn_handler(app):
# verify that request params got passed down
# implemented in MockSpawner
kwargs = {}
if app.external_certs:
kwargs['verify'] = app.external_certs['files']['ca']
url = public_url(app, user)
r = yield async_requests.get(ujoin(url, 'env'))
r = yield async_requests.get(ujoin(url, 'env'), **kwargs)
env = r.json()
assert 'HANDLER_ARGS' in env
assert env['HANDLER_ARGS'] == 'foo=bar'
@@ -1614,7 +1627,11 @@ def test_get_service(app, mockservice_url):
def test_root_api(app):
base_url = app.hub.url
url = ujoin(base_url, 'api')
r = yield async_requests.get(url)
kwargs = {}
if app.internal_ssl:
kwargs['cert'] = (app.internal_ssl_cert, app.internal_ssl_key)
kwargs["verify"] = app.internal_ssl_ca
r = yield async_requests.get(url, **kwargs)
r.raise_for_status()
expected = {
'version': jupyterhub.__version__

View File

@@ -64,7 +64,7 @@ def test_generate_config():
@pytest.mark.gen_test
def test_init_tokens():
def test_init_tokens(request):
with TemporaryDirectory() as td:
db_file = os.path.join(td, 'jupyterhub.sqlite')
tokens = {
@@ -72,7 +72,11 @@ def test_init_tokens():
'also-super-secret': 'gordon',
'boagasdfasdf': 'chell',
}
app = MockHub(db_url=db_file, api_tokens=tokens)
kwargs = {'db_url': db_file, 'api_tokens': tokens}
ssl_enabled = getattr(request.module, "ssl_enabled", False)
if ssl_enabled:
kwargs['internal_certs_location'] = td
app = MockHub(**kwargs)
yield app.initialize([])
db = app.db
for token, username in tokens.items():
@@ -82,7 +86,7 @@ def test_init_tokens():
assert user.name == username
# simulate second startup, reloading same tokens:
app = MockHub(db_url=db_file, api_tokens=tokens)
app = MockHub(**kwargs)
yield app.initialize([])
db = app.db
for token, username in tokens.items():
@@ -93,27 +97,35 @@ def test_init_tokens():
# don't allow failed token insertion to create users:
tokens['short'] = 'gman'
app = MockHub(db_url=db_file, api_tokens=tokens)
app = MockHub(**kwargs)
with pytest.raises(ValueError):
yield app.initialize([])
assert orm.User.find(app.db, 'gman') is None
def test_write_cookie_secret(tmpdir):
def test_write_cookie_secret(tmpdir, request):
secret_path = str(tmpdir.join('cookie_secret'))
hub = MockHub(cookie_secret_file=secret_path)
kwargs = {'cookie_secret_file': secret_path}
ssl_enabled = getattr(request.module, "ssl_enabled", False)
if ssl_enabled:
kwargs['internal_certs_location'] = str(tmpdir)
hub = MockHub(**kwargs)
hub.init_secrets()
assert os.path.exists(secret_path)
assert os.stat(secret_path).st_mode & 0o600
assert not os.stat(secret_path).st_mode & 0o177
def test_cookie_secret_permissions(tmpdir):
def test_cookie_secret_permissions(tmpdir, request):
secret_file = tmpdir.join('cookie_secret')
secret_path = str(secret_file)
secret = os.urandom(COOKIE_SECRET_BYTES)
secret_file.write(binascii.b2a_hex(secret))
hub = MockHub(cookie_secret_file=secret_path)
kwargs = {'cookie_secret_file': secret_path}
ssl_enabled = getattr(request.module, "ssl_enabled", False)
if ssl_enabled:
kwargs['internal_certs_location'] = str(tmpdir)
hub = MockHub(**kwargs)
# raise with public secret file
os.chmod(secret_path, 0o664)
@@ -126,18 +138,26 @@ def test_cookie_secret_permissions(tmpdir):
assert hub.cookie_secret == secret
def test_cookie_secret_content(tmpdir):
def test_cookie_secret_content(tmpdir, request):
secret_file = tmpdir.join('cookie_secret')
secret_file.write('not base 64: uñiço∂e')
secret_path = str(secret_file)
os.chmod(secret_path, 0o660)
hub = MockHub(cookie_secret_file=secret_path)
kwargs = {'cookie_secret_file': secret_path}
ssl_enabled = getattr(request.module, "ssl_enabled", False)
if ssl_enabled:
kwargs['internal_certs_location'] = str(tmpdir)
hub = MockHub(**kwargs)
with pytest.raises(SystemExit):
hub.init_secrets()
def test_cookie_secret_env(tmpdir):
hub = MockHub(cookie_secret_file=str(tmpdir.join('cookie_secret')))
def test_cookie_secret_env(tmpdir, request):
kwargs = {'cookie_secret_file': str(tmpdir.join('cookie_secret'))}
ssl_enabled = getattr(request.module, "ssl_enabled", False)
if ssl_enabled:
kwargs['internal_certs_location'] = str(tmpdir)
hub = MockHub(**kwargs)
with patch.dict(os.environ, {'JPY_COOKIE_SECRET': 'not hex'}):
with pytest.raises(ValueError):
@@ -150,12 +170,16 @@ def test_cookie_secret_env(tmpdir):
@pytest.mark.gen_test
def test_load_groups():
def test_load_groups(tmpdir, request):
to_load = {
'blue': ['cyclops', 'rogue', 'wolverine'],
'gold': ['storm', 'jean-grey', 'colossus'],
}
hub = MockHub(load_groups=to_load)
kwargs = {'load_groups': to_load}
ssl_enabled = getattr(request.module, "ssl_enabled", False)
if ssl_enabled:
kwargs['internal_certs_location'] = str(tmpdir)
hub = MockHub(**kwargs)
hub.init_db()
yield hub.init_users()
yield hub.init_groups()
@@ -178,7 +202,11 @@ def test_resume_spawners(tmpdir, request):
request.addfinalizer(p.stop)
@gen.coroutine
def new_hub():
app = MockHub()
kwargs = {}
ssl_enabled = getattr(request.module, "ssl_enabled", False)
if ssl_enabled:
kwargs['internal_certs_location'] = str(tmpdir)
app = MockHub(test_clean_db=False, **kwargs)
app.config.ConfigurableHTTPProxy.should_start = False
app.config.ConfigurableHTTPProxy.auth_token = 'unused'
yield app.initialize([])

View File

@@ -0,0 +1,7 @@
"""Tests for the SSL enabled REST API."""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
from jupyterhub.tests.test_api import *
ssl_enabled = True

View File

@@ -0,0 +1,10 @@
"""Test the JupyterHub entry point with internal ssl"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import sys
import jupyterhub.tests.mocking
from jupyterhub.tests.test_app import *
ssl_enabled = True

View File

@@ -0,0 +1,77 @@
"""Tests for jupyterhub internal_ssl connections"""
import time
from subprocess import check_output
import sys
from urllib.parse import urlparse
import pytest
import jupyterhub
from tornado import gen
from unittest import mock
from requests.exceptions import SSLError
from .utils import async_requests
from .test_api import add_user
ssl_enabled = True
@gen.coroutine
def wait_for_spawner(spawner, timeout=10):
"""Wait for an http server to show up
polling at shorter intervals for early termination
"""
deadline = time.monotonic() + timeout
def wait():
return spawner.server.wait_up(timeout=1, http=True)
while time.monotonic() < deadline:
status = yield spawner.poll()
assert status is None
try:
yield wait()
except TimeoutError:
continue
else:
break
yield wait()
@pytest.mark.gen_test
def test_connection_hub_wrong_certs(app):
"""Connecting to the internal hub url fails without correct certs"""
with pytest.raises(SSLError):
kwargs = {'verify': False}
r = yield async_requests.get(app.hub.url, **kwargs)
r.raise_for_status()
@pytest.mark.gen_test
def test_connection_proxy_api_wrong_certs(app):
"""Connecting to the proxy api fails without correct certs"""
with pytest.raises(SSLError):
kwargs = {'verify': False}
r = yield async_requests.get(app.proxy.api_url, **kwargs)
r.raise_for_status()
@pytest.mark.gen_test
def test_connection_notebook_wrong_certs(app):
"""Connecting to a notebook fails without correct certs"""
with mock.patch.dict(
app.config.LocalProcessSpawner,
{'cmd': [sys.executable, '-m', 'jupyterhub.tests.mocksu']}
):
user = add_user(app.db, app, name='foo')
yield user.spawn()
yield wait_for_spawner(user.spawner)
spawner = user.spawner
status = yield spawner.poll()
assert status is None
with pytest.raises(SSLError):
kwargs = {'verify': False}
r = yield async_requests.get(spawner.server.url, **kwargs)
r.raise_for_status()

View File

@@ -0,0 +1,7 @@
"""Tests for process spawning with internal_ssl"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
from jupyterhub.tests.test_spawner import *
ssl_enabled = True

View File

@@ -47,6 +47,17 @@ def test_server(db):
assert server.host == 'http://hub:%i' % server.port
assert server.url == server.host + '/'
server.connect_url = 'http://hub-url:%i/connect' % server.port
assert server.host == 'http://hub-url:%i' % server.port
server.bind_url = 'http://127.0.0.1/'
assert server.port == 80
check_connect_url = objects.Server(connect_url='http://127.0.0.1:80')
assert check_connect_url.connect_url == 'http://127.0.0.1:80/'
check_connect_url = objects.Server(connect_url='http://127.0.0.1:80/')
assert check_connect_url.connect_url == 'http://127.0.0.1:80/'
def test_user(db):
orm_user = orm.User(name='kaylee')

View File

@@ -253,6 +253,7 @@ def test_shell_cmd(db, tmpdir, request):
r.raise_for_status()
env = r.json()
assert env['TESTVAR'] == 'foo'
yield s.stop()
def test_inherit_overwrite():
@@ -401,8 +402,13 @@ def test_spawner_routing(app, name):
yield user.spawn()
yield wait_for_spawner(user.spawner)
yield app.proxy.add_user(user)
kwargs = {'allow_redirects': False}
if app.internal_ssl:
kwargs['cert'] = (app.internal_ssl_cert, app.internal_ssl_key)
kwargs["verify"] = app.internal_ssl_ca
url = url_path_join(public_url(app, user), "test/url")
r = yield async_requests.get(url, allow_redirects=False)
r = yield async_requests.get(url, **kwargs)
r.raise_for_status()
assert r.url == url
assert r.text == urlparse(url).path
yield user.stop()

View File

@@ -1,6 +1,8 @@
from concurrent.futures import ThreadPoolExecutor
import requests
from certipy import Certipy
class _AsyncRequests:
"""Wrapper around requests to return a Future from request methods
@@ -13,10 +15,24 @@ class _AsyncRequests:
requests_method = getattr(requests, name)
return lambda *args, **kwargs: self.executor.submit(requests_method, *args, **kwargs)
# async_requests.get = requests.get returning a Future, etc.
async_requests = _AsyncRequests()
class AsyncSession(requests.Session):
"""requests.Session object that runs in the background thread"""
def request(self, *args, **kwargs):
return async_requests.executor.submit(super().request, *args, **kwargs)
def ssl_setup(cert_dir, authority_name):
# Set up the external certs with the same authority as the internal
# one so that certificate trust works regardless of chosen endpoint.
certipy = Certipy(store_dir=cert_dir)
alt_names = ['DNS:localhost', 'IP:127.0.0.1']
internal_authority = certipy.create_ca(authority_name, overwrite=True)
external_certs = certipy.create_signed_pair('external', authority_name,
overwrite=True,
alt_names=alt_names)
return external_certs

View File

@@ -10,13 +10,14 @@ from sqlalchemy import inspect
from tornado import gen
from tornado.log import app_log
from .utils import maybe_future, url_path_join
from .utils import maybe_future, url_path_join, make_ssl_context
from . import orm
from ._version import _check_version, __version__
from .objects import Server
from .spawner import LocalProcessSpawner
from .crypto import encrypt, decrypt, CryptKeeper, EncryptionUnavailable, InvalidToken
from .metrics import TOTAL_USERS
class UserDict(dict):
"""Like defaultdict, but for users
@@ -39,6 +40,7 @@ class UserDict(dict):
"""Add a user to the UserDict"""
if orm_user.id not in self:
self[orm_user.id] = self.from_orm(orm_user)
TOTAL_USERS.inc()
return self[orm_user.id]
def __contains__(self, key):
@@ -93,6 +95,7 @@ class UserDict(dict):
self.db.delete(user)
self.db.commit()
# delete from dict after commit
TOTAL_USERS.dec()
del self[user_id]
def count_active_users(self):
@@ -226,6 +229,12 @@ class User:
client_id = 'jupyterhub-user-%s' % quote(self.name)
if server_name:
client_id = '%s-%s' % (client_id, quote(server_name))
trusted_alt_names = []
trusted_alt_names.extend(self.settings.get('trusted_alt_names', []))
if self.settings.get('subdomain_host'):
trusted_alt_names.append('DNS:' + self.domain)
spawn_kwargs = dict(
user=self,
orm_spawner=orm_spawner,
@@ -236,7 +245,19 @@ class User:
db=self.db,
oauth_client_id=client_id,
cookie_options = self.settings.get('cookie_options', {}),
trusted_alt_names=trusted_alt_names,
)
if self.settings.get('internal_ssl'):
ssl_kwargs = dict(
internal_ssl=self.settings.get('internal_ssl'),
internal_trust_bundles=self.settings.get(
'internal_trust_bundles'),
internal_certs_location=self.settings.get(
'internal_certs_location'),
)
spawn_kwargs.update(ssl_kwargs)
# update with kwargs. Mainly for testing.
spawn_kwargs.update(kwargs)
spawner = spawner_class(**spawn_kwargs)
@@ -428,6 +449,11 @@ class User:
try:
# run optional preparation work to bootstrap the notebook
await maybe_future(spawner.run_pre_spawn_hook())
if self.settings.get('internal_ssl'):
self.log.debug("Creating internal SSL certs for %s", spawner._log_name)
hub_paths = await maybe_future(spawner.create_certs())
spawner.cert_paths = await maybe_future(spawner.move_certs(hub_paths))
self.log.debug("Calling Spawner.start for %s", spawner._log_name)
f = maybe_future(spawner.start())
# commit any changes in spawner.start (always commit db changes before yield)
db.commit()
@@ -439,7 +465,8 @@ class User:
pass
else:
# >= 0.7 returns (ip, port)
url = 'http://%s:%i' % url
proto = 'https' if self.settings['internal_ssl'] else 'http'
url = '%s://%s:%i' % ((proto,) + url)
urlinfo = urlparse(url)
server.proto = urlinfo.scheme
server.ip = urlinfo.hostname
@@ -523,8 +550,15 @@ class User:
spawner.orm_spawner.state = spawner.get_state()
db.commit()
spawner._waiting_for_response = True
key = self.settings.get('internal_ssl_key')
cert = self.settings.get('internal_ssl_cert')
ca = self.settings.get('internal_ssl_ca')
ssl_context = make_ssl_context(key, cert, cafile=ca)
try:
resp = await server.wait_up(http=True, timeout=spawner.http_timeout)
resp = await server.wait_up(
http=True,
timeout=spawner.http_timeout,
ssl_context=ssl_context)
except Exception as e:
if isinstance(e, TimeoutError):
self.log.warning(

View File

@@ -16,6 +16,7 @@ import os
import socket
import sys
import threading
import ssl
import uuid
import warnings
@@ -70,6 +71,21 @@ def can_connect(ip, port):
else:
return True
def make_ssl_context(
keyfile, certfile, cafile=None,
verify=True, check_hostname=True):
"""Setup context for starting an https server or making requests over ssl.
"""
if not keyfile or not certfile:
return None
purpose = ssl.Purpose.SERVER_AUTH if verify else ssl.Purpose.CLIENT_AUTH
ssl_context = ssl.create_default_context(purpose, cafile=cafile)
ssl_context.load_cert_chain(certfile, keyfile)
ssl_context.check_hostname = check_hostname
return ssl_context
async def exponential_backoff(
pass_func,
fail_message,
@@ -166,12 +182,16 @@ async def wait_for_server(ip, port, timeout=10):
)
async def wait_for_http_server(url, timeout=10):
async def wait_for_http_server(url, timeout=10, ssl_context=None):
"""Wait for an HTTP Server to respond at url.
Any non-5XX response code will do, even 404.
"""
loop = ioloop.IOLoop.current()
tic = loop.time()
client = AsyncHTTPClient()
if ssl_context:
client.ssl_options = ssl_context
async def is_reachable():
try:
r = await client.fetch(url, follow_redirects=False)

View File

@@ -10,3 +10,4 @@ python-dateutil
SQLAlchemy>=1.1
requests
prometheus_client>=0.0.21
certipy>=0.1.2

View File

@@ -112,6 +112,10 @@ setup_args = dict(
'pam = jupyterhub.auth:PAMAuthenticator',
'dummy = jupyterhub.auth:DummyAuthenticator',
],
'jupyterhub.proxies': [
'default = jupyterhub.proxy:ConfigurableHTTPProxy',
'configurable-http-proxy = jupyterhub.proxy:ConfigurableHTTPProxy',
],
'jupyterhub.spawners': [
'default = jupyterhub.spawner:LocalProcessSpawner',
'localprocess = jupyterhub.spawner:LocalProcessSpawner',