mirror of
https://github.com/jupyterhub/jupyterhub.git
synced 2025-10-07 18:14:10 +00:00
Compare commits
38 Commits
Author | SHA1 | Date | |
---|---|---|---|
![]() |
b1111363fd | ||
![]() |
6c99b807c2 | ||
![]() |
8d650f594e | ||
![]() |
04a0a3a2e5 | ||
![]() |
9cebfd6367 | ||
![]() |
587cd70221 | ||
![]() |
e94f5e043a | ||
![]() |
5456fb6356 | ||
![]() |
fb75b9a392 | ||
![]() |
90d341e6f7 | ||
![]() |
a0354de3c1 | ||
![]() |
2e4e1ce82f | ||
![]() |
06f646099f | ||
![]() |
3360817cb6 | ||
![]() |
e042ad0b4a | ||
![]() |
246f9f9044 | ||
![]() |
bc08f4de34 | ||
![]() |
12904ecc32 | ||
![]() |
601d371796 | ||
![]() |
30d9e09390 | ||
![]() |
7850a5d478 | ||
![]() |
f5a3b1bc5a | ||
![]() |
b2fe8e5691 | ||
![]() |
9d4c410996 | ||
![]() |
dcae92ce4a | ||
![]() |
29957b8cd8 | ||
![]() |
6299e0368c | ||
![]() |
c862b6062d | ||
![]() |
146587ffff | ||
![]() |
077d8dec9a | ||
![]() |
af8d6086fc | ||
![]() |
18f8661d73 | ||
![]() |
bd70f66c70 | ||
![]() |
ac213fc4b5 | ||
![]() |
db33549173 | ||
![]() |
e985e2b84c | ||
![]() |
1d9abf7528 | ||
![]() |
897f5f62d5 |
@@ -1,5 +1,6 @@
|
||||
-r requirements.txt
|
||||
mock
|
||||
beautifulsoup4
|
||||
codecov
|
||||
cryptography
|
||||
pytest-cov
|
||||
|
@@ -3,7 +3,7 @@ swagger: '2.0'
|
||||
info:
|
||||
title: JupyterHub
|
||||
description: The REST API for JupyterHub
|
||||
version: 0.9.0dev
|
||||
version: 0.9.4
|
||||
license:
|
||||
name: BSD-3-Clause
|
||||
schemes:
|
||||
|
@@ -9,6 +9,30 @@ command line for details.
|
||||
|
||||
## 0.9
|
||||
|
||||
### [0.9.4] 2018-09-24
|
||||
|
||||
JupyterHub 0.9.4 is a small bugfix release.
|
||||
|
||||
- Fixes an issue that required all running user servers to be restarted
|
||||
when performing an upgrade from 0.8 to 0.9.
|
||||
- Fixes content-type for API endpoints back to `application/json`.
|
||||
It was `text/html` in 0.9.0-0.9.3.
|
||||
|
||||
### [0.9.3] 2018-09-12
|
||||
|
||||
JupyterHub 0.9.3 contains small bugfixes and improvements
|
||||
|
||||
- Fix token page and model handling of `expires_at`.
|
||||
This field was missing from the REST API model for tokens
|
||||
and could cause the token page to not render
|
||||
- Add keep-alive to progress event stream to avoid proxies dropping
|
||||
the connection due to inactivity
|
||||
- Documentation and example improvements
|
||||
- Disable quit button when using notebook 5.6
|
||||
- Prototype new feature (may change prior to 1.0):
|
||||
pass requesting Handler to Spawners during start,
|
||||
accessible as `self.handler`
|
||||
|
||||
### [0.9.2] 2018-08-10
|
||||
|
||||
JupyterHub 0.9.2 contains small bugfixes and improvements.
|
||||
@@ -118,7 +142,7 @@ and tornado < 5.0.
|
||||
- Added "Start All" button to admin page for launching all user servers at once.
|
||||
- Services have an `info` field which is a dictionary.
|
||||
This is accessible via the REST API.
|
||||
- `JupyterHub.extra_handlers` allows defining additonal tornado RequestHandlers attached to the Hub.
|
||||
- `JupyterHub.extra_handlers` allows defining additional tornado RequestHandlers attached to the Hub.
|
||||
- API tokens may now expire.
|
||||
Expiry is available in the REST model as `expires_at`,
|
||||
and settable when creating API tokens by specifying `expires_in`.
|
||||
@@ -402,7 +426,9 @@ Fix removal of `/login` page in 0.4.0, breaking some OAuth providers.
|
||||
First preview release
|
||||
|
||||
|
||||
[Unreleased]: https://github.com/jupyterhub/jupyterhub/compare/0.9.2...HEAD
|
||||
[Unreleased]: https://github.com/jupyterhub/jupyterhub/compare/0.9.4...HEAD
|
||||
[0.9.4]: https://github.com/jupyterhub/jupyterhub/compare/0.9.3...0.9.4
|
||||
[0.9.3]: https://github.com/jupyterhub/jupyterhub/compare/0.9.2...0.9.3
|
||||
[0.9.2]: https://github.com/jupyterhub/jupyterhub/compare/0.9.1...0.9.2
|
||||
[0.9.1]: https://github.com/jupyterhub/jupyterhub/compare/0.9.0...0.9.1
|
||||
[0.9.0]: https://github.com/jupyterhub/jupyterhub/compare/0.8.1...0.9.0
|
||||
|
@@ -45,7 +45,7 @@ is important that these files be put in a secure location on your server, where
|
||||
they are not readable by regular users.
|
||||
|
||||
If you are using a **chain certificate**, see also chained certificate for SSL
|
||||
in the JupyterHub `troubleshooting FAQ <troubleshooting>`_.
|
||||
in the JupyterHub `Troubleshooting FAQ <../troubleshooting.html>`_.
|
||||
|
||||
Using letsencrypt
|
||||
~~~~~~~~~~~~~~~~~
|
||||
|
@@ -70,7 +70,7 @@ Cmnd_Alias JUPYTER_CMD = /usr/local/bin/sudospawner
|
||||
rhea ALL=(JUPYTER_USERS) NOPASSWD:JUPYTER_CMD
|
||||
```
|
||||
|
||||
It might be useful to modifiy `secure_path` to add commands in path.
|
||||
It might be useful to modify `secure_path` to add commands in path.
|
||||
|
||||
As an alternative to adding every user to the `/etc/sudoers` file, you can
|
||||
use a group in the last line above, instead of `JUPYTER_USERS`:
|
||||
|
@@ -125,7 +125,7 @@ sure are available, I can install their specs system-wide (in /usr/local) with:
|
||||
There are two broad categories of user environments that depend on what
|
||||
Spawner you choose:
|
||||
|
||||
- Multi-user hosts (shared sytem)
|
||||
- Multi-user hosts (shared system)
|
||||
- Container-based
|
||||
|
||||
How you configure user environments for each category can differ a bit
|
||||
|
@@ -196,7 +196,7 @@ allocate. Attempting to use more memory than this limit will cause errors. The
|
||||
single-user notebook server can discover its own memory limit by looking at
|
||||
the environment variable `MEM_LIMIT`, which is specified in absolute bytes.
|
||||
|
||||
`c.Spawner.mem_guarantee`: Sometimes, a **guarantee** of a *minumum amount of
|
||||
`c.Spawner.mem_guarantee`: Sometimes, a **guarantee** of a *minimum amount of
|
||||
memory* is desirable. In this case, you can set `c.Spawner.mem_guarantee` to
|
||||
to provide a guarantee that at minimum this much memory will always be
|
||||
available for the single-user notebook server to use. The environment variable
|
||||
|
@@ -75,7 +75,7 @@ the top of all pages. The more specific variables
|
||||
`announcement_login`, `announcement_spawn`, `announcement_home`, and
|
||||
`announcement_logout` are more specific and only show on their
|
||||
respective pages (overriding the global `announcement` variable).
|
||||
Note that changing these varables require a restart, unlike direct
|
||||
Note that changing these variables require a restart, unlike direct
|
||||
template extension.
|
||||
|
||||
You can get the same effect by extending templates, which allows you
|
||||
|
@@ -166,7 +166,7 @@ startup
|
||||
statsd
|
||||
stdin
|
||||
stdout
|
||||
stoppped
|
||||
stopped
|
||||
subclasses
|
||||
subcommand
|
||||
subdomain
|
||||
|
@@ -1,9 +1,14 @@
|
||||
# Example for a Spawner.pre_spawn_hook
|
||||
# create a directory for the user before the spawner starts
|
||||
|
||||
"""
|
||||
Example for a Spawner.pre_spawn_hook
|
||||
create a directory for the user before the spawner starts
|
||||
"""
|
||||
# pylint: disable=import-error
|
||||
import os
|
||||
import shutil
|
||||
from jupyter_client.localinterfaces import public_ips
|
||||
|
||||
def create_dir_hook(spawner):
|
||||
""" Create directory """
|
||||
username = spawner.user.name # get the username
|
||||
volume_path = os.path.join('/volumes/jupyterhub', username)
|
||||
if not os.path.exists(volume_path):
|
||||
@@ -12,23 +17,24 @@ def create_dir_hook(spawner):
|
||||
# ...
|
||||
|
||||
def clean_dir_hook(spawner):
|
||||
""" Delete directory """
|
||||
username = spawner.user.name # get the username
|
||||
temp_path = os.path.join('/volumes/jupyterhub', username, 'temp')
|
||||
if os.path.exists(temp_path) and os.path.isdir(temp_path):
|
||||
shutil.rmtree(temp_path)
|
||||
|
||||
# attach the hook functions to the spawner
|
||||
# pylint: disable=undefined-variable
|
||||
c.Spawner.pre_spawn_hook = create_dir_hook
|
||||
c.Spawner.post_stop_hook = clean_dir_hook
|
||||
|
||||
# Use the DockerSpawner to serve your users' notebooks
|
||||
c.JupyterHub.spawner_class = 'dockerspawner.DockerSpawner'
|
||||
from jupyter_client.localinterfaces import public_ips
|
||||
c.JupyterHub.hub_ip = public_ips()[0]
|
||||
c.DockerSpawner.hub_ip_connect = public_ips()[0]
|
||||
c.DockerSpawner.container_ip = "0.0.0.0"
|
||||
|
||||
# You can now mount the volume to the docker container as we've
|
||||
# made sure the directory exists
|
||||
# pylint: disable=bad-whitespace
|
||||
c.DockerSpawner.volumes = { '/volumes/jupyterhub/{username}/': '/home/jovyan/work' }
|
||||
|
||||
|
@@ -11,12 +11,16 @@ function get_hub_version() {
|
||||
hub_xyz=$(cat hub_version)
|
||||
split=( ${hub_xyz//./ } )
|
||||
hub_xy="${split[0]}.${split[1]}"
|
||||
# add .dev on hub_xy so it's 1.0.dev
|
||||
if [[ ! -z "${split[3]}" ]]; then
|
||||
hub_xy="${hub_xy}.${split[3]}"
|
||||
fi
|
||||
}
|
||||
|
||||
|
||||
get_hub_version
|
||||
|
||||
# when building master, push 0.9.0 as well
|
||||
# when building master, push 0.9.0.dev as well
|
||||
docker tag $DOCKER_REPO:$DOCKER_TAG $DOCKER_REPO:$hub_xyz
|
||||
docker push $DOCKER_REPO:$hub_xyz
|
||||
docker tag $ONBUILD:$DOCKER_TAG $ONBUILD:$hub_xyz
|
||||
|
@@ -6,7 +6,7 @@
|
||||
version_info = (
|
||||
0,
|
||||
9,
|
||||
2,
|
||||
4,
|
||||
"", # release (b1, rc1, or "" for final or dev)
|
||||
# "dev", # dev or nothing
|
||||
)
|
||||
|
@@ -2,6 +2,7 @@
|
||||
# Copyright (c) Jupyter Development Team.
|
||||
# Distributed under the terms of the Modified BSD License.
|
||||
|
||||
from datetime import datetime
|
||||
import json
|
||||
|
||||
from http.client import responses
|
||||
@@ -13,12 +14,25 @@ from .. import orm
|
||||
from ..handlers import BaseHandler
|
||||
from ..utils import isoformat, url_path_join
|
||||
|
||||
|
||||
class APIHandler(BaseHandler):
|
||||
"""Base class for API endpoints
|
||||
|
||||
Differences from page handlers:
|
||||
|
||||
- JSON responses and errors
|
||||
- strict referer checking for Cookie-authenticated requests
|
||||
- strict content-security-policy
|
||||
- methods for REST API models
|
||||
"""
|
||||
|
||||
@property
|
||||
def content_security_policy(self):
|
||||
return '; '.join([super().content_security_policy, "default-src 'none'"])
|
||||
|
||||
def get_content_type(self):
|
||||
return 'application/json'
|
||||
|
||||
def check_referer(self):
|
||||
"""Check Origin for cross-site API requests.
|
||||
|
||||
@@ -156,6 +170,7 @@ class APIHandler(BaseHandler):
|
||||
'kind': kind,
|
||||
'created': isoformat(token.created),
|
||||
'last_activity': isoformat(token.last_activity),
|
||||
'expires_at': isoformat(expires_at),
|
||||
}
|
||||
model.update(extra)
|
||||
return model
|
||||
@@ -253,3 +268,13 @@ class APIHandler(BaseHandler):
|
||||
|
||||
def options(self, *args, **kwargs):
|
||||
self.finish()
|
||||
|
||||
|
||||
class API404(APIHandler):
|
||||
"""404 for API requests
|
||||
|
||||
Ensures JSON 404 errors for malformed URLs
|
||||
"""
|
||||
async def prepare(self):
|
||||
await super().prepare()
|
||||
raise web.HTTPError(404)
|
||||
|
@@ -428,6 +428,9 @@ class UserAdminAccessAPIHandler(APIHandler):
|
||||
|
||||
class SpawnProgressAPIHandler(APIHandler):
|
||||
"""EventStream handler for pending spawns"""
|
||||
|
||||
keepalive_interval = 8
|
||||
|
||||
def get_content_type(self):
|
||||
return 'text/event-stream'
|
||||
|
||||
@@ -440,6 +443,23 @@ class SpawnProgressAPIHandler(APIHandler):
|
||||
# raise Finish to halt the handler
|
||||
raise web.Finish()
|
||||
|
||||
_finished = False
|
||||
def on_finish(self):
|
||||
self._finished = True
|
||||
|
||||
async def keepalive(self):
|
||||
"""Write empty lines periodically
|
||||
|
||||
to avoid being closed by intermediate proxies
|
||||
when there's a large gap between events.
|
||||
"""
|
||||
while not self._finished:
|
||||
try:
|
||||
self.write("\n\n")
|
||||
except (StreamClosedError, RuntimeError):
|
||||
return
|
||||
await asyncio.sleep(self.keepalive_interval)
|
||||
|
||||
@admin_or_self
|
||||
async def get(self, username, server_name=''):
|
||||
self.set_header('Cache-Control', 'no-cache')
|
||||
@@ -453,6 +473,9 @@ class SpawnProgressAPIHandler(APIHandler):
|
||||
# user has no such server
|
||||
raise web.HTTPError(404)
|
||||
spawner = user.spawners[server_name]
|
||||
|
||||
# start sending keepalive to avoid proxies closing the connection
|
||||
asyncio.ensure_future(self.keepalive())
|
||||
# cases:
|
||||
# - spawner already started and ready
|
||||
# - spawner not running at all
|
||||
|
@@ -973,6 +973,8 @@ class JupyterHub(Application):
|
||||
h.extend(self.extra_handlers)
|
||||
|
||||
h.append((r'/logo', LogoHandler, {'path': self.logo_file}))
|
||||
h.append((r'/api/(.*)', apihandlers.base.API404))
|
||||
|
||||
self.handlers = self.add_url_prefix(self.hub_prefix, h)
|
||||
# some extra handlers, outside hub_prefix
|
||||
self.handlers.extend([
|
||||
@@ -1506,6 +1508,10 @@ class JupyterHub(Application):
|
||||
for user in self.users.values():
|
||||
for spawner in user.spawners.values():
|
||||
oauth_client_ids.add(spawner.oauth_client_id)
|
||||
# avoid deleting clients created by 0.8
|
||||
# 0.9 uses `jupyterhub-user-...` for the client id, while
|
||||
# 0.8 uses just `user-...`
|
||||
oauth_client_ids.add(spawner.oauth_client_id.split('-', 1)[1])
|
||||
|
||||
client_store = self.oauth_provider.client_authenticator.client_store
|
||||
for i, oauth_client in enumerate(self.db.query(orm.OAuthClient)):
|
||||
|
@@ -602,7 +602,7 @@ class BaseHandler(RequestHandler):
|
||||
|
||||
self.log.debug("Initiating spawn for %s", user_server_name)
|
||||
|
||||
spawn_future = user.spawn(server_name, options)
|
||||
spawn_future = user.spawn(server_name, options, handler=self)
|
||||
|
||||
self.log.debug("%i%s concurrent spawns",
|
||||
spawn_pending_count,
|
||||
|
@@ -111,7 +111,11 @@ class SpawnHandler(BaseHandler):
|
||||
if user.spawner._spawn_future and user.spawner._spawn_future.done():
|
||||
user.spawner._spawn_future = None
|
||||
# not running, no form. Trigger spawn by redirecting to /user/:name
|
||||
self.redirect(user.url)
|
||||
url = user.url
|
||||
if self.request.query:
|
||||
# add query params
|
||||
url += '?' + self.request.query
|
||||
self.redirect(url)
|
||||
|
||||
@web.authenticated
|
||||
async def post(self, for_user=None):
|
||||
@@ -243,9 +247,11 @@ class TokenPageHandler(BaseHandler):
|
||||
api_tokens.append(token)
|
||||
|
||||
# group oauth client tokens by client id
|
||||
# AccessTokens have expires_at as an integer timestamp
|
||||
now_timestamp = now.timestamp()
|
||||
oauth_tokens = defaultdict(list)
|
||||
for token in user.oauth_tokens:
|
||||
if token.expires_at and token.expires_at < now:
|
||||
if token.expires_at and token.expires_at < now_timestamp:
|
||||
self.log.warning("Deleting expired token")
|
||||
self.db.delete(token)
|
||||
self.db.commit()
|
||||
|
@@ -746,7 +746,7 @@ def new_session_factory(url="sqlite:///:memory:",
|
||||
Base.metadata.create_all(engine)
|
||||
|
||||
# We set expire_on_commit=False, since we don't actually need
|
||||
# SQLAlchemy to expire objects after commiting - we don't expect
|
||||
# SQLAlchemy to expire objects after committing - we don't expect
|
||||
# concurrent runs of the hub talking to the same db. Turning
|
||||
# this off gives us a major performance boost
|
||||
session_factory = sessionmaker(bind=engine,
|
||||
|
@@ -487,9 +487,14 @@ class ConfigurableHTTPProxy(Proxy):
|
||||
|
||||
# if we got here, CHP is still running
|
||||
self.log.warning("Proxy still running at pid=%s", pid)
|
||||
for i, sig in enumerate([signal.SIGTERM] * 2 + [signal.SIGKILL]):
|
||||
if os.name != 'nt':
|
||||
sig_list = [signal.SIGTERM] * 2 + [signal.SIGKILL]
|
||||
for i in range(3):
|
||||
try:
|
||||
os.kill(pid, signal.SIGTERM)
|
||||
if os.name == 'nt':
|
||||
self._terminate_win(pid)
|
||||
else:
|
||||
os.kill(pid,sig_list[i])
|
||||
except ProcessLookupError:
|
||||
break
|
||||
time.sleep(1)
|
||||
@@ -600,18 +605,21 @@ class ConfigurableHTTPProxy(Proxy):
|
||||
self._check_running_callback = pc
|
||||
pc.start()
|
||||
|
||||
def _terminate_win(self, pid):
|
||||
# On Windows we spawned a shell on Popen, so we need to
|
||||
# terminate all child processes as well
|
||||
import psutil
|
||||
|
||||
parent = psutil.Process(pid)
|
||||
children = parent.children(recursive=True)
|
||||
for child in children:
|
||||
child.kill()
|
||||
psutil.wait_procs(children, timeout=5)
|
||||
|
||||
def _terminate(self):
|
||||
"""Terminate our process"""
|
||||
if os.name == 'nt':
|
||||
# On Windows we spawned a shell on Popen, so we need to
|
||||
# terminate all child processes as well
|
||||
import psutil
|
||||
|
||||
parent = psutil.Process(self.proxy_process.pid)
|
||||
children = parent.children(recursive=True)
|
||||
for child in children:
|
||||
child.kill()
|
||||
psutil.wait_procs(children, timeout=5)
|
||||
self._terminate_win(self.proxy_process.pid)
|
||||
else:
|
||||
self.proxy_process.terminate()
|
||||
|
||||
|
@@ -298,6 +298,7 @@ class SingleUserNotebookApp(NotebookApp):
|
||||
# disble some single-user configurables
|
||||
token = ''
|
||||
open_browser = False
|
||||
quit_button = False
|
||||
trust_xheaders = True
|
||||
login_handler_class = JupyterHubLoginHandler
|
||||
logout_handler_class = JupyterHubLogoutHandler
|
||||
|
@@ -161,6 +161,7 @@ class Spawner(LoggingConfigurable):
|
||||
admin_access = Bool(False)
|
||||
api_token = Unicode()
|
||||
oauth_client_id = Unicode()
|
||||
handler = Any()
|
||||
|
||||
will_resume = Bool(False,
|
||||
help="""Whether the Spawner will resume on next start
|
||||
|
@@ -74,18 +74,20 @@ def mock_open_session(username, service, encoding):
|
||||
|
||||
class MockSpawner(LocalProcessSpawner):
|
||||
"""Base mock spawner
|
||||
|
||||
|
||||
- disables user-switching that we need root permissions to do
|
||||
- spawns `jupyterhub.tests.mocksu` instead of a full single-user server
|
||||
"""
|
||||
def make_preexec_fn(self, *a, **kw):
|
||||
# skip the setuid stuff
|
||||
return
|
||||
|
||||
|
||||
def _set_user_changed(self, name, old, new):
|
||||
pass
|
||||
|
||||
|
||||
def user_env(self, env):
|
||||
if self.handler:
|
||||
env['HANDLER_ARGS'] = self.handler.request.query
|
||||
return env
|
||||
|
||||
@default('cmd')
|
||||
|
@@ -27,13 +27,13 @@ class ArgsHandler(web.RequestHandler):
|
||||
self.write(json.dumps(sys.argv))
|
||||
|
||||
def main(args):
|
||||
|
||||
|
||||
app = web.Application([
|
||||
(r'.*/args', ArgsHandler),
|
||||
(r'.*/env', EnvHandler),
|
||||
(r'.*', EchoHandler),
|
||||
])
|
||||
|
||||
|
||||
server = httpserver.HTTPServer(app)
|
||||
server.listen(args.port)
|
||||
try:
|
||||
@@ -45,4 +45,4 @@ if __name__ == '__main__':
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument('--port', type=int)
|
||||
args, extra = parser.parse_known_args()
|
||||
main(args)
|
||||
main(args)
|
||||
|
@@ -100,6 +100,8 @@ def api_request(app, *api_path, **kwargs):
|
||||
assert "frame-ancestors 'self'" in resp.headers['Content-Security-Policy']
|
||||
assert ujoin(app.hub.base_url, "security/csp-report") in resp.headers['Content-Security-Policy']
|
||||
assert 'http' not in resp.headers['Content-Security-Policy']
|
||||
if not kwargs.get('stream', False) and resp.content:
|
||||
assert resp.headers.get('content-type') == 'application/json'
|
||||
return resp
|
||||
|
||||
|
||||
@@ -604,6 +606,32 @@ def test_spawn(app):
|
||||
assert app.users.count_active_users()['pending'] == 0
|
||||
|
||||
|
||||
@mark.gen_test
|
||||
def test_spawn_handler(app):
|
||||
"""Test that the requesting Handler is passed to Spawner.handler"""
|
||||
db = app.db
|
||||
name = 'salmon'
|
||||
user = add_user(db, app=app, name=name)
|
||||
app_user = app.users[name]
|
||||
|
||||
# spawn via API with ?foo=bar
|
||||
r = yield api_request(app, 'users', name, 'server', method='post', params={'foo': 'bar'})
|
||||
r.raise_for_status()
|
||||
|
||||
# verify that request params got passed down
|
||||
# implemented in MockSpawner
|
||||
url = public_url(app, user)
|
||||
r = yield async_requests.get(ujoin(url, 'env'))
|
||||
env = r.json()
|
||||
assert 'HANDLER_ARGS' in env
|
||||
assert env['HANDLER_ARGS'] == 'foo=bar'
|
||||
# make user spawner.handler doesn't persist after spawn finishes
|
||||
assert app_user.spawner.handler is None
|
||||
|
||||
r = yield api_request(app, 'users', name, 'server', method='delete')
|
||||
r.raise_for_status()
|
||||
|
||||
|
||||
@mark.slow
|
||||
@mark.gen_test
|
||||
def test_slow_spawn(app, no_patience, slow_spawn):
|
||||
@@ -720,6 +748,8 @@ def test_progress(request, app, no_patience, slow_spawn):
|
||||
r = yield api_request(app, 'users', name, 'server/progress', stream=True)
|
||||
r.raise_for_status()
|
||||
request.addfinalizer(r.close)
|
||||
assert r.headers['content-type'] == 'text/event-stream'
|
||||
|
||||
ex = async_requests.executor
|
||||
line_iter = iter(r.iter_lines(decode_unicode=True))
|
||||
evt = yield ex.submit(next_event, line_iter)
|
||||
@@ -781,6 +811,7 @@ def test_progress_ready(request, app):
|
||||
r = yield api_request(app, 'users', name, 'server/progress', stream=True)
|
||||
r.raise_for_status()
|
||||
request.addfinalizer(r.close)
|
||||
assert r.headers['content-type'] == 'text/event-stream'
|
||||
ex = async_requests.executor
|
||||
line_iter = iter(r.iter_lines(decode_unicode=True))
|
||||
evt = yield ex.submit(next_event, line_iter)
|
||||
@@ -800,6 +831,7 @@ def test_progress_bad(request, app, no_patience, bad_spawn):
|
||||
r = yield api_request(app, 'users', name, 'server/progress', stream=True)
|
||||
r.raise_for_status()
|
||||
request.addfinalizer(r.close)
|
||||
assert r.headers['content-type'] == 'text/event-stream'
|
||||
ex = async_requests.executor
|
||||
line_iter = iter(r.iter_lines(decode_unicode=True))
|
||||
evt = yield ex.submit(next_event, line_iter)
|
||||
@@ -821,6 +853,7 @@ def test_progress_bad_slow(request, app, no_patience, slow_bad_spawn):
|
||||
r = yield api_request(app, 'users', name, 'server/progress', stream=True)
|
||||
r.raise_for_status()
|
||||
request.addfinalizer(r.close)
|
||||
assert r.headers['content-type'] == 'text/event-stream'
|
||||
ex = async_requests.executor
|
||||
line_iter = iter(r.iter_lines(decode_unicode=True))
|
||||
evt = yield ex.submit(next_event, line_iter)
|
||||
@@ -1188,14 +1221,19 @@ def test_token_as_user_deprecated(app, as_user, for_user, status):
|
||||
|
||||
|
||||
@mark.gen_test
|
||||
@mark.parametrize("headers, status, note", [
|
||||
({}, 200, 'test note'),
|
||||
({}, 200, ''),
|
||||
({'Authorization': 'token bad'}, 403, ''),
|
||||
@mark.parametrize("headers, status, note, expires_in", [
|
||||
({}, 200, 'test note', None),
|
||||
({}, 200, '', 100),
|
||||
({'Authorization': 'token bad'}, 403, '', None),
|
||||
])
|
||||
def test_get_new_token(app, headers, status, note):
|
||||
def test_get_new_token(app, headers, status, note, expires_in):
|
||||
options = {}
|
||||
if note:
|
||||
body = json.dumps({'note': note})
|
||||
options['note'] = note
|
||||
if expires_in:
|
||||
options['expires_in'] = expires_in
|
||||
if options:
|
||||
body = json.dumps(options)
|
||||
else:
|
||||
body = ''
|
||||
# request a new token
|
||||
@@ -1213,6 +1251,10 @@ def test_get_new_token(app, headers, status, note):
|
||||
assert reply['user'] == 'admin'
|
||||
assert reply['created']
|
||||
assert 'last_activity' in reply
|
||||
if expires_in:
|
||||
assert isinstance(reply['expires_at'], str)
|
||||
else:
|
||||
assert reply['expires_at'] is None
|
||||
if note:
|
||||
assert reply['note'] == note
|
||||
else:
|
||||
|
@@ -1,7 +1,9 @@
|
||||
"""Tests for HTML pages"""
|
||||
|
||||
import sys
|
||||
from urllib.parse import urlencode, urlparse
|
||||
|
||||
from bs4 import BeautifulSoup
|
||||
from tornado import gen
|
||||
from tornado.httputil import url_concat
|
||||
|
||||
@@ -168,6 +170,31 @@ def test_spawn_redirect(app):
|
||||
assert path == ujoin(app.base_url, '/user/%s/' % name)
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_spawn_handler_access(app):
|
||||
name = 'winston'
|
||||
cookies = yield app.login_user(name)
|
||||
u = app.users[orm.User.find(app.db, name)]
|
||||
|
||||
status = yield u.spawner.poll()
|
||||
assert status is not None
|
||||
|
||||
# spawn server via browser link with ?arg=value
|
||||
r = yield get_page('spawn', app, cookies=cookies, params={'arg': 'value'})
|
||||
r.raise_for_status()
|
||||
|
||||
# verify that request params got passed down
|
||||
# implemented in MockSpawner
|
||||
r = yield async_requests.get(ujoin(public_url(app, u), 'env'))
|
||||
env = r.json()
|
||||
assert 'HANDLER_ARGS' in env
|
||||
assert env['HANDLER_ARGS'] == 'arg=value'
|
||||
|
||||
# stop server
|
||||
r = yield api_request(app, 'users', name, 'server', method='delete')
|
||||
r.raise_for_status()
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_spawn_admin_access(app, admin_access):
|
||||
"""GET /user/:name as admin with admin-access spawns user's server"""
|
||||
@@ -573,6 +600,51 @@ def test_announcements(app, announcements):
|
||||
assert_announcement("logout", r.text)
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_token_page(app):
|
||||
name = "cake"
|
||||
cookies = yield app.login_user(name)
|
||||
r = yield get_page("token", app, cookies=cookies)
|
||||
r.raise_for_status()
|
||||
assert urlparse(r.url).path.endswith('/hub/token')
|
||||
def extract_body(r):
|
||||
soup = BeautifulSoup(r.text, "html5lib")
|
||||
import re
|
||||
# trim empty lines
|
||||
return re.sub(r"(\n\s*)+", "\n", soup.body.find(class_="container").text)
|
||||
body = extract_body(r)
|
||||
assert "Request new API token" in body, body
|
||||
# no tokens yet, no lists
|
||||
assert "API Tokens" not in body, body
|
||||
assert "Authorized Applications" not in body, body
|
||||
|
||||
# request an API token
|
||||
user = app.users[name]
|
||||
token = user.new_api_token(expires_in=60, note="my-test-token")
|
||||
app.db.commit()
|
||||
|
||||
r = yield get_page("token", app, cookies=cookies)
|
||||
r.raise_for_status()
|
||||
body = extract_body(r)
|
||||
assert "API Tokens" in body, body
|
||||
assert "my-test-token" in body, body
|
||||
# no oauth tokens yet, shouldn't have that section
|
||||
assert "Authorized Applications" not in body, body
|
||||
|
||||
# spawn the user to trigger oauth, etc.
|
||||
# request an oauth token
|
||||
user.spawner.cmd = [sys.executable, '-m', 'jupyterhub.singleuser']
|
||||
r = yield get_page("spawn", app, cookies=cookies)
|
||||
r.raise_for_status()
|
||||
|
||||
r = yield get_page("token", app, cookies=cookies)
|
||||
r.raise_for_status()
|
||||
body = extract_body(r)
|
||||
assert "API Tokens" in body, body
|
||||
assert "Server at %s" % user.base_url in body, body
|
||||
assert "Authorized Applications" in body, body
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_server_not_running_api_request(app):
|
||||
cookies = yield app.login_user("bees")
|
||||
|
@@ -331,7 +331,7 @@ class User:
|
||||
url_parts.extend(['server/progress'])
|
||||
return url_path_join(*url_parts)
|
||||
|
||||
async def spawn(self, server_name='', options=None):
|
||||
async def spawn(self, server_name='', options=None, handler=None):
|
||||
"""Start the user's spawner
|
||||
|
||||
depending from the value of JupyterHub.allow_named_servers
|
||||
@@ -361,6 +361,9 @@ class User:
|
||||
spawner.server = server = Server(orm_server=orm_server)
|
||||
assert spawner.orm_spawner.server is orm_server
|
||||
|
||||
# pass requesting handler to the spawner
|
||||
# e.g. for processing GET params
|
||||
spawner.handler = handler
|
||||
# Passing user_options to the spawner
|
||||
spawner.user_options = options or {}
|
||||
# we are starting a new server, make sure it doesn't restore state
|
||||
@@ -484,6 +487,9 @@ class User:
|
||||
# raise original exception
|
||||
spawner._start_pending = False
|
||||
raise e
|
||||
finally:
|
||||
# clear reference to handler after start finishes
|
||||
spawner.handler = None
|
||||
spawner.start_polling()
|
||||
|
||||
# store state
|
||||
@@ -552,11 +558,25 @@ class User:
|
||||
# remove server entry from db
|
||||
spawner.server = None
|
||||
if not spawner.will_resume:
|
||||
# find and remove the API token if the spawner isn't
|
||||
# find and remove the API token and oauth client if the spawner isn't
|
||||
# going to re-use it next time
|
||||
orm_token = orm.APIToken.find(self.db, api_token)
|
||||
if orm_token:
|
||||
self.db.delete(orm_token)
|
||||
# remove oauth client as well
|
||||
# handle upgrades from 0.8, where client id will be `user-USERNAME`,
|
||||
# not just `jupyterhub-user-USERNAME`
|
||||
client_ids = (
|
||||
spawner.oauth_client_id,
|
||||
spawner.oauth_client_id.split('-', 1)[1],
|
||||
)
|
||||
for oauth_client in (
|
||||
self.db
|
||||
.query(orm.OAuthClient)
|
||||
.filter(orm.OAuthClient.identifier.in_(client_ids))
|
||||
):
|
||||
self.log.debug("Deleting oauth client %s", oauth_client.identifier)
|
||||
self.db.delete(oauth_client)
|
||||
self.db.commit()
|
||||
finally:
|
||||
spawner.orm_spawner.started = None
|
||||
|
@@ -4,3 +4,8 @@ conda:
|
||||
file: docs/environment.yml
|
||||
python:
|
||||
version: 3
|
||||
formats:
|
||||
- htmlzip
|
||||
- epub
|
||||
# pdf disabled due to bug in sphinx 1.8 + recommonmark
|
||||
# - pdf
|
||||
|
@@ -1,7 +1,7 @@
|
||||
#!/bin/bash
|
||||
set -ex
|
||||
|
||||
stable=0.8
|
||||
stable=0.9
|
||||
|
||||
for V in master $stable; do
|
||||
docker build --build-arg JUPYTERHUB_VERSION=$V -t $DOCKER_REPO:$V .
|
||||
|
@@ -1,6 +1,7 @@
|
||||
#!/bin/bash
|
||||
set -ex
|
||||
|
||||
stable=0.8
|
||||
stable=0.9
|
||||
for V in master $stable; do
|
||||
docker push $DOCKER_REPO:$V
|
||||
done
|
||||
@@ -12,6 +13,10 @@ function get_hub_version() {
|
||||
hub_xyz=$(cat hub_version)
|
||||
split=( ${hub_xyz//./ } )
|
||||
hub_xy="${split[0]}.${split[1]}"
|
||||
# add .dev on hub_xy so it's 1.0.dev
|
||||
if [[ ! -z "${split[3]}" ]]; then
|
||||
hub_xy="${hub_xy}.${split[3]}"
|
||||
fi
|
||||
}
|
||||
# tag e.g. 0.8.1 with 0.8
|
||||
get_hub_version $stable
|
||||
@@ -22,3 +27,5 @@ docker push $DOCKER_REPO:$hub_xyz
|
||||
get_hub_version master
|
||||
docker tag $DOCKER_REPO:master $DOCKER_REPO:$hub_xy
|
||||
docker push $DOCKER_REPO:$hub_xy
|
||||
docker tag $DOCKER_REPO:master $DOCKER_REPO:$hub_xyz
|
||||
docker push $DOCKER_REPO:$hub_xyz
|
||||
|
Reference in New Issue
Block a user