mirror of
https://github.com/jupyterhub/jupyterhub.git
synced 2025-10-18 23:42:59 +00:00
Merge branch 'master' into auth_data_sharing
This commit is contained in:
@@ -34,7 +34,7 @@ before_install:
|
||||
fi
|
||||
install:
|
||||
- pip install --upgrade pip
|
||||
- pip install --pre -r dev-requirements.txt .
|
||||
- pip install --upgrade --pre -r dev-requirements.txt .
|
||||
- pip freeze
|
||||
|
||||
# running tests
|
||||
@@ -48,7 +48,8 @@ script:
|
||||
# build docs
|
||||
if [[ "$TEST" == "docs" ]]; then
|
||||
pushd docs
|
||||
pip install -r requirements.txt
|
||||
pip install --upgrade -r requirements.txt
|
||||
pip install --upgrade alabaster_jupyterhub
|
||||
make html
|
||||
popd
|
||||
fi
|
||||
|
@@ -1,5 +1,5 @@
|
||||
# Contributing
|
||||
|
||||
|
||||
See the [contribution guide](https://jupyterhub.readthedocs.io/en/latest/index.html#contributor) section
|
||||
See the [contribution guide](https://jupyterhub.readthedocs.io/en/latest/index.html#contributing) section
|
||||
at the JupyterHub documentation.
|
@@ -12,11 +12,12 @@
|
||||
|
||||
[](https://pypi.python.org/pypi/jupyterhub)
|
||||
[](https://jupyterhub.readthedocs.org/en/latest/?badge=latest)
|
||||
[](https://jupyterhub.readthedocs.io/en/0.7.2/?badge=0.7.2)
|
||||
[](https://travis-ci.org/jupyterhub/jupyterhub)
|
||||
[](https://circleci.com/gh/jupyterhub/jupyterhub)
|
||||
[](https://codecov.io/github/jupyterhub/jupyterhub?branch=master)
|
||||
[](https://groups.google.com/forum/#!forum/jupyter)
|
||||
[](https://github.com/jupyterhub/jupyterhub/issues)
|
||||
[](https://discourse.jupyter.org/c/jupyterhub)
|
||||
[](https://gitter.im/jupyterhub/jupyterhub)
|
||||
|
||||
With [JupyterHub](https://jupyterhub.readthedocs.io) you can create a
|
||||
**multi-user Hub** which spawns, manages, and proxies multiple instances of the
|
||||
@@ -204,6 +205,9 @@ and the [`CONTRIBUTING.md`](CONTRIBUTING.md). The `CONTRIBUTING.md` file
|
||||
explains how to set up a development installation, how to run the test suite,
|
||||
and how to contribute to documentation.
|
||||
|
||||
For a high-level view of the vision and next directions of the project, see the
|
||||
[JupyterHub community roadmap](docs/source/contributing/roadmap.md).
|
||||
|
||||
### A note about platform support
|
||||
|
||||
JupyterHub is supported on Linux/Unix based systems.
|
||||
|
@@ -6,7 +6,7 @@ coverage
|
||||
cryptography
|
||||
html5lib # needed for beautifulsoup
|
||||
pytest-cov
|
||||
pytest-tornado
|
||||
pytest-asyncio
|
||||
pytest>=3.3
|
||||
notebook
|
||||
requests-mock
|
||||
|
@@ -21,3 +21,4 @@ dependencies:
|
||||
- prometheus_client
|
||||
- attrs>=17.4.0
|
||||
- sphinx-copybutton
|
||||
- alabaster_jupyterhub
|
||||
|
@@ -4,3 +4,4 @@
|
||||
sphinx>=1.7
|
||||
recommonmark==0.4.0
|
||||
sphinx-copybutton
|
||||
alabaster_jupyterhub
|
||||
|
@@ -89,7 +89,7 @@ paths:
|
||||
post:
|
||||
summary: Create multiple users
|
||||
parameters:
|
||||
- name: data
|
||||
- name: body
|
||||
in: body
|
||||
required: true
|
||||
schema:
|
||||
@@ -147,7 +147,7 @@ paths:
|
||||
in: path
|
||||
required: true
|
||||
type: string
|
||||
- name: data
|
||||
- name: body
|
||||
in: body
|
||||
required: true
|
||||
description: Updated user info. At least one key to be updated (name or admin) is required.
|
||||
@@ -176,6 +176,60 @@ paths:
|
||||
responses:
|
||||
'204':
|
||||
description: The user has been deleted
|
||||
/users/{name}/activity:
|
||||
post:
|
||||
summary:
|
||||
Notify Hub of activity for a given user.
|
||||
description:
|
||||
Notify the Hub of activity by the user,
|
||||
e.g. accessing a service or (more likely)
|
||||
actively using a server.
|
||||
parameters:
|
||||
- name: name
|
||||
description: username
|
||||
in: path
|
||||
required: true
|
||||
type: string
|
||||
- body:
|
||||
in: body
|
||||
schema:
|
||||
type: object
|
||||
properties:
|
||||
last_activity:
|
||||
type: string
|
||||
format: date-time
|
||||
description: |
|
||||
Timestamp of last-seen activity for this user.
|
||||
Only needed if this is not activity associated
|
||||
with using a given server.
|
||||
required: false
|
||||
servers:
|
||||
description: |
|
||||
Register activity for specific servers by name.
|
||||
The keys of this dict are the names of servers.
|
||||
The default server has an empty name ('').
|
||||
required: false
|
||||
type: object
|
||||
properties:
|
||||
'<server name>':
|
||||
description: |
|
||||
Activity for a single server.
|
||||
type: object
|
||||
properties:
|
||||
last_activity:
|
||||
required: true
|
||||
type: string
|
||||
format: date-time
|
||||
description: |
|
||||
Timestamp of last-seen activity on this server.
|
||||
example:
|
||||
last_activity: '2019-02-06T12:54:14Z'
|
||||
servers:
|
||||
'':
|
||||
last_activity: '2019-02-06T12:54:14Z'
|
||||
gpu:
|
||||
last_activity: '2019-02-06T12:54:14Z'
|
||||
|
||||
/users/{name}/server:
|
||||
post:
|
||||
summary: Start a user's single-user notebook server
|
||||
@@ -185,6 +239,15 @@ paths:
|
||||
in: path
|
||||
required: true
|
||||
type: string
|
||||
- options:
|
||||
description: |
|
||||
Spawn options can be passed as a JSON body
|
||||
when spawning via the API instead of spawn form.
|
||||
The structure of the options
|
||||
will depend on the Spawner's configuration.
|
||||
in: body
|
||||
required: false
|
||||
type: object
|
||||
responses:
|
||||
'201':
|
||||
description: The user's notebook server has started
|
||||
@@ -217,13 +280,15 @@ paths:
|
||||
in: path
|
||||
required: true
|
||||
type: string
|
||||
- name: remove
|
||||
- options:
|
||||
description: |
|
||||
Whether to fully remove the server, rather than just stop it.
|
||||
Removing a server deletes things like the state of the stopped server.
|
||||
Spawn options can be passed as a JSON body
|
||||
when spawning via the API instead of spawn form.
|
||||
The structure of the options
|
||||
will depend on the Spawner's configuration.
|
||||
in: body
|
||||
required: false
|
||||
type: boolean
|
||||
type: object
|
||||
responses:
|
||||
'201':
|
||||
description: The user's notebook named-server has started
|
||||
@@ -242,6 +307,13 @@ paths:
|
||||
in: path
|
||||
required: true
|
||||
type: string
|
||||
- name: remove
|
||||
description: |
|
||||
Whether to fully remove the server, rather than just stop it.
|
||||
Removing a server deletes things like the state of the stopped server.
|
||||
in: body
|
||||
required: false
|
||||
type: boolean
|
||||
responses:
|
||||
'204':
|
||||
description: The user's notebook named-server has stopped
|
||||
@@ -352,7 +424,7 @@ paths:
|
||||
in: path
|
||||
required: true
|
||||
type: string
|
||||
- name: data
|
||||
- name: body
|
||||
in: body
|
||||
required: true
|
||||
description: The users to add to the group
|
||||
@@ -377,7 +449,7 @@ paths:
|
||||
in: path
|
||||
required: true
|
||||
type: string
|
||||
- name: data
|
||||
- name: body
|
||||
in: body
|
||||
required: true
|
||||
description: The users to remove from the group
|
||||
@@ -435,7 +507,7 @@ paths:
|
||||
summary: Notify the Hub about a new proxy
|
||||
description: Notifies the Hub of a new proxy to use.
|
||||
parameters:
|
||||
- name: data
|
||||
- name: body
|
||||
in: body
|
||||
required: true
|
||||
description: Any values that have changed for the new proxy. All keys are optional.
|
||||
|
@@ -67,7 +67,9 @@ source_suffix = ['.rst', '.md']
|
||||
# -- Options for HTML output ----------------------------------------------
|
||||
|
||||
# The theme to use for HTML and HTML Help pages.
|
||||
html_theme = 'alabaster'
|
||||
import alabaster_jupyterhub
|
||||
html_theme = 'alabaster_jupyterhub'
|
||||
html_theme_path = [alabaster_jupyterhub.get_html_theme_path()]
|
||||
|
||||
html_logo = '_static/images/logo/logo.png'
|
||||
html_favicon = '_static/images/logo/favicon.ico'
|
||||
|
98
docs/source/contributing/roadmap.md
Normal file
98
docs/source/contributing/roadmap.md
Normal file
@@ -0,0 +1,98 @@
|
||||
# The JupyterHub roadmap
|
||||
|
||||
This roadmap collects "next steps" for JupyterHub. It is about creating a
|
||||
shared understanding of the project's vision and direction amongst
|
||||
the community of users, contributors, and maintainers.
|
||||
The goal is to communicate priorities and upcoming release plans.
|
||||
It is not a aimed at limiting contributions to what is listed here.
|
||||
|
||||
|
||||
## Using the roadmap
|
||||
### Sharing Feedback on the Roadmap
|
||||
|
||||
All of the community is encouraged to provide feedback as well as share new
|
||||
ideas with the community. Please do so by submitting an issue. If you want to
|
||||
have an informal conversation first use one of the other communication channels.
|
||||
After submitting the issue, others from the community will probably
|
||||
respond with questions or comments they have to clarify the issue. The
|
||||
maintainers will help identify what a good next step is for the issue.
|
||||
|
||||
### What do we mean by "next step"?
|
||||
|
||||
When submitting an issue, think about what "next step" category best describes
|
||||
your issue:
|
||||
|
||||
* **now**, concrete/actionable step that is ready for someone to start work on.
|
||||
These might be items that have a link to an issue or more abstract like
|
||||
"decrease typos and dead links in the documentation"
|
||||
* **soon**, less concrete/actionable step that is going to happen soon,
|
||||
discussions around the topic are coming close to an end at which point it can
|
||||
move into the "now" category
|
||||
* **later**, abstract ideas or tasks, need a lot of discussion or
|
||||
experimentation to shape the idea so that it can be executed. Can also
|
||||
contain concrete/actionable steps that have been postponed on purpose
|
||||
(these are steps that could be in "now" but the decision was taken to work on
|
||||
them later)
|
||||
|
||||
### Reviewing and Updating the Roadmap
|
||||
|
||||
The roadmap will get updated as time passes (next review by 1st December) based
|
||||
on discussions and ideas captured as issues.
|
||||
This means this list should not be exhaustive, it should only represent
|
||||
the "top of the stack" of ideas. It should
|
||||
not function as a wish list, collection of feature requests or todo list.
|
||||
For those please create a
|
||||
[new issue](https://github.com/jupyterhub/jupyterhub/issues/new).
|
||||
|
||||
The roadmap should give the reader an idea of what is happening next, what needs
|
||||
input and discussion before it can happen and what has been postponed.
|
||||
|
||||
|
||||
## The roadmap proper
|
||||
### Project vision
|
||||
|
||||
JupyterHub is a dependable tool used by humans that reduces the complexity of
|
||||
creating the environment in which a piece of software can be executed.
|
||||
|
||||
### Now
|
||||
|
||||
These "Now" items are considered active areas of focus for the project:
|
||||
|
||||
* HubShare - a sharing service for use with JupyterHub.
|
||||
* Users should be able to:
|
||||
- Push a project to other users.
|
||||
- Get a checkout of a project from other users.
|
||||
- Push updates to a published project.
|
||||
- Pull updates from a published project.
|
||||
- Manage conflicts/merges by simply picking a version (our/theirs)
|
||||
- Get a checkout of a project from the internet. These steps are completely different from saving notebooks/files.
|
||||
- Have directories that are managed by git completely separately from our stuff.
|
||||
- Look at pushed content that they have access to without an explicit pull.
|
||||
- Define and manage teams of users.
|
||||
- Adding/removing a user to/from a team gives/removes them access to all projects that team has access to.
|
||||
- Build other services, such as static HTML publishing and dashboarding on top of these things.
|
||||
|
||||
|
||||
### Soon
|
||||
|
||||
These "Soon" items are under discussion. Once an item reaches the point of an
|
||||
actionable plan, the item will be moved to the "Now" section. Typically,
|
||||
these will be moved at a future review of the roadmap.
|
||||
|
||||
* resource monitoring and management:
|
||||
- (prometheus?) API for resource monitoring
|
||||
- tracking activity on single-user servers instead of the proxy
|
||||
- notes and activity tracking per API token
|
||||
- UI for managing named servers
|
||||
|
||||
|
||||
### Later
|
||||
|
||||
The "Later" items are things that are at the back of the project's mind. At this
|
||||
time there is no active plan for an item. The project would like to find the
|
||||
resources and time to discuss these ideas.
|
||||
|
||||
- real-time collaboration
|
||||
- Enter into real-time collaboration mode for a project that starts a shared execution context.
|
||||
- Once the single-user notebook package supports realtime collaboration,
|
||||
implement sharing mechanism integrated into the Hub.
|
@@ -85,8 +85,13 @@ easy to do with RStudio too.
|
||||
|
||||
- https://datascience.business.illinois.edu
|
||||
|
||||
### IllustrisTNG Simulation Project
|
||||
|
||||
- [JupyterHub/Lab-based analysis platform, part of the TNG public data release](http://www.tng-project.org/data/)
|
||||
|
||||
### MIT and Lincoln Labs
|
||||
|
||||
- https://supercloud.mit.edu/
|
||||
|
||||
### Michigan State University
|
||||
|
||||
@@ -146,7 +151,6 @@ easy to do with RStudio too.
|
||||
|
||||
[Everware](https://github.com/everware) Reproducible and reusable science powered by jupyterhub and docker. Like nbviewer, but executable. CERN, Geneva [website](http://everware.xyz/)
|
||||
|
||||
|
||||
### Microsoft Azure
|
||||
|
||||
- https://docs.microsoft.com/en-us/azure/machine-learning/machine-learning-data-science-linux-dsvm-intro
|
||||
@@ -156,9 +160,7 @@ easy to do with RStudio too.
|
||||
- https://getcarina.com/blog/learning-how-to-whale/
|
||||
- http://carolynvanslyck.com/talk/carina/jupyterhub/#/
|
||||
|
||||
### jcloud.io
|
||||
- Open to public JupyterHub server
|
||||
- https://jcloud.io
|
||||
|
||||
## Miscellaneous
|
||||
|
||||
- https://medium.com/@ybarraud/setting-up-jupyterhub-with-sudospawner-and-anaconda-844628c0dbee#.rm3yt87e1
|
||||
|
@@ -30,7 +30,7 @@ For convenient administration of the Hub, its users, and services,
|
||||
JupyterHub also provides a `REST API`_.
|
||||
|
||||
The JupyterHub team and Project Jupyter value our community, and JupyterHub
|
||||
follows the Jupyter [Community Guides](https://jupyter.readthedocs.io/en/latest/community/content-community.html).
|
||||
follows the Jupyter `Community Guides <https://jupyter.readthedocs.io/en/latest/community/content-community.html>`_.
|
||||
|
||||
Contents
|
||||
========
|
||||
@@ -115,6 +115,7 @@ helps keep our community welcoming to as many people as possible.
|
||||
contributing/setup
|
||||
contributing/docs
|
||||
contributing/tests
|
||||
contributing/roadmap
|
||||
|
||||
Upgrading JupyterHub
|
||||
--------------------
|
||||
|
@@ -190,3 +190,24 @@ Listen 443
|
||||
</Location>
|
||||
</VirtualHost>
|
||||
```
|
||||
|
||||
|
||||
In case of the need to run the jupyterhub under /jhub/ or other location please use the below configurations:
|
||||
- JupyterHub running locally at http://127.0.0.1:8000/jhub/ or other location
|
||||
|
||||
httpd.conf amendments:
|
||||
```bash
|
||||
RewriteRule /jhub/(.*) ws://127.0.0.1:8000/jhub/$1 [P,L]
|
||||
RewriteRule /jhub/(.*) http://127.0.0.1:8000/jhub/$1 [P,L]
|
||||
|
||||
ProxyPass /jhub/ http://127.0.0.1:8000/jhub/
|
||||
ProxyPassReverse /jhub/ http://127.0.0.1:8000/jhub/
|
||||
```
|
||||
|
||||
jupyterhub_config.py amendments:
|
||||
```bash
|
||||
--The public facing URL of the whole JupyterHub application.
|
||||
--This is the address on which the proxy will bind. Sets protocol, ip, base_url
|
||||
c.JupyterHub.bind_url = 'http://127.0.0.1:8000/jhub/'
|
||||
```
|
||||
|
||||
|
@@ -25,19 +25,19 @@ supplement the material in the block. The
|
||||
make extensive use of blocks, which allows you to customize parts of the
|
||||
interface easily.
|
||||
|
||||
In general, a child template can extend a base template, `base.html`, by beginning with:
|
||||
In general, a child template can extend a base template, `page.html`, by beginning with:
|
||||
|
||||
```html
|
||||
{% extends "base.html" %}
|
||||
{% extends "page.html" %}
|
||||
```
|
||||
|
||||
This works, unless you are trying to extend the default template for the same
|
||||
file name. Starting in version 0.9, you may refer to the base file with a
|
||||
`templates/` prefix. Thus, if you are writing a custom `base.html`, start the
|
||||
`templates/` prefix. Thus, if you are writing a custom `page.html`, start the
|
||||
file with this block:
|
||||
|
||||
```html
|
||||
{% extends "templates/base.html" %}
|
||||
{% extends "templates/page.html" %}
|
||||
```
|
||||
|
||||
By defining `block`s with same name as in the base template, child templates
|
||||
|
@@ -4,16 +4,17 @@
|
||||
# Distributed under the terms of the Modified BSD License.
|
||||
|
||||
import asyncio
|
||||
from datetime import datetime
|
||||
from datetime import datetime, timedelta, timezone
|
||||
import json
|
||||
|
||||
from async_generator import aclosing
|
||||
from dateutil.parser import parse as parse_date
|
||||
from tornado import web
|
||||
from tornado.iostream import StreamClosedError
|
||||
|
||||
from .. import orm
|
||||
from ..user import User
|
||||
from ..utils import admin_only, iterate_until, maybe_future, url_path_join
|
||||
from ..utils import admin_only, isoformat, iterate_until, maybe_future, url_path_join
|
||||
from .base import APIHandler
|
||||
|
||||
|
||||
@@ -242,7 +243,10 @@ class UserTokenListAPIHandler(APIHandler):
|
||||
# defer to Authenticator for identifying the user
|
||||
# can be username+password or an upstream auth token
|
||||
try:
|
||||
name = await self.authenticator.authenticate(self, body.get('auth'))
|
||||
name = await self.authenticate(body.get('auth'))
|
||||
if isinstance(name, dict):
|
||||
# not a simple string so it has to be a dict
|
||||
name = name.get('name')
|
||||
except web.HTTPError as e:
|
||||
# turn any authentication error into 403
|
||||
raise web.HTTPError(403)
|
||||
@@ -347,8 +351,19 @@ class UserServerAPIHandler(APIHandler):
|
||||
@admin_or_self
|
||||
async def post(self, name, server_name=''):
|
||||
user = self.find_user(name)
|
||||
if server_name and not self.allow_named_servers:
|
||||
if server_name:
|
||||
if not self.allow_named_servers:
|
||||
raise web.HTTPError(400, "Named servers are not enabled.")
|
||||
if self.named_server_limit_per_user > 0 and server_name not in user.orm_spawners:
|
||||
named_spawners = list(user.all_spawners(include_default=False))
|
||||
if self.named_server_limit_per_user <= len(named_spawners):
|
||||
raise web.HTTPError(
|
||||
400,
|
||||
"User {} already has the maximum of {} named servers."
|
||||
" One must be deleted before a new server can be created".format(
|
||||
name,
|
||||
self.named_server_limit_per_user
|
||||
))
|
||||
spawner = user.spawners[server_name]
|
||||
pending = spawner.pending
|
||||
if pending == 'spawn':
|
||||
@@ -573,6 +588,139 @@ class SpawnProgressAPIHandler(APIHandler):
|
||||
await self.send_event(failed_event)
|
||||
|
||||
|
||||
def _parse_timestamp(timestamp):
|
||||
"""Parse and return a utc timestamp
|
||||
|
||||
- raise HTTPError(400) on parse error
|
||||
- handle and strip tz info for internal consistency
|
||||
(we use naïve utc timestamps everywhere)
|
||||
"""
|
||||
try:
|
||||
dt = parse_date(timestamp)
|
||||
except Exception:
|
||||
raise web.HTTPError(400, "Not a valid timestamp: %r", timestamp)
|
||||
if dt.tzinfo:
|
||||
# strip timezone info to naïve UTC datetime
|
||||
dt = dt.astimezone(timezone.utc).replace(tzinfo=None)
|
||||
|
||||
now = datetime.utcnow()
|
||||
if (dt - now) > timedelta(minutes=59):
|
||||
raise web.HTTPError(
|
||||
400,
|
||||
"Rejecting activity from more than an hour in the future: {}".format(
|
||||
isoformat(dt)
|
||||
)
|
||||
)
|
||||
return dt
|
||||
|
||||
|
||||
class ActivityAPIHandler(APIHandler):
|
||||
|
||||
def _validate_servers(self, user, servers):
|
||||
"""Validate servers dict argument
|
||||
|
||||
- types are correct
|
||||
- each server exists
|
||||
- last_activity fields are parsed into datetime objects
|
||||
"""
|
||||
msg = "servers must be a dict of the form {server_name: {last_activity: timestamp}}"
|
||||
if not isinstance(servers, dict):
|
||||
raise web.HTTPError(400, msg)
|
||||
|
||||
spawners = user.orm_spawners
|
||||
for server_name, server_info in servers.items():
|
||||
if server_name not in spawners:
|
||||
raise web.HTTPError(
|
||||
400,
|
||||
"No such server '{}' for user {}".format(
|
||||
server_name,
|
||||
user.name,
|
||||
)
|
||||
)
|
||||
# check that each per-server field is a dict
|
||||
if not isinstance(server_info, dict):
|
||||
raise web.HTTPError(400, msg)
|
||||
# check that last_activity is defined for each per-server dict
|
||||
if 'last_activity' not in server_info:
|
||||
raise web.HTTPError(400, msg)
|
||||
# parse last_activity timestamps
|
||||
# _parse_timestamp above is responsible for raising errors
|
||||
server_info['last_activity'] = _parse_timestamp(server_info['last_activity'])
|
||||
return servers
|
||||
|
||||
@admin_or_self
|
||||
def post(self, username):
|
||||
user = self.find_user(username)
|
||||
if user is None:
|
||||
# no such user
|
||||
raise web.HTTPError(404, "No such user: %r", username)
|
||||
|
||||
body = self.get_json_body()
|
||||
if not isinstance(body, dict):
|
||||
raise web.HTTPError(400, "body must be a json dict")
|
||||
|
||||
last_activity_timestamp = body.get('last_activity')
|
||||
servers = body.get('servers')
|
||||
if not last_activity_timestamp and not servers:
|
||||
raise web.HTTPError(
|
||||
400,
|
||||
"body must contain at least one of `last_activity` or `servers`"
|
||||
)
|
||||
|
||||
if servers:
|
||||
# validate server args
|
||||
servers = self._validate_servers(user, servers)
|
||||
# at this point we know that the servers dict
|
||||
# is valid and contains only servers that exist
|
||||
# and last_activity is defined and a valid datetime object
|
||||
|
||||
# update user.last_activity if specified
|
||||
if last_activity_timestamp:
|
||||
last_activity = _parse_timestamp(last_activity_timestamp)
|
||||
if (
|
||||
(not user.last_activity)
|
||||
or last_activity > user.last_activity
|
||||
):
|
||||
self.log.debug("Activity for user %s: %s",
|
||||
user.name,
|
||||
isoformat(last_activity),
|
||||
)
|
||||
user.last_activity = last_activity
|
||||
else:
|
||||
self.log.debug(
|
||||
"Not updating activity for %s: %s < %s",
|
||||
user,
|
||||
isoformat(last_activity),
|
||||
isoformat(user.last_activity),
|
||||
)
|
||||
|
||||
if servers:
|
||||
for server_name, server_info in servers.items():
|
||||
last_activity = server_info['last_activity']
|
||||
spawner = user.orm_spawners[server_name]
|
||||
|
||||
if (
|
||||
(not spawner.last_activity)
|
||||
or last_activity > spawner.last_activity
|
||||
):
|
||||
self.log.debug("Activity on server %s/%s: %s",
|
||||
user.name,
|
||||
server_name,
|
||||
isoformat(last_activity),
|
||||
)
|
||||
spawner.last_activity = last_activity
|
||||
else:
|
||||
self.log.debug(
|
||||
"Not updating server activity on %s/%s: %s < %s",
|
||||
user.name,
|
||||
server_name,
|
||||
isoformat(last_activity),
|
||||
isoformat(user.last_activity),
|
||||
)
|
||||
|
||||
self.db.commit()
|
||||
|
||||
|
||||
default_handlers = [
|
||||
(r"/api/user", SelfAPIHandler),
|
||||
(r"/api/users", UserListAPIHandler),
|
||||
@@ -583,5 +731,6 @@ default_handlers = [
|
||||
(r"/api/users/([^/]+)/tokens/([^/]*)", UserTokenAPIHandler),
|
||||
(r"/api/users/([^/]+)/servers/([^/]*)", UserServerAPIHandler),
|
||||
(r"/api/users/([^/]+)/servers/([^/]*)/progress", SpawnProgressAPIHandler),
|
||||
(r"/api/users/([^/]+)/activity", ActivityAPIHandler),
|
||||
(r"/api/users/([^/]+)/admin-access", UserAdminAccessAPIHandler),
|
||||
]
|
||||
|
@@ -803,6 +803,16 @@ class JupyterHub(Application):
|
||||
help="Allow named single-user servers per user"
|
||||
).tag(config=True)
|
||||
|
||||
named_server_limit_per_user = Integer(0,
|
||||
help="""
|
||||
Maximum number of concurrent named servers that can be created by a user at a time.
|
||||
|
||||
Setting this can limit the total resources a user can consume.
|
||||
|
||||
If set to 0, no limit is enforced.
|
||||
"""
|
||||
).tag(config=True)
|
||||
|
||||
# class for spawning single-user servers
|
||||
spawner_class = EntryPointType(
|
||||
default_value=LocalProcessSpawner,
|
||||
@@ -1020,6 +1030,11 @@ class JupyterHub(Application):
|
||||
|
||||
statsd = Any(allow_none=False, help="The statsd client, if any. A mock will be used if we aren't using statsd")
|
||||
|
||||
shutdown_on_logout = Bool(
|
||||
False,
|
||||
help="""Shuts down all user servers on logout"""
|
||||
).tag(config=True)
|
||||
|
||||
@default('statsd')
|
||||
def _statsd(self):
|
||||
if self.statsd_host:
|
||||
@@ -1687,6 +1702,40 @@ class JupyterHub(Application):
|
||||
spawner._log_name)
|
||||
status = -1
|
||||
|
||||
if status is None:
|
||||
# poll claims it's running.
|
||||
# Check if it's really there
|
||||
url_in_db = spawner.server.url
|
||||
url = await spawner.get_url()
|
||||
if url != url_in_db:
|
||||
self.log.warning(
|
||||
"%s had invalid url %s. Updating to %s",
|
||||
spawner._log_name, url_in_db, url,
|
||||
)
|
||||
urlinfo = urlparse(url)
|
||||
spawner.server.protocol = urlinfo.scheme
|
||||
spawner.server.ip = urlinfo.hostname
|
||||
if urlinfo.port:
|
||||
spawner.server.port = urlinfo.port
|
||||
elif urlinfo.scheme == 'http':
|
||||
spawner.server.port = 80
|
||||
elif urlinfo.scheme == 'https':
|
||||
spawner.server.port = 443
|
||||
self.db.commit()
|
||||
|
||||
self.log.debug(
|
||||
"Verifying that %s is running at %s",
|
||||
spawner._log_name, url,
|
||||
)
|
||||
try:
|
||||
await user._wait_up(spawner)
|
||||
except TimeoutError:
|
||||
self.log.error(
|
||||
"%s does not appear to be running at %s, shutting it down.",
|
||||
spawner._log_name, url,
|
||||
)
|
||||
status = -1
|
||||
|
||||
if status is None:
|
||||
self.log.info("%s still running", user.name)
|
||||
spawner.add_poll_callback(user_stopped, user, name)
|
||||
@@ -1836,6 +1885,7 @@ class JupyterHub(Application):
|
||||
domain=self.domain,
|
||||
statsd=self.statsd,
|
||||
allow_named_servers=self.allow_named_servers,
|
||||
named_server_limit_per_user=self.named_server_limit_per_user,
|
||||
oauth_provider=self.oauth_provider,
|
||||
concurrent_spawn_limit=self.concurrent_spawn_limit,
|
||||
spawn_throttle_retry_range=self.spawn_throttle_retry_range,
|
||||
@@ -1849,6 +1899,7 @@ class JupyterHub(Application):
|
||||
internal_ssl_cert=self.internal_ssl_cert,
|
||||
internal_ssl_ca=self.internal_ssl_ca,
|
||||
trusted_alt_names=self.trusted_alt_names,
|
||||
shutdown_on_logout=self.shutdown_on_logout
|
||||
)
|
||||
# allow configured settings to have priority
|
||||
settings.update(self.tornado_settings)
|
||||
|
@@ -19,40 +19,15 @@ except Exception as e:
|
||||
_pamela_error = e
|
||||
|
||||
from tornado.concurrent import run_on_executor
|
||||
from tornado import gen
|
||||
|
||||
from traitlets.config import LoggingConfigurable
|
||||
from traitlets import Bool, Set, Unicode, Dict, Any, default, observe
|
||||
from traitlets import Bool, Integer, Set, Unicode, Dict, Any, default, observe
|
||||
|
||||
from .handlers.login import LoginHandler
|
||||
from .utils import maybe_future, url_path_join
|
||||
from .traitlets import Command
|
||||
|
||||
|
||||
def getgrnam(name):
|
||||
"""Wrapper function to protect against `grp` not being available
|
||||
on Windows
|
||||
"""
|
||||
import grp
|
||||
return grp.getgrnam(name)
|
||||
|
||||
|
||||
def getpwnam(name):
|
||||
"""Wrapper function to protect against `pwd` not being available
|
||||
on Windows
|
||||
"""
|
||||
import pwd
|
||||
return pwd.getpwnam(name)
|
||||
|
||||
|
||||
def getgrouplist(name, group):
|
||||
"""Wrapper function to protect against `os.getgrouplist` not being available
|
||||
on Windows
|
||||
"""
|
||||
import os
|
||||
return os.getgrouplist(name, group)
|
||||
|
||||
|
||||
class Authenticator(LoggingConfigurable):
|
||||
"""Base class for implementing an authentication provider for JupyterHub"""
|
||||
|
||||
@@ -77,6 +52,35 @@ class Authenticator(LoggingConfigurable):
|
||||
""",
|
||||
)
|
||||
|
||||
auth_refresh_age = Integer(
|
||||
300,
|
||||
config=True,
|
||||
help="""The max age (in seconds) of authentication info
|
||||
before forcing a refresh of user auth info.
|
||||
|
||||
Refreshing auth info allows, e.g. requesting/re-validating auth tokens.
|
||||
|
||||
See :meth:`.refresh_user` for what happens when user auth info is refreshed
|
||||
(nothing by default).
|
||||
"""
|
||||
)
|
||||
|
||||
refresh_pre_spawn = Bool(
|
||||
False,
|
||||
config=True,
|
||||
help="""Force refresh of auth prior to spawn.
|
||||
|
||||
This forces :meth:`.refresh_user` to be called prior to launching
|
||||
a server, to ensure that auth state is up-to-date.
|
||||
|
||||
This can be important when e.g. auth tokens that may have expired
|
||||
are passed to the spawner via environment variables from auth_state.
|
||||
|
||||
If refresh_user cannot refresh the user auth data,
|
||||
launch will fail until the user logs in again.
|
||||
"""
|
||||
)
|
||||
|
||||
admin_users = Set(
|
||||
help="""
|
||||
Set of users that will have admin rights on this JupyterHub.
|
||||
@@ -329,7 +333,7 @@ class Authenticator(LoggingConfigurable):
|
||||
self.log.warning("User %r not in whitelist.", username)
|
||||
return
|
||||
|
||||
async def refresh_user(self, user):
|
||||
async def refresh_user(self, user, handler=None):
|
||||
"""Refresh auth data for a given user
|
||||
|
||||
Allows refreshing or invalidating auth data.
|
||||
@@ -341,6 +345,7 @@ class Authenticator(LoggingConfigurable):
|
||||
|
||||
Args:
|
||||
user (User): the user to refresh
|
||||
handler (tornado.web.RequestHandler or None): the current request handler
|
||||
Returns:
|
||||
auth_data (bool or dict):
|
||||
Return **True** if auth data for the user is up-to-date
|
||||
@@ -594,7 +599,7 @@ class LocalAuthenticator(Authenticator):
|
||||
return False
|
||||
for grnam in self.group_whitelist:
|
||||
try:
|
||||
group = getgrnam(grnam)
|
||||
group = self._getgrnam(grnam)
|
||||
except KeyError:
|
||||
self.log.error('No such group: [%s]' % grnam)
|
||||
continue
|
||||
@@ -622,11 +627,33 @@ class LocalAuthenticator(Authenticator):
|
||||
await maybe_future(super().add_user(user))
|
||||
|
||||
@staticmethod
|
||||
def system_user_exists(user):
|
||||
"""Check if the user exists on the system"""
|
||||
def _getgrnam(name):
|
||||
"""Wrapper function to protect against `grp` not being available
|
||||
on Windows
|
||||
"""
|
||||
import grp
|
||||
return grp.getgrnam(name)
|
||||
|
||||
@staticmethod
|
||||
def _getpwnam(name):
|
||||
"""Wrapper function to protect against `pwd` not being available
|
||||
on Windows
|
||||
"""
|
||||
import pwd
|
||||
return pwd.getpwnam(name)
|
||||
|
||||
@staticmethod
|
||||
def _getgrouplist(name, group):
|
||||
"""Wrapper function to protect against `os._getgrouplist` not being available
|
||||
on Windows
|
||||
"""
|
||||
import os
|
||||
return os.getgrouplist(name, group)
|
||||
|
||||
def system_user_exists(self, user):
|
||||
"""Check if the user exists on the system"""
|
||||
try:
|
||||
pwd.getpwnam(user.name)
|
||||
self._getpwnam(user.name)
|
||||
except KeyError:
|
||||
return False
|
||||
else:
|
||||
@@ -727,8 +754,8 @@ class PAMAuthenticator(LocalAuthenticator):
|
||||
# fail to authenticate and raise instead of soft-failing and not changing admin status
|
||||
# (returning None instead of just the username) as this indicates some sort of system failure
|
||||
|
||||
admin_group_gids = {getgrnam(x).gr_gid for x in self.admin_groups}
|
||||
user_group_gids = {x.gr_gid for x in getgrouplist(username, getpwnam(username).pw_gid)}
|
||||
admin_group_gids = {self._getgrnam(x).gr_gid for x in self.admin_groups}
|
||||
user_group_gids = set(self._getgrouplist(username, self._getpwnam(username).pw_gid))
|
||||
admin_status = len(admin_group_gids & user_group_gids) != 0
|
||||
|
||||
except Exception as e:
|
||||
|
@@ -28,6 +28,7 @@ from .. import __version__
|
||||
from .. import orm
|
||||
from ..objects import Server
|
||||
from ..spawner import LocalProcessSpawner
|
||||
from ..user import User
|
||||
from ..utils import maybe_future, url_path_join
|
||||
from ..metrics import (
|
||||
SERVER_SPAWN_DURATION_SECONDS, ServerSpawnStatus,
|
||||
@@ -103,6 +104,10 @@ class BaseHandler(RequestHandler):
|
||||
def allow_named_servers(self):
|
||||
return self.settings.get('allow_named_servers', False)
|
||||
|
||||
@property
|
||||
def named_server_limit_per_user(self):
|
||||
return self.settings.get('named_server_limit_per_user', 0)
|
||||
|
||||
@property
|
||||
def domain(self):
|
||||
return self.settings['domain']
|
||||
@@ -236,7 +241,7 @@ class BaseHandler(RequestHandler):
|
||||
self.db.commit()
|
||||
return self._user_from_orm(orm_token.user)
|
||||
|
||||
async def refresh_user_auth(self, user, force=False):
|
||||
async def refresh_auth(self, user, force=False):
|
||||
"""Refresh user authentication info
|
||||
|
||||
Calls `authenticator.refresh_user(user)`
|
||||
@@ -250,7 +255,12 @@ class BaseHandler(RequestHandler):
|
||||
user (User): the user having been refreshed,
|
||||
or None if the user must login again to refresh auth info.
|
||||
"""
|
||||
if not force: # TODO: and it's sufficiently recent
|
||||
refresh_age = self.authenticator.auth_refresh_age
|
||||
if not refresh_age:
|
||||
return user
|
||||
now = time.monotonic()
|
||||
if not force and user._auth_refreshed and (now - user._auth_refreshed < refresh_age):
|
||||
# auth up-to-date
|
||||
return user
|
||||
|
||||
# refresh a user at most once per request
|
||||
@@ -262,7 +272,7 @@ class BaseHandler(RequestHandler):
|
||||
self._refreshed_users.add(user.name)
|
||||
|
||||
self.log.debug("Refreshing auth for %s", user.name)
|
||||
auth_info = await self.authenticator.refresh_user(user)
|
||||
auth_info = await self.authenticator.refresh_user(user, self)
|
||||
|
||||
if not auth_info:
|
||||
self.log.warning(
|
||||
@@ -271,6 +281,8 @@ class BaseHandler(RequestHandler):
|
||||
)
|
||||
return
|
||||
|
||||
user._auth_refreshed = now
|
||||
|
||||
if auth_info == True:
|
||||
# refresh_user confirmed that it's up-to-date,
|
||||
# nothing to refresh
|
||||
@@ -298,6 +310,10 @@ class BaseHandler(RequestHandler):
|
||||
now = datetime.utcnow()
|
||||
orm_token.last_activity = now
|
||||
if orm_token.user:
|
||||
# FIXME: scopes should give us better control than this
|
||||
# don't consider API requests originating from a server
|
||||
# to be activity from the user
|
||||
if not orm_token.note.startswith("Server at "):
|
||||
orm_token.user.last_activity = now
|
||||
self.db.commit()
|
||||
|
||||
@@ -351,8 +367,8 @@ class BaseHandler(RequestHandler):
|
||||
user = self.get_current_user_token()
|
||||
if user is None:
|
||||
user = self.get_current_user_cookie()
|
||||
if user:
|
||||
user = await self.refresh_user_auth(user)
|
||||
if user and isinstance(user, User):
|
||||
user = await self.refresh_auth(user)
|
||||
self._jupyterhub_user = user
|
||||
except Exception:
|
||||
# don't let errors here raise more than once
|
||||
@@ -606,6 +622,7 @@ class BaseHandler(RequestHandler):
|
||||
self.statsd.incr('login.success')
|
||||
self.statsd.timing('login.authenticate.success', auth_timer.ms)
|
||||
self.log.info("User logged in: %s", user.name)
|
||||
user._auth_refreshed = time.monotonic()
|
||||
return user
|
||||
else:
|
||||
self.statsd.incr('login.failure')
|
||||
@@ -639,6 +656,11 @@ class BaseHandler(RequestHandler):
|
||||
|
||||
async def spawn_single_user(self, user, server_name='', options=None):
|
||||
# in case of error, include 'try again from /hub/home' message
|
||||
if self.authenticator.refresh_pre_spawn:
|
||||
auth_user = await self.refresh_auth(user, force=True)
|
||||
if auth_user is None:
|
||||
raise web.HTTPError(403, "auth has expired for %s, login again", user.name)
|
||||
|
||||
spawn_start_time = time.perf_counter()
|
||||
self.extra_error_html = self.spawn_home_error
|
||||
|
||||
@@ -1050,6 +1072,10 @@ class PrefixRedirectHandler(BaseHandler):
|
||||
path = self.request.uri[len(self.base_url):]
|
||||
else:
|
||||
path = self.request.path
|
||||
if not path:
|
||||
# default / -> /hub/ redirect
|
||||
# avoiding extra hop through /hub
|
||||
path = '/'
|
||||
self.redirect(url_path_join(
|
||||
self.hub.base_url, path,
|
||||
), permanent=False)
|
||||
@@ -1098,7 +1124,7 @@ class UserSpawnHandler(BaseHandler):
|
||||
# otherwise redirect users to their own server
|
||||
should_spawn = (current_user and current_user.name == user_name)
|
||||
|
||||
if "api" in user_path.split("/") and not user.active:
|
||||
if "api" in user_path.split("/") and user and not user.active:
|
||||
# API request for not-running server (e.g. notebook UI left open)
|
||||
# Avoid triggering a spawn.
|
||||
self._fail_api_request(user)
|
||||
@@ -1187,7 +1213,7 @@ class UserSpawnHandler(BaseHandler):
|
||||
status = 0
|
||||
# server is not running, trigger spawn
|
||||
if status is not None:
|
||||
if spawner.options_form:
|
||||
if await spawner.get_options_form():
|
||||
url_parts = [self.hub.base_url, 'spawn']
|
||||
if current_user.name != user.name:
|
||||
# spawning on behalf of another user
|
||||
|
@@ -3,19 +3,41 @@
|
||||
# Copyright (c) Jupyter Development Team.
|
||||
# Distributed under the terms of the Modified BSD License.
|
||||
|
||||
import asyncio
|
||||
|
||||
from tornado.escape import url_escape
|
||||
from tornado import gen
|
||||
from tornado.httputil import url_concat
|
||||
from tornado import web
|
||||
|
||||
from .base import BaseHandler
|
||||
from ..utils import maybe_future
|
||||
|
||||
|
||||
class LogoutHandler(BaseHandler):
|
||||
"""Log a user out by clearing their login cookie."""
|
||||
def get(self):
|
||||
|
||||
@property
|
||||
def shutdown_on_logout(self):
|
||||
return self.settings.get('shutdown_on_logout', False)
|
||||
|
||||
async def get(self):
|
||||
user = self.current_user
|
||||
if user:
|
||||
if self.shutdown_on_logout:
|
||||
active_servers = [
|
||||
name
|
||||
for (name, spawner) in user.spawners.items()
|
||||
if spawner.active and not spawner.pending
|
||||
]
|
||||
if active_servers:
|
||||
self.log.info("Shutting down %s's servers", user.name)
|
||||
futures = []
|
||||
for server_name in active_servers:
|
||||
futures.append(
|
||||
maybe_future(self.stop_single_user(user, server_name))
|
||||
)
|
||||
await asyncio.gather(*futures)
|
||||
|
||||
self.log.info("User logged out: %s", user.name)
|
||||
self.clear_login_cookie()
|
||||
self.statsd.incr('logout')
|
||||
|
@@ -58,6 +58,7 @@ class HomeHandler(BaseHandler):
|
||||
user=user,
|
||||
url=url,
|
||||
allow_named_servers=self.allow_named_servers,
|
||||
named_server_limit_per_user=self.named_server_limit_per_user,
|
||||
url_path_join=url_path_join,
|
||||
# can't use user.spawners because the stop method of User pops named servers from user.spawners when they're stopped
|
||||
spawners = user.orm_user._orm_spawners,
|
||||
@@ -73,11 +74,7 @@ class SpawnHandler(BaseHandler):
|
||||
|
||||
Only enabled when Spawner.options_form is defined.
|
||||
"""
|
||||
async def _render_form(self, message='', for_user=None):
|
||||
# Note that 'user' is the authenticated user making the request and
|
||||
# 'for_user' is the user whose server is being spawned.
|
||||
user = for_user or self.get_current_user()
|
||||
spawner_options_form = await user.spawner.get_options_form()
|
||||
def _render_form(self, for_user, spawner_options_form, message=''):
|
||||
return self.render_template('spawn.html',
|
||||
for_user=for_user,
|
||||
spawner_options_form=spawner_options_form,
|
||||
@@ -106,10 +103,12 @@ class SpawnHandler(BaseHandler):
|
||||
self.log.debug("User is running: %s", url)
|
||||
self.redirect(url)
|
||||
return
|
||||
if user.spawner.options_form:
|
||||
|
||||
spawner_options_form = await user.spawner.get_options_form()
|
||||
if spawner_options_form:
|
||||
# Add handler to spawner here so you can access query params in form rendering.
|
||||
user.spawner.handler = self
|
||||
form = await self._render_form(for_user=user)
|
||||
form = self._render_form(for_user=user, spawner_options_form=spawner_options_form)
|
||||
self.finish(form)
|
||||
else:
|
||||
# Explicit spawn request: clear _spawn_future
|
||||
@@ -153,7 +152,8 @@ class SpawnHandler(BaseHandler):
|
||||
await self.spawn_single_user(user, options=options)
|
||||
except Exception as e:
|
||||
self.log.error("Failed to spawn single-user server with form", exc_info=True)
|
||||
form = await self._render_form(message=str(e), for_user=user)
|
||||
spawner_options_form = await user.spawner.get_options_form()
|
||||
form = self._render_form(for_user=user, spawner_options_form=spawner_options_form, message=str(e))
|
||||
self.finish(form)
|
||||
return
|
||||
if current_user is user:
|
||||
@@ -230,6 +230,7 @@ class AdminHandler(BaseHandler):
|
||||
running=running,
|
||||
sort={s:o for s,o in zip(sorts, orders)},
|
||||
allow_named_servers=self.allow_named_servers,
|
||||
named_server_limit_per_user=self.named_server_limit_per_user,
|
||||
)
|
||||
self.finish(html)
|
||||
|
||||
|
@@ -12,14 +12,13 @@ import alembic.command
|
||||
from alembic.script import ScriptDirectory
|
||||
from tornado.log import app_log
|
||||
|
||||
from sqlalchemy.types import TypeDecorator, TEXT, LargeBinary
|
||||
from sqlalchemy.types import TypeDecorator, Text, LargeBinary
|
||||
from sqlalchemy import (
|
||||
create_engine, event, exc, inspect, or_, select,
|
||||
Column, Integer, ForeignKey, Unicode, Boolean,
|
||||
DateTime, Enum, Table,
|
||||
)
|
||||
from sqlalchemy.ext.declarative import declarative_base
|
||||
from sqlalchemy.interfaces import PoolListener
|
||||
from sqlalchemy.orm import (
|
||||
Session,
|
||||
interfaces, object_session, relationship, sessionmaker,
|
||||
@@ -46,7 +45,7 @@ class JSONDict(TypeDecorator):
|
||||
|
||||
"""
|
||||
|
||||
impl = TEXT
|
||||
impl = Text
|
||||
|
||||
def process_bind_param(self, value, dialect):
|
||||
if value is not None:
|
||||
@@ -559,10 +558,13 @@ class DatabaseSchemaMismatch(Exception):
|
||||
"""
|
||||
|
||||
|
||||
class ForeignKeysListener(PoolListener):
|
||||
"""Enable foreign keys on sqlite"""
|
||||
def connect(self, dbapi_con, con_record):
|
||||
dbapi_con.execute('pragma foreign_keys=ON')
|
||||
def register_foreign_keys(engine):
|
||||
"""register PRAGMA foreign_keys=on on connection"""
|
||||
@event.listens_for(engine, "connect")
|
||||
def connect(dbapi_con, con_record):
|
||||
cursor = dbapi_con.cursor()
|
||||
cursor.execute("PRAGMA foreign_keys=ON")
|
||||
cursor.close()
|
||||
|
||||
|
||||
def _expire_relationship(target, relationship_prop):
|
||||
@@ -735,8 +737,6 @@ def new_session_factory(url="sqlite:///:memory:",
|
||||
"""Create a new session at url"""
|
||||
if url.startswith('sqlite'):
|
||||
kwargs.setdefault('connect_args', {'check_same_thread': False})
|
||||
listeners = kwargs.setdefault('listeners', [])
|
||||
listeners.append(ForeignKeysListener())
|
||||
|
||||
elif url.startswith('mysql'):
|
||||
kwargs.setdefault('pool_recycle', 60)
|
||||
@@ -747,6 +747,9 @@ def new_session_factory(url="sqlite:///:memory:",
|
||||
kwargs.setdefault('poolclass', StaticPool)
|
||||
|
||||
engine = create_engine(url, **kwargs)
|
||||
if url.startswith('sqlite'):
|
||||
register_foreign_keys(engine)
|
||||
|
||||
# enable pessimistic disconnect handling
|
||||
register_ping_connection(engine)
|
||||
|
||||
|
@@ -99,6 +99,11 @@ class _ExpiringDict(dict):
|
||||
except KeyError:
|
||||
return default
|
||||
|
||||
def clear(self):
|
||||
"""Clear the cache"""
|
||||
self.values.clear()
|
||||
self.timestamps.clear()
|
||||
|
||||
|
||||
class HubAuth(SingletonConfigurable):
|
||||
"""A class for authenticating with JupyterHub
|
||||
@@ -277,11 +282,10 @@ class HubAuth(SingletonConfigurable):
|
||||
if cache_key is None:
|
||||
raise ValueError("cache_key is required when using cache")
|
||||
# check for a cached reply, so we don't check with the Hub if we don't have to
|
||||
cached = self.cache.get(cache_key)
|
||||
if cached is not None:
|
||||
return cached
|
||||
else:
|
||||
app_log.debug("Cache miss: %s" % cache_key)
|
||||
try:
|
||||
return self.cache[cache_key]
|
||||
except KeyError:
|
||||
app_log.debug("HubAuth cache miss: %s", cache_key)
|
||||
|
||||
data = self._api_request('GET', url, allow_404=True)
|
||||
if data is None:
|
||||
@@ -848,8 +852,8 @@ class HubAuthenticated(object):
|
||||
self._hub_auth_user_cache = None
|
||||
raise
|
||||
|
||||
# store ?token=... tokens passed via url in a cookie for future requests
|
||||
url_token = self.get_argument('token', '')
|
||||
# store tokens passed via url or header in a cookie for future requests
|
||||
url_token = self.hub_auth.get_token(self)
|
||||
if (
|
||||
user_model
|
||||
and url_token
|
||||
|
@@ -107,6 +107,8 @@ class _ServiceSpawner(LocalProcessSpawner):
|
||||
def start(self):
|
||||
"""Start the process"""
|
||||
env = self.get_env()
|
||||
# no activity url for services
|
||||
env.pop('JUPYTERHUB_ACTIVITY_URL', None)
|
||||
if os.name == 'nt':
|
||||
env['SYSTEMROOT'] = os.environ['SYSTEMROOT']
|
||||
cmd = self.cmd
|
||||
@@ -258,7 +260,6 @@ class Service(LoggingConfigurable):
|
||||
def _default_redirect_uri(self):
|
||||
if self.server is None:
|
||||
return ''
|
||||
print(self.domain, self.host, self.server)
|
||||
return self.host + url_path_join(self.prefix, 'oauth_callback')
|
||||
|
||||
@property
|
||||
|
@@ -4,13 +4,17 @@
|
||||
# Copyright (c) Jupyter Development Team.
|
||||
# Distributed under the terms of the Modified BSD License.
|
||||
|
||||
import asyncio
|
||||
from datetime import datetime, timezone
|
||||
import json
|
||||
import os
|
||||
import random
|
||||
from textwrap import dedent
|
||||
from urllib.parse import urlparse
|
||||
|
||||
from jinja2 import ChoiceLoader, FunctionLoader
|
||||
|
||||
from tornado.httpclient import AsyncHTTPClient
|
||||
from tornado.httpclient import AsyncHTTPClient, HTTPRequest
|
||||
from tornado import gen
|
||||
from tornado import ioloop
|
||||
from tornado.web import HTTPError, RequestHandler
|
||||
@@ -21,8 +25,10 @@ except ImportError:
|
||||
raise ImportError("JupyterHub single-user server requires notebook >= 4.0")
|
||||
|
||||
from traitlets import (
|
||||
Any,
|
||||
Bool,
|
||||
Bytes,
|
||||
Integer,
|
||||
Unicode,
|
||||
CUnicode,
|
||||
default,
|
||||
@@ -43,7 +49,7 @@ from notebook.base.handlers import IPythonHandler
|
||||
from ._version import __version__, _check_version
|
||||
from .log import log_request
|
||||
from .services.auth import HubOAuth, HubOAuthenticated, HubOAuthCallbackHandler
|
||||
from .utils import url_path_join, make_ssl_context
|
||||
from .utils import isoformat, url_path_join, make_ssl_context, exponential_backoff
|
||||
|
||||
|
||||
# Authenticate requests with the Hub
|
||||
@@ -385,20 +391,32 @@ class SingleUserNotebookApp(NotebookApp):
|
||||
path = list(_exclude_home(path))
|
||||
return path
|
||||
|
||||
# create dynamic default http client,
|
||||
# configured with any relevant ssl config
|
||||
hub_http_client = Any()
|
||||
@default('hub_http_client')
|
||||
def _default_client(self):
|
||||
ssl_context = make_ssl_context(
|
||||
self.keyfile,
|
||||
self.certfile,
|
||||
cafile=self.client_ca,
|
||||
)
|
||||
AsyncHTTPClient.configure(
|
||||
None,
|
||||
defaults={
|
||||
"ssl_options": ssl_context,
|
||||
},
|
||||
)
|
||||
return AsyncHTTPClient()
|
||||
|
||||
|
||||
async def check_hub_version(self):
|
||||
"""Test a connection to my Hub
|
||||
|
||||
- exit if I can't connect at all
|
||||
- check version and warn on sufficient mismatch
|
||||
"""
|
||||
ssl_context = make_ssl_context(
|
||||
self.keyfile,
|
||||
self.certfile,
|
||||
cafile=self.client_ca,
|
||||
)
|
||||
AsyncHTTPClient.configure(None, defaults={"ssl_options" : ssl_context})
|
||||
|
||||
client = AsyncHTTPClient()
|
||||
client = self.hub_http_client
|
||||
RETRIES = 5
|
||||
for i in range(1, RETRIES+1):
|
||||
try:
|
||||
@@ -415,6 +433,112 @@ class SingleUserNotebookApp(NotebookApp):
|
||||
hub_version = resp.headers.get('X-JupyterHub-Version')
|
||||
_check_version(hub_version, __version__, self.log)
|
||||
|
||||
server_name = Unicode()
|
||||
@default('server_name')
|
||||
def _server_name_default(self):
|
||||
return os.environ.get('JUPYTERHUB_SERVER_NAME', '')
|
||||
|
||||
hub_activity_url = Unicode(
|
||||
config=True,
|
||||
help="URL for sending JupyterHub activity updates",
|
||||
)
|
||||
@default('hub_activity_url')
|
||||
def _default_activity_url(self):
|
||||
return os.environ.get('JUPYTERHUB_ACTIVITY_URL', '')
|
||||
|
||||
hub_activity_interval = Integer(
|
||||
300,
|
||||
config=True,
|
||||
help="""
|
||||
Interval (in seconds) on which to update the Hub
|
||||
with our latest activity.
|
||||
"""
|
||||
)
|
||||
@default('hub_activity_interval')
|
||||
def _default_activity_interval(self):
|
||||
env_value = os.environ.get('JUPYTERHUB_ACTIVITY_INTERVAL')
|
||||
if env_value:
|
||||
return int(env_value)
|
||||
else:
|
||||
return 300
|
||||
|
||||
_last_activity_sent = Any(allow_none=True)
|
||||
|
||||
async def notify_activity(self):
|
||||
"""Notify jupyterhub of activity"""
|
||||
client = self.hub_http_client
|
||||
last_activity = self.web_app.last_activity()
|
||||
if not last_activity:
|
||||
self.log.debug("No activity to send to the Hub")
|
||||
return
|
||||
if last_activity:
|
||||
# protect against mixed timezone comparisons
|
||||
if not last_activity.tzinfo:
|
||||
# assume naive timestamps are utc
|
||||
self.log.warning("last activity is using naïve timestamps")
|
||||
last_activity = last_activity.replace(tzinfo=timezone.utc)
|
||||
|
||||
if (
|
||||
self._last_activity_sent
|
||||
and last_activity < self._last_activity_sent
|
||||
):
|
||||
self.log.debug("No activity since %s", self._last_activity_sent)
|
||||
return
|
||||
|
||||
last_activity_timestamp = isoformat(last_activity)
|
||||
|
||||
async def notify():
|
||||
self.log.debug("Notifying Hub of activity %s", last_activity_timestamp)
|
||||
req = HTTPRequest(
|
||||
url=self.hub_activity_url,
|
||||
method='POST',
|
||||
headers={
|
||||
"Authorization": "token {}".format(self.hub_auth.api_token),
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
body=json.dumps({
|
||||
'servers': {
|
||||
self.server_name: {
|
||||
'last_activity': last_activity_timestamp,
|
||||
},
|
||||
},
|
||||
'last_activity': last_activity_timestamp,
|
||||
})
|
||||
)
|
||||
try:
|
||||
await client.fetch(req)
|
||||
except Exception:
|
||||
self.log.exception("Error notifying Hub of activity")
|
||||
return False
|
||||
else:
|
||||
return True
|
||||
|
||||
await exponential_backoff(
|
||||
notify,
|
||||
fail_message="Failed to notify Hub of activity",
|
||||
start_wait=1,
|
||||
max_wait=15,
|
||||
timeout=60,
|
||||
)
|
||||
self._last_activity_sent = last_activity
|
||||
|
||||
async def keep_activity_updated(self):
|
||||
if not self.hub_activity_url or not self.hub_activity_interval:
|
||||
self.log.warning("Activity events disabled")
|
||||
return
|
||||
self.log.info("Updating Hub with activity every %s seconds",
|
||||
self.hub_activity_interval
|
||||
)
|
||||
while True:
|
||||
try:
|
||||
await self.notify_activity()
|
||||
except Exception as e:
|
||||
self.log.exception("Error notifying Hub of activity")
|
||||
# add 20% jitter to the interval to avoid alignment
|
||||
# of lots of requests from user servers
|
||||
t = self.hub_activity_interval * (1 + 0.2 * (random.random() - 0.5))
|
||||
await asyncio.sleep(t)
|
||||
|
||||
def initialize(self, argv=None):
|
||||
# disable trash by default
|
||||
# this can be re-enabled by config
|
||||
@@ -425,6 +549,7 @@ class SingleUserNotebookApp(NotebookApp):
|
||||
self.log.info("Starting jupyterhub-singleuser server version %s", __version__)
|
||||
# start by hitting Hub to check version
|
||||
ioloop.IOLoop.current().run_sync(self.check_hub_version)
|
||||
ioloop.IOLoop.current().add_callback(self.keep_activity_updated)
|
||||
super(SingleUserNotebookApp, self).start()
|
||||
|
||||
def init_hub_auth(self):
|
||||
|
@@ -5,6 +5,7 @@ Contains base Spawner class & default implementation
|
||||
# Copyright (c) Jupyter Development Team.
|
||||
# Distributed under the terms of the Modified BSD License.
|
||||
|
||||
import ast
|
||||
import asyncio
|
||||
import errno
|
||||
import json
|
||||
@@ -14,7 +15,6 @@ import shutil
|
||||
import signal
|
||||
import sys
|
||||
import warnings
|
||||
import pwd
|
||||
from subprocess import Popen
|
||||
from tempfile import mkdtemp
|
||||
|
||||
@@ -36,6 +36,23 @@ from .traitlets import Command, ByteSpecification, Callable
|
||||
from .utils import iterate_until, maybe_future, random_port, url_path_join, exponential_backoff
|
||||
|
||||
|
||||
def _quote_safe(s):
|
||||
"""pass a string that is safe on the command-line
|
||||
|
||||
traitlets may parse literals on the command-line, e.g. `--ip=123` will be the number 123 instead of the *string* 123.
|
||||
wrap valid literals in repr to ensure they are safe
|
||||
"""
|
||||
|
||||
try:
|
||||
val = ast.literal_eval(s)
|
||||
except Exception:
|
||||
# not valid, leave it alone
|
||||
return s
|
||||
else:
|
||||
# it's a valid literal, wrap it in repr (usually just quotes, but with proper escapes)
|
||||
# to avoid getting interpreted by traitlets
|
||||
return repr(s)
|
||||
|
||||
class Spawner(LoggingConfigurable):
|
||||
"""Base class for spawning single-user notebook servers.
|
||||
|
||||
@@ -636,7 +653,15 @@ class Spawner(LoggingConfigurable):
|
||||
|
||||
# Info previously passed on args
|
||||
env['JUPYTERHUB_USER'] = self.user.name
|
||||
env['JUPYTERHUB_SERVER_NAME'] = self.name
|
||||
env['JUPYTERHUB_API_URL'] = self.hub.api_url
|
||||
env['JUPYTERHUB_ACTIVITY_URL'] = url_path_join(
|
||||
self.hub.api_url,
|
||||
'users',
|
||||
# tolerate mocks defining only user.name
|
||||
getattr(self.user, 'escaped_name', self.user.name),
|
||||
'activity',
|
||||
)
|
||||
env['JUPYTERHUB_BASE_URL'] = self.hub.base_url[:-4]
|
||||
if self.server:
|
||||
env['JUPYTERHUB_SERVICE_PREFIX'] = self.server.base_url
|
||||
@@ -661,6 +686,22 @@ class Spawner(LoggingConfigurable):
|
||||
|
||||
return env
|
||||
|
||||
async def get_url(self):
|
||||
"""Get the URL to connect to the server
|
||||
|
||||
Sometimes JupyterHub may ask the Spawner for its url.
|
||||
This can occur e.g. when JupyterHub has restarted while a server was not finished starting,
|
||||
giving Spawners a chance to recover the URL where their server is running.
|
||||
|
||||
The default is to trust that JupyterHub has the right information.
|
||||
Only override this method in Spawners that know how to
|
||||
check the correct URL for the servers they start.
|
||||
|
||||
This will only be asked of Spawners that claim to be running
|
||||
(`poll()` returns `None`).
|
||||
"""
|
||||
return self.server.url
|
||||
|
||||
def template_namespace(self):
|
||||
"""Return the template namespace for format-string formatting.
|
||||
|
||||
@@ -816,7 +857,7 @@ class Spawner(LoggingConfigurable):
|
||||
args = []
|
||||
|
||||
if self.ip:
|
||||
args.append('--ip="%s"' % self.ip)
|
||||
args.append('--ip=%s' % _quote_safe(self.ip))
|
||||
|
||||
if self.port:
|
||||
args.append('--port=%i' % self.port)
|
||||
@@ -826,10 +867,10 @@ class Spawner(LoggingConfigurable):
|
||||
|
||||
if self.notebook_dir:
|
||||
notebook_dir = self.format_string(self.notebook_dir)
|
||||
args.append('--notebook-dir="%s"' % notebook_dir)
|
||||
args.append('--notebook-dir=%s' % _quote_safe(notebook_dir))
|
||||
if self.default_url:
|
||||
default_url = self.format_string(self.default_url)
|
||||
args.append('--NotebookApp.default_url="%s"' % default_url)
|
||||
args.append('--NotebookApp.default_url=%s' % _quote_safe(default_url))
|
||||
|
||||
if self.debug:
|
||||
args.append('--debug')
|
||||
@@ -1051,6 +1092,7 @@ def set_user_setuid(username, chdir=True):
|
||||
home directory.
|
||||
"""
|
||||
import grp
|
||||
import pwd
|
||||
user = pwd.getpwnam(username)
|
||||
uid = user.pw_uid
|
||||
gid = user.pw_gid
|
||||
@@ -1224,6 +1266,7 @@ class LocalProcessSpawner(Spawner):
|
||||
Stage certificates into a private home directory
|
||||
and make them readable by the user.
|
||||
"""
|
||||
import pwd
|
||||
key = paths['keyfile']
|
||||
cert = paths['certfile']
|
||||
ca = paths['cafile']
|
||||
@@ -1383,3 +1426,52 @@ class LocalProcessSpawner(Spawner):
|
||||
if status is None:
|
||||
# it all failed, zombie process
|
||||
self.log.warning("Process %i never died", self.pid)
|
||||
|
||||
|
||||
class SimpleLocalProcessSpawner(LocalProcessSpawner):
|
||||
"""
|
||||
A version of LocalProcessSpawner that doesn't require users to exist on
|
||||
the system beforehand.
|
||||
|
||||
Only use this for testing.
|
||||
|
||||
Note: DO NOT USE THIS FOR PRODUCTION USE CASES! It is very insecure, and
|
||||
provides absolutely no isolation between different users!
|
||||
"""
|
||||
|
||||
home_dir_template = Unicode(
|
||||
'/tmp/{username}',
|
||||
config=True,
|
||||
help="""
|
||||
Template to expand to set the user home.
|
||||
{username} is expanded to the jupyterhub username.
|
||||
"""
|
||||
)
|
||||
|
||||
home_dir = Unicode(help="The home directory for the user")
|
||||
@default('home_dir')
|
||||
def _default_home_dir(self):
|
||||
return self.home_dir_template.format(
|
||||
username=self.user.name,
|
||||
)
|
||||
|
||||
def make_preexec_fn(self, name):
|
||||
home = self.home_dir
|
||||
def preexec():
|
||||
try:
|
||||
os.makedirs(home, 0o755, exist_ok=True)
|
||||
os.chdir(home)
|
||||
except Exception as e:
|
||||
self.log.exception("Error in preexec for %s", name)
|
||||
return preexec
|
||||
|
||||
def user_env(self, env):
|
||||
env['USER'] = self.user.name
|
||||
env['HOME'] = self.home_dir
|
||||
env['SHELL'] = '/bin/bash'
|
||||
return env
|
||||
|
||||
def move_certs(self, paths):
|
||||
"""No-op for installing certs"""
|
||||
return paths
|
||||
|
||||
|
@@ -29,6 +29,7 @@ Fixtures to add functionality or spawning behavior
|
||||
|
||||
import asyncio
|
||||
from getpass import getuser
|
||||
import inspect
|
||||
import logging
|
||||
import os
|
||||
import sys
|
||||
@@ -46,7 +47,7 @@ from ..utils import random_port
|
||||
|
||||
from . import mocking
|
||||
from .mocking import MockHub
|
||||
from .utils import ssl_setup
|
||||
from .utils import ssl_setup, add_user
|
||||
from .test_services import mockservice_cmd
|
||||
|
||||
import jupyterhub.services.service
|
||||
@@ -55,6 +56,16 @@ import jupyterhub.services.service
|
||||
_db = None
|
||||
|
||||
|
||||
def pytest_collection_modifyitems(items):
|
||||
"""add asyncio marker to all async tests"""
|
||||
for item in items:
|
||||
if inspect.iscoroutinefunction(item.obj):
|
||||
item.add_marker('asyncio')
|
||||
if hasattr(inspect, 'isasyncgenfunction'):
|
||||
# double-check that we aren't mixing yield and async def
|
||||
assert not inspect.isasyncgenfunction(item.obj)
|
||||
|
||||
|
||||
@fixture(scope='module')
|
||||
def ssl_tmpdir(tmpdir_factory):
|
||||
return tmpdir_factory.mktemp('ssl')
|
||||
@@ -65,7 +76,7 @@ def app(request, io_loop, ssl_tmpdir):
|
||||
"""Mock a jupyterhub app for testing"""
|
||||
mocked_app = None
|
||||
ssl_enabled = getattr(request.module, "ssl_enabled", False)
|
||||
kwargs = dict(log_level=logging.DEBUG)
|
||||
kwargs = dict()
|
||||
if ssl_enabled:
|
||||
kwargs.update(
|
||||
dict(
|
||||
@@ -126,15 +137,21 @@ def db():
|
||||
|
||||
|
||||
@fixture(scope='module')
|
||||
def io_loop(request):
|
||||
def event_loop(request):
|
||||
"""Same as pytest-asyncio.event_loop, but re-scoped to module-level"""
|
||||
event_loop = asyncio.new_event_loop()
|
||||
asyncio.set_event_loop(event_loop)
|
||||
return event_loop
|
||||
|
||||
|
||||
@fixture(scope='module')
|
||||
def io_loop(event_loop, request):
|
||||
"""Same as pytest-tornado.io_loop, but re-scoped to module-level"""
|
||||
ioloop.IOLoop.configure(AsyncIOMainLoop)
|
||||
aio_loop = asyncio.new_event_loop()
|
||||
asyncio.set_event_loop(aio_loop)
|
||||
io_loop = ioloop.IOLoop()
|
||||
io_loop = AsyncIOMainLoop()
|
||||
io_loop.make_current()
|
||||
assert asyncio.get_event_loop() is aio_loop
|
||||
assert io_loop.asyncio_loop is aio_loop
|
||||
assert asyncio.get_event_loop() is event_loop
|
||||
assert io_loop.asyncio_loop is event_loop
|
||||
|
||||
def _close():
|
||||
io_loop.clear_current()
|
||||
@@ -168,6 +185,43 @@ def cleanup_after(request, io_loop):
|
||||
app.db.commit()
|
||||
|
||||
|
||||
_username_counter = 0
|
||||
|
||||
|
||||
def new_username(prefix='testuser'):
|
||||
"""Return a new unique username"""
|
||||
global _username_counter
|
||||
_username_counter += 1
|
||||
return '{}-{}'.format(prefix, _username_counter)
|
||||
|
||||
|
||||
@fixture
|
||||
def username():
|
||||
"""allocate a temporary username
|
||||
|
||||
unique each time the fixture is used
|
||||
"""
|
||||
yield new_username()
|
||||
|
||||
|
||||
@fixture
|
||||
def user(app):
|
||||
"""Fixture for creating a temporary user
|
||||
|
||||
Each time the fixture is used, a new user is created
|
||||
"""
|
||||
user = add_user(app.db, app, name=new_username())
|
||||
yield user
|
||||
|
||||
|
||||
@fixture
|
||||
def admin_user(app, username):
|
||||
"""Fixture for creating a temporary admin user"""
|
||||
user = add_user(app.db, app, name=new_username('testadmin'), admin=True)
|
||||
yield user
|
||||
|
||||
|
||||
|
||||
class MockServiceSpawner(jupyterhub.services.service._ServiceSpawner):
|
||||
"""mock services for testing.
|
||||
|
||||
|
@@ -40,16 +40,16 @@ from tornado import gen
|
||||
from tornado.concurrent import Future
|
||||
from tornado.ioloop import IOLoop
|
||||
|
||||
from traitlets import Bool, default
|
||||
from traitlets import Bool, Dict, default
|
||||
|
||||
from ..app import JupyterHub
|
||||
from ..auth import PAMAuthenticator
|
||||
from .. import orm
|
||||
from ..objects import Server
|
||||
from ..spawner import LocalProcessSpawner
|
||||
from ..spawner import LocalProcessSpawner, SimpleLocalProcessSpawner
|
||||
from ..singleuser import SingleUserNotebookApp
|
||||
from ..utils import random_port, url_path_join
|
||||
from .utils import async_requests, ssl_setup
|
||||
from .utils import async_requests, ssl_setup, public_host, public_url
|
||||
|
||||
from pamela import PAMError
|
||||
|
||||
@@ -73,20 +73,14 @@ def mock_open_session(username, service, encoding):
|
||||
pass
|
||||
|
||||
|
||||
class MockSpawner(LocalProcessSpawner):
|
||||
class MockSpawner(SimpleLocalProcessSpawner):
|
||||
"""Base mock spawner
|
||||
|
||||
- disables user-switching that we need root permissions to do
|
||||
- spawns `jupyterhub.tests.mocksu` instead of a full single-user server
|
||||
"""
|
||||
def make_preexec_fn(self, *a, **kw):
|
||||
# skip the setuid stuff
|
||||
return
|
||||
|
||||
def _set_user_changed(self, name, old, new):
|
||||
pass
|
||||
|
||||
def user_env(self, env):
|
||||
env = super().user_env(env)
|
||||
if self.handler:
|
||||
env['HANDLER_ARGS'] = self.handler.request.query
|
||||
return env
|
||||
@@ -95,10 +89,6 @@ class MockSpawner(LocalProcessSpawner):
|
||||
def _cmd_default(self):
|
||||
return [sys.executable, '-m', 'jupyterhub.tests.mocksu']
|
||||
|
||||
def move_certs(self, paths):
|
||||
"""Return the paths unmodified"""
|
||||
return paths
|
||||
|
||||
use_this_api_token = None
|
||||
def start(self):
|
||||
if self.use_this_api_token:
|
||||
@@ -177,6 +167,11 @@ class FormSpawner(MockSpawner):
|
||||
options['hello'] = form_data['hello_file'][0]
|
||||
return options
|
||||
|
||||
class FalsyCallableFormSpawner(FormSpawner):
|
||||
"""A spawner that has a callable options form defined returning a falsy value"""
|
||||
|
||||
options_form = lambda a, b: ""
|
||||
|
||||
|
||||
class MockStructGroup:
|
||||
"""Mock grp.struct_group"""
|
||||
@@ -229,11 +224,17 @@ class MockPAMAuthenticator(PAMAuthenticator):
|
||||
class MockHub(JupyterHub):
|
||||
"""Hub with various mock bits"""
|
||||
|
||||
# disable some inherited traits with hardcoded values
|
||||
db_file = None
|
||||
last_activity_interval = 2
|
||||
log_datefmt = '%M:%S'
|
||||
external_certs = None
|
||||
log_level = 10
|
||||
|
||||
@default('log_level')
|
||||
def _default_log_level(self):
|
||||
return 10
|
||||
|
||||
# MockHub additional traits
|
||||
external_certs = Dict()
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
if 'internal_certs_location' in kwargs:
|
||||
@@ -361,31 +362,6 @@ class MockHub(JupyterHub):
|
||||
return r.cookies
|
||||
|
||||
|
||||
def public_host(app):
|
||||
"""Return the public *host* (no URL prefix) of the given JupyterHub instance."""
|
||||
if app.subdomain_host:
|
||||
return app.subdomain_host
|
||||
else:
|
||||
return Server.from_url(app.proxy.public_url).host
|
||||
|
||||
|
||||
def public_url(app, user_or_service=None, path=''):
|
||||
"""Return the full, public base URL (including prefix) of the given JupyterHub instance."""
|
||||
if user_or_service:
|
||||
if app.subdomain_host:
|
||||
host = user_or_service.host
|
||||
else:
|
||||
host = public_host(app)
|
||||
prefix = user_or_service.prefix
|
||||
else:
|
||||
host = public_host(app)
|
||||
prefix = Server.from_url(app.proxy.public_url).base_url
|
||||
if path:
|
||||
return host + url_path_join(prefix, path)
|
||||
else:
|
||||
return host + prefix
|
||||
|
||||
|
||||
# single-user-server mocking:
|
||||
|
||||
class MockSingleUserServer(SingleUserNotebookApp):
|
||||
|
File diff suppressed because it is too large
Load Diff
@@ -63,8 +63,7 @@ def test_generate_config():
|
||||
assert 'Authenticator.whitelist' in cfg_text
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_init_tokens(request):
|
||||
async def test_init_tokens(request):
|
||||
with TemporaryDirectory() as td:
|
||||
db_file = os.path.join(td, 'jupyterhub.sqlite')
|
||||
tokens = {
|
||||
@@ -77,7 +76,7 @@ def test_init_tokens(request):
|
||||
if ssl_enabled:
|
||||
kwargs['internal_certs_location'] = td
|
||||
app = MockHub(**kwargs)
|
||||
yield app.initialize([])
|
||||
await app.initialize([])
|
||||
db = app.db
|
||||
for token, username in tokens.items():
|
||||
api_token = orm.APIToken.find(db, token)
|
||||
@@ -87,7 +86,7 @@ def test_init_tokens(request):
|
||||
|
||||
# simulate second startup, reloading same tokens:
|
||||
app = MockHub(**kwargs)
|
||||
yield app.initialize([])
|
||||
await app.initialize([])
|
||||
db = app.db
|
||||
for token, username in tokens.items():
|
||||
api_token = orm.APIToken.find(db, token)
|
||||
@@ -99,7 +98,7 @@ def test_init_tokens(request):
|
||||
tokens['short'] = 'gman'
|
||||
app = MockHub(**kwargs)
|
||||
with pytest.raises(ValueError):
|
||||
yield app.initialize([])
|
||||
await app.initialize([])
|
||||
assert orm.User.find(app.db, 'gman') is None
|
||||
|
||||
|
||||
@@ -169,8 +168,7 @@ def test_cookie_secret_env(tmpdir, request):
|
||||
assert not os.path.exists(hub.cookie_secret_file)
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_load_groups(tmpdir, request):
|
||||
async def test_load_groups(tmpdir, request):
|
||||
to_load = {
|
||||
'blue': ['cyclops', 'rogue', 'wolverine'],
|
||||
'gold': ['storm', 'jean-grey', 'colossus'],
|
||||
@@ -181,8 +179,8 @@ def test_load_groups(tmpdir, request):
|
||||
kwargs['internal_certs_location'] = str(tmpdir)
|
||||
hub = MockHub(**kwargs)
|
||||
hub.init_db()
|
||||
yield hub.init_users()
|
||||
yield hub.init_groups()
|
||||
await hub.init_users()
|
||||
await hub.init_groups()
|
||||
db = hub.db
|
||||
blue = orm.Group.find(db, name='blue')
|
||||
assert blue is not None
|
||||
@@ -192,16 +190,15 @@ def test_load_groups(tmpdir, request):
|
||||
assert sorted([ u.name for u in gold.users ]) == sorted(to_load['gold'])
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_resume_spawners(tmpdir, request):
|
||||
async def test_resume_spawners(tmpdir, request):
|
||||
if not os.getenv('JUPYTERHUB_TEST_DB_URL'):
|
||||
p = patch.dict(os.environ, {
|
||||
'JUPYTERHUB_TEST_DB_URL': 'sqlite:///%s' % tmpdir.join('jupyterhub.sqlite'),
|
||||
})
|
||||
p.start()
|
||||
request.addfinalizer(p.stop)
|
||||
@gen.coroutine
|
||||
def new_hub():
|
||||
|
||||
async def new_hub():
|
||||
kwargs = {}
|
||||
ssl_enabled = getattr(request.module, "ssl_enabled", False)
|
||||
if ssl_enabled:
|
||||
@@ -209,26 +206,27 @@ def test_resume_spawners(tmpdir, request):
|
||||
app = MockHub(test_clean_db=False, **kwargs)
|
||||
app.config.ConfigurableHTTPProxy.should_start = False
|
||||
app.config.ConfigurableHTTPProxy.auth_token = 'unused'
|
||||
yield app.initialize([])
|
||||
await app.initialize([])
|
||||
return app
|
||||
app = yield new_hub()
|
||||
|
||||
app = await new_hub()
|
||||
db = app.db
|
||||
# spawn a user's server
|
||||
name = 'kurt'
|
||||
user = add_user(db, app, name=name)
|
||||
yield user.spawn()
|
||||
await user.spawn()
|
||||
proc = user.spawner.proc
|
||||
assert proc is not None
|
||||
|
||||
# stop the Hub without cleaning up servers
|
||||
app.cleanup_servers = False
|
||||
yield app.stop()
|
||||
app.stop()
|
||||
|
||||
# proc is still running
|
||||
assert proc.poll() is None
|
||||
|
||||
# resume Hub, should still be running
|
||||
app = yield new_hub()
|
||||
app = await new_hub()
|
||||
db = app.db
|
||||
user = app.users[name]
|
||||
assert user.running
|
||||
@@ -236,7 +234,7 @@ def test_resume_spawners(tmpdir, request):
|
||||
|
||||
# stop the Hub without cleaning up servers
|
||||
app.cleanup_servers = False
|
||||
yield app.stop()
|
||||
app.stop()
|
||||
|
||||
# stop the server while the Hub is down. BAMF!
|
||||
proc.terminate()
|
||||
@@ -244,7 +242,7 @@ def test_resume_spawners(tmpdir, request):
|
||||
assert proc.poll() is not None
|
||||
|
||||
# resume Hub, should be stopped
|
||||
app = yield new_hub()
|
||||
app = await new_hub()
|
||||
db = app.db
|
||||
user = app.users[name]
|
||||
assert not user.running
|
||||
|
@@ -12,49 +12,47 @@ from requests import HTTPError
|
||||
from jupyterhub import auth, crypto, orm
|
||||
|
||||
from .mocking import MockPAMAuthenticator, MockStructGroup, MockStructPasswd
|
||||
from .test_api import add_user
|
||||
from .utils import add_user
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_pam_auth():
|
||||
|
||||
async def test_pam_auth():
|
||||
authenticator = MockPAMAuthenticator()
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'match',
|
||||
'password': 'match',
|
||||
})
|
||||
assert authorized['name'] == 'match'
|
||||
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'match',
|
||||
'password': 'nomatch',
|
||||
})
|
||||
assert authorized is None
|
||||
|
||||
# Account check is on by default for increased security
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'notallowedmatch',
|
||||
'password': 'notallowedmatch',
|
||||
})
|
||||
assert authorized is None
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_pam_auth_account_check_disabled():
|
||||
async def test_pam_auth_account_check_disabled():
|
||||
authenticator = MockPAMAuthenticator(check_account=False)
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'allowedmatch',
|
||||
'password': 'allowedmatch',
|
||||
})
|
||||
assert authorized['name'] == 'allowedmatch'
|
||||
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'notallowedmatch',
|
||||
'password': 'notallowedmatch',
|
||||
})
|
||||
assert authorized['name'] == 'notallowedmatch'
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_pam_auth_admin_groups():
|
||||
async def test_pam_auth_admin_groups():
|
||||
jh_users = MockStructGroup('jh_users', ['group_admin', 'also_group_admin', 'override_admin', 'non_admin'], 1234)
|
||||
jh_admins = MockStructGroup('jh_admins', ['group_admin'], 5678)
|
||||
wheel = MockStructGroup('wheel', ['also_group_admin'], 9999)
|
||||
@@ -67,10 +65,10 @@ def test_pam_auth_admin_groups():
|
||||
system_users = [group_admin, also_group_admin, override_admin, non_admin]
|
||||
|
||||
user_group_map = {
|
||||
'group_admin': [jh_users, jh_admins],
|
||||
'also_group_admin': [jh_users, wheel],
|
||||
'override_admin': [jh_users],
|
||||
'non_admin': [jh_users]
|
||||
'group_admin': [jh_users.gr_gid, jh_admins.gr_gid],
|
||||
'also_group_admin': [jh_users.gr_gid, wheel.gr_gid],
|
||||
'override_admin': [jh_users.gr_gid],
|
||||
'non_admin': [jh_users.gr_gid]
|
||||
}
|
||||
|
||||
def getgrnam(name):
|
||||
@@ -86,103 +84,100 @@ def test_pam_auth_admin_groups():
|
||||
admin_users={'override_admin'})
|
||||
|
||||
# Check admin_group applies as expected
|
||||
with mock.patch.multiple(auth,
|
||||
getgrnam=getgrnam,
|
||||
getpwnam=getpwnam,
|
||||
getgrouplist=getgrouplist):
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
with mock.patch.multiple(authenticator,
|
||||
_getgrnam=getgrnam,
|
||||
_getpwnam=getpwnam,
|
||||
_getgrouplist=getgrouplist):
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'group_admin',
|
||||
'password': 'group_admin'
|
||||
})
|
||||
assert authorized['name'] == 'group_admin'
|
||||
assert authorized['admin'] == True
|
||||
assert authorized['admin'] is True
|
||||
|
||||
# Check multiple groups work, just in case.
|
||||
with mock.patch.multiple(auth,
|
||||
getgrnam=getgrnam,
|
||||
getpwnam=getpwnam,
|
||||
getgrouplist=getgrouplist):
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
with mock.patch.multiple(authenticator,
|
||||
_getgrnam=getgrnam,
|
||||
_getpwnam=getpwnam,
|
||||
_getgrouplist=getgrouplist):
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'also_group_admin',
|
||||
'password': 'also_group_admin'
|
||||
})
|
||||
assert authorized['name'] == 'also_group_admin'
|
||||
assert authorized['admin'] == True
|
||||
assert authorized['admin'] is True
|
||||
|
||||
# Check admin_users still applies correctly
|
||||
with mock.patch.multiple(auth,
|
||||
getgrnam=getgrnam,
|
||||
getpwnam=getpwnam,
|
||||
getgrouplist=getgrouplist):
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
with mock.patch.multiple(authenticator,
|
||||
_getgrnam=getgrnam,
|
||||
_getpwnam=getpwnam,
|
||||
_getgrouplist=getgrouplist):
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'override_admin',
|
||||
'password': 'override_admin'
|
||||
})
|
||||
assert authorized['name'] == 'override_admin'
|
||||
assert authorized['admin'] == True
|
||||
assert authorized['admin'] is True
|
||||
|
||||
# Check it doesn't admin everyone
|
||||
with mock.patch.multiple(auth,
|
||||
getgrnam=getgrnam,
|
||||
getpwnam=getpwnam,
|
||||
getgrouplist=getgrouplist):
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
with mock.patch.multiple(authenticator,
|
||||
_getgrnam=getgrnam,
|
||||
_getpwnam=getpwnam,
|
||||
_getgrouplist=getgrouplist):
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'non_admin',
|
||||
'password': 'non_admin'
|
||||
})
|
||||
assert authorized['name'] == 'non_admin'
|
||||
assert authorized['admin'] == False
|
||||
assert authorized['admin'] is False
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_pam_auth_whitelist():
|
||||
async def test_pam_auth_whitelist():
|
||||
authenticator = MockPAMAuthenticator(whitelist={'wash', 'kaylee'})
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'kaylee',
|
||||
'password': 'kaylee',
|
||||
})
|
||||
assert authorized['name'] == 'kaylee'
|
||||
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'wash',
|
||||
'password': 'nomatch',
|
||||
})
|
||||
assert authorized is None
|
||||
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'mal',
|
||||
'password': 'mal',
|
||||
})
|
||||
assert authorized is None
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_pam_auth_group_whitelist():
|
||||
async def test_pam_auth_group_whitelist():
|
||||
def getgrnam(name):
|
||||
return MockStructGroup('grp', ['kaylee'])
|
||||
|
||||
authenticator = MockPAMAuthenticator(group_whitelist={'group'})
|
||||
|
||||
with mock.patch.object(auth, 'getgrnam', getgrnam):
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
with mock.patch.object(authenticator, '_getgrnam', getgrnam):
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'kaylee',
|
||||
'password': 'kaylee',
|
||||
})
|
||||
assert authorized['name'] == 'kaylee'
|
||||
|
||||
with mock.patch.object(auth, 'getgrnam', getgrnam):
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
with mock.patch.object(authenticator, '_getgrnam', getgrnam):
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'mal',
|
||||
'password': 'mal',
|
||||
})
|
||||
assert authorized is None
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_pam_auth_blacklist():
|
||||
async def test_pam_auth_blacklist():
|
||||
# Null case compared to next case
|
||||
authenticator = MockPAMAuthenticator()
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'wash',
|
||||
'password': 'wash',
|
||||
})
|
||||
@@ -190,7 +185,7 @@ def test_pam_auth_blacklist():
|
||||
|
||||
# Blacklist basics
|
||||
authenticator = MockPAMAuthenticator(blacklist={'wash'})
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'wash',
|
||||
'password': 'wash',
|
||||
})
|
||||
@@ -198,7 +193,7 @@ def test_pam_auth_blacklist():
|
||||
|
||||
# User in both white and blacklists: default deny. Make error someday?
|
||||
authenticator = MockPAMAuthenticator(blacklist={'wash'}, whitelist={'wash', 'kaylee'})
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'wash',
|
||||
'password': 'wash',
|
||||
})
|
||||
@@ -206,7 +201,7 @@ def test_pam_auth_blacklist():
|
||||
|
||||
# User not in blacklist can log in
|
||||
authenticator = MockPAMAuthenticator(blacklist={'wash'}, whitelist={'wash', 'kaylee'})
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'kaylee',
|
||||
'password': 'kaylee',
|
||||
})
|
||||
@@ -214,7 +209,7 @@ def test_pam_auth_blacklist():
|
||||
|
||||
# User in whitelist, blacklist irrelevent
|
||||
authenticator = MockPAMAuthenticator(blacklist={'mal'}, whitelist={'wash', 'kaylee'})
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'wash',
|
||||
'password': 'wash',
|
||||
})
|
||||
@@ -222,7 +217,7 @@ def test_pam_auth_blacklist():
|
||||
|
||||
# User in neither list
|
||||
authenticator = MockPAMAuthenticator(blacklist={'mal'}, whitelist={'wash', 'kaylee'})
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'simon',
|
||||
'password': 'simon',
|
||||
})
|
||||
@@ -230,15 +225,14 @@ def test_pam_auth_blacklist():
|
||||
|
||||
# blacklist == {}
|
||||
authenticator = MockPAMAuthenticator(blacklist=set(), whitelist={'wash', 'kaylee'})
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'kaylee',
|
||||
'password': 'kaylee',
|
||||
})
|
||||
assert authorized['name'] == 'kaylee'
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_deprecated_signatures():
|
||||
async def test_deprecated_signatures():
|
||||
deprecated_authenticator = MockPAMAuthenticator()
|
||||
|
||||
def deprecated_xlist(username):
|
||||
@@ -247,7 +241,7 @@ def test_deprecated_signatures():
|
||||
with mock.patch.multiple(deprecated_authenticator,
|
||||
check_whitelist=deprecated_xlist,
|
||||
check_blacklist=deprecated_xlist):
|
||||
authorized = yield deprecated_authenticator.get_authenticated_user(None, {
|
||||
authorized = await deprecated_authenticator.get_authenticated_user(None, {
|
||||
'username': 'test',
|
||||
'password': 'test'
|
||||
})
|
||||
@@ -255,27 +249,24 @@ def test_deprecated_signatures():
|
||||
assert authorized is not None
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_pam_auth_no_such_group():
|
||||
async def test_pam_auth_no_such_group():
|
||||
authenticator = MockPAMAuthenticator(group_whitelist={'nosuchcrazygroup'})
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'kaylee',
|
||||
'password': 'kaylee',
|
||||
})
|
||||
assert authorized is None
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_wont_add_system_user():
|
||||
async def test_wont_add_system_user():
|
||||
user = orm.User(name='lioness4321')
|
||||
authenticator = auth.PAMAuthenticator(whitelist={'mal'})
|
||||
authenticator.create_system_users = False
|
||||
with pytest.raises(KeyError):
|
||||
yield authenticator.add_user(user)
|
||||
await authenticator.add_user(user)
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_cant_add_system_user():
|
||||
async def test_cant_add_system_user():
|
||||
user = orm.User(name='lioness4321')
|
||||
authenticator = auth.PAMAuthenticator(whitelist={'mal'})
|
||||
authenticator.add_user_cmd = ['jupyterhub-fake-command']
|
||||
@@ -297,12 +288,11 @@ def test_cant_add_system_user():
|
||||
|
||||
with mock.patch.object(auth, 'Popen', DummyPopen):
|
||||
with pytest.raises(RuntimeError) as exc:
|
||||
yield authenticator.add_user(user)
|
||||
await authenticator.add_user(user)
|
||||
assert str(exc.value) == 'Failed to create system user lioness4321: dummy error'
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_add_system_user():
|
||||
async def test_add_system_user():
|
||||
user = orm.User(name='lioness4321')
|
||||
authenticator = auth.PAMAuthenticator(whitelist={'mal'})
|
||||
authenticator.create_system_users = True
|
||||
@@ -318,17 +308,16 @@ def test_add_system_user():
|
||||
return
|
||||
|
||||
with mock.patch.object(auth, 'Popen', DummyPopen):
|
||||
yield authenticator.add_user(user)
|
||||
await authenticator.add_user(user)
|
||||
assert record['cmd'] == ['echo', '/home/lioness4321', 'lioness4321']
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_delete_user():
|
||||
async def test_delete_user():
|
||||
user = orm.User(name='zoe')
|
||||
a = MockPAMAuthenticator(whitelist={'mal'})
|
||||
|
||||
assert 'zoe' not in a.whitelist
|
||||
yield a.add_user(user)
|
||||
await a.add_user(user)
|
||||
assert 'zoe' in a.whitelist
|
||||
a.delete_user(user)
|
||||
assert 'zoe' not in a.whitelist
|
||||
@@ -348,47 +337,43 @@ def test_handlers(app):
|
||||
assert handlers[0][0] == '/login'
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_auth_state(app, auth_state_enabled):
|
||||
async def test_auth_state(app, auth_state_enabled):
|
||||
"""auth_state enabled and available"""
|
||||
name = 'kiwi'
|
||||
user = add_user(app.db, app, name=name)
|
||||
assert user.encrypted_auth_state is None
|
||||
cookies = yield app.login_user(name)
|
||||
auth_state = yield user.get_auth_state()
|
||||
cookies = await app.login_user(name)
|
||||
auth_state = await user.get_auth_state()
|
||||
assert auth_state == app.authenticator.auth_state
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_auth_admin_non_admin(app):
|
||||
async def test_auth_admin_non_admin(app):
|
||||
"""admin should be passed through for non-admin users"""
|
||||
name = 'kiwi'
|
||||
user = add_user(app.db, app, name=name, admin=False)
|
||||
assert user.admin is False
|
||||
cookies = yield app.login_user(name)
|
||||
cookies = await app.login_user(name)
|
||||
assert user.admin is False
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_auth_admin_is_admin(app):
|
||||
async def test_auth_admin_is_admin(app):
|
||||
"""admin should be passed through for admin users"""
|
||||
# Admin user defined in MockPAMAuthenticator.
|
||||
name = 'admin'
|
||||
user = add_user(app.db, app, name=name, admin=False)
|
||||
assert user.admin is False
|
||||
cookies = yield app.login_user(name)
|
||||
cookies = await app.login_user(name)
|
||||
assert user.admin is True
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_auth_admin_retained_if_unset(app):
|
||||
async def test_auth_admin_retained_if_unset(app):
|
||||
"""admin should be unchanged if authenticator doesn't return admin value"""
|
||||
name = 'kiwi'
|
||||
# Add user as admin.
|
||||
user = add_user(app.db, app, name=name, admin=True)
|
||||
assert user.admin is True
|
||||
# User should remain unchanged.
|
||||
cookies = yield app.login_user(name)
|
||||
cookies = await app.login_user(name)
|
||||
assert user.admin is True
|
||||
|
||||
|
||||
@@ -402,56 +387,53 @@ def auth_state_unavailable(auth_state_enabled):
|
||||
yield
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_auth_state_disabled(app, auth_state_unavailable):
|
||||
async def test_auth_state_disabled(app, auth_state_unavailable):
|
||||
name = 'driebus'
|
||||
user = add_user(app.db, app, name=name)
|
||||
assert user.encrypted_auth_state is None
|
||||
with pytest.raises(HTTPError):
|
||||
cookies = yield app.login_user(name)
|
||||
auth_state = yield user.get_auth_state()
|
||||
cookies = await app.login_user(name)
|
||||
auth_state = await user.get_auth_state()
|
||||
assert auth_state is None
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_normalize_names():
|
||||
async def test_normalize_names():
|
||||
a = MockPAMAuthenticator()
|
||||
authorized = yield a.get_authenticated_user(None, {
|
||||
authorized = await a.get_authenticated_user(None, {
|
||||
'username': 'ZOE',
|
||||
'password': 'ZOE',
|
||||
})
|
||||
assert authorized['name'] == 'zoe'
|
||||
|
||||
authorized = yield a.get_authenticated_user(None, {
|
||||
authorized = await a.get_authenticated_user(None, {
|
||||
'username': 'Glenn',
|
||||
'password': 'Glenn',
|
||||
})
|
||||
assert authorized['name'] == 'glenn'
|
||||
|
||||
authorized = yield a.get_authenticated_user(None, {
|
||||
authorized = await a.get_authenticated_user(None, {
|
||||
'username': 'hExi',
|
||||
'password': 'hExi',
|
||||
})
|
||||
assert authorized['name'] == 'hexi'
|
||||
|
||||
authorized = yield a.get_authenticated_user(None, {
|
||||
authorized = await a.get_authenticated_user(None, {
|
||||
'username': 'Test',
|
||||
'password': 'Test',
|
||||
})
|
||||
assert authorized['name'] == 'test'
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_username_map():
|
||||
async def test_username_map():
|
||||
a = MockPAMAuthenticator(username_map={'wash': 'alpha'})
|
||||
authorized = yield a.get_authenticated_user(None, {
|
||||
authorized = await a.get_authenticated_user(None, {
|
||||
'username': 'WASH',
|
||||
'password': 'WASH',
|
||||
})
|
||||
|
||||
assert authorized['name'] == 'alpha'
|
||||
|
||||
authorized = yield a.get_authenticated_user(None, {
|
||||
authorized = await a.get_authenticated_user(None, {
|
||||
'username': 'Inara',
|
||||
'password': 'Inara',
|
||||
})
|
||||
|
179
jupyterhub/tests/test_auth_expiry.py
Normal file
179
jupyterhub/tests/test_auth_expiry.py
Normal file
@@ -0,0 +1,179 @@
|
||||
"""
|
||||
test authentication expiry
|
||||
|
||||
authentication can expire in a number of ways:
|
||||
|
||||
- needs refresh and can be refreshed
|
||||
- doesn't need refresh
|
||||
- needs refresh and cannot be refreshed without new login
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
from contextlib import contextmanager
|
||||
from unittest import mock
|
||||
from urllib.parse import parse_qs, urlparse
|
||||
|
||||
import pytest
|
||||
|
||||
from .utils import api_request, get_page
|
||||
|
||||
|
||||
async def refresh_expired(authenticator, user):
|
||||
return None
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def disable_refresh(app):
|
||||
"""Fixture disabling auth refresh"""
|
||||
with mock.patch.object(app.authenticator, 'refresh_user', refresh_expired):
|
||||
yield
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def refresh_pre_spawn(app):
|
||||
"""Fixture enabling auth refresh pre spawn"""
|
||||
app.authenticator.refresh_pre_spawn = True
|
||||
try:
|
||||
yield
|
||||
finally:
|
||||
app.authenticator.refresh_pre_spawn = False
|
||||
|
||||
|
||||
async def test_auth_refresh_at_login(app, user):
|
||||
# auth_refreshed starts unset:
|
||||
assert not user._auth_refreshed
|
||||
# login sets auth_refreshed timestamp
|
||||
await app.login_user(user.name)
|
||||
assert user._auth_refreshed
|
||||
user._auth_refreshed -= 10
|
||||
before = user._auth_refreshed
|
||||
# login again updates auth_refreshed timestamp
|
||||
# even when auth is fresh
|
||||
await app.login_user(user.name)
|
||||
assert user._auth_refreshed > before
|
||||
|
||||
|
||||
async def test_auth_refresh_page(app, user):
|
||||
cookies = await app.login_user(user.name)
|
||||
assert user._auth_refreshed
|
||||
user._auth_refreshed -= 10
|
||||
before = user._auth_refreshed
|
||||
|
||||
# get a page with auth not expired
|
||||
# doesn't trigger refresh
|
||||
r = await get_page('home', app, cookies=cookies)
|
||||
assert r.status_code == 200
|
||||
assert user._auth_refreshed == before
|
||||
|
||||
# get a page with stale auth, refreshes auth
|
||||
user._auth_refreshed -= app.authenticator.auth_refresh_age
|
||||
r = await get_page('home', app, cookies=cookies)
|
||||
assert r.status_code == 200
|
||||
assert user._auth_refreshed > before
|
||||
|
||||
|
||||
async def test_auth_expired_page(app, user, disable_refresh):
|
||||
cookies = await app.login_user(user.name)
|
||||
assert user._auth_refreshed
|
||||
user._auth_refreshed -= 10
|
||||
before = user._auth_refreshed
|
||||
|
||||
# auth is fresh, doesn't trigger expiry
|
||||
r = await get_page('home', app, cookies=cookies)
|
||||
assert user._auth_refreshed == before
|
||||
assert r.status_code == 200
|
||||
|
||||
# get a page with stale auth, triggers expiry
|
||||
user._auth_refreshed -= app.authenticator.auth_refresh_age
|
||||
before = user._auth_refreshed
|
||||
r = await get_page('home', app, cookies=cookies, allow_redirects=False)
|
||||
|
||||
# verify that we redirect to login with ?next=requested page
|
||||
assert r.status_code == 302
|
||||
redirect_url = urlparse(r.headers['Location'])
|
||||
assert redirect_url.path.endswith('/login')
|
||||
query = parse_qs(redirect_url.query)
|
||||
assert query['next']
|
||||
next_url = urlparse(query['next'][0])
|
||||
assert next_url.path == urlparse(r.url).path
|
||||
|
||||
# make sure refresh didn't get updated
|
||||
assert user._auth_refreshed == before
|
||||
|
||||
|
||||
async def test_auth_expired_api(app, user, disable_refresh):
|
||||
cookies = await app.login_user(user.name)
|
||||
assert user._auth_refreshed
|
||||
user._auth_refreshed -= 10
|
||||
before = user._auth_refreshed
|
||||
|
||||
# auth is fresh, doesn't trigger expiry
|
||||
r = await api_request(app, 'users/' + user.name, name=user.name)
|
||||
assert user._auth_refreshed == before
|
||||
assert r.status_code == 200
|
||||
|
||||
# get a page with stale auth, triggers expiry
|
||||
user._auth_refreshed -= app.authenticator.auth_refresh_age
|
||||
r = await api_request(app, 'users/' + user.name, name=user.name)
|
||||
# api requests can't do login redirects
|
||||
assert r.status_code == 403
|
||||
|
||||
|
||||
async def test_refresh_pre_spawn(app, user, refresh_pre_spawn):
|
||||
cookies = await app.login_user(user.name)
|
||||
assert user._auth_refreshed
|
||||
user._auth_refreshed -= 10
|
||||
before = user._auth_refreshed
|
||||
|
||||
# auth is fresh, but should be forced to refresh by spawn
|
||||
r = await api_request(
|
||||
app, 'users/{}/server'.format(user.name), method='post', name=user.name
|
||||
)
|
||||
assert 200 <= r.status_code < 300
|
||||
assert user._auth_refreshed > before
|
||||
|
||||
|
||||
async def test_refresh_pre_spawn_expired(app, user, refresh_pre_spawn, disable_refresh):
|
||||
cookies = await app.login_user(user.name)
|
||||
assert user._auth_refreshed
|
||||
user._auth_refreshed -= 10
|
||||
before = user._auth_refreshed
|
||||
|
||||
# auth is fresh, doesn't trigger expiry
|
||||
r = await api_request(
|
||||
app, 'users/{}/server'.format(user.name), method='post', name=user.name
|
||||
)
|
||||
assert r.status_code == 403
|
||||
assert user._auth_refreshed == before
|
||||
|
||||
|
||||
async def test_refresh_pre_spawn_admin_request(
|
||||
app, user, admin_user, refresh_pre_spawn
|
||||
):
|
||||
await app.login_user(user.name)
|
||||
await app.login_user(admin_user.name)
|
||||
user._auth_refreshed -= 10
|
||||
before = user._auth_refreshed
|
||||
|
||||
# admin request, auth is fresh. Should still refresh user auth.
|
||||
r = await api_request(
|
||||
app, 'users', user.name, 'server', method='post', name=admin_user.name
|
||||
)
|
||||
assert 200 <= r.status_code < 300
|
||||
assert user._auth_refreshed > before
|
||||
|
||||
|
||||
async def test_refresh_pre_spawn_expired_admin_request(
|
||||
app, user, admin_user, refresh_pre_spawn, disable_refresh
|
||||
):
|
||||
await app.login_user(user.name)
|
||||
await app.login_user(admin_user.name)
|
||||
user._auth_refreshed -= 10
|
||||
|
||||
# auth needs refresh but can't without a new login; spawn should fail
|
||||
user._auth_refreshed -= app.authenticator.auth_refresh_age
|
||||
r = await api_request(
|
||||
app, 'users', user.name, 'server', method='post', name=admin_user.name
|
||||
)
|
||||
# api requests can't do login redirects
|
||||
assert r.status_code == 403
|
@@ -11,6 +11,7 @@ keys = [('%i' % i).encode('ascii') * 32 for i in range(3)]
|
||||
hex_keys = [ b2a_hex(key).decode('ascii') for key in keys ]
|
||||
b64_keys = [ b2a_base64(key).decode('ascii').strip() for key in keys ]
|
||||
|
||||
|
||||
@pytest.mark.parametrize("key_env, keys", [
|
||||
(hex_keys[0], [keys[0]]),
|
||||
(';'.join([b64_keys[0], hex_keys[1]]), keys[:2]),
|
||||
@@ -52,30 +53,27 @@ def crypt_keeper():
|
||||
ck.keys = save_keys
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_roundtrip(crypt_keeper):
|
||||
async def test_roundtrip(crypt_keeper):
|
||||
data = {'key': 'value'}
|
||||
encrypted = yield encrypt(data)
|
||||
decrypted = yield decrypt(encrypted)
|
||||
encrypted = await encrypt(data)
|
||||
decrypted = await decrypt(encrypted)
|
||||
assert decrypted == data
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_missing_crypto(crypt_keeper):
|
||||
async def test_missing_crypto(crypt_keeper):
|
||||
with patch.object(crypto, 'cryptography', None):
|
||||
with pytest.raises(crypto.CryptographyUnavailable):
|
||||
yield encrypt({})
|
||||
await encrypt({})
|
||||
|
||||
with pytest.raises(crypto.CryptographyUnavailable):
|
||||
yield decrypt(b'whatever')
|
||||
await decrypt(b'whatever')
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_missing_keys(crypt_keeper):
|
||||
async def test_missing_keys(crypt_keeper):
|
||||
crypt_keeper.keys = []
|
||||
with pytest.raises(crypto.NoEncryptionKeys):
|
||||
yield encrypt({})
|
||||
await encrypt({})
|
||||
|
||||
with pytest.raises(crypto.NoEncryptionKeys):
|
||||
yield decrypt(b'whatever')
|
||||
await decrypt(b'whatever')
|
||||
|
||||
|
@@ -40,8 +40,7 @@ def generate_old_db(env_dir, hub_version, db_url):
|
||||
'0.8.1',
|
||||
],
|
||||
)
|
||||
@pytest.mark.gen_test
|
||||
def test_upgrade(tmpdir, hub_version):
|
||||
async def test_upgrade(tmpdir, hub_version):
|
||||
db_url = os.getenv('JUPYTERHUB_TEST_DB_URL')
|
||||
if db_url:
|
||||
db_url += '_upgrade_' + hub_version.replace('.', '')
|
||||
@@ -69,7 +68,7 @@ def test_upgrade(tmpdir, hub_version):
|
||||
assert len(sqlite_files) == 1
|
||||
|
||||
upgradeapp = UpgradeDB(config=cfg)
|
||||
yield upgradeapp.initialize([])
|
||||
upgradeapp.initialize([])
|
||||
upgradeapp.start()
|
||||
|
||||
# check that backup was created:
|
||||
|
@@ -7,48 +7,45 @@ import pytest
|
||||
|
||||
from jupyterhub.auth import DummyAuthenticator
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_dummy_auth_without_global_password():
|
||||
async def test_dummy_auth_without_global_password():
|
||||
authenticator = DummyAuthenticator()
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'test_user',
|
||||
'password': 'test_pass',
|
||||
})
|
||||
assert authorized['name'] == 'test_user'
|
||||
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'test_user',
|
||||
'password': '',
|
||||
})
|
||||
assert authorized['name'] == 'test_user'
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_dummy_auth_without_username():
|
||||
async def test_dummy_auth_without_username():
|
||||
authenticator = DummyAuthenticator()
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': '',
|
||||
'password': 'test_pass',
|
||||
})
|
||||
assert authorized is None
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_dummy_auth_with_global_password():
|
||||
async def test_dummy_auth_with_global_password():
|
||||
authenticator = DummyAuthenticator()
|
||||
authenticator.password = "test_password"
|
||||
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'test_user',
|
||||
'password': 'test_password',
|
||||
})
|
||||
assert authorized['name'] == 'test_user'
|
||||
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'test_user',
|
||||
'password': 'qwerty',
|
||||
})
|
||||
assert authorized is None
|
||||
|
||||
authorized = yield authenticator.get_authenticated_user(None, {
|
||||
authorized = await authenticator.get_authenticated_user(None, {
|
||||
'username': 'some_other_user',
|
||||
'password': 'test_password',
|
||||
})
|
||||
|
@@ -39,39 +39,36 @@ def wait_for_spawner(spawner, timeout=10):
|
||||
yield wait()
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_connection_hub_wrong_certs(app):
|
||||
async def test_connection_hub_wrong_certs(app):
|
||||
"""Connecting to the internal hub url fails without correct certs"""
|
||||
with pytest.raises(SSLError):
|
||||
kwargs = {'verify': False}
|
||||
r = yield async_requests.get(app.hub.url, **kwargs)
|
||||
r = await async_requests.get(app.hub.url, **kwargs)
|
||||
r.raise_for_status()
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_connection_proxy_api_wrong_certs(app):
|
||||
async def test_connection_proxy_api_wrong_certs(app):
|
||||
"""Connecting to the proxy api fails without correct certs"""
|
||||
with pytest.raises(SSLError):
|
||||
kwargs = {'verify': False}
|
||||
r = yield async_requests.get(app.proxy.api_url, **kwargs)
|
||||
r = await async_requests.get(app.proxy.api_url, **kwargs)
|
||||
r.raise_for_status()
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_connection_notebook_wrong_certs(app):
|
||||
async def test_connection_notebook_wrong_certs(app):
|
||||
"""Connecting to a notebook fails without correct certs"""
|
||||
with mock.patch.dict(
|
||||
app.config.LocalProcessSpawner,
|
||||
{'cmd': [sys.executable, '-m', 'jupyterhub.tests.mocksu']}
|
||||
):
|
||||
user = add_user(app.db, app, name='foo')
|
||||
yield user.spawn()
|
||||
yield wait_for_spawner(user.spawner)
|
||||
await user.spawn()
|
||||
await wait_for_spawner(user.spawner)
|
||||
spawner = user.spawner
|
||||
status = yield spawner.poll()
|
||||
status = await spawner.poll()
|
||||
assert status is None
|
||||
|
||||
with pytest.raises(SSLError):
|
||||
kwargs = {'verify': False}
|
||||
r = yield async_requests.get(spawner.server.url, **kwargs)
|
||||
r = await async_requests.get(spawner.server.url, **kwargs)
|
||||
r.raise_for_status()
|
||||
|
@@ -13,20 +13,19 @@ from .utils import async_requests
|
||||
@pytest.fixture
|
||||
def named_servers(app):
|
||||
with mock.patch.dict(app.tornado_settings,
|
||||
{'allow_named_servers': True}):
|
||||
{'allow_named_servers': True, 'named_server_limit_per_user': 2}):
|
||||
yield
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_default_server(app, named_servers):
|
||||
async def test_default_server(app, named_servers):
|
||||
"""Test the default /users/:user/server handler when named servers are enabled"""
|
||||
username = 'rosie'
|
||||
user = add_user(app.db, app, name=username)
|
||||
r = yield api_request(app, 'users', username, 'server', method='post')
|
||||
r = await api_request(app, 'users', username, 'server', method='post')
|
||||
assert r.status_code == 201
|
||||
assert r.text == ''
|
||||
|
||||
r = yield api_request(app, 'users', username)
|
||||
r = await api_request(app, 'users', username)
|
||||
r.raise_for_status()
|
||||
|
||||
user_model = normalize_user(r.json())
|
||||
@@ -51,11 +50,11 @@ def test_default_server(app, named_servers):
|
||||
|
||||
|
||||
# now stop the server
|
||||
r = yield api_request(app, 'users', username, 'server', method='delete')
|
||||
r = await api_request(app, 'users', username, 'server', method='delete')
|
||||
assert r.status_code == 204
|
||||
assert r.text == ''
|
||||
|
||||
r = yield api_request(app, 'users', username)
|
||||
r = await api_request(app, 'users', username)
|
||||
r.raise_for_status()
|
||||
|
||||
user_model = normalize_user(r.json())
|
||||
@@ -66,20 +65,19 @@ def test_default_server(app, named_servers):
|
||||
})
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_create_named_server(app, named_servers):
|
||||
async def test_create_named_server(app, named_servers):
|
||||
username = 'walnut'
|
||||
user = add_user(app.db, app, name=username)
|
||||
# assert user.allow_named_servers == True
|
||||
cookies = yield app.login_user(username)
|
||||
cookies = await app.login_user(username)
|
||||
servername = 'trevor'
|
||||
r = yield api_request(app, 'users', username, 'servers', servername, method='post')
|
||||
r = await api_request(app, 'users', username, 'servers', servername, method='post')
|
||||
r.raise_for_status()
|
||||
assert r.status_code == 201
|
||||
assert r.text == ''
|
||||
|
||||
url = url_path_join(public_url(app, user), servername, 'env')
|
||||
r = yield async_requests.get(url, cookies=cookies)
|
||||
r = await async_requests.get(url, cookies=cookies)
|
||||
r.raise_for_status()
|
||||
assert r.url == url
|
||||
env = r.json()
|
||||
@@ -87,7 +85,7 @@ def test_create_named_server(app, named_servers):
|
||||
assert prefix == user.spawners[servername].server.base_url
|
||||
assert prefix.endswith('/user/%s/%s/' % (username, servername))
|
||||
|
||||
r = yield api_request(app, 'users', username)
|
||||
r = await api_request(app, 'users', username)
|
||||
r.raise_for_status()
|
||||
|
||||
user_model = normalize_user(r.json())
|
||||
@@ -111,22 +109,21 @@ def test_create_named_server(app, named_servers):
|
||||
})
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_delete_named_server(app, named_servers):
|
||||
async def test_delete_named_server(app, named_servers):
|
||||
username = 'donaar'
|
||||
user = add_user(app.db, app, name=username)
|
||||
assert user.allow_named_servers
|
||||
cookies = app.login_user(username)
|
||||
servername = 'splugoth'
|
||||
r = yield api_request(app, 'users', username, 'servers', servername, method='post')
|
||||
r = await api_request(app, 'users', username, 'servers', servername, method='post')
|
||||
r.raise_for_status()
|
||||
assert r.status_code == 201
|
||||
|
||||
r = yield api_request(app, 'users', username, 'servers', servername, method='delete')
|
||||
r = await api_request(app, 'users', username, 'servers', servername, method='delete')
|
||||
r.raise_for_status()
|
||||
assert r.status_code == 204
|
||||
|
||||
r = yield api_request(app, 'users', username)
|
||||
r = await api_request(app, 'users', username)
|
||||
r.raise_for_status()
|
||||
|
||||
user_model = normalize_user(r.json())
|
||||
@@ -140,7 +137,7 @@ def test_delete_named_server(app, named_servers):
|
||||
# low-level record still exists
|
||||
assert servername in user.orm_spawners
|
||||
|
||||
r = yield api_request(
|
||||
r = await api_request(
|
||||
app, 'users', username, 'servers', servername,
|
||||
method='delete',
|
||||
data=json.dumps({'remove': True}),
|
||||
@@ -151,11 +148,56 @@ def test_delete_named_server(app, named_servers):
|
||||
assert servername not in user.orm_spawners
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_named_server_disabled(app):
|
||||
async def test_named_server_disabled(app):
|
||||
username = 'user'
|
||||
servername = 'okay'
|
||||
r = yield api_request(app, 'users', username, 'servers', servername, method='post')
|
||||
r = await api_request(app, 'users', username, 'servers', servername, method='post')
|
||||
assert r.status_code == 400
|
||||
r = yield api_request(app, 'users', username, 'servers', servername, method='delete')
|
||||
r = await api_request(app, 'users', username, 'servers', servername, method='delete')
|
||||
assert r.status_code == 400
|
||||
|
||||
|
||||
async def test_named_server_limit(app, named_servers):
|
||||
username = 'foo'
|
||||
user = add_user(app.db, app, name=username)
|
||||
cookies = await app.login_user(username)
|
||||
|
||||
# Create 1st named server
|
||||
servername1 = 'bar-1'
|
||||
r = await api_request(app, 'users', username, 'servers', servername1, method='post')
|
||||
r.raise_for_status()
|
||||
assert r.status_code == 201
|
||||
assert r.text == ''
|
||||
|
||||
# Create 2nd named server
|
||||
servername2 = 'bar-2'
|
||||
r = await api_request(app, 'users', username, 'servers', servername2, method='post')
|
||||
r.raise_for_status()
|
||||
assert r.status_code == 201
|
||||
assert r.text == ''
|
||||
|
||||
# Create 3rd named server
|
||||
servername3 = 'bar-3'
|
||||
r = await api_request(app, 'users', username, 'servers', servername3, method='post')
|
||||
assert r.status_code == 400
|
||||
assert r.json() == {"status": 400, "message": "User foo already has the maximum of 2 named servers. One must be deleted before a new server can be created"}
|
||||
|
||||
# Create default server
|
||||
r = await api_request(app, 'users', username, 'server', method='post')
|
||||
assert r.status_code == 201
|
||||
assert r.text == ''
|
||||
|
||||
# Delete 1st named server
|
||||
r = await api_request(
|
||||
app, 'users', username, 'servers', servername1,
|
||||
method='delete',
|
||||
data=json.dumps({'remove': True}),
|
||||
)
|
||||
r.raise_for_status()
|
||||
assert r.status_code == 204
|
||||
|
||||
# Create 3rd named server again
|
||||
r = await api_request(app, 'users', username, 'servers', servername3, method='post')
|
||||
r.raise_for_status()
|
||||
assert r.status_code == 201
|
||||
assert r.text == ''
|
||||
|
@@ -205,8 +205,7 @@ def test_token_find(db):
|
||||
assert found is None
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_spawn_fails(db):
|
||||
async def test_spawn_fails(db):
|
||||
orm_user = orm.User(name='aeofel')
|
||||
db.add(orm_user)
|
||||
db.commit()
|
||||
@@ -223,7 +222,7 @@ def test_spawn_fails(db):
|
||||
})
|
||||
|
||||
with pytest.raises(RuntimeError) as exc:
|
||||
yield user.spawn()
|
||||
await user.spawn()
|
||||
assert user.spawners[''].server is None
|
||||
assert not user.running
|
||||
|
||||
@@ -246,8 +245,7 @@ def test_groups(db):
|
||||
assert group.users == []
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_auth_state(db):
|
||||
async def test_auth_state(db):
|
||||
orm_user = orm.User(name='eve')
|
||||
db.add(orm_user)
|
||||
db.commit()
|
||||
@@ -262,51 +260,51 @@ def test_auth_state(db):
|
||||
state = {'key': 'value'}
|
||||
ck.keys = []
|
||||
with pytest.raises(crypto.EncryptionUnavailable):
|
||||
yield user.save_auth_state(state)
|
||||
await user.save_auth_state(state)
|
||||
|
||||
assert user.encrypted_auth_state is None
|
||||
# saving/loading None doesn't require keys
|
||||
yield user.save_auth_state(None)
|
||||
current = yield user.get_auth_state()
|
||||
await user.save_auth_state(None)
|
||||
current = await user.get_auth_state()
|
||||
assert current is None
|
||||
|
||||
first_key = os.urandom(32)
|
||||
second_key = os.urandom(32)
|
||||
ck.keys = [first_key]
|
||||
yield user.save_auth_state(state)
|
||||
await user.save_auth_state(state)
|
||||
assert user.encrypted_auth_state is not None
|
||||
decrypted_state = yield user.get_auth_state()
|
||||
decrypted_state = await user.get_auth_state()
|
||||
assert decrypted_state == state
|
||||
|
||||
# can't read auth_state without keys
|
||||
ck.keys = []
|
||||
auth_state = yield user.get_auth_state()
|
||||
auth_state = await user.get_auth_state()
|
||||
assert auth_state is None
|
||||
|
||||
# key rotation works
|
||||
db.rollback()
|
||||
ck.keys = [second_key, first_key]
|
||||
decrypted_state = yield user.get_auth_state()
|
||||
decrypted_state = await user.get_auth_state()
|
||||
assert decrypted_state == state
|
||||
|
||||
new_state = {'key': 'newvalue'}
|
||||
yield user.save_auth_state(new_state)
|
||||
await user.save_auth_state(new_state)
|
||||
db.commit()
|
||||
|
||||
ck.keys = [first_key]
|
||||
db.rollback()
|
||||
# can't read anymore with new-key after encrypting with second-key
|
||||
decrypted_state = yield user.get_auth_state()
|
||||
decrypted_state = await user.get_auth_state()
|
||||
assert decrypted_state is None
|
||||
|
||||
yield user.save_auth_state(new_state)
|
||||
decrypted_state = yield user.get_auth_state()
|
||||
await user.save_auth_state(new_state)
|
||||
decrypted_state = await user.get_auth_state()
|
||||
assert decrypted_state == new_state
|
||||
|
||||
ck.keys = []
|
||||
db.rollback()
|
||||
|
||||
decrypted_state = yield user.get_auth_state()
|
||||
decrypted_state = await user.get_auth_state()
|
||||
assert decrypted_state is None
|
||||
|
||||
|
||||
|
@@ -1,6 +1,8 @@
|
||||
"""Tests for HTML pages"""
|
||||
|
||||
import asyncio
|
||||
import sys
|
||||
from unittest import mock
|
||||
from urllib.parse import urlencode, urlparse
|
||||
|
||||
from bs4 import BeautifulSoup
|
||||
@@ -12,108 +14,94 @@ from ..utils import url_path_join as ujoin
|
||||
from .. import orm
|
||||
from ..auth import Authenticator
|
||||
|
||||
import mock
|
||||
import pytest
|
||||
|
||||
from .mocking import FormSpawner, public_url, public_host
|
||||
from .test_api import api_request, add_user
|
||||
from .utils import async_requests
|
||||
from .mocking import FormSpawner, FalsyCallableFormSpawner
|
||||
from .utils import (
|
||||
async_requests,
|
||||
api_request,
|
||||
add_user,
|
||||
get_page,
|
||||
public_url,
|
||||
public_host,
|
||||
)
|
||||
|
||||
|
||||
def get_page(path, app, hub=True, **kw):
|
||||
if hub:
|
||||
prefix = app.hub.base_url
|
||||
else:
|
||||
prefix = app.base_url
|
||||
base_url = ujoin(public_host(app), prefix)
|
||||
return async_requests.get(ujoin(base_url, path), **kw)
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_root_no_auth(app):
|
||||
async def test_root_no_auth(app):
|
||||
url = ujoin(public_host(app), app.hub.base_url)
|
||||
r = yield async_requests.get(url)
|
||||
r = await async_requests.get(url)
|
||||
r.raise_for_status()
|
||||
assert r.url == ujoin(url, 'login')
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_root_auth(app):
|
||||
cookies = yield app.login_user('river')
|
||||
r = yield async_requests.get(public_url(app), cookies=cookies)
|
||||
async def test_root_auth(app):
|
||||
cookies = await app.login_user('river')
|
||||
r = await async_requests.get(public_url(app), cookies=cookies)
|
||||
r.raise_for_status()
|
||||
assert r.url.startswith(public_url(app, app.users['river']))
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_root_redirect(app):
|
||||
async def test_root_redirect(app):
|
||||
name = 'wash'
|
||||
cookies = yield app.login_user(name)
|
||||
cookies = await app.login_user(name)
|
||||
next_url = ujoin(app.base_url, 'user/other/test.ipynb')
|
||||
url = '/?' + urlencode({'next': next_url})
|
||||
r = yield get_page(url, app, cookies=cookies)
|
||||
r = await get_page(url, app, cookies=cookies)
|
||||
r.raise_for_status()
|
||||
path = urlparse(r.url).path
|
||||
assert path == ujoin(app.base_url, 'user/%s/test.ipynb' % name)
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_root_default_url_noauth(app):
|
||||
async def test_root_default_url_noauth(app):
|
||||
with mock.patch.dict(app.tornado_settings,
|
||||
{'default_url': '/foo/bar'}):
|
||||
r = yield get_page('/', app, allow_redirects=False)
|
||||
r = await get_page('/', app, allow_redirects=False)
|
||||
r.raise_for_status()
|
||||
url = r.headers.get('Location', '')
|
||||
path = urlparse(url).path
|
||||
assert path == '/foo/bar'
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_root_default_url_auth(app):
|
||||
async def test_root_default_url_auth(app):
|
||||
name = 'wash'
|
||||
cookies = yield app.login_user(name)
|
||||
cookies = await app.login_user(name)
|
||||
with mock.patch.dict(app.tornado_settings,
|
||||
{'default_url': '/foo/bar'}):
|
||||
r = yield get_page('/', app, cookies=cookies, allow_redirects=False)
|
||||
r = await get_page('/', app, cookies=cookies, allow_redirects=False)
|
||||
r.raise_for_status()
|
||||
url = r.headers.get('Location', '')
|
||||
path = urlparse(url).path
|
||||
assert path == '/foo/bar'
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_home_no_auth(app):
|
||||
r = yield get_page('home', app, allow_redirects=False)
|
||||
async def test_home_no_auth(app):
|
||||
r = await get_page('home', app, allow_redirects=False)
|
||||
r.raise_for_status()
|
||||
assert r.status_code == 302
|
||||
assert '/hub/login' in r.headers['Location']
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_home_auth(app):
|
||||
cookies = yield app.login_user('river')
|
||||
r = yield get_page('home', app, cookies=cookies)
|
||||
async def test_home_auth(app):
|
||||
cookies = await app.login_user('river')
|
||||
r = await get_page('home', app, cookies=cookies)
|
||||
r.raise_for_status()
|
||||
assert r.url.endswith('home')
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_admin_no_auth(app):
|
||||
r = yield get_page('admin', app)
|
||||
async def test_admin_no_auth(app):
|
||||
r = await get_page('admin', app)
|
||||
assert r.status_code == 403
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_admin_not_admin(app):
|
||||
cookies = yield app.login_user('wash')
|
||||
r = yield get_page('admin', app, cookies=cookies)
|
||||
async def test_admin_not_admin(app):
|
||||
cookies = await app.login_user('wash')
|
||||
r = await get_page('admin', app, cookies=cookies)
|
||||
assert r.status_code == 403
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_admin(app):
|
||||
cookies = yield app.login_user('admin')
|
||||
r = yield get_page('admin', app, cookies=cookies, allow_redirects=False)
|
||||
async def test_admin(app):
|
||||
cookies = await app.login_user('admin')
|
||||
r = await get_page('admin', app, cookies=cookies, allow_redirects=False)
|
||||
r.raise_for_status()
|
||||
assert r.url.endswith('/admin')
|
||||
|
||||
@@ -124,127 +112,127 @@ def test_admin(app):
|
||||
'admin',
|
||||
'name',
|
||||
])
|
||||
@pytest.mark.gen_test
|
||||
def test_admin_sort(app, sort):
|
||||
cookies = yield app.login_user('admin')
|
||||
r = yield get_page('admin?sort=%s' % sort, app, cookies=cookies)
|
||||
async def test_admin_sort(app, sort):
|
||||
cookies = await app.login_user('admin')
|
||||
r = await get_page('admin?sort=%s' % sort, app, cookies=cookies)
|
||||
r.raise_for_status()
|
||||
assert r.status_code == 200
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_spawn_redirect(app):
|
||||
async def test_spawn_redirect(app):
|
||||
name = 'wash'
|
||||
cookies = yield app.login_user(name)
|
||||
cookies = await app.login_user(name)
|
||||
u = app.users[orm.User.find(app.db, name)]
|
||||
|
||||
status = yield u.spawner.poll()
|
||||
status = await u.spawner.poll()
|
||||
assert status is not None
|
||||
|
||||
# test spawn page when no server is running
|
||||
r = yield get_page('spawn', app, cookies=cookies)
|
||||
r = await get_page('spawn', app, cookies=cookies)
|
||||
r.raise_for_status()
|
||||
print(urlparse(r.url))
|
||||
path = urlparse(r.url).path
|
||||
assert path == ujoin(app.base_url, 'user/%s/' % name)
|
||||
|
||||
# should have started server
|
||||
status = yield u.spawner.poll()
|
||||
status = await u.spawner.poll()
|
||||
assert status is None
|
||||
|
||||
# test spawn page when server is already running (just redirect)
|
||||
r = yield get_page('spawn', app, cookies=cookies)
|
||||
r = await get_page('spawn', app, cookies=cookies)
|
||||
r.raise_for_status()
|
||||
print(urlparse(r.url))
|
||||
path = urlparse(r.url).path
|
||||
assert path == ujoin(app.base_url, '/user/%s/' % name)
|
||||
|
||||
# stop server to ensure /user/name is handled by the Hub
|
||||
r = yield api_request(app, 'users', name, 'server', method='delete', cookies=cookies)
|
||||
r = await api_request(app, 'users', name, 'server', method='delete', cookies=cookies)
|
||||
r.raise_for_status()
|
||||
|
||||
# test handing of trailing slash on `/user/name`
|
||||
r = yield get_page('user/' + name, app, hub=False, cookies=cookies)
|
||||
r = await get_page('user/' + name, app, hub=False, cookies=cookies)
|
||||
r.raise_for_status()
|
||||
path = urlparse(r.url).path
|
||||
assert path == ujoin(app.base_url, '/user/%s/' % name)
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_spawn_handler_access(app):
|
||||
async def test_spawn_handler_access(app):
|
||||
name = 'winston'
|
||||
cookies = yield app.login_user(name)
|
||||
cookies = await app.login_user(name)
|
||||
u = app.users[orm.User.find(app.db, name)]
|
||||
|
||||
status = yield u.spawner.poll()
|
||||
status = await u.spawner.poll()
|
||||
assert status is not None
|
||||
|
||||
# spawn server via browser link with ?arg=value
|
||||
r = yield get_page('spawn', app, cookies=cookies, params={'arg': 'value'})
|
||||
r = await get_page('spawn', app, cookies=cookies, params={'arg': 'value'})
|
||||
r.raise_for_status()
|
||||
|
||||
# verify that request params got passed down
|
||||
# implemented in MockSpawner
|
||||
r = yield async_requests.get(ujoin(public_url(app, u), 'env'))
|
||||
r = await async_requests.get(ujoin(public_url(app, u), 'env'))
|
||||
env = r.json()
|
||||
assert 'HANDLER_ARGS' in env
|
||||
assert env['HANDLER_ARGS'] == 'arg=value'
|
||||
|
||||
# stop server
|
||||
r = yield api_request(app, 'users', name, 'server', method='delete')
|
||||
r = await api_request(app, 'users', name, 'server', method='delete')
|
||||
r.raise_for_status()
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_spawn_admin_access(app, admin_access):
|
||||
async def test_spawn_admin_access(app, admin_access):
|
||||
"""GET /user/:name as admin with admin-access spawns user's server"""
|
||||
cookies = yield app.login_user('admin')
|
||||
cookies = await app.login_user('admin')
|
||||
name = 'mariel'
|
||||
user = add_user(app.db, app=app, name=name)
|
||||
app.db.commit()
|
||||
r = yield get_page('user/' + name, app, cookies=cookies)
|
||||
r = await get_page('user/' + name, app, cookies=cookies)
|
||||
r.raise_for_status()
|
||||
assert (r.url.split('?')[0] + '/').startswith(public_url(app, user))
|
||||
r = yield get_page('user/{}/env'.format(name), app, hub=False, cookies=cookies)
|
||||
r = await get_page('user/{}/env'.format(name), app, hub=False, cookies=cookies)
|
||||
r.raise_for_status()
|
||||
env = r.json()
|
||||
assert env['JUPYTERHUB_USER'] == name
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_spawn_page(app):
|
||||
async def test_spawn_page(app):
|
||||
with mock.patch.dict(app.users.settings, {'spawner_class': FormSpawner}):
|
||||
cookies = yield app.login_user('jones')
|
||||
r = yield get_page('spawn', app, cookies=cookies)
|
||||
cookies = await app.login_user('jones')
|
||||
r = await get_page('spawn', app, cookies=cookies)
|
||||
assert r.url.endswith('/spawn')
|
||||
assert FormSpawner.options_form in r.text
|
||||
|
||||
r = yield get_page('spawn?next=foo', app, cookies=cookies)
|
||||
r = await get_page('spawn?next=foo', app, cookies=cookies)
|
||||
assert r.url.endswith('/spawn?next=foo')
|
||||
assert FormSpawner.options_form in r.text
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_spawn_page_admin(app, admin_access):
|
||||
async def test_spawn_page_falsy_callable(app):
|
||||
with mock.patch.dict(app.users.settings, {'spawner_class': FalsyCallableFormSpawner}):
|
||||
cookies = await app.login_user('erik')
|
||||
r = await get_page('spawn', app, cookies=cookies)
|
||||
assert 'user/erik' in r.url
|
||||
|
||||
|
||||
async def test_spawn_page_admin(app, admin_access):
|
||||
with mock.patch.dict(app.users.settings, {'spawner_class': FormSpawner}):
|
||||
cookies = yield app.login_user('admin')
|
||||
cookies = await app.login_user('admin')
|
||||
u = add_user(app.db, app=app, name='melanie')
|
||||
r = yield get_page('spawn/' + u.name, app, cookies=cookies)
|
||||
r = await get_page('spawn/' + u.name, app, cookies=cookies)
|
||||
assert r.url.endswith('/spawn/' + u.name)
|
||||
assert FormSpawner.options_form in r.text
|
||||
assert "Spawning server for {}".format(u.name) in r.text
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_spawn_form(app):
|
||||
async def test_spawn_form(app):
|
||||
with mock.patch.dict(app.users.settings, {'spawner_class': FormSpawner}):
|
||||
base_url = ujoin(public_host(app), app.hub.base_url)
|
||||
cookies = yield app.login_user('jones')
|
||||
cookies = await app.login_user('jones')
|
||||
orm_u = orm.User.find(app.db, 'jones')
|
||||
u = app.users[orm_u]
|
||||
yield u.stop()
|
||||
await u.stop()
|
||||
next_url = ujoin(app.base_url, 'user/jones/tree')
|
||||
r = yield async_requests.post(
|
||||
r = await async_requests.post(
|
||||
url_concat(ujoin(base_url, 'spawn'), {'next': next_url}),
|
||||
cookies=cookies,
|
||||
data={'bounds': ['-1', '1'], 'energy': '511keV'},
|
||||
@@ -258,15 +246,14 @@ def test_spawn_form(app):
|
||||
}
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_spawn_form_admin_access(app, admin_access):
|
||||
async def test_spawn_form_admin_access(app, admin_access):
|
||||
with mock.patch.dict(app.tornado_settings, {'spawner_class': FormSpawner}):
|
||||
base_url = ujoin(public_host(app), app.hub.base_url)
|
||||
cookies = yield app.login_user('admin')
|
||||
cookies = await app.login_user('admin')
|
||||
u = add_user(app.db, app=app, name='martha')
|
||||
next_url = ujoin(app.base_url, 'user', u.name, 'tree')
|
||||
|
||||
r = yield async_requests.post(
|
||||
r = await async_requests.post(
|
||||
url_concat(ujoin(base_url, 'spawn', u.name), {'next': next_url}),
|
||||
cookies=cookies,
|
||||
data={'bounds': ['-3', '3'], 'energy': '938MeV'},
|
||||
@@ -281,16 +268,15 @@ def test_spawn_form_admin_access(app, admin_access):
|
||||
}
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_spawn_form_with_file(app):
|
||||
async def test_spawn_form_with_file(app):
|
||||
with mock.patch.dict(app.tornado_settings, {'spawner_class': FormSpawner}):
|
||||
base_url = ujoin(public_host(app), app.hub.base_url)
|
||||
cookies = yield app.login_user('jones')
|
||||
cookies = await app.login_user('jones')
|
||||
orm_u = orm.User.find(app.db, 'jones')
|
||||
u = app.users[orm_u]
|
||||
yield u.stop()
|
||||
await u.stop()
|
||||
|
||||
r = yield async_requests.post(ujoin(base_url, 'spawn'),
|
||||
r = await async_requests.post(ujoin(base_url, 'spawn'),
|
||||
cookies=cookies,
|
||||
data={
|
||||
'bounds': ['-1', '1'],
|
||||
@@ -309,12 +295,11 @@ def test_spawn_form_with_file(app):
|
||||
}
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_user_redirect(app):
|
||||
async def test_user_redirect(app):
|
||||
name = 'wash'
|
||||
cookies = yield app.login_user(name)
|
||||
cookies = await app.login_user(name)
|
||||
|
||||
r = yield get_page('/user-redirect/tree/top/', app)
|
||||
r = await get_page('/user-redirect/tree/top/', app)
|
||||
r.raise_for_status()
|
||||
print(urlparse(r.url))
|
||||
path = urlparse(r.url).path
|
||||
@@ -324,32 +309,31 @@ def test_user_redirect(app):
|
||||
'next': ujoin(app.hub.base_url, '/user-redirect/tree/top/')
|
||||
})
|
||||
|
||||
r = yield get_page('/user-redirect/notebooks/test.ipynb', app, cookies=cookies)
|
||||
r = await get_page('/user-redirect/notebooks/test.ipynb', app, cookies=cookies)
|
||||
r.raise_for_status()
|
||||
print(urlparse(r.url))
|
||||
path = urlparse(r.url).path
|
||||
assert path == ujoin(app.base_url, '/user/%s/notebooks/test.ipynb' % name)
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_user_redirect_deprecated(app):
|
||||
async def test_user_redirect_deprecated(app):
|
||||
"""redirecting from /user/someonelse/ URLs (deprecated)"""
|
||||
name = 'wash'
|
||||
cookies = yield app.login_user(name)
|
||||
cookies = await app.login_user(name)
|
||||
|
||||
r = yield get_page('/user/baduser', app, cookies=cookies, hub=False)
|
||||
r = await get_page('/user/baduser', app, cookies=cookies, hub=False)
|
||||
r.raise_for_status()
|
||||
print(urlparse(r.url))
|
||||
path = urlparse(r.url).path
|
||||
assert path == ujoin(app.base_url, '/user/%s/' % name)
|
||||
|
||||
r = yield get_page('/user/baduser/test.ipynb', app, cookies=cookies, hub=False)
|
||||
r = await get_page('/user/baduser/test.ipynb', app, cookies=cookies, hub=False)
|
||||
r.raise_for_status()
|
||||
print(urlparse(r.url))
|
||||
path = urlparse(r.url).path
|
||||
assert path == ujoin(app.base_url, '/user/%s/test.ipynb' % name)
|
||||
|
||||
r = yield get_page('/user/baduser/test.ipynb', app, hub=False)
|
||||
r = await get_page('/user/baduser/test.ipynb', app, hub=False)
|
||||
r.raise_for_status()
|
||||
print(urlparse(r.url))
|
||||
path = urlparse(r.url).path
|
||||
@@ -360,11 +344,10 @@ def test_user_redirect_deprecated(app):
|
||||
})
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_login_fail(app):
|
||||
async def test_login_fail(app):
|
||||
name = 'wash'
|
||||
base_url = public_url(app)
|
||||
r = yield async_requests.post(base_url + 'hub/login',
|
||||
r = await async_requests.post(base_url + 'hub/login',
|
||||
data={
|
||||
'username': name,
|
||||
'password': 'wrong',
|
||||
@@ -374,8 +357,7 @@ def test_login_fail(app):
|
||||
assert not r.cookies
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_login_strip(app):
|
||||
async def test_login_strip(app):
|
||||
"""Test that login form doesn't strip whitespace from passwords"""
|
||||
form_data = {
|
||||
'username': 'spiff',
|
||||
@@ -388,7 +370,7 @@ def test_login_strip(app):
|
||||
called_with.append(data)
|
||||
|
||||
with mock.patch.object(app.authenticator, 'authenticate', mock_authenticate):
|
||||
yield async_requests.post(base_url + 'hub/login',
|
||||
await async_requests.post(base_url + 'hub/login',
|
||||
data=form_data,
|
||||
allow_redirects=False,
|
||||
)
|
||||
@@ -414,9 +396,8 @@ def test_login_strip(app):
|
||||
(False, '//other.domain', ''),
|
||||
]
|
||||
)
|
||||
@pytest.mark.gen_test
|
||||
def test_login_redirect(app, running, next_url, location):
|
||||
cookies = yield app.login_user('river')
|
||||
async def test_login_redirect(app, running, next_url, location):
|
||||
cookies = await app.login_user('river')
|
||||
user = app.users['river']
|
||||
if location:
|
||||
location = ujoin(app.base_url, location)
|
||||
@@ -432,18 +413,17 @@ def test_login_redirect(app, running, next_url, location):
|
||||
|
||||
if running and not user.active:
|
||||
# ensure running
|
||||
yield user.spawn()
|
||||
await user.spawn()
|
||||
elif user.active and not running:
|
||||
# ensure not running
|
||||
yield user.stop()
|
||||
r = yield get_page(url, app, cookies=cookies, allow_redirects=False)
|
||||
await user.stop()
|
||||
r = await get_page(url, app, cookies=cookies, allow_redirects=False)
|
||||
r.raise_for_status()
|
||||
assert r.status_code == 302
|
||||
assert location == r.headers['Location']
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_auto_login(app, request):
|
||||
async def test_auto_login(app, request):
|
||||
class DummyLoginHandler(BaseHandler):
|
||||
def get(self):
|
||||
self.write('ok!')
|
||||
@@ -452,7 +432,7 @@ def test_auto_login(app, request):
|
||||
(ujoin(app.hub.base_url, 'dummy'), DummyLoginHandler),
|
||||
])
|
||||
# no auto_login: end up at /hub/login
|
||||
r = yield async_requests.get(base_url)
|
||||
r = await async_requests.get(base_url)
|
||||
assert r.url == public_url(app, path='hub/login')
|
||||
# enable auto_login: redirect from /hub/login to /hub/dummy
|
||||
authenticator = Authenticator(auto_login=True)
|
||||
@@ -461,78 +441,118 @@ def test_auto_login(app, request):
|
||||
with mock.patch.dict(app.tornado_settings, {
|
||||
'authenticator': authenticator,
|
||||
}):
|
||||
r = yield async_requests.get(base_url)
|
||||
r = await async_requests.get(base_url)
|
||||
assert r.url == public_url(app, path='hub/dummy')
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_auto_login_logout(app):
|
||||
|
||||
async def test_auto_login_logout(app):
|
||||
name = 'burnham'
|
||||
cookies = yield app.login_user(name)
|
||||
cookies = await app.login_user(name)
|
||||
|
||||
with mock.patch.dict(app.tornado_settings, {
|
||||
'authenticator': Authenticator(auto_login=True),
|
||||
}):
|
||||
r = yield async_requests.get(public_host(app) + app.tornado_settings['logout_url'], cookies=cookies)
|
||||
r = await async_requests.get(public_host(app) + app.tornado_settings['logout_url'], cookies=cookies)
|
||||
r.raise_for_status()
|
||||
logout_url = public_host(app) + app.tornado_settings['logout_url']
|
||||
assert r.url == logout_url
|
||||
assert r.cookies == {}
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_logout(app):
|
||||
|
||||
async def test_logout(app):
|
||||
name = 'wash'
|
||||
cookies = yield app.login_user(name)
|
||||
r = yield async_requests.get(public_host(app) + app.tornado_settings['logout_url'], cookies=cookies)
|
||||
cookies = await app.login_user(name)
|
||||
r = await async_requests.get(public_host(app) + app.tornado_settings['logout_url'], cookies=cookies)
|
||||
r.raise_for_status()
|
||||
login_url = public_host(app) + app.tornado_settings['login_url']
|
||||
assert r.url == login_url
|
||||
assert r.cookies == {}
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_login_no_whitelist_adds_user(app):
|
||||
@pytest.mark.parametrize('shutdown_on_logout', [True, False])
|
||||
async def test_shutdown_on_logout(app, shutdown_on_logout):
|
||||
name = 'shutitdown'
|
||||
cookies = await app.login_user(name)
|
||||
user = app.users[name]
|
||||
|
||||
# start the user's server
|
||||
await user.spawn()
|
||||
spawner = user.spawner
|
||||
|
||||
# wait for any pending state to resolve
|
||||
for i in range(50):
|
||||
if not spawner.pending:
|
||||
break
|
||||
await asyncio.sleep(0.1)
|
||||
else:
|
||||
assert False, "Spawner still pending"
|
||||
assert spawner.active
|
||||
|
||||
# logout
|
||||
with mock.patch.dict(app.tornado_settings, {
|
||||
'shutdown_on_logout': shutdown_on_logout,
|
||||
}):
|
||||
r = await async_requests.get(
|
||||
public_host(app) + app.tornado_settings['logout_url'],
|
||||
cookies=cookies,
|
||||
)
|
||||
r.raise_for_status()
|
||||
|
||||
login_url = public_host(app) + app.tornado_settings['login_url']
|
||||
assert r.url == login_url
|
||||
assert r.cookies == {}
|
||||
|
||||
# wait for any pending state to resolve
|
||||
for i in range(50):
|
||||
if not spawner.pending:
|
||||
break
|
||||
await asyncio.sleep(0.1)
|
||||
else:
|
||||
assert False, "Spawner still pending"
|
||||
|
||||
assert spawner.ready == (not shutdown_on_logout)
|
||||
|
||||
|
||||
async def test_login_no_whitelist_adds_user(app):
|
||||
auth = app.authenticator
|
||||
mock_add_user = mock.Mock()
|
||||
with mock.patch.object(auth, 'add_user', mock_add_user):
|
||||
cookies = yield app.login_user('jubal')
|
||||
cookies = await app.login_user('jubal')
|
||||
|
||||
user = app.users['jubal']
|
||||
assert mock_add_user.mock_calls == [mock.call(user)]
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_static_files(app):
|
||||
async def test_static_files(app):
|
||||
base_url = ujoin(public_host(app), app.hub.base_url)
|
||||
r = yield async_requests.get(ujoin(base_url, 'logo'))
|
||||
r = await async_requests.get(ujoin(base_url, 'logo'))
|
||||
r.raise_for_status()
|
||||
assert r.headers['content-type'] == 'image/png'
|
||||
r = yield async_requests.get(ujoin(base_url, 'static', 'images', 'jupyter.png'))
|
||||
r = await async_requests.get(ujoin(base_url, 'static', 'images', 'jupyter.png'))
|
||||
r.raise_for_status()
|
||||
assert r.headers['content-type'] == 'image/png'
|
||||
r = yield async_requests.get(ujoin(base_url, 'static', 'css', 'style.min.css'))
|
||||
r = await async_requests.get(ujoin(base_url, 'static', 'css', 'style.min.css'))
|
||||
r.raise_for_status()
|
||||
assert r.headers['content-type'] == 'text/css'
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_token_auth(app):
|
||||
cookies = yield app.login_user('token')
|
||||
r = yield get_page('token', app, cookies=cookies)
|
||||
async def test_token_auth(app):
|
||||
cookies = await app.login_user('token')
|
||||
r = await get_page('token', app, cookies=cookies)
|
||||
r.raise_for_status()
|
||||
assert r.status_code == 200
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_oauth_token_page(app):
|
||||
async def test_oauth_token_page(app):
|
||||
name = 'token'
|
||||
cookies = yield app.login_user(name)
|
||||
cookies = await app.login_user(name)
|
||||
user = app.users[orm.User.find(app.db, name)]
|
||||
client = orm.OAuthClient(identifier='token')
|
||||
app.db.add(client)
|
||||
oauth_token = orm.OAuthAccessToken(client=client, user=user, grant_type=orm.GrantType.authorization_code)
|
||||
app.db.add(oauth_token)
|
||||
app.db.commit()
|
||||
r = yield get_page('token', app, cookies=cookies)
|
||||
r = await get_page('token', app, cookies=cookies)
|
||||
r.raise_for_status()
|
||||
assert r.status_code == 200
|
||||
|
||||
@@ -541,14 +561,11 @@ def test_oauth_token_page(app):
|
||||
503,
|
||||
404,
|
||||
])
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_proxy_error(app, error_status):
|
||||
r = yield get_page('/error/%i' % error_status, app)
|
||||
async def test_proxy_error(app, error_status):
|
||||
r = await get_page('/error/%i' % error_status, app)
|
||||
assert r.status_code == 200
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
@pytest.mark.parametrize(
|
||||
"announcements",
|
||||
[
|
||||
@@ -558,7 +575,7 @@ def test_proxy_error(app, error_status):
|
||||
"login,logout",
|
||||
]
|
||||
)
|
||||
def test_announcements(app, announcements):
|
||||
async def test_announcements(app, announcements):
|
||||
"""Test announcements on various pages"""
|
||||
# Default announcement - same on all pages
|
||||
ann01 = "ANNOUNCE01"
|
||||
@@ -574,26 +591,26 @@ def test_announcements(app, announcements):
|
||||
else:
|
||||
assert ann01 in text
|
||||
|
||||
cookies = yield app.login_user("jones")
|
||||
cookies = await app.login_user("jones")
|
||||
|
||||
with mock.patch.dict(
|
||||
app.tornado_settings,
|
||||
{"template_vars": template_vars, "spawner_class": FormSpawner},
|
||||
):
|
||||
r = yield get_page("login", app)
|
||||
r = await get_page("login", app)
|
||||
r.raise_for_status()
|
||||
assert_announcement("login", r.text)
|
||||
r = yield get_page("spawn", app, cookies=cookies)
|
||||
r = await get_page("spawn", app, cookies=cookies)
|
||||
r.raise_for_status()
|
||||
assert_announcement("spawn", r.text)
|
||||
r = yield get_page("home", app, cookies=cookies) # hub/home
|
||||
r = await get_page("home", app, cookies=cookies) # hub/home
|
||||
r.raise_for_status()
|
||||
assert_announcement("home", r.text)
|
||||
# need auto_login=True to get logout page
|
||||
auto_login = app.authenticator.auto_login
|
||||
app.authenticator.auto_login = True
|
||||
try:
|
||||
r = yield get_page("logout", app, cookies=cookies)
|
||||
r = await get_page("logout", app, cookies=cookies)
|
||||
finally:
|
||||
app.authenticator.auto_login = auto_login
|
||||
r.raise_for_status()
|
||||
@@ -608,18 +625,16 @@ def test_announcements(app, announcements):
|
||||
"redirect_uri=ok&client_id=nosuchthing",
|
||||
]
|
||||
)
|
||||
@pytest.mark.gen_test
|
||||
def test_bad_oauth_get(app, params):
|
||||
cookies = yield app.login_user("authorizer")
|
||||
r = yield get_page("hub/api/oauth2/authorize?" + params, app, hub=False, cookies=cookies)
|
||||
async def test_bad_oauth_get(app, params):
|
||||
cookies = await app.login_user("authorizer")
|
||||
r = await get_page("hub/api/oauth2/authorize?" + params, app, hub=False, cookies=cookies)
|
||||
assert r.status_code == 400
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_token_page(app):
|
||||
async def test_token_page(app):
|
||||
name = "cake"
|
||||
cookies = yield app.login_user(name)
|
||||
r = yield get_page("token", app, cookies=cookies)
|
||||
cookies = await app.login_user(name)
|
||||
r = await get_page("token", app, cookies=cookies)
|
||||
r.raise_for_status()
|
||||
assert urlparse(r.url).path.endswith('/hub/token')
|
||||
def extract_body(r):
|
||||
@@ -638,7 +653,7 @@ def test_token_page(app):
|
||||
token = user.new_api_token(expires_in=60, note="my-test-token")
|
||||
app.db.commit()
|
||||
|
||||
r = yield get_page("token", app, cookies=cookies)
|
||||
r = await get_page("token", app, cookies=cookies)
|
||||
r.raise_for_status()
|
||||
body = extract_body(r)
|
||||
assert "API Tokens" in body, body
|
||||
@@ -649,10 +664,10 @@ def test_token_page(app):
|
||||
# spawn the user to trigger oauth, etc.
|
||||
# request an oauth token
|
||||
user.spawner.cmd = [sys.executable, '-m', 'jupyterhub.singleuser']
|
||||
r = yield get_page("spawn", app, cookies=cookies)
|
||||
r = await get_page("spawn", app, cookies=cookies)
|
||||
r.raise_for_status()
|
||||
|
||||
r = yield get_page("token", app, cookies=cookies)
|
||||
r = await get_page("token", app, cookies=cookies)
|
||||
r.raise_for_status()
|
||||
body = extract_body(r)
|
||||
assert "API Tokens" in body, body
|
||||
@@ -660,31 +675,27 @@ def test_token_page(app):
|
||||
assert "Authorized Applications" in body, body
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_server_not_running_api_request(app):
|
||||
cookies = yield app.login_user("bees")
|
||||
r = yield get_page("user/bees/api/status", app, hub=False, cookies=cookies)
|
||||
async def test_server_not_running_api_request(app):
|
||||
cookies = await app.login_user("bees")
|
||||
r = await get_page("user/bees/api/status", app, hub=False, cookies=cookies)
|
||||
assert r.status_code == 404
|
||||
assert r.headers["content-type"] == "application/json"
|
||||
assert r.json() == {"message": "bees is not running"}
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_metrics_no_auth(app):
|
||||
r = yield get_page("metrics", app)
|
||||
async def test_metrics_no_auth(app):
|
||||
r = await get_page("metrics", app)
|
||||
assert r.status_code == 403
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_metrics_auth(app):
|
||||
cookies = yield app.login_user('river')
|
||||
async def test_metrics_auth(app):
|
||||
cookies = await app.login_user('river')
|
||||
metrics_url = ujoin(public_host(app), app.hub.base_url, 'metrics')
|
||||
r = yield get_page("metrics", app, cookies=cookies)
|
||||
r = await get_page("metrics", app, cookies=cookies)
|
||||
assert r.status_code == 200
|
||||
assert r.url == metrics_url
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_health_check_request(app):
|
||||
r = yield get_page('health', app)
|
||||
async def test_health_check_request(app):
|
||||
r = await get_page('health', app)
|
||||
assert r.status_code == 200
|
||||
|
@@ -16,6 +16,7 @@ from .mocking import MockHub
|
||||
from .test_api import api_request, add_user
|
||||
from ..utils import wait_for_http_server, url_path_join as ujoin
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def disable_check_routes(app):
|
||||
# disable periodic check_routes while we are testing
|
||||
@@ -25,9 +26,8 @@ def disable_check_routes(app):
|
||||
finally:
|
||||
app.last_activity_callback.start()
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_external_proxy(request):
|
||||
|
||||
async def test_external_proxy(request):
|
||||
auth_token = 'secret!'
|
||||
proxy_ip = '127.0.0.1'
|
||||
proxy_port = 54321
|
||||
@@ -71,23 +71,24 @@ def test_external_proxy(request):
|
||||
|
||||
def wait_for_proxy():
|
||||
return wait_for_http_server('http://%s:%i' % (proxy_ip, proxy_port))
|
||||
yield wait_for_proxy()
|
||||
await wait_for_proxy()
|
||||
|
||||
yield app.initialize([])
|
||||
yield app.start()
|
||||
await app.initialize([])
|
||||
await app.start()
|
||||
assert app.proxy.proxy_process is None
|
||||
|
||||
# test if api service has a root route '/'
|
||||
routes = yield app.proxy.get_all_routes()
|
||||
routes = await app.proxy.get_all_routes()
|
||||
assert list(routes.keys()) == [app.hub.routespec]
|
||||
|
||||
# add user to the db and start a single user server
|
||||
name = 'river'
|
||||
add_user(app.db, app, name=name)
|
||||
r = yield api_request(app, 'users', name, 'server', method='post')
|
||||
r = await api_request(app, 'users', name, 'server', method='post',
|
||||
bypass_proxy=True)
|
||||
r.raise_for_status()
|
||||
|
||||
routes = yield app.proxy.get_all_routes()
|
||||
routes = await app.proxy.get_all_routes()
|
||||
# sets the desired path result
|
||||
user_path = ujoin(app.base_url, 'user/river') + '/'
|
||||
print(app.base_url, user_path)
|
||||
@@ -101,18 +102,18 @@ def test_external_proxy(request):
|
||||
proxy.terminate()
|
||||
proxy.wait(timeout=10)
|
||||
proxy = Popen(cmd, env=env)
|
||||
yield wait_for_proxy()
|
||||
await wait_for_proxy()
|
||||
|
||||
routes = yield app.proxy.get_all_routes()
|
||||
routes = await app.proxy.get_all_routes()
|
||||
|
||||
assert list(routes.keys()) == []
|
||||
|
||||
# poke the server to update the proxy
|
||||
r = yield api_request(app, 'proxy', method='post')
|
||||
r = await api_request(app, 'proxy', method='post', bypass_proxy=True)
|
||||
r.raise_for_status()
|
||||
|
||||
# check that the routes are correct
|
||||
routes = yield app.proxy.get_all_routes()
|
||||
routes = await app.proxy.get_all_routes()
|
||||
assert sorted(routes.keys()) == [app.hub.routespec, user_spec]
|
||||
|
||||
# teardown the proxy, and start a new one with different auth and port
|
||||
@@ -131,25 +132,30 @@ def test_external_proxy(request):
|
||||
if app.subdomain_host:
|
||||
cmd.append('--host-routing')
|
||||
proxy = Popen(cmd, env=env)
|
||||
yield wait_for_proxy()
|
||||
await wait_for_proxy()
|
||||
|
||||
# tell the hub where the new proxy is
|
||||
new_api_url = 'http://{}:{}'.format(proxy_ip, proxy_port)
|
||||
r = yield api_request(app, 'proxy', method='patch', data=json.dumps({
|
||||
r = await api_request(
|
||||
app,
|
||||
'proxy',
|
||||
method='patch',
|
||||
data=json.dumps({
|
||||
'api_url': new_api_url,
|
||||
'auth_token': new_auth_token,
|
||||
}))
|
||||
}),
|
||||
bypass_proxy=True,
|
||||
)
|
||||
r.raise_for_status()
|
||||
assert app.proxy.api_url == new_api_url
|
||||
|
||||
assert app.proxy.auth_token == new_auth_token
|
||||
|
||||
# check that the routes are correct
|
||||
routes = yield app.proxy.get_all_routes()
|
||||
routes = await app.proxy.get_all_routes()
|
||||
assert sorted(routes.keys()) == [app.hub.routespec, user_spec]
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
@pytest.mark.parametrize("username", [
|
||||
'zoe',
|
||||
'50fia',
|
||||
@@ -157,27 +163,27 @@ def test_external_proxy(request):
|
||||
'~TestJH',
|
||||
'has@',
|
||||
])
|
||||
def test_check_routes(app, username, disable_check_routes):
|
||||
async def test_check_routes(app, username, disable_check_routes):
|
||||
proxy = app.proxy
|
||||
test_user = add_user(app.db, app, name=username)
|
||||
r = yield api_request(app, 'users/%s/server' % username, method='post')
|
||||
r = await api_request(app, 'users/%s/server' % username, method='post')
|
||||
r.raise_for_status()
|
||||
|
||||
# check a valid route exists for user
|
||||
routes = yield app.proxy.get_all_routes()
|
||||
routes = await app.proxy.get_all_routes()
|
||||
before = sorted(routes)
|
||||
assert test_user.proxy_spec in before
|
||||
|
||||
# check if a route is removed when user deleted
|
||||
yield app.proxy.check_routes(app.users, app._service_map)
|
||||
yield proxy.delete_user(test_user)
|
||||
routes = yield app.proxy.get_all_routes()
|
||||
await app.proxy.check_routes(app.users, app._service_map)
|
||||
await proxy.delete_user(test_user)
|
||||
routes = await app.proxy.get_all_routes()
|
||||
during = sorted(routes)
|
||||
assert test_user.proxy_spec not in during
|
||||
|
||||
# check if a route exists for user
|
||||
yield app.proxy.check_routes(app.users, app._service_map)
|
||||
routes = yield app.proxy.get_all_routes()
|
||||
await app.proxy.check_routes(app.users, app._service_map)
|
||||
routes = await app.proxy.get_all_routes()
|
||||
after = sorted(routes)
|
||||
assert test_user.proxy_spec in after
|
||||
|
||||
@@ -185,7 +191,6 @@ def test_check_routes(app, username, disable_check_routes):
|
||||
assert before == after
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
@pytest.mark.parametrize("routespec", [
|
||||
'/has%20space/foo/',
|
||||
'/missing-trailing/slash',
|
||||
@@ -194,7 +199,7 @@ def test_check_routes(app, username, disable_check_routes):
|
||||
'host.name/path/',
|
||||
'other.host/path/no/slash',
|
||||
])
|
||||
def test_add_get_delete(app, routespec, disable_check_routes):
|
||||
async def test_add_get_delete(app, routespec, disable_check_routes):
|
||||
arg = routespec
|
||||
if not routespec.endswith('/'):
|
||||
routespec = routespec + '/'
|
||||
@@ -213,26 +218,25 @@ def test_add_get_delete(app, routespec, disable_check_routes):
|
||||
proxy = app.proxy
|
||||
target = 'https://localhost:1234'
|
||||
with context():
|
||||
yield proxy.add_route(arg, target, {})
|
||||
routes = yield proxy.get_all_routes()
|
||||
await proxy.add_route(arg, target, {})
|
||||
routes = await proxy.get_all_routes()
|
||||
if not expect_value_error:
|
||||
assert routespec in routes.keys()
|
||||
with context():
|
||||
route = yield proxy.get_route(arg)
|
||||
route = await proxy.get_route(arg)
|
||||
assert route == {
|
||||
'target': target,
|
||||
'routespec': routespec,
|
||||
'data': route.get('data'),
|
||||
}
|
||||
with context():
|
||||
yield proxy.delete_route(arg)
|
||||
await proxy.delete_route(arg)
|
||||
with context():
|
||||
route = yield proxy.get_route(arg)
|
||||
route = await proxy.get_route(arg)
|
||||
assert route is None
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
@pytest.mark.parametrize("test_data", [None, 'notjson', json.dumps([])])
|
||||
def test_proxy_patch_bad_request_data(app, test_data):
|
||||
r = yield api_request(app, 'proxy', method='patch', data=test_data)
|
||||
async def test_proxy_patch_bad_request_data(app, test_data):
|
||||
r = await api_request(app, 'proxy', method='patch', data=test_data)
|
||||
assert r.status_code == 400
|
||||
|
@@ -1,5 +1,6 @@
|
||||
"""Tests for services"""
|
||||
|
||||
import asyncio
|
||||
from binascii import hexlify
|
||||
from contextlib import contextmanager
|
||||
import os
|
||||
@@ -8,13 +9,14 @@ import sys
|
||||
from threading import Event
|
||||
import time
|
||||
|
||||
from async_generator import asynccontextmanager, async_generator, yield_
|
||||
import pytest
|
||||
import requests
|
||||
from tornado import gen
|
||||
from tornado.ioloop import IOLoop
|
||||
|
||||
from .mocking import public_url
|
||||
from ..utils import url_path_join, wait_for_http_server, random_port
|
||||
from ..utils import url_path_join, wait_for_http_server, random_port, maybe_future
|
||||
from .utils import async_requests
|
||||
|
||||
mockservice_path = os.path.dirname(os.path.abspath(__file__))
|
||||
@@ -22,8 +24,9 @@ mockservice_py = os.path.join(mockservice_path, 'mockservice.py')
|
||||
mockservice_cmd = [sys.executable, mockservice_py]
|
||||
|
||||
|
||||
@contextmanager
|
||||
def external_service(app, name='mockservice'):
|
||||
@asynccontextmanager
|
||||
@async_generator
|
||||
async def external_service(app, name='mockservice'):
|
||||
env = {
|
||||
'JUPYTERHUB_API_TOKEN': hexlify(os.urandom(5)),
|
||||
'JUPYTERHUB_SERVICE_NAME': name,
|
||||
@@ -31,17 +34,14 @@ def external_service(app, name='mockservice'):
|
||||
'JUPYTERHUB_SERVICE_URL': 'http://127.0.0.1:%i' % random_port(),
|
||||
}
|
||||
proc = Popen(mockservice_cmd, env=env)
|
||||
IOLoop().run_sync(
|
||||
lambda: wait_for_http_server(env['JUPYTERHUB_SERVICE_URL'])
|
||||
)
|
||||
try:
|
||||
yield env
|
||||
await wait_for_http_server(env['JUPYTERHUB_SERVICE_URL'])
|
||||
await yield_(env)
|
||||
finally:
|
||||
proc.terminate()
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_managed_service(mockservice):
|
||||
async def test_managed_service(mockservice):
|
||||
service = mockservice
|
||||
proc = service.proc
|
||||
assert isinstance(proc.pid, object)
|
||||
@@ -58,19 +58,18 @@ def test_managed_service(mockservice):
|
||||
if service.proc is not proc:
|
||||
break
|
||||
else:
|
||||
yield gen.sleep(0.2)
|
||||
await asyncio.sleep(0.2)
|
||||
|
||||
assert service.proc.pid != first_pid
|
||||
assert service.proc.poll() is None
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_proxy_service(app, mockservice_url):
|
||||
async def test_proxy_service(app, mockservice_url):
|
||||
service = mockservice_url
|
||||
name = service.name
|
||||
yield app.proxy.get_all_routes()
|
||||
await app.proxy.get_all_routes()
|
||||
url = public_url(app, service) + '/foo'
|
||||
r = yield async_requests.get(url, allow_redirects=False)
|
||||
r = await async_requests.get(url, allow_redirects=False)
|
||||
path = '/services/{}/foo'.format(name)
|
||||
r.raise_for_status()
|
||||
|
||||
@@ -78,23 +77,22 @@ def test_proxy_service(app, mockservice_url):
|
||||
assert r.text.endswith(path)
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_external_service(app):
|
||||
async def test_external_service(app):
|
||||
name = 'external'
|
||||
with external_service(app, name=name) as env:
|
||||
async with external_service(app, name=name) as env:
|
||||
app.services = [{
|
||||
'name': name,
|
||||
'admin': True,
|
||||
'url': env['JUPYTERHUB_SERVICE_URL'],
|
||||
'api_token': env['JUPYTERHUB_API_TOKEN'],
|
||||
}]
|
||||
yield app.init_services()
|
||||
yield app.init_api_tokens()
|
||||
yield app.proxy.add_all_services(app._service_map)
|
||||
await maybe_future(app.init_services())
|
||||
await app.init_api_tokens()
|
||||
await app.proxy.add_all_services(app._service_map)
|
||||
|
||||
service = app._service_map[name]
|
||||
url = public_url(app, service) + '/api/users'
|
||||
r = yield async_requests.get(url, allow_redirects=False)
|
||||
r = await async_requests.get(url, allow_redirects=False)
|
||||
r.raise_for_status()
|
||||
assert r.status_code == 200
|
||||
resp = r.json()
|
||||
|
@@ -170,6 +170,10 @@ def test_hub_authenticated(request):
|
||||
assert r.status_code == 302
|
||||
assert auth.login_url in r.headers['Location']
|
||||
|
||||
# clear the cache because we are going to request
|
||||
# the same url again with a different result
|
||||
auth.cache.clear()
|
||||
|
||||
# upstream 403
|
||||
m.get(bad_url, status_code=403)
|
||||
r = requests.get('http://127.0.0.1:%i' % port,
|
||||
@@ -223,11 +227,10 @@ def test_hub_authenticated(request):
|
||||
assert r.status_code == 403
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_hubauth_cookie(app, mockservice_url):
|
||||
async def test_hubauth_cookie(app, mockservice_url):
|
||||
"""Test HubAuthenticated service with user cookies"""
|
||||
cookies = yield app.login_user('badger')
|
||||
r = yield async_requests.get(public_url(app, mockservice_url) + '/whoami/', cookies=cookies)
|
||||
cookies = await app.login_user('badger')
|
||||
r = await async_requests.get(public_url(app, mockservice_url) + '/whoami/', cookies=cookies)
|
||||
r.raise_for_status()
|
||||
print(r.text)
|
||||
reply = r.json()
|
||||
@@ -238,15 +241,14 @@ def test_hubauth_cookie(app, mockservice_url):
|
||||
}
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_hubauth_token(app, mockservice_url):
|
||||
async def test_hubauth_token(app, mockservice_url):
|
||||
"""Test HubAuthenticated service with user API tokens"""
|
||||
u = add_user(app.db, name='river')
|
||||
token = u.new_api_token()
|
||||
app.db.commit()
|
||||
|
||||
# token in Authorization header
|
||||
r = yield async_requests.get(public_url(app, mockservice_url) + '/whoami/',
|
||||
r = await async_requests.get(public_url(app, mockservice_url) + '/whoami/',
|
||||
headers={
|
||||
'Authorization': 'token %s' % token,
|
||||
})
|
||||
@@ -258,7 +260,7 @@ def test_hubauth_token(app, mockservice_url):
|
||||
}
|
||||
|
||||
# token in ?token parameter
|
||||
r = yield async_requests.get(public_url(app, mockservice_url) + '/whoami/?token=%s' % token)
|
||||
r = await async_requests.get(public_url(app, mockservice_url) + '/whoami/?token=%s' % token)
|
||||
r.raise_for_status()
|
||||
reply = r.json()
|
||||
sub_reply = { key: reply.get(key, 'missing') for key in ['name', 'admin']}
|
||||
@@ -267,7 +269,7 @@ def test_hubauth_token(app, mockservice_url):
|
||||
'admin': False,
|
||||
}
|
||||
|
||||
r = yield async_requests.get(public_url(app, mockservice_url) + '/whoami/?token=no-such-token',
|
||||
r = await async_requests.get(public_url(app, mockservice_url) + '/whoami/?token=no-such-token',
|
||||
allow_redirects=False,
|
||||
)
|
||||
assert r.status_code == 302
|
||||
@@ -277,17 +279,16 @@ def test_hubauth_token(app, mockservice_url):
|
||||
assert path.endswith('/hub/login')
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_hubauth_service_token(app, mockservice_url):
|
||||
async def test_hubauth_service_token(app, mockservice_url):
|
||||
"""Test HubAuthenticated service with service API tokens"""
|
||||
|
||||
token = hexlify(os.urandom(5)).decode('utf8')
|
||||
name = 'test-api-service'
|
||||
app.service_tokens[token] = name
|
||||
yield app.init_api_tokens()
|
||||
await app.init_api_tokens()
|
||||
|
||||
# token in Authorization header
|
||||
r = yield async_requests.get(public_url(app, mockservice_url) + '/whoami/',
|
||||
r = await async_requests.get(public_url(app, mockservice_url) + '/whoami/',
|
||||
headers={
|
||||
'Authorization': 'token %s' % token,
|
||||
})
|
||||
@@ -301,7 +302,7 @@ def test_hubauth_service_token(app, mockservice_url):
|
||||
assert not r.cookies
|
||||
|
||||
# token in ?token parameter
|
||||
r = yield async_requests.get(public_url(app, mockservice_url) + '/whoami/?token=%s' % token)
|
||||
r = await async_requests.get(public_url(app, mockservice_url) + '/whoami/?token=%s' % token)
|
||||
r.raise_for_status()
|
||||
reply = r.json()
|
||||
assert reply == {
|
||||
@@ -310,7 +311,7 @@ def test_hubauth_service_token(app, mockservice_url):
|
||||
'admin': False,
|
||||
}
|
||||
|
||||
r = yield async_requests.get(public_url(app, mockservice_url) + '/whoami/?token=no-such-token',
|
||||
r = await async_requests.get(public_url(app, mockservice_url) + '/whoami/?token=no-such-token',
|
||||
allow_redirects=False,
|
||||
)
|
||||
assert r.status_code == 302
|
||||
@@ -320,16 +321,15 @@ def test_hubauth_service_token(app, mockservice_url):
|
||||
assert path.endswith('/hub/login')
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_oauth_service(app, mockservice_url):
|
||||
async def test_oauth_service(app, mockservice_url):
|
||||
service = mockservice_url
|
||||
url = url_path_join(public_url(app, mockservice_url) + 'owhoami/?arg=x')
|
||||
# first request is only going to login and get us to the oauth form page
|
||||
s = AsyncSession()
|
||||
name = 'link'
|
||||
s.cookies = yield app.login_user(name)
|
||||
s.cookies = await app.login_user(name)
|
||||
|
||||
r = yield s.get(url)
|
||||
r = await s.get(url)
|
||||
r.raise_for_status()
|
||||
# we should be looking at the oauth confirmation page
|
||||
assert urlparse(r.url).path == app.base_url + 'hub/api/oauth2/authorize'
|
||||
@@ -337,7 +337,7 @@ def test_oauth_service(app, mockservice_url):
|
||||
assert set(r.history[0].cookies.keys()) == {'service-%s-oauth-state' % service.name}
|
||||
|
||||
# submit the oauth form to complete authorization
|
||||
r = yield s.post(r.url, data={'scopes': ['identify']}, headers={'Referer': r.url})
|
||||
r = await s.post(r.url, data={'scopes': ['identify']}, headers={'Referer': r.url})
|
||||
r.raise_for_status()
|
||||
assert r.url == url
|
||||
# verify oauth cookie is set
|
||||
@@ -346,7 +346,7 @@ def test_oauth_service(app, mockservice_url):
|
||||
assert 'service-%s-oauth-state' % service.name not in set(s.cookies.keys())
|
||||
|
||||
# second request should be authenticated, which means no redirects
|
||||
r = yield s.get(url, allow_redirects=False)
|
||||
r = await s.get(url, allow_redirects=False)
|
||||
r.raise_for_status()
|
||||
assert r.status_code == 200
|
||||
reply = r.json()
|
||||
@@ -359,7 +359,7 @@ def test_oauth_service(app, mockservice_url):
|
||||
# token-authenticated request to HubOAuth
|
||||
token = app.users[name].new_api_token()
|
||||
# token in ?token parameter
|
||||
r = yield async_requests.get(url_concat(url, {'token': token}))
|
||||
r = await async_requests.get(url_concat(url, {'token': token}))
|
||||
r.raise_for_status()
|
||||
reply = r.json()
|
||||
assert reply['name'] == name
|
||||
@@ -367,7 +367,7 @@ def test_oauth_service(app, mockservice_url):
|
||||
# verify that ?token= requests set a cookie
|
||||
assert len(r.cookies) != 0
|
||||
# ensure cookie works in future requests
|
||||
r = yield async_requests.get(
|
||||
r = await async_requests.get(
|
||||
url,
|
||||
cookies=r.cookies,
|
||||
allow_redirects=False,
|
||||
@@ -378,17 +378,16 @@ def test_oauth_service(app, mockservice_url):
|
||||
assert reply['name'] == name
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_oauth_cookie_collision(app, mockservice_url):
|
||||
async def test_oauth_cookie_collision(app, mockservice_url):
|
||||
service = mockservice_url
|
||||
url = url_path_join(public_url(app, mockservice_url), 'owhoami/')
|
||||
print(url)
|
||||
s = AsyncSession()
|
||||
name = 'mypha'
|
||||
s.cookies = yield app.login_user(name)
|
||||
s.cookies = await app.login_user(name)
|
||||
state_cookie_name = 'service-%s-oauth-state' % service.name
|
||||
service_cookie_name = 'service-%s' % service.name
|
||||
oauth_1 = yield s.get(url)
|
||||
oauth_1 = await s.get(url)
|
||||
print(oauth_1.headers)
|
||||
print(oauth_1.cookies, oauth_1.url, url)
|
||||
assert state_cookie_name in s.cookies
|
||||
@@ -398,7 +397,7 @@ def test_oauth_cookie_collision(app, mockservice_url):
|
||||
state_1 = s.cookies[state_cookie_name]
|
||||
|
||||
# start second oauth login before finishing the first
|
||||
oauth_2 = yield s.get(url)
|
||||
oauth_2 = await s.get(url)
|
||||
state_cookies = [ c for c in s.cookies.keys() if c.startswith(state_cookie_name) ]
|
||||
assert len(state_cookies) == 2
|
||||
# get the random-suffix cookie name
|
||||
@@ -408,7 +407,7 @@ def test_oauth_cookie_collision(app, mockservice_url):
|
||||
|
||||
# finish oauth 2
|
||||
# submit the oauth form to complete authorization
|
||||
r = yield s.post(
|
||||
r = await s.post(
|
||||
oauth_2.url,
|
||||
data={'scopes': ['identify']},
|
||||
headers={'Referer': oauth_2.url},
|
||||
@@ -422,7 +421,7 @@ def test_oauth_cookie_collision(app, mockservice_url):
|
||||
service_cookie_2 = s.cookies[service_cookie_name]
|
||||
|
||||
# finish oauth 1
|
||||
r = yield s.post(
|
||||
r = await s.post(
|
||||
oauth_1.url,
|
||||
data={'scopes': ['identify']},
|
||||
headers={'Referer': oauth_1.url},
|
||||
@@ -441,8 +440,7 @@ def test_oauth_cookie_collision(app, mockservice_url):
|
||||
assert state_cookies == []
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_oauth_logout(app, mockservice_url):
|
||||
async def test_oauth_logout(app, mockservice_url):
|
||||
"""Verify that logout via the Hub triggers logout for oauth services
|
||||
|
||||
1. clears session id cookie
|
||||
@@ -467,18 +465,18 @@ def test_oauth_logout(app, mockservice_url):
|
||||
# ensure we start empty
|
||||
assert auth_tokens() == []
|
||||
|
||||
s.cookies = yield app.login_user(name)
|
||||
s.cookies = await app.login_user(name)
|
||||
assert 'jupyterhub-session-id' in s.cookies
|
||||
r = yield s.get(url)
|
||||
r = await s.get(url)
|
||||
r.raise_for_status()
|
||||
assert urlparse(r.url).path.endswith('oauth2/authorize')
|
||||
# submit the oauth form to complete authorization
|
||||
r = yield s.post(r.url, data={'scopes': ['identify']}, headers={'Referer': r.url})
|
||||
r = await s.post(r.url, data={'scopes': ['identify']}, headers={'Referer': r.url})
|
||||
r.raise_for_status()
|
||||
assert r.url == url
|
||||
|
||||
# second request should be authenticated
|
||||
r = yield s.get(url, allow_redirects=False)
|
||||
r = await s.get(url, allow_redirects=False)
|
||||
r.raise_for_status()
|
||||
assert r.status_code == 200
|
||||
reply = r.json()
|
||||
@@ -497,13 +495,13 @@ def test_oauth_logout(app, mockservice_url):
|
||||
assert len(auth_tokens()) == 1
|
||||
|
||||
# hit hub logout URL
|
||||
r = yield s.get(public_url(app, path='hub/logout'))
|
||||
r = await s.get(public_url(app, path='hub/logout'))
|
||||
r.raise_for_status()
|
||||
# verify that all cookies other than the service cookie are cleared
|
||||
assert list(s.cookies.keys()) == [service_cookie_name]
|
||||
# verify that clearing session id invalidates service cookie
|
||||
# i.e. redirect back to login page
|
||||
r = yield s.get(url)
|
||||
r = await s.get(url)
|
||||
r.raise_for_status()
|
||||
assert r.url.split('?')[0] == public_url(app, path='hub/login')
|
||||
|
||||
@@ -520,7 +518,7 @@ def test_oauth_logout(app, mockservice_url):
|
||||
# check that we got the old session id back
|
||||
assert session_id == s.cookies['jupyterhub-session-id']
|
||||
|
||||
r = yield s.get(url, allow_redirects=False)
|
||||
r = await s.get(url, allow_redirects=False)
|
||||
r.raise_for_status()
|
||||
assert r.status_code == 200
|
||||
reply = r.json()
|
||||
|
@@ -13,42 +13,41 @@ from ..utils import url_path_join
|
||||
from .utils import async_requests, AsyncSession
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_singleuser_auth(app):
|
||||
async def test_singleuser_auth(app):
|
||||
# use StubSingleUserSpawner to launch a single-user app in a thread
|
||||
app.spawner_class = StubSingleUserSpawner
|
||||
app.tornado_settings['spawner_class'] = StubSingleUserSpawner
|
||||
|
||||
# login, start the server
|
||||
cookies = yield app.login_user('nandy')
|
||||
cookies = await app.login_user('nandy')
|
||||
user = app.users['nandy']
|
||||
if not user.running:
|
||||
yield user.spawn()
|
||||
await user.spawn()
|
||||
url = public_url(app, user)
|
||||
|
||||
# no cookies, redirects to login page
|
||||
r = yield async_requests.get(url)
|
||||
r = await async_requests.get(url)
|
||||
r.raise_for_status()
|
||||
assert '/hub/login' in r.url
|
||||
|
||||
# with cookies, login successful
|
||||
r = yield async_requests.get(url, cookies=cookies)
|
||||
r = await async_requests.get(url, cookies=cookies)
|
||||
r.raise_for_status()
|
||||
assert urlparse(r.url).path.rstrip('/').endswith('/user/nandy/tree')
|
||||
assert r.status_code == 200
|
||||
|
||||
# logout
|
||||
r = yield async_requests.get(url_path_join(url, 'logout'), cookies=cookies)
|
||||
r = await async_requests.get(url_path_join(url, 'logout'), cookies=cookies)
|
||||
assert len(r.cookies) == 0
|
||||
|
||||
# accessing another user's server hits the oauth confirmation page
|
||||
cookies = yield app.login_user('burgess')
|
||||
cookies = await app.login_user('burgess')
|
||||
s = AsyncSession()
|
||||
s.cookies = cookies
|
||||
r = yield s.get(url)
|
||||
r = await s.get(url)
|
||||
assert urlparse(r.url).path.endswith('/oauth2/authorize')
|
||||
# submit the oauth form to complete authorization
|
||||
r = yield s.post(
|
||||
r = await s.post(
|
||||
r.url,
|
||||
data={'scopes': ['identify']},
|
||||
headers={'Referer': r.url},
|
||||
@@ -59,28 +58,27 @@ def test_singleuser_auth(app):
|
||||
assert 'burgess' in r.text
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_disable_user_config(app):
|
||||
async def test_disable_user_config(app):
|
||||
# use StubSingleUserSpawner to launch a single-user app in a thread
|
||||
app.spawner_class = StubSingleUserSpawner
|
||||
app.tornado_settings['spawner_class'] = StubSingleUserSpawner
|
||||
# login, start the server
|
||||
cookies = yield app.login_user('nandy')
|
||||
cookies = await app.login_user('nandy')
|
||||
user = app.users['nandy']
|
||||
# stop spawner, if running:
|
||||
if user.running:
|
||||
print("stopping")
|
||||
yield user.stop()
|
||||
await user.stop()
|
||||
# start with new config:
|
||||
user.spawner.debug = True
|
||||
user.spawner.disable_user_config = True
|
||||
yield user.spawn()
|
||||
yield app.proxy.add_user(user)
|
||||
await user.spawn()
|
||||
await app.proxy.add_user(user)
|
||||
|
||||
url = public_url(app, user)
|
||||
|
||||
# with cookies, login successful
|
||||
r = yield async_requests.get(url, cookies=cookies)
|
||||
r = await async_requests.get(url, cookies=cookies)
|
||||
r.raise_for_status()
|
||||
assert r.url.rstrip('/').endswith('/user/nandy/tree')
|
||||
assert r.status_code == 200
|
||||
@@ -90,6 +88,7 @@ def test_help_output():
|
||||
out = check_output([sys.executable, '-m', 'jupyterhub.singleuser', '--help-all']).decode('utf8', 'replace')
|
||||
assert 'JupyterHub' in out
|
||||
|
||||
|
||||
def test_version():
|
||||
out = check_output([sys.executable, '-m', 'jupyterhub.singleuser', '--version']).decode('utf8', 'replace')
|
||||
assert jupyterhub.__version__ in out
|
||||
|
@@ -60,10 +60,9 @@ def new_spawner(db, **kwargs):
|
||||
return user._new_spawner('', spawner_class=LocalProcessSpawner, **kwargs)
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_spawner(db, request):
|
||||
async def test_spawner(db, request):
|
||||
spawner = new_spawner(db)
|
||||
ip, port = yield spawner.start()
|
||||
ip, port = await spawner.start()
|
||||
assert ip == '127.0.0.1'
|
||||
assert isinstance(port, int)
|
||||
assert port > 0
|
||||
@@ -72,15 +71,14 @@ def test_spawner(db, request):
|
||||
# wait for the process to get to the while True: loop
|
||||
time.sleep(1)
|
||||
|
||||
status = yield spawner.poll()
|
||||
status = await spawner.poll()
|
||||
assert status is None
|
||||
yield spawner.stop()
|
||||
status = yield spawner.poll()
|
||||
await spawner.stop()
|
||||
status = await spawner.poll()
|
||||
assert status == 1
|
||||
|
||||
|
||||
@gen.coroutine
|
||||
def wait_for_spawner(spawner, timeout=10):
|
||||
async def wait_for_spawner(spawner, timeout=10):
|
||||
"""Wait for an http server to show up
|
||||
|
||||
polling at shorter intervals for early termination
|
||||
@@ -89,72 +87,68 @@ def wait_for_spawner(spawner, timeout=10):
|
||||
def wait():
|
||||
return spawner.server.wait_up(timeout=1, http=True)
|
||||
while time.monotonic() < deadline:
|
||||
status = yield spawner.poll()
|
||||
status = await spawner.poll()
|
||||
assert status is None
|
||||
try:
|
||||
yield wait()
|
||||
await wait()
|
||||
except TimeoutError:
|
||||
continue
|
||||
else:
|
||||
break
|
||||
yield wait()
|
||||
await wait()
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_single_user_spawner(app, request):
|
||||
async def test_single_user_spawner(app, request):
|
||||
user = next(iter(app.users.values()), None)
|
||||
spawner = user.spawner
|
||||
spawner.cmd = ['jupyterhub-singleuser']
|
||||
yield user.spawn()
|
||||
await user.spawn()
|
||||
assert spawner.server.ip == '127.0.0.1'
|
||||
assert spawner.server.port > 0
|
||||
yield wait_for_spawner(spawner)
|
||||
status = yield spawner.poll()
|
||||
await wait_for_spawner(spawner)
|
||||
status = await spawner.poll()
|
||||
assert status is None
|
||||
yield spawner.stop()
|
||||
status = yield spawner.poll()
|
||||
await spawner.stop()
|
||||
status = await spawner.poll()
|
||||
assert status == 0
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_stop_spawner_sigint_fails(db):
|
||||
async def test_stop_spawner_sigint_fails(db):
|
||||
spawner = new_spawner(db, cmd=[sys.executable, '-c', _uninterruptible])
|
||||
yield spawner.start()
|
||||
await spawner.start()
|
||||
|
||||
# wait for the process to get to the while True: loop
|
||||
yield gen.sleep(1)
|
||||
await gen.sleep(1)
|
||||
|
||||
status = yield spawner.poll()
|
||||
status = await spawner.poll()
|
||||
assert status is None
|
||||
|
||||
yield spawner.stop()
|
||||
status = yield spawner.poll()
|
||||
await spawner.stop()
|
||||
status = await spawner.poll()
|
||||
assert status == -signal.SIGTERM
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_stop_spawner_stop_now(db):
|
||||
async def test_stop_spawner_stop_now(db):
|
||||
spawner = new_spawner(db)
|
||||
yield spawner.start()
|
||||
await spawner.start()
|
||||
|
||||
# wait for the process to get to the while True: loop
|
||||
yield gen.sleep(1)
|
||||
await gen.sleep(1)
|
||||
|
||||
status = yield spawner.poll()
|
||||
status = await spawner.poll()
|
||||
assert status is None
|
||||
|
||||
yield spawner.stop(now=True)
|
||||
status = yield spawner.poll()
|
||||
await spawner.stop(now=True)
|
||||
status = await spawner.poll()
|
||||
assert status == -signal.SIGTERM
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_spawner_poll(db):
|
||||
async def test_spawner_poll(db):
|
||||
first_spawner = new_spawner(db)
|
||||
user = first_spawner.user
|
||||
yield first_spawner.start()
|
||||
await first_spawner.start()
|
||||
proc = first_spawner.proc
|
||||
status = yield first_spawner.poll()
|
||||
status = await first_spawner.poll()
|
||||
assert status is None
|
||||
if user.state is None:
|
||||
user.state = {}
|
||||
@@ -166,21 +160,21 @@ def test_spawner_poll(db):
|
||||
spawner.start_polling()
|
||||
|
||||
# wait for the process to get to the while True: loop
|
||||
yield gen.sleep(1)
|
||||
status = yield spawner.poll()
|
||||
await gen.sleep(1)
|
||||
status = await spawner.poll()
|
||||
assert status is None
|
||||
|
||||
# kill the process
|
||||
proc.terminate()
|
||||
for i in range(10):
|
||||
if proc.poll() is None:
|
||||
yield gen.sleep(1)
|
||||
await gen.sleep(1)
|
||||
else:
|
||||
break
|
||||
assert proc.poll() is not None
|
||||
|
||||
yield gen.sleep(2)
|
||||
status = yield spawner.poll()
|
||||
await gen.sleep(2)
|
||||
status = await spawner.poll()
|
||||
assert status is not None
|
||||
|
||||
|
||||
@@ -213,8 +207,7 @@ def test_string_formatting(db):
|
||||
assert s.format_string(s.default_url) == '/base/%s' % name
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_popen_kwargs(db):
|
||||
async def test_popen_kwargs(db):
|
||||
mock_proc = mock.Mock(spec=Popen)
|
||||
def mock_popen(*args, **kwargs):
|
||||
mock_proc.args = args
|
||||
@@ -224,14 +217,13 @@ def test_popen_kwargs(db):
|
||||
|
||||
s = new_spawner(db, popen_kwargs={'shell': True}, cmd='jupyterhub-singleuser')
|
||||
with mock.patch.object(spawnermod, 'Popen', mock_popen):
|
||||
yield s.start()
|
||||
await s.start()
|
||||
|
||||
assert mock_proc.kwargs['shell'] == True
|
||||
assert mock_proc.args[0][:1] == (['jupyterhub-singleuser'])
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_shell_cmd(db, tmpdir, request):
|
||||
async def test_shell_cmd(db, tmpdir, request):
|
||||
f = tmpdir.join('bashrc')
|
||||
f.write('export TESTVAR=foo\n')
|
||||
s = new_spawner(db,
|
||||
@@ -243,17 +235,17 @@ def test_shell_cmd(db, tmpdir, request):
|
||||
db.commit()
|
||||
s.server = Server.from_orm(server)
|
||||
db.commit()
|
||||
(ip, port) = yield s.start()
|
||||
(ip, port) = await s.start()
|
||||
request.addfinalizer(s.stop)
|
||||
s.server.ip = ip
|
||||
s.server.port = port
|
||||
db.commit()
|
||||
yield wait_for_spawner(s)
|
||||
r = yield async_requests.get('http://%s:%i/env' % (ip, port))
|
||||
await wait_for_spawner(s)
|
||||
r = await async_requests.get('http://%s:%i/env' % (ip, port))
|
||||
r.raise_for_status()
|
||||
env = r.json()
|
||||
assert env['TESTVAR'] == 'foo'
|
||||
yield s.stop()
|
||||
await s.stop()
|
||||
|
||||
|
||||
def test_inherit_overwrite():
|
||||
@@ -277,8 +269,7 @@ def test_inherit_ok():
|
||||
pass
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_spawner_reuse_api_token(db, app):
|
||||
async def test_spawner_reuse_api_token(db, app):
|
||||
# setup: user with no tokens, whose spawner has set the .will_resume flag
|
||||
user = add_user(app.db, app, name='snoopy')
|
||||
spawner = user.spawner
|
||||
@@ -286,26 +277,25 @@ def test_spawner_reuse_api_token(db, app):
|
||||
# will_resume triggers reuse of tokens
|
||||
spawner.will_resume = True
|
||||
# first start: gets a new API token
|
||||
yield user.spawn()
|
||||
await user.spawn()
|
||||
api_token = spawner.api_token
|
||||
found = orm.APIToken.find(app.db, api_token)
|
||||
assert found
|
||||
assert found.user.name == user.name
|
||||
assert user.api_tokens == [found]
|
||||
yield user.stop()
|
||||
await user.stop()
|
||||
# stop now deletes unused spawners.
|
||||
# put back the mock spawner!
|
||||
user.spawners[''] = spawner
|
||||
# second start: should reuse the token
|
||||
yield user.spawn()
|
||||
await user.spawn()
|
||||
# verify re-use of API token
|
||||
assert spawner.api_token == api_token
|
||||
# verify that a new token was not created
|
||||
assert user.api_tokens == [found]
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_spawner_insert_api_token(app):
|
||||
async def test_spawner_insert_api_token(app):
|
||||
"""Token provided by spawner is not in the db
|
||||
|
||||
Insert token into db as a user-provided token.
|
||||
@@ -322,17 +312,16 @@ def test_spawner_insert_api_token(app):
|
||||
# The spawner's provided API token would already be in the db
|
||||
# unless there is a bug somewhere else (in the Spawner),
|
||||
# but handle it anyway.
|
||||
yield user.spawn()
|
||||
await user.spawn()
|
||||
assert spawner.api_token == api_token
|
||||
found = orm.APIToken.find(app.db, api_token)
|
||||
assert found
|
||||
assert found.user.name == user.name
|
||||
assert user.api_tokens == [found]
|
||||
yield user.stop()
|
||||
await user.stop()
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_spawner_bad_api_token(app):
|
||||
async def test_spawner_bad_api_token(app):
|
||||
"""Tokens are revoked when a Spawner gets another user's token"""
|
||||
# we need two users for this one
|
||||
user = add_user(app.db, app, name='antimone')
|
||||
@@ -349,13 +338,12 @@ def test_spawner_bad_api_token(app):
|
||||
# starting a user's server with another user's token
|
||||
# should revoke it
|
||||
with pytest.raises(ValueError):
|
||||
yield user.spawn()
|
||||
await user.spawn()
|
||||
assert orm.APIToken.find(app.db, other_token) is None
|
||||
assert other_user.api_tokens == []
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
def test_spawner_delete_server(app):
|
||||
async def test_spawner_delete_server(app):
|
||||
"""Test deleting spawner.server
|
||||
|
||||
This can occur during app startup if their server has been deleted.
|
||||
@@ -393,22 +381,21 @@ def test_spawner_delete_server(app):
|
||||
"has%40x",
|
||||
]
|
||||
)
|
||||
@pytest.mark.gen_test
|
||||
def test_spawner_routing(app, name):
|
||||
async def test_spawner_routing(app, name):
|
||||
"""Test routing of names with special characters"""
|
||||
db = app.db
|
||||
with mock.patch.dict(app.config.LocalProcessSpawner, {'cmd': [sys.executable, '-m', 'jupyterhub.tests.mocksu']}):
|
||||
user = add_user(app.db, app, name=name)
|
||||
yield user.spawn()
|
||||
yield wait_for_spawner(user.spawner)
|
||||
yield app.proxy.add_user(user)
|
||||
await user.spawn()
|
||||
await wait_for_spawner(user.spawner)
|
||||
await app.proxy.add_user(user)
|
||||
kwargs = {'allow_redirects': False}
|
||||
if app.internal_ssl:
|
||||
kwargs['cert'] = (app.internal_ssl_cert, app.internal_ssl_key)
|
||||
kwargs["verify"] = app.internal_ssl_ca
|
||||
url = url_path_join(public_url(app, user), "test/url")
|
||||
r = yield async_requests.get(url, **kwargs)
|
||||
r = await async_requests.get(url, **kwargs)
|
||||
r.raise_for_status()
|
||||
assert r.url == url
|
||||
assert r.text == urlparse(url).path
|
||||
yield user.stop()
|
||||
await user.stop()
|
||||
|
@@ -26,7 +26,6 @@ def schedule_future(io_loop, *, delay, result=None):
|
||||
return f
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
@pytest.mark.parametrize("deadline, n, delay, expected", [
|
||||
(0, 3, 1, []),
|
||||
(0, 3, 0, [0, 1, 2]),
|
||||
@@ -43,7 +42,6 @@ async def test_iterate_until(io_loop, deadline, n, delay, expected):
|
||||
assert yielded == expected
|
||||
|
||||
|
||||
@pytest.mark.gen_test
|
||||
async def test_iterate_until_ready_after_deadline(io_loop):
|
||||
f = schedule_future(io_loop, delay=0)
|
||||
|
||||
|
@@ -1,19 +1,32 @@
|
||||
import asyncio
|
||||
from concurrent.futures import ThreadPoolExecutor
|
||||
import requests
|
||||
|
||||
from certipy import Certipy
|
||||
import requests
|
||||
|
||||
from jupyterhub import orm
|
||||
from jupyterhub.objects import Server
|
||||
from jupyterhub.utils import url_path_join as ujoin
|
||||
|
||||
|
||||
class _AsyncRequests:
|
||||
"""Wrapper around requests to return a Future from request methods
|
||||
|
||||
A single thread is allocated to avoid blocking the IOLoop thread.
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
self.executor = ThreadPoolExecutor(1)
|
||||
real_submit = self.executor.submit
|
||||
self.executor.submit = lambda *args, **kwargs: asyncio.wrap_future(
|
||||
real_submit(*args, **kwargs)
|
||||
)
|
||||
|
||||
def __getattr__(self, name):
|
||||
requests_method = getattr(requests, name)
|
||||
return lambda *args, **kwargs: self.executor.submit(requests_method, *args, **kwargs)
|
||||
return lambda *args, **kwargs: self.executor.submit(
|
||||
requests_method, *args, **kwargs
|
||||
)
|
||||
|
||||
|
||||
# async_requests.get = requests.get returning a Future, etc.
|
||||
@@ -22,6 +35,7 @@ async_requests = _AsyncRequests()
|
||||
|
||||
class AsyncSession(requests.Session):
|
||||
"""requests.Session object that runs in the background thread"""
|
||||
|
||||
def request(self, *args, **kwargs):
|
||||
return async_requests.executor.submit(super().request, *args, **kwargs)
|
||||
|
||||
@@ -30,9 +44,150 @@ def ssl_setup(cert_dir, authority_name):
|
||||
# Set up the external certs with the same authority as the internal
|
||||
# one so that certificate trust works regardless of chosen endpoint.
|
||||
certipy = Certipy(store_dir=cert_dir)
|
||||
alt_names = ['DNS:localhost', 'IP:127.0.0.1']
|
||||
alt_names = ["DNS:localhost", "IP:127.0.0.1"]
|
||||
internal_authority = certipy.create_ca(authority_name, overwrite=True)
|
||||
external_certs = certipy.create_signed_pair('external', authority_name,
|
||||
overwrite=True,
|
||||
alt_names=alt_names)
|
||||
external_certs = certipy.create_signed_pair(
|
||||
"external", authority_name, overwrite=True, alt_names=alt_names
|
||||
)
|
||||
return external_certs
|
||||
|
||||
|
||||
def check_db_locks(func):
|
||||
"""Decorator that verifies no locks are held on database upon exit.
|
||||
|
||||
This decorator for test functions verifies no locks are held on the
|
||||
application's database upon exit by creating and dropping a dummy table.
|
||||
|
||||
The decorator relies on an instance of JupyterHubApp being the first
|
||||
argument to the decorated function.
|
||||
|
||||
Example
|
||||
-------
|
||||
|
||||
@check_db_locks
|
||||
def api_request(app, *api_path, **kwargs):
|
||||
|
||||
"""
|
||||
def new_func(app, *args, **kwargs):
|
||||
retval = func(app, *args, **kwargs)
|
||||
|
||||
temp_session = app.session_factory()
|
||||
temp_session.execute('CREATE TABLE dummy (foo INT)')
|
||||
temp_session.execute('DROP TABLE dummy')
|
||||
temp_session.close()
|
||||
|
||||
return retval
|
||||
|
||||
return new_func
|
||||
|
||||
|
||||
def find_user(db, name, app=None):
|
||||
"""Find user in database."""
|
||||
orm_user = db.query(orm.User).filter(orm.User.name == name).first()
|
||||
if app is None:
|
||||
return orm_user
|
||||
else:
|
||||
return app.users[orm_user.id]
|
||||
|
||||
|
||||
def add_user(db, app=None, **kwargs):
|
||||
"""Add a user to the database."""
|
||||
orm_user = find_user(db, name=kwargs.get('name'))
|
||||
if orm_user is None:
|
||||
orm_user = orm.User(**kwargs)
|
||||
db.add(orm_user)
|
||||
else:
|
||||
for attr, value in kwargs.items():
|
||||
setattr(orm_user, attr, value)
|
||||
db.commit()
|
||||
if app:
|
||||
return app.users[orm_user.id]
|
||||
else:
|
||||
return orm_user
|
||||
|
||||
|
||||
def auth_header(db, name):
|
||||
"""Return header with user's API authorization token."""
|
||||
user = find_user(db, name)
|
||||
if user is None:
|
||||
user = add_user(db, name=name)
|
||||
token = user.new_api_token()
|
||||
return {'Authorization': 'token %s' % token}
|
||||
|
||||
|
||||
@check_db_locks
|
||||
async def api_request(app, *api_path, method='get',
|
||||
noauth=False, bypass_proxy=False,
|
||||
**kwargs):
|
||||
"""Make an API request"""
|
||||
if bypass_proxy:
|
||||
# make a direct request to the hub,
|
||||
# skipping the proxy
|
||||
base_url = app.hub.url
|
||||
else:
|
||||
base_url = public_url(app, path='hub')
|
||||
headers = kwargs.setdefault('headers', {})
|
||||
|
||||
if (
|
||||
'Authorization' not in headers
|
||||
and not noauth
|
||||
and 'cookies' not in kwargs
|
||||
):
|
||||
# make a copy to avoid modifying arg in-place
|
||||
kwargs['headers'] = h = {}
|
||||
h.update(headers)
|
||||
h.update(auth_header(app.db, kwargs.pop('name', 'admin')))
|
||||
|
||||
if 'cookies' in kwargs:
|
||||
# for cookie-authenticated requests,
|
||||
# set Referer so it looks like the request originated
|
||||
# from a Hub-served page
|
||||
headers.setdefault('Referer', ujoin(base_url, 'test'))
|
||||
|
||||
url = ujoin(base_url, 'api', *api_path)
|
||||
f = getattr(async_requests, method)
|
||||
if app.internal_ssl:
|
||||
kwargs['cert'] = (app.internal_ssl_cert, app.internal_ssl_key)
|
||||
kwargs["verify"] = app.internal_ssl_ca
|
||||
resp = await f(url, **kwargs)
|
||||
assert "frame-ancestors 'self'" in resp.headers['Content-Security-Policy']
|
||||
assert ujoin(app.hub.base_url, "security/csp-report") in resp.headers['Content-Security-Policy']
|
||||
assert 'http' not in resp.headers['Content-Security-Policy']
|
||||
if not kwargs.get('stream', False) and resp.content:
|
||||
assert resp.headers.get('content-type') == 'application/json'
|
||||
return resp
|
||||
|
||||
|
||||
def get_page(path, app, hub=True, **kw):
|
||||
if hub:
|
||||
prefix = app.hub.base_url
|
||||
else:
|
||||
prefix = app.base_url
|
||||
base_url = ujoin(public_host(app), prefix)
|
||||
return async_requests.get(ujoin(base_url, path), **kw)
|
||||
|
||||
|
||||
def public_host(app):
|
||||
"""Return the public *host* (no URL prefix) of the given JupyterHub instance."""
|
||||
if app.subdomain_host:
|
||||
return app.subdomain_host
|
||||
else:
|
||||
return Server.from_url(app.proxy.public_url).host
|
||||
|
||||
|
||||
def public_url(app, user_or_service=None, path=''):
|
||||
"""Return the full, public base URL (including prefix) of the given JupyterHub instance."""
|
||||
if user_or_service:
|
||||
if app.subdomain_host:
|
||||
host = user_or_service.host
|
||||
else:
|
||||
host = public_host(app)
|
||||
prefix = user_or_service.prefix
|
||||
else:
|
||||
host = public_host(app)
|
||||
prefix = Server.from_url(app.proxy.public_url).base_url
|
||||
if path:
|
||||
return host + ujoin(prefix, path)
|
||||
else:
|
||||
return host + prefix
|
||||
|
||||
|
@@ -8,7 +8,9 @@ import warnings
|
||||
|
||||
from sqlalchemy import inspect
|
||||
from tornado import gen
|
||||
from tornado.httputil import urlencode
|
||||
from tornado.log import app_log
|
||||
from tornado import web
|
||||
|
||||
from .utils import maybe_future, url_path_join, make_ssl_context
|
||||
|
||||
@@ -136,6 +138,7 @@ class User:
|
||||
orm_user = None
|
||||
log = app_log
|
||||
settings = None
|
||||
_auth_refreshed = None
|
||||
|
||||
def __init__(self, orm_user, settings=None, db=None):
|
||||
self.db = db or inspect(orm_user).session
|
||||
@@ -380,6 +383,59 @@ class User:
|
||||
url_parts.extend(['server/progress'])
|
||||
return url_path_join(*url_parts)
|
||||
|
||||
async def refresh_auth(self, handler):
|
||||
"""Refresh authentication if needed
|
||||
|
||||
Checks authentication expiry and refresh it if needed.
|
||||
See Spawner.
|
||||
|
||||
If the auth is expired and cannot be refreshed
|
||||
without forcing a new login, a few things can happen:
|
||||
|
||||
1. if this is a normal user spawn,
|
||||
the user should be redirected to login
|
||||
and back to spawn after login.
|
||||
2. if this is a spawn via API or other user,
|
||||
spawn will fail until the user logs in again.
|
||||
|
||||
Args:
|
||||
handler (RequestHandler):
|
||||
The handler for the request triggering the spawn.
|
||||
May be None
|
||||
"""
|
||||
authenticator = self.authenticator
|
||||
if authenticator is None or not authenticator.refresh_pre_spawn:
|
||||
# nothing to do
|
||||
return
|
||||
|
||||
# refresh auth
|
||||
auth_user = await handler.refresh_auth(self, force=True)
|
||||
|
||||
if auth_user:
|
||||
# auth refreshed, all done
|
||||
return
|
||||
|
||||
# if we got to here, auth is expired and couldn't be refreshed
|
||||
self.log.error(
|
||||
"Auth expired for %s; cannot spawn until they login again",
|
||||
self.name,
|
||||
)
|
||||
# auth expired, cannot spawn without a fresh login
|
||||
# it's the current user *and* spawn via GET, trigger login redirect
|
||||
if handler.request.method == 'GET' and handler.current_user is self:
|
||||
self.log.info("Redirecting %s to login to refresh auth", self.name)
|
||||
url = self.get_login_url()
|
||||
next_url = self.request.uri
|
||||
sep = '&' if '?' in url else '?'
|
||||
url += sep + urlencode(dict(next=next_url))
|
||||
self.redirect(url)
|
||||
raise web.Finish()
|
||||
else:
|
||||
# spawn via POST or on behalf of another user.
|
||||
# nothing we can do here but fail
|
||||
raise web.HTTPError(400, "{}'s authentication has expired".format(self.name))
|
||||
|
||||
|
||||
async def spawn(self, server_name='', options=None, handler=None):
|
||||
"""Start the user's spawner
|
||||
|
||||
@@ -395,6 +451,9 @@ class User:
|
||||
"""
|
||||
db = self.db
|
||||
|
||||
if handler:
|
||||
await self.refresh_auth(handler)
|
||||
|
||||
base_url = url_path_join(self.base_url, server_name) + '/'
|
||||
|
||||
orm_server = orm.Server(
|
||||
@@ -436,7 +495,7 @@ class User:
|
||||
|
||||
# trigger pre-spawn hook on authenticator
|
||||
authenticator = self.authenticator
|
||||
if (authenticator):
|
||||
if authenticator:
|
||||
await maybe_future(authenticator.pre_spawn_start(self, spawner))
|
||||
|
||||
spawner._start_pending = True
|
||||
@@ -531,7 +590,7 @@ class User:
|
||||
self.settings['statsd'].incr('spawner.failure.error')
|
||||
e.reason = 'error'
|
||||
try:
|
||||
await self.stop()
|
||||
await self.stop(spawner.name)
|
||||
except Exception:
|
||||
self.log.error("Failed to cleanup {user}'s server that failed to start".format(
|
||||
user=self.name,
|
||||
@@ -550,6 +609,15 @@ class User:
|
||||
spawner.orm_spawner.state = spawner.get_state()
|
||||
db.commit()
|
||||
spawner._waiting_for_response = True
|
||||
await self._wait_up(spawner)
|
||||
|
||||
async def _wait_up(self, spawner):
|
||||
"""Wait for a server to finish starting.
|
||||
|
||||
Shuts the server down if it doesn't respond within
|
||||
spawner.http_timeout.
|
||||
"""
|
||||
server = spawner.server
|
||||
key = self.settings.get('internal_ssl_key')
|
||||
cert = self.settings.get('internal_ssl_cert')
|
||||
ca = self.settings.get('internal_ssl_ca')
|
||||
@@ -578,7 +646,7 @@ class User:
|
||||
))
|
||||
self.settings['statsd'].incr('spawner.failure.http_error')
|
||||
try:
|
||||
await self.stop()
|
||||
await self.stop(spawner.name)
|
||||
except Exception:
|
||||
self.log.error("Failed to cleanup {user}'s server that failed to start".format(
|
||||
user=self.name,
|
||||
@@ -594,7 +662,7 @@ class User:
|
||||
finally:
|
||||
spawner._waiting_for_response = False
|
||||
spawner._start_pending = False
|
||||
return self
|
||||
return spawner
|
||||
|
||||
async def stop(self, server_name=''):
|
||||
"""Stop the user's spawner
|
||||
@@ -606,6 +674,9 @@ class User:
|
||||
spawner._start_pending = False
|
||||
spawner.stop_polling()
|
||||
spawner._stop_pending = True
|
||||
|
||||
self.log.debug("Stopping %s", spawner._log_name)
|
||||
|
||||
try:
|
||||
api_token = spawner.api_token
|
||||
status = await spawner.poll()
|
||||
@@ -637,6 +708,7 @@ class User:
|
||||
self.log.debug("Deleting oauth client %s", oauth_client.identifier)
|
||||
self.db.delete(oauth_client)
|
||||
self.db.commit()
|
||||
self.log.debug("Finished stopping %s", spawner._log_name)
|
||||
finally:
|
||||
spawner.orm_spawner.started = None
|
||||
self.db.commit()
|
||||
|
@@ -437,6 +437,9 @@ def print_stacks(file=sys.stderr):
|
||||
if (
|
||||
last_frame[0].endswith('threading.py')
|
||||
and last_frame[-1] == 'waiter.acquire()'
|
||||
) or (
|
||||
last_frame[0].endswith('thread.py')
|
||||
and last_frame[-1].endswith('work_queue.get(block=True)')
|
||||
):
|
||||
# thread is waiting on a condition
|
||||
# call it idle rather than showing the uninteresting stack
|
||||
@@ -541,3 +544,9 @@ async def iterate_until(deadline_future, generator):
|
||||
else:
|
||||
# neither is done, this shouldn't happen
|
||||
continue
|
||||
|
||||
|
||||
def utcnow():
|
||||
"""Return timezone-aware utcnow"""
|
||||
return datetime.now(timezone.utc)
|
||||
|
||||
|
@@ -20,7 +20,7 @@
|
||||
"prettier": "^1.14.2"
|
||||
},
|
||||
"dependencies": {
|
||||
"bootstrap": "^3.3.7",
|
||||
"bootstrap": "^3.4.0",
|
||||
"font-awesome": "^4.7.0",
|
||||
"jquery": "^3.2.1",
|
||||
"moment": "^2.19.3",
|
||||
|
@@ -1,5 +1,8 @@
|
||||
[pytest]
|
||||
minversion = 3.3
|
||||
# pytest 3.10 has broken minversion checks,
|
||||
# so we have to disable this until pytest 3.11
|
||||
# minversion = 3.3
|
||||
|
||||
python_files = test_*.py
|
||||
markers =
|
||||
gen_test: marks an async tornado test
|
||||
|
@@ -5,7 +5,7 @@ traitlets>=4.3.2
|
||||
tornado>=5.0
|
||||
jinja2
|
||||
pamela
|
||||
oauthlib>=2.0
|
||||
oauthlib>=2.0,<3
|
||||
python-dateutil
|
||||
SQLAlchemy>=1.1
|
||||
requests
|
||||
|
@@ -1,4 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
from jupyterhub.app import main
|
||||
main()
|
@@ -1,6 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
from jupyterhub.singleuser import main
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
6
setup.py
6
setup.py
@@ -89,7 +89,6 @@ with open('README.md', encoding="utf8") as f:
|
||||
|
||||
setup_args = dict(
|
||||
name = 'jupyterhub',
|
||||
scripts = glob(pjoin('scripts', '*')),
|
||||
packages = packages,
|
||||
# dummy, so that install_data doesn't get skipped
|
||||
# this will be overridden when bower is run anyway
|
||||
@@ -119,7 +118,12 @@ setup_args = dict(
|
||||
'jupyterhub.spawners': [
|
||||
'default = jupyterhub.spawner:LocalProcessSpawner',
|
||||
'localprocess = jupyterhub.spawner:LocalProcessSpawner',
|
||||
'simple = jupyterhub.spawner:SimpleLocalProcessSpawner',
|
||||
],
|
||||
'console_scripts': [
|
||||
'jupyterhub = jupyterhub.app:main',
|
||||
'jupyterhub-singleuser = jupyterhub.singleuser:main',
|
||||
]
|
||||
},
|
||||
classifiers = [
|
||||
'Intended Audience :: Developers',
|
||||
|
@@ -111,8 +111,9 @@ require(["jquery", "moment", "jhapi"], function($, moment, JHAPI) {
|
||||
});
|
||||
});
|
||||
|
||||
$("#new-server-btn").click(function() {
|
||||
var serverName = $("#new-server-name").val();
|
||||
$(".new-server-btn").click(function() {
|
||||
var row = getRow($(this));
|
||||
var serverName = row.find(".new-server-name").val();
|
||||
api.start_named_server(user, serverName, {
|
||||
success: function(reply) {
|
||||
// reload after creating the server
|
||||
|
@@ -26,10 +26,12 @@
|
||||
|
||||
<p>
|
||||
In addition to your default server,
|
||||
you may have additional servers with names.
|
||||
you may have additional {% if named_server_limit_per_user > 0 %}{{ named_server_limit_per_user }} {% endif %}server(s) with names.
|
||||
This allows you to have more than one server running at the same time.
|
||||
</p>
|
||||
|
||||
{% set named_spawners = user.all_spawners(include_default=False)|list %}
|
||||
|
||||
<table class="server-table table table-striped">
|
||||
<thead>
|
||||
<tr>
|
||||
@@ -42,13 +44,13 @@
|
||||
<tbody>
|
||||
<tr class="home-server-row add-server-row">
|
||||
<td colspan="4">
|
||||
<input id="new-server-name" placeholder="Name your server">
|
||||
<a role="button" id="new-server-btn" class="add-server btn btn-xs btn-primary">
|
||||
<input class="new-server-name" placeholder="Name your server">
|
||||
<a role="button" class="new-server-btn" class="add-server btn btn-xs btn-primary">
|
||||
Add New Server
|
||||
</a>
|
||||
</td>
|
||||
</tr>
|
||||
{% for spawner in user.all_spawners(include_default=False) %}
|
||||
{% for spawner in named_spawners %}
|
||||
<tr class="home-server-row" data-server-name="{{ spawner.name }}">
|
||||
{# name #}
|
||||
<td>{{ spawner.name }}</td>
|
||||
|
@@ -96,7 +96,11 @@
|
||||
<nav class="navbar navbar-default">
|
||||
<div class="container-fluid">
|
||||
<div class="navbar-header">
|
||||
<span id="jupyterhub-logo" class="pull-left"><a href="{{logo_url or base_url}}"><img src='{{base_url}}logo' alt='JupyterHub' class='jpy-logo' title='Home'/></a></span>
|
||||
{% block logo %}
|
||||
<span id="jupyterhub-logo" class="pull-left">
|
||||
<a href="{{logo_url or base_url}}"><img src='{{base_url}}logo' alt='JupyterHub' class='jpy-logo' title='Home'/></a>
|
||||
</span>
|
||||
{% endblock %}
|
||||
<button type="button" class="navbar-toggle collapsed" data-toggle="collapse" data-target="#thenavbar" aria-expanded="false">
|
||||
<span class="sr-only">Toggle navigation</span>
|
||||
<span class="icon-bar"></span>
|
||||
|
@@ -1 +0,0 @@
|
||||
from .simplespawner import SimpleSpawner
|
@@ -1,43 +0,0 @@
|
||||
import os
|
||||
from traitlets import Unicode
|
||||
|
||||
from jupyterhub.spawner import LocalProcessSpawner
|
||||
|
||||
|
||||
class SimpleLocalProcessSpawner(LocalProcessSpawner):
|
||||
"""
|
||||
A version of LocalProcessSpawner that doesn't require users to exist on
|
||||
the system beforehand.
|
||||
|
||||
Note: DO NOT USE THIS FOR PRODUCTION USE CASES! It is very insecure, and
|
||||
provides absolutely no isolation between different users!
|
||||
"""
|
||||
|
||||
home_path_template = Unicode(
|
||||
'/tmp/{userid}',
|
||||
config=True,
|
||||
help='Template to expand to set the user home. {userid} and {username} are expanded'
|
||||
)
|
||||
|
||||
@property
|
||||
def home_path(self):
|
||||
return self.home_path_template.format(
|
||||
userid=self.user.id,
|
||||
username=self.user.name
|
||||
)
|
||||
|
||||
def make_preexec_fn(self, name):
|
||||
home = self.home_path
|
||||
def preexec():
|
||||
try:
|
||||
os.makedirs(home, 0o755, exist_ok=True)
|
||||
os.chdir(home)
|
||||
except e:
|
||||
print(e)
|
||||
return preexec
|
||||
|
||||
def user_env(self, env):
|
||||
env['USER'] = self.user.name
|
||||
env['HOME'] = self.home_path
|
||||
env['SHELL'] = '/bin/bash'
|
||||
return env
|
Reference in New Issue
Block a user