Compare commits

..

20 Commits
1.3.0 ... 1.2.2

Author SHA1 Message Date
Min RK
69bb34b943 release 1.2.2 2020-11-27 14:44:42 +01:00
Min RK
728fbc68e0 Merge pull request #3285 from meeseeksmachine/auto-backport-of-pr-3284-on-1.2.x
Backport PR #3284 on branch 1.2.x (Changelog for 1.2.2)
2020-11-27 14:41:45 +01:00
Min RK
0dad9a3f39 Backport PR #3284: Changelog for 1.2.2 2020-11-27 13:41:32 +00:00
Min RK
41f291c0c9 Merge pull request #3282 from meeseeksmachine/auto-backport-of-pr-3257-on-1.2.x
Backport PR #3257 on branch 1.2.x (Update services-basics.md to use jupyterhub_idle_culler)
2020-11-27 10:05:01 +01:00
Min RK
9a5b11d5e1 Merge pull request #3283 from meeseeksmachine/auto-backport-of-pr-3250-on-1.2.x
Backport PR #3250 on branch 1.2.x (remove push-branch conditions for CI)
2020-11-27 10:04:11 +01:00
Erik Sundell
b47159b31e Backport PR #3250: remove push-branch conditions for CI 2020-11-27 09:03:54 +00:00
Erik Sundell
bbe377b70a Backport PR #3257: Update services-basics.md to use jupyterhub_idle_culler 2020-11-27 08:59:11 +00:00
Min RK
374a3a7b36 Merge pull request #3273 from meeseeksmachine/auto-backport-of-pr-3237-on-1.2.x
Backport PR #3237 on branch 1.2.x ([proxy.py] Improve robustness when detecting and closing existing proxy processes)
2020-11-26 10:01:46 +01:00
Min RK
32c493e5ab Merge pull request #3272 from meeseeksmachine/auto-backport-of-pr-3252-on-1.2.x
Backport PR #3252 on branch 1.2.x (Standardize "Sign in" capitalization on the login page)
2020-11-20 10:34:41 +01:00
Min RK
edfd363758 Merge pull request #3271 from meeseeksmachine/auto-backport-of-pr-3265-on-1.2.x
Backport PR #3265 on branch 1.2.x (Fix RootHandler when default_url is a callable)
2020-11-20 10:34:31 +01:00
Min RK
d72a5ca3e4 Merge pull request #3274 from meeseeksmachine/auto-backport-of-pr-3255-on-1.2.x
Backport PR #3255 on branch 1.2.x (Environment marker on pamela)
2020-11-20 10:34:22 +01:00
Min RK
3a6309a570 Backport PR #3255: Environment marker on pamela 2020-11-20 09:17:45 +00:00
Min RK
588407200f Backport PR #3237: [proxy.py] Improve robustness when detecting and closing existing proxy processes 2020-11-20 09:17:08 +00:00
Min RK
5cc36a6809 Backport PR #3252: Standardize "Sign in" capitalization on the login page 2020-11-20 09:16:57 +00:00
Min RK
5733eb76c2 Backport PR #3265: Fix RootHandler when default_url is a callable 2020-11-20 09:15:46 +00:00
Min RK
d9719e3538 Merge pull request #3269 from meeseeksmachine/auto-backport-of-pr-3261-on-1.2.x
Backport PR #3261 on branch 1.2.x (Only preserve params when ?next= is unspecified)
2020-11-20 10:10:43 +01:00
Min RK
7c91fbea93 Merge pull request #3270 from meeseeksmachine/auto-backport-of-pr-3246-on-1.2.x
Backport PR #3246 on branch 1.2.x (Migrate from travis to GitHub actions)
2020-11-20 10:10:30 +01:00
Min RK
5076745085 back to dev 2020-11-20 09:54:14 +01:00
Min RK
39eea2f053 Backport PR #3246: Migrate from travis to GitHub actions 2020-11-20 08:51:59 +00:00
Min RK
998f5d7b6c Backport PR #3261: Only preserve params when ?next= is unspecified 2020-11-20 08:48:02 +00:00
52 changed files with 348 additions and 654 deletions

View File

@@ -77,10 +77,6 @@ jobs:
# Tests everything when the user instances are started with
# jupyter_server instead of notebook.
#
# ssl:
# Tests everything using internal SSL connections instead of
# unencrypted HTTP
#
# main_dependencies:
# Tests everything when the we use the latest available dependencies
# from: ipytraitlets.
@@ -89,14 +85,10 @@ jobs:
# GitHub UI when the workflow run, we avoid using true/false as
# values by instead duplicating the name to signal true.
include:
- python: "3.6"
oldest_dependencies: oldest_dependencies
- python: "3.6"
subdomain: subdomain
- python: "3.7"
db: mysql
- python: "3.7"
ssl: ssl
- python: "3.8"
db: postgres
- python: "3.8"
@@ -117,9 +109,6 @@ jobs:
echo "MYSQL_HOST=127.0.0.1" >> $GITHUB_ENV
echo "JUPYTERHUB_TEST_DB_URL=mysql+mysqlconnector://root@127.0.0.1:3306/jupyterhub" >> $GITHUB_ENV
fi
if [ "${{ matrix.ssl }}" == "ssl" ]; then
echo "SSL_ENABLED=1" >> $GITHUB_ENV
fi
if [ "${{ matrix.db }}" == "postgres" ]; then
echo "PGHOST=127.0.0.1" >> $GITHUB_ENV
echo "PGUSER=test_user" >> $GITHUB_ENV
@@ -154,14 +143,6 @@ jobs:
pip install --upgrade pip
pip install --upgrade . -r dev-requirements.txt
if [ "${{ matrix.oldest_dependencies }}" != "" ]; then
# take any dependencies in requirements.txt such as tornado>=5.0
# and transform them to tornado==5.0 so we can run tests with
# the earliest-supported versions
cat requirements.txt | grep '>=' | sed -e 's@>=@==@g' > oldest-requirements.txt
pip install -r oldest-requirements.txt
fi
if [ "${{ matrix.main_dependencies }}" != "" ]; then
pip install git+https://github.com/ipython/traitlets#egg=traitlets --force
fi

1
.gitignore vendored
View File

@@ -28,4 +28,3 @@ htmlcov
.pytest_cache
pip-wheel-metadata
docs/source/reference/metrics.rst
oldest-requirements.txt

View File

@@ -4,7 +4,7 @@ repos:
hooks:
- id: reorder-python-imports
- repo: https://github.com/psf/black
rev: 20.8b1
rev: 19.10b0
hooks:
- id: black
- repo: https://github.com/pre-commit/pre-commit-hooks

View File

@@ -1,7 +1,7 @@
# Contributing to JupyterHub
Welcome! As a [Jupyter](https://jupyter.org) project,
you can follow the [Jupyter contributor guide](https://jupyter.readthedocs.io/en/latest/contributing/content-contributor.html).
you can follow the [Jupyter contributor guide](https://jupyter.readthedocs.io/en/latest/contributor/content-contributor.html).
Make sure to also follow [Project Jupyter's Code of Conduct](https://github.com/jupyter/governance/blob/master/conduct/code_of_conduct.md)
for a friendly and welcoming collaborative environment.

View File

@@ -79,21 +79,6 @@ paths:
/users:
get:
summary: List users
parameters:
- name: state
in: query
required: false
type: string
enum: ["inactive", "active", "ready"]
description: |
Return only users who have servers in the given state.
If unspecified, return all users.
active: all users with any active servers (ready OR pending)
ready: all users who have any ready servers (running, not pending)
inactive: all users who have *no* active servers (complement of active)
Added in JupyterHub 1.3
responses:
'200':
description: The Hub's user list

View File

@@ -7,62 +7,6 @@ command line for details.
## [Unreleased]
### 1.3
JupyterHub 1.3 is a small feature release. Highlights include:
- Require Python >=3.6 (jupyterhub 1.2 is the last release to support 3.5)
- Add a `?state=` filter for getting user list, allowing much quicker responses
when retrieving a small fraction of users.
`state` can be `active`, `inactive`, or `ready`.
- prometheus metrics now include a `jupyterhub_` prefix,
so deployments may need to update their grafana charts to match.
- page templates can now be [async](https://jinja.palletsprojects.com/en/2.11.x/api/#async-support)!
### [1.3.0]
([full changelog](https://github.com/jupyterhub/jupyterhub/compare/1.2.1...1.3.0))
#### Enhancements made
* allow services to call /api/user to identify themselves [#3293](https://github.com/jupyterhub/jupyterhub/pull/3293) ([@minrk](https://github.com/minrk))
* Add optional user agreement to login screen [#3264](https://github.com/jupyterhub/jupyterhub/pull/3264) ([@tlvu](https://github.com/tlvu))
* [Metrics] Add prefix to prometheus metrics to group all jupyterhub metrics [#3243](https://github.com/jupyterhub/jupyterhub/pull/3243) ([@agp8x](https://github.com/agp8x))
* Allow options_from_form to be configurable [#3225](https://github.com/jupyterhub/jupyterhub/pull/3225) ([@cbanek](https://github.com/cbanek))
* add ?state= filter for GET /users [#3177](https://github.com/jupyterhub/jupyterhub/pull/3177) ([@minrk](https://github.com/minrk))
* Enable async support in jinja2 templates [#3176](https://github.com/jupyterhub/jupyterhub/pull/3176) ([@yuvipanda](https://github.com/yuvipanda))
#### Bugs fixed
* fix increasing pagination limits [#3294](https://github.com/jupyterhub/jupyterhub/pull/3294) ([@minrk](https://github.com/minrk))
* fix and test TOTAL_USERS count [#3289](https://github.com/jupyterhub/jupyterhub/pull/3289) ([@minrk](https://github.com/minrk))
* Fix asyncio deprecation asyncio.Task.all_tasks [#3298](https://github.com/jupyterhub/jupyterhub/pull/3298) ([@coffeebenzene](https://github.com/coffeebenzene))
#### Maintenance and upkeep improvements
* bump oldest-required prometheus-client [#3292](https://github.com/jupyterhub/jupyterhub/pull/3292) ([@minrk](https://github.com/minrk))
* bump black pre-commit hook to 20.8 [#3287](https://github.com/jupyterhub/jupyterhub/pull/3287) ([@minrk](https://github.com/minrk))
* Test internal_ssl separately [#3266](https://github.com/jupyterhub/jupyterhub/pull/3266) ([@0mar](https://github.com/0mar))
* wait for pending spawns in spawn_form_admin_access [#3253](https://github.com/jupyterhub/jupyterhub/pull/3253) ([@minrk](https://github.com/minrk))
* Assume py36 and remove @gen.coroutine etc. [#3242](https://github.com/jupyterhub/jupyterhub/pull/3242) ([@consideRatio](https://github.com/consideRatio))
#### Documentation improvements
* Fix curl in jupyter announcements [#3286](https://github.com/jupyterhub/jupyterhub/pull/3286) ([@Sangarshanan](https://github.com/Sangarshanan))
* CONTRIBUTING: Fix contributor guide URL [#3281](https://github.com/jupyterhub/jupyterhub/pull/3281) ([@olifre](https://github.com/olifre))
* Update services.md [#3267](https://github.com/jupyterhub/jupyterhub/pull/3267) ([@slemonide](https://github.com/slemonide))
* [Docs] Fix https reverse proxy redirect issues [#3244](https://github.com/jupyterhub/jupyterhub/pull/3244) ([@mhwasil](https://github.com/mhwasil))
* Fixed idle-culler references. [#3300](https://github.com/jupyterhub/jupyterhub/pull/3300) ([@mxjeff](https://github.com/mxjeff))
* Remove the extra parenthesis in service.md [#3303](https://github.com/jupyterhub/jupyterhub/pull/3303) ([@Sangarshanan](https://github.com/Sangarshanan))
#### Contributors to this release
([GitHub contributors page for this release](https://github.com/jupyterhub/jupyterhub/graphs/contributors?from=2020-10-30&to=2020-12-11&type=c))
[@0mar](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3A0mar+updated%3A2020-10-30..2020-12-11&type=Issues) | [@agp8x](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Aagp8x+updated%3A2020-10-30..2020-12-11&type=Issues) | [@alexweav](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Aalexweav+updated%3A2020-10-30..2020-12-11&type=Issues) | [@belfhi](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Abelfhi+updated%3A2020-10-30..2020-12-11&type=Issues) | [@betatim](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Abetatim+updated%3A2020-10-30..2020-12-11&type=Issues) | [@cbanek](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Acbanek+updated%3A2020-10-30..2020-12-11&type=Issues) | [@cmd-ntrf](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Acmd-ntrf+updated%3A2020-10-30..2020-12-11&type=Issues) | [@coffeebenzene](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Acoffeebenzene+updated%3A2020-10-30..2020-12-11&type=Issues) | [@consideRatio](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3AconsideRatio+updated%3A2020-10-30..2020-12-11&type=Issues) | [@danlester](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Adanlester+updated%3A2020-10-30..2020-12-11&type=Issues) | [@fcollonval](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Afcollonval+updated%3A2020-10-30..2020-12-11&type=Issues) | [@GeorgianaElena](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3AGeorgianaElena+updated%3A2020-10-30..2020-12-11&type=Issues) | [@ianabc](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Aianabc+updated%3A2020-10-30..2020-12-11&type=Issues) | [@IvanaH8](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3AIvanaH8+updated%3A2020-10-30..2020-12-11&type=Issues) | [@manics](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Amanics+updated%3A2020-10-30..2020-12-11&type=Issues) | [@meeseeksmachine](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Ameeseeksmachine+updated%3A2020-10-30..2020-12-11&type=Issues) | [@mhwasil](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Amhwasil+updated%3A2020-10-30..2020-12-11&type=Issues) | [@minrk](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Aminrk+updated%3A2020-10-30..2020-12-11&type=Issues) | [@mriedem](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Amriedem+updated%3A2020-10-30..2020-12-11&type=Issues) | [@mxjeff](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Amxjeff+updated%3A2020-10-30..2020-12-11&type=Issues) | [@olifre](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Aolifre+updated%3A2020-10-30..2020-12-11&type=Issues) | [@rcthomas](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Arcthomas+updated%3A2020-10-30..2020-12-11&type=Issues) | [@rgbkrk](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Argbkrk+updated%3A2020-10-30..2020-12-11&type=Issues) | [@rkdarst](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Arkdarst+updated%3A2020-10-30..2020-12-11&type=Issues) | [@Sangarshanan](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3ASangarshanan+updated%3A2020-10-30..2020-12-11&type=Issues) | [@slemonide](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Aslemonide+updated%3A2020-10-30..2020-12-11&type=Issues) | [@support](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Asupport+updated%3A2020-10-30..2020-12-11&type=Issues) | [@tlvu](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Atlvu+updated%3A2020-10-30..2020-12-11&type=Issues) | [@welcome](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Awelcome+updated%3A2020-10-30..2020-12-11&type=Issues) | [@yuvipanda](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Ayuvipanda+updated%3A2020-10-30..2020-12-11&type=Issues)
## 1.2
### [1.2.2] 2020-11-27
@@ -967,9 +911,7 @@ Fix removal of `/login` page in 0.4.0, breaking some OAuth providers.
First preview release
[Unreleased]: https://github.com/jupyterhub/jupyterhub/compare/1.3.0...HEAD
[1.3.0]: https://github.com/jupyterhub/jupyterhub/compare/1.2.1...1.3.0
[1.2.2]: https://github.com/jupyterhub/jupyterhub/compare/1.2.1...1.2.2
[Unreleased]: https://github.com/jupyterhub/jupyterhub/compare/1.2.1...HEAD
[1.2.1]: https://github.com/jupyterhub/jupyterhub/compare/1.2.0...1.2.1
[1.2.0]: https://github.com/jupyterhub/jupyterhub/compare/1.1.0...1.2.0
[1.1.0]: https://github.com/jupyterhub/jupyterhub/compare/1.0.0...1.1.0

View File

@@ -235,9 +235,10 @@ to Spawner environment:
```python
class MyAuthenticator(Authenticator):
async def authenticate(self, handler, data=None):
username = await identify_user(handler, data)
upstream_token = await token_for_user(username)
@gen.coroutine
def authenticate(self, handler, data=None):
username = yield identify_user(handler, data)
upstream_token = yield token_for_user(username)
return {
'name': username,
'auth_state': {
@@ -245,9 +246,10 @@ class MyAuthenticator(Authenticator):
},
}
async def pre_spawn_start(self, user, spawner):
@gen.coroutine
def pre_spawn_start(self, user, spawner):
"""Pass upstream_token to spawner via environment variable"""
auth_state = await user.get_auth_state()
auth_state = yield user.get_auth_state()
if not auth_state:
# auth_state not enabled
return

View File

@@ -86,7 +86,6 @@ server {
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection $connection_upgrade;
proxy_set_header X-Scheme $scheme;
proxy_buffering off;
}

View File

@@ -50,7 +50,7 @@ A Service may have the following properties:
If a service is also to be managed by the Hub, it has a few extra options:
- `command: (str/Popen list)` - Command for JupyterHub to spawn the service.
- `command: (str/Popen list`) - Command for JupyterHub to spawn the service.
- Only use this if the service should be a subprocess.
- If command is not specified, the Service is assumed to be managed
externally.
@@ -91,9 +91,9 @@ This example would be configured as follows in `jupyterhub_config.py`:
```python
c.JupyterHub.services = [
{
'name': 'idle-culler',
'name': 'cull-idle',
'admin': True,
'command': [sys.executable, '-m', 'jupyterhub_idle_culler', '--timeout=3600']
'command': [sys.executable, '/path/to/cull-idle.py', '--timeout']
}
]
```
@@ -123,14 +123,15 @@ For the previous 'cull idle' Service example, these environment variables
would be passed to the Service when the Hub starts the 'cull idle' Service:
```bash
JUPYTERHUB_SERVICE_NAME: 'idle-culler'
JUPYTERHUB_SERVICE_NAME: 'cull-idle'
JUPYTERHUB_API_TOKEN: API token assigned to the service
JUPYTERHUB_API_URL: http://127.0.0.1:8080/hub/api
JUPYTERHUB_BASE_URL: https://mydomain[:port]
JUPYTERHUB_SERVICE_PREFIX: /services/idle-culler/
JUPYTERHUB_SERVICE_PREFIX: /services/cull-idle/
```
See the GitHub repo for additional information about the [jupyterhub_idle_culler][].
See the JupyterHub GitHub repo for additional information about the
[`cull-idle` example](https://github.com/jupyterhub/jupyterhub/tree/master/examples/cull-idle).
## Externally-Managed Services
@@ -339,7 +340,7 @@ and taking note of the following process:
```python
r = requests.get(
'/'.join(["http://127.0.0.1:8081/hub/api",
'/'.join((["http://127.0.0.1:8081/hub/api",
"authorizations/cookie/jupyterhub-services",
quote(encrypted_cookie, safe=''),
]),
@@ -375,4 +376,3 @@ section on securing the notebook viewer.
[HubAuth.user_for_token]: ../api/services.auth.html#jupyterhub.services.auth.HubAuth.user_for_token
[HubAuthenticated]: ../api/services.auth.html#jupyterhub.services.auth.HubAuthenticated
[nbviewer example]: https://github.com/jupyter/nbviewer#securing-the-notebook-viewer
[jupyterhub_idle_culler]: https://github.com/jupyterhub/jupyterhub-idle-culler

View File

@@ -27,7 +27,7 @@ that environment variable is set or `/` if it is not.
Admin users can set the announcement text with an API token:
$ curl -X POST -H "Authorization: token <token>" \
-d '{"announcement":"JupyterHub will be upgraded on August 14!"}' \
-d "{'announcement':'JupyterHub will be upgraded on August 14!'}" \
https://.../services/announcement
Anyone can read the announcement:

View File

@@ -4,8 +4,8 @@
version_info = (
1,
3,
0,
2,
2,
"", # release (b1, rc1, or "" for final or dev)
# "dev", # dev or nothing for beta/rc/stable releases
)

View File

@@ -253,7 +253,7 @@ class OAuthAuthorizeHandler(OAuthHandler, BaseHandler):
# Render oauth 'Authorize application...' page
auth_state = await self.current_user.get_auth_state()
self.write(
await self.render_template(
self.render_template(
"oauth.html",
auth_state=auth_state,
scopes=scopes,

View File

@@ -16,9 +16,9 @@ class ShutdownAPIHandler(APIHandler):
@admin_only
def post(self):
"""POST /api/shutdown triggers a clean shutdown
POST (JSON) parameters:
- servers: specify whether single-user servers should be terminated
- proxy: specify whether the proxy should be terminated
"""
@@ -57,7 +57,7 @@ class RootAPIHandler(APIHandler):
"""GET /api/ returns info about the Hub and its API.
It is not an authenticated endpoint.
For now, it just returns the version of JupyterHub itself.
"""
data = {'version': __version__}
@@ -70,7 +70,7 @@ class InfoAPIHandler(APIHandler):
"""GET /api/info returns detailed info about the Hub and its API.
It is not an authenticated endpoint.
For now, it just returns the version of JupyterHub itself.
"""

View File

@@ -9,7 +9,6 @@ from datetime import timezone
from async_generator import aclosing
from dateutil.parser import parse as parse_date
from sqlalchemy import func
from tornado import web
from tornado.iostream import StreamClosedError
@@ -36,69 +35,15 @@ class SelfAPIHandler(APIHandler):
user = self.get_current_user_oauth_token()
if user is None:
raise web.HTTPError(403)
if isinstance(user, orm.Service):
model = self.service_model(user)
else:
model = self.user_model(user)
self.write(json.dumps(model))
self.write(json.dumps(self.user_model(user)))
class UserListAPIHandler(APIHandler):
def _user_has_ready_spawner(self, orm_user):
"""Return True if a user has *any* ready spawners
Used for filtering from active -> ready
"""
user = self.users[orm_user]
return any(spawner.ready for spawner in user.spawners.values())
@admin_only
def get(self):
state_filter = self.get_argument("state", None)
# post_filter
post_filter = None
if state_filter in {"active", "ready"}:
# only get users with active servers
# an 'active' Spawner has a server record in the database
# which means Spawner.server != None
# it may still be in a pending start/stop state.
# join filters out users with no Spawners
query = (
self.db.query(orm.User)
# join filters out any Users with no Spawners
.join(orm.Spawner)
# this implicitly gets Users with *any* active server
.filter(orm.Spawner.server != None)
)
if state_filter == "ready":
# have to post-process query results because active vs ready
# can only be distinguished with in-memory Spawner properties
post_filter = self._user_has_ready_spawner
elif state_filter == "inactive":
# only get users with *no* active servers
# as opposed to users with *any inactive servers*
# this is the complement to the above query.
# how expensive is this with lots of servers?
query = (
self.db.query(orm.User)
.outerjoin(orm.Spawner)
.outerjoin(orm.Server)
.group_by(orm.User.id)
.having(func.count(orm.Server.id) == 0)
)
elif state_filter:
raise web.HTTPError(400, "Unrecognized state filter: %r" % state_filter)
else:
# no filter, return all users
query = self.db.query(orm.User)
data = [
self.user_model(u, include_servers=True, include_state=True)
for u in query
if (post_filter is None or post_filter(u))
for u in self.db.query(orm.User)
]
self.write(json.dumps(data))

View File

@@ -29,14 +29,6 @@ from urllib.parse import urlunparse
if sys.version_info[:2] < (3, 3):
raise ValueError("Python < 3.3 not supported: %s" % sys.version)
# For compatibility with python versions 3.6 or earlier.
# asyncio.Task.all_tasks() is fully moved to asyncio.all_tasks() starting with 3.9. Also applies to current_task.
try:
asyncio_all_tasks = asyncio.all_tasks
asyncio_current_task = asyncio.current_task
except AttributeError as e:
asyncio_all_tasks = asyncio.Task.all_tasks
asyncio_current_task = asyncio.Task.current_task
from dateutil.parser import parse as parse_date
from jinja2 import Environment, FileSystemLoader, PrefixLoader, ChoiceLoader
@@ -2128,7 +2120,7 @@ class JupyterHub(Application):
self.log.debug(
"Awaiting checks for %i possibly-running spawners", len(check_futures)
)
await asyncio.gather(*check_futures)
await gen.multi(check_futures)
db.commit()
# only perform this query if we are going to log it
@@ -2195,7 +2187,7 @@ class JupyterHub(Application):
def init_tornado_settings(self):
"""Set up the tornado settings dict."""
base_url = self.hub.base_url
jinja_options = dict(autoescape=True, enable_async=True)
jinja_options = dict(autoescape=True)
jinja_options.update(self.jinja_environment_options)
base_path = self._template_paths_default()[0]
if base_path not in self.template_paths:
@@ -2207,14 +2199,6 @@ class JupyterHub(Application):
]
)
jinja_env = Environment(loader=loader, **jinja_options)
# We need a sync jinja environment too, for the times we *must* use sync
# code - particularly in RequestHandler.write_error. Since *that*
# is called from inside the asyncio event loop, we can't actulaly just
# schedule it on the loop - without starting another thread with its
# own loop, which seems not worth the trouble. Instead, we create another
# environment, exactly like this one, but sync
del jinja_options['enable_async']
jinja_env_sync = Environment(loader=loader, **jinja_options)
login_url = url_path_join(base_url, 'login')
logout_url = self.authenticator.logout_url(base_url)
@@ -2261,7 +2245,6 @@ class JupyterHub(Application):
template_path=self.template_paths,
template_vars=self.template_vars,
jinja2_env=jinja_env,
jinja2_env_sync=jinja_env_sync,
version_hash=version_hash,
subdomain_host=self.subdomain_host,
domain=self.domain,
@@ -2825,7 +2808,9 @@ class JupyterHub(Application):
async def shutdown_cancel_tasks(self, sig):
"""Cancel all other tasks of the event loop and initiate cleanup"""
self.log.critical("Received signal %s, initiating shutdown...", sig.name)
tasks = [t for t in asyncio_all_tasks() if t is not asyncio_current_task()]
tasks = [
t for t in asyncio.Task.all_tasks() if t is not asyncio.Task.current_task()
]
if tasks:
self.log.debug("Cancelling pending tasks")
@@ -2838,7 +2823,7 @@ class JupyterHub(Application):
except StopAsyncIteration as e:
self.log.error("Caught StopAsyncIteration Exception", exc_info=True)
tasks = [t for t in asyncio_all_tasks()]
tasks = [t for t in asyncio.Task.all_tasks()]
for t in tasks:
self.log.debug("Task status: %s", t)
await self.cleanup()

View File

@@ -101,10 +101,7 @@ class Authenticator(LoggingConfigurable):
"""
).tag(config=True)
whitelist = Set(
help="Deprecated, use `Authenticator.allowed_users`",
config=True,
)
whitelist = Set(help="Deprecated, use `Authenticator.allowed_users`", config=True,)
allowed_users = Set(
help="""
@@ -718,9 +715,7 @@ for _old_name, _new_name, _version in [
("check_blacklist", "check_blocked_users", "1.2"),
]:
setattr(
Authenticator,
_old_name,
_deprecated_method(_old_name, _new_name, _version),
Authenticator, _old_name, _deprecated_method(_old_name, _new_name, _version),
)
@@ -783,9 +778,7 @@ class LocalAuthenticator(Authenticator):
"""
).tag(config=True)
group_whitelist = Set(
help="""DEPRECATED: use allowed_groups""",
).tag(config=True)
group_whitelist = Set(help="""DEPRECATED: use allowed_groups""",).tag(config=True)
allowed_groups = Set(
help="""

View File

@@ -40,7 +40,6 @@ from ..metrics import SERVER_STOP_DURATION_SECONDS
from ..metrics import ServerPollStatus
from ..metrics import ServerSpawnStatus
from ..metrics import ServerStopStatus
from ..metrics import TOTAL_USERS
from ..objects import Server
from ..spawner import LocalProcessSpawner
from ..user import User
@@ -454,7 +453,6 @@ class BaseHandler(RequestHandler):
# not found, create and register user
u = orm.User(name=username)
self.db.add(u)
TOTAL_USERS.inc()
self.db.commit()
user = self._user_from_orm(u)
return user
@@ -491,7 +489,7 @@ class BaseHandler(RequestHandler):
self.clear_cookie(
'jupyterhub-services',
path=url_path_join(self.base_url, 'services'),
**kwargs,
**kwargs
)
# Reset _jupyterhub_user
self._jupyterhub_user = None
@@ -1167,36 +1165,16 @@ class BaseHandler(RequestHandler):
"<a href='{home}'>home page</a>.".format(home=home)
)
def get_template(self, name, sync=False):
"""
Return the jinja template object for a given name
def get_template(self, name):
"""Return the jinja template object for a given name"""
return self.settings['jinja2_env'].get_template(name)
If sync is True, we return a Template that is compiled without async support.
Only those can be used in synchronous code.
If sync is False, we return a Template that is compiled with async support
"""
if sync:
key = 'jinja2_env_sync'
else:
key = 'jinja2_env'
return self.settings[key].get_template(name)
def render_template(self, name, sync=False, **ns):
"""
Render jinja2 template
If sync is set to True, we return an awaitable
If sync is set to False, we render the template & return a string
"""
def render_template(self, name, **ns):
template_ns = {}
template_ns.update(self.template_namespace)
template_ns.update(ns)
template = self.get_template(name, sync)
if sync:
return template.render(**template_ns)
else:
return template.render_async(**template_ns)
template = self.get_template(name)
return template.render(**template_ns)
@property
def template_namespace(self):
@@ -1271,19 +1249,17 @@ class BaseHandler(RequestHandler):
# Content-Length must be recalculated.
self.clear_header('Content-Length')
# render_template is async, but write_error can't be!
# so we run it sync here, instead of making a sync version of render_template
# render the template
try:
html = self.render_template('%s.html' % status_code, sync=True, **ns)
html = self.render_template('%s.html' % status_code, **ns)
except TemplateNotFound:
self.log.debug("No template for %d", status_code)
try:
html = self.render_template('error.html', sync=True, **ns)
html = self.render_template('error.html', **ns)
except:
# In this case, any side effect must be avoided.
ns['no_spawner_check'] = True
html = self.render_template('error.html', sync=True, **ns)
html = self.render_template('error.html', **ns)
self.write(html)
@@ -1487,14 +1463,10 @@ class UserUrlHandler(BaseHandler):
# if request is expecting JSON, assume it's an API request and fail with 503
# because it won't like the redirect to the pending page
if (
get_accepted_mimetype(
self.request.headers.get('Accept', ''),
choices=['application/json', 'text/html'],
)
== 'application/json'
or 'api' in user_path.split('/')
):
if get_accepted_mimetype(
self.request.headers.get('Accept', ''),
choices=['application/json', 'text/html'],
) == 'application/json' or 'api' in user_path.split('/'):
self._fail_api_request(user_name, server_name)
return
@@ -1520,7 +1492,7 @@ class UserUrlHandler(BaseHandler):
self.set_status(503)
auth_state = await user.get_auth_state()
html = await self.render_template(
html = self.render_template(
"not_running.html",
user=user,
server_name=server_name,
@@ -1575,7 +1547,7 @@ class UserUrlHandler(BaseHandler):
if redirects:
self.log.warning("Redirect loop detected on %s", self.request.uri)
# add capped exponential backoff where cap is 10s
await asyncio.sleep(min(1 * (2 ** redirects), 10))
await gen.sleep(min(1 * (2 ** redirects), 10))
# rewrite target url with new `redirects` query value
url_parts = urlparse(target)
query_parts = parse_qs(url_parts.query)

View File

@@ -72,14 +72,14 @@ class LogoutHandler(BaseHandler):
Override this function to set a custom logout page.
"""
if self.authenticator.auto_login:
html = await self.render_template('logout.html')
html = self.render_template('logout.html')
self.finish(html)
else:
self.redirect(self.settings['login_url'], permanent=False)
async def get(self):
"""Log the user out, call the custom action, forward the user
to the logout page
to the logout page
"""
await self.default_handle_logout()
await self.handle_logout()
@@ -132,7 +132,7 @@ class LoginHandler(BaseHandler):
self.redirect(auto_login_url)
return
username = self.get_argument('username', default='')
self.finish(await self._render(username=username))
self.finish(self._render(username=username))
async def post(self):
# parse the arguments dict
@@ -149,7 +149,7 @@ class LoginHandler(BaseHandler):
self._jupyterhub_user = user
self.redirect(self.get_next_url(user))
else:
html = await self._render(
html = self._render(
login_error='Invalid username or password', username=data['username']
)
self.finish(html)

View File

@@ -71,7 +71,7 @@ class HomeHandler(BaseHandler):
url = url_path_join(self.hub.base_url, 'spawn', user.escaped_name)
auth_state = await user.get_auth_state()
html = await self.render_template(
html = self.render_template(
'home.html',
auth_state=auth_state,
user=user,
@@ -98,7 +98,7 @@ class SpawnHandler(BaseHandler):
async def _render_form(self, for_user, spawner_options_form, message=''):
auth_state = await for_user.get_auth_state()
return await self.render_template(
return self.render_template(
'spawn.html',
for_user=for_user,
auth_state=auth_state,
@@ -382,7 +382,7 @@ class SpawnPendingHandler(BaseHandler):
self.hub.base_url, "spawn", user.escaped_name, server_name
)
self.set_status(500)
html = await self.render_template(
html = self.render_template(
"not_running.html",
user=user,
auth_state=auth_state,
@@ -406,7 +406,7 @@ class SpawnPendingHandler(BaseHandler):
page = "stop_pending.html"
else:
page = "spawn_pending.html"
html = await self.render_template(
html = self.render_template(
page,
user=user,
spawner=spawner,
@@ -433,7 +433,7 @@ class SpawnPendingHandler(BaseHandler):
spawn_url = url_path_join(
self.hub.base_url, "spawn", user.escaped_name, server_name
)
html = await self.render_template(
html = self.render_template(
"not_running.html",
user=user,
auth_state=auth_state,
@@ -457,8 +457,7 @@ class AdminHandler(BaseHandler):
@web.authenticated
@admin_only
async def get(self):
pagination = Pagination(url=self.request.uri, config=self.config)
page, per_page, offset = pagination.get_page_args(self)
page, per_page, offset = Pagination(config=self.config).get_page_args(self)
available = {'name', 'admin', 'running', 'last_activity'}
default_sort = ['admin', 'name']
@@ -514,10 +513,17 @@ class AdminHandler(BaseHandler):
for u in users:
running.extend(s for s in u.spawners.values() if s.active)
pagination.total = self.db.query(orm.User.id).count()
total = self.db.query(orm.User.id).count()
pagination = Pagination(
url=self.request.uri,
total=total,
page=page,
per_page=per_page,
config=self.config,
)
auth_state = await self.current_user.get_auth_state()
html = await self.render_template(
html = self.render_template(
'admin.html',
current_user=self.current_user,
auth_state=auth_state,
@@ -607,7 +613,7 @@ class TokenPageHandler(BaseHandler):
oauth_clients = sorted(oauth_clients, key=sort_key, reverse=True)
auth_state = await self.current_user.get_auth_state()
html = await self.render_template(
html = self.render_template(
'token.html',
api_tokens=api_tokens,
oauth_clients=oauth_clients,
@@ -619,7 +625,7 @@ class TokenPageHandler(BaseHandler):
class ProxyErrorHandler(BaseHandler):
"""Handler for rendering proxy error pages"""
async def get(self, status_code_s):
def get(self, status_code_s):
status_code = int(status_code_s)
status_message = responses.get(status_code, 'Unknown HTTP Error')
# build template namespace
@@ -643,10 +649,10 @@ class ProxyErrorHandler(BaseHandler):
self.set_header('Content-Type', 'text/html')
# render the template
try:
html = await self.render_template('%s.html' % status_code, **ns)
html = self.render_template('%s.html' % status_code, **ns)
except TemplateNotFound:
self.log.debug("No template for %d", status_code)
html = await self.render_template('error.html', **ns)
html = self.render_template('error.html', **ns)
self.write(html)

View File

@@ -7,7 +7,7 @@ from tornado.web import StaticFileHandler
class CacheControlStaticFilesHandler(StaticFileHandler):
"""StaticFileHandler subclass that sets Cache-Control: no-cache without `?v=`
rather than relying on default browser cache behavior.
"""

View File

@@ -3,9 +3,9 @@ Prometheus metrics exported by JupyterHub
Read https://prometheus.io/docs/practices/naming/ for naming
conventions for metrics & labels. We generally prefer naming them
`jupyterhub_<noun>_<verb>_<type_suffix>`. So a histogram that's tracking
`<noun>_<verb>_<type_suffix>`. So a histogram that's tracking
the duration (in seconds) of servers spawning would be called
jupyterhub_server_spawn_duration_seconds.
SERVER_SPAWN_DURATION_SECONDS.
We also create an Enum for each 'status' type label in every metric
we collect. This is to make sure that the metrics exist regardless
@@ -14,10 +14,6 @@ create them, the metric spawn_duration_seconds{status="failure"}
will not actually exist until the first failure. This makes dashboarding
and alerting difficult, so we explicitly list statuses and create
them manually here.
.. versionchanged:: 1.3
added ``jupyterhub_`` prefix to metric names.
"""
from enum import Enum
@@ -25,13 +21,13 @@ from prometheus_client import Gauge
from prometheus_client import Histogram
REQUEST_DURATION_SECONDS = Histogram(
'jupyterhub_request_duration_seconds',
'request_duration_seconds',
'request duration for all HTTP requests',
['method', 'handler', 'code'],
)
SERVER_SPAWN_DURATION_SECONDS = Histogram(
'jupyterhub_server_spawn_duration_seconds',
'server_spawn_duration_seconds',
'time taken for server spawning operation',
['status'],
# Use custom bucket sizes, since the default bucket ranges
@@ -40,27 +36,25 @@ SERVER_SPAWN_DURATION_SECONDS = Histogram(
)
RUNNING_SERVERS = Gauge(
'jupyterhub_running_servers', 'the number of user servers currently running'
'running_servers', 'the number of user servers currently running'
)
TOTAL_USERS = Gauge('jupyterhub_total_users', 'total number of users')
TOTAL_USERS = Gauge('total_users', 'total number of users')
CHECK_ROUTES_DURATION_SECONDS = Histogram(
'jupyterhub_check_routes_duration_seconds',
'Time taken to validate all routes in proxy',
'check_routes_duration_seconds', 'Time taken to validate all routes in proxy'
)
HUB_STARTUP_DURATION_SECONDS = Histogram(
'jupyterhub_hub_startup_duration_seconds', 'Time taken for Hub to start'
'hub_startup_duration_seconds', 'Time taken for Hub to start'
)
INIT_SPAWNERS_DURATION_SECONDS = Histogram(
'jupyterhub_init_spawners_duration_seconds', 'Time taken for spawners to initialize'
'init_spawners_duration_seconds', 'Time taken for spawners to initialize'
)
PROXY_POLL_DURATION_SECONDS = Histogram(
'jupyterhub_proxy_poll_duration_seconds',
'duration for polling all routes from proxy',
'proxy_poll_duration_seconds', 'duration for polling all routes from proxy'
)
@@ -85,9 +79,7 @@ for s in ServerSpawnStatus:
PROXY_ADD_DURATION_SECONDS = Histogram(
'jupyterhub_proxy_add_duration_seconds',
'duration for adding user routes to proxy',
['status'],
'proxy_add_duration_seconds', 'duration for adding user routes to proxy', ['status']
)
@@ -108,7 +100,7 @@ for s in ProxyAddStatus:
SERVER_POLL_DURATION_SECONDS = Histogram(
'jupyterhub_server_poll_duration_seconds',
'server_poll_duration_seconds',
'time taken to poll if server is running',
['status'],
)
@@ -135,9 +127,7 @@ for s in ServerPollStatus:
SERVER_STOP_DURATION_SECONDS = Histogram(
'jupyterhub_server_stop_seconds',
'time taken for server stopping operation',
['status'],
'server_stop_seconds', 'time taken for server stopping operation', ['status']
)
@@ -158,7 +148,7 @@ for s in ServerStopStatus:
PROXY_DELETE_DURATION_SECONDS = Histogram(
'jupyterhub_proxy_delete_duration_seconds',
'proxy_delete_duration_seconds',
'duration for deleting user routes from proxy',
['status'],
)

View File

@@ -256,7 +256,7 @@ class JupyterHubRequestValidator(RequestValidator):
self.db.commit()
def get_authorization_code_scopes(self, client_id, code, redirect_uri, request):
"""Extracts scopes from saved authorization code.
""" Extracts scopes from saved authorization code.
The scopes returned by this method is used to route token requests
based on scopes passed to Authorization Code requests.
With that the token endpoint knows when to include OpenIDConnect

View File

@@ -1,6 +1,7 @@
"""Basic class to manage pagination utils."""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
from traitlets import Bool
from traitlets import default
from traitlets import Integer
from traitlets import observe
@@ -80,13 +81,13 @@ class Pagination(Configurable):
try:
self.per_page = int(per_page)
except Exception:
self.per_page = self.default_per_page
self.per_page = self._default_per_page
try:
self.page = int(page)
if self.page < 1:
self.page = 1
except Exception:
except:
self.page = 1
return self.page, self.per_page, self.per_page * (self.page - 1)
@@ -106,14 +107,14 @@ class Pagination(Configurable):
def calculate_pages_window(self):
"""Calculates the set of pages to render later in links() method.
It returns the list of pages to render via links for the pagination
By default, as we've observed in other applications, we're going to render
only a finite and predefined number of pages, avoiding visual fatigue related
to a long list of pages. By default, we render 7 pages plus some inactive links with the characters '...'
to point out that there are other pages that aren't explicitly rendered.
The primary way of work is to provide current webpage and 5 next pages, the last 2 ones
(in case the current page + 5 does not overflow the total lenght of pages) and the first one for reference.
"""
It returns the list of pages to render via links for the pagination
By default, as we've observed in other applications, we're going to render
only a finite and predefined number of pages, avoiding visual fatigue related
to a long list of pages. By default, we render 7 pages plus some inactive links with the characters '...'
to point out that there are other pages that aren't explicitly rendered.
The primary way of work is to provide current webpage and 5 next pages, the last 2 ones
(in case the current page + 5 does not overflow the total lenght of pages) and the first one for reference.
"""
before_page = 2
after_page = 2
@@ -157,9 +158,9 @@ class Pagination(Configurable):
@property
def links(self):
"""Get the links for the pagination.
Getting the input from calculate_pages_window(), generates the HTML code
for the pages to render, plus the arrows to go onwards and backwards (if needed).
"""
Getting the input from calculate_pages_window(), generates the HTML code
for the pages to render, plus the arrows to go onwards and backwards (if needed).
"""
if self.total_pages == 1:
return []

View File

@@ -25,6 +25,7 @@ from functools import wraps
from subprocess import Popen
from urllib.parse import quote
from tornado import gen
from tornado.httpclient import AsyncHTTPClient
from tornado.httpclient import HTTPError
from tornado.httpclient import HTTPRequest
@@ -291,7 +292,7 @@ class Proxy(LoggingConfigurable):
if service.server:
futures.append(self.add_service(service))
# wait after submitting them all
await asyncio.gather(*futures)
await gen.multi(futures)
async def add_all_users(self, user_dict):
"""Update the proxy table from the database.
@@ -304,7 +305,7 @@ class Proxy(LoggingConfigurable):
if spawner.ready:
futures.append(self.add_user(user, name))
# wait after submitting them all
await asyncio.gather(*futures)
await gen.multi(futures)
@_one_at_a_time
async def check_routes(self, user_dict, service_dict, routes=None):
@@ -390,7 +391,7 @@ class Proxy(LoggingConfigurable):
self.log.warning("Deleting stale route %s", routespec)
futures.append(self.delete_route(routespec))
await asyncio.gather(*futures)
await gen.multi(futures)
stop = time.perf_counter() # timer stops here when user is deleted
CHECK_ROUTES_DURATION_SECONDS.observe(stop - start) # histogram metric
@@ -586,34 +587,6 @@ class ConfigurableHTTPProxy(Proxy):
self.log.debug("PID file %s already removed", self.pid_file)
pass
def _get_ssl_options(self):
"""List of cmd proxy options to use internal SSL"""
cmd = []
proxy_api = 'proxy-api'
proxy_client = 'proxy-client'
api_key = self.app.internal_proxy_certs[proxy_api][
'keyfile'
] # Check content in next test and just patch manulaly or in the config of the file
api_cert = self.app.internal_proxy_certs[proxy_api]['certfile']
api_ca = self.app.internal_trust_bundles[proxy_api + '-ca']
client_key = self.app.internal_proxy_certs[proxy_client]['keyfile']
client_cert = self.app.internal_proxy_certs[proxy_client]['certfile']
client_ca = self.app.internal_trust_bundles[proxy_client + '-ca']
cmd.extend(['--api-ssl-key', api_key])
cmd.extend(['--api-ssl-cert', api_cert])
cmd.extend(['--api-ssl-ca', api_ca])
cmd.extend(['--api-ssl-request-cert'])
cmd.extend(['--api-ssl-reject-unauthorized'])
cmd.extend(['--client-ssl-key', client_key])
cmd.extend(['--client-ssl-cert', client_cert])
cmd.extend(['--client-ssl-ca', client_ca])
cmd.extend(['--client-ssl-request-cert'])
cmd.extend(['--client-ssl-reject-unauthorized'])
return cmd
async def start(self):
"""Start the proxy process"""
# check if there is a previous instance still around
@@ -645,7 +618,27 @@ class ConfigurableHTTPProxy(Proxy):
if self.ssl_cert:
cmd.extend(['--ssl-cert', self.ssl_cert])
if self.app.internal_ssl:
cmd.extend(self._get_ssl_options())
proxy_api = 'proxy-api'
proxy_client = 'proxy-client'
api_key = self.app.internal_proxy_certs[proxy_api]['keyfile']
api_cert = self.app.internal_proxy_certs[proxy_api]['certfile']
api_ca = self.app.internal_trust_bundles[proxy_api + '-ca']
client_key = self.app.internal_proxy_certs[proxy_client]['keyfile']
client_cert = self.app.internal_proxy_certs[proxy_client]['certfile']
client_ca = self.app.internal_trust_bundles[proxy_client + '-ca']
cmd.extend(['--api-ssl-key', api_key])
cmd.extend(['--api-ssl-cert', api_cert])
cmd.extend(['--api-ssl-ca', api_ca])
cmd.extend(['--api-ssl-request-cert'])
cmd.extend(['--api-ssl-reject-unauthorized'])
cmd.extend(['--client-ssl-key', client_key])
cmd.extend(['--client-ssl-cert', client_cert])
cmd.extend(['--client-ssl-ca', client_ca])
cmd.extend(['--client-ssl-request-cert'])
cmd.extend(['--client-ssl-reject-unauthorized'])
if self.app.statsd_host:
cmd.extend(
[

View File

@@ -23,6 +23,7 @@ from urllib.parse import quote
from urllib.parse import urlencode
import requests
from tornado.gen import coroutine
from tornado.httputil import url_concat
from tornado.log import app_log
from tornado.web import HTTPError
@@ -287,7 +288,7 @@ class HubAuth(SingletonConfigurable):
def _check_hub_authorization(self, url, cache_key=None, use_cache=True):
"""Identify a user with the Hub
Args:
url (str): The API URL to check the Hub for authorization
(e.g. http://127.0.0.1:8081/hub/api/authorizations/token/abc-def)
@@ -603,10 +604,10 @@ class HubOAuth(HubAuth):
def token_for_code(self, code):
"""Get token for OAuth temporary code
This is the last step of OAuth login.
Should be called in OAuth Callback handler.
Args:
code (str): oauth code for finishing OAuth login
Returns:
@@ -949,7 +950,8 @@ class HubOAuthCallbackHandler(HubOAuthenticated, RequestHandler):
.. versionadded: 0.8
"""
async def get(self):
@coroutine
def get(self):
error = self.get_argument("error", False)
if error:
msg = self.get_argument("error_description", error)

View File

@@ -20,6 +20,7 @@ from urllib.parse import urlparse
from jinja2 import ChoiceLoader
from jinja2 import FunctionLoader
from tornado import gen
from tornado import ioloop
from tornado.httpclient import AsyncHTTPClient
from tornado.httpclient import HTTPRequest
@@ -433,7 +434,7 @@ class SingleUserNotebookAppMixin(Configurable):
i,
RETRIES,
)
await asyncio.sleep(min(2 ** i, 16))
await gen.sleep(min(2 ** i, 16))
else:
break
else:

View File

@@ -16,7 +16,8 @@ from tempfile import mkdtemp
if os.name == 'nt':
import psutil
from async_generator import aclosing
from async_generator import async_generator
from async_generator import yield_
from sqlalchemy import inspect
from tornado.ioloop import PeriodicCallback
from traitlets import Any
@@ -353,9 +354,8 @@ class Spawner(LoggingConfigurable):
return options_form
options_from_form = Callable(
help="""
Interpret HTTP form data
def options_from_form(self, form_data):
"""Interpret HTTP form data
Form data will always arrive as a dict of lists of strings.
Override this function to understand single-values, numbers, etc.
@@ -379,14 +379,7 @@ class Spawner(LoggingConfigurable):
(with additional support for bytes in case of uploaded file data),
and any non-bytes non-jsonable values will be replaced with None
if the user_options are re-used.
""",
).tag(config=True)
@default("options_from_form")
def _options_from_form(self):
return self._default_options_from_form
def _default_options_from_form(self, form_data):
"""
return form_data
def options_from_query(self, query_data):
@@ -1031,6 +1024,7 @@ class Spawner(LoggingConfigurable):
def _progress_url(self):
return self.user.progress_url(self.name)
@async_generator
async def _generate_progress(self):
"""Private wrapper of progress generator
@@ -1042,17 +1036,21 @@ class Spawner(LoggingConfigurable):
)
return
yield {"progress": 0, "message": "Server requested"}
await yield_({"progress": 0, "message": "Server requested"})
from async_generator import aclosing
async with aclosing(self.progress()) as progress:
async for event in progress:
yield event
await yield_(event)
@async_generator
async def progress(self):
"""Async generator for progress events
Must be an async generator
For Python 3.5-compatibility, use the async_generator package
Should yield messages of the form:
::
@@ -1069,7 +1067,7 @@ class Spawner(LoggingConfigurable):
.. versionadded:: 0.9
"""
yield {"progress": 50, "message": "Spawning server..."}
await yield_({"progress": 50, "message": "Spawning server..."})
async def start(self):
"""Start the single-user server
@@ -1080,7 +1078,9 @@ class Spawner(LoggingConfigurable):
.. versionchanged:: 0.7
Return ip, port instead of setting on self.user.server directly.
"""
raise NotImplementedError("Override in subclass. Must be a coroutine.")
raise NotImplementedError(
"Override in subclass. Must be a Tornado gen.coroutine."
)
async def stop(self, now=False):
"""Stop the single-user server
@@ -1093,7 +1093,9 @@ class Spawner(LoggingConfigurable):
Must be a coroutine.
"""
raise NotImplementedError("Override in subclass. Must be a coroutine.")
raise NotImplementedError(
"Override in subclass. Must be a Tornado gen.coroutine."
)
async def poll(self):
"""Check if the single-user process is running
@@ -1119,7 +1121,9 @@ class Spawner(LoggingConfigurable):
process has not yet completed.
"""
raise NotImplementedError("Override in subclass. Must be a coroutine.")
raise NotImplementedError(
"Override in subclass. Must be a Tornado gen.coroutine."
)
def add_poll_callback(self, callback, *args, **kwargs):
"""Add a callback to fire when the single-user server stops"""

View File

@@ -36,6 +36,7 @@ from unittest import mock
from pytest import fixture
from pytest import raises
from tornado import gen
from tornado import ioloop
from tornado.httpclient import HTTPError
from tornado.platform.asyncio import AsyncIOMainLoop
@@ -55,16 +56,13 @@ _db = None
def pytest_collection_modifyitems(items):
"""This function is automatically run by pytest passing all collected test
functions.
We use it to add asyncio marker to all async tests and assert we don't use
test functions that are async generators which wouldn't make sense.
"""
"""add asyncio marker to all async tests"""
for item in items:
if inspect.iscoroutinefunction(item.obj):
item.add_marker('asyncio')
assert not inspect.isasyncgenfunction(item.obj)
if hasattr(inspect, 'isasyncgenfunction'):
# double-check that we aren't mixing yield and async def
assert not inspect.isasyncgenfunction(item.obj)
@fixture(scope='module')
@@ -76,9 +74,7 @@ def ssl_tmpdir(tmpdir_factory):
def app(request, io_loop, ssl_tmpdir):
"""Mock a jupyterhub app for testing"""
mocked_app = None
ssl_enabled = getattr(
request.module, 'ssl_enabled', os.environ.get('SSL_ENABLED', False)
)
ssl_enabled = getattr(request.module, "ssl_enabled", False)
kwargs = dict()
if ssl_enabled:
kwargs.update(dict(internal_ssl=True, internal_certs_location=str(ssl_tmpdir)))
@@ -218,7 +214,7 @@ def admin_user(app, username):
class MockServiceSpawner(jupyterhub.services.service._ServiceSpawner):
"""mock services for testing.
Shorter intervals, etc.
Shorter intervals, etc.
"""
poll_interval = 1
@@ -248,14 +244,17 @@ def _mockservice(request, app, url=False):
assert name in app._service_map
service = app._service_map[name]
async def start():
@gen.coroutine
def start():
# wait for proxy to be updated before starting the service
await app.proxy.add_all_services(app._service_map)
yield app.proxy.add_all_services(app._service_map)
service.start()
io_loop.run_sync(start)
def cleanup():
import asyncio
asyncio.get_event_loop().run_until_complete(service.stop())
app.services[:] = []
app._service_map.clear()

View File

@@ -36,12 +36,13 @@ from unittest import mock
from urllib.parse import urlparse
from pamela import PAMError
from tornado import gen
from tornado.concurrent import Future
from tornado.ioloop import IOLoop
from traitlets import Bool
from traitlets import default
from traitlets import Dict
from .. import metrics
from .. import orm
from ..app import JupyterHub
from ..auth import PAMAuthenticator
@@ -109,17 +110,19 @@ class SlowSpawner(MockSpawner):
delay = 2
_start_future = None
async def start(self):
(ip, port) = await super().start()
@gen.coroutine
def start(self):
(ip, port) = yield super().start()
if self._start_future is not None:
await self._start_future
yield self._start_future
else:
await asyncio.sleep(self.delay)
yield gen.sleep(self.delay)
return ip, port
async def stop(self):
await asyncio.sleep(self.delay)
await super().stop()
@gen.coroutine
def stop(self):
yield gen.sleep(self.delay)
yield super().stop()
class NeverSpawner(MockSpawner):
@@ -131,12 +134,14 @@ class NeverSpawner(MockSpawner):
def start(self):
"""Return a Future that will never finish"""
return asyncio.Future()
return Future()
async def stop(self):
@gen.coroutine
def stop(self):
pass
async def poll(self):
@gen.coroutine
def poll(self):
return 0
@@ -210,7 +215,8 @@ class MockPAMAuthenticator(PAMAuthenticator):
# skip the add-system-user bit
return not user.name.startswith('dne')
async def authenticate(self, *args, **kwargs):
@gen.coroutine
def authenticate(self, *args, **kwargs):
with mock.patch.multiple(
'pamela',
authenticate=mock_authenticate,
@@ -218,7 +224,7 @@ class MockPAMAuthenticator(PAMAuthenticator):
close_session=mock_open_session,
check_account=mock_check_account,
):
username = await super(MockPAMAuthenticator, self).authenticate(
username = yield super(MockPAMAuthenticator, self).authenticate(
*args, **kwargs
)
if username is None:
@@ -314,13 +320,14 @@ class MockHub(JupyterHub):
self.db.delete(group)
self.db.commit()
async def initialize(self, argv=None):
@gen.coroutine
def initialize(self, argv=None):
self.pid_file = NamedTemporaryFile(delete=False).name
self.db_file = NamedTemporaryFile()
self.db_url = os.getenv('JUPYTERHUB_TEST_DB_URL') or self.db_file.name
if 'mysql' in self.db_url:
self.db_kwargs['connect_args'] = {'auth_plugin': 'mysql_native_password'}
await super().initialize([])
yield super().initialize([])
# add an initial user
user = self.db.query(orm.User).filter(orm.User.name == 'user').first()
@@ -328,7 +335,6 @@ class MockHub(JupyterHub):
user = orm.User(name='user')
self.db.add(user)
self.db.commit()
metrics.TOTAL_USERS.inc()
def stop(self):
super().stop()
@@ -352,13 +358,14 @@ class MockHub(JupyterHub):
self.cleanup = lambda: None
self.db_file.close()
async def login_user(self, name):
@gen.coroutine
def login_user(self, name):
"""Login a user by name, returning her cookies."""
base_url = public_url(self)
external_ca = None
if self.internal_ssl:
external_ca = self.external_certs['files']['ca']
r = await async_requests.post(
r = yield async_requests.post(
base_url + 'hub/login',
data={'username': name, 'password': name},
allow_redirects=False,
@@ -400,7 +407,8 @@ class StubSingleUserSpawner(MockSpawner):
_thread = None
async def start(self):
@gen.coroutine
def start(self):
ip = self.ip = '127.0.0.1'
port = self.port = random_port()
env = self.get_env()
@@ -427,12 +435,14 @@ class StubSingleUserSpawner(MockSpawner):
assert ready
return (ip, port)
async def stop(self):
@gen.coroutine
def stop(self):
self._app.stop()
self._thread.join(timeout=30)
assert not self._thread.is_alive()
async def poll(self):
@gen.coroutine
def poll(self):
if self._thread is None:
return 0
if self._thread.is_alive():

View File

@@ -60,7 +60,7 @@ class APIHandler(web.RequestHandler):
class WhoAmIHandler(HubAuthenticated, web.RequestHandler):
"""Reply with the name of the user who made the request.
Uses "deprecated" cookie login
"""
@@ -71,7 +71,7 @@ class WhoAmIHandler(HubAuthenticated, web.RequestHandler):
class OWhoAmIHandler(HubOAuthenticated, web.RequestHandler):
"""Reply with the name of the user who made the request.
Uses OAuth login flow
"""

View File

@@ -1,20 +1,22 @@
"""Tests for the REST API."""
import asyncio
import json
import re
import sys
import uuid
from concurrent.futures import Future
from datetime import datetime
from datetime import timedelta
from unittest import mock
from urllib.parse import quote
from urllib.parse import urlparse
from async_generator import async_generator
from async_generator import yield_
from pytest import mark
from tornado import gen
import jupyterhub
from .. import orm
from ..objects import Server
from ..utils import url_path_join as ujoin
from ..utils import utcnow
from .mocking import public_host
@@ -25,6 +27,7 @@ from .utils import async_requests
from .utils import auth_header
from .utils import find_user
# --------------------
# Authentication tests
# --------------------
@@ -179,71 +182,6 @@ async def test_get_users(app):
assert r.status_code == 403
@mark.user
@mark.parametrize(
"state",
("inactive", "active", "ready", "invalid"),
)
async def test_get_users_state_filter(app, state):
db = app.db
# has_one_active: one active, one inactive, zero ready
has_one_active = add_user(db, app=app, name='has_one_active')
# has_two_active: two active, ready servers
has_two_active = add_user(db, app=app, name='has_two_active')
# has_two_inactive: two spawners, neither active
has_two_inactive = add_user(db, app=app, name='has_two_inactive')
# has_zero: no Spawners registered at all
has_zero = add_user(db, app=app, name='has_zero')
test_usernames = set(
("has_one_active", "has_two_active", "has_two_inactive", "has_zero")
)
user_states = {
"inactive": ["has_two_inactive", "has_zero"],
"ready": ["has_two_active"],
"active": ["has_one_active", "has_two_active"],
"invalid": [],
}
expected = user_states[state]
def add_spawner(user, name='', active=True, ready=True):
"""Add a spawner in a requested state
If active, should turn up in an active query
If active and ready, should turn up in a ready query
If not active, should turn up in an inactive query
"""
spawner = user.spawners[name]
db.commit()
if active:
orm_server = orm.Server()
db.add(orm_server)
db.commit()
spawner.server = Server(orm_server=orm_server)
db.commit()
if not ready:
spawner._spawn_pending = True
return spawner
for name in ("", "secondary"):
add_spawner(has_two_active, name, active=True)
add_spawner(has_two_inactive, name, active=False)
add_spawner(has_one_active, active=True, ready=False)
add_spawner(has_one_active, "inactive", active=False)
r = await api_request(app, 'users?state={}'.format(state))
if state == "invalid":
assert r.status_code == 400
return
assert r.status_code == 200
usernames = sorted(u["name"] for u in r.json() if u["name"] in test_usernames)
assert usernames == expected
@mark.user
async def test_get_self(app):
db = app.db
@@ -277,17 +215,6 @@ async def test_get_self(app):
assert r.status_code == 403
async def test_get_self_service(app, mockservice):
r = await api_request(
app, "user", headers={"Authorization": f"token {mockservice.api_token}"}
)
r.raise_for_status()
service_info = r.json()
assert service_info['kind'] == 'service'
assert service_info['name'] == mockservice.name
@mark.user
async def test_add_user(app):
db = app.db
@@ -686,7 +613,7 @@ async def test_slow_spawn(app, no_patience, slow_spawn):
async def wait_spawn():
while not app_user.running:
await asyncio.sleep(0.1)
await gen.sleep(0.1)
await wait_spawn()
assert not app_user.spawner._spawn_pending
@@ -695,7 +622,7 @@ async def test_slow_spawn(app, no_patience, slow_spawn):
async def wait_stop():
while app_user.spawner._stop_pending:
await asyncio.sleep(0.1)
await gen.sleep(0.1)
r = await api_request(app, 'users', name, 'server', method='delete')
r.raise_for_status()
@@ -729,7 +656,7 @@ async def test_never_spawn(app, no_patience, never_spawn):
assert app.users.count_active_users()['pending'] == 1
while app_user.spawner.pending:
await asyncio.sleep(0.1)
await gen.sleep(0.1)
print(app_user.spawner.pending)
assert not app_user.spawner._spawn_pending
@@ -755,7 +682,7 @@ async def test_slow_bad_spawn(app, no_patience, slow_bad_spawn):
r = await api_request(app, 'users', name, 'server', method='post')
r.raise_for_status()
while user.spawner.pending:
await asyncio.sleep(0.1)
await gen.sleep(0.1)
# spawn failed
assert not user.running
assert app.users.count_active_users()['pending'] == 0
@@ -891,12 +818,32 @@ async def test_progress_bad_slow(request, app, no_patience, slow_bad_spawn):
}
@async_generator
async def progress_forever():
"""progress function that yields messages forever"""
for i in range(1, 10):
yield {'progress': i, 'message': 'Stage %s' % i}
await yield_({'progress': i, 'message': 'Stage %s' % i})
# wait a long time before the next event
await asyncio.sleep(10)
await gen.sleep(10)
if sys.version_info >= (3, 6):
# additional progress_forever defined as native
# async generator
# to test for issues with async_generator wrappers
exec(
"""
async def progress_forever_native():
for i in range(1, 10):
yield {
'progress': i,
'message': 'Stage %s' % i,
}
# wait a long time before the next event
await gen.sleep(10)
""",
globals(),
)
async def test_spawn_progress_cutoff(request, app, no_patience, slow_spawn):
@@ -907,7 +854,11 @@ async def test_spawn_progress_cutoff(request, app, no_patience, slow_spawn):
db = app.db
name = 'geddy'
app_user = add_user(db, app=app, name=name)
app_user.spawner.progress = progress_forever
if sys.version_info >= (3, 6):
# Python >= 3.6, try native async generator
app_user.spawner.progress = globals()['progress_forever_native']
else:
app_user.spawner.progress = progress_forever
app_user.spawner.delay = 1
r = await api_request(app, 'users', name, 'server', method='post')
@@ -934,8 +885,8 @@ async def test_spawn_limit(app, no_patience, slow_spawn, request):
# start two pending spawns
names = ['ykka', 'hjarka']
users = [add_user(db, app=app, name=name) for name in names]
users[0].spawner._start_future = asyncio.Future()
users[1].spawner._start_future = asyncio.Future()
users[0].spawner._start_future = Future()
users[1].spawner._start_future = Future()
for name in names:
await api_request(app, 'users', name, 'server', method='post')
assert app.users.count_active_users()['pending'] == 2
@@ -943,7 +894,7 @@ async def test_spawn_limit(app, no_patience, slow_spawn, request):
# ykka and hjarka's spawns are both pending. Essun should fail with 429
name = 'essun'
user = add_user(db, app=app, name=name)
user.spawner._start_future = asyncio.Future()
user.spawner._start_future = Future()
r = await api_request(app, 'users', name, 'server', method='post')
assert r.status_code == 429
@@ -951,7 +902,7 @@ async def test_spawn_limit(app, no_patience, slow_spawn, request):
users[0].spawner._start_future.set_result(None)
# wait for ykka to finish
while not users[0].running:
await asyncio.sleep(0.1)
await gen.sleep(0.1)
assert app.users.count_active_users()['pending'] == 1
r = await api_request(app, 'users', name, 'server', method='post')
@@ -962,7 +913,7 @@ async def test_spawn_limit(app, no_patience, slow_spawn, request):
for user in users[1:]:
user.spawner._start_future.set_result(None)
while not all(u.running for u in users):
await asyncio.sleep(0.1)
await gen.sleep(0.1)
# everybody's running, pending count should be back to 0
assert app.users.count_active_users()['pending'] == 0
@@ -971,7 +922,7 @@ async def test_spawn_limit(app, no_patience, slow_spawn, request):
r = await api_request(app, 'users', u.name, 'server', method='delete')
r.raise_for_status()
while any(u.spawner.active for u in users):
await asyncio.sleep(0.1)
await gen.sleep(0.1)
@mark.slow
@@ -1049,7 +1000,7 @@ async def test_start_stop_race(app, no_patience, slow_spawn):
r = await api_request(app, 'users', user.name, 'server', method='delete')
assert r.status_code == 400
while not spawner.ready:
await asyncio.sleep(0.1)
await gen.sleep(0.1)
spawner.delay = 3
# stop the spawner
@@ -1057,7 +1008,7 @@ async def test_start_stop_race(app, no_patience, slow_spawn):
assert r.status_code == 202
assert spawner.pending == 'stop'
# make sure we get past deleting from the proxy
await asyncio.sleep(1)
await gen.sleep(1)
# additional stops while stopping shouldn't trigger a new stop
with mock.patch.object(spawner, 'stop') as m:
r = await api_request(app, 'users', user.name, 'server', method='delete')
@@ -1069,7 +1020,7 @@ async def test_start_stop_race(app, no_patience, slow_spawn):
assert r.status_code == 400
while spawner.active:
await asyncio.sleep(0.1)
await gen.sleep(0.1)
# start after stop is okay
r = await api_request(app, 'users', user.name, 'server', method='post')
assert r.status_code == 202

View File

@@ -0,0 +1,6 @@
"""Tests for the SSL enabled REST API."""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
from jupyterhub.tests.test_api import *
ssl_enabled = True

View File

@@ -0,0 +1,7 @@
"""Test the JupyterHub entry point with internal ssl"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import jupyterhub.tests.mocking
from jupyterhub.tests.test_app import *
ssl_enabled = True

View File

@@ -20,7 +20,8 @@ ssl_enabled = True
SSL_ERROR = (SSLError, ConnectionError)
async def wait_for_spawner(spawner, timeout=10):
@gen.coroutine
def wait_for_spawner(spawner, timeout=10):
"""Wait for an http server to show up
polling at shorter intervals for early termination
@@ -31,15 +32,15 @@ async def wait_for_spawner(spawner, timeout=10):
return spawner.server.wait_up(timeout=1, http=True)
while time.monotonic() < deadline:
status = await spawner.poll()
status = yield spawner.poll()
assert status is None
try:
await wait()
yield wait()
except TimeoutError:
continue
else:
break
await wait()
yield wait()
async def test_connection_hub_wrong_certs(app):

View File

@@ -0,0 +1,6 @@
"""Tests for process spawning with internal_ssl"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
from jupyterhub.tests.test_spawner import *
ssl_enabled = True

View File

@@ -1,34 +0,0 @@
import json
from .utils import add_user
from .utils import api_request
from jupyterhub import metrics
from jupyterhub import orm
async def test_total_users(app):
num_users = app.db.query(orm.User).count()
sample = metrics.TOTAL_USERS.collect()[0].samples[0]
assert sample.value == num_users
await api_request(
app, "/users", method="post", data=json.dumps({"usernames": ["incrementor"]})
)
sample = metrics.TOTAL_USERS.collect()[0].samples[0]
assert sample.value == num_users + 1
# GET /users used to double-count
await api_request(app, "/users")
# populate the Users cache dict if any are missing:
for user in app.db.query(orm.User):
_ = app.users[user.id]
sample = metrics.TOTAL_USERS.collect()[0].samples[0]
assert sample.value == num_users + 1
await api_request(app, "/users/incrementor", method="delete")
sample = metrics.TOTAL_USERS.collect()[0].samples[0]
assert sample.value == num_users

View File

@@ -222,7 +222,8 @@ async def test_spawn_fails(db):
db.commit()
class BadSpawner(MockSpawner):
async def start(self):
@gen.coroutine
def start(self):
raise RuntimeError("Split the party")
user = User(

View File

@@ -8,6 +8,7 @@ from urllib.parse import urlparse
import pytest
from bs4 import BeautifulSoup
from tornado import gen
from tornado.escape import url_escape
from tornado.httputil import url_concat
@@ -265,8 +266,7 @@ async def test_spawn_with_query_arguments(app):
next_url = ujoin(app.base_url, 'user/jones/tree')
r = await async_requests.get(
url_concat(
ujoin(base_url, 'spawn'),
{'next': next_url, 'energy': '510keV'},
ujoin(base_url, 'spawn'), {'next': next_url, 'energy': '510keV'},
),
cookies=cookies,
)
@@ -332,12 +332,6 @@ async def test_spawn_form_admin_access(app, admin_access):
data={'bounds': ['-3', '3'], 'energy': '938MeV'},
)
r.raise_for_status()
while '/spawn-pending/' in r.url:
await asyncio.sleep(0.1)
r = await async_requests.get(r.url, cookies=cookies)
r.raise_for_status()
assert r.history
assert r.url.startswith(public_url(app, u))
assert u.spawner.user_options == {
@@ -592,7 +586,8 @@ async def test_login_strip(app):
base_url = public_url(app)
called_with = []
async def mock_authenticate(handler, data):
@gen.coroutine
def mock_authenticate(handler, data):
called_with.append(data)
with mock.patch.object(app.authenticator, 'authenticate', mock_authenticate):
@@ -1030,7 +1025,8 @@ async def test_pre_spawn_start_exc_no_form(app):
exc = "pre_spawn_start error"
# throw exception from pre_spawn_start
async def mock_pre_spawn_start(user, spawner):
@gen.coroutine
def mock_pre_spawn_start(user, spawner):
raise Exception(exc)
with mock.patch.object(app.authenticator, 'pre_spawn_start', mock_pre_spawn_start):
@@ -1045,7 +1041,8 @@ async def test_pre_spawn_start_exc_options_form(app):
exc = "pre_spawn_start error"
# throw exception from pre_spawn_start
async def mock_pre_spawn_start(user, spawner):
@gen.coroutine
def mock_pre_spawn_start(user, spawner):
raise Exception(exc)
with mock.patch.dict(

View File

@@ -6,20 +6,11 @@ from traitlets.config import Config
from jupyterhub.pagination import Pagination
@mark.parametrize(
"per_page, max_per_page, expected",
[
(20, 10, 10),
(1000, 1000, 1000),
],
)
def test_per_page_bounds(per_page, max_per_page, expected):
def test_per_page_bounds():
cfg = Config()
cfg.Pagination.max_per_page = max_per_page
p = Pagination(config=cfg)
p.per_page = per_page
p.total = 99999
assert p.per_page == expected
cfg.Pagination.max_per_page = 10
p = Pagination(config=cfg, per_page=20, total=100)
assert p.per_page == 10
with raises(Exception):
p.per_page = 0
@@ -48,5 +39,7 @@ def test_per_page_bounds(per_page, max_per_page, expected):
],
)
def test_window(page, per_page, total, expected):
cfg = Config()
cfg.Pagination
pagination = Pagination(page=page, per_page=per_page, total=total)
assert pagination.calculate_pages_window() == expected

View File

@@ -9,12 +9,12 @@ from urllib.parse import urlparse
import pytest
from traitlets.config import Config
from .. import orm
from ..utils import url_path_join as ujoin
from ..utils import wait_for_http_server
from .mocking import MockHub
from .test_api import add_user
from .test_api import api_request
from .utils import skip_if_ssl
@pytest.fixture
@@ -27,7 +27,6 @@ def disable_check_routes(app):
app.last_activity_callback.start()
@skip_if_ssl
async def test_external_proxy(request):
auth_token = 'secret!'
proxy_ip = '127.0.0.1'

View File

@@ -6,7 +6,9 @@ from binascii import hexlify
from contextlib import contextmanager
from subprocess import Popen
from async_generator import async_generator
from async_generator import asynccontextmanager
from async_generator import yield_
from tornado.ioloop import IOLoop
from ..utils import maybe_future
@@ -15,7 +17,6 @@ from ..utils import url_path_join
from ..utils import wait_for_http_server
from .mocking import public_url
from .utils import async_requests
from .utils import skip_if_ssl
mockservice_path = os.path.dirname(os.path.abspath(__file__))
mockservice_py = os.path.join(mockservice_path, 'mockservice.py')
@@ -23,6 +24,7 @@ mockservice_cmd = [sys.executable, mockservice_py]
@asynccontextmanager
@async_generator
async def external_service(app, name='mockservice'):
env = {
'JUPYTERHUB_API_TOKEN': hexlify(os.urandom(5)),
@@ -33,7 +35,7 @@ async def external_service(app, name='mockservice'):
proc = Popen(mockservice_cmd, env=env)
try:
await wait_for_http_server(env['JUPYTERHUB_SERVICE_URL'])
yield env
await yield_(env)
finally:
proc.terminate()
@@ -61,7 +63,6 @@ async def test_managed_service(mockservice):
assert service.proc.poll() is None
@skip_if_ssl
async def test_proxy_service(app, mockservice_url):
service = mockservice_url
name = service.name
@@ -75,7 +76,6 @@ async def test_proxy_service(app, mockservice_url):
assert r.text.endswith(path)
@skip_if_ssl
async def test_external_service(app):
name = 'external'
async with external_service(app, name=name) as env:

View File

@@ -35,7 +35,6 @@ from .utils import AsyncSession
# mock for sending monotonic counter way into the future
monotonic_future = mock.patch('time.monotonic', lambda: sys.maxsize)
ssl_enabled = False
def test_expiring_dict():

View File

@@ -1,7 +1,6 @@
"""Tests for process spawning"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import asyncio
import logging
import os
import signal
@@ -13,6 +12,7 @@ from unittest import mock
from urllib.parse import urlparse
import pytest
from tornado import gen
from .. import orm
from .. import spawner as spawnermod
@@ -123,7 +123,7 @@ async def test_stop_spawner_sigint_fails(db):
await spawner.start()
# wait for the process to get to the while True: loop
await asyncio.sleep(1)
await gen.sleep(1)
status = await spawner.poll()
assert status is None
@@ -138,7 +138,7 @@ async def test_stop_spawner_stop_now(db):
await spawner.start()
# wait for the process to get to the while True: loop
await asyncio.sleep(1)
await gen.sleep(1)
status = await spawner.poll()
assert status is None
@@ -165,7 +165,7 @@ async def test_spawner_poll(db):
spawner.start_polling()
# wait for the process to get to the while True: loop
await asyncio.sleep(1)
await gen.sleep(1)
status = await spawner.poll()
assert status is None
@@ -173,12 +173,12 @@ async def test_spawner_poll(db):
proc.terminate()
for i in range(10):
if proc.poll() is None:
await asyncio.sleep(1)
await gen.sleep(1)
else:
break
assert proc.poll() is not None
await asyncio.sleep(2)
await gen.sleep(2)
status = await spawner.poll()
assert status is not None
@@ -258,7 +258,8 @@ async def test_shell_cmd(db, tmpdir, request):
def test_inherit_overwrite():
"""On 3.6+ we check things are overwritten at import time"""
"""On 3.6+ we check things are overwritten at import time
"""
if sys.version_info >= (3, 6):
with pytest.raises(NotImplementedError):

View File

@@ -1,22 +1,21 @@
"""Tests for utilities"""
import asyncio
import time
from concurrent.futures import ThreadPoolExecutor
import pytest
from async_generator import aclosing
from tornado import gen
from tornado.concurrent import run_on_executor
from async_generator import async_generator
from async_generator import yield_
from ..utils import iterate_until
@async_generator
async def yield_n(n, delay=0.01):
"""Yield n items with a delay between each"""
for i in range(n):
if delay:
await asyncio.sleep(delay)
yield i
await yield_(i)
def schedule_future(io_loop, *, delay, result=None):
@@ -51,40 +50,13 @@ async def test_iterate_until(io_loop, deadline, n, delay, expected):
async def test_iterate_until_ready_after_deadline(io_loop):
f = schedule_future(io_loop, delay=0)
@async_generator
async def gen():
for i in range(5):
yield i
await yield_(i)
yielded = []
async with aclosing(iterate_until(f, gen())) as items:
async for item in items:
yielded.append(item)
assert yielded == list(range(5))
@gen.coroutine
def tornado_coroutine():
yield gen.sleep(0.05)
return "ok"
class TornadoCompat:
def __init__(self):
self.executor = ThreadPoolExecutor(1)
@run_on_executor
def on_executor(self):
time.sleep(0.05)
return "executor"
@gen.coroutine
def tornado_coroutine(self):
yield gen.sleep(0.05)
return "gen.coroutine"
async def test_tornado_coroutines():
t = TornadoCompat()
# verify that tornado gen and executor methods return awaitables
assert (await t.on_executor()) == "executor"
assert (await t.tornado_coroutine()) == "gen.coroutine"

View File

@@ -1,12 +1,9 @@
import asyncio
import os
from concurrent.futures import ThreadPoolExecutor
import pytest
import requests
from certipy import Certipy
from jupyterhub import metrics
from jupyterhub import orm
from jupyterhub.objects import Server
from jupyterhub.utils import url_path_join as ujoin
@@ -55,12 +52,6 @@ def ssl_setup(cert_dir, authority_name):
return external_certs
"""Skip tests that don't work under internal-ssl when testing under internal-ssl"""
skip_if_ssl = pytest.mark.skipif(
os.environ.get('SSL_ENABLED', False), reason="Does not use internal SSL"
)
def check_db_locks(func):
"""Decorator that verifies no locks are held on database upon exit.
@@ -106,7 +97,6 @@ def add_user(db, app=None, **kwargs):
if orm_user is None:
orm_user = orm.User(**kwargs)
db.add(orm_user)
metrics.TOTAL_USERS.inc()
else:
for attr, value in kwargs.items():
setattr(orm_user, attr, value)

View File

@@ -69,6 +69,7 @@ class UserDict(dict):
"""Add a user to the UserDict"""
if orm_user.id not in self:
self[orm_user.id] = self.from_orm(orm_user)
TOTAL_USERS.inc()
return self[orm_user.id]
def __contains__(self, key):

View File

@@ -21,6 +21,10 @@ from hmac import compare_digest
from operator import itemgetter
from async_generator import aclosing
from async_generator import async_generator
from async_generator import asynccontextmanager
from async_generator import yield_
from tornado import gen
from tornado import ioloop
from tornado import web
from tornado.httpclient import AsyncHTTPClient
@@ -28,15 +32,6 @@ from tornado.httpclient import HTTPError
from tornado.log import app_log
from tornado.platform.asyncio import to_asyncio_future
# For compatibility with python versions 3.6 or earlier.
# asyncio.Task.all_tasks() is fully moved to asyncio.all_tasks() starting with 3.9. Also applies to current_task.
try:
asyncio_all_tasks = asyncio.all_tasks
asyncio_current_task = asyncio.current_task
except AttributeError as e:
asyncio_all_tasks = asyncio.Task.all_tasks
asyncio_current_task = asyncio.Task.current_task
def random_port():
"""Get a single random port."""
@@ -84,7 +79,8 @@ def can_connect(ip, port):
def make_ssl_context(keyfile, certfile, cafile=None, verify=True, check_hostname=True):
"""Setup context for starting an https server or making requests over ssl."""
"""Setup context for starting an https server or making requests over ssl.
"""
if not keyfile or not certfile:
return None
purpose = ssl.Purpose.SERVER_AUTH if verify else ssl.Purpose.CLIENT_AUTH
@@ -104,7 +100,7 @@ async def exponential_backoff(
timeout=10,
timeout_tolerance=0.1,
*args,
**kwargs,
**kwargs
):
"""
Exponentially backoff until `pass_func` is true.
@@ -179,7 +175,7 @@ async def exponential_backoff(
dt = min(max_wait, remaining, random.uniform(0, start_wait * scale))
if dt < max_wait:
scale *= scale_factor
await asyncio.sleep(dt)
await gen.sleep(dt)
raise TimeoutError(fail_message)
@@ -483,7 +479,7 @@ def print_stacks(file=sys.stderr):
# also show asyncio tasks, if any
# this will increase over time as we transition from tornado
# coroutines to native `async def`
tasks = asyncio_all_tasks()
tasks = asyncio.Task.all_tasks()
if tasks:
print("AsyncIO tasks: %i" % len(tasks))
for task in tasks:
@@ -511,12 +507,28 @@ def maybe_future(obj):
return asyncio.wrap_future(obj)
else:
# could also check for tornado.concurrent.Future
# but with tornado >= 5.1 tornado.Future is asyncio.Future
# but with tornado >= 5 tornado.Future is asyncio.Future
f = asyncio.Future()
f.set_result(obj)
return f
@asynccontextmanager
@async_generator
async def not_aclosing(coro):
"""An empty context manager for Python < 3.5.2
which lacks the `aclose` method on async iterators
"""
await yield_(await coro)
if sys.version_info < (3, 5, 2):
# Python 3.5.1 is missing the aclose method on async iterators,
# so we can't close them
aclosing = not_aclosing
@async_generator
async def iterate_until(deadline_future, generator):
"""An async generator that yields items from a generator
until a deadline future resolves
@@ -541,7 +553,7 @@ async def iterate_until(deadline_future, generator):
)
if item_future.done():
try:
yield item_future.result()
await yield_(item_future.result())
except (StopAsyncIteration, asyncio.CancelledError):
break
elif deadline_future.done():

View File

@@ -1,7 +1,2 @@
[tool.black]
skip-string-normalization = true
target_version = [
"py36",
"py37",
"py38",
]

View File

@@ -1,15 +1,15 @@
alembic
async_generator>=1.9
async_generator>=1.8
certipy>=0.1.2
entrypoints
jinja2>=2.11.0
jinja2
jupyter_telemetry>=0.1.0
oauthlib>=3.0
pamela; sys_platform != 'win32'
prometheus_client>=0.4.0
prometheus_client>=0.0.21
psutil>=5.6.5; sys_platform == 'win32'
python-dateutil
requests
SQLAlchemy>=1.1
tornado>=5.1
tornado>=5.0
traitlets>=4.3.2

View File

@@ -17,8 +17,8 @@ from setuptools.command.bdist_egg import bdist_egg
v = sys.version_info
if v[:2] < (3, 6):
error = "ERROR: JupyterHub requires Python version 3.6 or above."
if v[:2] < (3, 5):
error = "ERROR: JupyterHub requires Python version 3.5 or above."
print(error, file=sys.stderr)
sys.exit(1)
@@ -94,7 +94,7 @@ setup_args = dict(
license="BSD",
platforms="Linux, Mac OS X",
keywords=['Interactive', 'Interpreter', 'Shell', 'Web'],
python_requires=">=3.6",
python_requires=">=3.5",
entry_points={
'jupyterhub.authenticators': [
'default = jupyterhub.auth:PAMAuthenticator',

View File

@@ -68,18 +68,6 @@
<i class="fa fa-spinner"></i>
</div>
</div>
{% block login_terms %}
{% if login_term_url %}
<div id="login_terms" class="login_terms">
<input type="checkbox" id="login_terms_checkbox" name="login_terms_checkbox" required />
{% block login_terms_text %} {# allow overriding the text #}
By logging into the platform you accept the <a href="{{ login_term_url }}">terms and conditions</a>.
{% endblock login_terms_text %}
</div>
{% endif %}
{% endblock login_terms %}
</div>
</form>
{% endif %}

View File

@@ -142,7 +142,7 @@ def untag(vs, push=False):
def make_env(*packages):
"""Make a virtualenv
Assumes `which python` has the `virtualenv` package
"""
if not os.path.exists(env_root):
@@ -167,7 +167,7 @@ def make_env(*packages):
def build_sdist(py):
"""Build sdists
Returns the path to the tarball
"""
with cd(repo_root):