Compare commits

...

160 Commits

Author SHA1 Message Date
Min RK
a91197635a release 1.3.0 2020-12-11 12:07:55 +01:00
Min RK
88706d4c27 final changelog edits for 1.3.0 2020-12-11 12:07:06 +01:00
Min RK
29fac11bfe Merge pull request #3295 from minrk/changelog-1.3
begin changelog for 1.3
2020-12-11 12:02:15 +01:00
Erik Sundell
947ef67184 Merge pull request #3303 from Sangarshanan/patch-1
Remove the extra parenthesis in service.md
2020-12-11 09:39:28 +01:00
sangarshanan
8ede924956 Remove extra paranthesis 2020-12-11 13:15:13 +05:30
sangarshanan
55c2d3648e Add the missing parenthesis in service.md 2020-12-11 01:53:35 +05:30
Min RK
2cf8e48fb5 start changelog for 1.3
I noticed that our jinja async feature is new in 2.9, and matured in 2.11, so explicitly require that
2020-12-09 14:31:10 +01:00
Min RK
ae77038a64 Merge pull request #3293 from minrk/services-whoami
allow services to call /api/user to identify themselves
2020-12-09 13:25:46 +01:00
Min RK
ffed8f67a0 Merge pull request #3294 from minrk/paginate-per-page
fix increasing pagination limits
2020-12-08 10:03:51 +01:00
Erik Sundell
1efd7da6ee Merge pull request #3300 from mxjeff/fixed-doc-services
Fixed idle-culler references.
2020-12-04 11:46:04 +01:00
Geoffroy Youri Berret
6e161d0140 Fixed idle-culler references.
Merge request #3257 fixed #3256 only on getting-started/services-basics.md
There is still a reference to jupyterhub example cull-idle in reference/services.md
2020-12-04 09:28:02 +01:00
Min RK
5f4144cc98 Merge pull request #3298 from coffeebenzene/master
Fix asyncio deprecation asyncio.Task.all_tasks
2020-12-03 11:16:46 +01:00
coffeebenzene
f866bbcf45 Use variable instead of monkey patching asyncio 2020-12-02 19:50:49 +00:00
coffeebenzene
ed6231d3aa Fix asyncio deprecation asyncio.Task.all_tasks 2020-12-02 17:57:28 +00:00
Min RK
9d38259ad7 fix increasing pagination limits
setting per_page in constructor resolves before max_per_page limit is updated from config,
preventing max_per_page from being increased beyond the default limit

we already loaded these values anyway in the first instance,
so remove the redundant Pagination object
2020-12-02 12:52:42 +01:00
Min RK
4b254fe5ed Merge pull request #3243 from agp8x/master
[Metrics] Add prefix to prometheus metrics to group all jupyterhub metrics
2020-12-02 12:22:32 +01:00
Min RK
f8040209b0 allow services to call /api/user to identify themselves 2020-12-02 12:21:25 +01:00
Min RK
e59ee33a6e note versionchanged in metrics module docstring 2020-12-02 11:36:13 +01:00
Min RK
ff15ced3ce Merge pull request #3225 from cbanek/configurable_options_from_form
Allow options_from_form to be configurable
2020-12-02 11:32:24 +01:00
Min RK
75acd6a67b Merge pull request #3264 from tlvu/add-user-agreement-to-login-screen
Add optional user agreement to login screen
2020-12-02 11:31:23 +01:00
Min RK
73ac6207af Merge pull request #3244 from mhwasil/fix-https-redirect-issues
[Docs] Fix https reverse proxy redirect issues
2020-12-02 11:30:09 +01:00
Min RK
e435fe66a5 Merge pull request #3292 from minrk/oldest-metrics
bump oldest-required prometheus-client
2020-12-02 11:27:27 +01:00
Min RK
d7569d6f8e bump oldest-required prometheus-client
oldest-dependency tests caught an error with our base required version
2020-12-02 11:20:30 +01:00
Min RK
ba6c2cf854 Merge pull request #3266 from 0mar/reduce_ssl_testing
Test internal_ssl separately
2020-12-02 10:59:39 +01:00
0mar
970b25d017 Added docstrings 2020-12-01 10:49:10 +01:00
0mar
671ef0d5ef Moved ssl options to proxy 2020-12-01 10:30:44 +01:00
Erik Sundell
77220d6662 Merge pull request #3289 from minrk/user-count
fix and test TOTAL_USERS count
2020-11-30 15:21:48 +01:00
Min RK
7e469f911d fix and test TOTAL_USERS count
Don't assume UserDict contains all users

which assumption led to double-counting when a user in the db was loaded into the dict cache
2020-11-30 13:27:52 +01:00
Erik Sundell
18393ec6b4 Merge pull request #3287 from minrk/bump-black
bump black pre-commit hook to 20.8
2020-11-30 10:26:55 +01:00
Min RK
28fdbeb0c0 update back pre-commit hook
specify minimum target_version as py36

results in some churn
2020-11-30 10:13:10 +01:00
Tim Head
5664e4d318 Merge pull request #3286 from Sangarshanan/patch-1
Fix curl in jupyter announcements
2020-11-30 07:47:27 +01:00
sangarshanan
24c83e721f Fix curl in jupyter announcements
Running the Curl as is return a 500 with ```json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 1 column 2 (char 1)
```  Converting the payload to a proper Json
2020-11-28 17:50:44 +05:30
0mar
cc73ab711e Disabled ssl testing 2020-11-27 17:50:47 +01:00
0mar
2cfe4474ac Submitting reason for skiptest 2020-11-27 17:26:44 +01:00
0mar
74766e4786 Resolving merge conflichts 2020-11-27 17:18:40 +01:00
0mar
ed461ff4a7 Merge branch 'tmp' into reduce_ssl_testing
# Conflicts:
#	jupyterhub/tests/test_proxy.py
2020-11-27 17:05:26 +01:00
0mar
184d87ff2a Skip SSL-free tests if not on SSL matrix 2020-11-27 17:00:09 +01:00
Min RK
06ed7dc0cf Merge pull request #3284 from minrk/12-cl
Changelog for 1.2.2
2020-11-27 14:41:08 +01:00
Min RK
a0b229431c Update docs/source/changelog.md
Co-authored-by: Erik Sundell <erik.i.sundell@gmail.com>
2020-11-27 14:40:59 +01:00
0mar
2a06c8a94c WIP: Attempt to access SSL parameters, failing due to self-signed certificate error 2020-11-27 13:26:32 +01:00
Min RK
91159d08d3 Changelog for 1.2.2 2020-11-27 10:09:54 +01:00
Erik Sundell
06a83f146b Merge pull request #3281 from olifre/patch-1
CONTRIBUTING: Fix contributor guide URL
2020-11-27 09:53:41 +01:00
Oliver Freyermuth
7b66d1656b CONTRIBUTING: Fix contributor guide URL
The link has been changed.
2020-11-27 09:39:29 +01:00
0mar
40176a667f Attempt to patch proxy, unsuccessful 2020-11-26 12:22:43 +01:00
Omar Richardson
e02345a4e8 WIP: Moved ssl options to new method 2020-11-26 09:24:44 +01:00
Long Vu
1408e9f5f4 Merge remote-tracking branch 'origin/master' into add-user-agreement-to-login-screen 2020-11-25 10:31:38 -05:00
Long Vu
b66d204d69 login page: no javascript needed for the optional accept terms and conditions feature
Bonus user gets a pop-up notification to check the checkbox.

Tested on Mozilla Firefox
(https://user-images.githubusercontent.com/11966697/100246404-18115e00-2f07-11eb-9061-d35434ace3aa.gif)
and Google Chrome.

Feedback from @minrk.
2020-11-25 10:30:22 -05:00
Omar Richardson
164447717f Fix formulation 2020-11-20 15:30:23 +01:00
Omar Richardson
0472ef0533 Central internal_ssl switch 2020-11-20 15:27:50 +01:00
Erik Sundell
202efae6d8 Merge pull request #3177 from minrk/user-state-filter
add ?state= filter for GET /users
2020-11-20 11:06:15 +01:00
Min RK
2e043241fb Merge pull request #3261 from minrk/next-append-query
Only preserve params when ?next= is unspecified
2020-11-20 09:47:20 +01:00
Min RK
fa61f06fed Merge pull request #3237 from alexweav/cleanup-leftover-proxy
[proxy.py] Improve robustness when detecting and closing existing proxy processes
2020-11-20 09:45:53 +01:00
Min RK
8b19413fa1 Merge pull request #3242 from consideRatio/pr/py36-async-await
Assume py36 and remove @gen.coroutine etc.
2020-11-20 09:31:43 +01:00
Min RK
7c2e7692b0 Merge pull request #3265 from ideonate/master
Fix RootHandler when default_url is a callable
2020-11-20 09:14:46 +01:00
Tim Head
ce11959b1a Merge pull request #3267 from slemonide/patch-1
Update services.md
2020-11-19 14:07:56 +01:00
fyrzbavqr
097974d57d Update services.md
Fix small typo
2020-11-19 04:14:54 -08:00
Omar Richardson
09ff03ca4f Superfluous import statement 2020-11-19 13:10:48 +01:00
Omar Richardson
313f050c42 Reduced ssl on for active tests only 2020-11-19 12:58:38 +01:00
Omar Richardson
4862831f71 Trying with different configuration 2020-11-19 12:08:10 +01:00
Omar Richardson
c46beb976a Moving ssl tests to testing matrix 2020-11-19 11:59:03 +01:00
Long Vu
11a85d1dc5 login page: allow full override of the optional accept terms and conditions feature
The text was already overridable but the endblock was at the wrong
location.

Now the javascript can also be overridden.
2020-11-18 14:25:49 -05:00
Dan Lester
67c4a86376 Fix RootHandler when default_url is a callable 2020-11-18 12:55:44 +00:00
Long Vu
e00ef1aef1 Merge remote-tracking branch 'origin/master' into add-user-agreement-to-login-screen 2020-11-17 17:27:30 -05:00
Long Vu
fb5f98f2fa login page: add optional feature to accept terms and conditions in order to login
The feature is disabled by default.

If enabled (by setting `login_term_url`), user will have to check the
checkbox to accept the terms and conditions in order to login.
2020-11-17 17:24:38 -05:00
Alex Weaver
82a1ba8402 Import psutil and perform cmdline check on Windows onlyy 2020-11-17 13:02:35 -06:00
Alex Weaver
7f53ad52fb Assume that fapermission errors when getting process metadata indicate a non-running proxy 2020-11-17 12:55:34 -06:00
agp8x
73cdd687e9 fix formatting 2020-11-17 15:36:30 +01:00
agp8x
af09bc547a change metric prefix to jupyterhub 2020-11-17 15:29:37 +01:00
Min RK
3ddc796068 verify that tornado gen.coroutine and run_on_executor are awaitable
- our APIs require that methods return 'awaitables'
- make sure that the older ways to create tornado 'yieldables' still produce 'awaitables'
2020-11-17 12:38:42 +01:00
Min RK
3c071467bb require tornado 5.1, async_generator 1.9
- maybe_future relies on changes in 5.1, not in 5.0
- async_generator.asynccontextmanager is new in 1.9
2020-11-17 12:23:39 +01:00
Min RK
0c43feee1b run tests with oldest-supported versions
to catch any cases where we make assumptions about more recent versions than we claim to support
2020-11-17 12:22:46 +01:00
Min RK
5bcbc8b328 Merge pull request #3252 from cmd-ntrf/signin
Standardize "Sign in" capitalization on the login page
2020-11-17 11:59:26 +01:00
Min RK
87e4f458fb only preserve params when ?next= is not specified 2020-11-17 11:58:28 +01:00
Min RK
808e8711e1 Merge pull request #3176 from yuvipanda/async_template
Enable async support in jinja2 templates
2020-11-17 11:46:23 +01:00
YuviPanda
19935254a7 Fix pre-commit errors 2020-11-17 15:58:38 +05:30
YuviPanda
a499940309 Remove extreneous coroutine creation
You can 'pass through' coroutines like this without
yield.
2020-11-17 15:41:40 +05:30
YuviPanda
74544009ca Remove extreneous print statement
Was a debugging aid
2020-11-17 15:41:22 +05:30
YuviPanda
665f9fa693 Drop Python 3.5 support
See https://github.com/jupyterhub/jupyterhub/pull/3176#issuecomment-694315759

For Travis, I push the version cascade down one step.
Should preserve our test coverage while conserving test
duration
2020-11-17 15:39:55 +05:30
YuviPanda
24b555185a Revert "Run templates synchronously for Python 3.5"
This reverts commit f1155d6c2afbcbd875c7addc88784313c77da8e9.

Instead, let's stop supporting 3.5!
2020-11-17 15:39:26 +05:30
YuviPanda
24f4b7b6b6 Run templates synchronously for Python 3.5
jinja2's async support requires Python 3.6+. That should
be an implementation detail - so we render it in the main
thread (current behavior) but pretend we did not
2020-11-17 15:39:26 +05:30
YuviPanda
217dffa845 Fix typo in format string 2020-11-17 15:39:26 +05:30
YuviPanda
a7b796fa57 Autoformat with black 2020-11-17 15:39:21 +05:30
YuviPanda
6c5fb5fe97 F-strings are Python 3.6, not 3.5 2020-11-17 15:38:29 +05:30
Yuvi Panda
20ea322e25 Fix typo
Co-authored-by: Tim Head <betatim@gmail.com>
2020-11-17 15:38:29 +05:30
YuviPanda
4f9664cfe2 Provide sync versions of render_template too
write_error is a synchronous method called by an async
method from inside the event loop. This means we can't just
schedule an async render_templates in the same loop and wait
for it - that would deadlock.

jinja2 compiled your code differently based on wether you
enable async support or not. Templates compiled with async
support can't be used in cases like ours, where we already
have an event loop running and calling a sync function. So
we maintain two almost identical jinja2 environments
2020-11-17 15:38:29 +05:30
YuviPanda
be211a48ef Enable async jinja2 template rendering
Follows https://jinja.palletsprojects.com/en/2.11.x/api/#async-support

- This blocks the main thread fewer times
- We can use async methods inside templates too
2020-11-17 15:38:29 +05:30
Min RK
553ee26312 preserve url params in ?next from root page 2020-11-17 10:45:11 +01:00
Erik Sundell
7e6111448a Merge pull request #3253 from minrk/wait-admin-form
wait for pending spawns in spawn_form_admin_access
2020-11-16 02:39:11 +01:00
Erik Sundell
ccc0294f2e Merge pull request #3257 from manics/jupyterhub_idle_culler
Update services-basics.md to ues jupyterhub_idle_culler
2020-11-14 17:37:17 +01:00
Simon Li
3232ad61aa Update services-basics.md to ues jupyterhub_idle_culler
Closes https://github.com/jupyterhub/jupyterhub/issues/3256
2020-11-14 15:59:56 +00:00
Min RK
202a5bf9a5 Merge pull request #3255 from fcollonval/patch-1
Environment marker on pamela
2020-11-13 10:28:28 +01:00
Frédéric Collonval
47136f6a3c Environment marker on pamela 2020-11-13 09:57:20 +01:00
Min RK
5d3161c6ef wait for pending spawns in spawn_form_admin_access
copy logic from test_spawn_admin_access
2020-11-12 10:16:48 +01:00
Félix-Antoine Fortin
9da4aa236e Standardize Sign in capitalization on the login page 2020-11-11 13:01:14 -05:00
Erik Sundell
d581cf54cb Retain an assertion and update comments 2020-11-11 15:40:54 +01:00
Erik Sundell
fca2528332 Retain explicit pytest mark asyncio of our coroutines 2020-11-11 14:47:41 +01:00
Erik Sundell
5edd246474 Replace @async_generator/yeild_ with async/yeild 2020-11-11 14:47:29 +01:00
Erik Sundell
77ed2faf31 Replace gen.multi(futures) with asyncio.gather(*futures) 2020-11-11 14:47:24 +01:00
Erik Sundell
4a17441e5a Replace gen.sleep with asyncio.sleep 2020-11-11 14:40:59 +01:00
Erik Sundell
e1166ec834 Replace @gen.coroutine/yield with async/await 2020-11-11 14:36:56 +01:00
Erik Sundell
2a1d341586 Merge pull request #3250 from minrk/test-condition
remove push-branch conditions for CI
2020-11-11 12:21:52 +01:00
Min RK
55a59a2e43 remove push-branch conditions for CI
testing other branches is useful, and there's little cost to removing the conditions:

- we don't run PRs from our repo, so test runs aren't duplicated on the repo
- testing on a fork without opening a PR is still useful (I use this often)
- if we push a branch, it should probably be tested (e.g. backport branch), and filters make this extra work
- the cost of running a few extra tests is low, especially given actions' current quotas and parallelism
2020-11-11 09:12:58 +01:00
Min RK
e019a33509 Merge pull request #3246 from consideRatio/pr/migrate-to-gh-actions-from-travis
Migrate from travis to GitHub actions
2020-11-11 09:06:58 +01:00
Erik Sundell
737dcf65eb Fix mysql/postgresql auth and comment struggles 2020-11-10 19:20:47 +01:00
Erik Sundell
9deaeb1fa9 Final variable name update 2020-11-10 16:19:22 +01:00
Erik Sundell
bcfc2c1b0d Cleanup use of database related environment variables 2020-11-10 16:16:28 +01:00
Erik Sundell
f71bacc998 Apply suggestions from code review
Co-authored-by: Min RK <benjaminrk@gmail.com>
2020-11-10 15:39:46 +01:00
Erik Sundell
ff14b1aa71 CI: use --maxfail=2 2020-11-10 11:14:59 +01:00
Erik Sundell
ebbbdcb2b1 Refactor ci/docker-db and ci/init-db 2020-11-10 11:14:40 +01:00
Erik Sundell
d0fca9e56b Reword comment 2020-11-10 10:03:53 +01:00
Erik Sundell
517737aa0b Add notes about not needing "set -e" etc. 2020-11-10 02:17:44 +01:00
Erik Sundell
5dadd34a87 Help GitHub UI present the job parameterization + inline comments 2020-11-10 02:17:40 +01:00
Erik Sundell
df134fefd0 Refactor pre-commit to its own job 2020-11-10 01:17:30 +01:00
Erik Sundell
47cec97e63 Let pytest fail on first error 2020-11-10 01:16:12 +01:00
Erik Sundell
0b8b87d7d0 Remove debugging trigger 2020-11-09 07:43:42 +01:00
Erik Sundell
3bf1d72905 Test in Ubuntu 20.04 2020-11-09 07:42:45 +01:00
Erik Sundell
8cdd449cca Unpin mysql-connector-python and resolve errors 2020-11-09 07:42:12 +01:00
Erik Sundell
6fc3c19763 For CI readability, exit on first failure 2020-11-09 07:41:05 +01:00
Erik Sundell
265dc07c78 Remove .travis.yml, add GitHub workflow 2020-11-09 07:40:15 +01:00
Erik Sundell
1ae039ddef Remove py3.7+ breaking test variation (has~x)
The jupyterhub/tests/test_spawner.py::test_spawner_routing[has~x] test
failed in py37+ but not in py36, and I think it is foundational to the
socket library of Python that has changed.

This is a stacktrace from Python/3.7.9/x64/lib/python3.7/site-packages/urllib3/util/connection.py:61

```
>       for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
E       socket.gaierror: [Errno -2] Name or service not known
```

Here is relevant documentation about socket.getaddrinfo.

https://docs.python.org/3.7/library/socket.html#socket.getaddrinfo
2020-11-09 07:32:11 +01:00
Erik Sundell
378d34b213 Don't ignore outer env vars 2020-11-09 07:31:16 +01:00
Mohammad Wasil
9657430cac Fix reverse proxy redirect from https 2020-11-04 17:59:28 +01:00
Mohammad Wasil
6271535f46 Merge pull request #1 from jupyterhub/master
Merge from jupyterhub/jupyterhub master
2020-11-04 17:02:28 +01:00
agp8x
2bef5ba981 Add prefix to prometheus metrics to group all jupyter metrics (see #1585) 2020-11-04 13:54:31 +01:00
Alex Weaver
efb1f3c824 Run precommit hooks, fix formatting issue 2020-10-30 12:35:01 -05:00
Alex Weaver
53050a5836 Merge branch 'master' of https://github.com/jupyterhub/jupyterhub into cleanup-leftover-proxy 2020-10-30 12:14:08 -05:00
Alex Weaver
6428ad9f0b Check proxy cmd before shutting down, cleaner shutdown on Windows 2020-10-30 12:13:50 -05:00
Min RK
9068ff2239 back to dev 2020-10-30 13:22:14 +01:00
Min RK
fc6cd33ce0 release 1.2.1 2020-10-30 13:20:43 +01:00
Erik Sundell
b0b8e2d058 Merge pull request #3235 from minrk/changelog-1.2.1
Changelog for 1.2.1
2020-10-30 13:19:52 +01:00
Erik Sundell
6bfa402bfa Apply suggestions from code review 2020-10-30 13:19:18 +01:00
Min RK
b51a0bba92 Changelog for 1.2.1 2020-10-30 13:15:19 +01:00
Erik Sundell
2d3f962a1d Merge pull request #3234 from gesiscss/master
Make external JupyterHub services' oauth_no_confirm configuration work as intentend
2020-10-30 13:07:39 +01:00
Kenan Erdogan
625242136a fix checking if oauth confirm is needed 2020-10-30 10:39:02 +01:00
Min RK
f92560fed0 back to dev 2020-10-29 14:06:20 +01:00
Min RK
8249ef69f0 release jupyterhub 1.2.0 2020-10-29 14:03:34 +01:00
Min RK
c63605425f Merge pull request #3233 from minrk/1.2.0-final
latest changelog since 1.2.0b1
2020-10-29 14:03:01 +01:00
Min RK
5b57900c0b 1.2.0 heading in changelog
Co-authored-by: Erik Sundell <erik.i.sundell@gmail.com>
2020-10-29 14:02:35 +01:00
Erik Sundell
d0afdabd4c order changelog entries systematically 2020-10-29 13:13:02 +01:00
Min RK
618746fa00 latest changelog since 1.2.0b1 2020-10-29 13:02:04 +01:00
Min RK
e7bc6c2ba9 Merge pull request #3229 from minrk/configurable-pagination
make pagination configurable
2020-10-29 10:53:29 +01:00
Min RK
e9f86cd602 make pagination configurable
add some unittests for pagination

reorganize pagination a bit to make it easier to configure
2020-10-29 09:24:34 +01:00
Erik Sundell
6e8517f795 Merge pull request #3232 from consideRatio/pr/travis-badge
Update travis-ci badge in README.md
2020-10-28 23:01:04 +01:00
Erik Sundell
5fa540bea1 Update travis-ci badge in README.md 2020-10-28 22:59:44 +01:00
Min RK
99f597887c Merge pull request #3223 from consideRatio/pr/proxy-api_request-retries
Make api_request to CHP's REST API more reliable
2020-10-28 15:21:23 +01:00
Erik Sundell
352526c36a Merge pull request #3226 from xlotlu/patch-1
Fix typo in documentation
2020-10-28 08:09:11 +01:00
Ionuț Ciocîrlan
cbbed04eed fix typo 2020-10-28 03:00:31 +02:00
Christine Banek
b2e7b474ff Allow options_from_form to be configurable 2020-10-27 12:11:48 -07:00
Erik Sundell
b2756fb18c Retry on >=500 errors on hub to proxy REST API reqeusts 2020-10-27 16:53:53 +01:00
Erik Sundell
37b88029e4 Revert improved logging attempt 2020-10-27 16:28:56 +01:00
Erik Sundell
4b7413184e Adjust hub to proxy REST API requests' timeouts 2020-10-27 16:23:40 +01:00
Min RK
41ef0da180 Merge pull request #3219 from elgalu/patch-3
Fix #2284 must be sent from authorization page
2020-10-27 15:41:05 +01:00
Erik Sundell
a4a8b3fa2c Fix scope mistake 2020-10-27 13:38:34 +01:00
Erik Sundell
02e5984f34 Let API requests to CHP retry on 429,500,503,504 as well 2020-10-27 12:52:14 +01:00
Erik Sundell
b91c5a489c Rely on HTTPError over pycurl assumed CurlError 2020-10-26 20:39:20 +01:00
Erik Sundell
c47c3b2f9e Make api_request to CHP's REST API more reliable 2020-10-25 02:35:36 +01:00
Min RK
eaa1353dcd typos in use of partition 2020-10-23 14:16:46 +02:00
Leo Gallucci
b9a3b0a66a Fix #2284 must be sent from authorization pageUpdate jupyterhub/apihandlers/auth.py
Co-authored-by: Min RK <benjaminrk@gmail.com>
2020-10-22 11:36:15 +02:00
Leo Gallucci
929b805fae Fix #2284 must be sent from authorization page
Fix #2284 Authorization form must be sent from authorization page
2020-10-21 17:57:14 +02:00
Min RK
30b8bc3664 add ?state= filter for GET /users
allows selecting users based on the 'ready' 'active' or 'inactive' states of their servers

- ready: users who have any servers in the 'ready' state
- active: users who have any servers in the 'active' state (i.e. ready OR pending)
- inactive: users who have *no* servers in the 'active' state (inactive + active = all users, no overlap)

Does not change the user model, so a user with *any* ready servers will still return all their servers
2020-09-17 12:31:16 +02:00
58 changed files with 1329 additions and 638 deletions

225
.github/workflows/test.yml vendored Normal file
View File

@@ -0,0 +1,225 @@
# This is a GitHub workflow defining a set of jobs with a set of steps.
# ref: https://docs.github.com/en/free-pro-team@latest/actions/reference/workflow-syntax-for-github-actions
#
name: Run tests
# Trigger the workflow's on all PRs but only on pushed tags or commits to
# main/master branch to avoid PRs developed in a GitHub fork's dedicated branch
# to trigger.
on:
pull_request:
push:
defaults:
run:
# Declare bash be used by default in this workflow's "run" steps.
#
# NOTE: bash will by default run with:
# --noprofile: Ignore ~/.profile etc.
# --norc: Ignore ~/.bashrc etc.
# -e: Exit directly on errors
# -o pipefail: Don't mask errors from a command piped into another command
shell: bash
env:
# UTF-8 content may be interpreted as ascii and causes errors without this.
LANG: C.UTF-8
jobs:
# Run "pre-commit run --all-files"
pre-commit:
runs-on: ubuntu-20.04
timeout-minutes: 2
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: 3.8
# ref: https://github.com/pre-commit/action
- uses: pre-commit/action@v2.0.0
- name: Help message if pre-commit fail
if: ${{ failure() }}
run: |
echo "You can install pre-commit hooks to automatically run formatting"
echo "on each commit with:"
echo " pre-commit install"
echo "or you can run by hand on staged files with"
echo " pre-commit run"
echo "or after-the-fact on already committed files with"
echo " pre-commit run --all-files"
# Run "pytest jupyterhub/tests" in various configurations
pytest:
runs-on: ubuntu-20.04
timeout-minutes: 10
strategy:
# Keep running even if one variation of the job fail
fail-fast: false
matrix:
# We run this job multiple times with different parameterization
# specified below, these parameters have no meaning on their own and
# gain meaning on how job steps use them.
#
# subdomain:
# Tests everything when JupyterHub is configured to add routes for
# users with dedicated subdomains like user1.jupyter.example.com
# rather than jupyter.example.com/user/user1.
#
# db: [mysql/postgres]
# Tests everything when JupyterHub works against a dedicated mysql or
# postgresql server.
#
# jupyter_server:
# Tests everything when the user instances are started with
# jupyter_server instead of notebook.
#
# ssl:
# Tests everything using internal SSL connections instead of
# unencrypted HTTP
#
# main_dependencies:
# Tests everything when the we use the latest available dependencies
# from: ipytraitlets.
#
# NOTE: Since only the value of these parameters are presented in the
# GitHub UI when the workflow run, we avoid using true/false as
# values by instead duplicating the name to signal true.
include:
- python: "3.6"
oldest_dependencies: oldest_dependencies
- python: "3.6"
subdomain: subdomain
- python: "3.7"
db: mysql
- python: "3.7"
ssl: ssl
- python: "3.8"
db: postgres
- python: "3.8"
jupyter_server: jupyter_server
- python: "3.9"
main_dependencies: main_dependencies
steps:
# NOTE: In GitHub workflows, environment variables are set by writing
# assignment statements to a file. They will be set in the following
# steps as if would used `export MY_ENV=my-value`.
- name: Configure environment variables
run: |
if [ "${{ matrix.subdomain }}" != "" ]; then
echo "JUPYTERHUB_TEST_SUBDOMAIN_HOST=http://localhost.jovyan.org:8000" >> $GITHUB_ENV
fi
if [ "${{ matrix.db }}" == "mysql" ]; then
echo "MYSQL_HOST=127.0.0.1" >> $GITHUB_ENV
echo "JUPYTERHUB_TEST_DB_URL=mysql+mysqlconnector://root@127.0.0.1:3306/jupyterhub" >> $GITHUB_ENV
fi
if [ "${{ matrix.ssl }}" == "ssl" ]; then
echo "SSL_ENABLED=1" >> $GITHUB_ENV
fi
if [ "${{ matrix.db }}" == "postgres" ]; then
echo "PGHOST=127.0.0.1" >> $GITHUB_ENV
echo "PGUSER=test_user" >> $GITHUB_ENV
echo "PGPASSWORD=hub[test/:?" >> $GITHUB_ENV
echo "JUPYTERHUB_TEST_DB_URL=postgresql://test_user:hub%5Btest%2F%3A%3F@127.0.0.1:5432/jupyterhub" >> $GITHUB_ENV
fi
if [ "${{ matrix.jupyter_server }}" != "" ]; then
echo "JUPYTERHUB_SINGLEUSER_APP=jupyterhub.tests.mockserverapp.MockServerApp" >> $GITHUB_ENV
fi
- uses: actions/checkout@v2
# NOTE: actions/setup-node@v1 make use of a cache within the GitHub base
# environment and setup in a fraction of a second.
- name: Install Node v14
uses: actions/setup-node@v1
with:
node-version: "14"
- name: Install Node dependencies
run: |
npm install
npm install -g configurable-http-proxy
npm list
# NOTE: actions/setup-python@v2 make use of a cache within the GitHub base
# environment and setup in a fraction of a second.
- name: Install Python ${{ matrix.python }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python }}
- name: Install Python dependencies
run: |
pip install --upgrade pip
pip install --upgrade . -r dev-requirements.txt
if [ "${{ matrix.oldest_dependencies }}" != "" ]; then
# take any dependencies in requirements.txt such as tornado>=5.0
# and transform them to tornado==5.0 so we can run tests with
# the earliest-supported versions
cat requirements.txt | grep '>=' | sed -e 's@>=@==@g' > oldest-requirements.txt
pip install -r oldest-requirements.txt
fi
if [ "${{ matrix.main_dependencies }}" != "" ]; then
pip install git+https://github.com/ipython/traitlets#egg=traitlets --force
fi
if [ "${{ matrix.jupyter_server }}" != "" ]; then
pip uninstall notebook --yes
pip install jupyter_server
fi
if [ "${{ matrix.db }}" == "mysql" ]; then
pip install mysql-connector-python
fi
if [ "${{ matrix.db }}" == "postgres" ]; then
pip install psycopg2-binary
fi
pip freeze
# NOTE: If you need to debug this DB setup step, consider the following.
#
# 1. mysql/postgressql are database servers we start as docker containers,
# and we use clients named mysql/psql.
#
# 2. When we start a database server we need to pass environment variables
# explicitly as part of the `docker run` command. These environment
# variables are named differently from the similarly named environment
# variables used by the clients.
#
# - mysql server ref: https://hub.docker.com/_/mysql/
# - mysql client ref: https://dev.mysql.com/doc/refman/5.7/en/environment-variables.html
# - postgres server ref: https://hub.docker.com/_/postgres/
# - psql client ref: https://www.postgresql.org/docs/9.5/libpq-envars.html
#
# 3. When we connect, they should use 127.0.0.1 rather than the
# default way of connecting which leads to errors like below both for
# mysql and postgresql unless we set MYSQL_HOST/PGHOST to 127.0.0.1.
#
# - ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)
#
- name: Start a database server (${{ matrix.db }})
if: ${{ matrix.db }}
run: |
if [ "${{ matrix.db }}" == "mysql" ]; then
sudo apt-get update
sudo apt-get install -y mysql-client
DB=mysql bash ci/docker-db.sh
DB=mysql bash ci/init-db.sh
fi
if [ "${{ matrix.db }}" == "postgres" ]; then
sudo apt-get update
sudo apt-get install -y postgresql-client
DB=postgres bash ci/docker-db.sh
DB=postgres bash ci/init-db.sh
fi
- name: Run pytest
# FIXME: --color=yes explicitly set because:
# https://github.com/actions/runner/issues/241
run: |
pytest -v --maxfail=2 --color=yes --cov=jupyterhub jupyterhub/tests
- name: Submit codecov report
run: |
codecov

1
.gitignore vendored
View File

@@ -28,3 +28,4 @@ htmlcov
.pytest_cache .pytest_cache
pip-wheel-metadata pip-wheel-metadata
docs/source/reference/metrics.rst docs/source/reference/metrics.rst
oldest-requirements.txt

View File

@@ -4,7 +4,7 @@ repos:
hooks: hooks:
- id: reorder-python-imports - id: reorder-python-imports
- repo: https://github.com/psf/black - repo: https://github.com/psf/black
rev: 19.10b0 rev: 20.8b1
hooks: hooks:
- id: black - id: black
- repo: https://github.com/pre-commit/pre-commit-hooks - repo: https://github.com/pre-commit/pre-commit-hooks

View File

@@ -1,120 +0,0 @@
dist: bionic
language: python
cache:
- pip
env:
global:
- MYSQL_HOST=127.0.0.1
- MYSQL_TCP_PORT=13306
# request additional services for the jobs to access
services:
- postgresql
- docker
# install dependencies for running pytest (but not linting)
before_install:
- set -e
- nvm install 6; nvm use 6
- npm install
- npm install -g configurable-http-proxy
- |
# setup database
if [[ $JUPYTERHUB_TEST_DB_URL == mysql* ]]; then
unset MYSQL_UNIX_PORT
DB=mysql bash ci/docker-db.sh
DB=mysql bash ci/init-db.sh
# FIXME: mysql-connector-python 8.0.16 incorrectly decodes bytes to str
# ref: https://bugs.mysql.com/bug.php?id=94944
pip install 'mysql-connector-python==8.0.11'
elif [[ $JUPYTERHUB_TEST_DB_URL == postgresql* ]]; then
psql -c "CREATE USER $PGUSER WITH PASSWORD '$PGPASSWORD';" -U postgres
DB=postgres bash ci/init-db.sh
pip install psycopg2-binary
fi
# install general dependencies
install:
- pip install --upgrade pip
- pip install --upgrade --pre -r dev-requirements.txt .
- |
if [[ "$MASTER_DEPENDENCIES" == "True" ]]; then
pip install git+https://github.com/ipython/traitlets#egg=traitlets --force
fi
- |
if [[ "$TEST" == "jupyter_server" ]]; then
pip uninstall notebook --yes
pip install jupyter_server
fi
- pip freeze
# run tests
script:
- pytest -v --maxfail=2 --cov=jupyterhub jupyterhub/tests
# collect test coverage information
after_success:
- codecov
# list the jobs
jobs:
include:
- name: autoformatting check
python: 3.6
# NOTE: It does not suffice to override to: null, [], or [""]. Travis will
# fall back to the default if we do.
before_install: echo "Do nothing before install."
script:
- pre-commit run --all-files
after_success: echo "Do nothing after success."
after_failure:
- |
echo "You can install pre-commit hooks to automatically run formatting"
echo "on each commit with:"
echo " pre-commit install"
echo "or you can run by hand on staged files with"
echo " pre-commit run"
echo "or after-the-fact on already committed files with"
echo " pre-commit run --all-files"
# When we run pytest, we want to run it with python>=3.5 as well as with
# various configurations. We increment the python version at the same time
# as we test new configurations in order to reduce the number of test jobs.
- name: python:3.5 + dist:xenial
python: 3.5
dist: xenial
- name: python:3.6 + subdomain
python: 3.6
env: JUPYTERHUB_TEST_SUBDOMAIN_HOST=http://localhost.jovyan.org:8000
- name: python:3.7 + mysql
python: 3.7
env:
- JUPYTERHUB_TEST_DB_URL=mysql+mysqlconnector://root@127.0.0.1:$MYSQL_TCP_PORT/jupyterhub
- name: python:3.8 + postgresql
python: 3.8
env:
- PGUSER=jupyterhub
- PGPASSWORD=hub[test/:?
# The password in url below is url-encoded with: urllib.parse.quote($PGPASSWORD, safe='')
- JUPYTERHUB_TEST_DB_URL=postgresql://jupyterhub:hub%5Btest%2F%3A%3F@127.0.0.1/jupyterhub
- name: python:3.8 + master dependencies
python: 3.8
env:
- PGUSER=jupyterhub
- PGPASSWORD=hub[test/:?
# The password in url below is url-encoded with: urllib.parse.quote($PGPASSWORD, safe='')
- JUPYTERHUB_TEST_DB_URL=postgresql://jupyterhub:hub%5Btest%2F%3A%3F@127.0.0.1/jupyterhub
- MASTER_DEPENDENCIES=True
- name: python:3.8 + jupyter_server
python: 3.8
env:
- TEST=jupyter_server
- JUPYTERHUB_SINGLEUSER_APP=jupyterhub.tests.mockserverapp.MockServerApp
- name: python:nightly
python: nightly
allow_failures:
- name: python:nightly
# https://github.com/jupyterhub/jupyterhub/issues/3141
# The latest traitlets is close to release so it should not fail
# - name: python:3.8 + master dependencies
fast_finish: true

View File

@@ -1,7 +1,7 @@
# Contributing to JupyterHub # Contributing to JupyterHub
Welcome! As a [Jupyter](https://jupyter.org) project, Welcome! As a [Jupyter](https://jupyter.org) project,
you can follow the [Jupyter contributor guide](https://jupyter.readthedocs.io/en/latest/contributor/content-contributor.html). you can follow the [Jupyter contributor guide](https://jupyter.readthedocs.io/en/latest/contributing/content-contributor.html).
Make sure to also follow [Project Jupyter's Code of Conduct](https://github.com/jupyter/governance/blob/master/conduct/code_of_conduct.md) Make sure to also follow [Project Jupyter's Code of Conduct](https://github.com/jupyter/governance/blob/master/conduct/code_of_conduct.md)
for a friendly and welcoming collaborative environment. for a friendly and welcoming collaborative environment.

View File

@@ -13,7 +13,7 @@
[![Latest PyPI version](https://img.shields.io/pypi/v/jupyterhub?logo=pypi)](https://pypi.python.org/pypi/jupyterhub) [![Latest PyPI version](https://img.shields.io/pypi/v/jupyterhub?logo=pypi)](https://pypi.python.org/pypi/jupyterhub)
[![Latest conda-forge version](https://img.shields.io/conda/vn/conda-forge/jupyterhub?logo=conda-forge)](https://www.npmjs.com/package/jupyterhub) [![Latest conda-forge version](https://img.shields.io/conda/vn/conda-forge/jupyterhub?logo=conda-forge)](https://www.npmjs.com/package/jupyterhub)
[![Documentation build status](https://img.shields.io/readthedocs/jupyterhub?logo=read-the-docs)](https://jupyterhub.readthedocs.org/en/latest/) [![Documentation build status](https://img.shields.io/readthedocs/jupyterhub?logo=read-the-docs)](https://jupyterhub.readthedocs.org/en/latest/)
[![TravisCI build status](https://img.shields.io/travis/jupyterhub/jupyterhub/master?logo=travis)](https://travis-ci.org/jupyterhub/jupyterhub) [![TravisCI build status](https://img.shields.io/travis/com/jupyterhub/jupyterhub?logo=travis)](https://travis-ci.com/jupyterhub/jupyterhub)
[![DockerHub build status](https://img.shields.io/docker/build/jupyterhub/jupyterhub?logo=docker&label=build)](https://hub.docker.com/r/jupyterhub/jupyterhub/tags) [![DockerHub build status](https://img.shields.io/docker/build/jupyterhub/jupyterhub?logo=docker&label=build)](https://hub.docker.com/r/jupyterhub/jupyterhub/tags)
[![CircleCI build status](https://img.shields.io/circleci/build/github/jupyterhub/jupyterhub?logo=circleci)](https://circleci.com/gh/jupyterhub/jupyterhub)<!-- CircleCI Token: b5b65862eb2617b9a8d39e79340b0a6b816da8cc --> [![CircleCI build status](https://img.shields.io/circleci/build/github/jupyterhub/jupyterhub?logo=circleci)](https://circleci.com/gh/jupyterhub/jupyterhub)<!-- CircleCI Token: b5b65862eb2617b9a8d39e79340b0a6b816da8cc -->
[![Test coverage of code](https://codecov.io/gh/jupyterhub/jupyterhub/branch/master/graph/badge.svg)](https://codecov.io/gh/jupyterhub/jupyterhub) [![Test coverage of code](https://codecov.io/gh/jupyterhub/jupyterhub/branch/master/graph/badge.svg)](https://codecov.io/gh/jupyterhub/jupyterhub)

View File

@@ -1,36 +1,55 @@
#!/usr/bin/env bash #!/usr/bin/env bash
# source this file to setup postgres and mysql # The goal of this script is to start a database server as a docker container.
# for local testing (as similar as possible to docker) #
# Required environment variables:
# - DB: The database server to start, either "postgres" or "mysql".
#
# - PGUSER/PGPASSWORD: For the creation of a postgresql user with associated
# password.
set -eu set -eu
export MYSQL_HOST=127.0.0.1 # Stop and remove any existing database container
export MYSQL_TCP_PORT=${MYSQL_TCP_PORT:-13306} DOCKER_CONTAINER="hub-test-$DB"
export PGHOST=127.0.0.1 docker rm -f "$DOCKER_CONTAINER" 2>/dev/null || true
NAME="hub-test-$DB"
DOCKER_RUN="docker run -d --name $NAME"
docker rm -f "$NAME" 2>/dev/null || true # Prepare environment variables to startup and await readiness of either a mysql
# or postgresql server.
case "$DB" in if [[ "$DB" == "mysql" ]]; then
"mysql") # Environment variables can influence both the mysql server in the docker
RUN_ARGS="-e MYSQL_ALLOW_EMPTY_PASSWORD=1 -p $MYSQL_TCP_PORT:3306 mysql:5.7" # container and the mysql client.
CHECK="mysql --host $MYSQL_HOST --port $MYSQL_TCP_PORT --user root -e \q" #
;; # ref server: https://hub.docker.com/_/mysql/
"postgres") # ref client: https://dev.mysql.com/doc/refman/5.7/en/setting-environment-variables.html
RUN_ARGS="-p 5432:5432 postgres:9.5" #
CHECK="psql --user postgres -c \q" DOCKER_RUN_ARGS="-p 3306:3306 --env MYSQL_ALLOW_EMPTY_PASSWORD=1 mysql:5.7"
;; READINESS_CHECK="mysql --user root --execute \q"
*) elif [[ "$DB" == "postgres" ]]; then
# Environment variables can influence both the postgresql server in the
# docker container and the postgresql client (psql).
#
# ref server: https://hub.docker.com/_/postgres/
# ref client: https://www.postgresql.org/docs/9.5/libpq-envars.html
#
# POSTGRES_USER / POSTGRES_PASSWORD will create a user on startup of the
# postgres server, but PGUSER and PGPASSWORD are the environment variables
# used by the postgresql client psql, so we configure the user based on how
# we want to connect.
#
DOCKER_RUN_ARGS="-p 5432:5432 --env "POSTGRES_USER=${PGUSER}" --env "POSTGRES_PASSWORD=${PGPASSWORD}" postgres:9.5"
READINESS_CHECK="psql --command \q"
else
echo '$DB must be mysql or postgres' echo '$DB must be mysql or postgres'
exit 1 exit 1
esac fi
$DOCKER_RUN $RUN_ARGS # Start the database server
docker run --detach --name "$DOCKER_CONTAINER" $DOCKER_RUN_ARGS
# Wait for the database server to start
echo -n "waiting for $DB " echo -n "waiting for $DB "
for i in {1..60}; do for i in {1..60}; do
if $CHECK; then if $READINESS_CHECK; then
echo 'done' echo 'done'
break break
else else
@@ -38,22 +57,4 @@ for i in {1..60}; do
sleep 1 sleep 1
fi fi
done done
$CHECK $READINESS_CHECK
case "$DB" in
"mysql")
;;
"postgres")
# create the user
psql --user postgres -c "CREATE USER $PGUSER WITH PASSWORD '$PGPASSWORD';"
;;
*)
esac
echo -e "
Set these environment variables:
export MYSQL_HOST=127.0.0.1
export MYSQL_TCP_PORT=$MYSQL_TCP_PORT
export PGHOST=127.0.0.1
"

View File

@@ -1,27 +1,26 @@
#!/usr/bin/env bash #!/usr/bin/env bash
# initialize jupyterhub databases for testing # The goal of this script is to initialize a running database server with clean
# databases for use during tests.
#
# Required environment variables:
# - DB: The database server to start, either "postgres" or "mysql".
set -eu set -eu
MYSQL="mysql --user root --host $MYSQL_HOST --port $MYSQL_TCP_PORT -e " # Prepare env vars SQL_CLIENT and EXTRA_CREATE_DATABASE_ARGS
PSQL="psql --user postgres -c " if [[ "$DB" == "mysql" ]]; then
SQL_CLIENT="mysql --user root --execute "
case "$DB" in EXTRA_CREATE_DATABASE_ARGS='CHARACTER SET utf8 COLLATE utf8_general_ci'
"mysql") elif [[ "$DB" == "postgres" ]]; then
EXTRA_CREATE='CHARACTER SET utf8 COLLATE utf8_general_ci' SQL_CLIENT="psql --command "
SQL="$MYSQL" else
;;
"postgres")
SQL="$PSQL"
;;
*)
echo '$DB must be mysql or postgres' echo '$DB must be mysql or postgres'
exit 1 exit 1
esac fi
# Configure a set of databases in the database server for upgrade tests
set -x set -x
for SUFFIX in '' _upgrade_072 _upgrade_081 _upgrade_094; do for SUFFIX in '' _upgrade_072 _upgrade_081 _upgrade_094; do
$SQL "DROP DATABASE jupyterhub${SUFFIX};" 2>/dev/null || true $SQL_CLIENT "DROP DATABASE jupyterhub${SUFFIX};" 2>/dev/null || true
$SQL "CREATE DATABASE jupyterhub${SUFFIX} ${EXTRA_CREATE:-};" $SQL_CLIENT "CREATE DATABASE jupyterhub${SUFFIX} ${EXTRA_CREATE_DATABASE_ARGS:-};"
done done

View File

@@ -79,6 +79,21 @@ paths:
/users: /users:
get: get:
summary: List users summary: List users
parameters:
- name: state
in: query
required: false
type: string
enum: ["inactive", "active", "ready"]
description: |
Return only users who have servers in the given state.
If unspecified, return all users.
active: all users with any active servers (ready OR pending)
ready: all users who have any ready servers (running, not pending)
inactive: all users who have *no* active servers (complement of active)
Added in JupyterHub 1.3
responses: responses:
'200': '200':
description: The Hub's user list description: The Hub's user list

File diff suppressed because one or more lines are too long

View File

@@ -21,7 +21,7 @@ Here is a quick breakdown of these three tools:
* **The Jupyter Notebook** is a document specification (the `.ipynb`) file that interweaves * **The Jupyter Notebook** is a document specification (the `.ipynb`) file that interweaves
narrative text with code cells and their outputs. It is also a graphical interface narrative text with code cells and their outputs. It is also a graphical interface
that allows users to edit these documents. There are also several other graphical interfaces that allows users to edit these documents. There are also several other graphical interfaces
that allow users to edit the `.ipynb` format (nteract, Jupyer Lab, Google Colab, Kaggle, etc). that allow users to edit the `.ipynb` format (nteract, Jupyter Lab, Google Colab, Kaggle, etc).
* **JupyterLab** is a flexible and extendible user interface for interactive computing. It * **JupyterLab** is a flexible and extendible user interface for interactive computing. It
has several extensions that are tailored for using Jupyter Notebooks, as well as extensions has several extensions that are tailored for using Jupyter Notebooks, as well as extensions
for other parts of the data science stack. for other parts of the data science stack.

View File

@@ -5,7 +5,7 @@ that interacts with the Hub's REST API. A Service may perform a specific
or action or task. For example, shutting down individuals' single user or action or task. For example, shutting down individuals' single user
notebook servers that have been idle for some time is a good example of notebook servers that have been idle for some time is a good example of
a task that could be automated by a Service. Let's look at how the a task that could be automated by a Service. Let's look at how the
[cull_idle_servers][] script can be used as a Service. [jupyterhub_idle_culler][] script can be used as a Service.
## Real-world example to cull idle servers ## Real-world example to cull idle servers
@@ -15,11 +15,11 @@ document will:
- explain some basic information about API tokens - explain some basic information about API tokens
- clarify that API tokens can be used to authenticate to - clarify that API tokens can be used to authenticate to
single-user servers as of [version 0.8.0](../changelog) single-user servers as of [version 0.8.0](../changelog)
- show how the [cull_idle_servers][] script can be: - show how the [jupyterhub_idle_culler][] script can be:
- used in a Hub-managed service - used in a Hub-managed service
- run as a standalone script - run as a standalone script
Both examples for `cull_idle_servers` will communicate tasks to the Both examples for `jupyterhub_idle_culler` will communicate tasks to the
Hub via the REST API. Hub via the REST API.
## API Token basics ## API Token basics
@@ -78,17 +78,23 @@ single-user servers, and only cookies can be used for authentication.
0.8 supports using JupyterHub API tokens to authenticate to single-user 0.8 supports using JupyterHub API tokens to authenticate to single-user
servers. servers.
## Configure `cull-idle` to run as a Hub-Managed Service ## Configure the idle culler to run as a Hub-Managed Service
Install the idle culler:
```
pip install jupyterhub-idle-culler
```
In `jupyterhub_config.py`, add the following dictionary for the In `jupyterhub_config.py`, add the following dictionary for the
`cull-idle` Service to the `c.JupyterHub.services` list: `idle-culler` Service to the `c.JupyterHub.services` list:
```python ```python
c.JupyterHub.services = [ c.JupyterHub.services = [
{ {
'name': 'cull-idle', 'name': 'idle-culler',
'admin': True, 'admin': True,
'command': [sys.executable, 'cull_idle_servers.py', '--timeout=3600'], 'command': [sys.executable, '-m', 'jupyterhub_idle_culler', '--timeout=3600'],
} }
] ]
``` ```
@@ -101,21 +107,21 @@ where:
## Run `cull-idle` manually as a standalone script ## Run `cull-idle` manually as a standalone script
Now you can run your script, i.e. `cull_idle_servers`, by providing it Now you can run your script by providing it
the API token and it will authenticate through the REST API to the API token and it will authenticate through the REST API to
interact with it. interact with it.
This will run `cull-idle` manually. `cull-idle` can be run as a standalone This will run the idle culler service manually. It can be run as a standalone
script anywhere with access to the Hub, and will periodically check for idle script anywhere with access to the Hub, and will periodically check for idle
servers and shut them down via the Hub's REST API. In order to shutdown the servers and shut them down via the Hub's REST API. In order to shutdown the
servers, the token given to cull-idle must have admin privileges. servers, the token given to cull-idle must have admin privileges.
Generate an API token and store it in the `JUPYTERHUB_API_TOKEN` environment Generate an API token and store it in the `JUPYTERHUB_API_TOKEN` environment
variable. Run `cull_idle_servers.py` manually. variable. Run `jupyterhub_idle_culler` manually.
```bash ```bash
export JUPYTERHUB_API_TOKEN='token' export JUPYTERHUB_API_TOKEN='token'
python3 cull_idle_servers.py [--timeout=900] [--url=http://127.0.0.1:8081/hub/api] python -m jupyterhub_idle_culler [--timeout=900] [--url=http://127.0.0.1:8081/hub/api]
``` ```
[cull_idle_servers]: https://github.com/jupyterhub/jupyterhub/blob/master/examples/cull-idle/cull_idle_servers.py [jupyterhub_idle_culler]: https://github.com/jupyterhub/jupyterhub-idle-culler

View File

@@ -235,10 +235,9 @@ to Spawner environment:
```python ```python
class MyAuthenticator(Authenticator): class MyAuthenticator(Authenticator):
@gen.coroutine async def authenticate(self, handler, data=None):
def authenticate(self, handler, data=None): username = await identify_user(handler, data)
username = yield identify_user(handler, data) upstream_token = await token_for_user(username)
upstream_token = yield token_for_user(username)
return { return {
'name': username, 'name': username,
'auth_state': { 'auth_state': {
@@ -246,10 +245,9 @@ class MyAuthenticator(Authenticator):
}, },
} }
@gen.coroutine async def pre_spawn_start(self, user, spawner):
def pre_spawn_start(self, user, spawner):
"""Pass upstream_token to spawner via environment variable""" """Pass upstream_token to spawner via environment variable"""
auth_state = yield user.get_auth_state() auth_state = await user.get_auth_state()
if not auth_state: if not auth_state:
# auth_state not enabled # auth_state not enabled
return return

View File

@@ -86,6 +86,7 @@ server {
proxy_http_version 1.1; proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade; proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection $connection_upgrade; proxy_set_header Connection $connection_upgrade;
proxy_set_header X-Scheme $scheme;
proxy_buffering off; proxy_buffering off;
} }

View File

@@ -50,7 +50,7 @@ A Service may have the following properties:
If a service is also to be managed by the Hub, it has a few extra options: If a service is also to be managed by the Hub, it has a few extra options:
- `command: (str/Popen list`) - Command for JupyterHub to spawn the service. - `command: (str/Popen list)` - Command for JupyterHub to spawn the service.
- Only use this if the service should be a subprocess. - Only use this if the service should be a subprocess.
- If command is not specified, the Service is assumed to be managed - If command is not specified, the Service is assumed to be managed
externally. externally.
@@ -91,9 +91,9 @@ This example would be configured as follows in `jupyterhub_config.py`:
```python ```python
c.JupyterHub.services = [ c.JupyterHub.services = [
{ {
'name': 'cull-idle', 'name': 'idle-culler',
'admin': True, 'admin': True,
'command': [sys.executable, '/path/to/cull-idle.py', '--timeout'] 'command': [sys.executable, '-m', 'jupyterhub_idle_culler', '--timeout=3600']
} }
] ]
``` ```
@@ -123,15 +123,14 @@ For the previous 'cull idle' Service example, these environment variables
would be passed to the Service when the Hub starts the 'cull idle' Service: would be passed to the Service when the Hub starts the 'cull idle' Service:
```bash ```bash
JUPYTERHUB_SERVICE_NAME: 'cull-idle' JUPYTERHUB_SERVICE_NAME: 'idle-culler'
JUPYTERHUB_API_TOKEN: API token assigned to the service JUPYTERHUB_API_TOKEN: API token assigned to the service
JUPYTERHUB_API_URL: http://127.0.0.1:8080/hub/api JUPYTERHUB_API_URL: http://127.0.0.1:8080/hub/api
JUPYTERHUB_BASE_URL: https://mydomain[:port] JUPYTERHUB_BASE_URL: https://mydomain[:port]
JUPYTERHUB_SERVICE_PREFIX: /services/cull-idle/ JUPYTERHUB_SERVICE_PREFIX: /services/idle-culler/
``` ```
See the JupyterHub GitHub repo for additional information about the See the GitHub repo for additional information about the [jupyterhub_idle_culler][].
[`cull-idle` example](https://github.com/jupyterhub/jupyterhub/tree/master/examples/cull-idle).
## Externally-Managed Services ## Externally-Managed Services
@@ -340,7 +339,7 @@ and taking note of the following process:
```python ```python
r = requests.get( r = requests.get(
'/'.join((["http://127.0.0.1:8081/hub/api", '/'.join(["http://127.0.0.1:8081/hub/api",
"authorizations/cookie/jupyterhub-services", "authorizations/cookie/jupyterhub-services",
quote(encrypted_cookie, safe=''), quote(encrypted_cookie, safe=''),
]), ]),
@@ -376,3 +375,4 @@ section on securing the notebook viewer.
[HubAuth.user_for_token]: ../api/services.auth.html#jupyterhub.services.auth.HubAuth.user_for_token [HubAuth.user_for_token]: ../api/services.auth.html#jupyterhub.services.auth.HubAuth.user_for_token
[HubAuthenticated]: ../api/services.auth.html#jupyterhub.services.auth.HubAuthenticated [HubAuthenticated]: ../api/services.auth.html#jupyterhub.services.auth.HubAuthenticated
[nbviewer example]: https://github.com/jupyter/nbviewer#securing-the-notebook-viewer [nbviewer example]: https://github.com/jupyter/nbviewer#securing-the-notebook-viewer
[jupyterhub_idle_culler]: https://github.com/jupyterhub/jupyterhub-idle-culler

View File

@@ -27,7 +27,7 @@ that environment variable is set or `/` if it is not.
Admin users can set the announcement text with an API token: Admin users can set the announcement text with an API token:
$ curl -X POST -H "Authorization: token <token>" \ $ curl -X POST -H "Authorization: token <token>" \
-d "{'announcement':'JupyterHub will be upgraded on August 14!'}" \ -d '{"announcement":"JupyterHub will be upgraded on August 14!"}' \
https://.../services/announcement https://.../services/announcement
Anyone can read the announcement: Anyone can read the announcement:

View File

@@ -4,9 +4,9 @@
version_info = ( version_info = (
1, 1,
2, 3,
0, 0,
"b1", # release (b1, rc1, or "" for final or dev) "", # release (b1, rc1, or "" for final or dev)
# "dev", # dev or nothing for beta/rc/stable releases # "dev", # dev or nothing for beta/rc/stable releases
) )

View File

@@ -215,7 +215,8 @@ class OAuthAuthorizeHandler(OAuthHandler, BaseHandler):
# it's the user's own server # it's the user's own server
oauth_client.identifier in own_oauth_client_ids oauth_client.identifier in own_oauth_client_ids
# or it's in the global no-confirm list # or it's in the global no-confirm list
or oauth_client.identifier in self.settings.get('oauth_no_confirm', set()) or oauth_client.identifier
in self.settings.get('oauth_no_confirm_list', set())
): ):
return False return False
# default: require confirmation # default: require confirmation
@@ -252,7 +253,7 @@ class OAuthAuthorizeHandler(OAuthHandler, BaseHandler):
# Render oauth 'Authorize application...' page # Render oauth 'Authorize application...' page
auth_state = await self.current_user.get_auth_state() auth_state = await self.current_user.get_auth_state()
self.write( self.write(
self.render_template( await self.render_template(
"oauth.html", "oauth.html",
auth_state=auth_state, auth_state=auth_state,
scopes=scopes, scopes=scopes,
@@ -274,9 +275,26 @@ class OAuthAuthorizeHandler(OAuthHandler, BaseHandler):
uri, http_method, body, headers = self.extract_oauth_params() uri, http_method, body, headers = self.extract_oauth_params()
referer = self.request.headers.get('Referer', 'no referer') referer = self.request.headers.get('Referer', 'no referer')
full_url = self.request.full_url() full_url = self.request.full_url()
if referer != full_url: # trim protocol, which cannot be trusted with multiple layers of proxies anyway
# Referer is set by browser, but full_url can be modified by proxy layers to appear as http
# when it is actually https
referer_proto, _, stripped_referer = referer.partition("://")
referer_proto = referer_proto.lower()
req_proto, _, stripped_full_url = full_url.partition("://")
req_proto = req_proto.lower()
if referer_proto != req_proto:
self.log.warning("Protocol mismatch: %s != %s", referer, full_url)
if req_proto == "https":
# insecure origin to secure target is not allowed
raise web.HTTPError(
403, "Not allowing authorization form submitted from insecure page"
)
if stripped_referer != stripped_full_url:
# OAuth post must be made to the URL it came from # OAuth post must be made to the URL it came from
self.log.error("OAuth POST from %s != %s", referer, full_url) self.log.error("Original OAuth POST from %s != %s", referer, full_url)
self.log.error(
"Stripped OAuth POST from %s != %s", stripped_referer, stripped_full_url
)
raise web.HTTPError( raise web.HTTPError(
403, "Authorization form must be sent from authorization page" 403, "Authorization form must be sent from authorization page"
) )

View File

@@ -9,6 +9,7 @@ from datetime import timezone
from async_generator import aclosing from async_generator import aclosing
from dateutil.parser import parse as parse_date from dateutil.parser import parse as parse_date
from sqlalchemy import func
from tornado import web from tornado import web
from tornado.iostream import StreamClosedError from tornado.iostream import StreamClosedError
@@ -35,15 +36,69 @@ class SelfAPIHandler(APIHandler):
user = self.get_current_user_oauth_token() user = self.get_current_user_oauth_token()
if user is None: if user is None:
raise web.HTTPError(403) raise web.HTTPError(403)
self.write(json.dumps(self.user_model(user))) if isinstance(user, orm.Service):
model = self.service_model(user)
else:
model = self.user_model(user)
self.write(json.dumps(model))
class UserListAPIHandler(APIHandler): class UserListAPIHandler(APIHandler):
def _user_has_ready_spawner(self, orm_user):
"""Return True if a user has *any* ready spawners
Used for filtering from active -> ready
"""
user = self.users[orm_user]
return any(spawner.ready for spawner in user.spawners.values())
@admin_only @admin_only
def get(self): def get(self):
state_filter = self.get_argument("state", None)
# post_filter
post_filter = None
if state_filter in {"active", "ready"}:
# only get users with active servers
# an 'active' Spawner has a server record in the database
# which means Spawner.server != None
# it may still be in a pending start/stop state.
# join filters out users with no Spawners
query = (
self.db.query(orm.User)
# join filters out any Users with no Spawners
.join(orm.Spawner)
# this implicitly gets Users with *any* active server
.filter(orm.Spawner.server != None)
)
if state_filter == "ready":
# have to post-process query results because active vs ready
# can only be distinguished with in-memory Spawner properties
post_filter = self._user_has_ready_spawner
elif state_filter == "inactive":
# only get users with *no* active servers
# as opposed to users with *any inactive servers*
# this is the complement to the above query.
# how expensive is this with lots of servers?
query = (
self.db.query(orm.User)
.outerjoin(orm.Spawner)
.outerjoin(orm.Server)
.group_by(orm.User.id)
.having(func.count(orm.Server.id) == 0)
)
elif state_filter:
raise web.HTTPError(400, "Unrecognized state filter: %r" % state_filter)
else:
# no filter, return all users
query = self.db.query(orm.User)
data = [ data = [
self.user_model(u, include_servers=True, include_state=True) self.user_model(u, include_servers=True, include_state=True)
for u in self.db.query(orm.User) for u in query
if (post_filter is None or post_filter(u))
] ]
self.write(json.dumps(data)) self.write(json.dumps(data))

View File

@@ -29,6 +29,14 @@ from urllib.parse import urlunparse
if sys.version_info[:2] < (3, 3): if sys.version_info[:2] < (3, 3):
raise ValueError("Python < 3.3 not supported: %s" % sys.version) raise ValueError("Python < 3.3 not supported: %s" % sys.version)
# For compatibility with python versions 3.6 or earlier.
# asyncio.Task.all_tasks() is fully moved to asyncio.all_tasks() starting with 3.9. Also applies to current_task.
try:
asyncio_all_tasks = asyncio.all_tasks
asyncio_current_task = asyncio.current_task
except AttributeError as e:
asyncio_all_tasks = asyncio.Task.all_tasks
asyncio_current_task = asyncio.Task.current_task
from dateutil.parser import parse as parse_date from dateutil.parser import parse as parse_date
from jinja2 import Environment, FileSystemLoader, PrefixLoader, ChoiceLoader from jinja2 import Environment, FileSystemLoader, PrefixLoader, ChoiceLoader
@@ -77,6 +85,7 @@ from .user import UserDict
from .oauth.provider import make_provider from .oauth.provider import make_provider
from ._data import DATA_FILES_PATH from ._data import DATA_FILES_PATH
from .log import CoroutineLogFormatter, log_request from .log import CoroutineLogFormatter, log_request
from .pagination import Pagination
from .proxy import Proxy, ConfigurableHTTPProxy from .proxy import Proxy, ConfigurableHTTPProxy
from .traitlets import URLPrefix, Command, EntryPointType, Callable from .traitlets import URLPrefix, Command, EntryPointType, Callable
from .utils import ( from .utils import (
@@ -279,7 +288,7 @@ class JupyterHub(Application):
@default('classes') @default('classes')
def _load_classes(self): def _load_classes(self):
classes = [Spawner, Authenticator, CryptKeeper] classes = [Spawner, Authenticator, CryptKeeper, Pagination]
for name, trait in self.traits(config=True).items(): for name, trait in self.traits(config=True).items():
# load entry point groups into configurable class list # load entry point groups into configurable class list
# so that they show up in config files, etc. # so that they show up in config files, etc.
@@ -2119,7 +2128,7 @@ class JupyterHub(Application):
self.log.debug( self.log.debug(
"Awaiting checks for %i possibly-running spawners", len(check_futures) "Awaiting checks for %i possibly-running spawners", len(check_futures)
) )
await gen.multi(check_futures) await asyncio.gather(*check_futures)
db.commit() db.commit()
# only perform this query if we are going to log it # only perform this query if we are going to log it
@@ -2186,7 +2195,7 @@ class JupyterHub(Application):
def init_tornado_settings(self): def init_tornado_settings(self):
"""Set up the tornado settings dict.""" """Set up the tornado settings dict."""
base_url = self.hub.base_url base_url = self.hub.base_url
jinja_options = dict(autoescape=True) jinja_options = dict(autoescape=True, enable_async=True)
jinja_options.update(self.jinja_environment_options) jinja_options.update(self.jinja_environment_options)
base_path = self._template_paths_default()[0] base_path = self._template_paths_default()[0]
if base_path not in self.template_paths: if base_path not in self.template_paths:
@@ -2198,6 +2207,14 @@ class JupyterHub(Application):
] ]
) )
jinja_env = Environment(loader=loader, **jinja_options) jinja_env = Environment(loader=loader, **jinja_options)
# We need a sync jinja environment too, for the times we *must* use sync
# code - particularly in RequestHandler.write_error. Since *that*
# is called from inside the asyncio event loop, we can't actulaly just
# schedule it on the loop - without starting another thread with its
# own loop, which seems not worth the trouble. Instead, we create another
# environment, exactly like this one, but sync
del jinja_options['enable_async']
jinja_env_sync = Environment(loader=loader, **jinja_options)
login_url = url_path_join(base_url, 'login') login_url = url_path_join(base_url, 'login')
logout_url = self.authenticator.logout_url(base_url) logout_url = self.authenticator.logout_url(base_url)
@@ -2244,6 +2261,7 @@ class JupyterHub(Application):
template_path=self.template_paths, template_path=self.template_paths,
template_vars=self.template_vars, template_vars=self.template_vars,
jinja2_env=jinja_env, jinja2_env=jinja_env,
jinja2_env_sync=jinja_env_sync,
version_hash=version_hash, version_hash=version_hash,
subdomain_host=self.subdomain_host, subdomain_host=self.subdomain_host,
domain=self.domain, domain=self.domain,
@@ -2807,9 +2825,7 @@ class JupyterHub(Application):
async def shutdown_cancel_tasks(self, sig): async def shutdown_cancel_tasks(self, sig):
"""Cancel all other tasks of the event loop and initiate cleanup""" """Cancel all other tasks of the event loop and initiate cleanup"""
self.log.critical("Received signal %s, initiating shutdown...", sig.name) self.log.critical("Received signal %s, initiating shutdown...", sig.name)
tasks = [ tasks = [t for t in asyncio_all_tasks() if t is not asyncio_current_task()]
t for t in asyncio.Task.all_tasks() if t is not asyncio.Task.current_task()
]
if tasks: if tasks:
self.log.debug("Cancelling pending tasks") self.log.debug("Cancelling pending tasks")
@@ -2822,7 +2838,7 @@ class JupyterHub(Application):
except StopAsyncIteration as e: except StopAsyncIteration as e:
self.log.error("Caught StopAsyncIteration Exception", exc_info=True) self.log.error("Caught StopAsyncIteration Exception", exc_info=True)
tasks = [t for t in asyncio.Task.all_tasks()] tasks = [t for t in asyncio_all_tasks()]
for t in tasks: for t in tasks:
self.log.debug("Task status: %s", t) self.log.debug("Task status: %s", t)
await self.cleanup() await self.cleanup()

View File

@@ -101,7 +101,10 @@ class Authenticator(LoggingConfigurable):
""" """
).tag(config=True) ).tag(config=True)
whitelist = Set(help="Deprecated, use `Authenticator.allowed_users`", config=True,) whitelist = Set(
help="Deprecated, use `Authenticator.allowed_users`",
config=True,
)
allowed_users = Set( allowed_users = Set(
help=""" help="""
@@ -715,7 +718,9 @@ for _old_name, _new_name, _version in [
("check_blacklist", "check_blocked_users", "1.2"), ("check_blacklist", "check_blocked_users", "1.2"),
]: ]:
setattr( setattr(
Authenticator, _old_name, _deprecated_method(_old_name, _new_name, _version), Authenticator,
_old_name,
_deprecated_method(_old_name, _new_name, _version),
) )
@@ -778,7 +783,9 @@ class LocalAuthenticator(Authenticator):
""" """
).tag(config=True) ).tag(config=True)
group_whitelist = Set(help="""DEPRECATED: use allowed_groups""",).tag(config=True) group_whitelist = Set(
help="""DEPRECATED: use allowed_groups""",
).tag(config=True)
allowed_groups = Set( allowed_groups = Set(
help=""" help="""

View File

@@ -40,6 +40,7 @@ from ..metrics import SERVER_STOP_DURATION_SECONDS
from ..metrics import ServerPollStatus from ..metrics import ServerPollStatus
from ..metrics import ServerSpawnStatus from ..metrics import ServerSpawnStatus
from ..metrics import ServerStopStatus from ..metrics import ServerStopStatus
from ..metrics import TOTAL_USERS
from ..objects import Server from ..objects import Server
from ..spawner import LocalProcessSpawner from ..spawner import LocalProcessSpawner
from ..user import User from ..user import User
@@ -453,6 +454,7 @@ class BaseHandler(RequestHandler):
# not found, create and register user # not found, create and register user
u = orm.User(name=username) u = orm.User(name=username)
self.db.add(u) self.db.add(u)
TOTAL_USERS.inc()
self.db.commit() self.db.commit()
user = self._user_from_orm(u) user = self._user_from_orm(u)
return user return user
@@ -489,7 +491,7 @@ class BaseHandler(RequestHandler):
self.clear_cookie( self.clear_cookie(
'jupyterhub-services', 'jupyterhub-services',
path=url_path_join(self.base_url, 'services'), path=url_path_join(self.base_url, 'services'),
**kwargs **kwargs,
) )
# Reset _jupyterhub_user # Reset _jupyterhub_user
self._jupyterhub_user = None self._jupyterhub_user = None
@@ -634,6 +636,12 @@ class BaseHandler(RequestHandler):
next_url, next_url,
) )
# this is where we know if next_url is coming from ?next= param or we are using a default url
if next_url:
next_url_from_param = True
else:
next_url_from_param = False
if not next_url: if not next_url:
# custom default URL, usually passed because user landed on that page but was not logged in # custom default URL, usually passed because user landed on that page but was not logged in
if default: if default:
@@ -659,6 +667,9 @@ class BaseHandler(RequestHandler):
else: else:
next_url = url_path_join(self.hub.base_url, 'home') next_url = url_path_join(self.hub.base_url, 'home')
if not next_url_from_param:
# when a request made with ?next=... assume all the params have already been encoded
# otherwise, preserve params from the current request across the redirect
next_url = self.append_query_parameters(next_url, exclude=['next']) next_url = self.append_query_parameters(next_url, exclude=['next'])
return next_url return next_url
@@ -1156,16 +1167,36 @@ class BaseHandler(RequestHandler):
"<a href='{home}'>home page</a>.".format(home=home) "<a href='{home}'>home page</a>.".format(home=home)
) )
def get_template(self, name): def get_template(self, name, sync=False):
"""Return the jinja template object for a given name""" """
return self.settings['jinja2_env'].get_template(name) Return the jinja template object for a given name
def render_template(self, name, **ns): If sync is True, we return a Template that is compiled without async support.
Only those can be used in synchronous code.
If sync is False, we return a Template that is compiled with async support
"""
if sync:
key = 'jinja2_env_sync'
else:
key = 'jinja2_env'
return self.settings[key].get_template(name)
def render_template(self, name, sync=False, **ns):
"""
Render jinja2 template
If sync is set to True, we return an awaitable
If sync is set to False, we render the template & return a string
"""
template_ns = {} template_ns = {}
template_ns.update(self.template_namespace) template_ns.update(self.template_namespace)
template_ns.update(ns) template_ns.update(ns)
template = self.get_template(name) template = self.get_template(name, sync)
if sync:
return template.render(**template_ns) return template.render(**template_ns)
else:
return template.render_async(**template_ns)
@property @property
def template_namespace(self): def template_namespace(self):
@@ -1240,17 +1271,19 @@ class BaseHandler(RequestHandler):
# Content-Length must be recalculated. # Content-Length must be recalculated.
self.clear_header('Content-Length') self.clear_header('Content-Length')
# render the template # render_template is async, but write_error can't be!
# so we run it sync here, instead of making a sync version of render_template
try: try:
html = self.render_template('%s.html' % status_code, **ns) html = self.render_template('%s.html' % status_code, sync=True, **ns)
except TemplateNotFound: except TemplateNotFound:
self.log.debug("No template for %d", status_code) self.log.debug("No template for %d", status_code)
try: try:
html = self.render_template('error.html', **ns) html = self.render_template('error.html', sync=True, **ns)
except: except:
# In this case, any side effect must be avoided. # In this case, any side effect must be avoided.
ns['no_spawner_check'] = True ns['no_spawner_check'] = True
html = self.render_template('error.html', **ns) html = self.render_template('error.html', sync=True, **ns)
self.write(html) self.write(html)
@@ -1454,10 +1487,14 @@ class UserUrlHandler(BaseHandler):
# if request is expecting JSON, assume it's an API request and fail with 503 # if request is expecting JSON, assume it's an API request and fail with 503
# because it won't like the redirect to the pending page # because it won't like the redirect to the pending page
if get_accepted_mimetype( if (
get_accepted_mimetype(
self.request.headers.get('Accept', ''), self.request.headers.get('Accept', ''),
choices=['application/json', 'text/html'], choices=['application/json', 'text/html'],
) == 'application/json' or 'api' in user_path.split('/'): )
== 'application/json'
or 'api' in user_path.split('/')
):
self._fail_api_request(user_name, server_name) self._fail_api_request(user_name, server_name)
return return
@@ -1483,7 +1520,7 @@ class UserUrlHandler(BaseHandler):
self.set_status(503) self.set_status(503)
auth_state = await user.get_auth_state() auth_state = await user.get_auth_state()
html = self.render_template( html = await self.render_template(
"not_running.html", "not_running.html",
user=user, user=user,
server_name=server_name, server_name=server_name,
@@ -1538,7 +1575,7 @@ class UserUrlHandler(BaseHandler):
if redirects: if redirects:
self.log.warning("Redirect loop detected on %s", self.request.uri) self.log.warning("Redirect loop detected on %s", self.request.uri)
# add capped exponential backoff where cap is 10s # add capped exponential backoff where cap is 10s
await gen.sleep(min(1 * (2 ** redirects), 10)) await asyncio.sleep(min(1 * (2 ** redirects), 10))
# rewrite target url with new `redirects` query value # rewrite target url with new `redirects` query value
url_parts = urlparse(target) url_parts = urlparse(target)
query_parts = parse_qs(url_parts.query) query_parts = parse_qs(url_parts.query)

View File

@@ -72,7 +72,7 @@ class LogoutHandler(BaseHandler):
Override this function to set a custom logout page. Override this function to set a custom logout page.
""" """
if self.authenticator.auto_login: if self.authenticator.auto_login:
html = self.render_template('logout.html') html = await self.render_template('logout.html')
self.finish(html) self.finish(html)
else: else:
self.redirect(self.settings['login_url'], permanent=False) self.redirect(self.settings['login_url'], permanent=False)
@@ -132,7 +132,7 @@ class LoginHandler(BaseHandler):
self.redirect(auto_login_url) self.redirect(auto_login_url)
return return
username = self.get_argument('username', default='') username = self.get_argument('username', default='')
self.finish(self._render(username=username)) self.finish(await self._render(username=username))
async def post(self): async def post(self):
# parse the arguments dict # parse the arguments dict
@@ -149,7 +149,7 @@ class LoginHandler(BaseHandler):
self._jupyterhub_user = user self._jupyterhub_user = user
self.redirect(self.get_next_url(user)) self.redirect(self.get_next_url(user))
else: else:
html = self._render( html = await self._render(
login_error='Invalid username or password', username=data['username'] login_error='Invalid username or password', username=data['username']
) )
self.finish(html) self.finish(html)

View File

@@ -40,11 +40,15 @@ class RootHandler(BaseHandler):
def get(self): def get(self):
user = self.current_user user = self.current_user
if self.default_url: if self.default_url:
# As set in jupyterhub_config.py
if callable(self.default_url):
url = self.default_url(self)
else:
url = self.default_url url = self.default_url
elif user: elif user:
url = self.get_next_url(user) url = self.get_next_url(user)
else: else:
url = self.settings['login_url'] url = url_concat(self.settings["login_url"], dict(next=self.request.uri))
self.redirect(url) self.redirect(url)
@@ -67,7 +71,7 @@ class HomeHandler(BaseHandler):
url = url_path_join(self.hub.base_url, 'spawn', user.escaped_name) url = url_path_join(self.hub.base_url, 'spawn', user.escaped_name)
auth_state = await user.get_auth_state() auth_state = await user.get_auth_state()
html = self.render_template( html = await self.render_template(
'home.html', 'home.html',
auth_state=auth_state, auth_state=auth_state,
user=user, user=user,
@@ -94,7 +98,7 @@ class SpawnHandler(BaseHandler):
async def _render_form(self, for_user, spawner_options_form, message=''): async def _render_form(self, for_user, spawner_options_form, message=''):
auth_state = await for_user.get_auth_state() auth_state = await for_user.get_auth_state()
return self.render_template( return await self.render_template(
'spawn.html', 'spawn.html',
for_user=for_user, for_user=for_user,
auth_state=auth_state, auth_state=auth_state,
@@ -378,7 +382,7 @@ class SpawnPendingHandler(BaseHandler):
self.hub.base_url, "spawn", user.escaped_name, server_name self.hub.base_url, "spawn", user.escaped_name, server_name
) )
self.set_status(500) self.set_status(500)
html = self.render_template( html = await self.render_template(
"not_running.html", "not_running.html",
user=user, user=user,
auth_state=auth_state, auth_state=auth_state,
@@ -402,7 +406,7 @@ class SpawnPendingHandler(BaseHandler):
page = "stop_pending.html" page = "stop_pending.html"
else: else:
page = "spawn_pending.html" page = "spawn_pending.html"
html = self.render_template( html = await self.render_template(
page, page,
user=user, user=user,
spawner=spawner, spawner=spawner,
@@ -429,7 +433,7 @@ class SpawnPendingHandler(BaseHandler):
spawn_url = url_path_join( spawn_url = url_path_join(
self.hub.base_url, "spawn", user.escaped_name, server_name self.hub.base_url, "spawn", user.escaped_name, server_name
) )
html = self.render_template( html = await self.render_template(
"not_running.html", "not_running.html",
user=user, user=user,
auth_state=auth_state, auth_state=auth_state,
@@ -453,7 +457,8 @@ class AdminHandler(BaseHandler):
@web.authenticated @web.authenticated
@admin_only @admin_only
async def get(self): async def get(self):
page, per_page, offset = Pagination.get_page_args(self) pagination = Pagination(url=self.request.uri, config=self.config)
page, per_page, offset = pagination.get_page_args(self)
available = {'name', 'admin', 'running', 'last_activity'} available = {'name', 'admin', 'running', 'last_activity'}
default_sort = ['admin', 'name'] default_sort = ['admin', 'name']
@@ -509,13 +514,10 @@ class AdminHandler(BaseHandler):
for u in users: for u in users:
running.extend(s for s in u.spawners.values() if s.active) running.extend(s for s in u.spawners.values() if s.active)
total = self.db.query(orm.User.id).count() pagination.total = self.db.query(orm.User.id).count()
pagination = Pagination(
url=self.request.uri, total=total, page=page, per_page=per_page,
)
auth_state = await self.current_user.get_auth_state() auth_state = await self.current_user.get_auth_state()
html = self.render_template( html = await self.render_template(
'admin.html', 'admin.html',
current_user=self.current_user, current_user=self.current_user,
auth_state=auth_state, auth_state=auth_state,
@@ -605,7 +607,7 @@ class TokenPageHandler(BaseHandler):
oauth_clients = sorted(oauth_clients, key=sort_key, reverse=True) oauth_clients = sorted(oauth_clients, key=sort_key, reverse=True)
auth_state = await self.current_user.get_auth_state() auth_state = await self.current_user.get_auth_state()
html = self.render_template( html = await self.render_template(
'token.html', 'token.html',
api_tokens=api_tokens, api_tokens=api_tokens,
oauth_clients=oauth_clients, oauth_clients=oauth_clients,
@@ -617,7 +619,7 @@ class TokenPageHandler(BaseHandler):
class ProxyErrorHandler(BaseHandler): class ProxyErrorHandler(BaseHandler):
"""Handler for rendering proxy error pages""" """Handler for rendering proxy error pages"""
def get(self, status_code_s): async def get(self, status_code_s):
status_code = int(status_code_s) status_code = int(status_code_s)
status_message = responses.get(status_code, 'Unknown HTTP Error') status_message = responses.get(status_code, 'Unknown HTTP Error')
# build template namespace # build template namespace
@@ -641,10 +643,10 @@ class ProxyErrorHandler(BaseHandler):
self.set_header('Content-Type', 'text/html') self.set_header('Content-Type', 'text/html')
# render the template # render the template
try: try:
html = self.render_template('%s.html' % status_code, **ns) html = await self.render_template('%s.html' % status_code, **ns)
except TemplateNotFound: except TemplateNotFound:
self.log.debug("No template for %d", status_code) self.log.debug("No template for %d", status_code)
html = self.render_template('error.html', **ns) html = await self.render_template('error.html', **ns)
self.write(html) self.write(html)

View File

@@ -3,9 +3,9 @@ Prometheus metrics exported by JupyterHub
Read https://prometheus.io/docs/practices/naming/ for naming Read https://prometheus.io/docs/practices/naming/ for naming
conventions for metrics & labels. We generally prefer naming them conventions for metrics & labels. We generally prefer naming them
`<noun>_<verb>_<type_suffix>`. So a histogram that's tracking `jupyterhub_<noun>_<verb>_<type_suffix>`. So a histogram that's tracking
the duration (in seconds) of servers spawning would be called the duration (in seconds) of servers spawning would be called
SERVER_SPAWN_DURATION_SECONDS. jupyterhub_server_spawn_duration_seconds.
We also create an Enum for each 'status' type label in every metric We also create an Enum for each 'status' type label in every metric
we collect. This is to make sure that the metrics exist regardless we collect. This is to make sure that the metrics exist regardless
@@ -14,6 +14,10 @@ create them, the metric spawn_duration_seconds{status="failure"}
will not actually exist until the first failure. This makes dashboarding will not actually exist until the first failure. This makes dashboarding
and alerting difficult, so we explicitly list statuses and create and alerting difficult, so we explicitly list statuses and create
them manually here. them manually here.
.. versionchanged:: 1.3
added ``jupyterhub_`` prefix to metric names.
""" """
from enum import Enum from enum import Enum
@@ -21,13 +25,13 @@ from prometheus_client import Gauge
from prometheus_client import Histogram from prometheus_client import Histogram
REQUEST_DURATION_SECONDS = Histogram( REQUEST_DURATION_SECONDS = Histogram(
'request_duration_seconds', 'jupyterhub_request_duration_seconds',
'request duration for all HTTP requests', 'request duration for all HTTP requests',
['method', 'handler', 'code'], ['method', 'handler', 'code'],
) )
SERVER_SPAWN_DURATION_SECONDS = Histogram( SERVER_SPAWN_DURATION_SECONDS = Histogram(
'server_spawn_duration_seconds', 'jupyterhub_server_spawn_duration_seconds',
'time taken for server spawning operation', 'time taken for server spawning operation',
['status'], ['status'],
# Use custom bucket sizes, since the default bucket ranges # Use custom bucket sizes, since the default bucket ranges
@@ -36,25 +40,27 @@ SERVER_SPAWN_DURATION_SECONDS = Histogram(
) )
RUNNING_SERVERS = Gauge( RUNNING_SERVERS = Gauge(
'running_servers', 'the number of user servers currently running' 'jupyterhub_running_servers', 'the number of user servers currently running'
) )
TOTAL_USERS = Gauge('total_users', 'total number of users') TOTAL_USERS = Gauge('jupyterhub_total_users', 'total number of users')
CHECK_ROUTES_DURATION_SECONDS = Histogram( CHECK_ROUTES_DURATION_SECONDS = Histogram(
'check_routes_duration_seconds', 'Time taken to validate all routes in proxy' 'jupyterhub_check_routes_duration_seconds',
'Time taken to validate all routes in proxy',
) )
HUB_STARTUP_DURATION_SECONDS = Histogram( HUB_STARTUP_DURATION_SECONDS = Histogram(
'hub_startup_duration_seconds', 'Time taken for Hub to start' 'jupyterhub_hub_startup_duration_seconds', 'Time taken for Hub to start'
) )
INIT_SPAWNERS_DURATION_SECONDS = Histogram( INIT_SPAWNERS_DURATION_SECONDS = Histogram(
'init_spawners_duration_seconds', 'Time taken for spawners to initialize' 'jupyterhub_init_spawners_duration_seconds', 'Time taken for spawners to initialize'
) )
PROXY_POLL_DURATION_SECONDS = Histogram( PROXY_POLL_DURATION_SECONDS = Histogram(
'proxy_poll_duration_seconds', 'duration for polling all routes from proxy' 'jupyterhub_proxy_poll_duration_seconds',
'duration for polling all routes from proxy',
) )
@@ -79,7 +85,9 @@ for s in ServerSpawnStatus:
PROXY_ADD_DURATION_SECONDS = Histogram( PROXY_ADD_DURATION_SECONDS = Histogram(
'proxy_add_duration_seconds', 'duration for adding user routes to proxy', ['status'] 'jupyterhub_proxy_add_duration_seconds',
'duration for adding user routes to proxy',
['status'],
) )
@@ -100,7 +108,7 @@ for s in ProxyAddStatus:
SERVER_POLL_DURATION_SECONDS = Histogram( SERVER_POLL_DURATION_SECONDS = Histogram(
'server_poll_duration_seconds', 'jupyterhub_server_poll_duration_seconds',
'time taken to poll if server is running', 'time taken to poll if server is running',
['status'], ['status'],
) )
@@ -127,7 +135,9 @@ for s in ServerPollStatus:
SERVER_STOP_DURATION_SECONDS = Histogram( SERVER_STOP_DURATION_SECONDS = Histogram(
'server_stop_seconds', 'time taken for server stopping operation', ['status'] 'jupyterhub_server_stop_seconds',
'time taken for server stopping operation',
['status'],
) )
@@ -148,7 +158,7 @@ for s in ServerStopStatus:
PROXY_DELETE_DURATION_SECONDS = Histogram( PROXY_DELETE_DURATION_SECONDS = Histogram(
'proxy_delete_duration_seconds', 'jupyterhub_proxy_delete_duration_seconds',
'duration for deleting user routes from proxy', 'duration for deleting user routes from proxy',
['status'], ['status'],
) )

View File

@@ -1,69 +1,93 @@
"""Basic class to manage pagination utils.""" """Basic class to manage pagination utils."""
# Copyright (c) Jupyter Development Team. # Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License. # Distributed under the terms of the Modified BSD License.
from traitlets import default
from traitlets import Integer
from traitlets import observe
from traitlets import Unicode
from traitlets import validate
from traitlets.config import Configurable
class Pagination: class Pagination(Configurable):
_page_name = 'page' # configurable options
_per_page_name = 'per_page' default_per_page = Integer(
_default_page = 1 100,
_default_per_page = 100 config=True,
_max_per_page = 250 help="Default number of entries per page for paginated results.",
)
def __init__(self, *args, **kwargs): max_per_page = Integer(
"""Potential parameters. 250,
**url**: URL in request config=True,
**page**: current page in use help="Maximum number of entries per page for paginated results.",
**per_page**: number of records to display in the page. By default 100 )
**total**: total records considered while paginating
"""
self.page = kwargs.get(self._page_name, 1)
if self.per_page > self._max_per_page: # state variables
self.per_page = self._max_per_page url = Unicode("")
page = Integer(1)
per_page = Integer(1, min=1)
self.total = int(kwargs.get('total', 0)) @default("per_page")
self.url = kwargs.get('url') or self.get_url() def _default_per_page(self):
self.init_values() return self.default_per_page
def init_values(self): @validate("per_page")
self._cached = {} def _limit_per_page(self, proposal):
self.skip = (self.page - 1) * self.per_page if self.max_per_page and proposal.value > self.max_per_page:
pages = divmod(self.total, self.per_page) return self.max_per_page
self.total_pages = pages[0] + 1 if pages[1] else pages[0] if proposal.value <= 1:
return 1
return proposal.value
self.has_prev = self.page > 1 @observe("max_per_page")
self.has_next = self.page < self.total_pages def _apply_max(self, change):
if change.new:
self.per_page = min(change.new, self.per_page)
total = Integer(0)
total_pages = Integer(0)
@default("total_pages")
def _calculate_total_pages(self):
total_pages = self.total // self.per_page
if self.total % self.per_page:
# there's a remainder, add 1
total_pages += 1
return total_pages
@observe("per_page", "total")
def _update_total_pages(self, change):
"""Update total_pages when per_page or total is changed"""
self.total_pages = self._calculate_total_pages()
separator = Unicode("...")
@classmethod
def get_page_args(self, handler): def get_page_args(self, handler):
""" """
This method gets the arguments used in the webpage to configurate the pagination This method gets the arguments used in the webpage to configurate the pagination
In case of no arguments, it uses the default values from this class In case of no arguments, it uses the default values from this class
It returns: Returns:
- self.page: The page requested for paginating or the default value (1) - page: The page requested for paginating or the default value (1)
- self.per_page: The number of items to return in this page. By default 100 and no more than 250 - per_page: The number of items to return in this page. No more than max_per_page
- self.per_page * (self.page - 1): The offset to consider when managing pagination via the ORM - offset: The offset to consider when managing pagination via the ORM
""" """
self.page = handler.get_argument(self._page_name, self._default_page) page = handler.get_argument("page", 1)
self.per_page = handler.get_argument( per_page = handler.get_argument("per_page", self.default_per_page)
self._per_page_name, self._default_per_page
)
try: try:
self.per_page = int(self.per_page) self.per_page = int(per_page)
if self.per_page > self._max_per_page: except Exception:
self.per_page = self._max_per_page self.per_page = self.default_per_page
except:
self.per_page = self._default_per_page
try: try:
self.page = int(self.page) self.page = int(page)
if self.page < 1: if self.page < 1:
self.page = self._default_page self.page = 1
except: except Exception:
self.page = self._default_page self.page = 1
return self.page, self.per_page, self.per_page * (self.page - 1) return self.page, self.per_page, self.per_page * (self.page - 1)
@@ -91,38 +115,44 @@ class Pagination:
(in case the current page + 5 does not overflow the total lenght of pages) and the first one for reference. (in case the current page + 5 does not overflow the total lenght of pages) and the first one for reference.
""" """
self.separator_character = '...' before_page = 2
default_pages_to_render = 7 after_page = 2
after_page = 5 window_size = before_page + after_page + 1
before_end = 2
# Add 1 to self.total_pages since our default page is 1 and not 0 # Add 1 to total_pages since our starting page is 1 and not 0
total_pages = self.total_pages + 1 last_page = self.total_pages
pages = [] pages = []
if total_pages > default_pages_to_render: # will default window + start, end fit without truncation?
if self.page > 1: if self.total_pages > window_size + 2:
pages.extend([1, '...']) if self.page - before_page > 1:
# before_page will not reach page 1
pages.append(1)
if self.page - before_page > 2:
# before_page will not reach page 2, need separator
pages.append(self.separator)
if total_pages < self.page + after_page: pages.extend(range(max(1, self.page - before_page), self.page))
pages.extend(list(range(self.page, total_pages))) # we now have up to but not including self.page
if self.page + after_page + 1 >= last_page:
# after_page gets us to the end
pages.extend(range(self.page, last_page + 1))
else: else:
if total_pages >= self.page + after_page + before_end: # add full after_page entries
pages.extend(list(range(self.page, self.page + after_page))) pages.extend(range(self.page, self.page + after_page + 1))
pages.append('...') # add separator *if* this doesn't get to last page - 1
pages.extend(list(range(total_pages - before_end, total_pages))) if self.page + after_page < last_page - 1:
else: pages.append(self.separator)
pages.extend(list(range(self.page, self.page + after_page))) pages.append(last_page)
if self.page + after_page < total_pages:
# show only last page when the after_page window left space to show it
pages.append('...')
pages.extend(list(range(total_pages - 1, total_pages)))
return pages return pages
else: else:
return list(range(1, total_pages)) # everything will fit, nothing to think about
# always return at least one page
return list(range(1, last_page + 1)) or [1]
@property @property
def links(self): def links(self):
@@ -155,9 +185,11 @@ class Pagination:
page=page page=page
) )
) )
elif page == self.separator_character: elif page == self.separator:
links.append( links.append(
'<li class="disabled"><span> <span aria-hidden="true">...</span></span></li>' '<li class="disabled"><span> <span aria-hidden="true">{separator}</span></span></li>'.format(
separator=self.separator
)
) )
else: else:
links.append( links.append(

View File

@@ -25,7 +25,6 @@ from functools import wraps
from subprocess import Popen from subprocess import Popen
from urllib.parse import quote from urllib.parse import quote
from tornado import gen
from tornado.httpclient import AsyncHTTPClient from tornado.httpclient import AsyncHTTPClient
from tornado.httpclient import HTTPError from tornado.httpclient import HTTPError
from tornado.httpclient import HTTPRequest from tornado.httpclient import HTTPRequest
@@ -43,6 +42,7 @@ from . import utils
from .metrics import CHECK_ROUTES_DURATION_SECONDS from .metrics import CHECK_ROUTES_DURATION_SECONDS
from .metrics import PROXY_POLL_DURATION_SECONDS from .metrics import PROXY_POLL_DURATION_SECONDS
from .objects import Server from .objects import Server
from .utils import exponential_backoff
from .utils import make_ssl_context from .utils import make_ssl_context
from .utils import url_path_join from .utils import url_path_join
from jupyterhub.traitlets import Command from jupyterhub.traitlets import Command
@@ -291,7 +291,7 @@ class Proxy(LoggingConfigurable):
if service.server: if service.server:
futures.append(self.add_service(service)) futures.append(self.add_service(service))
# wait after submitting them all # wait after submitting them all
await gen.multi(futures) await asyncio.gather(*futures)
async def add_all_users(self, user_dict): async def add_all_users(self, user_dict):
"""Update the proxy table from the database. """Update the proxy table from the database.
@@ -304,7 +304,7 @@ class Proxy(LoggingConfigurable):
if spawner.ready: if spawner.ready:
futures.append(self.add_user(user, name)) futures.append(self.add_user(user, name))
# wait after submitting them all # wait after submitting them all
await gen.multi(futures) await asyncio.gather(*futures)
@_one_at_a_time @_one_at_a_time
async def check_routes(self, user_dict, service_dict, routes=None): async def check_routes(self, user_dict, service_dict, routes=None):
@@ -390,7 +390,7 @@ class Proxy(LoggingConfigurable):
self.log.warning("Deleting stale route %s", routespec) self.log.warning("Deleting stale route %s", routespec)
futures.append(self.delete_route(routespec)) futures.append(self.delete_route(routespec))
await gen.multi(futures) await asyncio.gather(*futures)
stop = time.perf_counter() # timer stops here when user is deleted stop = time.perf_counter() # timer stops here when user is deleted
CHECK_ROUTES_DURATION_SECONDS.observe(stop - start) # histogram metric CHECK_ROUTES_DURATION_SECONDS.observe(stop - start) # histogram metric
@@ -496,6 +496,19 @@ class ConfigurableHTTPProxy(Proxy):
if not psutil.pid_exists(pid): if not psutil.pid_exists(pid):
raise ProcessLookupError raise ProcessLookupError
try:
process = psutil.Process(pid)
if self.command and self.command[0]:
process_cmd = process.cmdline()
if process_cmd and not any(
self.command[0] in clause for clause in process_cmd
):
raise ProcessLookupError
except (psutil.AccessDenied, psutil.NoSuchProcess):
# If there is a process at the proxy's PID but we don't have permissions to see it,
# then it is unlikely to actually be the proxy.
raise ProcessLookupError
else: else:
os.kill(pid, 0) os.kill(pid, 0)
@@ -573,6 +586,34 @@ class ConfigurableHTTPProxy(Proxy):
self.log.debug("PID file %s already removed", self.pid_file) self.log.debug("PID file %s already removed", self.pid_file)
pass pass
def _get_ssl_options(self):
"""List of cmd proxy options to use internal SSL"""
cmd = []
proxy_api = 'proxy-api'
proxy_client = 'proxy-client'
api_key = self.app.internal_proxy_certs[proxy_api][
'keyfile'
] # Check content in next test and just patch manulaly or in the config of the file
api_cert = self.app.internal_proxy_certs[proxy_api]['certfile']
api_ca = self.app.internal_trust_bundles[proxy_api + '-ca']
client_key = self.app.internal_proxy_certs[proxy_client]['keyfile']
client_cert = self.app.internal_proxy_certs[proxy_client]['certfile']
client_ca = self.app.internal_trust_bundles[proxy_client + '-ca']
cmd.extend(['--api-ssl-key', api_key])
cmd.extend(['--api-ssl-cert', api_cert])
cmd.extend(['--api-ssl-ca', api_ca])
cmd.extend(['--api-ssl-request-cert'])
cmd.extend(['--api-ssl-reject-unauthorized'])
cmd.extend(['--client-ssl-key', client_key])
cmd.extend(['--client-ssl-cert', client_cert])
cmd.extend(['--client-ssl-ca', client_ca])
cmd.extend(['--client-ssl-request-cert'])
cmd.extend(['--client-ssl-reject-unauthorized'])
return cmd
async def start(self): async def start(self):
"""Start the proxy process""" """Start the proxy process"""
# check if there is a previous instance still around # check if there is a previous instance still around
@@ -604,27 +645,7 @@ class ConfigurableHTTPProxy(Proxy):
if self.ssl_cert: if self.ssl_cert:
cmd.extend(['--ssl-cert', self.ssl_cert]) cmd.extend(['--ssl-cert', self.ssl_cert])
if self.app.internal_ssl: if self.app.internal_ssl:
proxy_api = 'proxy-api' cmd.extend(self._get_ssl_options())
proxy_client = 'proxy-client'
api_key = self.app.internal_proxy_certs[proxy_api]['keyfile']
api_cert = self.app.internal_proxy_certs[proxy_api]['certfile']
api_ca = self.app.internal_trust_bundles[proxy_api + '-ca']
client_key = self.app.internal_proxy_certs[proxy_client]['keyfile']
client_cert = self.app.internal_proxy_certs[proxy_client]['certfile']
client_ca = self.app.internal_trust_bundles[proxy_client + '-ca']
cmd.extend(['--api-ssl-key', api_key])
cmd.extend(['--api-ssl-cert', api_cert])
cmd.extend(['--api-ssl-ca', api_ca])
cmd.extend(['--api-ssl-request-cert'])
cmd.extend(['--api-ssl-reject-unauthorized'])
cmd.extend(['--client-ssl-key', client_key])
cmd.extend(['--client-ssl-cert', client_cert])
cmd.extend(['--client-ssl-ca', client_ca])
cmd.extend(['--client-ssl-request-cert'])
cmd.extend(['--client-ssl-reject-unauthorized'])
if self.app.statsd_host: if self.app.statsd_host:
cmd.extend( cmd.extend(
[ [
@@ -691,8 +712,17 @@ class ConfigurableHTTPProxy(Proxy):
parent = psutil.Process(pid) parent = psutil.Process(pid)
children = parent.children(recursive=True) children = parent.children(recursive=True)
for child in children: for child in children:
child.kill() child.terminate()
psutil.wait_procs(children, timeout=5) gone, alive = psutil.wait_procs(children, timeout=5)
for p in alive:
p.kill()
# Clear the shell, too, if it still exists.
try:
parent.terminate()
parent.wait(timeout=5)
parent.kill()
except psutil.NoSuchProcess:
pass
def _terminate(self): def _terminate(self):
"""Terminate our process""" """Terminate our process"""
@@ -768,9 +798,35 @@ class ConfigurableHTTPProxy(Proxy):
method=method, method=method,
headers={'Authorization': 'token {}'.format(self.auth_token)}, headers={'Authorization': 'token {}'.format(self.auth_token)},
body=body, body=body,
connect_timeout=3, # default: 20s
request_timeout=10, # default: 20s
) )
async def _wait_for_api_request():
try:
async with self.semaphore: async with self.semaphore:
result = await client.fetch(req) return await client.fetch(req)
except HTTPError as e:
# Retry on potentially transient errors in CHP, typically
# numbered 500 and up. Note that CHP isn't able to emit 429
# errors.
if e.code >= 500:
self.log.warning(
"api_request to the proxy failed with status code {}, retrying...".format(
e.code
)
)
return False # a falsy return value make exponential_backoff retry
else:
self.log.error("api_request to proxy failed: {0}".format(e))
# An unhandled error here will help the hub invoke cleanup logic
raise
result = await exponential_backoff(
_wait_for_api_request,
'Repeated api_request to proxy path "{}" failed.'.format(path),
timeout=30,
)
return result return result
async def add_route(self, routespec, target, data): async def add_route(self, routespec, target, data):

View File

@@ -23,7 +23,6 @@ from urllib.parse import quote
from urllib.parse import urlencode from urllib.parse import urlencode
import requests import requests
from tornado.gen import coroutine
from tornado.httputil import url_concat from tornado.httputil import url_concat
from tornado.log import app_log from tornado.log import app_log
from tornado.web import HTTPError from tornado.web import HTTPError
@@ -950,8 +949,7 @@ class HubOAuthCallbackHandler(HubOAuthenticated, RequestHandler):
.. versionadded: 0.8 .. versionadded: 0.8
""" """
@coroutine async def get(self):
def get(self):
error = self.get_argument("error", False) error = self.get_argument("error", False)
if error: if error:
msg = self.get_argument("error_description", error) msg = self.get_argument("error_description", error)

View File

@@ -20,7 +20,6 @@ from urllib.parse import urlparse
from jinja2 import ChoiceLoader from jinja2 import ChoiceLoader
from jinja2 import FunctionLoader from jinja2 import FunctionLoader
from tornado import gen
from tornado import ioloop from tornado import ioloop
from tornado.httpclient import AsyncHTTPClient from tornado.httpclient import AsyncHTTPClient
from tornado.httpclient import HTTPRequest from tornado.httpclient import HTTPRequest
@@ -434,7 +433,7 @@ class SingleUserNotebookAppMixin(Configurable):
i, i,
RETRIES, RETRIES,
) )
await gen.sleep(min(2 ** i, 16)) await asyncio.sleep(min(2 ** i, 16))
else: else:
break break
else: else:

View File

@@ -16,8 +16,7 @@ from tempfile import mkdtemp
if os.name == 'nt': if os.name == 'nt':
import psutil import psutil
from async_generator import async_generator from async_generator import aclosing
from async_generator import yield_
from sqlalchemy import inspect from sqlalchemy import inspect
from tornado.ioloop import PeriodicCallback from tornado.ioloop import PeriodicCallback
from traitlets import Any from traitlets import Any
@@ -354,8 +353,9 @@ class Spawner(LoggingConfigurable):
return options_form return options_form
def options_from_form(self, form_data): options_from_form = Callable(
"""Interpret HTTP form data help="""
Interpret HTTP form data
Form data will always arrive as a dict of lists of strings. Form data will always arrive as a dict of lists of strings.
Override this function to understand single-values, numbers, etc. Override this function to understand single-values, numbers, etc.
@@ -379,7 +379,14 @@ class Spawner(LoggingConfigurable):
(with additional support for bytes in case of uploaded file data), (with additional support for bytes in case of uploaded file data),
and any non-bytes non-jsonable values will be replaced with None and any non-bytes non-jsonable values will be replaced with None
if the user_options are re-used. if the user_options are re-used.
""" """,
).tag(config=True)
@default("options_from_form")
def _options_from_form(self):
return self._default_options_from_form
def _default_options_from_form(self, form_data):
return form_data return form_data
def options_from_query(self, query_data): def options_from_query(self, query_data):
@@ -1024,7 +1031,6 @@ class Spawner(LoggingConfigurable):
def _progress_url(self): def _progress_url(self):
return self.user.progress_url(self.name) return self.user.progress_url(self.name)
@async_generator
async def _generate_progress(self): async def _generate_progress(self):
"""Private wrapper of progress generator """Private wrapper of progress generator
@@ -1036,21 +1042,17 @@ class Spawner(LoggingConfigurable):
) )
return return
await yield_({"progress": 0, "message": "Server requested"}) yield {"progress": 0, "message": "Server requested"}
from async_generator import aclosing
async with aclosing(self.progress()) as progress: async with aclosing(self.progress()) as progress:
async for event in progress: async for event in progress:
await yield_(event) yield event
@async_generator
async def progress(self): async def progress(self):
"""Async generator for progress events """Async generator for progress events
Must be an async generator Must be an async generator
For Python 3.5-compatibility, use the async_generator package
Should yield messages of the form: Should yield messages of the form:
:: ::
@@ -1067,7 +1069,7 @@ class Spawner(LoggingConfigurable):
.. versionadded:: 0.9 .. versionadded:: 0.9
""" """
await yield_({"progress": 50, "message": "Spawning server..."}) yield {"progress": 50, "message": "Spawning server..."}
async def start(self): async def start(self):
"""Start the single-user server """Start the single-user server
@@ -1078,9 +1080,7 @@ class Spawner(LoggingConfigurable):
.. versionchanged:: 0.7 .. versionchanged:: 0.7
Return ip, port instead of setting on self.user.server directly. Return ip, port instead of setting on self.user.server directly.
""" """
raise NotImplementedError( raise NotImplementedError("Override in subclass. Must be a coroutine.")
"Override in subclass. Must be a Tornado gen.coroutine."
)
async def stop(self, now=False): async def stop(self, now=False):
"""Stop the single-user server """Stop the single-user server
@@ -1093,9 +1093,7 @@ class Spawner(LoggingConfigurable):
Must be a coroutine. Must be a coroutine.
""" """
raise NotImplementedError( raise NotImplementedError("Override in subclass. Must be a coroutine.")
"Override in subclass. Must be a Tornado gen.coroutine."
)
async def poll(self): async def poll(self):
"""Check if the single-user process is running """Check if the single-user process is running
@@ -1121,9 +1119,7 @@ class Spawner(LoggingConfigurable):
process has not yet completed. process has not yet completed.
""" """
raise NotImplementedError( raise NotImplementedError("Override in subclass. Must be a coroutine.")
"Override in subclass. Must be a Tornado gen.coroutine."
)
def add_poll_callback(self, callback, *args, **kwargs): def add_poll_callback(self, callback, *args, **kwargs):
"""Add a callback to fire when the single-user server stops""" """Add a callback to fire when the single-user server stops"""

View File

@@ -36,7 +36,6 @@ from unittest import mock
from pytest import fixture from pytest import fixture
from pytest import raises from pytest import raises
from tornado import gen
from tornado import ioloop from tornado import ioloop
from tornado.httpclient import HTTPError from tornado.httpclient import HTTPError
from tornado.platform.asyncio import AsyncIOMainLoop from tornado.platform.asyncio import AsyncIOMainLoop
@@ -56,12 +55,15 @@ _db = None
def pytest_collection_modifyitems(items): def pytest_collection_modifyitems(items):
"""add asyncio marker to all async tests""" """This function is automatically run by pytest passing all collected test
functions.
We use it to add asyncio marker to all async tests and assert we don't use
test functions that are async generators which wouldn't make sense.
"""
for item in items: for item in items:
if inspect.iscoroutinefunction(item.obj): if inspect.iscoroutinefunction(item.obj):
item.add_marker('asyncio') item.add_marker('asyncio')
if hasattr(inspect, 'isasyncgenfunction'):
# double-check that we aren't mixing yield and async def
assert not inspect.isasyncgenfunction(item.obj) assert not inspect.isasyncgenfunction(item.obj)
@@ -74,7 +76,9 @@ def ssl_tmpdir(tmpdir_factory):
def app(request, io_loop, ssl_tmpdir): def app(request, io_loop, ssl_tmpdir):
"""Mock a jupyterhub app for testing""" """Mock a jupyterhub app for testing"""
mocked_app = None mocked_app = None
ssl_enabled = getattr(request.module, "ssl_enabled", False) ssl_enabled = getattr(
request.module, 'ssl_enabled', os.environ.get('SSL_ENABLED', False)
)
kwargs = dict() kwargs = dict()
if ssl_enabled: if ssl_enabled:
kwargs.update(dict(internal_ssl=True, internal_certs_location=str(ssl_tmpdir))) kwargs.update(dict(internal_ssl=True, internal_certs_location=str(ssl_tmpdir)))
@@ -244,17 +248,14 @@ def _mockservice(request, app, url=False):
assert name in app._service_map assert name in app._service_map
service = app._service_map[name] service = app._service_map[name]
@gen.coroutine async def start():
def start():
# wait for proxy to be updated before starting the service # wait for proxy to be updated before starting the service
yield app.proxy.add_all_services(app._service_map) await app.proxy.add_all_services(app._service_map)
service.start() service.start()
io_loop.run_sync(start) io_loop.run_sync(start)
def cleanup(): def cleanup():
import asyncio
asyncio.get_event_loop().run_until_complete(service.stop()) asyncio.get_event_loop().run_until_complete(service.stop())
app.services[:] = [] app.services[:] = []
app._service_map.clear() app._service_map.clear()

View File

@@ -36,13 +36,12 @@ from unittest import mock
from urllib.parse import urlparse from urllib.parse import urlparse
from pamela import PAMError from pamela import PAMError
from tornado import gen
from tornado.concurrent import Future
from tornado.ioloop import IOLoop from tornado.ioloop import IOLoop
from traitlets import Bool from traitlets import Bool
from traitlets import default from traitlets import default
from traitlets import Dict from traitlets import Dict
from .. import metrics
from .. import orm from .. import orm
from ..app import JupyterHub from ..app import JupyterHub
from ..auth import PAMAuthenticator from ..auth import PAMAuthenticator
@@ -110,19 +109,17 @@ class SlowSpawner(MockSpawner):
delay = 2 delay = 2
_start_future = None _start_future = None
@gen.coroutine async def start(self):
def start(self): (ip, port) = await super().start()
(ip, port) = yield super().start()
if self._start_future is not None: if self._start_future is not None:
yield self._start_future await self._start_future
else: else:
yield gen.sleep(self.delay) await asyncio.sleep(self.delay)
return ip, port return ip, port
@gen.coroutine async def stop(self):
def stop(self): await asyncio.sleep(self.delay)
yield gen.sleep(self.delay) await super().stop()
yield super().stop()
class NeverSpawner(MockSpawner): class NeverSpawner(MockSpawner):
@@ -134,14 +131,12 @@ class NeverSpawner(MockSpawner):
def start(self): def start(self):
"""Return a Future that will never finish""" """Return a Future that will never finish"""
return Future() return asyncio.Future()
@gen.coroutine async def stop(self):
def stop(self):
pass pass
@gen.coroutine async def poll(self):
def poll(self):
return 0 return 0
@@ -215,8 +210,7 @@ class MockPAMAuthenticator(PAMAuthenticator):
# skip the add-system-user bit # skip the add-system-user bit
return not user.name.startswith('dne') return not user.name.startswith('dne')
@gen.coroutine async def authenticate(self, *args, **kwargs):
def authenticate(self, *args, **kwargs):
with mock.patch.multiple( with mock.patch.multiple(
'pamela', 'pamela',
authenticate=mock_authenticate, authenticate=mock_authenticate,
@@ -224,7 +218,7 @@ class MockPAMAuthenticator(PAMAuthenticator):
close_session=mock_open_session, close_session=mock_open_session,
check_account=mock_check_account, check_account=mock_check_account,
): ):
username = yield super(MockPAMAuthenticator, self).authenticate( username = await super(MockPAMAuthenticator, self).authenticate(
*args, **kwargs *args, **kwargs
) )
if username is None: if username is None:
@@ -320,14 +314,13 @@ class MockHub(JupyterHub):
self.db.delete(group) self.db.delete(group)
self.db.commit() self.db.commit()
@gen.coroutine async def initialize(self, argv=None):
def initialize(self, argv=None):
self.pid_file = NamedTemporaryFile(delete=False).name self.pid_file = NamedTemporaryFile(delete=False).name
self.db_file = NamedTemporaryFile() self.db_file = NamedTemporaryFile()
self.db_url = os.getenv('JUPYTERHUB_TEST_DB_URL') or self.db_file.name self.db_url = os.getenv('JUPYTERHUB_TEST_DB_URL') or self.db_file.name
if 'mysql' in self.db_url: if 'mysql' in self.db_url:
self.db_kwargs['connect_args'] = {'auth_plugin': 'mysql_native_password'} self.db_kwargs['connect_args'] = {'auth_plugin': 'mysql_native_password'}
yield super().initialize([]) await super().initialize([])
# add an initial user # add an initial user
user = self.db.query(orm.User).filter(orm.User.name == 'user').first() user = self.db.query(orm.User).filter(orm.User.name == 'user').first()
@@ -335,6 +328,7 @@ class MockHub(JupyterHub):
user = orm.User(name='user') user = orm.User(name='user')
self.db.add(user) self.db.add(user)
self.db.commit() self.db.commit()
metrics.TOTAL_USERS.inc()
def stop(self): def stop(self):
super().stop() super().stop()
@@ -358,14 +352,13 @@ class MockHub(JupyterHub):
self.cleanup = lambda: None self.cleanup = lambda: None
self.db_file.close() self.db_file.close()
@gen.coroutine async def login_user(self, name):
def login_user(self, name):
"""Login a user by name, returning her cookies.""" """Login a user by name, returning her cookies."""
base_url = public_url(self) base_url = public_url(self)
external_ca = None external_ca = None
if self.internal_ssl: if self.internal_ssl:
external_ca = self.external_certs['files']['ca'] external_ca = self.external_certs['files']['ca']
r = yield async_requests.post( r = await async_requests.post(
base_url + 'hub/login', base_url + 'hub/login',
data={'username': name, 'password': name}, data={'username': name, 'password': name},
allow_redirects=False, allow_redirects=False,
@@ -407,8 +400,7 @@ class StubSingleUserSpawner(MockSpawner):
_thread = None _thread = None
@gen.coroutine async def start(self):
def start(self):
ip = self.ip = '127.0.0.1' ip = self.ip = '127.0.0.1'
port = self.port = random_port() port = self.port = random_port()
env = self.get_env() env = self.get_env()
@@ -435,14 +427,12 @@ class StubSingleUserSpawner(MockSpawner):
assert ready assert ready
return (ip, port) return (ip, port)
@gen.coroutine async def stop(self):
def stop(self):
self._app.stop() self._app.stop()
self._thread.join(timeout=30) self._thread.join(timeout=30)
assert not self._thread.is_alive() assert not self._thread.is_alive()
@gen.coroutine async def poll(self):
def poll(self):
if self._thread is None: if self._thread is None:
return 0 return 0
if self._thread.is_alive(): if self._thread.is_alive():

View File

@@ -1,22 +1,20 @@
"""Tests for the REST API.""" """Tests for the REST API."""
import asyncio
import json import json
import re import re
import sys import sys
import uuid import uuid
from concurrent.futures import Future
from datetime import datetime from datetime import datetime
from datetime import timedelta from datetime import timedelta
from unittest import mock from unittest import mock
from urllib.parse import quote from urllib.parse import quote
from urllib.parse import urlparse from urllib.parse import urlparse
from async_generator import async_generator
from async_generator import yield_
from pytest import mark from pytest import mark
from tornado import gen
import jupyterhub import jupyterhub
from .. import orm from .. import orm
from ..objects import Server
from ..utils import url_path_join as ujoin from ..utils import url_path_join as ujoin
from ..utils import utcnow from ..utils import utcnow
from .mocking import public_host from .mocking import public_host
@@ -27,7 +25,6 @@ from .utils import async_requests
from .utils import auth_header from .utils import auth_header
from .utils import find_user from .utils import find_user
# -------------------- # --------------------
# Authentication tests # Authentication tests
# -------------------- # --------------------
@@ -182,6 +179,71 @@ async def test_get_users(app):
assert r.status_code == 403 assert r.status_code == 403
@mark.user
@mark.parametrize(
"state",
("inactive", "active", "ready", "invalid"),
)
async def test_get_users_state_filter(app, state):
db = app.db
# has_one_active: one active, one inactive, zero ready
has_one_active = add_user(db, app=app, name='has_one_active')
# has_two_active: two active, ready servers
has_two_active = add_user(db, app=app, name='has_two_active')
# has_two_inactive: two spawners, neither active
has_two_inactive = add_user(db, app=app, name='has_two_inactive')
# has_zero: no Spawners registered at all
has_zero = add_user(db, app=app, name='has_zero')
test_usernames = set(
("has_one_active", "has_two_active", "has_two_inactive", "has_zero")
)
user_states = {
"inactive": ["has_two_inactive", "has_zero"],
"ready": ["has_two_active"],
"active": ["has_one_active", "has_two_active"],
"invalid": [],
}
expected = user_states[state]
def add_spawner(user, name='', active=True, ready=True):
"""Add a spawner in a requested state
If active, should turn up in an active query
If active and ready, should turn up in a ready query
If not active, should turn up in an inactive query
"""
spawner = user.spawners[name]
db.commit()
if active:
orm_server = orm.Server()
db.add(orm_server)
db.commit()
spawner.server = Server(orm_server=orm_server)
db.commit()
if not ready:
spawner._spawn_pending = True
return spawner
for name in ("", "secondary"):
add_spawner(has_two_active, name, active=True)
add_spawner(has_two_inactive, name, active=False)
add_spawner(has_one_active, active=True, ready=False)
add_spawner(has_one_active, "inactive", active=False)
r = await api_request(app, 'users?state={}'.format(state))
if state == "invalid":
assert r.status_code == 400
return
assert r.status_code == 200
usernames = sorted(u["name"] for u in r.json() if u["name"] in test_usernames)
assert usernames == expected
@mark.user @mark.user
async def test_get_self(app): async def test_get_self(app):
db = app.db db = app.db
@@ -215,6 +277,17 @@ async def test_get_self(app):
assert r.status_code == 403 assert r.status_code == 403
async def test_get_self_service(app, mockservice):
r = await api_request(
app, "user", headers={"Authorization": f"token {mockservice.api_token}"}
)
r.raise_for_status()
service_info = r.json()
assert service_info['kind'] == 'service'
assert service_info['name'] == mockservice.name
@mark.user @mark.user
async def test_add_user(app): async def test_add_user(app):
db = app.db db = app.db
@@ -613,7 +686,7 @@ async def test_slow_spawn(app, no_patience, slow_spawn):
async def wait_spawn(): async def wait_spawn():
while not app_user.running: while not app_user.running:
await gen.sleep(0.1) await asyncio.sleep(0.1)
await wait_spawn() await wait_spawn()
assert not app_user.spawner._spawn_pending assert not app_user.spawner._spawn_pending
@@ -622,7 +695,7 @@ async def test_slow_spawn(app, no_patience, slow_spawn):
async def wait_stop(): async def wait_stop():
while app_user.spawner._stop_pending: while app_user.spawner._stop_pending:
await gen.sleep(0.1) await asyncio.sleep(0.1)
r = await api_request(app, 'users', name, 'server', method='delete') r = await api_request(app, 'users', name, 'server', method='delete')
r.raise_for_status() r.raise_for_status()
@@ -656,7 +729,7 @@ async def test_never_spawn(app, no_patience, never_spawn):
assert app.users.count_active_users()['pending'] == 1 assert app.users.count_active_users()['pending'] == 1
while app_user.spawner.pending: while app_user.spawner.pending:
await gen.sleep(0.1) await asyncio.sleep(0.1)
print(app_user.spawner.pending) print(app_user.spawner.pending)
assert not app_user.spawner._spawn_pending assert not app_user.spawner._spawn_pending
@@ -682,7 +755,7 @@ async def test_slow_bad_spawn(app, no_patience, slow_bad_spawn):
r = await api_request(app, 'users', name, 'server', method='post') r = await api_request(app, 'users', name, 'server', method='post')
r.raise_for_status() r.raise_for_status()
while user.spawner.pending: while user.spawner.pending:
await gen.sleep(0.1) await asyncio.sleep(0.1)
# spawn failed # spawn failed
assert not user.running assert not user.running
assert app.users.count_active_users()['pending'] == 0 assert app.users.count_active_users()['pending'] == 0
@@ -818,32 +891,12 @@ async def test_progress_bad_slow(request, app, no_patience, slow_bad_spawn):
} }
@async_generator
async def progress_forever(): async def progress_forever():
"""progress function that yields messages forever""" """progress function that yields messages forever"""
for i in range(1, 10): for i in range(1, 10):
await yield_({'progress': i, 'message': 'Stage %s' % i}) yield {'progress': i, 'message': 'Stage %s' % i}
# wait a long time before the next event # wait a long time before the next event
await gen.sleep(10) await asyncio.sleep(10)
if sys.version_info >= (3, 6):
# additional progress_forever defined as native
# async generator
# to test for issues with async_generator wrappers
exec(
"""
async def progress_forever_native():
for i in range(1, 10):
yield {
'progress': i,
'message': 'Stage %s' % i,
}
# wait a long time before the next event
await gen.sleep(10)
""",
globals(),
)
async def test_spawn_progress_cutoff(request, app, no_patience, slow_spawn): async def test_spawn_progress_cutoff(request, app, no_patience, slow_spawn):
@@ -854,10 +907,6 @@ async def test_spawn_progress_cutoff(request, app, no_patience, slow_spawn):
db = app.db db = app.db
name = 'geddy' name = 'geddy'
app_user = add_user(db, app=app, name=name) app_user = add_user(db, app=app, name=name)
if sys.version_info >= (3, 6):
# Python >= 3.6, try native async generator
app_user.spawner.progress = globals()['progress_forever_native']
else:
app_user.spawner.progress = progress_forever app_user.spawner.progress = progress_forever
app_user.spawner.delay = 1 app_user.spawner.delay = 1
@@ -885,8 +934,8 @@ async def test_spawn_limit(app, no_patience, slow_spawn, request):
# start two pending spawns # start two pending spawns
names = ['ykka', 'hjarka'] names = ['ykka', 'hjarka']
users = [add_user(db, app=app, name=name) for name in names] users = [add_user(db, app=app, name=name) for name in names]
users[0].spawner._start_future = Future() users[0].spawner._start_future = asyncio.Future()
users[1].spawner._start_future = Future() users[1].spawner._start_future = asyncio.Future()
for name in names: for name in names:
await api_request(app, 'users', name, 'server', method='post') await api_request(app, 'users', name, 'server', method='post')
assert app.users.count_active_users()['pending'] == 2 assert app.users.count_active_users()['pending'] == 2
@@ -894,7 +943,7 @@ async def test_spawn_limit(app, no_patience, slow_spawn, request):
# ykka and hjarka's spawns are both pending. Essun should fail with 429 # ykka and hjarka's spawns are both pending. Essun should fail with 429
name = 'essun' name = 'essun'
user = add_user(db, app=app, name=name) user = add_user(db, app=app, name=name)
user.spawner._start_future = Future() user.spawner._start_future = asyncio.Future()
r = await api_request(app, 'users', name, 'server', method='post') r = await api_request(app, 'users', name, 'server', method='post')
assert r.status_code == 429 assert r.status_code == 429
@@ -902,7 +951,7 @@ async def test_spawn_limit(app, no_patience, slow_spawn, request):
users[0].spawner._start_future.set_result(None) users[0].spawner._start_future.set_result(None)
# wait for ykka to finish # wait for ykka to finish
while not users[0].running: while not users[0].running:
await gen.sleep(0.1) await asyncio.sleep(0.1)
assert app.users.count_active_users()['pending'] == 1 assert app.users.count_active_users()['pending'] == 1
r = await api_request(app, 'users', name, 'server', method='post') r = await api_request(app, 'users', name, 'server', method='post')
@@ -913,7 +962,7 @@ async def test_spawn_limit(app, no_patience, slow_spawn, request):
for user in users[1:]: for user in users[1:]:
user.spawner._start_future.set_result(None) user.spawner._start_future.set_result(None)
while not all(u.running for u in users): while not all(u.running for u in users):
await gen.sleep(0.1) await asyncio.sleep(0.1)
# everybody's running, pending count should be back to 0 # everybody's running, pending count should be back to 0
assert app.users.count_active_users()['pending'] == 0 assert app.users.count_active_users()['pending'] == 0
@@ -922,7 +971,7 @@ async def test_spawn_limit(app, no_patience, slow_spawn, request):
r = await api_request(app, 'users', u.name, 'server', method='delete') r = await api_request(app, 'users', u.name, 'server', method='delete')
r.raise_for_status() r.raise_for_status()
while any(u.spawner.active for u in users): while any(u.spawner.active for u in users):
await gen.sleep(0.1) await asyncio.sleep(0.1)
@mark.slow @mark.slow
@@ -1000,7 +1049,7 @@ async def test_start_stop_race(app, no_patience, slow_spawn):
r = await api_request(app, 'users', user.name, 'server', method='delete') r = await api_request(app, 'users', user.name, 'server', method='delete')
assert r.status_code == 400 assert r.status_code == 400
while not spawner.ready: while not spawner.ready:
await gen.sleep(0.1) await asyncio.sleep(0.1)
spawner.delay = 3 spawner.delay = 3
# stop the spawner # stop the spawner
@@ -1008,7 +1057,7 @@ async def test_start_stop_race(app, no_patience, slow_spawn):
assert r.status_code == 202 assert r.status_code == 202
assert spawner.pending == 'stop' assert spawner.pending == 'stop'
# make sure we get past deleting from the proxy # make sure we get past deleting from the proxy
await gen.sleep(1) await asyncio.sleep(1)
# additional stops while stopping shouldn't trigger a new stop # additional stops while stopping shouldn't trigger a new stop
with mock.patch.object(spawner, 'stop') as m: with mock.patch.object(spawner, 'stop') as m:
r = await api_request(app, 'users', user.name, 'server', method='delete') r = await api_request(app, 'users', user.name, 'server', method='delete')
@@ -1020,7 +1069,7 @@ async def test_start_stop_race(app, no_patience, slow_spawn):
assert r.status_code == 400 assert r.status_code == 400
while spawner.active: while spawner.active:
await gen.sleep(0.1) await asyncio.sleep(0.1)
# start after stop is okay # start after stop is okay
r = await api_request(app, 'users', user.name, 'server', method='post') r = await api_request(app, 'users', user.name, 'server', method='post')
assert r.status_code == 202 assert r.status_code == 202

View File

@@ -1,6 +0,0 @@
"""Tests for the SSL enabled REST API."""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
from jupyterhub.tests.test_api import *
ssl_enabled = True

View File

@@ -1,7 +0,0 @@
"""Test the JupyterHub entry point with internal ssl"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import jupyterhub.tests.mocking
from jupyterhub.tests.test_app import *
ssl_enabled = True

View File

@@ -20,8 +20,7 @@ ssl_enabled = True
SSL_ERROR = (SSLError, ConnectionError) SSL_ERROR = (SSLError, ConnectionError)
@gen.coroutine async def wait_for_spawner(spawner, timeout=10):
def wait_for_spawner(spawner, timeout=10):
"""Wait for an http server to show up """Wait for an http server to show up
polling at shorter intervals for early termination polling at shorter intervals for early termination
@@ -32,15 +31,15 @@ def wait_for_spawner(spawner, timeout=10):
return spawner.server.wait_up(timeout=1, http=True) return spawner.server.wait_up(timeout=1, http=True)
while time.monotonic() < deadline: while time.monotonic() < deadline:
status = yield spawner.poll() status = await spawner.poll()
assert status is None assert status is None
try: try:
yield wait() await wait()
except TimeoutError: except TimeoutError:
continue continue
else: else:
break break
yield wait() await wait()
async def test_connection_hub_wrong_certs(app): async def test_connection_hub_wrong_certs(app):

View File

@@ -1,6 +0,0 @@
"""Tests for process spawning with internal_ssl"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
from jupyterhub.tests.test_spawner import *
ssl_enabled = True

View File

@@ -0,0 +1,34 @@
import json
from .utils import add_user
from .utils import api_request
from jupyterhub import metrics
from jupyterhub import orm
async def test_total_users(app):
num_users = app.db.query(orm.User).count()
sample = metrics.TOTAL_USERS.collect()[0].samples[0]
assert sample.value == num_users
await api_request(
app, "/users", method="post", data=json.dumps({"usernames": ["incrementor"]})
)
sample = metrics.TOTAL_USERS.collect()[0].samples[0]
assert sample.value == num_users + 1
# GET /users used to double-count
await api_request(app, "/users")
# populate the Users cache dict if any are missing:
for user in app.db.query(orm.User):
_ = app.users[user.id]
sample = metrics.TOTAL_USERS.collect()[0].samples[0]
assert sample.value == num_users + 1
await api_request(app, "/users/incrementor", method="delete")
sample = metrics.TOTAL_USERS.collect()[0].samples[0]
assert sample.value == num_users

View File

@@ -222,8 +222,7 @@ async def test_spawn_fails(db):
db.commit() db.commit()
class BadSpawner(MockSpawner): class BadSpawner(MockSpawner):
@gen.coroutine async def start(self):
def start(self):
raise RuntimeError("Split the party") raise RuntimeError("Split the party")
user = User( user = User(

View File

@@ -8,7 +8,6 @@ from urllib.parse import urlparse
import pytest import pytest
from bs4 import BeautifulSoup from bs4 import BeautifulSoup
from tornado import gen
from tornado.escape import url_escape from tornado.escape import url_escape
from tornado.httputil import url_concat from tornado.httputil import url_concat
@@ -31,7 +30,7 @@ async def test_root_no_auth(app):
url = ujoin(public_host(app), app.hub.base_url) url = ujoin(public_host(app), app.hub.base_url)
r = await async_requests.get(url) r = await async_requests.get(url)
r.raise_for_status() r.raise_for_status()
assert r.url == ujoin(url, 'login') assert r.url == url_concat(ujoin(url, 'login'), dict(next=app.hub.base_url))
async def test_root_auth(app): async def test_root_auth(app):
@@ -266,7 +265,8 @@ async def test_spawn_with_query_arguments(app):
next_url = ujoin(app.base_url, 'user/jones/tree') next_url = ujoin(app.base_url, 'user/jones/tree')
r = await async_requests.get( r = await async_requests.get(
url_concat( url_concat(
ujoin(base_url, 'spawn'), {'next': next_url, 'energy': '510keV'}, ujoin(base_url, 'spawn'),
{'next': next_url, 'energy': '510keV'},
), ),
cookies=cookies, cookies=cookies,
) )
@@ -332,6 +332,12 @@ async def test_spawn_form_admin_access(app, admin_access):
data={'bounds': ['-3', '3'], 'energy': '938MeV'}, data={'bounds': ['-3', '3'], 'energy': '938MeV'},
) )
r.raise_for_status() r.raise_for_status()
while '/spawn-pending/' in r.url:
await asyncio.sleep(0.1)
r = await async_requests.get(r.url, cookies=cookies)
r.raise_for_status()
assert r.history assert r.history
assert r.url.startswith(public_url(app, u)) assert r.url.startswith(public_url(app, u))
assert u.spawner.user_options == { assert u.spawner.user_options == {
@@ -586,8 +592,7 @@ async def test_login_strip(app):
base_url = public_url(app) base_url = public_url(app)
called_with = [] called_with = []
@gen.coroutine async def mock_authenticate(handler, data):
def mock_authenticate(handler, data):
called_with.append(data) called_with.append(data)
with mock.patch.object(app.authenticator, 'authenticate', mock_authenticate): with mock.patch.object(app.authenticator, 'authenticate', mock_authenticate):
@@ -616,9 +621,16 @@ async def test_login_strip(app):
(False, '//other.domain', '', None), (False, '//other.domain', '', None),
(False, '///other.domain/triple', '', None), (False, '///other.domain/triple', '', None),
(False, '\\\\other.domain/backslashes', '', None), (False, '\\\\other.domain/backslashes', '', None),
# params are handled correctly # params are handled correctly (ignored if ?next= specified)
(True, '/hub/admin', 'hub/admin?left=1&right=2', [('left', 1), ('right', 2)]), (
(False, '/hub/admin', 'hub/admin?left=1&right=2', [('left', 1), ('right', 2)]), True,
'/hub/admin?left=1&right=2',
'hub/admin?left=1&right=2',
{"left": "abc"},
),
(False, '/hub/admin', 'hub/admin', [('left', 1), ('right', 2)]),
(True, '', '', {"keep": "yes"}),
(False, '', '', {"keep": "yes"}),
], ],
) )
async def test_login_redirect(app, running, next_url, location, params): async def test_login_redirect(app, running, next_url, location, params):
@@ -627,10 +639,15 @@ async def test_login_redirect(app, running, next_url, location, params):
if location: if location:
location = ujoin(app.base_url, location) location = ujoin(app.base_url, location)
elif running: elif running:
# location not specified,
location = user.url location = user.url
if params:
location = url_concat(location, params)
else: else:
# use default url # use default url
location = ujoin(app.base_url, 'hub/spawn') location = ujoin(app.base_url, 'hub/spawn')
if params:
location = url_concat(location, params)
url = 'login' url = 'login'
if params: if params:
@@ -649,7 +666,73 @@ async def test_login_redirect(app, running, next_url, location, params):
r = await get_page(url, app, cookies=cookies, allow_redirects=False) r = await get_page(url, app, cookies=cookies, allow_redirects=False)
r.raise_for_status() r.raise_for_status()
assert r.status_code == 302 assert r.status_code == 302
assert location == r.headers['Location'] assert r.headers["Location"] == location
@pytest.mark.parametrize(
'location, next, extra_params',
[
(
"{base_url}hub/spawn?a=5",
None,
{"a": "5"},
), # no ?next= given, preserve params
("/x", "/x", {"a": "5"}), # ?next=given, params ignored
(
"/x?b=10",
"/x?b=10",
{"a": "5"},
), # ?next=given with params, additional params ignored
],
)
async def test_next_url(app, user, location, next, extra_params):
params = {}
if extra_params:
params.update(extra_params)
if next:
params["next"] = next
url = url_concat("/", params)
cookies = await app.login_user("monster")
# location can be a string template
location = location.format(base_url=app.base_url)
r = await get_page(url, app, cookies=cookies, allow_redirects=False)
r.raise_for_status()
assert r.status_code == 302
assert r.headers["Location"] == location
async def test_next_url_params_sequence(app, user):
"""Test each step of / -> login -> spawn
and whether they preserve url params
"""
params = {"xyz": "5"}
# first request: root page, with params, not logged in
r = await get_page("/?xyz=5", app, allow_redirects=False)
r.raise_for_status()
location = r.headers["Location"]
# next page: login
cookies = await app.login_user(user.name)
assert location == url_concat(
ujoin(app.base_url, "/hub/login"), {"next": ujoin(app.base_url, "/hub/?xyz=5")}
)
r = await async_requests.get(
public_host(app) + location, cookies=cookies, allow_redirects=False
)
r.raise_for_status()
location = r.headers["Location"]
# after login, redirect back
assert location == ujoin(app.base_url, "/hub/?xyz=5")
r = await async_requests.get(
public_host(app) + location, cookies=cookies, allow_redirects=False
)
r.raise_for_status()
location = r.headers["Location"]
assert location == ujoin(app.base_url, "/hub/spawn?xyz=5")
async def test_auto_login(app, request): async def test_auto_login(app, request):
@@ -663,14 +746,18 @@ async def test_auto_login(app, request):
) )
# no auto_login: end up at /hub/login # no auto_login: end up at /hub/login
r = await async_requests.get(base_url) r = await async_requests.get(base_url)
assert r.url == public_url(app, path='hub/login') assert r.url == url_concat(
public_url(app, path="hub/login"), {"next": app.hub.base_url}
)
# enable auto_login: redirect from /hub/login to /hub/dummy # enable auto_login: redirect from /hub/login to /hub/dummy
authenticator = Authenticator(auto_login=True) authenticator = Authenticator(auto_login=True)
authenticator.login_url = lambda base_url: ujoin(base_url, 'dummy') authenticator.login_url = lambda base_url: ujoin(base_url, 'dummy')
with mock.patch.dict(app.tornado_settings, {'authenticator': authenticator}): with mock.patch.dict(app.tornado_settings, {'authenticator': authenticator}):
r = await async_requests.get(base_url) r = await async_requests.get(base_url)
assert r.url == public_url(app, path='hub/dummy') assert r.url == url_concat(
public_url(app, path="hub/dummy"), {"next": app.hub.base_url}
)
async def test_auto_login_logout(app): async def test_auto_login_logout(app):
@@ -943,8 +1030,7 @@ async def test_pre_spawn_start_exc_no_form(app):
exc = "pre_spawn_start error" exc = "pre_spawn_start error"
# throw exception from pre_spawn_start # throw exception from pre_spawn_start
@gen.coroutine async def mock_pre_spawn_start(user, spawner):
def mock_pre_spawn_start(user, spawner):
raise Exception(exc) raise Exception(exc)
with mock.patch.object(app.authenticator, 'pre_spawn_start', mock_pre_spawn_start): with mock.patch.object(app.authenticator, 'pre_spawn_start', mock_pre_spawn_start):
@@ -959,8 +1045,7 @@ async def test_pre_spawn_start_exc_options_form(app):
exc = "pre_spawn_start error" exc = "pre_spawn_start error"
# throw exception from pre_spawn_start # throw exception from pre_spawn_start
@gen.coroutine async def mock_pre_spawn_start(user, spawner):
def mock_pre_spawn_start(user, spawner):
raise Exception(exc) raise Exception(exc)
with mock.patch.dict( with mock.patch.dict(

View File

@@ -0,0 +1,52 @@
"""tests for pagination"""
from pytest import mark
from pytest import raises
from traitlets.config import Config
from jupyterhub.pagination import Pagination
@mark.parametrize(
"per_page, max_per_page, expected",
[
(20, 10, 10),
(1000, 1000, 1000),
],
)
def test_per_page_bounds(per_page, max_per_page, expected):
cfg = Config()
cfg.Pagination.max_per_page = max_per_page
p = Pagination(config=cfg)
p.per_page = per_page
p.total = 99999
assert p.per_page == expected
with raises(Exception):
p.per_page = 0
@mark.parametrize(
"page, per_page, total, expected",
[
(1, 10, 99, [1, 2, 3, "...", 10]),
(2, 10, 99, [1, 2, 3, 4, "...", 10]),
(3, 10, 99, [1, 2, 3, 4, 5, "...", 10]),
(4, 10, 99, [1, 2, 3, 4, 5, 6, "...", 10]),
(5, 10, 99, [1, "...", 3, 4, 5, 6, 7, "...", 10]),
(6, 10, 99, [1, "...", 4, 5, 6, 7, 8, "...", 10]),
(7, 10, 99, [1, "...", 5, 6, 7, 8, 9, 10]),
(8, 10, 99, [1, "...", 6, 7, 8, 9, 10]),
(9, 10, 99, [1, "...", 7, 8, 9, 10]),
(1, 20, 99, [1, 2, 3, 4, 5]),
(1, 10, 0, [1]),
(1, 10, 1, [1]),
(1, 10, 10, [1]),
(1, 10, 11, [1, 2]),
(1, 10, 50, [1, 2, 3, 4, 5]),
(1, 10, 60, [1, 2, 3, 4, 5, 6]),
(1, 10, 70, [1, 2, 3, 4, 5, 6, 7]),
(1, 10, 80, [1, 2, 3, "...", 8]),
],
)
def test_window(page, per_page, total, expected):
pagination = Pagination(page=page, per_page=per_page, total=total)
assert pagination.calculate_pages_window() == expected

View File

@@ -9,12 +9,12 @@ from urllib.parse import urlparse
import pytest import pytest
from traitlets.config import Config from traitlets.config import Config
from .. import orm
from ..utils import url_path_join as ujoin from ..utils import url_path_join as ujoin
from ..utils import wait_for_http_server from ..utils import wait_for_http_server
from .mocking import MockHub from .mocking import MockHub
from .test_api import add_user from .test_api import add_user
from .test_api import api_request from .test_api import api_request
from .utils import skip_if_ssl
@pytest.fixture @pytest.fixture
@@ -27,6 +27,7 @@ def disable_check_routes(app):
app.last_activity_callback.start() app.last_activity_callback.start()
@skip_if_ssl
async def test_external_proxy(request): async def test_external_proxy(request):
auth_token = 'secret!' auth_token = 'secret!'
proxy_ip = '127.0.0.1' proxy_ip = '127.0.0.1'

View File

@@ -6,9 +6,7 @@ from binascii import hexlify
from contextlib import contextmanager from contextlib import contextmanager
from subprocess import Popen from subprocess import Popen
from async_generator import async_generator
from async_generator import asynccontextmanager from async_generator import asynccontextmanager
from async_generator import yield_
from tornado.ioloop import IOLoop from tornado.ioloop import IOLoop
from ..utils import maybe_future from ..utils import maybe_future
@@ -17,6 +15,7 @@ from ..utils import url_path_join
from ..utils import wait_for_http_server from ..utils import wait_for_http_server
from .mocking import public_url from .mocking import public_url
from .utils import async_requests from .utils import async_requests
from .utils import skip_if_ssl
mockservice_path = os.path.dirname(os.path.abspath(__file__)) mockservice_path = os.path.dirname(os.path.abspath(__file__))
mockservice_py = os.path.join(mockservice_path, 'mockservice.py') mockservice_py = os.path.join(mockservice_path, 'mockservice.py')
@@ -24,7 +23,6 @@ mockservice_cmd = [sys.executable, mockservice_py]
@asynccontextmanager @asynccontextmanager
@async_generator
async def external_service(app, name='mockservice'): async def external_service(app, name='mockservice'):
env = { env = {
'JUPYTERHUB_API_TOKEN': hexlify(os.urandom(5)), 'JUPYTERHUB_API_TOKEN': hexlify(os.urandom(5)),
@@ -35,7 +33,7 @@ async def external_service(app, name='mockservice'):
proc = Popen(mockservice_cmd, env=env) proc = Popen(mockservice_cmd, env=env)
try: try:
await wait_for_http_server(env['JUPYTERHUB_SERVICE_URL']) await wait_for_http_server(env['JUPYTERHUB_SERVICE_URL'])
await yield_(env) yield env
finally: finally:
proc.terminate() proc.terminate()
@@ -63,6 +61,7 @@ async def test_managed_service(mockservice):
assert service.proc.poll() is None assert service.proc.poll() is None
@skip_if_ssl
async def test_proxy_service(app, mockservice_url): async def test_proxy_service(app, mockservice_url):
service = mockservice_url service = mockservice_url
name = service.name name = service.name
@@ -76,6 +75,7 @@ async def test_proxy_service(app, mockservice_url):
assert r.text.endswith(path) assert r.text.endswith(path)
@skip_if_ssl
async def test_external_service(app): async def test_external_service(app):
name = 'external' name = 'external'
async with external_service(app, name=name) as env: async with external_service(app, name=name) as env:

View File

@@ -35,6 +35,7 @@ from .utils import AsyncSession
# mock for sending monotonic counter way into the future # mock for sending monotonic counter way into the future
monotonic_future = mock.patch('time.monotonic', lambda: sys.maxsize) monotonic_future = mock.patch('time.monotonic', lambda: sys.maxsize)
ssl_enabled = False
def test_expiring_dict(): def test_expiring_dict():

View File

@@ -1,6 +1,7 @@
"""Tests for process spawning""" """Tests for process spawning"""
# Copyright (c) Jupyter Development Team. # Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License. # Distributed under the terms of the Modified BSD License.
import asyncio
import logging import logging
import os import os
import signal import signal
@@ -12,7 +13,6 @@ from unittest import mock
from urllib.parse import urlparse from urllib.parse import urlparse
import pytest import pytest
from tornado import gen
from .. import orm from .. import orm
from .. import spawner as spawnermod from .. import spawner as spawnermod
@@ -123,7 +123,7 @@ async def test_stop_spawner_sigint_fails(db):
await spawner.start() await spawner.start()
# wait for the process to get to the while True: loop # wait for the process to get to the while True: loop
await gen.sleep(1) await asyncio.sleep(1)
status = await spawner.poll() status = await spawner.poll()
assert status is None assert status is None
@@ -138,7 +138,7 @@ async def test_stop_spawner_stop_now(db):
await spawner.start() await spawner.start()
# wait for the process to get to the while True: loop # wait for the process to get to the while True: loop
await gen.sleep(1) await asyncio.sleep(1)
status = await spawner.poll() status = await spawner.poll()
assert status is None assert status is None
@@ -165,7 +165,7 @@ async def test_spawner_poll(db):
spawner.start_polling() spawner.start_polling()
# wait for the process to get to the while True: loop # wait for the process to get to the while True: loop
await gen.sleep(1) await asyncio.sleep(1)
status = await spawner.poll() status = await spawner.poll()
assert status is None assert status is None
@@ -173,12 +173,12 @@ async def test_spawner_poll(db):
proc.terminate() proc.terminate()
for i in range(10): for i in range(10):
if proc.poll() is None: if proc.poll() is None:
await gen.sleep(1) await asyncio.sleep(1)
else: else:
break break
assert proc.poll() is not None assert proc.poll() is not None
await gen.sleep(2) await asyncio.sleep(2)
status = await spawner.poll() status = await spawner.poll()
assert status is not None assert status is not None
@@ -258,8 +258,7 @@ async def test_shell_cmd(db, tmpdir, request):
def test_inherit_overwrite(): def test_inherit_overwrite():
"""On 3.6+ we check things are overwritten at import time """On 3.6+ we check things are overwritten at import time"""
"""
if sys.version_info >= (3, 6): if sys.version_info >= (3, 6):
with pytest.raises(NotImplementedError): with pytest.raises(NotImplementedError):

View File

@@ -1,21 +1,22 @@
"""Tests for utilities""" """Tests for utilities"""
import asyncio import asyncio
import time
from concurrent.futures import ThreadPoolExecutor
import pytest import pytest
from async_generator import aclosing from async_generator import aclosing
from async_generator import async_generator from tornado import gen
from async_generator import yield_ from tornado.concurrent import run_on_executor
from ..utils import iterate_until from ..utils import iterate_until
@async_generator
async def yield_n(n, delay=0.01): async def yield_n(n, delay=0.01):
"""Yield n items with a delay between each""" """Yield n items with a delay between each"""
for i in range(n): for i in range(n):
if delay: if delay:
await asyncio.sleep(delay) await asyncio.sleep(delay)
await yield_(i) yield i
def schedule_future(io_loop, *, delay, result=None): def schedule_future(io_loop, *, delay, result=None):
@@ -50,13 +51,40 @@ async def test_iterate_until(io_loop, deadline, n, delay, expected):
async def test_iterate_until_ready_after_deadline(io_loop): async def test_iterate_until_ready_after_deadline(io_loop):
f = schedule_future(io_loop, delay=0) f = schedule_future(io_loop, delay=0)
@async_generator
async def gen(): async def gen():
for i in range(5): for i in range(5):
await yield_(i) yield i
yielded = [] yielded = []
async with aclosing(iterate_until(f, gen())) as items: async with aclosing(iterate_until(f, gen())) as items:
async for item in items: async for item in items:
yielded.append(item) yielded.append(item)
assert yielded == list(range(5)) assert yielded == list(range(5))
@gen.coroutine
def tornado_coroutine():
yield gen.sleep(0.05)
return "ok"
class TornadoCompat:
def __init__(self):
self.executor = ThreadPoolExecutor(1)
@run_on_executor
def on_executor(self):
time.sleep(0.05)
return "executor"
@gen.coroutine
def tornado_coroutine(self):
yield gen.sleep(0.05)
return "gen.coroutine"
async def test_tornado_coroutines():
t = TornadoCompat()
# verify that tornado gen and executor methods return awaitables
assert (await t.on_executor()) == "executor"
assert (await t.tornado_coroutine()) == "gen.coroutine"

View File

@@ -1,9 +1,12 @@
import asyncio import asyncio
import os
from concurrent.futures import ThreadPoolExecutor from concurrent.futures import ThreadPoolExecutor
import pytest
import requests import requests
from certipy import Certipy from certipy import Certipy
from jupyterhub import metrics
from jupyterhub import orm from jupyterhub import orm
from jupyterhub.objects import Server from jupyterhub.objects import Server
from jupyterhub.utils import url_path_join as ujoin from jupyterhub.utils import url_path_join as ujoin
@@ -52,6 +55,12 @@ def ssl_setup(cert_dir, authority_name):
return external_certs return external_certs
"""Skip tests that don't work under internal-ssl when testing under internal-ssl"""
skip_if_ssl = pytest.mark.skipif(
os.environ.get('SSL_ENABLED', False), reason="Does not use internal SSL"
)
def check_db_locks(func): def check_db_locks(func):
"""Decorator that verifies no locks are held on database upon exit. """Decorator that verifies no locks are held on database upon exit.
@@ -97,6 +106,7 @@ def add_user(db, app=None, **kwargs):
if orm_user is None: if orm_user is None:
orm_user = orm.User(**kwargs) orm_user = orm.User(**kwargs)
db.add(orm_user) db.add(orm_user)
metrics.TOTAL_USERS.inc()
else: else:
for attr, value in kwargs.items(): for attr, value in kwargs.items():
setattr(orm_user, attr, value) setattr(orm_user, attr, value)

View File

@@ -69,7 +69,6 @@ class UserDict(dict):
"""Add a user to the UserDict""" """Add a user to the UserDict"""
if orm_user.id not in self: if orm_user.id not in self:
self[orm_user.id] = self.from_orm(orm_user) self[orm_user.id] = self.from_orm(orm_user)
TOTAL_USERS.inc()
return self[orm_user.id] return self[orm_user.id]
def __contains__(self, key): def __contains__(self, key):

View File

@@ -21,10 +21,6 @@ from hmac import compare_digest
from operator import itemgetter from operator import itemgetter
from async_generator import aclosing from async_generator import aclosing
from async_generator import async_generator
from async_generator import asynccontextmanager
from async_generator import yield_
from tornado import gen
from tornado import ioloop from tornado import ioloop
from tornado import web from tornado import web
from tornado.httpclient import AsyncHTTPClient from tornado.httpclient import AsyncHTTPClient
@@ -32,6 +28,15 @@ from tornado.httpclient import HTTPError
from tornado.log import app_log from tornado.log import app_log
from tornado.platform.asyncio import to_asyncio_future from tornado.platform.asyncio import to_asyncio_future
# For compatibility with python versions 3.6 or earlier.
# asyncio.Task.all_tasks() is fully moved to asyncio.all_tasks() starting with 3.9. Also applies to current_task.
try:
asyncio_all_tasks = asyncio.all_tasks
asyncio_current_task = asyncio.current_task
except AttributeError as e:
asyncio_all_tasks = asyncio.Task.all_tasks
asyncio_current_task = asyncio.Task.current_task
def random_port(): def random_port():
"""Get a single random port.""" """Get a single random port."""
@@ -79,8 +84,7 @@ def can_connect(ip, port):
def make_ssl_context(keyfile, certfile, cafile=None, verify=True, check_hostname=True): def make_ssl_context(keyfile, certfile, cafile=None, verify=True, check_hostname=True):
"""Setup context for starting an https server or making requests over ssl. """Setup context for starting an https server or making requests over ssl."""
"""
if not keyfile or not certfile: if not keyfile or not certfile:
return None return None
purpose = ssl.Purpose.SERVER_AUTH if verify else ssl.Purpose.CLIENT_AUTH purpose = ssl.Purpose.SERVER_AUTH if verify else ssl.Purpose.CLIENT_AUTH
@@ -100,7 +104,7 @@ async def exponential_backoff(
timeout=10, timeout=10,
timeout_tolerance=0.1, timeout_tolerance=0.1,
*args, *args,
**kwargs **kwargs,
): ):
""" """
Exponentially backoff until `pass_func` is true. Exponentially backoff until `pass_func` is true.
@@ -175,7 +179,7 @@ async def exponential_backoff(
dt = min(max_wait, remaining, random.uniform(0, start_wait * scale)) dt = min(max_wait, remaining, random.uniform(0, start_wait * scale))
if dt < max_wait: if dt < max_wait:
scale *= scale_factor scale *= scale_factor
await gen.sleep(dt) await asyncio.sleep(dt)
raise TimeoutError(fail_message) raise TimeoutError(fail_message)
@@ -479,7 +483,7 @@ def print_stacks(file=sys.stderr):
# also show asyncio tasks, if any # also show asyncio tasks, if any
# this will increase over time as we transition from tornado # this will increase over time as we transition from tornado
# coroutines to native `async def` # coroutines to native `async def`
tasks = asyncio.Task.all_tasks() tasks = asyncio_all_tasks()
if tasks: if tasks:
print("AsyncIO tasks: %i" % len(tasks)) print("AsyncIO tasks: %i" % len(tasks))
for task in tasks: for task in tasks:
@@ -507,28 +511,12 @@ def maybe_future(obj):
return asyncio.wrap_future(obj) return asyncio.wrap_future(obj)
else: else:
# could also check for tornado.concurrent.Future # could also check for tornado.concurrent.Future
# but with tornado >= 5 tornado.Future is asyncio.Future # but with tornado >= 5.1 tornado.Future is asyncio.Future
f = asyncio.Future() f = asyncio.Future()
f.set_result(obj) f.set_result(obj)
return f return f
@asynccontextmanager
@async_generator
async def not_aclosing(coro):
"""An empty context manager for Python < 3.5.2
which lacks the `aclose` method on async iterators
"""
await yield_(await coro)
if sys.version_info < (3, 5, 2):
# Python 3.5.1 is missing the aclose method on async iterators,
# so we can't close them
aclosing = not_aclosing
@async_generator
async def iterate_until(deadline_future, generator): async def iterate_until(deadline_future, generator):
"""An async generator that yields items from a generator """An async generator that yields items from a generator
until a deadline future resolves until a deadline future resolves
@@ -553,7 +541,7 @@ async def iterate_until(deadline_future, generator):
) )
if item_future.done(): if item_future.done():
try: try:
await yield_(item_future.result()) yield item_future.result()
except (StopAsyncIteration, asyncio.CancelledError): except (StopAsyncIteration, asyncio.CancelledError):
break break
elif deadline_future.done(): elif deadline_future.done():

View File

@@ -1,2 +1,7 @@
[tool.black] [tool.black]
skip-string-normalization = true skip-string-normalization = true
target_version = [
"py36",
"py37",
"py38",
]

View File

@@ -1,15 +1,15 @@
alembic alembic
async_generator>=1.8 async_generator>=1.9
certipy>=0.1.2 certipy>=0.1.2
entrypoints entrypoints
jinja2 jinja2>=2.11.0
jupyter_telemetry>=0.1.0 jupyter_telemetry>=0.1.0
oauthlib>=3.0 oauthlib>=3.0
pamela pamela; sys_platform != 'win32'
prometheus_client>=0.0.21 prometheus_client>=0.4.0
psutil>=5.6.5; sys_platform == 'win32' psutil>=5.6.5; sys_platform == 'win32'
python-dateutil python-dateutil
requests requests
SQLAlchemy>=1.1 SQLAlchemy>=1.1
tornado>=5.0 tornado>=5.1
traitlets>=4.3.2 traitlets>=4.3.2

View File

@@ -17,8 +17,8 @@ from setuptools.command.bdist_egg import bdist_egg
v = sys.version_info v = sys.version_info
if v[:2] < (3, 5): if v[:2] < (3, 6):
error = "ERROR: JupyterHub requires Python version 3.5 or above." error = "ERROR: JupyterHub requires Python version 3.6 or above."
print(error, file=sys.stderr) print(error, file=sys.stderr)
sys.exit(1) sys.exit(1)
@@ -94,7 +94,7 @@ setup_args = dict(
license="BSD", license="BSD",
platforms="Linux, Mac OS X", platforms="Linux, Mac OS X",
keywords=['Interactive', 'Interpreter', 'Shell', 'Web'], keywords=['Interactive', 'Interpreter', 'Shell', 'Web'],
python_requires=">=3.5", python_requires=">=3.6",
entry_points={ entry_points={
'jupyterhub.authenticators': [ 'jupyterhub.authenticators': [
'default = jupyterhub.auth:PAMAuthenticator', 'default = jupyterhub.auth:PAMAuthenticator',

View File

@@ -61,13 +61,25 @@
id="login_submit" id="login_submit"
type="submit" type="submit"
class='btn btn-jupyter' class='btn btn-jupyter'
value='Sign In' value='Sign in'
tabindex="3" tabindex="3"
/> />
<div class="feedback-widget hidden"> <div class="feedback-widget hidden">
<i class="fa fa-spinner"></i> <i class="fa fa-spinner"></i>
</div> </div>
</div> </div>
{% block login_terms %}
{% if login_term_url %}
<div id="login_terms" class="login_terms">
<input type="checkbox" id="login_terms_checkbox" name="login_terms_checkbox" required />
{% block login_terms_text %} {# allow overriding the text #}
By logging into the platform you accept the <a href="{{ login_term_url }}">terms and conditions</a>.
{% endblock login_terms_text %}
</div>
{% endif %}
{% endblock login_terms %}
</div> </div>
</form> </form>
{% endif %} {% endif %}