mirror of
https://github.com/jupyterhub/jupyterhub.git
synced 2025-10-10 11:33:01 +00:00
Merge master into rbac
This commit is contained in:
61
.github/workflows/release.yml
vendored
Normal file
61
.github/workflows/release.yml
vendored
Normal file
@@ -0,0 +1,61 @@
|
|||||||
|
# Build releases and (on tags) publish to PyPI
|
||||||
|
name: Release
|
||||||
|
|
||||||
|
# always build releases (to make sure wheel-building works)
|
||||||
|
# but only publish to PyPI on tags
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
pull_request:
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
build-release:
|
||||||
|
runs-on: ubuntu-20.04
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v2
|
||||||
|
- uses: actions/setup-python@v2
|
||||||
|
with:
|
||||||
|
python-version: 3.8
|
||||||
|
|
||||||
|
- uses: actions/setup-node@v1
|
||||||
|
with:
|
||||||
|
node-version: "14"
|
||||||
|
|
||||||
|
- name: install build package
|
||||||
|
run: |
|
||||||
|
pip install --upgrade pip
|
||||||
|
pip install build
|
||||||
|
pip freeze
|
||||||
|
|
||||||
|
- name: build release
|
||||||
|
run: |
|
||||||
|
python -m build --sdist --wheel .
|
||||||
|
ls -l dist
|
||||||
|
|
||||||
|
- name: verify wheel
|
||||||
|
run: |
|
||||||
|
cd dist
|
||||||
|
pip install ./*.whl
|
||||||
|
# verify data-files are installed where they are found
|
||||||
|
cat <<EOF | python
|
||||||
|
import os
|
||||||
|
from jupyterhub._data import DATA_FILES_PATH
|
||||||
|
print(f"DATA_FILES_PATH={DATA_FILES_PATH}")
|
||||||
|
assert os.path.exists(DATA_FILES_PATH), DATA_FILES_PATH
|
||||||
|
for subpath in (
|
||||||
|
"templates/page.html",
|
||||||
|
"static/css/style.min.css",
|
||||||
|
"static/components/jquery/dist/jquery.js",
|
||||||
|
):
|
||||||
|
path = os.path.join(DATA_FILES_PATH, subpath)
|
||||||
|
assert os.path.exists(path), path
|
||||||
|
print("OK")
|
||||||
|
EOF
|
||||||
|
|
||||||
|
- name: Publish to PyPI
|
||||||
|
if: startsWith(github.ref, 'refs/tags/')
|
||||||
|
env:
|
||||||
|
TWINE_USERNAME: __token__
|
||||||
|
TWINE_PASSWORD: ${{ secrets.PYPI_PASSWORD }}
|
||||||
|
run: |
|
||||||
|
pip install twine
|
||||||
|
twine upload --skip-existing dist/*
|
3
.github/workflows/test.yml
vendored
3
.github/workflows/test.yml
vendored
@@ -1,7 +1,7 @@
|
|||||||
# This is a GitHub workflow defining a set of jobs with a set of steps.
|
# This is a GitHub workflow defining a set of jobs with a set of steps.
|
||||||
# ref: https://docs.github.com/en/free-pro-team@latest/actions/reference/workflow-syntax-for-github-actions
|
# ref: https://docs.github.com/en/free-pro-team@latest/actions/reference/workflow-syntax-for-github-actions
|
||||||
#
|
#
|
||||||
name: Run tests
|
name: Test
|
||||||
|
|
||||||
# Trigger the workflow's on all PRs but only on pushed tags or commits to
|
# Trigger the workflow's on all PRs but only on pushed tags or commits to
|
||||||
# main/master branch to avoid PRs developed in a GitHub fork's dedicated branch
|
# main/master branch to avoid PRs developed in a GitHub fork's dedicated branch
|
||||||
@@ -9,6 +9,7 @@ name: Run tests
|
|||||||
on:
|
on:
|
||||||
pull_request:
|
pull_request:
|
||||||
push:
|
push:
|
||||||
|
workflow_dispatch:
|
||||||
|
|
||||||
defaults:
|
defaults:
|
||||||
run:
|
run:
|
||||||
|
@@ -13,7 +13,7 @@
|
|||||||
[](https://pypi.python.org/pypi/jupyterhub)
|
[](https://pypi.python.org/pypi/jupyterhub)
|
||||||
[](https://www.npmjs.com/package/jupyterhub)
|
[](https://www.npmjs.com/package/jupyterhub)
|
||||||
[](https://jupyterhub.readthedocs.org/en/latest/)
|
[](https://jupyterhub.readthedocs.org/en/latest/)
|
||||||
[](https://travis-ci.com/jupyterhub/jupyterhub)
|
[](https://github.com/jupyterhub/jupyterhub/actions)
|
||||||
[](https://hub.docker.com/r/jupyterhub/jupyterhub/tags)
|
[](https://hub.docker.com/r/jupyterhub/jupyterhub/tags)
|
||||||
[](https://circleci.com/gh/jupyterhub/jupyterhub)<!-- CircleCI Token: b5b65862eb2617b9a8d39e79340b0a6b816da8cc -->
|
[](https://circleci.com/gh/jupyterhub/jupyterhub)<!-- CircleCI Token: b5b65862eb2617b9a8d39e79340b0a6b816da8cc -->
|
||||||
[](https://codecov.io/gh/jupyterhub/jupyterhub)
|
[](https://codecov.io/gh/jupyterhub/jupyterhub)
|
||||||
|
@@ -7,6 +7,62 @@ command line for details.
|
|||||||
|
|
||||||
## [Unreleased]
|
## [Unreleased]
|
||||||
|
|
||||||
|
### 1.3
|
||||||
|
|
||||||
|
JupyterHub 1.3 is a small feature release. Highlights include:
|
||||||
|
|
||||||
|
- Require Python >=3.6 (jupyterhub 1.2 is the last release to support 3.5)
|
||||||
|
- Add a `?state=` filter for getting user list, allowing much quicker responses
|
||||||
|
when retrieving a small fraction of users.
|
||||||
|
`state` can be `active`, `inactive`, or `ready`.
|
||||||
|
- prometheus metrics now include a `jupyterhub_` prefix,
|
||||||
|
so deployments may need to update their grafana charts to match.
|
||||||
|
- page templates can now be [async](https://jinja.palletsprojects.com/en/2.11.x/api/#async-support)!
|
||||||
|
|
||||||
|
### [1.3.0]
|
||||||
|
|
||||||
|
([full changelog](https://github.com/jupyterhub/jupyterhub/compare/1.2.1...1.3.0))
|
||||||
|
|
||||||
|
#### Enhancements made
|
||||||
|
|
||||||
|
* allow services to call /api/user to identify themselves [#3293](https://github.com/jupyterhub/jupyterhub/pull/3293) ([@minrk](https://github.com/minrk))
|
||||||
|
* Add optional user agreement to login screen [#3264](https://github.com/jupyterhub/jupyterhub/pull/3264) ([@tlvu](https://github.com/tlvu))
|
||||||
|
* [Metrics] Add prefix to prometheus metrics to group all jupyterhub metrics [#3243](https://github.com/jupyterhub/jupyterhub/pull/3243) ([@agp8x](https://github.com/agp8x))
|
||||||
|
* Allow options_from_form to be configurable [#3225](https://github.com/jupyterhub/jupyterhub/pull/3225) ([@cbanek](https://github.com/cbanek))
|
||||||
|
* add ?state= filter for GET /users [#3177](https://github.com/jupyterhub/jupyterhub/pull/3177) ([@minrk](https://github.com/minrk))
|
||||||
|
* Enable async support in jinja2 templates [#3176](https://github.com/jupyterhub/jupyterhub/pull/3176) ([@yuvipanda](https://github.com/yuvipanda))
|
||||||
|
|
||||||
|
#### Bugs fixed
|
||||||
|
|
||||||
|
* fix increasing pagination limits [#3294](https://github.com/jupyterhub/jupyterhub/pull/3294) ([@minrk](https://github.com/minrk))
|
||||||
|
* fix and test TOTAL_USERS count [#3289](https://github.com/jupyterhub/jupyterhub/pull/3289) ([@minrk](https://github.com/minrk))
|
||||||
|
* Fix asyncio deprecation asyncio.Task.all_tasks [#3298](https://github.com/jupyterhub/jupyterhub/pull/3298) ([@coffeebenzene](https://github.com/coffeebenzene))
|
||||||
|
|
||||||
|
#### Maintenance and upkeep improvements
|
||||||
|
|
||||||
|
* bump oldest-required prometheus-client [#3292](https://github.com/jupyterhub/jupyterhub/pull/3292) ([@minrk](https://github.com/minrk))
|
||||||
|
* bump black pre-commit hook to 20.8 [#3287](https://github.com/jupyterhub/jupyterhub/pull/3287) ([@minrk](https://github.com/minrk))
|
||||||
|
* Test internal_ssl separately [#3266](https://github.com/jupyterhub/jupyterhub/pull/3266) ([@0mar](https://github.com/0mar))
|
||||||
|
* wait for pending spawns in spawn_form_admin_access [#3253](https://github.com/jupyterhub/jupyterhub/pull/3253) ([@minrk](https://github.com/minrk))
|
||||||
|
* Assume py36 and remove @gen.coroutine etc. [#3242](https://github.com/jupyterhub/jupyterhub/pull/3242) ([@consideRatio](https://github.com/consideRatio))
|
||||||
|
|
||||||
|
#### Documentation improvements
|
||||||
|
|
||||||
|
* Fix curl in jupyter announcements [#3286](https://github.com/jupyterhub/jupyterhub/pull/3286) ([@Sangarshanan](https://github.com/Sangarshanan))
|
||||||
|
* CONTRIBUTING: Fix contributor guide URL [#3281](https://github.com/jupyterhub/jupyterhub/pull/3281) ([@olifre](https://github.com/olifre))
|
||||||
|
* Update services.md [#3267](https://github.com/jupyterhub/jupyterhub/pull/3267) ([@slemonide](https://github.com/slemonide))
|
||||||
|
* [Docs] Fix https reverse proxy redirect issues [#3244](https://github.com/jupyterhub/jupyterhub/pull/3244) ([@mhwasil](https://github.com/mhwasil))
|
||||||
|
* Fixed idle-culler references. [#3300](https://github.com/jupyterhub/jupyterhub/pull/3300) ([@mxjeff](https://github.com/mxjeff))
|
||||||
|
* Remove the extra parenthesis in service.md [#3303](https://github.com/jupyterhub/jupyterhub/pull/3303) ([@Sangarshanan](https://github.com/Sangarshanan))
|
||||||
|
|
||||||
|
|
||||||
|
#### Contributors to this release
|
||||||
|
|
||||||
|
([GitHub contributors page for this release](https://github.com/jupyterhub/jupyterhub/graphs/contributors?from=2020-10-30&to=2020-12-11&type=c))
|
||||||
|
|
||||||
|
[@0mar](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3A0mar+updated%3A2020-10-30..2020-12-11&type=Issues) | [@agp8x](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Aagp8x+updated%3A2020-10-30..2020-12-11&type=Issues) | [@alexweav](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Aalexweav+updated%3A2020-10-30..2020-12-11&type=Issues) | [@belfhi](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Abelfhi+updated%3A2020-10-30..2020-12-11&type=Issues) | [@betatim](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Abetatim+updated%3A2020-10-30..2020-12-11&type=Issues) | [@cbanek](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Acbanek+updated%3A2020-10-30..2020-12-11&type=Issues) | [@cmd-ntrf](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Acmd-ntrf+updated%3A2020-10-30..2020-12-11&type=Issues) | [@coffeebenzene](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Acoffeebenzene+updated%3A2020-10-30..2020-12-11&type=Issues) | [@consideRatio](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3AconsideRatio+updated%3A2020-10-30..2020-12-11&type=Issues) | [@danlester](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Adanlester+updated%3A2020-10-30..2020-12-11&type=Issues) | [@fcollonval](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Afcollonval+updated%3A2020-10-30..2020-12-11&type=Issues) | [@GeorgianaElena](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3AGeorgianaElena+updated%3A2020-10-30..2020-12-11&type=Issues) | [@ianabc](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Aianabc+updated%3A2020-10-30..2020-12-11&type=Issues) | [@IvanaH8](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3AIvanaH8+updated%3A2020-10-30..2020-12-11&type=Issues) | [@manics](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Amanics+updated%3A2020-10-30..2020-12-11&type=Issues) | [@meeseeksmachine](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Ameeseeksmachine+updated%3A2020-10-30..2020-12-11&type=Issues) | [@mhwasil](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Amhwasil+updated%3A2020-10-30..2020-12-11&type=Issues) | [@minrk](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Aminrk+updated%3A2020-10-30..2020-12-11&type=Issues) | [@mriedem](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Amriedem+updated%3A2020-10-30..2020-12-11&type=Issues) | [@mxjeff](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Amxjeff+updated%3A2020-10-30..2020-12-11&type=Issues) | [@olifre](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Aolifre+updated%3A2020-10-30..2020-12-11&type=Issues) | [@rcthomas](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Arcthomas+updated%3A2020-10-30..2020-12-11&type=Issues) | [@rgbkrk](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Argbkrk+updated%3A2020-10-30..2020-12-11&type=Issues) | [@rkdarst](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Arkdarst+updated%3A2020-10-30..2020-12-11&type=Issues) | [@Sangarshanan](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3ASangarshanan+updated%3A2020-10-30..2020-12-11&type=Issues) | [@slemonide](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Aslemonide+updated%3A2020-10-30..2020-12-11&type=Issues) | [@support](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Asupport+updated%3A2020-10-30..2020-12-11&type=Issues) | [@tlvu](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Atlvu+updated%3A2020-10-30..2020-12-11&type=Issues) | [@welcome](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Awelcome+updated%3A2020-10-30..2020-12-11&type=Issues) | [@yuvipanda](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Ayuvipanda+updated%3A2020-10-30..2020-12-11&type=Issues)
|
||||||
|
|
||||||
|
|
||||||
## 1.2
|
## 1.2
|
||||||
|
|
||||||
### [1.2.2] 2020-11-27
|
### [1.2.2] 2020-11-27
|
||||||
@@ -911,7 +967,9 @@ Fix removal of `/login` page in 0.4.0, breaking some OAuth providers.
|
|||||||
First preview release
|
First preview release
|
||||||
|
|
||||||
|
|
||||||
[Unreleased]: https://github.com/jupyterhub/jupyterhub/compare/1.2.1...HEAD
|
[Unreleased]: https://github.com/jupyterhub/jupyterhub/compare/1.3.0...HEAD
|
||||||
|
[1.3.0]: https://github.com/jupyterhub/jupyterhub/compare/1.2.1...1.3.0
|
||||||
|
[1.2.2]: https://github.com/jupyterhub/jupyterhub/compare/1.2.1...1.2.2
|
||||||
[1.2.1]: https://github.com/jupyterhub/jupyterhub/compare/1.2.0...1.2.1
|
[1.2.1]: https://github.com/jupyterhub/jupyterhub/compare/1.2.0...1.2.1
|
||||||
[1.2.0]: https://github.com/jupyterhub/jupyterhub/compare/1.1.0...1.2.0
|
[1.2.0]: https://github.com/jupyterhub/jupyterhub/compare/1.1.0...1.2.0
|
||||||
[1.1.0]: https://github.com/jupyterhub/jupyterhub/compare/1.0.0...1.1.0
|
[1.1.0]: https://github.com/jupyterhub/jupyterhub/compare/1.0.0...1.1.0
|
||||||
|
@@ -1,10 +1,7 @@
|
|||||||
Eventlogging and Telemetry
|
Eventlogging and Telemetry
|
||||||
==========================
|
==========================
|
||||||
|
|
||||||
JupyterHub can be configured to record structured events from a running server using Jupyter's `Telemetry System`_. The types of events that JupyterHub emits are defined by `JSON schemas`_ listed below_
|
JupyterHub can be configured to record structured events from a running server using Jupyter's `Telemetry System`_. The types of events that JupyterHub emits are defined by `JSON schemas`_ listed at the bottom of this page_.
|
||||||
|
|
||||||
emitted as JSON data, defined and validated by the JSON schemas listed below.
|
|
||||||
|
|
||||||
|
|
||||||
.. _logging: https://docs.python.org/3/library/logging.html
|
.. _logging: https://docs.python.org/3/library/logging.html
|
||||||
.. _`Telemetry System`: https://github.com/jupyter/telemetry
|
.. _`Telemetry System`: https://github.com/jupyter/telemetry
|
||||||
@@ -38,13 +35,12 @@ Here's a basic example:
|
|||||||
The output is a file, ``"event.log"``, with events recorded as JSON data.
|
The output is a file, ``"event.log"``, with events recorded as JSON data.
|
||||||
|
|
||||||
|
|
||||||
|
.. _page:
|
||||||
.. _below:
|
|
||||||
|
|
||||||
Event schemas
|
Event schemas
|
||||||
-------------
|
-------------
|
||||||
|
|
||||||
.. toctree::
|
.. toctree::
|
||||||
:maxdepth: 2
|
:maxdepth: 2
|
||||||
|
|
||||||
server-actions.rst
|
server-actions.rst
|
||||||
|
@@ -86,6 +86,7 @@ server {
|
|||||||
proxy_http_version 1.1;
|
proxy_http_version 1.1;
|
||||||
proxy_set_header Upgrade $http_upgrade;
|
proxy_set_header Upgrade $http_upgrade;
|
||||||
proxy_set_header Connection $connection_upgrade;
|
proxy_set_header Connection $connection_upgrade;
|
||||||
|
proxy_set_header X-Scheme $scheme;
|
||||||
|
|
||||||
proxy_buffering off;
|
proxy_buffering off;
|
||||||
}
|
}
|
||||||
|
@@ -179,3 +179,13 @@ The number of named servers per user can be limited by setting
|
|||||||
```python
|
```python
|
||||||
c.JupyterHub.named_server_limit_per_user = 5
|
c.JupyterHub.named_server_limit_per_user = 5
|
||||||
```
|
```
|
||||||
|
|
||||||
|
## Switching to Jupyter Server
|
||||||
|
|
||||||
|
[Jupyter Server](https://jupyter-server.readthedocs.io/en/latest/) is a new Tornado Server backend for Jupyter web applications (e.g. JupyterLab 3.0 uses this package as its default backend).
|
||||||
|
|
||||||
|
By default, the single-user notebook server uses the (old) `NotebookApp` from the [notebook](https://github.com/jupyter/notebook) package. You can switch to using Jupyter Server's `ServerApp` backend (this will likely become the default in future releases) by setting the `JUPYTERHUB_SINGLEUSER_APP` environment variable to:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
export JUPYTERHUB_SINGLEUSER_APP='jupyter_server.serverapp.ServerApp'
|
||||||
|
```
|
||||||
|
@@ -91,9 +91,9 @@ This example would be configured as follows in `jupyterhub_config.py`:
|
|||||||
```python
|
```python
|
||||||
c.JupyterHub.services = [
|
c.JupyterHub.services = [
|
||||||
{
|
{
|
||||||
'name': 'cull-idle',
|
'name': 'idle-culler',
|
||||||
'admin': True,
|
'admin': True,
|
||||||
'command': [sys.executable, '/path/to/cull-idle.py', '--timeout']
|
'command': [sys.executable, '-m', 'jupyterhub_idle_culler', '--timeout=3600']
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
```
|
```
|
||||||
@@ -123,15 +123,14 @@ For the previous 'cull idle' Service example, these environment variables
|
|||||||
would be passed to the Service when the Hub starts the 'cull idle' Service:
|
would be passed to the Service when the Hub starts the 'cull idle' Service:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
JUPYTERHUB_SERVICE_NAME: 'cull-idle'
|
JUPYTERHUB_SERVICE_NAME: 'idle-culler'
|
||||||
JUPYTERHUB_API_TOKEN: API token assigned to the service
|
JUPYTERHUB_API_TOKEN: API token assigned to the service
|
||||||
JUPYTERHUB_API_URL: http://127.0.0.1:8080/hub/api
|
JUPYTERHUB_API_URL: http://127.0.0.1:8080/hub/api
|
||||||
JUPYTERHUB_BASE_URL: https://mydomain[:port]
|
JUPYTERHUB_BASE_URL: https://mydomain[:port]
|
||||||
JUPYTERHUB_SERVICE_PREFIX: /services/cull-idle/
|
JUPYTERHUB_SERVICE_PREFIX: /services/idle-culler/
|
||||||
```
|
```
|
||||||
|
|
||||||
See the JupyterHub GitHub repo for additional information about the
|
See the GitHub repo for additional information about the [jupyterhub_idle_culler][].
|
||||||
[`cull-idle` example](https://github.com/jupyterhub/jupyterhub/tree/master/examples/cull-idle).
|
|
||||||
|
|
||||||
## Externally-Managed Services
|
## Externally-Managed Services
|
||||||
|
|
||||||
@@ -340,7 +339,7 @@ and taking note of the following process:
|
|||||||
|
|
||||||
```python
|
```python
|
||||||
r = requests.get(
|
r = requests.get(
|
||||||
'/'.join((["http://127.0.0.1:8081/hub/api",
|
'/'.join(["http://127.0.0.1:8081/hub/api",
|
||||||
"authorizations/cookie/jupyterhub-services",
|
"authorizations/cookie/jupyterhub-services",
|
||||||
quote(encrypted_cookie, safe=''),
|
quote(encrypted_cookie, safe=''),
|
||||||
]),
|
]),
|
||||||
@@ -376,3 +375,4 @@ section on securing the notebook viewer.
|
|||||||
[HubAuth.user_for_token]: ../api/services.auth.html#jupyterhub.services.auth.HubAuth.user_for_token
|
[HubAuth.user_for_token]: ../api/services.auth.html#jupyterhub.services.auth.HubAuth.user_for_token
|
||||||
[HubAuthenticated]: ../api/services.auth.html#jupyterhub.services.auth.HubAuthenticated
|
[HubAuthenticated]: ../api/services.auth.html#jupyterhub.services.auth.HubAuthenticated
|
||||||
[nbviewer example]: https://github.com/jupyter/nbviewer#securing-the-notebook-viewer
|
[nbviewer example]: https://github.com/jupyter/nbviewer#securing-the-notebook-viewer
|
||||||
|
[jupyterhub_idle_culler]: https://github.com/jupyterhub/jupyterhub-idle-culler
|
||||||
|
@@ -4,7 +4,7 @@
|
|||||||
|
|
||||||
version_info = (
|
version_info = (
|
||||||
1,
|
1,
|
||||||
3,
|
4,
|
||||||
0,
|
0,
|
||||||
"", # release (b1, rc1, or "" for final or dev)
|
"", # release (b1, rc1, or "" for final or dev)
|
||||||
"dev", # dev or nothing for beta/rc/stable releases
|
"dev", # dev or nothing for beta/rc/stable releases
|
||||||
|
@@ -37,7 +37,11 @@ class SelfAPIHandler(APIHandler):
|
|||||||
user = self.get_current_user_oauth_token()
|
user = self.get_current_user_oauth_token()
|
||||||
if user is None:
|
if user is None:
|
||||||
raise web.HTTPError(403)
|
raise web.HTTPError(403)
|
||||||
self.write(json.dumps(self.user_model(user)))
|
if isinstance(user, orm.Service):
|
||||||
|
model = self.service_model(user)
|
||||||
|
else:
|
||||||
|
model = self.user_model(user)
|
||||||
|
self.write(json.dumps(model))
|
||||||
|
|
||||||
|
|
||||||
class UserListAPIHandler(APIHandler):
|
class UserListAPIHandler(APIHandler):
|
||||||
@@ -240,6 +244,13 @@ class UserAPIHandler(APIHandler):
|
|||||||
)
|
)
|
||||||
|
|
||||||
await maybe_future(self.authenticator.delete_user(user))
|
await maybe_future(self.authenticator.delete_user(user))
|
||||||
|
|
||||||
|
# allow the spawner to cleanup any persistent resources associated with the user
|
||||||
|
try:
|
||||||
|
await user.spawner.delete_forever()
|
||||||
|
except Exception as e:
|
||||||
|
self.log.error("Error cleaning up persistent resources: %s" % e)
|
||||||
|
|
||||||
# remove from registry
|
# remove from registry
|
||||||
self.users.delete(user)
|
self.users.delete(user)
|
||||||
|
|
||||||
|
@@ -29,6 +29,14 @@ from urllib.parse import urlunparse
|
|||||||
if sys.version_info[:2] < (3, 3):
|
if sys.version_info[:2] < (3, 3):
|
||||||
raise ValueError("Python < 3.3 not supported: %s" % sys.version)
|
raise ValueError("Python < 3.3 not supported: %s" % sys.version)
|
||||||
|
|
||||||
|
# For compatibility with python versions 3.6 or earlier.
|
||||||
|
# asyncio.Task.all_tasks() is fully moved to asyncio.all_tasks() starting with 3.9. Also applies to current_task.
|
||||||
|
try:
|
||||||
|
asyncio_all_tasks = asyncio.all_tasks
|
||||||
|
asyncio_current_task = asyncio.current_task
|
||||||
|
except AttributeError as e:
|
||||||
|
asyncio_all_tasks = asyncio.Task.all_tasks
|
||||||
|
asyncio_current_task = asyncio.Task.current_task
|
||||||
|
|
||||||
from dateutil.parser import parse as parse_date
|
from dateutil.parser import parse as parse_date
|
||||||
from jinja2 import Environment, FileSystemLoader, PrefixLoader, ChoiceLoader
|
from jinja2 import Environment, FileSystemLoader, PrefixLoader, ChoiceLoader
|
||||||
@@ -392,7 +400,8 @@ class JupyterHub(Application):
|
|||||||
300, help="Interval (in seconds) at which to update last-activity timestamps."
|
300, help="Interval (in seconds) at which to update last-activity timestamps."
|
||||||
).tag(config=True)
|
).tag(config=True)
|
||||||
proxy_check_interval = Integer(
|
proxy_check_interval = Integer(
|
||||||
30, help="Interval (in seconds) at which to check if the proxy is running."
|
5,
|
||||||
|
help="DEPRECATED since version 0.8: Use ConfigurableHTTPProxy.check_running_interval",
|
||||||
).tag(config=True)
|
).tag(config=True)
|
||||||
service_check_interval = Integer(
|
service_check_interval = Integer(
|
||||||
60,
|
60,
|
||||||
@@ -706,6 +715,7 @@ class JupyterHub(Application):
|
|||||||
).tag(config=True)
|
).tag(config=True)
|
||||||
|
|
||||||
_proxy_config_map = {
|
_proxy_config_map = {
|
||||||
|
'proxy_check_interval': 'check_running_interval',
|
||||||
'proxy_cmd': 'command',
|
'proxy_cmd': 'command',
|
||||||
'debug_proxy': 'debug',
|
'debug_proxy': 'debug',
|
||||||
'proxy_auth_token': 'auth_token',
|
'proxy_auth_token': 'auth_token',
|
||||||
@@ -864,15 +874,30 @@ class JupyterHub(Application):
|
|||||||
to reduce the cost of checking authentication tokens.
|
to reduce the cost of checking authentication tokens.
|
||||||
""",
|
""",
|
||||||
).tag(config=True)
|
).tag(config=True)
|
||||||
cookie_secret = Bytes(
|
cookie_secret = Union(
|
||||||
|
[Bytes(), Unicode()],
|
||||||
help="""The cookie secret to use to encrypt cookies.
|
help="""The cookie secret to use to encrypt cookies.
|
||||||
|
|
||||||
Loaded from the JPY_COOKIE_SECRET env variable by default.
|
Loaded from the JPY_COOKIE_SECRET env variable by default.
|
||||||
|
|
||||||
Should be exactly 256 bits (32 bytes).
|
Should be exactly 256 bits (32 bytes).
|
||||||
"""
|
""",
|
||||||
).tag(config=True, env='JPY_COOKIE_SECRET')
|
).tag(config=True, env='JPY_COOKIE_SECRET')
|
||||||
|
|
||||||
|
@validate('cookie_secret')
|
||||||
|
def _validate_secret_key(self, proposal):
|
||||||
|
"""Coerces strings with even number of hexadecimal characters to bytes."""
|
||||||
|
r = proposal['value']
|
||||||
|
if isinstance(r, str):
|
||||||
|
try:
|
||||||
|
return bytes.fromhex(r)
|
||||||
|
except ValueError:
|
||||||
|
raise ValueError(
|
||||||
|
"cookie_secret set as a string must contain an even amount of hexadecimal characters."
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
return r
|
||||||
|
|
||||||
@observe('cookie_secret')
|
@observe('cookie_secret')
|
||||||
def _cookie_secret_check(self, change):
|
def _cookie_secret_check(self, change):
|
||||||
secret = change.new
|
secret = change.new
|
||||||
@@ -2884,9 +2909,7 @@ class JupyterHub(Application):
|
|||||||
async def shutdown_cancel_tasks(self, sig):
|
async def shutdown_cancel_tasks(self, sig):
|
||||||
"""Cancel all other tasks of the event loop and initiate cleanup"""
|
"""Cancel all other tasks of the event loop and initiate cleanup"""
|
||||||
self.log.critical("Received signal %s, initiating shutdown...", sig.name)
|
self.log.critical("Received signal %s, initiating shutdown...", sig.name)
|
||||||
tasks = [
|
tasks = [t for t in asyncio_all_tasks() if t is not asyncio_current_task()]
|
||||||
t for t in asyncio.Task.all_tasks() if t is not asyncio.Task.current_task()
|
|
||||||
]
|
|
||||||
|
|
||||||
if tasks:
|
if tasks:
|
||||||
self.log.debug("Cancelling pending tasks")
|
self.log.debug("Cancelling pending tasks")
|
||||||
@@ -2899,7 +2922,7 @@ class JupyterHub(Application):
|
|||||||
except StopAsyncIteration as e:
|
except StopAsyncIteration as e:
|
||||||
self.log.error("Caught StopAsyncIteration Exception", exc_info=True)
|
self.log.error("Caught StopAsyncIteration Exception", exc_info=True)
|
||||||
|
|
||||||
tasks = [t for t in asyncio.Task.all_tasks()]
|
tasks = [t for t in asyncio_all_tasks()]
|
||||||
for t in tasks:
|
for t in tasks:
|
||||||
self.log.debug("Task status: %s", t)
|
self.log.debug("Task status: %s", t)
|
||||||
await self.cleanup()
|
await self.cleanup()
|
||||||
|
@@ -498,6 +498,11 @@ class BaseHandler(RequestHandler):
|
|||||||
path=url_path_join(self.base_url, 'services'),
|
path=url_path_join(self.base_url, 'services'),
|
||||||
**kwargs,
|
**kwargs,
|
||||||
)
|
)
|
||||||
|
# clear tornado cookie
|
||||||
|
self.clear_cookie(
|
||||||
|
'_xsrf',
|
||||||
|
**self.settings.get('xsrf_cookie_kwargs', {}),
|
||||||
|
)
|
||||||
# Reset _jupyterhub_user
|
# Reset _jupyterhub_user
|
||||||
self._jupyterhub_user = None
|
self._jupyterhub_user = None
|
||||||
|
|
||||||
@@ -1192,8 +1197,8 @@ class BaseHandler(RequestHandler):
|
|||||||
"""
|
"""
|
||||||
Render jinja2 template
|
Render jinja2 template
|
||||||
|
|
||||||
If sync is set to True, we return an awaitable
|
If sync is set to True, we render the template & return a string
|
||||||
If sync is set to False, we render the template & return a string
|
If sync is set to False, we return an awaitable
|
||||||
"""
|
"""
|
||||||
template_ns = {}
|
template_ns = {}
|
||||||
template_ns.update(self.template_namespace)
|
template_ns.update(self.template_namespace)
|
||||||
|
@@ -459,7 +459,8 @@ class AdminHandler(BaseHandler):
|
|||||||
@needs_scope('admin:users')
|
@needs_scope('admin:users')
|
||||||
@needs_scope('admin:users:servers')
|
@needs_scope('admin:users:servers')
|
||||||
async def get(self):
|
async def get(self):
|
||||||
page, per_page, offset = Pagination(config=self.config).get_page_args(self)
|
pagination = Pagination(url=self.request.uri, config=self.config)
|
||||||
|
page, per_page, offset = pagination.get_page_args(self)
|
||||||
|
|
||||||
available = {'name', 'admin', 'running', 'last_activity'}
|
available = {'name', 'admin', 'running', 'last_activity'}
|
||||||
default_sort = ['admin', 'name']
|
default_sort = ['admin', 'name']
|
||||||
@@ -502,27 +503,24 @@ class AdminHandler(BaseHandler):
|
|||||||
# get User.col.desc() order objects
|
# get User.col.desc() order objects
|
||||||
ordered = [getattr(c, o)() for c, o in zip(cols, orders)]
|
ordered = [getattr(c, o)() for c, o in zip(cols, orders)]
|
||||||
|
|
||||||
|
query = self.db.query(orm.User).outerjoin(orm.Spawner).distinct(orm.User.id)
|
||||||
|
subquery = query.subquery("users")
|
||||||
users = (
|
users = (
|
||||||
self.db.query(orm.User)
|
self.db.query(orm.User)
|
||||||
|
.select_entity_from(subquery)
|
||||||
.outerjoin(orm.Spawner)
|
.outerjoin(orm.Spawner)
|
||||||
.order_by(*ordered)
|
.order_by(*ordered)
|
||||||
.limit(per_page)
|
.limit(per_page)
|
||||||
.offset(offset)
|
.offset(offset)
|
||||||
)
|
)
|
||||||
|
|
||||||
users = [self._user_from_orm(u) for u in users]
|
users = [self._user_from_orm(u) for u in users]
|
||||||
|
|
||||||
running = []
|
running = []
|
||||||
for u in users:
|
for u in users:
|
||||||
running.extend(s for s in u.spawners.values() if s.active)
|
running.extend(s for s in u.spawners.values() if s.active)
|
||||||
|
|
||||||
total = self.db.query(orm.User.id).count()
|
pagination.total = query.count()
|
||||||
pagination = Pagination(
|
|
||||||
url=self.request.uri,
|
|
||||||
total=total,
|
|
||||||
page=page,
|
|
||||||
per_page=per_page,
|
|
||||||
config=self.config,
|
|
||||||
)
|
|
||||||
|
|
||||||
auth_state = await self.current_user.get_auth_state()
|
auth_state = await self.current_user.get_auth_state()
|
||||||
html = await self.render_template(
|
html = await self.render_template(
|
||||||
|
@@ -2,7 +2,9 @@
|
|||||||
# Copyright (c) Jupyter Development Team.
|
# Copyright (c) Jupyter Development Team.
|
||||||
# Distributed under the terms of the Modified BSD License.
|
# Distributed under the terms of the Modified BSD License.
|
||||||
import json
|
import json
|
||||||
|
import logging
|
||||||
import traceback
|
import traceback
|
||||||
|
from functools import partial
|
||||||
from http.cookies import SimpleCookie
|
from http.cookies import SimpleCookie
|
||||||
from urllib.parse import urlparse
|
from urllib.parse import urlparse
|
||||||
from urllib.parse import urlunparse
|
from urllib.parse import urlunparse
|
||||||
@@ -132,19 +134,25 @@ def log_request(handler):
|
|||||||
status < 300 and isinstance(handler, (StaticFileHandler, HealthCheckHandler))
|
status < 300 and isinstance(handler, (StaticFileHandler, HealthCheckHandler))
|
||||||
):
|
):
|
||||||
# static-file success and 304 Found are debug-level
|
# static-file success and 304 Found are debug-level
|
||||||
log_method = access_log.debug
|
log_level = logging.DEBUG
|
||||||
elif status < 400:
|
elif status < 400:
|
||||||
log_method = access_log.info
|
log_level = logging.INFO
|
||||||
elif status < 500:
|
elif status < 500:
|
||||||
log_method = access_log.warning
|
log_level = logging.WARNING
|
||||||
else:
|
else:
|
||||||
log_method = access_log.error
|
log_level = logging.ERROR
|
||||||
|
|
||||||
uri = _scrub_uri(request.uri)
|
uri = _scrub_uri(request.uri)
|
||||||
headers = _scrub_headers(request.headers)
|
headers = _scrub_headers(request.headers)
|
||||||
|
|
||||||
request_time = 1000.0 * handler.request.request_time()
|
request_time = 1000.0 * handler.request.request_time()
|
||||||
|
|
||||||
|
# always log slow responses (longer than 1s) at least info-level
|
||||||
|
if request_time >= 1000 and log_level < logging.INFO:
|
||||||
|
log_level = logging.INFO
|
||||||
|
|
||||||
|
log_method = partial(access_log.log, log_level)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
user = handler.current_user
|
user = handler.current_user
|
||||||
except (HTTPError, RuntimeError):
|
except (HTTPError, RuntimeError):
|
||||||
|
@@ -3,9 +3,9 @@ Prometheus metrics exported by JupyterHub
|
|||||||
|
|
||||||
Read https://prometheus.io/docs/practices/naming/ for naming
|
Read https://prometheus.io/docs/practices/naming/ for naming
|
||||||
conventions for metrics & labels. We generally prefer naming them
|
conventions for metrics & labels. We generally prefer naming them
|
||||||
`<noun>_<verb>_<type_suffix>`. So a histogram that's tracking
|
`jupyterhub_<noun>_<verb>_<type_suffix>`. So a histogram that's tracking
|
||||||
the duration (in seconds) of servers spawning would be called
|
the duration (in seconds) of servers spawning would be called
|
||||||
SERVER_SPAWN_DURATION_SECONDS.
|
jupyterhub_server_spawn_duration_seconds.
|
||||||
|
|
||||||
We also create an Enum for each 'status' type label in every metric
|
We also create an Enum for each 'status' type label in every metric
|
||||||
we collect. This is to make sure that the metrics exist regardless
|
we collect. This is to make sure that the metrics exist regardless
|
||||||
@@ -14,6 +14,10 @@ create them, the metric spawn_duration_seconds{status="failure"}
|
|||||||
will not actually exist until the first failure. This makes dashboarding
|
will not actually exist until the first failure. This makes dashboarding
|
||||||
and alerting difficult, so we explicitly list statuses and create
|
and alerting difficult, so we explicitly list statuses and create
|
||||||
them manually here.
|
them manually here.
|
||||||
|
|
||||||
|
.. versionchanged:: 1.3
|
||||||
|
|
||||||
|
added ``jupyterhub_`` prefix to metric names.
|
||||||
"""
|
"""
|
||||||
from enum import Enum
|
from enum import Enum
|
||||||
|
|
||||||
@@ -21,13 +25,13 @@ from prometheus_client import Gauge
|
|||||||
from prometheus_client import Histogram
|
from prometheus_client import Histogram
|
||||||
|
|
||||||
REQUEST_DURATION_SECONDS = Histogram(
|
REQUEST_DURATION_SECONDS = Histogram(
|
||||||
'request_duration_seconds',
|
'jupyterhub_request_duration_seconds',
|
||||||
'request duration for all HTTP requests',
|
'request duration for all HTTP requests',
|
||||||
['method', 'handler', 'code'],
|
['method', 'handler', 'code'],
|
||||||
)
|
)
|
||||||
|
|
||||||
SERVER_SPAWN_DURATION_SECONDS = Histogram(
|
SERVER_SPAWN_DURATION_SECONDS = Histogram(
|
||||||
'server_spawn_duration_seconds',
|
'jupyterhub_server_spawn_duration_seconds',
|
||||||
'time taken for server spawning operation',
|
'time taken for server spawning operation',
|
||||||
['status'],
|
['status'],
|
||||||
# Use custom bucket sizes, since the default bucket ranges
|
# Use custom bucket sizes, since the default bucket ranges
|
||||||
@@ -36,25 +40,27 @@ SERVER_SPAWN_DURATION_SECONDS = Histogram(
|
|||||||
)
|
)
|
||||||
|
|
||||||
RUNNING_SERVERS = Gauge(
|
RUNNING_SERVERS = Gauge(
|
||||||
'running_servers', 'the number of user servers currently running'
|
'jupyterhub_running_servers', 'the number of user servers currently running'
|
||||||
)
|
)
|
||||||
|
|
||||||
TOTAL_USERS = Gauge('total_users', 'total number of users')
|
TOTAL_USERS = Gauge('jupyterhub_total_users', 'total number of users')
|
||||||
|
|
||||||
CHECK_ROUTES_DURATION_SECONDS = Histogram(
|
CHECK_ROUTES_DURATION_SECONDS = Histogram(
|
||||||
'check_routes_duration_seconds', 'Time taken to validate all routes in proxy'
|
'jupyterhub_check_routes_duration_seconds',
|
||||||
|
'Time taken to validate all routes in proxy',
|
||||||
)
|
)
|
||||||
|
|
||||||
HUB_STARTUP_DURATION_SECONDS = Histogram(
|
HUB_STARTUP_DURATION_SECONDS = Histogram(
|
||||||
'hub_startup_duration_seconds', 'Time taken for Hub to start'
|
'jupyterhub_hub_startup_duration_seconds', 'Time taken for Hub to start'
|
||||||
)
|
)
|
||||||
|
|
||||||
INIT_SPAWNERS_DURATION_SECONDS = Histogram(
|
INIT_SPAWNERS_DURATION_SECONDS = Histogram(
|
||||||
'init_spawners_duration_seconds', 'Time taken for spawners to initialize'
|
'jupyterhub_init_spawners_duration_seconds', 'Time taken for spawners to initialize'
|
||||||
)
|
)
|
||||||
|
|
||||||
PROXY_POLL_DURATION_SECONDS = Histogram(
|
PROXY_POLL_DURATION_SECONDS = Histogram(
|
||||||
'proxy_poll_duration_seconds', 'duration for polling all routes from proxy'
|
'jupyterhub_proxy_poll_duration_seconds',
|
||||||
|
'duration for polling all routes from proxy',
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
@@ -79,7 +85,9 @@ for s in ServerSpawnStatus:
|
|||||||
|
|
||||||
|
|
||||||
PROXY_ADD_DURATION_SECONDS = Histogram(
|
PROXY_ADD_DURATION_SECONDS = Histogram(
|
||||||
'proxy_add_duration_seconds', 'duration for adding user routes to proxy', ['status']
|
'jupyterhub_proxy_add_duration_seconds',
|
||||||
|
'duration for adding user routes to proxy',
|
||||||
|
['status'],
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
@@ -100,7 +108,7 @@ for s in ProxyAddStatus:
|
|||||||
|
|
||||||
|
|
||||||
SERVER_POLL_DURATION_SECONDS = Histogram(
|
SERVER_POLL_DURATION_SECONDS = Histogram(
|
||||||
'server_poll_duration_seconds',
|
'jupyterhub_server_poll_duration_seconds',
|
||||||
'time taken to poll if server is running',
|
'time taken to poll if server is running',
|
||||||
['status'],
|
['status'],
|
||||||
)
|
)
|
||||||
@@ -127,7 +135,9 @@ for s in ServerPollStatus:
|
|||||||
|
|
||||||
|
|
||||||
SERVER_STOP_DURATION_SECONDS = Histogram(
|
SERVER_STOP_DURATION_SECONDS = Histogram(
|
||||||
'server_stop_seconds', 'time taken for server stopping operation', ['status']
|
'jupyterhub_server_stop_seconds',
|
||||||
|
'time taken for server stopping operation',
|
||||||
|
['status'],
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
@@ -148,7 +158,7 @@ for s in ServerStopStatus:
|
|||||||
|
|
||||||
|
|
||||||
PROXY_DELETE_DURATION_SECONDS = Histogram(
|
PROXY_DELETE_DURATION_SECONDS = Histogram(
|
||||||
'proxy_delete_duration_seconds',
|
'jupyterhub_proxy_delete_duration_seconds',
|
||||||
'duration for deleting user routes from proxy',
|
'duration for deleting user routes from proxy',
|
||||||
['status'],
|
['status'],
|
||||||
)
|
)
|
||||||
|
@@ -400,7 +400,7 @@ class Expiring:
|
|||||||
which should be unix timestamp or datetime object
|
which should be unix timestamp or datetime object
|
||||||
"""
|
"""
|
||||||
|
|
||||||
now = utcnow # funciton, must return float timestamp or datetime
|
now = utcnow # function, must return float timestamp or datetime
|
||||||
expires_at = None # must be defined
|
expires_at = None # must be defined
|
||||||
|
|
||||||
@property
|
@property
|
||||||
|
@@ -1,7 +1,6 @@
|
|||||||
"""Basic class to manage pagination utils."""
|
"""Basic class to manage pagination utils."""
|
||||||
# Copyright (c) Jupyter Development Team.
|
# Copyright (c) Jupyter Development Team.
|
||||||
# Distributed under the terms of the Modified BSD License.
|
# Distributed under the terms of the Modified BSD License.
|
||||||
from traitlets import Bool
|
|
||||||
from traitlets import default
|
from traitlets import default
|
||||||
from traitlets import Integer
|
from traitlets import Integer
|
||||||
from traitlets import observe
|
from traitlets import observe
|
||||||
@@ -81,13 +80,13 @@ class Pagination(Configurable):
|
|||||||
try:
|
try:
|
||||||
self.per_page = int(per_page)
|
self.per_page = int(per_page)
|
||||||
except Exception:
|
except Exception:
|
||||||
self.per_page = self._default_per_page
|
self.per_page = self.default_per_page
|
||||||
|
|
||||||
try:
|
try:
|
||||||
self.page = int(page)
|
self.page = int(page)
|
||||||
if self.page < 1:
|
if self.page < 1:
|
||||||
self.page = 1
|
self.page = 1
|
||||||
except:
|
except Exception:
|
||||||
self.page = 1
|
self.page = 1
|
||||||
|
|
||||||
return self.page, self.per_page, self.per_page * (self.page - 1)
|
return self.page, self.per_page, self.per_page * (self.page - 1)
|
||||||
|
@@ -450,7 +450,11 @@ class ConfigurableHTTPProxy(Proxy):
|
|||||||
Loaded from the CONFIGPROXY_AUTH_TOKEN env variable by default.
|
Loaded from the CONFIGPROXY_AUTH_TOKEN env variable by default.
|
||||||
"""
|
"""
|
||||||
).tag(config=True)
|
).tag(config=True)
|
||||||
check_running_interval = Integer(5, config=True)
|
check_running_interval = Integer(
|
||||||
|
5,
|
||||||
|
help="Interval (in seconds) at which to check if the proxy is running.",
|
||||||
|
config=True,
|
||||||
|
)
|
||||||
|
|
||||||
@default('auth_token')
|
@default('auth_token')
|
||||||
def _auth_token_default(self):
|
def _auth_token_default(self):
|
||||||
|
@@ -353,8 +353,9 @@ class Spawner(LoggingConfigurable):
|
|||||||
|
|
||||||
return options_form
|
return options_form
|
||||||
|
|
||||||
def options_from_form(self, form_data):
|
options_from_form = Callable(
|
||||||
"""Interpret HTTP form data
|
help="""
|
||||||
|
Interpret HTTP form data
|
||||||
|
|
||||||
Form data will always arrive as a dict of lists of strings.
|
Form data will always arrive as a dict of lists of strings.
|
||||||
Override this function to understand single-values, numbers, etc.
|
Override this function to understand single-values, numbers, etc.
|
||||||
@@ -378,7 +379,14 @@ class Spawner(LoggingConfigurable):
|
|||||||
(with additional support for bytes in case of uploaded file data),
|
(with additional support for bytes in case of uploaded file data),
|
||||||
and any non-bytes non-jsonable values will be replaced with None
|
and any non-bytes non-jsonable values will be replaced with None
|
||||||
if the user_options are re-used.
|
if the user_options are re-used.
|
||||||
"""
|
""",
|
||||||
|
).tag(config=True)
|
||||||
|
|
||||||
|
@default("options_from_form")
|
||||||
|
def _options_from_form(self):
|
||||||
|
return self._default_options_from_form
|
||||||
|
|
||||||
|
def _default_options_from_form(self, form_data):
|
||||||
return form_data
|
return form_data
|
||||||
|
|
||||||
def options_from_query(self, query_data):
|
def options_from_query(self, query_data):
|
||||||
|
@@ -95,6 +95,16 @@ class MockSpawner(SimpleLocalProcessSpawner):
|
|||||||
def _cmd_default(self):
|
def _cmd_default(self):
|
||||||
return [sys.executable, '-m', 'jupyterhub.tests.mocksu']
|
return [sys.executable, '-m', 'jupyterhub.tests.mocksu']
|
||||||
|
|
||||||
|
async def delete_forever(self):
|
||||||
|
"""Called when a user is deleted.
|
||||||
|
|
||||||
|
This can do things like request removal of resources such as persistent storage.
|
||||||
|
Only called on stopped spawners, and is likely the last action ever taken for the user.
|
||||||
|
|
||||||
|
Will only be called once on the user's default Spawner.
|
||||||
|
"""
|
||||||
|
pass
|
||||||
|
|
||||||
use_this_api_token = None
|
use_this_api_token = None
|
||||||
|
|
||||||
def start(self):
|
def start(self):
|
||||||
|
@@ -296,6 +296,17 @@ async def test_get_self(app):
|
|||||||
assert r.status_code == 403
|
assert r.status_code == 403
|
||||||
|
|
||||||
|
|
||||||
|
async def test_get_self_service(app, mockservice):
|
||||||
|
r = await api_request(
|
||||||
|
app, "user", headers={"Authorization": f"token {mockservice.api_token}"}
|
||||||
|
)
|
||||||
|
r.raise_for_status()
|
||||||
|
service_info = r.json()
|
||||||
|
|
||||||
|
assert service_info['kind'] == 'service'
|
||||||
|
assert service_info['name'] == mockservice.name
|
||||||
|
|
||||||
|
|
||||||
@mark.user
|
@mark.user
|
||||||
@mark.role
|
@mark.role
|
||||||
async def test_add_user(app):
|
async def test_add_user(app):
|
||||||
|
@@ -199,6 +199,18 @@ def test_cookie_secret_env(tmpdir, request):
|
|||||||
assert not os.path.exists(hub.cookie_secret_file)
|
assert not os.path.exists(hub.cookie_secret_file)
|
||||||
|
|
||||||
|
|
||||||
|
def test_cookie_secret_string_():
|
||||||
|
cfg = Config()
|
||||||
|
|
||||||
|
cfg.JupyterHub.cookie_secret = "not hex"
|
||||||
|
with pytest.raises(ValueError):
|
||||||
|
JupyterHub(config=cfg)
|
||||||
|
|
||||||
|
cfg.JupyterHub.cookie_secret = "abc123"
|
||||||
|
app = JupyterHub(config=cfg)
|
||||||
|
assert app.cookie_secret == binascii.a2b_hex('abc123')
|
||||||
|
|
||||||
|
|
||||||
async def test_load_groups(tmpdir, request):
|
async def test_load_groups(tmpdir, request):
|
||||||
to_load = {
|
to_load = {
|
||||||
'blue': ['cyclops', 'rogue', 'wolverine'],
|
'blue': ['cyclops', 'rogue', 'wolverine'],
|
||||||
|
@@ -6,11 +6,20 @@ from traitlets.config import Config
|
|||||||
from jupyterhub.pagination import Pagination
|
from jupyterhub.pagination import Pagination
|
||||||
|
|
||||||
|
|
||||||
def test_per_page_bounds():
|
@mark.parametrize(
|
||||||
|
"per_page, max_per_page, expected",
|
||||||
|
[
|
||||||
|
(20, 10, 10),
|
||||||
|
(1000, 1000, 1000),
|
||||||
|
],
|
||||||
|
)
|
||||||
|
def test_per_page_bounds(per_page, max_per_page, expected):
|
||||||
cfg = Config()
|
cfg = Config()
|
||||||
cfg.Pagination.max_per_page = 10
|
cfg.Pagination.max_per_page = max_per_page
|
||||||
p = Pagination(config=cfg, per_page=20, total=100)
|
p = Pagination(config=cfg)
|
||||||
assert p.per_page == 10
|
p.per_page = per_page
|
||||||
|
p.total = 99999
|
||||||
|
assert p.per_page == expected
|
||||||
with raises(Exception):
|
with raises(Exception):
|
||||||
p.per_page = 0
|
p.per_page = 0
|
||||||
|
|
||||||
@@ -39,7 +48,5 @@ def test_per_page_bounds():
|
|||||||
],
|
],
|
||||||
)
|
)
|
||||||
def test_window(page, per_page, total, expected):
|
def test_window(page, per_page, total, expected):
|
||||||
cfg = Config()
|
|
||||||
cfg.Pagination
|
|
||||||
pagination = Pagination(page=page, per_page=per_page, total=total)
|
pagination = Pagination(page=page, per_page=per_page, total=total)
|
||||||
assert pagination.calculate_pages_window() == expected
|
assert pagination.calculate_pages_window() == expected
|
||||||
|
@@ -124,7 +124,7 @@ def auth_header(db, name):
|
|||||||
"""Return header with user's API authorization token."""
|
"""Return header with user's API authorization token."""
|
||||||
user = find_user(db, name)
|
user = find_user(db, name)
|
||||||
if user is None:
|
if user is None:
|
||||||
user = add_user(db, name=name)
|
raise KeyError(f"No such user: {name}")
|
||||||
token = user.new_api_token()
|
token = user.new_api_token()
|
||||||
return {'Authorization': 'token %s' % token}
|
return {'Authorization': 'token %s' % token}
|
||||||
|
|
||||||
|
@@ -28,6 +28,15 @@ from tornado.httpclient import HTTPError
|
|||||||
from tornado.log import app_log
|
from tornado.log import app_log
|
||||||
from tornado.platform.asyncio import to_asyncio_future
|
from tornado.platform.asyncio import to_asyncio_future
|
||||||
|
|
||||||
|
# For compatibility with python versions 3.6 or earlier.
|
||||||
|
# asyncio.Task.all_tasks() is fully moved to asyncio.all_tasks() starting with 3.9. Also applies to current_task.
|
||||||
|
try:
|
||||||
|
asyncio_all_tasks = asyncio.all_tasks
|
||||||
|
asyncio_current_task = asyncio.current_task
|
||||||
|
except AttributeError as e:
|
||||||
|
asyncio_all_tasks = asyncio.Task.all_tasks
|
||||||
|
asyncio_current_task = asyncio.Task.current_task
|
||||||
|
|
||||||
|
|
||||||
def random_port():
|
def random_port():
|
||||||
"""Get a single random port."""
|
"""Get a single random port."""
|
||||||
@@ -480,7 +489,7 @@ def print_stacks(file=sys.stderr):
|
|||||||
# also show asyncio tasks, if any
|
# also show asyncio tasks, if any
|
||||||
# this will increase over time as we transition from tornado
|
# this will increase over time as we transition from tornado
|
||||||
# coroutines to native `async def`
|
# coroutines to native `async def`
|
||||||
tasks = asyncio.Task.all_tasks()
|
tasks = asyncio_all_tasks()
|
||||||
if tasks:
|
if tasks:
|
||||||
print("AsyncIO tasks: %i" % len(tasks))
|
print("AsyncIO tasks: %i" % len(tasks))
|
||||||
for task in tasks:
|
for task in tasks:
|
||||||
|
@@ -1,8 +1,8 @@
|
|||||||
alembic
|
alembic>=1.4
|
||||||
async_generator>=1.9
|
async_generator>=1.9
|
||||||
certipy>=0.1.2
|
certipy>=0.1.2
|
||||||
entrypoints
|
entrypoints
|
||||||
jinja2
|
jinja2>=2.11.0
|
||||||
jupyter_telemetry>=0.1.0
|
jupyter_telemetry>=0.1.0
|
||||||
oauthlib>=3.0
|
oauthlib>=3.0
|
||||||
pamela; sys_platform != 'win32'
|
pamela; sys_platform != 'win32'
|
||||||
|
@@ -68,6 +68,18 @@
|
|||||||
<i class="fa fa-spinner"></i>
|
<i class="fa fa-spinner"></i>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
{% block login_terms %}
|
||||||
|
{% if login_term_url %}
|
||||||
|
<div id="login_terms" class="login_terms">
|
||||||
|
<input type="checkbox" id="login_terms_checkbox" name="login_terms_checkbox" required />
|
||||||
|
{% block login_terms_text %} {# allow overriding the text #}
|
||||||
|
By logging into the platform you accept the <a href="{{ login_term_url }}">terms and conditions</a>.
|
||||||
|
{% endblock login_terms_text %}
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
{% endblock login_terms %}
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
</form>
|
</form>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
Reference in New Issue
Block a user