Compare commits

..

20 Commits
3.0.0 ... 2.3.0

Author SHA1 Message Date
Min RK
a3c93088a8 Bump to 2.3.0 2022-05-06 16:05:34 +02:00
Min RK
834229622d Merge pull request #3887 from minrk/2.3-backports
2.3 backports
2022-05-06 16:05:10 +02:00
Min RK
44a1ea42de One more in the changelog 2022-05-06 15:56:13 +02:00
Simon Li
3879a96b67 Backport PR #3886: Cleanup everything on API shutdown
`app.stop` triggers full cleanup and stopping of the event loop

closes  3881

Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-06 15:55:00 +02:00
Min RK
d40627d397 changelog for 2.3 2022-05-05 13:24:00 +02:00
Min RK
057cdbc9e9 pre-commit autoupdate 2022-05-05 13:23:52 +02:00
Min RK
75390d2e46 Backport PR #3882: Use log.exception when logging exceptions
This provides the stack trace in the log file, incredibly
useful when debugging

Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:28 +02:00
Min RK
f5e4846cfa Backport PR #3874: Missing f prefix on f-strings fix
Fixes  3873

Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:27 +02:00
Georgiana Elena
3dc115a829 Backport PR #3876: don't confuse :// in next_url query params for a redirect hostname
closes  3014

These query params should be url-encoded (https://github.com/jupyterhub/nbgitpuller/issues/118), but we still shouldn't be making the wrong assumptions about when a hostname is specified

Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:25 +02:00
Min RK
af4ddbfc58 Backport PR #3867: ci: update black configuration
Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:24 +02:00
Min RK
50a4d1e34d Backport PR #3863: [Bug Fix] Search bar disabled on admin dashboard
I originally had `defaultValue` here and I changed it not realizing this would break/disable the input.

Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:23 +02:00
Erik Sundell
86a238334c Backport PR #3862: Fix typo in [rest api] link in README.md
Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:22 +02:00
Simon Li
dacb9d1668 Backport PR #3859: Do not store Spawner.ip/port on spawner.server during get_env
we shouldn't mutate db state when getting the environment.

IIRC, this was part of an attempt to get the url via `self.server.bind_url` that didn't end up getting used in  3381. So this doesn't really have any positive effects, but it _can_ have negative effects if `get_env` is called in unusual circumstances (jupyterhub/batchspawner 236)

closes jupyterhub/batchspawner 236

Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:21 +02:00
Min RK
95cc170383 Backport PR #3853: Fix xsrf_cookie_kwargs ValueError
Fixes

`ValueError: too many values to unpack (expected 2)`

Related to code added as a fix for https://github.com/jupyterhub/jupyterhub/issues/3821

Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:20 +02:00
Erik Sundell
437a9d150f Backport PR #3849: The word used is duplicated in upgrade.md
This PR is to update doc for that the word `used` is duplicated in this doc.

Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:19 +02:00
Erik Sundell
c9616d6f11 Backport PR #3843: Some typos in docs
- fix some references to old 'all' name which was renamed 'inherit'
- fix a heading level in changlog that sphinx warns about

Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:18 +02:00
Min RK
61aed70c4d Backport PR #3841: adopt pytest-asyncio asyncio_mode='auto'
removes need for our own implementation of the same behavior in conftest

Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:17 +02:00
Erik Sundell
9abb573d47 Backport PR #3839: Document version mismatch log message
Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:16 +02:00
Erik Sundell
b074304834 Backport PR #3835: remove lingering reference to distutils
traitlets, like most Jupyter projects (and Python itself), has a `.version_info` tuple to avoid needing to parse versions

Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:15 +02:00
Min RK
201e7ca3d8 Backport PR #3834: Admin Dashboard - Collapsible Details View
I made this PR to see if this feature would be useful for other people. Right now, you can't see all of a user or server's details in the admin page so I added a collapsible view which will let you see the entire server and user objects. I'm open to ideas about how the information is displayed. Will add more tests if this feature is accepted.

![improved-collapse](https://user-images.githubusercontent.com/737367/158468531-1efc28e6-a229-4383-b5f9-b301898d929f.gif)

Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:14 +02:00
156 changed files with 4383 additions and 6371 deletions

View File

@@ -1,15 +0,0 @@
# dependabot.yml reference: https://docs.github.com/en/code-security/dependabot/dependabot-version-updates/configuration-options-for-the-dependabot.yml-file
#
# Notes:
# - Status and logs from dependabot are provided at
# https://github.com/jupyterhub/jupyterhub/network/updates.
#
version: 2
updates:
# Maintain dependencies in our GitHub Workflows
- package-ecosystem: github-actions
directory: "/"
schedule:
interval: weekly
time: "05:00"
timezone: "Etc/UTC"

View File

@@ -32,18 +32,17 @@ jobs:
build-release:
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: "3.9"
python-version: 3.8
- uses: actions/setup-node@v3
- uses: actions/setup-node@v1
with:
node-version: "14"
- name: install build requirements
- name: install build package
run: |
npm install -g yarn
pip install --upgrade pip
pip install build
pip freeze
@@ -53,17 +52,28 @@ jobs:
python -m build --sdist --wheel .
ls -l dist
- name: verify sdist
- name: verify wheel
run: |
./ci/check_sdist.py dist/jupyterhub-*.tar.gz
- name: verify data-files are installed where they are found
run: |
pip install dist/*.whl
./ci/check_installed_data.py
cd dist
pip install ./*.whl
# verify data-files are installed where they are found
cat <<EOF | python
import os
from jupyterhub._data import DATA_FILES_PATH
print(f"DATA_FILES_PATH={DATA_FILES_PATH}")
assert os.path.exists(DATA_FILES_PATH), DATA_FILES_PATH
for subpath in (
"templates/page.html",
"static/css/style.min.css",
"static/components/jquery/dist/jquery.js",
):
path = os.path.join(DATA_FILES_PATH, subpath)
assert os.path.exists(path), path
print("OK")
EOF
# ref: https://github.com/actions/upload-artifact#readme
- uses: actions/upload-artifact@v3
- uses: actions/upload-artifact@v2
with:
name: jupyterhub-${{ github.sha }}
path: "dist/*"
@@ -98,16 +108,16 @@ jobs:
echo "REGISTRY=localhost:5000/" >> $GITHUB_ENV
fi
- uses: actions/checkout@v3
- uses: actions/checkout@v2
# Setup docker to build for multiple platforms, see:
# https://github.com/docker/build-push-action/tree/v2.4.0#usage
# https://github.com/docker/build-push-action/blob/v2.4.0/docs/advanced/multi-platform.md
- name: Set up QEMU (for docker buildx)
uses: docker/setup-qemu-action@8b122486cedac8393e77aa9734c3528886e4a1a8 # associated tag: v1.0.2
uses: docker/setup-qemu-action@25f0500ff22e406f7191a2a8ba8cda16901ca018 # associated tag: v1.0.2
- name: Set up Docker Buildx (for multi-arch builds)
uses: docker/setup-buildx-action@dc7b9719a96d48369863986a06765841d7ea23f6 # associated tag: v1.1.2
uses: docker/setup-buildx-action@2a4b53665e15ce7d7049afb11ff1f70ff1610609 # associated tag: v1.1.2
with:
# Allows pushing to registry on localhost:5000
driver-opts: network=host
@@ -145,7 +155,7 @@ jobs:
branchRegex: ^\w[\w-.]*$
- name: Build and push jupyterhub
uses: docker/build-push-action@c84f38281176d4c9cdb1626ffafcd6b3911b5d94
uses: docker/build-push-action@e1b7f96249f2e4c8e4ac1519b9608c0d48944a1f
with:
context: .
platforms: linux/amd64,linux/arm64
@@ -166,7 +176,7 @@ jobs:
branchRegex: ^\w[\w-.]*$
- name: Build and push jupyterhub-onbuild
uses: docker/build-push-action@c84f38281176d4c9cdb1626ffafcd6b3911b5d94
uses: docker/build-push-action@e1b7f96249f2e4c8e4ac1519b9608c0d48944a1f
with:
build-args: |
BASE_IMAGE=${{ fromJson(steps.jupyterhubtags.outputs.tags)[0] }}
@@ -187,7 +197,7 @@ jobs:
branchRegex: ^\w[\w-.]*$
- name: Build and push jupyterhub-demo
uses: docker/build-push-action@c84f38281176d4c9cdb1626ffafcd6b3911b5d94
uses: docker/build-push-action@e1b7f96249f2e4c8e4ac1519b9608c0d48944a1f
with:
build-args: |
BASE_IMAGE=${{ fromJson(steps.onbuildtags.outputs.tags)[0] }}
@@ -211,7 +221,7 @@ jobs:
branchRegex: ^\w[\w-.]*$
- name: Build and push jupyterhub/singleuser
uses: docker/build-push-action@c84f38281176d4c9cdb1626ffafcd6b3911b5d94
uses: docker/build-push-action@e1b7f96249f2e4c8e4ac1519b9608c0d48944a1f
with:
build-args: |
JUPYTERHUB_VERSION=${{ github.ref_type == 'tag' && github.ref_name || format('git:{0}', github.sha) }}

View File

@@ -15,13 +15,15 @@ on:
- "docs/**"
- "jupyterhub/_version.py"
- "jupyterhub/scopes.py"
- ".github/workflows/test-docs.yml"
- ".github/workflows/*"
- "!.github/workflows/test-docs.yml"
push:
paths:
- "docs/**"
- "jupyterhub/_version.py"
- "jupyterhub/scopes.py"
- ".github/workflows/test-docs.yml"
- ".github/workflows/*"
- "!.github/workflows/test-docs.yml"
branches-ignore:
- "dependabot/**"
- "pre-commit-ci-update-config"
@@ -38,18 +40,18 @@ jobs:
validate-rest-api-definition:
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v2
- name: Validate REST API definition
uses: char0n/swagger-editor-validate@v1.3.1
uses: char0n/swagger-editor-validate@182d1a5d26ff5c2f4f452c43bd55e2c7d8064003
with:
definition-file: docs/source/_static/rest-api.yml
test-docs:
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: "3.9"

View File

@@ -19,9 +19,6 @@ on:
- "**"
workflow_dispatch:
permissions:
contents: read
jobs:
# The ./jsx folder contains React based source code files that are to compile
# to share/jupyterhub/static/js/admin-react.js. The ./jsx folder includes
@@ -32,8 +29,8 @@ jobs:
timeout-minutes: 5
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
- uses: actions/checkout@v2
- uses: actions/setup-node@v1
with:
node-version: "14"
@@ -50,3 +47,62 @@ jobs:
run: |
cd jsx
yarn test
# The ./jsx folder contains React based source files that are to compile to
# share/jupyterhub/static/js/admin-react.js. This job makes sure that whatever
# we have in jsx/src matches the compiled asset that we package and
# distribute.
#
# This job's purpose is to make sure we don't forget to compile changes and to
# verify nobody sneaks in a change in the hard to review compiled asset.
#
# NOTE: In the future we may want to stop version controlling the compiled
# artifact and instead generate it whenever we package JupyterHub. If we
# do this, we are required to setup node and compile the source code
# more often, at the same time we could avoid having this check be made.
#
compile-jsx-admin-react:
runs-on: ubuntu-20.04
timeout-minutes: 5
steps:
- uses: actions/checkout@v2
- uses: actions/setup-node@v1
with:
node-version: "14"
- name: Install yarn
run: |
npm install -g yarn
- name: yarn
run: |
cd jsx
yarn
- name: yarn build
run: |
cd jsx
yarn build
- name: yarn place
run: |
cd jsx
yarn place
- name: Verify compiled jsx/src matches version controlled artifact
run: |
if [[ `git status --porcelain=v1` ]]; then
echo "The source code in ./jsx compiles to something different than found in ./share/jupyterhub/static/js/admin-react.js!"
echo
echo "Please re-compile the source code in ./jsx with the following commands:"
echo
echo "yarn"
echo "yarn build"
echo "yarn place"
echo
echo "See ./jsx/README.md for more details."
exit 1
else
echo "Compilation of jsx/src to share/jupyterhub/static/js/admin-react.js didn't lead to changes."
fi

View File

@@ -30,9 +30,6 @@ env:
LANG: C.UTF-8
PYTEST_ADDOPTS: "--verbose --color=yes"
permissions:
contents: read
jobs:
# Run "pytest jupyterhub/tests" in various configurations
pytest:
@@ -56,9 +53,9 @@ jobs:
# Tests everything when JupyterHub works against a dedicated mysql or
# postgresql server.
#
# legacy_notebook:
# nbclassic:
# Tests everything when the user instances are started with
# the legacy notebook server instead of jupyter_server.
# notebook instead of jupyter_server.
#
# ssl:
# Tests everything using internal SSL connections instead of
@@ -71,24 +68,21 @@ jobs:
# NOTE: Since only the value of these parameters are presented in the
# GitHub UI when the workflow run, we avoid using true/false as
# values by instead duplicating the name to signal true.
# Python versions available at:
# https://github.com/actions/python-versions/blob/HEAD/versions-manifest.json
include:
- python: "3.7"
- python: "3.6"
oldest_dependencies: oldest_dependencies
legacy_notebook: legacy_notebook
- python: "3.8"
legacy_notebook: legacy_notebook
- python: "3.9"
db: mysql
- python: "3.10"
db: postgres
- python: "3.10"
nbclassic: nbclassic
- python: "3.6"
subdomain: subdomain
- python: "3.10"
- python: "3.7"
db: mysql
- python: "3.7"
ssl: ssl
- python: "3.11.0-rc.1"
- python: "3.10"
- python: "3.8"
db: postgres
- python: "3.8"
nbclassic: nbclassic
- python: "3.9"
main_dependencies: main_dependencies
steps:
@@ -116,38 +110,28 @@ jobs:
if [ "${{ matrix.jupyter_server }}" != "" ]; then
echo "JUPYTERHUB_SINGLEUSER_APP=jupyterhub.tests.mockserverapp.MockServerApp" >> $GITHUB_ENV
fi
- uses: actions/checkout@v3
# NOTE: actions/setup-node@v3 make use of a cache within the GitHub base
- uses: actions/checkout@v2
# NOTE: actions/setup-node@v1 make use of a cache within the GitHub base
# environment and setup in a fraction of a second.
- name: Install Node v14
uses: actions/setup-node@v3
uses: actions/setup-node@v1
with:
node-version: "14"
- name: Install Javascript dependencies
- name: Install Node dependencies
run: |
npm install
npm install -g configurable-http-proxy yarn
npm install -g configurable-http-proxy
npm list
# NOTE: actions/setup-python@v4 make use of a cache within the GitHub base
# NOTE: actions/setup-python@v2 make use of a cache within the GitHub base
# environment and setup in a fraction of a second.
- name: Install Python ${{ matrix.python }}
uses: actions/setup-python@v4
uses: actions/setup-python@v2
with:
python-version: "${{ matrix.python }}"
python-version: ${{ matrix.python }}
- name: Install Python dependencies
run: |
pip install --upgrade pip
if [[ "${{ matrix.python }}" == "3.11"* ]]; then
# greenlet is not actually required,
# but is an install dependency of sqlalchemy.
# It does not yet install on 3.11
# see: see https://github.com/gevent/gevent/issues/1867
pip install ./ci/mock-greenlet
fi
pip install --upgrade . -r dev-requirements.txt
if [ "${{ matrix.oldest_dependencies }}" != "" ]; then
@@ -161,9 +145,9 @@ jobs:
if [ "${{ matrix.main_dependencies }}" != "" ]; then
pip install git+https://github.com/ipython/traitlets#egg=traitlets --force
fi
if [ "${{ matrix.legacy_notebook }}" != "" ]; then
if [ "${{ matrix.nbclassic }}" != "" ]; then
pip uninstall jupyter_server --yes
pip install 'notebook<7'
pip install notebook
fi
if [ "${{ matrix.db }}" == "mysql" ]; then
pip install mysql-connector-python
@@ -227,7 +211,7 @@ jobs:
timeout-minutes: 20
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v2
- name: build images
run: |

2
.gitignore vendored
View File

@@ -10,7 +10,6 @@ docs/build
docs/source/_static/rest-api
docs/source/rbac/scope-table.md
.ipynb_checkpoints
jsx/build/
# ignore config file at the top-level of the repo
# but not sub-dirs
/jupyterhub_config.py
@@ -20,7 +19,6 @@ package-lock.json
share/jupyterhub/static/components
share/jupyterhub/static/css/style.min.css
share/jupyterhub/static/css/style.min.css.map
share/jupyterhub/static/js/admin-react.js*
*.egg-info
MANIFEST
.coverage

View File

@@ -11,33 +11,33 @@
repos:
# Autoformat: Python code, syntax patterns are modernized
- repo: https://github.com/asottile/pyupgrade
rev: v2.37.3
rev: v2.32.1
hooks:
- id: pyupgrade
args:
- --py36-plus
# Autoformat: Python code
- repo: https://github.com/pycqa/isort
rev: 5.10.1
- repo: https://github.com/asottile/reorder_python_imports
rev: v3.1.0
hooks:
- id: isort
- id: reorder-python-imports
# Autoformat: Python code
- repo: https://github.com/psf/black
rev: 22.6.0
rev: 22.3.0
hooks:
- id: black
# Autoformat: markdown, yaml, javascript (see the file .prettierignore)
- repo: https://github.com/pre-commit/mirrors-prettier
rev: v2.7.1
rev: v2.6.2
hooks:
- id: prettier
# Autoformat and linting, misc. details
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.3.0
rev: v4.2.0
hooks:
- id: end-of-file-fixer
exclude: share/jupyterhub/static/js/admin-react.js
@@ -47,6 +47,6 @@ repos:
# Linting: Python code (see the file .flake8)
- repo: https://github.com/PyCQA/flake8
rev: "5.0.4"
rev: "4.0.1"
hooks:
- id: flake8

View File

@@ -6,9 +6,134 @@ you can follow the [Jupyter contributor guide](https://jupyter.readthedocs.io/en
Make sure to also follow [Project Jupyter's Code of Conduct](https://github.com/jupyter/governance/blob/HEAD/conduct/code_of_conduct.md)
for a friendly and welcoming collaborative environment.
Please see our documentation on
## Setting up a development environment
- [Setting up a development install](https://jupyterhub.readthedocs.io/en/latest/contributing/setup.html)
- [Testing JupyterHub and linting code](https://jupyterhub.readthedocs.io/en/latest/contributing/tests.html)
<!--
https://jupyterhub.readthedocs.io/en/stable/contributing/setup.html
contains a lot of the same information. Should we merge the docs and
just have this page link to that one?
-->
If you need some help, feel free to ask on [Gitter](https://gitter.im/jupyterhub/jupyterhub) or [Discourse](https://discourse.jupyter.org/).
JupyterHub requires Python >= 3.5 and nodejs.
As a Python project, a development install of JupyterHub follows standard practices for the basics (steps 1-2).
1. clone the repo
```bash
git clone https://github.com/jupyterhub/jupyterhub
```
2. do a development install with pip
```bash
cd jupyterhub
python3 -m pip install --editable .
```
3. install the development requirements,
which include things like testing tools
```bash
python3 -m pip install -r dev-requirements.txt
```
4. install configurable-http-proxy with npm:
```bash
npm install -g configurable-http-proxy
```
5. set up pre-commit hooks for automatic code formatting, etc.
```bash
pre-commit install
```
You can also invoke the pre-commit hook manually at any time with
```bash
pre-commit run
```
## Contributing
JupyterHub has adopted automatic code formatting so you shouldn't
need to worry too much about your code style.
As long as your code is valid,
the pre-commit hook should take care of how it should look.
You can invoke the pre-commit hook by hand at any time with:
```bash
pre-commit run
```
which should run any autoformatting on your code
and tell you about any errors it couldn't fix automatically.
You may also install [black integration](https://github.com/psf/black#editor-integration)
into your text editor to format code automatically.
If you have already committed files before setting up the pre-commit
hook with `pre-commit install`, you can fix everything up using
`pre-commit run --all-files`. You need to make the fixing commit
yourself after that.
## Testing
It's a good idea to write tests to exercise any new features,
or that trigger any bugs that you have fixed to catch regressions.
You can run the tests with:
```bash
pytest -v
```
in the repo directory. If you want to just run certain tests,
check out the [pytest docs](https://pytest.readthedocs.io/en/latest/usage.html)
for how pytest can be called.
For instance, to test only spawner-related things in the REST API:
```bash
pytest -v -k spawn jupyterhub/tests/test_api.py
```
The tests live in `jupyterhub/tests` and are organized roughly into:
1. `test_api.py` tests the REST API
2. `test_pages.py` tests loading the HTML pages
and other collections of tests for different components.
When writing a new test, there should usually be a test of
similar functionality already written and related tests should
be added nearby.
The fixtures live in `jupyterhub/tests/conftest.py`. There are
fixtures that can be used for JupyterHub components, such as:
- `app`: an instance of JupyterHub with mocked parts
- `auth_state_enabled`: enables persisting auth_state (like authentication tokens)
- `db`: a sqlite in-memory DB session
- `io_loop`: a Tornado event loop
- `event_loop`: a new asyncio event loop
- `user`: creates a new temporary user
- `admin_user`: creates a new temporary admin user
- single user servers
- `cleanup_after`: allows cleanup of single user servers between tests
- mocked service
- `MockServiceSpawner`: a spawner that mocks services for testing with a short poll interval
- `mockservice`: mocked service with no external service url
- `mockservice_url`: mocked service with a url to test external services
And fixtures to add functionality or spawning behavior:
- `admin_access`: grants admin access
- `no_patience`: sets slow-spawning timeouts to zero
- `slow_spawn`: enables the SlowSpawner (a spawner that takes a few seconds to start)
- `never_spawn`: enables the NeverSpawner (a spawner that will never start)
- `bad_spawn`: enables the BadSpawner (a spawner that fails immediately)
- `slow_bad_spawn`: enables the SlowBadSpawner (a spawner that fails after a short delay)
To read more about fixtures check out the
[pytest docs](https://docs.pytest.org/en/latest/fixture.html)
for how to use the existing fixtures, and how to create new ones.
When in doubt, feel free to [ask](https://gitter.im/jupyterhub/jupyterhub).

View File

@@ -21,7 +21,7 @@
# your jupyterhub_config.py will be added automatically
# from your docker directory.
ARG BASE_IMAGE=ubuntu:22.04
ARG BASE_IMAGE=ubuntu:focal-20200729
FROM $BASE_IMAGE AS builder
USER root
@@ -37,7 +37,6 @@ RUN apt-get update \
python3-pycurl \
nodejs \
npm \
yarn \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/*

View File

@@ -8,7 +8,6 @@ include *requirements.txt
include Dockerfile
graft onbuild
graft jsx
graft jupyterhub
graft scripts
graft share
@@ -19,10 +18,6 @@ graft ci
graft docs
prune docs/node_modules
# Intermediate javascript files
prune jsx/node_modules
prune jsx/build
# prune some large unused files from components
prune share/jupyterhub/static/components/bootstrap/dist/css
exclude share/jupyterhub/static/components/bootstrap/dist/fonts/*.svg

View File

@@ -1,20 +0,0 @@
#!/usr/bin/env python
# Check that installed package contains everything we expect
import os
from jupyterhub._data import DATA_FILES_PATH
print("Checking jupyterhub._data")
print(f"DATA_FILES_PATH={DATA_FILES_PATH}")
assert os.path.exists(DATA_FILES_PATH), DATA_FILES_PATH
for subpath in (
"templates/page.html",
"static/css/style.min.css",
"static/components/jquery/dist/jquery.js",
"static/js/admin-react.js",
):
path = os.path.join(DATA_FILES_PATH, subpath)
assert os.path.exists(path), path
print("OK")

View File

@@ -1,28 +0,0 @@
#!/usr/bin/env python
# Check that sdist contains everything we expect
import sys
import tarfile
from tarfile import TarFile
expected_files = [
"docs/requirements.txt",
"jsx/package.json",
"package.json",
"README.md",
]
assert len(sys.argv) == 2, "Expected one file"
print(f"Checking {sys.argv[1]}")
tar = tarfile.open(name=sys.argv[1], mode="r:gz")
try:
# Remove leading jupyterhub-VERSION/
filelist = {f.partition('/')[2] for f in tar.getnames()}
finally:
tar.close()
for e in expected_files:
assert e in filelist, f"{e} not found"
print("OK")

View File

@@ -20,7 +20,7 @@ fi
# Configure a set of databases in the database server for upgrade tests
set -x
for SUFFIX in '' _upgrade_100 _upgrade_122 _upgrade_130 _upgrade_150 _upgrade_211; do
for SUFFIX in '' _upgrade_100 _upgrade_122 _upgrade_130; do
$SQL_CLIENT "DROP DATABASE jupyterhub${SUFFIX};" 2>/dev/null || true
$SQL_CLIENT "CREATE DATABASE jupyterhub${SUFFIX} ${EXTRA_CREATE_DATABASE_ARGS:-};"
done

View File

@@ -1,3 +0,0 @@
__version__ = "22.0.0.dev0"
raise ImportError("Don't actually have greenlet")

View File

@@ -1,13 +0,0 @@
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[project]
name = "greenlet"
description = 'Mock greenlet to allow install on 3.11'
requires-python = ">=3.7"
dynamic = ["version"]
[tool.hatch.version]
path = "greenlet.py"

View File

@@ -9,13 +9,10 @@ cryptography
html5lib # needed for beautifulsoup
jupyterlab >=3
mock
# nbclassic provides the '/tree/' handler, which we use in tests
# it is a transitive dependency via jupyterlab,
# but depend on it directly
nbclassic
pre-commit
pytest>=3.3
pytest-asyncio>=0.17
pytest-asyncio; python_version < "3.7"
pytest-asyncio>=0.17; python_version >= "3.7"
pytest-cov
requests-mock
tbump

View File

@@ -6,7 +6,7 @@ info:
description: The REST API for JupyterHub
license:
name: BSD-3-Clause
version: 3.0.0
version: 2.3.0
servers:
- url: /hub/api
security:
@@ -139,16 +139,6 @@ paths:
If unspecified, use api_page_default_limit.
schema:
type: number
- name: include_stopped_servers
in: query
description: |
Include stopped servers in user model(s).
Added in JupyterHub 3.0.
Allows retrieval of information about stopped servers,
such as activity and state fields.
schema:
type: boolean
allowEmptyValue: true
responses:
200:
description: The Hub's user list
@@ -570,19 +560,7 @@ paths:
description: A note attached to the token for future bookkeeping
roles:
type: array
description: |
A list of role names from which to derive scopes.
This is a shortcut for assigning collections of scopes;
Tokens do not retain role assignment.
(Changed in 3.0: roles are immediately resolved to scopes
instead of stored on roles.)
items:
type: string
scopes:
type: array
description: |
A list of scopes that the token should have.
(new in JupyterHub 3.0).
description: A list of role names that the token should have
items:
type: string
required: false
@@ -1170,11 +1148,7 @@ components:
format: date-time
servers:
type: array
description: |
The servers for this user.
By default: only includes _active_ servers.
Changed in 3.0: if `?include_stopped_servers` parameter is specified,
stopped servers will be included as well.
description: The active servers for this user.
items:
$ref: "#/components/schemas/Server"
auth_state:
@@ -1196,15 +1170,6 @@ components:
description: |
Whether the server is ready for traffic.
Will always be false when any transition is pending.
stopped:
type: boolean
description: |
Whether the server is stopped. Added in JupyterHub 3.0,
and only useful when using the `?include_stopped_servers`
request parameter.
Now that stopped servers may be included (since JupyterHub 3.0),
this is the simplest way to select stopped servers.
Always equivalent to `not (ready or pending)`.
pending:
type: string
description: |
@@ -1349,16 +1314,7 @@ components:
description: The service that owns the token (undefined of owned by a user)
roles:
type: array
description:
Deprecated in JupyterHub 3, always an empty list. Tokens have
'scopes' starting from JupyterHub 3.
items:
type: string
scopes:
type: array
description:
List of scopes this token has been assigned. New in JupyterHub
3. In JupyterHub 2.x, tokens were assigned 'roles' insead of scopes.
description: The names of roles this token has
items:
type: string
note:
@@ -1414,9 +1370,6 @@ components:
inherit:
Everything that the token-owning entity can access _(metascope
for tokens)_
admin-ui:
Access the admin page. Permission to take actions via the admin
page granted separately.
admin:users:
Read, write, create and delete users and their authentication
state, not including their servers or tokens.

File diff suppressed because one or more lines are too long

View File

@@ -48,7 +48,7 @@ version = '%i.%i' % jupyterhub.version_info[:2]
# The full version, including alpha/beta/rc tags.
release = jupyterhub.__version__
language = "en"
language = None
exclude_patterns = []
pygments_style = 'sphinx'
todo_include_todos = False
@@ -56,14 +56,12 @@ todo_include_todos = False
# Set the default role so we can use `foo` instead of ``foo``
default_role = 'literal'
from contextlib import redirect_stdout
from io import StringIO
from docutils import nodes
from sphinx.directives.other import SphinxDirective
# -- Config -------------------------------------------------------------
from jupyterhub.app import JupyterHub
from docutils import nodes
from sphinx.directives.other import SphinxDirective
from contextlib import redirect_stdout
from io import StringIO
# create a temp instance of JupyterHub just to get the output of the generate-config
# and help --all commands.

View File

@@ -16,7 +16,7 @@ Install Python
--------------
JupyterHub is written in the `Python <https://python.org>`_ programming language, and
requires you have at least version 3.6 installed locally. If you havent
requires you have at least version 3.5 installed locally. If you havent
installed Python before, the recommended way to install it is to use
`miniconda <https://conda.io/miniconda.html>`_. Remember to get the Python 3 version,
and **not** the Python 2 version!
@@ -24,10 +24,11 @@ and **not** the Python 2 version!
Install nodejs
--------------
`NodeJS 12+ <https://nodejs.org/en/>`_ is required for building some JavaScript components.
``configurable-http-proxy``, the default proxy implementation for JupyterHub, is written in Javascript.
If you have not installed nodejs before, we recommend installing it in the ``miniconda`` environment you set up for Python.
You can do so with ``conda install nodejs``.
``configurable-http-proxy``, the default proxy implementation for
JupyterHub, is written in Javascript to run on `NodeJS
<https://nodejs.org/en/>`_. If you have not installed nodejs before, we
recommend installing it in the ``miniconda`` environment you set up for
Python. You can do so with ``conda install nodejs``.
Install git
-----------
@@ -45,7 +46,7 @@ their effects quickly. You need to do a developer install to make that
happen.
.. note:: This guide does not attempt to dictate *how* development
environments should be isolated since that is a personal preference and can
environements should be isolated since that is a personal preference and can
be achieved in many ways, for example `tox`, `conda`, `docker`, etc. See this
`forum thread <https://discourse.jupyter.org/t/thoughts-on-using-tox/3497>`_ for
a more detailed discussion.
@@ -65,7 +66,7 @@ happen.
python -V
This should return a version number greater than or equal to 3.6.
This should return a version number greater than or equal to 3.5.
.. code:: bash
@@ -73,11 +74,12 @@ happen.
This should return a version number greater than or equal to 5.0.
3. Install ``configurable-http-proxy`` (required to run and test the default JupyterHub configuration) and ``yarn`` (required to build some components):
3. Install ``configurable-http-proxy``. This is required to run
JupyterHub.
.. code:: bash
npm install -g configurable-http-proxy yarn
npm install -g configurable-http-proxy
If you get an error that says ``Error: EACCES: permission denied``,
you might need to prefix the command with ``sudo``. If you do not
@@ -85,17 +87,11 @@ happen.
.. code:: bash
npm install configurable-http-proxy yarn
npm install configurable-http-proxy
export PATH=$PATH:$(pwd)/node_modules/.bin
The second line needs to be run every time you open a new terminal.
If you are using conda you can instead run:
.. code:: bash
conda install configurable-http-proxy yarn
4. Install the python packages required for JupyterHub development.
.. code:: bash
@@ -190,4 +186,3 @@ development updates, with:
python3 setup.py js # fetch updated client-side js
python3 setup.py css # recompile CSS from LESS sources
python3 setup.py jsx # build React admin app

View File

@@ -1,8 +1,8 @@
.. _contributing/tests:
===================================
Testing JupyterHub and linting code
===================================
==================
Testing JupyterHub
==================
Unit test help validate that JupyterHub works the way we think it does,
and continues to do so when changes occur. They also help communicate
@@ -57,50 +57,6 @@ Running the tests
pytest -v jupyterhub/tests/test_api.py::test_shutdown
See the `pytest usage documentation <https://pytest.readthedocs.io/en/latest/usage.html>`_ for more details.
Test organisation
=================
The tests live in ``jupyterhub/tests`` and are organized roughly into:
#. ``test_api.py`` tests the REST API
#. ``test_pages.py`` tests loading the HTML pages
and other collections of tests for different components.
When writing a new test, there should usually be a test of
similar functionality already written and related tests should
be added nearby.
The fixtures live in ``jupyterhub/tests/conftest.py``. There are
fixtures that can be used for JupyterHub components, such as:
- ``app``: an instance of JupyterHub with mocked parts
- ``auth_state_enabled``: enables persisting auth_state (like authentication tokens)
- ``db``: a sqlite in-memory DB session
- ``io_loop```: a Tornado event loop
- ``event_loop``: a new asyncio event loop
- ``user``: creates a new temporary user
- ``admin_user``: creates a new temporary admin user
- single user servers
- ``cleanup_after``: allows cleanup of single user servers between tests
- mocked service
- ``MockServiceSpawner``: a spawner that mocks services for testing with a short poll interval
- ``mockservice```: mocked service with no external service url
- ``mockservice_url``: mocked service with a url to test external services
And fixtures to add functionality or spawning behavior:
- ``admin_access``: grants admin access
- ``no_patience```: sets slow-spawning timeouts to zero
- ``slow_spawn``: enables the SlowSpawner (a spawner that takes a few seconds to start)
- ``never_spawn``: enables the NeverSpawner (a spawner that will never start)
- ``bad_spawn``: enables the BadSpawner (a spawner that fails immediately)
- ``slow_bad_spawn``: enables the SlowBadSpawner (a spawner that fails after a short delay)
See the `pytest fixtures documentation <https://pytest.readthedocs.io/en/latest/fixture.html>`_
for how to use the existing fixtures, and how to create new ones.
Troubleshooting Test Failures
=============================
@@ -110,27 +66,3 @@ All the tests are failing
Make sure you have completed all the steps in :ref:`contributing/setup` successfully, and
can launch ``jupyterhub`` from the terminal.
Code formatting and linting
===========================
JupyterHub has adopted automatic code formatting and linting.
As long as your code is valid, the pre-commit hook should take care of how it should look.
You can invoke the pre-commit hook by hand at any time with:
.. code:: bash
pre-commit run
which should run any autoformatting on your code and tell you about any errors it couldn't fix automatically.
You may also install `black integration <https://github.com/psf/black#editor-integration>`_
into your text editor to format code automatically.
If you have already committed files before running pre-commit you can fix everything using:
.. code:: bash
pre-commit run --all-files
And committing the changes.

View File

@@ -183,6 +183,12 @@ itself, ``jupyterhub_config.py``, as a binary string:
c.JupyterHub.cookie_secret = bytes.fromhex('64 CHAR HEX STRING')
.. important::
If the cookie secret value changes for the Hub, all single-user notebook
servers must also be restarted.
.. _cookies:
Cookies used by JupyterHub authentication

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.1 MiB

View File

@@ -1,5 +1,3 @@
(RBAC)=
# JupyterHub RBAC
Role Based Access Control (RBAC) in JupyterHub serves to provide fine grained control of access to Jupyterhub's API resources.

View File

@@ -27,6 +27,7 @@ Roles can be assigned to the following entities:
- Users
- Services
- Groups
- Tokens
An entity can have zero, one, or multiple roles, and there are no restrictions on which roles can be assigned to which entity. Roles can be added to or removed from entities at any time.
@@ -40,7 +41,7 @@ Services do not have a default role. Services without roles have no access to th
A group does not require any role, and has no roles by default. If a user is a member of a group, they automatically inherit any of the group's permissions (see {ref}`resolving-roles-scopes-target` for more details). This is useful for assigning a set of common permissions to several users.
**Tokens** \
A tokens permissions are evaluated based on their owning entity. Since a token is always issued for a user or service, it can never have more permissions than its owner. If no specific scopes are requested for a new token, the token is assigned the scopes of the `token` role.
A tokens permissions are evaluated based on their owning entity. Since a token is always issued for a user or service, it can never have more permissions than its owner. If no specific role is requested for a new token, the token is assigned the `token` role.
(define-role-target)=

View File

@@ -72,31 +72,13 @@ Requested resources are filtered based on the filter of the corresponding scope.
In case a user resource is being accessed, any scopes with _group_ filters will be expanded to filters for each _user_ in those groups.
(self-referencing-filters)=
### Self-referencing filters
There are some 'shortcut' filters,
which can be applied to all scopes,
that filter based on the entities associated with the request.
### `!user` filter
The `!user` filter is a special horizontal filter that strictly refers to the **"owner only"** scopes, where _owner_ is a user entity. The filter resolves internally into `!user=<ownerusername>` ensuring that only the owner's resources may be accessed through the associated scopes.
For example, the `server` role assigned by default to server tokens contains `access:servers!user` and `users:activity!user` scopes. This allows the token to access and post activity of only the servers owned by the token owner.
:::{versionadded} 3.0
`!service` and `!server` filters.
:::
In addition to `!user`, _tokens_ may have filters `!service`
or `!server`, which expand similarly to `!service=servicename`
and `!server=servername`.
This only applies to tokens issued via the OAuth flow.
In these cases, the name is the _issuing_ entity (a service or single-user server),
so that access can be restricted to the issuing service,
e.g. `access:servers!server` would grant access only to the server that requested the token.
These filters can be applied to any scope.
The filter can be applied to any scope.
(vertical-filtering-target)=
@@ -132,170 +114,11 @@ There are four exceptions to the general {ref}`scope conventions <scope-conventi
```
:::{versionadded} 3.0
The `admin-ui` scope is added to explicitly grant access to the admin page,
rather than combining `admin:users` and `admin:servers` permissions.
This means a deployment can enable the admin page with only a subset of functionality enabled.
Note that this means actions to take _via_ the admin UI
and access _to_ the admin UI are separated.
For example, it generally doesn't make sense to grant
`admin-ui` without at least `list:users` for at least some subset of users.
For example:
```python
c.JupyterHub.load_roles = [
{
"name": "instructor-data8",
"scopes": [
# access to the admin page
"admin-ui",
# list users in the class group
"list:users!group=students-data8",
# start/stop servers for users in the class
"admin:servers!group=students-data8",
# access servers for users in the class
"access:servers!group=students-data8",
],
"group": ["instructors-data8"],
}
]
```
will grant instructors in the data8 course permission to:
1. view the admin UI
2. see students in the class (but not all users)
3. start/stop/access servers for users in the class
4. but _not_ permission to administer the users themselves (e.g. change their permissions, etc.)
:::
```{Caution}
Note that only the {ref}`horizontal filtering <horizontal-filtering-target>` can be added to scopes to customize them. \
Metascopes `self` and `all`, `<resource>`, `<resource>:<subresource>`, `read:<resource>`, `admin:<resource>`, and `access:<resource>` scopes are predefined and cannot be changed otherwise.
```
(custom-scopes)=
### Custom scopes
:::{versionadded} 3.0
:::
JupyterHub 3.0 introduces support for custom scopes.
Services that use JupyterHub for authentication and want to implement their own granular access may define additional _custom_ scopes and assign them to users with JupyterHub roles.
% Note: keep in sync with pattern/description in jupyterhub/scopes.py
Custom scope names must start with `custom:`
and contain only lowercase ascii letters, numbers, hyphen, underscore, colon, and asterisk (`-_:*`).
The part after `custom:` must start with a letter or number.
Scopes may not end with a hyphen or colon.
The only strict requirement is that a custom scope definition must have a `description`.
It _may_ also have `subscopes` if you are defining multiple scopes that have a natural hierarchy,
For example:
```python
c.JupyterHub.custom_scopes = {
"custom:myservice:read": {
"description": "read-only access to myservice",
},
"custom:myservice:write": {
"description": "write access to myservice",
# write permission implies read permission
"subscopes": [
"custom:myservice:read",
],
},
}
c.JupyterHub.load_roles = [
# graders have read-only access to the service
{
"name": "service-user",
"groups": ["graders"],
"scopes": [
"custom:myservice:read",
"access:service!service=myservice",
],
},
# instructors have read and write access to the service
{
"name": "service-admin",
"groups": ["instructors"],
"scopes": [
"custom:myservice:write",
"access:service!service=myservice",
],
},
]
```
In the above configuration, two scopes are defined:
- `custom:myservice:read` grants read-only access to the service, and
- `custom:myservice:write` grants write access to the service
- write access _implies_ read access via the `subscope`
These custom scopes are assigned to two groups via `roles`:
- users in the group `graders` are granted read access to the service
- users in the group `instructors` are
- both are granted _access_ to the service via `access:service!service=myservice`
When the service completes OAuth, it will retrieve the user model from `/hub/api/user`.
This model includes a `scopes` field which is a list of authorized scope for the request,
which can be used.
```python
def require_scope(scope):
"""decorator to require a scope to perform an action"""
def wrapper(func):
@functools.wraps(func)
def wrapped_func(request):
user = fetch_hub_api_user(request.token)
if scope not in user["scopes"]:
raise HTTP403(f"Requires scope {scope}")
else:
return func()
return wrapper
@require_scope("custom:myservice:read")
async def read_something(request):
...
@require_scope("custom:myservice:write")
async def write_something(request):
...
```
If you use {class}`~.HubOAuthenticated`, this check is performed automatically
against the `.hub_scopes` attribute of each Handler
(the default is populated from `$JUPYTERHUB_OAUTH_ACCESS_SCOPES` and usually `access:services!service=myservice`).
:::{versionchanged} 3.0
The JUPYTERHUB_OAUTH_SCOPES environment variable is deprecated and renamed to JUPYTERHUB_OAUTH_ACCESS_SCOPES,
to avoid ambiguity with JUPYTERHUB_OAUTH_CLIENT_ALLOWED_SCOPES
:::
```python
from tornado import web
from jupyterhub.services.auth import HubOAuthenticated
class MyHandler(HubOAuthenticated, BaseHandler):
hub_scopes = ["custom:myservice:read"]
@web.authenticated
def get(self):
...
```
Existing scope filters (`!user=`, etc.) may be applied to custom scopes.
Custom scope _filters_ are NOT supported.
### Scopes and APIs
The scopes are also listed in the [](../reference/rest-api.rst) documentation. Each API endpoint has a list of scopes which can be used to access the API; if no scopes are listed, the API is not authenticated and can be accessed without any permissions (i.e., no scopes).

View File

@@ -7,11 +7,11 @@ Roles and scopes utilities can be found in `roles.py` and `scopes.py` modules. S
```{admonition} **Scope variable nomenclature**
:class: tip
- _scopes_ \
List of scopes that may contain abbreviations (used in role definitions). E.g., `["users:activity!user", "self"]`.
List of scopes with abbreviations (used in role definitions). E.g., `["users:activity!user"]`.
- _expanded scopes_ \
Set of fully expanded scopes without abbreviations (i.e., resolved metascopes, filters, and subscopes). E.g., `{"users:activity!user=charlie", "read:users:activity!user=charlie"}`.
Set of expanded scopes without abbreviations (i.e., resolved metascopes, filters and subscopes). E.g., `{"users:activity!user=charlie", "read:users:activity!user=charlie"}`.
- _parsed scopes_ \
Dictionary represenation of expanded scopes. E.g., `{"users:activity": {"user": ["charlie"]}, "read:users:activity": {"users": ["charlie"]}}`.
Dictionary JSON like format of expanded scopes. E.g., `{"users:activity": {"user": ["charlie"]}, "read:users:activity": {"users": ["charlie"]}}`.
- _intersection_ \
Set of expanded scopes as intersection of 2 expanded scope sets.
- _identify scopes_ \
@@ -22,47 +22,27 @@ Roles and scopes utilities can be found in `roles.py` and `scopes.py` modules. S
## Resolving roles and scopes
**Resolving roles** refers to determining which roles a user, service, or group has, extracting the list of scopes from each role and combining them into a single set of scopes.
**Resolving roles** refers to determining which roles a user, service, token, or group has, extracting the list of scopes from each role and combining them into a single set of scopes.
**Resolving scopes** involves expanding scopes into all their possible subscopes (_expanded scopes_), parsing them into format used for access evaluation (_parsed scopes_) and, if applicable, comparing two sets of scopes (_intersection_). All procedures take into account the scope hierarchy, {ref}`vertical <vertical-filtering-target>` and {ref}`horizontal filtering <horizontal-filtering-target>`, limiting or elevated permissions (`read:<resource>` or `admin:<resource>`, respectively), and metascopes.
Roles and scopes are resolved on several occasions, for example when requesting an API token with specific scopes or making an API request. The following sections provide more details.
Roles and scopes are resolved on several occasions, for example when requesting an API token with specific roles or making an API request. The following sections provide more details.
(requesting-api-token-target)=
### Requesting API token with specific scopes
### Requesting API token with specific roles
:::{versionchanged} 3.0
API tokens have _scopes_ instead of roles,
so that their permissions cannot be updated.
API tokens grant access to JupyterHub's APIs. The RBAC framework allows for requesting tokens with specific existing roles. To date, it is only possible to add roles to a token through the _POST /users/:name/tokens_ API where the roles can be specified in the token parameters body (see [](../reference/rest-api.rst)).
You may still request roles for a token,
but those roles will be evaluated to the corresponding _scopes_ immediately.
RBAC adds several steps into the token issue flow.
Prior to 3.0, tokens stored _roles_,
which meant their scopes were resolved on each request.
:::
If no roles are requested, the token is issued with the default `token` role (providing the requester is allowed to create the token).
API tokens grant access to JupyterHub's APIs. The RBAC framework allows for requesting tokens with specific permissions.
If the token is requested with any roles, the permissions of requesting entity are checked against the requested permissions to ensure the token would not grant its owner additional privileges.
RBAC is involved in several stages of the OAuth token flow.
If, due to modifications of roles or entities, at API request time a token has any scopes that its owner does not, those scopes are removed. The API request is resolved without additional errors using the scopes _intersection_, but the Hub logs a warning (see {ref}`Figure 2 <api-request-chart>`).
When requesting a token via the tokens API (`/users/:name/tokens`), or the token page (`/hub/token`),
if no scopes are requested, the token is issued with the permissions stored on the default `token` role
(providing the requester is allowed to create the token).
OAuth tokens are also requested via OAuth flow
If the token is requested with any scopes, the permissions of requesting entity are checked against the requested permissions to ensure the token would not grant its owner additional privileges.
If, due to modifications of permissions of the token or token owner,
at API request time a token has any scopes that its owner does not,
those scopes are removed.
The API request is resolved without additional errors using the scope _intersection_;
the Hub logs a warning in this case (see {ref}`Figure 2 <api-request-chart>`).
Resolving a token's scope (yellow box in {ref}`Figure 1 <token-request-chart>`) corresponds to resolving all the token's owner roles (including the roles associated with their groups) and the token's own scopes into a set of scopes. The two sets are compared (Resolve the scopes box in orange in {ref}`Figure 1 <token-request-chart>`), taking into account the scope hierarchy.
If the token's scopes are a subset of the token owner's scopes, the token is issued with the requested scopes; if not, JupyterHub will raise an error.
Resolving a token's roles (yellow box in {ref}`Figure 1 <token-request-chart>`) corresponds to resolving all the token's owner roles (including the roles associated with their groups) and the token's requested roles into a set of scopes. The two sets are compared (Resolve the scopes box in orange in {ref}`Figure 1 <token-request-chart>`), taking into account the scope hierarchy but, solely for role assignment, omitting any {ref}`horizontal filter <horizontal-filtering-target>` comparison. If the token's scopes are a subset of the token owner's scopes, the token is issued with the requested roles; if not, JupyterHub will raise an error.
{ref}`Figure 1 <token-request-chart>` below illustrates the steps involved. The orange rectangles highlight where in the process the roles and scopes are resolved.
@@ -75,9 +55,9 @@ Figure 1. Resolving roles and scopes during API token request
### Making an API request
With the RBAC framework, each authenticated JupyterHub API request is guarded by a scope decorator that specifies which scopes are required to gain the access to the API.
With the RBAC framework each authenticated JupyterHub API request is guarded by a scope decorator that specifies which scopes are required to gain the access to the API.
When an API request is performed, the requesting API token's scopes are again intersected with its owner's (yellow box in {ref}`Figure 2 <api-request-chart>`) to ensure the token does not grant more permissions than its owner has at the request time (e.g., due to changing/losing roles).
When an API request is performed, the requesting API token's roles are again resolved (yellow box in {ref}`Figure 2 <api-request-chart>`) to ensure the token does not grant more permissions than its owner has at the request time (e.g., due to changing/losing roles).
If the owner's roles do not include some scopes of the token's scopes, only the _intersection_ of the token's and owner's scopes will be used. For example, using a token with scope `users` whose owner's role scope is `read:users:name` will result in only the `read:users:name` scope being passed on. In the case of no _intersection_, an empty set of scopes will be used.
The passed scopes are compared to the scopes required to access the API as follows:

View File

@@ -231,8 +231,8 @@ In case of the need to run the jupyterhub under /jhub/ or other location please
httpd.conf amendments:
```bash
RewriteRule /jhub/(.*) ws://127.0.0.1:8000/jhub/$1 [P,L]
RewriteRule /jhub/(.*) http://127.0.0.1:8000/jhub/$1 [P,L]
RewriteRule /jhub/(.*) ws://127.0.0.1:8000/jhub/$1 [NE,P,L]
RewriteRule /jhub/(.*) http://127.0.0.1:8000/jhub/$1 [NE,P,L]
ProxyPass /jhub/ http://127.0.0.1:8000/jhub/
ProxyPassReverse /jhub/ http://127.0.0.1:8000/jhub/

View File

@@ -35,17 +35,6 @@ A Service may have the following properties:
the service will be added to the proxy at `/services/:name`
- `api_token: str (default - None)` - For Externally-Managed Services you need to specify
an API token to perform API requests to the Hub
- `display: bool (default - True)` - When set to true, display a link to the
service's URL under the 'Services' dropdown in user's hub home page.
- `oauth_no_confirm: bool (default - False)` - When set to true,
skip the OAuth confirmation page when users access this service.
By default, when users authenticate with a service using JupyterHub,
they are prompted to confirm that they want to grant that service
access to their credentials.
Skipping the confirmation page is useful for admin-managed services that are considered part of the Hub
and shouldn't need extra prompts for login.
If a service is also to be managed by the Hub, it has a few extra options:
@@ -125,10 +114,7 @@ JUPYTERHUB_BASE_URL: Base URL of the Hub (https://mydomain[:port]/)
JUPYTERHUB_SERVICE_PREFIX: URL path prefix of this service (/services/:service-name/)
JUPYTERHUB_SERVICE_URL: Local URL where the service is expected to be listening.
Only for proxied web services.
JUPYTERHUB_OAUTH_SCOPES: JSON-serialized list of scopes to use for allowing access to the service
(deprecated in 3.0, use JUPYTERHUB_OAUTH_ACCESS_SCOPES).
JUPYTERHUB_OAUTH_ACCESS_SCOPES: JSON-serialized list of scopes to use for allowing access to the service (new in 3.0).
JUPYTERHUB_OAUTH_CLIENT_ALLOWED_SCOPES: JSON-serialized list of scopes that can be requested by the oauth client on behalf of users (new in 3.0).
JUPYTERHUB_OAUTH_SCOPES: JSON-serialized list of scopes to use for allowing access to the service.
```
For the previous 'cull idle' Service example, these environment variables
@@ -388,7 +374,7 @@ The `scopes` field can be used to manage access.
Note: a user will have access to a service to complete oauth access to the service for the first time.
Individual permissions may be revoked at any later point without revoking the token,
in which case the `scopes` field in this model should be checked on each access.
The default required scopes for access are available from `hub_auth.oauth_scopes` or `$JUPYTERHUB_OAUTH_ACCESS_SCOPES`.
The default required scopes for access are available from `hub_auth.oauth_scopes` or `$JUPYTERHUB_OAUTH_SCOPES`.
An example of using an Externally-Managed Service and authentication is
in [nbviewer README][nbviewer example] section on securing the notebook viewer,

View File

@@ -308,9 +308,6 @@ The process environment is returned by `Spawner.get_env`, which specifies the fo
This is also the OAuth client secret.
- JUPYTERHUB_CLIENT_ID - the OAuth client ID for authenticating visitors.
- JUPYTERHUB_OAUTH_CALLBACK_URL - the callback URL to use in oauth, typically `/user/:name/oauth_callback`
- JUPYTERHUB_OAUTH_ACCESS_SCOPES - the scopes required to access the server (called JUPYTERHUB_OAUTH_SCOPES prior to 3.0)
- JUPYTERHUB_OAUTH_CLIENT_ALLOWED_SCOPES - the scopes the service is allowed to request.
If no scopes are requested explicitly, these scopes will be requested.
Optional environment variables, depending on configuration:

View File

@@ -371,7 +371,7 @@ a JupyterHub deployment. The commands are:
- System and deployment information
```bash
jupyter troubleshoot
jupyter troubleshooting
```
- Kernel information

View File

@@ -7,9 +7,8 @@ to enable testing without administrative privileges.
c = get_config() # noqa
c.Application.log_level = 'DEBUG'
import os
from oauthenticator.azuread import AzureAdOAuthenticator
import os
c.JupyterHub.authenticator_class = AzureAdOAuthenticator

View File

@@ -1,132 +0,0 @@
import os
from functools import wraps
from html import escape
from urllib.parse import urlparse
from tornado.httpserver import HTTPServer
from tornado.ioloop import IOLoop
from tornado.web import Application, RequestHandler, authenticated
from jupyterhub.services.auth import HubOAuthCallbackHandler, HubOAuthenticated
from jupyterhub.utils import url_path_join
SCOPE_PREFIX = "custom:grades"
READ_SCOPE = f"{SCOPE_PREFIX}:read"
WRITE_SCOPE = f"{SCOPE_PREFIX}:write"
def require_scope(scopes):
"""Decorator to require scopes
For use if multiple methods on one Handler
may want different scopes,
so class-level .hub_scopes is insufficient
(e.g. read for GET, write for POST).
"""
if isinstance(scopes, str):
scopes = [scopes]
def wrap(method):
"""The actual decorator"""
@wraps(method)
@authenticated
def wrapped(self, *args, **kwargs):
self.hub_scopes = scopes
return method(self, *args, **kwargs)
return wrapped
return wrap
class MyGradesHandler(HubOAuthenticated, RequestHandler):
# no hub_scopes, anyone with access to this service
# will be able to visit this URL
@authenticated
def get(self):
self.write("<h1>My grade</h1>")
name = self.current_user["name"]
grades = self.settings["grades"]
self.write(f"<p>My name is: {escape(name)}</p>")
if name in grades:
self.write(f"<p>My grade is: {escape(str(grades[name]))}</p>")
else:
self.write("<p>No grade entered</p>")
if READ_SCOPE in self.current_user["scopes"]:
self.write('<a href="grades/">enter grades</a>')
class GradesHandler(HubOAuthenticated, RequestHandler):
# default scope for this Handler: read-only
hub_scopes = [READ_SCOPE]
def _render(self):
grades = self.settings["grades"]
self.write("<h1>All grades</h1>")
self.write("<table>")
self.write("<tr><th>Student</th><th>Grade</th></tr>")
for student, grade in grades.items():
qstudent = escape(student)
qgrade = escape(str(grade))
self.write(
f"""
<tr>
<td class="student">{qstudent}</td>
<td class="grade">{qgrade}</td>
</tr>
"""
)
if WRITE_SCOPE in self.current_user["scopes"]:
self.write("Enter grade:")
self.write(
"""
<form action=. method=POST>
<input name=student placeholder=student></input>
<input kind=number name=grade placeholder=grade></input>
<input type="submit" value="Submit">
"""
)
@require_scope([READ_SCOPE])
async def get(self):
self._render()
# POST requires WRITE_SCOPE instead of READ_SCOPE
@require_scope([WRITE_SCOPE])
async def post(self):
name = self.get_argument("student")
grade = self.get_argument("grade")
self.settings["grades"][name] = grade
self._render()
def main():
base_url = os.environ['JUPYTERHUB_SERVICE_PREFIX']
app = Application(
[
(base_url, MyGradesHandler),
(url_path_join(base_url, 'grades/'), GradesHandler),
(
url_path_join(base_url, 'oauth_callback'),
HubOAuthCallbackHandler,
),
],
cookie_secret=os.urandom(32),
grades={"student": 53},
)
http_server = HTTPServer(app)
url = urlparse(os.environ['JUPYTERHUB_SERVICE_URL'])
http_server.listen(url.port, url.hostname)
try:
IOLoop.current().start()
except KeyboardInterrupt:
pass
if __name__ == '__main__':
main()

View File

@@ -1,52 +0,0 @@
import sys
c = get_config() # noqa
c.JupyterHub.services = [
{
'name': 'grades',
'url': 'http://127.0.0.1:10101',
'command': [sys.executable, './grades.py'],
'oauth_client_allowed_scopes': [
'custom:grades:write',
'custom:grades:read',
],
},
]
c.JupyterHub.custom_scopes = {
"custom:grades:read": {
"description": "read-access to all grades",
},
"custom:grades:write": {
"description": "Enter new grades",
"subscopes": ["custom:grades:read"],
},
}
c.JupyterHub.load_roles = [
{
"name": "user",
# grant all users access to services
"scopes": ["access:services", "self"],
},
{
"name": "grader",
# grant graders access to write grades
"scopes": ["custom:grades:write"],
"users": ["grader"],
},
{
"name": "instructor",
# grant instructors access to read, but not write grades
"scopes": ["custom:grades:read"],
"users": ["instructor"],
},
]
c.JupyterHub.allowed_users = {"instructor", "grader", "student"}
# dummy spawner and authenticator for testing, don't actually use these!
c.JupyterHub.authenticator_class = 'dummy'
c.JupyterHub.spawner_class = 'simple'
c.JupyterHub.ip = '127.0.0.1' # let's just run on localhost while dummy auth is enabled
c.JupyterHub.log_level = 10

View File

@@ -5,10 +5,13 @@ so all URLs and requests necessary for OAuth with JupyterHub should be in one pl
"""
import json
import os
from urllib.parse import urlencode, urlparse
from urllib.parse import urlencode
from urllib.parse import urlparse
from tornado import log, web
from tornado.httpclient import AsyncHTTPClient, HTTPRequest
from tornado import log
from tornado import web
from tornado.httpclient import AsyncHTTPClient
from tornado.httpclient import HTTPRequest
from tornado.httputil import url_concat
from tornado.ioloop import IOLoop

View File

@@ -2,8 +2,6 @@
# 1. start/stop servers, and
# 2. access the server API
c = get_config() # noqa
c.JupyterHub.load_roles = [
{
"name": "launcher",

View File

@@ -16,6 +16,7 @@ import time
import requests
log = logging.getLogger(__name__)

View File

@@ -3,7 +3,9 @@ import datetime
import json
import os
from tornado import escape, ioloop, web
from tornado import escape
from tornado import ioloop
from tornado import web
from jupyterhub.services.auth import HubAuthenticated

View File

@@ -1,5 +1,8 @@
from datetime import datetime
from typing import Any, Dict, List, Optional
from typing import Any
from typing import Dict
from typing import List
from typing import Optional
from pydantic import BaseModel

View File

@@ -1,7 +1,9 @@
import json
import os
from fastapi import HTTPException, Security, status
from fastapi import HTTPException
from fastapi import Security
from fastapi import status
from fastapi.security import OAuth2AuthorizationCodeBearer
from fastapi.security.api_key import APIKeyQuery

View File

@@ -1,9 +1,14 @@
import os
from fastapi import APIRouter, Depends, Form, Request
from fastapi import APIRouter
from fastapi import Depends
from fastapi import Form
from fastapi import Request
from .client import get_client
from .models import AuthorizationError, HubApiError, User
from .models import AuthorizationError
from .models import HubApiError
from .models import User
from .security import get_current_user
# APIRouter prefix cannot end in /

View File

@@ -7,10 +7,16 @@ import os
import secrets
from functools import wraps
from flask import Flask, Response, make_response, redirect, request, session
from flask import Flask
from flask import make_response
from flask import redirect
from flask import request
from flask import Response
from flask import session
from jupyterhub.services.auth import HubOAuth
prefix = os.environ.get('JUPYTERHUB_SERVICE_PREFIX', '/')
auth = HubOAuth(api_token=os.environ['JUPYTERHUB_API_TOKEN'], cache_max_age=60)

View File

@@ -26,7 +26,7 @@ After logging in with any username and password, you should see a JSON dump of y
```
What is contained in the model will depend on the permissions
requested in the `oauth_client_allowed_scopes` configuration of the service `whoami-oauth` service.
requested in the `oauth_roles` configuration of the service `whoami-oauth` service.
The default is the minimum required for identification and access to the service,
which will provide the username and current scopes.

View File

@@ -14,11 +14,11 @@ c.JupyterHub.services = [
# only requesting access to the service,
# and identification by name,
# nothing more.
# Specifying 'oauth_client_allowed_scopes' as a list of scopes
# Specifying 'oauth_roles' as a list of role names
# allows requesting more information about users,
# or the ability to take actions on users' behalf, as required.
# the 'inherit' scope means the full permissions of the owner
# 'oauth_client_allowed_scopes': ['inherit'],
# The default 'token' role has the full permissions of its owner:
# 'oauth_roles': ['token'],
},
]

View File

@@ -10,9 +10,12 @@ from urllib.parse import urlparse
from tornado.httpserver import HTTPServer
from tornado.ioloop import IOLoop
from tornado.web import Application, RequestHandler, authenticated
from tornado.web import Application
from tornado.web import authenticated
from tornado.web import RequestHandler
from jupyterhub.services.auth import HubOAuthCallbackHandler, HubOAuthenticated
from jupyterhub.services.auth import HubOAuthCallbackHandler
from jupyterhub.services.auth import HubOAuthenticated
from jupyterhub.utils import url_path_join

View File

@@ -10,7 +10,9 @@ from urllib.parse import urlparse
from tornado.httpserver import HTTPServer
from tornado.ioloop import IOLoop
from tornado.web import Application, RequestHandler, authenticated
from tornado.web import Application
from tornado.web import authenticated
from tornado.web import RequestHandler
from jupyterhub.services.auth import HubAuthenticated

View File

@@ -0,0 +1,56 @@
/*
object-assign
(c) Sindre Sorhus
@license MIT
*/
/*!
Copyright (c) 2018 Jed Watson.
Licensed under the MIT License (MIT), see
http://jedwatson.github.io/classnames
*/
/** @license React v0.20.2
* scheduler.production.min.js
*
* Copyright (c) Facebook, Inc. and its affiliates.
*
* This source code is licensed under the MIT license found in the
* LICENSE file in the root directory of this source tree.
*/
/** @license React v16.13.1
* react-is.production.min.js
*
* Copyright (c) Facebook, Inc. and its affiliates.
*
* This source code is licensed under the MIT license found in the
* LICENSE file in the root directory of this source tree.
*/
/** @license React v17.0.2
* react-dom.production.min.js
*
* Copyright (c) Facebook, Inc. and its affiliates.
*
* This source code is licensed under the MIT license found in the
* LICENSE file in the root directory of this source tree.
*/
/** @license React v17.0.2
* react-jsx-runtime.production.min.js
*
* Copyright (c) Facebook, Inc. and its affiliates.
*
* This source code is licensed under the MIT license found in the
* LICENSE file in the root directory of this source tree.
*/
/** @license React v17.0.2
* react.production.min.js
*
* Copyright (c) Facebook, Inc. and its affiliates.
*
* This source code is licensed under the MIT license found in the
* LICENSE file in the root directory of this source tree.
*/

6
jsx/build/index.html Normal file
View File

@@ -0,0 +1,6 @@
<!DOCTYPE html>
<head></head>
<body>
<div id="admin-react-hook"></div>
<script src="admin-react.js"></script>
</body>

View File

@@ -8,7 +8,7 @@
"scripts": {
"build": "yarn && webpack",
"hot": "webpack && webpack-dev-server",
"place": "cp build/admin-react.js* ../share/jupyterhub/static/js/",
"place": "cp -r build/admin-react.js ../share/jupyterhub/static/js/admin-react.js",
"test": "jest --verbose",
"snap": "jest --updateSnapshot",
"lint": "eslint --ext .jsx --ext .js src/",
@@ -28,7 +28,17 @@
}
},
"dependencies": {
"@babel/core": "^7.12.3",
"@babel/preset-env": "^7.12.11",
"@babel/preset-react": "^7.12.10",
"@testing-library/jest-dom": "^5.15.1",
"@testing-library/react": "^12.1.2",
"@testing-library/user-event": "^13.5.0",
"babel-loader": "^8.2.1",
"bootstrap": "^4.5.3",
"css-loader": "^5.0.1",
"eslint-plugin-unused-imports": "^1.1.1",
"file-loader": "^6.2.0",
"history": "^5.0.0",
"lodash.debounce": "^4.0.8",
"prop-types": "^15.7.2",
@@ -41,35 +51,24 @@
"react-redux": "^7.2.2",
"react-router": "^5.2.0",
"react-router-dom": "^5.2.0",
"recompose": "npm:react-recompose@^0.31.2",
"recompose": "^0.30.0",
"redux": "^4.0.5",
"regenerator-runtime": "^0.13.9"
"regenerator-runtime": "^0.13.9",
"style-loader": "^2.0.0",
"webpack": "^5.6.0",
"webpack-cli": "^3.3.4",
"webpack-dev-server": "^3.11.0"
},
"devDependencies": {
"@babel/core": "^7.12.3",
"@babel/preset-env": "^7.12.11",
"@babel/preset-react": "^7.12.10",
"@testing-library/jest-dom": "^5.15.1",
"@testing-library/react": "^12.1.2",
"@testing-library/user-event": "^13.5.0",
"@webpack-cli/serve": "^1.7.0",
"@wojtekmaj/enzyme-adapter-react-17": "^0.6.5",
"babel-jest": "^26.6.3",
"babel-loader": "^8.2.1",
"css-loader": "^5.0.1",
"enzyme": "^3.11.0",
"eslint": "^7.18.0",
"eslint-plugin-prettier": "^3.3.1",
"eslint-plugin-react": "^7.22.0",
"eslint-plugin-unused-imports": "^1.1.1",
"file-loader": "^6.2.0",
"identity-obj-proxy": "^3.0.0",
"jest": "^26.6.3",
"prettier": "^2.2.1",
"sinon": "^13.0.1",
"style-loader": "^2.0.0",
"webpack": "^5.6.0",
"webpack-cli": "^4.10.0",
"webpack-dev-server": "^4.9.3"
"sinon": "^13.0.1"
}
}

View File

@@ -1,9 +1,10 @@
import React from "react";
import React, { useEffect } from "react";
import ReactDOM from "react-dom";
import { Provider } from "react-redux";
import { createStore } from "redux";
import { compose } from "recompose";
import { initialState, reducers } from "./Store";
import { jhapiRequest } from "./util/jhapiUtil";
import withAPI from "./util/withAPI";
import { HashRouter, Switch, Route } from "react-router-dom";
@@ -19,6 +20,23 @@ import "./style/root.css";
const store = createStore(reducers, initialState);
const App = () => {
useEffect(() => {
let { limit, user_page, groups_page } = initialState;
jhapiRequest(`/users?offset=${user_page * limit}&limit=${limit}`, "GET")
.then((data) => data.json())
.then((data) =>
store.dispatch({ type: "USER_PAGE", value: { data: data, page: 0 } })
)
.catch((err) => console.log(err));
jhapiRequest(`/groups?offset=${groups_page * limit}&limit=${limit}`, "GET")
.then((data) => data.json())
.then((data) =>
store.dispatch({ type: "GROUPS_PAGE", value: { data: data, page: 0 } })
)
.catch((err) => console.log(err));
});
return (
<div className="resets">
<Provider store={store}>

View File

@@ -1,48 +1,23 @@
export const initialState = {
user_data: undefined,
user_page: { offset: 0, limit: window.api_page_limit || 100 },
user_page: 0,
name_filter: "",
groups_data: undefined,
groups_page: { offset: 0, limit: window.api_page_limit || 100 },
limit: window.api_page_limit || 100,
groups_page: 0,
limit: window.api_page_limit,
};
export const reducers = (state = initialState, action) => {
switch (action.type) {
// Updates the client user model data and stores the page
case "USER_OFFSET":
return Object.assign({}, state, {
user_page: Object.assign({}, state.user_page, {
offset: action.value.offset,
}),
});
case "USER_NAME_FILTER":
// set offset to 0 if name filter changed,
// otherwise leave it alone
const newOffset =
action.value.name_filter !== state.name_filter ? 0 : state.name_filter;
return Object.assign({}, state, {
user_page: Object.assign({}, state.user_page, {
offset: newOffset,
}),
name_filter: action.value.name_filter,
});
case "USER_PAGE":
return Object.assign({}, state, {
user_page: action.value.page,
user_data: action.value.data,
name_filter: action.value.name_filter || "",
});
// Updates the client group user model data and stores the page
case "GROUPS_OFFSET":
return Object.assign({}, state, {
groups_page: Object.assign({}, state.groups_page, {
offset: action.value.offset,
}),
});
// Updates the client group model data and stores the page
case "GROUPS_PAGE":
return Object.assign({}, state, {
groups_page: action.value.page,

View File

@@ -60,10 +60,7 @@ const AddUser = (props) => {
placeholder="usernames separated by line"
data-testid="user-textarea"
onBlur={(e) => {
let split_users = e.target.value
.split("\n")
.map((u) => u.trim())
.filter((u) => u.length > 0);
let split_users = e.target.value.split("\n");
setUsers(split_users);
}}
></textarea>
@@ -91,7 +88,17 @@ const AddUser = (props) => {
data-testid="submit"
className="btn btn-primary"
onClick={() => {
addUsers(users, admin)
let filtered_users = users.filter(
(e) =>
e.length > 2 &&
/[!@#$%^&*(),.?":{}|<>]/g.test(e) == false
);
if (filtered_users.length < users.length) {
setUsers(filtered_users);
failRegexEvent();
}
addUsers(filtered_users, admin)
.then((data) =>
data.status < 300
? updateUsers(0, limit)

View File

@@ -70,12 +70,12 @@ test("Removes users when they fail Regex", async () => {
let textarea = screen.getByTestId("user-textarea");
let submit = screen.getByTestId("submit");
fireEvent.blur(textarea, { target: { value: "foo \n bar\na@b.co\n \n\n" } });
fireEvent.blur(textarea, { target: { value: "foo\nbar\n!!*&*" } });
await act(async () => {
fireEvent.click(submit);
});
expect(callbackSpy).toHaveBeenCalledWith(["foo", "bar", "a@b.co"], false);
expect(callbackSpy).toHaveBeenCalledWith(["foo", "bar"], false);
});
test("Correctly submits admin", async () => {

View File

@@ -59,7 +59,7 @@ const CreateGroup = (props) => {
value={groupName}
placeholder="group name..."
onChange={(e) => {
setGroupName(e.target.value.trim());
setGroupName(e.target.value);
}}
></input>
</div>

View File

@@ -125,12 +125,38 @@ const EditUser = (props) => {
if (updatedUsername == "" && admin == has_admin) {
noChangeEvent();
return;
} else if (updatedUsername != "") {
if (
updatedUsername.length > 2 &&
/[!@#$%^&*(),.?":{}|<>]/g.test(updatedUsername) == false
) {
editUser(
username,
updatedUsername != "" ? updatedUsername : username,
admin
)
.then((data) => {
data.status < 300
? updateUsers(0, limit)
.then((data) => dispatchPageChange(data, 0))
.then(() => history.push("/"))
.catch(() =>
setErrorAlert(
`Could not update users list.`
)
)
: setErrorAlert(`Failed to edit user.`);
})
.catch(() => {
setErrorAlert(`Failed to edit user.`);
});
} else {
setErrorAlert(
`Failed to edit user. Make sure the username does not contain special characters.`
);
}
} else {
editUser(
username,
updatedUsername != "" ? updatedUsername : username,
admin
)
editUser(username, username, admin)
.then((data) => {
data.status < 300
? updateUsers(0, limit)

View File

@@ -122,6 +122,7 @@ const GroupEdit = (props) => {
: setErrorAlert(`Failed to edit group.`);
})
.catch(() => {
console.log("outer");
setErrorAlert(`Failed to edit group.`);
});
}}

View File

@@ -1,4 +1,4 @@
import React, { useEffect, useState } from "react";
import React from "react";
import { useSelector, useDispatch } from "react-redux";
import PropTypes from "prop-types";
@@ -6,26 +6,23 @@ import { Link } from "react-router-dom";
import PaginationFooter from "../PaginationFooter/PaginationFooter";
const Groups = (props) => {
var groups_data = useSelector((state) => state.groups_data),
var user_data = useSelector((state) => state.user_data),
groups_data = useSelector((state) => state.groups_data),
groups_page = useSelector((state) => state.groups_page),
dispatch = useDispatch();
limit = useSelector((state) => state.limit),
dispatch = useDispatch(),
page = parseInt(new URLSearchParams(props.location.search).get("page"));
var offset = groups_page ? groups_page.offset : 0;
const setOffset = (offset) => {
dispatch({
type: "GROUPS_OFFSET",
value: {
offset: offset,
},
});
};
var limit = groups_page ? groups_page.limit : window.api_page_limit;
var total = groups_page ? groups_page.total : undefined;
page = isNaN(page) ? 0 : page;
var slice = [page * limit, limit];
var { updateGroups, history } = props;
const dispatchPageUpdate = (data, page) => {
if (!groups_data || !user_data) {
return <div data-testid="no-show"></div>;
}
const dispatchPageChange = (data, page) => {
dispatch({
type: "GROUPS_PAGE",
value: {
@@ -35,14 +32,10 @@ const Groups = (props) => {
});
};
useEffect(() => {
updateGroups(offset, limit).then((data) =>
dispatchPageUpdate(data.items, data._pagination)
);
}, [offset, limit]);
if (!groups_data || !groups_page) {
return <div data-testid="no-show"></div>;
if (groups_page != page) {
updateGroups(...slice).then((data) => {
dispatchPageChange(data, page);
});
}
return (
@@ -66,6 +59,7 @@ const Groups = (props) => {
pathname: "/group-edit",
state: {
group_data: e,
user_data: user_data,
},
}}
>
@@ -80,12 +74,11 @@ const Groups = (props) => {
)}
</ul>
<PaginationFooter
offset={offset}
endpoint="/groups"
page={page}
limit={limit}
visible={groups_data.length}
total={total}
next={() => setOffset(offset + limit)}
prev={() => setOffset(offset >= limit ? offset - limit : 0)}
numOffset={slice[0]}
numElements={groups_data.length}
/>
</div>
<div className="panel-footer">
@@ -109,6 +102,8 @@ const Groups = (props) => {
};
Groups.propTypes = {
user_data: PropTypes.array,
groups_data: PropTypes.array,
updateUsers: PropTypes.func,
updateGroups: PropTypes.func,
history: PropTypes.shape({

View File

@@ -1,72 +1,53 @@
import React from "react";
import "@testing-library/jest-dom";
import { act } from "react-dom/test-utils";
import { render, screen, fireEvent } from "@testing-library/react";
import { render, screen } from "@testing-library/react";
import { Provider, useDispatch, useSelector } from "react-redux";
import { createStore } from "redux";
import { HashRouter } from "react-router-dom";
// eslint-disable-next-line
import regeneratorRuntime from "regenerator-runtime";
import { initialState, reducers } from "../../Store";
import Groups from "./Groups";
jest.mock("react-redux", () => ({
...jest.requireActual("react-redux"),
useSelector: jest.fn(),
useDispatch: jest.fn(),
}));
var mockAsync = () =>
jest.fn().mockImplementation(() => Promise.resolve({ key: "value" }));
var groupsJsx = (callbackSpy) => (
<Provider store={createStore(mockReducers, mockAppState())}>
<Provider store={createStore(() => {}, {})}>
<HashRouter>
<Groups location={{ search: "0" }} updateGroups={callbackSpy} />
</HashRouter>
</Provider>
);
var mockReducers = jest.fn((state, action) => {
if (action.type === "GROUPS_PAGE" && !action.value.data) {
// no-op from mock, don't update state
return state;
}
state = reducers(state, action);
// mocked useSelector seems to cause a problem
// this should get the right state back?
// not sure
// useSelector.mockImplementation((callback) => callback(state);
return state;
var mockAppState = () => ({
user_data: JSON.parse(
'[{"kind":"user","name":"foo","admin":true,"groups":[],"server":"/user/foo/","pending":null,"created":"2020-12-07T18:46:27.112695Z","last_activity":"2020-12-07T21:00:33.336354Z","servers":{"":{"name":"","last_activity":"2020-12-07T20:58:02.437408Z","started":"2020-12-07T20:58:01.508266Z","pending":null,"ready":true,"state":{"pid":28085},"url":"/user/foo/","user_options":{},"progress_url":"/hub/api/users/foo/server/progress"}}},{"kind":"user","name":"bar","admin":false,"groups":[],"server":null,"pending":null,"created":"2020-12-07T18:46:27.115528Z","last_activity":"2020-12-07T20:43:51.013613Z","servers":{}}]'
),
groups_data: JSON.parse(
'[{"kind":"group","name":"testgroup","users":[]}, {"kind":"group","name":"testgroup2","users":["foo", "bar"]}]'
),
limit: 10,
});
var mockAppState = () =>
Object.assign({}, initialState, {
groups_data: [
{ kind: "group", name: "testgroup", users: [] },
{ kind: "group", name: "testgroup2", users: ["foo", "bar"] },
],
groups_page: {
offset: 0,
limit: 2,
total: 4,
next: {
offset: 2,
limit: 2,
url: "http://localhost:8000/hub/api/groups?offset=2&limit=2",
},
},
});
beforeEach(() => {
useSelector.mockImplementation((callback) => {
return callback(mockAppState());
});
useDispatch.mockImplementation(() => {
return () => {};
});
});
afterEach(() => {
useSelector.mockClear();
mockReducers.mockClear();
});
test("Renders", async () => {
@@ -107,30 +88,3 @@ test("Renders nothing if required data is not available", async () => {
let noShow = screen.getByTestId("no-show");
expect(noShow).toBeVisible();
});
test("Interacting with PaginationFooter causes state update and refresh via useEffect call", async () => {
let callbackSpy = mockAsync();
await act(async () => {
render(groupsJsx(callbackSpy));
});
expect(callbackSpy).toBeCalledWith(0, 2);
var lastState =
mockReducers.mock.results[mockReducers.mock.results.length - 1].value;
expect(lastState.groups_page.offset).toEqual(0);
expect(lastState.groups_page.limit).toEqual(2);
let next = screen.getByTestId("paginate-next");
fireEvent.click(next);
lastState =
mockReducers.mock.results[mockReducers.mock.results.length - 1].value;
expect(lastState.groups_page.offset).toEqual(2);
expect(lastState.groups_page.limit).toEqual(2);
// FIXME: mocked useSelector, state seem to prevent updateGroups from being called
// making the test environment not representative
// expect(callbackSpy).toHaveBeenCalledWith(2, 2);
});

View File

@@ -1,40 +1,33 @@
import React from "react";
import { Link } from "react-router-dom";
import PropTypes from "prop-types";
import "./pagination-footer.css";
const PaginationFooter = (props) => {
let { offset, limit, visible, total, next, prev } = props;
let { endpoint, page, limit, numOffset, numElements } = props;
return (
<div className="pagination-footer">
<p>
Displaying {offset}-{offset + visible}
Displaying {numOffset}-{numOffset + numElements}
<br></br>
<br></br>
{offset >= 1 ? (
{page >= 1 ? (
<button className="btn btn-sm btn-light spaced">
<span
className="active-pagination"
data-testid="paginate-prev"
onClick={prev}
>
Previous
</span>
<Link to={`${endpoint}?page=${page - 1}`}>
<span className="active-pagination">Previous</span>
</Link>
</button>
) : (
<button className="btn btn-sm btn-light spaced">
<span className="inactive-pagination">Previous</span>
</button>
)}
{offset + visible < total ? (
{numElements >= limit ? (
<button className="btn btn-sm btn-light spaced">
<span
className="active-pagination"
data-testid="paginate-next"
onClick={next}
>
Next
</span>
<Link to={`${endpoint}?page=${page + 1}`}>
<span className="active-pagination">Next</span>
</Link>
</button>
) : (
<button className="btn btn-sm btn-light spaced">

View File

@@ -1,6 +1,6 @@
import React, { useEffect, useState } from "react";
import React, { useState } from "react";
import regeneratorRuntime from "regenerator-runtime";
import { useSelector, useDispatch } from "react-redux";
import { debounce } from "lodash";
import PropTypes from "prop-types";
import {
@@ -30,7 +30,7 @@ const AccessServerButton = ({ url }) => (
);
const ServerDashboard = (props) => {
let base_url = window.base_url || "/";
let base_url = window.base_url;
// sort methods
var usernameDesc = (e) => e.sort((a, b) => (a.name > b.name ? 1 : -1)),
usernameAsc = (e) => e.sort((a, b) => (a.name < b.name ? 1 : -1)),
@@ -50,15 +50,16 @@ const ServerDashboard = (props) => {
var [errorAlert, setErrorAlert] = useState(null);
var [sortMethod, setSortMethod] = useState(null);
var [disabledButtons, setDisabledButtons] = useState({});
var [collapseStates, setCollapseStates] = useState({});
const [collapseStates, setCollapseStates] = useState({});
var user_data = useSelector((state) => state.user_data),
user_page = useSelector((state) => state.user_page),
name_filter = useSelector((state) => state.name_filter);
limit = useSelector((state) => state.limit),
name_filter = useSelector((state) => state.name_filter),
page = parseInt(new URLSearchParams(props.location.search).get("page"));
var offset = user_page ? user_page.offset : 0;
var limit = user_page ? user_page.limit : window.api_page_limit;
var total = user_page ? user_page.total : undefined;
page = isNaN(page) ? 0 : page;
var slice = [page * limit, limit, name_filter];
const dispatch = useDispatch();
@@ -72,48 +73,33 @@ const ServerDashboard = (props) => {
history,
} = props;
const dispatchPageUpdate = (data, page) => {
var dispatchPageUpdate = (data, page, name_filter) => {
dispatch({
type: "USER_PAGE",
value: {
data: data,
page: page,
},
});
};
const setOffset = (newOffset) => {
dispatch({
type: "USER_OFFSET",
value: {
offset: newOffset,
},
});
};
const setNameFilter = (name_filter) => {
dispatch({
type: "USER_NAME_FILTER",
value: {
name_filter: name_filter,
},
});
};
useEffect(() => {
updateUsers(offset, limit, name_filter)
.then((data) => dispatchPageUpdate(data.items, data._pagination))
.catch((err) => setErrorAlert("Failed to update user list."));
}, [offset, limit, name_filter]);
if (!user_data || !user_page) {
if (!user_data) {
return <div data-testid="no-show"></div>;
}
var slice = [offset, limit, name_filter];
if (page != user_page) {
updateUsers(...slice).then((data) =>
dispatchPageUpdate(data, page, name_filter)
);
}
var debounce = require("lodash.debounce");
const handleSearch = debounce(async (event) => {
setNameFilter(event.target.value);
// setNameFilter(event.target.value);
updateUsers(page * limit, limit, event.target.value).then((data) =>
dispatchPageUpdate(data, page, name_filter)
);
}, 300);
if (sortMethod != null) {
@@ -133,11 +119,7 @@ const ServerDashboard = (props) => {
if (res.status < 300) {
updateUsers(...slice)
.then((data) => {
dispatchPageUpdate(
data.items,
data._pagination,
name_filter
);
dispatchPageUpdate(data, page, name_filter);
})
.catch(() => {
setIsDisabled(false);
@@ -173,11 +155,7 @@ const ServerDashboard = (props) => {
if (res.status < 300) {
updateUsers(...slice)
.then((data) => {
dispatchPageUpdate(
data.items,
data._pagination,
name_filter
);
dispatchPageUpdate(data, page, name_filter);
})
.catch(() => {
setErrorAlert(`Failed to update users list.`);
@@ -222,44 +200,6 @@ const ServerDashboard = (props) => {
);
};
const ServerRowTable = ({ data }) => {
const sortedData = Object.keys(data)
.sort()
.reduce(function (result, key) {
let value = data[key];
switch (key) {
case "last_activity":
case "created":
case "started":
// format timestamps
value = value ? timeSince(value) : value;
break;
}
if (Array.isArray(value)) {
// cast arrays (e.g. roles, groups) to string
value = value.sort().join(", ");
}
result[key] = value;
return result;
}, {});
return (
<ReactObjectTableViewer
className="table-striped table-bordered"
style={{
padding: "3px 6px",
margin: "auto",
}}
keyStyle={{
padding: "4px",
}}
valueStyle={{
padding: "4px",
}}
data={sortedData}
/>
);
};
const serverRow = (user, server) => {
const { servers, ...userNoServers } = user;
const serverNameDash = server.name ? `-${server.name}` : "";
@@ -292,7 +232,11 @@ const ServerDashboard = (props) => {
<td data-testid="user-row-admin">{user.admin ? "admin" : ""}</td>
<td data-testid="user-row-server">
<p className="text-secondary">{server.name}</p>
{server.name ? (
<p className="text-secondary">{server.name}</p>
) : (
<p style={{ color: "lightgrey" }}>[MAIN]</p>
)}
</td>
<td data-testid="user-row-last-activity">
{server.last_activity ? timeSince(server.last_activity) : "Never"}
@@ -314,7 +258,7 @@ const ServerDashboard = (props) => {
/>
<a
href={`${base_url}spawn/${user.name}${
server.name ? "/" + server.name : ""
server.name && "/" + server.name
}`}
>
<button
@@ -342,11 +286,37 @@ const ServerDashboard = (props) => {
>
<Card style={{ width: "100%", padding: 3, margin: "0 auto" }}>
<Card.Title>User</Card.Title>
<ServerRowTable data={userNoServers} />
<ReactObjectTableViewer
className="table-striped table-bordered admin-table-head"
style={{
padding: "3px 6px",
margin: "auto",
}}
keyStyle={{
padding: "4px",
}}
valueStyle={{
padding: "4px",
}}
data={userNoServers}
/>
</Card>
<Card style={{ width: "100%", padding: 3, margin: "0 auto" }}>
<Card.Title>Server</Card.Title>
<ServerRowTable data={server} />
<ReactObjectTableViewer
className="table-striped table-bordered admin-table-head"
style={{
padding: "3px 6px",
margin: "auto",
}}
keyStyle={{
padding: "4px",
}}
valueStyle={{
padding: "4px",
}}
data={server}
/>
</Card>
</CardGroup>
</Collapse>
@@ -479,11 +449,7 @@ const ServerDashboard = (props) => {
.then((res) => {
updateUsers(...slice)
.then((data) => {
dispatchPageUpdate(
data.items,
data._pagination,
name_filter
);
dispatchPageUpdate(data, page, name_filter);
})
.catch(() =>
setErrorAlert(`Failed to update users list.`)
@@ -519,11 +485,7 @@ const ServerDashboard = (props) => {
.then((res) => {
updateUsers(...slice)
.then((data) => {
dispatchPageUpdate(
data.items,
data._pagination,
name_filter
);
dispatchPageUpdate(data, page, name_filter);
})
.catch(() =>
setErrorAlert(`Failed to update users list.`)
@@ -551,12 +513,11 @@ const ServerDashboard = (props) => {
</tbody>
</table>
<PaginationFooter
offset={offset}
endpoint="/"
page={page}
limit={limit}
visible={user_data.length}
total={total}
next={() => setOffset(offset + limit)}
prev={() => setOffset(offset - limit)}
numOffset={slice[0]}
numElements={user_data.length}
/>
<br></br>
</div>

View File

@@ -10,7 +10,6 @@ import { createStore } from "redux";
import regeneratorRuntime from "regenerator-runtime";
import ServerDashboard from "./ServerDashboard";
import { initialState, reducers } from "../../Store";
import * as sinon from "sinon";
let clock;
@@ -21,7 +20,7 @@ jest.mock("react-redux", () => ({
}));
var serverDashboardJsx = (spy) => (
<Provider store={createStore(mockReducers, mockAppState())}>
<Provider store={createStore(() => {}, {})}>
<HashRouter>
<Switch>
<ServerDashboard
@@ -43,67 +42,10 @@ var mockAsync = (data) =>
var mockAsyncRejection = () =>
jest.fn().mockImplementation(() => Promise.reject());
var mockAppState = () =>
Object.assign({}, initialState, {
user_data: [
{
kind: "user",
name: "foo",
admin: true,
groups: [],
server: "/user/foo/",
pending: null,
created: "2020-12-07T18:46:27.112695Z",
last_activity: "2020-12-07T21:00:33.336354Z",
servers: {
"": {
name: "",
last_activity: "2020-12-07T20:58:02.437408Z",
started: "2020-12-07T20:58:01.508266Z",
pending: null,
ready: true,
state: { pid: 28085 },
url: "/user/foo/",
user_options: {},
progress_url: "/hub/api/users/foo/server/progress",
},
},
},
{
kind: "user",
name: "bar",
admin: false,
groups: [],
server: null,
pending: null,
created: "2020-12-07T18:46:27.115528Z",
last_activity: "2020-12-07T20:43:51.013613Z",
servers: {},
},
],
user_page: {
offset: 0,
limit: 2,
total: 4,
next: {
offset: 2,
limit: 2,
url: "http://localhost:8000/hub/api/groups?offset=2&limit=2",
},
},
});
var mockReducers = jest.fn((state, action) => {
if (action.type === "USER_PAGE" && !action.value.data) {
// no-op from mock, don't update state
return state;
}
state = reducers(state, action);
// mocked useSelector seems to cause a problem
// this should get the right state back?
// not sure
// useSelector.mockImplementation((callback) => callback(state);
return state;
var mockAppState = () => ({
user_data: JSON.parse(
'[{"kind":"user","name":"foo","admin":true,"groups":[],"server":"/user/foo/","pending":null,"created":"2020-12-07T18:46:27.112695Z","last_activity":"2020-12-07T21:00:33.336354Z","servers":{"":{"name":"","last_activity":"2020-12-07T20:58:02.437408Z","started":"2020-12-07T20:58:01.508266Z","pending":null,"ready":true,"state":{"pid":28085},"url":"/user/foo/","user_options":{},"progress_url":"/hub/api/users/foo/server/progress"}}},{"kind":"user","name":"bar","admin":false,"groups":[],"server":null,"pending":null,"created":"2020-12-07T18:46:27.115528Z","last_activity":"2020-12-07T20:43:51.013613Z","servers":{}}]'
),
});
beforeEach(() => {
@@ -115,7 +57,6 @@ beforeEach(() => {
afterEach(() => {
useSelector.mockClear();
mockReducers.mockClear();
clock.restore();
});
@@ -157,18 +98,6 @@ test("Renders correctly the status of a single-user server", async () => {
expect(stop).toBeVisible();
});
test("Renders spawn page link", async () => {
let callbackSpy = mockAsync();
await act(async () => {
render(serverDashboardJsx(callbackSpy));
});
let link = screen.getByText("Spawn Page").closest("a");
let url = new URL(link.href);
expect(url.pathname).toEqual("/spawn/bar");
});
test("Invokes the startServer event on button click", async () => {
let callbackSpy = mockAsync();
@@ -557,22 +486,11 @@ test("Shows a UI error dialogue when stop user server returns an improper status
test("Search for user calls updateUsers with name filter", async () => {
let spy = mockAsync();
let mockUpdateUsers = jest.fn((offset, limit, name_filter) => {
return Promise.resolve({
items: [],
_pagination: {
offset: offset,
limit: limit,
total: offset + limit * 2,
next: {
offset: offset + limit,
limit: limit,
},
},
});
return Promise.resolve([]);
});
await act(async () => {
render(
<Provider store={createStore(mockReducers, mockAppState())}>
<Provider store={createStore(() => {}, {})}>
<HashRouter>
<Switch>
<ServerDashboard
@@ -591,56 +509,15 @@ test("Search for user calls updateUsers with name filter", async () => {
let search = screen.getByLabelText("user-search");
expect(mockUpdateUsers.mock.calls).toHaveLength(1);
userEvent.type(search, "a");
expect(search.value).toEqual("a");
clock.tick(400);
expect(mockReducers.mock.calls).toHaveLength(3);
var lastState =
mockReducers.mock.results[mockReducers.mock.results.length - 1].value;
expect(lastState.name_filter).toEqual("a");
// TODO: this should
expect(mockUpdateUsers.mock.calls).toHaveLength(1);
expect(mockUpdateUsers.mock.calls[1][2]).toEqual("a");
expect(mockUpdateUsers.mock.calls).toHaveLength(2);
userEvent.type(search, "b");
expect(search.value).toEqual("ab");
clock.tick(400);
expect(mockReducers.mock.calls).toHaveLength(4);
lastState =
mockReducers.mock.results[mockReducers.mock.results.length - 1].value;
expect(lastState.name_filter).toEqual("ab");
expect(lastState.user_page.offset).toEqual(0);
});
test("Interacting with PaginationFooter causes state update and refresh via useEffect call", async () => {
let callbackSpy = mockAsync();
await act(async () => {
render(serverDashboardJsx(callbackSpy));
});
expect(callbackSpy).toBeCalledWith(0, 2, "");
expect(mockReducers.mock.results).toHaveLength(2);
lastState =
mockReducers.mock.results[mockReducers.mock.results.length - 1].value;
console.log(lastState);
expect(lastState.user_page.offset).toEqual(0);
expect(lastState.user_page.limit).toEqual(2);
let next = screen.getByTestId("paginate-next");
fireEvent.click(next);
clock.tick(400);
expect(mockReducers.mock.results).toHaveLength(3);
var lastState =
mockReducers.mock.results[mockReducers.mock.results.length - 1].value;
expect(lastState.user_page.offset).toEqual(2);
expect(lastState.user_page.limit).toEqual(2);
// FIXME: should call updateUsers, does in reality.
// tests don't reflect reality due to mocked state/useSelector
// unclear how to fix this.
// expect(callbackSpy.mock.calls).toHaveLength(2);
// expect(callbackSpy).toHaveBeenCalledWith(2, 2, "");
expect(mockUpdateUsers.mock.calls[2][2]).toEqual("ab");
expect(mockUpdateUsers.mock.calls).toHaveLength(3);
});

View File

@@ -1,12 +1,11 @@
export const jhapiRequest = (endpoint, method, data) => {
let base_url = window.base_url || "/",
let base_url = window.base_url,
api_url = `${base_url}hub/api`;
return fetch(api_url + endpoint, {
method: method,
json: true,
headers: {
"Content-Type": "application/json",
Accept: "application/jupyterhub-pagination+json",
},
body: data ? JSON.stringify(data) : null,
});

View File

@@ -4,9 +4,7 @@ import { jhapiRequest } from "./jhapiUtil";
const withAPI = withProps(() => ({
updateUsers: (offset, limit, name_filter) =>
jhapiRequest(
`/users?include_stopped_servers&offset=${offset}&limit=${limit}&name_filter=${
name_filter || ""
}`,
`/users?offset=${offset}&limit=${limit}&name_filter=${name_filter || ""}`,
"GET"
).then((data) => data.json()),
updateGroups: (offset, limit) =>

View File

@@ -1,5 +1,6 @@
const webpack = require("webpack");
const path = require("path");
const express = require("express");
module.exports = {
entry: path.resolve(__dirname, "src", "App.jsx"),
@@ -33,19 +34,16 @@ module.exports = {
},
plugins: [new webpack.HotModuleReplacementPlugin()],
devServer: {
static: {
directory: path.resolve(__dirname, "build"),
},
contentBase: path.resolve(__dirname, "build"),
port: 9000,
onBeforeSetupMiddleware: (devServer) => {
const app = devServer.app;
before: (app, server) => {
var user_data = JSON.parse(
'[{"kind":"user","name":"foo","admin":true,"groups":[],"server":"/user/foo/","pending":null,"created":"2020-12-07T18:46:27.112695Z","last_activity":"2020-12-07T21:00:33.336354Z","servers":{"":{"name":"","last_activity":"2020-12-07T20:58:02.437408Z","started":"2020-12-07T20:58:01.508266Z","pending":null,"ready":true,"state":{"pid":28085},"url":"/user/foo/","user_options":{},"progress_url":"/hub/api/users/foo/server/progress"}}},{"kind":"user","name":"bar","admin":false,"groups":[],"server":null,"pending":null,"created":"2020-12-07T18:46:27.115528Z","last_activity":"2020-12-07T20:43:51.013613Z","servers":{}}]'
);
var group_data = JSON.parse(
'[{"kind":"group","name":"testgroup","users":[]}, {"kind":"group","name":"testgroup2","users":["foo", "bar"]}]'
);
app.use(express.json());
// get user_data
app.get("/hub/api/users", (req, res) => {

File diff suppressed because it is too large Load Diff

View File

@@ -1 +1,2 @@
from ._version import __version__, version_info
from ._version import __version__
from ._version import version_info

View File

@@ -4,7 +4,7 @@
def get_data_files():
"""Walk up until we find share/jupyterhub"""
import sys
from os.path import abspath, dirname, exists, join, split
from os.path import join, abspath, dirname, exists, split
path = abspath(dirname(__file__))
starting_points = [path]

View File

@@ -1,154 +0,0 @@
"""Utilities for memoization
Note: a memoized function should always return an _immutable_
result to avoid later modifications polluting cached results.
"""
from collections import OrderedDict
from functools import wraps
class DoNotCache:
"""Wrapper to return a result without caching it.
In a function decorated with `@lru_cache_key`:
return DoNotCache(result)
is equivalent to:
return result # but don't cache it!
"""
def __init__(self, result):
self.result = result
class LRUCache:
"""A simple Least-Recently-Used (LRU) cache with a max size"""
def __init__(self, maxsize=1024):
self._cache = OrderedDict()
self.maxsize = maxsize
def __contains__(self, key):
return key in self._cache
def get(self, key, default=None):
"""Get an item from the cache"""
if key in self._cache:
# cache hit, bump to front of the queue for LRU
result = self._cache[key]
self._cache.move_to_end(key)
return result
return default
def set(self, key, value):
"""Store an entry in the cache
Purges oldest entry if cache is full
"""
self._cache[key] = value
# cache is full, purge oldest entry
if len(self._cache) > self.maxsize:
self._cache.popitem(last=False)
__getitem__ = get
__setitem__ = set
def lru_cache_key(key_func, maxsize=1024):
"""Like functools.lru_cache, but takes a custom key function,
as seen in sorted(key=func).
Useful for non-hashable arguments which have a known hashable equivalent (e.g. sets, lists),
or mutable objects where only immutable fields might be used
(e.g. User, where only username affects output).
For safety: Cached results should always be immutable,
such as using `frozenset` instead of mutable `set`.
Example:
@lru_cache_key(lambda user: user.name)
def func_user(user):
# output only varies by name
Args:
key (callable):
Should have the same signature as the decorated function.
Returns a hashable key to use in the cache
maxsize (int):
The maximum size of the cache.
"""
def cache_func(func):
cache = LRUCache(maxsize=maxsize)
# the actual decorated function:
@wraps(func)
def cached(*args, **kwargs):
cache_key = key_func(*args, **kwargs)
if cache_key in cache:
# cache hit
return cache[cache_key]
else:
# cache miss, call function and cache result
result = func(*args, **kwargs)
if isinstance(result, DoNotCache):
# DoNotCache prevents caching
result = result.result
else:
cache[cache_key] = result
return result
return cached
return cache_func
class FrozenDict(dict):
"""A frozen dictionary subclass
Immutable and hashable, so it can be used as a cache key
Values will be frozen with `.freeze(value)`
and must be hashable after freezing.
Not rigorous, but enough for our purposes.
"""
_hash = None
def __init__(self, d):
dict_set = dict.__setitem__
for key, value in d.items():
dict.__setitem__(self, key, self._freeze(value))
def _freeze(self, item):
"""Make values of a dict hashable
- list, set -> frozenset
- dict -> recursive _FrozenDict
- anything else: assumed hashable
"""
if isinstance(item, FrozenDict):
return item
elif isinstance(item, list):
return tuple(self._freeze(e) for e in item)
elif isinstance(item, set):
return frozenset(item)
elif isinstance(item, dict):
return FrozenDict(item)
else:
# any other type is assumed hashable
return item
def __setitem__(self, key):
raise RuntimeError("Cannot modify frozen {type(self).__name__}")
def update(self, other):
raise RuntimeError("Cannot modify frozen {type(self).__name__}")
def __hash__(self):
"""Cache hash because we are immutable"""
if self._hash is None:
self._hash = hash(tuple((key, value) for key, value in self.items()))
return self._hash

View File

@@ -2,7 +2,7 @@
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
# version_info updated by running `tbump`
version_info = (3, 0, 0, "", "")
version_info = (2, 3, 0, "", "")
# pep 440 version: no dot before beta/rc, but before .dev
# 0.1.0rc1

View File

@@ -3,16 +3,17 @@ import sys
from logging.config import fileConfig
from alembic import context
from sqlalchemy import engine_from_config, pool
from sqlalchemy import engine_from_config
from sqlalchemy import pool
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
# Interpret the config file for Python logging.
# This line sets up loggers basically.
if 'jupyterhub' in sys.modules:
from traitlets.config import MultipleInstanceError
from jupyterhub.app import JupyterHub
app = None
@@ -41,16 +42,6 @@ target_metadata = orm.Base.metadata
# my_important_option = config.get_main_option("my_important_option")
# ... etc.
# pass these to context.configure(**config_opts)
common_config_opts = dict(
# target_metadata for autogenerate
target_metadata=target_metadata,
# transaction per migration to ensure
# each migration is 'complete' before running the next one
# (e.g. dropped tables)
transaction_per_migration=True,
)
def run_migrations_offline():
"""Run migrations in 'offline' mode.
@@ -65,15 +56,14 @@ def run_migrations_offline():
"""
connectable = config.attributes.get('connection', None)
config_opts = {}
config_opts.update(common_config_opts)
config_opts["literal_binds"] = True
if connectable is None:
config_opts["url"] = config.get_main_option("sqlalchemy.url")
url = config.get_main_option("sqlalchemy.url")
context.configure(url=url, target_metadata=target_metadata, literal_binds=True)
else:
config_opts["connection"] = connectable
context.configure(**config_opts)
context.configure(
connection=connectable, target_metadata=target_metadata, literal_binds=True
)
with context.begin_transaction():
context.run_migrations()
@@ -87,8 +77,6 @@ def run_migrations_online():
"""
connectable = config.attributes.get('connection', None)
config_opts = {}
config_opts.update(common_config_opts)
if connectable is None:
connectable = engine_from_config(
@@ -98,10 +86,7 @@ def run_migrations_online():
)
with connectable.connect() as connection:
context.configure(
connection=connection,
**common_config_opts,
)
context.configure(connection=connection, target_metadata=target_metadata)
with context.begin_transaction():
context.run_migrations()

View File

@@ -11,8 +11,8 @@ down_revision = None
branch_labels = None
depends_on = None
import sqlalchemy as sa
from alembic import op
import sqlalchemy as sa
def upgrade():

View File

@@ -15,8 +15,8 @@ import logging
logger = logging.getLogger('alembic')
import sqlalchemy as sa
from alembic import op
import sqlalchemy as sa
tables = ('oauth_access_tokens', 'oauth_codes')

View File

@@ -22,9 +22,8 @@ import logging
logger = logging.getLogger('alembic')
import sqlalchemy as sa
from alembic import op
import sqlalchemy as sa
from jupyterhub.orm import JSONDict

View File

@@ -11,9 +11,8 @@ down_revision = '896818069c98'
branch_labels = None
depends_on = None
import sqlalchemy as sa
from alembic import op
import sqlalchemy as sa
from jupyterhub.orm import JSONDict

View File

@@ -11,10 +11,10 @@ down_revision = '1cebaf56856c'
branch_labels = None
depends_on = None
import logging
import sqlalchemy as sa
from alembic import op
import sqlalchemy as sa
import logging
logger = logging.getLogger('alembic')

View File

@@ -1,115 +0,0 @@
"""api_token_scopes
Revision ID: 651f5419b74d
Revises: 833da8570507
Create Date: 2022-02-28 12:42:55.149046
"""
# revision identifiers, used by Alembic.
revision = '651f5419b74d'
down_revision = '833da8570507'
branch_labels = None
depends_on = None
import sqlalchemy as sa
from alembic import op
from sqlalchemy import Column, ForeignKey, Table
from sqlalchemy.orm import relationship
from sqlalchemy.orm.session import Session
from jupyterhub import orm, roles, scopes
def upgrade():
c = op.get_bind()
tables = sa.inspect(c.engine).get_table_names()
# oauth codes are short lived, no need to upgrade them
if 'oauth_code_role_map' in tables:
op.drop_table('oauth_code_role_map')
if 'oauth_codes' in tables:
op.add_column('oauth_codes', sa.Column('scopes', orm.JSONList(), nullable=True))
if 'api_tokens' in tables:
# may not be present,
# e.g. upgrade from 1.x, token table dropped
# in which case no migration to do
# define new scopes column on API tokens
op.add_column('api_tokens', sa.Column('scopes', orm.JSONList(), nullable=True))
if 'api_token_role_map' in tables:
# redefine the to-be-removed api_token->role relationship
# so we can run a query on it for the migration
token_role_map = Table(
"api_token_role_map",
orm.Base.metadata,
Column(
'api_token_id',
ForeignKey('api_tokens.id', ondelete='CASCADE'),
primary_key=True,
),
Column(
'role_id',
ForeignKey('roles.id', ondelete='CASCADE'),
primary_key=True,
),
extend_existing=True,
)
orm.APIToken.roles = relationship('Role', secondary='api_token_role_map')
# tokens have roles, evaluate to scopes
db = Session(bind=c)
for token in db.query(orm.APIToken):
token.scopes = list(roles.roles_to_scopes(token.roles))
db.commit()
# drop token-role relationship
op.drop_table('api_token_role_map')
if 'oauth_clients' in tables:
# define new scopes column on API tokens
op.add_column(
'oauth_clients', sa.Column('allowed_scopes', orm.JSONList(), nullable=True)
)
if 'oauth_client_role_map' in tables:
# redefine the to-be-removed api_token->role relationship
# so we can run a query on it for the migration
client_role_map = Table(
"oauth_client_role_map",
orm.Base.metadata,
Column(
'oauth_client_id',
ForeignKey('oauth_clients.id', ondelete='CASCADE'),
primary_key=True,
),
Column(
'role_id',
ForeignKey('roles.id', ondelete='CASCADE'),
primary_key=True,
),
extend_existing=True,
)
orm.OAuthClient.allowed_roles = relationship(
'Role', secondary='oauth_client_role_map'
)
# oauth clients have allowed_roles, evaluate to allowed_scopes
db = Session(bind=c)
for oauth_client in db.query(orm.OAuthClient):
allowed_scopes = set(roles.roles_to_scopes(oauth_client.allowed_roles))
allowed_scopes.update(scopes.access_scopes(oauth_client))
oauth_client.allowed_scopes = sorted(allowed_scopes)
db.commit()
# drop token-role relationship
op.drop_table('oauth_client_role_map')
def downgrade():
# cannot map permissions from scopes back to roles
# drop whole api token table (revokes all tokens), which will be recreated on hub start
op.drop_table('api_tokens')
op.drop_table('oauth_clients')
op.drop_table('oauth_codes')

View File

@@ -12,11 +12,12 @@ down_revision = '4dc2d5a8c53c'
branch_labels = None
depends_on = None
import sqlalchemy as sa
from alembic import op
import sqlalchemy as sa
from jupyterhub import orm
naming_convention = orm.meta.naming_convention

View File

@@ -11,8 +11,8 @@ down_revision = 'd68c98b66cd4'
branch_labels = None
depends_on = None
import sqlalchemy as sa
from alembic import op
import sqlalchemy as sa
def upgrade():

View File

@@ -12,10 +12,10 @@ branch_labels = None
depends_on = None
from datetime import datetime
import sqlalchemy as sa
from alembic import op
import sqlalchemy as sa
from datetime import datetime
def upgrade():

View File

@@ -11,8 +11,8 @@ down_revision = 'eeb276e51423'
branch_labels = None
depends_on = None
import sqlalchemy as sa
from alembic import op
import sqlalchemy as sa
def upgrade():

View File

@@ -11,8 +11,8 @@ down_revision = '99a28a4418e1'
branch_labels = None
depends_on = None
import sqlalchemy as sa
from alembic import op
import sqlalchemy as sa
def upgrade():

View File

@@ -12,9 +12,8 @@ down_revision = '19c0846f6344'
branch_labels = None
depends_on = None
import sqlalchemy as sa
from alembic import op
import sqlalchemy as sa
from jupyterhub.orm import JSONDict

View File

@@ -1,4 +1,9 @@
from . import auth, groups, hub, proxy, services, users
from . import auth
from . import groups
from . import hub
from . import proxy
from . import services
from . import users
from .base import *
default_handlers = []

View File

@@ -1,16 +1,25 @@
"""Authorization handlers"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import itertools
import json
from datetime import datetime
from urllib.parse import parse_qsl, quote, urlencode, urlparse, urlunparse
from urllib.parse import parse_qsl
from urllib.parse import quote
from urllib.parse import urlencode
from urllib.parse import urlparse
from urllib.parse import urlunparse
from oauthlib import oauth2
from tornado import web
from .. import orm, roles, scopes
from ..utils import get_browser_protocol, token_authenticated
from .base import APIHandler, BaseHandler
from .. import orm
from .. import roles
from .. import scopes
from ..utils import get_browser_protocol
from ..utils import token_authenticated
from .base import APIHandler
from .base import BaseHandler
class TokenAPIHandler(APIHandler):
@@ -29,7 +38,7 @@ class TokenAPIHandler(APIHandler):
if owner:
# having a token means we should be able to read the owner's model
# (this is the only thing this handler is for)
self.expanded_scopes |= scopes.identify_scopes(owner)
self.expanded_scopes.update(scopes.identify_scopes(owner))
self.parsed_scopes = scopes.parse_scopes(self.expanded_scopes)
# record activity whenever we see a token
@@ -171,7 +180,7 @@ class OAuthAuthorizeHandler(OAuthHandler, BaseHandler):
raise
self.send_oauth_response(headers, body, status)
def needs_oauth_confirm(self, user, oauth_client, requested_scopes):
def needs_oauth_confirm(self, user, oauth_client, roles):
"""Return whether the given oauth client needs to prompt for access for the given user
Checks list for oauth clients that don't need confirmation
@@ -202,20 +211,20 @@ class OAuthAuthorizeHandler(OAuthHandler, BaseHandler):
user_id=user.id,
client_id=oauth_client.identifier,
)
authorized_scopes = set()
authorized_roles = set()
for token in existing_tokens:
authorized_scopes.update(token.scopes)
authorized_roles.update({role.name for role in token.roles})
if authorized_scopes:
if set(requested_scopes).issubset(authorized_scopes):
if authorized_roles:
if set(roles).issubset(authorized_roles):
self.log.debug(
f"User {user.name} has already authorized {oauth_client.identifier} for scopes {requested_scopes}"
f"User {user.name} has already authorized {oauth_client.identifier} for roles {roles}"
)
return False
else:
self.log.debug(
f"User {user.name} has authorized {oauth_client.identifier}"
f" for scopes {authorized_scopes}, confirming additional scopes {requested_scopes}"
f" for roles {authorized_roles}, confirming additonal roles {roles}"
)
# default: require confirmation
return True
@@ -242,7 +251,7 @@ class OAuthAuthorizeHandler(OAuthHandler, BaseHandler):
uri, http_method, body, headers = self.extract_oauth_params()
try:
(
requested_scopes,
role_names,
credentials,
) = self.oauth_provider.validate_authorization_request(
uri, http_method, body, headers
@@ -274,51 +283,26 @@ class OAuthAuthorizeHandler(OAuthHandler, BaseHandler):
raise web.HTTPError(
403, f"You do not have permission to access {client.description}"
)
# subset 'raw scopes' to those held by authenticating user
requested_scopes = set(requested_scopes)
user = self.current_user
# raw, _not_ expanded scopes
user_scopes = roles.roles_to_scopes(roles.get_roles_for(user.orm_user))
# these are some scopes the user may not have
# in 'raw' form, but definitely have at this point
# make sure they are here, because we are computing the
# 'raw' scope intersection,
# rather than the expanded_scope intersection
required_scopes = {*scopes.identify_scopes(), *scopes.access_scopes(client)}
user_scopes |= {"inherit", *required_scopes}
allowed_scopes = requested_scopes.intersection(user_scopes)
excluded_scopes = requested_scopes.difference(user_scopes)
# TODO: compute lower-level intersection of remaining _expanded_ scopes
# (e.g. user has admin:users, requesting read:users!group=x)
if excluded_scopes:
self.log.warning(
f"Service {client.description} requested scopes {','.join(requested_scopes)}"
f" for user {self.current_user.name},"
f" granting only {','.join(allowed_scopes) or '[]'}."
)
if not self.needs_oauth_confirm(self.current_user, client, allowed_scopes):
if not self.needs_oauth_confirm(self.current_user, client, role_names):
self.log.debug(
"Skipping oauth confirmation for %s accessing %s",
self.current_user,
client.description,
)
# this is the pre-1.0 behavior for all oauth
self._complete_login(uri, headers, allowed_scopes, credentials)
self._complete_login(uri, headers, role_names, credentials)
return
# discard 'required' scopes from description
# no need to describe the ability to access itself
scopes_to_describe = allowed_scopes.difference(required_scopes)
if not scopes_to_describe:
# TODO: describe all scopes?
# Not right now, because the no-scope default 'identify' text
# is clearer than what we produce for those scopes individually
# resolve roles to scopes for authorization page
raw_scopes = set()
if role_names:
role_objects = (
self.db.query(orm.Role).filter(orm.Role.name.in_(role_names)).all()
)
raw_scopes = set(
itertools.chain(*(role.scopes for role in role_objects))
)
if not raw_scopes:
scope_descriptions = [
{
"scope": None,
@@ -328,8 +312,8 @@ class OAuthAuthorizeHandler(OAuthHandler, BaseHandler):
"filter": "",
}
]
elif 'inherit' in scopes_to_describe:
allowed_scopes = scopes_to_describe = ['inherit']
elif 'inherit' in raw_scopes:
raw_scopes = ['inherit']
scope_descriptions = [
{
"scope": "inherit",
@@ -341,7 +325,7 @@ class OAuthAuthorizeHandler(OAuthHandler, BaseHandler):
]
else:
scope_descriptions = scopes.describe_raw_scopes(
scopes_to_describe,
raw_scopes,
username=self.current_user.name,
)
# Render oauth 'Authorize application...' page
@@ -350,7 +334,7 @@ class OAuthAuthorizeHandler(OAuthHandler, BaseHandler):
await self.render_template(
"oauth.html",
auth_state=auth_state,
allowed_scopes=allowed_scopes,
role_names=role_names,
scope_descriptions=scope_descriptions,
oauth_client=client,
)
@@ -397,10 +381,6 @@ class OAuthAuthorizeHandler(OAuthHandler, BaseHandler):
# The scopes the user actually authorized, i.e. checkboxes
# that were selected.
scopes = self.get_arguments('scopes')
if scopes == []:
# avoid triggering default scopes (provider selects default scopes when scopes is falsy)
# when an explicit empty list is authorized
scopes = ["identify"]
# credentials we need in the validator
credentials = self.add_credentials()

View File

@@ -4,15 +4,19 @@
import json
from functools import lru_cache
from http.client import responses
from urllib.parse import parse_qs, urlencode, urlparse, urlunparse
from urllib.parse import parse_qs
from urllib.parse import urlencode
from urllib.parse import urlparse
from urllib.parse import urlunparse
from sqlalchemy.exc import SQLAlchemyError
from tornado import web
from .. import orm
from ..handlers import BaseHandler
from ..scopes import get_scopes_for
from ..utils import get_browser_protocol, isoformat, url_escape_path, url_path_join
from ..utils import get_browser_protocol
from ..utils import isoformat
from ..utils import url_path_join
PAGINATION_MEDIA_TYPE = "application/jupyterhub-pagination+json"
@@ -187,44 +191,22 @@ class APIHandler(BaseHandler):
json.dumps({'status': status_code, 'message': message or status_message})
)
def server_model(self, spawner, *, user=None):
def server_model(self, spawner):
"""Get the JSON model for a Spawner
Assume server permission already granted
"""
if isinstance(spawner, orm.Spawner):
# if an orm.Spawner is passed,
# create a model for a stopped Spawner
# not all info is available without the higher-level Spawner wrapper
orm_spawner = spawner
pending = None
ready = False
stopped = True
user = user
if user is None:
raise RuntimeError("Must specify User with orm.Spawner")
state = orm_spawner.state
else:
orm_spawner = spawner.orm_spawner
pending = spawner.pending
ready = spawner.ready
user = spawner.user
stopped = not spawner.active
state = spawner.get_state()
Assume server permission already granted"""
model = {
'name': orm_spawner.name,
'last_activity': isoformat(orm_spawner.last_activity),
'started': isoformat(orm_spawner.started),
'pending': pending,
'ready': ready,
'stopped': stopped,
'url': url_path_join(user.url, url_escape_path(spawner.name), '/'),
'name': spawner.name,
'last_activity': isoformat(spawner.orm_spawner.last_activity),
'started': isoformat(spawner.orm_spawner.started),
'pending': spawner.pending,
'ready': spawner.ready,
'url': url_path_join(spawner.user.url, spawner.name, '/'),
'user_options': spawner.user_options,
'progress_url': user.progress_url(spawner.name),
'progress_url': spawner._progress_url,
}
scope_filter = self.get_scope_filter('admin:server_state')
if scope_filter(spawner, kind='server'):
model['state'] = state
model['state'] = spawner.get_state()
return model
def token_model(self, token):
@@ -242,9 +224,7 @@ class APIHandler(BaseHandler):
owner_key: owner,
'id': token.api_id,
'kind': 'api_token',
# deprecated field, but leave it present.
'roles': [],
'scopes': list(get_scopes_for(token)),
'roles': [r.name for r in token.roles],
'created': isoformat(token.created),
'last_activity': isoformat(token.last_activity),
'expires_at': isoformat(token.expires_at),
@@ -270,22 +250,10 @@ class APIHandler(BaseHandler):
keys.update(allowed_keys)
return model
_include_stopped_servers = None
@property
def include_stopped_servers(self):
"""Whether stopped servers should be included in user models"""
if self._include_stopped_servers is None:
self._include_stopped_servers = self.get_argument(
"include_stopped_servers", "0"
).lower() not in {"0", "false"}
return self._include_stopped_servers
def user_model(self, user):
"""Get the JSON model for a User object"""
if isinstance(user, orm.User):
user = self.users[user.id]
include_stopped_servers = self.include_stopped_servers
model = {
'kind': 'user',
'name': user.name,
@@ -325,29 +293,18 @@ class APIHandler(BaseHandler):
if '' in user.spawners and 'pending' in allowed_keys:
model['pending'] = user.spawners[''].pending
servers = {}
servers = model['servers'] = {}
scope_filter = self.get_scope_filter('read:servers')
for name, spawner in user.spawners.items():
# include 'active' servers, not just ready
# (this includes pending events)
if (spawner.active or include_stopped_servers) and scope_filter(
spawner, kind='server'
):
if spawner.active and scope_filter(spawner, kind='server'):
servers[name] = self.server_model(spawner)
if include_stopped_servers:
# add any stopped servers in the db
seen = set(servers.keys())
for name, orm_spawner in user.orm_spawners.items():
if name not in seen and scope_filter(orm_spawner, kind='server'):
servers[name] = self.server_model(orm_spawner, user=user)
if "servers" in allowed_keys or servers:
if not servers and 'servers' not in allowed_keys:
# omit servers if no access
# leave present and empty
# if request has access to read servers in general
model["servers"] = servers
model.pop('servers')
return model
def group_model(self, group):

View File

@@ -6,7 +6,8 @@ import json
from tornado import web
from .. import orm
from ..scopes import Scope, needs_scope
from ..scopes import needs_scope
from ..scopes import Scope
from .base import APIHandler

View File

@@ -26,7 +26,7 @@ class ProxyAPIHandler(APIHandler):
else:
routes = {}
end = offset + limit
for i, key in enumerate(sorted(all_routes.keys())):
for i, key in sorted(all_routes.keys()):
if i < offset:
continue
elif i >= end:

View File

@@ -6,7 +6,8 @@ Currently GET-only, no actions can be taken to modify services.
# Distributed under the terms of the Modified BSD License.
import json
from ..scopes import Scope, needs_scope
from ..scopes import needs_scope
from ..scopes import Scope
from .base import APIHandler

View File

@@ -3,25 +3,26 @@
# Distributed under the terms of the Modified BSD License.
import asyncio
import json
from datetime import datetime, timedelta, timezone
from datetime import datetime
from datetime import timedelta
from datetime import timezone
from async_generator import aclosing
from dateutil.parser import parse as parse_date
from sqlalchemy import func, or_
from sqlalchemy import func
from sqlalchemy import or_
from tornado import web
from tornado.iostream import StreamClosedError
from .. import orm, scopes
from .. import orm
from .. import scopes
from ..roles import assign_default_roles
from ..scopes import needs_scope
from ..user import User
from ..utils import (
isoformat,
iterate_until,
maybe_future,
url_escape_path,
url_path_join,
)
from ..utils import isoformat
from ..utils import iterate_until
from ..utils import maybe_future
from ..utils import url_path_join
from .base import APIHandler
@@ -50,7 +51,7 @@ class SelfAPIHandler(APIHandler):
for scope in identify_scopes:
if scope not in self.expanded_scopes:
_added_scopes.add(scope)
self.expanded_scopes |= {scope}
self.expanded_scopes.add(scope)
if _added_scopes:
# re-parse with new scopes
self.parsed_scopes = scopes.parse_scopes(self.expanded_scopes)
@@ -405,18 +406,21 @@ class UserTokenListAPIHandler(APIHandler):
if requester is not user:
note += f" by {kind} {requester.name}"
token_roles = body.get("roles")
token_scopes = body.get("scopes")
token_roles = body.get('roles')
try:
api_token = user.new_api_token(
note=note,
expires_in=body.get('expires_in', None),
roles=token_roles,
scopes=token_scopes,
)
except ValueError as e:
raise web.HTTPError(400, str(e))
except KeyError:
raise web.HTTPError(404, "Requested roles %r not found" % token_roles)
except ValueError:
raise web.HTTPError(
403,
"Requested roles %r cannot have higher permissions than the token owner"
% token_roles,
)
if requester is not user:
self.log.info(
"%s %s requested API token for %s",
@@ -691,7 +695,7 @@ class SpawnProgressAPIHandler(APIHandler):
# - spawner not running at all
# - spawner failed
# - spawner pending start (what we expect)
url = url_path_join(user.url, url_escape_path(server_name), '/')
url = url_path_join(user.url, server_name, '/')
ready_event = {
'progress': 100,
'ready': True,

View File

@@ -11,85 +11,106 @@ import re
import secrets
import signal
import socket
import ssl
import sys
import time
from concurrent.futures import ThreadPoolExecutor
from datetime import datetime, timedelta, timezone
from datetime import datetime
from datetime import timedelta
from datetime import timezone
from functools import partial
from getpass import getuser
from glob import glob
from itertools import chain
from operator import itemgetter
from textwrap import dedent
from urllib.parse import unquote, urlparse, urlunparse
from urllib.parse import unquote
from urllib.parse import urlparse
from urllib.parse import urlunparse
if sys.version_info[:2] < (3, 3):
raise ValueError("Python < 3.3 not supported: %s" % sys.version)
import tornado.httpserver
import tornado.options
# For compatibility with python versions 3.6 or earlier.
# asyncio.Task.all_tasks() is fully moved to asyncio.all_tasks() starting with 3.9. Also applies to current_task.
try:
asyncio_all_tasks = asyncio.all_tasks
asyncio_current_task = asyncio.current_task
except AttributeError as e:
asyncio_all_tasks = asyncio.Task.all_tasks
asyncio_current_task = asyncio.Task.current_task
from dateutil.parser import parse as parse_date
from jinja2 import ChoiceLoader, Environment, FileSystemLoader, PrefixLoader
from jupyter_telemetry.eventlog import EventLog
from jinja2 import Environment, FileSystemLoader, PrefixLoader, ChoiceLoader
from sqlalchemy.exc import OperationalError, SQLAlchemyError
from tornado import gen, web
from tornado.httpclient import AsyncHTTPClient
import tornado.httpserver
from tornado.ioloop import IOLoop, PeriodicCallback
from tornado.log import access_log, app_log, gen_log
from tornado.log import app_log, access_log, gen_log
import tornado.options
from tornado import gen, web
from traitlets import (
Any,
Bool,
Bytes,
Dict,
Float,
Instance,
Integer,
List,
Set,
Tuple,
Unicode,
Integer,
Dict,
TraitError,
List,
Bool,
Any,
Tuple,
Type,
Set,
Instance,
Bytes,
Float,
Union,
default,
observe,
default,
validate,
)
from traitlets.config import Application, Configurable, catch_config_error
from jupyter_telemetry.eventlog import EventLog
here = os.path.dirname(__file__)
import jupyterhub
from . import handlers, apihandlers
from .handlers.static import CacheControlStaticFilesHandler, LogoHandler
from .services.service import Service
from . import apihandlers, crypto, dbutil, handlers, orm, roles, scopes
from . import crypto
from . import dbutil, orm
from . import roles
from .user import UserDict
from .oauth.provider import make_provider
from ._data import DATA_FILES_PATH
from .log import CoroutineLogFormatter, log_request
from .proxy import Proxy, ConfigurableHTTPProxy
from .traitlets import URLPrefix, Command, EntryPointType, Callable
from .utils import (
AnyTimeoutError,
catch_db_error,
maybe_future,
url_path_join,
print_stacks,
print_ps_info,
make_ssl_context,
)
from .metrics import HUB_STARTUP_DURATION_SECONDS
from .metrics import INIT_SPAWNERS_DURATION_SECONDS
from .metrics import RUNNING_SERVERS
from .metrics import TOTAL_USERS
# classes for config
from .auth import Authenticator, PAMAuthenticator
from .crypto import CryptKeeper
from .spawner import Spawner, LocalProcessSpawner
from .objects import Hub, Server
# For faking stats
from .emptyclass import EmptyClass
from .handlers.static import CacheControlStaticFilesHandler, LogoHandler
from .log import CoroutineLogFormatter, log_request
from .metrics import (
HUB_STARTUP_DURATION_SECONDS,
INIT_SPAWNERS_DURATION_SECONDS,
RUNNING_SERVERS,
TOTAL_USERS,
)
from .oauth.provider import make_provider
from .objects import Hub, Server
from .proxy import ConfigurableHTTPProxy, Proxy
from .services.service import Service
from .spawner import LocalProcessSpawner, Spawner
from .traitlets import Callable, Command, EntryPointType, URLPrefix
from .user import UserDict
from .utils import (
AnyTimeoutError,
catch_db_error,
make_ssl_context,
maybe_future,
print_ps_info,
print_stacks,
url_path_join,
)
common_aliases = {
'log-level': 'Application.log_level',
@@ -332,29 +353,6 @@ class JupyterHub(Application):
""",
).tag(config=True)
custom_scopes = Dict(
key_trait=Unicode(),
value_trait=Dict(
key_trait=Unicode(),
),
help="""Custom scopes to define.
For use when defining custom roles,
to grant users granular permissions
All custom scopes must have a description,
and must start with the prefix `custom:`.
For example::
custom_scopes = {
"custom:jupyter_server:read": {
"description": "read-only access to a single-user server",
},
}
""",
).tag(config=True)
config_file = Unicode('jupyterhub_config.py', help="The config file to load").tag(
config=True
)
@@ -703,14 +701,11 @@ class JupyterHub(Application):
""",
).tag(config=True)
@validate("subdomain_host")
def _validate_subdomain_host(self, proposal):
new = proposal.value
def _subdomain_host_changed(self, name, old, new):
if new and '://' not in new:
# host should include '://'
# if not specified, assume https: You have to be really explicit about HTTP!
new = 'https://' + new
return new
self.subdomain_host = 'https://' + new
domain = Unicode(help="domain name, e.g. 'example.com' (excludes protocol, port)")
@@ -1124,7 +1119,7 @@ class JupyterHub(Application):
@default('authenticator')
def _authenticator_default(self):
return self.authenticator_class(parent=self, _deprecated_db_session=self.db)
return self.authenticator_class(parent=self, db=self.db)
implicit_spawn_seconds = Float(
0,
@@ -1312,14 +1307,11 @@ class JupyterHub(Application):
admin_access = Bool(
False,
help="""DEPRECATED since version 2.0.0.
help="""Grant admin users permission to access single-user servers.
The default admin role has full permissions, use custom RBAC scopes instead to
create restricted administrator roles.
https://jupyterhub.readthedocs.io/en/stable/rbac/index.html
Users should be properly informed if this is enabled.
""",
).tag(config=True)
admin_users = Set(
help="""DEPRECATED since version 0.7.2, use Authenticator.admin_users instead."""
).tag(config=True)
@@ -1697,9 +1689,7 @@ class JupyterHub(Application):
for authority, files in self.internal_ssl_authorities.items():
if files:
self.log.info("Adding CA for %s", authority)
certipy.store.add_record(
authority, is_ca=True, files=files, overwrite=True
)
certipy.store.add_record(authority, is_ca=True, files=files)
self.internal_trust_bundles = certipy.trust_from_graph(
self.internal_ssl_components_trust
@@ -2028,10 +2018,7 @@ class JupyterHub(Application):
db.commit()
async def init_role_creation(self):
"""Load default and user-defined roles and scopes into the database"""
if self.custom_scopes:
self.log.info(f"Defining {len(self.custom_scopes)} custom scopes.")
scopes.define_custom_scopes(self.custom_scopes)
"""Load default and predefined roles into the database"""
self.log.debug('Loading roles into database')
default_roles = roles.get_default_roles()
config_role_names = [r['name'] for r in self.load_roles]
@@ -2068,6 +2055,23 @@ class JupyterHub(Application):
# make sure we load any default roles not overridden
init_roles = list(default_roles_dict.values()) + init_roles
if roles_with_new_permissions:
unauthorized_oauth_tokens = (
self.db.query(orm.APIToken)
.filter(
orm.APIToken.roles.any(
orm.Role.name.in_(roles_with_new_permissions)
)
)
.filter(orm.APIToken.client_id != 'jupyterhub')
)
for token in unauthorized_oauth_tokens:
self.log.warning(
"Deleting OAuth token %s; one of its roles obtained new permissions that were not authorized by user"
% token
)
self.db.delete(token)
self.db.commit()
init_role_names = [r['name'] for r in init_roles]
if (
@@ -2203,6 +2207,9 @@ class JupyterHub(Application):
for kind in kinds:
roles.check_for_default_roles(db, kind)
# check tokens for default roles
roles.check_for_default_roles(db, bearer='tokens')
async def _add_tokens(self, token_dict, kind):
"""Add tokens for users or services to the database"""
if kind == 'user':
@@ -2367,34 +2374,21 @@ class JupyterHub(Application):
service.orm.server = None
if service.oauth_available:
allowed_scopes = set()
if service.oauth_client_allowed_scopes:
allowed_scopes.update(service.oauth_client_allowed_scopes)
allowed_roles = []
if service.oauth_roles:
if not allowed_scopes:
# DEPRECATED? It's still convenient and valid,
# e.g. 'admin'
allowed_roles = list(
self.db.query(orm.Role).filter(
orm.Role.name.in_(service.oauth_roles)
)
)
allowed_scopes.update(roles.roles_to_scopes(allowed_roles))
else:
self.log.warning(
f"Ignoring oauth_roles for {service.name}: {service.oauth_roles},"
f" using oauth_client_allowed_scopes={allowed_scopes}."
allowed_roles = list(
self.db.query(orm.Role).filter(
orm.Role.name.in_(service.oauth_roles)
)
)
oauth_client = self.oauth_provider.add_client(
client_id=service.oauth_client_id,
client_secret=service.api_token,
redirect_uri=service.oauth_redirect_uri,
allowed_roles=allowed_roles,
description="JupyterHub service %s" % service.name,
)
service.orm.oauth_client = oauth_client
# add access-scopes, derived from OAuthClient itself
allowed_scopes.update(scopes.access_scopes(oauth_client))
oauth_client.allowed_scopes = sorted(allowed_scopes)
else:
if service.oauth_client:
self.db.delete(service.oauth_client)
@@ -3064,7 +3058,7 @@ class JupyterHub(Application):
self.internal_ssl_key,
self.internal_ssl_cert,
cafile=self.internal_ssl_ca,
purpose=ssl.Purpose.CLIENT_AUTH,
check_hostname=False,
)
# start the webserver
@@ -3241,7 +3235,11 @@ class JupyterHub(Application):
self._atexit_ran = True
self._init_asyncio_patch()
# run the cleanup step (in a new loop, because the interrupted one is unclean)
asyncio.run(self.cleanup())
asyncio.set_event_loop(asyncio.new_event_loop())
IOLoop.clear_current()
loop = IOLoop()
loop.make_current()
loop.run_sync(self.cleanup)
async def shutdown_cancel_tasks(self, sig=None):
"""Cancel all other tasks of the event loop and initiate cleanup"""
@@ -3252,7 +3250,7 @@ class JupyterHub(Application):
await self.cleanup()
tasks = [t for t in asyncio.all_tasks() if t is not asyncio.current_task()]
tasks = [t for t in asyncio_all_tasks() if t is not asyncio_current_task()]
if tasks:
self.log.debug("Cancelling pending tasks")
@@ -3265,7 +3263,7 @@ class JupyterHub(Application):
except StopAsyncIteration as e:
self.log.error("Caught StopAsyncIteration Exception", exc_info=True)
tasks = [t for t in asyncio.all_tasks()]
tasks = [t for t in asyncio_all_tasks()]
for t in tasks:
self.log.debug("Task status: %s", t)
asyncio.get_event_loop().stop()
@@ -3301,19 +3299,16 @@ class JupyterHub(Application):
def launch_instance(cls, argv=None):
self = cls.instance()
self._init_asyncio_patch()
loop = IOLoop(make_current=False)
try:
loop.run_sync(self.launch_instance_async, argv)
except Exception:
loop.close()
raise
loop = IOLoop.current()
task = asyncio.ensure_future(self.launch_instance_async(argv))
try:
loop.start()
except KeyboardInterrupt:
print("\nInterrupted")
finally:
if task.done():
# re-raise exceptions in launch_instance_async
task.result()
loop.stop()
loop.close()

View File

@@ -9,8 +9,9 @@ import warnings
from concurrent.futures import ThreadPoolExecutor
from functools import partial
from shutil import which
from subprocess import PIPE, STDOUT, Popen
from textwrap import dedent
from subprocess import PIPE
from subprocess import Popen
from subprocess import STDOUT
try:
import pamela
@@ -19,12 +20,13 @@ except Exception as e:
_pamela_error = e
from tornado.concurrent import run_on_executor
from traitlets import Any, Bool, Dict, Integer, Set, Unicode, default, observe
from traitlets.config import LoggingConfigurable
from traitlets import Bool, Integer, Set, Unicode, Dict, Any, default, observe
from .handlers.login import LoginHandler
from .traitlets import Command
from .utils import maybe_future, url_path_join
from .traitlets import Command
class Authenticator(LoggingConfigurable):
@@ -32,23 +34,6 @@ class Authenticator(LoggingConfigurable):
db = Any()
@default("db")
def _deprecated_db(self):
self.log.warning(
dedent(
"""
The shared database session at Authenticator.db is deprecated, and will be removed.
Please manage your own database and connections.
Contact JupyterHub at https://github.com/jupyterhub/jupyterhub/issues/3700
if you have questions or ideas about direct database needs for your Authenticator.
"""
),
)
return self._deprecated_db_session
_deprecated_db_session = Any()
enable_auth_state = Bool(
False,
config=True,
@@ -256,9 +241,6 @@ class Authenticator(LoggingConfigurable):
if not username:
# empty usernames are not allowed
return False
if username != username.strip():
# starting/ending with space is not allowed
return False
if not self.username_regex:
return True
return bool(self.username_regex.match(username))
@@ -835,7 +817,7 @@ class LocalAuthenticator(Authenticator):
raise ValueError("I don't know how to create users on OS X")
elif which('pw'):
# Probably BSD
return ['pw', 'useradd', '-m', '-n']
return ['pw', 'useradd', '-m']
else:
# This appears to be the Linux non-interactive adduser command:
return ['adduser', '-q', '--gecos', '""', '--disabled-password']

View File

@@ -4,12 +4,18 @@ import os
from binascii import a2b_hex
from concurrent.futures import ThreadPoolExecutor
from traitlets import Any, Integer, List, default, observe, validate
from traitlets.config import Config, SingletonConfigurable
from traitlets import Any
from traitlets import default
from traitlets import Integer
from traitlets import List
from traitlets import observe
from traitlets import validate
from traitlets.config import Config
from traitlets.config import SingletonConfigurable
try:
import cryptography
from cryptography.fernet import Fernet, InvalidToken, MultiFernet
from cryptography.fernet import Fernet, MultiFernet, InvalidToken
except ImportError:
cryptography = None

View File

@@ -1,4 +1,7 @@
from . import base, login, metrics, pages
from . import base
from . import login
from . import metrics
from . import pages
from .base import *
from .login import *

View File

@@ -9,44 +9,50 @@ import random
import re
import time
import uuid
from datetime import datetime, timedelta
from datetime import datetime
from datetime import timedelta
from http.client import responses
from urllib.parse import parse_qs, parse_qsl, urlencode, urlparse, urlunparse
from urllib.parse import parse_qs
from urllib.parse import parse_qsl
from urllib.parse import urlencode
from urllib.parse import urlparse
from urllib.parse import urlunparse
from jinja2 import TemplateNotFound
from sqlalchemy.exc import SQLAlchemyError
from tornado import gen, web
from tornado.httputil import HTTPHeaders, url_concat
from tornado import gen
from tornado import web
from tornado.httputil import HTTPHeaders
from tornado.httputil import url_concat
from tornado.ioloop import IOLoop
from tornado.log import app_log
from tornado.web import RequestHandler, addslash
from tornado.web import addslash
from tornado.web import RequestHandler
from .. import __version__, orm, roles, scopes
from ..metrics import (
PROXY_ADD_DURATION_SECONDS,
PROXY_DELETE_DURATION_SECONDS,
RUNNING_SERVERS,
SERVER_POLL_DURATION_SECONDS,
SERVER_SPAWN_DURATION_SECONDS,
SERVER_STOP_DURATION_SECONDS,
TOTAL_USERS,
ProxyDeleteStatus,
ServerPollStatus,
ServerSpawnStatus,
ServerStopStatus,
)
from .. import __version__
from .. import orm
from .. import roles
from .. import scopes
from ..metrics import PROXY_ADD_DURATION_SECONDS
from ..metrics import PROXY_DELETE_DURATION_SECONDS
from ..metrics import ProxyDeleteStatus
from ..metrics import RUNNING_SERVERS
from ..metrics import SERVER_POLL_DURATION_SECONDS
from ..metrics import SERVER_SPAWN_DURATION_SECONDS
from ..metrics import SERVER_STOP_DURATION_SECONDS
from ..metrics import ServerPollStatus
from ..metrics import ServerSpawnStatus
from ..metrics import ServerStopStatus
from ..metrics import TOTAL_USERS
from ..objects import Server
from ..scopes import needs_scope
from ..spawner import LocalProcessSpawner
from ..user import User
from ..utils import (
AnyTimeoutError,
get_accepted_mimetype,
get_browser_protocol,
maybe_future,
url_escape_path,
url_path_join,
)
from ..utils import AnyTimeoutError
from ..utils import get_accepted_mimetype
from ..utils import get_browser_protocol
from ..utils import maybe_future
from ..utils import url_path_join
# pattern for the authentication token header
auth_header_pat = re.compile(r'^(?:token|bearer)\s+([^\s]+)$', flags=re.IGNORECASE)
@@ -856,12 +862,6 @@ class BaseHandler(RequestHandler):
user_server_name = user.name
if server_name:
if '/' in server_name:
error_message = (
f"Invalid server_name (may not contain '/'): {server_name}"
)
self.log.error(error_message)
raise web.HTTPError(400, error_message)
user_server_name = f'{user.name}:{server_name}'
if server_name in user.spawners and user.spawners[server_name].pending:
@@ -1520,7 +1520,6 @@ class UserUrlHandler(BaseHandler):
server_name = ''
else:
server_name = ''
escaped_server_name = url_escape_path(server_name)
spawner = user.spawners[server_name]
if spawner.ready:
@@ -1539,10 +1538,7 @@ class UserUrlHandler(BaseHandler):
pending_url = url_concat(
url_path_join(
self.hub.base_url,
'spawn-pending',
user.escaped_name,
escaped_server_name,
self.hub.base_url, 'spawn-pending', user.escaped_name, server_name
),
{'next': self.request.uri},
)
@@ -1556,9 +1552,7 @@ class UserUrlHandler(BaseHandler):
# page *in* the server is not found, we return a 424 instead of a 404.
# We allow retaining the old behavior to support older JupyterLab versions
spawn_url = url_concat(
url_path_join(
self.hub.base_url, "spawn", user.escaped_name, escaped_server_name
),
url_path_join(self.hub.base_url, "spawn", user.escaped_name, server_name),
{"next": self.request.uri},
)
self.set_status(

View File

@@ -145,9 +145,7 @@ class LoginHandler(BaseHandler):
# parse the arguments dict
data = {}
for arg in self.request.arguments:
# strip username, but not other fields like passwords,
# which should be allowed to start or end with space
data[arg] = self.get_argument(arg, strip=arg == "username")
data[arg] = self.get_argument(arg, strip=False)
auth_timer = self.statsd.timer('login.authenticate').start()
user = await self.login_user(data)

View File

@@ -1,5 +1,7 @@
"""Handlers for serving prometheus metrics"""
from prometheus_client import CONTENT_TYPE_LATEST, REGISTRY, generate_latest
from prometheus_client import CONTENT_TYPE_LATEST
from prometheus_client import generate_latest
from prometheus_client import REGISTRY
from ..utils import metrics_authentication
from .base import BaseHandler

View File

@@ -12,9 +12,11 @@ from tornado import web
from tornado.httputil import url_concat
from .. import __version__
from ..metrics import SERVER_POLL_DURATION_SECONDS, ServerPollStatus
from ..metrics import SERVER_POLL_DURATION_SECONDS
from ..metrics import ServerPollStatus
from ..scopes import needs_scope
from ..utils import maybe_future, url_escape_path, url_path_join
from ..utils import maybe_future
from ..utils import url_path_join
from .base import BaseHandler
@@ -268,6 +270,15 @@ class SpawnHandler(BaseHandler):
)
self.finish(form)
return
if current_user is user:
self.set_login_cookie(user)
next_url = self.get_next_url(
user,
default=url_path_join(
self.hub.base_url, "spawn-pending", user.escaped_name, server_name
),
)
self.redirect(next_url)
def _get_pending_url(self, user, server_name):
# resolve `?next=...`, falling back on the spawn-pending url
@@ -275,10 +286,7 @@ class SpawnHandler(BaseHandler):
# which may get handled by the default server if they aren't ready yet
pending_url = url_path_join(
self.hub.base_url,
"spawn-pending",
user.escaped_name,
url_escape_path(server_name),
self.hub.base_url, "spawn-pending", user.escaped_name, server_name
)
pending_url = self.append_query_parameters(pending_url, exclude=['next'])
@@ -347,7 +355,6 @@ class SpawnPendingHandler(BaseHandler):
if server_name and server_name not in user.spawners:
raise web.HTTPError(404, f"{user.name} has no such server {server_name}")
escaped_server_name = url_escape_path(server_name)
spawner = user.spawners[server_name]
if spawner.ready:
@@ -370,7 +377,7 @@ class SpawnPendingHandler(BaseHandler):
exc = spawner._spawn_future.exception()
self.log.error("Previous spawn for %s failed: %s", spawner._log_name, exc)
spawn_url = url_path_join(
self.hub.base_url, "spawn", user.escaped_name, escaped_server_name
self.hub.base_url, "spawn", user.escaped_name, server_name
)
self.set_status(500)
html = await self.render_template(
@@ -423,7 +430,7 @@ class SpawnPendingHandler(BaseHandler):
# serving the expected page
if status is not None:
spawn_url = url_path_join(
self.hub.base_url, "spawn", user.escaped_name, escaped_server_name
self.hub.base_url, "spawn", user.escaped_name, server_name
)
html = await self.render_template(
"not_running.html",
@@ -449,14 +456,15 @@ class AdminHandler(BaseHandler):
@web.authenticated
# stacked decorators: all scopes must be present
# note: keep in sync with admin link condition in page.html
@needs_scope('admin-ui')
@needs_scope('admin:users')
@needs_scope('admin:servers')
async def get(self):
auth_state = await self.current_user.get_auth_state()
html = await self.render_template(
'admin.html',
current_user=self.current_user,
auth_state=auth_state,
admin_access=True,
admin_access=self.settings.get('admin_access', False),
allow_named_servers=self.allow_named_servers,
named_server_limit_per_user=self.named_server_limit_per_user,
server_version=f'{__version__} {self.version_hash}',

View File

@@ -6,10 +6,13 @@ import logging
import traceback
from functools import partial
from http.cookies import SimpleCookie
from urllib.parse import urlparse, urlunparse
from urllib.parse import urlparse
from urllib.parse import urlunparse
from tornado.log import LogFormatter, access_log
from tornado.web import HTTPError, StaticFileHandler
from tornado.log import access_log
from tornado.log import LogFormatter
from tornado.web import HTTPError
from tornado.web import StaticFileHandler
from .handlers.pages import HealthCheckHandler
from .metrics import prometheus_log_method

View File

@@ -21,7 +21,8 @@ them manually here.
"""
from enum import Enum
from prometheus_client import Gauge, Histogram
from prometheus_client import Gauge
from prometheus_client import Histogram
REQUEST_DURATION_SECONDS = Histogram(
'jupyterhub_request_duration_seconds',

View File

@@ -3,14 +3,15 @@
implements https://oauthlib.readthedocs.io/en/latest/oauth2/server.html
"""
from oauthlib import uri_validate
from oauthlib.oauth2 import RequestValidator, WebApplicationServer
from oauthlib.oauth2.rfc6749.grant_types import authorization_code, base
from oauthlib.oauth2 import RequestValidator
from oauthlib.oauth2 import WebApplicationServer
from oauthlib.oauth2.rfc6749.grant_types import authorization_code
from oauthlib.oauth2.rfc6749.grant_types import base
from tornado.log import app_log
from .. import orm
from ..roles import roles_to_scopes
from ..scopes import _check_scopes_exist, access_scopes, identify_scopes
from ..utils import compare_token, hash_token
from ..utils import compare_token
from ..utils import hash_token
# patch absolute-uri check
# because we want to allow relative uri oauth
@@ -151,12 +152,7 @@ class JupyterHubRequestValidator(RequestValidator):
)
if orm_client is None:
raise ValueError("No such client: %s" % client_id)
scopes = set(orm_client.allowed_scopes)
if 'inherit' not in scopes:
# add identify-user scope
# and access-service scope
scopes |= identify_scopes() | access_scopes(orm_client)
return scopes
return [role.name for role in orm_client.allowed_roles]
def get_original_scopes(self, refresh_token, request, *args, **kwargs):
"""Get the list of scopes associated with the refresh token.
@@ -256,7 +252,7 @@ class JupyterHubRequestValidator(RequestValidator):
code=code['code'],
# oauth has 5 minutes to complete
expires_at=int(orm.OAuthCode.now() + 300),
scopes=list(request.scopes),
roles=request._jupyterhub_roles,
user=request.user.orm_user,
redirect_uri=orm_client.redirect_uri,
session_id=request.session_id,
@@ -351,9 +347,9 @@ class JupyterHubRequestValidator(RequestValidator):
# APIToken.new commits the token to the db
orm.APIToken.new(
oauth_client=client,
client_id=client.identifier,
expires_in=token['expires_in'],
scopes=request.scopes,
roles=[rolename for rolename in request.scopes],
token=token['access_token'],
session_id=request.session_id,
user=request.user,
@@ -455,7 +451,7 @@ class JupyterHubRequestValidator(RequestValidator):
return False
request.user = orm_code.user
request.session_id = orm_code.session_id
request.scopes = orm_code.scopes
request.scopes = [role.name for role in orm_code.roles]
return True
def validate_grant_type(
@@ -541,7 +537,7 @@ class JupyterHubRequestValidator(RequestValidator):
def validate_scopes(self, client_id, scopes, client, request, *args, **kwargs):
"""Ensure the client is authorized access to requested scopes.
:param client_id: Unicode client identifier
:param scopes: List of 'raw' scopes (defined by you)
:param scopes: List of scopes (defined by you)
:param client: Client object set by you, see authenticate_client.
:param request: The HTTP Request (oauthlib.common.Request)
:rtype: True or False
@@ -551,70 +547,35 @@ class JupyterHubRequestValidator(RequestValidator):
- Resource Owner Password Credentials Grant
- Client Credentials Grant
"""
orm_client = (
self.db.query(orm.OAuthClient).filter_by(identifier=client_id).one_or_none()
)
if orm_client is None:
app_log.warning("No such oauth client %s", client_id)
return False
requested_scopes = set(scopes)
client_allowed_roles = {role.name: role for role in orm_client.allowed_roles}
# explicitly allow 'identify', which was the only allowed scope previously
# requesting 'identify' gets no actual permissions other than self-identification
if "identify" in requested_scopes:
app_log.warning(
f"Ignoring deprecated 'identify' scope, requested by {client_id}"
)
requested_scopes.discard("identify")
# TODO: handle roles->scopes transition
# In 2.x, `?scopes=` only accepted _role_ names,
# but in 3.0 we accept and prefer scopes.
# For backward-compatibility, we still accept both.
# Should roles be deprecated here, or kept as a convenience?
try:
_check_scopes_exist(requested_scopes)
except KeyError as e:
# scopes don't exist, maybe they are role names
requested_roles = list(
self.db.query(orm.Role).filter(orm.Role.name.in_(requested_scopes))
)
if len(requested_roles) != len(requested_scopes):
# did not find roles
app_log.warning(f"No such scopes: {requested_scopes}")
return False
app_log.info(
f"OAuth client {client_id} requesting roles: {requested_scopes}"
)
requested_scopes = roles_to_scopes(requested_roles)
client_allowed_scopes = set(orm_client.allowed_scopes)
# always grant reading the token-owner's name
# and accessing the service itself
required_scopes = {*identify_scopes(), *access_scopes(orm_client)}
requested_scopes.update(required_scopes)
client_allowed_scopes.update(required_scopes)
# TODO: handle expanded_scopes intersection here?
# e.g. client allowed to request admin:users,
# but requests admin:users!name=x will not be allowed
# This can probably be dealt with in config by listing expected requests
# as explcitly allowed
disallowed_scopes = requested_scopes.difference(client_allowed_scopes)
if disallowed_scopes:
client_allowed_roles.setdefault('identify', None)
roles = []
requested_roles = set(scopes)
disallowed_roles = requested_roles.difference(client_allowed_roles)
if disallowed_roles:
app_log.error(
f"Scope(s) not allowed for {client_id}: {', '.join(disallowed_scopes)}"
f"Role(s) not allowed for {client_id}: {','.join(disallowed_roles)}"
)
return False
# store resolved scopes on request
# store resolved roles on request
app_log.debug(
f"Allowing request for scope(s) for {client_id}: {','.join(requested_scopes) or '[]'}"
f"Allowing request for role(s) for {client_id}: {','.join(requested_roles) or '[]'}"
)
request.scopes = requested_scopes
# these will be stored on the OAuthCode object
request._jupyterhub_roles = [
client_allowed_roles[name]
for name in requested_roles
if client_allowed_roles[name] is not None
]
return True
@@ -624,12 +585,7 @@ class JupyterHubOAuthServer(WebApplicationServer):
super().__init__(validator, *args, **kwargs)
def add_client(
self,
client_id,
client_secret,
redirect_uri,
allowed_scopes=None,
description='',
self, client_id, client_secret, redirect_uri, allowed_roles=None, description=''
):
"""Add a client
@@ -651,12 +607,12 @@ class JupyterHubOAuthServer(WebApplicationServer):
app_log.info(f'Creating oauth client {client_id}')
else:
app_log.info(f'Updating oauth client {client_id}')
if allowed_scopes == None:
allowed_scopes = []
if allowed_roles == None:
allowed_roles = []
orm_client.secret = hash_token(client_secret) if client_secret else ""
orm_client.redirect_uri = redirect_uri
orm_client.description = description or client_id
orm_client.allowed_scopes = list(allowed_scopes)
orm_client.allowed_roles = allowed_roles
self.db.commit()
return orm_client

Some files were not shown because too many files have changed in this diff Show More