Compare commits

..

20 Commits

Author SHA1 Message Date
Min RK
a3c93088a8 Bump to 2.3.0 2022-05-06 16:05:34 +02:00
Min RK
834229622d Merge pull request #3887 from minrk/2.3-backports
2.3 backports
2022-05-06 16:05:10 +02:00
Min RK
44a1ea42de One more in the changelog 2022-05-06 15:56:13 +02:00
Simon Li
3879a96b67 Backport PR #3886: Cleanup everything on API shutdown
`app.stop` triggers full cleanup and stopping of the event loop

closes  3881

Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-06 15:55:00 +02:00
Min RK
d40627d397 changelog for 2.3 2022-05-05 13:24:00 +02:00
Min RK
057cdbc9e9 pre-commit autoupdate 2022-05-05 13:23:52 +02:00
Min RK
75390d2e46 Backport PR #3882: Use log.exception when logging exceptions
This provides the stack trace in the log file, incredibly
useful when debugging

Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:28 +02:00
Min RK
f5e4846cfa Backport PR #3874: Missing f prefix on f-strings fix
Fixes  3873

Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:27 +02:00
Georgiana Elena
3dc115a829 Backport PR #3876: don't confuse :// in next_url query params for a redirect hostname
closes  3014

These query params should be url-encoded (https://github.com/jupyterhub/nbgitpuller/issues/118), but we still shouldn't be making the wrong assumptions about when a hostname is specified

Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:25 +02:00
Min RK
af4ddbfc58 Backport PR #3867: ci: update black configuration
Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:24 +02:00
Min RK
50a4d1e34d Backport PR #3863: [Bug Fix] Search bar disabled on admin dashboard
I originally had `defaultValue` here and I changed it not realizing this would break/disable the input.

Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:23 +02:00
Erik Sundell
86a238334c Backport PR #3862: Fix typo in [rest api] link in README.md
Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:22 +02:00
Simon Li
dacb9d1668 Backport PR #3859: Do not store Spawner.ip/port on spawner.server during get_env
we shouldn't mutate db state when getting the environment.

IIRC, this was part of an attempt to get the url via `self.server.bind_url` that didn't end up getting used in  3381. So this doesn't really have any positive effects, but it _can_ have negative effects if `get_env` is called in unusual circumstances (jupyterhub/batchspawner 236)

closes jupyterhub/batchspawner 236

Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:21 +02:00
Min RK
95cc170383 Backport PR #3853: Fix xsrf_cookie_kwargs ValueError
Fixes

`ValueError: too many values to unpack (expected 2)`

Related to code added as a fix for https://github.com/jupyterhub/jupyterhub/issues/3821

Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:20 +02:00
Erik Sundell
437a9d150f Backport PR #3849: The word used is duplicated in upgrade.md
This PR is to update doc for that the word `used` is duplicated in this doc.

Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:19 +02:00
Erik Sundell
c9616d6f11 Backport PR #3843: Some typos in docs
- fix some references to old 'all' name which was renamed 'inherit'
- fix a heading level in changlog that sphinx warns about

Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:18 +02:00
Min RK
61aed70c4d Backport PR #3841: adopt pytest-asyncio asyncio_mode='auto'
removes need for our own implementation of the same behavior in conftest

Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:17 +02:00
Erik Sundell
9abb573d47 Backport PR #3839: Document version mismatch log message
Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:16 +02:00
Erik Sundell
b074304834 Backport PR #3835: remove lingering reference to distutils
traitlets, like most Jupyter projects (and Python itself), has a `.version_info` tuple to avoid needing to parse versions

Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:15 +02:00
Min RK
201e7ca3d8 Backport PR #3834: Admin Dashboard - Collapsible Details View
I made this PR to see if this feature would be useful for other people. Right now, you can't see all of a user or server's details in the admin page so I added a collapsible view which will let you see the entire server and user objects. I'm open to ideas about how the information is displayed. Will add more tests if this feature is accepted.

![improved-collapse](https://user-images.githubusercontent.com/737367/158468531-1efc28e6-a229-4383-b5f9-b301898d929f.gif)

Signed-off-by: Min RK <benjaminrk@gmail.com>
2022-05-05 13:15:14 +02:00
145 changed files with 4225 additions and 5977 deletions

View File

@@ -1,15 +0,0 @@
# dependabot.yml reference: https://docs.github.com/en/code-security/dependabot/dependabot-version-updates/configuration-options-for-the-dependabot.yml-file
#
# Notes:
# - Status and logs from dependabot are provided at
# https://github.com/jupyterhub/jupyterhub/network/updates.
#
version: 2
updates:
# Maintain dependencies in our GitHub Workflows
- package-ecosystem: github-actions
directory: "/"
schedule:
interval: weekly
time: "05:00"
timezone: "Etc/UTC"

View File

@@ -32,18 +32,17 @@ jobs:
build-release: build-release:
runs-on: ubuntu-20.04 runs-on: ubuntu-20.04
steps: steps:
- uses: actions/checkout@v3 - uses: actions/checkout@v2
- uses: actions/setup-python@v4 - uses: actions/setup-python@v2
with: with:
python-version: "3.9" python-version: 3.8
- uses: actions/setup-node@v3 - uses: actions/setup-node@v1
with: with:
node-version: "14" node-version: "14"
- name: install build requirements - name: install build package
run: | run: |
npm install -g yarn
pip install --upgrade pip pip install --upgrade pip
pip install build pip install build
pip freeze pip freeze
@@ -53,17 +52,28 @@ jobs:
python -m build --sdist --wheel . python -m build --sdist --wheel .
ls -l dist ls -l dist
- name: verify sdist - name: verify wheel
run: | run: |
./ci/check_sdist.py dist/jupyterhub-*.tar.gz cd dist
pip install ./*.whl
- name: verify data-files are installed where they are found # verify data-files are installed where they are found
run: | cat <<EOF | python
pip install dist/*.whl import os
./ci/check_installed_data.py from jupyterhub._data import DATA_FILES_PATH
print(f"DATA_FILES_PATH={DATA_FILES_PATH}")
assert os.path.exists(DATA_FILES_PATH), DATA_FILES_PATH
for subpath in (
"templates/page.html",
"static/css/style.min.css",
"static/components/jquery/dist/jquery.js",
):
path = os.path.join(DATA_FILES_PATH, subpath)
assert os.path.exists(path), path
print("OK")
EOF
# ref: https://github.com/actions/upload-artifact#readme # ref: https://github.com/actions/upload-artifact#readme
- uses: actions/upload-artifact@v3 - uses: actions/upload-artifact@v2
with: with:
name: jupyterhub-${{ github.sha }} name: jupyterhub-${{ github.sha }}
path: "dist/*" path: "dist/*"
@@ -98,16 +108,16 @@ jobs:
echo "REGISTRY=localhost:5000/" >> $GITHUB_ENV echo "REGISTRY=localhost:5000/" >> $GITHUB_ENV
fi fi
- uses: actions/checkout@v3 - uses: actions/checkout@v2
# Setup docker to build for multiple platforms, see: # Setup docker to build for multiple platforms, see:
# https://github.com/docker/build-push-action/tree/v2.4.0#usage # https://github.com/docker/build-push-action/tree/v2.4.0#usage
# https://github.com/docker/build-push-action/blob/v2.4.0/docs/advanced/multi-platform.md # https://github.com/docker/build-push-action/blob/v2.4.0/docs/advanced/multi-platform.md
- name: Set up QEMU (for docker buildx) - name: Set up QEMU (for docker buildx)
uses: docker/setup-qemu-action@8b122486cedac8393e77aa9734c3528886e4a1a8 # associated tag: v1.0.2 uses: docker/setup-qemu-action@25f0500ff22e406f7191a2a8ba8cda16901ca018 # associated tag: v1.0.2
- name: Set up Docker Buildx (for multi-arch builds) - name: Set up Docker Buildx (for multi-arch builds)
uses: docker/setup-buildx-action@dc7b9719a96d48369863986a06765841d7ea23f6 # associated tag: v1.1.2 uses: docker/setup-buildx-action@2a4b53665e15ce7d7049afb11ff1f70ff1610609 # associated tag: v1.1.2
with: with:
# Allows pushing to registry on localhost:5000 # Allows pushing to registry on localhost:5000
driver-opts: network=host driver-opts: network=host
@@ -145,7 +155,7 @@ jobs:
branchRegex: ^\w[\w-.]*$ branchRegex: ^\w[\w-.]*$
- name: Build and push jupyterhub - name: Build and push jupyterhub
uses: docker/build-push-action@1cb9d22b932e4832bb29793b7777ec860fc1cde0 uses: docker/build-push-action@e1b7f96249f2e4c8e4ac1519b9608c0d48944a1f
with: with:
context: . context: .
platforms: linux/amd64,linux/arm64 platforms: linux/amd64,linux/arm64
@@ -166,7 +176,7 @@ jobs:
branchRegex: ^\w[\w-.]*$ branchRegex: ^\w[\w-.]*$
- name: Build and push jupyterhub-onbuild - name: Build and push jupyterhub-onbuild
uses: docker/build-push-action@1cb9d22b932e4832bb29793b7777ec860fc1cde0 uses: docker/build-push-action@e1b7f96249f2e4c8e4ac1519b9608c0d48944a1f
with: with:
build-args: | build-args: |
BASE_IMAGE=${{ fromJson(steps.jupyterhubtags.outputs.tags)[0] }} BASE_IMAGE=${{ fromJson(steps.jupyterhubtags.outputs.tags)[0] }}
@@ -187,7 +197,7 @@ jobs:
branchRegex: ^\w[\w-.]*$ branchRegex: ^\w[\w-.]*$
- name: Build and push jupyterhub-demo - name: Build and push jupyterhub-demo
uses: docker/build-push-action@1cb9d22b932e4832bb29793b7777ec860fc1cde0 uses: docker/build-push-action@e1b7f96249f2e4c8e4ac1519b9608c0d48944a1f
with: with:
build-args: | build-args: |
BASE_IMAGE=${{ fromJson(steps.onbuildtags.outputs.tags)[0] }} BASE_IMAGE=${{ fromJson(steps.onbuildtags.outputs.tags)[0] }}
@@ -211,7 +221,7 @@ jobs:
branchRegex: ^\w[\w-.]*$ branchRegex: ^\w[\w-.]*$
- name: Build and push jupyterhub/singleuser - name: Build and push jupyterhub/singleuser
uses: docker/build-push-action@1cb9d22b932e4832bb29793b7777ec860fc1cde0 uses: docker/build-push-action@e1b7f96249f2e4c8e4ac1519b9608c0d48944a1f
with: with:
build-args: | build-args: |
JUPYTERHUB_VERSION=${{ github.ref_type == 'tag' && github.ref_name || format('git:{0}', github.sha) }} JUPYTERHUB_VERSION=${{ github.ref_type == 'tag' && github.ref_name || format('git:{0}', github.sha) }}

View File

@@ -15,13 +15,15 @@ on:
- "docs/**" - "docs/**"
- "jupyterhub/_version.py" - "jupyterhub/_version.py"
- "jupyterhub/scopes.py" - "jupyterhub/scopes.py"
- ".github/workflows/test-docs.yml" - ".github/workflows/*"
- "!.github/workflows/test-docs.yml"
push: push:
paths: paths:
- "docs/**" - "docs/**"
- "jupyterhub/_version.py" - "jupyterhub/_version.py"
- "jupyterhub/scopes.py" - "jupyterhub/scopes.py"
- ".github/workflows/test-docs.yml" - ".github/workflows/*"
- "!.github/workflows/test-docs.yml"
branches-ignore: branches-ignore:
- "dependabot/**" - "dependabot/**"
- "pre-commit-ci-update-config" - "pre-commit-ci-update-config"
@@ -38,18 +40,18 @@ jobs:
validate-rest-api-definition: validate-rest-api-definition:
runs-on: ubuntu-20.04 runs-on: ubuntu-20.04
steps: steps:
- uses: actions/checkout@v3 - uses: actions/checkout@v2
- name: Validate REST API definition - name: Validate REST API definition
uses: char0n/swagger-editor-validate@v1.3.1 uses: char0n/swagger-editor-validate@182d1a5d26ff5c2f4f452c43bd55e2c7d8064003
with: with:
definition-file: docs/source/_static/rest-api.yml definition-file: docs/source/_static/rest-api.yml
test-docs: test-docs:
runs-on: ubuntu-20.04 runs-on: ubuntu-20.04
steps: steps:
- uses: actions/checkout@v3 - uses: actions/checkout@v2
- uses: actions/setup-python@v4 - uses: actions/setup-python@v2
with: with:
python-version: "3.9" python-version: "3.9"

View File

@@ -19,9 +19,6 @@ on:
- "**" - "**"
workflow_dispatch: workflow_dispatch:
permissions:
contents: read
jobs: jobs:
# The ./jsx folder contains React based source code files that are to compile # The ./jsx folder contains React based source code files that are to compile
# to share/jupyterhub/static/js/admin-react.js. The ./jsx folder includes # to share/jupyterhub/static/js/admin-react.js. The ./jsx folder includes
@@ -32,8 +29,8 @@ jobs:
timeout-minutes: 5 timeout-minutes: 5
steps: steps:
- uses: actions/checkout@v3 - uses: actions/checkout@v2
- uses: actions/setup-node@v3 - uses: actions/setup-node@v1
with: with:
node-version: "14" node-version: "14"
@@ -50,3 +47,62 @@ jobs:
run: | run: |
cd jsx cd jsx
yarn test yarn test
# The ./jsx folder contains React based source files that are to compile to
# share/jupyterhub/static/js/admin-react.js. This job makes sure that whatever
# we have in jsx/src matches the compiled asset that we package and
# distribute.
#
# This job's purpose is to make sure we don't forget to compile changes and to
# verify nobody sneaks in a change in the hard to review compiled asset.
#
# NOTE: In the future we may want to stop version controlling the compiled
# artifact and instead generate it whenever we package JupyterHub. If we
# do this, we are required to setup node and compile the source code
# more often, at the same time we could avoid having this check be made.
#
compile-jsx-admin-react:
runs-on: ubuntu-20.04
timeout-minutes: 5
steps:
- uses: actions/checkout@v2
- uses: actions/setup-node@v1
with:
node-version: "14"
- name: Install yarn
run: |
npm install -g yarn
- name: yarn
run: |
cd jsx
yarn
- name: yarn build
run: |
cd jsx
yarn build
- name: yarn place
run: |
cd jsx
yarn place
- name: Verify compiled jsx/src matches version controlled artifact
run: |
if [[ `git status --porcelain=v1` ]]; then
echo "The source code in ./jsx compiles to something different than found in ./share/jupyterhub/static/js/admin-react.js!"
echo
echo "Please re-compile the source code in ./jsx with the following commands:"
echo
echo "yarn"
echo "yarn build"
echo "yarn place"
echo
echo "See ./jsx/README.md for more details."
exit 1
else
echo "Compilation of jsx/src to share/jupyterhub/static/js/admin-react.js didn't lead to changes."
fi

View File

@@ -30,9 +30,6 @@ env:
LANG: C.UTF-8 LANG: C.UTF-8
PYTEST_ADDOPTS: "--verbose --color=yes" PYTEST_ADDOPTS: "--verbose --color=yes"
permissions:
contents: read
jobs: jobs:
# Run "pytest jupyterhub/tests" in various configurations # Run "pytest jupyterhub/tests" in various configurations
pytest: pytest:
@@ -56,9 +53,9 @@ jobs:
# Tests everything when JupyterHub works against a dedicated mysql or # Tests everything when JupyterHub works against a dedicated mysql or
# postgresql server. # postgresql server.
# #
# legacy_notebook: # nbclassic:
# Tests everything when the user instances are started with # Tests everything when the user instances are started with
# the legacy notebook server instead of jupyter_server. # notebook instead of jupyter_server.
# #
# ssl: # ssl:
# Tests everything using internal SSL connections instead of # Tests everything using internal SSL connections instead of
@@ -72,24 +69,20 @@ jobs:
# GitHub UI when the workflow run, we avoid using true/false as # GitHub UI when the workflow run, we avoid using true/false as
# values by instead duplicating the name to signal true. # values by instead duplicating the name to signal true.
include: include:
- python: "3.7" - python: "3.6"
oldest_dependencies: oldest_dependencies oldest_dependencies: oldest_dependencies
legacy_notebook: legacy_notebook nbclassic: nbclassic
- python: "3.8" - python: "3.6"
legacy_notebook: legacy_notebook
- python: "3.9"
db: mysql
- python: "3.10"
db: postgres
- python: "3.10"
subdomain: subdomain subdomain: subdomain
- python: "3.10" - python: "3.7"
db: mysql
- python: "3.7"
ssl: ssl ssl: ssl
# can't test 3.11.0-beta.4 until a greenlet release - python: "3.8"
# greenlet is a dependency of sqlalchemy on linux db: postgres
# see https://github.com/gevent/gevent/issues/1867 - python: "3.8"
# - python: "3.11.0-beta.4" nbclassic: nbclassic
- python: "3.10" - python: "3.9"
main_dependencies: main_dependencies main_dependencies: main_dependencies
steps: steps:
@@ -117,25 +110,25 @@ jobs:
if [ "${{ matrix.jupyter_server }}" != "" ]; then if [ "${{ matrix.jupyter_server }}" != "" ]; then
echo "JUPYTERHUB_SINGLEUSER_APP=jupyterhub.tests.mockserverapp.MockServerApp" >> $GITHUB_ENV echo "JUPYTERHUB_SINGLEUSER_APP=jupyterhub.tests.mockserverapp.MockServerApp" >> $GITHUB_ENV
fi fi
- uses: actions/checkout@v3 - uses: actions/checkout@v2
# NOTE: actions/setup-node@v3 make use of a cache within the GitHub base # NOTE: actions/setup-node@v1 make use of a cache within the GitHub base
# environment and setup in a fraction of a second. # environment and setup in a fraction of a second.
- name: Install Node v14 - name: Install Node v14
uses: actions/setup-node@v3 uses: actions/setup-node@v1
with: with:
node-version: "14" node-version: "14"
- name: Install Javascript dependencies - name: Install Node dependencies
run: | run: |
npm install npm install
npm install -g configurable-http-proxy yarn npm install -g configurable-http-proxy
npm list npm list
# NOTE: actions/setup-python@v4 make use of a cache within the GitHub base # NOTE: actions/setup-python@v2 make use of a cache within the GitHub base
# environment and setup in a fraction of a second. # environment and setup in a fraction of a second.
- name: Install Python ${{ matrix.python }} - name: Install Python ${{ matrix.python }}
uses: actions/setup-python@v4 uses: actions/setup-python@v2
with: with:
python-version: "${{ matrix.python }}" python-version: ${{ matrix.python }}
- name: Install Python dependencies - name: Install Python dependencies
run: | run: |
pip install --upgrade pip pip install --upgrade pip
@@ -152,9 +145,9 @@ jobs:
if [ "${{ matrix.main_dependencies }}" != "" ]; then if [ "${{ matrix.main_dependencies }}" != "" ]; then
pip install git+https://github.com/ipython/traitlets#egg=traitlets --force pip install git+https://github.com/ipython/traitlets#egg=traitlets --force
fi fi
if [ "${{ matrix.legacy_notebook }}" != "" ]; then if [ "${{ matrix.nbclassic }}" != "" ]; then
pip uninstall jupyter_server --yes pip uninstall jupyter_server --yes
pip install 'notebook<7' pip install notebook
fi fi
if [ "${{ matrix.db }}" == "mysql" ]; then if [ "${{ matrix.db }}" == "mysql" ]; then
pip install mysql-connector-python pip install mysql-connector-python
@@ -218,7 +211,7 @@ jobs:
timeout-minutes: 20 timeout-minutes: 20
steps: steps:
- uses: actions/checkout@v3 - uses: actions/checkout@v2
- name: build images - name: build images
run: | run: |

2
.gitignore vendored
View File

@@ -10,7 +10,6 @@ docs/build
docs/source/_static/rest-api docs/source/_static/rest-api
docs/source/rbac/scope-table.md docs/source/rbac/scope-table.md
.ipynb_checkpoints .ipynb_checkpoints
jsx/build/
# ignore config file at the top-level of the repo # ignore config file at the top-level of the repo
# but not sub-dirs # but not sub-dirs
/jupyterhub_config.py /jupyterhub_config.py
@@ -20,7 +19,6 @@ package-lock.json
share/jupyterhub/static/components share/jupyterhub/static/components
share/jupyterhub/static/css/style.min.css share/jupyterhub/static/css/style.min.css
share/jupyterhub/static/css/style.min.css.map share/jupyterhub/static/css/style.min.css.map
share/jupyterhub/static/js/admin-react.js*
*.egg-info *.egg-info
MANIFEST MANIFEST
.coverage .coverage

View File

@@ -11,33 +11,33 @@
repos: repos:
# Autoformat: Python code, syntax patterns are modernized # Autoformat: Python code, syntax patterns are modernized
- repo: https://github.com/asottile/pyupgrade - repo: https://github.com/asottile/pyupgrade
rev: v2.37.3 rev: v2.32.1
hooks: hooks:
- id: pyupgrade - id: pyupgrade
args: args:
- --py36-plus - --py36-plus
# Autoformat: Python code # Autoformat: Python code
- repo: https://github.com/pycqa/isort - repo: https://github.com/asottile/reorder_python_imports
rev: 5.10.1 rev: v3.1.0
hooks: hooks:
- id: isort - id: reorder-python-imports
# Autoformat: Python code # Autoformat: Python code
- repo: https://github.com/psf/black - repo: https://github.com/psf/black
rev: 22.6.0 rev: 22.3.0
hooks: hooks:
- id: black - id: black
# Autoformat: markdown, yaml, javascript (see the file .prettierignore) # Autoformat: markdown, yaml, javascript (see the file .prettierignore)
- repo: https://github.com/pre-commit/mirrors-prettier - repo: https://github.com/pre-commit/mirrors-prettier
rev: v2.7.1 rev: v2.6.2
hooks: hooks:
- id: prettier - id: prettier
# Autoformat and linting, misc. details # Autoformat and linting, misc. details
- repo: https://github.com/pre-commit/pre-commit-hooks - repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.3.0 rev: v4.2.0
hooks: hooks:
- id: end-of-file-fixer - id: end-of-file-fixer
exclude: share/jupyterhub/static/js/admin-react.js exclude: share/jupyterhub/static/js/admin-react.js
@@ -47,6 +47,6 @@ repos:
# Linting: Python code (see the file .flake8) # Linting: Python code (see the file .flake8)
- repo: https://github.com/PyCQA/flake8 - repo: https://github.com/PyCQA/flake8
rev: "5.0.2" rev: "4.0.1"
hooks: hooks:
- id: flake8 - id: flake8

View File

@@ -6,9 +6,134 @@ you can follow the [Jupyter contributor guide](https://jupyter.readthedocs.io/en
Make sure to also follow [Project Jupyter's Code of Conduct](https://github.com/jupyter/governance/blob/HEAD/conduct/code_of_conduct.md) Make sure to also follow [Project Jupyter's Code of Conduct](https://github.com/jupyter/governance/blob/HEAD/conduct/code_of_conduct.md)
for a friendly and welcoming collaborative environment. for a friendly and welcoming collaborative environment.
Please see our documentation on ## Setting up a development environment
- [Setting up a development install](https://jupyterhub.readthedocs.io/en/latest/contributing/setup.html) <!--
- [Testing JupyterHub and linting code](https://jupyterhub.readthedocs.io/en/latest/contributing/tests.html) https://jupyterhub.readthedocs.io/en/stable/contributing/setup.html
contains a lot of the same information. Should we merge the docs and
just have this page link to that one?
-->
If you need some help, feel free to ask on [Gitter](https://gitter.im/jupyterhub/jupyterhub) or [Discourse](https://discourse.jupyter.org/). JupyterHub requires Python >= 3.5 and nodejs.
As a Python project, a development install of JupyterHub follows standard practices for the basics (steps 1-2).
1. clone the repo
```bash
git clone https://github.com/jupyterhub/jupyterhub
```
2. do a development install with pip
```bash
cd jupyterhub
python3 -m pip install --editable .
```
3. install the development requirements,
which include things like testing tools
```bash
python3 -m pip install -r dev-requirements.txt
```
4. install configurable-http-proxy with npm:
```bash
npm install -g configurable-http-proxy
```
5. set up pre-commit hooks for automatic code formatting, etc.
```bash
pre-commit install
```
You can also invoke the pre-commit hook manually at any time with
```bash
pre-commit run
```
## Contributing
JupyterHub has adopted automatic code formatting so you shouldn't
need to worry too much about your code style.
As long as your code is valid,
the pre-commit hook should take care of how it should look.
You can invoke the pre-commit hook by hand at any time with:
```bash
pre-commit run
```
which should run any autoformatting on your code
and tell you about any errors it couldn't fix automatically.
You may also install [black integration](https://github.com/psf/black#editor-integration)
into your text editor to format code automatically.
If you have already committed files before setting up the pre-commit
hook with `pre-commit install`, you can fix everything up using
`pre-commit run --all-files`. You need to make the fixing commit
yourself after that.
## Testing
It's a good idea to write tests to exercise any new features,
or that trigger any bugs that you have fixed to catch regressions.
You can run the tests with:
```bash
pytest -v
```
in the repo directory. If you want to just run certain tests,
check out the [pytest docs](https://pytest.readthedocs.io/en/latest/usage.html)
for how pytest can be called.
For instance, to test only spawner-related things in the REST API:
```bash
pytest -v -k spawn jupyterhub/tests/test_api.py
```
The tests live in `jupyterhub/tests` and are organized roughly into:
1. `test_api.py` tests the REST API
2. `test_pages.py` tests loading the HTML pages
and other collections of tests for different components.
When writing a new test, there should usually be a test of
similar functionality already written and related tests should
be added nearby.
The fixtures live in `jupyterhub/tests/conftest.py`. There are
fixtures that can be used for JupyterHub components, such as:
- `app`: an instance of JupyterHub with mocked parts
- `auth_state_enabled`: enables persisting auth_state (like authentication tokens)
- `db`: a sqlite in-memory DB session
- `io_loop`: a Tornado event loop
- `event_loop`: a new asyncio event loop
- `user`: creates a new temporary user
- `admin_user`: creates a new temporary admin user
- single user servers
- `cleanup_after`: allows cleanup of single user servers between tests
- mocked service
- `MockServiceSpawner`: a spawner that mocks services for testing with a short poll interval
- `mockservice`: mocked service with no external service url
- `mockservice_url`: mocked service with a url to test external services
And fixtures to add functionality or spawning behavior:
- `admin_access`: grants admin access
- `no_patience`: sets slow-spawning timeouts to zero
- `slow_spawn`: enables the SlowSpawner (a spawner that takes a few seconds to start)
- `never_spawn`: enables the NeverSpawner (a spawner that will never start)
- `bad_spawn`: enables the BadSpawner (a spawner that fails immediately)
- `slow_bad_spawn`: enables the SlowBadSpawner (a spawner that fails after a short delay)
To read more about fixtures check out the
[pytest docs](https://docs.pytest.org/en/latest/fixture.html)
for how to use the existing fixtures, and how to create new ones.
When in doubt, feel free to [ask](https://gitter.im/jupyterhub/jupyterhub).

View File

@@ -21,7 +21,7 @@
# your jupyterhub_config.py will be added automatically # your jupyterhub_config.py will be added automatically
# from your docker directory. # from your docker directory.
ARG BASE_IMAGE=ubuntu:22.04 ARG BASE_IMAGE=ubuntu:focal-20200729
FROM $BASE_IMAGE AS builder FROM $BASE_IMAGE AS builder
USER root USER root
@@ -37,7 +37,6 @@ RUN apt-get update \
python3-pycurl \ python3-pycurl \
nodejs \ nodejs \
npm \ npm \
yarn \
&& apt-get clean \ && apt-get clean \
&& rm -rf /var/lib/apt/lists/* && rm -rf /var/lib/apt/lists/*

View File

@@ -8,7 +8,6 @@ include *requirements.txt
include Dockerfile include Dockerfile
graft onbuild graft onbuild
graft jsx
graft jupyterhub graft jupyterhub
graft scripts graft scripts
graft share graft share
@@ -19,10 +18,6 @@ graft ci
graft docs graft docs
prune docs/node_modules prune docs/node_modules
# Intermediate javascript files
prune jsx/node_modules
prune jsx/build
# prune some large unused files from components # prune some large unused files from components
prune share/jupyterhub/static/components/bootstrap/dist/css prune share/jupyterhub/static/components/bootstrap/dist/css
exclude share/jupyterhub/static/components/bootstrap/dist/fonts/*.svg exclude share/jupyterhub/static/components/bootstrap/dist/fonts/*.svg

View File

@@ -1,20 +0,0 @@
#!/usr/bin/env python
# Check that installed package contains everything we expect
import os
from jupyterhub._data import DATA_FILES_PATH
print("Checking jupyterhub._data")
print(f"DATA_FILES_PATH={DATA_FILES_PATH}")
assert os.path.exists(DATA_FILES_PATH), DATA_FILES_PATH
for subpath in (
"templates/page.html",
"static/css/style.min.css",
"static/components/jquery/dist/jquery.js",
"static/js/admin-react.js",
):
path = os.path.join(DATA_FILES_PATH, subpath)
assert os.path.exists(path), path
print("OK")

View File

@@ -1,28 +0,0 @@
#!/usr/bin/env python
# Check that sdist contains everything we expect
import sys
import tarfile
from tarfile import TarFile
expected_files = [
"docs/requirements.txt",
"jsx/package.json",
"package.json",
"README.md",
]
assert len(sys.argv) == 2, "Expected one file"
print(f"Checking {sys.argv[1]}")
tar = tarfile.open(name=sys.argv[1], mode="r:gz")
try:
# Remove leading jupyterhub-VERSION/
filelist = {f.partition('/')[2] for f in tar.getnames()}
finally:
tar.close()
for e in expected_files:
assert e in filelist, f"{e} not found"
print("OK")

View File

@@ -20,7 +20,7 @@ fi
# Configure a set of databases in the database server for upgrade tests # Configure a set of databases in the database server for upgrade tests
set -x set -x
for SUFFIX in '' _upgrade_100 _upgrade_122 _upgrade_130 _upgrade_150 _upgrade_211; do for SUFFIX in '' _upgrade_100 _upgrade_122 _upgrade_130; do
$SQL_CLIENT "DROP DATABASE jupyterhub${SUFFIX};" 2>/dev/null || true $SQL_CLIENT "DROP DATABASE jupyterhub${SUFFIX};" 2>/dev/null || true
$SQL_CLIENT "CREATE DATABASE jupyterhub${SUFFIX} ${EXTRA_CREATE_DATABASE_ARGS:-};" $SQL_CLIENT "CREATE DATABASE jupyterhub${SUFFIX} ${EXTRA_CREATE_DATABASE_ARGS:-};"
done done

View File

@@ -9,13 +9,10 @@ cryptography
html5lib # needed for beautifulsoup html5lib # needed for beautifulsoup
jupyterlab >=3 jupyterlab >=3
mock mock
# nbclassic provides the '/tree/' handler, which we use in tests
# it is a transitive dependency via jupyterlab,
# but depend on it directly
nbclassic
pre-commit pre-commit
pytest>=3.3 pytest>=3.3
pytest-asyncio>=0.17 pytest-asyncio; python_version < "3.7"
pytest-asyncio>=0.17; python_version >= "3.7"
pytest-cov pytest-cov
requests-mock requests-mock
tbump tbump

View File

@@ -6,7 +6,7 @@ info:
description: The REST API for JupyterHub description: The REST API for JupyterHub
license: license:
name: BSD-3-Clause name: BSD-3-Clause
version: 3.0.0b1 version: 2.3.0
servers: servers:
- url: /hub/api - url: /hub/api
security: security:
@@ -139,16 +139,6 @@ paths:
If unspecified, use api_page_default_limit. If unspecified, use api_page_default_limit.
schema: schema:
type: number type: number
- name: include_stopped_servers
in: query
description: |
Include stopped servers in user model(s).
Added in JupyterHub 3.0.
Allows retrieval of information about stopped servers,
such as activity and state fields.
schema:
type: boolean
allowEmptyValue: true
responses: responses:
200: 200:
description: The Hub's user list description: The Hub's user list
@@ -570,19 +560,7 @@ paths:
description: A note attached to the token for future bookkeeping description: A note attached to the token for future bookkeeping
roles: roles:
type: array type: array
description: | description: A list of role names that the token should have
A list of role names from which to derive scopes.
This is a shortcut for assigning collections of scopes;
Tokens do not retain role assignment.
(Changed in 3.0: roles are immediately resolved to scopes
instead of stored on roles.)
items:
type: string
scopes:
type: array
description: |
A list of scopes that the token should have.
(new in JupyterHub 3.0).
items: items:
type: string type: string
required: false required: false
@@ -1170,11 +1148,7 @@ components:
format: date-time format: date-time
servers: servers:
type: array type: array
description: | description: The active servers for this user.
The servers for this user.
By default: only includes _active_ servers.
Changed in 3.0: if `?include_stopped_servers` parameter is specified,
stopped servers will be included as well.
items: items:
$ref: "#/components/schemas/Server" $ref: "#/components/schemas/Server"
auth_state: auth_state:
@@ -1196,15 +1170,6 @@ components:
description: | description: |
Whether the server is ready for traffic. Whether the server is ready for traffic.
Will always be false when any transition is pending. Will always be false when any transition is pending.
stopped:
type: boolean
description: |
Whether the server is stopped. Added in JupyterHub 3.0,
and only useful when using the `?include_stopped_servers`
request parameter.
Now that stopped servers may be included (since JupyterHub 3.0),
this is the simplest way to select stopped servers.
Always equivalent to `not (ready or pending)`.
pending: pending:
type: string type: string
description: | description: |
@@ -1349,16 +1314,7 @@ components:
description: The service that owns the token (undefined of owned by a user) description: The service that owns the token (undefined of owned by a user)
roles: roles:
type: array type: array
description: description: The names of roles this token has
Deprecated in JupyterHub 3, always an empty list. Tokens have
'scopes' starting from JupyterHub 3.
items:
type: string
scopes:
type: array
description:
List of scopes this token has been assigned. New in JupyterHub
3. In JupyterHub 2.x, tokens were assigned 'roles' insead of scopes.
items: items:
type: string type: string
note: note:
@@ -1414,9 +1370,6 @@ components:
inherit: inherit:
Everything that the token-owning entity can access _(metascope Everything that the token-owning entity can access _(metascope
for tokens)_ for tokens)_
admin-ui:
Access the admin page. Permission to take actions via the admin
page granted separately.
admin:users: admin:users:
Read, write, create and delete users and their authentication Read, write, create and delete users and their authentication
state, not including their servers or tokens. state, not including their servers or tokens.

File diff suppressed because one or more lines are too long

View File

@@ -48,7 +48,7 @@ version = '%i.%i' % jupyterhub.version_info[:2]
# The full version, including alpha/beta/rc tags. # The full version, including alpha/beta/rc tags.
release = jupyterhub.__version__ release = jupyterhub.__version__
language = "en" language = None
exclude_patterns = [] exclude_patterns = []
pygments_style = 'sphinx' pygments_style = 'sphinx'
todo_include_todos = False todo_include_todos = False
@@ -56,14 +56,12 @@ todo_include_todos = False
# Set the default role so we can use `foo` instead of ``foo`` # Set the default role so we can use `foo` instead of ``foo``
default_role = 'literal' default_role = 'literal'
from contextlib import redirect_stdout
from io import StringIO
from docutils import nodes
from sphinx.directives.other import SphinxDirective
# -- Config ------------------------------------------------------------- # -- Config -------------------------------------------------------------
from jupyterhub.app import JupyterHub from jupyterhub.app import JupyterHub
from docutils import nodes
from sphinx.directives.other import SphinxDirective
from contextlib import redirect_stdout
from io import StringIO
# create a temp instance of JupyterHub just to get the output of the generate-config # create a temp instance of JupyterHub just to get the output of the generate-config
# and help --all commands. # and help --all commands.

View File

@@ -16,7 +16,7 @@ Install Python
-------------- --------------
JupyterHub is written in the `Python <https://python.org>`_ programming language, and JupyterHub is written in the `Python <https://python.org>`_ programming language, and
requires you have at least version 3.6 installed locally. If you havent requires you have at least version 3.5 installed locally. If you havent
installed Python before, the recommended way to install it is to use installed Python before, the recommended way to install it is to use
`miniconda <https://conda.io/miniconda.html>`_. Remember to get the Python 3 version, `miniconda <https://conda.io/miniconda.html>`_. Remember to get the Python 3 version,
and **not** the Python 2 version! and **not** the Python 2 version!
@@ -24,10 +24,11 @@ and **not** the Python 2 version!
Install nodejs Install nodejs
-------------- --------------
`NodeJS 12+ <https://nodejs.org/en/>`_ is required for building some JavaScript components. ``configurable-http-proxy``, the default proxy implementation for
``configurable-http-proxy``, the default proxy implementation for JupyterHub, is written in Javascript. JupyterHub, is written in Javascript to run on `NodeJS
If you have not installed nodejs before, we recommend installing it in the ``miniconda`` environment you set up for Python. <https://nodejs.org/en/>`_. If you have not installed nodejs before, we
You can do so with ``conda install nodejs``. recommend installing it in the ``miniconda`` environment you set up for
Python. You can do so with ``conda install nodejs``.
Install git Install git
----------- -----------
@@ -45,7 +46,7 @@ their effects quickly. You need to do a developer install to make that
happen. happen.
.. note:: This guide does not attempt to dictate *how* development .. note:: This guide does not attempt to dictate *how* development
environments should be isolated since that is a personal preference and can environements should be isolated since that is a personal preference and can
be achieved in many ways, for example `tox`, `conda`, `docker`, etc. See this be achieved in many ways, for example `tox`, `conda`, `docker`, etc. See this
`forum thread <https://discourse.jupyter.org/t/thoughts-on-using-tox/3497>`_ for `forum thread <https://discourse.jupyter.org/t/thoughts-on-using-tox/3497>`_ for
a more detailed discussion. a more detailed discussion.
@@ -65,7 +66,7 @@ happen.
python -V python -V
This should return a version number greater than or equal to 3.6. This should return a version number greater than or equal to 3.5.
.. code:: bash .. code:: bash
@@ -73,11 +74,12 @@ happen.
This should return a version number greater than or equal to 5.0. This should return a version number greater than or equal to 5.0.
3. Install ``configurable-http-proxy`` (required to run and test the default JupyterHub configuration) and ``yarn`` (required to build some components): 3. Install ``configurable-http-proxy``. This is required to run
JupyterHub.
.. code:: bash .. code:: bash
npm install -g configurable-http-proxy yarn npm install -g configurable-http-proxy
If you get an error that says ``Error: EACCES: permission denied``, If you get an error that says ``Error: EACCES: permission denied``,
you might need to prefix the command with ``sudo``. If you do not you might need to prefix the command with ``sudo``. If you do not
@@ -85,17 +87,11 @@ happen.
.. code:: bash .. code:: bash
npm install configurable-http-proxy yarn npm install configurable-http-proxy
export PATH=$PATH:$(pwd)/node_modules/.bin export PATH=$PATH:$(pwd)/node_modules/.bin
The second line needs to be run every time you open a new terminal. The second line needs to be run every time you open a new terminal.
If you are using conda you can instead run:
.. code:: bash
conda install configurable-http-proxy yarn
4. Install the python packages required for JupyterHub development. 4. Install the python packages required for JupyterHub development.
.. code:: bash .. code:: bash
@@ -190,4 +186,3 @@ development updates, with:
python3 setup.py js # fetch updated client-side js python3 setup.py js # fetch updated client-side js
python3 setup.py css # recompile CSS from LESS sources python3 setup.py css # recompile CSS from LESS sources
python3 setup.py jsx # build React admin app

View File

@@ -1,8 +1,8 @@
.. _contributing/tests: .. _contributing/tests:
=================================== ==================
Testing JupyterHub and linting code Testing JupyterHub
=================================== ==================
Unit test help validate that JupyterHub works the way we think it does, Unit test help validate that JupyterHub works the way we think it does,
and continues to do so when changes occur. They also help communicate and continues to do so when changes occur. They also help communicate
@@ -57,50 +57,6 @@ Running the tests
pytest -v jupyterhub/tests/test_api.py::test_shutdown pytest -v jupyterhub/tests/test_api.py::test_shutdown
See the `pytest usage documentation <https://pytest.readthedocs.io/en/latest/usage.html>`_ for more details.
Test organisation
=================
The tests live in ``jupyterhub/tests`` and are organized roughly into:
#. ``test_api.py`` tests the REST API
#. ``test_pages.py`` tests loading the HTML pages
and other collections of tests for different components.
When writing a new test, there should usually be a test of
similar functionality already written and related tests should
be added nearby.
The fixtures live in ``jupyterhub/tests/conftest.py``. There are
fixtures that can be used for JupyterHub components, such as:
- ``app``: an instance of JupyterHub with mocked parts
- ``auth_state_enabled``: enables persisting auth_state (like authentication tokens)
- ``db``: a sqlite in-memory DB session
- ``io_loop```: a Tornado event loop
- ``event_loop``: a new asyncio event loop
- ``user``: creates a new temporary user
- ``admin_user``: creates a new temporary admin user
- single user servers
- ``cleanup_after``: allows cleanup of single user servers between tests
- mocked service
- ``MockServiceSpawner``: a spawner that mocks services for testing with a short poll interval
- ``mockservice```: mocked service with no external service url
- ``mockservice_url``: mocked service with a url to test external services
And fixtures to add functionality or spawning behavior:
- ``admin_access``: grants admin access
- ``no_patience```: sets slow-spawning timeouts to zero
- ``slow_spawn``: enables the SlowSpawner (a spawner that takes a few seconds to start)
- ``never_spawn``: enables the NeverSpawner (a spawner that will never start)
- ``bad_spawn``: enables the BadSpawner (a spawner that fails immediately)
- ``slow_bad_spawn``: enables the SlowBadSpawner (a spawner that fails after a short delay)
See the `pytest fixtures documentation <https://pytest.readthedocs.io/en/latest/fixture.html>`_
for how to use the existing fixtures, and how to create new ones.
Troubleshooting Test Failures Troubleshooting Test Failures
============================= =============================
@@ -110,27 +66,3 @@ All the tests are failing
Make sure you have completed all the steps in :ref:`contributing/setup` successfully, and Make sure you have completed all the steps in :ref:`contributing/setup` successfully, and
can launch ``jupyterhub`` from the terminal. can launch ``jupyterhub`` from the terminal.
Code formatting and linting
===========================
JupyterHub has adopted automatic code formatting and linting.
As long as your code is valid, the pre-commit hook should take care of how it should look.
You can invoke the pre-commit hook by hand at any time with:
.. code:: bash
pre-commit run
which should run any autoformatting on your code and tell you about any errors it couldn't fix automatically.
You may also install `black integration <https://github.com/psf/black#editor-integration>`_
into your text editor to format code automatically.
If you have already committed files before running pre-commit you can fix everything using:
.. code:: bash
pre-commit run --all-files
And committing the changes.

View File

@@ -183,6 +183,12 @@ itself, ``jupyterhub_config.py``, as a binary string:
c.JupyterHub.cookie_secret = bytes.fromhex('64 CHAR HEX STRING') c.JupyterHub.cookie_secret = bytes.fromhex('64 CHAR HEX STRING')
.. important::
If the cookie secret value changes for the Hub, all single-user notebook
servers must also be restarted.
.. _cookies: .. _cookies:
Cookies used by JupyterHub authentication Cookies used by JupyterHub authentication

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.1 MiB

View File

@@ -1,5 +1,3 @@
(RBAC)=
# JupyterHub RBAC # JupyterHub RBAC
Role Based Access Control (RBAC) in JupyterHub serves to provide fine grained control of access to Jupyterhub's API resources. Role Based Access Control (RBAC) in JupyterHub serves to provide fine grained control of access to Jupyterhub's API resources.

View File

@@ -27,6 +27,7 @@ Roles can be assigned to the following entities:
- Users - Users
- Services - Services
- Groups - Groups
- Tokens
An entity can have zero, one, or multiple roles, and there are no restrictions on which roles can be assigned to which entity. Roles can be added to or removed from entities at any time. An entity can have zero, one, or multiple roles, and there are no restrictions on which roles can be assigned to which entity. Roles can be added to or removed from entities at any time.
@@ -40,7 +41,7 @@ Services do not have a default role. Services without roles have no access to th
A group does not require any role, and has no roles by default. If a user is a member of a group, they automatically inherit any of the group's permissions (see {ref}`resolving-roles-scopes-target` for more details). This is useful for assigning a set of common permissions to several users. A group does not require any role, and has no roles by default. If a user is a member of a group, they automatically inherit any of the group's permissions (see {ref}`resolving-roles-scopes-target` for more details). This is useful for assigning a set of common permissions to several users.
**Tokens** \ **Tokens** \
A tokens permissions are evaluated based on their owning entity. Since a token is always issued for a user or service, it can never have more permissions than its owner. If no specific scopes are requested for a new token, the token is assigned the scopes of the `token` role. A tokens permissions are evaluated based on their owning entity. Since a token is always issued for a user or service, it can never have more permissions than its owner. If no specific role is requested for a new token, the token is assigned the `token` role.
(define-role-target)= (define-role-target)=

View File

@@ -72,31 +72,13 @@ Requested resources are filtered based on the filter of the corresponding scope.
In case a user resource is being accessed, any scopes with _group_ filters will be expanded to filters for each _user_ in those groups. In case a user resource is being accessed, any scopes with _group_ filters will be expanded to filters for each _user_ in those groups.
(self-referencing-filters)= ### `!user` filter
### Self-referencing filters
There are some 'shortcut' filters,
which can be applied to all scopes,
that filter based on the entities associated with the request.
The `!user` filter is a special horizontal filter that strictly refers to the **"owner only"** scopes, where _owner_ is a user entity. The filter resolves internally into `!user=<ownerusername>` ensuring that only the owner's resources may be accessed through the associated scopes. The `!user` filter is a special horizontal filter that strictly refers to the **"owner only"** scopes, where _owner_ is a user entity. The filter resolves internally into `!user=<ownerusername>` ensuring that only the owner's resources may be accessed through the associated scopes.
For example, the `server` role assigned by default to server tokens contains `access:servers!user` and `users:activity!user` scopes. This allows the token to access and post activity of only the servers owned by the token owner. For example, the `server` role assigned by default to server tokens contains `access:servers!user` and `users:activity!user` scopes. This allows the token to access and post activity of only the servers owned by the token owner.
:::{versionadded} 3.0 The filter can be applied to any scope.
`!service` and `!server` filters.
:::
In addition to `!user`, _tokens_ may have filters `!service`
or `!server`, which expand similarly to `!service=servicename`
and `!server=servername`.
This only applies to tokens issued via the OAuth flow.
In these cases, the name is the _issuing_ entity (a service or single-user server),
so that access can be restricted to the issuing service,
e.g. `access:servers!server` would grant access only to the server that requested the token.
These filters can be applied to any scope.
(vertical-filtering-target)= (vertical-filtering-target)=
@@ -132,170 +114,11 @@ There are four exceptions to the general {ref}`scope conventions <scope-conventi
``` ```
:::{versionadded} 3.0
The `admin-ui` scope is added to explicitly grant access to the admin page,
rather than combining `admin:users` and `admin:servers` permissions.
This means a deployment can enable the admin page with only a subset of functionality enabled.
Note that this means actions to take _via_ the admin UI
and access _to_ the admin UI are separated.
For example, it generally doesn't make sense to grant
`admin-ui` without at least `list:users` for at least some subset of users.
For example:
```python
c.JupyterHub.load_roles = [
{
"name": "instructor-data8",
"scopes": [
# access to the admin page
"admin-ui",
# list users in the class group
"list:users!group=students-data8",
# start/stop servers for users in the class
"admin:servers!group=students-data8",
# access servers for users in the class
"access:servers!group=students-data8",
],
"group": ["instructors-data8"],
}
]
```
will grant instructors in the data8 course permission to:
1. view the admin UI
2. see students in the class (but not all users)
3. start/stop/access servers for users in the class
4. but _not_ permission to administer the users themselves (e.g. change their permissions, etc.)
:::
```{Caution} ```{Caution}
Note that only the {ref}`horizontal filtering <horizontal-filtering-target>` can be added to scopes to customize them. \ Note that only the {ref}`horizontal filtering <horizontal-filtering-target>` can be added to scopes to customize them. \
Metascopes `self` and `all`, `<resource>`, `<resource>:<subresource>`, `read:<resource>`, `admin:<resource>`, and `access:<resource>` scopes are predefined and cannot be changed otherwise. Metascopes `self` and `all`, `<resource>`, `<resource>:<subresource>`, `read:<resource>`, `admin:<resource>`, and `access:<resource>` scopes are predefined and cannot be changed otherwise.
``` ```
(custom-scopes)=
### Custom scopes
:::{versionadded} 3.0
:::
JupyterHub 3.0 introduces support for custom scopes.
Services that use JupyterHub for authentication and want to implement their own granular access may define additional _custom_ scopes and assign them to users with JupyterHub roles.
% Note: keep in sync with pattern/description in jupyterhub/scopes.py
Custom scope names must start with `custom:`
and contain only lowercase ascii letters, numbers, hyphen, underscore, colon, and asterisk (`-_:*`).
The part after `custom:` must start with a letter or number.
Scopes may not end with a hyphen or colon.
The only strict requirement is that a custom scope definition must have a `description`.
It _may_ also have `subscopes` if you are defining multiple scopes that have a natural hierarchy,
For example:
```python
c.JupyterHub.custom_scopes = {
"custom:myservice:read": {
"description": "read-only access to myservice",
},
"custom:myservice:write": {
"description": "write access to myservice",
# write permission implies read permission
"subscopes": [
"custom:myservice:read",
],
},
}
c.JupyterHub.load_roles = [
# graders have read-only access to the service
{
"name": "service-user",
"groups": ["graders"],
"scopes": [
"custom:myservice:read",
"access:service!service=myservice",
],
},
# instructors have read and write access to the service
{
"name": "service-admin",
"groups": ["instructors"],
"scopes": [
"custom:myservice:write",
"access:service!service=myservice",
],
},
]
```
In the above configuration, two scopes are defined:
- `custom:myservice:read` grants read-only access to the service, and
- `custom:myservice:write` grants write access to the service
- write access _implies_ read access via the `subscope`
These custom scopes are assigned to two groups via `roles`:
- users in the group `graders` are granted read access to the service
- users in the group `instructors` are
- both are granted _access_ to the service via `access:service!service=myservice`
When the service completes OAuth, it will retrieve the user model from `/hub/api/user`.
This model includes a `scopes` field which is a list of authorized scope for the request,
which can be used.
```python
def require_scope(scope):
"""decorator to require a scope to perform an action"""
def wrapper(func):
@functools.wraps(func)
def wrapped_func(request):
user = fetch_hub_api_user(request.token)
if scope not in user["scopes"]:
raise HTTP403(f"Requires scope {scope}")
else:
return func()
return wrapper
@require_scope("custom:myservice:read")
async def read_something(request):
...
@require_scope("custom:myservice:write")
async def write_something(request):
...
```
If you use {class}`~.HubOAuthenticated`, this check is performed automatically
against the `.hub_scopes` attribute of each Handler
(the default is populated from `$JUPYTERHUB_OAUTH_ACCESS_SCOPES` and usually `access:services!service=myservice`).
:::{versionchanged} 3.0
The JUPYTERHUB_OAUTH_SCOPES environment variable is deprecated and renamed to JUPYTERHUB_OAUTH_ACCESS_SCOPES,
to avoid ambiguity with JUPYTERHUB_OAUTH_CLIENT_ALLOWED_SCOPES
:::
```python
from tornado import web
from jupyterhub.services.auth import HubOAuthenticated
class MyHandler(HubOAuthenticated, BaseHandler):
hub_scopes = ["custom:myservice:read"]
@web.authenticated
def get(self):
...
```
Existing scope filters (`!user=`, etc.) may be applied to custom scopes.
Custom scope _filters_ are NOT supported.
### Scopes and APIs ### Scopes and APIs
The scopes are also listed in the [](../reference/rest-api.rst) documentation. Each API endpoint has a list of scopes which can be used to access the API; if no scopes are listed, the API is not authenticated and can be accessed without any permissions (i.e., no scopes). The scopes are also listed in the [](../reference/rest-api.rst) documentation. Each API endpoint has a list of scopes which can be used to access the API; if no scopes are listed, the API is not authenticated and can be accessed without any permissions (i.e., no scopes).

View File

@@ -7,11 +7,11 @@ Roles and scopes utilities can be found in `roles.py` and `scopes.py` modules. S
```{admonition} **Scope variable nomenclature** ```{admonition} **Scope variable nomenclature**
:class: tip :class: tip
- _scopes_ \ - _scopes_ \
List of scopes that may contain abbreviations (used in role definitions). E.g., `["users:activity!user", "self"]`. List of scopes with abbreviations (used in role definitions). E.g., `["users:activity!user"]`.
- _expanded scopes_ \ - _expanded scopes_ \
Set of fully expanded scopes without abbreviations (i.e., resolved metascopes, filters, and subscopes). E.g., `{"users:activity!user=charlie", "read:users:activity!user=charlie"}`. Set of expanded scopes without abbreviations (i.e., resolved metascopes, filters and subscopes). E.g., `{"users:activity!user=charlie", "read:users:activity!user=charlie"}`.
- _parsed scopes_ \ - _parsed scopes_ \
Dictionary represenation of expanded scopes. E.g., `{"users:activity": {"user": ["charlie"]}, "read:users:activity": {"users": ["charlie"]}}`. Dictionary JSON like format of expanded scopes. E.g., `{"users:activity": {"user": ["charlie"]}, "read:users:activity": {"users": ["charlie"]}}`.
- _intersection_ \ - _intersection_ \
Set of expanded scopes as intersection of 2 expanded scope sets. Set of expanded scopes as intersection of 2 expanded scope sets.
- _identify scopes_ \ - _identify scopes_ \
@@ -22,47 +22,27 @@ Roles and scopes utilities can be found in `roles.py` and `scopes.py` modules. S
## Resolving roles and scopes ## Resolving roles and scopes
**Resolving roles** refers to determining which roles a user, service, or group has, extracting the list of scopes from each role and combining them into a single set of scopes. **Resolving roles** refers to determining which roles a user, service, token, or group has, extracting the list of scopes from each role and combining them into a single set of scopes.
**Resolving scopes** involves expanding scopes into all their possible subscopes (_expanded scopes_), parsing them into format used for access evaluation (_parsed scopes_) and, if applicable, comparing two sets of scopes (_intersection_). All procedures take into account the scope hierarchy, {ref}`vertical <vertical-filtering-target>` and {ref}`horizontal filtering <horizontal-filtering-target>`, limiting or elevated permissions (`read:<resource>` or `admin:<resource>`, respectively), and metascopes. **Resolving scopes** involves expanding scopes into all their possible subscopes (_expanded scopes_), parsing them into format used for access evaluation (_parsed scopes_) and, if applicable, comparing two sets of scopes (_intersection_). All procedures take into account the scope hierarchy, {ref}`vertical <vertical-filtering-target>` and {ref}`horizontal filtering <horizontal-filtering-target>`, limiting or elevated permissions (`read:<resource>` or `admin:<resource>`, respectively), and metascopes.
Roles and scopes are resolved on several occasions, for example when requesting an API token with specific scopes or making an API request. The following sections provide more details. Roles and scopes are resolved on several occasions, for example when requesting an API token with specific roles or making an API request. The following sections provide more details.
(requesting-api-token-target)= (requesting-api-token-target)=
### Requesting API token with specific scopes ### Requesting API token with specific roles
:::{versionchanged} 3.0 API tokens grant access to JupyterHub's APIs. The RBAC framework allows for requesting tokens with specific existing roles. To date, it is only possible to add roles to a token through the _POST /users/:name/tokens_ API where the roles can be specified in the token parameters body (see [](../reference/rest-api.rst)).
API tokens have _scopes_ instead of roles,
so that their permissions cannot be updated.
You may still request roles for a token, RBAC adds several steps into the token issue flow.
but those roles will be evaluated to the corresponding _scopes_ immediately.
Prior to 3.0, tokens stored _roles_, If no roles are requested, the token is issued with the default `token` role (providing the requester is allowed to create the token).
which meant their scopes were resolved on each request.
:::
API tokens grant access to JupyterHub's APIs. The RBAC framework allows for requesting tokens with specific permissions. If the token is requested with any roles, the permissions of requesting entity are checked against the requested permissions to ensure the token would not grant its owner additional privileges.
RBAC is involved in several stages of the OAuth token flow. If, due to modifications of roles or entities, at API request time a token has any scopes that its owner does not, those scopes are removed. The API request is resolved without additional errors using the scopes _intersection_, but the Hub logs a warning (see {ref}`Figure 2 <api-request-chart>`).
When requesting a token via the tokens API (`/users/:name/tokens`), or the token page (`/hub/token`), Resolving a token's roles (yellow box in {ref}`Figure 1 <token-request-chart>`) corresponds to resolving all the token's owner roles (including the roles associated with their groups) and the token's requested roles into a set of scopes. The two sets are compared (Resolve the scopes box in orange in {ref}`Figure 1 <token-request-chart>`), taking into account the scope hierarchy but, solely for role assignment, omitting any {ref}`horizontal filter <horizontal-filtering-target>` comparison. If the token's scopes are a subset of the token owner's scopes, the token is issued with the requested roles; if not, JupyterHub will raise an error.
if no scopes are requested, the token is issued with the permissions stored on the default `token` role
(providing the requester is allowed to create the token).
OAuth tokens are also requested via OAuth flow
If the token is requested with any scopes, the permissions of requesting entity are checked against the requested permissions to ensure the token would not grant its owner additional privileges.
If, due to modifications of permissions of the token or token owner,
at API request time a token has any scopes that its owner does not,
those scopes are removed.
The API request is resolved without additional errors using the scope _intersection_;
the Hub logs a warning in this case (see {ref}`Figure 2 <api-request-chart>`).
Resolving a token's scope (yellow box in {ref}`Figure 1 <token-request-chart>`) corresponds to resolving all the token's owner roles (including the roles associated with their groups) and the token's own scopes into a set of scopes. The two sets are compared (Resolve the scopes box in orange in {ref}`Figure 1 <token-request-chart>`), taking into account the scope hierarchy.
If the token's scopes are a subset of the token owner's scopes, the token is issued with the requested scopes; if not, JupyterHub will raise an error.
{ref}`Figure 1 <token-request-chart>` below illustrates the steps involved. The orange rectangles highlight where in the process the roles and scopes are resolved. {ref}`Figure 1 <token-request-chart>` below illustrates the steps involved. The orange rectangles highlight where in the process the roles and scopes are resolved.
@@ -75,9 +55,9 @@ Figure 1. Resolving roles and scopes during API token request
### Making an API request ### Making an API request
With the RBAC framework, each authenticated JupyterHub API request is guarded by a scope decorator that specifies which scopes are required to gain the access to the API. With the RBAC framework each authenticated JupyterHub API request is guarded by a scope decorator that specifies which scopes are required to gain the access to the API.
When an API request is performed, the requesting API token's scopes are again intersected with its owner's (yellow box in {ref}`Figure 2 <api-request-chart>`) to ensure the token does not grant more permissions than its owner has at the request time (e.g., due to changing/losing roles). When an API request is performed, the requesting API token's roles are again resolved (yellow box in {ref}`Figure 2 <api-request-chart>`) to ensure the token does not grant more permissions than its owner has at the request time (e.g., due to changing/losing roles).
If the owner's roles do not include some scopes of the token's scopes, only the _intersection_ of the token's and owner's scopes will be used. For example, using a token with scope `users` whose owner's role scope is `read:users:name` will result in only the `read:users:name` scope being passed on. In the case of no _intersection_, an empty set of scopes will be used. If the owner's roles do not include some scopes of the token's scopes, only the _intersection_ of the token's and owner's scopes will be used. For example, using a token with scope `users` whose owner's role scope is `read:users:name` will result in only the `read:users:name` scope being passed on. In the case of no _intersection_, an empty set of scopes will be used.
The passed scopes are compared to the scopes required to access the API as follows: The passed scopes are compared to the scopes required to access the API as follows:

View File

@@ -231,8 +231,8 @@ In case of the need to run the jupyterhub under /jhub/ or other location please
httpd.conf amendments: httpd.conf amendments:
```bash ```bash
RewriteRule /jhub/(.*) ws://127.0.0.1:8000/jhub/$1 [P,L] RewriteRule /jhub/(.*) ws://127.0.0.1:8000/jhub/$1 [NE,P,L]
RewriteRule /jhub/(.*) http://127.0.0.1:8000/jhub/$1 [P,L] RewriteRule /jhub/(.*) http://127.0.0.1:8000/jhub/$1 [NE,P,L]
ProxyPass /jhub/ http://127.0.0.1:8000/jhub/ ProxyPass /jhub/ http://127.0.0.1:8000/jhub/
ProxyPassReverse /jhub/ http://127.0.0.1:8000/jhub/ ProxyPassReverse /jhub/ http://127.0.0.1:8000/jhub/

View File

@@ -35,8 +35,6 @@ A Service may have the following properties:
the service will be added to the proxy at `/services/:name` the service will be added to the proxy at `/services/:name`
- `api_token: str (default - None)` - For Externally-Managed Services you need to specify - `api_token: str (default - None)` - For Externally-Managed Services you need to specify
an API token to perform API requests to the Hub an API token to perform API requests to the Hub
- `display: bool (default - True)` - When set to true, display a link to the
service's URL under the 'Services' dropdown in user's hub home page.
If a service is also to be managed by the Hub, it has a few extra options: If a service is also to be managed by the Hub, it has a few extra options:
@@ -116,10 +114,7 @@ JUPYTERHUB_BASE_URL: Base URL of the Hub (https://mydomain[:port]/)
JUPYTERHUB_SERVICE_PREFIX: URL path prefix of this service (/services/:service-name/) JUPYTERHUB_SERVICE_PREFIX: URL path prefix of this service (/services/:service-name/)
JUPYTERHUB_SERVICE_URL: Local URL where the service is expected to be listening. JUPYTERHUB_SERVICE_URL: Local URL where the service is expected to be listening.
Only for proxied web services. Only for proxied web services.
JUPYTERHUB_OAUTH_SCOPES: JSON-serialized list of scopes to use for allowing access to the service JUPYTERHUB_OAUTH_SCOPES: JSON-serialized list of scopes to use for allowing access to the service.
(deprecated in 3.0, use JUPYTERHUB_OAUTH_ACCESS_SCOPES).
JUPYTERHUB_OAUTH_ACCESS_SCOPES: JSON-serialized list of scopes to use for allowing access to the service (new in 3.0).
JUPYTERHUB_OAUTH_CLIENT_ALLOWED_SCOPES: JSON-serialized list of scopes that can be requested by the oauth client on behalf of users (new in 3.0).
``` ```
For the previous 'cull idle' Service example, these environment variables For the previous 'cull idle' Service example, these environment variables
@@ -379,7 +374,7 @@ The `scopes` field can be used to manage access.
Note: a user will have access to a service to complete oauth access to the service for the first time. Note: a user will have access to a service to complete oauth access to the service for the first time.
Individual permissions may be revoked at any later point without revoking the token, Individual permissions may be revoked at any later point without revoking the token,
in which case the `scopes` field in this model should be checked on each access. in which case the `scopes` field in this model should be checked on each access.
The default required scopes for access are available from `hub_auth.oauth_scopes` or `$JUPYTERHUB_OAUTH_ACCESS_SCOPES`. The default required scopes for access are available from `hub_auth.oauth_scopes` or `$JUPYTERHUB_OAUTH_SCOPES`.
An example of using an Externally-Managed Service and authentication is An example of using an Externally-Managed Service and authentication is
in [nbviewer README][nbviewer example] section on securing the notebook viewer, in [nbviewer README][nbviewer example] section on securing the notebook viewer,

View File

@@ -308,9 +308,6 @@ The process environment is returned by `Spawner.get_env`, which specifies the fo
This is also the OAuth client secret. This is also the OAuth client secret.
- JUPYTERHUB_CLIENT_ID - the OAuth client ID for authenticating visitors. - JUPYTERHUB_CLIENT_ID - the OAuth client ID for authenticating visitors.
- JUPYTERHUB_OAUTH_CALLBACK_URL - the callback URL to use in oauth, typically `/user/:name/oauth_callback` - JUPYTERHUB_OAUTH_CALLBACK_URL - the callback URL to use in oauth, typically `/user/:name/oauth_callback`
- JUPYTERHUB_OAUTH_ACCESS_SCOPES - the scopes required to access the server (called JUPYTERHUB_OAUTH_SCOPES prior to 3.0)
- JUPYTERHUB_OAUTH_CLIENT_ALLOWED_SCOPES - the scopes the service is allowed to request.
If no scopes are requested explicitly, these scopes will be requested.
Optional environment variables, depending on configuration: Optional environment variables, depending on configuration:

View File

@@ -371,7 +371,7 @@ a JupyterHub deployment. The commands are:
- System and deployment information - System and deployment information
```bash ```bash
jupyter troubleshoot jupyter troubleshooting
``` ```
- Kernel information - Kernel information

View File

@@ -7,9 +7,8 @@ to enable testing without administrative privileges.
c = get_config() # noqa c = get_config() # noqa
c.Application.log_level = 'DEBUG' c.Application.log_level = 'DEBUG'
import os
from oauthenticator.azuread import AzureAdOAuthenticator from oauthenticator.azuread import AzureAdOAuthenticator
import os
c.JupyterHub.authenticator_class = AzureAdOAuthenticator c.JupyterHub.authenticator_class = AzureAdOAuthenticator

View File

@@ -1,132 +0,0 @@
import os
from functools import wraps
from html import escape
from urllib.parse import urlparse
from tornado.httpserver import HTTPServer
from tornado.ioloop import IOLoop
from tornado.web import Application, RequestHandler, authenticated
from jupyterhub.services.auth import HubOAuthCallbackHandler, HubOAuthenticated
from jupyterhub.utils import url_path_join
SCOPE_PREFIX = "custom:grades"
READ_SCOPE = f"{SCOPE_PREFIX}:read"
WRITE_SCOPE = f"{SCOPE_PREFIX}:write"
def require_scope(scopes):
"""Decorator to require scopes
For use if multiple methods on one Handler
may want different scopes,
so class-level .hub_scopes is insufficient
(e.g. read for GET, write for POST).
"""
if isinstance(scopes, str):
scopes = [scopes]
def wrap(method):
"""The actual decorator"""
@wraps(method)
@authenticated
def wrapped(self, *args, **kwargs):
self.hub_scopes = scopes
return method(self, *args, **kwargs)
return wrapped
return wrap
class MyGradesHandler(HubOAuthenticated, RequestHandler):
# no hub_scopes, anyone with access to this service
# will be able to visit this URL
@authenticated
def get(self):
self.write("<h1>My grade</h1>")
name = self.current_user["name"]
grades = self.settings["grades"]
self.write(f"<p>My name is: {escape(name)}</p>")
if name in grades:
self.write(f"<p>My grade is: {escape(str(grades[name]))}</p>")
else:
self.write("<p>No grade entered</p>")
if READ_SCOPE in self.current_user["scopes"]:
self.write('<a href="grades/">enter grades</a>')
class GradesHandler(HubOAuthenticated, RequestHandler):
# default scope for this Handler: read-only
hub_scopes = [READ_SCOPE]
def _render(self):
grades = self.settings["grades"]
self.write("<h1>All grades</h1>")
self.write("<table>")
self.write("<tr><th>Student</th><th>Grade</th></tr>")
for student, grade in grades.items():
qstudent = escape(student)
qgrade = escape(str(grade))
self.write(
f"""
<tr>
<td class="student">{qstudent}</td>
<td class="grade">{qgrade}</td>
</tr>
"""
)
if WRITE_SCOPE in self.current_user["scopes"]:
self.write("Enter grade:")
self.write(
"""
<form action=. method=POST>
<input name=student placeholder=student></input>
<input kind=number name=grade placeholder=grade></input>
<input type="submit" value="Submit">
"""
)
@require_scope([READ_SCOPE])
async def get(self):
self._render()
# POST requires WRITE_SCOPE instead of READ_SCOPE
@require_scope([WRITE_SCOPE])
async def post(self):
name = self.get_argument("student")
grade = self.get_argument("grade")
self.settings["grades"][name] = grade
self._render()
def main():
base_url = os.environ['JUPYTERHUB_SERVICE_PREFIX']
app = Application(
[
(base_url, MyGradesHandler),
(url_path_join(base_url, 'grades/'), GradesHandler),
(
url_path_join(base_url, 'oauth_callback'),
HubOAuthCallbackHandler,
),
],
cookie_secret=os.urandom(32),
grades={"student": 53},
)
http_server = HTTPServer(app)
url = urlparse(os.environ['JUPYTERHUB_SERVICE_URL'])
http_server.listen(url.port, url.hostname)
try:
IOLoop.current().start()
except KeyboardInterrupt:
pass
if __name__ == '__main__':
main()

View File

@@ -1,52 +0,0 @@
import sys
c = get_config() # noqa
c.JupyterHub.services = [
{
'name': 'grades',
'url': 'http://127.0.0.1:10101',
'command': [sys.executable, './grades.py'],
'oauth_client_allowed_scopes': [
'custom:grades:write',
'custom:grades:read',
],
},
]
c.JupyterHub.custom_scopes = {
"custom:grades:read": {
"description": "read-access to all grades",
},
"custom:grades:write": {
"description": "Enter new grades",
"subscopes": ["custom:grades:read"],
},
}
c.JupyterHub.load_roles = [
{
"name": "user",
# grant all users access to services
"scopes": ["access:services", "self"],
},
{
"name": "grader",
# grant graders access to write grades
"scopes": ["custom:grades:write"],
"users": ["grader"],
},
{
"name": "instructor",
# grant instructors access to read, but not write grades
"scopes": ["custom:grades:read"],
"users": ["instructor"],
},
]
c.JupyterHub.allowed_users = {"instructor", "grader", "student"}
# dummy spawner and authenticator for testing, don't actually use these!
c.JupyterHub.authenticator_class = 'dummy'
c.JupyterHub.spawner_class = 'simple'
c.JupyterHub.ip = '127.0.0.1' # let's just run on localhost while dummy auth is enabled
c.JupyterHub.log_level = 10

View File

@@ -5,10 +5,13 @@ so all URLs and requests necessary for OAuth with JupyterHub should be in one pl
""" """
import json import json
import os import os
from urllib.parse import urlencode, urlparse from urllib.parse import urlencode
from urllib.parse import urlparse
from tornado import log, web from tornado import log
from tornado.httpclient import AsyncHTTPClient, HTTPRequest from tornado import web
from tornado.httpclient import AsyncHTTPClient
from tornado.httpclient import HTTPRequest
from tornado.httputil import url_concat from tornado.httputil import url_concat
from tornado.ioloop import IOLoop from tornado.ioloop import IOLoop

View File

@@ -2,8 +2,6 @@
# 1. start/stop servers, and # 1. start/stop servers, and
# 2. access the server API # 2. access the server API
c = get_config() # noqa
c.JupyterHub.load_roles = [ c.JupyterHub.load_roles = [
{ {
"name": "launcher", "name": "launcher",

View File

@@ -16,6 +16,7 @@ import time
import requests import requests
log = logging.getLogger(__name__) log = logging.getLogger(__name__)

View File

@@ -3,7 +3,9 @@ import datetime
import json import json
import os import os
from tornado import escape, ioloop, web from tornado import escape
from tornado import ioloop
from tornado import web
from jupyterhub.services.auth import HubAuthenticated from jupyterhub.services.auth import HubAuthenticated

View File

@@ -1,5 +1,8 @@
from datetime import datetime from datetime import datetime
from typing import Any, Dict, List, Optional from typing import Any
from typing import Dict
from typing import List
from typing import Optional
from pydantic import BaseModel from pydantic import BaseModel

View File

@@ -1,7 +1,9 @@
import json import json
import os import os
from fastapi import HTTPException, Security, status from fastapi import HTTPException
from fastapi import Security
from fastapi import status
from fastapi.security import OAuth2AuthorizationCodeBearer from fastapi.security import OAuth2AuthorizationCodeBearer
from fastapi.security.api_key import APIKeyQuery from fastapi.security.api_key import APIKeyQuery

View File

@@ -1,9 +1,14 @@
import os import os
from fastapi import APIRouter, Depends, Form, Request from fastapi import APIRouter
from fastapi import Depends
from fastapi import Form
from fastapi import Request
from .client import get_client from .client import get_client
from .models import AuthorizationError, HubApiError, User from .models import AuthorizationError
from .models import HubApiError
from .models import User
from .security import get_current_user from .security import get_current_user
# APIRouter prefix cannot end in / # APIRouter prefix cannot end in /

View File

@@ -7,10 +7,16 @@ import os
import secrets import secrets
from functools import wraps from functools import wraps
from flask import Flask, Response, make_response, redirect, request, session from flask import Flask
from flask import make_response
from flask import redirect
from flask import request
from flask import Response
from flask import session
from jupyterhub.services.auth import HubOAuth from jupyterhub.services.auth import HubOAuth
prefix = os.environ.get('JUPYTERHUB_SERVICE_PREFIX', '/') prefix = os.environ.get('JUPYTERHUB_SERVICE_PREFIX', '/')
auth = HubOAuth(api_token=os.environ['JUPYTERHUB_API_TOKEN'], cache_max_age=60) auth = HubOAuth(api_token=os.environ['JUPYTERHUB_API_TOKEN'], cache_max_age=60)

View File

@@ -26,7 +26,7 @@ After logging in with any username and password, you should see a JSON dump of y
``` ```
What is contained in the model will depend on the permissions What is contained in the model will depend on the permissions
requested in the `oauth_client_allowed_scopes` configuration of the service `whoami-oauth` service. requested in the `oauth_roles` configuration of the service `whoami-oauth` service.
The default is the minimum required for identification and access to the service, The default is the minimum required for identification and access to the service,
which will provide the username and current scopes. which will provide the username and current scopes.

View File

@@ -14,11 +14,11 @@ c.JupyterHub.services = [
# only requesting access to the service, # only requesting access to the service,
# and identification by name, # and identification by name,
# nothing more. # nothing more.
# Specifying 'oauth_client_allowed_scopes' as a list of scopes # Specifying 'oauth_roles' as a list of role names
# allows requesting more information about users, # allows requesting more information about users,
# or the ability to take actions on users' behalf, as required. # or the ability to take actions on users' behalf, as required.
# the 'inherit' scope means the full permissions of the owner # The default 'token' role has the full permissions of its owner:
# 'oauth_client_allowed_scopes': ['inherit'], # 'oauth_roles': ['token'],
}, },
] ]

View File

@@ -10,9 +10,12 @@ from urllib.parse import urlparse
from tornado.httpserver import HTTPServer from tornado.httpserver import HTTPServer
from tornado.ioloop import IOLoop from tornado.ioloop import IOLoop
from tornado.web import Application, RequestHandler, authenticated from tornado.web import Application
from tornado.web import authenticated
from tornado.web import RequestHandler
from jupyterhub.services.auth import HubOAuthCallbackHandler, HubOAuthenticated from jupyterhub.services.auth import HubOAuthCallbackHandler
from jupyterhub.services.auth import HubOAuthenticated
from jupyterhub.utils import url_path_join from jupyterhub.utils import url_path_join

View File

@@ -10,7 +10,9 @@ from urllib.parse import urlparse
from tornado.httpserver import HTTPServer from tornado.httpserver import HTTPServer
from tornado.ioloop import IOLoop from tornado.ioloop import IOLoop
from tornado.web import Application, RequestHandler, authenticated from tornado.web import Application
from tornado.web import authenticated
from tornado.web import RequestHandler
from jupyterhub.services.auth import HubAuthenticated from jupyterhub.services.auth import HubAuthenticated

View File

@@ -0,0 +1,56 @@
/*
object-assign
(c) Sindre Sorhus
@license MIT
*/
/*!
Copyright (c) 2018 Jed Watson.
Licensed under the MIT License (MIT), see
http://jedwatson.github.io/classnames
*/
/** @license React v0.20.2
* scheduler.production.min.js
*
* Copyright (c) Facebook, Inc. and its affiliates.
*
* This source code is licensed under the MIT license found in the
* LICENSE file in the root directory of this source tree.
*/
/** @license React v16.13.1
* react-is.production.min.js
*
* Copyright (c) Facebook, Inc. and its affiliates.
*
* This source code is licensed under the MIT license found in the
* LICENSE file in the root directory of this source tree.
*/
/** @license React v17.0.2
* react-dom.production.min.js
*
* Copyright (c) Facebook, Inc. and its affiliates.
*
* This source code is licensed under the MIT license found in the
* LICENSE file in the root directory of this source tree.
*/
/** @license React v17.0.2
* react-jsx-runtime.production.min.js
*
* Copyright (c) Facebook, Inc. and its affiliates.
*
* This source code is licensed under the MIT license found in the
* LICENSE file in the root directory of this source tree.
*/
/** @license React v17.0.2
* react.production.min.js
*
* Copyright (c) Facebook, Inc. and its affiliates.
*
* This source code is licensed under the MIT license found in the
* LICENSE file in the root directory of this source tree.
*/

6
jsx/build/index.html Normal file
View File

@@ -0,0 +1,6 @@
<!DOCTYPE html>
<head></head>
<body>
<div id="admin-react-hook"></div>
<script src="admin-react.js"></script>
</body>

View File

@@ -8,7 +8,7 @@
"scripts": { "scripts": {
"build": "yarn && webpack", "build": "yarn && webpack",
"hot": "webpack && webpack-dev-server", "hot": "webpack && webpack-dev-server",
"place": "cp build/admin-react.js* ../share/jupyterhub/static/js/", "place": "cp -r build/admin-react.js ../share/jupyterhub/static/js/admin-react.js",
"test": "jest --verbose", "test": "jest --verbose",
"snap": "jest --updateSnapshot", "snap": "jest --updateSnapshot",
"lint": "eslint --ext .jsx --ext .js src/", "lint": "eslint --ext .jsx --ext .js src/",
@@ -28,7 +28,17 @@
} }
}, },
"dependencies": { "dependencies": {
"@babel/core": "^7.12.3",
"@babel/preset-env": "^7.12.11",
"@babel/preset-react": "^7.12.10",
"@testing-library/jest-dom": "^5.15.1",
"@testing-library/react": "^12.1.2",
"@testing-library/user-event": "^13.5.0",
"babel-loader": "^8.2.1",
"bootstrap": "^4.5.3", "bootstrap": "^4.5.3",
"css-loader": "^5.0.1",
"eslint-plugin-unused-imports": "^1.1.1",
"file-loader": "^6.2.0",
"history": "^5.0.0", "history": "^5.0.0",
"lodash.debounce": "^4.0.8", "lodash.debounce": "^4.0.8",
"prop-types": "^15.7.2", "prop-types": "^15.7.2",
@@ -41,35 +51,24 @@
"react-redux": "^7.2.2", "react-redux": "^7.2.2",
"react-router": "^5.2.0", "react-router": "^5.2.0",
"react-router-dom": "^5.2.0", "react-router-dom": "^5.2.0",
"recompose": "npm:react-recompose@^0.31.2", "recompose": "^0.30.0",
"redux": "^4.0.5", "redux": "^4.0.5",
"regenerator-runtime": "^0.13.9" "regenerator-runtime": "^0.13.9",
"style-loader": "^2.0.0",
"webpack": "^5.6.0",
"webpack-cli": "^3.3.4",
"webpack-dev-server": "^3.11.0"
}, },
"devDependencies": { "devDependencies": {
"@babel/core": "^7.12.3",
"@babel/preset-env": "^7.12.11",
"@babel/preset-react": "^7.12.10",
"@testing-library/jest-dom": "^5.15.1",
"@testing-library/react": "^12.1.2",
"@testing-library/user-event": "^13.5.0",
"@webpack-cli/serve": "^1.7.0",
"@wojtekmaj/enzyme-adapter-react-17": "^0.6.5", "@wojtekmaj/enzyme-adapter-react-17": "^0.6.5",
"babel-jest": "^26.6.3", "babel-jest": "^26.6.3",
"babel-loader": "^8.2.1",
"css-loader": "^5.0.1",
"enzyme": "^3.11.0", "enzyme": "^3.11.0",
"eslint": "^7.18.0", "eslint": "^7.18.0",
"eslint-plugin-prettier": "^3.3.1", "eslint-plugin-prettier": "^3.3.1",
"eslint-plugin-react": "^7.22.0", "eslint-plugin-react": "^7.22.0",
"eslint-plugin-unused-imports": "^1.1.1",
"file-loader": "^6.2.0",
"identity-obj-proxy": "^3.0.0", "identity-obj-proxy": "^3.0.0",
"jest": "^26.6.3", "jest": "^26.6.3",
"prettier": "^2.2.1", "prettier": "^2.2.1",
"sinon": "^13.0.1", "sinon": "^13.0.1"
"style-loader": "^2.0.0",
"webpack": "^5.6.0",
"webpack-cli": "^4.10.0",
"webpack-dev-server": "^4.9.3"
} }
} }

View File

@@ -22,16 +22,15 @@ const store = createStore(reducers, initialState);
const App = () => { const App = () => {
useEffect(() => { useEffect(() => {
let { limit, user_page, groups_page } = initialState; let { limit, user_page, groups_page } = initialState;
let api = withAPI()().props; jhapiRequest(`/users?offset=${user_page * limit}&limit=${limit}`, "GET")
api .then((data) => data.json())
.updateUsers(user_page * limit, limit)
.then((data) => .then((data) =>
store.dispatch({ type: "USER_PAGE", value: { data: data, page: 0 } }) store.dispatch({ type: "USER_PAGE", value: { data: data, page: 0 } })
) )
.catch((err) => console.log(err)); .catch((err) => console.log(err));
api jhapiRequest(`/groups?offset=${groups_page * limit}&limit=${limit}`, "GET")
.updateGroups(groups_page * limit, limit) .then((data) => data.json())
.then((data) => .then((data) =>
store.dispatch({ type: "GROUPS_PAGE", value: { data: data, page: 0 } }) store.dispatch({ type: "GROUPS_PAGE", value: { data: data, page: 0 } })
) )

View File

@@ -60,10 +60,7 @@ const AddUser = (props) => {
placeholder="usernames separated by line" placeholder="usernames separated by line"
data-testid="user-textarea" data-testid="user-textarea"
onBlur={(e) => { onBlur={(e) => {
let split_users = e.target.value let split_users = e.target.value.split("\n");
.split("\n")
.map((u) => u.trim())
.filter((u) => u.length > 0);
setUsers(split_users); setUsers(split_users);
}} }}
></textarea> ></textarea>
@@ -91,7 +88,17 @@ const AddUser = (props) => {
data-testid="submit" data-testid="submit"
className="btn btn-primary" className="btn btn-primary"
onClick={() => { onClick={() => {
addUsers(users, admin) let filtered_users = users.filter(
(e) =>
e.length > 2 &&
/[!@#$%^&*(),.?":{}|<>]/g.test(e) == false
);
if (filtered_users.length < users.length) {
setUsers(filtered_users);
failRegexEvent();
}
addUsers(filtered_users, admin)
.then((data) => .then((data) =>
data.status < 300 data.status < 300
? updateUsers(0, limit) ? updateUsers(0, limit)

View File

@@ -70,12 +70,12 @@ test("Removes users when they fail Regex", async () => {
let textarea = screen.getByTestId("user-textarea"); let textarea = screen.getByTestId("user-textarea");
let submit = screen.getByTestId("submit"); let submit = screen.getByTestId("submit");
fireEvent.blur(textarea, { target: { value: "foo \n bar\na@b.co\n \n\n" } }); fireEvent.blur(textarea, { target: { value: "foo\nbar\n!!*&*" } });
await act(async () => { await act(async () => {
fireEvent.click(submit); fireEvent.click(submit);
}); });
expect(callbackSpy).toHaveBeenCalledWith(["foo", "bar", "a@b.co"], false); expect(callbackSpy).toHaveBeenCalledWith(["foo", "bar"], false);
}); });
test("Correctly submits admin", async () => { test("Correctly submits admin", async () => {

View File

@@ -59,7 +59,7 @@ const CreateGroup = (props) => {
value={groupName} value={groupName}
placeholder="group name..." placeholder="group name..."
onChange={(e) => { onChange={(e) => {
setGroupName(e.target.value.trim()); setGroupName(e.target.value);
}} }}
></input> ></input>
</div> </div>

View File

@@ -30,7 +30,7 @@ const AccessServerButton = ({ url }) => (
); );
const ServerDashboard = (props) => { const ServerDashboard = (props) => {
let base_url = window.base_url || "/"; let base_url = window.base_url;
// sort methods // sort methods
var usernameDesc = (e) => e.sort((a, b) => (a.name > b.name ? 1 : -1)), var usernameDesc = (e) => e.sort((a, b) => (a.name > b.name ? 1 : -1)),
usernameAsc = (e) => e.sort((a, b) => (a.name < b.name ? 1 : -1)), usernameAsc = (e) => e.sort((a, b) => (a.name < b.name ? 1 : -1)),
@@ -200,44 +200,6 @@ const ServerDashboard = (props) => {
); );
}; };
const ServerRowTable = ({ data }) => {
const sortedData = Object.keys(data)
.sort()
.reduce(function (result, key) {
let value = data[key];
switch (key) {
case "last_activity":
case "created":
case "started":
// format timestamps
value = value ? timeSince(value) : value;
break;
}
if (Array.isArray(value)) {
// cast arrays (e.g. roles, groups) to string
value = value.sort().join(", ");
}
result[key] = value;
return result;
}, {});
return (
<ReactObjectTableViewer
className="table-striped table-bordered"
style={{
padding: "3px 6px",
margin: "auto",
}}
keyStyle={{
padding: "4px",
}}
valueStyle={{
padding: "4px",
}}
data={sortedData}
/>
);
};
const serverRow = (user, server) => { const serverRow = (user, server) => {
const { servers, ...userNoServers } = user; const { servers, ...userNoServers } = user;
const serverNameDash = server.name ? `-${server.name}` : ""; const serverNameDash = server.name ? `-${server.name}` : "";
@@ -270,7 +232,11 @@ const ServerDashboard = (props) => {
<td data-testid="user-row-admin">{user.admin ? "admin" : ""}</td> <td data-testid="user-row-admin">{user.admin ? "admin" : ""}</td>
<td data-testid="user-row-server"> <td data-testid="user-row-server">
<p className="text-secondary">{server.name}</p> {server.name ? (
<p className="text-secondary">{server.name}</p>
) : (
<p style={{ color: "lightgrey" }}>[MAIN]</p>
)}
</td> </td>
<td data-testid="user-row-last-activity"> <td data-testid="user-row-last-activity">
{server.last_activity ? timeSince(server.last_activity) : "Never"} {server.last_activity ? timeSince(server.last_activity) : "Never"}
@@ -292,7 +258,7 @@ const ServerDashboard = (props) => {
/> />
<a <a
href={`${base_url}spawn/${user.name}${ href={`${base_url}spawn/${user.name}${
server.name ? "/" + server.name : "" server.name && "/" + server.name
}`} }`}
> >
<button <button
@@ -320,11 +286,37 @@ const ServerDashboard = (props) => {
> >
<Card style={{ width: "100%", padding: 3, margin: "0 auto" }}> <Card style={{ width: "100%", padding: 3, margin: "0 auto" }}>
<Card.Title>User</Card.Title> <Card.Title>User</Card.Title>
<ServerRowTable data={userNoServers} /> <ReactObjectTableViewer
className="table-striped table-bordered admin-table-head"
style={{
padding: "3px 6px",
margin: "auto",
}}
keyStyle={{
padding: "4px",
}}
valueStyle={{
padding: "4px",
}}
data={userNoServers}
/>
</Card> </Card>
<Card style={{ width: "100%", padding: 3, margin: "0 auto" }}> <Card style={{ width: "100%", padding: 3, margin: "0 auto" }}>
<Card.Title>Server</Card.Title> <Card.Title>Server</Card.Title>
<ServerRowTable data={server} /> <ReactObjectTableViewer
className="table-striped table-bordered admin-table-head"
style={{
padding: "3px 6px",
margin: "auto",
}}
keyStyle={{
padding: "4px",
}}
valueStyle={{
padding: "4px",
}}
data={server}
/>
</Card> </Card>
</CardGroup> </CardGroup>
</Collapse> </Collapse>

View File

@@ -98,18 +98,6 @@ test("Renders correctly the status of a single-user server", async () => {
expect(stop).toBeVisible(); expect(stop).toBeVisible();
}); });
test("Renders spawn page link", async () => {
let callbackSpy = mockAsync();
await act(async () => {
render(serverDashboardJsx(callbackSpy));
});
let link = screen.getByText("Spawn Page").closest("a");
let url = new URL(link.href);
expect(url.pathname).toEqual("/spawn/bar");
});
test("Invokes the startServer event on button click", async () => { test("Invokes the startServer event on button click", async () => {
let callbackSpy = mockAsync(); let callbackSpy = mockAsync();

View File

@@ -1,5 +1,5 @@
export const jhapiRequest = (endpoint, method, data) => { export const jhapiRequest = (endpoint, method, data) => {
let base_url = window.base_url || "/", let base_url = window.base_url,
api_url = `${base_url}hub/api`; api_url = `${base_url}hub/api`;
return fetch(api_url + endpoint, { return fetch(api_url + endpoint, {
method: method, method: method,

View File

@@ -4,9 +4,7 @@ import { jhapiRequest } from "./jhapiUtil";
const withAPI = withProps(() => ({ const withAPI = withProps(() => ({
updateUsers: (offset, limit, name_filter) => updateUsers: (offset, limit, name_filter) =>
jhapiRequest( jhapiRequest(
`/users?include_stopped_servers&offset=${offset}&limit=${limit}&name_filter=${ `/users?offset=${offset}&limit=${limit}&name_filter=${name_filter || ""}`,
name_filter || ""
}`,
"GET" "GET"
).then((data) => data.json()), ).then((data) => data.json()),
updateGroups: (offset, limit) => updateGroups: (offset, limit) =>

View File

@@ -1,5 +1,6 @@
const webpack = require("webpack"); const webpack = require("webpack");
const path = require("path"); const path = require("path");
const express = require("express");
module.exports = { module.exports = {
entry: path.resolve(__dirname, "src", "App.jsx"), entry: path.resolve(__dirname, "src", "App.jsx"),
@@ -33,19 +34,16 @@ module.exports = {
}, },
plugins: [new webpack.HotModuleReplacementPlugin()], plugins: [new webpack.HotModuleReplacementPlugin()],
devServer: { devServer: {
static: { contentBase: path.resolve(__dirname, "build"),
directory: path.resolve(__dirname, "build"),
},
port: 9000, port: 9000,
onBeforeSetupMiddleware: (devServer) => { before: (app, server) => {
const app = devServer.app;
var user_data = JSON.parse( var user_data = JSON.parse(
'[{"kind":"user","name":"foo","admin":true,"groups":[],"server":"/user/foo/","pending":null,"created":"2020-12-07T18:46:27.112695Z","last_activity":"2020-12-07T21:00:33.336354Z","servers":{"":{"name":"","last_activity":"2020-12-07T20:58:02.437408Z","started":"2020-12-07T20:58:01.508266Z","pending":null,"ready":true,"state":{"pid":28085},"url":"/user/foo/","user_options":{},"progress_url":"/hub/api/users/foo/server/progress"}}},{"kind":"user","name":"bar","admin":false,"groups":[],"server":null,"pending":null,"created":"2020-12-07T18:46:27.115528Z","last_activity":"2020-12-07T20:43:51.013613Z","servers":{}}]' '[{"kind":"user","name":"foo","admin":true,"groups":[],"server":"/user/foo/","pending":null,"created":"2020-12-07T18:46:27.112695Z","last_activity":"2020-12-07T21:00:33.336354Z","servers":{"":{"name":"","last_activity":"2020-12-07T20:58:02.437408Z","started":"2020-12-07T20:58:01.508266Z","pending":null,"ready":true,"state":{"pid":28085},"url":"/user/foo/","user_options":{},"progress_url":"/hub/api/users/foo/server/progress"}}},{"kind":"user","name":"bar","admin":false,"groups":[],"server":null,"pending":null,"created":"2020-12-07T18:46:27.115528Z","last_activity":"2020-12-07T20:43:51.013613Z","servers":{}}]'
); );
var group_data = JSON.parse( var group_data = JSON.parse(
'[{"kind":"group","name":"testgroup","users":[]}, {"kind":"group","name":"testgroup2","users":["foo", "bar"]}]' '[{"kind":"group","name":"testgroup","users":[]}, {"kind":"group","name":"testgroup2","users":["foo", "bar"]}]'
); );
app.use(express.json());
// get user_data // get user_data
app.get("/hub/api/users", (req, res) => { app.get("/hub/api/users", (req, res) => {

File diff suppressed because it is too large Load Diff

View File

@@ -1 +1,2 @@
from ._version import __version__, version_info from ._version import __version__
from ._version import version_info

View File

@@ -4,7 +4,7 @@
def get_data_files(): def get_data_files():
"""Walk up until we find share/jupyterhub""" """Walk up until we find share/jupyterhub"""
import sys import sys
from os.path import abspath, dirname, exists, join, split from os.path import join, abspath, dirname, exists, split
path = abspath(dirname(__file__)) path = abspath(dirname(__file__))
starting_points = [path] starting_points = [path]

View File

@@ -1,154 +0,0 @@
"""Utilities for memoization
Note: a memoized function should always return an _immutable_
result to avoid later modifications polluting cached results.
"""
from collections import OrderedDict
from functools import wraps
class DoNotCache:
"""Wrapper to return a result without caching it.
In a function decorated with `@lru_cache_key`:
return DoNotCache(result)
is equivalent to:
return result # but don't cache it!
"""
def __init__(self, result):
self.result = result
class LRUCache:
"""A simple Least-Recently-Used (LRU) cache with a max size"""
def __init__(self, maxsize=1024):
self._cache = OrderedDict()
self.maxsize = maxsize
def __contains__(self, key):
return key in self._cache
def get(self, key, default=None):
"""Get an item from the cache"""
if key in self._cache:
# cache hit, bump to front of the queue for LRU
result = self._cache[key]
self._cache.move_to_end(key)
return result
return default
def set(self, key, value):
"""Store an entry in the cache
Purges oldest entry if cache is full
"""
self._cache[key] = value
# cache is full, purge oldest entry
if len(self._cache) > self.maxsize:
self._cache.popitem(last=False)
__getitem__ = get
__setitem__ = set
def lru_cache_key(key_func, maxsize=1024):
"""Like functools.lru_cache, but takes a custom key function,
as seen in sorted(key=func).
Useful for non-hashable arguments which have a known hashable equivalent (e.g. sets, lists),
or mutable objects where only immutable fields might be used
(e.g. User, where only username affects output).
For safety: Cached results should always be immutable,
such as using `frozenset` instead of mutable `set`.
Example:
@lru_cache_key(lambda user: user.name)
def func_user(user):
# output only varies by name
Args:
key (callable):
Should have the same signature as the decorated function.
Returns a hashable key to use in the cache
maxsize (int):
The maximum size of the cache.
"""
def cache_func(func):
cache = LRUCache(maxsize=maxsize)
# the actual decorated function:
@wraps(func)
def cached(*args, **kwargs):
cache_key = key_func(*args, **kwargs)
if cache_key in cache:
# cache hit
return cache[cache_key]
else:
# cache miss, call function and cache result
result = func(*args, **kwargs)
if isinstance(result, DoNotCache):
# DoNotCache prevents caching
result = result.result
else:
cache[cache_key] = result
return result
return cached
return cache_func
class FrozenDict(dict):
"""A frozen dictionary subclass
Immutable and hashable, so it can be used as a cache key
Values will be frozen with `.freeze(value)`
and must be hashable after freezing.
Not rigorous, but enough for our purposes.
"""
_hash = None
def __init__(self, d):
dict_set = dict.__setitem__
for key, value in d.items():
dict.__setitem__(self, key, self._freeze(value))
def _freeze(self, item):
"""Make values of a dict hashable
- list, set -> frozenset
- dict -> recursive _FrozenDict
- anything else: assumed hashable
"""
if isinstance(item, FrozenDict):
return item
elif isinstance(item, list):
return tuple(self._freeze(e) for e in item)
elif isinstance(item, set):
return frozenset(item)
elif isinstance(item, dict):
return FrozenDict(item)
else:
# any other type is assumed hashable
return item
def __setitem__(self, key):
raise RuntimeError("Cannot modify frozen {type(self).__name__}")
def update(self, other):
raise RuntimeError("Cannot modify frozen {type(self).__name__}")
def __hash__(self):
"""Cache hash because we are immutable"""
if self._hash is None:
self._hash = hash(tuple((key, value) for key, value in self.items()))
return self._hash

View File

@@ -2,7 +2,7 @@
# Copyright (c) Jupyter Development Team. # Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License. # Distributed under the terms of the Modified BSD License.
# version_info updated by running `tbump` # version_info updated by running `tbump`
version_info = (3, 0, 0, "b1", "") version_info = (2, 3, 0, "", "")
# pep 440 version: no dot before beta/rc, but before .dev # pep 440 version: no dot before beta/rc, but before .dev
# 0.1.0rc1 # 0.1.0rc1

View File

@@ -3,16 +3,17 @@ import sys
from logging.config import fileConfig from logging.config import fileConfig
from alembic import context from alembic import context
from sqlalchemy import engine_from_config, pool from sqlalchemy import engine_from_config
from sqlalchemy import pool
# this is the Alembic Config object, which provides # this is the Alembic Config object, which provides
# access to the values within the .ini file in use. # access to the values within the .ini file in use.
config = context.config config = context.config
# Interpret the config file for Python logging. # Interpret the config file for Python logging.
# This line sets up loggers basically. # This line sets up loggers basically.
if 'jupyterhub' in sys.modules: if 'jupyterhub' in sys.modules:
from traitlets.config import MultipleInstanceError from traitlets.config import MultipleInstanceError
from jupyterhub.app import JupyterHub from jupyterhub.app import JupyterHub
app = None app = None
@@ -41,16 +42,6 @@ target_metadata = orm.Base.metadata
# my_important_option = config.get_main_option("my_important_option") # my_important_option = config.get_main_option("my_important_option")
# ... etc. # ... etc.
# pass these to context.configure(**config_opts)
common_config_opts = dict(
# target_metadata for autogenerate
target_metadata=target_metadata,
# transaction per migration to ensure
# each migration is 'complete' before running the next one
# (e.g. dropped tables)
transaction_per_migration=True,
)
def run_migrations_offline(): def run_migrations_offline():
"""Run migrations in 'offline' mode. """Run migrations in 'offline' mode.
@@ -65,15 +56,14 @@ def run_migrations_offline():
""" """
connectable = config.attributes.get('connection', None) connectable = config.attributes.get('connection', None)
config_opts = {}
config_opts.update(common_config_opts)
config_opts["literal_binds"] = True
if connectable is None: if connectable is None:
config_opts["url"] = config.get_main_option("sqlalchemy.url") url = config.get_main_option("sqlalchemy.url")
context.configure(url=url, target_metadata=target_metadata, literal_binds=True)
else: else:
config_opts["connection"] = connectable context.configure(
context.configure(**config_opts) connection=connectable, target_metadata=target_metadata, literal_binds=True
)
with context.begin_transaction(): with context.begin_transaction():
context.run_migrations() context.run_migrations()
@@ -87,8 +77,6 @@ def run_migrations_online():
""" """
connectable = config.attributes.get('connection', None) connectable = config.attributes.get('connection', None)
config_opts = {}
config_opts.update(common_config_opts)
if connectable is None: if connectable is None:
connectable = engine_from_config( connectable = engine_from_config(
@@ -98,10 +86,7 @@ def run_migrations_online():
) )
with connectable.connect() as connection: with connectable.connect() as connection:
context.configure( context.configure(connection=connection, target_metadata=target_metadata)
connection=connection,
**common_config_opts,
)
with context.begin_transaction(): with context.begin_transaction():
context.run_migrations() context.run_migrations()

View File

@@ -11,8 +11,8 @@ down_revision = None
branch_labels = None branch_labels = None
depends_on = None depends_on = None
import sqlalchemy as sa
from alembic import op from alembic import op
import sqlalchemy as sa
def upgrade(): def upgrade():

View File

@@ -15,8 +15,8 @@ import logging
logger = logging.getLogger('alembic') logger = logging.getLogger('alembic')
import sqlalchemy as sa
from alembic import op from alembic import op
import sqlalchemy as sa
tables = ('oauth_access_tokens', 'oauth_codes') tables = ('oauth_access_tokens', 'oauth_codes')

View File

@@ -22,9 +22,8 @@ import logging
logger = logging.getLogger('alembic') logger = logging.getLogger('alembic')
import sqlalchemy as sa
from alembic import op from alembic import op
import sqlalchemy as sa
from jupyterhub.orm import JSONDict from jupyterhub.orm import JSONDict

View File

@@ -11,9 +11,8 @@ down_revision = '896818069c98'
branch_labels = None branch_labels = None
depends_on = None depends_on = None
import sqlalchemy as sa
from alembic import op from alembic import op
import sqlalchemy as sa
from jupyterhub.orm import JSONDict from jupyterhub.orm import JSONDict

View File

@@ -11,10 +11,10 @@ down_revision = '1cebaf56856c'
branch_labels = None branch_labels = None
depends_on = None depends_on = None
import logging
import sqlalchemy as sa
from alembic import op from alembic import op
import sqlalchemy as sa
import logging
logger = logging.getLogger('alembic') logger = logging.getLogger('alembic')

View File

@@ -1,115 +0,0 @@
"""api_token_scopes
Revision ID: 651f5419b74d
Revises: 833da8570507
Create Date: 2022-02-28 12:42:55.149046
"""
# revision identifiers, used by Alembic.
revision = '651f5419b74d'
down_revision = '833da8570507'
branch_labels = None
depends_on = None
import sqlalchemy as sa
from alembic import op
from sqlalchemy import Column, ForeignKey, Table
from sqlalchemy.orm import relationship
from sqlalchemy.orm.session import Session
from jupyterhub import orm, roles, scopes
def upgrade():
c = op.get_bind()
tables = sa.inspect(c.engine).get_table_names()
# oauth codes are short lived, no need to upgrade them
if 'oauth_code_role_map' in tables:
op.drop_table('oauth_code_role_map')
if 'oauth_codes' in tables:
op.add_column('oauth_codes', sa.Column('scopes', orm.JSONList(), nullable=True))
if 'api_tokens' in tables:
# may not be present,
# e.g. upgrade from 1.x, token table dropped
# in which case no migration to do
# define new scopes column on API tokens
op.add_column('api_tokens', sa.Column('scopes', orm.JSONList(), nullable=True))
if 'api_token_role_map' in tables:
# redefine the to-be-removed api_token->role relationship
# so we can run a query on it for the migration
token_role_map = Table(
"api_token_role_map",
orm.Base.metadata,
Column(
'api_token_id',
ForeignKey('api_tokens.id', ondelete='CASCADE'),
primary_key=True,
),
Column(
'role_id',
ForeignKey('roles.id', ondelete='CASCADE'),
primary_key=True,
),
extend_existing=True,
)
orm.APIToken.roles = relationship('Role', secondary='api_token_role_map')
# tokens have roles, evaluate to scopes
db = Session(bind=c)
for token in db.query(orm.APIToken):
token.scopes = list(roles.roles_to_scopes(token.roles))
db.commit()
# drop token-role relationship
op.drop_table('api_token_role_map')
if 'oauth_clients' in tables:
# define new scopes column on API tokens
op.add_column(
'oauth_clients', sa.Column('allowed_scopes', orm.JSONList(), nullable=True)
)
if 'oauth_client_role_map' in tables:
# redefine the to-be-removed api_token->role relationship
# so we can run a query on it for the migration
client_role_map = Table(
"oauth_client_role_map",
orm.Base.metadata,
Column(
'oauth_client_id',
ForeignKey('oauth_clients.id', ondelete='CASCADE'),
primary_key=True,
),
Column(
'role_id',
ForeignKey('roles.id', ondelete='CASCADE'),
primary_key=True,
),
extend_existing=True,
)
orm.OAuthClient.allowed_roles = relationship(
'Role', secondary='oauth_client_role_map'
)
# oauth clients have allowed_roles, evaluate to allowed_scopes
db = Session(bind=c)
for oauth_client in db.query(orm.OAuthClient):
allowed_scopes = set(roles.roles_to_scopes(oauth_client.allowed_roles))
allowed_scopes.update(scopes.access_scopes(oauth_client))
oauth_client.allowed_scopes = sorted(allowed_scopes)
db.commit()
# drop token-role relationship
op.drop_table('oauth_client_role_map')
def downgrade():
# cannot map permissions from scopes back to roles
# drop whole api token table (revokes all tokens), which will be recreated on hub start
op.drop_table('api_tokens')
op.drop_table('oauth_clients')
op.drop_table('oauth_codes')

View File

@@ -12,11 +12,12 @@ down_revision = '4dc2d5a8c53c'
branch_labels = None branch_labels = None
depends_on = None depends_on = None
import sqlalchemy as sa
from alembic import op from alembic import op
import sqlalchemy as sa
from jupyterhub import orm from jupyterhub import orm
naming_convention = orm.meta.naming_convention naming_convention = orm.meta.naming_convention

View File

@@ -11,8 +11,8 @@ down_revision = 'd68c98b66cd4'
branch_labels = None branch_labels = None
depends_on = None depends_on = None
import sqlalchemy as sa
from alembic import op from alembic import op
import sqlalchemy as sa
def upgrade(): def upgrade():

View File

@@ -12,10 +12,10 @@ branch_labels = None
depends_on = None depends_on = None
from datetime import datetime
import sqlalchemy as sa
from alembic import op from alembic import op
import sqlalchemy as sa
from datetime import datetime
def upgrade(): def upgrade():

View File

@@ -11,8 +11,8 @@ down_revision = 'eeb276e51423'
branch_labels = None branch_labels = None
depends_on = None depends_on = None
import sqlalchemy as sa
from alembic import op from alembic import op
import sqlalchemy as sa
def upgrade(): def upgrade():

View File

@@ -11,8 +11,8 @@ down_revision = '99a28a4418e1'
branch_labels = None branch_labels = None
depends_on = None depends_on = None
import sqlalchemy as sa
from alembic import op from alembic import op
import sqlalchemy as sa
def upgrade(): def upgrade():

View File

@@ -12,9 +12,8 @@ down_revision = '19c0846f6344'
branch_labels = None branch_labels = None
depends_on = None depends_on = None
import sqlalchemy as sa
from alembic import op from alembic import op
import sqlalchemy as sa
from jupyterhub.orm import JSONDict from jupyterhub.orm import JSONDict

View File

@@ -1,4 +1,9 @@
from . import auth, groups, hub, proxy, services, users from . import auth
from . import groups
from . import hub
from . import proxy
from . import services
from . import users
from .base import * from .base import *
default_handlers = [] default_handlers = []

View File

@@ -1,16 +1,25 @@
"""Authorization handlers""" """Authorization handlers"""
# Copyright (c) Jupyter Development Team. # Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License. # Distributed under the terms of the Modified BSD License.
import itertools
import json import json
from datetime import datetime from datetime import datetime
from urllib.parse import parse_qsl, quote, urlencode, urlparse, urlunparse from urllib.parse import parse_qsl
from urllib.parse import quote
from urllib.parse import urlencode
from urllib.parse import urlparse
from urllib.parse import urlunparse
from oauthlib import oauth2 from oauthlib import oauth2
from tornado import web from tornado import web
from .. import orm, roles, scopes from .. import orm
from ..utils import get_browser_protocol, token_authenticated from .. import roles
from .base import APIHandler, BaseHandler from .. import scopes
from ..utils import get_browser_protocol
from ..utils import token_authenticated
from .base import APIHandler
from .base import BaseHandler
class TokenAPIHandler(APIHandler): class TokenAPIHandler(APIHandler):
@@ -29,7 +38,7 @@ class TokenAPIHandler(APIHandler):
if owner: if owner:
# having a token means we should be able to read the owner's model # having a token means we should be able to read the owner's model
# (this is the only thing this handler is for) # (this is the only thing this handler is for)
self.expanded_scopes |= scopes.identify_scopes(owner) self.expanded_scopes.update(scopes.identify_scopes(owner))
self.parsed_scopes = scopes.parse_scopes(self.expanded_scopes) self.parsed_scopes = scopes.parse_scopes(self.expanded_scopes)
# record activity whenever we see a token # record activity whenever we see a token
@@ -171,7 +180,7 @@ class OAuthAuthorizeHandler(OAuthHandler, BaseHandler):
raise raise
self.send_oauth_response(headers, body, status) self.send_oauth_response(headers, body, status)
def needs_oauth_confirm(self, user, oauth_client, requested_scopes): def needs_oauth_confirm(self, user, oauth_client, roles):
"""Return whether the given oauth client needs to prompt for access for the given user """Return whether the given oauth client needs to prompt for access for the given user
Checks list for oauth clients that don't need confirmation Checks list for oauth clients that don't need confirmation
@@ -202,20 +211,20 @@ class OAuthAuthorizeHandler(OAuthHandler, BaseHandler):
user_id=user.id, user_id=user.id,
client_id=oauth_client.identifier, client_id=oauth_client.identifier,
) )
authorized_scopes = set() authorized_roles = set()
for token in existing_tokens: for token in existing_tokens:
authorized_scopes.update(token.scopes) authorized_roles.update({role.name for role in token.roles})
if authorized_scopes: if authorized_roles:
if set(requested_scopes).issubset(authorized_scopes): if set(roles).issubset(authorized_roles):
self.log.debug( self.log.debug(
f"User {user.name} has already authorized {oauth_client.identifier} for scopes {requested_scopes}" f"User {user.name} has already authorized {oauth_client.identifier} for roles {roles}"
) )
return False return False
else: else:
self.log.debug( self.log.debug(
f"User {user.name} has authorized {oauth_client.identifier}" f"User {user.name} has authorized {oauth_client.identifier}"
f" for scopes {authorized_scopes}, confirming additional scopes {requested_scopes}" f" for roles {authorized_roles}, confirming additonal roles {roles}"
) )
# default: require confirmation # default: require confirmation
return True return True
@@ -242,7 +251,7 @@ class OAuthAuthorizeHandler(OAuthHandler, BaseHandler):
uri, http_method, body, headers = self.extract_oauth_params() uri, http_method, body, headers = self.extract_oauth_params()
try: try:
( (
requested_scopes, role_names,
credentials, credentials,
) = self.oauth_provider.validate_authorization_request( ) = self.oauth_provider.validate_authorization_request(
uri, http_method, body, headers uri, http_method, body, headers
@@ -274,51 +283,26 @@ class OAuthAuthorizeHandler(OAuthHandler, BaseHandler):
raise web.HTTPError( raise web.HTTPError(
403, f"You do not have permission to access {client.description}" 403, f"You do not have permission to access {client.description}"
) )
if not self.needs_oauth_confirm(self.current_user, client, role_names):
# subset 'raw scopes' to those held by authenticating user
requested_scopes = set(requested_scopes)
user = self.current_user
# raw, _not_ expanded scopes
user_scopes = roles.roles_to_scopes(roles.get_roles_for(user.orm_user))
# these are some scopes the user may not have
# in 'raw' form, but definitely have at this point
# make sure they are here, because we are computing the
# 'raw' scope intersection,
# rather than the expanded_scope intersection
required_scopes = {*scopes.identify_scopes(), *scopes.access_scopes(client)}
user_scopes |= {"inherit", *required_scopes}
allowed_scopes = requested_scopes.intersection(user_scopes)
excluded_scopes = requested_scopes.difference(user_scopes)
# TODO: compute lower-level intersection of remaining _expanded_ scopes
# (e.g. user has admin:users, requesting read:users!group=x)
if excluded_scopes:
self.log.warning(
f"Service {client.description} requested scopes {','.join(requested_scopes)}"
f" for user {self.current_user.name},"
f" granting only {','.join(allowed_scopes) or '[]'}."
)
if not self.needs_oauth_confirm(self.current_user, client, allowed_scopes):
self.log.debug( self.log.debug(
"Skipping oauth confirmation for %s accessing %s", "Skipping oauth confirmation for %s accessing %s",
self.current_user, self.current_user,
client.description, client.description,
) )
# this is the pre-1.0 behavior for all oauth # this is the pre-1.0 behavior for all oauth
self._complete_login(uri, headers, allowed_scopes, credentials) self._complete_login(uri, headers, role_names, credentials)
return return
# discard 'required' scopes from description # resolve roles to scopes for authorization page
# no need to describe the ability to access itself raw_scopes = set()
scopes_to_describe = allowed_scopes.difference(required_scopes) if role_names:
role_objects = (
if not scopes_to_describe: self.db.query(orm.Role).filter(orm.Role.name.in_(role_names)).all()
# TODO: describe all scopes? )
# Not right now, because the no-scope default 'identify' text raw_scopes = set(
# is clearer than what we produce for those scopes individually itertools.chain(*(role.scopes for role in role_objects))
)
if not raw_scopes:
scope_descriptions = [ scope_descriptions = [
{ {
"scope": None, "scope": None,
@@ -328,8 +312,8 @@ class OAuthAuthorizeHandler(OAuthHandler, BaseHandler):
"filter": "", "filter": "",
} }
] ]
elif 'inherit' in scopes_to_describe: elif 'inherit' in raw_scopes:
allowed_scopes = scopes_to_describe = ['inherit'] raw_scopes = ['inherit']
scope_descriptions = [ scope_descriptions = [
{ {
"scope": "inherit", "scope": "inherit",
@@ -341,7 +325,7 @@ class OAuthAuthorizeHandler(OAuthHandler, BaseHandler):
] ]
else: else:
scope_descriptions = scopes.describe_raw_scopes( scope_descriptions = scopes.describe_raw_scopes(
scopes_to_describe, raw_scopes,
username=self.current_user.name, username=self.current_user.name,
) )
# Render oauth 'Authorize application...' page # Render oauth 'Authorize application...' page
@@ -350,7 +334,7 @@ class OAuthAuthorizeHandler(OAuthHandler, BaseHandler):
await self.render_template( await self.render_template(
"oauth.html", "oauth.html",
auth_state=auth_state, auth_state=auth_state,
allowed_scopes=allowed_scopes, role_names=role_names,
scope_descriptions=scope_descriptions, scope_descriptions=scope_descriptions,
oauth_client=client, oauth_client=client,
) )
@@ -397,10 +381,6 @@ class OAuthAuthorizeHandler(OAuthHandler, BaseHandler):
# The scopes the user actually authorized, i.e. checkboxes # The scopes the user actually authorized, i.e. checkboxes
# that were selected. # that were selected.
scopes = self.get_arguments('scopes') scopes = self.get_arguments('scopes')
if scopes == []:
# avoid triggering default scopes (provider selects default scopes when scopes is falsy)
# when an explicit empty list is authorized
scopes = ["identify"]
# credentials we need in the validator # credentials we need in the validator
credentials = self.add_credentials() credentials = self.add_credentials()

View File

@@ -4,15 +4,19 @@
import json import json
from functools import lru_cache from functools import lru_cache
from http.client import responses from http.client import responses
from urllib.parse import parse_qs, urlencode, urlparse, urlunparse from urllib.parse import parse_qs
from urllib.parse import urlencode
from urllib.parse import urlparse
from urllib.parse import urlunparse
from sqlalchemy.exc import SQLAlchemyError from sqlalchemy.exc import SQLAlchemyError
from tornado import web from tornado import web
from .. import orm from .. import orm
from ..handlers import BaseHandler from ..handlers import BaseHandler
from ..scopes import get_scopes_for from ..utils import get_browser_protocol
from ..utils import get_browser_protocol, isoformat, url_escape_path, url_path_join from ..utils import isoformat
from ..utils import url_path_join
PAGINATION_MEDIA_TYPE = "application/jupyterhub-pagination+json" PAGINATION_MEDIA_TYPE = "application/jupyterhub-pagination+json"
@@ -187,44 +191,22 @@ class APIHandler(BaseHandler):
json.dumps({'status': status_code, 'message': message or status_message}) json.dumps({'status': status_code, 'message': message or status_message})
) )
def server_model(self, spawner, *, user=None): def server_model(self, spawner):
"""Get the JSON model for a Spawner """Get the JSON model for a Spawner
Assume server permission already granted Assume server permission already granted"""
"""
if isinstance(spawner, orm.Spawner):
# if an orm.Spawner is passed,
# create a model for a stopped Spawner
# not all info is available without the higher-level Spawner wrapper
orm_spawner = spawner
pending = None
ready = False
stopped = True
user = user
if user is None:
raise RuntimeError("Must specify User with orm.Spawner")
state = orm_spawner.state
else:
orm_spawner = spawner.orm_spawner
pending = spawner.pending
ready = spawner.ready
user = spawner.user
stopped = not spawner.active
state = spawner.get_state()
model = { model = {
'name': orm_spawner.name, 'name': spawner.name,
'last_activity': isoformat(orm_spawner.last_activity), 'last_activity': isoformat(spawner.orm_spawner.last_activity),
'started': isoformat(orm_spawner.started), 'started': isoformat(spawner.orm_spawner.started),
'pending': pending, 'pending': spawner.pending,
'ready': ready, 'ready': spawner.ready,
'stopped': stopped, 'url': url_path_join(spawner.user.url, spawner.name, '/'),
'url': url_path_join(user.url, url_escape_path(spawner.name), '/'),
'user_options': spawner.user_options, 'user_options': spawner.user_options,
'progress_url': user.progress_url(spawner.name), 'progress_url': spawner._progress_url,
} }
scope_filter = self.get_scope_filter('admin:server_state') scope_filter = self.get_scope_filter('admin:server_state')
if scope_filter(spawner, kind='server'): if scope_filter(spawner, kind='server'):
model['state'] = state model['state'] = spawner.get_state()
return model return model
def token_model(self, token): def token_model(self, token):
@@ -242,9 +224,7 @@ class APIHandler(BaseHandler):
owner_key: owner, owner_key: owner,
'id': token.api_id, 'id': token.api_id,
'kind': 'api_token', 'kind': 'api_token',
# deprecated field, but leave it present. 'roles': [r.name for r in token.roles],
'roles': [],
'scopes': list(get_scopes_for(token)),
'created': isoformat(token.created), 'created': isoformat(token.created),
'last_activity': isoformat(token.last_activity), 'last_activity': isoformat(token.last_activity),
'expires_at': isoformat(token.expires_at), 'expires_at': isoformat(token.expires_at),
@@ -270,22 +250,10 @@ class APIHandler(BaseHandler):
keys.update(allowed_keys) keys.update(allowed_keys)
return model return model
_include_stopped_servers = None
@property
def include_stopped_servers(self):
"""Whether stopped servers should be included in user models"""
if self._include_stopped_servers is None:
self._include_stopped_servers = self.get_argument(
"include_stopped_servers", "0"
).lower() not in {"0", "false"}
return self._include_stopped_servers
def user_model(self, user): def user_model(self, user):
"""Get the JSON model for a User object""" """Get the JSON model for a User object"""
if isinstance(user, orm.User): if isinstance(user, orm.User):
user = self.users[user.id] user = self.users[user.id]
include_stopped_servers = self.include_stopped_servers
model = { model = {
'kind': 'user', 'kind': 'user',
'name': user.name, 'name': user.name,
@@ -325,29 +293,18 @@ class APIHandler(BaseHandler):
if '' in user.spawners and 'pending' in allowed_keys: if '' in user.spawners and 'pending' in allowed_keys:
model['pending'] = user.spawners[''].pending model['pending'] = user.spawners[''].pending
servers = {} servers = model['servers'] = {}
scope_filter = self.get_scope_filter('read:servers') scope_filter = self.get_scope_filter('read:servers')
for name, spawner in user.spawners.items(): for name, spawner in user.spawners.items():
# include 'active' servers, not just ready # include 'active' servers, not just ready
# (this includes pending events) # (this includes pending events)
if (spawner.active or include_stopped_servers) and scope_filter( if spawner.active and scope_filter(spawner, kind='server'):
spawner, kind='server'
):
servers[name] = self.server_model(spawner) servers[name] = self.server_model(spawner)
if not servers and 'servers' not in allowed_keys:
if include_stopped_servers:
# add any stopped servers in the db
seen = set(servers.keys())
for name, orm_spawner in user.orm_spawners.items():
if name not in seen and scope_filter(orm_spawner, kind='server'):
servers[name] = self.server_model(orm_spawner, user=user)
if "servers" in allowed_keys or servers:
# omit servers if no access # omit servers if no access
# leave present and empty # leave present and empty
# if request has access to read servers in general # if request has access to read servers in general
model["servers"] = servers model.pop('servers')
return model return model
def group_model(self, group): def group_model(self, group):

View File

@@ -6,7 +6,8 @@ import json
from tornado import web from tornado import web
from .. import orm from .. import orm
from ..scopes import Scope, needs_scope from ..scopes import needs_scope
from ..scopes import Scope
from .base import APIHandler from .base import APIHandler

View File

@@ -26,7 +26,7 @@ class ProxyAPIHandler(APIHandler):
else: else:
routes = {} routes = {}
end = offset + limit end = offset + limit
for i, key in enumerate(sorted(all_routes.keys())): for i, key in sorted(all_routes.keys()):
if i < offset: if i < offset:
continue continue
elif i >= end: elif i >= end:

View File

@@ -6,7 +6,8 @@ Currently GET-only, no actions can be taken to modify services.
# Distributed under the terms of the Modified BSD License. # Distributed under the terms of the Modified BSD License.
import json import json
from ..scopes import Scope, needs_scope from ..scopes import needs_scope
from ..scopes import Scope
from .base import APIHandler from .base import APIHandler

View File

@@ -3,25 +3,26 @@
# Distributed under the terms of the Modified BSD License. # Distributed under the terms of the Modified BSD License.
import asyncio import asyncio
import json import json
from datetime import datetime, timedelta, timezone from datetime import datetime
from datetime import timedelta
from datetime import timezone
from async_generator import aclosing from async_generator import aclosing
from dateutil.parser import parse as parse_date from dateutil.parser import parse as parse_date
from sqlalchemy import func, or_ from sqlalchemy import func
from sqlalchemy import or_
from tornado import web from tornado import web
from tornado.iostream import StreamClosedError from tornado.iostream import StreamClosedError
from .. import orm, scopes from .. import orm
from .. import scopes
from ..roles import assign_default_roles from ..roles import assign_default_roles
from ..scopes import needs_scope from ..scopes import needs_scope
from ..user import User from ..user import User
from ..utils import ( from ..utils import isoformat
isoformat, from ..utils import iterate_until
iterate_until, from ..utils import maybe_future
maybe_future, from ..utils import url_path_join
url_escape_path,
url_path_join,
)
from .base import APIHandler from .base import APIHandler
@@ -50,7 +51,7 @@ class SelfAPIHandler(APIHandler):
for scope in identify_scopes: for scope in identify_scopes:
if scope not in self.expanded_scopes: if scope not in self.expanded_scopes:
_added_scopes.add(scope) _added_scopes.add(scope)
self.expanded_scopes |= {scope} self.expanded_scopes.add(scope)
if _added_scopes: if _added_scopes:
# re-parse with new scopes # re-parse with new scopes
self.parsed_scopes = scopes.parse_scopes(self.expanded_scopes) self.parsed_scopes = scopes.parse_scopes(self.expanded_scopes)
@@ -405,18 +406,21 @@ class UserTokenListAPIHandler(APIHandler):
if requester is not user: if requester is not user:
note += f" by {kind} {requester.name}" note += f" by {kind} {requester.name}"
token_roles = body.get("roles") token_roles = body.get('roles')
token_scopes = body.get("scopes")
try: try:
api_token = user.new_api_token( api_token = user.new_api_token(
note=note, note=note,
expires_in=body.get('expires_in', None), expires_in=body.get('expires_in', None),
roles=token_roles, roles=token_roles,
scopes=token_scopes,
) )
except ValueError as e: except KeyError:
raise web.HTTPError(400, str(e)) raise web.HTTPError(404, "Requested roles %r not found" % token_roles)
except ValueError:
raise web.HTTPError(
403,
"Requested roles %r cannot have higher permissions than the token owner"
% token_roles,
)
if requester is not user: if requester is not user:
self.log.info( self.log.info(
"%s %s requested API token for %s", "%s %s requested API token for %s",
@@ -691,7 +695,7 @@ class SpawnProgressAPIHandler(APIHandler):
# - spawner not running at all # - spawner not running at all
# - spawner failed # - spawner failed
# - spawner pending start (what we expect) # - spawner pending start (what we expect)
url = url_path_join(user.url, url_escape_path(server_name), '/') url = url_path_join(user.url, server_name, '/')
ready_event = { ready_event = {
'progress': 100, 'progress': 100,
'ready': True, 'ready': True,

View File

@@ -11,85 +11,106 @@ import re
import secrets import secrets
import signal import signal
import socket import socket
import ssl
import sys import sys
import time import time
from concurrent.futures import ThreadPoolExecutor from concurrent.futures import ThreadPoolExecutor
from datetime import datetime, timedelta, timezone from datetime import datetime
from datetime import timedelta
from datetime import timezone
from functools import partial
from getpass import getuser from getpass import getuser
from glob import glob
from itertools import chain
from operator import itemgetter from operator import itemgetter
from textwrap import dedent from textwrap import dedent
from urllib.parse import unquote, urlparse, urlunparse from urllib.parse import unquote
from urllib.parse import urlparse
from urllib.parse import urlunparse
if sys.version_info[:2] < (3, 3): if sys.version_info[:2] < (3, 3):
raise ValueError("Python < 3.3 not supported: %s" % sys.version) raise ValueError("Python < 3.3 not supported: %s" % sys.version)
import tornado.httpserver # For compatibility with python versions 3.6 or earlier.
import tornado.options # asyncio.Task.all_tasks() is fully moved to asyncio.all_tasks() starting with 3.9. Also applies to current_task.
try:
asyncio_all_tasks = asyncio.all_tasks
asyncio_current_task = asyncio.current_task
except AttributeError as e:
asyncio_all_tasks = asyncio.Task.all_tasks
asyncio_current_task = asyncio.Task.current_task
from dateutil.parser import parse as parse_date from dateutil.parser import parse as parse_date
from jinja2 import ChoiceLoader, Environment, FileSystemLoader, PrefixLoader from jinja2 import Environment, FileSystemLoader, PrefixLoader, ChoiceLoader
from jupyter_telemetry.eventlog import EventLog
from sqlalchemy.exc import OperationalError, SQLAlchemyError from sqlalchemy.exc import OperationalError, SQLAlchemyError
from tornado import gen, web
from tornado.httpclient import AsyncHTTPClient from tornado.httpclient import AsyncHTTPClient
import tornado.httpserver
from tornado.ioloop import IOLoop, PeriodicCallback from tornado.ioloop import IOLoop, PeriodicCallback
from tornado.log import access_log, app_log, gen_log from tornado.log import app_log, access_log, gen_log
import tornado.options
from tornado import gen, web
from traitlets import ( from traitlets import (
Any,
Bool,
Bytes,
Dict,
Float,
Instance,
Integer,
List,
Set,
Tuple,
Unicode, Unicode,
Integer,
Dict,
TraitError,
List,
Bool,
Any,
Tuple,
Type,
Set,
Instance,
Bytes,
Float,
Union, Union,
default,
observe, observe,
default,
validate, validate,
) )
from traitlets.config import Application, Configurable, catch_config_error from traitlets.config import Application, Configurable, catch_config_error
from jupyter_telemetry.eventlog import EventLog
here = os.path.dirname(__file__) here = os.path.dirname(__file__)
import jupyterhub import jupyterhub
from . import handlers, apihandlers
from .handlers.static import CacheControlStaticFilesHandler, LogoHandler
from .services.service import Service
from . import apihandlers, crypto, dbutil, handlers, orm, roles, scopes from . import crypto
from . import dbutil, orm
from . import roles
from .user import UserDict
from .oauth.provider import make_provider
from ._data import DATA_FILES_PATH from ._data import DATA_FILES_PATH
from .log import CoroutineLogFormatter, log_request
from .proxy import Proxy, ConfigurableHTTPProxy
from .traitlets import URLPrefix, Command, EntryPointType, Callable
from .utils import (
AnyTimeoutError,
catch_db_error,
maybe_future,
url_path_join,
print_stacks,
print_ps_info,
make_ssl_context,
)
from .metrics import HUB_STARTUP_DURATION_SECONDS
from .metrics import INIT_SPAWNERS_DURATION_SECONDS
from .metrics import RUNNING_SERVERS
from .metrics import TOTAL_USERS
# classes for config # classes for config
from .auth import Authenticator, PAMAuthenticator from .auth import Authenticator, PAMAuthenticator
from .crypto import CryptKeeper from .crypto import CryptKeeper
from .spawner import Spawner, LocalProcessSpawner
from .objects import Hub, Server
# For faking stats # For faking stats
from .emptyclass import EmptyClass from .emptyclass import EmptyClass
from .handlers.static import CacheControlStaticFilesHandler, LogoHandler
from .log import CoroutineLogFormatter, log_request
from .metrics import (
HUB_STARTUP_DURATION_SECONDS,
INIT_SPAWNERS_DURATION_SECONDS,
RUNNING_SERVERS,
TOTAL_USERS,
)
from .oauth.provider import make_provider
from .objects import Hub, Server
from .proxy import ConfigurableHTTPProxy, Proxy
from .services.service import Service
from .spawner import LocalProcessSpawner, Spawner
from .traitlets import Callable, Command, EntryPointType, URLPrefix
from .user import UserDict
from .utils import (
AnyTimeoutError,
catch_db_error,
make_ssl_context,
maybe_future,
print_ps_info,
print_stacks,
url_path_join,
)
common_aliases = { common_aliases = {
'log-level': 'Application.log_level', 'log-level': 'Application.log_level',
@@ -332,29 +353,6 @@ class JupyterHub(Application):
""", """,
).tag(config=True) ).tag(config=True)
custom_scopes = Dict(
key_trait=Unicode(),
value_trait=Dict(
key_trait=Unicode(),
),
help="""Custom scopes to define.
For use when defining custom roles,
to grant users granular permissions
All custom scopes must have a description,
and must start with the prefix `custom:`.
For example::
custom_scopes = {
"custom:jupyter_server:read": {
"description": "read-only access to a single-user server",
},
}
""",
).tag(config=True)
config_file = Unicode('jupyterhub_config.py', help="The config file to load").tag( config_file = Unicode('jupyterhub_config.py', help="The config file to load").tag(
config=True config=True
) )
@@ -703,14 +701,11 @@ class JupyterHub(Application):
""", """,
).tag(config=True) ).tag(config=True)
@validate("subdomain_host") def _subdomain_host_changed(self, name, old, new):
def _validate_subdomain_host(self, proposal):
new = proposal.value
if new and '://' not in new: if new and '://' not in new:
# host should include '://' # host should include '://'
# if not specified, assume https: You have to be really explicit about HTTP! # if not specified, assume https: You have to be really explicit about HTTP!
new = 'https://' + new self.subdomain_host = 'https://' + new
return new
domain = Unicode(help="domain name, e.g. 'example.com' (excludes protocol, port)") domain = Unicode(help="domain name, e.g. 'example.com' (excludes protocol, port)")
@@ -1124,7 +1119,7 @@ class JupyterHub(Application):
@default('authenticator') @default('authenticator')
def _authenticator_default(self): def _authenticator_default(self):
return self.authenticator_class(parent=self, _deprecated_db_session=self.db) return self.authenticator_class(parent=self, db=self.db)
implicit_spawn_seconds = Float( implicit_spawn_seconds = Float(
0, 0,
@@ -1312,14 +1307,11 @@ class JupyterHub(Application):
admin_access = Bool( admin_access = Bool(
False, False,
help="""DEPRECATED since version 2.0.0. help="""Grant admin users permission to access single-user servers.
The default admin role has full permissions, use custom RBAC scopes instead to Users should be properly informed if this is enabled.
create restricted administrator roles.
https://jupyterhub.readthedocs.io/en/stable/rbac/index.html
""", """,
).tag(config=True) ).tag(config=True)
admin_users = Set( admin_users = Set(
help="""DEPRECATED since version 0.7.2, use Authenticator.admin_users instead.""" help="""DEPRECATED since version 0.7.2, use Authenticator.admin_users instead."""
).tag(config=True) ).tag(config=True)
@@ -1697,9 +1689,7 @@ class JupyterHub(Application):
for authority, files in self.internal_ssl_authorities.items(): for authority, files in self.internal_ssl_authorities.items():
if files: if files:
self.log.info("Adding CA for %s", authority) self.log.info("Adding CA for %s", authority)
certipy.store.add_record( certipy.store.add_record(authority, is_ca=True, files=files)
authority, is_ca=True, files=files, overwrite=True
)
self.internal_trust_bundles = certipy.trust_from_graph( self.internal_trust_bundles = certipy.trust_from_graph(
self.internal_ssl_components_trust self.internal_ssl_components_trust
@@ -2028,10 +2018,7 @@ class JupyterHub(Application):
db.commit() db.commit()
async def init_role_creation(self): async def init_role_creation(self):
"""Load default and user-defined roles and scopes into the database""" """Load default and predefined roles into the database"""
if self.custom_scopes:
self.log.info(f"Defining {len(self.custom_scopes)} custom scopes.")
scopes.define_custom_scopes(self.custom_scopes)
self.log.debug('Loading roles into database') self.log.debug('Loading roles into database')
default_roles = roles.get_default_roles() default_roles = roles.get_default_roles()
config_role_names = [r['name'] for r in self.load_roles] config_role_names = [r['name'] for r in self.load_roles]
@@ -2068,6 +2055,23 @@ class JupyterHub(Application):
# make sure we load any default roles not overridden # make sure we load any default roles not overridden
init_roles = list(default_roles_dict.values()) + init_roles init_roles = list(default_roles_dict.values()) + init_roles
if roles_with_new_permissions:
unauthorized_oauth_tokens = (
self.db.query(orm.APIToken)
.filter(
orm.APIToken.roles.any(
orm.Role.name.in_(roles_with_new_permissions)
)
)
.filter(orm.APIToken.client_id != 'jupyterhub')
)
for token in unauthorized_oauth_tokens:
self.log.warning(
"Deleting OAuth token %s; one of its roles obtained new permissions that were not authorized by user"
% token
)
self.db.delete(token)
self.db.commit()
init_role_names = [r['name'] for r in init_roles] init_role_names = [r['name'] for r in init_roles]
if ( if (
@@ -2203,6 +2207,9 @@ class JupyterHub(Application):
for kind in kinds: for kind in kinds:
roles.check_for_default_roles(db, kind) roles.check_for_default_roles(db, kind)
# check tokens for default roles
roles.check_for_default_roles(db, bearer='tokens')
async def _add_tokens(self, token_dict, kind): async def _add_tokens(self, token_dict, kind):
"""Add tokens for users or services to the database""" """Add tokens for users or services to the database"""
if kind == 'user': if kind == 'user':
@@ -2367,34 +2374,21 @@ class JupyterHub(Application):
service.orm.server = None service.orm.server = None
if service.oauth_available: if service.oauth_available:
allowed_scopes = set() allowed_roles = []
if service.oauth_client_allowed_scopes:
allowed_scopes.update(service.oauth_client_allowed_scopes)
if service.oauth_roles: if service.oauth_roles:
if not allowed_scopes: allowed_roles = list(
# DEPRECATED? It's still convenient and valid, self.db.query(orm.Role).filter(
# e.g. 'admin' orm.Role.name.in_(service.oauth_roles)
allowed_roles = list(
self.db.query(orm.Role).filter(
orm.Role.name.in_(service.oauth_roles)
)
)
allowed_scopes.update(roles.roles_to_scopes(allowed_roles))
else:
self.log.warning(
f"Ignoring oauth_roles for {service.name}: {service.oauth_roles},"
f" using oauth_client_allowed_scopes={allowed_scopes}."
) )
)
oauth_client = self.oauth_provider.add_client( oauth_client = self.oauth_provider.add_client(
client_id=service.oauth_client_id, client_id=service.oauth_client_id,
client_secret=service.api_token, client_secret=service.api_token,
redirect_uri=service.oauth_redirect_uri, redirect_uri=service.oauth_redirect_uri,
allowed_roles=allowed_roles,
description="JupyterHub service %s" % service.name, description="JupyterHub service %s" % service.name,
) )
service.orm.oauth_client = oauth_client service.orm.oauth_client = oauth_client
# add access-scopes, derived from OAuthClient itself
allowed_scopes.update(scopes.access_scopes(oauth_client))
oauth_client.allowed_scopes = sorted(allowed_scopes)
else: else:
if service.oauth_client: if service.oauth_client:
self.db.delete(service.oauth_client) self.db.delete(service.oauth_client)
@@ -3064,7 +3058,7 @@ class JupyterHub(Application):
self.internal_ssl_key, self.internal_ssl_key,
self.internal_ssl_cert, self.internal_ssl_cert,
cafile=self.internal_ssl_ca, cafile=self.internal_ssl_ca,
purpose=ssl.Purpose.CLIENT_AUTH, check_hostname=False,
) )
# start the webserver # start the webserver
@@ -3241,7 +3235,11 @@ class JupyterHub(Application):
self._atexit_ran = True self._atexit_ran = True
self._init_asyncio_patch() self._init_asyncio_patch()
# run the cleanup step (in a new loop, because the interrupted one is unclean) # run the cleanup step (in a new loop, because the interrupted one is unclean)
asyncio.run(self.cleanup()) asyncio.set_event_loop(asyncio.new_event_loop())
IOLoop.clear_current()
loop = IOLoop()
loop.make_current()
loop.run_sync(self.cleanup)
async def shutdown_cancel_tasks(self, sig=None): async def shutdown_cancel_tasks(self, sig=None):
"""Cancel all other tasks of the event loop and initiate cleanup""" """Cancel all other tasks of the event loop and initiate cleanup"""
@@ -3252,7 +3250,7 @@ class JupyterHub(Application):
await self.cleanup() await self.cleanup()
tasks = [t for t in asyncio.all_tasks() if t is not asyncio.current_task()] tasks = [t for t in asyncio_all_tasks() if t is not asyncio_current_task()]
if tasks: if tasks:
self.log.debug("Cancelling pending tasks") self.log.debug("Cancelling pending tasks")
@@ -3265,7 +3263,7 @@ class JupyterHub(Application):
except StopAsyncIteration as e: except StopAsyncIteration as e:
self.log.error("Caught StopAsyncIteration Exception", exc_info=True) self.log.error("Caught StopAsyncIteration Exception", exc_info=True)
tasks = [t for t in asyncio.all_tasks()] tasks = [t for t in asyncio_all_tasks()]
for t in tasks: for t in tasks:
self.log.debug("Task status: %s", t) self.log.debug("Task status: %s", t)
asyncio.get_event_loop().stop() asyncio.get_event_loop().stop()
@@ -3301,19 +3299,16 @@ class JupyterHub(Application):
def launch_instance(cls, argv=None): def launch_instance(cls, argv=None):
self = cls.instance() self = cls.instance()
self._init_asyncio_patch() self._init_asyncio_patch()
loop = IOLoop(make_current=False) loop = IOLoop.current()
task = asyncio.ensure_future(self.launch_instance_async(argv))
try:
loop.run_sync(self.launch_instance_async, argv)
except Exception:
loop.close()
raise
try: try:
loop.start() loop.start()
except KeyboardInterrupt: except KeyboardInterrupt:
print("\nInterrupted") print("\nInterrupted")
finally: finally:
if task.done():
# re-raise exceptions in launch_instance_async
task.result()
loop.stop() loop.stop()
loop.close() loop.close()

View File

@@ -9,8 +9,9 @@ import warnings
from concurrent.futures import ThreadPoolExecutor from concurrent.futures import ThreadPoolExecutor
from functools import partial from functools import partial
from shutil import which from shutil import which
from subprocess import PIPE, STDOUT, Popen from subprocess import PIPE
from textwrap import dedent from subprocess import Popen
from subprocess import STDOUT
try: try:
import pamela import pamela
@@ -19,12 +20,13 @@ except Exception as e:
_pamela_error = e _pamela_error = e
from tornado.concurrent import run_on_executor from tornado.concurrent import run_on_executor
from traitlets import Any, Bool, Dict, Integer, Set, Unicode, default, observe
from traitlets.config import LoggingConfigurable from traitlets.config import LoggingConfigurable
from traitlets import Bool, Integer, Set, Unicode, Dict, Any, default, observe
from .handlers.login import LoginHandler from .handlers.login import LoginHandler
from .traitlets import Command
from .utils import maybe_future, url_path_join from .utils import maybe_future, url_path_join
from .traitlets import Command
class Authenticator(LoggingConfigurable): class Authenticator(LoggingConfigurable):
@@ -32,23 +34,6 @@ class Authenticator(LoggingConfigurable):
db = Any() db = Any()
@default("db")
def _deprecated_db(self):
self.log.warning(
dedent(
"""
The shared database session at Authenticator.db is deprecated, and will be removed.
Please manage your own database and connections.
Contact JupyterHub at https://github.com/jupyterhub/jupyterhub/issues/3700
if you have questions or ideas about direct database needs for your Authenticator.
"""
),
)
return self._deprecated_db_session
_deprecated_db_session = Any()
enable_auth_state = Bool( enable_auth_state = Bool(
False, False,
config=True, config=True,
@@ -832,7 +817,7 @@ class LocalAuthenticator(Authenticator):
raise ValueError("I don't know how to create users on OS X") raise ValueError("I don't know how to create users on OS X")
elif which('pw'): elif which('pw'):
# Probably BSD # Probably BSD
return ['pw', 'useradd', '-m', '-n'] return ['pw', 'useradd', '-m']
else: else:
# This appears to be the Linux non-interactive adduser command: # This appears to be the Linux non-interactive adduser command:
return ['adduser', '-q', '--gecos', '""', '--disabled-password'] return ['adduser', '-q', '--gecos', '""', '--disabled-password']

View File

@@ -4,12 +4,18 @@ import os
from binascii import a2b_hex from binascii import a2b_hex
from concurrent.futures import ThreadPoolExecutor from concurrent.futures import ThreadPoolExecutor
from traitlets import Any, Integer, List, default, observe, validate from traitlets import Any
from traitlets.config import Config, SingletonConfigurable from traitlets import default
from traitlets import Integer
from traitlets import List
from traitlets import observe
from traitlets import validate
from traitlets.config import Config
from traitlets.config import SingletonConfigurable
try: try:
import cryptography import cryptography
from cryptography.fernet import Fernet, InvalidToken, MultiFernet from cryptography.fernet import Fernet, MultiFernet, InvalidToken
except ImportError: except ImportError:
cryptography = None cryptography = None

View File

@@ -1,4 +1,7 @@
from . import base, login, metrics, pages from . import base
from . import login
from . import metrics
from . import pages
from .base import * from .base import *
from .login import * from .login import *

View File

@@ -9,44 +9,50 @@ import random
import re import re
import time import time
import uuid import uuid
from datetime import datetime, timedelta from datetime import datetime
from datetime import timedelta
from http.client import responses from http.client import responses
from urllib.parse import parse_qs, parse_qsl, urlencode, urlparse, urlunparse from urllib.parse import parse_qs
from urllib.parse import parse_qsl
from urllib.parse import urlencode
from urllib.parse import urlparse
from urllib.parse import urlunparse
from jinja2 import TemplateNotFound from jinja2 import TemplateNotFound
from sqlalchemy.exc import SQLAlchemyError from sqlalchemy.exc import SQLAlchemyError
from tornado import gen, web from tornado import gen
from tornado.httputil import HTTPHeaders, url_concat from tornado import web
from tornado.httputil import HTTPHeaders
from tornado.httputil import url_concat
from tornado.ioloop import IOLoop from tornado.ioloop import IOLoop
from tornado.log import app_log from tornado.log import app_log
from tornado.web import RequestHandler, addslash from tornado.web import addslash
from tornado.web import RequestHandler
from .. import __version__, orm, roles, scopes from .. import __version__
from ..metrics import ( from .. import orm
PROXY_ADD_DURATION_SECONDS, from .. import roles
PROXY_DELETE_DURATION_SECONDS, from .. import scopes
RUNNING_SERVERS, from ..metrics import PROXY_ADD_DURATION_SECONDS
SERVER_POLL_DURATION_SECONDS, from ..metrics import PROXY_DELETE_DURATION_SECONDS
SERVER_SPAWN_DURATION_SECONDS, from ..metrics import ProxyDeleteStatus
SERVER_STOP_DURATION_SECONDS, from ..metrics import RUNNING_SERVERS
TOTAL_USERS, from ..metrics import SERVER_POLL_DURATION_SECONDS
ProxyDeleteStatus, from ..metrics import SERVER_SPAWN_DURATION_SECONDS
ServerPollStatus, from ..metrics import SERVER_STOP_DURATION_SECONDS
ServerSpawnStatus, from ..metrics import ServerPollStatus
ServerStopStatus, from ..metrics import ServerSpawnStatus
) from ..metrics import ServerStopStatus
from ..metrics import TOTAL_USERS
from ..objects import Server from ..objects import Server
from ..scopes import needs_scope from ..scopes import needs_scope
from ..spawner import LocalProcessSpawner from ..spawner import LocalProcessSpawner
from ..user import User from ..user import User
from ..utils import ( from ..utils import AnyTimeoutError
AnyTimeoutError, from ..utils import get_accepted_mimetype
get_accepted_mimetype, from ..utils import get_browser_protocol
get_browser_protocol, from ..utils import maybe_future
maybe_future, from ..utils import url_path_join
url_escape_path,
url_path_join,
)
# pattern for the authentication token header # pattern for the authentication token header
auth_header_pat = re.compile(r'^(?:token|bearer)\s+([^\s]+)$', flags=re.IGNORECASE) auth_header_pat = re.compile(r'^(?:token|bearer)\s+([^\s]+)$', flags=re.IGNORECASE)
@@ -856,12 +862,6 @@ class BaseHandler(RequestHandler):
user_server_name = user.name user_server_name = user.name
if server_name: if server_name:
if '/' in server_name:
error_message = (
f"Invalid server_name (may not contain '/'): {server_name}"
)
self.log.error(error_message)
raise web.HTTPError(400, error_message)
user_server_name = f'{user.name}:{server_name}' user_server_name = f'{user.name}:{server_name}'
if server_name in user.spawners and user.spawners[server_name].pending: if server_name in user.spawners and user.spawners[server_name].pending:
@@ -1520,7 +1520,6 @@ class UserUrlHandler(BaseHandler):
server_name = '' server_name = ''
else: else:
server_name = '' server_name = ''
escaped_server_name = url_escape_path(server_name)
spawner = user.spawners[server_name] spawner = user.spawners[server_name]
if spawner.ready: if spawner.ready:
@@ -1539,10 +1538,7 @@ class UserUrlHandler(BaseHandler):
pending_url = url_concat( pending_url = url_concat(
url_path_join( url_path_join(
self.hub.base_url, self.hub.base_url, 'spawn-pending', user.escaped_name, server_name
'spawn-pending',
user.escaped_name,
escaped_server_name,
), ),
{'next': self.request.uri}, {'next': self.request.uri},
) )
@@ -1556,9 +1552,7 @@ class UserUrlHandler(BaseHandler):
# page *in* the server is not found, we return a 424 instead of a 404. # page *in* the server is not found, we return a 424 instead of a 404.
# We allow retaining the old behavior to support older JupyterLab versions # We allow retaining the old behavior to support older JupyterLab versions
spawn_url = url_concat( spawn_url = url_concat(
url_path_join( url_path_join(self.hub.base_url, "spawn", user.escaped_name, server_name),
self.hub.base_url, "spawn", user.escaped_name, escaped_server_name
),
{"next": self.request.uri}, {"next": self.request.uri},
) )
self.set_status( self.set_status(

View File

@@ -1,5 +1,7 @@
"""Handlers for serving prometheus metrics""" """Handlers for serving prometheus metrics"""
from prometheus_client import CONTENT_TYPE_LATEST, REGISTRY, generate_latest from prometheus_client import CONTENT_TYPE_LATEST
from prometheus_client import generate_latest
from prometheus_client import REGISTRY
from ..utils import metrics_authentication from ..utils import metrics_authentication
from .base import BaseHandler from .base import BaseHandler

View File

@@ -12,9 +12,11 @@ from tornado import web
from tornado.httputil import url_concat from tornado.httputil import url_concat
from .. import __version__ from .. import __version__
from ..metrics import SERVER_POLL_DURATION_SECONDS, ServerPollStatus from ..metrics import SERVER_POLL_DURATION_SECONDS
from ..metrics import ServerPollStatus
from ..scopes import needs_scope from ..scopes import needs_scope
from ..utils import maybe_future, url_escape_path, url_path_join from ..utils import maybe_future
from ..utils import url_path_join
from .base import BaseHandler from .base import BaseHandler
@@ -268,6 +270,15 @@ class SpawnHandler(BaseHandler):
) )
self.finish(form) self.finish(form)
return return
if current_user is user:
self.set_login_cookie(user)
next_url = self.get_next_url(
user,
default=url_path_join(
self.hub.base_url, "spawn-pending", user.escaped_name, server_name
),
)
self.redirect(next_url)
def _get_pending_url(self, user, server_name): def _get_pending_url(self, user, server_name):
# resolve `?next=...`, falling back on the spawn-pending url # resolve `?next=...`, falling back on the spawn-pending url
@@ -275,10 +286,7 @@ class SpawnHandler(BaseHandler):
# which may get handled by the default server if they aren't ready yet # which may get handled by the default server if they aren't ready yet
pending_url = url_path_join( pending_url = url_path_join(
self.hub.base_url, self.hub.base_url, "spawn-pending", user.escaped_name, server_name
"spawn-pending",
user.escaped_name,
url_escape_path(server_name),
) )
pending_url = self.append_query_parameters(pending_url, exclude=['next']) pending_url = self.append_query_parameters(pending_url, exclude=['next'])
@@ -347,7 +355,6 @@ class SpawnPendingHandler(BaseHandler):
if server_name and server_name not in user.spawners: if server_name and server_name not in user.spawners:
raise web.HTTPError(404, f"{user.name} has no such server {server_name}") raise web.HTTPError(404, f"{user.name} has no such server {server_name}")
escaped_server_name = url_escape_path(server_name)
spawner = user.spawners[server_name] spawner = user.spawners[server_name]
if spawner.ready: if spawner.ready:
@@ -370,7 +377,7 @@ class SpawnPendingHandler(BaseHandler):
exc = spawner._spawn_future.exception() exc = spawner._spawn_future.exception()
self.log.error("Previous spawn for %s failed: %s", spawner._log_name, exc) self.log.error("Previous spawn for %s failed: %s", spawner._log_name, exc)
spawn_url = url_path_join( spawn_url = url_path_join(
self.hub.base_url, "spawn", user.escaped_name, escaped_server_name self.hub.base_url, "spawn", user.escaped_name, server_name
) )
self.set_status(500) self.set_status(500)
html = await self.render_template( html = await self.render_template(
@@ -423,7 +430,7 @@ class SpawnPendingHandler(BaseHandler):
# serving the expected page # serving the expected page
if status is not None: if status is not None:
spawn_url = url_path_join( spawn_url = url_path_join(
self.hub.base_url, "spawn", user.escaped_name, escaped_server_name self.hub.base_url, "spawn", user.escaped_name, server_name
) )
html = await self.render_template( html = await self.render_template(
"not_running.html", "not_running.html",
@@ -449,14 +456,15 @@ class AdminHandler(BaseHandler):
@web.authenticated @web.authenticated
# stacked decorators: all scopes must be present # stacked decorators: all scopes must be present
# note: keep in sync with admin link condition in page.html # note: keep in sync with admin link condition in page.html
@needs_scope('admin-ui') @needs_scope('admin:users')
@needs_scope('admin:servers')
async def get(self): async def get(self):
auth_state = await self.current_user.get_auth_state() auth_state = await self.current_user.get_auth_state()
html = await self.render_template( html = await self.render_template(
'admin.html', 'admin.html',
current_user=self.current_user, current_user=self.current_user,
auth_state=auth_state, auth_state=auth_state,
admin_access=True, admin_access=self.settings.get('admin_access', False),
allow_named_servers=self.allow_named_servers, allow_named_servers=self.allow_named_servers,
named_server_limit_per_user=self.named_server_limit_per_user, named_server_limit_per_user=self.named_server_limit_per_user,
server_version=f'{__version__} {self.version_hash}', server_version=f'{__version__} {self.version_hash}',

View File

@@ -6,10 +6,13 @@ import logging
import traceback import traceback
from functools import partial from functools import partial
from http.cookies import SimpleCookie from http.cookies import SimpleCookie
from urllib.parse import urlparse, urlunparse from urllib.parse import urlparse
from urllib.parse import urlunparse
from tornado.log import LogFormatter, access_log from tornado.log import access_log
from tornado.web import HTTPError, StaticFileHandler from tornado.log import LogFormatter
from tornado.web import HTTPError
from tornado.web import StaticFileHandler
from .handlers.pages import HealthCheckHandler from .handlers.pages import HealthCheckHandler
from .metrics import prometheus_log_method from .metrics import prometheus_log_method

View File

@@ -21,7 +21,8 @@ them manually here.
""" """
from enum import Enum from enum import Enum
from prometheus_client import Gauge, Histogram from prometheus_client import Gauge
from prometheus_client import Histogram
REQUEST_DURATION_SECONDS = Histogram( REQUEST_DURATION_SECONDS = Histogram(
'jupyterhub_request_duration_seconds', 'jupyterhub_request_duration_seconds',

View File

@@ -3,14 +3,15 @@
implements https://oauthlib.readthedocs.io/en/latest/oauth2/server.html implements https://oauthlib.readthedocs.io/en/latest/oauth2/server.html
""" """
from oauthlib import uri_validate from oauthlib import uri_validate
from oauthlib.oauth2 import RequestValidator, WebApplicationServer from oauthlib.oauth2 import RequestValidator
from oauthlib.oauth2.rfc6749.grant_types import authorization_code, base from oauthlib.oauth2 import WebApplicationServer
from oauthlib.oauth2.rfc6749.grant_types import authorization_code
from oauthlib.oauth2.rfc6749.grant_types import base
from tornado.log import app_log from tornado.log import app_log
from .. import orm from .. import orm
from ..roles import roles_to_scopes from ..utils import compare_token
from ..scopes import _check_scopes_exist, access_scopes, identify_scopes from ..utils import hash_token
from ..utils import compare_token, hash_token
# patch absolute-uri check # patch absolute-uri check
# because we want to allow relative uri oauth # because we want to allow relative uri oauth
@@ -151,12 +152,7 @@ class JupyterHubRequestValidator(RequestValidator):
) )
if orm_client is None: if orm_client is None:
raise ValueError("No such client: %s" % client_id) raise ValueError("No such client: %s" % client_id)
scopes = set(orm_client.allowed_scopes) return [role.name for role in orm_client.allowed_roles]
if 'inherit' not in scopes:
# add identify-user scope
# and access-service scope
scopes |= identify_scopes() | access_scopes(orm_client)
return scopes
def get_original_scopes(self, refresh_token, request, *args, **kwargs): def get_original_scopes(self, refresh_token, request, *args, **kwargs):
"""Get the list of scopes associated with the refresh token. """Get the list of scopes associated with the refresh token.
@@ -256,7 +252,7 @@ class JupyterHubRequestValidator(RequestValidator):
code=code['code'], code=code['code'],
# oauth has 5 minutes to complete # oauth has 5 minutes to complete
expires_at=int(orm.OAuthCode.now() + 300), expires_at=int(orm.OAuthCode.now() + 300),
scopes=list(request.scopes), roles=request._jupyterhub_roles,
user=request.user.orm_user, user=request.user.orm_user,
redirect_uri=orm_client.redirect_uri, redirect_uri=orm_client.redirect_uri,
session_id=request.session_id, session_id=request.session_id,
@@ -351,9 +347,9 @@ class JupyterHubRequestValidator(RequestValidator):
# APIToken.new commits the token to the db # APIToken.new commits the token to the db
orm.APIToken.new( orm.APIToken.new(
oauth_client=client, client_id=client.identifier,
expires_in=token['expires_in'], expires_in=token['expires_in'],
scopes=request.scopes, roles=[rolename for rolename in request.scopes],
token=token['access_token'], token=token['access_token'],
session_id=request.session_id, session_id=request.session_id,
user=request.user, user=request.user,
@@ -455,7 +451,7 @@ class JupyterHubRequestValidator(RequestValidator):
return False return False
request.user = orm_code.user request.user = orm_code.user
request.session_id = orm_code.session_id request.session_id = orm_code.session_id
request.scopes = orm_code.scopes request.scopes = [role.name for role in orm_code.roles]
return True return True
def validate_grant_type( def validate_grant_type(
@@ -541,7 +537,7 @@ class JupyterHubRequestValidator(RequestValidator):
def validate_scopes(self, client_id, scopes, client, request, *args, **kwargs): def validate_scopes(self, client_id, scopes, client, request, *args, **kwargs):
"""Ensure the client is authorized access to requested scopes. """Ensure the client is authorized access to requested scopes.
:param client_id: Unicode client identifier :param client_id: Unicode client identifier
:param scopes: List of 'raw' scopes (defined by you) :param scopes: List of scopes (defined by you)
:param client: Client object set by you, see authenticate_client. :param client: Client object set by you, see authenticate_client.
:param request: The HTTP Request (oauthlib.common.Request) :param request: The HTTP Request (oauthlib.common.Request)
:rtype: True or False :rtype: True or False
@@ -551,70 +547,35 @@ class JupyterHubRequestValidator(RequestValidator):
- Resource Owner Password Credentials Grant - Resource Owner Password Credentials Grant
- Client Credentials Grant - Client Credentials Grant
""" """
orm_client = ( orm_client = (
self.db.query(orm.OAuthClient).filter_by(identifier=client_id).one_or_none() self.db.query(orm.OAuthClient).filter_by(identifier=client_id).one_or_none()
) )
if orm_client is None: if orm_client is None:
app_log.warning("No such oauth client %s", client_id) app_log.warning("No such oauth client %s", client_id)
return False return False
client_allowed_roles = {role.name: role for role in orm_client.allowed_roles}
requested_scopes = set(scopes)
# explicitly allow 'identify', which was the only allowed scope previously # explicitly allow 'identify', which was the only allowed scope previously
# requesting 'identify' gets no actual permissions other than self-identification # requesting 'identify' gets no actual permissions other than self-identification
if "identify" in requested_scopes: client_allowed_roles.setdefault('identify', None)
app_log.warning( roles = []
f"Ignoring deprecated 'identify' scope, requested by {client_id}" requested_roles = set(scopes)
) disallowed_roles = requested_roles.difference(client_allowed_roles)
requested_scopes.discard("identify") if disallowed_roles:
# TODO: handle roles->scopes transition
# In 2.x, `?scopes=` only accepted _role_ names,
# but in 3.0 we accept and prefer scopes.
# For backward-compatibility, we still accept both.
# Should roles be deprecated here, or kept as a convenience?
try:
_check_scopes_exist(requested_scopes)
except KeyError as e:
# scopes don't exist, maybe they are role names
requested_roles = list(
self.db.query(orm.Role).filter(orm.Role.name.in_(requested_scopes))
)
if len(requested_roles) != len(requested_scopes):
# did not find roles
app_log.warning(f"No such scopes: {requested_scopes}")
return False
app_log.info(
f"OAuth client {client_id} requesting roles: {requested_scopes}"
)
requested_scopes = roles_to_scopes(requested_roles)
client_allowed_scopes = set(orm_client.allowed_scopes)
# always grant reading the token-owner's name
# and accessing the service itself
required_scopes = {*identify_scopes(), *access_scopes(orm_client)}
requested_scopes.update(required_scopes)
client_allowed_scopes.update(required_scopes)
# TODO: handle expanded_scopes intersection here?
# e.g. client allowed to request admin:users,
# but requests admin:users!name=x will not be allowed
# This can probably be dealt with in config by listing expected requests
# as explcitly allowed
disallowed_scopes = requested_scopes.difference(client_allowed_scopes)
if disallowed_scopes:
app_log.error( app_log.error(
f"Scope(s) not allowed for {client_id}: {', '.join(disallowed_scopes)}" f"Role(s) not allowed for {client_id}: {','.join(disallowed_roles)}"
) )
return False return False
# store resolved scopes on request # store resolved roles on request
app_log.debug( app_log.debug(
f"Allowing request for scope(s) for {client_id}: {','.join(requested_scopes) or '[]'}" f"Allowing request for role(s) for {client_id}: {','.join(requested_roles) or '[]'}"
) )
request.scopes = requested_scopes # these will be stored on the OAuthCode object
request._jupyterhub_roles = [
client_allowed_roles[name]
for name in requested_roles
if client_allowed_roles[name] is not None
]
return True return True
@@ -624,12 +585,7 @@ class JupyterHubOAuthServer(WebApplicationServer):
super().__init__(validator, *args, **kwargs) super().__init__(validator, *args, **kwargs)
def add_client( def add_client(
self, self, client_id, client_secret, redirect_uri, allowed_roles=None, description=''
client_id,
client_secret,
redirect_uri,
allowed_scopes=None,
description='',
): ):
"""Add a client """Add a client
@@ -651,12 +607,12 @@ class JupyterHubOAuthServer(WebApplicationServer):
app_log.info(f'Creating oauth client {client_id}') app_log.info(f'Creating oauth client {client_id}')
else: else:
app_log.info(f'Updating oauth client {client_id}') app_log.info(f'Updating oauth client {client_id}')
if allowed_scopes == None: if allowed_roles == None:
allowed_scopes = [] allowed_roles = []
orm_client.secret = hash_token(client_secret) if client_secret else "" orm_client.secret = hash_token(client_secret) if client_secret else ""
orm_client.redirect_uri = redirect_uri orm_client.redirect_uri = redirect_uri
orm_client.description = description or client_id orm_client.description = description or client_id
orm_client.allowed_scopes = list(allowed_scopes) orm_client.allowed_roles = allowed_roles
self.db.commit() self.db.commit()
return orm_client return orm_client

View File

@@ -3,20 +3,25 @@
# Distributed under the terms of the Modified BSD License. # Distributed under the terms of the Modified BSD License.
import socket import socket
import warnings import warnings
from urllib.parse import urlparse, urlunparse from urllib.parse import urlparse
from urllib.parse import urlunparse
from traitlets import HasTraits, Instance, Integer, Unicode, default, observe, validate from traitlets import default
from traitlets import HasTraits
from traitlets import Instance
from traitlets import Integer
from traitlets import observe
from traitlets import Unicode
from traitlets import validate
from . import orm from . import orm
from .traitlets import URLPrefix from .traitlets import URLPrefix
from .utils import ( from .utils import can_connect
can_connect, from .utils import make_ssl_context
make_ssl_context, from .utils import random_port
random_port, from .utils import url_path_join
url_path_join, from .utils import wait_for_http_server
wait_for_http_server, from .utils import wait_for_server
wait_for_server,
)
class Server(HasTraits): class Server(HasTraits):

View File

@@ -3,43 +3,51 @@
# Distributed under the terms of the Modified BSD License. # Distributed under the terms of the Modified BSD License.
import enum import enum
import json import json
from base64 import decodebytes, encodebytes import warnings
from datetime import datetime, timedelta from base64 import decodebytes
from base64 import encodebytes
from datetime import datetime
from datetime import timedelta
import alembic.command import alembic.command
import alembic.config import alembic.config
from alembic.script import ScriptDirectory from alembic.script import ScriptDirectory
from sqlalchemy import ( from sqlalchemy import Boolean
Boolean, from sqlalchemy import Column
Column, from sqlalchemy import create_engine
DateTime, from sqlalchemy import DateTime
ForeignKey, from sqlalchemy import event
Integer, from sqlalchemy import exc
MetaData, from sqlalchemy import ForeignKey
Table, from sqlalchemy import inspect
Unicode, from sqlalchemy import Integer
create_engine, from sqlalchemy import MetaData
event, from sqlalchemy import or_
exc, from sqlalchemy import select
inspect, from sqlalchemy import Table
or_, from sqlalchemy import Unicode
select,
)
from sqlalchemy.ext.declarative import declarative_base from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import ( from sqlalchemy.orm import backref
Session, from sqlalchemy.orm import interfaces
backref, from sqlalchemy.orm import object_session
interfaces, from sqlalchemy.orm import relationship
object_session, from sqlalchemy.orm import Session
relationship, from sqlalchemy.orm import sessionmaker
sessionmaker,
)
from sqlalchemy.pool import StaticPool from sqlalchemy.pool import StaticPool
from sqlalchemy.sql.expression import bindparam from sqlalchemy.sql.expression import bindparam
from sqlalchemy.types import LargeBinary, Text, TypeDecorator from sqlalchemy.types import LargeBinary
from sqlalchemy.types import Text
from sqlalchemy.types import TypeDecorator
from tornado.log import app_log from tornado.log import app_log
from .utils import compare_token, hash_token, new_token, random_port from .roles import assign_default_roles
from .roles import create_role
from .roles import get_default_roles
from .roles import update_roles
from .utils import compare_token
from .utils import hash_token
from .utils import new_token
from .utils import random_port
# top-level variable for easier mocking in tests # top-level variable for easier mocking in tests
utcnow = datetime.utcnow utcnow = datetime.utcnow
@@ -102,9 +110,7 @@ class JSONList(JSONDict):
return value return value
def process_result_value(self, value, dialect): def process_result_value(self, value, dialect):
if value is None: if value is not None:
return []
else:
value = json.loads(value) value = json.loads(value)
return value return value
@@ -151,6 +157,9 @@ for has_role in (
'user', 'user',
'group', 'group',
'service', 'service',
'api_token',
'oauth_client',
'oauth_code',
): ):
role_map = Table( role_map = Table(
f'{has_role}_role_map', f'{has_role}_role_map',
@@ -176,9 +185,10 @@ class Role(Base):
id = Column(Integer, primary_key=True, autoincrement=True) id = Column(Integer, primary_key=True, autoincrement=True)
name = Column(Unicode(255), unique=True) name = Column(Unicode(255), unique=True)
description = Column(Unicode(1023)) description = Column(Unicode(1023))
scopes = Column(JSONList, default=[]) scopes = Column(JSONList)
users = relationship('User', secondary='user_role_map', backref='roles') users = relationship('User', secondary='user_role_map', backref='roles')
services = relationship('Service', secondary='service_role_map', backref='roles') services = relationship('Service', secondary='service_role_map', backref='roles')
tokens = relationship('APIToken', secondary='api_token_role_map', backref='roles')
groups = relationship('Group', secondary='group_role_map', backref='roles') groups = relationship('Group', secondary='group_role_map', backref='roles')
def __repr__(self): def __repr__(self):
@@ -526,7 +536,9 @@ class Hashed(Expiring):
prefix = token[: cls.prefix_length] prefix = token[: cls.prefix_length]
# since we can't filter on hashed values, filter on prefix # since we can't filter on hashed values, filter on prefix
# so we aren't comparing with all tokens # so we aren't comparing with all tokens
prefix_match = db.query(cls).filter_by(prefix=prefix) prefix_match = db.query(cls).filter(
bindparam('prefix', prefix).startswith(cls.prefix)
)
prefix_match = prefix_match.filter( prefix_match = prefix_match.filter(
or_(cls.expires_at == None, cls.expires_at >= cls.now()) or_(cls.expires_at == None, cls.expires_at >= cls.now())
) )
@@ -585,10 +597,6 @@ class APIToken(Hashed, Base):
def api_id(self): def api_id(self):
return 'a%i' % self.id return 'a%i' % self.id
@property
def owner(self):
return self.user or self.service
# added in 2.0 # added in 2.0
client_id = Column( client_id = Column(
Unicode(255), Unicode(255),
@@ -616,7 +624,6 @@ class APIToken(Hashed, Base):
expires_at = Column(DateTime, default=None, nullable=True) expires_at = Column(DateTime, default=None, nullable=True)
last_activity = Column(DateTime) last_activity = Column(DateTime)
note = Column(Unicode(1023)) note = Column(Unicode(1023))
scopes = Column(JSONList, default=[])
def __repr__(self): def __repr__(self):
if self.user is not None: if self.user is not None:
@@ -669,17 +676,14 @@ class APIToken(Hashed, Base):
def new( def new(
cls, cls,
token=None, token=None,
*,
user=None, user=None,
service=None, service=None,
roles=None, roles=None,
scopes=None,
note='', note='',
generated=True, generated=True,
session_id=None, session_id=None,
expires_in=None, expires_in=None,
client_id=None, client_id='jupyterhub',
oauth_client=None,
return_orm=False, return_orm=False,
): ):
"""Generate a new API token for a user or service""" """Generate a new API token for a user or service"""
@@ -693,54 +697,6 @@ class APIToken(Hashed, Base):
generated = True generated = True
else: else:
cls.check_token(db, token) cls.check_token(db, token)
# avoid circular import
from .roles import roles_to_scopes
if scopes is not None and roles is not None:
raise ValueError(
"Can only assign one of scopes or roles when creating tokens."
)
elif scopes is None and roles is None:
# this is the default branch
# use the default 'token' role to specify default permissions for API tokens
default_token_role = Role.find(db, 'token')
if not default_token_role:
scopes = ["inherit"]
else:
scopes = roles_to_scopes([default_token_role])
elif roles is not None:
# evaluate roles to scopes immediately
# TODO: should this be deprecated, or not?
# warnings.warn(
# "Setting roles on tokens is deprecated in JupyterHub 3.0. Use scopes.",
# DeprecationWarning,
# stacklevel=3,
# )
orm_roles = []
for rolename in roles:
role = Role.find(db, name=rolename)
if role is None:
raise ValueError(f"No such role: {rolename}")
orm_roles.append(role)
scopes = roles_to_scopes(orm_roles)
if oauth_client is None:
# lookup oauth client by identifier
if client_id is None:
# default: global 'jupyterhub' client
client_id = "jupyterhub"
oauth_client = db.query(OAuthClient).filter_by(identifier=client_id).one()
if client_id is None:
client_id = oauth_client.identifier
# avoid circular import
from .scopes import _check_scopes_exist, _check_token_scopes
_check_scopes_exist(scopes, who_for="token")
_check_token_scopes(scopes, owner=user or service, oauth_client=oauth_client)
# two stages to ensure orm_token.generated has been set # two stages to ensure orm_token.generated has been set
# before token setter is called # before token setter is called
orm_token = cls( orm_token = cls(
@@ -748,7 +704,6 @@ class APIToken(Hashed, Base):
note=note or '', note=note or '',
client_id=client_id, client_id=client_id,
session_id=session_id, session_id=session_id,
scopes=list(scopes),
) )
orm_token.token = token orm_token.token = token
if user: if user:
@@ -761,19 +716,21 @@ class APIToken(Hashed, Base):
orm_token.expires_at = cls.now() + timedelta(seconds=expires_in) orm_token.expires_at = cls.now() + timedelta(seconds=expires_in)
db.add(orm_token) db.add(orm_token)
if not Role.find(db, 'token'):
raise RuntimeError("Default token role has not been created")
try:
if roles is not None:
update_roles(db, entity=orm_token, roles=roles)
else:
assign_default_roles(db, entity=orm_token)
except Exception:
db.delete(orm_token)
db.commit()
raise
db.commit() db.commit()
return token return token
def update_scopes(self, new_scopes):
"""Set new scopes, checking that they are allowed"""
from .scopes import _check_scopes_exist, _check_token_scopes
_check_scopes_exist(new_scopes, who_for="token")
_check_token_scopes(
new_scopes, owner=self.owner, oauth_client=self.oauth_client
)
self.scopes = new_scopes
class OAuthCode(Expiring, Base): class OAuthCode(Expiring, Base):
__tablename__ = 'oauth_codes' __tablename__ = 'oauth_codes'
@@ -789,7 +746,7 @@ class OAuthCode(Expiring, Base):
# state = Column(Unicode(1023)) # state = Column(Unicode(1023))
user_id = Column(Integer, ForeignKey('users.id', ondelete='CASCADE')) user_id = Column(Integer, ForeignKey('users.id', ondelete='CASCADE'))
scopes = Column(JSONList, default=[]) roles = relationship('Role', secondary='oauth_code_role_map')
@staticmethod @staticmethod
def now(): def now():
@@ -827,9 +784,9 @@ class OAuthClient(Base):
) )
codes = relationship(OAuthCode, backref='client', cascade='all, delete-orphan') codes = relationship(OAuthCode, backref='client', cascade='all, delete-orphan')
# these are the scopes an oauth client is allowed to request # these are the roles an oauth client is allowed to request
# *not* the scopes of the client itself # *not* the roles of the client itself
allowed_scopes = Column(JSONList, default=[]) allowed_roles = relationship('Role', secondary='oauth_client_role_map')
def __repr__(self): def __repr__(self):
return f"<{self.__class__.__name__}(identifier={self.identifier!r})>" return f"<{self.__class__.__name__}(identifier={self.identifier!r})>"

View File

@@ -23,32 +23,31 @@ import signal
import time import time
from functools import wraps from functools import wraps
from subprocess import Popen from subprocess import Popen
from urllib.parse import quote, urlparse from urllib.parse import quote
from weakref import WeakKeyDictionary from weakref import WeakKeyDictionary
from tornado.httpclient import AsyncHTTPClient, HTTPError, HTTPRequest from tornado.httpclient import AsyncHTTPClient
from tornado.httpclient import HTTPError
from tornado.httpclient import HTTPRequest
from tornado.ioloop import PeriodicCallback from tornado.ioloop import PeriodicCallback
from traitlets import ( from traitlets import Any
Any, from traitlets import Bool
Bool, from traitlets import default
CaselessStrEnum, from traitlets import Dict
Dict, from traitlets import Instance
Instance, from traitlets import Integer
Integer, from traitlets import observe
TraitError, from traitlets import Unicode
Unicode,
default,
observe,
validate,
)
from traitlets.config import LoggingConfigurable from traitlets.config import LoggingConfigurable
from jupyterhub.traitlets import Command
from . import utils from . import utils
from .metrics import CHECK_ROUTES_DURATION_SECONDS, PROXY_POLL_DURATION_SECONDS from .metrics import CHECK_ROUTES_DURATION_SECONDS
from .metrics import PROXY_POLL_DURATION_SECONDS
from .objects import Server from .objects import Server
from .utils import AnyTimeoutError, exponential_backoff, url_escape_path, url_path_join from .utils import AnyTimeoutError
from .utils import exponential_backoff
from .utils import url_path_join
from jupyterhub.traitlets import Command
def _one_at_a_time(method): def _one_at_a_time(method):
@@ -123,8 +122,7 @@ class Proxy(LoggingConfigurable):
) )
extra_routes = Dict( extra_routes = Dict(
key_trait=Unicode(), {},
value_trait=Unicode(),
config=True, config=True,
help=""" help="""
Additional routes to be maintained in the proxy. Additional routes to be maintained in the proxy.
@@ -143,51 +141,6 @@ class Proxy(LoggingConfigurable):
""", """,
) )
@validate("extra_routes")
def _validate_extra_routes(self, proposal):
extra_routes = {}
# check routespecs for leading/trailing slashes
for routespec, target in proposal.value.items():
if not isinstance(routespec, str):
raise TraitError(
f"Proxy.extra_routes keys must be str, got {routespec!r}"
)
if not isinstance(target, str):
raise TraitError(
f"Proxy.extra_routes values must be str, got {target!r}"
)
if not routespec.endswith("/"):
# trailing / is unambiguous, so we can add it
self.log.warning(
f"Adding missing trailing '/' to c.Proxy.extra_routes {routespec} -> {routespec}/"
)
routespec += "/"
if self.app.subdomain_host:
# subdomain routing must _not_ start with /
if routespec.startswith("/"):
raise ValueError(
f"Proxy.extra_routes missing host component in {routespec} (must not have leading '/') when using `JupyterHub.subdomain_host = {self.app.subdomain_host!r}`"
)
else:
# no subdomains, must start with /
# this is ambiguous with host routing, so raise instead of warn
if not routespec.startswith("/"):
raise ValueError(
f"Proxy.extra_routes routespec {routespec} missing leading '/'."
)
# validate target URL?
target_url = urlparse(target.lower())
if target_url.scheme not in {"http", "https"} or not target_url.netloc:
raise ValueError(
f"Proxy.extra_routes target {routespec}={target!r} doesn't look like a URL (should have http[s]://...)"
)
extra_routes[routespec] = target
return extra_routes
def start(self): def start(self):
"""Start the proxy. """Start the proxy.
@@ -353,9 +306,7 @@ class Proxy(LoggingConfigurable):
"""Remove a user's server from the proxy table.""" """Remove a user's server from the proxy table."""
routespec = user.proxy_spec routespec = user.proxy_spec
if server_name: if server_name:
routespec = url_path_join( routespec = url_path_join(user.proxy_spec, server_name, '/')
user.proxy_spec, url_escape_path(server_name), '/'
)
self.log.info("Removing user %s from proxy (%s)", user.name, routespec) self.log.info("Removing user %s from proxy (%s)", user.name, routespec)
await self.delete_route(routespec) await self.delete_route(routespec)
@@ -524,21 +475,7 @@ class ConfigurableHTTPProxy(Proxy):
def _concurrency_changed(self, change): def _concurrency_changed(self, change):
self.semaphore = asyncio.BoundedSemaphore(change.new) self.semaphore = asyncio.BoundedSemaphore(change.new)
# https://github.com/jupyterhub/configurable-http-proxy/blob/4.5.1/bin/configurable-http-proxy#L92
log_level = CaselessStrEnum(
["debug", "info", "warn", "error"],
"info",
help="Proxy log level",
config=True,
)
debug = Bool(False, help="Add debug-level logging to the Proxy.", config=True) debug = Bool(False, help="Add debug-level logging to the Proxy.", config=True)
@observe('debug')
def _debug_changed(self, change):
if change.new:
self.log_level = "debug"
auth_token = Unicode( auth_token = Unicode(
help="""The Proxy auth token help="""The Proxy auth token
@@ -734,11 +671,11 @@ class ConfigurableHTTPProxy(Proxy):
str(api_server.port), str(api_server.port),
'--error-target', '--error-target',
url_path_join(self.hub.url, 'error'), url_path_join(self.hub.url, 'error'),
'--log-level',
self.log_level,
] ]
if self.app.subdomain_host: if self.app.subdomain_host:
cmd.append('--host-routing') cmd.append('--host-routing')
if self.debug:
cmd.extend(['--log-level', 'debug'])
if self.ssl_key: if self.ssl_key:
cmd.extend(['--ssl-key', self.ssl_key]) cmd.extend(['--ssl-key', self.ssl_key])
if self.ssl_cert: if self.ssl_cert:

View File

@@ -3,11 +3,13 @@
# Distributed under the terms of the Modified BSD License. # Distributed under the terms of the Modified BSD License.
import re import re
from functools import wraps from functools import wraps
from itertools import chain
from sqlalchemy import func from sqlalchemy import func
from tornado.log import app_log from tornado.log import app_log
from . import orm, scopes from . import orm
from . import scopes
def get_default_roles(): def get_default_roles():
@@ -31,7 +33,6 @@ def get_default_roles():
'name': 'admin', 'name': 'admin',
'description': 'Elevated privileges (can do anything)', 'description': 'Elevated privileges (can do anything)',
'scopes': [ 'scopes': [
'admin-ui',
'admin:users', 'admin:users',
'admin:servers', 'admin:servers',
'tokens', 'tokens',
@@ -64,51 +65,180 @@ def get_default_roles():
return default_roles return default_roles
def get_roles_for(orm_object): def expand_self_scope(name):
"""Get roles for a given User/Group/etc. """
Users have a metascope 'self' that should be expanded to standard user privileges.
At the moment that is a user-filtered version (optional read) access to
users
users:name
users:groups
users:activity
tokens
servers
access:servers
Arguments:
name (str): user name
Returns:
expanded scopes (set): set of expanded scopes covering standard user privileges
"""
scope_list = [
'read:users',
'read:users:name',
'read:users:groups',
'users:activity',
'read:users:activity',
'servers',
'delete:servers',
'read:servers',
'tokens',
'read:tokens',
'access:servers',
]
return {f"{scope}!user={name}" for scope in scope_list}
def horizontal_filter(func):
"""Decorator to account for horizontal filtering in scope syntax"""
def expand_server_filter(hor_filter):
resource, mark, value = hor_filter.partition('=')
if resource == 'server':
user, mark, server = value.partition('/')
return f'read:users:name!user={user}'
def ignore(scopename):
# temporarily remove horizontal filtering if present
scopename, mark, hor_filter = scopename.partition('!')
expanded_scope = func(scopename)
# add the filter back
full_expanded_scope = {scope + mark + hor_filter for scope in expanded_scope}
server_filter = expand_server_filter(hor_filter)
if server_filter:
full_expanded_scope.add(server_filter)
return full_expanded_scope
return ignore
@horizontal_filter
def _expand_scope(scopename):
"""Returns a set of all subscopes
Arguments:
scopename (str): name of the scope to expand
Returns:
expanded scope (set): set of all scope's subscopes including the scope itself
"""
expanded_scope = []
def _add_subscopes(scopename):
expanded_scope.append(scopename)
if scopes.scope_definitions[scopename].get('subscopes'):
for subscope in scopes.scope_definitions[scopename].get('subscopes'):
_add_subscopes(subscope)
_add_subscopes(scopename)
return set(expanded_scope)
def expand_roles_to_scopes(orm_object):
"""Get the scopes listed in the roles of the User/Service/Group/Token
If User, take into account the user's groups roles as well If User, take into account the user's groups roles as well
Arguments: Arguments:
orm_object: orm.User, orm.Service, orm.Group orm_object: orm.User, orm.Service, orm.Group or orm.APIToken
Any role-having entity
Returns: Returns:
roles (list): list of orm.Role objects assigned to the object. expanded scopes (set): set of all expanded scopes for the orm object
""" """
if not isinstance(orm_object, orm.Base): if not isinstance(orm_object, orm.Base):
raise TypeError(f"Only orm objects allowed, got {orm_object}") raise TypeError(f"Only orm objects allowed, got {orm_object}")
roles = [] pass_roles = []
roles.extend(orm_object.roles) pass_roles.extend(orm_object.roles)
if isinstance(orm_object, orm.User): if isinstance(orm_object, orm.User):
for group in orm_object.groups: for group in orm_object.groups:
roles.extend(group.roles) pass_roles.extend(group.roles)
return roles
expanded_scopes = _get_subscopes(*pass_roles, owner=orm_object)
return expanded_scopes
def roles_to_scopes(roles): def _get_subscopes(*roles, owner=None):
"""Returns set of raw (not expanded) scopes for a collection of roles""" """Returns a set of all available subscopes for a specified role or list of roles
raw_scopes = set()
for role in roles:
raw_scopes.update(role.scopes)
return raw_scopes
def roles_to_expanded_scopes(roles, owner):
"""Returns a set of fully expanded scopes for a specified role or list of roles
Arguments: Arguments:
roles (list(orm.Role): orm.Role objects to expand roles (obj): orm.Roles
owner (obj): orm.User or orm.Service which holds the role(s) owner (obj, optional): orm.User or orm.Service as owner of orm.APIToken
Used for expanding filters and metascopes such as !user.
Returns: Returns:
expanded scopes (set): set of all expanded scopes for the role(s) expanded scopes (set): set of all expanded scopes for the role(s)
""" """
return scopes.expand_scopes(roles_to_scopes(roles), owner=owner) scopes = set()
for role in roles:
scopes.update(role.scopes)
expanded_scopes = set(chain.from_iterable(list(map(_expand_scope, scopes))))
# transform !user filter to !user=ownername
for scope in expanded_scopes.copy():
base_scope, _, filter = scope.partition('!')
if filter == 'user':
expanded_scopes.remove(scope)
if isinstance(owner, orm.APIToken):
token_owner = owner.user
if token_owner is None:
token_owner = owner.service
name = token_owner.name
else:
name = owner.name
trans_scope = f'{base_scope}!user={name}'
expanded_scopes.add(trans_scope)
if 'self' in expanded_scopes:
expanded_scopes.remove('self')
if owner and isinstance(owner, orm.User):
expanded_scopes |= expand_self_scope(owner.name)
return expanded_scopes
def _check_scopes(*args, rolename=None):
"""Check if provided scopes exist
Arguments:
scope (str): name of the scope to check
or
scopes (list): list of scopes to check
Raises KeyError if scope does not exist
"""
allowed_scopes = set(scopes.scope_definitions.keys())
allowed_filters = ['!user=', '!service=', '!group=', '!server=', '!user']
if rolename:
log_role = f"for role {rolename}"
else:
log_role = ""
for scope in args:
scopename, _, filter_ = scope.partition('!')
if scopename not in allowed_scopes:
if scopename == "all":
raise KeyError("Draft scope 'all' is now called 'inherit'")
raise KeyError(f"Scope '{scope}' {log_role} does not exist")
if filter_:
full_filter = f"!{filter_}"
if not any(f in scope for f in allowed_filters):
raise KeyError(
f"Scope filter '{full_filter}' in scope '{scope}' {log_role} does not exist"
)
_role_name_pattern = re.compile(r'^[a-z][a-z0-9\-_~\.]{1,253}[a-z0-9]$') _role_name_pattern = re.compile(r'^[a-z][a-z0-9\-_~\.]{1,253}[a-z0-9]$')
@@ -158,10 +288,7 @@ def create_role(db, role_dict):
# check if the provided scopes exist # check if the provided scopes exist
if scopes: if scopes:
# avoid circular import _check_scopes(*scopes, rolename=role_dict['name'])
from .scopes import _check_scopes_exist
_check_scopes_exist(scopes, who_for=f"role {role_dict['name']}")
if role is None: if role is None:
if not scopes: if not scopes:
@@ -235,13 +362,13 @@ def grant_role(db, entity, role):
if role not in entity.roles: if role not in entity.roles:
entity.roles.append(role) entity.roles.append(role)
db.commit()
app_log.info( app_log.info(
'Adding role %s for %s: %s', 'Adding role %s for %s: %s',
role.name, role.name,
type(entity).__name__, type(entity).__name__,
entity_repr, entity_repr,
) )
db.commit()
@_existing_only @_existing_only
@@ -262,6 +389,45 @@ def strip_role(db, entity, role):
) )
def _token_allowed_role(db, token, role):
"""Checks if requested role for token does not grant the token
higher permissions than the token's owner has
Returns:
True if requested permissions are within the owner's permissions, False otherwise
"""
owner = token.user
if owner is None:
owner = token.service
if owner is None:
raise ValueError(f"Owner not found for {token}")
if role in owner.roles:
# shortcut: token is assigned an exact role the owner has
return True
expanded_scopes = _get_subscopes(role, owner=owner)
implicit_permissions = {'inherit', 'read:inherit'}
explicit_scopes = expanded_scopes - implicit_permissions
# find the owner's scopes
expanded_owner_scopes = expand_roles_to_scopes(owner)
allowed_scopes = scopes._intersect_expanded_scopes(
explicit_scopes, expanded_owner_scopes, db
)
disallowed_scopes = explicit_scopes.difference(allowed_scopes)
if not disallowed_scopes:
# no scopes requested outside owner's own scopes
return True
else:
app_log.warning(
f"Token requesting role {role.name} with scopes not held by owner {owner.name}: {disallowed_scopes}"
)
return False
def assign_default_roles(db, entity): def assign_default_roles(db, entity):
"""Assigns default role(s) to an entity: """Assigns default role(s) to an entity:
@@ -274,28 +440,54 @@ def assign_default_roles(db, entity):
if isinstance(entity, orm.Group): if isinstance(entity, orm.Group):
return return
if isinstance(entity, orm.APIToken):
app_log.debug('Assigning default role to token')
default_token_role = orm.Role.find(db, 'token')
if not entity.roles and (entity.user or entity.service) is not None:
default_token_role.tokens.append(entity)
app_log.info('Added role %s to token %s', default_token_role.name, entity)
db.commit()
# users and services all have 'user' role by default # users and services all have 'user' role by default
# and optionally 'admin' as well # and optionally 'admin' as well
kind = type(entity).__name__
app_log.debug(f'Assigning default role to {kind} {entity.name}')
if entity.admin:
grant_role(db, entity=entity, rolename="admin")
else: else:
admin_role = orm.Role.find(db, 'admin') kind = type(entity).__name__
if admin_role in entity.roles: app_log.debug(f'Assigning default role to {kind} {entity.name}')
strip_role(db, entity=entity, rolename="admin") if entity.admin:
if kind == "User": grant_role(db, entity=entity, rolename="admin")
grant_role(db, entity=entity, rolename="user") else:
admin_role = orm.Role.find(db, 'admin')
if admin_role in entity.roles:
strip_role(db, entity=entity, rolename="admin")
if kind == "User":
grant_role(db, entity=entity, rolename="user")
def update_roles(db, entity, roles): def update_roles(db, entity, roles):
"""Add roles to an entity (token, user, etc.) """Add roles to an entity (token, user, etc.)
Calls `grant_role` for each role. If it is an API token, check role permissions against token owner
prior to assignment to avoid permission expansion.
Otherwise, it just calls `grant_role` for each role.
""" """
for rolename in roles: for rolename in roles:
grant_role(db, entity=entity, rolename=rolename) if isinstance(entity, orm.APIToken):
role = orm.Role.find(db, rolename)
if role:
app_log.debug(
'Checking token permissions against requested role %s', rolename
)
if _token_allowed_role(db, entity, role):
role.tokens.append(entity)
app_log.info('Adding role %s to token: %s', role.name, entity)
else:
raise ValueError(
f'Requested token role {rolename} for {entity} has more permissions than the token owner'
)
else:
raise KeyError(f'Role {rolename} does not exist')
else:
grant_role(db, entity=entity, rolename=rolename)
def check_for_default_roles(db, bearer): def check_for_default_roles(db, bearer):

View File

@@ -1,34 +1,26 @@
""" """
General scope definitions and utilities General scope definitions and utilities
Scope functions generally return _immutable_ collections,
such as `frozenset` to avoid mutating cached values.
If needed, mutable copies can be made, e.g. `set(frozen_scopes)`
Scope variable nomenclature Scope variable nomenclature
--------------------------- ---------------------------
scopes or 'raw' scopes: collection of scopes that may contain abbreviations (e.g., in role definition) scopes: list of scopes with abbreviations (e.g., in role definition)
expanded scopes: set of expanded scopes without abbreviations (i.e., resolved metascopes, filters, and subscopes) expanded scopes: set of expanded scopes without abbreviations (i.e., resolved metascopes, filters and subscopes)
parsed scopes: dictionary format of expanded scopes (`read:users!user=name` -> `{'read:users': {user: [name]}`) parsed scopes: dictionary JSON like format of expanded scopes
intersection : set of expanded scopes as intersection of 2 expanded scope sets intersection : set of expanded scopes as intersection of 2 expanded scope sets
identify scopes: set of expanded scopes needed for identify (whoami) endpoints identify scopes: set of expanded scopes needed for identify (whoami) endpoints
reduced scopes: expanded scopes that have been reduced
""" """
import functools import functools
import inspect import inspect
import re
import warnings import warnings
from enum import Enum from enum import Enum
from functools import lru_cache from functools import lru_cache
from itertools import chain
from textwrap import indent
import sqlalchemy as sa import sqlalchemy as sa
from tornado import web from tornado import web
from tornado.log import app_log from tornado.log import app_log
from . import orm, roles from . import orm
from ._memoize import DoNotCache, FrozenDict, lru_cache_key from . import roles
"""when modifying the scope definitions, make sure that `docs/source/rbac/generate-scope-table.py` is run """when modifying the scope definitions, make sure that `docs/source/rbac/generate-scope-table.py` is run
so that changes are reflected in the documentation and REST API description.""" so that changes are reflected in the documentation and REST API description."""
@@ -42,10 +34,6 @@ scope_definitions = {
'description': 'Anything you have access to', 'description': 'Anything you have access to',
'doc_description': 'Everything that the token-owning entity can access _(metascope for tokens)_', 'doc_description': 'Everything that the token-owning entity can access _(metascope for tokens)_',
}, },
'admin-ui': {
'description': 'Access the admin page.',
'doc_description': 'Access the admin page. Permission to take actions via the admin page granted separately.',
},
'admin:users': { 'admin:users': {
'description': 'Read, write, create and delete users and their authentication state, not including their servers or tokens.', 'description': 'Read, write, create and delete users and their authentication state, not including their servers or tokens.',
'subscopes': ['admin:auth_state', 'users', 'read:roles:users', 'delete:users'], 'subscopes': ['admin:auth_state', 'users', 'read:roles:users', 'delete:users'],
@@ -153,12 +141,6 @@ class Scope(Enum):
ALL = True ALL = True
def _intersection_cache_key(scopes_a, scopes_b, db=None):
"""Cache key function for scope intersections"""
return (frozenset(scopes_a), frozenset(scopes_b))
@lru_cache_key(_intersection_cache_key)
def _intersect_expanded_scopes(scopes_a, scopes_b, db=None): def _intersect_expanded_scopes(scopes_a, scopes_b, db=None):
"""Intersect two sets of scopes by comparing their permissions """Intersect two sets of scopes by comparing their permissions
@@ -174,16 +156,11 @@ def _intersect_expanded_scopes(scopes_a, scopes_b, db=None):
(i.e. users!group=x & users!user=y will be empty, even if user y is in group x.) (i.e. users!group=x & users!user=y will be empty, even if user y is in group x.)
""" """
empty_set = frozenset() empty_set = frozenset()
scopes_a = frozenset(scopes_a)
scopes_b = frozenset(scopes_b)
# cached lookups for group membership of users and servers # cached lookups for group membership of users and servers
@lru_cache() @lru_cache()
def groups_for_user(username): def groups_for_user(username):
"""Get set of group names for a given username""" """Get set of group names for a given username"""
# if we need a group lookup, the result is not cacheable
nonlocal needs_db
needs_db = True
user = db.query(orm.User).filter_by(name=username).first() user = db.query(orm.User).filter_by(name=username).first()
if user is None: if user is None:
return empty_set return empty_set
@@ -199,11 +176,6 @@ def _intersect_expanded_scopes(scopes_a, scopes_b, db=None):
parsed_scopes_a = parse_scopes(scopes_a) parsed_scopes_a = parse_scopes(scopes_a)
parsed_scopes_b = parse_scopes(scopes_b) parsed_scopes_b = parse_scopes(scopes_b)
# track whether we need a db lookup (for groups)
# because we can't cache the intersection if we do
# if there are no group filters, this is cacheable
needs_db = False
common_bases = parsed_scopes_a.keys() & parsed_scopes_b.keys() common_bases = parsed_scopes_a.keys() & parsed_scopes_b.keys()
common_filters = {} common_filters = {}
@@ -245,7 +217,6 @@ def _intersect_expanded_scopes(scopes_a, scopes_b, db=None):
UserWarning, UserWarning,
) )
warned = True warned = True
needs_db = True
common_filters[base] = { common_filters[base] = {
entity: filters_a[entity] & filters_b[entity] entity: filters_a[entity] & filters_b[entity]
@@ -271,7 +242,6 @@ def _intersect_expanded_scopes(scopes_a, scopes_b, db=None):
# resolve group/server hierarchy if db available # resolve group/server hierarchy if db available
servers = servers.difference(common_servers) servers = servers.difference(common_servers)
if db is not None and servers and 'group' in b: if db is not None and servers and 'group' in b:
needs_db = True
for server in servers: for server in servers:
server_groups = groups_for_server(server) server_groups = groups_for_server(server)
if server_groups & b['group']: if server_groups & b['group']:
@@ -299,12 +269,7 @@ def _intersect_expanded_scopes(scopes_a, scopes_b, db=None):
if common_users and "user" not in common_filters[base]: if common_users and "user" not in common_filters[base]:
common_filters[base]["user"] = common_users common_filters[base]["user"] = common_users
intersection = unparse_scopes(common_filters) return unparse_scopes(common_filters)
if needs_db:
# return intersection, but don't cache it if it needed db lookups
return DoNotCache(intersection)
return intersection
def get_scopes_for(orm_object): def get_scopes_for(orm_object):
@@ -332,36 +297,37 @@ def get_scopes_for(orm_object):
f"Only allow orm objects or User wrappers, got {orm_object}" f"Only allow orm objects or User wrappers, got {orm_object}"
) )
owner = None
if isinstance(orm_object, orm.APIToken): if isinstance(orm_object, orm.APIToken):
app_log.debug(f"Authenticated with token {orm_object}")
owner = orm_object.user or orm_object.service owner = orm_object.user or orm_object.service
owner_roles = roles.get_roles_for(owner) token_scopes = roles.expand_roles_to_scopes(orm_object)
owner_scopes = roles.roles_to_expanded_scopes(owner_roles, owner)
token_scopes = set(orm_object.scopes)
if 'inherit' in token_scopes:
# token_scopes includes 'inherit',
# so we know the intersection is exactly the owner's scopes
# only thing we miss by short-circuiting here: warning about excluded extra scopes
return owner_scopes
token_scopes = set(
expand_scopes(
token_scopes,
owner=owner,
oauth_client=orm_object.oauth_client,
)
)
if orm_object.client_id != "jupyterhub": if orm_object.client_id != "jupyterhub":
# oauth tokens can be used to access the service issuing the token, # oauth tokens can be used to access the service issuing the token,
# assuming the owner itself still has permission to do so # assuming the owner itself still has permission to do so
token_scopes.update(access_scopes(orm_object.oauth_client)) spawner = orm_object.oauth_client.spawner
if spawner:
token_scopes.add(
f"access:servers!server={spawner.user.name}/{spawner.name}"
)
else:
service = orm_object.oauth_client.service
if service:
token_scopes.add(f"access:services!service={service.name}")
else:
app_log.warning(
f"Token {orm_object} has no associated service or spawner!"
)
# reduce to collapse multiple filters on the same scope owner_scopes = roles.expand_roles_to_scopes(owner)
# to avoid spurious logs about discarded scopes
token_scopes.update(identify_scopes(owner)) if token_scopes == {'inherit'}:
token_scopes = reduce_scopes(token_scopes) # token_scopes is only 'inherit', return scopes inherited from owner as-is
# short-circuit common case where we don't need to compute an intersection
return owner_scopes
if 'inherit' in token_scopes:
token_scopes.remove('inherit')
token_scopes |= owner_scopes
intersection = _intersect_expanded_scopes( intersection = _intersect_expanded_scopes(
token_scopes, token_scopes,
@@ -373,202 +339,15 @@ def get_scopes_for(orm_object):
# Not taking symmetric difference here because token owner can naturally have more scopes than token # Not taking symmetric difference here because token owner can naturally have more scopes than token
if discarded_token_scopes: if discarded_token_scopes:
app_log.warning( app_log.warning(
f"discarding scopes [{discarded_token_scopes}]," "discarding scopes [%s], not present in owner roles"
f" not present in roles of owner {owner}" % ", ".join(discarded_token_scopes)
)
app_log.debug(
"Owner %s has scopes: %s\nToken has scopes: %s",
owner,
owner_scopes,
token_scopes,
) )
expanded_scopes = intersection expanded_scopes = intersection
# always include identify scopes
expanded_scopes
else: else:
expanded_scopes = roles.roles_to_expanded_scopes( expanded_scopes = roles.expand_roles_to_scopes(orm_object)
roles.get_roles_for(orm_object),
owner=orm_object,
)
if isinstance(orm_object, (orm.User, orm.Service)):
owner = orm_object
return expanded_scopes return expanded_scopes
@lru_cache()
def _expand_self_scope(username):
"""
Users have a metascope 'self' that should be expanded to standard user privileges.
At the moment that is a user-filtered version (optional read) access to
users
users:name
users:groups
users:activity
tokens
servers
access:servers
Arguments:
username (str): user name
Returns:
expanded scopes (set): set of expanded scopes covering standard user privileges
"""
scope_list = [
'read:users',
'read:users:name',
'read:users:groups',
'users:activity',
'read:users:activity',
'servers',
'delete:servers',
'read:servers',
'tokens',
'read:tokens',
'access:servers',
]
# return immutable frozenset because the result is cached
return frozenset(f"{scope}!user={username}" for scope in scope_list)
@lru_cache(maxsize=65535)
def _expand_scope(scope):
"""Returns a scope and all all subscopes
Arguments:
scope (str): the scope to expand
Returns:
expanded scope (set): set of all scope's subscopes including the scope itself
"""
# remove filter, save for later
scope_name, sep, filter_ = scope.partition('!')
# expand scope and subscopes
expanded_scope_names = set()
def _add_subscopes(scope_name):
expanded_scope_names.add(scope_name)
if scope_definitions[scope_name].get('subscopes'):
for subscope in scope_definitions[scope_name].get('subscopes'):
_add_subscopes(subscope)
_add_subscopes(scope_name)
# reapply !filter
if filter_:
expanded_scopes = {
f"{scope_name}!{filter_}" for scope_name in expanded_scope_names
}
# special handling of server filter
# any read access via server filter includes permission to read the user's name
resource, _, value = filter_.partition('=')
if resource == 'server' and any(
scope_name.startswith("read:") for scope_name in expanded_scope_names
):
username, _, server = value.partition('/')
expanded_scopes.add(f'read:users:name!user={username}')
else:
expanded_scopes = expanded_scope_names
# return immutable frozenset because the result is cached
return frozenset(expanded_scopes)
def _expand_scopes_key(scopes, owner=None, oauth_client=None):
"""Cache key function for expand_scopes
scopes is usually a mutable list or set,
which can be hashed as a frozenset
For the owner, we only care about what kind they are,
and their name.
"""
# freeze scopes for hash
frozen_scopes = frozenset(scopes)
if owner is None:
owner_key = None
else:
# owner key is the type and name
owner_key = (type(owner).__name__, owner.name)
if oauth_client is None:
oauth_client_key = None
else:
oauth_client_key = oauth_client.identifier
return (frozen_scopes, owner_key, oauth_client_key)
@lru_cache_key(_expand_scopes_key)
def expand_scopes(scopes, owner=None, oauth_client=None):
"""Returns a set of fully expanded scopes for a collection of raw scopes
Arguments:
scopes (collection(str)): collection of raw scopes
owner (obj, optional): orm.User or orm.Service as owner of orm.APIToken
Used for expansion of metascopes such as `self`
and owner-based filters such as `!user`
oauth_client (obj, optional): orm.OAuthClient
The issuing OAuth client of an API token.
Returns:
expanded scopes (set): set of all expanded scopes, with filters applied for the owner
"""
expanded_scopes = set(chain.from_iterable(map(_expand_scope, scopes)))
filter_replacements = {
"user": None,
"service": None,
"server": None,
}
user_name = None
if isinstance(owner, orm.User):
user_name = owner.name
filter_replacements["user"] = f"user={user_name}"
elif isinstance(owner, orm.Service):
filter_replacements["service"] = f"service={owner.name}"
if oauth_client is not None:
if oauth_client.service is not None:
filter_replacements["service"] = f"service={oauth_client.service.name}"
elif oauth_client.spawner is not None:
spawner = oauth_client.spawner
filter_replacements["server"] = f"server={spawner.user.name}/{spawner.name}"
for scope in expanded_scopes.copy():
base_scope, _, filter = scope.partition('!')
if filter in filter_replacements:
# translate !user into !user={username}
# and !service into !service={servicename}
# and !server into !server={username}/{servername}
expanded_scopes.remove(scope)
expanded_filter = filter_replacements[filter]
if expanded_filter:
# translate
expanded_scopes.add(f'{base_scope}!{expanded_filter}')
else:
warnings.warn(
f"Not expanding !{filter} filter without target {filter} in {scope}",
stacklevel=3,
)
if 'self' in expanded_scopes:
expanded_scopes.remove('self')
if user_name:
expanded_scopes |= _expand_self_scope(user_name)
else:
warnings.warn(
f"Not expanding 'self' scope for owner {owner} which is not a User",
stacklevel=3,
)
# reduce to discard overlapping scopes
# return immutable frozenset because the result is cached
return frozenset(reduce_scopes(expanded_scopes))
def _needs_scope_expansion(filter_, filter_value, sub_scope): def _needs_scope_expansion(filter_, filter_value, sub_scope):
""" """
Check if there is a requirements to expand the `group` scope to individual `user` scopes. Check if there is a requirements to expand the `group` scope to individual `user` scopes.
@@ -633,77 +412,6 @@ def _check_scope_access(api_handler, req_scope, **kwargs):
raise web.HTTPError(404, "No access to resources or resources not found") raise web.HTTPError(404, "No access to resources or resources not found")
def _check_scopes_exist(scopes, who_for=None):
"""Check if provided scopes exist
Arguments:
scopes (list): list of scopes to check
Raises KeyError if scope does not exist
"""
allowed_scopes = set(scope_definitions.keys())
filter_prefixes = ('!user=', '!service=', '!group=', '!server=')
exact_filters = {"!user", "!service", "!server"}
if who_for:
log_for = f"for {who_for}"
else:
log_for = ""
for scope in scopes:
scopename, _, filter_ = scope.partition('!')
if scopename not in allowed_scopes:
if scopename == "all":
raise KeyError("Draft scope 'all' is now called 'inherit'")
raise KeyError(f"Scope '{scope}' {log_for} does not exist")
if filter_:
full_filter = f"!{filter_}"
if full_filter not in exact_filters and not full_filter.startswith(
filter_prefixes
):
raise KeyError(
f"Scope filter {filter_} '{full_filter}' in scope '{scope}' {log_for} does not exist"
)
def _check_token_scopes(scopes, owner, oauth_client):
"""Check that scopes to be assigned to a token
are in fact
Arguments:
scopes: raw or expanded scopes
owner: orm.User or orm.Service
raises:
ValueError: if requested scopes exceed owner's assigned scopes
"""
scopes = set(scopes)
if scopes.issubset({"inherit"}):
# nothing to check for simple 'inherit' scopes
return
scopes.discard("inherit")
# common short circuit
token_scopes = expand_scopes(scopes, owner=owner, oauth_client=oauth_client)
if not token_scopes:
return
owner_scopes = get_scopes_for(owner)
intersection = _intersect_expanded_scopes(
token_scopes,
owner_scopes,
db=sa.inspect(owner).session,
)
excess_scopes = token_scopes - intersection
if excess_scopes:
raise ValueError(
f"Not assigning requested scopes {','.join(excess_scopes)} not held by {owner.__class__.__name__} {owner.name}"
)
@lru_cache_key(frozenset)
def parse_scopes(scope_list): def parse_scopes(scope_list):
""" """
Parses scopes and filters in something akin to JSON style Parses scopes and filters in something akin to JSON style
@@ -739,11 +447,9 @@ def parse_scopes(scope_list):
parsed_scopes[base_scope][key] = {value} parsed_scopes[base_scope][key] = {value}
else: else:
parsed_scopes[base_scope][key].add(value) parsed_scopes[base_scope][key].add(value)
# return immutable FrozenDict because the result is cached return parsed_scopes
return FrozenDict(parsed_scopes)
@lru_cache_key(FrozenDict)
def unparse_scopes(parsed_scopes): def unparse_scopes(parsed_scopes):
"""Turn a parsed_scopes dictionary back into a expanded scopes set""" """Turn a parsed_scopes dictionary back into a expanded scopes set"""
expanded_scopes = set() expanded_scopes = set()
@@ -754,18 +460,7 @@ def unparse_scopes(parsed_scopes):
for entity, names_list in filters.items(): for entity, names_list in filters.items():
for name in names_list: for name in names_list:
expanded_scopes.add(f'{base}!{entity}={name}') expanded_scopes.add(f'{base}!{entity}={name}')
# return immutable frozenset because the result is cached return expanded_scopes
return frozenset(expanded_scopes)
@lru_cache_key(frozenset)
def reduce_scopes(expanded_scopes):
"""Reduce expanded scopes to minimal set
Eliminates overlapping scopes, such as access:services and access:services!service=x
"""
# unparse_scopes already returns a frozenset
return unparse_scopes(parse_scopes(expanded_scopes))
def needs_scope(*scopes): def needs_scope(*scopes):
@@ -818,69 +513,23 @@ def needs_scope(*scopes):
return scope_decorator return scope_decorator
def _identify_key(obj=None): def identify_scopes(obj):
if obj is None:
return None
else:
return (type(obj).__name__, obj.name)
@lru_cache_key(_identify_key)
def identify_scopes(obj=None):
"""Return 'identify' scopes for an orm object """Return 'identify' scopes for an orm object
Arguments: Arguments:
obj (optional): orm.User or orm.Service obj: orm.User or orm.Service
If not specified, 'raw' scopes for identifying the current user are returned,
which may need to be expanded, later.
Returns: Returns:
identify scopes (set): set of scopes needed for 'identify' endpoints identify scopes (set): set of scopes needed for 'identify' endpoints
""" """
if obj is None: if isinstance(obj, orm.User):
return frozenset(f"read:users:{field}!user" for field in {"name", "groups"}) return {f"read:users:{field}!user={obj.name}" for field in {"name", "groups"}}
elif isinstance(obj, orm.User):
return frozenset(
f"read:users:{field}!user={obj.name}" for field in {"name", "groups"}
)
elif isinstance(obj, orm.Service): elif isinstance(obj, orm.Service):
return frozenset( return {f"read:services:{field}!service={obj.name}" for field in {"name"}}
f"read:services:{field}!service={obj.name}" for field in {"name"}
)
else: else:
raise TypeError(f"Expected orm.User or orm.Service, got {obj!r}") raise TypeError(f"Expected orm.User or orm.Service, got {obj!r}")
@lru_cache_key(lambda oauth_client: oauth_client.identifier)
def access_scopes(oauth_client):
"""Return scope(s) required to access an oauth client"""
scopes = set()
if oauth_client.identifier == "jupyterhub":
return frozenset()
spawner = oauth_client.spawner
if spawner:
scopes.add(f"access:servers!server={spawner.user.name}/{spawner.name}")
else:
service = oauth_client.service
if service:
scopes.add(f"access:services!service={service.name}")
else:
app_log.warning(
f"OAuth client {oauth_client} has no associated service or spawner!"
)
return frozenset(scopes)
def _check_scope_key(sub_scope, orm_resource, kind):
"""Cache key function for check_scope_filter"""
if kind == 'server':
resource_key = (orm_resource.user.name, orm_resource.name)
else:
resource_key = orm_resource.name
return (sub_scope, resource_key, kind)
@lru_cache_key(_check_scope_key)
def check_scope_filter(sub_scope, orm_resource, kind): def check_scope_filter(sub_scope, orm_resource, kind):
"""Return whether a sub_scope filter applies to a given resource. """Return whether a sub_scope filter applies to a given resource.
@@ -909,8 +558,8 @@ def check_scope_filter(sub_scope, orm_resource, kind):
if kind == 'user' and 'group' in sub_scope: if kind == 'user' and 'group' in sub_scope:
group_names = {group.name for group in orm_resource.groups} group_names = {group.name for group in orm_resource.groups}
user_in_group = bool(group_names & set(sub_scope['group'])) user_in_group = bool(group_names & set(sub_scope['group']))
# cannot cache if we needed to lookup groups in db if user_in_group:
return DoNotCache(user_in_group) return True
return False return False
@@ -949,7 +598,6 @@ def describe_parsed_scopes(parsed_scopes, username=None):
return descriptions return descriptions
@lru_cache_key(lambda raw_scopes, username=None: (frozenset(raw_scopes), username))
def describe_raw_scopes(raw_scopes, username=None): def describe_raw_scopes(raw_scopes, username=None):
"""Return list of descriptions of raw scopes """Return list of descriptions of raw scopes
@@ -980,122 +628,4 @@ def describe_raw_scopes(raw_scopes, username=None):
"filter": filter_text, "filter": filter_text,
} }
) )
# make sure we return immutable from a cached function return descriptions
return tuple(descriptions)
# regex for custom scope
# for-humans description below
# note: scope description duplicated in docs/source/rbac/scopes.md
# update docs when making changes here
_custom_scope_pattern = re.compile(r"^custom:[a-z0-9][a-z0-9_\-\*:]+[a-z0-9_\*]$")
# custom scope pattern description
# used in docstring below and error message when scopes don't match _custom_scope_pattern
_custom_scope_description = """
Custom scopes must start with `custom:`
and contain only lowercase ascii letters, numbers, hyphen, underscore, colon, and asterisk (-_:*).
The part after `custom:` must start with a letter or number.
Scopes may not end with a hyphen or colon.
"""
def define_custom_scopes(scopes):
"""Define custom scopes
Adds custom scopes to the scope_definitions dict.
Scopes must start with `custom:`.
It is recommended to name custom scopes with a pattern like::
custom:$your-project:$action:$resource
e.g.::
custom:jupyter_server:read:contents
That makes them easy to parse and avoids collisions across projects.
`scopes` must have at least one scope definition,
and each scope definition must have a `description`,
which will be displayed on the oauth authorization page,
and _may_ have a `subscopes` list of other scopes if having one scope
should imply having other, more specific scopes.
Args:
scopes: dict
A dictionary of scope definitions.
The keys are the scopes,
while the values are dictionaries with at least a `description` field,
and optional `subscopes` field.
%s
Examples::
define_custom_scopes(
{
"custom:jupyter_server:read:contents": {
"description": "read-only access to files in a Jupyter server",
},
"custom:jupyter_server:read": {
"description": "read-only access to a Jupyter server",
"subscopes": [
"custom:jupyter_server:read:contents",
"custom:jupyter_server:read:kernels",
"...",
},
}
)
""" % indent(
_custom_scope_description, " " * 8
)
for scope, scope_definition in scopes.items():
if scope in scope_definitions and scope_definitions[scope] != scope_definition:
raise ValueError(
f"Cannot redefine scope {scope}={scope_definition}. Already have {scope}={scope_definitions[scope]}"
)
if not _custom_scope_pattern.match(scope):
# note: keep this description in sync with docstring above
raise ValueError(
f"Invalid scope name: {scope!r}.\n{_custom_scope_description}"
" and contain only lowercase ascii letters, numbers, hyphen, underscore, colon, and asterisk."
" The part after `custom:` must start with a letter or number."
" Scopes may not end with a hyphen or colon."
)
if "description" not in scope_definition:
raise ValueError(
f"scope {scope}={scope_definition} missing key 'description'"
)
if "subscopes" in scope_definition:
subscopes = scope_definition["subscopes"]
if not isinstance(subscopes, list) or not all(
isinstance(s, str) for s in subscopes
):
raise ValueError(
f"subscopes must be a list of scope strings, got {subscopes!r}"
)
for subscope in subscopes:
if subscope not in scopes:
if subscope in scope_definitions:
raise ValueError(
f"non-custom subscope {subscope} in {scope}={scope_definition} is not allowed."
f" Custom scopes may only have custom subscopes."
f" Roles should be used to assign multiple scopes together."
)
raise ValueError(
f"subscope {subscope} in {scope}={scope_definition} not found. All scopes must be defined."
)
extra_keys = set(scope_definition.keys()).difference(
["description", "subscopes"]
)
if extra_keys:
warnings.warn(
f"Ignoring unrecognized key(s) {', '.join(extra_keys)!r} in {scope}={scope_definition}",
UserWarning,
stacklevel=2,
)
app_log.info(f"Defining custom scope {scope}")
# deferred evaluation for debug-logging
app_log.debug("Defining custom scope %s=%s", scope, scope_definition)
scope_definitions[scope] = scope_definition

View File

@@ -23,7 +23,6 @@ If you are using OAuth, you will also need to register an oauth callback handler
A tornado implementation is provided in :class:`HubOAuthCallbackHandler`. A tornado implementation is provided in :class:`HubOAuthCallbackHandler`.
""" """
import asyncio
import base64 import base64
import hashlib import hashlib
import json import json
@@ -35,30 +34,27 @@ import string
import time import time
import uuid import uuid
import warnings import warnings
from functools import partial
from http import HTTPStatus
from unittest import mock from unittest import mock
from urllib.parse import urlencode from urllib.parse import urlencode
from tornado.httpclient import AsyncHTTPClient, HTTPRequest import requests
from tornado.httputil import url_concat from tornado.httputil import url_concat
from tornado.log import app_log from tornado.log import app_log
from tornado.web import HTTPError, RequestHandler from tornado.web import HTTPError
from traitlets import ( from tornado.web import RequestHandler
Any, from traitlets import default
Dict, from traitlets import Dict
Instance, from traitlets import Instance
Integer, from traitlets import Integer
Set, from traitlets import observe
Unicode, from traitlets import Set
default, from traitlets import Unicode
observe, from traitlets import validate
validate,
)
from traitlets.config import SingletonConfigurable from traitlets.config import SingletonConfigurable
from ..scopes import _intersect_expanded_scopes from ..scopes import _intersect_expanded_scopes
from ..utils import get_browser_protocol, url_path_join from ..utils import get_browser_protocol
from ..utils import url_path_join
def check_scopes(required_scopes, scopes): def check_scopes(required_scopes, scopes):
@@ -346,76 +342,25 @@ class HubAuth(SingletonConfigurable):
def _default_cache(self): def _default_cache(self):
return _ExpiringDict(self.cache_max_age) return _ExpiringDict(self.cache_max_age)
@property oauth_scopes = Set(
def oauth_scopes(self):
warnings.warn(
"HubAuth.oauth_scopes is deprecated in JupyterHub 3.0. Use .access_scopes"
)
return self.access_scopes
access_scopes = Set(
Unicode(), Unicode(),
help="""OAuth scopes to use for allowing access. help="""OAuth scopes to use for allowing access.
Get from $JUPYTERHUB_OAUTH_ACCESS_SCOPES by default. Get from $JUPYTERHUB_OAUTH_SCOPES by default.
""", """,
).tag(config=True) ).tag(config=True)
@default('access_scopes') @default('oauth_scopes')
def _default_scopes(self): def _default_scopes(self):
env_scopes = os.getenv('JUPYTERHUB_OAUTH_ACCESS_SCOPES') env_scopes = os.getenv('JUPYTERHUB_OAUTH_SCOPES')
if not env_scopes:
# deprecated name (since 3.0)
env_scopes = os.getenv('JUPYTERHUB_OAUTH_SCOPES')
if env_scopes: if env_scopes:
return set(json.loads(env_scopes)) return set(json.loads(env_scopes))
# scopes not specified, use service name if defined
service_name = os.getenv("JUPYTERHUB_SERVICE_NAME") service_name = os.getenv("JUPYTERHUB_SERVICE_NAME")
if service_name: if service_name:
return {f'access:services!service={service_name}'} return {f'access:services!service={service_name}'}
return set() return set()
_pool = Any(help="Thread pool for running async methods in the background") def _check_hub_authorization(self, url, api_token, cache_key=None, use_cache=True):
@default("_pool")
def _new_pool(self):
# start a single ThreadPool in the background
from concurrent.futures import ThreadPoolExecutor
pool = ThreadPoolExecutor(1)
# create an event loop in the thread
pool.submit(self._setup_asyncio_thread).result()
return pool
def _setup_asyncio_thread(self):
"""Create asyncio loop
To be called from the background thread,
so that any thread-local state is setup correctly
"""
self._thread_loop = asyncio.new_event_loop()
def _synchronize(self, async_f, *args, **kwargs):
"""Call an async method in our background thread"""
future = self._pool.submit(
lambda: self._thread_loop.run_until_complete(async_f(*args, **kwargs))
)
return future.result()
def _call_coroutine(self, sync, async_f, *args, **kwargs):
"""Call an async coroutine function, either blocking or returning an awaitable
if not sync: calls function directly, returning awaitable
else: Block on a call in our background thread, return actual result
"""
if not sync:
return async_f(*args, **kwargs)
else:
return self._synchronize(async_f, *args, **kwargs)
async def _check_hub_authorization(
self, url, api_token, cache_key=None, use_cache=True
):
"""Identify a user with the Hub """Identify a user with the Hub
Args: Args:
@@ -438,7 +383,7 @@ class HubAuth(SingletonConfigurable):
except KeyError: except KeyError:
app_log.debug("HubAuth cache miss: %s", cache_key) app_log.debug("HubAuth cache miss: %s", cache_key)
data = await self._api_request( data = self._api_request(
'GET', 'GET',
url, url,
headers={"Authorization": "token " + api_token}, headers={"Authorization": "token " + api_token},
@@ -453,26 +398,18 @@ class HubAuth(SingletonConfigurable):
self.cache[cache_key] = data self.cache[cache_key] = data
return data return data
async def _api_request(self, method, url, **kwargs): def _api_request(self, method, url, **kwargs):
"""Make an API request""" """Make an API request"""
allow_403 = kwargs.pop('allow_403', False) allow_403 = kwargs.pop('allow_403', False)
headers = kwargs.setdefault('headers', {}) headers = kwargs.setdefault('headers', {})
headers.setdefault('Authorization', f'token {self.api_token}') headers.setdefault('Authorization', 'token %s' % self.api_token)
# translate requests args to tornado's if "cert" not in kwargs and self.certfile and self.keyfile:
if self.certfile: kwargs["cert"] = (self.certfile, self.keyfile)
kwargs["client_cert"] = self.certfile if self.client_ca:
if self.keyfile: kwargs["verify"] = self.client_ca
kwargs["client_key"] = self.keyfile
if self.client_ca:
kwargs["ca_certs"] = self.client_ca
req = HTTPRequest(
url,
method=method,
**kwargs,
)
try: try:
r = await AsyncHTTPClient().fetch(req, raise_error=False) r = requests.request(method, url, **kwargs)
except Exception as e: except requests.ConnectionError as e:
app_log.error("Error connecting to %s: %s", self.api_url, e) app_log.error("Error connecting to %s: %s", self.api_url, e)
msg = "Failed to connect to Hub API at %r." % self.api_url msg = "Failed to connect to Hub API at %r." % self.api_url
msg += ( msg += (
@@ -487,46 +424,35 @@ class HubAuth(SingletonConfigurable):
raise HTTPError(500, msg) raise HTTPError(500, msg)
data = None data = None
try: if r.status_code == 403 and allow_403:
status = HTTPStatus(r.code)
except ValueError:
app_log.error(
f"Unknown error checking authorization with JupyterHub: {r.code}"
)
app_log.error(r.body.decode("utf8", "replace"))
response_text = r.body.decode("utf8", "replace")
if status.value == 403 and allow_403:
pass pass
elif status.value == 403: elif r.status_code == 403:
app_log.error( app_log.error(
"I don't have permission to check authorization with JupyterHub, my auth token may have expired: [%i] %s", "I don't have permission to check authorization with JupyterHub, my auth token may have expired: [%i] %s",
status.value, r.status_code,
status.description, r.reason,
) )
app_log.error(response_text) app_log.error(r.text)
raise HTTPError( raise HTTPError(
500, "Permission failure checking authorization, I may need a new token" 500, "Permission failure checking authorization, I may need a new token"
) )
elif status.value >= 500: elif r.status_code >= 500:
app_log.error( app_log.error(
"Upstream failure verifying auth token: [%i] %s", "Upstream failure verifying auth token: [%i] %s",
status.value, r.status_code,
status.description, r.reason,
) )
app_log.error(response_text) app_log.error(r.text)
raise HTTPError(502, "Failed to check authorization (upstream problem)") raise HTTPError(502, "Failed to check authorization (upstream problem)")
elif status.value >= 400: elif r.status_code >= 400:
app_log.warning( app_log.warning(
"Failed to check authorization: [%i] %s", "Failed to check authorization: [%i] %s", r.status_code, r.reason
status.value,
status.description,
) )
app_log.warning(response_text) app_log.warning(r.text)
msg = "Failed to check authorization" msg = "Failed to check authorization"
# pass on error from oauth failure # pass on error from oauth failure
try: try:
response = json.loads(response_text) response = r.json()
# prefer more specific 'error_description', fallback to 'error' # prefer more specific 'error_description', fallback to 'error'
description = response.get( description = response.get(
"error_description", response.get("error", "Unknown error") "error_description", response.get("error", "Unknown error")
@@ -537,7 +463,7 @@ class HubAuth(SingletonConfigurable):
msg += ": " + description msg += ": " + description
raise HTTPError(500, msg) raise HTTPError(500, msg)
else: else:
data = json.loads(response_text) data = r.json()
return data return data
@@ -547,25 +473,19 @@ class HubAuth(SingletonConfigurable):
"Identifying users by shared cookie is removed in JupyterHub 2.0. Use OAuth tokens." "Identifying users by shared cookie is removed in JupyterHub 2.0. Use OAuth tokens."
) )
def user_for_token(self, token, use_cache=True, session_id='', *, sync=True): def user_for_token(self, token, use_cache=True, session_id=''):
"""Ask the Hub to identify the user for a given token. """Ask the Hub to identify the user for a given token.
.. versionadded:: 2.4
async support via `sync` argument.
Args: Args:
token (str): the token token (str): the token
use_cache (bool): Specify use_cache=False to skip cached cookie values (default: True) use_cache (bool): Specify use_cache=False to skip cached cookie values (default: True)
sync (bool): whether to block for the result or return an awaitable
Returns: Returns:
user_model (dict): The user model, if a user is identified, None if authentication fails. user_model (dict): The user model, if a user is identified, None if authentication fails.
The 'name' field contains the user's name. The 'name' field contains the user's name.
""" """
return self._call_coroutine( return self._check_hub_authorization(
sync,
self._check_hub_authorization,
url=url_path_join( url=url_path_join(
self.api_url, self.api_url,
"user", "user",
@@ -610,7 +530,7 @@ class HubAuth(SingletonConfigurable):
"""Base class doesn't store tokens in cookies""" """Base class doesn't store tokens in cookies"""
return None return None
async def _get_user_cookie(self, handler): def _get_user_cookie(self, handler):
"""Get the user model from a cookie""" """Get the user model from a cookie"""
# overridden in HubOAuth to store the access token after oauth # overridden in HubOAuth to store the access token after oauth
return None return None
@@ -622,26 +542,20 @@ class HubAuth(SingletonConfigurable):
""" """
return handler.get_cookie('jupyterhub-session-id', '') return handler.get_cookie('jupyterhub-session-id', '')
def get_user(self, handler, *, sync=True): def get_user(self, handler):
"""Get the Hub user for a given tornado handler. """Get the Hub user for a given tornado handler.
Checks cookie with the Hub to identify the current user. Checks cookie with the Hub to identify the current user.
.. versionadded:: 2.4
async support via `sync` argument.
Args: Args:
handler (tornado.web.RequestHandler): the current request handler handler (tornado.web.RequestHandler): the current request handler
sync (bool): whether to block for the result or return an awaitable
Returns: Returns:
user_model (dict): The user model, if a user is identified, None if authentication fails. user_model (dict): The user model, if a user is identified, None if authentication fails.
The 'name' field contains the user's name. The 'name' field contains the user's name.
""" """
return self._call_coroutine(sync, self._get_user, handler)
async def _get_user(self, handler):
# only allow this to be called once per handler # only allow this to be called once per handler
# avoids issues if an error is raised, # avoids issues if an error is raised,
# since this may be called again when trying to render the error page # since this may be called again when trying to render the error page
@@ -656,15 +570,13 @@ class HubAuth(SingletonConfigurable):
# is token-authenticated (CORS-related) # is token-authenticated (CORS-related)
token = self.get_token(handler, in_cookie=False) token = self.get_token(handler, in_cookie=False)
if token: if token:
user_model = await self.user_for_token( user_model = self.user_for_token(token, session_id=session_id)
token, session_id=session_id, sync=False
)
if user_model: if user_model:
handler._token_authenticated = True handler._token_authenticated = True
# no token, check cookie # no token, check cookie
if user_model is None: if user_model is None:
user_model = await self._get_user_cookie(handler) user_model = self._get_user_cookie(handler)
# cache result # cache result
handler._cached_hub_user = user_model handler._cached_hub_user = user_model
@@ -724,13 +636,11 @@ class HubOAuth(HubAuth):
token = token.decode('ascii', 'replace') token = token.decode('ascii', 'replace')
return token return token
async def _get_user_cookie(self, handler): def _get_user_cookie(self, handler):
token = self._get_token_cookie(handler) token = self._get_token_cookie(handler)
session_id = self.get_session_id(handler) session_id = self.get_session_id(handler)
if token: if token:
user_model = await self.user_for_token( user_model = self.user_for_token(token, session_id=session_id)
token, session_id=session_id, sync=False
)
if user_model is None: if user_model is None:
app_log.warning("Token stored in cookie may have expired") app_log.warning("Token stored in cookie may have expired")
handler.clear_cookie(self.cookie_name) handler.clear_cookie(self.cookie_name)
@@ -785,7 +695,7 @@ class HubOAuth(HubAuth):
def _token_url(self): def _token_url(self):
return url_path_join(self.api_url, 'oauth2/token') return url_path_join(self.api_url, 'oauth2/token')
def token_for_code(self, code, *, sync=True): def token_for_code(self, code):
"""Get token for OAuth temporary code """Get token for OAuth temporary code
This is the last step of OAuth login. This is the last step of OAuth login.
@@ -796,9 +706,6 @@ class HubOAuth(HubAuth):
Returns: Returns:
token (str): JupyterHub API Token token (str): JupyterHub API Token
""" """
return self._call_coroutine(sync, self._token_for_code, code)
async def _token_for_code(self, code):
# GitHub specifies a POST request yet requires URL parameters # GitHub specifies a POST request yet requires URL parameters
params = dict( params = dict(
client_id=self.oauth_client_id, client_id=self.oauth_client_id,
@@ -808,10 +715,10 @@ class HubOAuth(HubAuth):
redirect_uri=self.oauth_redirect_uri, redirect_uri=self.oauth_redirect_uri,
) )
token_reply = await self._api_request( token_reply = self._api_request(
'POST', 'POST',
self.oauth_token_url, self.oauth_token_url,
body=urlencode(params).encode('utf8'), data=urlencode(params).encode('utf8'),
headers={'Content-Type': 'application/x-www-form-urlencoded'}, headers={'Content-Type': 'application/x-www-form-urlencoded'},
) )
@@ -965,8 +872,8 @@ class HubAuthenticated:
- .hub_auth: A HubAuth instance - .hub_auth: A HubAuth instance
- .hub_scopes: A set of JupyterHub 2.0 OAuth scopes to allow. - .hub_scopes: A set of JupyterHub 2.0 OAuth scopes to allow.
Default comes from .hub_auth.oauth_access_scopes, Default comes from .hub_auth.oauth_scopes,
which in turn is set by $JUPYTERHUB_OAUTH_ACCESS_SCOPES which in turn is set by $JUPYTERHUB_OAUTH_SCOPES
Default values include: Default values include:
- 'access:services', 'access:services!service={service_name}' for services - 'access:services', 'access:services!service={service_name}' for services
- 'access:servers', 'access:servers!user={user}', - 'access:servers', 'access:servers!user={user}',
@@ -1005,8 +912,8 @@ class HubAuthenticated:
@property @property
def hub_scopes(self): def hub_scopes(self):
"""Set of allowed scopes (use hub_auth.access_scopes by default)""" """Set of allowed scopes (use hub_auth.oauth_scopes by default)"""
return self.hub_auth.access_scopes or None return self.hub_auth.oauth_scopes or None
@property @property
def allow_all(self): def allow_all(self):
@@ -1216,12 +1123,10 @@ class HubOAuthCallbackHandler(HubOAuthenticated, RequestHandler):
app_log.warning("oauth state %r != %r", arg_state, cookie_state) app_log.warning("oauth state %r != %r", arg_state, cookie_state)
raise HTTPError(403, "oauth state does not match. Try logging in again.") raise HTTPError(403, "oauth state does not match. Try logging in again.")
next_url = self.hub_auth.get_next_url(cookie_state) next_url = self.hub_auth.get_next_url(cookie_state)
# TODO: make async (in a Thread?)
token = await self.hub_auth.token_for_code(code, sync=False) token = self.hub_auth.token_for_code(code)
session_id = self.hub_auth.get_session_id(self) session_id = self.hub_auth.get_session_id(self)
user_model = await self.hub_auth.user_for_token( user_model = self.hub_auth.user_for_token(token, session_id=session_id)
token, session_id=session_id, sync=False
)
if user_model is None: if user_model is None:
raise HTTPError(500, "oauth callback failed to identify a user") raise HTTPError(500, "oauth callback failed to identify a user")
app_log.info("Logged-in user %s", user_model) app_log.info("Logged-in user %s", user_model)

View File

@@ -45,22 +45,21 @@ import pipes
import shutil import shutil
from subprocess import Popen from subprocess import Popen
from traitlets import ( from traitlets import Any
Any, from traitlets import Bool
Bool, from traitlets import default
Dict, from traitlets import Dict
HasTraits, from traitlets import HasTraits
Instance, from traitlets import Instance
List, from traitlets import List
Unicode, from traitlets import Unicode
default, from traitlets import validate
validate,
)
from traitlets.config import LoggingConfigurable from traitlets.config import LoggingConfigurable
from .. import orm from .. import orm
from ..objects import Server from ..objects import Server
from ..spawner import LocalProcessSpawner, set_user_setuid from ..spawner import LocalProcessSpawner
from ..spawner import set_user_setuid
from ..traitlets import Command from ..traitlets import Command
from ..utils import url_path_join from ..utils import url_path_join
@@ -102,8 +101,8 @@ class _ServiceSpawner(LocalProcessSpawner):
cmd = Command(minlen=0) cmd = Command(minlen=0)
_service_name = Unicode() _service_name = Unicode()
@default("oauth_access_scopes") @default("oauth_scopes")
def _default_oauth_access_scopes(self): def _default_oauth_scopes(self):
return [ return [
"access:services", "access:services",
f"access:services!service={self._service_name}", f"access:services!service={self._service_name}",
@@ -203,14 +202,7 @@ class Service(LoggingConfigurable):
oauth_roles = List( oauth_roles = List(
help="""OAuth allowed roles. help="""OAuth allowed roles.
DEPRECATED in 3.0: use oauth_client_allowed_scopes This sets the maximum and default roles
"""
).tag(input=True)
oauth_client_allowed_scopes = List(
help="""OAuth allowed scopes.
This sets the maximum and default scopes
assigned to oauth tokens issued for this service assigned to oauth tokens issued for this service
(i.e. tokens stored in browsers after authenticating with the server), (i.e. tokens stored in browsers after authenticating with the server),
defining what actions the service can take on behalf of logged-in users. defining what actions the service can take on behalf of logged-in users.

View File

@@ -2,8 +2,10 @@
Contains default notebook-app subclass and mixins Contains default notebook-app subclass and mixins
""" """
from .app import SingleUserNotebookApp, main from .app import main
from .mixins import HubAuthenticatedHandler, make_singleuser_app from .app import SingleUserNotebookApp
from .mixins import HubAuthenticatedHandler
from .mixins import make_singleuser_app
# backward-compatibility # backward-compatibility
JupyterHubLoginHandler = SingleUserNotebookApp.login_handler_class JupyterHubLoginHandler = SingleUserNotebookApp.login_handler_class

View File

@@ -29,9 +29,9 @@ else:
try: try:
App = import_item(JUPYTERHUB_SINGLEUSER_APP) App = import_item(JUPYTERHUB_SINGLEUSER_APP)
except ImportError as e: except ImportError as e:
continue
if _import_error is None: if _import_error is None:
_import_error = e _import_error = e
continue
else: else:
break break
if App is None: if App is None:

Some files were not shown because too many files have changed in this diff Show More