Compare commits

...

86 Commits
4.0.1 ... 4.1.4

Author SHA1 Message Date
Min RK
42191672ac Bump to 4.1.4 2024-03-30 09:58:35 +01:00
Min RK
669d8d7b65 Merge pull request #4764 from minrk/414
changelog for 4.1.4
2024-03-30 09:58:09 +01:00
Min RK
171026583c changelog for 4.1.4 2024-03-30 09:55:16 +01:00
Min RK
78a3dc5b01 Merge pull request #4759 from minrk/xsrf-no-navigate
avoid xsrf check on navigate GET requests
2024-03-30 09:53:09 +01:00
Min RK
21c37309a5 avoid xsrf check on navigate GET requests
sevices/auth prevents calling check_xsrf_cookie,
but if the Handler itself called it the newly strict check would still be applied

this ensures the check is actually allowed for navigate GET requests
2024-03-29 09:55:49 +01:00
Min RK
3d40be5890 Bump to 4.1.3 2024-03-26 10:07:04 +01:00
Min RK
ac72c60cb3 Merge pull request #4754 from minrk/413
changelog for 4.1.3
2024-03-26 10:06:38 +01:00
Min RK
92264696b1 changelog for 4.1.3 2024-03-26 09:44:07 +01:00
Min RK
f2b7b69c3e Merge pull request #4753 from minrk/server-xsrf-config
respect jupyter-server disable_check_xsrf setting
2024-03-26 09:42:54 +01:00
Min RK
e0f001271b respect jupyter-server disable_check_xsrf setting
allows global disable of xsrf checks in single-user servers
2024-03-26 08:55:15 +01:00
Min RK
d27e760677 Bump to 4.1.2 2024-03-25 20:47:05 +01:00
Min RK
3999556ed8 Merge pull request #4751 from minrk/cl-412
changelog for 4.1.2
2024-03-25 20:44:56 +01:00
Min RK
ff14797b9b changelog for 4.1.2 2024-03-25 20:39:47 +01:00
Min RK
f0cbec191e Merge pull request #4750 from minrk/fix-named-server
rework handling of multiple xsrf tokens
2024-03-25 20:36:15 +01:00
Min RK
87c2aebb5c avoid cycle on current_user in _set_xsrf_cookie
pass authenticated explicitly
2024-03-25 13:50:17 +01:00
Min RK
e0ea52af49 rework handling of multiple xsrf tokens
rather than attempting to clear multiple tokens (too complicated, breaks named servers)
look for and accept first valid token

have to do our own cookie parsing because existing cookie implementations only return a single value for each key
and default to selecting the _least_ likely to be correct, according to RFCs.

set updated xsrf cookie on login to avoid needing two requests to get the right cookie
2024-03-25 12:32:15 +01:00
Min RK
b7b2558ab7 Bump to 4.1.1 2024-03-23 17:16:49 +01:00
Min RK
a77e57290e Merge pull request #4746 from minrk/411
changelog for 4.1.1
2024-03-23 17:16:34 +01:00
Min RK
506f931f15 changelog for 4.1.1 2024-03-23 16:03:30 +01:00
Min RK
b094381f79 Merge pull request #4745 from minrk/inject-xsrf
allow subclasses to override xsrf check
2024-03-23 16:01:00 +01:00
Min RK
e6e85eebc1 allow subclasses to override xsrf check
need to inject our override into the base class,
rather than at the instance level,
to avoid clobbering any overrides in extensions like jupyter-server-proxy
2024-03-23 00:00:04 +01:00
Min RK
984a67932f Bump to 4.1.0 2024-03-20 13:38:27 +01:00
Min RK
133dda26cc sync publish workflow with main 2024-03-20 13:26:14 +01:00
Min RK
e797e31ef9 fix check links for unpublished advisories 2024-03-20 13:23:54 +01:00
Min RK
e2798a088f Merge pull request from GHSA-7r3h-4ph8-w38g
4.1.0
2024-03-20 13:19:29 +01:00
Min RK
3fa60e6849 4.1.0 2024-03-19 13:33:35 +01:00
Min RK
aeeabbee07 Merge pull request #4735 from meeseeksmachine/auto-backport-of-pr-4628-on-4.x
Backport PR #4628 on branch 4.x (Include LDAP groups in local spawner gids)
2024-03-19 12:48:49 +01:00
Min RK
999c58f584 Backport PR #4628: Include LDAP groups in local spawner gids 2024-03-19 09:34:16 +00:00
Min RK
513c61321f Merge pull request #4734 from meeseeksmachine/auto-backport-of-pr-4733-on-4.x
Backport PR #4733 on branch 4.x (Catch ValueError while waiting for server to be reachable)
2024-03-19 10:03:27 +01:00
Min RK
715c8599b3 Backport PR #4733: Catch ValueError while waiting for server to be reachable 2024-03-19 08:38:04 +00:00
Min RK
63e118f144 Merge pull request #4714 from meeseeksmachine/auto-backport-of-pr-4561-on-4.x
Backport PR #4561 on branch 4.x (Improve debugging when waiting for servers)
2024-02-29 15:13:16 +01:00
Erik Sundell
05e569cb42 Backport PR #4561: Improve debugging when waiting for servers 2024-02-28 13:49:35 +00:00
Min RK
d4c7d9748a Merge pull request #4707 from meeseeksmachine/auto-backport-of-pr-4563-on-4.x
Backport PR #4563 on branch 4.x (only set 'domain' field on session-id cookie)
2024-02-26 11:29:43 +01:00
Min RK
d8f404d25e Backport PR #4563: only set 'domain' field on session-id cookie 2024-02-13 14:08:01 +00:00
Min RK
4492b508a1 Merge pull request #4705 from minrk/backport-4679
Backport PR #4679 on branch 4.x (Unescape jinja username)
2024-02-12 15:41:50 +01:00
Min RK
6221f27c19 add missing init for jupyterhub.tests.browser 2024-02-12 14:24:11 +01:00
Min RK
77ae4401a1 Backport PR #4679 on branch 4.x (Unescape jinja username) 2024-02-12 14:03:17 +01:00
Min RK
df7ae422f6 Merge pull request #4687 from meeseeksmachine/auto-backport-of-pr-4560-on-4.x
Backport PR #4560 on branch 4.x (singleuser extension: persist token from ?token=... url in cookie)
2024-02-06 10:39:10 +01:00
Min RK
a0dd715bf7 Merge pull request #4689 from meeseeksmachine/auto-backport-of-pr-4542-on-4.x
Backport PR #4542 on branch 4.x (Fix include_stopped_servers in paginated next_url)
2024-02-06 10:16:59 +01:00
Min RK
bfccb9af73 Merge pull request #4698 from minrk/backport-quay
Backport quay.io publishing
2024-02-06 10:01:00 +01:00
Min RK
fd14165da3 Merge pull request #4692 from meeseeksmachine/auto-backport-of-pr-4677-on-4.x
Backport PR #4677 on branch 4.x (Improve validation, docs for token.expires_in)
2024-02-06 10:00:49 +01:00
Min RK
5778d8fa48 Merge pull request #4696 from minrk/backport-4632
Backport PR #4632: simplify, avoid errors in parsing accept headers
2024-02-06 10:00:40 +01:00
Min RK
cd51660eff IS_JUPYVERSE hasn't been backported
make it always False
2024-02-06 09:18:06 +01:00
Erik Sundell
6af20e79cf Backport PR #4560: singleuser extension: persist token from ?token=... url in cookie 2024-02-06 09:18:06 +01:00
Min RK
262557579f Merge pull request #4697 from minrk/backport-4630
Backport PR #4630: avoid setting unused oauth state cookies on API requests
2024-02-06 09:14:06 +01:00
Min RK
77a6d75d70 Bump to 4.1.0.dev 2024-02-06 09:13:05 +01:00
Min RK
6f3be4b697 Backport PR #4641: Publish to Docker Hub alongside Quay.io
Publish to Docker Hub alongside Quay.io

(cherry picked from commit 7613ba170f)
2024-02-06 09:09:14 +01:00
Min RK
d4bfbdfde2 Backport PR #4612: Move from dockerhub to quay.io
Move from dockerhub to quay.io

(cherry picked from commit 60802b2b76)
2024-02-06 09:08:07 +01:00
Min RK
10f507e83b Backport PR #4632: simplify, avoid errors in parsing accept headers 2024-02-06 09:02:39 +01:00
Min RK
0bbda9a45e Merge pull request #4691 from meeseeksmachine/auto-backport-of-pr-4570-on-4.x
Backport PR #4570 on branch 4.x (fix mutation of frozenset in scope intersection)
2024-02-06 09:00:38 +01:00
Min RK
c8bb3a3679 Merge pull request #4690 from meeseeksmachine/auto-backport-of-pr-4562-on-4.x
Backport PR #4562 on branch 4.x (Use `user.stop` to cleanup spawners that stopped while Hub was down)
2024-02-06 08:59:21 +01:00
Min RK
1a65858968 Merge pull request #4688 from meeseeksmachine/auto-backport-of-pr-4651-on-4.x
Backport PR #4651 on branch 4.x (avoid attempting to patch removed IPythonHandler with notebook v7)
2024-02-06 08:59:08 +01:00
Min RK
2aa28e1a1f Backport PR #4630: avoid setting unused oauth state cookies on API requests 2024-02-06 08:51:25 +01:00
Min RK
dbd90b1bfe Merge pull request #4695 from minrk/backport-pr-4617
Backport PR #4617: try to improve reliability of test_external_proxy
2024-02-06 08:49:34 +01:00
Min RK
7a3ff4028a Merge pull request #4694 from meeseeksmachine/auto-backport-of-pr-4618-on-4.x
Backport PR #4618 on branch 4.x (browser test: wait for token request to finish before reloading)
2024-02-06 08:42:06 +01:00
Erik Sundell
44518d00c2 Backport PR #4617: try to improve reliability of test_external_proxy 2024-02-06 08:35:57 +01:00
Erik Sundell
051848d1ef Backport PR #4618: browser test: wait for token request to finish before reloading 2024-02-06 07:24:50 +00:00
Erik Sundell
5e57e0141a Backport PR #4677: Improve validation, docs for token.expires_in 2024-02-05 14:30:06 +00:00
Min RK
6cfa789d6a Backport PR #4570: fix mutation of frozenset in scope intersection 2024-02-05 14:28:09 +00:00
Erik Sundell
55c3211ec2 Backport PR #4562: Use user.stop to cleanup spawners that stopped while Hub was down 2024-02-05 14:27:04 +00:00
Min RK
603ba309f5 Backport PR #4542: Fix include_stopped_servers in paginated next_url 2024-02-05 14:26:15 +00:00
Simon Li
6337b695bb Backport PR #4651: avoid attempting to patch removed IPythonHandler with notebook v7 2024-02-05 14:25:25 +00:00
Min RK
ee9e509ab5 Merge pull request #4685 from minrk/4.x
preparing 4.x branch
2024-01-31 11:01:51 +01:00
Min RK
f0e049226d BIDS Teaching video appears to be gone 2024-01-31 09:18:00 +01:00
Min RK
7ffb0b0719 skip linkcheck for linux.die.net
links work, but site seems to block linkcheck requests from CI with 403
2024-01-31 09:17:10 +01:00
YuviPanda
825e8aacea Remove links to okpy from docs
These were removed in
https://github.com/jupyterhub/oauthenticator/pull/691,
and now the link checker is not happy.
2024-01-31 09:16:24 +01:00
Min RK
55213f6f53 run pre-commit
black adds some blank lines
2024-01-30 14:32:25 +01:00
Min RK
32dfe70a01 update pre-commit 2024-01-30 14:30:51 +01:00
Min RK
9db326fb7a pin some test dependencies 2024-01-30 14:30:14 +01:00
Min RK
0e7689f277 Bump to 4.0.2 2023-08-10 11:27:56 +02:00
Min RK
b677655572 Merge pull request #4535 from meeseeksmachine/auto-backport-of-pr-4534-on-4.x
Backport PR #4534 on branch 4.x (Changelog for 4.0.2)
2023-08-10 11:26:50 +02:00
Min RK
9adc871448 Backport PR #4534: Changelog for 4.0.2 2023-08-10 09:19:16 +00:00
Min RK
29d6540333 Merge pull request #4533 from meeseeksmachine/auto-backport-of-pr-4489-on-4.x
Backport PR #4489 on branch 4.x (improve permission-denied errors for various cases)
2023-08-10 11:01:08 +02:00
Erik Sundell
5a4949faa5 Backport PR #4489: improve permission-denied errors for various cases 2023-08-10 08:13:44 +00:00
Min RK
f2ab23b376 Merge pull request #4531 from meeseeksmachine/auto-backport-of-pr-4475-on-4.x
Backport PR #4475 on branch 4.x (Allow setting custom log_function in tornado_settings in SingleUserServer)
2023-08-09 15:24:45 +02:00
Min RK
b61582420a Merge pull request #4532 from meeseeksmachine/auto-backport-of-pr-4522-on-4.x
Backport PR #4522 on branch 4.x (document how to use notebook v7 with jupyterhub)
2023-08-09 15:24:34 +02:00
Simon Li
f11ae34b73 Backport PR #4522: document how to use notebook v7 with jupyterhub 2023-08-09 11:12:50 +00:00
Min RK
e91ab50d1b Backport PR #4475: Allow setting custom log_function in tornado_settings in SingleUserServer 2023-08-09 11:03:55 +00:00
Min RK
4cb3a45ce4 Merge pull request #4529 from meeseeksmachine/auto-backport-of-pr-4523-on-4.x
Backport PR #4523 on branch 4.x (doc: update notebook config URL)
2023-08-09 12:40:11 +02:00
Min RK
4e8f9b4334 Merge pull request #4528 from meeseeksmachine/auto-backport-of-pr-4503-on-4.x
Backport PR #4503 on branch 4.x (set root_dir when using singleuser extension)
2023-08-09 12:31:33 +02:00
Min RK
6131f2dbaa Merge pull request #4530 from meeseeksmachine/auto-backport-of-pr-4491-on-4.x
Backport PR #4491 on branch 4.x (avoid counting failed requests to not-running servers as 'activity')
2023-08-09 12:31:07 +02:00
Min RK
a9dc588454 can't use f"{name=}" in Python 3.7 2023-08-09 11:56:03 +02:00
Erik Sundell
537b2eaff6 Backport PR #4491: avoid counting failed requests to not-running servers as 'activity' 2023-08-09 09:53:20 +00:00
Simon Li
7f8a981aed Backport PR #4523: doc: update notebook config URL 2023-08-09 09:52:41 +00:00
Erik Sundell
bc86e4c8f5 Backport PR #4503: set root_dir when using singleuser extension 2023-08-09 09:50:48 +00:00
Min RK
20f75c0018 Bump to 4.1.0.dev 2023-06-12 15:29:13 +02:00
109 changed files with 2184 additions and 413 deletions

View File

@@ -30,16 +30,16 @@ on:
jobs:
build-release:
runs-on: ubuntu-20.04
runs-on: ubuntu-22.04
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: "3.9"
python-version: "3.11"
- uses: actions/setup-node@v3
- uses: actions/setup-node@v4
with:
node-version: "14"
node-version: "20"
- name: install build requirements
run: |
@@ -67,7 +67,7 @@ jobs:
docker run --rm -v $PWD/dist:/dist:ro docker.io/library/python:3.9-slim-bullseye bash -c 'pip install /dist/jupyterhub-*.tar.gz'
# ref: https://github.com/actions/upload-artifact#readme
- uses: actions/upload-artifact@v3
- uses: actions/upload-artifact@v4
with:
name: jupyterhub-${{ github.sha }}
path: "dist/*"
@@ -83,7 +83,7 @@ jobs:
twine upload --skip-existing dist/*
publish-docker:
runs-on: ubuntu-20.04
runs-on: ubuntu-22.04
timeout-minutes: 20
services:
@@ -97,39 +97,35 @@ jobs:
- name: Should we push this image to a public registry?
run: |
if [ "${{ startsWith(github.ref, 'refs/tags/') || (github.ref == 'refs/heads/main') }}" = "true" ]; then
# Empty => Docker Hub
echo "REGISTRY=" >> $GITHUB_ENV
echo "REGISTRY=quay.io/" >> $GITHUB_ENV
else
echo "REGISTRY=localhost:5000/" >> $GITHUB_ENV
fi
- uses: actions/checkout@v3
- uses: actions/checkout@v4
# Setup docker to build for multiple platforms, see:
# https://github.com/docker/build-push-action/tree/v2.4.0#usage
# https://github.com/docker/build-push-action/blob/v2.4.0/docs/advanced/multi-platform.md
- name: Set up QEMU (for docker buildx)
uses: docker/setup-qemu-action@v2
uses: docker/setup-qemu-action@v3
- name: Set up Docker Buildx (for multi-arch builds)
uses: docker/setup-buildx-action@v2
uses: docker/setup-buildx-action@v3
with:
# Allows pushing to registry on localhost:5000
driver-opts: network=host
- name: Setup push rights to Docker Hub
# This was setup by...
# 1. Creating a Docker Hub service account "jupyterhubbot"
# 2. Creating a access token for the service account specific to this
# repository: https://hub.docker.com/settings/security
# 3. Making the account part of the "bots" team, and granting that team
# permissions to push to the relevant images:
# https://hub.docker.com/orgs/jupyterhub/teams/bots/permissions
# 4. Registering the username and token as a secret for this repo:
# https://github.com/jupyterhub/jupyterhub/settings/secrets/actions
# 1. Creating a [Robot Account](https://quay.io/organization/jupyterhub?tab=robots) in the JupyterHub
# . Quay.io org
# 2. Giving it enough permissions to push to the jupyterhub and singleuser images
# 3. Putting the robot account's username and password in GitHub actions environment
if: env.REGISTRY != 'localhost:5000/'
run: |
docker login -u "${{ secrets.DOCKERHUB_USERNAME }}" -p "${{ secrets.DOCKERHUB_TOKEN }}"
docker login -u "${{ secrets.QUAY_USERNAME }}" -p "${{ secrets.QUAY_PASSWORD }}" "${{ env.REGISTRY }}"
docker login -u "${{ secrets.DOCKERHUB_USERNAME }}" -p "${{ secrets.DOCKERHUB_TOKEN }}" docker.io
# image: jupyterhub/jupyterhub
#
@@ -142,15 +138,17 @@ jobs:
# If GITHUB_TOKEN isn't available (e.g. in PRs) returns no tags [].
- name: Get list of jupyterhub tags
id: jupyterhubtags
uses: jupyterhub/action-major-minor-tag-calculator@v2
uses: jupyterhub/action-major-minor-tag-calculator@v3
with:
githubToken: ${{ secrets.GITHUB_TOKEN }}
prefix: "${{ env.REGISTRY }}jupyterhub/jupyterhub:"
prefix: >-
${{ env.REGISTRY }}jupyterhub/jupyterhub:
jupyterhub/jupyterhub:
defaultTag: "${{ env.REGISTRY }}jupyterhub/jupyterhub:noref"
branchRegex: ^\w[\w-.]*$
- name: Build and push jupyterhub
uses: docker/build-push-action@3b5e8027fcad23fda98b2e3ac259d8d67585f671
uses: docker/build-push-action@v5
with:
context: .
platforms: linux/amd64,linux/arm64
@@ -163,15 +161,17 @@ jobs:
#
- name: Get list of jupyterhub-onbuild tags
id: onbuildtags
uses: jupyterhub/action-major-minor-tag-calculator@v2
uses: jupyterhub/action-major-minor-tag-calculator@v3
with:
githubToken: ${{ secrets.GITHUB_TOKEN }}
prefix: "${{ env.REGISTRY }}jupyterhub/jupyterhub-onbuild:"
prefix: >-
${{ env.REGISTRY }}jupyterhub/jupyterhub-onbuild:
jupyterhub/jupyterhub-onbuild:
defaultTag: "${{ env.REGISTRY }}jupyterhub/jupyterhub-onbuild:noref"
branchRegex: ^\w[\w-.]*$
- name: Build and push jupyterhub-onbuild
uses: docker/build-push-action@3b5e8027fcad23fda98b2e3ac259d8d67585f671
uses: docker/build-push-action@v5
with:
build-args: |
BASE_IMAGE=${{ fromJson(steps.jupyterhubtags.outputs.tags)[0] }}
@@ -184,15 +184,17 @@ jobs:
#
- name: Get list of jupyterhub-demo tags
id: demotags
uses: jupyterhub/action-major-minor-tag-calculator@v2
uses: jupyterhub/action-major-minor-tag-calculator@v3
with:
githubToken: ${{ secrets.GITHUB_TOKEN }}
prefix: "${{ env.REGISTRY }}jupyterhub/jupyterhub-demo:"
prefix: >-
${{ env.REGISTRY }}jupyterhub/jupyterhub-demo:
jupyterhub/jupyterhub-demo:
defaultTag: "${{ env.REGISTRY }}jupyterhub/jupyterhub-demo:noref"
branchRegex: ^\w[\w-.]*$
- name: Build and push jupyterhub-demo
uses: docker/build-push-action@3b5e8027fcad23fda98b2e3ac259d8d67585f671
uses: docker/build-push-action@v5
with:
build-args: |
BASE_IMAGE=${{ fromJson(steps.onbuildtags.outputs.tags)[0] }}
@@ -208,15 +210,17 @@ jobs:
#
- name: Get list of jupyterhub/singleuser tags
id: singleusertags
uses: jupyterhub/action-major-minor-tag-calculator@v2
uses: jupyterhub/action-major-minor-tag-calculator@v3
with:
githubToken: ${{ secrets.GITHUB_TOKEN }}
prefix: "${{ env.REGISTRY }}jupyterhub/singleuser:"
prefix: >-
${{ env.REGISTRY }}jupyterhub/singleuser:
jupyterhub/singleuser:
defaultTag: "${{ env.REGISTRY }}jupyterhub/singleuser:noref"
branchRegex: ^\w[\w-.]*$
- name: Build and push jupyterhub/singleuser
uses: docker/build-push-action@3b5e8027fcad23fda98b2e3ac259d8d67585f671
uses: docker/build-push-action@v5
with:
build-args: |
JUPYTERHUB_VERSION=${{ github.ref_type == 'tag' && github.ref_name || format('git:{0}', github.sha) }}

View File

@@ -100,6 +100,9 @@ jobs:
subset: singleuser
- python: "3.11"
browser: browser
- python: "3.11"
subdomain: subdomain
browser: browser
- python: "3.11"
main_dependencies: main_dependencies

View File

@@ -16,7 +16,7 @@ ci:
repos:
# Autoformat: Python code, syntax patterns are modernized
- repo: https://github.com/asottile/pyupgrade
rev: v3.4.0
rev: v3.15.0
hooks:
- id: pyupgrade
args:
@@ -24,7 +24,7 @@ repos:
# Autoformat: Python code
- repo: https://github.com/PyCQA/autoflake
rev: v2.1.1
rev: v2.2.1
hooks:
- id: autoflake
# args ref: https://github.com/PyCQA/autoflake#advanced-usage
@@ -33,25 +33,25 @@ repos:
# Autoformat: Python code
- repo: https://github.com/pycqa/isort
rev: 5.12.0
rev: 5.13.2
hooks:
- id: isort
# Autoformat: Python code
- repo: https://github.com/psf/black
rev: 23.3.0
rev: 24.1.1
hooks:
- id: black
# Autoformat: markdown, yaml, javascript (see the file .prettierignore)
- repo: https://github.com/pre-commit/mirrors-prettier
rev: v3.0.0-alpha.9-for-vscode
rev: v4.0.0-alpha.8
hooks:
- id: prettier
# Autoformat and linting, misc. details
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.4.0
rev: v4.5.0
hooks:
- id: end-of-file-fixer
exclude: share/jupyterhub/static/js/admin-react.js
@@ -61,6 +61,6 @@ repos:
# Linting: Python code (see the file .flake8)
- repo: https://github.com/PyCQA/flake8
rev: "6.0.0"
rev: "7.0.0"
hooks:
- id: flake8

View File

@@ -6,7 +6,7 @@
#
# Option 1:
#
# FROM jupyterhub/jupyterhub:latest
# FROM quay.io/jupyterhub/jupyterhub:latest
#
# And put your configuration file jupyterhub_config.py in /srv/jupyterhub/jupyterhub_config.py.
#
@@ -14,10 +14,10 @@
#
# Or you can create your jupyterhub config and database on the host machine, and mount it with:
#
# docker run -v $PWD:/srv/jupyterhub -t jupyterhub/jupyterhub
# docker run -v $PWD:/srv/jupyterhub -t quay.io/jupyterhub/jupyterhub
#
# NOTE
# If you base on jupyterhub/jupyterhub-onbuild
# If you base on quay.io/jupyterhub/jupyterhub-onbuild
# your jupyterhub_config.py will be added automatically
# from your docker directory.

View File

@@ -14,7 +14,6 @@
[![Latest conda-forge version](https://img.shields.io/conda/vn/conda-forge/jupyterhub?logo=conda-forge)](https://anaconda.org/conda-forge/jupyterhub)
[![Documentation build status](https://img.shields.io/readthedocs/jupyterhub?logo=read-the-docs)](https://jupyterhub.readthedocs.org/en/latest/)
[![GitHub Workflow Status - Test](https://img.shields.io/github/workflow/status/jupyterhub/jupyterhub/Test?logo=github&label=tests)](https://github.com/jupyterhub/jupyterhub/actions)
[![DockerHub build status](https://img.shields.io/docker/build/jupyterhub/jupyterhub?logo=docker&label=build)](https://hub.docker.com/r/jupyterhub/jupyterhub/tags)
[![Test coverage of code](https://codecov.io/gh/jupyterhub/jupyterhub/branch/main/graph/badge.svg)](https://codecov.io/gh/jupyterhub/jupyterhub)
[![GitHub](https://img.shields.io/badge/issue_tracking-github-blue?logo=github)](https://github.com/jupyterhub/jupyterhub/issues)
[![Discourse](https://img.shields.io/badge/help_forum-discourse-blue?logo=discourse)](https://discourse.jupyter.org/c/jupyterhub)
@@ -160,10 +159,10 @@ To start the Hub on a specific url and port `10.0.1.2:443` with **https**:
## Docker
A starter [**docker image for JupyterHub**](https://hub.docker.com/r/jupyterhub/jupyterhub/)
A starter [**docker image for JupyterHub**](https://quay.io/repository/jupyterhub/jupyterhub)
gives a baseline deployment of JupyterHub using Docker.
**Important:** This `jupyterhub/jupyterhub` image contains only the Hub itself,
**Important:** This `quay.io/jupyterhub/jupyterhub` image contains only the Hub itself,
with no configuration. In general, one needs to make a derivative image, with
at least a `jupyterhub_config.py` setting up an Authenticator and/or a Spawner.
To run the single-user servers, which may be on the same system as the Hub or
@@ -171,7 +170,7 @@ not, Jupyter Notebook version 4 or greater must be installed.
The JupyterHub docker image can be started with the following command:
docker run -p 8000:8000 -d --name jupyterhub jupyterhub/jupyterhub jupyterhub
docker run -p 8000:8000 -d --name jupyterhub quay.io/jupyterhub/jupyterhub jupyterhub
This command will create a container named `jupyterhub` that you can
**stop and resume** with `docker stop/start`.

View File

@@ -3,7 +3,7 @@
# This should only be used for demo or testing and not as a base image to build on.
#
# It includes the notebook package and it uses the DummyAuthenticator and the SimpleLocalProcessSpawner.
ARG BASE_IMAGE=jupyterhub/jupyterhub-onbuild
ARG BASE_IMAGE=quay.io/jupyterhub/jupyterhub-onbuild
FROM ${BASE_IMAGE}
# Install the notebook package

View File

@@ -6,7 +6,7 @@ info:
description: The REST API for JupyterHub
license:
name: BSD-3-Clause
version: 4.0.1
version: 4.1.4
servers:
- url: /hub/api
security:
@@ -562,9 +562,10 @@ paths:
properties:
expires_in:
type: number
example: 3600
description:
lifetime (in seconds) after which the requested token
will expire.
will expire. Omit, or specify null or 0 for no expiration.
note:
type: string
description: A note attached to the token for future bookkeeping

View File

@@ -187,7 +187,9 @@ linkcheck_ignore = [
"https://github.com/jupyterhub/jupyterhub/compare/", # too many comparisons in changelog
r"https?://(localhost|127.0.0.1).*", # ignore localhost references in auto-links
r".*/rest-api.html#.*", # ignore javascript-resolved internal rest-api links
r"https://jupyter.chameleoncloud.org", # FIXME: ignore (presumably) short-term SSL issue
r"https://linux.die.net/.*", # linux.die.net seems to block requests from CI with 403 sometimes
# don't check links to unpublished advisories
r"https://github.com/jupyterhub/jupyterhub/security/advisories/.*",
]
linkcheck_anchors_ignore = [
"/#!",

View File

@@ -16,7 +16,8 @@ works.
JupyterHub is designed to be a _simple multi-user server for modestly sized
groups_ of **semi-trusted** users. While the design reflects serving
semi-trusted users, JupyterHub can also be suitable for serving **untrusted** users.
semi-trusted users, JupyterHub can also be suitable for serving **untrusted** users,
but **is not suitable for untrusted users** in its default configuration.
As a result, using JupyterHub with **untrusted** users means more work by the
administrator, since much care is required to secure a Hub, with extra caution on
@@ -52,33 +53,67 @@ ensure that:
their single-user server;
- the modification of the configuration of the notebook server
(the `~/.jupyter` or `JUPYTER_CONFIG_DIR` directory).
- unrestricted selection of the base environment (e.g. the image used in container-based Spawners)
If any additional services are run on the same domain as the Hub, the services
**must never** display user-authored HTML that is neither _sanitized_ nor _sandboxed_
(e.g. IFramed) to any user that lacks authentication as the author of a file.
to any user that lacks authentication as the author of a file.
### Sharing access to servers
Because sharing access to servers (via `access:servers` scopes or the sharing feature in JupyterHub 5) by definition means users can serve each other files, enabling sharing is not suitable for untrusted users without also enabling per-user domains.
JupyterHub does not enable any sharing by default.
## Mitigate security issues
The several approaches to mitigating security issues with configuration
options provided by JupyterHub include:
### Enable subdomains
### Enable user subdomains
JupyterHub provides the ability to run single-user servers on their own
subdomains. This means the cross-origin protections between servers has the
desired effect, and user servers and the Hub are protected from each other. A
user's single-user server will be at `username.jupyter.mydomain.com`. This also
requires all user subdomains to point to the same address, which is most easily
accomplished with wildcard DNS. Since this spreads the service across multiple
domains, you will need wildcard SSL as well. Unfortunately, for many
institutional domains, wildcard DNS and SSL are not available. **If you do plan
to serve untrusted users, enabling subdomains is highly encouraged**, as it
resolves the cross-site issues.
domains. This means the cross-origin protections between servers has the
desired effect, and user servers and the Hub are protected from each other.
**Subdomains are the only way to reliably isolate user servers from each other.**
To enable subdomains, set:
```python
c.JupyterHub.subdomain_host = "https://jupyter.example.org"
```
When subdomains are enabled, each user's single-user server will be at e.g. `https://username.jupyter.example.org`.
This also requires all user subdomains to point to the same address,
which is most easily accomplished with wildcard DNS, where a single A record points to your server and a wildcard CNAME record points to your A record:
```
A jupyter.example.org 192.168.1.123
CNAME *.jupyter.example.org jupyter.example.org
```
Since this spreads the service across multiple domains, you will likely need wildcard SSL as well,
matching `*.jupyter.example.org`.
Unfortunately, for many institutional domains, wildcard DNS and SSL may not be available.
We also **strongly encourage** serving JupyterHub and user content on a domain that is _not_ a subdomain of any sensitive content.
For reasoning, see [GitHub's discussion of moving user content to github.io from \*.github.com](https://github.blog/2013-04-09-yummy-cookies-across-domains/).
**If you do plan to serve untrusted users, enabling subdomains is highly encouraged**,
as it resolves many security issues, which are difficult to unavoidable when JupyterHub is on a single-domain.
:::{important}
JupyterHub makes no guarantees about protecting users from each other unless subdomains are enabled.
If you want to protect users from each other, you **_must_** enable per-user domains.
:::
### Disable user config
If subdomains are unavailable or undesirable, JupyterHub provides a
configuration option `Spawner.disable_user_config`, which can be set to prevent
configuration option `Spawner.disable_user_config = True`, which can be set to prevent
the user-owned configuration files from being loaded. After implementing this
option, `PATH`s and package installation are the other things that the
admin must enforce.
@@ -88,21 +123,24 @@ admin must enforce.
For most Spawners, `PATH` is not something users can influence, but it's important that
the Spawner should _not_ evaluate shell configuration files prior to launching the server.
### Isolate packages using virtualenv
### Isolate packages in a read-only environment
Package isolation is most easily handled by running the single-user server in
a virtualenv with disabled system-site-packages. The user should not have
permission to install packages into this environment.
The user must not have permission to install packages into the environment where the singleuser-server runs.
On a shared system, package isolation is most easily handled by running the single-user server in
a root-owned virtualenv with disabled system-site-packages.
The user must not have permission to install packages into this environment.
The same principle extends to the images used by container-based deployments.
If users can select the images in which their servers run, they can disable all security for their own servers.
It is important to note that the control over the environment only affects the
single-user server, and not the environment(s) in which the user's kernel(s)
It is important to note that the control over the environment is only required for the
single-user server, and not the environment(s) in which the users' kernel(s)
may run. Installing additional packages in the kernel environment does not
pose additional risk to the web application's security.
### Encrypt internal connections with SSL/TLS
By default, all communications on the server, between the proxy, hub, and single
-user notebooks are performed unencrypted. Setting the `internal_ssl` flag in
By default, all communications within JupyterHub—between the proxy, hub, and single
-user notebooksare performed unencrypted. Setting the `internal_ssl` flag in
`jupyterhub_config.py` secures the aforementioned routes. Turning this
feature on does require that the enabled `Spawner` can use the certificates
generated by the `Hub` (the default `LocalProcessSpawner` can, for instance).
@@ -116,6 +154,104 @@ Unix permissions to the communication sockets thereby restricting
communication to the socket owner. The `internal_ssl` option will eventually
extend to securing the `tcp` sockets as well.
### Mitigating same-origin deployments
While per-user domains are **required** for robust protection of users from each other,
you can mitigate many (but not all) cross-user issues.
First, it is critical that users cannot modify their server environments, as described above.
Second, it is important that users do not have `access:servers` permission to any server other than their own.
If users can access each others' servers, additional security measures must be enabled, some of which come with distinct user-experience costs.
Without the [Same-Origin Policy] (SOP) protecting user servers from each other,
each user server is considered a trusted origin for requests to each other user server (and the Hub itself).
Servers _cannot_ meaningfully distinguish requests originating from other user servers,
because SOP implies a great deal of trust, losing many restrictions applied to cross-origin requests.
That means pages served from each user server can:
1. arbitrarily modify the path in the Referer
2. make fully authorized requests with cookies
3. access full page contents served from the hub or other servers via popups
JupyterHub uses distinct xsrf tokens stored in cookies on each server path to attempt to limit requests across.
This has limitations because not all requests are protected by these XSRF tokens,
and unless additional measures are taken, the XSRF tokens from other user prefixes may be retrieved.
[Same-Origin Policy]: https://developer.mozilla.org/en-US/docs/Web/Security/Same-origin_policy
For example:
- `Content-Security-Policy` header must prohibit popups and iframes from the same origin.
The following Content-Security-Policy rules are _insecure_ and readily enable users to access each others' servers:
- `frame-ancestors: 'self'`
- `frame-ancestors: '*'`
- `sandbox allow-popups`
- Ideally, pages should use the strictest `Content-Security-Policy: sandbox` available,
but this is not feasible in general for JupyterLab pages, which need at least `sandbox allow-same-origin allow-scripts` to work.
The default Content-Security-Policy for single-user servers is
```
frame-ancestors: 'none'
```
which prohibits iframe embedding, but not pop-ups.
A more secure Content-Security-Policy that has some costs to user experience is:
```
frame-ancestors: 'none'; sandbox allow-same-origin allow-scripts
```
`allow-popups` is not disabled by default because disabling it breaks legitimate functionality, like "Open this in a new tab", and the "JupyterHub Control Panel" menu item.
To reiterate, the right way to avoid these issues is to enable per-user domains, where none of these concerns come up.
Note: even this level of protection requires administrators maintaining full control over the user server environment.
If users can modify their server environment, these methods are ineffective, as users can readily disable them.
### Cookie tossing
Cookie tossing is a technique where another server on a subdomain or peer subdomain can set a cookie
which will be read on another domain.
This is not relevant unless there are other user-controlled servers on a peer domain.
"Domain-locked" cookies avoid this issue, but have their own restrictions:
- JupyterHub must be served over HTTPS
- All secure cookies must be set on `/`, not on sub-paths, which means they are shared by all JupyterHub components in a single-domain deployment.
As a result, this option is only recommended when per-user subdomains are enabled,
to prevent sending all jupyterhub cookies to all user servers.
To enable domain-locked cookies, set:
```python
c.JupyterHub.cookie_host_prefix_enabled = True
```
```{versionadded} 4.1
```
### Forced-login
Jupyter servers can share links with `?token=...`.
JupyterHub prior to 5.0 will accept this request and persist the token for future requests.
This is useful for enabling admins to create 'fully authenticated' links bypassing login.
However, it also means users can share their own links that will log other users into their own servers,
enabling them to serve each other notebooks and other arbitrary HTML, depending on server configuration.
```{versionadded} 4.1
Setting environment variable `JUPYTERHUB_ALLOW_TOKEN_IN_URL=0` in the single-user environment can opt out of accepting token auth in URL parameters.
```
```{versionadded} 5.0
Accepting tokens in URLs is disabled by default, and `JUPYTERHUB_ALLOW_TOKEN_IN_URL=1` environment variable must be set to _allow_ token auth in URL parameters.
```
## Security audits
We recommend that you do periodic reviews of your deployment's security. It's

View File

@@ -46,13 +46,13 @@ things like inspect other users' servers or modify the user list at runtime).
### JupyterHub Docker container is not accessible at localhost
Even though the command to start your Docker container exposes port 8000
(`docker run -p 8000:8000 -d --name jupyterhub jupyterhub/jupyterhub jupyterhub`),
(`docker run -p 8000:8000 -d --name jupyterhub quay.io/jupyterhub/jupyterhub jupyterhub`),
it is possible that the IP address itself is not accessible/visible. As a result,
when you try http://localhost:8000 in your browser, you are unable to connect
even though the container is running properly. One workaround is to explicitly
tell Jupyterhub to start at `0.0.0.0` which is visible to everyone. Try this
command:
`docker run -p 8000:8000 -d --name jupyterhub jupyterhub/jupyterhub jupyterhub --ip 0.0.0.0 --port 8000`
`docker run -p 8000:8000 -d --name jupyterhub quay.io/jupyterhub/jupyterhub jupyterhub --ip 0.0.0.0 --port 8000`
### How can I kill ports from JupyterHub-managed services that have been orphaned?
@@ -347,12 +347,12 @@ In order to resolve this issue, there are two potential options.
### Where do I find Docker images and Dockerfiles related to JupyterHub?
Docker images can be found at the [JupyterHub organization on DockerHub](https://hub.docker.com/u/jupyterhub/).
The Docker image [jupyterhub/singleuser](https://hub.docker.com/r/jupyterhub/singleuser/)
Docker images can be found at the [JupyterHub organization on Quay.io](https://quay.io/organization/jupyterhub).
The Docker image [jupyterhub/singleuser](https://quay.io/repository/jupyterhub/singleuser)
provides an example single-user notebook server for use with DockerSpawner.
Additional single-user notebook server images can be found at the [Jupyter
organization on DockerHub](https://hub.docker.com/r/jupyter/) and information
organization on Quay.io](https://quay.io/organization/jupyter) and information
about each image at the [jupyter/docker-stacks repo](https://github.com/jupyter/docker-stacks).
### How can I view the logs for JupyterHub or the user's Notebook servers when using the DockerSpawner?

View File

@@ -45,7 +45,7 @@ additional packages.
## Configuring Jupyter and IPython
[Jupyter](https://jupyter-notebook.readthedocs.io/en/stable/config_overview.html)
[Jupyter](https://jupyter-notebook.readthedocs.io/en/stable/configuring/config_overview.html)
and [IPython](https://ipython.readthedocs.io/en/stable/development/config.html)
have their own configuration systems.
@@ -212,13 +212,31 @@ By default, the single-user server launches JupyterLab,
which is based on [Jupyter Server][].
This is the default server when running JupyterHub ≥ 2.0.
To switch to using the legacy Jupyter Notebook server, you can set the `JUPYTERHUB_SINGLEUSER_APP` environment variable
To switch to using the legacy Jupyter Notebook server (notebook < 7.0), you can set the `JUPYTERHUB_SINGLEUSER_APP` environment variable
(in the single-user environment) to:
```bash
export JUPYTERHUB_SINGLEUSER_APP='notebook.notebookapp.NotebookApp'
```
:::{note}
```
JUPYTERHUB_SINGLEUSER_APP='notebook.notebookapp.NotebookApp'
```
is only valid for notebook < 7. notebook v7 is based on jupyter-server,
and the default jupyter-server application must be used.
Selecting the new notebook UI is no longer a matter of selecting the server app to launch,
but only the default URL for users to visit.
To use notebook v7 with JupyterHub, leave the default singleuser app config alone (or specify `JUPYTERHUB_SINGLEUSER_APP=jupyter-server`) and set the default _URL_ for user servers:
```python
c.Spawner.default_url = '/tree/'
```
:::
[jupyter server]: https://jupyter-server.readthedocs.io
[jupyter notebook]: https://jupyter-notebook.readthedocs.io

View File

@@ -70,7 +70,7 @@ need to configure the options there.
## Docker image
You can use [jupyterhub configurable-http-proxy docker
image](https://hub.docker.com/r/jupyterhub/configurable-http-proxy/)
image](https://quay.io/repository/jupyterhub/configurable-http-proxy)
to run the proxy.
## See also

View File

@@ -13,6 +13,7 @@ The files are:
This file is JupyterHub's REST API schema. Both a version and the RBAC
scopes descriptions are updated in it.
"""
import os
from collections import defaultdict
from pathlib import Path

View File

@@ -30,7 +30,6 @@ popular services:
- Globus
- Google
- MediaWiki
- Okpy
- OpenShift
A [generic implementation](https://github.com/jupyterhub/oauthenticator/blob/master/oauthenticator/generic.py), which you can use for OAuth authentication with any provider, is also available.

View File

@@ -8,8 +8,160 @@ command line for details.
## [Unreleased]
## 4.1
### 4.1.4 - 2024-03-30
([full changelog](https://github.com/jupyterhub/jupyterhub/compare/4.1.3...4.1.4))
#### Bugs fixed
- avoid xsrf check on navigate GET requests [#4759](https://github.com/jupyterhub/jupyterhub/pull/4759) ([@minrk](https://github.com/minrk), [@consideRatio](https://github.com/consideRatio))
#### Contributors to this release
The following people contributed discussions, new ideas, code and documentation contributions, and review.
See [our definition of contributors](https://github-activity.readthedocs.io/en/latest/#how-does-this-tool-define-contributions-in-the-reports).
([GitHub contributors page for this release](https://github.com/jupyterhub/jupyterhub/graphs/contributors?from=2024-03-26&to=2024-03-30&type=c))
@consideRatio ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3AconsideRatio+updated%3A2024-03-26..2024-03-30&type=Issues)) | @minrk ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Aminrk+updated%3A2024-03-26..2024-03-30&type=Issues))
### 4.1.3 - 2024-03-26
([full changelog](https://github.com/jupyterhub/jupyterhub/compare/4.1.2...4.1.3))
#### Bugs fixed
- respect jupyter-server disable_check_xsrf setting [#4753](https://github.com/jupyterhub/jupyterhub/pull/4753) ([@minrk](https://github.com/minrk), [@consideRatio](https://github.com/consideRatio))
#### Contributors to this release
The following people contributed discussions, new ideas, code and documentation contributions, and review.
See [our definition of contributors](https://github-activity.readthedocs.io/en/latest/#how-does-this-tool-define-contributions-in-the-reports).
([GitHub contributors page for this release](https://github.com/jupyterhub/jupyterhub/graphs/contributors?from=2024-03-25&to=2024-03-26&type=c))
@consideRatio ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3AconsideRatio+updated%3A2024-03-25..2024-03-26&type=Issues)) | @minrk ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Aminrk+updated%3A2024-03-25..2024-03-26&type=Issues))
### 4.1.2 - 2024-03-25
4.1.2 fixes a regression in 4.1.0 affecting named servers.
([full changelog](https://github.com/jupyterhub/jupyterhub/compare/4.1.1...4.1.2))
#### Bugs fixed
- rework handling of multiple xsrf tokens [#4750](https://github.com/jupyterhub/jupyterhub/pull/4750) ([@minrk](https://github.com/minrk), [@consideRatio](https://github.com/consideRatio))
#### Contributors to this release
The following people contributed discussions, new ideas, code and documentation contributions, and review.
See [our definition of contributors](https://github-activity.readthedocs.io/en/latest/#how-does-this-tool-define-contributions-in-the-reports).
([GitHub contributors page for this release](https://github.com/jupyterhub/jupyterhub/graphs/contributors?from=2024-03-23&to=2024-03-25&type=c))
@consideRatio ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3AconsideRatio+updated%3A2024-03-23..2024-03-25&type=Issues)) | @minrk ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Aminrk+updated%3A2024-03-23..2024-03-25&type=Issues))
### 4.1.1 - 2024-03-23
4.1.1 fixes a compatibility regression in 4.1.0 for some extensions,
particularly jupyter-server-proxy.
([full changelog](https://github.com/jupyterhub/jupyterhub/compare/4.1.0...4.1.1))
#### Bugs fixed
- allow subclasses to override xsrf check [#4745](https://github.com/jupyterhub/jupyterhub/pull/4745) ([@minrk](https://github.com/minrk), [@consideRatio](https://github.com/consideRatio))
#### Contributors to this release
The following people contributed discussions, new ideas, code and documentation contributions, and review.
See [our definition of contributors](https://github-activity.readthedocs.io/en/latest/#how-does-this-tool-define-contributions-in-the-reports).
([GitHub contributors page for this release](https://github.com/jupyterhub/jupyterhub/graphs/contributors?from=2024-03-20&to=2024-03-23&type=c))
@consideRatio ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3AconsideRatio+updated%3A2024-03-20..2024-03-23&type=Issues)) | @minrk ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Aminrk+updated%3A2024-03-20..2024-03-23&type=Issues))
### 4.1.0 - 2024-03-20
JupyterHub 4.1 is a security release, fixing [CVE-2024-28233].
All JupyterHub deployments are encouraged to upgrade,
especially those with other user content on peer domains to JupyterHub.
As always, JupyterHub deployments are especially encouraged to enable per-user domains if protecting users from each other is a concern.
For more information on securely deploying JupyterHub, see the [web security documentation](web-security).
[CVE-2024-28233]: https://github.com/jupyterhub/jupyterhub/security/advisories/GHSA-7r3h-4ph8-w38g
([full changelog](https://github.com/jupyterhub/jupyterhub/compare/4.0.2...4.1.0))
#### Enhancements made
- Backport PR #4628 on branch 4.x (Include LDAP groups in local spawner gids) [#4735](https://github.com/jupyterhub/jupyterhub/pull/4735) ([@minrk](https://github.com/minrk))
- Backport PR #4561 on branch 4.x (Improve debugging when waiting for servers) [#4714](https://github.com/jupyterhub/jupyterhub/pull/4714) ([@minrk](https://github.com/minrk))
- Backport PR #4563 on branch 4.x (only set 'domain' field on session-id cookie) [#4707](https://github.com/jupyterhub/jupyterhub/pull/4707) ([@minrk](https://github.com/minrk))
#### Bugs fixed
- Backport PR #4733 on branch 4.x (Catch ValueError while waiting for server to be reachable) [#4734](https://github.com/jupyterhub/jupyterhub/pull/4734) ([@minrk](https://github.com/minrk))
- Backport PR #4679 on branch 4.x (Unescape jinja username) [#4705](https://github.com/jupyterhub/jupyterhub/pull/4705) ([@minrk](https://github.com/minrk))
- Backport PR #4630: avoid setting unused oauth state cookies on API requests [#4697](https://github.com/jupyterhub/jupyterhub/pull/4697) ([@minrk](https://github.com/minrk))
- Backport PR #4632: simplify, avoid errors in parsing accept headers [#4696](https://github.com/jupyterhub/jupyterhub/pull/4696) ([@minrk](https://github.com/minrk))
- Backport PR #4677 on branch 4.x (Improve validation, docs for token.expires_in) [#4692](https://github.com/jupyterhub/jupyterhub/pull/4692) ([@minrk](https://github.com/minrk))
- Backport PR #4570 on branch 4.x (fix mutation of frozenset in scope intersection) [#4691](https://github.com/jupyterhub/jupyterhub/pull/4691) ([@minrk](https://github.com/minrk))
- Backport PR #4562 on branch 4.x (Use `user.stop` to cleanup spawners that stopped while Hub was down) [#4690](https://github.com/jupyterhub/jupyterhub/pull/4690) ([@minrk](https://github.com/minrk))
- Backport PR #4542 on branch 4.x (Fix include_stopped_servers in paginated next_url) [#4689](https://github.com/jupyterhub/jupyterhub/pull/4689) ([@minrk](https://github.com/minrk))
- Backport PR #4651 on branch 4.x (avoid attempting to patch removed IPythonHandler with notebook v7) [#4688](https://github.com/jupyterhub/jupyterhub/pull/4688) ([@minrk](https://github.com/minrk))
- Backport PR #4560 on branch 4.x (singleuser extension: persist token from ?token=... url in cookie) [#4687](https://github.com/jupyterhub/jupyterhub/pull/4687) ([@minrk](https://github.com/minrk))
#### Maintenance and upkeep improvements
- Backport quay.io publishing [#4698](https://github.com/jupyterhub/jupyterhub/pull/4698) ([@minrk](https://github.com/minrk))
- Backport PR #4617: try to improve reliability of test_external_proxy [#4695](https://github.com/jupyterhub/jupyterhub/pull/4695) ([@minrk](https://github.com/minrk))
- Backport PR #4618 on branch 4.x (browser test: wait for token request to finish before reloading) [#4694](https://github.com/jupyterhub/jupyterhub/pull/4694) ([@minrk](https://github.com/minrk))
- preparing 4.x branch [#4685](https://github.com/jupyterhub/jupyterhub/pull/4685) ([@minrk](https://github.com/minrk), [@consideRatio](https://github.com/consideRatio))
#### Contributors to this release
The following people contributed discussions, new ideas, code and documentation contributions, and review.
See [our definition of contributors](https://github-activity.readthedocs.io/en/latest/#how-does-this-tool-define-contributions-in-the-reports).
([GitHub contributors page for this release](https://github.com/jupyterhub/jupyterhub/graphs/contributors?from=2023-08-10&to=2024-03-19&type=c))
@Achele ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3AAchele+updated%3A2023-08-10..2024-03-19&type=Issues)) | @akashthedeveloper ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Aakashthedeveloper+updated%3A2023-08-10..2024-03-19&type=Issues)) | @balajialg ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Abalajialg+updated%3A2023-08-10..2024-03-19&type=Issues)) | @BhavyaT-135 ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3ABhavyaT-135+updated%3A2023-08-10..2024-03-19&type=Issues)) | @blink1073 ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Ablink1073+updated%3A2023-08-10..2024-03-19&type=Issues)) | @consideRatio ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3AconsideRatio+updated%3A2023-08-10..2024-03-19&type=Issues)) | @fcollonval ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Afcollonval+updated%3A2023-08-10..2024-03-19&type=Issues)) | @I-Am-D-B ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3AI-Am-D-B+updated%3A2023-08-10..2024-03-19&type=Issues)) | @jakirkham ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Ajakirkham+updated%3A2023-08-10..2024-03-19&type=Issues)) | @ktaletsk ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Aktaletsk+updated%3A2023-08-10..2024-03-19&type=Issues)) | @kzgrzendek ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Akzgrzendek+updated%3A2023-08-10..2024-03-19&type=Issues)) | @lumberbot-app ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Alumberbot-app+updated%3A2023-08-10..2024-03-19&type=Issues)) | @manics ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Amanics+updated%3A2023-08-10..2024-03-19&type=Issues)) | @mbiette ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Ambiette+updated%3A2023-08-10..2024-03-19&type=Issues)) | @minrk ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Aminrk+updated%3A2023-08-10..2024-03-19&type=Issues)) | @rcthomas ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Arcthomas+updated%3A2023-08-10..2024-03-19&type=Issues)) | @ryanlovett ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Aryanlovett+updated%3A2023-08-10..2024-03-19&type=Issues)) | @sgaist ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Asgaist+updated%3A2023-08-10..2024-03-19&type=Issues)) | @shubham0473 ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Ashubham0473+updated%3A2023-08-10..2024-03-19&type=Issues)) | @Temidayo32 ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3ATemidayo32+updated%3A2023-08-10..2024-03-19&type=Issues)) | @willingc ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Awillingc+updated%3A2023-08-10..2024-03-19&type=Issues)) | @yuvipanda ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Ayuvipanda+updated%3A2023-08-10..2024-03-19&type=Issues))
## 4.0
### 4.0.2 - 2023-08-10
([full changelog](https://github.com/jupyterhub/jupyterhub/compare/4.0.1...4.0.2))
#### Enhancements made
- avoid counting failed requests to not-running servers as 'activity' [#4491](https://github.com/jupyterhub/jupyterhub/pull/4491) ([@minrk](https://github.com/minrk), [@consideRatio](https://github.com/consideRatio))
- improve permission-denied errors for various cases [#4489](https://github.com/jupyterhub/jupyterhub/pull/4489) ([@minrk](https://github.com/minrk), [@consideRatio](https://github.com/consideRatio))
#### Bugs fixed
- set root_dir when using singleuser extension [#4503](https://github.com/jupyterhub/jupyterhub/pull/4503) ([@minrk](https://github.com/minrk), [@consideRatio](https://github.com/consideRatio), [@manics](https://github.com/manics))
- Allow setting custom log_function in tornado_settings in SingleUserServer [#4475](https://github.com/jupyterhub/jupyterhub/pull/4475) ([@grios-stratio](https://github.com/grios-stratio), [@minrk](https://github.com/minrk))
#### Documentation improvements
- doc: update notebook config URL [#4523](https://github.com/jupyterhub/jupyterhub/pull/4523) ([@minrk](https://github.com/minrk), [@manics](https://github.com/manics))
- document how to use notebook v7 with jupyterhub [#4522](https://github.com/jupyterhub/jupyterhub/pull/4522) ([@minrk](https://github.com/minrk), [@manics](https://github.com/manics))
#### Contributors to this release
The following people contributed discussions, new ideas, code and documentation contributions, and review.
See [our definition of contributors](https://github-activity.readthedocs.io/en/latest/#how-does-this-tool-define-contributions-in-the-reports).
([GitHub contributors page for this release](https://github.com/jupyterhub/jupyterhub/graphs/contributors?from=2023-06-08&to=2023-08-10&type=c))
@agelosnm ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Aagelosnm+updated%3A2023-06-08..2023-08-10&type=Issues)) | @consideRatio ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3AconsideRatio+updated%3A2023-06-08..2023-08-10&type=Issues)) | @diocas ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Adiocas+updated%3A2023-06-08..2023-08-10&type=Issues)) | @grios-stratio ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Agrios-stratio+updated%3A2023-06-08..2023-08-10&type=Issues)) | @jhgoebbert ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Ajhgoebbert+updated%3A2023-06-08..2023-08-10&type=Issues)) | @jtpio ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Ajtpio+updated%3A2023-06-08..2023-08-10&type=Issues)) | @kosmonavtus ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Akosmonavtus+updated%3A2023-06-08..2023-08-10&type=Issues)) | @kreuzert ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Akreuzert+updated%3A2023-06-08..2023-08-10&type=Issues)) | @manics ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Amanics+updated%3A2023-06-08..2023-08-10&type=Issues)) | @martinRenou ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3AmartinRenou+updated%3A2023-06-08..2023-08-10&type=Issues)) | @minrk ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Aminrk+updated%3A2023-06-08..2023-08-10&type=Issues)) | @opoplawski ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Aopoplawski+updated%3A2023-06-08..2023-08-10&type=Issues)) | @Ph0tonic ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3APh0tonic+updated%3A2023-06-08..2023-08-10&type=Issues)) | @sgaist ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Asgaist+updated%3A2023-06-08..2023-08-10&type=Issues)) | @trungleduc ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Atrungleduc+updated%3A2023-06-08..2023-08-10&type=Issues)) | @yuvipanda ([activity](https://github.com/search?q=repo%3Ajupyterhub%2Fjupyterhub+involves%3Ayuvipanda+updated%3A2023-06-08..2023-08-10&type=Issues))
### 4.0.1 - 2023-06-08
([full changelog](https://github.com/jupyterhub/jupyterhub/compare/4.0.0...4.0.1))

View File

@@ -16,8 +16,6 @@ Please submit pull requests to update information or to add new institutions or
- [BIDS - Berkeley Institute for Data Science](https://bids.berkeley.edu/)
- [Teaching with Jupyter notebooks and JupyterHub](https://bids.berkeley.edu/resources/videos/teaching-ipythonjupyter-notebooks-and-jupyterhub)
- [Data 8](http://data8.org/)
- [GitHub organization](https://github.com/data-8)

View File

@@ -112,7 +112,6 @@ popular services:
- [Globus](https://oauthenticator.readthedocs.io/en/latest/reference/api/gen/oauthenticator.globus.html)
- [Google](https://oauthenticator.readthedocs.io/en/latest/reference/api/gen/oauthenticator.google.html)
- [MediaWiki](https://oauthenticator.readthedocs.io/en/latest/reference/api/gen/oauthenticator.mediawiki.html)
- [Okpy](https://oauthenticator.readthedocs.io/en/latest/reference/api/gen/oauthenticator.okpy.html)
- [OpenShift](https://oauthenticator.readthedocs.io/en/latest/reference/api/gen/oauthenticator.openshift.html)
A [generic implementation](https://oauthenticator.readthedocs.io/en/latest/reference/api/gen/oauthenticator.generic.html), which you can use for OAuth authentication

View File

@@ -1,9 +1,9 @@
# Install JupyterHub with Docker
The JupyterHub [docker image](https://hub.docker.com/r/jupyterhub/jupyterhub/) is the fastest way to set up Jupyterhub in your local development environment.
The JupyterHub [docker image](https://quay.io/repository/jupyterhub/jupyterhub) is the fastest way to set up Jupyterhub in your local development environment.
:::{note}
This `jupyterhub/jupyterhub` docker image is only an image for running
This `quay.io/jupyterhub/jupyterhub` docker image is only an image for running
the Hub service itself. It does not provide the other Jupyter components,
such as Notebook installation, which are needed by the single-user servers.
To run the single-user servers, which may be on the same system as the Hub or
@@ -24,7 +24,7 @@ You should have [Docker] installed on a Linux/Unix based system.
To pull the latest JupyterHub image and start the `jupyterhub` container, run this command in your terminal.
```
docker run -d -p 8000:8000 --name jupyterhub jupyterhub/jupyterhub jupyterhub
docker run -d -p 8000:8000 --name jupyterhub quay.io/jupyterhub/jupyterhub jupyterhub
```
This command exposes the Jupyter container on port:8000. Navigate to `http://localhost:8000` in a web browser to access the JupyterHub console.

View File

@@ -2,6 +2,7 @@
Example for a Spawner.pre_spawn_hook
create a directory for the user before the spawner starts
"""
# pylint: disable=import-error
import os
import shutil

View File

@@ -3,6 +3,7 @@
Implements OAuth handshake manually
so all URLs and requests necessary for OAuth with JupyterHub should be in one place
"""
import json
import os
from urllib.parse import urlencode, urlparse

View File

@@ -1,9 +1,9 @@
FROM jupyterhub/jupyterhub
FROM quay.io/jupyterhub/jupyterhub
# Create test user (PAM auth) and install single-user Jupyter
# Create test user (PAM auth) and install single-user Jupyter
RUN useradd testuser --create-home --shell /bin/bash
RUN echo 'testuser:passwd' | chpasswd
RUN pip install jupyter
RUN pip install jupyter
COPY app ./app
COPY jupyterhub_config.py .

View File

@@ -4,6 +4,7 @@ This example service serves `/services/whoami-oauth/`,
authenticated with the Hub,
showing the user their own info.
"""
import json
import os
from urllib.parse import urlparse

View File

@@ -4,6 +4,7 @@ This serves `/services/whoami-api/`, authenticated with the Hub, showing the use
HubAuthenticated only supports token-based access.
"""
import json
import os
from urllib.parse import urlparse

View File

@@ -1,6 +1,7 @@
"""
Example JupyterHub config allowing users to specify environment variables and notebook-server args
"""
import shlex
from jupyterhub.spawner import LocalProcessSpawner

View File

@@ -3,6 +3,7 @@
Note: a memoized function should always return an _immutable_
result to avoid later modifications polluting cached results.
"""
from collections import OrderedDict
from functools import wraps

View File

@@ -1,8 +1,9 @@
"""JupyterHub version info"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
# version_info updated by running `tbump`
version_info = (4, 0, 1, "", "")
version_info = (4, 1, 4, "", "")
# pep 440 version: no dot before beta/rc, but before .dev
# 0.1.0rc1

247
jupyterhub/_xsrf_utils.py Normal file
View File

@@ -0,0 +1,247 @@
"""utilities for XSRF
Extends tornado's xsrf token checks with the following:
- only set xsrf cookie on navigation requests (cannot be fetched)
This utility file enables the consistent reuse of these functions
in both Hub and single-user code
"""
import base64
import hashlib
from http.cookies import SimpleCookie
from tornado import web
from tornado.log import app_log
def _get_signed_value_urlsafe(handler, name, b64_value):
"""Like get_signed_value (used in get_secure_cookie), but for urlsafe values
Decodes urlsafe_base64-encoded signed values
Returns None if any decoding failed
"""
if b64_value is None:
return None
if isinstance(b64_value, str):
try:
b64_value = b64_value.encode("ascii")
except UnicodeEncodeError:
app_log.warning("Invalid value %r", b64_value)
return None
# re-pad, since we stripped padding in _create_signed_value
remainder = len(b64_value) % 4
if remainder:
b64_value = b64_value + (b'=' * (4 - remainder))
try:
value = base64.urlsafe_b64decode(b64_value)
except ValueError:
app_log.warning("Invalid base64 value %r", b64_value)
return None
return web.decode_signed_value(
handler.application.settings["cookie_secret"],
name,
value,
max_age_days=31,
min_version=2,
)
def _create_signed_value_urlsafe(handler, name, value):
"""Like tornado's create_signed_value (used in set_secure_cookie), but returns urlsafe bytes"""
signed_value = handler.create_signed_value(name, value)
return base64.urlsafe_b64encode(signed_value).rstrip(b"=")
def _get_xsrf_token_cookie(handler):
"""
Get the _valid_ XSRF token and id from Cookie
Returns (xsrf_token, xsrf_id) found in Cookies header.
multiple xsrf cookies may be set on multiple paths;
RFC 6265 states that they should be in order of more specific path to less,
but ALSO states that servers should never rely on order.
Tornado (6.4) and stdlib (3.12) SimpleCookie explicitly use the _last_ value,
which means the cookie with the _least_ specific prefix will be used if more than one is present.
Because we sign values, we can get the first valid cookie and not worry about order too much.
This is simplified from tornado's HTTPRequest.cookies property
only looking for a single cookie.
"""
if "Cookie" not in handler.request.headers:
return (None, None)
for chunk in handler.request.headers["Cookie"].split(";"):
key = chunk.partition("=")[0].strip()
if key != "_xsrf":
# we are only looking for the _xsrf cookie
# ignore everything else
continue
# use stdlib parsing to handle quotes, validation, etc.
try:
xsrf_token = SimpleCookie(chunk)[key].value.encode("ascii")
except (ValueError, KeyError):
continue
xsrf_token_id = _get_signed_value_urlsafe(handler, "_xsrf", xsrf_token)
if xsrf_token_id:
# only return if we found a _valid_ xsrf cookie
# otherwise, keep looking
return (xsrf_token, xsrf_token_id)
# no valid token found found
return (None, None)
def _set_xsrf_cookie(handler, xsrf_id, *, cookie_path="", authenticated=None):
"""Set xsrf token cookie"""
xsrf_token = _create_signed_value_urlsafe(handler, "_xsrf", xsrf_id)
xsrf_cookie_kwargs = {}
xsrf_cookie_kwargs.update(handler.settings.get('xsrf_cookie_kwargs', {}))
xsrf_cookie_kwargs.setdefault("path", cookie_path)
if authenticated is None:
try:
current_user = handler.current_user
except Exception:
authenticated = False
else:
authenticated = bool(current_user)
if not authenticated:
# limit anonymous xsrf cookies to one hour
xsrf_cookie_kwargs.pop("expires", None)
xsrf_cookie_kwargs.pop("expires_days", None)
xsrf_cookie_kwargs["max_age"] = 3600
app_log.info(
"Setting new xsrf cookie for %r %r",
xsrf_id,
xsrf_cookie_kwargs,
)
handler.set_cookie("_xsrf", xsrf_token, **xsrf_cookie_kwargs)
def get_xsrf_token(handler, cookie_path=""):
"""Override tornado's xsrf token to add further restrictions
- only set cookie for regular pages (not API requests)
- include login info in xsrf token
- verify signature
"""
# original: https://github.com/tornadoweb/tornado/blob/v6.4.0/tornado/web.py#L1455
if hasattr(handler, "_xsrf_token"):
return handler._xsrf_token
_set_cookie = False
# the raw cookie is the token
xsrf_token, xsrf_id_cookie = _get_xsrf_token_cookie(handler)
cookie_token = xsrf_token
# check the decoded, signed value for validity
xsrf_id = handler._xsrf_token_id
if xsrf_id_cookie != xsrf_id:
# this will usually happen on the first page request after login,
# which changes the inputs to the token id
if xsrf_id_cookie:
app_log.debug("xsrf id mismatch %r != %r", xsrf_id_cookie, xsrf_id)
# generate new value
xsrf_token = _create_signed_value_urlsafe(handler, "_xsrf", xsrf_id)
# only set cookie on regular navigation pages
# i.e. not API requests, etc.
# insecure URLs (public hostname/ip, no https)
# don't set Sec-Fetch-Mode.
# consequence of assuming 'navigate': setting a cookie unnecessarily
# consequence of assuming not 'navigate': xsrf never set, nothing works
_set_cookie = (
handler.request.headers.get("Sec-Fetch-Mode", "navigate") == "navigate"
)
if xsrf_id_cookie and not _set_cookie:
# if we aren't setting a cookie here but we got one,
# this means things probably aren't going to work
app_log.warning(
"Not accepting incorrect xsrf token id in cookie on %s",
handler.request.path,
)
if _set_cookie:
_set_xsrf_cookie(handler, xsrf_id, cookie_path=cookie_path)
handler._xsrf_token = xsrf_token
return xsrf_token
def _needs_check_xsrf(handler):
"""Does the given cookie-authenticated request need to check xsrf?"""
if getattr(handler, "_token_authenticated", False):
return False
fetch_mode = handler.request.headers.get("Sec-Fetch-Mode", "unspecified")
if fetch_mode in {"websocket", "no-cors"} or (
fetch_mode in {"navigate", "unspecified"}
and handler.request.method.lower() in {"get", "head", "options"}
):
# no xsrf check needed for regular page views or no-cors
# or websockets after allow_websocket_cookie_auth passes
if fetch_mode == "unspecified":
app_log.warning(
f"Skipping XSRF check for insecure request {handler.request.method} {handler.request.path}"
)
return False
else:
return True
def check_xsrf_cookie(handler):
"""Check that xsrf cookie matches xsrf token in request"""
# overrides tornado's implementation
# because we changed what a correct value should be in xsrf_token
if not _needs_check_xsrf(handler):
# don't require XSRF for regular page views
return
token = (
handler.get_argument("_xsrf", None)
or handler.request.headers.get("X-Xsrftoken")
or handler.request.headers.get("X-Csrftoken")
)
if not token:
raise web.HTTPError(
403, f"'_xsrf' argument missing from {handler.request.method}"
)
try:
token = token.encode("utf8")
except UnicodeEncodeError:
raise web.HTTPError(403, "'_xsrf' argument invalid")
if token != handler.xsrf_token:
raise web.HTTPError(
403, f"XSRF cookie does not match {handler.request.method.upper()} argument"
)
def _anonymous_xsrf_id(handler):
"""Generate an appropriate xsrf token id for an anonymous request
Currently uses hash of request ip and user-agent
These are typically used only for the initial login page,
so only need to be valid for a few seconds to a few minutes
(enough to submit a login form with MFA).
"""
hasher = hashlib.sha256()
hasher.update(handler.request.remote_ip.encode("ascii"))
hasher.update(
handler.request.headers.get("User-Agent", "").encode("utf8", "replace")
)
return base64.urlsafe_b64encode(hasher.digest()).decode("ascii")

View File

@@ -5,6 +5,7 @@ Revises: 833da8570507
Create Date: 2021-09-15 14:04:09.067024
"""
# revision identifiers, used by Alembic.
revision = '0eee8c825d24'
down_revision = '651f5419b74d'

View File

@@ -5,6 +5,7 @@ Revises:
Create Date: 2016-04-11 16:05:34.873288
"""
# revision identifiers, used by Alembic.
revision = '19c0846f6344'
down_revision = None

View File

@@ -5,6 +5,7 @@ Revises: 3ec6993fe20c
Create Date: 2017-12-07 14:43:51.500740
"""
# revision identifiers, used by Alembic.
revision = '1cebaf56856c'
down_revision = '3ec6993fe20c'

View File

@@ -12,6 +12,7 @@ Revises: af4cbdb2d13c
Create Date: 2017-07-28 16:44:40.413648
"""
# revision identifiers, used by Alembic.
revision = '3ec6993fe20c'
down_revision = 'af4cbdb2d13c'

View File

@@ -5,6 +5,7 @@ Revises: 896818069c98
Create Date: 2019-02-28 14:14:27.423927
"""
# revision identifiers, used by Alembic.
revision = '4dc2d5a8c53c'
down_revision = '896818069c98'

View File

@@ -5,6 +5,7 @@ Revises: 1cebaf56856c
Create Date: 2017-12-19 15:21:09.300513
"""
# revision identifiers, used by Alembic.
revision = '56cc5a70207e'
down_revision = '1cebaf56856c'

View File

@@ -5,6 +5,7 @@ Revises: 833da8570507
Create Date: 2022-02-28 12:42:55.149046
"""
# revision identifiers, used by Alembic.
revision = '651f5419b74d'
down_revision = '833da8570507'

View File

@@ -6,6 +6,7 @@ Revises: 4dc2d5a8c53c
Create Date: 2021-02-17 15:03:04.360368
"""
# revision identifiers, used by Alembic.
revision = '833da8570507'
down_revision = '4dc2d5a8c53c'

View File

@@ -5,6 +5,7 @@ Revises: d68c98b66cd4
Create Date: 2018-05-07 11:35:58.050542
"""
# revision identifiers, used by Alembic.
revision = '896818069c98'
down_revision = 'd68c98b66cd4'

View File

@@ -5,6 +5,7 @@ Revises: 56cc5a70207e
Create Date: 2018-03-21 14:27:17.466841
"""
# revision identifiers, used by Alembic.
revision = '99a28a4418e1'
down_revision = '56cc5a70207e'

View File

@@ -5,6 +5,7 @@ Revises: eeb276e51423
Create Date: 2016-07-28 16:16:38.245348
"""
# revision identifiers, used by Alembic.
revision = 'af4cbdb2d13c'
down_revision = 'eeb276e51423'

View File

@@ -5,6 +5,7 @@ Revises: 99a28a4418e1
Create Date: 2018-04-13 10:50:17.968636
"""
# revision identifiers, used by Alembic.
revision = 'd68c98b66cd4'
down_revision = '99a28a4418e1'

View File

@@ -6,6 +6,7 @@ Revision ID: eeb276e51423
Revises: 19c0846f6344
Create Date: 2016-04-11 16:06:49.239831
"""
# revision identifiers, used by Alembic.
revision = 'eeb276e51423'
down_revision = '19c0846f6344'

View File

@@ -1,4 +1,5 @@
"""Authorization handlers"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import json

View File

@@ -1,4 +1,5 @@
"""Base API handlers"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import json
@@ -75,20 +76,18 @@ class APIHandler(BaseHandler):
return True
async def prepare(self):
await super().prepare()
# tornado only checks xsrf on non-GET
# we also check xsrf on GETs to API endpoints
# make sure this runs after auth, which happens in super().prepare()
if self.request.method not in {"HEAD", "OPTIONS"} and self.settings.get(
"xsrf_cookies"
):
self.check_xsrf_cookie()
# we also check xsrf on GETs to API endpoints
_xsrf_safe_methods = {"HEAD", "OPTIONS"}
def check_xsrf_cookie(self):
if not hasattr(self, '_jupyterhub_user'):
# called too early to check if we're token-authenticated
return
if self._jupyterhub_user is None and 'Origin' not in self.request.headers:
# don't raise xsrf if auth failed
# don't apply this shortcut to actual cross-site requests, which have an 'Origin' header,
# which would reveal if there are credentials present
return
if getattr(self, '_token_authenticated', False):
# if token-authenticated, ignore XSRF
return
@@ -475,7 +474,7 @@ class APIHandler(BaseHandler):
if next_offset < total_count:
# if there's a next page
next_url_parsed = urlparse(self.request.full_url())
query = parse_qs(next_url_parsed.query)
query = parse_qs(next_url_parsed.query, keep_blank_values=True)
query['offset'] = [next_offset]
query['limit'] = [limit]
next_url_parsed = next_url_parsed._replace(

View File

@@ -1,4 +1,5 @@
"""Group handlers"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import json

View File

@@ -1,4 +1,5 @@
"""API handlers for administering the Hub itself"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import json

View File

@@ -1,4 +1,5 @@
"""Proxy handlers"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import json

View File

@@ -2,6 +2,7 @@
Currently GET-only, no actions can be taken to modify services.
"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import json

View File

@@ -1,4 +1,5 @@
"""User handlers"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import asyncio
@@ -445,7 +446,14 @@ class UserTokenListAPIHandler(APIHandler):
user_kind = 'user' if isinstance(user, User) else 'service'
self.log.info("%s %s requested new API token", user_kind.title(), user.name)
# retrieve the model
token_model = self.token_model(orm.APIToken.find(self.db, api_token))
orm_token = orm.APIToken.find(self.db, api_token)
if orm_token is None:
self.log.error(
"Failed to find token after creating it: %r. Maybe it expired already?",
body,
)
raise web.HTTPError(500, "Failed to create token")
token_model = self.token_model(orm_token)
token_model['token'] = api_token
self.write(json.dumps(token_model))
self.set_status(201)

View File

@@ -401,6 +401,25 @@ class JupyterHub(Application):
Useful for daemonizing JupyterHub.
""",
).tag(config=True)
cookie_host_prefix_enabled = Bool(
False,
help="""Enable `__Host-` prefix on authentication cookies.
The `__Host-` prefix on JupyterHub cookies provides further
protection against cookie tossing when untrusted servers
may control subdomains of your jupyterhub deployment.
_However_, it also requires that cookies be set on the path `/`,
which means they are shared by all JupyterHub components,
so a compromised server component will have access to _all_ JupyterHub-related
cookies of the visiting browser.
It is recommended to only combine `__Host-` cookies with per-user domains.
.. versionadded:: 4.1
""",
).tag(config=True)
cookie_max_age_days = Float(
14,
help="""Number of days for a login cookie to be valid.
@@ -1898,6 +1917,8 @@ class JupyterHub(Application):
hub_args['port'] = self.hub_port
self.hub = Hub(**hub_args)
if self.cookie_host_prefix_enabled:
self.hub.cookie_name = "__Host-" + self.hub.cookie_name
if not self.subdomain_host:
api_prefix = url_path_join(self.hub.base_url, "api/")
@@ -2582,9 +2603,13 @@ class JupyterHub(Application):
"%s appears to have stopped while the Hub was down",
spawner._log_name,
)
# remove server entry from db
db.delete(spawner.orm_spawner.server)
spawner.server = None
try:
await user.stop(name)
except Exception:
self.log.exception(
f"Failed to cleanup {spawner._log_name} which appeared to stop while the Hub was down.",
exc_info=True,
)
else:
self.log.debug("%s not running", spawner._log_name)
@@ -2756,6 +2781,7 @@ class JupyterHub(Application):
base_url=self.base_url,
default_url=self.default_url,
cookie_secret=self.cookie_secret,
cookie_host_prefix_enabled=self.cookie_host_prefix_enabled,
cookie_max_age_days=self.cookie_max_age_days,
redirect_to_server=self.redirect_to_server,
login_url=login_url,

View File

@@ -1,4 +1,5 @@
"""Base Authenticator class and the default PAM Authenticator"""
# Copyright (c) IPython Development Team.
# Distributed under the terms of the Modified BSD License.
import inspect

View File

@@ -1,4 +1,5 @@
"""Database utilities for JupyterHub"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
# Based on pgcontents.utils.migrate, used under the Apache license.

View File

@@ -1,4 +1,5 @@
"""HTTP Handlers for the hub server"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import asyncio
@@ -23,6 +24,12 @@ from tornado.log import app_log
from tornado.web import RequestHandler, addslash
from .. import __version__, orm, roles, scopes
from .._xsrf_utils import (
_anonymous_xsrf_id,
_set_xsrf_cookie,
check_xsrf_cookie,
get_xsrf_token,
)
from ..metrics import (
PROXY_ADD_DURATION_SECONDS,
PROXY_DELETE_DURATION_SECONDS,
@@ -98,7 +105,14 @@ class BaseHandler(RequestHandler):
self.log.error("Rolling back session due to database error")
self.db.rollback()
self._resolve_roles_and_scopes()
return await maybe_future(super().prepare())
await maybe_future(super().prepare())
# run xsrf check after prepare
# because our version takes auth info into account
if (
self.request.method not in self._xsrf_safe_methods
and self.application.settings.get("xsrf_cookies")
):
self.check_xsrf_cookie()
@property
def log(self):
@@ -199,9 +213,13 @@ class BaseHandler(RequestHandler):
"""The default Content-Security-Policy header
Can be overridden by defining Content-Security-Policy in settings['headers']
..versionchanged:: 4.1
Change default frame-ancestors from 'self' to 'none'
"""
return '; '.join(
["frame-ancestors 'self'", "report-uri " + self.csp_report_uri]
["frame-ancestors 'none'", "report-uri " + self.csp_report_uri]
)
def get_content_type(self):
@@ -211,7 +229,6 @@ class BaseHandler(RequestHandler):
"""
Set any headers passed as tornado_settings['headers'].
By default sets Content-Security-Policy of frame-ancestors 'self'.
Also responsible for setting content-type header
"""
# wrap in HTTPHeaders for case-insensitivity
@@ -233,15 +250,63 @@ class BaseHandler(RequestHandler):
# Login and cookie-related
# ---------------------------------------------------------------
_xsrf_safe_methods = {"GET", "HEAD", "OPTIONS"}
@property
def _xsrf_token_id(self):
"""Value to be signed/encrypted for xsrf token
include login info in xsrf token
this means xsrf tokens are tied to logged-in users,
and change after a user logs in.
While the user is not yet logged in,
an anonymous value is used, to prevent portability.
These anonymous values are short-lived.
"""
# cases:
# 1. logged in, session id (session_id:user_id)
# 2. logged in, no session id (anonymous_id:user_id)
# 3. not logged in, session id (session_id:anonymous_id)
# 4. no cookies at all, use single anonymous value (:anonymous_id)
session_id = self.get_session_cookie()
if self.current_user:
if isinstance(self.current_user, User):
user_id = self.current_user.cookie_id
else:
# this shouldn't happen, but may if e.g. a Service attempts to fetch a page,
# which usually won't work, but this method should not be what raises
user_id = ""
if not session_id:
# no session id, use non-portable anonymous id
session_id = _anonymous_xsrf_id(self)
else:
# not logged in yet, use non-portable anonymous id
user_id = _anonymous_xsrf_id(self)
xsrf_id = f"{session_id}:{user_id}".encode("utf8", "replace")
return xsrf_id
@property
def xsrf_token(self):
"""Override tornado's xsrf token with further restrictions
- only set cookie for regular pages
- include login info in xsrf token
- verify signature
"""
return get_xsrf_token(self, cookie_path=self.hub.base_url)
def check_xsrf_cookie(self):
try:
return super().check_xsrf_cookie()
except Exception as e:
# ensure _juptyerhub_user is defined on rejected requests
if not hasattr(self, "_jupyterhub_user"):
self._jupyterhub_user = None
self._resolve_roles_and_scopes()
raise
"""Check that xsrf cookie matches xsrf token in request"""
# overrides tornado's implementation
# because we changed what a correct value should be in xsrf_token
if not hasattr(self, "_jupyterhub_user"):
# run too early to check the value
# tornado runs this before 'prepare',
# but we run it again after so auth info is available, which happens in 'prepare'
return None
return check_xsrf_cookie(self)
@property
def admin_users(self):
@@ -514,15 +579,30 @@ class BaseHandler(RequestHandler):
user = self._user_from_orm(u)
return user
def clear_cookie(self, cookie_name, **kwargs):
"""Clear a cookie
overrides RequestHandler to always handle __Host- prefix correctly
"""
if cookie_name.startswith("__Host-"):
kwargs["path"] = "/"
kwargs["secure"] = True
return super().clear_cookie(cookie_name, **kwargs)
def clear_login_cookie(self, name=None):
kwargs = {}
if self.subdomain_host:
kwargs['domain'] = self.domain
user = self.get_current_user_cookie()
session_id = self.get_session_cookie()
if session_id:
# clear session id
self.clear_cookie(SESSION_COOKIE_NAME, path=self.base_url, **kwargs)
session_cookie_kwargs = {}
session_cookie_kwargs.update(kwargs)
if self.subdomain_host:
session_cookie_kwargs['domain'] = self.domain
self.clear_cookie(
SESSION_COOKIE_NAME, path=self.base_url, **session_cookie_kwargs
)
if user:
# user is logged in, clear any tokens associated with the current session
@@ -571,12 +651,15 @@ class BaseHandler(RequestHandler):
kwargs = {'httponly': True}
if self.request.protocol == 'https':
kwargs['secure'] = True
if self.subdomain_host:
kwargs['domain'] = self.domain
kwargs.update(self.settings.get('cookie_options', {}))
kwargs.update(overrides)
if key.startswith("__Host-"):
# __Host- cookies must be secure and on /
kwargs["path"] = "/"
kwargs["secure"] = True
if encrypted:
set_cookie = self.set_secure_cookie
else:
@@ -606,9 +689,21 @@ class BaseHandler(RequestHandler):
Session id cookie is *not* encrypted,
so other services on this domain can read it.
"""
session_id = uuid.uuid4().hex
if not hasattr(self, "_session_id"):
self._session_id = uuid.uuid4().hex
session_id = self._session_id
# if using subdomains, set session cookie on the domain,
# which allows it to be shared by subdomains.
# if domain is unspecified, it is _more_ restricted to only the setting domain
kwargs = {}
if self.subdomain_host:
kwargs['domain'] = self.domain
self._set_cookie(
SESSION_COOKIE_NAME, session_id, encrypted=False, path=self.base_url
SESSION_COOKIE_NAME,
session_id,
encrypted=False,
path=self.base_url,
**kwargs,
)
return session_id
@@ -640,6 +735,13 @@ class BaseHandler(RequestHandler):
if not self.get_current_user_cookie():
self.set_hub_cookie(user)
# make sure xsrf cookie is updated
# this avoids needing a second request to set the right xsrf cookie
self._jupyterhub_user = user
_set_xsrf_cookie(
self, self._xsrf_token_id, cookie_path=self.hub.base_url, authenticated=True
)
def authenticate(self, data):
return maybe_future(self.authenticator.get_authenticated_user(self, data))
@@ -1431,6 +1533,12 @@ class UserUrlHandler(BaseHandler):
# accept token auth for API requests that are probably to non-running servers
_accept_token_auth = True
# don't consider these redirects 'activity'
# if the redirect is followed and the subsequent action taken,
# _that_ is activity
def _record_activity(self, obj, timestamp=None):
return False
def _fail_api_request(self, user_name='', server_name=''):
"""Fail an API request to a not-running server"""
self.log.warning(

View File

@@ -1,4 +1,5 @@
"""HTTP Handlers for the hub server"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import asyncio

View File

@@ -1,4 +1,5 @@
"""Handlers for serving prometheus metrics"""
from prometheus_client import CONTENT_TYPE_LATEST, REGISTRY, generate_latest
from ..utils import metrics_authentication

View File

@@ -1,4 +1,5 @@
"""Basic html-rendering handlers."""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import asyncio

View File

@@ -1,4 +1,5 @@
"""logging utilities"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import json

View File

@@ -19,6 +19,7 @@ them manually here.
added ``jupyterhub_`` prefix to metric names.
"""
from datetime import timedelta
from enum import Enum

View File

@@ -2,6 +2,7 @@
implements https://oauthlib.readthedocs.io/en/latest/oauth2/server.html
"""
from oauthlib import uri_validate
from oauthlib.oauth2 import RequestValidator, WebApplicationServer
from oauthlib.oauth2.rfc6749.grant_types import authorization_code, base

View File

@@ -1,4 +1,5 @@
"""Some general objects for use in JupyterHub"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import socket

View File

@@ -1,8 +1,10 @@
"""sqlalchemy ORM tools for the state of the constellation of processes"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import enum
import json
import numbers
from base64 import decodebytes, encodebytes
from datetime import datetime, timedelta
@@ -760,7 +762,18 @@ class APIToken(Hashed, Base):
else:
assert service.id is not None
orm_token.service = service
if expires_in is not None:
if expires_in:
if not isinstance(expires_in, numbers.Real):
raise TypeError(
f"expires_in must be a positive integer or null, not {expires_in!r}"
)
expires_in = int(expires_in)
# tokens must always expire in the future
if expires_in < 1:
raise ValueError(
f"expires_in must be a positive integer or null, not {expires_in!r}"
)
orm_token.expires_at = cls.now() + timedelta(seconds=expires_in)
db.commit()

View File

@@ -14,6 +14,7 @@ Route Specification:
'host.tld/path/' for host-based routing or '/path/' for default routing.
- Route paths should be normalized to always start and end with '/'
"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import asyncio
@@ -48,7 +49,7 @@ from jupyterhub.traitlets import Command
from . import utils
from .metrics import CHECK_ROUTES_DURATION_SECONDS, PROXY_POLL_DURATION_SECONDS
from .objects import Server
from .utils import AnyTimeoutError, exponential_backoff, url_escape_path, url_path_join
from .utils import exponential_backoff, url_escape_path, url_path_join
def _one_at_a_time(method):
@@ -766,24 +767,67 @@ class ConfigurableHTTPProxy(Proxy):
self._write_pid_file()
def _check_process():
status = self.proxy_process.poll()
if status is not None:
with self.proxy_process:
e = RuntimeError("Proxy failed to start with exit code %i" % status)
raise e from None
async def wait_for_process():
"""Watch proxy process for early termination
Runs forever, checking every 0.5s if the process has exited
so we don't keep waiting for endpoints after the proxy has stopped.
Raises RuntimeError if/when the proxy process exits,
otherwise runs forever.
Should be cancelled when servers become ready.
"""
while True:
status = self.proxy_process.poll()
if status is not None:
with self.proxy_process:
e = RuntimeError(
f"Proxy failed to start with exit code {status}"
)
raise e from None
await asyncio.sleep(0.5)
# process_exited can only resolve with a RuntimeError when the process has exited,
# otherwise it must be cancelled.
process_exited = asyncio.ensure_future(wait_for_process())
# wait for both servers to be ready (or one server to fail)
server_futures = [
asyncio.ensure_future(server.wait_up(10))
for server in (public_server, api_server)
]
servers_ready = asyncio.gather(*server_futures)
# wait for process to crash or servers to be ready,
# whichever comes first
wait_timeout = 15
ready, pending = await asyncio.wait(
[
process_exited,
servers_ready,
],
return_when=asyncio.FIRST_COMPLETED,
timeout=wait_timeout,
)
for task in [servers_ready, process_exited] + server_futures:
# cancel any pending tasks
if not task.done():
task.cancel()
if not ready:
# timeouts passed to wait_up should prevent this,
# but weird things like DNS delays may result in
# wait_up taking a lot longer than it should
raise TimeoutError(
f"Waiting for proxy endpoints didn't complete in {wait_timeout}s"
)
if process_exited in ready:
# process exited, this will raise RuntimeError
await process_exited
else:
# if we got here, servers_ready is done
# await it to make sure exceptions are raised
await servers_ready
for server in (public_server, api_server):
for i in range(10):
_check_process()
try:
await server.wait_up(1)
except AnyTimeoutError:
continue
else:
break
await server.wait_up(1)
_check_process()
self.log.debug("Proxy started and appears to be up")
pc = PeriodicCallback(self.check_running, 1e3 * self.check_running_interval)
self._check_running_callback = pc

View File

@@ -1,4 +1,5 @@
"""Roles utils"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import re

View File

@@ -14,6 +14,7 @@ intersection : set of expanded scopes as intersection of 2 expanded scope sets
identify scopes: set of expanded scopes needed for identify (whoami) endpoints
reduced scopes: expanded scopes that have been reduced
"""
import functools
import inspect
import re
@@ -253,8 +254,12 @@ def _intersect_expanded_scopes(scopes_a, scopes_b, db=None):
}
# resolve hierarchies (group/user/server) in both directions
common_servers = common_filters[base].get("server", set())
common_users = common_filters[base].get("user", set())
common_servers = initial_common_servers = common_filters[base].get(
"server", frozenset()
)
common_users = initial_common_users = common_filters[base].get(
"user", frozenset()
)
for a, b in [(filters_a, filters_b), (filters_b, filters_a)]:
if 'server' in a and b.get('server') != a['server']:
@@ -266,7 +271,7 @@ def _intersect_expanded_scopes(scopes_a, scopes_b, db=None):
for server in servers:
username, _, servername = server.partition("/")
if username in b['user']:
common_servers.add(server)
common_servers = common_servers | {server}
# resolve group/server hierarchy if db available
servers = servers.difference(common_servers)
@@ -275,7 +280,7 @@ def _intersect_expanded_scopes(scopes_a, scopes_b, db=None):
for server in servers:
server_groups = groups_for_server(server)
if server_groups & b['group']:
common_servers.add(server)
common_servers = common_servers | {server}
# resolve group/user hierarchy if db available and user sets aren't identical
if (
@@ -289,14 +294,16 @@ def _intersect_expanded_scopes(scopes_a, scopes_b, db=None):
for username in users:
groups = groups_for_user(username)
if groups & b["group"]:
common_users.add(username)
common_users = common_users | {username}
# add server filter if there wasn't one before
if common_servers and "server" not in common_filters[base]:
# add server filter if it's non-empty
# and it changed
if common_servers and common_servers != initial_common_servers:
common_filters[base]["server"] = common_servers
# add user filter if it's non-empty and there wasn't one before
if common_users and "user" not in common_filters[base]:
# add user filter if it's non-empty
# and it changed
if common_users and common_users != initial_common_users:
common_filters[base]["user"] = common_users
intersection = unparse_scopes(common_filters)
@@ -845,6 +852,15 @@ def needs_scope(*scopes):
def scope_decorator(func):
@functools.wraps(func)
def _auth_func(self, *args, **kwargs):
if not self.current_user:
# not authenticated at all, fail with more generic message
# this is the most likely permission error - missing or mis-specified credentials,
# don't indicate that they have insufficient permissions.
raise web.HTTPError(
403,
"Missing or invalid credentials.",
)
sig = inspect.signature(func)
bound_sig = sig.bind(self, *args, **kwargs)
bound_sig.apply_defaults()
@@ -853,6 +869,11 @@ def needs_scope(*scopes):
self.expanded_scopes = {}
self.parsed_scopes = {}
try:
end_point = self.request.path
except AttributeError:
end_point = self.__name__
s_kwargs = {}
for resource in {'user', 'server', 'group', 'service'}:
resource_name = resource + '_name'
@@ -860,14 +881,10 @@ def needs_scope(*scopes):
resource_value = bound_sig.arguments[resource_name]
s_kwargs[resource] = resource_value
for scope in scopes:
app_log.debug("Checking access via scope %s", scope)
app_log.debug("Checking access to %s via scope %s", end_point, scope)
has_access = _check_scope_access(self, scope, **s_kwargs)
if has_access:
return func(self, *args, **kwargs)
try:
end_point = self.request.path
except AttributeError:
end_point = self.__name__
app_log.warning(
"Not authorizing access to {}. Requires any of [{}], not derived from scopes [{}]".format(
end_point, ", ".join(scopes), ", ".join(self.expanded_scopes)

View File

@@ -23,6 +23,7 @@ If you are using OAuth, you will also need to register an oauth callback handler
A tornado implementation is provided in :class:`HubOAuthCallbackHandler`.
"""
import asyncio
import base64
import hashlib
@@ -35,6 +36,7 @@ import string
import time
import uuid
import warnings
from functools import partial
from http import HTTPStatus
from unittest import mock
from urllib.parse import urlencode
@@ -43,8 +45,10 @@ from tornado.httpclient import AsyncHTTPClient, HTTPRequest
from tornado.httputil import url_concat
from tornado.log import app_log
from tornado.web import HTTPError, RequestHandler
from tornado.websocket import WebSocketHandler
from traitlets import (
Any,
Bool,
Dict,
Instance,
Integer,
@@ -56,8 +60,15 @@ from traitlets import (
)
from traitlets.config import SingletonConfigurable
from .._xsrf_utils import (
_anonymous_xsrf_id,
_needs_check_xsrf,
_set_xsrf_cookie,
check_xsrf_cookie,
get_xsrf_token,
)
from ..scopes import _intersect_expanded_scopes
from ..utils import get_browser_protocol, url_path_join
from ..utils import _bool_env, get_browser_protocol, url_path_join
def check_scopes(required_scopes, scopes):
@@ -305,6 +316,46 @@ class HubAuth(SingletonConfigurable):
""",
).tag(config=True)
allow_token_in_url = Bool(
_bool_env("JUPYTERHUB_ALLOW_TOKEN_IN_URL", default=True),
help="""Allow requests to pages with ?token=... in the URL
This allows starting a user session by sharing a URL with credentials,
bypassing authentication with the Hub.
If False, tokens in URLs will be ignored by the server,
except on websocket requests.
Has no effect on websocket requests,
which can only reliably authenticate via token in the URL,
as recommended by browser Websocket implementations.
This will default to False in JupyterHub 5.
.. versionadded:: 4.1
.. versionchanged:: 5.0
default changed to False
""",
).tag(config=True)
allow_websocket_cookie_auth = Bool(
_bool_env("JUPYTERHUB_ALLOW_WEBSOCKET_COOKIE_AUTH", default=True),
help="""Allow websocket requests with only cookie for authentication
Cookie-authenticated websockets cannot be protected from other user servers unless per-user domains are used.
Disabling cookie auth on websockets protects user servers from each other,
but may break some user applications.
Per-user domains eliminate the need to lock this down.
JupyterLab 4.1.2 and Notebook 6.5.6, 7.1.0 will not work
because they rely on cookie authentication without
API or XSRF tokens.
.. versionadded:: 4.1
""",
).tag(config=True)
cookie_options = Dict(
help="""Additional options to pass when setting cookies.
@@ -323,6 +374,40 @@ class HubAuth(SingletonConfigurable):
else:
return {}
cookie_host_prefix_enabled = Bool(
False,
help="""Enable `__Host-` prefix on authentication cookies.
The `__Host-` prefix on JupyterHub cookies provides further
protection against cookie tossing when untrusted servers
may control subdomains of your jupyterhub deployment.
_However_, it also requires that cookies be set on the path `/`,
which means they are shared by all JupyterHub components,
so a compromised server component will have access to _all_ JupyterHub-related
cookies of the visiting browser.
It is recommended to only combine `__Host-` cookies with per-user domains.
Set via $JUPYTERHUB_COOKIE_HOST_PREFIX_ENABLED
""",
).tag(config=True)
@default("cookie_host_prefix_enabled")
def _default_cookie_host_prefix_enabled(self):
return _bool_env("JUPYTERHUB_COOKIE_HOST_PREFIX_ENABLED")
@property
def cookie_path(self):
"""
Path prefix on which to set cookies
self.base_url, but '/' when cookie_host_prefix_enabled is True
"""
if self.cookie_host_prefix_enabled:
return "/"
else:
return self.base_url
cookie_cache_max_age = Integer(help="DEPRECATED. Use cache_max_age")
@observe('cookie_cache_max_age')
@@ -585,6 +670,17 @@ class HubAuth(SingletonConfigurable):
auth_header_name = 'Authorization'
auth_header_pat = re.compile(r'(?:token|bearer)\s+(.+)', re.IGNORECASE)
def _get_token_url(self, handler):
"""Get the token from the URL
Always run for websockets,
otherwise run only if self.allow_token_in_url
"""
fetch_mode = handler.request.headers.get("Sec-Fetch-Mode", "unspecified")
if self.allow_token_in_url or fetch_mode == "websocket":
return handler.get_argument("token", "")
return ""
def get_token(self, handler, in_cookie=True):
"""Get the token authenticating a request
@@ -597,8 +693,7 @@ class HubAuth(SingletonConfigurable):
- in header: Authorization: token <token>
- in cookie (stored after oauth), if in_cookie is True
"""
user_token = handler.get_argument('token', '')
user_token = self._get_token_url(handler)
if not user_token:
# get it from Authorization header
m = self.auth_header_pat.match(
@@ -645,6 +740,14 @@ class HubAuth(SingletonConfigurable):
"""
return self._call_coroutine(sync, self._get_user, handler)
def _patch_xsrf(self, handler):
"""Overridden in HubOAuth
HubAuth base class doesn't handle xsrf,
which is only relevant for cookie-based auth
"""
return
async def _get_user(self, handler):
# only allow this to be called once per handler
# avoids issues if an error is raised,
@@ -652,6 +755,9 @@ class HubAuth(SingletonConfigurable):
if hasattr(handler, '_cached_hub_user'):
return handler._cached_hub_user
# patch XSRF checks, which will apply after user check
self._patch_xsrf(handler)
handler._cached_hub_user = user_model = None
session_id = self.get_session_id(handler)
@@ -680,6 +786,37 @@ class HubAuth(SingletonConfigurable):
"""Check whether the user has required scope(s)"""
return check_scopes(required_scopes, set(user["scopes"]))
def _persist_url_token_if_set(self, handler):
"""Persist ?token=... from URL in cookie if set
for use in future cookie-authenticated requests.
Allows initiating an authenticated session
via /user/name/?token=abc...,
otherwise only the initial request will be authenticated.
No-op if no token URL parameter is given.
"""
url_token = handler.get_argument('token', '')
if not url_token:
# no token to persist
return
# only do this if the token in the URL is the source of authentication
if not getattr(handler, '_token_authenticated', False):
return
if not hasattr(self, 'set_cookie'):
# only HubOAuth can persist cookies
return
fetch_mode = handler.request.headers.get("Sec-Fetch-Mode", "navigate")
if isinstance(handler, WebSocketHandler) or fetch_mode != "navigate":
# don't do this on websockets or non-navigate requests
return
self.log.info(
"Storing token from url in cookie for %s",
handler.request.remote_ip,
)
self.set_cookie(handler, url_token)
class HubOAuth(HubAuth):
"""HubAuth using OAuth for login instead of cookies set by the Hub.
@@ -710,7 +847,10 @@ class HubOAuth(HubAuth):
because we don't want to use the same cookie name
across OAuth clients.
"""
return self.oauth_client_id
cookie_name = self.oauth_client_id
if self.cookie_host_prefix_enabled:
cookie_name = "__Host-" + cookie_name
return cookie_name
@property
def state_cookie_name(self):
@@ -722,22 +862,115 @@ class HubOAuth(HubAuth):
def _get_token_cookie(self, handler):
"""Base class doesn't store tokens in cookies"""
if hasattr(handler, "_hub_auth_token_cookie"):
return handler._hub_auth_token_cookie
fetch_mode = handler.request.headers.get("Sec-Fetch-Mode", "unset")
if fetch_mode == "websocket" and not self.allow_websocket_cookie_auth:
# disallow cookie auth on websockets
return None
token = handler.get_secure_cookie(self.cookie_name)
if token:
# decode cookie bytes
token = token.decode('ascii', 'replace')
return token
async def _get_user_cookie(self, handler):
def _get_xsrf_token_id(self, handler):
"""Get contents for xsrf token for a given Handler
This is the value to be encrypted & signed in the xsrf token
"""
token = self._get_token_cookie(handler)
session_id = self.get_session_id(handler)
if token:
token_hash = hashlib.sha256(token.encode("ascii", "replace")).hexdigest()
if not session_id:
session_id = _anonymous_xsrf_id(handler)
else:
token_hash = _anonymous_xsrf_id(handler)
return f"{session_id}:{token_hash}".encode("ascii", "replace")
def _patch_xsrf(self, handler):
"""Patch handler to inject JuptyerHub xsrf token behavior"""
if isinstance(handler, HubAuthenticated):
# doesn't need patch
return
# patch in our xsrf token handling
# overrides tornado and jupyter_server defaults,
# but not others.
# subclasses will still inherit our overridden behavior,
# but their overrides (if any) will take precedence over ours
# such as jupyter-server-proxy
for cls in handler.__class__.__mro__:
# search for the nearest parent class defined
# in one of the 'base' Handler-defining packages.
# In current implementations, this will
# generally be jupyter_server.base.handlers.JupyterHandler
# or tornado.web.RequestHandler,
# but doing it this way ensures consistent results
if (cls.__module__ or '').partition('.')[0] not in {
"jupyter_server",
"notebook",
"tornado",
}:
continue
# override check_xsrf_cookie where it's defined
if "check_xsrf_cookie" in cls.__dict__:
if "_get_xsrf_token_id" in cls.__dict__:
# already patched
return
cls._xsrf_token_id = property(self._get_xsrf_token_id)
cls.xsrf_token = property(
partial(get_xsrf_token, cookie_path=self.base_url)
)
cls.check_xsrf_cookie = lambda handler: self.check_xsrf_cookie(handler)
def check_xsrf_cookie(self, handler):
"""check_xsrf_cookie patch
Applies JupyterHub check_xsrf_cookie if not token authenticated
"""
if getattr(handler, '_token_authenticated', False) or handler.settings.get(
"disable_check_xsrf", False
):
return
check_xsrf_cookie(handler)
def _clear_cookie(self, handler, cookie_name, **kwargs):
"""Clear a cookie, handling __Host- prefix"""
# Set-Cookie is rejected without 'secure',
# this includes clearing cookies!
if cookie_name.startswith("__Host-"):
kwargs["path"] = "/"
kwargs["secure"] = True
return handler.clear_cookie(cookie_name, **kwargs)
async def _get_user_cookie(self, handler):
# check xsrf if needed
token = self._get_token_cookie(handler)
session_id = self.get_session_id(handler)
if token and _needs_check_xsrf(handler):
# call handler.check_xsrf_cookie instead of self.check_xsrf_cookie
# to allow subclass overrides
try:
handler.check_xsrf_cookie()
except HTTPError as e:
self.log.debug(
f"Not accepting cookie auth on {handler.request.method} {handler.request.path}: {e.log_message}"
)
# don't proceed with cookie auth unless xsrf is okay
# don't raise either, because that makes a mess
return None
if token:
user_model = await self.user_for_token(
token, session_id=session_id, sync=False
)
if user_model is None:
app_log.warning("Token stored in cookie may have expired")
handler.clear_cookie(self.cookie_name)
self._clear_cookie(handler, self.cookie_name, path=self.cookie_path)
return user_model
# HubOAuth API
@@ -883,7 +1116,7 @@ class HubOAuth(HubAuth):
cookie_name = self.state_cookie_name
b64_state = self.generate_state(next_url, **extra_state)
kwargs = {
'path': self.base_url,
'path': self.cookie_path,
'httponly': True,
# Expire oauth state cookie in ten minutes.
# Usually this will be cleared by completed login
@@ -891,8 +1124,12 @@ class HubOAuth(HubAuth):
# OAuth that doesn't complete shouldn't linger too long.
'max_age': 600,
}
if get_browser_protocol(handler.request) == 'https':
if (
get_browser_protocol(handler.request) == 'https'
or self.cookie_host_prefix_enabled
):
kwargs['secure'] = True
# load user cookie overrides
kwargs.update(self.cookie_options)
handler.set_secure_cookie(cookie_name, b64_state, **kwargs)
@@ -930,8 +1167,11 @@ class HubOAuth(HubAuth):
def set_cookie(self, handler, access_token):
"""Set a cookie recording OAuth result"""
kwargs = {'path': self.base_url, 'httponly': True}
if get_browser_protocol(handler.request) == 'https':
kwargs = {'path': self.cookie_path, 'httponly': True}
if (
get_browser_protocol(handler.request) == 'https'
or self.cookie_host_prefix_enabled
):
kwargs['secure'] = True
# load user cookie overrides
kwargs.update(self.cookie_options)
@@ -942,10 +1182,19 @@ class HubOAuth(HubAuth):
kwargs,
)
handler.set_secure_cookie(self.cookie_name, access_token, **kwargs)
# set updated xsrf token cookie,
# which changes after login
handler._hub_auth_token_cookie = access_token
_set_xsrf_cookie(
handler,
handler._xsrf_token_id,
cookie_path=self.base_url,
authenticated=True,
)
def clear_cookie(self, handler):
"""Clear the OAuth cookie"""
handler.clear_cookie(self.cookie_name, path=self.base_url)
self._clear_cookie(handler, self.cookie_name, path=self.cookie_path)
class UserNotAllowed(Exception):
@@ -1042,19 +1291,30 @@ class HubAuthenticated:
def hub_auth(self, auth):
self._hub_auth = auth
_hub_login_url = None
def get_login_url(self):
"""Return the Hub's login URL"""
login_url = self.hub_auth.login_url
if isinstance(self.hub_auth, HubOAuth):
# add state argument to OAuth url
state = self.hub_auth.set_state_cookie(self, next_url=self.request.uri)
login_url = url_concat(login_url, {'state': state})
if self._hub_login_url is not None:
# cached value, don't call this more than once per handler
return self._hub_login_url
# temporary override at setting level,
# to allow any subclass overrides of get_login_url to preserve their effect
# for example, APIHandler raises 403 to prevent redirects
with mock.patch.dict(self.application.settings, {"login_url": login_url}):
app_log.debug("Redirecting to login url: %s", login_url)
return super().get_login_url()
with mock.patch.dict(
self.application.settings, {"login_url": self.hub_auth.login_url}
):
login_url = super().get_login_url()
app_log.debug("Redirecting to login url: %s", login_url)
if isinstance(self.hub_auth, HubOAuth):
# add state argument to OAuth url
# must do this _after_ allowing get_login_url to raise
# so we don't set unused cookies
state = self.hub_auth.set_state_cookie(self, next_url=self.request.uri)
login_url = url_concat(login_url, {'state': state})
self._hub_login_url = login_url
return login_url
def check_hub_user(self, model):
"""Check whether Hub-authenticated user or service should be allowed.
@@ -1146,7 +1406,7 @@ class HubAuthenticated:
return
try:
self._hub_auth_user_cache = self.check_hub_user(user_model)
except UserNotAllowed as e:
except UserNotAllowed:
# cache None, in case get_user is called again while processing the error
self._hub_auth_user_cache = None
@@ -1165,20 +1425,28 @@ class HubAuthenticated:
self._hub_auth_user_cache = None
raise
# store ?token=... tokens passed via url in a cookie for future requests
url_token = self.get_argument('token', '')
if (
user_model
and url_token
and getattr(self, '_token_authenticated', False)
and hasattr(self.hub_auth, 'set_cookie')
):
# authenticated via `?token=`
# set a cookie for future requests
# hub_auth.set_cookie is only available on HubOAuth
self.hub_auth.set_cookie(self, url_token)
self.hub_auth._persist_url_token_if_set(self)
return self._hub_auth_user_cache
@property
def _xsrf_token_id(self):
if hasattr(self, "__xsrf_token_id"):
return self.__xsrf_token_id
if not isinstance(self.hub_auth, HubOAuth):
return ""
return self.hub_auth._get_xsrf_token_id(self)
@_xsrf_token_id.setter
def _xsrf_token_id(self, value):
self.__xsrf_token_id = value
@property
def xsrf_token(self):
return get_xsrf_token(self, cookie_path=self.hub_auth.base_url)
def check_xsrf_cookie(self):
return self.hub_auth.check_xsrf_cookie(self)
class HubOAuthenticated(HubAuthenticated):
"""Simple subclass of HubAuthenticated using OAuth instead of old shared cookies"""
@@ -1213,12 +1481,22 @@ class HubOAuthCallbackHandler(HubOAuthenticated, RequestHandler):
cookie_name = self.hub_auth.get_state_cookie_name(arg_state)
cookie_state = self.get_secure_cookie(cookie_name)
# clear cookie state now that we've consumed it
self.clear_cookie(cookie_name, path=self.hub_auth.base_url)
clear_kwargs = {}
if self.hub_auth.cookie_host_prefix_enabled:
# Set-Cookie is rejected without 'secure',
# this includes clearing cookies!
clear_kwargs["secure"] = True
self.hub_auth._clear_cookie(self, cookie_name, path=self.hub_auth.cookie_path)
if isinstance(cookie_state, bytes):
cookie_state = cookie_state.decode('ascii', 'replace')
# check that state matches
if arg_state != cookie_state:
app_log.warning("oauth state %r != %r", arg_state, cookie_state)
app_log.warning(
"oauth state argument %r != cookie %s=%r",
arg_state,
cookie_name,
cookie_state,
)
raise HTTPError(403, "oauth state does not match. Try logging in again.")
next_url = self.hub_auth.get_next_url(cookie_state)

View File

@@ -38,6 +38,7 @@ A hub-managed service with no URL::
}
"""
import asyncio
import copy
import os

View File

@@ -12,6 +12,7 @@ Application subclass can be controlled with environment variables:
- JUPYTERHUB_SINGLEUSER_EXTENSION=1 to opt-in to the extension (requires Jupyter Server 2)
- JUPYTERHUB_SINGLEUSER_APP=notebook (or jupyter-server) to opt-in
"""
import os
from .mixins import HubAuthenticatedHandler, make_singleuser_app

View File

@@ -6,8 +6,9 @@
.. versionchanged:: 2.0
Default app changed to launch `jupyter labhub`.
Use JUPYTERHUB_SINGLEUSER_APP=notebook.notebookapp.NotebookApp for the legacy 'classic' notebook server.
Use JUPYTERHUB_SINGLEUSER_APP='notebook' for the legacy 'classic' notebook server (requires notebook<7).
"""
import os
from traitlets import import_item
@@ -27,7 +28,25 @@ JUPYTERHUB_SINGLEUSER_APP = _app_shortcuts.get(
JUPYTERHUB_SINGLEUSER_APP.replace("_", "-"), JUPYTERHUB_SINGLEUSER_APP
)
if JUPYTERHUB_SINGLEUSER_APP:
if JUPYTERHUB_SINGLEUSER_APP in {"notebook", _app_shortcuts["notebook"]}:
# better error for notebook v7, which uses jupyter-server
# when the legacy notebook server is requested
try:
from notebook import __version__
except ImportError:
# will raise later
pass
else:
# check if this failed because of notebook v7
_notebook_major_version = int(__version__.split(".", 1)[0])
if _notebook_major_version >= 7:
raise ImportError(
f"JUPYTERHUB_SINGLEUSER_APP={JUPYTERHUB_SINGLEUSER_APP} is not valid with notebook>=7 (have notebook=={__version__}).\n"
f"Leave $JUPYTERHUB_SINGLEUSER_APP unspecified (or use the default JUPYTERHUB_SINGLEUSER_APP=jupyter-server), "
'and set `c.Spawner.default_url = "/tree"` to make notebook v7 the default UI.'
)
App = import_item(JUPYTERHUB_SINGLEUSER_APP)
else:
App = None

View File

@@ -44,6 +44,7 @@ from jupyterhub._version import __version__, _check_version
from jupyterhub.log import log_request
from jupyterhub.services.auth import HubOAuth, HubOAuthCallbackHandler
from jupyterhub.utils import (
_bool_env,
exponential_backoff,
isoformat,
make_ssl_context,
@@ -55,17 +56,6 @@ from ._disable_user_config import _disable_user_config
SINGLEUSER_TEMPLATES_DIR = str(Path(__file__).parent.resolve().joinpath("templates"))
def _bool_env(key):
"""Cast an environment variable to bool
0, empty, or unset is False; All other values are True.
"""
if os.environ.get(key, "") in {"", "0"}:
return False
else:
return True
def _exclude_home(path_list):
"""Filter out any entries in a path list that are in my home directory.
@@ -127,25 +117,36 @@ class JupyterHubIdentityProvider(IdentityProvider):
# HubAuth gets most of its config from the environment
return HubOAuth(parent=self)
def _patch_xsrf(self, handler):
self.hub_auth._patch_xsrf(handler)
def _patch_get_login_url(self, handler):
original_get_login_url = handler.get_login_url
_hub_login_url = None
def get_login_url():
"""Return the Hub's login URL, to begin login redirect"""
login_url = self.hub_auth.login_url
# add state argument to OAuth url
state = self.hub_auth.set_state_cookie(
handler, next_url=handler.request.uri
)
login_url = url_concat(login_url, {'state': state})
# temporary override at setting level,
nonlocal _hub_login_url
if _hub_login_url is not None:
# cached value, don't call this more than once per handler
return _hub_login_url
# temporary override at settings level,
# to allow any subclass overrides of get_login_url to preserve their effect;
# for example, APIHandler raises 403 to prevent redirects
with mock.patch.dict(
handler.application.settings, {"login_url": login_url}
handler.application.settings, {"login_url": self.hub_auth.login_url}
):
self.log.debug("Redirecting to login url: %s", login_url)
return original_get_login_url()
login_url = original_get_login_url()
self.log.debug("Redirecting to login url: %s", login_url)
# add state argument to OAuth url
# must do this _after_ allowing get_login_url to raise
# so we don't set unused cookies
state = self.hub_auth.set_state_cookie(
handler, next_url=handler.request.uri
)
_hub_login_url = url_concat(login_url, {'state': state})
return _hub_login_url
handler.get_login_url = get_login_url
@@ -153,6 +154,7 @@ class JupyterHubIdentityProvider(IdentityProvider):
if hasattr(handler, "_jupyterhub_user"):
return handler._jupyterhub_user
self._patch_get_login_url(handler)
self._patch_xsrf(handler)
user = await self.hub_auth.get_user(handler, sync=False)
if user is None:
handler._jupyterhub_user = None
@@ -187,6 +189,7 @@ class JupyterHubIdentityProvider(IdentityProvider):
return None
handler._jupyterhub_user = JupyterHubUser(user)
self.hub_auth._persist_url_token_if_set(handler)
return handler._jupyterhub_user
def get_handlers(self):
@@ -483,6 +486,11 @@ class JupyterHubSingleUser(ExtensionApp):
cfg.answer_yes = True
self.config.FileContentsManager.delete_to_trash = False
# load Spawner.notebook_dir configuration, if given
root_dir = os.getenv("JUPYTERHUB_ROOT_DIR", None)
if root_dir:
cfg.root_dir = os.path.expanduser(root_dir)
# load http server config from environment
url = urlparse(os.environ['JUPYTERHUB_SERVICE_URL'])
if url.port:
@@ -599,9 +607,9 @@ class JupyterHubSingleUser(ExtensionApp):
jinja_template_vars['logo_url'] = self.hub_auth.hub_host + url_path_join(
self.hub_auth.hub_prefix, 'logo'
)
jinja_template_vars[
'hub_control_panel_url'
] = self.hub_auth.hub_host + url_path_join(self.hub_auth.hub_prefix, 'home')
jinja_template_vars['hub_control_panel_url'] = (
self.hub_auth.hub_host + url_path_join(self.hub_auth.hub_prefix, 'home')
)
_activity_task = None
@@ -614,10 +622,15 @@ class JupyterHubSingleUser(ExtensionApp):
super().initialize()
app = self.serverapp
app.web_app.settings[
"page_config_hook"
] = app.identity_provider.page_config_hook
app.web_app.settings["log_function"] = log_request
app.web_app.settings["page_config_hook"] = (
app.identity_provider.page_config_hook
)
# disable xsrf_cookie checks by Tornado, which run too early
# checks in Jupyter Server are unconditional
app.web_app.settings["xsrf_cookies"] = False
# if the user has configured a log function in the tornado settings, do not override it
if not 'log_function' in app.config.ServerApp.get('tornado_settings', {}):
app.web_app.settings["log_function"] = log_request
# add jupyterhub version header
headers = app.web_app.settings.setdefault("headers", {})
headers["X-JupyterHub-Version"] = __version__
@@ -625,6 +638,9 @@ class JupyterHubSingleUser(ExtensionApp):
# check jupyterhub version
app.io_loop.run_sync(self.check_hub_version)
# set default CSP to prevent iframe embedding across jupyterhub components
headers.setdefault("Content-Security-Policy", "frame-ancestors 'none'")
async def _start_activity():
self._activity_task = asyncio.ensure_future(self.keep_activity_updated())

View File

@@ -44,21 +44,15 @@ from traitlets.config import Configurable
from .._version import __version__, _check_version
from ..log import log_request
from ..services.auth import HubOAuth, HubOAuthCallbackHandler, HubOAuthenticated
from ..utils import exponential_backoff, isoformat, make_ssl_context, url_path_join
from ..utils import (
_bool_env,
exponential_backoff,
isoformat,
make_ssl_context,
url_path_join,
)
from ._disable_user_config import _disable_user_config, _exclude_home
def _bool_env(key):
"""Cast an environment variable to bool
0, empty, or unset is False; All other values are True.
"""
if os.environ.get(key, "") in {"", "0"}:
return False
else:
return True
# Authenticate requests with the Hub
@@ -669,7 +663,8 @@ class SingleUserNotebookAppMixin(Configurable):
# load the hub-related settings into the tornado settings dict
self.init_hub_auth()
s = self.tornado_settings
s['log_function'] = log_request
# if the user has configured a log function in the tornado settings, do not override it
s.setdefault('log_function', log_request)
s['user'] = self.user
s['group'] = self.group
s['hub_prefix'] = self.hub_prefix
@@ -681,10 +676,10 @@ class SingleUserNotebookAppMixin(Configurable):
)
headers = s.setdefault('headers', {})
headers['X-JupyterHub-Version'] = __version__
# set CSP header directly to workaround bugs in jupyter/notebook 5.0
# set default CSP to prevent iframe embedding across jupyterhub components
headers.setdefault(
'Content-Security-Policy',
';'.join(["frame-ancestors 'self'", "report-uri " + csp_report_uri]),
';'.join(["frame-ancestors 'none'", "report-uri " + csp_report_uri]),
)
super().init_webapp()
@@ -733,9 +728,9 @@ class SingleUserNotebookAppMixin(Configurable):
)
self.jinja_template_vars['hub_host'] = self.hub_host
self.jinja_template_vars['hub_prefix'] = self.hub_prefix
self.jinja_template_vars[
'hub_control_panel_url'
] = self.hub_host + url_path_join(self.hub_prefix, 'home')
self.jinja_template_vars['hub_control_panel_url'] = (
self.hub_host + url_path_join(self.hub_prefix, 'home')
)
settings = self.web_app.settings
# patch classic notebook jinja env
@@ -855,13 +850,21 @@ def _patch_app_base_handlers(app):
if BaseHandler is not None:
base_handlers.append(BaseHandler)
# patch juptyer_server and notebook handlers if they have been imported
# patch jupyter_server and notebook handlers if they have been imported
for base_handler_name in [
"jupyter_server.base.handlers.JupyterHandler",
"notebook.base.handlers.IPythonHandler",
]:
modname, _ = base_handler_name.rsplit(".", 1)
if modname in sys.modules:
root_mod = modname.partition(".")[0]
if root_mod == "notebook":
import notebook
if int(notebook.__version__.partition(".")[0]) >= 7:
# notebook 7 is a server extension,
# it doesn't have IPythonHandler anymore
continue
base_handlers.append(import_item(base_handler_name))
if not base_handlers:

View File

@@ -1,6 +1,7 @@
"""
Contains base Spawner class & default implementation
"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import ast
@@ -162,6 +163,7 @@ class Spawner(LoggingConfigurable):
hub = Any()
orm_spawner = Any()
cookie_options = Dict()
cookie_host_prefix_enabled = Bool()
db = Any()
@@ -274,8 +276,6 @@ class Spawner(LoggingConfigurable):
api_token = Unicode()
oauth_client_id = Unicode()
oauth_scopes = List(Unicode())
@property
def oauth_scopes(self):
warnings.warn(
@@ -971,6 +971,10 @@ class Spawner(LoggingConfigurable):
env['JUPYTERHUB_CLIENT_ID'] = self.oauth_client_id
if self.cookie_options:
env['JUPYTERHUB_COOKIE_OPTIONS'] = json.dumps(self.cookie_options)
env["JUPYTERHUB_COOKIE_HOST_PREFIX_ENABLED"] = str(
int(self.cookie_host_prefix_enabled)
)
env['JUPYTERHUB_HOST'] = self.hub.public_host
env['JUPYTERHUB_OAUTH_CALLBACK_URL'] = url_path_join(
self.user.url, url_escape_path(self.name), 'oauth_callback'
@@ -1455,14 +1459,13 @@ def set_user_setuid(username, chdir=True):
Returned preexec_fn will set uid/gid, and attempt to chdir to the target user's
home directory.
"""
import grp
import pwd
user = pwd.getpwnam(username)
uid = user.pw_uid
gid = user.pw_gid
home = user.pw_dir
gids = [g.gr_gid for g in grp.getgrall() if username in g.gr_mem]
gids = os.getgrouplist(username, gid)
def preexec():
"""Set uid/gid of current process

View File

View File

@@ -1,6 +1,10 @@
from collections import namedtuple
import pytest
from playwright.async_api import async_playwright
from ..conftest import add_user, new_username
@pytest.fixture()
async def browser():
@@ -12,3 +16,13 @@ async def browser():
yield page
await context.clear_cookies()
await browser.close()
@pytest.fixture
def user_special_chars(app):
"""Fixture for creating a temporary user with special characters in the name"""
user = add_user(app.db, app, name=new_username("testuser<'&\">"))
yield namedtuple('UserSpecialChars', ['user', 'urlname'])(
user,
user.name.replace("<'&\">", "%3C%27%26%22%3E"),
)

View File

@@ -1,6 +1,8 @@
"""Tests for the Playwright Python"""
import asyncio
import json
import pprint
import re
from urllib.parse import parse_qs, urlparse
@@ -10,7 +12,8 @@ from tornado.escape import url_escape
from tornado.httputil import url_concat
from jupyterhub import orm, roles, scopes
from jupyterhub.tests.utils import public_host, public_url, ujoin
from jupyterhub.tests.test_named_servers import named_servers # noqa
from jupyterhub.tests.utils import async_requests, public_host, public_url, ujoin
from jupyterhub.utils import url_escape_path, url_path_join
pytestmark = pytest.mark.browser
@@ -36,11 +39,12 @@ async def test_open_login_page(app, browser):
await expect(form.locator('//h1')).to_have_text("Sign in")
async def test_submit_login_form(app, browser, user):
async def test_submit_login_form(app, browser, user_special_chars):
user = user_special_chars.user
login_url = url_path_join(public_host(app), app.hub.base_url, "login")
await browser.goto(login_url)
await login(browser, user.name, password=user.name)
expected_url = ujoin(public_url(app), f"/user/{user.name}/")
expected_url = public_url(app, user)
await expect(browser).to_have_url(expected_url)
@@ -52,7 +56,7 @@ async def test_submit_login_form(app, browser, user):
# will encode given parameters for an unauthenticated URL in the next url
# the next parameter will contain the app base URL (replaces BASE_URL in tests)
'spawn',
[('param', 'value')],
{'param': 'value'},
'/hub/login?next={{BASE_URL}}hub%2Fspawn%3Fparam%3Dvalue',
'/hub/login?next={{BASE_URL}}hub%2Fspawn%3Fparam%3Dvalue',
),
@@ -60,15 +64,15 @@ async def test_submit_login_form(app, browser, user):
# login?param=fromlogin&next=encoded(/hub/spawn?param=value)
# will drop parameters given to the login page, passing only the next url
'login',
[('param', 'fromlogin'), ('next', '/hub/spawn?param=value')],
'/hub/login?param=fromlogin&next=%2Fhub%2Fspawn%3Fparam%3Dvalue',
'/hub/login?next=%2Fhub%2Fspawn%3Fparam%3Dvalue',
{'param': 'fromlogin', 'next': '/hub/spawn?param=value'},
'/hub/login?param=fromlogin&next={{BASE_URL}}hub%2Fspawn%3Fparam%3Dvalue',
'/hub/login?next={{BASE_URL}}hub%2Fspawn%3Fparam%3Dvalue',
),
(
# login?param=value&anotherparam=anothervalue
# will drop parameters given to the login page, and use an empty next url
'login',
[('param', 'value'), ('anotherparam', 'anothervalue')],
{'param': 'value', 'anotherparam': 'anothervalue'},
'/hub/login?param=value&anotherparam=anothervalue',
'/hub/login?next=',
),
@@ -76,7 +80,7 @@ async def test_submit_login_form(app, browser, user):
# login
# simplest case, accessing the login URL, gives an empty next url
'login',
[],
{},
'/hub/login',
'/hub/login?next=',
),
@@ -89,10 +93,13 @@ async def test_open_url_login(
params,
redirected_url,
form_action,
user,
user_special_chars,
):
user = user_special_chars.user
login_url = url_path_join(public_host(app), app.hub.base_url, url)
await browser.goto(login_url)
if params.get("next"):
params["next"] = url_path_join(app.base_url, params["next"])
url_new = url_path_join(public_host(app), app.hub.base_url, url_concat(url, params))
print(url_new)
await browser.goto(url_new)
@@ -119,7 +126,9 @@ async def test_open_url_login(
await expect(browser).to_have_url(re.compile(pattern))
await expect(browser).not_to_have_url(re.compile(".*/user/.*"))
else:
await expect(browser).to_have_url(re.compile(".*/user/" + f"{user.name}/"))
await expect(browser).to_have_url(
re.compile(".*/user/" + f"{user_special_chars.urlname}/")
)
@pytest.mark.parametrize(
@@ -131,7 +140,7 @@ async def test_open_url_login(
("user", "password"),
],
)
async def test_login_with_invalid_credantials(app, browser, username, pass_w):
async def test_login_with_invalid_credentials(app, browser, username, pass_w):
login_url = url_path_join(public_host(app), app.hub.base_url, "login")
await browser.goto(login_url)
await login(browser, username, pass_w)
@@ -146,7 +155,8 @@ async def test_login_with_invalid_credantials(app, browser, username, pass_w):
# SPAWNING
async def open_spawn_pending(app, browser, user):
async def open_spawn_pending(app, browser, user_special_chars):
user = user_special_chars.user
url = url_path_join(
public_host(app),
url_concat(
@@ -157,18 +167,21 @@ async def open_spawn_pending(app, browser, user):
await browser.goto(url)
await login(browser, user.name, password=user.name)
url_spawn = url_path_join(
public_host(app), app.hub.base_url, '/spawn-pending/' + user.name
public_host(app),
app.hub.base_url,
'/spawn-pending/' + user_special_chars.urlname,
)
await browser.goto(url_spawn)
await expect(browser).to_have_url(url_spawn)
async def test_spawn_pending_server_not_started(
app, browser, no_patience, user, slow_spawn
app, browser, no_patience, user_special_chars, slow_spawn
):
user = user_special_chars.user
# first request, no spawn is pending
# spawn-pending shows button linking to spawn
await open_spawn_pending(app, browser, user)
await open_spawn_pending(app, browser, user_special_chars)
# on the page verify the button and expected information
expected_heading = "Server not running"
heading = browser.locator('//div[@class="text-center"]').get_by_role("heading")
@@ -180,16 +193,20 @@ async def test_spawn_pending_server_not_started(
await expect(launch_btn).to_have_id("start")
await expect(launch_btn).to_be_enabled()
await expect(launch_btn).to_have_count(1)
f_string = re.escape(f"/hub/spawn/{user.name}")
f_string = re.escape(f"/hub/spawn/{user_special_chars.urlname}")
await expect(launch_btn).to_have_attribute('href', re.compile('.*' + f_string))
async def test_spawn_pending_progress(app, browser, no_patience, user, slow_spawn):
async def test_spawn_pending_progress(
app, browser, no_patience, user_special_chars, slow_spawn
):
"""verify that the server process messages are showing up to the user
when the server is going to start up"""
user = user_special_chars.user
urlname = user_special_chars.urlname
# visit the spawn-pending page
await open_spawn_pending(app, browser, user)
await open_spawn_pending(app, browser, user_special_chars)
launch_btn = browser.locator("//div[@class='text-center']").get_by_role(
"button", name="Launch Server"
)
@@ -197,18 +214,18 @@ async def test_spawn_pending_progress(app, browser, no_patience, user, slow_spaw
# begin starting the server
async with browser.expect_navigation(
url=re.compile(".*/spawn-pending/" + f"{user.name}")
url=re.compile(".*/spawn-pending/" + f"{urlname}")
):
await launch_btn.click()
# wait for progress message to appear
progress = browser.locator("#progress-message")
progress_message = await progress.inner_text()
async with browser.expect_navigation(url=re.compile(".*/user/" + f"{user.name}/")):
async with browser.expect_navigation(url=re.compile(".*/user/" + f"{urlname}/")):
# wait for log messages to appear
expected_messages = [
"Server requested",
"Spawning server...",
f"Server ready at {app.base_url}user/{user.name}/",
f"Server ready at {app.base_url}user/{urlname}/",
]
while not user.spawner.ready:
logs_list = [
@@ -222,15 +239,16 @@ async def test_spawn_pending_progress(app, browser, no_patience, user, slow_spaw
if logs_list:
assert progress_message
assert logs_list == expected_messages[: len(logs_list)]
await expect(browser).to_have_url(re.compile(".*/user/" + f"{user.name}/"))
await expect(browser).to_have_url(re.compile(".*/user/" + f"{urlname}/"))
assert user.spawner.ready
async def test_spawn_pending_server_ready(app, browser, user):
async def test_spawn_pending_server_ready(app, browser, user_special_chars):
"""verify that after a successful launch server via the spawn-pending page
the user should see two buttons on the home page"""
await open_spawn_pending(app, browser, user)
user = user_special_chars.user
await open_spawn_pending(app, browser, user_special_chars)
launch_btn = browser.get_by_role("button", name="Launch Server")
await launch_btn.click()
await browser.wait_for_selector("button", state="detached")
@@ -261,9 +279,11 @@ async def open_home_page(app, browser, user):
await expect(browser).to_have_url(re.compile(".*/hub/home"))
async def test_start_button_server_not_started(app, browser, user):
"""verify that when server is not started one button is availeble,
async def test_start_button_server_not_started(app, browser, user_special_chars):
"""verify that when server is not started one button is available,
after starting 2 buttons are available"""
user = user_special_chars.user
urlname = user_special_chars.urlname
await open_home_page(app, browser, user)
# checking that only one button is presented
start_stop_btns = browser.locator('//div[@class="text-center"]').get_by_role(
@@ -273,9 +293,9 @@ async def test_start_button_server_not_started(app, browser, user):
await expect(start_stop_btns).to_be_enabled()
await expect(start_stop_btns).to_have_count(1)
await expect(start_stop_btns).to_have_text(expected_btn_name)
f_string = re.escape(f"/hub/spawn/{user.name}")
f_string = re.escape(f"/hub/spawn/{urlname}")
await expect(start_stop_btns).to_have_attribute('href', re.compile('.*' + f_string))
async with browser.expect_navigation(url=re.compile(".*/user/" + f"{user.name}/")):
async with browser.expect_navigation(url=re.compile(".*/user/" + f"{urlname}/")):
# Start server via clicking on the Start button
await start_stop_btns.click()
# return to Home page
@@ -289,7 +309,7 @@ async def test_start_button_server_not_started(app, browser, user):
await expect(start_stop_btn).to_be_enabled()
for start_stop_btn in await start_stop_btns.all()
]
f_string = re.escape(f"/user/{user.name}")
f_string = re.escape(f"/user/{urlname}")
await expect(start_stop_btns.nth(1)).to_have_attribute(
'href', re.compile('.*' + f_string)
)
@@ -297,16 +317,19 @@ async def test_start_button_server_not_started(app, browser, user):
await expect(start_stop_btns.nth(1)).to_have_id("start")
async def test_stop_button(app, browser, user):
"""verify that the stop button after stoping a server is not shown
async def test_stop_button(app, browser, user_special_chars):
"""verify that the stop button after stopping a server is not shown
the start button is displayed with new name"""
user = user_special_chars.user
await open_home_page(app, browser, user)
# checking that only one button is presented
start_stop_btns = browser.locator('//div[@class="text-center"]').get_by_role(
"button"
)
async with browser.expect_navigation(url=re.compile(".*/user/" + f"{user.name}/")):
async with browser.expect_navigation(
url=re.compile(".*/user/" + re.escape(user_special_chars.urlname) + "/")
):
# Start server via clicking on the Start button
await start_stop_btns.click()
assert user.spawner.ready
@@ -336,10 +359,10 @@ async def open_token_page(app, browser, user):
await expect(browser).to_have_url(re.compile(".*/hub/token"))
async def test_token_request_form_and_panel(app, browser, user):
async def test_token_request_form_and_panel(app, browser, user_special_chars):
"""verify elements of the request token form"""
await open_token_page(app, browser, user)
await open_token_page(app, browser, user_special_chars.user)
request_btn = browser.locator('//div[@class="text-center"]').get_by_role("button")
expected_btn_name = 'Request new API token'
# check if the request token button is enabled
@@ -380,7 +403,7 @@ async def test_token_request_form_and_panel(app, browser, user):
await expect(token_area_heading).to_have_text(expected_panel_token_heading)
token_result = browser.locator('#token-result')
await expect(token_result).not_to_be_empty()
await expect(token_area).to_be_visible()
await expect(token_result).to_be_visible()
# verify that "Your new API Token" panel is hidden after refresh the page
await browser.reload(wait_until="load")
await expect(token_area).to_be_hidden()
@@ -404,14 +427,18 @@ async def test_token_request_form_and_panel(app, browser, user):
("server_up", False),
],
)
async def test_request_token_expiration(app, browser, token_opt, note, user):
async def test_request_token_expiration(
app, browser, token_opt, note, user_special_chars
):
"""verify request token with the different options"""
user = user_special_chars.user
urlname = user_special_chars.urlname
if token_opt == "server_up":
# open the home page
await open_home_page(app, browser, user)
# start server via clicking on the Start button
async with browser.expect_navigation(url=f"**/user/{user.name}/"):
async with browser.expect_navigation(url=f"**/user/{urlname}/"):
await browser.locator("#start").click()
token_page = url_path_join(public_host(app), app.base_url, '/hub/token')
await browser.goto(token_page)
@@ -428,6 +455,11 @@ async def test_request_token_expiration(app, browser, token_opt, note, user):
"button"
)
await reqeust_btn.click()
# wait for token response to show up on the page
await browser.wait_for_load_state("load")
token_result = browser.locator("#token-result")
await expect(token_result).to_be_visible()
# reload the page
await browser.reload(wait_until="load")
# API Tokens table: verify that elements are displayed
api_token_table_area = browser.locator('//div[@class="row"]').nth(2)
@@ -439,7 +471,7 @@ async def test_request_token_expiration(app, browser, token_opt, note, user):
orm_token = user.api_tokens[-1]
if token_opt == "server_up":
expected_note = "Server at " + ujoin(app.base_url, f"/user/{user.name}/")
expected_note = "Server at " + ujoin(app.base_url, f"/user/{urlname}/")
elif note:
expected_note = note
else:
@@ -497,24 +529,33 @@ async def test_request_token_expiration(app, browser, token_opt, note, user):
("both"),
],
)
async def test_revoke_token(app, browser, token_type, user):
"""verify API Tokens table contant in case the server is started"""
async def test_revoke_token(app, browser, token_type, user_special_chars):
"""verify API Tokens table content in case the server is started"""
user = user_special_chars.user
# open the home page
await open_home_page(app, browser, user)
if token_type == "server_up" or token_type == "both":
# Start server via clicking on the Start button
async with browser.expect_navigation(url=f"**/user/{user.name}/"):
async with browser.expect_navigation(
url=f"**/user/{user_special_chars.urlname}/"
):
await browser.locator("#start").click()
# open the token page
next_url = url_path_join(public_host(app), app.base_url, '/hub/token')
await browser.goto(next_url)
await browser.wait_for_load_state("load")
await expect(browser).to_have_url(re.compile(".*/hub/token"))
if token_type == "both" or token_type == "request_by_user":
request_btn = browser.locator('//div[@class="text-center"]').get_by_role(
"button"
)
await request_btn.click()
# wait for token response to show up on the page
await browser.wait_for_load_state("load")
token_result = browser.locator("#token-result")
await expect(token_result).to_be_visible()
# reload the page
await browser.reload(wait_until="load")
revoke_btns = browser.get_by_role("button", name="revoke")
@@ -557,7 +598,8 @@ async def test_revoke_token(app, browser, token_type, user):
("", False),
],
)
async def test_menu_bar(app, browser, page, logged_in, user):
async def test_menu_bar(app, browser, page, logged_in, user_special_chars):
user = user_special_chars.user
url = url_path_join(
public_host(app),
url_concat(
@@ -599,7 +641,9 @@ async def test_menu_bar(app, browser, page, logged_in, user):
expected_url = f"hub/login?next={url_escape(app.base_url)}"
assert expected_url in browser.url
else:
await expect(browser).to_have_url(re.compile(f".*/user/{user.name}/"))
await expect(browser).to_have_url(
re.compile(f".*/user/{user_special_chars.urlname}/")
)
await browser.go_back()
await expect(browser).to_have_url(re.compile(".*" + page))
elif index == 3:
@@ -617,13 +661,14 @@ async def test_menu_bar(app, browser, page, logged_in, user):
"url",
[("/hub/home"), ("/hub/token"), ("/hub/spawn")],
)
async def test_user_logout(app, browser, url, user):
async def test_user_logout(app, browser, url, user_special_chars):
user = user_special_chars.user
if "/hub/home" in url:
await open_home_page(app, browser, user)
elif "/hub/token" in url:
await open_home_page(app, browser, user)
elif "/hub/spawn" in url:
await open_spawn_pending(app, browser, user)
await open_spawn_pending(app, browser, user_special_chars)
logout_btn = browser.get_by_role("button", name="Logout")
await expect(logout_btn).to_be_enabled()
await logout_btn.click()
@@ -637,7 +682,9 @@ async def test_user_logout(app, browser, url, user):
# verify that user can login after logout
await login(browser, user.name, password=user.name)
await expect(browser).to_have_url(re.compile(".*/user/" + f"{user.name}/"))
await expect(browser).to_have_url(
re.compile(".*/user/" + f"{user_special_chars.urlname}/")
)
# OAUTH confirmation page
@@ -688,12 +735,15 @@ async def test_oauth_page(
oauth_client.allowed_scopes = sorted(roles.roles_to_scopes([service_role]))
app.db.commit()
# open the service url in the browser
service_url = url_path_join(public_url(app, service) + 'owhoami/?arg=x')
service_url = url_path_join(public_url(app, service), 'owhoami/?arg=x')
await browser.goto(service_url)
expected_redirect_url = url_path_join(
app.base_url + f"services/{service.name}/oauth_callback"
)
if app.subdomain_host:
expected_redirect_url = url_path_join(
public_url(app, service), "oauth_callback"
)
else:
expected_redirect_url = url_path_join(service.prefix, "oauth_callback")
expected_client_id = f"service-{service.name}"
# decode the URL
@@ -749,9 +799,11 @@ async def test_oauth_page(
user_scopes or ['(no_scope)'], user.name
)
desc_list_expected = [
f"{sd['description']} Applies to {sd['filter']}."
if sd.get('filter')
else sd['description']
(
f"{sd['description']} Applies to {sd['filter']}."
if sd.get('filter')
else sd['description']
)
for sd in scope_descriptions
]
assert sorted(desc_list_form) == sorted(desc_list_expected)
@@ -1068,3 +1120,261 @@ async def test_start_stop_server_on_admin_page(
await expect(browser.get_by_role("button", name="Spawn Page")).to_have_count(
len(users_list)
)
@pytest.mark.parametrize(
"case",
[
"fresh",
"invalid",
"valid-prefix-invalid-root",
"valid-prefix-invalid-other-prefix",
],
)
async def test_login_xsrf_initial_cookies(app, browser, case, username):
"""Test that login works with various initial states for xsrf tokens
Page will be reloaded with correct values
"""
hub_root = public_host(app)
hub_url = url_path_join(public_host(app), app.hub.base_url)
hub_parent = hub_url.rstrip("/").rsplit("/", 1)[0] + "/"
login_url = url_path_join(
hub_url, url_concat("login", {"next": url_path_join(app.base_url, "/hub/home")})
)
# start with all cookies cleared
await browser.context.clear_cookies()
if case == "invalid":
await browser.context.add_cookies(
[{"name": "_xsrf", "value": "invalid-hub-prefix", "url": hub_url}]
)
elif case.startswith("valid-prefix"):
if "invalid-root" in case:
invalid_url = hub_root
else:
invalid_url = hub_parent
await browser.goto(login_url)
# first visit sets valid xsrf cookie
cookies = await browser.context.cookies()
assert len(cookies) == 1
# second visit is also made with invalid xsrf on `/`
# handling of this behavior is undefined in HTTP itself!
# _either_ the invalid cookie on / is ignored
# _or_ both will be cleared
# currently, this test assumes the observed behavior,
# which is that the invalid cookie on `/` has _higher_ priority
await browser.context.add_cookies(
[{"name": "_xsrf", "value": "invalid-root", "url": invalid_url}]
)
cookies = await browser.context.cookies()
assert len(cookies) == 2
# after visiting page, cookies get re-established
await browser.goto(login_url)
cookies = await browser.context.cookies()
print(cookies)
cookie = cookies[0]
assert cookie['name'] == '_xsrf'
assert cookie["path"] == app.hub.base_url
# next page visit, cookies don't change
await browser.goto(login_url)
cookies_2 = await browser.context.cookies()
assert cookies == cookies_2
# login is successful
await login(browser, username, username)
def _cookie_dict(cookie_list):
"""Convert list of cookies to dict of the form
{ 'path': {'key': {cookie} } }
"""
cookie_dict = {}
for cookie in cookie_list:
path_cookies = cookie_dict.setdefault(cookie['path'], {})
path_cookies[cookie['name']] = cookie
return cookie_dict
async def test_singleuser_xsrf(
app, browser, user, create_user_with_scopes, full_spawn, named_servers # noqa: F811
):
# full login process, checking XSRF handling
# start two servers
target_user = user
target_start = asyncio.ensure_future(target_user.spawn())
browser_user = create_user_with_scopes("self", "access:servers")
# login browser_user
login_url = url_path_join(public_host(app), app.hub.base_url, "login")
await browser.goto(login_url)
await login(browser, browser_user.name, browser_user.name)
# end up at single-user
await expect(browser).to_have_url(re.compile(rf".*/user/{browser_user.name}/.*"))
# wait for target user to start, too
await target_start
await app.proxy.add_user(target_user)
# visit target user, sets credentials for second server
await browser.goto(public_url(app, target_user))
await expect(browser).to_have_url(re.compile(r".*/oauth2/authorize"))
auth_button = browser.locator('//input[@type="submit"]')
await expect(auth_button).to_be_enabled()
await auth_button.click()
await expect(browser).to_have_url(re.compile(rf".*/user/{target_user.name}/.*"))
# at this point, we are on a page served by target_user,
# logged in as browser_user
# basic check that xsrf isolation works
cookies = await browser.context.cookies()
cookie_dict = _cookie_dict(cookies)
pprint.pprint(cookie_dict)
# we should have xsrf tokens for both singleuser servers and the hub
target_prefix = target_user.prefix
user_prefix = browser_user.prefix
hub_prefix = app.hub.base_url
assert target_prefix in cookie_dict
assert user_prefix in cookie_dict
assert hub_prefix in cookie_dict
target_xsrf = cookie_dict[target_prefix].get("_xsrf", {}).get("value")
assert target_xsrf
user_xsrf = cookie_dict[user_prefix].get("_xsrf", {}).get("value")
assert user_xsrf
hub_xsrf = cookie_dict[hub_prefix].get("_xsrf", {}).get("value")
assert hub_xsrf
assert hub_xsrf != target_xsrf
assert hub_xsrf != user_xsrf
assert target_xsrf != user_xsrf
# we are on a page served by target_user
# check that we can't access
async def fetch_user_page(path, params=None):
url = url_path_join(public_url(app, browser_user), path)
if params:
url = url_concat(url, params)
status = await browser.evaluate(
"""
async (user_url) => {
try {
response = await fetch(user_url);
} catch (e) {
return 'error';
}
return response.status;
}
""",
url,
)
return status
if app.subdomain_host:
expected_status = 'error'
else:
expected_status = 403
status = await fetch_user_page("/api/contents")
assert status == expected_status
status = await fetch_user_page("/api/contents", params={"_xsrf": target_xsrf})
assert status == expected_status
if not app.subdomain_host:
expected_status = 200
status = await fetch_user_page("/api/contents", params={"_xsrf": user_xsrf})
assert status == expected_status
# check that we can't iframe the other user's page
async def iframe(src):
return await browser.evaluate(
"""
async (src) => {
const frame = document.createElement("iframe");
frame.src = src;
return new Promise((resolve, reject) => {
frame.addEventListener("load", (event) => {
if (frame.contentDocument) {
resolve("got document!");
} else {
resolve("blocked")
}
});
setTimeout(() => {
// some browsers (firefox) never fire load event
// despite spec appasrently stating it must always do so,
// even for rejected frames
resolve("timeout")
}, 3000)
document.body.appendChild(frame);
});
}
""",
src,
)
hub_iframe = await iframe(url_path_join(public_url(app), "hub/admin"))
assert hub_iframe in {"timeout", "blocked"}
user_iframe = await iframe(public_url(app, browser_user))
assert user_iframe in {"timeout", "blocked"}
# check that server page can still connect to its own kernels
token = target_user.new_api_token(scopes=["access:servers!user"])
async def test_kernel(kernels_url):
headers = {"Authorization": f"Bearer {token}"}
r = await async_requests.post(kernels_url, headers=headers)
r.raise_for_status()
kernel = r.json()
kernel_id = kernel["id"]
kernel_url = url_path_join(kernels_url, kernel_id)
kernel_ws_url = "ws" + url_path_join(kernel_url, "channels")[4:]
try:
result = await browser.evaluate(
"""
async (ws_url) => {
ws = new WebSocket(ws_url);
finished = await new Promise((resolve, reject) => {
ws.onerror = (err) => {
reject(err);
};
ws.onopen = () => {
resolve("ok");
};
});
return finished;
}
""",
kernel_ws_url,
)
finally:
r = await async_requests.delete(kernel_url, headers=headers)
r.raise_for_status()
assert result == "ok"
kernels_url = url_path_join(public_url(app, target_user), "/api/kernels")
await test_kernel(kernels_url)
# final check: make sure named servers work.
# first, visit spawn page to launch server,
# will issue cookies, etc.
server_name = "named"
url = url_path_join(
public_host(app),
url_path_join(app.base_url, f"hub/spawn/{browser_user.name}/{server_name}"),
)
await browser.goto(url)
await expect(browser).to_have_url(
re.compile(rf".*/user/{browser_user.name}/{server_name}/.*")
)
# from named server URL, make sure we can talk to a kernel
token = browser_user.new_api_token(scopes=["access:servers!user"])
# named-server URL
kernels_url = url_path_join(
public_url(app, browser_user), server_name, "api/kernels"
)
await test_kernel(kernels_url)
# go back to user's own page, test again
# make sure we didn't break anything
await browser.goto(public_url(app, browser_user))
await test_kernel(url_path_join(public_url(app, browser_user), "api/kernels"))

View File

@@ -23,6 +23,7 @@ Fixtures to add functionality or spawning behavior
- `slow_bad_spawn`
"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import asyncio
@@ -447,8 +448,6 @@ def create_user_with_scopes(app, create_temp_role):
return app.users[orm_user.id]
yield temp_user_creator
for user in temp_users:
app.users.delete(user)
@fixture

View File

@@ -30,6 +30,7 @@ class JupyterHubTestHandler(JupyterHandler):
info = {
"current_user": self.current_user,
"config": self.app.config,
"root_dir": self.contents_manager.root_dir,
"disable_user_config": getattr(self.app, "disable_user_config", None),
"settings": self.settings,
"config_file_paths": self.app.config_file_paths,

View File

@@ -26,6 +26,7 @@ Other components
- public_url
"""
import asyncio
import os
import sys
@@ -42,8 +43,8 @@ from .. import metrics, orm, roles
from ..app import JupyterHub
from ..auth import PAMAuthenticator
from ..spawner import SimpleLocalProcessSpawner
from ..utils import random_port, utcnow
from .utils import async_requests, public_url, ssl_setup
from ..utils import random_port, url_path_join, utcnow
from .utils import AsyncSession, public_url, ssl_setup
def mock_authenticate(username, password, service, encoding):
@@ -355,29 +356,32 @@ class MockHub(JupyterHub):
async def login_user(self, name):
"""Login a user by name, returning her cookies."""
base_url = public_url(self)
external_ca = None
s = AsyncSession()
if self.internal_ssl:
external_ca = self.external_certs['files']['ca']
s.verify = self.external_certs['files']['ca']
login_url = base_url + 'hub/login'
r = await async_requests.get(login_url)
r = await s.get(login_url)
r.raise_for_status()
xsrf = r.cookies['_xsrf']
r = await async_requests.post(
r = await s.post(
url_concat(login_url, {"_xsrf": xsrf}),
cookies=r.cookies,
data={'username': name, 'password': name},
allow_redirects=False,
verify=external_ca,
)
r.raise_for_status()
r.cookies["_xsrf"] = xsrf
assert sorted(r.cookies.keys()) == [
# make second request to get updated xsrf cookie
r2 = await s.get(
url_path_join(base_url, "hub/home"),
allow_redirects=False,
)
assert r2.status_code == 200
assert sorted(s.cookies.keys()) == [
'_xsrf',
'jupyterhub-hub-login',
'jupyterhub-session-id',
]
return r.cookies
return s.cookies
class InstrumentedSpawner(MockSpawner):

View File

@@ -1,4 +1,5 @@
"""Example JupyterServer app subclass"""
from jupyter_server.base.handlers import JupyterHandler
from jupyter_server.serverapp import ServerApp
from tornado import web

View File

@@ -12,6 +12,7 @@ Handlers and their purpose include:
- WhoAmIHandler: returns name of user making a request (deprecated cookie login)
- OWhoAmIHandler: returns name of user making a request (OAuth login)
"""
import json
import os
import pprint

View File

@@ -11,6 +11,7 @@ Handlers and their purpose include:
- ArgsHandler: allowing retrieval of `sys.argv`.
"""
import json
import os
import sys

View File

@@ -4,6 +4,7 @@ Run with old versions of jupyterhub to test upgrade/downgrade
used in test_db.py
"""
from datetime import datetime
from functools import partial

View File

@@ -1,4 +1,5 @@
"""Tests for the REST API."""
import asyncio
import json
import re
@@ -6,7 +7,7 @@ import sys
import uuid
from datetime import datetime, timedelta
from unittest import mock
from urllib.parse import quote, urlparse
from urllib.parse import parse_qs, quote, urlparse
from pytest import fixture, mark
from tornado.httputil import url_concat
@@ -95,7 +96,7 @@ async def test_post_content_type(app, content_type, status):
assert r.status_code == status
@mark.parametrize("xsrf_in_url", [True, False])
@mark.parametrize("xsrf_in_url", [True, False, "invalid"])
@mark.parametrize(
"method, path",
[
@@ -106,6 +107,13 @@ async def test_post_content_type(app, content_type, status):
async def test_xsrf_check(app, username, method, path, xsrf_in_url):
cookies = await app.login_user(username)
xsrf = cookies['_xsrf']
if xsrf_in_url == "invalid":
cookies.pop("_xsrf")
# a valid old-style tornado xsrf token is no longer valid
xsrf = cookies['_xsrf'] = (
"2|7329b149|d837ced983e8aac7468bc7a61ce3d51a|1708610065"
)
url = path.format(username=username)
if xsrf_in_url:
url = f"{url}?_xsrf={xsrf}"
@@ -116,12 +124,47 @@ async def test_xsrf_check(app, username, method, path, xsrf_in_url):
noauth=True,
cookies=cookies,
)
if xsrf_in_url:
if xsrf_in_url is True:
assert r.status_code == 200
else:
assert r.status_code == 403
@mark.parametrize(
"auth, expected_message",
[
("", "Missing or invalid credentials"),
("cookie_no_xsrf", "'_xsrf' argument missing from GET"),
("cookie_xsrf_mismatch", "XSRF cookie does not match GET argument"),
("token_no_scope", "requires any of [list:users]"),
("cookie_no_scope", "requires any of [list:users]"),
],
)
async def test_permission_error_messages(app, user, auth, expected_message):
# 1. no credentials, should be 403 and not mention xsrf
url = public_url(app, path="hub/api/users")
kwargs = {}
kwargs["headers"] = headers = {}
kwargs["params"] = params = {}
if auth == "token_no_scope":
token = user.new_api_token()
headers["Authorization"] = f"Bearer {token}"
elif "cookie" in auth:
cookies = kwargs["cookies"] = await app.login_user(user.name)
if auth == "cookie_no_scope":
params["_xsrf"] = cookies["_xsrf"]
if auth == "cookie_xsrf_mismatch":
params["_xsrf"] = "somethingelse"
headers['Sec-Fetch-Mode'] = 'cors'
r = await async_requests.get(url, **kwargs)
assert r.status_code == 403
response = r.json()
message = response["message"]
assert expected_message in message
# --------------
# User API tests
# --------------
@@ -231,20 +274,22 @@ def max_page_limit(app):
@mark.user
@mark.role
@mark.parametrize(
"n, offset, limit, accepts_pagination, expected_count",
"n, offset, limit, accepts_pagination, expected_count, include_stopped_servers",
[
(10, None, None, False, 10),
(10, None, None, True, 10),
(10, 5, None, True, 5),
(10, 5, None, False, 5),
(10, 5, 1, True, 1),
(10, 10, 10, True, 0),
(10, None, None, False, 10, False),
(10, None, None, True, 10, False),
(10, 5, None, True, 5, False),
(10, 5, None, False, 5, False),
(10, None, 5, True, 5, True),
(10, 5, 1, True, 1, True),
(10, 10, 10, True, 0, False),
( # default page limit, pagination expected
30,
None,
None,
True,
'default',
False,
),
(
# default max page limit, pagination not expected
@@ -253,6 +298,7 @@ def max_page_limit(app):
None,
False,
'max',
False,
),
(
# limit exceeded
@@ -261,6 +307,7 @@ def max_page_limit(app):
500,
False,
'max',
False,
),
],
)
@@ -273,6 +320,7 @@ async def test_get_users_pagination(
expected_count,
default_page_limit,
max_page_limit,
include_stopped_servers,
):
db = app.db
@@ -299,6 +347,11 @@ async def test_get_users_pagination(
if limit:
params['limit'] = limit
url = url_concat(url, params)
if include_stopped_servers:
# assumes limit is set. There doesn't seem to be a way to set valueless query
# params using url_cat
url += "&include_stopped_servers"
headers = auth_header(db, 'admin')
if accepts_pagination:
headers['Accept'] = PAGINATION_MEDIA_TYPE
@@ -311,6 +364,11 @@ async def test_get_users_pagination(
"_pagination",
}
pagination = response["_pagination"]
if include_stopped_servers and pagination["next"]:
next_query = parse_qs(
urlparse(pagination["next"]["url"]).query, keep_blank_values=True
)
assert "include_stopped_servers" in next_query
users = response["items"]
else:
users = response

View File

@@ -1,4 +1,5 @@
"""Test the JupyterHub entry point"""
import asyncio
import binascii
import json

View File

@@ -1,4 +1,5 @@
"""Tests for PAM authentication"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import logging

View File

@@ -7,6 +7,7 @@ authentication can expire in a number of ways:
- doesn't need refresh
- needs refresh and cannot be refreshed without new login
"""
from unittest import mock
from urllib.parse import parse_qs, urlparse

View File

@@ -1,4 +1,5 @@
"""Tests for dummy authentication"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
from jupyterhub.auth import DummyAuthenticator

View File

@@ -5,6 +5,7 @@ To test a new schema or event, simply add it to the
You *shouldn't* need to write new tests.
"""
import io
import json
import logging

View File

@@ -1,4 +1,5 @@
"""Tests for jupyterhub internal_ssl connections"""
import sys
import time
from unittest import mock

View File

@@ -1,4 +1,5 @@
"""Tests for named servers"""
import asyncio
import json
import time

View File

@@ -1,4 +1,5 @@
"""Tests for basic object-wrappers"""
import socket
import pytest

View File

@@ -1,4 +1,5 @@
"""Tests for the ORM bits"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import os

View File

@@ -1,4 +1,5 @@
"""Tests for HTML pages"""
import asyncio
import sys
from unittest import mock
@@ -682,11 +683,10 @@ async def test_other_user_url(app, username, user, group, create_temp_role, has_
],
)
async def test_page_with_token(app, user, url, token_in):
cookies = await app.login_user(user.name)
token = user.new_api_token()
if token_in == "url":
url = url_concat(url, {"token": token})
headers = None
headers = {}
elif token_in == "header":
headers = {
"Authorization": f"token {token}",
@@ -731,14 +731,13 @@ async def test_login_strip(app, form_user, auth_user, form_password):
"""Test that login form strips space form usernames, but not passwords"""
form_data = {"username": form_user, "password": form_password}
expected_auth = {"username": auth_user, "password": form_password}
base_url = public_url(app)
called_with = []
async def mock_authenticate(handler, data):
called_with.append(data)
with mock.patch.object(app.authenticator, 'authenticate', mock_authenticate):
r = await async_requests.get(base_url + 'hub/login')
r = await get_page('login', app)
r.raise_for_status()
cookies = r.cookies
xsrf = cookies['_xsrf']
@@ -919,17 +918,19 @@ async def test_auto_login(app, request):
async def test_auto_login_logout(app):
name = 'burnham'
cookies = await app.login_user(name)
s = AsyncSession()
s.cookies = cookies
with mock.patch.dict(
app.tornado_settings, {'authenticator': Authenticator(auto_login=True)}
):
r = await async_requests.get(
r = await s.get(
public_host(app) + app.tornado_settings['logout_url'], cookies=cookies
)
r.raise_for_status()
logout_url = public_host(app) + app.tornado_settings['logout_url']
assert r.url == logout_url
assert r.cookies == {}
assert list(s.cookies.keys()) == ["_xsrf"]
# don't include logged-out user in page:
try:
idx = r.text.index(name)
@@ -943,19 +944,23 @@ async def test_auto_login_logout(app):
async def test_logout(app):
name = 'wash'
cookies = await app.login_user(name)
r = await async_requests.get(
public_host(app) + app.tornado_settings['logout_url'], cookies=cookies
s = AsyncSession()
s.cookies = cookies
r = await s.get(
public_host(app) + app.tornado_settings['logout_url'],
)
r.raise_for_status()
login_url = public_host(app) + app.tornado_settings['login_url']
assert r.url == login_url
assert r.cookies == {}
assert list(s.cookies.keys()) == ["_xsrf"]
@pytest.mark.parametrize('shutdown_on_logout', [True, False])
async def test_shutdown_on_logout(app, shutdown_on_logout):
name = 'shutitdown'
cookies = await app.login_user(name)
s = AsyncSession()
s.cookies = cookies
user = app.users[name]
# start the user's server
@@ -975,14 +980,14 @@ async def test_shutdown_on_logout(app, shutdown_on_logout):
with mock.patch.dict(
app.tornado_settings, {'shutdown_on_logout': shutdown_on_logout}
):
r = await async_requests.get(
r = await s.get(
public_host(app) + app.tornado_settings['logout_url'], cookies=cookies
)
r.raise_for_status()
login_url = public_host(app) + app.tornado_settings['login_url']
assert r.url == login_url
assert r.cookies == {}
assert list(s.cookies.keys()) == ["_xsrf"]
# wait for any pending state to resolve
for i in range(50):

View File

@@ -1,4 +1,5 @@
"""Test a proxy being started before the Hub"""
import json
import os
from contextlib import contextmanager
@@ -9,6 +10,7 @@ import pytest
from traitlets import TraitError
from traitlets.config import Config
from ..utils import random_port
from ..utils import url_path_join as ujoin
from ..utils import wait_for_http_server
from .mocking import MockHub
@@ -27,10 +29,11 @@ def disable_check_routes(app):
@skip_if_ssl
@pytest.mark.flaky(reruns=2)
async def test_external_proxy(request):
auth_token = 'secret!'
proxy_ip = '127.0.0.1'
proxy_port = 54321
proxy_port = random_port()
cfg = Config()
cfg.ConfigurableHTTPProxy.auth_token = auth_token
cfg.ConfigurableHTTPProxy.api_url = 'http://%s:%i' % (proxy_ip, proxy_port)
@@ -127,7 +130,7 @@ async def test_external_proxy(request):
proxy.wait(timeout=10)
new_auth_token = 'different!'
env['CONFIGPROXY_AUTH_TOKEN'] = new_auth_token
proxy_port = 55432
proxy_port = random_port()
cmd = [
'configurable-http-proxy',
'--ip',

View File

@@ -1,4 +1,5 @@
"""Test roles"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import json

View File

@@ -1,4 +1,5 @@
"""Test scopes for API handlers"""
from operator import itemgetter
from unittest import mock
@@ -912,6 +913,22 @@ def test_intersect_expanded_scopes(left, right, expected, should_warn, recwarn):
["read:users!user=uy"],
{"gx": ["ux"], "gy": ["uy"]},
),
(
# make sure the group > user > server hierarchy
# is managed
["read:servers!server=ux/server", "read:servers!group=gy"],
["read:servers!server=uy/server", "read:servers!user=ux"],
["read:servers!server=ux/server", "read:servers!server=uy/server"],
{"gx": ["ux"], "gy": ["uy"]},
),
(
# make sure the group > user hierarchy
# is managed
["read:servers!user=ux", "read:servers!group=gy"],
["read:servers!user=uy", "read:servers!group=gx"],
["read:servers!user=ux", "read:servers!user=uy"],
{"gx": ["ux"], "gy": ["uy"]},
),
],
)
def test_intersect_groups(request, db, left, right, expected, groups):

View File

@@ -1,4 +1,5 @@
"""Tests for services"""
import os
import sys
from binascii import hexlify

View File

@@ -1,4 +1,5 @@
"""Tests for service authentication"""
import copy
import os
import sys
@@ -384,7 +385,7 @@ async def test_oauth_service_roles(
# token-authenticated request to HubOAuth
token = app.users[name].new_api_token()
# token in ?token parameter
r = await async_requests.get(url_concat(url, {'token': token}))
r = await async_requests.get(url_concat(url, {'token': token}), headers=s.headers)
r.raise_for_status()
reply = r.json()
assert reply['name'] == name
@@ -392,7 +393,9 @@ async def test_oauth_service_roles(
# verify that ?token= requests set a cookie
assert len(r.cookies) != 0
# ensure cookie works in future requests
r = await async_requests.get(url, cookies=r.cookies, allow_redirects=False)
r = await async_requests.get(
url, cookies=r.cookies, allow_redirects=False, headers=s.headers
)
r.raise_for_status()
assert r.url == url
reply = r.json()
@@ -525,7 +528,7 @@ async def test_oauth_cookie_collision(app, mockservice_url, create_user_with_sco
print(url)
s = AsyncSession()
name = 'mypha'
user = create_user_with_scopes("access:services", name=name)
create_user_with_scopes("access:services", name=name)
s.cookies = await app.login_user(name)
state_cookie_name = 'service-%s-oauth-state' % service.name
service_cookie_name = 'service-%s' % service.name
@@ -548,10 +551,9 @@ async def test_oauth_cookie_collision(app, mockservice_url, create_user_with_sco
assert s.cookies[state_cookie_name] == state_1
# finish oauth 2
hub_xsrf = s.cookies.get("_xsrf", path=app.hub.base_url)
# submit the oauth form to complete authorization
r = await s.post(
oauth_2.url, data={'scopes': ['identify'], "_xsrf": s.cookies["_xsrf"]}
)
r = await s.post(oauth_2.url, data={'scopes': ['identify'], "_xsrf": hub_xsrf})
r.raise_for_status()
assert r.url == url
# after finishing, state cookie is cleared
@@ -561,9 +563,7 @@ async def test_oauth_cookie_collision(app, mockservice_url, create_user_with_sco
service_cookie_2 = s.cookies[service_cookie_name]
# finish oauth 1
r = await s.post(
oauth_1.url, data={'scopes': ['identify'], "_xsrf": s.cookies["_xsrf"]}
)
r = await s.post(oauth_1.url, data={'scopes': ['identify'], "_xsrf": hub_xsrf})
r.raise_for_status()
assert r.url == url
@@ -632,7 +632,7 @@ async def test_oauth_logout(app, mockservice_url, create_user_with_scopes):
r = await s.get(public_url(app, path='hub/logout'))
r.raise_for_status()
# verify that all cookies other than the service cookie are cleared
assert sorted(s.cookies.keys()) == ["_xsrf", service_cookie_name]
assert sorted(set(s.cookies.keys())) == ["_xsrf", service_cookie_name]
# verify that clearing session id invalidates service cookie
# i.e. redirect back to login page
r = await s.get(url)

View File

@@ -1,7 +1,9 @@
"""Tests for jupyterhub.singleuser"""
import os
import sys
from contextlib import nullcontext
from pprint import pprint
from subprocess import CalledProcessError, check_output
from unittest import mock
from urllib.parse import urlencode, urlparse
@@ -16,6 +18,14 @@ from ..utils import url_path_join
from .mocking import public_url
from .utils import AsyncSession, async_requests, get_page
IS_JUPYVERSE = False # backport compatibility
@pytest.fixture(autouse=True)
def _jupyverse(app):
if IS_JUPYVERSE:
app.config.Spawner.default_url = "/lab"
@pytest.mark.parametrize(
"access_scopes, server_name, expect_success",
@@ -64,18 +74,20 @@ async def test_singleuser_auth(
spawner = user.spawners[server_name]
url = url_path_join(public_url(app, user), server_name)
s = AsyncSession()
# no cookies, redirects to login page
r = await async_requests.get(url)
r = await s.get(url)
r.raise_for_status()
assert '/hub/login' in r.url
# unauthenticated /api/ should 403, not redirect
api_url = url_path_join(url, "api/status")
r = await async_requests.get(api_url, allow_redirects=False)
r = await s.get(api_url, allow_redirects=False)
assert r.status_code == 403
# with cookies, login successful
r = await async_requests.get(url, cookies=cookies)
r = await s.get(url, cookies=cookies)
r.raise_for_status()
assert (
urlparse(r.url)
@@ -89,7 +101,7 @@ async def test_singleuser_auth(
assert r.status_code == 200
# logout
r = await async_requests.get(url_path_join(url, 'logout'), cookies=cookies)
r = await s.get(url_path_join(url, 'logout'))
assert len(r.cookies) == 0
# accessing another user's server hits the oauth confirmation page
@@ -135,6 +147,8 @@ async def test_singleuser_auth(
async def test_disable_user_config(request, app, tmpdir, full_spawn):
# login, start the server
cookies = await app.login_user('nandy')
s = AsyncSession()
s.cookies = cookies
user = app.users['nandy']
# stop spawner, if running:
if user.running:
@@ -159,21 +173,17 @@ async def test_disable_user_config(request, app, tmpdir, full_spawn):
url = public_url(app, user)
# with cookies, login successful
r = await async_requests.get(url, cookies=cookies)
r = await s.get(url)
r.raise_for_status()
assert r.url.rstrip('/').endswith(
url_path_join('/user/nandy', user.spawner.default_url or "/tree")
)
assert r.status_code == 200
r = await async_requests.get(
url_path_join(public_url(app, user), 'jupyterhub-test-info'), cookies=cookies
)
r = await s.get(url_path_join(public_url(app, user), 'jupyterhub-test-info'))
r.raise_for_status()
info = r.json()
import pprint
pprint.pprint(info)
pprint(info)
assert info['disable_user_config']
server_config = info['config']
settings = info['settings']
@@ -198,6 +208,79 @@ async def test_disable_user_config(request, app, tmpdir, full_spawn):
assert_not_in_home(path, key)
@pytest.mark.parametrize("extension", [True, False])
@pytest.mark.parametrize("notebook_dir", ["", "~", "~/sub", "ABS"])
async def test_notebook_dir(
request, app, tmpdir, user, full_spawn, extension, notebook_dir
):
if extension:
try:
import jupyter_server # noqa
except ImportError:
pytest.skip("needs jupyter-server 2")
else:
if jupyter_server.version_info < (2,):
pytest.skip("needs jupyter-server 2")
token = user.new_api_token(scopes=["access:servers!user"])
headers = {"Authorization": f"Bearer {token}"}
spawner = user.spawner
if extension:
user.spawner.environment["JUPYTERHUB_SINGLEUSER_EXTENSION"] = "1"
else:
user.spawner.environment["JUPYTERHUB_SINGLEUSER_EXTENSION"] = "0"
home_dir = tmpdir.join("home").mkdir()
sub_dir = home_dir.join("sub").mkdir()
with sub_dir.join("subfile.txt").open("w") as f:
f.write("txt\n")
abs_dir = tmpdir.join("abs").mkdir()
with abs_dir.join("absfile.txt").open("w") as f:
f.write("absfile\n")
if notebook_dir:
expected_root_dir = notebook_dir.replace("ABS", str(abs_dir)).replace(
"~", str(home_dir)
)
else:
expected_root_dir = str(home_dir)
spawner.notebook_dir = notebook_dir.replace("ABS", str(abs_dir))
# home_dir is defined on SimpleSpawner
user.spawner.home_dir = home = str(home_dir)
spawner.environment["HOME"] = home
await user.spawn()
await app.proxy.add_user(user)
url = public_url(app, user)
r = await async_requests.get(
url_path_join(public_url(app, user), 'jupyterhub-test-info'), headers=headers
)
r.raise_for_status()
info = r.json()
pprint(info)
assert info["root_dir"] == expected_root_dir
# secondary check: make sure it has the intended effect on root_dir
r = await async_requests.get(
url_path_join(public_url(app, user), 'api/contents/'), headers=headers
)
r.raise_for_status()
root_contents = sorted(item['name'] for item in r.json()['content'])
# check contents
if not notebook_dir or notebook_dir == "~":
# use any to avoid counting possible automatically created files in $HOME
assert 'sub' in root_contents
elif notebook_dir == "ABS":
assert 'absfile.txt' in root_contents
elif notebook_dir == "~/sub":
assert 'subfile.txt' in root_contents
else:
raise ValueError(f"No contents check for {notebook_dir}")
def test_help_output():
out = check_output(
[sys.executable, '-m', 'jupyterhub.singleuser', '--help-all']
@@ -288,3 +371,49 @@ async def test_nbclassic_control_panel(app, user, full_spawn):
else:
prefix = app.base_url
assert link["href"] == url_path_join(prefix, "hub/home")
@pytest.mark.skipif(
IS_JUPYVERSE, reason="jupyverse doesn't implement token authentication"
)
@pytest.mark.parametrize("accept_token_in_url", ["1", "0", ""])
async def test_token_url_cookie(app, user, full_spawn, accept_token_in_url):
if accept_token_in_url:
user.spawner.environment["JUPYTERHUB_ALLOW_TOKEN_IN_URL"] = accept_token_in_url
should_accept = accept_token_in_url != "0"
await user.spawn()
await app.proxy.add_user(user)
token = user.new_api_token(scopes=["access:servers!user"])
url = url_path_join(public_url(app, user), user.spawner.default_url or "/tree/")
# first request: auth with token in URL
s = AsyncSession()
r = await s.get(url + f"?token={token}", allow_redirects=False)
print(r.url, r.status_code)
if not should_accept:
assert r.status_code == 302
return
assert r.status_code == 200
assert s.cookies
# second request, use cookies set by first response,
# no token in URL
r = await s.get(url, allow_redirects=False)
assert r.status_code == 200
await user.stop()
async def test_api_403_no_cookie(app, user, full_spawn):
"""unused oauth cookies don't get set for failed requests to API handlers"""
await user.spawn()
await app.proxy.add_user(user)
url = url_path_join(public_url(app, user), "/api/contents/")
s = AsyncSession()
r = await s.get(url, allow_redirects=False)
# 403, not redirect
assert r.status_code == 403
# no state cookie set
assert not r.cookies
await user.stop()

View File

@@ -1,4 +1,5 @@
"""Tests for process spawning"""
# Copyright (c) Jupyter Development Team.
# Distributed under the terms of the Modified BSD License.
import asyncio

View File

@@ -1,4 +1,5 @@
"""Tests for utilities"""
import asyncio
import time
from concurrent.futures import ThreadPoolExecutor
@@ -122,3 +123,53 @@ def test_browser_protocol(x_scheme, x_forwarded_proto, forwarded, expected):
proto = utils.get_browser_protocol(request)
assert proto == expected
@pytest.mark.parametrize(
"accept_header, choices, expected",
[
(
"",
["application/json"],
None,
),
(
"text/html",
["application/json"],
None,
),
(
"nonsense",
["application/json"],
None,
),
(
"text/html, application/json",
["application/json"],
"application/json",
),
(
"text/html, application/json",
["application/json", "text/html"],
"text/html",
),
(
"text/html; q=0.8, application/json; q=0.9",
["application/json", "text/html"],
"application/json",
),
(
"text/html, application/json; q=0.9",
["application/json", "text/html"],
"text/html",
),
(
"text/html; q=notunderstood, application/json; q=0.9",
["application/json", "text/html"],
"text/html",
),
],
)
def test_get_accepted_mimetype(accept_header, choices, expected):
accepted = utils.get_accepted_mimetype(accept_header, choices=choices)
assert accepted == expected

View File

@@ -1,4 +1,5 @@
"""Test version checking"""
import logging
import pytest

View File

@@ -42,6 +42,13 @@ async_requests = _AsyncRequests()
class AsyncSession(requests.Session):
"""requests.Session object that runs in the background thread"""
def __init__(self, **kwargs):
super().__init__(**kwargs)
# session requests are for cookie authentication
# and should look like regular page views,
# so set Sec-Fetch-Mode: navigate
self.headers.setdefault("Sec-Fetch-Mode", "navigate")
def request(self, *args, **kwargs):
return async_requests.executor.submit(super().request, *args, **kwargs)
@@ -157,6 +164,7 @@ async def api_request(
else:
base_url = public_url(app, path='hub')
headers = kwargs.setdefault('headers', {})
headers.setdefault("Sec-Fetch-Mode", "cors")
if 'Authorization' not in headers and not noauth and 'cookies' not in kwargs:
# make a copy to avoid modifying arg in-place
kwargs['headers'] = h = {}
@@ -176,7 +184,7 @@ async def api_request(
kwargs['cert'] = (app.internal_ssl_cert, app.internal_ssl_key)
kwargs["verify"] = app.internal_ssl_ca
resp = await f(url, **kwargs)
assert "frame-ancestors 'self'" in resp.headers['Content-Security-Policy']
assert "frame-ancestors 'none'" in resp.headers['Content-Security-Policy']
assert (
ujoin(app.hub.base_url, "security/csp-report")
in resp.headers['Content-Security-Policy']
@@ -197,6 +205,9 @@ def get_page(path, app, hub=True, **kw):
else:
prefix = app.base_url
base_url = ujoin(public_host(app), prefix)
# Sec-Fetch-Mode=navigate to look like a regular page view
headers = kw.setdefault("headers", {})
headers.setdefault("Sec-Fetch-Mode", "navigate")
return async_requests.get(ujoin(base_url, path), **kw)

Some files were not shown because too many files have changed in this diff Show More