Peter Parente
2b33e23bb7
Make RTD the doc source of truth
2018-05-20 16:15:06 -04:00
Jay Qi
7c6bf72ced
Added conda clean. Addresses #627
2018-05-05 12:22:43 -05:00
Peter Parente
e2798680b0
Merge pull request #600 from taqtiqa-mark/patch-1
...
Keyserver pool
2018-04-15 22:21:03 -04:00
Peter Parente
22d38bad00
Fix pyarrow install
...
Need to install as NB_UID, not root, to keep all /opt/conda
folders writeable by the default user
2018-04-07 23:16:33 -04:00
Mark Van de Vyver
0344fd393d
Keyserver used in Hashicorps Nomad files
...
Keyserver used in Hashicorps Nomad files
2018-04-04 21:40:08 +10:00
Min RK
8d9388cac5
Merge pull request #596 from clkao/start-notebook.d
...
Run additional scripts in /usr/local/bin/start-notebook.d/
2018-04-02 10:44:39 +02:00
Mark Van de Vyver
7b8fff73fc
Keyserver pool
...
hkp://pool.sks-keyservers.net over keyserver.ubuntu.com see
https://sks-keyservers.net/overview-of-pools.php
2018-03-30 13:34:11 +11:00
Chia-liang Kao
632ce8c2d2
Run additional scripts in /usr/local/bin/start-notebook.d/
2018-03-29 16:26:20 +08:00
Stijn De Haes
7e7041953e
fixed tar --owner root to be correct
2018-03-26 09:05:16 +02:00
Stijn De Haes
a7eb9d6398
Make extracted files root id
2018-03-20 07:58:25 +01:00
Giovanni Lanzani
a58e6afb9b
Add pyarrow installation
...
This way we can profit from pandas UDF
2018-03-14 13:50:52 +01:00
Darek
ef402bfe71
Fixing checksum error
2018-03-01 22:43:16 +00:00
Darek
29e1b43aa5
Changed Spark mirror server and upgraded py4j library to 0.10.6
2018-03-01 21:22:56 +00:00
Darek
f2e8096825
Upgrading Spark to 2.3
2018-03-01 20:54:06 +00:00
Graham Dumpleton
770007bb10
Use NB_UID for USER statement in Dockerfile so deployment platform can verify that image doesn't run as root.
2018-02-15 11:11:32 +11:00
cglewis
d91d4a8c48
MAINTAINER is deprecated, using LABEL now
2017-10-31 20:17:06 -07:00
Grant Nestor
40a2791b73
Upgrade to notebook 5.2.0
2017-10-13 10:26:00 -07:00
Grant Nestor
01c9e6c66b
Upgrade to notebook 5.1
2017-09-26 11:03:38 -07:00
Min RK
b69f43e098
remove user-facing start-singleuser.sh docs
...
it’s handled internally
2017-08-26 09:54:58 -04:00
Peter Parente
a802e4b84d
[ci skip] Copy/paste is evil
2017-08-26 09:13:22 -04:00
Peter Parente
5131f0df81
[ci skip] Doc host mount permission requirements
2017-08-22 15:46:25 -04:00
Peter Parente
09c9a4fd9c
[ci skip] Add a jupyter lab example with ports
2017-08-22 14:28:23 -04:00
Min RK
32b3d2bec2
Remove Python 2
...
Only Python 3 from now on
2017-08-10 16:08:30 +02:00
Min RK
0351f0c3b7
base image on ubuntu:16.04
2017-08-09 11:07:29 +02:00
uodna
c740fbb1ca
Upgrade Spark to 2.2.0
2017-07-18 23:50:18 +09:00
Peter Parente
e1e3f24b5d
Docker Cloud build phase hooks
...
Must be located in the root of each docker build context
i.e., every stack folder
2017-07-09 15:59:19 -04:00
Pando85
e304b3a787
[Update] Update spark version in pyspark-notebook and all-spark-notebook README
2017-06-24 16:12:59 +02:00
Pando85
93683ee25f
[Update] Spark updated to 2.1.1
2017-06-24 14:12:21 +02:00
Stanislav Khotinok
ac9620e4a9
Added description of NB_GID to README.md files
2017-05-26 13:26:56 +02:00
Vishu Dontham
c49a3cfdd7
changed the version of mesos to avoid backward compatibility issue
2017-04-30 02:58:53 +05:30
Peter Parente
02672dc083
Fix notebook version documented in READMEs
2017-04-22 13:23:27 -04:00
Natalino Busa
3bf772d32b
bumped spark version in the readme to 2.1.0
2017-04-02 00:48:55 +02:00
Rudolf Kral
1503e39c6f
Bump Py4J to 0.10.4
...
Spark 2.1.0 requires Py4J 0.10.4, not 0.10.3.
2017-03-15 17:51:03 +01:00
Peter Parente
87ca3de02a
Fix #345 : Bump to Spark 2.1.0
2017-03-09 08:51:40 -05:00
Peter Parente
fde30f67fe
Fix #340 : Failing openjdk install
2017-02-07 22:56:31 -06:00
Peter Parente
19fe00c08c
debian, miniconda, notebook version, option updates
...
* Upgrade to latest debian base image
* Upgrade to Notebook 4.3
* Upgrade to Miniconda 4.2.12
* Remove USE_HTTPS env var in favor of command line options for key and cert
* Add GEN_CERT env var for generating a self-signed certificate
* Remove PASSWORD env var in favor of the new Notebook 4.3 default token auth
or the more secure a hashed password command line option
2016-12-31 23:27:54 -05:00
Peter Parente
f4d6934962
Add note about spark web ui port
2016-12-10 13:43:50 -05:00
Peter Parente
6b56894aa3
Doc updates for Spark 2
2016-12-10 13:33:36 -05:00
Peter Parente
52bce8f002
Use Apache Toree 0.2.0dev1 for Spark 2.0.2
2016-12-10 13:33:36 -05:00
Peter Parente
1339b518a9
Upgrade Spark to 2.0
2016-12-10 13:33:36 -05:00
Matt McCormick
3f86381329
Add badges with image size and layers
...
This uses the MicroBadger badge service. It allows prospective image users to
quickly get an idea of the size of an image from glancing at the README.
2016-11-13 23:31:58 -05:00
Adam Chainz
c28742a097
Convert readthedocs links for their .org -> .io migration for hosted projects
...
As per [their blog post of the 27th April](https://blog.readthedocs.com/securing-subdomains/ ) ‘Securing subdomains’:
> Starting today, Read the Docs will start hosting projects from subdomains on the domain readthedocs.io, instead of on readthedocs.org. This change addresses some security concerns around site cookies while hosting user generated data on the same domain as our dashboard.
Test Plan: Manually visited all the links I’ve modified.
2016-10-09 22:20:48 +01:00
Peter Parente
30d37c0186
Remove busted jhub internal link
2016-08-09 16:35:04 -04:00
Peter Parente
ed068a7d62
Rebase and update all READMEs
2016-08-09 16:32:10 -04:00
david cochran
37b50dc325
fix broken hyperlinks in READMEs
2016-08-08 21:59:31 -07:00
Peter Parente
47703bba50
[skip ci] Fix a doc copy-pasta mess
...
(c) Copyright IBM Corp. 2016
2016-07-25 12:57:06 -04:00
Peter Parente
3f3fa27d3e
Document how to set a hashed password
...
Already supported via the command line.
Update all the READMEs.
(c) Copyright IBM Corp. 2016
2016-07-17 22:30:24 -04:00
Min RK
e89f0b278b
inherit pyspark from scipy-notebook
...
removes duplicate installation of python 2 env and Python packages.
There are very few packages in scipy not in pyspark, so removing the duplicate seems valuable.
2016-06-29 09:45:12 +02:00
Sebastian Reuße
fdb247ec0c
Make use of available environment variables
...
base-notebook defines environment variables for the Conda install path and the
notebook user. However, in some instances, these locations were still hardcoded.
Let’s use the variables instead.
2016-06-28 11:31:24 +02:00
John Kirkham
ac53e57ed9
pyspark-notebook: Use jq
from conda-forge
.
2016-06-09 19:26:45 -04:00