Jay Qi
7c6bf72ced
Added conda clean. Addresses #627
2018-05-05 12:22:43 -05:00
Peter Parente
e2798680b0
Merge pull request #600 from taqtiqa-mark/patch-1
...
Keyserver pool
2018-04-15 22:21:03 -04:00
Peter Parente
22d38bad00
Fix pyarrow install
...
Need to install as NB_UID, not root, to keep all /opt/conda
folders writeable by the default user
2018-04-07 23:16:33 -04:00
Mark Van de Vyver
0344fd393d
Keyserver used in Hashicorps Nomad files
...
Keyserver used in Hashicorps Nomad files
2018-04-04 21:40:08 +10:00
Mark Van de Vyver
7b8fff73fc
Keyserver pool
...
hkp://pool.sks-keyservers.net over keyserver.ubuntu.com see
https://sks-keyservers.net/overview-of-pools.php
2018-03-30 13:34:11 +11:00
Stijn De Haes
7e7041953e
fixed tar --owner root to be correct
2018-03-26 09:05:16 +02:00
Stijn De Haes
a7eb9d6398
Make extracted files root id
2018-03-20 07:58:25 +01:00
Giovanni Lanzani
a58e6afb9b
Add pyarrow installation
...
This way we can profit from pandas UDF
2018-03-14 13:50:52 +01:00
Darek
ef402bfe71
Fixing checksum error
2018-03-01 22:43:16 +00:00
Darek
29e1b43aa5
Changed Spark mirror server and upgraded py4j library to 0.10.6
2018-03-01 21:22:56 +00:00
Darek
f2e8096825
Upgrading Spark to 2.3
2018-03-01 20:54:06 +00:00
Graham Dumpleton
770007bb10
Use NB_UID for USER statement in Dockerfile so deployment platform can verify that image doesn't run as root.
2018-02-15 11:11:32 +11:00
cglewis
d91d4a8c48
MAINTAINER is deprecated, using LABEL now
2017-10-31 20:17:06 -07:00
Min RK
0351f0c3b7
base image on ubuntu:16.04
2017-08-09 11:07:29 +02:00
uodna
c740fbb1ca
Upgrade Spark to 2.2.0
2017-07-18 23:50:18 +09:00
Pando85
93683ee25f
[Update] Spark updated to 2.1.1
2017-06-24 14:12:21 +02:00
Vishu Dontham
c49a3cfdd7
changed the version of mesos to avoid backward compatibility issue
2017-04-30 02:58:53 +05:30
Rudolf Kral
1503e39c6f
Bump Py4J to 0.10.4
...
Spark 2.1.0 requires Py4J 0.10.4, not 0.10.3.
2017-03-15 17:51:03 +01:00
Peter Parente
87ca3de02a
Fix #345 : Bump to Spark 2.1.0
2017-03-09 08:51:40 -05:00
Peter Parente
fde30f67fe
Fix #340 : Failing openjdk install
2017-02-07 22:56:31 -06:00
Peter Parente
52bce8f002
Use Apache Toree 0.2.0dev1 for Spark 2.0.2
2016-12-10 13:33:36 -05:00
Peter Parente
1339b518a9
Upgrade Spark to 2.0
2016-12-10 13:33:36 -05:00
Min RK
e89f0b278b
inherit pyspark from scipy-notebook
...
removes duplicate installation of python 2 env and Python packages.
There are very few packages in scipy not in pyspark, so removing the duplicate seems valuable.
2016-06-29 09:45:12 +02:00
Sebastian Reuße
fdb247ec0c
Make use of available environment variables
...
base-notebook defines environment variables for the Conda install path and the
notebook user. However, in some instances, these locations were still hardcoded.
Let’s use the variables instead.
2016-06-28 11:31:24 +02:00
John Kirkham
ac53e57ed9
pyspark-notebook: Use jq
from conda-forge
.
2016-06-09 19:26:45 -04:00
Peter Parente
e384a4512b
Add pip2, pip3 symlinks for images with python2
...
Fixes #123
(c) Copyright IBM Corp. 2016
2016-05-29 23:09:48 -04:00
Peter Parente
401575ede9
Merge pull request #204 from lbustelo/UpgradeToree0.1.0.dev5
...
Bump Spark to 1.6.1, Toree to 0.1.0.dev7
2016-05-11 00:34:51 -04:00
Peter Parente
eb754a1a23
Bump Spark to 1.6.1
...
(c) Copyright IBM Corp. 2016
2016-05-10 09:06:09 -05:00
Peter Parente
e14e387d6c
Require ipywidgets 5.1
...
5.0 had API compatibility problems
see ipython/ipywidgets#522
(c) Copyright IBM Corp. 2016
2016-04-29 07:40:48 -04:00
Peter Parente
39104f2432
Upgrade to notebook 4.2, ipywidgets 5.0
2016-04-26 08:55:18 -04:00
Peter Parente
42765a9617
Add numexpr to wherever pandas is
...
Fixes #173
(c) Copyright IBM Corp. 2016
2016-04-10 14:41:58 -04:00
Peter Parente
f5da6216ad
Base all-spark-notebook on pyspark-notebook
...
To speed up build, reduce accidental disparity, ease maintenance
(c) Copyright IBM Corp. 2016
2016-03-18 08:16:39 -04:00
John Kirkham
630ceec257
Dockerfile: Clean apt-get
lists.
2016-03-07 21:38:40 -05:00
John Kirkham
9f9ddd67a2
all-spark-notebook/Dockerfile: Clean after installing jq.
2016-03-06 23:58:35 -05:00
John Kirkham
a5cc245a7a
docker: Make conda
installs quieter.
2016-03-06 21:13:57 -05:00
John Kirkham
ca3f00a6c8
pyspark-notebook/Dockerfile: Note to get Mesos for jessie
.
2016-03-05 15:12:24 -05:00
John Kirkham
4711a6ce75
pyspark-notebook/Dockerfile: Clean out conda
.
2016-03-04 18:29:31 -05:00
John Kirkham
2a51eb9139
pyspark-notebook/Dockerfile: Tell conda yes.
2016-03-04 17:31:47 -05:00
Peter Parente
3fd3850bdc
Bump python libs to match other images
...
(c) Copyright IBM Corp. 2016
2016-02-13 20:49:11 -05:00
Peter Parente
d25f33a5fb
Fix py4j version and path
...
(c) Copyright IBM Corp. 2016
2016-02-04 21:29:36 -05:00
Peter Parente
154c4d64d2
Bump Spark dependencies
...
* Use Spark 1.6.0
* Use Apache Toree (new name/home of ibm-et/spark-kernel)
* Add missing pub key for binary
* Add SHA check for spark package download
(c) Copyright IBM Corp. 2016
2016-02-03 13:29:21 -05:00
Peter Parente
232d6fc465
Use jq instead of sed
...
(c) Copyright IBM Corp. 2016
2016-02-02 10:47:57 -05:00
Peter Parente
155fdea583
Set PYSPARK_HOME in python2 kernelspec
...
* Install python2 spec to CONDA_DIR instead of system prefix too
(c) Copyright IBM Corp. 2016
2016-02-02 09:03:30 -05:00
Peter Parente
9aae44ef66
Use updated LICENSE.md file
...
Also fix copyright headers on various files
2016-01-17 11:27:10 -05:00
Peter Parente
6d5cd67528
Make jovyan the default user for the docker cmd
...
* Switch to jovyan at the end of every Dockerfile
* Document --user root requirement for NB_UID and GRANT_SUDO flags
(c) Copyright IBM Corp. 2015
2015-12-27 21:05:48 -05:00
Justin Tyberg
e53b288e6e
Bump to pandas 0.17.
...
(c) Copyright IBM Corp. 2015
2015-12-04 17:47:12 -05:00
Kai Wang
817550198f
update for Spark 1.5.1
2015-10-21 15:22:57 +00:00
Peter Parente
678b64e9e4
Swap tini for supervisord
...
* Pass $@ args to start-notebook.sh
* Set tini as entrypoint, but keep start-notebook.sh as easily overridable CMD
* su to jovyan user within start-notebook.sh script
Contribution (c) Copyright IBM Corp. 2015
2015-09-22 20:51:18 -04:00
Peter Parente
69c5c35f7f
Run pyspark, all-spark, scipy, r conda installs as jovyan
...
… but install kernel specs global to avoid problems when switching NB_UID
Contribution (c) Copyright IBM Corp. 2015
2015-09-11 00:36:40 -04:00
Peter Parente
701212dbf1
Explicitly set USER root in every Dockerfile
...
Contribution (c) Copyright IBM Corp. 2015
2015-08-29 21:50:04 -04:00