cglewis
d91d4a8c48
MAINTAINER is deprecated, using LABEL now
2017-10-31 20:17:06 -07:00
Min RK
0351f0c3b7
base image on ubuntu:16.04
2017-08-09 11:07:29 +02:00
uodna
c740fbb1ca
Upgrade Spark to 2.2.0
2017-07-18 23:50:18 +09:00
Pando85
93683ee25f
[Update] Spark updated to 2.1.1
2017-06-24 14:12:21 +02:00
Vishu Dontham
c49a3cfdd7
changed the version of mesos to avoid backward compatibility issue
2017-04-30 02:58:53 +05:30
Rudolf Kral
1503e39c6f
Bump Py4J to 0.10.4
...
Spark 2.1.0 requires Py4J 0.10.4, not 0.10.3.
2017-03-15 17:51:03 +01:00
Peter Parente
87ca3de02a
Fix #345 : Bump to Spark 2.1.0
2017-03-09 08:51:40 -05:00
Peter Parente
fde30f67fe
Fix #340 : Failing openjdk install
2017-02-07 22:56:31 -06:00
Peter Parente
52bce8f002
Use Apache Toree 0.2.0dev1 for Spark 2.0.2
2016-12-10 13:33:36 -05:00
Peter Parente
1339b518a9
Upgrade Spark to 2.0
2016-12-10 13:33:36 -05:00
Min RK
e89f0b278b
inherit pyspark from scipy-notebook
...
removes duplicate installation of python 2 env and Python packages.
There are very few packages in scipy not in pyspark, so removing the duplicate seems valuable.
2016-06-29 09:45:12 +02:00
Sebastian Reuße
fdb247ec0c
Make use of available environment variables
...
base-notebook defines environment variables for the Conda install path and the
notebook user. However, in some instances, these locations were still hardcoded.
Let’s use the variables instead.
2016-06-28 11:31:24 +02:00
John Kirkham
ac53e57ed9
pyspark-notebook: Use jq
from conda-forge
.
2016-06-09 19:26:45 -04:00
Peter Parente
e384a4512b
Add pip2, pip3 symlinks for images with python2
...
Fixes #123
(c) Copyright IBM Corp. 2016
2016-05-29 23:09:48 -04:00
Peter Parente
401575ede9
Merge pull request #204 from lbustelo/UpgradeToree0.1.0.dev5
...
Bump Spark to 1.6.1, Toree to 0.1.0.dev7
2016-05-11 00:34:51 -04:00
Peter Parente
eb754a1a23
Bump Spark to 1.6.1
...
(c) Copyright IBM Corp. 2016
2016-05-10 09:06:09 -05:00
Peter Parente
e14e387d6c
Require ipywidgets 5.1
...
5.0 had API compatibility problems
see ipython/ipywidgets#522
(c) Copyright IBM Corp. 2016
2016-04-29 07:40:48 -04:00
Peter Parente
39104f2432
Upgrade to notebook 4.2, ipywidgets 5.0
2016-04-26 08:55:18 -04:00
Peter Parente
42765a9617
Add numexpr to wherever pandas is
...
Fixes #173
(c) Copyright IBM Corp. 2016
2016-04-10 14:41:58 -04:00
Peter Parente
f5da6216ad
Base all-spark-notebook on pyspark-notebook
...
To speed up build, reduce accidental disparity, ease maintenance
(c) Copyright IBM Corp. 2016
2016-03-18 08:16:39 -04:00
John Kirkham
630ceec257
Dockerfile: Clean apt-get
lists.
2016-03-07 21:38:40 -05:00
John Kirkham
9f9ddd67a2
all-spark-notebook/Dockerfile: Clean after installing jq.
2016-03-06 23:58:35 -05:00
John Kirkham
a5cc245a7a
docker: Make conda
installs quieter.
2016-03-06 21:13:57 -05:00
John Kirkham
ca3f00a6c8
pyspark-notebook/Dockerfile: Note to get Mesos for jessie
.
2016-03-05 15:12:24 -05:00
John Kirkham
4711a6ce75
pyspark-notebook/Dockerfile: Clean out conda
.
2016-03-04 18:29:31 -05:00
John Kirkham
2a51eb9139
pyspark-notebook/Dockerfile: Tell conda yes.
2016-03-04 17:31:47 -05:00
Peter Parente
3fd3850bdc
Bump python libs to match other images
...
(c) Copyright IBM Corp. 2016
2016-02-13 20:49:11 -05:00
Peter Parente
d25f33a5fb
Fix py4j version and path
...
(c) Copyright IBM Corp. 2016
2016-02-04 21:29:36 -05:00
Peter Parente
154c4d64d2
Bump Spark dependencies
...
* Use Spark 1.6.0
* Use Apache Toree (new name/home of ibm-et/spark-kernel)
* Add missing pub key for binary
* Add SHA check for spark package download
(c) Copyright IBM Corp. 2016
2016-02-03 13:29:21 -05:00
Peter Parente
232d6fc465
Use jq instead of sed
...
(c) Copyright IBM Corp. 2016
2016-02-02 10:47:57 -05:00
Peter Parente
155fdea583
Set PYSPARK_HOME in python2 kernelspec
...
* Install python2 spec to CONDA_DIR instead of system prefix too
(c) Copyright IBM Corp. 2016
2016-02-02 09:03:30 -05:00
Peter Parente
9aae44ef66
Use updated LICENSE.md file
...
Also fix copyright headers on various files
2016-01-17 11:27:10 -05:00
Peter Parente
6d5cd67528
Make jovyan the default user for the docker cmd
...
* Switch to jovyan at the end of every Dockerfile
* Document --user root requirement for NB_UID and GRANT_SUDO flags
(c) Copyright IBM Corp. 2015
2015-12-27 21:05:48 -05:00
Justin Tyberg
e53b288e6e
Bump to pandas 0.17.
...
(c) Copyright IBM Corp. 2015
2015-12-04 17:47:12 -05:00
Kai Wang
817550198f
update for Spark 1.5.1
2015-10-21 15:22:57 +00:00
Peter Parente
678b64e9e4
Swap tini for supervisord
...
* Pass $@ args to start-notebook.sh
* Set tini as entrypoint, but keep start-notebook.sh as easily overridable CMD
* su to jovyan user within start-notebook.sh script
Contribution (c) Copyright IBM Corp. 2015
2015-09-22 20:51:18 -04:00
Peter Parente
69c5c35f7f
Run pyspark, all-spark, scipy, r conda installs as jovyan
...
… but install kernel specs global to avoid problems when switching NB_UID
Contribution (c) Copyright IBM Corp. 2015
2015-09-11 00:36:40 -04:00
Peter Parente
701212dbf1
Explicitly set USER root in every Dockerfile
...
Contribution (c) Copyright IBM Corp. 2015
2015-08-29 21:50:04 -04:00
Peter Parente
c4616560cf
Make subimages compatible with late user creation
...
* Always remain as root during install
* Put kernel specs in system path, not user home
* Create user work directory at startup
* Note this is in 4.0 and up images, not 3.2
Contribution (c) Copyright IBM Corp. 2015
2015-08-28 23:43:57 -04:00
Peter Parente
16db7cfe55
Bump master to Jupyter Notebook 4.0
...
* Merge 4.0.x branch
* Rejigger Dockerfiles to build from minimal-notebook:latest
2015-08-25 20:45:04 -04:00
Peter Parente
e1c0c75c9c
Upgrade to Jupyter Notebook
...
* Change minimal-notebook to install notebook=4.0*
* Change other Dockerfile to point to 4.0 Docker Hub tag (to be built)
* Change config and pem file paths for jupyter
* Install ipywidgets in all containers that have a Python stack
* Update all READMEs to describe v3.2 and v4.0 since Docker Hub only shows one README for all tags
Contribution (c) Copyright IBM Corp. 2015
2015-08-23 08:55:09 -04:00
Peter Parente
871c0edc16
Spark 1.4.1 and Mesos 0.22, for Python 2 and 3
...
Plus README explaining how to use pyspark in local mode and with a
Mesos cluster.
(c) Copyright IBM Corp. 2015
2015-08-12 12:10:09 -04:00