Files
docker-stacks/all-spark-notebook/test/data/local_sparkR.ipynb
Romain c83024c950 Add spark notebook tests and change examples
* Test added for all kernels
* Same examples as provided in the documentation (`specifics.md`)
* Used the same use case for all examples: sum of the first 100 whole numbers

Note: I've not automatically tested `local_sparklyr.ipynb` since it creates by default the `metastore_db` dir and the `derby.log` file in the working directory. Since I mount it in `RO` it's not working. I'm struggling to set it elsewhere...
2020-05-29 06:54:46 +02:00

41 lines
803 B
Plaintext

{
"cells": [
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"library(SparkR)\n",
"\n",
"# Spark session & context\n",
"sc <- sparkR.session(\"local\")\n",
"\n",
"# Sum of the first 100 whole numbers\n",
"sdf <- createDataFrame(list(1:100))\n",
"dapplyCollect(sdf,\n",
" function(x) \n",
" { x <- sum(x)}\n",
" )\n",
"# 5050"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "R",
"language": "R",
"name": "ir"
},
"language_info": {
"codemirror_mode": "r",
"file_extension": ".r",
"mimetype": "text/x-r-source",
"name": "R",
"pygments_lexer": "r",
"version": "3.6.3"
}
},
"nbformat": 4,
"nbformat_minor": 4
}