windows基于docker下的spark开发环境搭建

docker toolbox

从事香港机房服务器托管,服务器租用,云主机,网站空间,域名与空间,CDN,网络代维等服务。

https://www.docker.com/products/docker-toolbox

spark

https://hub.docker.com/r/singularities/spark/~/dockerfile/

# start-hadoop-namenode

# hadoop fs -mkdir /user

# hadoop fs -mkdir /user/root/

# hadoop fs -put ./README.md /user/root

# start-spark

# start-spark worker [master]

# spark-shell

# spark-shell --master spark://a60b8c8f9653:7077

scala> val lines = sc.textFile("file:///usr/local/spark-2.1.0/README.md")

scala> val lines = sc.textFile("hdfs:///usr/local/spark-2.1.0/README.md")

lines: org.apache.spark.rdd.RDD[String] = file:///usr/local/spark-2.1.0/README.md MapPartitionsRDD[1] at textFile at :24

scala> lines.count()

res0: Long = 104

scala> lines.saveAsTextFile("hdfs:///user/root/README2.md")  // 保存到hdfs


分享题目:windows基于docker下的spark开发环境搭建
地址分享:http://scyanting.com/article/ieecep.html