Windows 10上的Spark无法正常工作
我正在尝试使win10发挥作用.当我尝试运行Spark Shell时,出现此错误:
Im trying to get spark working on win10. When i try to run spark shell i get this error :
'Spark \ spark-2.0.0-bin-hadoop2.7 \ bin .. \ jars" \不被识别为内部或外部命令,可操作程序或批处理文件.
'Spark\spark-2.0.0-bin-hadoop2.7\bin..\jars""\ is not recognized as an internal or external command,operable program or batch file.
找不到Spark jars目录.您需要在运行此程序之前构建Spark.
Failed to find Spark jars directory. You need to build Spark before running this program.
我正在为Hadoop 2.7或更高版本使用预构建的Spark.我已经安装了java 8,eclipse neon,python 2.7,scala 2.11,并获得了hadoop 2.7.1的winutils,但我仍然收到此错误.
I am using a pre-built spark for hadoop 2.7 or later. I have installed java 8, eclipse neon, python 2.7, scala 2.11, gotten winutils for hadoop 2.7.1 And i still get this error.
当我下载火花时,它出现在tgz中,提取时里面还有另一个tzg,所以我也提取了它,然后我得到了所有bin文件夹和东西.我需要访问spark-shell.有人可以帮忙吗?
When I donwloaded spark it comes in the tgz, when extracted there is another tzg inside, so i extracted it also and then I got all the bin folders and stuff. I need to access spark-shell. Can anyone help?
我最终使用的解决方案是:
Solution i ended up using:
1)虚拟盒子
2)Linux造币厂
2) Linux mint
在构建Spark时遇到了相同的错误.您可以将提取的文件夹移动到C:\
I got the same error while building Spark. You can move the extracted folder to C:\
请参阅: http://techgobi.blogspot.in/2016/08/configure-spark-on-windows-some-error.html