admin管理员组

文章数量:1566664

spark启动slave节点出错,日志文件

Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

https://wwwblogs/tijun/p/7562282.html

Consider explicitly setting the appropriate port for the service 'sparkWorker' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.

https://my.oschina/u/2329800/blog/1826179

添加export SPARK_LOCAL_IP=127.0.0.1  

这个SPARK_LOCAL_IP可以slave的ip或者名字

本文标签: 节点explicitlySparkSlaveService