哪些地方可以找到网络黑客(哪里能找到真正的黑客)

1234[hadoop@master ~]$ sudo tar -zxf spark-3.0.0-bin-without-hadoop.tgz -C /usr/local[hadoop@master ~]$ cd /usr/local[hadoop@master /usr/loc
1
2
3
4
[hadoop@master ~]$ sudo tar -zxf spark-3.0.0-bin-without-hadoop.tgz -C /usr/local
[hadoop@master ~]$ cd /usr/local
[hadoop@master /usr/local]$ sudo mv ./spark-3.0.0-bin-without-hadoop/ spark
[hadoop@master /usr/local]$ sudo chown -R hadoop: ./spark

四个节点都添加环境变量

1
2
export SPARK_HOME=/usr/local/spark
export PATH=$PATH:$SPARK_HOME/bin:$SPARK_HOME/sbin

配置spark

spark目录中的conf目录下cp ./conf/spark-env.sh.template ./conf/spark-env.sh后面添加

1
2
3
4
export SPARK_MASTER_IP=192.168.168.11
export HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop
export SPARK_LOCAL_DIRS=/usr/local/hadoop
export SPARK_DIST_CLASSPATH=$(/usr/local/hadoop/bin/hadoop classpath)

然后配置work节点,cp ./conf/slaves.template ./conf/slaves修改为

master
slave1
slave2
slave3

写死JAVA_HOME,sbin/spark-config.sh最后添加

export JAVA_HOME=/usr/lib/jvm/jdk1.8.0_191

复制spark目录到其他节点

1
2
3
4
sudo scp -r /usr/local/spark/ slave1:/usr/local/
sudo scp -r /usr/local/spark/ slave2:/usr/local/
sudo scp -r /usr/local/spark/ slave3:/usr/local/
sudo chown -R hadoop ./spark/

...

启动集群

先启动hadoop集群/usr/local/hadoop/sbin/start-all.sh

然后启动spark集群


  • 发表于 2021-04-03 13:51
  • 阅读 ( 747 )
  • 分类:互联网

0 条评论

请先 登录 后评论
guanxi16
guanxi16

681 篇文章

你可能感兴趣的文章

相关问题