0

What could be the reason for the following erorrs. I ran the script as a root user. I believe the root user must be having super user permissions. It was failing with the following error

mkdir: cannot create directory ‘/var/log/hadoop’: Permission denied

(base) [root@localhost ~]# cat /tmp/hadoop-service-startup.log
STARTING NAMENODE
WARNING: HADOOP_NAMENODE_OPTS has been replaced by HDFS_NAMENODE_OPTS. Using value of HADOOP_NAMENODE_OPTS.
WARNING: /var/log/hadoop does not exist. Creating.
mkdir: cannot create directory ‘/var/log/hadoop’: Permission denied
ERROR: Unable to create /var/log/hadoop. Aborting.
COMPLETE
STARTING SECONDARY NAMENODE
WARNING: HADOOP_SECONDARYNAMENODE_OPTS has been replaced by HDFS_SECONDARYNAMENODE_OPTS. Using value of HADOOP_SECONDARYNAMENODE_OPTS.
WARNING: /var/log/hadoop does not exist. Creating.
mkdir: cannot create directory ‘/var/log/hadoop’: Permission denied
ERROR: Unable to create /var/log/hadoop. Aborting.
COMPLETE
STARTING DATANODE
WARNING: HADOOP_DATANODE_OPTS has been replaced by HDFS_DATANODE_OPTS. Using value of HADOOP_DATANODE_OPTS.
WARNING: /var/log/hadoop does not exist. Creating.
mkdir: cannot create directory ‘/var/log/hadoop’: Permission denied
ERROR: Unable to create /var/log/hadoop. Aborting.
COMPLETE
STARTED DAEMONS
4884 Jps
STARTING RESOURCEMANGER
WARNING: /var/log/hadoop does not exist. Creating.
mkdir: cannot create directory ‘/var/log/hadoop’: Permission denied
ERROR: Unable to create /var/log/hadoop. Aborting.
COMPLETE
STARTING NODEMANGER
WARNING: /var/log/hadoop does not exist. Creating.
mkdir: cannot create directory ‘/var/log/hadoop’: Permission denied
ERROR: Unable to create /var/log/hadoop. Aborting.
COMPLETE
STARTING HISTORYSERVER
WARNING: /var/log/hadoop does not exist. Creating.
mkdir: cannot create directory ‘/var/log/hadoop’: Permission denied
ERROR: Unable to create /var/log/hadoop. Aborting.
COMPLETE
STARTED DAEMONS
5012 Jps
Nag
  • 101
  • can you create the directory manually? – binarysta Sep 05 '20 at 11:08
  • yes, i am able to create the directory manually , only it is happening when running the script. the script doesnt do any thing apart from bootstrapping some services – Nag Sep 05 '20 at 11:14
  • 1
    It's probably not running as the root user, or it is properly changing to some hadoop user. In any case, creating the directory with the correct ownership and permissions should probably resolve the issue. It seems reasonable to assume that this may be something that you're supposed to do before running whatever this script is. – Kusalananda Sep 05 '20 at 20:09

0 Answers0