Radoop错误:无法上传必要的薪酬onents to the directory of HDFS

kimusu2002kimusu2002 MemberPosts:5Contributor I
edited November 2018 inHelp
Hi,

I am having problem with this Radoop Error: could not upload the necessary components to the directory of HDFS. It said that the radoop can't upload into the directory ''/tmp/radoop/27eca174758add21906d3b197af684e7/ ' .

So I changed the permission of "'/tmp/radoop/" and also '/tmp/radoop/27eca174758add21906d3b197af684e7/ ' of the namenode in VM, and then I typed in 'hadoop fs -ls /tmp/radoop", results show that the permission had been changed. So I went ahead and re-run the process which has the "Radoop nest", and the same error pop out again, and the permission of the directory "tmp/radoop" is now changed back automatically to the one before.

Could some give me some pointers please ?:D

FYI, i was able to connect the cluster, and also able to explore all the HIVE tables.

Thanks heaps !
Tagged:

Answers

  • MartinLiebigMartinLiebig Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, University ProfessorPosts:3,386RM Data Scientist
    Hi kimusu2002,

    in radoop are often various users involved. Sometimes Hive is writing the table and not the user you specified. Are you sure your user and the hive has accsess to this? Can you try to give accsess rights to all users?

    Since you are using radoop you have most likely a support contract. You can use our professional support athttp://support.www.turtlecreekpls.com/.

    Cheers,

    Martin
    - Head of Data Science Services at RapidMiner -
    Dortmund, Germany
  • krishnas_mkrishnas_m MemberPosts:2Contributor I
    Hi Team,

    I get the same error and also I get "[May 9, 2015 4:18:43 PM] SEVERE: Wrong FS: hdfs://localhost.localdomain:8020/user/radoop/.staging, expected: hdfs://192.168.93.133:8020
    [May 9, 2015 4:18:43 PM] SEVERE: MapReduce staging directory test failed. The Radoop client cannot write the staging directory. Please consult your Hadoop administrator."

    Getting this error when testing the connection.

    Checked all the permission its not working.

    Infrastructure: I am using CHD4 VM from cloudera and the user in the VM is cloudera. I can see the folder /tmp/radoop being created with user group "radoop" but still have issues.

    Its a bit urgent can any one please help.

    Thanks in advance.

    Regards,
    Krishna.
  • krishnas_mkrishnas_m MemberPosts:2Contributor I
    Hi All,

    I have got the above problem resolved which is related to the connection. But I get the same error as mentioned in the thread "Error: could not upload the necessary components to the directory of HDFS" Can you please help me on this.

    This is bit urgent.

    Regards,
    Krishna.
  • MartinLiebigMartinLiebig Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, University ProfessorPosts:3,386RM Data Scientist
    Hi,

    i would recommend to ask directly at support.www.turtlecreekpls.com . Since you have hadoop you should have a support contract.

    Best,
    Martin
    - Head of Data Science Services at RapidMiner -
    Dortmund, Germany
  • MartinLiebigMartinLiebig Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, University ProfessorPosts:3,386RM Data Scientist
    could you try to

    chmod 777 /tmp
    that might help. I
    - Head of Data Science Services at RapidMiner -
    Dortmund, Germany
  • MartinLiebigMartinLiebig Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, University ProfessorPosts:3,386RM Data Scientist
    - Head of Data Science Services at RapidMiner -
    Dortmund, Germany
Sign InorRegisterto comment.