English 中文(简体)
Error in using Hadoop MapReduce in Eclipse
原标题:

When I executed a MapReduce program in Eclipse using Hadoop, I got the below error.
It has to be some change in path, but I m not able to figure it out.
Any idea?

16:35:39 INFO mapred.JobClient: Task Id : attempt_201001151609_0001_m_000006_0, Status : FAILED
java.io.FileNotFoundException: File C:/tmp/hadoop-Shwe/mapred/local/taskTracker/jobcache/job_201001151609_0001/attempt_201001151609_0001_m_000006_0/work/tmp does not exist.
    at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:361)
    at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:245)
    at org.apache.hadoop.mapred.TaskRunner.setupWorkDir(TaskRunner.java:519)
    at org.apache.hadoop.mapred.Child.main(Child.java:155)
问题回答

Considering the error message ([...]6_0/work/tmp does not exist), the first issues to check are:

Extract:

If you installed it correctly, you should be able to start the MapReduce perspective and Map/Reduce view. Both are under Window > Open Perspective and Show View respectively.

  • Click the blue elephant on the upper right corner of the Map/Reduce view. It ll bring up a configuration window.
  • Type in any name for the the Location Name. I just called it localhost cluster.
  • Set the port numbers for Map/Reduce Master and DFS Master. Look in your conf/hadoop-site.xml for "mapred.job.tracker" and "dfs.default.name" respectively. If not in that file, then it is probably in hadoop-default.xml or hadoop-env.xml.
  • Click "Advanced Parameters" tab and set the "mapred.job.tracker" parameter. For some reason it doesn t automatically change when you change it in the "General" tab.
  • User name should be whoever owns hadoop. For me, it is just my login, others make a separate "hadoop" user for hadoop.
  • If you have done everything correctly, you should be able to click the triangles on the left to span the hierarchy view.

check core-site.xml and hdfs-site.xml to see where is their address. both of them should be hdfs://localhost:[port] or file:///

Typically if you are using cdh 5, cloudera quickstart VM then it is 8021 and 8020 respectively, unless you do additional config.





相关问题
In Eclipse, why doesn t "Show In" appear for non-Java files?

If I have a *java file open, I can right click on the source, scroll down to "Show In (Alt-Shift-W)", and select Navigator. If I have an *xml, *jsp, or pretty much anything besides a Java source ...

Eclipse: Hover broken in debug perspective

Since upgrading Eclipse (Galileo build 20090920-1017), hover in debug no longer displays a variable s value. Instead, hover behaves as if I were in normal Java perspective: alt text http://...

Eclipse smart quotes - like in Textmate

Happy Friday — Does anyone know if eclipse has the notion of smart quotes like Textmate. The way it works is to select some words and quote them by simply hitting the " key? I m a newbie here so be ...

Indentation guide for the eclipse editor

Is there a setting or plugin for eclipse that can show indentation guides in the editor? Something like the Codekana plugin for visual studio (not so fancy is also OK). Important that it works with ...

Adding jar from Eclipse plugin runtime tab

I want to add .jar files for plugin from the Runtime tab of manifest file. When I use the Add... button, I can see only sub-directories of the plugin project. So if I want to add same .jar file to ...

How to copy multiple projects into single folder in eclipse

I have two projects, Project1 and Project2. Now i want to copy this projects into eclipse workspace but i want these projects to be in a single folder like below. Eclipse Workspace -> Project -> ...

Java: Finding out what is using all the memory

I have a java application that runs out of memory, but I have no idea which code is allocating the memory. Is there an application with which I can check this? I use Eclipse.

热门标签