[Solved-2 Solutions] Apache Pig permissions issue ?
- If your attempting to get Apache Pig up and running on my Hadoop cluster, and am encountering a permissions problem. Pig itself is launching and connecting to the cluster just fine- from within the Pig shell, if it
lsthrough and around HDFS directories.
- However, when you try and actually load data and run Pig commands, you may run into permissions-related errors:
- In this case,
all_annotated.txtis a file in my HDFS home directory that you may created, and most definitely have permissions to; the same problem occurs no matter what file you try to
load. However the problem, as the error itself indicates Pig is trying to write somewhere.
If you have any ideas as to what might be going on ?
Probably our pig.temp.dir setting. It defaults to /tmp on hdfs. Pig will write temporary result there. If we don't have permission to /tmp, Pig will complain. We can try to override it by -Dpig.temp.dir.
A problem might be that hadoop.tmp.dir is a directory on your local filesystem, not HDFS. Try setting that property to a local directory .we can run into the same error using regular MapReduce in Hadoop.