2016-04-30 07:06:12

Exception in thread "main" 
org.apache.hadoop.security.AccessControlException: Permission denied: 
user=publisher, access=EXECUTE, inode="/data/foo/configuration.xml":publisher:publisher:-rw-r--r--
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)


Always a nice error message to waste some time upon - your HDFS user seems to have all the required permissions, the file exists, and still you get the a Permission denied.
In this case, the EXECUTE permission seems to be missing, but that's surely bogus, since you only want to read a single file, not execute it.
Turns out, when accessing a file as a folder, HDFS checks the permission and denies further access because the file cannot be used as a folder. In this case, the code was searching for 
/data/foo/configuration.xml/foo/configuration.xml due to a two pieces of code both appending a part of the path to the final filename.