Problem fixed...the most recent version of MCR as of this post has a major error in which some process looks for a /homes/ directory with the .matlab directory in it...Some programmer must've just put an extra s in there equaling days of pain. Creating that directory under 777 permissions and putting the .matlab folder in it fixed the problem. This is a major bug for any distrobuted computing tasks attempting to use MCR and should be fixed in the next release
MCR problems with linux/hadoop
2 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Evan
el 15 de Jul. de 2013
Respondida: Rick Amos
el 12 de Nov. de 2014
I am trying to run a program called Phenoripper(image analysis software) that utilizes MCR inside of a Hadoop multi-node setup (version 1.1.2) on Ubuntu (13.04). The hadoop setup works perfectly with generic python/java mapreduce jobs; however, when I try to use phenoripper in the setup I get this error:
boost::filesystem::create_directory:permission denied
I have asked the devs of Phenoripper about this and they have assured me that their program is not creating or writing to temporary directories and since it works on a single computer perfectly I believe that MCR is responsible for this error. If I understand correctly, it is trying to create a temporary directory somewhere where it doesn't have permission, likely on one of the slave nodes. Does anyone have any idea where this directory might be located or how to find it? If I can find this directory, my solution would be to permanently create it with appropriate permissions. Does this sound like a viable solution? Any ideas as to what on earth is going on/how to fix it would be much appreciated!
0 comentarios
Respuesta aceptada
Más respuestas (1)
Rick Amos
el 12 de Nov. de 2014
This behavior arises because Hadoop defines the 'HOME' environment variable of task processes to be '/homes' by default. MATLAB requires the 'HOME' environment variable to point to a valid writable directory. If you have control of the Hadoop configuration for the cluster, an alternative solution is to set one of the following two Hadoop configuration properties:
mapreduce.admin.user.home.dir (for Hadoop v1.X)
yarn.nodemanager.user-home-dir (for Hadoop v2.X)
For example, if the Hadoop cluster is running as 'hduser', one option is to set these properties to '/home/hduser'.
0 comentarios
Ver también
Categorías
Más información sobre MapReduce en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!