Does Spring Hadoop support running from a Windows client? I assume it does, since I see windows specific batch files to execute in the map reduce example.
When I build and run on a Windows client, connecting to my cluster, it fails. First it says it can't load native libs and then it submits the job but fails after that.
11:40:41,919 INFO t.support.ClassPathXmlApplicationContext: 510 - Refreshing org.springframework.context.support.ClassPathXmlApplicationContext@659297ab: startup date [Tue Feb 11 11:40:41 EST 2014]; root of context hierarchy
11:40:42,176 INFO eans.factory.xml.XmlBeanDefinitionReader: 315 - Loading XML bean definitions from class path resource [META-INF/spring/application-context.xml]
11:40:42,895 INFO ort.PropertySourcesPlaceholderConfigurer: 172 - Loading properties file from class path resource [hadoop.properties]
11:40:42,922 INFO ctory.support.DefaultListableBeanFactory: 596 - Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@74ab6b5: defining beans [org.springframework.context.support.PropertySourcesPlaceholderConfigurer#0,hadoopConfiguration,wordcountJob,runner]; root of factory hierarchy
11:40:43,166 INFO he.hadoop.conf.Configuration.deprecation: 840 - fs.default.name is deprecated. Instead, use fs.defaultFS
11:40:44,706 ERROR org.apache.hadoop.util.Shell: 303 - Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:278)
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:300)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:293)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:76)
at org.apache.hadoop.conf.Configuration.getTrimmedStrings(Configuration.java:1546)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:519)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:453)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:136)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2433)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:88)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2467)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2449)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:367)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:166)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:351)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:287)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.addInputPath(FileInputFormat.java:466)
at org.springframework.data.hadoop.mapreduce.JobFactoryBean.afterPropertiesSet(JobFactoryBean.java:208)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1547)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1485)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:524)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:461)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:295)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:223)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:292)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:194)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:608)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:932)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:479)
at org.springframework.context.support.ClassPathXmlApplicationContext.<init>(ClassPathXmlApplicationContext.java:197)
at org.springframework.context.support.ClassPathXmlApplicationContext.<init>(ClassPathXmlApplicationContext.java:172)
at org.springframework.context.support.ClassPathXmlApplicationContext.<init>(ClassPathXmlApplicationContext.java:158)
at org.springframework.samples.hadoop.mapreduce.Wordcount.main(Wordcount.java:28)
11:40:45,142 INFO org.apache.hadoop.yarn.client.RMProxy: 56 - Connecting to ResourceManager at hd-dn-01.grcrtp.local/10.6.64.232:8050
11:40:45,245 INFO ramework.data.hadoop.mapreduce.JobRunner: 192 - Starting job [wordcountJob]
11:40:45,302 INFO org.apache.hadoop.yarn.client.RMProxy: 56 - Connecting to ResourceManager at hd-dn-01.grcrtp.local/10.6.64.232:8050
11:40:45,971 WARN org.apache.hadoop.mapreduce.JobSubmitter: 258 - No job jar file set. User classes may not be found. See Job or Job#setJar(String).
11:40:46,080 INFO doop.mapreduce.lib.input.FileInputFormat: 287 - Total input paths to process : 1
11:40:46,422 INFO org.apache.hadoop.mapreduce.JobSubmitter: 394 - number of splits:1
11:40:46,441 INFO he.hadoop.conf.Configuration.deprecation: 840 - user.name is deprecated. Instead, use mapreduce.job.user.name
11:40:46,442 INFO he.hadoop.conf.Configuration.deprecation: 840 - fs.default.name is deprecated. Instead, use fs.defaultFS
11:40:46,444 INFO he.hadoop.conf.Configuration.deprecation: 840 - mapred.mapoutput.value.class is deprecated. Instead, use mapreduce.map.output.value.class
11:40:46,444 INFO he.hadoop.conf.Configuration.deprecation: 840 - mapred.used.genericoptionsparser is deprecated. Instead, use mapreduce.client.genericoptionsparser.used
11:40:46,450 INFO he.hadoop.conf.Configuration.deprecation: 840 - mapreduce.map.class is deprecated. Instead, use mapreduce.job.map.class
11:40:46,450 INFO he.hadoop.conf.Configuration.deprecation: 840 - mapred.job.name is deprecated. Instead, use mapreduce.job.name
11:40:46,450 INFO he.hadoop.conf.Configuration.deprecation: 840 - mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
11:40:46,451 INFO he.hadoop.conf.Configuration.deprecation: 840 - mapreduce.reduce.class is deprecated. Instead, use mapreduce.job.reduce.class
11:40:46,451 INFO he.hadoop.conf.Configuration.deprecation: 840 - mapred.input.dir is deprecated. Instead, use mapreduce.input.fileinputformat.inputdir
11:40:46,452 INFO he.hadoop.conf.Configuration.deprecation: 840 - mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
11:40:46,452 INFO he.hadoop.conf.Configuration.deprecation: 840 - mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
11:40:46,454 INFO he.hadoop.conf.Configuration.deprecation: 840 - mapred.mapoutput.key.class is deprecated. Instead, use mapreduce.map.output.key.class
11:40:46,454 INFO he.hadoop.conf.Configuration.deprecation: 840 - mapred.working.dir is deprecated. Instead, use mapreduce.job.working.dir
11:40:46,820 INFO org.apache.hadoop.mapreduce.JobSubmitter: 477 - Submitting tokens for job: job_1391711633872_0022
11:40:47,127 INFO org.apache.hadoop.mapred.YARNRunner: 368 - Job jar is not present. Not adding any jar to the list of resources.
11:40:47,225 INFO doop.yarn.client.api.impl.YarnClientImpl: 174 - Submitted application application_1391711633872_0022 to ResourceManager at hd-dn-01.grcrtp.local/10.6.64.232:8050
11:40:47,291 INFO org.apache.hadoop.mapreduce.Job:1272 - The url to track the job: http://http://hd-dn-01.grcrtp.local:8088/proxy/application_1391711633872_0022/
11:40:47,292 INFO org.apache.hadoop.mapreduce.Job:1317 - Running job: job_1391711633872_0022
11:40:50,330 INFO org.apache.hadoop.mapreduce.Job:1338 - Job job_1391711633872_0022 running in uber mode : false
11:40:50,332 INFO org.apache.hadoop.mapreduce.Job:1345 - map 0% reduce 0%
11:40:50,356 INFO org.apache.hadoop.mapreduce.Job:1358 - Job job_1391711633872_0022 failed with state FAILED due to: Application application_1391711633872_0022 failed 2 times due to AM Container for appattempt_1391711633872_0022_000002 exited with exitCode: 1 due to: Exception from container-launch:
org.apache.hadoop.util.Shell$ExitCodeException: /bin/bash: line 0: fg: no job control
at org.apache.hadoop.util.Shell.runCommand(Shell.java:464)
at org.apache.hadoop.util.Shell.run(Shell.java:379)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:589)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:283)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:79)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
.Failing this attempt.. Failing the application.
11:40:50,434 INFO org.apache.hadoop.mapreduce.Job:1363 - Counters: 0
11:40:50,470 INFO ramework.data.hadoop.mapreduce.JobRunner: 202 - Completed job [wordcountJob]
11:40:50,507 INFO org.apache.hadoop.yarn.client.RMProxy: 56 - Connecting to ResourceManager at hd-dn-01.grcrtp.local/10.6.64.232:8050
11:40:50,590 INFO ctory.support.DefaultListableBeanFactory: 444 - Destroying singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@74ab6b5: defining beans [org.springframework.context.support.PropertySourcesPlaceholderConfigurer#0,hadoopConfiguration,wordcountJob,runner]; root of factory hierarchy