GithubHelp home page GithubHelp logo

cesnet-hadoop's People

Contributors

knackjax avatar valtri avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

cesnet-hadoop's Issues

java.lang.IllegalStateException: Variable substitution depth too large: 20

Hello!
I have 36 data dirs on the one server.
Then I start data node I get this error:

2018-06-22 15:11:47,015 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT]
2018-06-22 15:11:47,157 FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain
java.lang.IllegalStateException: Variable substitution depth too large: 20 file:///data/1//${user.name}/dfs/data,file:///data/2//${user.name}/dfs/data,file:///data/3//${user.name}/dfs/data,file:///data/4//${user.name}/dfs/data,file:///data/5//${user.name}/dfs/data,file:///data/6//${user.name}/dfs/data,file:///data/7//${user.name}/dfs/data,file:///data/8//${user.name}/dfs/data,file:///data/9//${user.name}/dfs/data,file:///data/10//${user.name}/dfs/data,file:///data/11//${user.name}/dfs/data,file:///data/12//${user.name}/dfs/data,file:///data/13//${user.name}/dfs/data,file:///data/14//${user.name}/dfs/data,file:///data/15//${user.name}/dfs/data,file:///data/16//${user.name}/dfs/data,file:///data/17//${user.name}/dfs/data,file:///data/18//${user.name}/dfs/data,file:///data/19//${user.name}/dfs/data,file:///data/20//${user.name}/dfs/data,file:///data/21//${user.name}/dfs/data,file:///data/22//${user.name}/dfs/data,file:///data/23//${user.name}/dfs/data,file:///data/24//${user.name}/dfs/data,file:///data/25//${user.name}/dfs/data,file:///data/26//${user.name}/dfs/data,file:///data/27//${user.name}/dfs/data,file:///data/28//${user.name}/dfs/data,file:///data/29//${user.name}/dfs/data,file:///data/30//${user.name}/dfs/data,file:///data/31//${user.name}/dfs/data,file:///data/32//${user.name}/dfs/data,file:///data/33//${user.name}/dfs/data,file:///data/34//${user.name}/dfs/data,file:///data/35//${user.name}/dfs/data,file:///data/36//${user.name}/dfs/data
        at org.apache.hadoop.conf.Configuration.substituteVars(Configuration.java:1058)
        at org.apache.hadoop.conf.Configuration.get(Configuration.java:1078)
        at org.apache.hadoop.conf.Configuration.getTrimmedStringCollection(Configuration.java:1974)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.getStorageLocations(DataNode.java:2473)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2465)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2516)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2698)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2722)
2018-06-22 15:11:47,160 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1
2018-06-22 15:11:47,161 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:

For fix this error I replaced ${user.name} with actual hdfs username.
For now, I just created variable

class hadoop::params {

  $hadoop_user_name = lookup('hadoop::params:hadoop_user_name', String, 'first', '${user.name}')

and use it in code:

  $hdfs_datanode_suffix = "${::osfamily}-${::operatingsystem}" ? {
    /RedHat-Fedora/ => "/$hadoop_user_name/dfs/datanode",
    /Debian|RedHat/ => "/$hadoop_user_name/dfs/data",
  }

But variable ${user.name} widely used and I don't know how to fix it in the right way.

hdfs_port_namenode_http is incorrectly set if HDFS package are not installed using cesnet-hadoop

Hello,

Following snippet from manifests/init.pp is not working correcly for me and imho, is incorrect.

  case "${::hadoop::version}." {
    /^2(\.)?/: {
      $hdfs_port_namenode_http = '50070'
      $hdfs_port_namenode_https = '50470'
      $hdfs_port_namenode = '8020'
    }
    default: {
      $hdfs_port_namenode_http = '9870'
      $hdfs_port_namenode_https = '9871'
      # changed back from 9820 to 8020 in Hadoop 3.0.1
      $hdfs_port_namenode = '8020'
    }
  }

If HDFS has been deployed manually, without using the class itself ::hadoop:version is undefined and thus, defaults to hadoop 3.x port, which is, in my case, incorrect.

Would you at least provide a way to override these ports from class parameters ?

Best regards, Adam.

Failed to parse template hadoop/hadoop/core-site.xml.erb

Hello,
I try to run hadoop on a CentOS box and fail at the very beginning:

puppet agent -t

Info: Retrieving pluginfacts
Info: Retrieving plugin
Info: Loading facts
Info: Loading facts
Error: Could not retrieve catalog from remote server: Error 400 on SERVER: Failed to parse template hadoop/hadoop/core-site.xml.erb:
Filepath: /etc/puppet/modules/hadoop/templates/hadoop/core-site.xml.erb
Line: 32
Detail: private method `select' called for nil:NilClass
at /etc/puppet/modules/hadoop/manifests/common/config.pp:11 on node hhvmsparkqat01.easycash.de
Warning: Not using cache on failed catalog
Error: Could not retrieve catalog; skipping run

Illegal fully qualified name with parser = future

Hello,

Latest version of the module seems to fail with future parser:

Error: Could not retrieve catalog from remote server: Error 400 on SERVER: Evaluation Error: Error while evaluating a Function Call, Illegal fully qualified name in file /etc/puppet/environments/production/modules/zookeeper/manifests/config.pp at line 60:38 at /etc/puppet/environments/production/modules/zookeeper/manifests/init.pp:52:3 on node hdfs-master1.service.earthlab.lu

The same occurs at line 67.

It seems that replacing with content => "${_myid}" fixes the issue (for me) but it may have broken something else.

Best regards, Adam.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.