GithubHelp home page GithubHelp logo

Comments (8)

adrian-wang avatar adrian-wang commented on July 19, 2024

We didn't test HiBench on S3 before, but I think it should work.
From the log you provided, it seems your hadoop is not configured well for S3. Maybe you should refer to documentation of S3 about how to use S3 with Hadoop for help.

from hibench.

ZuluPro avatar ZuluPro commented on July 19, 2024

Hello @adrian-wang

By configuring fs.default.name with s3a://, I have the same kind of error: org.apache.hadoop.mapred.YarnClientProtocolProvider due to error: Error in instantiating YarnClient

Do you think Hi-bench could officialy support S3 benchmarks ?

from hibench.

adrian-wang avatar adrian-wang commented on July 19, 2024

HiBench only runs some big data workloads on cluster, I think you should verify your S3 settings with yarn settings. Make sure your settings work before you begin your benchmark.

from hibench.

bgaborg avatar bgaborg commented on July 19, 2024

export DATA_HDFS=s3://

you should use s3a://

from hibench.

weinick avatar weinick commented on July 19, 2024

@bgaborg : I'm using s3a://, but met a similar error. Did you run successfully before? Can you share your configuration if it works? Thanks.

19/06/12 20:02:25 INFO dfsioe.DfsioeConfig: Setting testRootDir to 's3a://hibenchd
ata/HiBench/Dfsioe/Input'
java.lang.IllegalArgumentException: Wrong FS: s3a://hibenchdata/HiBench/Dfsioe/Inp
ut/io_control, expected: hdfs://hacluster
at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:665)
at org.apache.hadoop.hdfs.DistributedFileSystem.getPathName(DistributedFil
eSystem.java:218)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileS
ystem.java:839)

from hibench.

adrian-wang avatar adrian-wang commented on July 19, 2024

@weinick I think you should config your fs.defaultFs in core-site.xml to s3a://....

from hibench.

weinick avatar weinick commented on July 19, 2024

@adrian-wang Hi Adrian, thanks for your reply. As you said, I tried to update fs.defaultFs on my YARN core-site.xml, it still didn't work but the error message changed. I have added S3 AK/SK into this file also and I can run S3 based wordcount job. I'm pretty sure that the bucket name is correct and the credential is correct. Do you have any other idea?

19/06/12 22:04:48 INFO mapreduce.Job: map 0% reduce 0%
19/06/12 22:04:48 INFO mapreduce.Job: Job job_1560180791815_0030 failed with state FAIL
e FAILED due to: Application application_1560180791815_0030 failed 2 times due tontaine
AM Container for appattempt_1560180791815_0030_000002 exited with exitCode: -10
00
Failing this attempt.Diagnostics: doesBucketExist on hibenchdata: com.amazonaws.Amode :
mazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider
EnvironmentVariableCredentialsProvider SharedInstanceProfileCredentialsProvider :mode :
com.amazonaws.AmazonClientException: The requested metadata is not found at http
://169.254.169.254/latest/meta-data/iam/security-credentials/: No AWS Credentials
provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider Se FAIL
haredInstanceProfileCredentialsProvider : com.amazonaws.AmazonClientException: Thntaine
e requested metadata is not found at http://169.254.169.254/latest/meta-data/iam/
security-credentials/mazonC
org.apache.hadoop.fs.s3a.AWSClientIOException: doesBucketExist on hibenchdata: conmentV
m.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentws.Ama
ialsProvider EnvironmentVariableCredentialsProvider SharedInstanceProfileCredenti/lates
alsProvider : com.amazonaws.AmazonClientException: The requested metadata is not dentia
found at http://169.254.169.254/latest/meta-data/iam/security-credentials/: No AWsProvi
S Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredenti http:
alsProvider SharedInstanceProfileCredentialsProvider : com.amazonaws.AmazonClient
Exception: The requested metadata is not found at http://169.254.169.254/latest/mm.amaz
eta-data/iam/security-credentials/

from hibench.

adrian-wang avatar adrian-wang commented on July 19, 2024

@weinick Can you use the same configuration to do a hadoop fs -ls /? Your log appears there's something wrong with credential settings.

from hibench.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.