GithubHelp home page GithubHelp logo

mock-aws-s3's Introduction

Mock AWS S3 SDK

This is a very simple interface that mocks the AWS SDK for Node.js. The implementation is incomplete but most basic features are supported.

Available:

  • createBucket
  • deleteBucket
  • listObjects
  • listObjectsV2
  • deleteObjects
  • deleteObject
  • getObject
  • headObject
  • putObject
  • copyObject
  • getObjectTagging
  • putObjectTagging
  • upload
  • getSignedUrl

It uses a directory to mock a bucket and its content.

If you'd like to see some more features or you have some suggestions, feel free to use the issues or submit a pull request.

Release History

  • 2021-04-10   v4.0.2   Update dependencies, remove extra log and use proper path concat
  • 2020-01-30   v4.0.0   Fix promises and update packages with various contributions.
  • 2018-06-16   v3.0.0   Contributions from @benedict-wellard and @telenor-digital-asia adding support for promises and deleteBucket
  • 2017-08-11   v2.6.0   Contributions from @pamelafox and @fkleon adding support for listObjectsV2, tagging and more useful debug info returned
  • 2017-05-31   v2.5.1   Fix bug when statSync was called on non existing files, spotted by @AllanHodkinson
  • 2017-05-20   v2.5.0   Set LastModified on getObject by @stujo, support for custom metadata on get/head by @rgparkins and putObject returns some data on error by @pamelafox
  • 2017-02-02   v2.4.0   Account for no existing keys when getting an object by @derPuntigamer
  • 2016-06-03   v2.3.0   Add createBucket method and tests by @neilstuartcraig
  • 2016-05-25   v2.2.1   Add Size attribute by @aldafu
  • 2016-04-25   v2.2.0   Add MaxKey options in listObject by @hauboldj
  • 2016-01-18   v2.1.0   Fix markers on listObjects (by @wellsjo) and add send method (by @AllieRays and @IonicaBizau)
  • 2015-11-04   v2.0.0   Static basePath configuration, bound params (by @CJNE) and match upload API (by @kyleseely)
  • 2015-10-25   v1.1.0   Removed because of potential breaking change with bound params
  • 2015-09-24   v1.0.0   Breaking changes and awesome PR to fix API inconsistencies by @irothschild
  • 2015-08-27   v0.5.0   Refactor and default options by @whitingj
  • 2015-07-28   v0.4.0   Add headObject method by @mdlavin
  • 2015-07-21   v0.3.0   Add CommonPrefixes to listObjects by @jakepruitt
  • 2015-03-15   v0.2.7   Mock out AWS' config submodule by @necaris
  • 2015-03-13   v0.2.6   Partial match support and ContentLength by @mick
  • 2015-03-03   v0.2.5   Allow string and fix tests by @lbud
  • 2015-02-05   v0.2.4   Fix url encoding for copy by @ahageali
  • 2015-01-22   v0.2.3   Support for copyObject
  • 2014-02-02   v0.2.1   Support for deleteObject
  • 2014-01-08   v0.2.0   Support streams for getObject/putObject
  • 2013-10-24   v0.1.2   Fix isTruncated typo
  • 2013-10-09   v0.1.1   Add LastModified to listObject
  • 2013-08-09   v0.1.0   First release

Example

Instantiate

var AWSMock = require('mock-aws-s3');
AWSMock.config.basePath = '/tmp/buckets/' // Can configure a basePath for your local buckets
var s3 = AWSMock.S3({
	params: { Bucket: 'example' }
});

PutObject/ListObjects

s3.putObject({Key: 'sea/animal.json', Body: '{"is dog":false,"name":"otter","stringified object?":true}'}, function(err, data) {
	s3.listObjects({Prefix: 'sea'}, function (err, data) {
		console.log(data);
	});
});

CreateBucket

var params = { Bucket: 'example' };
s3.createBucket(params, function(err) {
    if(err) {
        console.error(err);
    }
});

DeleteBucket

var params = { Bucket: 'example' };
s3.deleteBucket(params, function(err) {
    if(err) {
        console.error(err);
    }
});

mock-aws-s3's People

Contributors

ahageali avatar aldafu avatar allierays avatar cjne avatar dependabot[bot] avatar fkleon avatar gabegorelick avatar hauboldj avatar ionicabizau avatar jamesabc avatar kronick avatar macilath avatar mathieuloutre avatar mdlavin avatar michaelcereda avatar mick avatar necaris avatar neilstuartcraig avatar pamelafox avatar puinenveturi avatar rgparkins avatar stujo avatar tastefulelk avatar verdier avatar viral-sh avatar wellsjo avatar whitingj avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

mock-aws-s3's Issues

putObject should pass callback an object

From the S3 docs, the second parameter to the putObject callback should "the de-serialized data returned from the request. Set to null if a request error occurs." But mock-aws-s3's version isn't passing anything.

Testing for failed cases

Hi again
After doing the work for createBucket, the simple version, I noticed that there are at least some tests which are only testing for the correct, direct use-case. I think it'd be prudent to verify correct handling for e.g. incorrect method arguments. I realise that you don't want to emulate the AWS API entirely but I think some handling of incorrect args etc. would be very worthwhile.

Any thoughts?

Cheers

Latest AWS SDK supported?

I've just upgraded both aws-sdk and mock-aws-s3 to their latest version, but I am getting the following type error (using TypeScript):

 Type 'S3' is missing the following properties from type 'S3': 
deleteBucketIntelligentTieringConfiguration, deleteBucketOwnershipControls, getBucketIntelligentTieringConfiguration, getBucketOwnershipControls, and 4 more.

Version used:

mock-aws-s3@npm:4.0.2
aws-sdk@npm:2.995.0

putObject should return an object

mock-aws-s3/lib/mock.js

Lines 221 to 255 in 4b247d6

putObject: function (search, callback) {
search = _.extend({}, this.defaultOptions, applyBasePath(search));
var dest = search.Bucket + '/' + search.Key;
if (typeof search.Body === 'string') {
search.Body = new Buffer(search.Body);
}
if (search.Body instanceof Buffer) {
fs.createFileSync(dest);
fs.writeFile(dest, search.Body, function (err) {
callback(err);
});
}
else {
fs.mkdirsSync(path.dirname(dest));
var stream = fs.createWriteStream(dest);
stream.on('finish', function () {
callback(null, true);
});
search.Body.on('error', function (err) {
callback(err);
});
stream.on('error', function (err) {
callback(err);
});
search.Body.pipe(stream);
}
},

Me and @AllieRays coded a solution to return an object containing the send method, simulating the s3 module. 🍀

@AllieRays Don't forget to fork, commit, pull request that change, so after npm installing you will not lose it. 😂

CreateBucket method

Hello

Firstly, thanks for the work you've all put in on this package - I just started using it on a small module i'm writing. I just need 2 methods for my tests, S3.putObject and S3.createBucket - obviously only the former is implemented right now.

I'm just wanting to ask if you'd consider me contributing a PR to add s3.createBucket. Having looked through the source a little, i have an idea how i'd add it to be compatible with the existing code. I guess the simplest way would be to just create the dir on disk and then perhaps as a second phase, modify e.g. s3.putObject to verify the existence of a bucket before allowing a put (this would also help my use case, incidentally).

Cheers

S3 Upload with in cucumber.js does not save content.

S3 file upload does not work correctly within cucumber.js. The file is created but it does not have any content.

See the attached test cases:

Run node test.js and show that it is working in the normal case.

Run npm test and look for the echoed "content" string within the cucumber message. This should be "content template" but appears as "content" hence no file content - confirm by opening the file within a text editor (C:/projects/test/files/test/test.txt).

test.zip

listObjects delimiter

Hi,

I discovered this library and it's really nice but I encountering a problem with listObjects and listObjectsV2 methods.

I don't have same responses that aws sdk. I just want to list all directory and files at a specific path. So I specify delimiter : "/" and Prefix with my path.

The problem is that response contains all subdirectories and files including in these directories.

Official AWS SDK returns only directories and files to specific prefix

|--- root_folder/
______| file.txt
______|--- Folder1/
__________|--- Subfolder1
_______________|---File1.txt
_______________|---File12.txt
______| Folder2/
__________| File2.txt

Options :

  • Bucket : "my_bucket"
  • Prefix : "root_folder"
  • Delimiter: "/"

Results should contains array of CommonPrefixes with "Prefix" (Folder1, Folder2) and Key "file.txt". But it returns "Subfolder1", "File1", "File12", "File2" too.

copyObject is returning /tmp/buckets//<bucket_name>/key

I'm running v 4.0.1

copyObject is returning /tmp/buckets//<bucket_name>/key

Code:
const copyObject = s3 => sourceBucket => outputBucket => ( oldKey, newKey) => {

  const CopySource = `/${sourceBucket}/${oldKey}`;
  const params = {
    Bucket : outputBucket,
    CopySource,
    Key    : newKey
  };
  return s3.copyObject(params).promise();

test file:

const AWSMock = require('mock-aws-s3');

AWSMock.config.basePath = '/tmp/buckets';

const FAKE_S3 = AWSMock.S3({
params : {
Bucket : KNOWN_OUTPUT_BUCKET
}
}),
FAKE_OLD_KEY = 'oldkey',
FAKE_NEW_KEY = 'newkey',
FAKE_BODY = '123';

copyObject(FAKE_S3)(KNOWN_OUTPUT_BUCKET)(KNOWN_OUTPUT_BUCKET)(FAKE_OLD_KEY, FAKE_NEW_KEY)
.then(res => {
console.log('res',res)


Output:

res {
Bucket: '/tmp/buckets/media-output-dev-use1-v1',
CopySource: '/tmp/buckets//media-output-dev-use1-v1/oldkey',
Key: 'newkey'
}

Promises are not Working for createBucket

Running the following code:

var AWSMock = require('mock-aws-s3');
AWSMock.config.basePath = './tmp/buckets/' // Can configure a basePath for your local buckets
var s3 = AWSMock.S3({
    params: { Bucket: 'example' }
});

async function test() {
    await s3.createBucket({
        Bucket: "example"
    }).promise()
}

test()

throws this error:

/home/me/testing/node_modules/mock-aws-s3/lib/mock.js:376
				return callback(err);
				       ^

TypeError: callback is not a function
    at /home/me/testing/node_modules/mock-aws-s3/lib/mock.js:376:12
    at /home/me/testing/node_modules/mkdirp/index.js:38:26
    at FSReqWrap.oncomplete (fs.js:153:5)

getObject should answer correctly of object not found

getObject() on an entry which does not exists results in an "ENOENT" - file not found exception, while the mocked S3 tries to open the object's file without checking its existence. This behavior does not mimic the AWS S3, since S3 sends no error in this scenario, just an object with the answer:

...
code: "NoSuchKey",
message: "The specified key does not exist.",
name: "NoSuchKey",
region: null,
statusCode: 404
...

getSignedUrl doesn't call callback for `putObject` operation

getObject works as expected, but putObject looks like it's still stubbed out without a fake implementation (one that would call the callback or work as a promisified)

case 'putObject':

I think the solution would be just to add callback(null, url) on like 625? Would need to look more at the spec for getSignedUrl to make sure the dummy url would work for both

Any chance of adding Promises?

The core aws-sdk dependency added promises to the core, via a ".promise" call appended to each service. Ex:

  var s3 = new AWS.S3();
  s3.listBuckets(params).promise().then();

Any chance of that functionality being added to this library?

getObject LastModified

It looks like getObject doesn't set the LastModified time in the mocked response.

I think this is set by the AWS code. I'm new to S3 though

no such file or directory, scandir...

I get the error if I want to getList of empty bucket:

s3 = AWSMock.S3({Bucket: 'my-bucket'});
s3.listObjects({Bucket: 'my-bucket', Prefix: 'new-one/'}, (data)=>{
        console.log(data);
      });

Error: ENOENT: no such file or directory, scandir '/tmp/buckets//my-bucket'

getObject can throw uncaught ENOENT if object is concurrently deleted

Relevant code from getObject:

mock-aws-s3/lib/mock.js

Lines 312 to 314 in 6b5aa02

fs.readFile(path, function (err, data) {
if (!err) {
var stat = fs.statSync(path)

Ignoring the fact that this probably shouldn't be doing synchronous I/O (if not, then why isn't readFileSync being called instead of the async variant?), the issue is the underlying file may be deleted in between the readFile and statSync calls. This results in an uncaught Error: ENOENT: no such file or directory, instead of the expected NoSuchKey error being returned.

Support for ContinuationToken?

I have noticed that listObjectsV2 does not return NextContinuationToken which is required when iterating through larger sets of objects on S3.
The response looks like this:

{
      Contents: [
        {
          Key: 'file_0',
          ETag: '"cc6c6102174b3050bc3397c724f00f63"',
          LastModified: 2022-08-17T13:10:29.351Z,
          Size: 3
        },
        {
          Key: 'file_1',
          ETag: '"cc6c6102174b3050bc3397c724f00f63"',
          LastModified: 2022-08-17T13:10:29.351Z,
          Size: 3
        }
      ],
      CommonPrefixes: [ { Prefix: '/' } ],
      IsTruncated: true,
      NextContinuationToken: undefined,
      ContinuationToken: undefined,
      StartAfter: undefined
}

Note that while the tokens are missing, isTruncated is still set to true, which would lead to an infinite loop in many applications.

getObject does not return Error objects for NoSuchKey

When an object doesn't exist, getObject returns an object with the requisite properties, but it's not an Error object. This makes debugging more difficult (since there's no stack property), breaks Nodejs conventions (callers typically expect the first argument to a callback to be an Error object), and deviates from the official AWS SDK (which returns Error objects).

mock-aws-s3/lib/mock.js

Lines 331 to 340 in 6b5aa02

if (err.code === 'ENOENT') {
return callback({
cfId: undefined,
code: 'NoSuchKey',
message: 'The specified key does not exist.',
name: 'NoSuchKey',
region: null,
statusCode: 404
}, search)
}

Every other method seems to correctly return Error objects. So this appears to be an outlier.

AssertionError when trying to remove a folder (using deleteBucket)

While using the deleteBucket function, I get the following error:

AssertionError [ERR_ASSERTION]: false == true
    at rmdir (/media/sylvain/Store/projects/devpipeline/services/AmbraIngestionServer/node_modules/fs-extra/node_modules/rimraf/rimraf.js:159:5)
    at /media/sylvain/Store/projects/devpipeline/services/AmbraIngestionServer/node_modules/fs-extra/node_modules/rimraf/rimraf.js:97:16
    at FSReqWrap.oncomplete (fs.js:135:15)

After some research on google, it looks like it is caused by the package rimraf not being up to date.
The rimraf package is included in the fs-extra package (also out of date) which mock-aws-s3 includes.

I have confirmed the error no longer occurs after updating fs-extra from 0.6.4 to 7.0.1 but I don't know what impact updating fs-extra could have on other functions.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.