GithubHelp home page GithubHelp logo

dynalite's People

Contributors

betamoo avatar bwitt avatar dependabot[bot] avatar evalphobia avatar gerst20051 avatar ieiayaobb avatar mhart avatar mick avatar rclark avatar reconbot avatar rudolf avatar ryanblock avatar saebyn avatar vogonistic avatar willwhite avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dynalite's Issues

NetworkingError: socket hang up

I am getting this error

{ [NetworkingError: socket hang up]
  message: 'socket hang up',
  code: 'NetworkingError',
  region: 'us-east-1',
  hostname: 'localhost',
  retryable: true,
  time: Fri Jan 15 2016 15:27:57 GMT+0530 (IST) }``` 
while trying to create a table for testing.  Can someone please help me point to right direction?

delete table return object differs from AWS

Hi Mhart and thanks for the good work,

Just so you know, the delete table return some extra info compared to AWS:

AWS:

{
    "TableDescription": {
        "TableName": "test",
        "TableStatus": "DELETING",
        "ProvisionedThroughput": {
            "NumberOfDecreasesToday": 0,
            "ReadCapacityUnits": 5,
            "WriteCapacityUnits": 5
        },
        "TableSizeBytes": 0,
        "ItemCount": 0,
        "TableArn": "arn:aws:dynamodb:us-west-1:886704940455:table/test"
    }
}

then dynalite ( I removed the extra info just kept the keys )

{
    "TableDescription": {
        "AttributeDefinitions": ********,
        "TableName": "test",
        "KeySchema": ********,
        "TableStatus": "DELETING",
        "CreationDateTime": ********,
        "ProvisionedThroughput": {
            "NumberOfDecreasesToday": 0,
            "ReadCapacityUnits": 5,
            "WriteCapacityUnits": 5
        },
        "TableSizeBytes": 0,
        "ItemCount": 0,
        "TableArn": "arn:aws:dynamodb:us-east-1:000000000000:table/test",
        "LocalSecondaryIndexes": ********
    }
}

not a real problem-causing issue though ๐Ÿ˜„

Error: Cannot find module 'memdown'

Hello @mhart ,

I'm on Ubuntu 14.04 and following these steps:

curl -sL https://deb.nodesource.com/setup | sudo bash -
sudo apt-get -y install nodejs
sudo apt-get -y install build-essential
sudo npm install -g dynalite
sudo /usr/bin/dynalite --port 8080

And getting:

sarxos@sarxos-comp:~/workspace$ sudo /usr/bin/dynalite --port 8080

module.js:340
    throw err;
          ^
Error: Cannot find module 'memdown'
    at Function.Module._resolveFilename (module.js:338:15)
    at Function.Module._load (module.js:280:25)
    at Module.require (module.js:364:17)
    at require (module.js:380:17)
    at Object.<anonymous> (/usr/lib/node_modules/dynalite/db/index.js:4:15)
    at Module._compile (module.js:456:26)
    at Object.Module._extensions..js (module.js:474:10)
    at Module.load (module.js:356:32)
    at Function.Module._load (module.js:312:12)
    at Module.require (module.js:364:17)

I've executed sudo npm install -g memdown as well, but it didn't help.

The npm version:

sarxos@sarxos-comp:~/workspace$ sudo npm -v
3.7.5

And node version:

sarxos@sarxos-comp:~/workspace$ node --version
v0.10.42

It would be great if you have some ideas of how this could be resolved.

Validate between and raise error

Hello @mhart,

Thank you for creating Dynalite!

While testing against DynamoDB and Dynalite, I noticed an inconsistency in validating the BETWEEN key condition expression values.

Steps to validate the issue:

Sample Table schema:

      TableName: 'sometable',
      KeySchema: [
        {AttributeName: 'hashKey', KeyType: 'HASH'},
        {AttributeName: 'rangeKey', KeyType: 'RANGE'}
      ],
      AttributeDefinitions: [
        {AttributeName: 'hashKey', AttributeType: 'S'},
        {AttributeName: 'rangeKey', AttributeType: 'N'}
      ]

Add two items with ranges 0 and 1 under the same hashKey "hello":

      TableName: 'sometable',
      Item: {
        hashKey: {S: 'hello'},
        rangeKey: {N: 0}
      },
...
      Item: {
        hashKey: {S: 'hello'},
        rangeKey: {N: 1}
      },

Query the table using BETWEEN range 0 and 1:

      TableName: 'sometable,
      KeyConditionExpression: 'hashKey = "hello" AND rangeKey BETWEEN 0 AND 1'
Status Output
DynamoDB Ok [item with rangeKey 0, item with rangeKey 1]
Dynalite Ok [item with rangeKey 0, item with rangeKey 1]

Query the table using BETWEEN range 2 and 1:

      TableName: 'sometable,
      KeyConditionExpression: 'hashKey = "hello" AND rangeKey BETWEEN 2 AND 1'
Status Output
DynamoDB Error ValidationException: Invalid KeyConditionExpression: The BETWEEN operator requires upper bound to be greater than or equal to lower bound; lower bound operand: AttributeValue: {N:2}, upper bound operand: AttributeValue: {N:1}
Dynalite Ok no error thrown

The fix for this issue is straightforward.

      if (comparisonOperator === 'BETWEEN') {
        var lowerBound = data.KeyConditions[attr].AttributeValueList[0];
        var upperBound = data.KeyConditions[attr].AttributeValueList[1];

        // at this point AttributeValueList for the BETWEEN op looks like: [ {N: '42'}, {N: '4.2'} ]
        if (Object.keys(lowerBound).join('') === 'N' && Object.keys(upperBound).join('') === 'N') {
          var lowerBoundValue = Number(lowerBound.N);
          var upperBoundValue = Number(upperBound.N);

          if (lowerBoundValue > upperBoundValue) {
            var errMsg = 'Invalid KeyConditionExpression: The BETWEEN operator requires upper bound to be greater ' +
              'than or equal to lower bound; lower bound operand: AttributeValue: ' + JSON.stringify(lowerBound) +
              ', upper bound operand: AttributeValue: ' + JSON.stringify(upperBound);
            return cb(db.validationError(errMsg));
          }

        }

      }

Adding the above check after this line https://github.com/mhart/dynalite/blob/master/actions/query.js#L61 fixes this issue. Could you please validate this check patch, and possibly add this to dynalite? I'm happy to send a PR as well.

Thanks!

Either the KeyConditions or KeyConditionExpression parameter must be specified

Hello @mhart,

At the very beginning I would like to thank you for the awesome tool!

Now lets go to the problem description. I'm using this docker to start dynalite. The AWS SDK is the newest one available from Maven Central. To reproduce this case I'm using 1.10.56. The reason I'm using Docker is due to issue #51.

Setup stack:

docker run -d -p 8080:8080 vsouza/dynamo-local --port 8080

Execute reproducer (code available at the end of this ticket).

Then, I'm getting this exception:

Exception in thread "main" com.amazonaws.AmazonServiceException: Either the KeyConditions or KeyConditionExpression parameter must be specified in the request. (Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ValidationException; Request ID: 1VINPYAMXI2VWGE7BX1TLEUCKH3ONXHIGC2DTGQ4OXR3X5ZU6C1T)
    at com.amazonaws.http.AmazonHttpClient.handleErrorResponse(AmazonHttpClient.java:1369)
    at com.amazonaws.http.AmazonHttpClient.executeOneRequest(AmazonHttpClient.java:913)
    at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:631)
    at com.amazonaws.http.AmazonHttpClient.doExecute(AmazonHttpClient.java:400)
    at com.amazonaws.http.AmazonHttpClient.executeWithTimer(AmazonHttpClient.java:362)
    at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:311)
    at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.invoke(AmazonDynamoDBClient.java:1822)
    at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.query(AmazonDynamoDBClient.java:1459)
    at TestProblem.main(TestProblem.java:112)

From what I understand the dynalite claims I'm sending neither KeyConditions nor KeyConditionExpression, but my code to build the request for this issue is the following:

QueryRequest request = new QueryRequest()
    .withTableName(table)
    .withConsistentRead(false)
    .withIndexName("rid-index")
    .withKeyConditionExpression("rid = :rid") // <-- specified
    .withExpressionAttributeValues(ImmutableMap.of(
        ":rid", new AttributeValue().withN("123")))
    .withReturnConsumedCapacity(ReturnConsumedCapacity.TOTAL); 

And, as you can see above, the key condition expression is given as rid = :rid.

By now I workaround this by not using key condition expression at all and stuck with old legacy key condition, which should not be used (according to AWS documentation):

QueryRequest request = new QueryRequest()
    .withTableName(table)
    .withConsistentRead(false)
    .withIndexName("rid-index")
    .withKeyConditions(ImmutableMap.of( // <-- workaround
        "rid", new Condition()
            .withComparisonOperator(ComparisonOperator.EQ)
            .withAttributeValueList(new AttributeValue().withN("123"))))
    .withReturnConsumedCapacity(ReturnConsumedCapacity.TOTAL);

I did take a look into the dynalite JS code and found where this error is generated but my experience with nodejs is not sufficient enough to debug this problem deeper. Just FYI, both requests works well in real DynamoDB.

Full code I used to reproduce this case:

import java.util.Arrays;
import java.util.List;

import com.amazonaws.ClientConfiguration;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient;
import com.amazonaws.services.dynamodbv2.document.DynamoDB;
import com.amazonaws.services.dynamodbv2.model.AttributeDefinition;
import com.amazonaws.services.dynamodbv2.model.AttributeValue;
import com.amazonaws.services.dynamodbv2.model.CreateTableRequest;
import com.amazonaws.services.dynamodbv2.model.GlobalSecondaryIndex;
import com.amazonaws.services.dynamodbv2.model.KeySchemaElement;
import com.amazonaws.services.dynamodbv2.model.KeyType;
import com.amazonaws.services.dynamodbv2.model.Projection;
import com.amazonaws.services.dynamodbv2.model.ProjectionType;
import com.amazonaws.services.dynamodbv2.model.ProvisionedThroughput;
import com.amazonaws.services.dynamodbv2.model.QueryRequest;
import com.amazonaws.services.dynamodbv2.model.QueryResult;
import com.amazonaws.services.dynamodbv2.model.ReturnConsumedCapacity;
import com.google.common.collect.ImmutableMap;


public class TestProblem {

    public static void main(String[] args) {

        String access = "bubu";
        String secret = "something";

        ClientConfiguration clientConfiguration = new ClientConfiguration().withTcpKeepAlive(true);
        BasicAWSCredentials credentials = new BasicAWSCredentials(access, secret);
        AmazonDynamoDBClient client = new AmazonDynamoDBClient(credentials, clientConfiguration);

        // !!! change port to the one exposed by you docker host !!!

        client.setEndpoint("http://localhost:8080");

        String table = "test-message.3";

        // create table schema

        List<AttributeDefinition> attributes = Arrays.asList(
            new AttributeDefinition()
                .withAttributeName("mid")
                .withAttributeType("S"),
            new AttributeDefinition()
                .withAttributeName("rid")
                .withAttributeType("N"));

        List<KeySchemaElement> schema = Arrays.asList(
            new KeySchemaElement()
                .withAttributeName("mid")
                .withKeyType(KeyType.HASH));

        List<GlobalSecondaryIndex> globals = Arrays.asList(
            new GlobalSecondaryIndex()
                .withIndexName("rid-index")
                .withKeySchema(Arrays.asList(
                    new KeySchemaElement()
                        .withAttributeName("rid")
                        .withKeyType(KeyType.HASH)))
                .withProvisionedThroughput(new ProvisionedThroughput()
                    .withReadCapacityUnits(5L)
                    .withWriteCapacityUnits(5L))
                .withProjection(new Projection()
                    .withProjectionType(ProjectionType.ALL)));

        CreateTableRequest createRequest = new CreateTableRequest()
            .withTableName(table)
            .withAttributeDefinitions(attributes)
            .withKeySchema(schema)
            .withGlobalSecondaryIndexes(globals)
            .withProvisionedThroughput(new ProvisionedThroughput()
                .withReadCapacityUnits(5L)
                .withWriteCapacityUnits(5L));

        // create table

        try {
            new DynamoDB(client)
                .createTable(createRequest)
                .waitForActive();
        } catch (InterruptedException e) {
            throw new IllegalStateException(e);
        }

        // query table

        QueryRequest request = new QueryRequest()
            .withTableName(table)
            .withConsistentRead(false)
            .withIndexName("rid-index")
            .withKeyConditionExpression("rid = :rid")
            .withExpressionAttributeValues(ImmutableMap.of(
                ":rid", new AttributeValue().withN("123")))
            .withReturnConsumedCapacity(ReturnConsumedCapacity.TOTAL);

        // this cause the error

        QueryResult result = client.query(request);

        System.out.println(result);
    }
}

Update LevelDB/UP/DOWN

Taking a look at the current state of LevelUP/DOWN it seems a worthy update to DOWN as LevelDB has been updated to 1.19 (after 2 years).

It seems LevelDOWN has upgraded its dependencies to async 2+ so perhaps it's not immediate.

Logging + a --verbose option

Great project! Tough to debug though. Would be awesome to have some logging and a --verbose flag to make the server a little more chatty.

Configurable max record size

While DynamoDB's documented limit is 400kb, it is possible to request an increase. Could dynalite allow for a configurable max record size to accommodate testing systems where this limit has been increased?

query not using indexes

This is more of a performance issue, when querying on the id it looks like it still does a scan of the entire db.

We have a table with around 200K rows, to do the following query:
table.query {id: type + '-' + id}, ['attr1', 'attr2', 'attr3'], cb
takes about 45 seconds.

Doing a scan and then filtering the results seems to actually run faster, 35-40 seconds, albeit using more memory.

On dynamo the query takes < 1 second and scan takes longer than on here.

I understand that leveldb doesn't have indexes, not sure if this is even possible or if the query uses the just the key could that just be a leveldb key lookup?

Either way would be good to know, great library btw @mhart ๐Ÿ‘

Tests failing due to breaking change in abstract-leveldown

A breaking change in the new release of abstract-leveldown (an upstream dependency of memdown) is causing tests to fail unexpectedly in this project. I have reported the problem in Level/abstract-leveldown#61 but I am opening this issue to help anyone else who might encounter this in the mean time.

The test failures looks like this:

/Users/willwhite/dynalite/node_modules/memdown/node_modules/abstract-leveldown/abstract-leveldown.js:13
    throw new Error('constructor requires a non empty location string')
          ^
Error: constructor requires a non empty location string
    at MemDOWN.AbstractLevelDOWN (/Users/willwhite/dynalite/node_modules/memdown/node_modules/abstract-leveldown/abstract-leveldown.js:13:11)
    at new MemDOWN (/Users/willwhite/dynalite/node_modules/memdown/memdown.js:124:21)
    at MemDOWN (/Users/willwhite/dynalite/node_modules/memdown/memdown.js:122:12)
    at LevelUP.open (/Users/willwhite/dynalite/node_modules/levelup/lib/levelup.js:115:18)
    at new LevelUP (/Users/willwhite/dynalite/node_modules/levelup/lib/levelup.js:87:8)
    at LevelUP (/Users/willwhite/dynalite/node_modules/levelup/lib/levelup.js:47:12)
    at Object.create (/Users/willwhite/dynalite/db/index.js:30:12)
    at dynalite (/Users/willwhite/dynalite/index.js:23:26)
    at Object.<anonymous> (/Users/willwhite/dynalite/test/helpers.js:49:22)
    at Module._compile (module.js:456:26)
    at Object.Module._extensions..js (module.js:474:10)
    at Module.load (module.js:356:32)
    at Function.Module._load (module.js:312:12)
    at Module.require (module.js:364:17)
    at require (module.js:380:17)
    at Object.<anonymous> (/Users/willwhite/dynalite/test/batchGetItem.js:1:77)
    at Module._compile (module.js:456:26)
    at Object.Module._extensions..js (module.js:474:10)
    at Module.load (module.js:356:32)
    at Function.Module._load (module.js:312:12)
    at Module.require (module.js:364:17)
    at require (module.js:380:17)
    at /Users/willwhite/dynalite/node_modules/mocha/lib/mocha.js:192:27
    at Array.forEach (native)
    at Mocha.loadFiles (/Users/willwhite/dynalite/node_modules/mocha/lib/mocha.js:189:14)
    at Mocha.run (/Users/willwhite/dynalite/node_modules/mocha/lib/mocha.js:422:31)
    at Object.<anonymous> (/Users/willwhite/dynalite/node_modules/mocha/bin/_mocha:398:16)
    at Module._compile (module.js:456:26)
    at Object.Module._extensions..js (module.js:474:10)
    at Module.load (module.js:356:32)
    at Function.Module._load (module.js:312:12)
    at Function.Module.runMain (module.js:497:10)
    at startup (node.js:119:16)
    at node.js:906:3

Can't delete big tables via either cli nor js

Hi mhart,

First of all, thanks for your effort on this project, this helps a lot when we do some test in local env, but we also meet some issues here.

As creating/deleting global secondary index is still in your to do list, so I need to delete the table and recreate it to update index, but when I want to delete a big table ( about 30+ GB), I tried cli and got this error message even I set timeout=0
` aws dynamodb delete-table --table-name=table --cli-read-timeout=0 --cli-connect-timeout=0 --endpoint=http://192.168.3.187:8081

Connection was closed before we received a valid response from endpoint URL: "http://192.168.3.187:8081/". `

I also tried via nodejs with code:
var dynamodb = require('./env.js'); var params = { TableName: 'table', }; dynamodb.deleteTable(params, function(err, data) { if (err) console.log(err); // an error occurred else console.log(data); // successful response });
I will return similar error message.

May I know is there a way I can delete this table even manually?

Can't connect to Dynalite using Boto

>>> conn = DynamoDBConnection(aws_access_key_id='x', aws_secret_access_key='y', host='0.0.0.0', port=8000, is_secure=False)
>>> conn.list_tables()
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Library/Python/2.7/site-packages/boto/dynamodb2/layer1.py", line 1090, in list_tables
    body=json.dumps(params))
  File "/Library/Python/2.7/site-packages/boto/dynamodb2/layer1.py", line 2731, in make_request
    retry_handler=self._retry_handler)
  File "/Library/Python/2.7/site-packages/boto/connection.py", line 953, in _mexe
    status = retry_handler(response, i, next_sleep)
  File "/Library/Python/2.7/site-packages/boto/dynamodb2/layer1.py", line 2774, in _retry_handler
    data)
boto.exception.JSONResponseError: JSONResponseError: 400 Bad Request
{u'message': u"Authorization header requires 'Signature' parameter. Authorization header requires 'SignedHeaders' parameter. Authorization=AWS4-HMAC-SHA256 Credential=x/20150320/168/192/aws4_request,SignedHeaders=host;x-amz-date;x-amz-target,Signature=252305f17ba2763e705b6818f68077cba9627a338574b4fd9d0635be79b67d5b", u'__type': u'com.amazon.coral.service#IncompleteSignatureException'}

If I remove the is_secure from the command, then dynalite hangs on the request.

Add support for `ConditionExpression`, `KeyConditionExpression `, `ExpressionAttributeNames` and `ExpressionAttributeValues` in `DeleteItem`, `UpdateItem` and `PutItem` and `FilterExpression` in `Scan` and `Query`

This library supports all the old methods of using conditions (KeyConditions, etc) โ€“ but should be updated at some point to support all the new grammar.

This is a significant undertaking as it will require writing a parser for all of the syntax supported here: http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Expressions.SpecifyingConditions.html

.close() doesn't seem to close levelup

I'm experiencing some unexpected behaviour. I expect that when I call .close() on a dynalite server, the underlying Levelup database is closed. However, it seems that is not the case.

Here's an example:

var dynalite = require('dynalite');

var server = dynalite({path: './tmp/dynalite', createTableMs: 50});

server.listen(4567,function(err){
  if(!err){
    // doThings();

    server.close(function(){
      // Expect server to be closed...
      var server2 = dynalite({path: './tmp/dynalite', createTableMs: 50});
      server2.listen(6789,function(err){
        // recieved an error "Uncaught OpenError: IO error: lock ./tmp/mockDynamo/LOCK: already held by process"
      });
    });
  }
});

0.5.1 Broken Dependency Upgrade - --path mode fails.

Heyo, just run into an issue with version 0.5.1. Downgrading to 0.5.0 works just fine.

LevelUPError: Installed version of LevelDOWN (1.0.0) does not match required version (~0.10.0)

The optional dependency of leveldown needs to be downgraded back to '0.10.', setting it to '1.' yields this error when running dynalite with the --path option. LevelUp is not yet updated to support leveldown@1.*

Missing `LastEvaluatedKey` in query response

I believe I have a test case that proves the following scenario triggers some kind of dynalite bug:

  • generate 1000 items to put which are large enough that the entire result cannot be returned in a single query response.
  • use dynamo.batchWriteItem requests to put these items into dynalite in sets of 25
  • scan the table, assert that there are 1000 records in the table
  • query the table, response does not contain a LastEvaluatedKey

See https://gist.github.com/rclark/ff22c330f8d6791cb0bb for a script that plays out this scenario. The table schema and query details are from my application that picked up on a problem -- I'm not sure how much (if at all) those details play into the issue. I do know that running the same code against a live dynamodb table does not recreate the issue.

"ADD action is not supported for the type L"

Seems like dynalite is not supporting ADD to an List and throws

Uncaught ValidationException: One or more parameter values were invalid: ADD action is not supported for the type L

payload used

putItem {
        "TableName": "test_hash_range",
        "Item": {
                "hash": {
                        "S": "test-update"
                },
                "range": {
                        "N": "1"
                },
                "old_array": {
                        "L": [
                                {
                                        "N": "1"
                                },
                                {
                                        "N": "2"
                                },
                                {
                                        "N": "3"
                                }
                        ]
                }
        },
        "ReturnConsumedCapacity": "TOTAL"
}
updateItem {
        "TableName": "test_hash_range",
        "Key": {
                "hash": {
                        "S": "test-update"
                },
                "range": {
                        "N": "1"
                }
        },
        "AttributeUpdates": {
                "old_array": {
                        "Action": "ADD",
                        "Value": {
                                "L": [
                                        {
                                                "N": "1"
                                        },
                                        {
                                                "S": "a"
                                        },
                                        {
                                                "NULL": true
                                        },
                                        {
                                                "M": {
                                                        "k1": {
                                                                "S": "v1"
                                                        },
                                                        "k2": {
                                                                "S": "v2"
                                                        },
                                                        "k3": {
                                                                "S": "v3"
                                                        }
                                                }
                                        },
                                        {
                                                "L": []
                                        }
                                ]
                        }
                }
        },
        "Expected": {
                "hash": {
                        "Exists": true,
                        "Value": {
                                "S": "test-update"
                        }
                },
                "range": {
                        "Exists": true,
                        "Value": {
                                "N": "1"
                        }
                }
        },
        "ReturnConsumedCapacity": "TOTAL",
        "ReturnValues": "ALL_NEW"
}

scan performance

scan is the only easy way to dump the DB -- any timing on when you might improve the performance? ... or provide a JSON dump of the DB that can be imported again?

Useful for saving and restoring state during development.

GSI pagination fails with overlapping range values

Ran across this issue in a table like:

index hash range
primary collection id
gsi collection created

If I load 100 items into the same collection, with unique ids but identical creation times, then a set of paginated queries against the GSI, with Limit set < 100, does not return all 100 records. I get one page with a LastEvaluatedKey, but the request for a second page returns no items.

https://gist.github.com/rclark/c652191b855f3bba3a30 is a demonstration of the issue that uses https://github.com/rclark/dynamodb-test. It's just a wrapper around dynalite that makes it easy to toggle between a "live" test and a dynalite test. This test passes on a live table but fails against dynalite.

Cannot find module 'memdown'

Hi guys,

I've a provision script that installs dynalite and it stopped working.

> var dynalite = require('dynalite')
Error: Cannot find module 'memdown'
    at Function.Module._resolveFilename (module.js:338:15)
    at Function.Module._load (module.js:280:25)
    at Module.require (module.js:364:17)
    at require (module.js:380:17)
    at Object.<anonymous> (/vagrant/node_modules/dynalite/db/index.js:4:15)
    at Module._compile (module.js:456:26)
    at Object.Module._extensions..js (module.js:474:10)
    at Module.load (module.js:356:32)
    at Function.Module._load (module.js:312:12)
    at Module.require (module.js:364:17)

Seems something happened with memdown dependency. I've tested globally, and locally and it always asks for memdown.

Any ideas what's happening ?

Thanks

putItem not work

Hi, I'm using dynalite.

However, its putItem method doesn't work.

here is my code.... please see it.

var AWS = require('aws-sdk')
AWS.config.loadFromPath('config.json');

var dynamo = new AWS.DynamoDB({region: '******', endpoint: 'http://localhost:4567'})
dynamo.listTables(console.log.bind(console))

var params = {
    TableName: 'IotData',
        Item: {
            "id":   {N: "1"},
            "detail":    {S: "hogehoge"},
        }
    }

dynamo.putItem(params, function(err, data) {
     if (err){
       console.log(JSON.stringify(err, null, 2))
     } else {
        console.log("ok")
        console.log(JSON.stringify(data, null, 2))
     }
  });

stdout is below.

null { TableNames: [ 'IotData' ] }
ok
{}

GUI

Are there any plans to have a GUI interface with this? I just recently installed this locally to try and use it for local development but it's been a pain to try to update the tables which I normally do through the GUI. I've taken a look at https://github.com/hij1nx/lev but that's pretty cumbersome as well, especially when trying to update JSON documents.

itemSize underestimates item size

From the Dynamo docs:

Individual items in an DynamoDB table can have any number of attributes, although there is a limit of 64 KB on the item size. An item size is the sum of lengths of its attribute names and values (binary and UTF-8 lengths).

But itemSize uses new Buffer(val, 'base64').length to calculate buffer lengths. Seems like this should be simply new Buffer(val).length, skipping the base64 encoding.

Performance

Hi,

great project! Since you value performance, why don't you checkout uberlevel or hyperlevel? (these are available from rvagg, too: level-lmdb and level-hyper). We built quite some bindings, but these are the fastest you will get (lmdb is a memory mapped database, leveldown compatible; hyper-leveldb is a supercharged fork of leveldb). Btw. when you include these to your project, you can drop *down + levelup, since they are already bundled in.

Cheers

describeTable does not return an ACTIVE status

I'm using dynalite, and when I create a table and then call describeTable, I'm always getting a CREATING status, even after 40 seconds (max I tried) after the createTable was issued.

putItem { ReturnValues: "UPDATED_OLD" } issue

Hi,

first of all, thank you for this npm module,

Im using dynalite for "npm test" and I'm seeing differences between AWS and dynalite when using
putItem and "ADD"

dynalite

putItem {
    "TableName": "test_hash_range",
    "Item": {
        "hash": {
            "S": "test-updated-old"
        },
        "range": {
            "N": "1"
        },
        "number": {
            "N": "10"
        }
    },
}
putItem {
    "TableName": "test_hash_range",
    "Key": {
        "hash": {
            "S": "test-updated-old"
        },
        "range": {
            "N": "1"
        }
    },
    "AttributeUpdates": {
        "number": {
            "Action": "ADD",
            "Value": {
                "N": "20"
            }
        }
    },
    "Expected": {
        "hash": {
            "Exists": true,
            "Value": {
                "S": "test-updated-old"
            }
        },
        "range": {
            "Exists": true,
            "Value": {
                "N": "1"
            }
        }
    },
    "ReturnValues": "UPDATED_OLD"
}

returns

{ number: 30 }

AWS

putItem {
    "TableName": "test_hash_range",
    "Item": {
        "hash": {
            "S": "test-updated-old"
        },
        "range": {
            "N": "1"
        },
        "number": {
            "N": "10"
        }
    },
}
putItem {
    "TableName": "test_hash_range",
    "Key": {
        "hash": {
            "S": "test-updated-old"
        },
        "range": {
            "N": "1"
        }
    },
    "AttributeUpdates": {
        "number": {
            "Action": "ADD",
            "Value": {
                "N": "20"
            }
        }
    },
    "Expected": {
        "hash": {
            "Exists": true,
            "Value": {
                "S": "test-updated-old"
            }
        },
        "range": {
            "Exists": true,
            "Value": {
                "N": "1"
            }
        }
    },
    "ReturnValues": "UPDATED_OLD"
}

returns

{ number: 10 }

LevelUPError: Could not locate LevelDOWN

the following

mkdir dynalite-test
cd dynalite-test
npm init # enter `dynalite --path db` for `test command`
npm i dynalite
npm test

throws with this error:

/Users/jed/code/dynalite-test/node_modules/dynalite/node_modules/levelup/lib/util.js:108
    throw new LevelUPError(missingLevelDOWNError)
          ^
LevelUPError: Could not locate LevelDOWN, try `npm install leveldown`

perhaps you should be depending on level and not leveldown?

(also, i think the API would be more level-like if you require the user to pass their levelDB instance instead of the path for the constructor. what do you think?)

Query on LSI returns incorrect results

I create a table with columns A, B, C where A is hash key, B is range key, and there's also a local index with A as hash key and C as range key.
Then I insert some rows having all 3 columns (and a few more)
Doing a query on the local index, using A = 'a' and C <= 'c' is returning incorrect results (empty results in my case).
If I do C = 'c', then it returns correct results.
Also, doing the inequality comparison in Amazon DynamoDB Local returns the expected results as well.
If it's useful, both A and B are strings, and C is a number.

It takes to long to create a table

When running dynalite with it's default settings, it should take half a second to create a table. However (on my system) it take a full 20 seconds to complete.

Running dynamlite 0.17.1

How to reproduce

dynalite &
sleep 5

aws dynamodb create-table --table-name MusicCollection --attribute-definitions AttributeName=Artist,AttributeType=S AttributeName=SongTitle,AttributeType=S --key-schema AttributeName=Artist,KeyType=HASH AttributeName=SongTitle,KeyType=RANGE --provisioned-throughput ReadCapacityUnits=5,WriteCapacityUnits=5 --endpoint-url=http://localhost:4567

STARTTIME=$(date +%s)
aws dynamodb wait table-exists --table-name MusicCollection --endpoint-url=http://localhost:4567
ENDTIME=$(date +%s)

echo "It took $(($ENDTIME - $STARTTIME)) seconds to create the table"

Output

....
It took 21 seconds to create the table

Unable to create GSI indices in 0.18.0

In 0.17.4 I was able to create a global secondary index with the following configuration:

Name: "ResourceId-SourceId-index"
Partition Key: "ResourceId"
Sort Key: "SourceId"
Projected Attributes: All
Read/Write capacity: 20/20

in 0.18.0 this operation fails with a 400 error and the following error message:

"One or more parameter values were invalid: Select type ALL_ATTRIBUTES is not supported for global secondary index undefined because its projection type is not ALL"

We are using the Haskell aws library, but I do not think that is part of the issue, as the operation worked fine in the prior version and also on Dynamo itself. Log output including the request body is as follows:

Debug: Host: "127.0.0.1"
Debug: Path: "/"
Debug: Query string: ""
Debug: Body: "{\"KeyConditions\":{\"EntityId\":{\"AttributeValueList\":[{\"S\":\"00000000-0000-0000-0000-000000000000\"}],\"ComparisonOperator\":\"EQ\"}},\"ConsistentRead\":false,\"ReturnConsumedCapacity\":\"NONE\",\"ScanIndexForward\":true,\"Select\":\"ALL_ATTRIBUTES\",\"TableName\":\"test-resolver\"}"
Debug: Response status: Status {statusCode = 400, statusMessage = "Bad Request"}
Debug: Response header 'x-amzn-RequestId': 'RURN4KIOMEXTZ5TPZITM1ZKMEIUBBWCPDDMFSP31RISGRMA9PYOS'
Debug: Response header 'x-amz-crc32': '81088140'
Debug: Response header 'Content-Type': 'application/x-amz-json-1.0'
Debug: Response header 'Content-Length': '233'
Debug: Response header 'Date': 'Thu, 14 Apr 2016 19:47:04 GMT'
Debug: Response header 'Connection': 'keep-alive'

Please let me know if I can provide any further assistance. Thanks!

Flag to manually trigger UnprocessedItems on batchWriteItem

When using dynalite to test libraries there is no good way to confirm that the library being tested handles UnprocessedItems correctly without rolling your own request handler. It would be cool if there was a way to tell dynalite to not processes items either via a key or some other method as long as that method was constant.

Is this something you'd be interested in adding to dynalite @mhart?

Here is an example of where some code I'm working with is mocking this itself.

https://github.com/mapbox/cardboard/blob/8333b5b4e4a657df0a16b1129a4eb91f36c03f3c/test/batch.test.js#L71-L79

Add validation for size limits for index keys

To deal with the following error message:

One or more parameter values were invalid: Size limit exceeded for Index Key c Actual Size: 43398 bytes Max Size: 2048 bytes IndexName: index4

Query with KeyConditionExpression not working.

I was playing around with Dynalite over the holidays and found an issue with its handling of KeyConditionExpression and query. It appears that Dynalite doesn't do validations for KeyConditionExpression fully, or at all.

I couldn't find any threads on others having a similar issue, is it just me?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.