GithubHelp home page GithubHelp logo

jsevellec / cassandra-unit Goto Github PK

View Code? Open in Web Editor NEW
424.0 34.0 211.0 843 KB

Utility tool to load Data into Cassandra to help you writing good isolated JUnit Test into your application

License: GNU Lesser General Public License v3.0

Java 97.19% Shell 1.53% Batchfile 1.28%

cassandra-unit's Introduction

WELCOME to CassandraUnit

Everything is in the wiki : https://github.com/jsevellec/cassandra-unit/wiki

What is it?

Like other *Unit projects, CassandraUnit is a Java utility test tool. It helps you create your Java Application with Apache Cassandra Database backend. CassandraUnit is for Cassandra what DBUnit is for Relational Databases.

CassandraUnit helps you writing isolated JUnit tests in a Test Driven Development style.

Main features :

  • Start an embedded Cassandra.
  • Create structure (keyspace and Column Families) and load data from an XML, JSON or YAML DataSet.
  • Execute a CQL script.

Where to start :

You can start by reading the wiki : https://github.com/jsevellec/cassandra-unit/wiki

and you can watch cassandra-unit-examples project. https://github.com/jsevellec/cassandra-unit-examples

Mailing List :

[email protected] (http://groups.google.com/group/cassandra-unit-users)

License :

This project is licensed under LGPL V3.0 : http://www.gnu.org/licenses/lgpl-3.0-standalone.html

cassandra-unit's People

Contributors

augi avatar buzztaiki avatar davidrg13 avatar ffissore avatar fuji-151a avatar gaetanlebrun avatar jsevellec avatar krasserm avatar lfn3 avatar magott avatar marccarre avatar marcinszymaniuk avatar markuskull avatar martindow avatar mathijs81 avatar matthiaswessel avatar mbaechler avatar mniehoff avatar obazoud avatar pime avatar rsertelon avatar scottbessler avatar slandelle avatar snicolai avatar stefanbirkner avatar the-alchemist avatar themillhousegroup avatar thomas-hilaire avatar trecloux avatar xiaodong-xie avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

cassandra-unit's Issues

Null column values not working properly

I was trying to use the new feature (7461282). Then I tried using the cu-loader:

./bin/cu-loader  -h localhost -p 9160 -f ~/workspace/cassandra-unit/src/test/resources/yaml/dataSetWithNullColumnValue.yaml 
Start Loading...
Exception in thread "main" org.cassandraunit.exception.CassandraUnitException: cannot parse "columnWithNullColumnValue" as hex bytes
    at org.cassandraunit.serializer.GenericTypeSerializer.toByteBuffer(GenericTypeSerializer.java:66)
    at org.cassandraunit.serializer.GenericTypeSerializer.toByteBuffer(GenericTypeSerializer.java:20)
    at me.prettyprint.cassandra.model.HColumnImpl.<init>(HColumnImpl.java:39)
    at me.prettyprint.hector.api.factory.HFactory.createColumn(HFactory.java:575)
    at org.cassandraunit.DataLoader.createHColumnList(DataLoader.java:167)
    at org.cassandraunit.DataLoader.loadStandardColumnFamilyData(DataLoader.java:150)
    at org.cassandraunit.DataLoader.loadColumnFamilyData(DataLoader.java:112)
    at org.cassandraunit.DataLoader.loadData(DataLoader.java:102)
    at org.cassandraunit.DataLoader.load(DataLoader.java:64)
    at org.cassandraunit.cli.CassandraUnitCommandLineLoader.load(CassandraUnitCommandLineLoader.java:79)
    at org.cassandraunit.cli.CassandraUnitCommandLineLoader.main(CassandraUnitCommandLineLoader.java:28)
Caused by: org.apache.commons.codec.DecoderException: Odd number of characters.
    at org.apache.commons.codec.binary.Hex.decodeHex(Hex.java:101)
    at org.cassandraunit.serializer.GenericTypeSerializer.toByteBuffer(GenericTypeSerializer.java:62)
    ... 10 more

Other yamls, jsons and xmls test resources work. I don't know if I`m doing somethiing wrong.

Support for reversed comparators

I'm trying to create a schema which contains columns with composite comparators and a reverse component

My dataset.yaml looks like

- name: MyColumnFamily
  keyType: UUIDType
  comparatorType: CompositeType(TimeUUIDType(reversed=true),TimeUUIDType)

This doesn't seem possible, looking at ComparatorTypeHelper and the way it looks up a ParsedDataType.

Also, pull request #23 doesn't seem to handle this, from what I can see

http://thelastpickle.com/2011/10/03/Reverse-Comparators/

defaultColumnValueType possibly gets ignored when using hector CQL

We are trying to test our cassandra + hector code using cassandra-unit.
To do so, we made the following fixture: http://pastebin.com/76zjghQ9

We are able to retrieve the data inside the fixture by doing a hector CQL call

SELECT * FROM test;

However, when trying to update or insert a row with the following query

UPDATE test USING CONSISTENCY ALL SET 'method' = 'namedMethodValue', 'unnamedMethod' = 'unnamedMethodValue', field' = 'namedFieldValue', 'unnamedField' = 'unnamedFieldValue' WHERE KEY = 'KEY';

Hector throws the following error

me.prettyprint.hector.api.exceptions.HInvalidRequestException:
InvalidRequestException(why:cannot parse 'unnamedMethodValue' as hex
bytes)
at me.prettyprint.cassandra.service.ExceptionsTranslatorImpl.translate(ExceptionsTranslatorImpl.java:52)
at me.prettyprint.cassandra.model.CqlQuery$1.execute(CqlQuery.java:130)
at me.prettyprint.cassandra.model.CqlQuery$1.execute(CqlQuery.java:100)

The Hector CQL page suggests that it would mean that either the query or the validators are invalid, however both look correct to me.

Also,

If we create the columnfamily by doing the CQL query

CREATE COLUMNFAMILY " + "test" + " (" +
"KEY text PRIMARY KEY," +
"field text," +
"unnamedField text," +
"method text," +
"unnamedMethod text" +
")

It all works fine ..

Any ideas?

Kind regards,
Maarten

Incompatible version of com.googlecode.concurrentlinkedhashmap.ConcurrentLinkedHashMap

I'm using cassandra-unit 1.1.1.2. When I try to start a cassandra server (using EmbeddedCassandraServerHelper.startEmbeddedCassandra()), I get this error.

2:50:45.785 [pool-8-thread-1] ERROR o.a.c.s.AbstractCassandraDaemon - Exception encountered during startup
java.lang.NoSuchMethodError: com.googlecode.concurrentlinkedhashmap.ConcurrentLinkedHashMap$Builder.maximumWeightedCapacity(I)Lcom/googlecode/concurrentlinkedhashmap/ConcurrentLinkedHashMap$Builder;
at org.apache.cassandra.cache.ConcurrentLinkedHashCache.create(ConcurrentLinkedHashCache.java:70) ~[cassandra-all-1.1.0.jar:1.1.0]
at org.apache.cassandra.cache.ConcurrentLinkedHashCache.create(ConcurrentLinkedHashCache.java:54) ~[cassandra-all-1.1.0.jar:1.1.0]
at org.apache.cassandra.service.CacheService.initKeyCache(CacheService.java:102) ~[cassandra-all-1.1.0.jar:1.1.0]
at org.apache.cassandra.service.CacheService.(CacheService.java:86) ~[cassandra-all-1.1.0.jar:1.1.0]
at org.apache.cassandra.service.CacheService.(CacheService.java:62) ~[cassandra-all-1.1.0.jar:1.1.0]
at org.apache.cassandra.service.AbstractCassandraDaemon.setup(AbstractCassandraDaemon.java:161) ~[cassandra-all-1.1.0.jar:1.1.0]
at org.apache.cassandra.s

I looked at the dependency graph and it looks like cassandra-all (1.1.0) depends on com.googlecode.concurrentlinkedhashmap (1.2) whereas cassandra-unit depends on 1.3. I suspect this explains the error?

Plans to support "list, map, set" types?

Hello,

as you know CQL3 have this types, and the correspondence between Java and CQL3 is here:

http://www.datastax.com/doc-source/developer/java-driver/#reference/javaClass2Cql3Datatypes_r.html

Unfortunately, it seems you are not yet supporting those types in your datasets, as here:

https://github.com/jsevellec/cassandra-unit/blob/master/cassandra-unit/src/main/java/org/cassandraunit/dataset/commons/ParsedDataType.java

Said that, how can I write my tests to be able to create a keyspace where a column has type list, map or set?

Thanks
A.

Dataset with keyspace using NetworkTopologyStrategy?

Has cassandra-unit been tested with datasets that have a keyspace that uses NetworkTopologyStrategy? There is nothing in the yaml definition to specify the datacenter along with the replication factor; does it assume that the replication factor is for the local data center?

EmbeddedCassandra : waiting 10 seconds may not be enough

On old computer, when using a embeddedCassandra, waiting 10 seconds may not be enough before trying to load a cql script.

Indeed, in the class EmbeddedCassandraServerHelper (method startEmbeddedCassandra), a hard coded wait (startupLatch.await(10, SECONDS)) is done.
However when using a CQLDataLoader just after (manually or by spring annotations), a connection error occur.
Increasing this value to 30 seconds solves this issue.

Perhaps a retry policy could be applied.
What do you think about it?

Cheers.

Khanh

ability to either pass in yaml contents or pass a filesystem file to use as the yaml file

Our CI server runs into issues with multiple builds running simultaneously for different branches that end up attempting to use the same port for the embedded cassandra.

I can programmatically create or modify my yaml file but cannot come up with a way given the current implementation to get cassandra-unit to use it.

I don't mind adding this feature to EmbeddedCassandraServerHelper myself, but given that there are many ways to do it I was curious if you had a preference between:
a. Pass the yaml contents as a String
b. Pass the yaml contents as a Stream
c. Pass a String absolute filepath to an existing readable yaml file
d. Other

Spring extension

Would you be interested by a casssandra unit extension for Spring ?

Spec by example:

@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration
@TestExecutionListeners({ CassandraUnitTestExecutionListener.class })
@CassandraDataSet(value = { "dataset1.cql", "dataset2.cql" })
public class ActionRequestsTest {
  @Test
  public void do_some_tests_here {
  }
}

CassandraUnitTestExecutionListener should extend AbstractTestExecutionListener to start / start an embedded cassandra and CassandraDataSet should load dataset.
Yes it is like Cassandra(CQL)Unit but in Spring World.

Tell me if you are interested, I will do a PR.

allow to set different type in column name

I would like to describe a DefaultComparatorType. So we could have something like this:

- name: columnFamily
  type: STANDARD
  keyType: CompositeType(UUIDType,LongType)
  defaultColumnValueType: LongType
  defaultComparatorType: LongType
  rows:
  - key: 371dd0e3-3bc6-4a88-9eeb-37f9a8eccf6b:101
    columns:
    - {name: 1, value: 3}
    - {name: float(1.5), value: 1029}
   - {name: 20, value: 23}

The default column name is Long, but it would be possible to describe other column names.

Am I asking something stupid? Or using the wrong definitions?

Inserting byte keys

Hi,

It looks like I can't insert binary values as keys, like so:

bytes(369ff963196dc2e5fe174dad2c0c6e9149b1acd9) ...

The key will be "bytes(369ff963196dc2e5fe174dad2c0c6e9149b1acd9)". I've tried with and without keyType=BytesType, and with and without bytes() function, but no luck.

Maybe the same goes for column names, but I haven't tried it.

Tom

Old snappy-java dependency

I get this error trying to use cassandra-unit within my Java 7 app on Mac OS 10.7.5:

java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at org.xerial.snappy.SnappyLoader.loadNativeLibrary(SnappyLoader.java:317)
    at org.xerial.snappy.SnappyLoader.load(SnappyLoader.java:219)
    at org.xerial.snappy.Snappy.<clinit>(Snappy.java:44)
    at org.apache.cassandra.io.compress.SnappyCompressor.create(SnappyCompressor.java:45)
    at org.apache.cassandra.io.compress.SnappyCompressor.isAvailable(SnappyCompressor.java:55)
    at org.apache.cassandra.io.compress.SnappyCompressor.<clinit>(SnappyCompressor.java:37)
    at org.apache.cassandra.config.CFMetaData.<clinit>(CFMetaData.java:82)
    at org.apache.cassandra.config.KSMetaData.systemKeyspace(KSMetaData.java:81)
    at org.apache.cassandra.config.DatabaseDescriptor.loadYaml(DatabaseDescriptor.java:476)
    at org.apache.cassandra.config.DatabaseDescriptor.<clinit>(DatabaseDescriptor.java:123)
    at org.cassandraunit.utils.EmbeddedCassandraServerHelper.mkdirs(EmbeddedCassandraServerHelper.java:227)
    at org.cassandraunit.utils.EmbeddedCassandraServerHelper.cleanupAndLeaveDirs(EmbeddedCassandraServerHelper.java:199)
    at org.cassandraunit.utils.EmbeddedCassandraServerHelper.startEmbeddedCassandra(EmbeddedCassandraServerHelper.java:95)
    at org.cassandraunit.utils.EmbeddedCassandraServerHelper.startEmbeddedCassandra(EmbeddedCassandraServerHelper.java:65)
    at org.cassandraunit.utils.EmbeddedCassandraServerHelper.startEmbeddedCassandra(EmbeddedCassandraServerHelper.java:49)
    at org.cassandraunit.utils.EmbeddedCassandraServerHelper.startEmbeddedCassandra(EmbeddedCassandraServerHelper.java:45)
    at org.cassandraunit.BaseCassandraUnit.before(BaseCassandraUnit.java:18)
    at org.cassandraunit.AbstractCassandraUnit4TestCase.before(AbstractCassandraUnit4TestCase.java:33)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
    at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
    at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
    at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:27)
    at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:76)
    at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
    at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
    at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
    at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
    at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
    at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
    at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
    at org.junit.runner.JUnitCore.run(JUnitCore.java:157)
    at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:77)
    at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:195)
    at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:63)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
Caused by: java.lang.UnsatisfiedLinkError: no snappyjava in java.library.path
    at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1860)
    at java.lang.Runtime.loadLibrary0(Runtime.java:845)
    at java.lang.System.loadLibrary(System.java:1084)
    at org.xerial.snappy.SnappyNativeLoader.loadLibrary(SnappyNativeLoader.java:52)
    ... 47 more

Java version:

Antoninus:~$ java -version
java version "1.7.0_21"
Java(TM) SE Runtime Environment (build 1.7.0_21-b12)
Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)

This is a bug in snappy-java 1.0.4.1 that was fixed in 1.0.5. Indeed, adding a dep on a version 1.0.5 of snappy-java to my build.gradle fixes this:

testCompile('org.xerial.snappy:snappy-java:1.0.5')

Cassandra 1.2.0, pegged in cassandra-unit uses snappy-java 1.0.4.1. Cassandra switched to snappy-java 1.0.5 in version 1.2.6.

Overwriting log4j.configuration is bad

If using a log4j properties file already, it is overwritten by the EmbeddedCassandraServerHelper. This is probably not the desired behaviour (it makes it difficult to use it embedded anywhere).

ColumnsMetadata in JSON structure interprets the value of the "name" property incorrectly

I'm not sure if this is by design (and I'm pretty new to Cassandra and Hector so it could be), but if I supply a JSON dataset with a CF that has column metadata, the parser automatically interprets the value of name as the name of the index for that column. However, I'm not actually indexing the column, I'm just providing a validator for the column value. For example, the snippet below would cause an error to be thrown because I have not defined an indexType property.

"columnsMetadata" : [{
    "name" : "birthdate",
    "validationClass" : "LongType"
}]

DataLoader doesn't work in 1.0.1.1

Hi,
I'm using cassandra-unit 1.0.1.1, and I'm getting a nasty exception when executing the code from the xml example:

EmbeddedCassandraServerHelper.startEmbeddedCassandra();
DataLoader dataLoader = new DataLoader("TestCluster", "localhost:9171");
dataLoader.load(new ClassPathXmlDataSet("simpleDataSet.xml"));

2011-11-05 13:53:37,897 [pool-2-thread-2] ERROR cassandra.thrift.Cassandra$Processor - Internal error processing system_add_keyspace
java.lang.NoSuchMethodError: org.apache.cassandra.thrift.CfDef.isSetRow_cache_keys_to_save()Z
at org.apache.cassandra.config.CFMetaData.applyImplicitDefaults(CFMetaData.java:619)
at org.apache.cassandra.config.CFMetaData.fromThrift(CFMetaData.java:637)
at org.apache.cassandra.thrift.CassandraServer.system_add_keyspace(CassandraServer.java:964)
at org.apache.cassandra.thrift.Cassandra$Processor$system_add_keyspace.process(Cassandra.java:3912)
at org.apache.cassandra.thrift.Cassandra$Processor.process(Cassandra.java:2889)
at org.apache.cassandra.thrift.CustomTThreadPoolServer$WorkerProcess.run(CustomTThreadPoolServer.java:187)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:680)

When I switch back to 0.8.0.2.3, it works fine. I'm relying on maven to resolve all dependencies.

Any help would be greatly appreciated.
Tom

cassandra-unit 2.0.2.0 implicitly depends on commons-lang >= 2.5, yet doesn't declare dependency

I tried hooking up cassandra-unit 2.0.2.0 with cassandra-2.0.4 and hector-core-1.1-4.

In that constellation, hector-core happens to depend on commons-lang-2.4, which cassandra-unit tries to use, but fails:

java.lang.NoSuchMethodError: org.apache.commons.lang.StringUtils.startsWithAny(Ljava/lang/String;[Ljava/lang/String;)Z
        at org.cassandraunit.utils.TypeExtractor.containFunctions(TypeExtractor.java:41)

The startsWithAny method appeared in commons-lang-2.5.

cassandra-unit itself doesn't declare a dependency on commons-lang, but obviously uses it as witnessed by the NoSuchMethodError.

If I explicitly include commons-lang-2.6 in my project, I can run tests against an embedded Cassandra successfully.

Load multiple dataset

it would be useful to have multiple datasets to load
Actually, subclasses of CQLDataSet are limited to only one dataset.

Can't load data which column value include nested json object

I use nosqlunit-cassandra which use cassandra-unit as its engine, when I load/parser json data file with column value includ nested json object such as
{
"name" : "monitoring_service",
"replicationFactor" : 1,
"strategy" : "org.apache.cassandra.locator.SimpleStrategy",
"columnFamilies" : [{
"name" : "at_workreq",
"type" : "STANDARD",
"keyType" : "TimeUUIDType",
"comparatorType" : "UTF8Type",
"defaultColumnValueType" : "UTF8Type",
"rows" : [{
"key" : "989C6E5C-2CC1-11CA-A044-08002B1BB4F5",
"columns" : [{
"name": "WORKREQUEST",
"value": [{
"name" : "device",
"value" : "989C6E5C-2CC1-11CA-A044-08002B1BB4F6"
},
{
"name" : "timeuuid",
"value" : "989C6E5C-2CC1-11CA-A044-08002B1BB4F5"
},
{
"name" : "workReqTargetList".
"value": [{
"name" : "name",
"value" : "989C6E5C-2CC1-11CA-A044-08002B1BB4F6"
},
{
"name" : "cfgGUID",
"value" : "cfgGUID"
},
{
"name" : "fwGUID",
"value" : "firmwareGUID"
},
{
"name" : "cpu",
"value" : "90.0"
}]
}]
}]
}]
}]
},

cassandra-unit will throw such exception:

org.cassandraunit.dataset.ParseException: org.codehaus.jackson.map.JsonMappingException: Can not deserialize instance of java.lang.String out of START_ARRAY token
at [Source: java.io.BufferedInputStream@5a20f443; line: 22, column: 32](through reference chain: org.cassandraunit.dataset.commons.ParsedKeyspace["columnFamilies"]->org.cassandraunit.dataset.commons.ParsedColumnFamily["rows"]->org.cassandraunit.dataset.commons.ParsedRow["columns"]->org.cassandraunit.dataset.commons.ParsedColumn["value"])
at org.cassandraunit.dataset.json.AbstractJsonDataSet.getParsedKeyspace(AbstractJsonDataSet.java:36)
at org.cassandraunit.dataset.commons.AbstractCommonsParserDataSet.getKeyspace(AbstractCommonsParserDataSet.java:33)
at org.cassandraunit.DataLoader.load(DataLoader.java:58)
at org.cassandraunit.DataLoader.load(DataLoader.java:54)
at com.cisco.onplus.das.server.dao.monitoring.MonitoringNoSQLDaoRemoteCassandraTest.test(MonitoringNoSQLDaoRemoteCassandraTest.java:345)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:45)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:42)
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
at com.lordofthejars.nosqlunit.core.AbstractNoSqlTestRule$1.evaluate(AbstractNoSqlTestRule.java:72)
at org.junit.rules.RunRules.evaluate(RunRules.java:18)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:263)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:68)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:47)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:231)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:60)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:229)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:50)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:222)
at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
at org.junit.runners.ParentRunner.run(ParentRunner.java:300)
at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:50)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:467)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:683)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:390)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:197)
Caused by: org.codehaus.jackson.map.JsonMappingException: Can not deserialize instance of java.lang.String out of START_ARRAY token
at [Source: java.io.BufferedInputStream@5a20f443; line: 22, column: 32](through reference chain: org.cassandraunit.dataset.commons.ParsedKeyspace["columnFamilies"]->org.cassandraunit.dataset.commons.ParsedColumnFamily["rows"]->org.cassandraunit.dataset.commons.ParsedRow["columns"]->org.cassandraunit.dataset.commons.ParsedColumn["value"])
at org.codehaus.jackson.map.JsonMappingException.from(JsonMappingException.java:163)
at org.codehaus.jackson.map.deser.StdDeserializationContext.mappingException(StdDeserializationContext.java:219)
at org.codehaus.jackson.map.deser.std.StringDeserializer.deserialize(StringDeserializer.java:44)
at org.codehaus.jackson.map.deser.std.StringDeserializer.deserialize(StringDeserializer.java:13)
at org.codehaus.jackson.map.deser.SettableBeanProperty.deserialize(SettableBeanProperty.java:299)
at org.codehaus.jackson.map.deser.SettableBeanProperty$MethodProperty.deserializeAndSet(SettableBeanProperty.java:414)
at org.codehaus.jackson.map.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:697)
at org.codehaus.jackson.map.deser.BeanDeserializer.deserialize(BeanDeserializer.java:580)
at org.codehaus.jackson.map.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:217)
at org.codehaus.jackson.map.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:194)
at org.codehaus.jackson.map.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:30)
at org.codehaus.jackson.map.deser.SettableBeanProperty.deserialize(SettableBeanProperty.java:299)
at org.codehaus.jackson.map.deser.SettableBeanProperty$MethodProperty.deserializeAndSet(SettableBeanProperty.java:414)
at org.codehaus.jackson.map.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:697)
at org.codehaus.jackson.map.deser.BeanDeserializer.deserialize(BeanDeserializer.java:580)
at org.codehaus.jackson.map.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:217)
at org.codehaus.jackson.map.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:194)
at org.codehaus.jackson.map.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:30)
at org.codehaus.jackson.map.deser.SettableBeanProperty.deserialize(SettableBeanProperty.java:299)
at org.codehaus.jackson.map.deser.SettableBeanProperty$MethodProperty.deserializeAndSet(SettableBeanProperty.java:414)
at org.codehaus.jackson.map.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:697)
at org.codehaus.jackson.map.deser.BeanDeserializer.deserialize(BeanDeserializer.java:580)
at org.codehaus.jackson.map.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:217)
at org.codehaus.jackson.map.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:194)
at org.codehaus.jackson.map.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:30)
at org.codehaus.jackson.map.deser.SettableBeanProperty.deserialize(SettableBeanProperty.java:299)
at org.codehaus.jackson.map.deser.SettableBeanProperty$MethodProperty.deserializeAndSet(SettableBeanProperty.java:414)
at org.codehaus.jackson.map.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:697)
at org.codehaus.jackson.map.deser.BeanDeserializer.deserialize(BeanDeserializer.java:580)
at org.codehaus.jackson.map.ObjectMapper._readMapAndClose(ObjectMapper.java:2723)
at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:1900)
at org.cassandraunit.dataset.json.AbstractJsonDataSet.getParsedKeyspace(AbstractJsonDataSet.java:32)
... 30 more

Many our column family will contains json object (event 3 layer nested) in their column value.

Column values shouldn't be expected

The column values are expected, but Cassandra can handle columns without a value, just using a name to map. The dataset.xsd file has this description:

<complexType name="Column">
<sequence>
<element name="name" type="string" minOccurs="1" maxOccurs="1" />
<element name="value" type="string" minOccurs="1" maxOccurs="1" />
</sequence>
</complexType>

It should be possible to describe a column without a value.

OOM - unable to create new native thread

Jeremy,

I am on the Spring-Data-Cassandra team and we are making good use of cassandra-unit. Nice work!

We have an issue that only occurs on Macs, and we have a large number of test cases. As you can imagine, all of our integration tests use cassandra-unit. :)

It seems almost like each test doesn't thoroughly clean up after itself before forking a new cassandra instance. We have tried with and without forking and still get the error. Thanks for the help.

Have you seen this error before in your testing?

If not, we can help reproduce it.

java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:713)
at java.util.concurrent.ThreadPoolExecutor.addWorker(ThreadPoolExecutor.java:949)
at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1371)
at org.jboss.netty.util.internal.DeadLockProofWorker.start(DeadLockProofWorker.java:38)
at org.jboss.netty.channel.socket.nio.AbstractNioSelector.openSelector(AbstractNioSelector.java:343)
at org.jboss.netty.channel.socket.nio.AbstractNioSelector.(AbstractNioSelector.java:95)
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.(AbstractNioWorker.java:53)
at org.jboss.netty.channel.socket.nio.NioWorker.(NioWorker.java:45)
at org.jboss.netty.channel.socket.nio.NioWorkerPool.createWorker(NioWorkerPool.java:45)
at org.jboss.netty.channel.socket.nio.NioWorkerPool.createWorker(NioWorkerPool.java:28)
at org.jboss.netty.channel.socket.nio.AbstractNioWorkerPool.newWorker(AbstractNioWorkerPool.java:99)
at org.jboss.netty.channel.socket.nio.AbstractNioWorkerPool.init(AbstractNioWorkerPool.java:69)
at org.jboss.netty.channel.socket.nio.NioWorkerPool.(NioWorkerPool.java:39)
at org.jboss.netty.channel.socket.nio.NioWorkerPool.(NioWorkerPool.java:33)
at org.jboss.netty.channel.socket.nio.NioClientSocketChannelFactory.(NioClientSocketChannelFactory.java:151)
at org.jboss.netty.channel.socket.nio.NioClientSocketChannelFactory.(NioClientSocketChannelFactory.java:116)
at com.datastax.driver.core.Connection$Factory.(Connection.java:406)
at com.datastax.driver.core.Connection$Factory.(Connection.java:417)
at com.datastax.driver.core.Cluster$Manager.(Cluster.java:787)
at com.datastax.driver.core.Cluster$Manager.(Cluster.java:739)
at com.datastax.driver.core.Cluster.(Cluster.java:80)
at com.datastax.driver.core.Cluster.(Cluster.java:67)
at com.datastax.driver.core.Cluster$Builder.build(Cluster.java:708)
at org.springframework.cassandra.test.integration.AbstractEmbeddedCassandraIntegrationTest.cluster(AbstractEmbeddedCassandraIntegrationTest.java:68)
at org.springframework.cassandra.test.integration.AbstractEmbeddedCassandraIntegrationTest.connect(AbstractEmbeddedCassandraIntegrationTest.java:73)
at org.springframework.cassandra.test.integration.AbstractEmbeddedCassandraIntegrationTest.(AbstractEmbeddedCassandraIntegrationTest.java:32)
at org.springframework.cassandra.test.integration.core.template.CassandraOperationsTest.(CassandraOperationsTest.java:68)

@Autowired not working

Hi,

after adding @TestExecutionListeners({ CassandraUnitTestExecutionListener.class })
in JUnit test, the @Autowired does not work.

Thanks,
K.

Support for indexes?

Hi Jérémy,

It looks like there is currently no support for indexes on columns. Is there a way to make this work?

Thanks,
Tom

Please clarify license

I am unclear whether this project is GPL or LGPL:

  • The home page [1] states the license is LGPL 3.0 but links to GPL 3.0.
  • The pom.xml [2] states the license is LGPL 3.0 and links to LGPL 3.0.
  • The license.txt [3] contains GPL 3.0.
  • A Stackoverflow comment [4] states the license is GPL 3.0 (which may confuse potential users if this project is not GPL 3.0).

As a testing library, I'd assume LGPL is what was intended. However would you please clarify so that the community observes the intended license.

[1] https://github.com/jsevellec/cassandra-unit
[2] https://github.com/jsevellec/cassandra-unit/blob/master/pom.xml
[3] https://github.com/jsevellec/cassandra-unit/blob/master/LICENSE.txt
[4] http://stackoverflow.com/questions/6612104/junit-testing-cassandra-with-embedded-server#comment21266836_7758852

allow to set indexName into ColumnMetadata on a ColumnFamily

By default, when an indexType is set on a columnMetadata, the indexName is set with the columnMetada.name of the dataSet.

the idea is to allow to set the indexName by adding an indexName attribute in the dataset in the columnMetadata level

create embedded Cassandra server with its own cassandra.yaml file

The EmbeddedCassandraServerHelper class includes the startEmbeddedCassandra() method.

The "cassandra.yaml" file used by the embedded Cassandra server is the one given into the cassandra-unit distribution.

Is it possible to have an alternate startEmbeddedCassandra() method enabling the embedded Cassandra server to use whatever "cassandra.yaml" file is given as a parameter ?
Thanks.

support for cassandra 2.0

Hi

I tried to use cassandra 2.0 lib, failed.
IllegalAccessError occured, Can you fix this?

java.lang.IllegalAccessError: tried to access method com.google.common.collect.MapMaker.makeComputingMap(Lcom/google/common/base/Function;)Ljava/util/concurrent/ConcurrentMap; from class org.apache.cassandra.service.StorageProxy
at org.apache.cassandra.service.StorageProxy.(StorageProxy.java:84)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:186)
at org.apache.cassandra.service.StorageService.initServer(StorageService.java:432)
at org.apache.cassandra.service.StorageService.initServer(StorageService.java:411)
at org.apache.cassandra.service.CassandraDaemon.setup(CassandraDaemon.java:278)
at org.apache.cassandra.service.CassandraDaemon.activate(CassandraDaemon.java:366)
at org.cassandraunit.utils.EmbeddedCassandraServerHelper$1.run(EmbeddedCassandraServerHelper.java:102)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
at java.lang.Thread.run(Thread.java:722)

my pom.xml is below.

    <dependency>
        <groupId>org.apache.cassandra</groupId>
        <artifactId>cassandra-thrift</artifactId>
        <version>2.0.1</version>
    </dependency>
    <dependency>
        <groupId>org.cassandraunit</groupId>
        <artifactId>cassandra-unit-spring</artifactId>
        <version>1.2.0.1</version>
    </dependency>

'key name' in data set

Is there a way to specify 'key name' in the data set?

The class I'm testing has a CQL 3 query like this:
SELECT my_key, my_column1 FROM table1 WHERE my_key = 'key1';
In my Cassandra unit test, I'm getting:
com.datastax.driver.core.exceptions.InvalidQueryException: Undefined name my_key in selection clause
at com.datastax.driver.core.exceptions.InvalidQueryException.copy(InvalidQueryException.java:32)
...

It seems because I didn't specify a key name in my data set. But I couldn't figure out how (with Cassandra unit).

I'm using Cassandra 1.2.2, datastax Java Driver 1.0.0-beta2, and Cassandra unit 1.1.2.1.

Thanks.

Multiline CQL dataset

The current implementation of SQL dataset only supports one line operation not multi line.

It would be useful to load CQL by statement for pleasant reading, for example allow to load the following dataset cql:

create table user (
    username varchar PRIMARY KEY
    , email varchar
    , password varchar
);

INSERT INTO space.user (username, email, password)
        VALUES ('user1', '[email protected]', 'secret');

CompositeType for column values

Support for CompositeType in 'column values' is required.

cassandra-unit currently support composite types in "row key" and "column name". What I am looking for is composite type in 'column values'.

I think there should be an option to add "CompositeType" in "validationClass" and then a utility method something like composite(), on the same lines as utf8() to set values.

Following should be possible, after CompositeType is added.

subject CompositeType(UTF8Type,UTF8Type,UTF8Type) KEYS

Unable to escape parenthesis in a value

Hi,

I would like to init from a json file a value containing some parenthesis like :

{
"name" : "backgroundColor",
"value" : "utf8(rgb(0,0,0))"
}

but only 'rgb(0,0,0' was stored in cassandra. I tried to escape them
"value" : "utf8(rgb(0,0,0))"
or "value" : "utf8('rgb(0,0,0)')"
but I was unable to deal with that.

Thanks

column metadata name can only be utf8

it is not possible to provide column metadata if comparatorType is not UTF8Type as the column metadata property name will always be interpreted as UTF8 and will not use functions

from DataLoader.java:
line 252:
columnDefinition.setName(ByteBuffer.wrap(columnName.getBytes(Charsets.UTF_8)));

something like this for instance is impossible:

{
"name" : "keyspace",
"columnFamilies" : [{
"name" : "cf",
"comparatorType" : "TimeUUIDType",
"columnsMetadata" : [{
"name" : "00000000-0000-1000-0000-000000000000",
"validationClass" : "BytesType",
"indexType" : "KEYS"
}
]
}

other types like LongType or BytesType might not throw exceptions but will work in unexpected ways.

Configure cassandra-unit with specific version of Hector and Cassandra.

Hi,

To avoid all the:

  • NoSuchMethodException
  • IncompatibleClassChangeError
  • etc.

due to conflicting dependencies, would it be possible to have a way to configure cassandra-unit with the versions used in client projects?

I guess this would also avoid having to release new versions of cassandra-unit just to upgrade to different versions of Hector/Cassandra.

Thanks & Regards,

Marc.

P.S. : Happy to help implementing this, provided some guidance.


EDIT:

As a workaround, Maven exclusions work on cassandra-unit 1.1.12, using the cassandra-all 1.0.7 and hector-core 1.0-5 in all my modules:

<dependency>
    <groupId>org.cassandraunit</groupId>
    <artifactId>cassandra-unit</artifactId>
    <scope>test</scope>
    <exclusions>
        <exclusion>
            <groupId>org.apache.cassandra</groupId>
            <artifactId>cassandra-all</artifactId>
        </exclusion>
        <exclusion>
            <groupId>org.hectorclient</groupId>
            <artifactId>hector-core</artifactId>
        </exclusion>
        <exclusion>
            <groupId>com.googlecode.concurrentlinkedhashmap</groupId>
            <artifactId>concurrentlinkedhashmap-lru</artifactId>
        </exclusion>
    </exclusions>
</dependency>

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.