tenmax / cqlkit Goto Github PK
View Code? Open in Web Editor NEWCLI tool to export Cassandra query as CSV and JSON format.
License: Apache License 2.0
CLI tool to export Cassandra query as CSV and JSON format.
License: Apache License 2.0
The example cqlshrc shows providing a port
number, but the code in https://github.com/tenmax/cqlkit/blob/master/src/main/java/io/tenmax/cqlkit/SessionFactory.java is not using port
. Only the hostname
is being pulled from the configuration file.
Both cql2json and cql2csv are throwing this exception and im not able to get results.
./cql2json -q 'select * from films_property'
Exception in thread "main" com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (tried: /30.0.3.217:9042 (com.datastax.driver.core.exceptions.InvalidQueryException: unconfigured table schema_keyspaces))
at com.datastax.driver.core.ControlConnection.reconnectInternal(ControlConnection.java:227)
at com.datastax.driver.core.ControlConnection.connect(ControlConnection.java:82)
at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1307)
at com.datastax.driver.core.Cluster.init(Cluster.java:159)
at com.datastax.driver.core.Cluster.connect(Cluster.java:249)
at io.tenmax.cqlkit.SessionFactory.(SessionFactory.java:62)
at io.tenmax.cqlkit.SessionFactory.newInstance(SessionFactory.java:91)
at io.tenmax.cqlkit.AbstractMapper.run(AbstractMapper.java:215)
at io.tenmax.cqlkit.AbstractMapper.start(AbstractMapper.java:117)
at io.tenmax.cqlkit.CQL2JSON.main(CQL2JSON.java:107)
cql2csv <FILE
outputs the HELP info
cql2csv -d <FILE
runs the query inside FILE and outputs expected csv.
Is there a way to set the consistency level on the connection for this tool? I need to be able to set QUORUM or ALL to get full export. Without this, I get different row counts on each run.
As a command-line tool, it's not necessary to dump the application stack trace if an error occurs.
Just output the error message and set the appropriate exit code (the user shouldn't even know that the application was written in Java). If you need to debug the program itself, then accept a --debug
option.
C:>cql2json -q "select * from system.schema_columns"
Error: unconfigured table schema_columns
I downloaded the project. I have imported it in IntelliJ idea. When I compile with "build of gradle" and try to execute it, it says "the main class has not been found or loaded". What am I doing wrong? Is there another way to compile it? I do this because I need to implement another "Custom" data type
Thank you very much
It's not very user friendly when running a program without args immediately throws an exception.
If no query is specified in any of the 3 methods, then the program should display help before even trying to connect to a cluster.
> cql2json
Exception in thread "main" java.lang.IllegalArgumentException: Cannot build a cluster without contact points
at com.datastax.driver.core.Cluster.checkNotEmpty(Cluster.java:118)
at com.datastax.driver.core.Cluster.<init>(Cluster.java:110)
at com.datastax.driver.core.Cluster.buildFrom(Cluster.java:179)
at com.datastax.driver.core.Cluster$Builder.build(Cluster.java:1190)
at io.tenmax.cqlkit.SessionFactory.<init>(SessionFactory.java:60)
at io.tenmax.cqlkit.SessionFactory.newInstance(SessionFactory.java:91)
at io.tenmax.cqlkit.AbstractMapper.run(AbstractMapper.java:215)
at io.tenmax.cqlkit.AbstractMapper.start(AbstractMapper.java:117)
at io.tenmax.cqlkit.CQL2JSON.main(CQL2JSON.java:107)
if you have multi contact points, you can use two way below.
cql2json -c contactPoint1,contactPoint2 --consistency quorum -k myKeyspace -q "select * from myTable limit 1"
[authentication]
keyspace = system
[connection]
hostname = contactPoint1,contactPoint2
port = 9042
; vim: set ft=dosini :
use ,
to separate multi contact points
in running a basic "SELECT * ... LIMIT 100" I'm getting
com.datastax.driver.core.exceptions.SyntaxError: line 0:-1 mismatched input '<EOF>' expecting K_FROM
at com.datastax.driver.core.exceptions.SyntaxError.copy(SyntaxError.java:58)
at com.datastax.driver.core.exceptions.SyntaxError.copy(SyntaxError.java:24)
at com.datastax.driver.core.DriverThrowables.propagateCause(DriverThrowables.java:37)
at com.datastax.driver.core.DefaultResultSetFuture.getUninterruptibly(DefaultResultSetFuture.java:245)
at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:68)
at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:43)
at io.tenmax.cqlkit.AbstractMapper.run(AbstractMapper.java:306)
at io.tenmax.cqlkit.AbstractMapper.start(AbstractMapper.java:136)
at io.tenmax.cqlkit.CQL2CSV.main(CQL2CSV.java:107)
Caused by: com.datastax.driver.core.exceptions.SyntaxError: line 0:-1 mismatched input '<EOF>' expecting K_FROM
at com.datastax.driver.core.Responses$Error.asException(Responses.java:143)
at com.datastax.driver.core.DefaultResultSetFuture.onSet(DefaultResultSetFuture.java:179)
at com.datastax.driver.core.RequestHandler.setFinalResult(RequestHandler.java:198)
at com.datastax.driver.core.RequestHandler.access$2600(RequestHandler.java:50)
at com.datastax.driver.core.RequestHandler$SpeculativeExecution.setFinalResult(RequestHandler.java:852)
at com.datastax.driver.core.RequestHandler$SpeculativeExecution.onSet(RequestHandler.java:686)
at com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:1089)
at com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:1012)
at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:312)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:286)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:750)
Hi,
The tool is awesome!!
The query is timing out for a saturated DB in one of our setups.
Can we add support for --request-timeout like in cqlsh?
First of all thank you for creating this handy tool!
Having the following table:
user | order | created | deleted | score | progress
-----------------------+------------------------+---------------------------------+---------+-------+-----------
mM0CJ0N2QBGOUife8Jv3Tg | 0GrxPSpj9tmcqEghLmtufg | 2020-04-02 09:03:16.933000+0000 | null | 0 | null
Where:
the exported CSV data for that row using cqlkit will look like this:
mM0CJ0N2QBGOUife8Jv3Tg,0GrxPSpj9tmcqEghLmtufg,2020-04-02 11:03:16.933+0200,,0.0,NULL
but using the COPY TO command in cqlsh the same row will look like this:
mM0CJ0N2QBGOUife8Jv3Tg,0GrxPSpj9tmcqEghLmtufg,2020-04-02 09:03:16.933+0000,,0,
As you can see there are differences in the outputs of cqlsh and cqlkit.
A double value looks like:
cqlsh : 0
cqlkit: 0.0
A null double:
cqlsh : ''
cqlkit: NULL
A timestamp:
cqlsh: 2020-04-02 09:03:16.933+0000
cqlkit: 2020-04-02 11:03:16.933+0200
It would be really handy to have an option in cqlkit that would allow to export the data in the same text format as cqlsh does, this becomes specially useful when you need to import the data exported by cqlkit using the COPY FROM command in cqlsh.
Hello,
This repo does not appear to come with a license?
Could you advise what license this is released under if any?
Regards,
Tim
When I run:
cql2csv -c cas-01.company.net -q "select * from <key_space>.<table_name>" > table.csv
I only get the header row.
If I run cql2json with the same arguments, I get the entire table as expected.
While cql2json outputs JSON for individual rows, it doesn't wrap all rows in a container object, making the command much less useful for piping the result to additional utilities that expect JSON input.
> cql2json -c localhost -k items -q "select id from items"
{"id":"anv79n"}
{"id":"hcp5yn"}
> cql2json -c localhost -k items -q "select id from items" | json_pp
garbage after JSON object, at character offset 17 (before ""id":"hcp5yn"}\n") at /usr/bin/json_pp5.18 line 45.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.