GithubHelp home page GithubHelp logo

outworkers / phantom Goto Github PK

View Code? Open in Web Editor NEW
1.0K 75.0 185.0 6.93 MB

Schema safe, type-safe, reactive Scala driver for Cassandra/Datastax Enterprise

Home Page: http://outworkers.github.io/phantom/

License: Apache License 2.0

Thrift 0.02% Scala 99.59% Shell 0.39%
scala phantom cassandra datastax-enterprise reactive reactive-streams

phantom's Introduction

phantom

CI Test coverage(%) Code quality Stable version ScalaDoc Chat Open issues Average issue resolution time
Build Status Coverage Status Codacy Rating Maven Central ScalaDoc Gitter Percentage of issues still open Average time to resolve an issue

Reactive type-safe Scala driver for Apache Cassandra/Datastax Enterprise

To stay up-to-date with our latest releases and news, follow us on Twitter: @outworkers.

If you use phantom, please consider adding your company to our list of adopters. Phantom is and will always be open source, but the more adopters our projects have, the more people from our company will actively work to make them better.

phantom

Migrating to phantom 2.14.0 and using execution backends.

Please refer to the new docs on query execution to understand the breaking changes in phantom 2.14.0. They will affect all users of phantom, as we further optimise the internals for better performance and to gently prepare 3.0.

Details here. In short, query generation is no longer coupled with query execution within the framework. That means phantom can natively support different kind of concurrency frameworks in parallel, using different sub-modules. That includes Monix, Twitter Util, Scala Futures, and a few others, some of which only available via phantom-pro.

import com.outworkers.phantom.dsl._ is now required in more places than before. The future method is no longer implementation by query classes, but rather added via implicit augmentation by QueryContext. The return type of the future method is now dependent on which QueryContext you use, so that's why importing is required, without it the necessary implicits will not be in scope by default, or similarly, in some places new implicits are required to specify things specific to an execution backend.

Scala 2.13 support

As of phantom 2.50.0, Scala 2.13 support is officially available, however all support has been dropped for Scala 2.10. To use Scala 2.10 with phantom, please use a version of phantom earlier than 2.5.0. No support or ongoing maintenance will be offered for 2.10 artifacts, as the codebase has undergone significant change to support newer versions, and the various libraries we depend on no longer support this.

Migrating to phantom 2.x.x series

The new series of phantom introduces several key backwards incompatible changes with previous versions. This was done to obtain massive performance boosts and to thoroughly improve user experience with phantom.

Read the MIGRATION GUIDE for more information on how to upgrade.

Available modules

This is a table of the available modules for the various Scala versions. Not all modules are available for all versions just yet, and this is because certain dependencies have yet to be published for Scala 2.12.

Phantom OSS

Module name Scala 2.11.x Scala 2.12.x Scala 2.13.x
phantom-connectors yes yes yes
phantom-dsl yes yes yes
phantom-jdk8 yes yes yes
phantom-sbt no yes no
phantom-example yes yes yes
phantom-thrift yes yes yes
phantom-finagle yes yes yes
phantom-streams yes yes no

Phantom Pro subscription edition

Modules marked with "x" are still in beta or pre-publishing mode.

Module name Scala 2.11.x Scala 2.12.x Scala 2.13.x Release date
phantom-dse yes yes yes Released
phantom-udt yes yes yes Released
phantom-autotables yes yes yes Released
phantom-monix yes yes yes Released
phantom-docker x x x Released
phantom-migrations yes yes yes Released
phantom-graph x x x April 2020
phantom-spark x x x July 2020
phantom-solr x x x July 2020
phantom-native x x x December 2020
phantom-java-dsl x x x December 2020

Using phantom

Scala 2.11, 2.12 and 2.13 releases

We publish phantom in 2 formats, stable releases and bleeding edge.

  • The stable release is always available on Maven Central and will be indicated by the badge at the top of this readme. The Maven Central badge is pointing at the latest version

  • Intermediary releases are available through our Bintray repo available at Resolver.bintrayRepo("outworkers", "oss-releases") or https://dl.bintray.com/outworkers/oss-releases/. The latest version available on our Bintray repository is indicated by the Bintray badge at the top of this readme.

How phantom compares

To compare phantom to similar tools in the Scala/Cassandra category, you can read more here.

Latest versions

The latest versions are available here. The badges automatically update when a new version is released.

  • Latest stable version: Maven Central (Maven Central)
  • Bleeding edge: Bintray (OSS releases on Bintray)

For ease of use and far better management of documentation, we have decided to export the README.md to a compiled documentation page, now available here.

The following are the current resources available for learning phantom, outside of tests which are very useful in highlighting all the possible features in phantom and how to use them.

This is a list of resources to help you learn phantom and Cassandra:

back to top

We love Cassandra to bits and use it in every bit of our stack. phantom makes it super trivial for Scala users to embrace Cassandra.

Cassandra is highly scalable and it is by far the most powerful database technology available, open source or otherwise.

Phantom is built on top of the Datastax Java Driver, which handles Cassandra connectivity and raw query execution.

We are very happy to help implement missing features in phantom, answer questions about phantom, and occasionally help you out with Cassandra questions! Please use GitHub for any issues or bug reports.

Adopters

Here are a few of the biggest phantom adopters, though the full list is far more comprehensive.

Microsoft CreditSuisse ING UBS Wincor Nixdorf Paddy Power Strava Equens Pellucid Analytics Anomaly42 ChartBoost Tecsisa Mobli VictorOps Socrata Sphonic

License and copyright

Phantom is distributed under the Apache V2 License.

  • Outworkers, Limited is the copyright holder.

  • You can use phantom in commercial products or otherwise.

  • We strongly appreciate and encourage contributions.

  • All paid for features are published and sold separately as phantom-pro, everything that is currently available for free will remain so forever.

If you would like our help with any new content or initiatives, we'd love to hear about it!

back to top

Phantom was developed at outworkers as an in-house project. All Cassandra integration at outworkers goes through phantom, and nowadays it's safe to say most Scala/Cassandra users in the world rely on phantom.

back to top

Special thanks to Viktor Taranenko from WhiskLabs, who gave us the original idea, and special thanks to Miles Sabin and team behind Shapeless, where we shamelessly stole all the good patterns from.

Copyright © 2013 - 2017 outworkers.

Contributing to phantom

back to top

Contributions are most welcome! Use GitHub for issues and pull requests and we will happily help out in any way we can!

phantom's People

Contributors

0xroch avatar alexflav23 avatar arixmkii avatar benjumanji avatar bholt avatar bjankie1 avatar cakper avatar creyer avatar davoudrafati avatar dbakarcic avatar dcheckoway avatar emschimmel avatar erikvanoosten avatar gitter-badger avatar j-potts avatar jaspervz avatar jheijkoop avatar levinson avatar liff avatar mmatloka avatar msmygit avatar nthportal avatar polymorphic avatar rafaelvindelamor avatar reynoldsm88 avatar sksamuel avatar slouc avatar sslavic avatar velvia avatar vn971 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

phantom's Issues

Verbose logging on tests

I cant find a setting on phantom-testing to disable the massive console.log output

e.g.

9042 available
Starting cassandra
14:48:34.974 [pool-6-thread-1] INFO o.a.c.config.YamlConfigurationLoader - Loading settings

Is there a way to disable this?

Build for Scala 2.11

Hi,
It seems that there is no build for Scala 2.11 in maven. When could it be fixed?

Improve tests for BatchStatements

Add tests for:

  • Statements with a timestamp
  • Updating multiple tables.
  • Inserting into multiple tables.
  • Deleting from multiple tables.

Add support for CounterColumns

Cassandra offers CounterColumn definitions to allow for distributed counting. Implement the feature in phantom with basic add subtract functionality.

Add tests for partial selects

The default partial select methods found in com.newzly.phantom.dsl.CassandraTable are missing unit tests.

Add at least one basic unit test for every one of them.

Primary keys order must be exact

In Cassandra the order in which one specifies the keys in the create table is important. At the moment the order generated by Phantom is random.

Greedy object initialization is not preserving order.

The current implementation of CassandraTable is required to aggregate the inner column definitions in the order they are written.

Using the Java reflection API returns the methods in a random order with no guarantee that any order will be preserved. This leads to schema variations in PrimaryKey and PartitionKey columns, effectively causing discrepancies in CompositeKeys.

Simplify fetchEnumerator and enumerate API

The current API:

def fetchEnumerator()(implicit session: Session, ctx: ExecutionContext): ScalaFuture[PlayEnumerator[R]]

Enumerator can be considered an analogy to Iterator in reactive/asynchronous space. Therefore wrapping it with Future seems to be redundant. The API doesn't break reactive principles if the function returned simply an Enumerator.

def fetchEnumerator()(implicit session: Session, ctx: ExecutionContext): PlayEnumerator[R]

Missing dependencies

I am having trouble including this package into my sbt project, in particular due to missing dependencies for sbt-pgp(0.8.1) and sbt-git(0.6.4). SBT cannot find these with the resolvers that are given in the build files for this repository. I am using scala(2.10.4) with sbt(0.13.5). I got it to work by using github snapshots of the dependencies, changing the sbt versions and then using publish-local. Is there a repository/resolver that I can use instead?

Please release a Scala 2.11 version

Phantom looks like exactly what we need to upgrade the way we are interacting with Cassandra from Akka actors... unfortunately there is no Scala 2.11 release. Could you please provide one?

If you need any help turning your build into a cross-versioned one, please let me know and I'll submit a PR for it.

Support CompositeColumns or SuperColumns

Hi, I'm trying to store a composite object into cassandra, something like

case class UserAccount(id: UUID, msisdn: Option[String], email: Option[String], personalInfo: UserInfo, settings: AccountSettings)

and I would like to store the UserInfo and AccountSettings into a composite column or in a super column under the same row of the UserAccount CF.

At the moment I'm flattening my structure inside the UserAccount record but would be nice to be able to define a more hierarchical structure

Spark Integration

What is the state of spark integration, I can't find any branch related to spark so I make assumption that it's not started yet.

I'm playing with spark, datastax spark connector and phantom in the same project, and I feel that I understand what I really need. I there is no branch to pick up, I'll try to submit basic proof-of-concept this week.

Test with the maximum amount of records per Table

To make sure we pushed the last milestone, we need to test a table with 2 billion records.

It should work with the default Table.fetchEnumerator method, allowing us to extract desired ranges of items from a 2 billion element queue.

Please advise of how we can best achieve this.

Prepared statements support

Are there any plans to support prepared statements?
Now I am using my own abstraction. Besides simple prepared statements it allows to choose execution context for a Future.

abstract class ExecutablePreparedStatement(implicit val session: Session, context: ExecutionContext with Executor) {
  val query: String

  private lazy val statement = session.prepare(query)

  def execute(values: java.lang.Object*): Future[ResultSet] = {
    val bs = new BoundStatement(statement).bind(values: _*)
    statementToFuture(bs)
  }

  private def statementToFuture(s: Statement)(implicit session: Session): ScalaFuture[ResultSet] = {
    val promise = ScalaPromise[ResultSet]()

    val future = session.executeAsync(s)

    val callback = new FutureCallback[ResultSet] {
      def onSuccess(result: ResultSet): Unit = {
        promise success result
      }

      def onFailure(err: Throwable): Unit = {
        promise failure err
      }
    }

    Futures.addCallback(future, callback, context)

    promise.future
  }
}

Write examples for using Iterators.

In the new phantom-example module, please write concise examples of how to use iterators, including how and when to provide your custom implementations.

Please target this to both open source audience and our own team.

Logging is missing where needed

Some things should be logged in debug mode, like the exact query sent to the cassandra cluster. (are very helpful when debugging stuff)

Feature: Publish to multiple Maven repositories

The current single repository publishing mechanism does not allow for efficient distribution of new releases internally.

Using sbt-multi-release or custom SBT commands, setup a mechanism to allow publishing to a specific target.

phantom-test sample

Hello,

I'm new to phantom, i would like to implement test methods using phantom-test and i don't know how to use it, someone can help me plz

Tks

Mouhcine

Remove the default parameter for the implicit executor.

There is no point to having an implicit parameter with a default value. It's enough to import the default executor using import com.newzly.phantom.Implicits.

Remove the default, overrides are done by providing an implicit executor in the "current scope". That will take precedence over the default one.

OR Query Support

I've looked into OR Query support, but i just found the constant, since QueryBuilder does not support it.
Is there any way someone can try to build an SelectWhere.or which just wraps query parts and chains them with OR?
I would also be happy with something like

CassandraTable[T, R] {
    or(list: List[SelectWhere[T, R]]): ExecutableQuery
}

I don't want to try to step into the magic too hard :)

thanks you

Error extracting value from compound key

I'm using the published dev release of 1.2.7 for Scala 2.11

I get the following error when extracting the row values
Error: can't extract required value for column 'param_id'

The first column, decoder_key is extracted fine but the second column fails with that error. A toString() on the row reveals all the column data is there and in tact.

CREATE TABLE answers_clustered (
decoder_key text,
param_id bigint,
answers set < int >,
first_answered timestamp,
last_answered timestamp,
expires timestamp,
PRIMARY KEY ( decoder_key, param_id )
);

https://gist.github.com/lab3/893f5bea365ff36c67aa

Race condition in CassandraTable’s initialization

While working on a servlet web app I discovered a bug which I identified as race condition in the CassandraTable initialization code. It effectively prevents parallel instantiation of classes inherited from CassandraTable and as a consequence it makes impossible parallel class loading and static initialization of scala object singletons inherited from it.

I managed to scrap up a minimal example without all the web app details:

Every run on my machine gives me exceptions from the scala-reflect internals like:

  • java.lang.RuntimeException: error reading Scala signature of com.newzly.phantom.CassandraTable: value Predef is not a package
  • java.lang.RuntimeException: error reading Scala signature of org.scalatest.package: error reading Scala signature of scala.Predef: assertion failed: type Manifest
  • java.lang.RuntimeException: error reading Scala signature of org.scalatest.package: malformed Scala signature of Predef at 2219; bad type tag: 1
  • java.lang.RuntimeException: error reading Scala signature of org.scalatest.package: error reading Scala signature of scala.deprecated: scalf2a.annotationpackage

In the sources of CassandraTable I have found a possible cause for the error:
it relies on singleton scala.reflect.runtime.currentMirror in it's initialization logic and scala-reflection API is known to be not thread-safe (at least in Scala 2.10.x):
CassandraTable.scala

import scala.reflect.runtime.{ currentMirror => cm, universe => ru }
  ...
  private[this] val instanceMirror = cm.reflect(this)
  private[this] val selfType = instanceMirror.symbol.toType
  ...

blob data type support

Can't post it into google groups as it says that I don't have permission for it, so consider the issue as a feature request:
I could not find any way to use blob cassandra's data type. Does phantom have it? Or will it?

MapColumn question

Hi there!
Is there a way to evaluate elements of a map in a "Where" clause?... I need to get entries based on a MapColumn content...

Thanks,
Fran.

missing '=' at 'EXISTS' for tests

I get this exception while running tests with CassandraFlatSpec. Seems like a Cassandra version issue. How can I fix this?

Exception encountered when invoking run on a nested suite - line 1:23 missing '=' at 'EXISTS'
com.datastax.driver.core.exceptions.SyntaxError: line 1:23 missing '=' at 'EXISTS'
at com.datastax.driver.core.exceptions.SyntaxError.copy(SyntaxError.java:35)
at com.datastax.driver.core.DefaultResultSetFuture.extractCauseFromExecutionException(DefaultResultSetFuture.java:258)
at com.datastax.driver.core.DefaultResultSetFuture.getUninterruptibly(DefaultResultSetFuture.java:174)
at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:52)
at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:36)
at com.websudos.phantom.zookeeper.DefaultCassandraManager$$anonfun$initIfNotInited$1.apply(SimpleCassandraConnector.scala:72)
at com.websudos.phantom.zookeeper.DefaultCassandraManager$$anonfun$initIfNotInited$1.apply(SimpleCassandraConnector.scala:70)
at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.package$.blocking(package.scala:50)
at com.websudos.phantom.zookeeper.DefaultCassandraManager$.initIfNotInited(SimpleCassandraConnector.scala:70)
at com.websudos.phantom.testing.SimpleCassandraTest$class.beforeAll(SimpleCassandraConnector.scala:38)

Missing play-iteratees

Hey,
Was getting unresolved dependencies for play-iteratees_2.10;2.2.0 & noticed in the project's build.scala that you you guys are referencing 2.2.0 whereas the mvn central repo has only 2.4.0-M1.
Where are you guys getting the 2.2.0?

introspect in CassandraTable doesn't work as expected

Reflection 'getMethods' is used to find all defined columns in CassandraTable. However according to Java documentation: " The elements in the array returned are not sorted and are not in any particular order."

In case of composite primary key: (PartitionKey, PK1, PK2, PK3) it may end up with CREATE TABLE query where primary keys will be defined in wrong order: PRIMARY KEY(PartitionKey, PK3, PK1, PK2) and it breaks all queries.

Re-add Maven central publishing.

Phantom is already enabled as a project on Maven Central. To share our project with the world without crashing our internal Maven repositories, re-add the settings.

  • Re-add the credentials.
  • Re-enable PGP encryption.
  • Open a ticket with Sonatype support.

Count query

As a developer I need to execute a count query like:
Select count(*) from table_name where condition allow filtering
The result of this query should be a long.

secondary index name

A developer needs to be able to set the secondary index name, as this is unique per keyspace, and the default value is mapped to the name of the column, which is unique only at the table level.

phantom-dsl: List $prependAll operator is reversing records.

To replicate, use com.newzly.phantom.tables.Recipes available in the phantom-test module.

val items = List("sugar", "spice")

// prepend a list of items
Recipes.where(_.url eqs someUrl).modify(_.ingrediends prependAll items).one()

// check the output
val item = Recipes.select(_.ingredients).where(_.url eqs someUrl).one().sync().get
// the initial set of items is reversed to "spice", "sugar" ...

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.