GithubHelp home page GithubHelp logo

dwhjames / datomisca Goto Github PK

View Code? Open in Web Editor NEW
130.0 130.0 28.0 15.93 MB

Datomisca: a Scala API for Datomic

Home Page: https://dwhjames.github.io/datomisca/

License: Apache License 2.0

Scala 100.00%

datomisca's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

datomisca's Issues

Use () for defs with side effects

For example, the txReportQueue, removeTxReportQueue, and requestIndex methods in Connection should probably all be declared with ().

Version tagging

Maybe we should create a 0.1 tag from commit on the date when we have opened Datomisca release.
And on current master, rename it 0.2-SNAPSHOT and when we have a reached a stable scope to tag it 0.2...

Your mind?

implicits for DatomicDataToArgs should be in implicit scope

To be able to execute a query you need to have an appropriate implicit DatomicDataToArgs. Currently this means that you have to have the imports:

import datomisca._
import Datomic._

It would be much more preferable if DatomicDataToArgs implicits were in implicit scope so that import datomisca._ would suffice.

NullPointerException where mapped class references same class

Hi!

I'm pretty sure I found a bug, but I would love to be wrong about this :). My bug seems to be stemming from when I have a case class that references another instance of the same class. take the following example:

  case class Price(uuid: UUID, amount: Double, previousPrice: Option[Price])

Notice how previousPrice is an option of type Price. Now if I map everything the normal way and add a newPrice method to create new Price entities....

object Price{

  object Schema {

    object ns { val price = Namespace("price") }

    val uuid = Attribute(ns.price / "uuid", SchemaType.uuid, Cardinality.one).withUnique(Unique.identity)
    val amount = Attribute(ns.price / "amount", SchemaType.double, Cardinality.one)
    val previousPrice = Attribute(ns.price / "previousPrice", SchemaType.ref, Cardinality.one)

    val all = List(uuid, amount, previousPrice)
  }

  implicit val reader: EntityReader[Price] = (
      Schema.uuid.read[UUID] and
      Schema.amount.read[Double] and
      Schema.previousPrice.readOpt[Price]
    )(Price.apply _)


  def newPrice(amount: Double, previousPrice: Option[Price])(implicit conn: Connection): Future[Price] = {

    val tempId =  DId(Partition.USER)
    val addTxn: AddEntity = (
      SchemaEntity.newBuilder
        += (Schema.uuid -> Datomic.squuid())
        += (Schema.amount -> amount)
        +?= (Schema.previousPrice -> previousPrice.map(pp => LookupRef(Schema.uuid, pp.uuid)))
      ) withId tempId

    Datomic.transact(addTxn) map { r =>

      val entityId = r.resolve(tempId)
      val entity = r.dbAfter.entity(entityId)
      DatomicMapping.fromEntity[Price](entity)
    }
  }
}

So far, so good, but if I run the following unit test....

  "A price" should {

    "be created once" in new WithDB {

      (for{
        _ <- Datomic.transact(Price.Schema.all)
        price <- Price.newPrice(123.00, None)

      } yield {

        price.amount must beEqualTo(123.00)
      }).await
    }

    "be created twice" in new WithDB {

      (for{
        _ <- Datomic.transact(Price.Schema.all)
        price1 <- Price.newPrice(123.00, None)
        price2 <- Price.newPrice(456.00, Some(price1))
      } yield {

        (price1.amount must beEqualTo(123.00)) and (price2.amount must beEqualTo(456.00))
      }).await
    }
  }

The first test, (which does not tie in a previous price) passes, but the second test fails with the dreaded NullPointerException. This is the stack trace that I get:

[error]    NullPointerException:   (attribute2EntityReader.scala:253)
[error] datomisca.Attribute2EntityReaderCast$$anon$25$$anon$11.read(attribute2EntityReader.scala:253)
[error] datomisca.package$RichAttribute$$anonfun$readOpt$extension$1.apply(package.scala:148)
[error] datomisca.package$RichAttribute$$anonfun$readOpt$extension$1.apply(package.scala:146)
[error] datomisca.EntityReader$$anon$1.read(entityMapper.scala:71)
[error] datomisca.EntityReader$EntityReaderMonad$$anonfun$bind$1.apply(entityMapper.scala:77)
[error] datomisca.EntityReader$EntityReaderMonad$$anonfun$bind$1.apply(entityMapper.scala:77)
[error] datomisca.EntityReader$$anon$1.read(entityMapper.scala:71)
[error] datomisca.EntityReader$EntityReaderMonad$$anonfun$bind$1.apply(entityMapper.scala:77)
[error] datomisca.EntityReader$EntityReaderMonad$$anonfun$bind$1.apply(entityMapper.scala:77)
[error] datomisca.EntityReader$$anon$1.read(entityMapper.scala:71)
[error] datomisca.EntityReader$EntityReaderFunctor$$anonfun$fmap$1.apply(entityMapper.scala:81)
[error] datomisca.EntityReader$EntityReaderFunctor$$anonfun$fmap$1.apply(entityMapper.scala:81)
[error] datomisca.EntityReader$$anon$1.read(entityMapper.scala:71)
[error] datomisca.DatomicMapping$.fromEntity(DatomicMapping.scala:26)

Printing out the entity produced by the line val entity = r.dbAfter.entity(entityId) seems to give me exactly the entity I would expect, but then it crashes on the next line.

BTW, for the record, I'm aware of how silly this example is in a database that keeps history. But trust me when I say that my real example needs to be like this :).

Add support for Keywords as a Datomic data type

We need to support Keywords as a value type. This also means we need to revisit our use of DRef to return idents as the value for references. From a query, the value of a reference attribute will be either Long or Keyword. From the entity graph, the value will be either Entity or Keyword.

Consistency bug in resolveEntity

The is a potential for inconsistent reads in the implementation of resolveEntity.

in DatomicFacilities

def resolveEntity(tx: TxReport, id: DId)(implicit db: DDatabase): DEntity = {
  tx.resolveOpt(id) match {
    case None    => throw new TempidNotResolved(id)
    case Some(e) => db.entity(e)
  }
}

in TxReport

def resolveOpt(id: DId): Option[Long] =
  Option {
    datomic.Peer.resolveTempid(dbAfter.underlying, tempids, id.toNative)
  } map { id =>
    id.asInstanceOf[Long]
  }

The current implementation resolveEntity could result in the id being resolved in one state of the db and the entity read in another.

I think resolveEntity should be moved into TxReport and the dbAfter database value that is provided in the tx report should be used for both resolving the id and reading the entity.

Extractor for DRef

An extractor for DRef to help match against keywords would be helpful.

case DRef(KW(":ns/attr")) =>

However, DRef is already a case class, and the macros rely on this.

Problem with SchemaType.instant and Cardinality.many

I'm getting an error in my implicit reader when I create an implicit reader for an entity that has an attribute that is of type instant and cardinality many, which I'm pretty sure is a bug. Here's an example of a class with a singular date that works fine:

case class AppointmentRequest(title: String, proposedTime: Date)

object AppointmentRequest {

  object Schema {

    object ns {
      val appointmentRequest = new Namespace("appointmentRequest")
    }
    val title = Attribute(ns.appointmentRequest / "title", 
                          SchemaType.string, 
                          Cardinality.one)
    val proposedTime = Attribute(ns.appointmentRequest / "proposedTime", 
                                 SchemaType.instant, 
                                 Cardinality.one)
  }

  implicit val reader = (
    Schema.title.read[String] and
    Schema.proposedTime.read[java.util.Date]
  )(AppointmentRequest.apply _)
}

But if I change proposedTime to have a Cardinality.many...

case class AppointmentRequest(title: String, proposedTime: Set[Date])

object AppointmentRequest {

  object Schema {

    object ns {
      val appointmentRequest = new Namespace("appointmentRequest")
    }
    val title = Attribute(ns.appointmentRequest / "title", 
                          SchemaType.string, 
                          Cardinality.one)
    val proposedTime = Attribute(ns.appointmentRequest / "proposedTime",
                                 SchemaType.instant, 
                                 Cardinality.many)
  }

  implicit val reader = (
    Schema.title.read[String] and
    Schema.proposedTime.read[Date]
  )(AppointmentRequest.apply _)
}

It gives the following error:

[error] /.../AppointmentRequest.scala:29: There is no type-casting reader for type java.util.Date given an attribute with Datomic type java.util.Date and cardinality datomisca.Cardinality.many.type to type java.util.Date
[error]     Schema.proposedTime.read[Date]
[error]                             ^

implicit Datomic.database is potentially dangerous magic

I’m concerned about

implicit def database(implicit conn: Connection) = conn.database

in trait DatomicPeer. I don’t think we should have this as an implicit def, just a regular def. This makes it far too easy to silently lose control over what database value is being used, and thus has potential pitfalls for losing transactionality over multiple reads.

The only major part of Datomisca that has (implicit db: DDatabase) params are the query methods. And this param is only required to ensure that a database value is given as input to a query if no other inputs are given. I think it would be better to strip this out of the implementation of queries, and simply require that query invocations should never forget to provide the appropriate inputs.

Parser is broken for parsing ids in assertions and retractions

The parser should only accept final ids for retractions… instead it only accepts temporary ids:

[:db/retract #db/id[:db.part/user] :db/ident :region/n]

For assertions it should accept both temporary and and final ids, but it only accepts temporary ids.

DSet isn't good to preserve data sequence order in datasources

DSet is the only collection type in DatomicData but when using it in datasources for queries, it may lose the order as it's considered as a Set and not a Seq.
We might introduce something to manage this:

  • DSeq but this type shouldn't be used when mapping query returned data
  • A new type Datasource which can be build from sequences of values...

???

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.