GithubHelp home page GithubHelp logo

dropwizard-scala's Introduction

Dropwizard Scala

Scala support for Dropwizard.

This project is no longer actively maintained. If you wish to continue using it, you're welcome to fork it and continue using in-line with the Apache 2.0 Licensing terms.

Usage

Just add a dependency to dropwizard-scala-core and dropwizard-scala-jdbi (optional) to your project:

SBT

libraryDependencies += "com.datasift.dropwizard.scala" %% "dropwizard-scala-core" % "1.0.0-1"

Maven

Include the dropwizard-scala-core artifact in your POM:

<dependency>
    <groupId>com.datasift.dropwizard.scala</groupId>
    <artifactId>dropwizard-scala-core_2.10.2</artifactId>
    <version>1.0.0-1</version>
</dependency>

It's good practice to keep your Scala version as a global property that you can use elsewhere to ensure coherence in your POM:

<properties>
    <scala.version>2.10.2</scala.version>
    <dropwizard.version>1.0.0</dropwizard.version>
    <dropwizard.scala.version>${dropwizard.version}-1</dropwizard.scala.version>
</properties>

<dependencies>
    <dependency>
        <groupId>com.datasift.dropwizard.scala</groupId>
        <artifactId>dropwizard-scala-core_${scala.version}</artifactId>
        <version>${dropwizard.scala.version}</version>
    </dependency>
</dependencies>

Core

  • A base ScalaApplication trait for applications to be defined as a singleton object:
import io.dropwizard.Configuration
import com.datasift.dropwizard.scala.ScalaApplication

class MyConfiguration extends Configuration {
  @NotEmpty val greeting: String = "Hello, %s!"
  @NotNull val greeters: List[String] = Nil
}

object MyApplication extends ScalaApplication[MyConfiguration] {
  def init(bootstrap: Bootstrap[MyConfiguration]) {
    
  }

  def run(conf: MyConfiguration, env: Environment) {

  }
}

When you build an application like this, the ScalaBundle is automatically added, providing everything else described here.

  • Jackson support for Scala collections, Option and case classes, enabling (de)serialization of Scala collections/case classes in configurations and within Jersey request/response entities.

  • log4s is provided automatically, via a transitive dependency. To use it, simply import org.log4s._. See http://github.com/log4s/log4s for more details.

Metrics

  • A more idiomatic API for metrics is provided by com.datasift.dropwizard.scala.metrics._.
import com.codahale.metrics._
import com.datasift.dropwizard.scala.metrics._

class MyApplication extends ScalaApplication[MyConfiguration] {
  def run (conf: MyConfiguration, env: Environment) {
    env.metrics.gauge("things.current_time") {
      System.currentTimeMillis()
    }
    
    env.metrics.timer("things.some_timer") {
      // do something and time the execution
    }
  }
}

Jersey

  • Support for Option in resource method parameters and for request/response entities.

  • Support for Either[L, R] in resource method parameters, where L and R are both types Jersey supports for parameters. By convention, it will attempt to decode the parameter first in to the right-side as an R, and if that fails, in to the left-side as an L.

  • Support for Seq[A], List[A], Vector[A], IndexedSeq[A] and Set[A] in resource method parameters, where A is any non-collection type that Jersey supports for parameters. This is the same limitation imposed on Java collections.

  • Support for BigInt and BigDecimal in resource method parameters and request/response entities.

  • Support for Scala's native Boolean, Int and Long types in resource method parameters via the BooleanParam, IntParam and LongParam wrapper types.

JDBI

  • Scala collections and Option as the return type for a result set (i.e. multiple rows of results).

    Note: when returning a single row as an Option, you must use the @SingleValueResult annotation:

    @SqlQuery("select i from tbl limit 1")
    @SingleValueResult
    def headOption: Option[Int]
  • Support for the BigDecimal and Option types as parameters and result column types.

  • Support for returning a row as a case class or tuple, with the following constraints:

    • selected columns must match up with constructor paramaters positionally.
    • only the first defined public constructor will be used if multiple constructors are defined.
    • paramater types must be directly mappable from their SQL types, without the use of a mapper. The only exceptions to this rule are Option and scala.BigDecimal, which are natively supported.
  • case classes and tuples as parameters using the BindProduct annotation:

    @SqlUpdate("insert into tbl (a, b, c, d) values (:x.a, :x.b, :y._1, :y._2)")
    def insert(@BindProduct("x") x: Thing, @BindProduct("y") y: (Int, String))

    Note: BindProduct will bind to any no-args method or field (prioritizing no-arg methods).

  • A more idiomatic JDBI API:

    import com.datasift.dropwizard.scala.jdbi._
    
    val db = JDBI(dataSource)
    val dao = db.onDemand[MyDAO]
    val result: Int = db.inTransaction {
      handle: Handle => handle.attach[MyDAO].myQuery(123)
    }

To enable Scala integration for JDBI, you will need to add an extra dependency:

SBT

libraryDependencies += "com.datasift.dropwizard.scala" %% "dropwizard-scala-jdbi" % "1.0.0-1"

Maven

<dependency>
    <groupId>com.datasift.dropwizard.scala</groupId>
    <artifactId>dropwizard-scala-jdbi_${scala.version}</artifactId>
    <version>${dropwizard.scala.version}</version>
</dependency>

Validation

  • Support for all JSR-303 and Hibernate Validator constraints on Scala types. In particular, support is added for @NotEmpty and @Size on Scala collections. All other constraint annotations work on Scala types out of the box.

  • Validation of Scala case class properties using JSR-303 and Hibernate Validator constraints. To validate a case class, you will need to use the wrapper constraints defined in com.datasift.dropwizard.scala.validation.constraints:

import com.datasift.dropwizard.scala.validation.constraints._

class MyConfiguration extends Configuration {
  @NotEmpty val names: List[String] = Nil
  @Min(0) val age: Int = 20
}

Limitations

In order to cascade validation using @Valid on collection types, Hibernate requires that the collection provide a Java Iterator. Since Scala collections don't provide this, they cannot cascade validation.

In the following example, only MyConfiguration is validated. Person values held in the people collection are not validated, though the size of people is.

case class MyConfiguration(@Valid @NotEmpty people: List[Person]) 
  extends Configuration

case class Person(@NotEmpty name: String, @Min(0) age: Int)

Test

This module provides some utilities for aiding testing with ScalaTest. Note: this module is by far the least mature, and the API of its components is subject to change. Comments, ideas and suggestions welcome.

See core/src/test/**/ScalaApplicationSpecIT for examples of all of these components in action.

  • BeforeAndAfterMulti - a utility trait that allows multiple functions to be registered to run before and after tests, executing the after functions in the reverse order to their associated before functions. This behaves similarly to Dropwizard's lifecycle management, except it's managing the lifecycle of test dependencies.

    All of the *Test utilities below require that your test class extend this trait.

  • ApplicationTest - runs tests in the context of a running Dropwizard Application:

    val app =
      ApplicationTest(this, configFilePath) {
        MyApplication
      }

    The returned object contains the following utility methods to work with the application:

    • configuration: Try[C] - the application's configuration.
    • application: Try[A] - the application object itself.
    • environment: Try[Environment] - the appliction's Environment.
    • server: Try[Server] - the application's Jetty Server.
    • newClient(name: String): Try[Client] - a helper to construct a Jersey Client that connects to the application.
  • MySQLTest - runs tests in the context of a running MySQL server:

    val mysql = MySQLTest(this, dataSourceFactory.getUrl) {
      dataSourceFactory.build(new MetricRegistry, "test")
    }

    The returned object contains the following utility methods to work with the MySQL server:

    • dataSource: Try[ManagedDataSource] - the DataSource used to create the database instance.
    • baseDir: Try[File] - the base directory for the MySQL server's data.

    Note: to use this object, you will need to add a dependency on mysql:mysql-connector-mxj:5.0.12.

  • LiquibaseTest - runs tests in the context of a database migration:

    val migrations = LiquibaseTest(
      this, LiquibaseTest.Config(migrationsFilePath)) {
        dataSourceFactory.build(new MetricRegistry, "migrations")
      }

    The returned object contains the following utility methods to work with the Liquibase context:

    • dataSource: Try[ManagedDataSource] - the DataSource used to connect to the database instance.
    • liquibase: Try[CloseableLiquibase] - the Liquibase context that ran the migrations.

dropwizard-scala's People

Contributors

bmckown avatar boopboopbeepboop avatar daniperez avatar eronwright avatar jairamc avatar jklukas avatar joekarl avatar kelnos avatar mpitid avatar nicktelford avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dropwizard-scala's Issues

Defaults for Options don't appear to be supported

I'd be happy to be wrong, but when I declare something like:

@QueryParam("limit") limit: Option[Integer]

I would expect limit to be None when the code is run. Instead, it's null. On the off chance that it's supposed to be null when no default is actually supplied, I also tried:

@QueryParam("limit") limit: Option[Integer] = None

To the same effect.

We were previously using the massrelevance version of this library, but this looks much better and seems to be maintained, however, we seem to have lost our optional defaults.

With some direction, I'll happily try to fix this myself, but I'm pretty new to jersey's internals.

add LICENSE

Add the Apache 2.0 license file to the repository.

ProductResultSetMapper doesn't properly handle null fields

Deserializing non-Option types can erroneously result in a null value if a previous column has a null value.

case class DBFoo(bar: Option[Int], baz: String)
trait FooDao {
  @SqlQuery("SELECT bar, baz from foo limit 1")
  @SingleValueResult
  def getFoo(): Option[DBFoo]
}

"Jdbi-scala" should "properly deserialize non-options" in {
    handle.execute("CREATE TABLE `foo` (`bar` INT, `baz` varchar(128) NOT NULL) ENGINE=InnoDB DEFAULT CHARSET=utf8;")
    handle.execute("INSERT INTO foo (baz) VALUES (?)", "this isn't null")
    handle.commit()
    val dao = dbi.onDemand(classOf[FooDao])
    val foo: Option[DBFoo] = dao.getFoo()
    foo.get.baz should equal("this isn't null")
  }

This results in a failure:
null did not equal "this isn't null"

I believe this is due to the order in which ProductResultSetMapper is identifying null values:

override def map(index: Int,
                   rs: ResultSet,
                   ctx: StatementContext): A = {
...
      t match {
        // todo: do we need to explicitly match on the java variations of these types too?
        case _ if t.isAssignableFrom(classOf[Option[_]]) =>
          Option(rs.getObject(i))
        case _ if !t.isPrimitive && rs.wasNull =>
          null
...
        case _ if t.isAssignableFrom(classOf[String]) =>
          rs.getString(i)
...
      }
    }

The case _ if !t.isPrimitive && rs.wasNull => line is using rs.wasNull which refers to the last retrieved field value (not the current field). So if a previous column was null (and deserialized to an Option), the current field will get mapped to a null - even if the value of the current field is not null.

From the ResultSet docs:

/**
Reports whether
     * the last column read had a value of SQL <code>NULL</code>.
     * Note that you must first call one of the getter methods
     * on a column to try to read its value and then call
     * the method <code>wasNull</code> to see if the value read was
     * SQL <code>NULL</code>.
*/
boolean wasNull() throws SQLException;

One possible workaround would be to call rs.getObject(i) just just prior to the call to rs.wasNull but this will result in retrieving every field twice. Another option might be to not allow null values that are not mapped to an Option.

ProdectResultsetMapper can not handle List[Int] as the return type from dao function.

while returning the List[Int] from the JDBI dao I see following exception please help.

Following is the code snap -

@UseStringTemplate3StatementLocator
abstract class RelationalDao () {

@SqlQuery("SELECT * FROM <resourceName>;")
def getList(@define("resourceName") resourceName: String): List[Int]
}

resulting in below exception.

ERROR [2017-02-08 21:52:46,803] io.dropwizard.jersey.errors.LoggingExceptionMapper: Error handling a request: 7fc7ab8663f6e79d
! java.lang.InstantiationException: null
! at sun.reflect.InstantiationExceptionConstructorAccessorImpl.newInstance(InstantiationExceptionConstructorAccessorImpl.java:48)
! at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
! at com.datasift.dropwizard.scala.jdbi.tweak.ProductResultSetMapper.map(ProductResultSetMapperFactory.scala:79)
! ... 74 common frames omitted
! Causing: java.lang.IllegalArgumentException: Cannot create instances of abstract class: List
! at com.datasift.dropwizard.scala.jdbi.tweak.ProductResultSetMapper.map(ProductResultSetMapperFactory.scala:85)
! at com.datasift.dropwizard.scala.jdbi.tweak.ProductResultSetMapper.map(ProductResultSetMapperFactory.scala:22)
! at org.skife.jdbi.v2.RegisteredMapper.map(RegisteredMapper.java:35)

Error in reading default configuration in Scala application only

When running a ScalaApplication I get the following error
`default configuration has an error:

  • Unrecognized field at: server.getApplicationConnectors.[1].getPort
    Did you mean?:
    • port
    • bindHost
    • idleTimeout
    • reuseAddress
    • soLingerTime
      [14 more]`

I built two almost identical applications, one in Scala and one in Java, in this repository to illustrate the problem:
[(https://github.com/oferron/dropwizard-scala-minimal)]. The Java applications runs successfully.

AS far as I can tell, the ScalaAnnotationIntrospector assigns the wrong name to methods, e.g. 'getApplicationConnectors' instead of 'applicationConnectors' and thus the method isn't recognised as a getter, and this causes an error.

Maven artefacts

I could not locate the artefacts you mention in README.md.
What I was able to find is this:

com.massrelevance
dropwizard-scala_2.10
0.7.0

Is this the version I should use?

Disambiguate inTransaction(TransactionIsolationLevel)

Hi,

I'm trying to use inTransaction with a TransactionIsolationLevel (code):

> dbi.inTransaction(isolation=SERIALIZABLE) { f: Handle =>
|  // Whatever
| }
ambiguous reference to overloaded definition, both method inTransaction in class JDBIWrapper of type (isolation: org.skife.jdbi.v2.TransactionIsolationLevel)(f: org.skife.jdbi.v2.Handle => T)T and  method inTransaction in class JDBIWrapper of type (isolation: org.skife.jdbi.v2.TransactionIsolationLevel)(f: (org.skife.jdbi.v2.Handle, org.skife.jdbi.v2.TransactionStatus) => T)T match argument types (org.skife.jdbi.v2.TransactionIsolationLevel)

Those methods are not callable because the 1st argument group is used to match the calling method (AFAIK). Could it be possible to disambiguate them?

I'm created PR #27 for this.

QueryParam of Option of scala primitive results in error

Been scratching my head at this one for a bit now after doing an upgrade from the old massrelevance version to the datasift maintained version.

Specifically the error I'm running into is whenever I try to add something like QueryParam("foo") foo: Option[Int] to a resource.

I've done some digging, and similar combinations of QueryParam("foo") foo: Option[java.lang.Integer] and QueryParam("foo") foo: Int work as expected.

This can be replicated by adding the following test/resource to ScalaApplicationSpecIT.scala (and actually just adding that resource path with the parameter Option[Int] will fail validation of the resource on test suite startup.

--- a/core/src/test/scala/com/datasift/dropwizard/scala/ScalaApplicationSpecIT.scala
+++ b/core/src/test/scala/com/datasift/dropwizard/scala/ScalaApplicationSpecIT.scala
@@ -37,6 +37,15 @@ case class ScalaTestConfiguration(
   def greetOrNotFound(@QueryParam("name") name: Option[String]): Option[List[String]] =
     name.map(greeting.format(_)).map(List(_))

+
+  @GET @Path("/opt-int-param")
+  def greetWithOptionInt(@QueryParam("i") i: Option[Int]): Option[Int] =
+    i
+
   @GET @Path("/option")
   def greetWithOption(@QueryParam("name") name: Option[String]): List[String] =
     name.map(greeting.format(_)).toList
@@ -313,6 +322,28 @@ class ScalaApplicationSpecIT extends FlatSpec with BeforeAndAfterAllMulti {
     assert(result === Success(expected))
   }

+
+  "GET /opt-int-param" should "yield results" in {
+    val fixture = 2
+    val expected = Option(2)
+    val result = request("/opt-int-param").map {
+      _.queryParam("i", fixture.toString)
+        .request(MediaType.APPLICATION_JSON)
+        .get(classOf[Option[Int]])
+    }
+    assert(result === Success(expected))
+  }
+
   "GET /complex_scala" should "yield results" in {
     val fixture: Set[BigDecimal] = Set(BigDecimal(1), BigDecimal(2))
     val expected = 2

I think this has something to do with some weirdness with either type resolution or erasure as by introspecting ScalaParamConvertersProvider during test time the call to getConverter as the parameter type is being listed at type=class scala.Option and the generic type at runtime is scala.Option[java.lang.Object]

Actual error the container spits out

Caused by: org.glassfish.jersey.server.model.ModelValidationException: Validation of the application resource model has failed during application initialization.
[[FATAL] No injection source found for a parameter of type public scala.Option com.datasift.dropwizard.scala.ScalaTestResource.greetWithOptionInt(scala.Option) at index 0.; source='ResourceMethod{httpMethod=GET, consumedTypes=[application/json], producedTypes=[application/json], suspended=false, suspendTimeout=0, suspendTimeoutUnit=MILLISECONDS, invocable=Invocable{handler=ClassBasedMethodHandler{handlerClass=class com.datasift.dropwizard.scala.ScalaTestResource, handlerConstructors=[org.glassfish.jersey.server.model.HandlerConstructor@6097f225]}, definitionMethod=public scala.Option com.datasift.dropwizard.scala.ScalaTestResource.greetWithOptionInt(scala.Option), parameters=[Parameter [type=class scala.Option, source=i, defaultValue=null]], responseType=scala.Option<java.lang.Object>}, nameBindings=[]}']

Question on how to return query results as Seq

We use the @UseStringTemplate3StatementLocator annotation and locate our SQL elsewhere, but I don't know if it matters.

We've been using the other dropwizard scala library to this point and thought that with this one we could change this:

  @SqlQuery
  def all: java.util.List[User]

To this:

  @SqlQuery
  def all: Seq[User]

Which would save us a lot of .asScala littered throughout our code.

But all I get is No mapper registered for scala.collection.Seq Do I have my wires crossed? Is this something that's actually supported or am I making this up?

I have added dbi.registerContainerFactory(new IterableContainerFactory[scala.collection.Seq]) for what it's worth. I thought that would do it.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.