GithubHelp home page GithubHelp logo

circe / circe Goto Github PK

View Code? Open in Web Editor NEW
2.5K 66.0 533.0 12.45 MB

Yet another JSON library for Scala

Home Page: https://circe.github.io/circe/

License: Apache License 2.0

Scala 99.92% Nix 0.08%
json generic-derivation scala

circe's Introduction

circe

Build status Coverage status Maven Central Discord

circe is a JSON library for Scala (and Scala.js).

Please see the guide for more information about why circe exists and how to use it.

Community

Adopters

Are you using circe? Please consider opening a pull request to list your organization here:

Other circe organization projects

Please get in touch on Gitter if you have a circe-related project that you'd like to discuss hosting under the circe organization on GitHub.

  • circe-benchmarks: Benchmarks for comparing the performance of circe and other JSON libraries for the JVM.
  • circe-config: A library for translating between HOCON, Java properties, and JSON documents.
  • circe-derivation: Experimental generic derivation with improved compile times.
  • circe-fs2: A library that provides streaming JSON parsing and decoding built on fs2 and Jawn.
  • circe-iteratee: A library that provides streaming JSON parsing and decoding built on iteratee.io and Jawn.
  • circe-jackson: A library that provides Jackson-supported parsing and printing for circe.
  • circe-spray: A library that provides JSON marshallers and unmarshallers for Spray using circe.
  • circe-yaml: A library that uses SnakeYAML to support parsing YAML 1.1 into circe's Json.

Related projects

The following open source projects are either built on circe or provide circe support:

  • Actor Messenger: A platform for instant messaging.
  • akka-http-json: A library that supports using circe for JSON marshalling and unmarshalling in Akka HTTP.
  • akka-stream-json: A library that provides JSON support for stream based applications using Jawn as a parser with a convenience example for circe.
  • Argus: Generates models and circe encoders and decoders from JSON schemas.
  • Blackdoor JOSE: circe JSON support for blackdoor JOSE and JWT.
  • borer: Allows circe encoders/decoders to be reused for CBOR (de)serialization.
  • circe-debezium: Circe codecs for Debezium payload types
  • circe-geojson: Circe support for GeoJSON (RFC 7946)
  • circe-kafka: Implicit conversion of Encoder and Decoder into Kafka Serializer/Deserializer/Serde
  • cornichon: A DSL for JSON API testing.
  • Cosmos: An API for DCOS services that uses circe.
  • crjdt: A conflict-free replicated JSON datatype in Scala.
  • diffson: A Scala diff / patch library for JSON.
  • elastic4s: A Scala client for Elasticsearch with circe support.
  • Enumeratum: Enumerations for Scala with circe integration.
  • Featherbed: A REST client library with circe support.
  • Finch: A library for building web services with circe support.
  • fintrospect: HTTP contracts for Finagle with circe support.
  • fluflu: A Fluentd logger.
  • Github4s: A GitHub API wrapper written in Scala.
  • content-api-models: The Guardian's Content API Thrift models.
  • http4s: A purely functional HTTP library for client and server applications.
  • IdeaLingua: Staged Interface Definition and Data Modeling Language & RPC system currently targeting Scala, Go, C# and TypeScript. Scala codegen generates models and JSON codecs using circe.
  • Iglu Schema Repository: A JSON Schema repository with circe support.
  • jsactor: An actor library for Scala.js with circe support.
  • jsoniter-scala-circe: A booster for faster parsing/printing to/from circe AST and decoding/encoding of java.time._ and BigInt types.
  • jwt-circe: A JSON Web Token implementation with circe support.
  • kadai-log: A logging library with circe support.
  • msgpack4z-circe: A MessagePack implementation with circe support.
  • ohNoMyCirce: Friendly compile error messages for shapeless's Generic, circe's Encoder & Decoder and slick's case class mapping.
  • play-circe: circe support for Play!.
  • pulsar4s: A Scala client for Apache-Pulsar with circe support.
  • Rapture: Support for using circe's parsing and AST in Rapture JSON.
  • roc: A PostgreSQL client built on Finagle.
  • sangria-circe: circe marshalling for Sangria, a GraphQL implementation.
  • scalist: A Todoist API client.
  • scala-jsonapi: Scala support library for integrating the JSON API spec with Spray, Play! or Circe
  • scala-json-rpc: JSON-RPC 2.0 library for Scala and Scala.js
  • scalatest-json-circe: Scalatest matchers for Json with appropriate equality and descriptive error messages.
  • Scio: A Scala API for Apache Beam and Google Cloud Dataflow, uses circe for JSON IO
  • seals: Tools for schema evolution and language-integrated schemata (derives circe encoders and decoders).
  • shaclex: RDF validation using SHACL or ShEx.
  • Slick-pg: Slick extensions for PostgreSQL.
  • sttp: Scala HTTP client.
  • Synapses: A lightweight Neural Network library, for js, jvm and .net.
  • telepooz: A Scala wrapper for the Telegram Bot API built on circe.
  • Zenith: Functional HTTP library built on circe.

Examples

The following projects provide examples, templates, or benchmarks that include circe:

Contributors and participation

circe is a fork of Argonaut, and if you find it at all useful, you should thank Mark Hibberd, Tony Morris, Kenji Yoshida, and the rest of the Argonaut contributors.

circe is currently maintained by Darren Gibson and Erlend Hamnaberg. After the 1.0 release, all pull requests will require two sign-offs by a maintainer to be merged.

The circe project is a typelevel affiliate project, and follow the Typelevel Code of Conduct

Please see the contributors' guide for details on how to submit a pull request.

License

circe is licensed under the Apache License, Version 2.0 (the "License"); you may not use this software except in compliance with the License.

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

circe's People

Contributors

alexarchambault avatar aparo avatar armanbilge avatar cb372 avatar cjsmith-0141 avatar diesalbla avatar dwijnand avatar fthomas avatar hamnis avatar howyp avatar isomarcte avatar jonas avatar jorokr21 avatar julien-truffaut avatar keisunagawa avatar koterpillar avatar liff avatar mattkohl avatar n4to4 avatar nilsmahlstaedt avatar pfcoperez avatar pjan avatar plokhotnyuk avatar scala-steward avatar travisbrown avatar travisbrown-stripe avatar vkostyukov avatar xuwei-k avatar zarthross avatar zmccoy avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

circe's Issues

Provide a macro-annotation to reduce defining Encoder/Decoder pairs

From your blog post https://meta.plasm.us/posts/2016/01/14/configuring-generic-derivation/

I was wondering if perhaps

import io.circe.generic.semiauto._

case class Foo(thisIsSomeNumber: Int = 0)
object Foo {
  implicit val decodeFoo: Decoder[Foo] = deriveConfiguredDecoder[Foo, SnakeCaseKeys]
  implicit val encodeFoo: Encoder[Foo] = deriveConfiguredEncoder[Foo, SnakeCaseKeys]
}

could be further boilerplate-reduced to

import io.circe.macros.MkCodec // or some other name
import io.circe.generic.semiauto._

@MkCodec case class Foo(thisIsSomeNumber: Int = 0)

I've read your "The public API should not contain unnecessary types" point in DESIGN.md and #133, and I think this doesn't go against those wishes.

Support for default parameter values in generic

This appears to be mostly shapeless related, but @travisbrown asked me to file this for posterity...

case class Foo(s: String, i: Int = 10)
val json = """{"s": "Hello"}"""

// this obviously works
decode[Int => Foo](json).map(_(10))

// but what about cases where you have many default parameters of the same type?
case class Bar(i1: Int, i2: Int = 2, i3: Int = 3, i4: Int = 4)
val json2 = """{"i1": 1}"""

// i would like the following behavior...
"""{"i1": 1, "i4": 44}""".as[Bar] // => Bar(1, 2, 3, 44)

// this would work kind of work, but it overrides the default values
decode[(Int, Int, Int) => Bar].map(_(2, 3, 44))

// alternatively, I could do this to ignore the potentially incoming values:
val dec = Decoder.instance[Bar](c =>
  for {
    i1 <- c.downField("i1").as[Int]
  } yield Bar(i1)
)

// or change my datatype completely:
case class Bar2(i1: Int, i2: Option[Int], i3: Option[Int], i4: Option[Int])
val dec2 = Decoder.instance[Bar2](c =>
  for {
    i1 <- c.downField("i1").as[Int]
    i2 <- c.downField("i2").as[Option[Int]]
    i3 <- c.downField("i3").as[Option[Int]]
    i4 <- c.downField("i4").as[Option[Int]]
  } yield Bar2(i1, i2, i3, i4)
)

Unwanted object encoding instances for tuples, lists, etc.

I've introduced an ObjectEncoder[A] type class that extends Encoder[A] but additionally guarantees that the generated JSON will be a JSON object. Argonaut doesn't provide this, but it's useful in many situations where you as the developer know that the instance for a type has this property (meaning that asObject casts would just be noise and extra code to maintain).

ObjectEncoder is a way to share that knowledge with the compiler, while still having something that will work when people just want an Encoder.

The issue is that there are ADT / case class types like tuples and lists where the usual Encoder instances are not ObjectEncoders. This means that if our generic derivation mechanism for ADTs and case classes provides ObjectEncoders, we'll get those if we ask for them for e.g. tuples:

scala> import io.circe._, io.circe.generic.auto._
import io.circe._
import io.circe.generic.auto._

scala> Encoder[(Int, String)].apply((1, "foo")).noSpaces
res3: String = [1,"foo"]

scala> ObjectEncoder[(Int, String)].apply((1, "foo")).noSpaces
res4: String = {"_2":"foo","_1":1}

scala> ObjectEncoder[List[Int]].apply(List(1, 2, 3)).noSpaces
res5: String = {"::":{"tl$1":[2,3],"head":1}}

I'm currently side-stepping this problem by having my generic derivation mechanism return plain old Encoder instances, but it'd be nice to have the more informative instances for case classes and ADTs.

Add non-compilation tests

In #55 @tylerprete points out that generic.auto will find an Encoder instance for Map[String, Map[String, Object]], which is definitely not a good thing.

We should add tests to confirm that circe won't find encoders or decoders for types like Object, Product, etc. (we might need to wait for a fix for this Shapeless bug before we can merge).

Odd incorrect encoding of heterogenous map

With circe 0.1.1, I'm running the following program:

import io.circe._
import io.circe.generic.auto._
import io.circe.jawn._
import io.circe.syntax._

object Main {
  def main(args: Array[String]) {
    val funnyMap1 = Map("a" -> Map("str" -> "b", "map" -> Map("d" -> 1)))
    val weirdJs1 = funnyMap1.asJson.noSpaces 

    println(weirdJs1)
  }
}

With the result:

{"a":{"str":{},"map":{}}}

Automatic Decoder derivation hangs when used across files

Took me an hour to hunt this down in a work project. The gist of it is that I have a case class for which I automatically derive Decoder instances using io.circe.generic.auto. Within this case class there is a nested field (it's a couple of levels down) that contains a field typed to an option of another case class. This last case class got turned into a sealed trait during a refactoring. At this point I could no longer compile my project: scalac simply ran forever and failed to terminate. "Forever" in this case can be defined as more than 10 minutes for a ~1,200 SLOC project.

The situation is best illustrated with a live example: https://gist.github.com/maxaf/0cf0398d16c219099ae0

Run sbt compile and watch it spin. I've left it for more than 10 minutes, and it never finishes. What's funny is that dumping the contents of model.scala and main.scala into a single file & compiling that works fine. Also, :paste-ing the whole pile of goo into the REPL works as well!

The only conclusion I can make is that I've probably run into a scalac bug that manifests when circe is trying to automatically derive a Decoder for a type hierarchy housed in another file. I'm posting this here to get a second pair of eyes on the problem & make sure I'm not doing something obviously stupid.

PS: thanks for circe! It's incredibly useful and saves me a ton of time. Except when scalac fails to terminate, of course. :)

Compilation Failure with heterogenous map

With an addition to my other issue, I get the following compilation error.

[error] /Users/tyler/programming/scala/circe-error/src/main/scala/Main.scala:14: could not find implicit value for parameter e: io.circe.Encoder[scala.collection.immutable.Map[String,scala.collection.immutable.Map[String,java.io.Serializable]]]
[error]     val weirdJs2 = funnyMap2.asJson.noSpaces
[error]                              ^
[error] one error found
[error] (compile:compile) Compilation failed
import io.circe._
import io.circe.generic.auto._
import io.circe.jawn._
import io.circe.syntax._

object Main {
  def main(args: Array[String]) {

    val funnyMap2 = Map("a" -> Map("str" -> "b", "map" -> List(Map("d" -> 1))))
    val weirdJs2 = funnyMap2.asJson.noSpaces

    println(weirdJs2)
  }
}

Ugh implicit value classes

If we've got the following set-up:

import io.circe._, io.circe.optics.JsonPath.root, io.circe.syntax._

val json = Map("a" -> Map("b" -> Map("c" -> 1))).asJson
val path = root.a.b.c.as[Int]

And then we try to use the Optional, we'll get this:

scala> path.getOption(json)
res0: Option[Int] = None

scala> path.modify(_ + 1)(json)
res1: io.circe.Json =
{
  "a" : {
    "b" : {
      "c" : 1
    }
  }
}

But…

scala> root.x.y.z.as[Int].getOption(Map("x" -> Map("y" -> Map("z" -> 1))).asJson)
res2: Option[Int] = Some(1)

This made absolutely no sense to me for about ten minutes this morning, until I remembered that the io.circe.syntax.EncoderOps class looks like this:

implicit class EncoderOps[A](val a: A) extends AnyVal {
  def asJson(implicit e: Encoder[A]): Json = e(a)
}

Which means that anything with an Encoder instance gets a a method that's just the identity. For some reason I never realized that implicit value classes have this horrible property. This happens in the standard library, too. For example, in a fresh REPL:

scala> 13.self
res0: Int = 13

…thanks to RichInt. I hate implicits.

At the moment I can imagine three solutions:

  1. Make EncoderOps not a value class.
  2. Give the value member a more obfuscated name that isn't likely to result in collisions.
  3. Introduce a dummy class with a method name that intentionally collides.

I'm leaning toward the first option.

Add refined module

@fthomas's refined allows to restrict the values that some fields can take, at the type level, like

case class SimpleT(
  i: Int @@ Positive,
  s: String @@ NonEmpty,
  l: List[Int] @@ Size[Greater[_3]]
)

preventing before hand the corresponding fields to have invalid values.

I added a module with codecs for that in argonaut-shapeless, that works nicely too.

Just wanted to enquire opinions, before possibly sending a PR here, with a module for that.

Pretty-printing of decoding failure history

From gitter: https://gitter.im/travisbrown/circe?at=56b1d0786b6468374a0b2b7c

Consider the following example:

scala> val json = parse("""[1, 2, "hello", 4, "world", 6]""").getOrElse(Json.empty)
scala> Decoder[List[Int]].apply(json.hcursor)
res5: io.circe.Decoder.Result[List[Int]] = Left(io.circe.DecodingFailure: Int: El(MoveRight,true),El(MoveRight,true),El(DownArray,true))

This part might be hard to parse for a human eye:

io.circe.DecodingFailure: Int: El(MoveRight,true),El(MoveRight,true),El(DownArray,true)

I guess we could improve the error message by reversing a history, making it a bit more visual, and maybe adding the element we failed to parse. Something along these lines:

[↴ array]
  →|→ × Failed to decode hello as Int

I'm not sure about arrow symbols though (on github they are very tiny), so maybe just printing

[down array]
  [right], [right] × Failed to decode hello as Int

Rename toJson to asJson

@travisbrown what do you think about renaming toJson to asJson? I'm a huge fan of symmetric names for methods that do symmetirc things. Now you can go from A to Json with toJson, but in opposite way (Json to A) is as[A].

If it sounds good to you I can put together a PR.

Add codecs for singleton types

It would be nice to have codecs for singleton types (Witness."value"..T and the like), to use e.g. like

case class CC(
  i: Int,
  message: Witness.`"test"`.T
)

allowing to force some fields to a given value.

I did that in argonaut-shapeless, and it's working nicely.

Just wanted to enquire opinions, before possibly sending a PR. Also the generic module seems the more likely for that, as it depends on shapeless, although this is not fully satisfying: codecs for singleton types are just extra codecs, independent from automatic derivation.

Add tuple instances to core?

circe-core currently does not provide Encoder and Decoder instances for tuples, which are instead provided by the generic module. The mechanism is a little more complicated than I'd like—we want tuples to be treated differently from all other case classes, so we have to use implicit prioritization tricks to make sure they end up earlier in line.

This could be simplified a little with #30 if we provided tuple instances in core. I think tuple instances in core are well-motivated anyway, but I originally wanted to see how far I could get without any arity-related boilerplate there.

Does this sound reasonable? I don't have a strong opinion about which boilerplate generation system to use—I guess the starting points are sbt-boilerplate, Argonaut's Boilerplate.scala, or Shapeless's.

(I think this question is pretty much orthogonal to #36, but if there's a way to address both at the same time that would be pretty cool.)

Remove cats from core?

@alexarchambault has just submitted a pull request (#28) that removes cats from core and adds a new cats module that includes all of the original type class instances and operations that depended on cats.

This isn't terribly disruptive. Not having a right-biased disjunction is kind of annoying, but in my opinion not annoying enough to justify a dependency on its own. The M versions of withFocus, etc. are nice to have, but aren't used in the core (or in many common use cases). I think I'd be okay with moving all of the cats type class instances to a subproject.

I like the idea of a dependency-free core that just provides the encoding and decoding type classes, an Argonaut-flavored AST, and the zippers, and I'm tempted to go this route (especially now that @alexarchambault has done all the hard work). Is anyone strongly opposed?

New module for scala.js specific helpers

as discussed in gitter

We can definitely add a JS-specific module that would provide convertJson, etc.—it's just not been a priority for me yet

what can be the name for new module ?

my picks - sjs , sjs-helpers ( just popping whats on top of my head )

Option to remove quotes for keys in printer

Currently printer prints keys with string quotes

Sample(duration: Int)
Sample(123).asJson.noSpaces // { "duration" : 123} 

// with new proposal printer can print without quotes { duration : 123}

Use cases : Some API's(for example GraphQL) doesn't allow keys with quotes

Fix :
Looks like an easy one
https://github.com/travisbrown/circe/blob/master/core/shared/src/main/scala/io/circe/Printer.scala#L166

 if(keyWithQuotes) encloseJsonString(key)  else     appendJsonString(key)

// a new val in printer
val noSpacesAndKeysWithoutQuotes =  Printer(
    preserveOrder = true,
    dropNullKeys = false,
    indent = "",
   keyWithQuotes = false
  )

please review current proposal and let me know the feedback , i can send a PR :)

Compile error when encode a case class

I don't find the reason with the bug, just provide all I know.

[enuma] $ console
[info] Starting scala interpreter...
[info] 
import scalaz._
import Scalaz._
import io.circe._
import io.circe.generic.auto._
import io.circe.jawn._
import io.circe.syntax._
Welcome to Scala version 2.11.7 (OpenJDK 64-Bit Server VM, Java 1.8.0_51).
Type in expressions to have them evaluated.
Type :help for more information.

scala> :paste
// Entering paste mode (ctrl-D to finish)

case class Aaaaa(
  id: Option[Long],
  trade111: Option[Long],
  trade222: Option[Long],
  trade333: Option[Long],
  trade444: Option[Long],
  enterprise: Option[Long]
)
case class Ccccc(name: String)
case class Bbbbb(energy: Aaaaa, street1: Ccccc, street2: Ccccc, street3: Ccccc)

//implicit val aabbcc = Encoder[Aaaaa]
val kk: Bbbbb = ???
kk.asJson

// Exiting paste mode, now interpreting.
//Tips: Here complete failed
<console>:43: error: diverging implicit expansion for type io.circe.Encoder[Bbbbb]
starting with method encodeCaseClass in trait GenericInstances
       kk.asJson
          ^

scala> :paste
// Entering paste mode (ctrl-D to finish)

case class Aaaaa(
  id: Option[Long],
  trade111: Option[Long],
  trade222: Option[Long],
  trade333: Option[Long],
  trade444: Option[Long],
  enterprise: Option[Long]
)
case class Ccccc(name: String)
case class Bbbbb(energy: Aaaaa, street1: Ccccc, street2: Ccccc, street3: Ccccc)

implicit val aabbcc = Encoder[Aaaaa]
val kk: Bbbbb = ???
kk.asJson

// Exiting paste mode, now interpreting.
//Tips: Here complete successfully
scala.NotImplementedError: an implementation is missing
  at scala.Predef$.$qmark$qmark$qmark(Predef.scala:225)
  ... 47 elided

scala> :paste
// Entering paste mode (ctrl-D to finish)

case class Aaaaa(
  id: Option[Long],
  trade111: Option[Long],
  enterprise: Option[Long]
)
case class Ccccc(name: String)
case class Bbbbb(energy: Aaaaa, street1: Ccccc, street2: Ccccc, street3: Ccccc)

//implicit val aabbcc = Encoder[Aaaaa]
val kk: Bbbbb = ???
kk.asJson

// Exiting paste mode, now interpreting.
//Tips: Here complete successfully
scala.NotImplementedError: an implementation is missing
  at scala.Predef$.$qmark$qmark$qmark(Predef.scala:225)
  ... 47 elided

scala> :paste
// Entering paste mode (ctrl-D to finish)

case class Aaaaa(
  id: Option[Long],
  trade111: Option[Long],
  enterprise: Option[Long]
)
case class Ccccc(name: String)
case class Bbbbb(energy: Aaaaa, street1: Ccccc)

//implicit val aabbcc = Encoder[Aaaaa]
val kk: Bbbbb = ???
kk.asJson

// Exiting paste mode, now interpreting.
//Tips: Here complete failed
<console>:40: error: diverging implicit expansion for type io.circe.Encoder[Bbbbb]
starting with method encodeCaseClass in trait GenericInstances
       kk.asJson
          ^

scala> :paste
// Entering paste mode (ctrl-D to finish)

case class Aaaaa(
  id: Option[Long],
  trade111: Option[Long],
  enterprise: Option[Long]
)
case class Ccccc(name: String)
case class Bbbbb(energy: Aaaaa, street1: Ccccc)

implicit val aabbcc = Encoder[Aaaaa]
val kk: Bbbbb = ???
kk.asJson

// Exiting paste mode, now interpreting.
//Tips: Here complete successfully
scala.NotImplementedError: an implementation is missing
  at scala.Predef$.$qmark$qmark$qmark(Predef.scala:225)
  ... 47 elided

scala> 

It seems that sometimes I have to provide implicit val aabbcc = Encoder[Aaaaa], and it looks like that complete always successfully when I provide the implicit val aabbcc = Encoder[Aaaaa], and sometimes fail when I just type kk.asJson without implicit.
Any fix or suggest?

Timings comparaison with Play json

I had made few test to compare performances of play json and circe.
First I was chock by a huge differences between thems 8ms for circe and just 0.1ms for play.
( theses test are made in a scala console. This may have an incidence )

Here a gist of the test.
you can copy past this in console (with the right dependency ) and run Test.run

The first time it's run I see a huge advantage for play but the second run is less significant because circe won.

when I past this code in the console every time I run it I see theses results

val v = IP(Some("1"),"2")

for(i <- 1 to 3) { Test.time(v.asJson) }
 elapsed time :8.349507ms
 elapsed time :0.037627ms
 elapsed time :0.034833ms

but with play I have this

for (i <- 1 to 3 ) { Test.time(play.api.libs.json.Json.toJson(v)) }
 elapsed time :0.104637ms
 elapsed time :0.073606ms
 elapsed time :0.071487ms

Because there is a long call every time the for is invoked I wonder if there is not a performances issues.

ps : the time become significant if you have a lot of field

scala> case class Loonng(un: Int, deux: Int, trois : Int, quatre : Int, cinq: Int, six: Int, sept: Int, huit: Int, neuf: Int, dix :Int, onze : Int, douze: Int, triez :Int, quatroze: Int, quinze: Int, saize : Int, dixsept: Int, dixhuit: Int, dixneuf: Int, vinght: Int, vinghtetun : Int, vinghtdeux: Int, vinghttrois: Int)
defined class Loonng

scala> Loonng(1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23)
res11: Loonng = Loonng(1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23)

scala> for(i <- 1 to 3) { Test.time(res11.asJson) }
 elapsed time :57.142316ms
 elapsed time :0.311121ms
 elapsed time :0.285936ms

Stringified object representations don't preserve field order

Currently, stringified io.circe.Json doesn't appear to preserve field order. This doesn't matter for machines but it would greatly help human readability.

scala> import io.circe._, io.circe.generic.auto._, io.circe.jawn._, io.circe.syntax._
import io.circe._
import io.circe.generic.auto._
import io.circe.jawn._
import io.circe.syntax._

scala> case class MyClass(a: Int, b: String, c: List[String], d: Boolean)
defined class MyClass

scala> val m = MyClass(5, "asdf", List("zxcv", "qwer", "hjkl"), false)
m: MyClass = MyClass(5,asdf,List(zxcv, qwer, hjkl),false)

scala> m.asJson
res0: io.circe.Json =
{
  "d" : false,
  "c" : [
    "zxcv",
    "qwer",
    "hjkl"
  ],
  "b" : "asdf",
  "a" : 5
}

Codecs for maps with non-string keys

@koshelev asked today on gitter about Decoder instances for a Map with non-string keys.

It'd be nice to have a clear story for these cases. This new Argonaut pull request by @xuwei-k is one solution: you have to provide a mapping from keys to strings, and then you get the instance for free. Alternatively uPickle provides Map[NonString, Bar] instances that serialize to JSON arrays of pairs.

I'm leaning toward not providing instances for maps with non-string keys by default (for the reason given here), but making it easy for users to define uPickle-style instances if they want them.

Circle does not support Scrooge genearted scala classes.

Currently Circe (with the latest shapeless snapshot) doesn't support the sereilization and eserialization of Twitter's scrooge generated scala classes as those classes are not sealed nor case classes.

Take the following thrift code as an example:

namespace java demo
#@namespace scala demo

enum State {
  A = 1
  B = 2
}

struct StringOpt {
  1: optional string str
}


struct IntOpt {
  1: optional i32 int
}


union Opt {
  1: StringOpt strOpt
  2: IntOpt intOpt
}


struct Foo {
  1: list<string> list
  2: map<State, string> stateMap
  3: map<i32, Opt> optMap
}

Scroogel will generate these classes:

/**
 * Generated by Scrooge
 *   version: 3.14.1
 *   rev: a996c1128a032845c508102d62e65fc0aa7a5f41
 *   built at: 20140501-114733
 */
package demo

import com.twitter.scrooge.ThriftEnum


@javax.annotation.Generated(value = Array("com.twitter.scrooge.Compiler"))
case object State {

  case object A extends demo.State {
    val value = 1
    val name = "A"
  }

  case object B extends demo.State {
    val value = 2
    val name = "B"
  }

  /**
   * Find the enum by its integer value, as defined in the Thrift IDL.
   * @throws NoSuchElementException if the value is not found.
   */
  def apply(value: Int): demo.State = {
    value match {
      case 1 => demo.State.A
      case 2 => demo.State.B
      case _ => throw new NoSuchElementException(value.toString)
    }
  }

  /**
   * Find the enum by its integer value, as defined in the Thrift IDL.
   * Returns None if the value is not found
   */
  def get(value: Int): Option[demo.State] = {
    value match {
      case 1 => scala.Some(demo.State.A)
      case 2 => scala.Some(demo.State.B)
      case _ => scala.None
    }
  }

  def valueOf(name: String): Option[demo.State] = {
    name.toLowerCase match {
      case "a" => scala.Some(demo.State.A)
      case "b" => scala.Some(demo.State.B)
      case _ => scala.None
    }
  }

  lazy val list: List[demo.State] = scala.List[demo.State](
    demo.State.A,
    demo.State.B
  )
}



@javax.annotation.Generated(value = Array("com.twitter.scrooge.Compiler"))
sealed trait State extends ThriftEnum with Serializable
/**
 * Generated by Scrooge
 *   version: 3.14.1
 *   rev: a996c1128a032845c508102d62e65fc0aa7a5f41
 *   built at: 20140501-114733
 */
package demo

import com.twitter.scrooge.{
  TFieldBlob, ThriftException, ThriftStruct, ThriftStructCodec3, ThriftStructFieldInfo, ThriftUtil}
import org.apache.thrift.protocol._
import org.apache.thrift.transport.{TMemoryBuffer, TTransport}
import java.nio.ByteBuffer
import java.util.Arrays
import scala.collection.immutable.{Map => immutable$Map}
import scala.collection.mutable.Builder
import scala.collection.mutable.{
  ArrayBuffer => mutable$ArrayBuffer, Buffer => mutable$Buffer,
  HashMap => mutable$HashMap, HashSet => mutable$HashSet}
import scala.collection.{Map, Set}


object StringOpt extends ThriftStructCodec3[StringOpt] {
  private val NoPassthroughFields = immutable$Map.empty[Short, TFieldBlob]
  val Struct = new TStruct("StringOpt")
  val StrField = new TField("str", TType.STRING, 1)
  val StrFieldManifest = implicitly[Manifest[String]]

  /**
   * Field information in declaration order.
   */
  lazy val fieldInfos: scala.List[ThriftStructFieldInfo] = scala.List[ThriftStructFieldInfo](
    new ThriftStructFieldInfo(
      StrField,
      true,
      StrFieldManifest,
      None,
      None,
      immutable$Map(
      ),
      immutable$Map(
      )
    )
  )

  lazy val structAnnotations: immutable$Map[String, String] =
    immutable$Map[String, String](
    )

  /**
   * Checks that all required fields are non-null.
   */
  def validate(_item: StringOpt) {
  }

  override def encode(_item: StringOpt, _oproto: TProtocol) {
    _item.write(_oproto)
  }

  override def decode(_iprot: TProtocol): StringOpt = {
    var str: Option[String] = None
    var _passthroughFields: Builder[(Short, TFieldBlob), immutable$Map[Short, TFieldBlob]] = null
    var _done = false

    _iprot.readStructBegin()
    while (!_done) {
      val _field = _iprot.readFieldBegin()
      if (_field.`type` == TType.STOP) {
        _done = true
      } else {
        _field.id match {
          case 1 =>
            _field.`type` match {
              case TType.STRING => {
                str = Some(readStrValue(_iprot))
              }
              case _actualType =>
                val _expectedType = TType.STRING

                throw new TProtocolException(
                  "Received wrong type for field 'str' (expected=%s, actual=%s).".format(
                    ttypeToHuman(_expectedType),
                    ttypeToHuman(_actualType)
                  )
                )
            }
          case _ =>
            if (_passthroughFields == null)
              _passthroughFields = immutable$Map.newBuilder[Short, TFieldBlob]
            _passthroughFields += (_field.id -> TFieldBlob.read(_field, _iprot))
        }
        _iprot.readFieldEnd()
      }
    }
    _iprot.readStructEnd()

    new Immutable(
      str,
      if (_passthroughFields == null)
        NoPassthroughFields
      else
        _passthroughFields.result()
    )
  }

  def apply(
    str: Option[String] = None
  ): StringOpt =
    new Immutable(
      str
    )

  def unapply(_item: StringOpt): Option[Option[String]] = Some(_item.str)


  private def readStrValue(_iprot: TProtocol): String = {
    _iprot.readString()
  }

  private def writeStrField(str_item: String, _oprot: TProtocol) {
    _oprot.writeFieldBegin(StrField)
    writeStrValue(str_item, _oprot)
    _oprot.writeFieldEnd()
  }

  private def writeStrValue(str_item: String, _oprot: TProtocol) {
    _oprot.writeString(str_item)
  }



  private def ttypeToHuman(byte: Byte) = {
    // from https://github.com/apache/thrift/blob/master/lib/java/src/org/apache/thrift/protocol/TType.java
    byte match {
      case TType.STOP   => "STOP"
      case TType.VOID   => "VOID"
      case TType.BOOL   => "BOOL"
      case TType.BYTE   => "BYTE"
      case TType.DOUBLE => "DOUBLE"
      case TType.I16    => "I16"
      case TType.I32    => "I32"
      case TType.I64    => "I64"
      case TType.STRING => "STRING"
      case TType.STRUCT => "STRUCT"
      case TType.MAP    => "MAP"
      case TType.SET    => "SET"
      case TType.LIST   => "LIST"
      case TType.ENUM   => "ENUM"
      case _            => "UNKNOWN"
    }
  }

  object Immutable extends ThriftStructCodec3[StringOpt] {
    override def encode(_item: StringOpt, _oproto: TProtocol) { _item.write(_oproto) }
    override def decode(_iprot: TProtocol): StringOpt = StringOpt.decode(_iprot)
  }

  /**
   * The default read-only implementation of StringOpt.  You typically should not need to
   * directly reference this class; instead, use the StringOpt.apply method to construct
   * new instances.
   */
  class Immutable(
    val str: Option[String],
    override val _passthroughFields: immutable$Map[Short, TFieldBlob]
  ) extends StringOpt {
    def this(
      str: Option[String] = None
    ) = this(
      str,
      Map.empty
    )
  }

  /**
   * This Proxy trait allows you to extend the StringOpt trait with additional state or
   * behavior and implement the read-only methods from StringOpt using an underlying
   * instance.
   */
  trait Proxy extends StringOpt {
    protected def _underlying_StringOpt: StringOpt
    override def str: Option[String] = _underlying_StringOpt.str
    override def _passthroughFields = _underlying_StringOpt._passthroughFields
  }
}

trait StringOpt
  extends ThriftStruct
  with scala.Product1[Option[String]]
  with java.io.Serializable
{
  import StringOpt._

  def str: Option[String]

  def _passthroughFields: immutable$Map[Short, TFieldBlob] = immutable$Map.empty

  def _1 = str

  /**
   * Gets a field value encoded as a binary blob using TCompactProtocol.  If the specified field
   * is present in the passthrough map, that value is returend.  Otherwise, if the specified field
   * is known and not optional and set to None, then the field is serialized and returned.
   */
  def getFieldBlob(_fieldId: Short): Option[TFieldBlob] = {
    lazy val _buff = new TMemoryBuffer(32)
    lazy val _oprot = new TCompactProtocol(_buff)
    _passthroughFields.get(_fieldId) orElse {
      val _fieldOpt: Option[TField] =
        _fieldId match {
          case 1 =>
            if (str.isDefined) {
              writeStrValue(str.get, _oprot)
              Some(StringOpt.StrField)
            } else {
              None
            }
          case _ => None
        }
      _fieldOpt match {
        case Some(_field) =>
          val _data = Arrays.copyOfRange(_buff.getArray, 0, _buff.length)
          Some(TFieldBlob(_field, _data))
        case None =>
          None
      }
    }
  }

  /**
   * Collects TCompactProtocol-encoded field values according to `getFieldBlob` into a map.
   */
  def getFieldBlobs(ids: TraversableOnce[Short]): immutable$Map[Short, TFieldBlob] =
    (ids flatMap { id => getFieldBlob(id) map { id -> _ } }).toMap

  /**
   * Sets a field using a TCompactProtocol-encoded binary blob.  If the field is a known
   * field, the blob is decoded and the field is set to the decoded value.  If the field
   * is unknown and passthrough fields are enabled, then the blob will be stored in
   * _passthroughFields.
   */
  def setField(_blob: TFieldBlob): StringOpt = {
    var str: Option[String] = this.str
    var _passthroughFields = this._passthroughFields
    _blob.id match {
      case 1 =>
        str = Some(readStrValue(_blob.read))
      case _ => _passthroughFields += (_blob.id -> _blob)
    }
    new Immutable(
      str,
      _passthroughFields
    )
  }

  /**
   * If the specified field is optional, it is set to None.  Otherwise, if the field is
   * known, it is reverted to its default value; if the field is unknown, it is subtracked
   * from the passthroughFields map, if present.
   */
  def unsetField(_fieldId: Short): StringOpt = {
    var str: Option[String] = this.str

    _fieldId match {
      case 1 =>
        str = None
      case _ =>
    }
    new Immutable(
      str,
      _passthroughFields - _fieldId
    )
  }

  /**
   * If the specified field is optional, it is set to None.  Otherwise, if the field is
   * known, it is reverted to its default value; if the field is unknown, it is subtracked
   * from the passthroughFields map, if present.
   */
  def unsetStr: StringOpt = unsetField(1)


  override def write(_oprot: TProtocol) {
    StringOpt.validate(this)
    _oprot.writeStructBegin(Struct)
    if (str.isDefined) writeStrField(str.get, _oprot)
    _passthroughFields.values foreach { _.write(_oprot) }
    _oprot.writeFieldStop()
    _oprot.writeStructEnd()
  }

  def copy(
    str: Option[String] = this.str,
    _passthroughFields: immutable$Map[Short, TFieldBlob] = this._passthroughFields
  ): StringOpt =
    new Immutable(
      str,
      _passthroughFields
    )

  override def canEqual(other: Any): Boolean = other.isInstanceOf[StringOpt]

  override def equals(other: Any): Boolean =
    _root_.scala.runtime.ScalaRunTime._equals(this, other) &&
      _passthroughFields == other.asInstanceOf[StringOpt]._passthroughFields

  override def hashCode: Int = _root_.scala.runtime.ScalaRunTime._hashCode(this)

  override def toString: String = _root_.scala.runtime.ScalaRunTime._toString(this)


  override def productArity: Int = 1

  override def productElement(n: Int): Any = n match {
    case 0 => this.str
    case _ => throw new IndexOutOfBoundsException(n.toString)
  }

  override def productPrefix: String = "StringOpt"
}
/**
 * Generated by Scrooge
 *   version: 3.14.1
 *   rev: a996c1128a032845c508102d62e65fc0aa7a5f41
 *   built at: 20140501-114733
 */
package demo

import com.twitter.scrooge.{
  TFieldBlob, ThriftException, ThriftStruct, ThriftStructCodec3, ThriftStructFieldInfo, ThriftUtil}
import org.apache.thrift.protocol._
import org.apache.thrift.transport.{TMemoryBuffer, TTransport}
import java.nio.ByteBuffer
import java.util.Arrays
import scala.collection.immutable.{Map => immutable$Map}
import scala.collection.mutable.Builder
import scala.collection.mutable.{
  ArrayBuffer => mutable$ArrayBuffer, Buffer => mutable$Buffer,
  HashMap => mutable$HashMap, HashSet => mutable$HashSet}
import scala.collection.{Map, Set}


object IntOpt extends ThriftStructCodec3[IntOpt] {
  private val NoPassthroughFields = immutable$Map.empty[Short, TFieldBlob]
  val Struct = new TStruct("IntOpt")
  val IntField = new TField("int", TType.I32, 1)
  val IntFieldManifest = implicitly[Manifest[Int]]

  /**
   * Field information in declaration order.
   */
  lazy val fieldInfos: scala.List[ThriftStructFieldInfo] = scala.List[ThriftStructFieldInfo](
    new ThriftStructFieldInfo(
      IntField,
      true,
      IntFieldManifest,
      None,
      None,
      immutable$Map(
      ),
      immutable$Map(
      )
    )
  )

  lazy val structAnnotations: immutable$Map[String, String] =
    immutable$Map[String, String](
    )

  /**
   * Checks that all required fields are non-null.
   */
  def validate(_item: IntOpt) {
  }

  override def encode(_item: IntOpt, _oproto: TProtocol) {
    _item.write(_oproto)
  }

  override def decode(_iprot: TProtocol): IntOpt = {
    var int: Option[Int] = None
    var _passthroughFields: Builder[(Short, TFieldBlob), immutable$Map[Short, TFieldBlob]] = null
    var _done = false

    _iprot.readStructBegin()
    while (!_done) {
      val _field = _iprot.readFieldBegin()
      if (_field.`type` == TType.STOP) {
        _done = true
      } else {
        _field.id match {
          case 1 =>
            _field.`type` match {
              case TType.I32 => {
                int = Some(readIntValue(_iprot))
              }
              case _actualType =>
                val _expectedType = TType.I32

                throw new TProtocolException(
                  "Received wrong type for field 'int' (expected=%s, actual=%s).".format(
                    ttypeToHuman(_expectedType),
                    ttypeToHuman(_actualType)
                  )
                )
            }
          case _ =>
            if (_passthroughFields == null)
              _passthroughFields = immutable$Map.newBuilder[Short, TFieldBlob]
            _passthroughFields += (_field.id -> TFieldBlob.read(_field, _iprot))
        }
        _iprot.readFieldEnd()
      }
    }
    _iprot.readStructEnd()

    new Immutable(
      int,
      if (_passthroughFields == null)
        NoPassthroughFields
      else
        _passthroughFields.result()
    )
  }

  def apply(
    int: Option[Int] = None
  ): IntOpt =
    new Immutable(
      int
    )

  def unapply(_item: IntOpt): Option[Option[Int]] = Some(_item.int)


  private def readIntValue(_iprot: TProtocol): Int = {
    _iprot.readI32()
  }

  private def writeIntField(int_item: Int, _oprot: TProtocol) {
    _oprot.writeFieldBegin(IntField)
    writeIntValue(int_item, _oprot)
    _oprot.writeFieldEnd()
  }

  private def writeIntValue(int_item: Int, _oprot: TProtocol) {
    _oprot.writeI32(int_item)
  }



  private def ttypeToHuman(byte: Byte) = {
    // from https://github.com/apache/thrift/blob/master/lib/java/src/org/apache/thrift/protocol/TType.java
    byte match {
      case TType.STOP   => "STOP"
      case TType.VOID   => "VOID"
      case TType.BOOL   => "BOOL"
      case TType.BYTE   => "BYTE"
      case TType.DOUBLE => "DOUBLE"
      case TType.I16    => "I16"
      case TType.I32    => "I32"
      case TType.I64    => "I64"
      case TType.STRING => "STRING"
      case TType.STRUCT => "STRUCT"
      case TType.MAP    => "MAP"
      case TType.SET    => "SET"
      case TType.LIST   => "LIST"
      case TType.ENUM   => "ENUM"
      case _            => "UNKNOWN"
    }
  }

  object Immutable extends ThriftStructCodec3[IntOpt] {
    override def encode(_item: IntOpt, _oproto: TProtocol) { _item.write(_oproto) }
    override def decode(_iprot: TProtocol): IntOpt = IntOpt.decode(_iprot)
  }

  /**
   * The default read-only implementation of IntOpt.  You typically should not need to
   * directly reference this class; instead, use the IntOpt.apply method to construct
   * new instances.
   */
  class Immutable(
    val int: Option[Int],
    override val _passthroughFields: immutable$Map[Short, TFieldBlob]
  ) extends IntOpt {
    def this(
      int: Option[Int] = None
    ) = this(
      int,
      Map.empty
    )
  }

  /**
   * This Proxy trait allows you to extend the IntOpt trait with additional state or
   * behavior and implement the read-only methods from IntOpt using an underlying
   * instance.
   */
  trait Proxy extends IntOpt {
    protected def _underlying_IntOpt: IntOpt
    override def int: Option[Int] = _underlying_IntOpt.int
    override def _passthroughFields = _underlying_IntOpt._passthroughFields
  }
}

trait IntOpt
  extends ThriftStruct
  with scala.Product1[Option[Int]]
  with java.io.Serializable
{
  import IntOpt._

  def int: Option[Int]

  def _passthroughFields: immutable$Map[Short, TFieldBlob] = immutable$Map.empty

  def _1 = int

  /**
   * Gets a field value encoded as a binary blob using TCompactProtocol.  If the specified field
   * is present in the passthrough map, that value is returend.  Otherwise, if the specified field
   * is known and not optional and set to None, then the field is serialized and returned.
   */
  def getFieldBlob(_fieldId: Short): Option[TFieldBlob] = {
    lazy val _buff = new TMemoryBuffer(32)
    lazy val _oprot = new TCompactProtocol(_buff)
    _passthroughFields.get(_fieldId) orElse {
      val _fieldOpt: Option[TField] =
        _fieldId match {
          case 1 =>
            if (int.isDefined) {
              writeIntValue(int.get, _oprot)
              Some(IntOpt.IntField)
            } else {
              None
            }
          case _ => None
        }
      _fieldOpt match {
        case Some(_field) =>
          val _data = Arrays.copyOfRange(_buff.getArray, 0, _buff.length)
          Some(TFieldBlob(_field, _data))
        case None =>
          None
      }
    }
  }

  /**
   * Collects TCompactProtocol-encoded field values according to `getFieldBlob` into a map.
   */
  def getFieldBlobs(ids: TraversableOnce[Short]): immutable$Map[Short, TFieldBlob] =
    (ids flatMap { id => getFieldBlob(id) map { id -> _ } }).toMap

  /**
   * Sets a field using a TCompactProtocol-encoded binary blob.  If the field is a known
   * field, the blob is decoded and the field is set to the decoded value.  If the field
   * is unknown and passthrough fields are enabled, then the blob will be stored in
   * _passthroughFields.
   */
  def setField(_blob: TFieldBlob): IntOpt = {
    var int: Option[Int] = this.int
    var _passthroughFields = this._passthroughFields
    _blob.id match {
      case 1 =>
        int = Some(readIntValue(_blob.read))
      case _ => _passthroughFields += (_blob.id -> _blob)
    }
    new Immutable(
      int,
      _passthroughFields
    )
  }

  /**
   * If the specified field is optional, it is set to None.  Otherwise, if the field is
   * known, it is reverted to its default value; if the field is unknown, it is subtracked
   * from the passthroughFields map, if present.
   */
  def unsetField(_fieldId: Short): IntOpt = {
    var int: Option[Int] = this.int

    _fieldId match {
      case 1 =>
        int = None
      case _ =>
    }
    new Immutable(
      int,
      _passthroughFields - _fieldId
    )
  }

  /**
   * If the specified field is optional, it is set to None.  Otherwise, if the field is
   * known, it is reverted to its default value; if the field is unknown, it is subtracked
   * from the passthroughFields map, if present.
   */
  def unsetInt: IntOpt = unsetField(1)


  override def write(_oprot: TProtocol) {
    IntOpt.validate(this)
    _oprot.writeStructBegin(Struct)
    if (int.isDefined) writeIntField(int.get, _oprot)
    _passthroughFields.values foreach { _.write(_oprot) }
    _oprot.writeFieldStop()
    _oprot.writeStructEnd()
  }

  def copy(
    int: Option[Int] = this.int,
    _passthroughFields: immutable$Map[Short, TFieldBlob] = this._passthroughFields
  ): IntOpt =
    new Immutable(
      int,
      _passthroughFields
    )

  override def canEqual(other: Any): Boolean = other.isInstanceOf[IntOpt]

  override def equals(other: Any): Boolean =
    _root_.scala.runtime.ScalaRunTime._equals(this, other) &&
      _passthroughFields == other.asInstanceOf[IntOpt]._passthroughFields

  override def hashCode: Int = _root_.scala.runtime.ScalaRunTime._hashCode(this)

  override def toString: String = _root_.scala.runtime.ScalaRunTime._toString(this)


  override def productArity: Int = 1

  override def productElement(n: Int): Any = n match {
    case 0 => this.int
    case _ => throw new IndexOutOfBoundsException(n.toString)
  }

  override def productPrefix: String = "IntOpt"
}
 * Generated by Scrooge
 *   version: 3.14.1
 *   rev: a996c1128a032845c508102d62e65fc0aa7a5f41
 *   built at: 20140501-114733
 */
package demo

import com.twitter.scrooge.{ThriftStruct, ThriftStructCodec3, TFieldBlob}
import org.apache.thrift.protocol._
import java.nio.ByteBuffer
import java.util.Arrays
import scala.collection.mutable.{
  ArrayBuffer => mutable$ArrayBuffer, Buffer => mutable$Buffer,
  HashMap => mutable$HashMap, HashSet => mutable$HashSet}
import scala.collection.{Map, Set}

@javax.annotation.Generated(value = Array("com.twitter.scrooge.Compiler"))
sealed trait Opt extends ThriftStruct

private object OptDecoder {
  def apply(_iprot: TProtocol, newUnknown: TFieldBlob => Opt): Opt = {
    var _result: Opt = null
    _iprot.readStructBegin()
    val _field = _iprot.readFieldBegin()
    _field.id match {
      case 1 => { /* strOpt */
        _field.`type` match {
          case TType.STRUCT => {
            _result = Opt.StrOpt({
              demo.StringOpt.decode(_iprot)
            })
          }
          case _ => TProtocolUtil.skip(_iprot, _field.`type`)
        }
      }
      case 2 => { /* intOpt */
        _field.`type` match {
          case TType.STRUCT => {
            _result = Opt.IntOpt({
              demo.IntOpt.decode(_iprot)
            })
          }
          case _ => TProtocolUtil.skip(_iprot, _field.`type`)
        }
      }
      case _ =>
        if (_field.`type` != TType.STOP) {
          _result = newUnknown(TFieldBlob.read(_field, _iprot))
        } else {
          TProtocolUtil.skip(_iprot, _field.`type`)
        }
    }
    if (_field.`type` != TType.STOP) {
      _iprot.readFieldEnd()
      var _done = false
      var _moreThanOne = false
      while (!_done) {
        val _field = _iprot.readFieldBegin()
        if (_field.`type` == TType.STOP)
          _done = true
        else {
          _moreThanOne = true
          TProtocolUtil.skip(_iprot, _field.`type`)
          _iprot.readFieldEnd()
        }
      }
      if (_moreThanOne) {
        _iprot.readStructEnd()
        throw new TProtocolException("Cannot read a TUnion with more than one set value!")
      }
    }
    _iprot.readStructEnd()
    if (_result == null)
      throw new TProtocolException("Cannot read a TUnion with no set value!")
    _result
  }
}

object OptAliases {
  type StrOptAlias = demo.StringOpt

  type IntOptAlias = demo.IntOpt

}


@javax.annotation.Generated(value = Array("com.twitter.scrooge.Compiler"))
object Opt extends ThriftStructCodec3[Opt] {
  val Union = new TStruct("Opt")
  val StrOptField = new TField("strOpt", TType.STRUCT, 1)
  val IntOptField = new TField("intOpt", TType.STRUCT, 2)

  override def encode(_item: Opt, _oprot: TProtocol) { _item.write(_oprot) }
  override def decode(_iprot: TProtocol): Opt = OptDecoder(_iprot, UnknownUnionField(_))

  def apply(_iprot: TProtocol): Opt = decode(_iprot)

  import OptAliases._

  case class StrOpt(strOpt: StrOptAlias) extends Opt {
    override def write(_oprot: TProtocol) {
      if (strOpt == null)
        throw new TProtocolException("Cannot write a TUnion with no set value!")
      _oprot.writeStructBegin(Union)
      if (strOpt ne null) {
        val strOpt_item = strOpt
        _oprot.writeFieldBegin(StrOptField)
        strOpt_item.write(_oprot)
        _oprot.writeFieldEnd()
      }
      _oprot.writeFieldStop()
      _oprot.writeStructEnd()
    }
  }
  case class IntOpt(intOpt: IntOptAlias) extends Opt {
    override def write(_oprot: TProtocol) {
      if (intOpt == null)
        throw new TProtocolException("Cannot write a TUnion with no set value!")
      _oprot.writeStructBegin(Union)
      if (intOpt ne null) {
        val intOpt_item = intOpt
        _oprot.writeFieldBegin(IntOptField)
        intOpt_item.write(_oprot)
        _oprot.writeFieldEnd()
      }
      _oprot.writeFieldStop()
      _oprot.writeStructEnd()
    }
  }

  case class UnknownUnionField private[Opt](private val field: TFieldBlob) extends Opt {
    override def write(_oprot: TProtocol) {
      _oprot.writeStructBegin(Union)
      field.write(_oprot)
      _oprot.writeFieldStop()
      _oprot.writeStructEnd()
    }
  }
}
/**
 * Generated by Scrooge
 *   version: 3.14.1
 *   rev: a996c1128a032845c508102d62e65fc0aa7a5f41
 *   built at: 20140501-114733
 */
package demo

import com.twitter.scrooge.{
  TFieldBlob, ThriftException, ThriftStruct, ThriftStructCodec3, ThriftStructFieldInfo, ThriftUtil}
import org.apache.thrift.protocol._
import org.apache.thrift.transport.{TMemoryBuffer, TTransport}
import java.nio.ByteBuffer
import java.util.Arrays
import scala.collection.immutable.{Map => immutable$Map}
import scala.collection.mutable.Builder
import scala.collection.mutable.{
  ArrayBuffer => mutable$ArrayBuffer, Buffer => mutable$Buffer,
  HashMap => mutable$HashMap, HashSet => mutable$HashSet}
import scala.collection.{Map, Set}


object Foo extends ThriftStructCodec3[Foo] {
  private val NoPassthroughFields = immutable$Map.empty[Short, TFieldBlob]
  val Struct = new TStruct("Foo")
  val ListField = new TField("list", TType.LIST, 1)
  val ListFieldManifest = implicitly[Manifest[Seq[String]]]
  val StateMapField = new TField("stateMap", TType.MAP, 2)
  val StateMapFieldManifest = implicitly[Manifest[Map[State, String]]]
  val OptMapField = new TField("optMap", TType.MAP, 3)
  val OptMapFieldManifest = implicitly[Manifest[Map[Int, Opt]]]

  /**
   * Field information in declaration order.
   */
  lazy val fieldInfos: scala.List[ThriftStructFieldInfo] = scala.List[ThriftStructFieldInfo](
    new ThriftStructFieldInfo(
      ListField,
      false,
      ListFieldManifest,
      None,
      Some(implicitly[Manifest[String]]),
      immutable$Map(
      ),
      immutable$Map(
      )
    ),
    new ThriftStructFieldInfo(
      StateMapField,
      false,
      StateMapFieldManifest,
      Some(implicitly[Manifest[State]]),
      Some(implicitly[Manifest[String]]),
      immutable$Map(
      ),
      immutable$Map(
      )
    ),
    new ThriftStructFieldInfo(
      OptMapField,
      false,
      OptMapFieldManifest,
      Some(implicitly[Manifest[Int]]),
      Some(implicitly[Manifest[Opt]]),
      immutable$Map(
      ),
      immutable$Map(
      )
    )
  )

  lazy val structAnnotations: immutable$Map[String, String] =
    immutable$Map[String, String](
    )

  /**
   * Checks that all required fields are non-null.
   */
  def validate(_item: Foo) {
  }

  override def encode(_item: Foo, _oproto: TProtocol) {
    _item.write(_oproto)
  }

  override def decode(_iprot: TProtocol): Foo = {
    var list: Seq[String] = Seq[String]()
    var stateMap: Map[State, String] = Map[State, String]()
    var optMap: Map[Int, Opt] = Map[Int, Opt]()
    var _passthroughFields: Builder[(Short, TFieldBlob), immutable$Map[Short, TFieldBlob]] = null
    var _done = false

    _iprot.readStructBegin()
    while (!_done) {
      val _field = _iprot.readFieldBegin()
      if (_field.`type` == TType.STOP) {
        _done = true
      } else {
        _field.id match {
          case 1 =>
            _field.`type` match {
              case TType.LIST => {
                list = readListValue(_iprot)
              }
              case _actualType =>
                val _expectedType = TType.LIST

                throw new TProtocolException(
                  "Received wrong type for field 'list' (expected=%s, actual=%s).".format(
                    ttypeToHuman(_expectedType),
                    ttypeToHuman(_actualType)
                  )
                )
            }
          case 2 =>
            _field.`type` match {
              case TType.MAP => {
                stateMap = readStateMapValue(_iprot)
              }
              case _actualType =>
                val _expectedType = TType.MAP

                throw new TProtocolException(
                  "Received wrong type for field 'stateMap' (expected=%s, actual=%s).".format(
                    ttypeToHuman(_expectedType),
                    ttypeToHuman(_actualType)
                  )
                )
            }
          case 3 =>
            _field.`type` match {
              case TType.MAP => {
                optMap = readOptMapValue(_iprot)
              }
              case _actualType =>
                val _expectedType = TType.MAP

                throw new TProtocolException(
                  "Received wrong type for field 'optMap' (expected=%s, actual=%s).".format(
                    ttypeToHuman(_expectedType),
                    ttypeToHuman(_actualType)
                  )
                )
            }
          case _ =>
            if (_passthroughFields == null)
              _passthroughFields = immutable$Map.newBuilder[Short, TFieldBlob]
            _passthroughFields += (_field.id -> TFieldBlob.read(_field, _iprot))
        }
        _iprot.readFieldEnd()
      }
    }
    _iprot.readStructEnd()

    new Immutable(
      list,
      stateMap,
      optMap,
      if (_passthroughFields == null)
        NoPassthroughFields
      else
        _passthroughFields.result()
    )
  }

  def apply(
    list: Seq[String] = Seq[String](),
    stateMap: Map[State, String] = Map[State, String](),
    optMap: Map[Int, Opt] = Map[Int, Opt]()
  ): Foo =
    new Immutable(
      list,
      stateMap,
      optMap
    )

  def unapply(_item: Foo): Option[scala.Product3[Seq[String], Map[State, String], Map[Int, Opt]]] = Some(_item)


  private def readListValue(_iprot: TProtocol): Seq[String] = {
    val _list = _iprot.readListBegin()
    if (_list.size == 0) {
      _iprot.readListEnd()
      Nil
    } else {
      val _rv = new mutable$ArrayBuffer[String](_list.size)
      var _i = 0
      while (_i < _list.size) {
        _rv += {
            _iprot.readString()

        }
        _i += 1
      }
      _iprot.readListEnd()
      _rv
    }
  }

  private def writeListField(list_item: Seq[String], _oprot: TProtocol) {
    _oprot.writeFieldBegin(ListField)
    writeListValue(list_item, _oprot)
    _oprot.writeFieldEnd()
  }

  private def writeListValue(list_item: Seq[String], _oprot: TProtocol) {
    _oprot.writeListBegin(new TList(TType.STRING, list_item.size))
    list_item.foreach { list_item_element =>
      _oprot.writeString(list_item_element)
    }
    _oprot.writeListEnd()
  }

  private def readStateMapValue(_iprot: TProtocol): Map[State, String] = {
    val _map = _iprot.readMapBegin()
    if (_map.size == 0) {
      _iprot.readMapEnd()
      Map.empty[State, String]
    } else {
      val _rv = new mutable$HashMap[State, String]
      var _i = 0
      while (_i < _map.size) {
        val _key = {
            State(_iprot.readI32())

        }
        val _value = {
            _iprot.readString()

        }
        _rv(_key) = _value
        _i += 1
      }
      _iprot.readMapEnd()
      _rv
    }
  }

  private def writeStateMapField(stateMap_item: Map[State, String], _oprot: TProtocol) {
    _oprot.writeFieldBegin(StateMapField)
    writeStateMapValue(stateMap_item, _oprot)
    _oprot.writeFieldEnd()
  }

  private def writeStateMapValue(stateMap_item: Map[State, String], _oprot: TProtocol) {
    _oprot.writeMapBegin(new TMap(TType.I32, TType.STRING, stateMap_item.size))
    stateMap_item.foreach { _pair =>
      val stateMap_item_key = _pair._1
      val stateMap_item_value = _pair._2
      _oprot.writeI32(stateMap_item_key.value)
      _oprot.writeString(stateMap_item_value)
    }
    _oprot.writeMapEnd()
  }

  private def readOptMapValue(_iprot: TProtocol): Map[Int, Opt] = {
    val _map = _iprot.readMapBegin()
    if (_map.size == 0) {
      _iprot.readMapEnd()
      Map.empty[Int, Opt]
    } else {
      val _rv = new mutable$HashMap[Int, Opt]
      var _i = 0
      while (_i < _map.size) {
        val _key = {
            _iprot.readI32()

        }
        val _value = {
            Opt.decode(_iprot)

        }
        _rv(_key) = _value
        _i += 1
      }
      _iprot.readMapEnd()
      _rv
    }
  }

  private def writeOptMapField(optMap_item: Map[Int, Opt], _oprot: TProtocol) {
    _oprot.writeFieldBegin(OptMapField)
    writeOptMapValue(optMap_item, _oprot)
    _oprot.writeFieldEnd()
  }

  private def writeOptMapValue(optMap_item: Map[Int, Opt], _oprot: TProtocol) {
    _oprot.writeMapBegin(new TMap(TType.I32, TType.STRUCT, optMap_item.size))
    optMap_item.foreach { _pair =>
      val optMap_item_key = _pair._1
      val optMap_item_value = _pair._2
      _oprot.writeI32(optMap_item_key)
      optMap_item_value.write(_oprot)
    }
    _oprot.writeMapEnd()
  }



  private def ttypeToHuman(byte: Byte) = {
    // from https://github.com/apache/thrift/blob/master/lib/java/src/org/apache/thrift/protocol/TType.java
    byte match {
      case TType.STOP   => "STOP"
      case TType.VOID   => "VOID"
      case TType.BOOL   => "BOOL"
      case TType.BYTE   => "BYTE"
      case TType.DOUBLE => "DOUBLE"
      case TType.I16    => "I16"
      case TType.I32    => "I32"
      case TType.I64    => "I64"
      case TType.STRING => "STRING"
      case TType.STRUCT => "STRUCT"
      case TType.MAP    => "MAP"
      case TType.SET    => "SET"
      case TType.LIST   => "LIST"
      case TType.ENUM   => "ENUM"
      case _            => "UNKNOWN"
    }
  }

  object Immutable extends ThriftStructCodec3[Foo] {
    override def encode(_item: Foo, _oproto: TProtocol) { _item.write(_oproto) }
    override def decode(_iprot: TProtocol): Foo = Foo.decode(_iprot)
  }

  /**
   * The default read-only implementation of Foo.  You typically should not need to
   * directly reference this class; instead, use the Foo.apply method to construct
   * new instances.
   */
  class Immutable(
    val list: Seq[String],
    val stateMap: Map[State, String],
    val optMap: Map[Int, Opt],
    override val _passthroughFields: immutable$Map[Short, TFieldBlob]
  ) extends Foo {
    def this(
      list: Seq[String] = Seq[String](),
      stateMap: Map[State, String] = Map[State, String](),
      optMap: Map[Int, Opt] = Map[Int, Opt]()
    ) = this(
      list,
      stateMap,
      optMap,
      Map.empty
    )
  }

  /**
   * This Proxy trait allows you to extend the Foo trait with additional state or
   * behavior and implement the read-only methods from Foo using an underlying
   * instance.
   */
  trait Proxy extends Foo {
    protected def _underlying_Foo: Foo
    override def list: Seq[String] = _underlying_Foo.list
    override def stateMap: Map[State, String] = _underlying_Foo.stateMap
    override def optMap: Map[Int, Opt] = _underlying_Foo.optMap
    override def _passthroughFields = _underlying_Foo._passthroughFields
  }
}

trait Foo
  extends ThriftStruct
  with scala.Product3[Seq[String], Map[State, String], Map[Int, Opt]]
  with java.io.Serializable
{
  import Foo._

  def list: Seq[String]
  def stateMap: Map[State, String]
  def optMap: Map[Int, Opt]

  def _passthroughFields: immutable$Map[Short, TFieldBlob] = immutable$Map.empty

  def _1 = list
  def _2 = stateMap
  def _3 = optMap

  /**
   * Gets a field value encoded as a binary blob using TCompactProtocol.  If the specified field
   * is present in the passthrough map, that value is returend.  Otherwise, if the specified field
   * is known and not optional and set to None, then the field is serialized and returned.
   */
  def getFieldBlob(_fieldId: Short): Option[TFieldBlob] = {
    lazy val _buff = new TMemoryBuffer(32)
    lazy val _oprot = new TCompactProtocol(_buff)
    _passthroughFields.get(_fieldId) orElse {
      val _fieldOpt: Option[TField] =
        _fieldId match {
          case 1 =>
            if (list ne null) {
              writeListValue(list, _oprot)
              Some(Foo.ListField)
            } else {
              None
            }
          case 2 =>
            if (stateMap ne null) {
              writeStateMapValue(stateMap, _oprot)
              Some(Foo.StateMapField)
            } else {
              None
            }
          case 3 =>
            if (optMap ne null) {
              writeOptMapValue(optMap, _oprot)
              Some(Foo.OptMapField)
            } else {
              None
            }
          case _ => None
        }
      _fieldOpt match {
        case Some(_field) =>
          val _data = Arrays.copyOfRange(_buff.getArray, 0, _buff.length)
          Some(TFieldBlob(_field, _data))
        case None =>
          None
      }
    }
  }

  /**
   * Collects TCompactProtocol-encoded field values according to `getFieldBlob` into a map.
   */
  def getFieldBlobs(ids: TraversableOnce[Short]): immutable$Map[Short, TFieldBlob] =
    (ids flatMap { id => getFieldBlob(id) map { id -> _ } }).toMap

  /**
   * Sets a field using a TCompactProtocol-encoded binary blob.  If the field is a known
   * field, the blob is decoded and the field is set to the decoded value.  If the field
   * is unknown and passthrough fields are enabled, then the blob will be stored in
   * _passthroughFields.
   */
  def setField(_blob: TFieldBlob): Foo = {
    var list: Seq[String] = this.list
    var stateMap: Map[State, String] = this.stateMap
    var optMap: Map[Int, Opt] = this.optMap
    var _passthroughFields = this._passthroughFields
    _blob.id match {
      case 1 =>
        list = readListValue(_blob.read)
      case 2 =>
        stateMap = readStateMapValue(_blob.read)
      case 3 =>
        optMap = readOptMapValue(_blob.read)
      case _ => _passthroughFields += (_blob.id -> _blob)
    }
    new Immutable(
      list,
      stateMap,
      optMap,
      _passthroughFields
    )
  }

  /**
   * If the specified field is optional, it is set to None.  Otherwise, if the field is
   * known, it is reverted to its default value; if the field is unknown, it is subtracked
   * from the passthroughFields map, if present.
   */
  def unsetField(_fieldId: Short): Foo = {
    var list: Seq[String] = this.list
    var stateMap: Map[State, String] = this.stateMap
    var optMap: Map[Int, Opt] = this.optMap

    _fieldId match {
      case 1 =>
        list = Seq[String]()
      case 2 =>
        stateMap = Map[State, String]()
      case 3 =>
        optMap = Map[Int, Opt]()
      case _ =>
    }
    new Immutable(
      list,
      stateMap,
      optMap,
      _passthroughFields - _fieldId
    )
  }

  /**
   * If the specified field is optional, it is set to None.  Otherwise, if the field is
   * known, it is reverted to its default value; if the field is unknown, it is subtracked
   * from the passthroughFields map, if present.
   */
  def unsetList: Foo = unsetField(1)

  def unsetStateMap: Foo = unsetField(2)

  def unsetOptMap: Foo = unsetField(3)


  override def write(_oprot: TProtocol) {
    Foo.validate(this)
    _oprot.writeStructBegin(Struct)
    if (list ne null) writeListField(list, _oprot)
    if (stateMap ne null) writeStateMapField(stateMap, _oprot)
    if (optMap ne null) writeOptMapField(optMap, _oprot)
    _passthroughFields.values foreach { _.write(_oprot) }
    _oprot.writeFieldStop()
    _oprot.writeStructEnd()
  }

  def copy(
    list: Seq[String] = this.list,
    stateMap: Map[State, String] = this.stateMap,
    optMap: Map[Int, Opt] = this.optMap,
    _passthroughFields: immutable$Map[Short, TFieldBlob] = this._passthroughFields
  ): Foo =
    new Immutable(
      list,
      stateMap,
      optMap,
      _passthroughFields
    )

  override def canEqual(other: Any): Boolean = other.isInstanceOf[Foo]

  override def equals(other: Any): Boolean =
    _root_.scala.runtime.ScalaRunTime._equals(this, other) &&
      _passthroughFields == other.asInstanceOf[Foo]._passthroughFields

  override def hashCode: Int = _root_.scala.runtime.ScalaRunTime._hashCode(this)

  override def toString: String = _root_.scala.runtime.ScalaRunTime._toString(this)


  override def productArity: Int = 3

  override def productElement(n: Int): Any = n match {
    case 0 => this.list
    case 1 => this.stateMap
    case 2 => this.optMap
    case _ => throw new IndexOutOfBoundsException(n.toString)
  }

  override def productPrefix: String = "Foo"
}

LabelledGeneric can be summoned as LabelledGeneric[Foo.Immutable] but I have no idea how it can be used.

Try to convert string value in Double/Float decoders

From the Gitter conversation.

The Decoders for Int, Long and other integral types try to convert a string into their number type.

import io.circe._, parse._, cats.data.Xor
scala> val Xor.Right(i) = decode[Int]("\"42\"")
i: Int = 42

The decoder for Double and Float don't try this conversion because of this:

scala> "Infinity".toDouble
res1: Double = Infinity

However, we could use JsonNumber.fromString and _.toDouble to try to safely convert a String into a Double, for example:

scala> val doubleWithFromString = Decoder.instance(c => c.as[String].flatMap(s => Xor.fromOption(JsonNumber.fromString(s).map(_.toDouble), DecodingFailure("Double", c.history))))
doubleWithFromString: io.circe.Decoder[Double] = io.circe.Decoder$$anon$8@47016c8b

scala> val Xor.Right(d) = decode("\"13.37\"")(doubleWithFromString)
d: Double = 13.37

scala> val Xor.Left(e) = decode("\"Infinity\"")(doubleWithFromString)
e: io.circe.Error = io.circe.DecodingFailure: Double

scala> val Xor.Left(e) = decode("\"NaN\"")(doubleWithFromString)
e: io.circe.Error = io.circe.DecodingFailure: Double

If "Infinty" and "NaN" are valid to be parsed as Double, one can fallback to using a Decoder[String].map(_.toDouble) approach.

Ugh signed zeros

Let me start by saying that I spent my entire life up until about an hour ago without realizing what a shitshow signed zeros are, and I wish I could go back to a state of innocence, so you might want to close the tab and move on.

First for the basics:

scala> 0.0 == -0.0
res0: Boolean = true

scala> Double.box(0.0) == Double.box(-0.0)
res1: Boolean = false

scala> java.lang.Double.compare(0.0, -0.0) == 0
res2: Boolean = false

Hurray, that makes no sense (but at least you might have seen it before).

Cats and Scalaz also disagree:

scala> import scalaz.Equal, scalaz.std.anyVal._
import scalaz.Equal
import scalaz.std.anyVal._

scala> Equal[Double].equal(0.0, -0.0)
res3: Boolean = true

scala> import cats.Eq, cats.std.double._
import cats.Eq
import cats.std.double._

scala> Eq[Double].eqv(0.0, -0.0)
res4: Boolean = false

Apparently IEEE 754 says something like "positive and negative zeros are distinct but equal", and that's why Java's primitive floating point types make them equal. Cats (and Algebra, although each defines its own instance) seems to be taking a more principled stand than Scalaz here on "distinct but equal" not making any sense (I don't know what the actual motivations for the choices here are, but I'm curious).

Argonaut says all zero JSON numbers are equal:

scala> import argonaut._, Argonaut._, scalaz._, Scalaz._
import argonaut._
import Argonaut._
import scalaz._
import Scalaz._

scala> Parse.parse("0.0") === Parse.parse("-0.0")
res0: Boolean = true

scala> Parse.parse("0") === Parse.parse("-0")
res1: Boolean = true

But it's perfectly happy to round-trip negative zeros if they have a fractional part:

scala> Parse.parse("-0").map(_.nospaces)
res2: scalaz.\/[String,String] = \/-(0)

scala> Parse.parse("-0.0").map(_.nospaces)
res3: scalaz.\/[String,String] = \/-(-0.0)

This seems really wrong to me—if two things serialize differently, the scalaz.Equal instance shouldn't say they're the same.

circe currently round-trips both cases (at least with Jawn as the parser—other parsers may behave in such a way that this isn't possible), but also says they're all equal:

scala> import io.circe._, io.circe.jawn._, cats.syntax.eq._
import io.circe._
import io.circe.jawn._
import cats.syntax.eq._

scala> parse("-0.0").map(_.noSpaces)
res0: cats.data.Xor[io.circe.ParsingFailure,String] = Right(-0.0)

scala> parse("-0").map(_.noSpaces)
res1: cats.data.Xor[io.circe.ParsingFailure,String] = Right(-0)

scala> parse("0.0") === parse("-0.0")
res2: Boolean = true

scala> parse("0") === parse("-0")
res3: Boolean = true

I think I'd like to keep the current round-tripping behavior (I want circe to be able to distinguish distinct JSON values in as many cases as is reasonably possible), but to make positive and negative values not equal whether or not there's a fractional part.

Any objections?

Add a deep merge operation

Requested by @mpilquist on Gitter, where there's some discussion of the issues involved. For now I think it makes sense to go with something like the simpler version in Argonaut (not the HCursor => HCursor idea) but with the argument determining the field order for consistency.

Get serializability right

I hate dealing with java.lang.Serializable, but I'd like to support Spark usage and it's not really that much overhead.

I'd guess "getting it right" would mean something like cats's SerializableLaws.

Readme parsing example won't compile

Thanks a lot for this great library! 😄

The first example won't compile, because of ambiguous imports for parse:

import io.circe._, io.circe.generic.auto._, io.circe.jawn._, io.circe.syntax._
import cats.data.Xor

val json: String = ...

val doc: Json = parse(json).getOrElse(Json.empty)

I think jawn.parse(json).getOrElse(Json.empty) on the other hand is doing the job just fine or have I missed anything?

Provide parsing and printing for Scala.js

We now have Scala.js support, but it's not terribly useful yet, since we don't have parsers or printers that work for Scala.js.

For parsing I'm thinking we'll create a new circe-js module with something like upickle's parser.

For printing the only issue is that I'm using Java's CopyOnWriteArrayList for memoization. This was an attempt to optimize that was at least kind of successful according to my initial benchmarks. It wouldn't be too hard to factor out the memoization part and provide different implementations for JS and JVM.

Generic derivation chokes on recursive types in certain situations

The generic derivation mechanism can handle recursive type definitions, but it will choose the wrong codecs when there's an intervening non-derived type. For example (the error messages also need some work, but I'll save that for another issue):

scala> import io.circe._, io.circe.jawn.decode, io.circe.generic.auto._
import io.circe._
import io.circe.jawn.decode
import io.circe.generic.auto._

scala> case class Foo(o: Option[Foo])
defined class Foo

scala> decode[Foo]("{}")
res0: cats.data.Xor[io.circe.Error,Foo] = Left(io.circe.DecodingFailure: Attempt to decode value on failed cursorEl(DownField(o),false))

scala> decode[Foo]("""{ "o": null }""")
res1: cats.data.Xor[io.circe.Error,Foo] = Left(io.circe.DecodingFailure: CNilEl(DownField(o),true))

But this works:

scala> decode[Foo]("""{ "o": { "None": {} } }""")
res2: cats.data.Xor[io.circe.Error,Foo] = Right(Foo(None))

And similarly:

scala> Foo(None).asJson
res3: io.circe.Json =
{
  "o" : {
    "None" : {

    }
  }
}

This makes it clear that circe is using the derived instance for Option[A: Decoder] instead of the usual one. This is bad, and should be fixed.

Thanks to Caballero on Stack Overflow for noticing this.

Reconsider integral decoders

circe currently follows Argonaut in truncating, etc. when decoding integral types:

scala> io.circe.parse.decode[Short]("0.1")
res0: cats.data.Xor[io.circe.Error,Short] = Right(0)

scala> io.circe.parse.decode[Short]("123456")
res1: cats.data.Xor[io.circe.Error,Short] = Right(32767)

I don't really like this, and would prefer both of the above to result in failures. Any objections?

Use export-hook in core

I'm creating this issue to follow up on a comment by @dwijnand on gitter:

The only thought I had about #29 is the need for the imports, and I that made me think if maybe core should use export-hook

I've played around with export-hook in core, but I really want to push on the experiment of keeping generic derivation modular for as long as possible. I don't love my current approach of getting priorities right by balancing a tower of instance-providing traits and using a Secondary wrapper, but it seems to work reasonably well, and is still (I think) not completely unintelligible.

What about allocations?

It would be absolutely terrifc to re-run JMH with -prof gc and add one more line for each performnace run: gc.alloc.norm (or something like this): bytes per iteration.

Add a type class combining encoding and decoding

Today on Gitter @julienrf asked the following question:

@travisbrown can you remind me why you decided to remove CodecJson?

And my response:

@julienrf Primarily because I think the role it plays in Argonaut is confusing. It can be handy for definitions, but you generally don't want to use it as a context bound anywhere except tests, since there aren't CodecJson instances for lots of things that have DecodeJson and EncodeJson instances…
@julienrf …and that's the case because you can't define e.g. CodecJson[String] in the CodecJson companion object, since then it wouldn't be found when you ask for DecodeJson[String].
@julienrf So you could either put all your CodecJson instances for these basic types in some object that you expect users to import (ugh), or have some kind of implicit that automatically combines DecodeJson and EncodeJson into a CodecJson (which was removed in Argonaut for reasons I don't exactly remember, although I can imagine how that might get messy), or do what Argonaut does and just provide CodecJson for convenient definitions and let users figure out why it's pretty much useless as a requirement.
None of those options are very nice, so at the beginning I decided to leave it out of circe entirely.
It's possible that it's possible to do it right (or at least dramatically better) with export-hook, and I'd definitely be open to that possibility.

My wishlist for a Codec type class in circe would look something like this:

  1. A Codec[A] should be available (without any imports) for any A that has a Decoder and Encoder—i.e. if io.circe.Decoder[A] and io.circe.Encoder[A] compile, then io.circe.Codec[A] must compile as well.
  2. All tests should pass as currently written.
  3. If a type has a Codec instance, then asking for a Decoder (or Encoder) for that type shouldn't require additional allocations.
  4. We shouldn't have to change the name of the apply methods on Decoder or Encoder.
  5. It should be possible to define Codec instances for standard library and circe types in the Codec companion object and have Encoder and Decoder instances available with no imports.

Only the first two are hard requirements, and the third and fourth are probably incompatible.

Useless compile error while lacking encoder/decoder

If there is no encode for date, then compiler complains

 [error] /home/jilen/workspace/jfc-test/src/main/scala/Foo.scala:26: diverging implicit expansion for type io.circe.Encoder.Secondary[this.Out]
[error] starting with method encodeCaseClass in trait GenericInstances

Is there any chance to improve the error reporting ?

import io.circe._
import io.circe.generic.auto._
import io.circe.syntax._

import java.util.Date
case class Foo(
  date: Date,
  number: Int,
  str: String)

object App {

  def encodesDate(fmt: String): Encoder[Date] = new Encoder[Date] {
    def apply(a: Date) = {
      val sdf = new java.text.SimpleDateFormat(fmt)
      Json.string(sdf.format(a))
    }
  }



  def main(args: Array[String])  {
    //implicit val DateEncodes = encodesDate("yyyy-MM-dd HH:mm:ss")
    val f = Foo(new Date, 1, "fff")
    println(f.asJson.noSpaces)
  }
}

Change the name?

jfc was my working title when I started the project a couple of weeks ago. Arguments against it include the following:

  1. It's hard to search for.
  2. The acronym happens to coincide with a profane bit of Internet slang.
  3. Nobody ever, ever wants to be confused with the Java Foundation Classes.

I still like it. I prefer package names to be reverse domain names owned by the project developers (I get annoyed every time I have to write e.g. import _root_.argonaut._ because of an argonaut subpackage somewhere). io.jfc is unambiguous in that respect and short enough that writing out the fully-qualified name isn't too much of a burden.

@non's "circe" is the best alternative candidate I've heard so far, but I'm happy to consider others.

Add support for convenient non-generic codec definition

circe currently doesn't include anything like Argonaut's jdecodeNL, casecodecN, etc.—you can define instances entirely by hand, semi-automatically with io.circe.generic.semiauto._, or fully automatically with io.circe.generic.auto._, but that's it.

There's currently no arity-level boilerplate in the core project, and I'm not sure how I want to handle this, but I want to provide something comparable to casecodecN in 0.3.0.

Conversion from JsonNumber to BigDecimal can throw exceptions

JSON allows numbers with exponents larger than Int.MaxValue, and JsonNumber.fromString will happily accept them:

val Some(okay) = JsonNumber.fromString(s"1E${ Int.MaxValue.toLong }")
val Some(tooBig) = JsonNumber.fromString(s"1E${ Int.MaxValue.toLong + 1L }")

Unfortunately the toBigDecimal method just crashes on tooBig here:

scala> okay.toBigDecimal
res47: BigDecimal = 1E+2147483647

scala> tooBig.toBigDecimal
java.lang.NumberFormatException
  at java.math.BigDecimal.<init>(BigDecimal.java:491)
  at java.math.BigDecimal.<init>(BigDecimal.java:824)
  at scala.math.BigDecimal$.apply(BigDecimal.scala:289)
  at io.circe.JsonDecimal.toBigDecimal$lzycompute(JsonNumber.scala:187)
  at io.circe.JsonDecimal.toBigDecimal(JsonNumber.scala:187)
  ... 43 elided

toDouble also crashes, as does toLong, etc. (even though toLong returns an Option). The same thing happens with these methods in Argonaut.

This is bad, and needs to be fixed. Here's my proposal:

  1. toBigDecimal should return an Option that's empty if the value can't be parsed as a BigDecimal.
  2. We add a new approximateBigDecimal that uses one of the various pow(BigDecimal, BigDecimal) implementations floating around (e.g. a copy-paste-and-port job from here or some other implementation with a friendly license).
  3. The current contract of toDouble says that values outside the range of Double will be rounded to positive or negative infinity, so in the case that toBigDecimal is None, we use approximateBigDecimal and truncate.
  4. The behavior of toLong, toInt, etc. is unchanged, except that they no longer throw exceptions.
  5. All the truncateTo methods use approximateBigDecimal if necessary.

Request for Case Class Example for De(serialization)

In my experience with the Play and spray JSON libraries, there's usually a way to serialize and de-serialize using case classes

Example from spray-json:

import spray.json._

case class Boy(name: String, hobbies: List[String]) extends Parent

A Formatter provides de-serialization and serialization:

object Boy {
    import spray.json.DefaultJsonProtocol._

    implicit val format: RootJsonFormat[Boy] = 
        jsonFormat2(Boy.apply)
}

Does such capability exist in circe? If so, could you please show me an example? And, if there is such a way to do this, perhaps it'd be worthwhile to add to the README?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.