GithubHelp home page GithubHelp logo

scala / pickling Goto Github PK

View Code? Open in Web Editor NEW
831.0 831.0 79.0 10.51 MB

Fast, customizable, boilerplate-free pickling support for Scala

Home Page: lampwww.epfl.ch/~hmiller/pickling

License: BSD 3-Clause "New" or "Revised" License

Scala 99.83% Java 0.17%

pickling's Introduction

This is Scala 2! Welcome!

This is the home of the Scala 2 standard library, compiler, and language spec.

For Scala 3, visit scala/scala3.

How to contribute

Issues and bug reports for Scala 2 are located in scala/bug. That tracker is also where new contributors may find issues to work on: good first issues, help wanted.

For coordinating broader efforts, we also use the scala/scala-dev tracker.

To contribute here, please open a pull request from your fork of this repository.

Be aware that we can't accept additions to the standard library, only modifications to existing code. Binary compatibility forbids adding new public classes or public methods. Additions are made to scala-library-next instead.

We require that you sign the Scala CLA before we can merge any of your work, to protect Scala's future as open source software.

The general workflow is as follows.

  1. Find/file an issue in scala/bug (or submit a well-documented PR right away!).
  2. Fork the scala/scala repo.
  3. Push your changes to a branch in your forked repo. For coding guidelines, go here.
  4. Submit a pull request to scala/scala from your forked repo.

For more information on building and developing the core of Scala, read the rest of this README, especially for setting up your machine!

Get in touch!

In order to get in touch with other Scala contributors, join the #scala-contributors channel on the Scala Discord chat, or post on contributors.scala-lang.org (Discourse).

If you need some help with your PR at any time, please feel free to @-mention anyone from the list below, and we will do our best to help you out:

username talk to me about...
@lrytz back end, optimizer, named & default arguments, reporters
@retronym 2.12.x branch, compiler performance, weird compiler bugs, lambdas
@SethTisue getting started, build, CI, community build, Jenkins, docs, library, REPL
@dwijnand pattern matcher, MiMa, partest
@som-snytt warnings/lints/errors, REPL, compiler options, compiler internals, partest
@Ichoran collections library, performance
@viktorklang concurrency, futures
@sjrd interactions with Scala.js
@NthPortal library, concurrency, scala.math, LazyList, Using, warnings
@bishabosha TASTy reader
@joroKr21 higher-kinded types, implicits, variance

P.S.: If you have some spare time to help out around here, we would be delighted to add your name to this list!

Branches

Target the oldest branch you would like your changes to end up in. We periodically merge forward from older release branches (e.g., 2.12.x) to new ones (e.g. 2.13.x).

If your change is difficult to merge forward, you may be asked to also submit a separate PR targeting the newer branch.

If your change is version-specific and shouldn't be merged forward, put [nomerge] in the PR name.

If your change is a backport from a newer branch and thus doesn't need to be merged forward, put [backport] in the PR name.

Choosing a branch

Most changes should target 2.13.x. We are increasingly reluctant to target 2.12.x unless there is a special reason (e.g. if an especially bad bug is found, or if there is commercial sponsorship).

The 2.11.x branch is now inactive and no further 2.11.x releases are planned (unless unusual, unforeseeable circumstances arise). You should not target 2.11.x without asking maintainers first.

Repository structure

Most importantly:

scala/
+--build.sbt                 The main sbt build definition
+--project/                  The rest of the sbt build
+--src/                      All sources
   +---/library              Scala Standard Library
   +---/reflect              Scala Reflection
   +---/compiler             Scala Compiler
+--test/                     The Scala test suite
   +---/files                Partest tests
   +---/junit                JUnit tests
   +---/scalacheck           ScalaCheck tests
+--spec/                     The Scala language specification

but also:

scala/
   +---/library-aux          Scala Auxiliary Library, for bootstrapping and documentation purposes
   +---/interactive          Scala Interactive Compiler, for clients such as an IDE (aka Presentation Compiler)
   +---/intellij             IntelliJ project templates
   +---/manual               Scala's runner scripts "man" (manual) pages
   +---/partest              Scala's internal parallel testing framework
   +---/partest-javaagent    Partest's helper java agent
   +---/repl                 Scala REPL core
   +---/repl-frontend        Scala REPL frontend
   +---/scaladoc             Scala's documentation tool
   +---/scalap               Scala's class file decompiler
   +---/testkit              Scala's unit-testing kit
+--admin/                    Scripts for the CI jobs and releasing
+--doc/                      Additional licenses and copyrights
+--scripts/                  Scripts for the CI jobs and releasing
+--tools/                    Scripts useful for local development
+--build/                    Build products
+--dist/                     Build products
+--target/                   Build products

Get ready to contribute

Requirements

You need the following tools:

  • Java SDK. The baseline version is 8 for both 2.12.x and 2.13.x. It is almost always fine to use a later SDK such as 11 or 15 for local development. CI will verify against the baseline version.
  • sbt

MacOS and Linux work. Windows may work if you use Cygwin. Community help with keeping the build working on Windows and documenting any needed setup is appreciated.

Tools we use

We are grateful for the following OSS licenses:

Build setup

Basics

During ordinary development, a new Scala build is built by the previously released version, known as the "reference compiler" or, slangily, as "STARR" (stable reference release). Building with STARR is sufficient for most kinds of changes.

However, a full build of Scala is bootstrapped. Bootstrapping has two steps: first, build with STARR; then, build again using the freshly built compiler, leaving STARR behind. This guarantees that every Scala version can build itself.

If you change the code generation part of the Scala compiler, your changes will only show up in the bytecode of the library and compiler after a bootstrap. Our CI does a bootstrapped build.

Bootstrapping locally: To perform a bootstrap, run restarrFull within an sbt session. This will build and publish the Scala distribution to your local artifact repository and then switch sbt to use that version as its new scalaVersion. You may then revert back with reload. Note restarrFull will also write the STARR version to buildcharacter.properties so you can switch back to it with restarr without republishing. This will switch the sbt session to use the build-restarr and target-restarr directories instead of build and target, which avoids wiping out classfiles and incremental metadata. IntelliJ will continue to be configured to compile and run tests using the starr version in versions.properties.

For history on how the current scheme was arrived at, see https://groups.google.com/d/topic/scala-internals/gp5JsM1E0Fo/discussion.

Building with fatal warnings: To make warnings in the project fatal (i.e. turn them into errors), run set Global / fatalWarnings := true in sbt (replace Global with the name of a module—such as reflect—to only make warnings fatal for that module). To disable fatal warnings again, either reload sbt, or run set Global / fatalWarnings := false (again, replace Global with the name of a module if you only enabled fatal warnings for that module). CI always has fatal warnings enabled.

Using the sbt build

Once you've started an sbt session you can run one of the core commands:

  • compile compiles all sub-projects (library, reflect, compiler, scaladoc, etc)
  • scala / scalac run the REPL / compiler directly from sbt (accept options / arguments)
  • enableOptimizer reloads the build with the Scala optimizer enabled. Our releases are built this way. Enable this when working on compiler performance improvements. When the optimizer is enabled the build will be slower and incremental builds can be incorrect.
  • setupPublishCore runs enableOptimizer and configures a version number based on the current Git SHA. Often used as part of bootstrapping: sbt setupPublishCore publishLocal && sbt -Dstarr.version=<VERSION> testAll
  • dist/mkBin generates runner scripts (scala, scalac, etc) in build/quick/bin
  • dist/mkPack creates a build in the Scala distribution format in build/pack
  • junit/test runs the JUnit tests; junit/testOnly *Foo runs a subset
  • scalacheck/test runs scalacheck tests, use testOnly to run a subset
  • partest runs partest tests (accepts options, try partest --help)
  • publishLocal publishes a distribution locally (can be used as scalaVersion in other sbt projects)
    • Optionally set baseVersionSuffix := "bin-abcd123-SNAPSHOT" where abcd123 is the git hash of the revision being published. You can also use something custom like "bin-mypatch". This changes the version number from 2.13.2-SNAPSHOT to something more stable (2.13.2-bin-abcd123-SNAPSHOT).
    • Note that the -bin string marks the version binary compatible. Using it in sbt will cause the scalaBinaryVersion to be 2.13. If the version is not binary compatible, we recommend using -pre, e.g., 2.14.0-pre-abcd123-SNAPSHOT.
    • Optionally set ThisBuild / Compile / packageDoc / publishArtifact := false to skip generating / publishing API docs (speeds up the process).

If a command results in an error message like a module is not authorized to depend on itself, it may be that a global sbt plugin is causing a cyclical dependency. Try disabling global sbt plugins (perhaps by temporarily commenting them out in ~/.sbt/1.0/plugins/plugins.sbt).

Sandbox

We recommend keeping local test files in the sandbox directory which is listed in the .gitignore of the Scala repo.

Incremental compilation

Note that sbt's incremental compilation is often too coarse for the Scala compiler codebase and re-compiles too many files, resulting in long build times (check sbt#1104 for progress on that front). In the meantime you can:

  • Use IntelliJ IDEA for incremental compiles (see IDE Setup below) - its incremental compiler is a bit less conservative, but usually correct.

IDE setup

We suggest using IntelliJ IDEA (see src/intellij/README.md).

Metals may also work, but we don't yet have instructions or sample configuration for that. A pull request in this area would be exceedingly welcome. In the meantime, we are collecting guidance at scala/scala-dev#668.

In order to use IntelliJ's incremental compiler:

  • run dist/mkBin in sbt to get a build and the runner scripts in build/quick/bin
  • run "Build" - "Make Project" in IntelliJ

Now you can edit and build in IntelliJ and use the scripts (compiler, REPL) to directly test your changes. You can also run the scala, scalac and partest commands in sbt. Enable "Ant mode" (explained above) to prevent sbt's incremental compiler from re-compiling (too many) files before each partest invocation.

Coding guidelines

Our guidelines for contributing are explained in CONTRIBUTING.md. It contains useful information on our coding standards, testing, documentation, how we use git and GitHub and how to get your code reviewed.

You may also want to check out the following resources:

Scala CI

Build Status

Once you submit a PR your commits will be automatically tested by the Scala CI.

Our CI setup is always evolving. See scala/scala-dev#751 for more details on how things currently work and how we expect they might change.

If you see a spurious failure on Jenkins, you can post /rebuild as a PR comment. The scabot README lists all available commands.

If you'd like to test your patch before having everything polished for review, you can have Travis CI build your branch (make sure you have a fork and have Travis CI enabled for branch builds on it first, and then push your branch). Also feel free to submit a draft PR. In case your draft branch contains a large number of commits (that you didn't clean up / squash yet for review), consider adding [ci: last-only] to the PR title. That way only the last commit will be tested, saving some energy and CI-resources. Note that inactive draft PRs will be closed eventually, which does not mean the change is being rejected.

CI performs a compiler bootstrap. The first task, validatePublishCore, publishes a build of your commit to the temporary repository https://scala-ci.typesafe.com/artifactory/scala-pr-validation-snapshots. Note that this build is not yet bootstrapped, its bytecode is built using the current STARR. The version number is 2.13.2-bin-abcd123-SNAPSHOT where abcd123 is the commit hash. For binary incompatible builds, the version number is 2.14.0-pre-abcd123-SNAPSHOT.

You can use Scala builds in the validation repository locally by adding a resolver and specifying the corresponding scalaVersion:

$ sbt
> set resolvers += "pr" at "https://scala-ci.typesafe.com/artifactory/scala-pr-validation-snapshots/"
> set scalaVersion := "2.12.2-bin-abcd123-SNAPSHOT"
> console

"Nightly" builds

The Scala CI builds nightly download releases and publishes them to https://scala-ci.typesafe.com/artifactory/scala-integration/ .

Using a nightly build in sbt is explained in this Stack Overflow answer

Although we casually refer to these as "nightly" builds, they aren't actually built nightly, but "mergely". That is to say, a build is published for every merged PR.

Scala CI internals

The Scala CI runs as a Jenkins instance on scala-ci.typesafe.com, configured by a chef cookbook at scala/scala-jenkins-infra.

The build bot that watches PRs, triggers testing builds and applies the "reviewed" label after an LGTM comment is in the scala/scabot repo.

Community build

The Scala community build is an important method for testing Scala releases. A community build can be launched for any Scala commit, even before the commit's PR has been merged. That commit is then used to build a large number of open-source projects from source and run their test suites.

To request a community build run on your PR, just ask in a comment on the PR and a Scala team member (probably @SethTisue) will take care of it. (details)

Community builds run on the Scala Jenkins instance. The jobs are named ..-integrate-community-build. See the scala/community-builds repo.

pickling's People

Contributors

ahnfelt avatar dzufferey avatar eed3si9n avatar emchristiansen avatar gitter-badger avatar guersam avatar havocp avatar heathermiller avatar jcracknell avatar jsuereth avatar jvican avatar lbliss avatar phaller avatar theblackdragon avatar xeno-by avatar zaneli avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pickling's Issues

List as type parameter error

Hello,

I have this function:

def p [ A : SPickler : FastTypeTag ](a: A) = a.pickle

when I try to call it with Seq[Int] => p(Seq(1,2,3)) it returns

scala.pickling.json.JSONPickle = 
JSONPickle({
  "tpe": "scala.collection.Seq[scala.Int]",
  "elems": [
    1,
    2,
    3
  ]
})

but if I try to call it with List[Int] => p(List(1,2,3)) it causes exception

scala.NotImplementedError: an implementation is missing
    at scala.Predef$.$qmark$qmark$qmark(Predef.scala:252)
    at scala.pickling.PicklerUnpicklerNotFound.pickle(Custom.scala:19)
    at .p(<console>:17)
    at .<init>(<console>:19)
    at .<clinit>(<console>)
    at .<init>(<console>:7)
    at .<clinit>(<console>)
    at $print(<console>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:734)
    at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:983)
    at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:573)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:604)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:568)
    at scala.tools.nsc.interpreter.ILoop.reallyInterpret$1(ILoop.scala:745)
    at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:790)
    at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:702)
    at scala.tools.nsc.interpreter.ILoop.processLine$1(ILoop.scala:566)
    at scala.tools.nsc.interpreter.ILoop.innerLoop$1(ILoop.scala:573)
    at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:576)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:867)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:822)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:822)
    at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
    at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:822)
    at scala.tools.nsc.interpreter.ILoop.main(ILoop.scala:889)
    at xsbt.ConsoleInterface.run(ConsoleInterface.scala:57)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:73)
    at sbt.compiler.AnalyzingCompiler.console(AnalyzingCompiler.scala:64)
    at sbt.Console.console0$1(Console.scala:23)
    at sbt.Console$$anonfun$apply$2$$anonfun$apply$1.apply$mcV$sp(Console.scala:24)
    at sbt.TrapExit$.executeMain$1(TrapExit.scala:33)
    at sbt.TrapExit$$anon$1.run(TrapExit.scala:42)

Thanks for your help,
Milan

BigDecimal unpickling doesn't work

I'll let the console history speak for itself:

scala> import scala.pickling._
import scala.pickling._

scala> import binary._
import binary._

scala> (2.3: BigDecimal).pickle.unpickle[BigDecimal]
<console>:29: error: Cannot generate an unpickler for BigDecimal. Recompile with -Xlog-implicits for details
              (2.3: BigDecimal).pickle.unpickle[BigDecimal]

CustomPicklers cannot be used in generic functions.

object Test extends App {
        implicit def genCustomPersonPickler[T <: Person](implicit format: PickleFormat) = new CustomPersonPickler
        def fn[T <: Person](x: T) = x.pickle
}

case class Person(name: String, age: Int, salary: Int)

class CustomPersonPickler(implicit val format: PickleFormat) extends SPickler[Person] {
        def pickle(picklee: Person, builder: PBuilder) {}
}

The above code will fail to compile even though a normal implicit conversion can be used in fn. The error is:

[error] exception during macro expansion: 
[error] scala.ScalaReflectionException: type T is not a class
[error]         at scala.reflect.api.Symbols$SymbolApi$class.asClass(Symbols.scala:323)
[error]         at scala.reflect.internal.Symbols$SymbolContextApiImpl.asClass(Symbols.scala:73)
[error]         at scala.pickling.PickleMacros$class.pickleInto(Macros.scala:276)
[error]         at scala.pickling.Compat$$anon$5.pickleInto(Compat.scala:30)
[error]         at scala.pickling.Compat$.PickleMacros_pickleInto(Compat.scala:31)
[error] one error found

This was inspired by issue #3.

edit: My code was slightly inaccurate, so I fixed it. A regular implicit conversion with bounds like [T <: Person] can be used in fn, but conversion to CustomPersonPickler does not work though it should be able to.

Pickling Java classes yields empty objects

Hi,
It seems that java classes aren't currently well handled, when attempting to either pickle directly, or say embedded in fields of pure scala classes.
The examples below compile successfully, and generate pickled representations.

However in each case:
a) unpickling to the specific type, successfully returns an object, however it's incorrectly initialised. (i.e. it's fields don't match those of the original object)

b) unpickling to [Any] fails at runtime.

This is rather unfortunate, since particularly case (a) could lead to hard to detect errors.
It would be great if you could take a look.

package picklingtests
import scala.pickling._
import scala.pickling.json._

import org.joda.time.LocalDate

import org.junit._
import org.junit.Assert._

case class X(date: org.joda.time.LocalDate)

class PickleTest {

  @Test def testPickleCaseClassJavaField_Specific = {
    val dt = LocalDate.now().plusDays(3)
    val x = X(dt)
    val pickled = x.pickle
    println(pickled)
    val restored = pickled.unpickle[X]

    assertEquals(x, restored)
  }

    @Test def testPickleCaseClassJavaField_Any = {
    val dt = LocalDate.now().plusDays(3)
    val x = X(dt)
    val pickled = x.pickle
    println(pickled)
    val restored = pickled.unpickle[Any]

    assertEquals(x, restored)
  }


  @Test def testPickleJavaClass_Any = {
    val x = LocalDate.now().plusDays(3)
    val pickled = x.pickle
    println(pickled)
    val restored = pickled.unpickle[Any]

    assertEquals(x, restored)
  }

  @Test def testPickleJavaClass_Specific = {
    val x = LocalDate.now().plusDays(3)
    val pickled = x.pickle
    println(pickled)
    val restored = pickled.unpickle[LocalDate]

    assertEquals(x, restored)
  }

  @Test def testPickleJavaClass2_Any = {
    val x = new java.awt.Rectangle(1, 2, 3, 4)
    val pickled = x.pickle
    println(pickled)
    val restored = pickled.unpickle[Any]

    assertEquals(x, restored)
  }

  @Test def testPickleJavaClass2_Specific = {
    val x = new java.awt.Rectangle(1, 2, 3, 4)
    val pickled = x.pickle
    println(pickled)
    val restored = pickled.unpickle[java.awt.Rectangle]

    assertEquals(x, restored)
  }

}

Output:

JSONPickle({
  "tpe": "picklingtests.X",
  "date": {

  }
})
JSONPickle({
  "tpe": "picklingtests.X",
  "date": {

  }
})
JSONPickle({
  "tpe": "java.awt.Rectangle"
})
JSONPickle({
  "tpe": "org.joda.time.LocalDate"
})
JSONPickle({
  "tpe": "org.joda.time.LocalDate"
})
JSONPickle({
  "tpe": "java.awt.Rectangle"
})

picklingtests.PickleTest

testPickleCaseClassJavaField_Specific(picklingtests.PickleTest)
java.lang.AssertionError: expected:<X(2013-10-24)> but was:<X(2013-10-21)>


testPickleCaseClassJavaField_Any(picklingtests.PickleTest)
java.util.NoSuchElementException: key not found: x$1
    at scala.collection.MapLike$class.default(MapLike.scala:228)
    at scala.collection.AbstractMap.default(Map.scala:58)
    at scala.collection.MapLike$class.apply(MapLike.scala:141)
    at scala.collection.AbstractMap.apply(Map.scala:58)
    at scala.pickling.json.JSONPickleReader.readField(JSONPickleFormat.scala:235)
    at scala.pickling.json.JSONPickleReader.readField(JSONPickleFormat.scala:159)
    at scala.pickling.InterpretedUnpicklerRuntime$$anon$2$$anonfun$fieldVals$1$1.apply(Runtime.scala:174)



testPickleJavaClass2_Specific(picklingtests.PickleTest)
java.lang.AssertionError: expected:<java.awt.Rectangle[x=1,y=2,width=3,height=4]> but was:<java.awt.Rectangle[x=0,y=0,width=0,height=0]>


testPickleJavaClass_Any(picklingtests.PickleTest)
java.util.NoSuchElementException: key not found: x$1
    at scala.collection.MapLike$class.default(MapLike.scala:228)
    at scala.collection.AbstractMap.default(Map.scala:58)
    at scala.collection.MapLike$class.apply(MapLike.scala:141)
    at scala.collection.AbstractMap.apply(Map.scala:58)
    at scala.pickling.json.JSONPickleReader.readField(JSONPickleFormat.scala:235)
    at scala.pickling.json.JSONPickleReader.readField(JSONPickleFormat.scala:159)

testPickleJavaClass_Specific(picklingtests.PickleTest)
java.lang.AssertionError: expected:<2013-10-24> but was:<2013-10-21>


testPickleJavaClass2_Any(picklingtests.PickleTest)
java.util.NoSuchElementException: key not found: x$1
    at scala.collection.MapLike$class.default(MapLike.scala:228)
    at scala.collection.AbstractMap.default(Map.scala:58)
    at scala.collection.MapLike$class.apply(MapLike.scala:141)
    at scala.collection.AbstractMap.apply(Map.scala:58)
    at scala.pickling.json.JSONPickleReader.readField(JSONPickleFormat.scala:235)
    at scala.pickling.json.JSONPickleReader.readField(JSONPickleFormat.scala:159)
    at scala.pickling.InterpretedUnpicklerRuntime$$anon$2$$anonfun$fieldVals$1$1.apply(Runtime.scala:174)



scala.NotImplementedError when using custom function to unpickle case class to its parent trait type

Hello, here is the problem:

scala> import scala.pickling._
import scala.pickling._

scala> import json._
import json._

scala> trait Fruit
defined trait Fruit

scala> case class Apple() extends Fruit
defined class Apple

// This works just fine: 
scala> Apple().pickle.unpickle[Fruit]
res0: Fruit = Apple()

but when I define function to unpickle any instance of type T or any instance derived from trait T, it fails

scala> def unpickle[T : Unpickler : FastTypeTag](p : JSONPickle) = p.unpickle[T]
unpickle: [T](p: scala.pickling.json.JSONPickle)(implicit evidence$1: scala.pickling.Unpickler[T], implicit evidence$2: scala.pickling.FastTypeTag[T])T

scala> unpickle[Fruit](Apple().pickle)

problem

scala> unpickle[Fruit](Apple().pickle)
scala.NotImplementedError: an implementation is missing
    at scala.Predef$.$qmark$qmark$qmark(Predef.scala:252)
    at scala.pickling.PicklerUnpicklerNotFound.unpickle(Custom.scala:20)
    at .unpickle(<console>:13)
    at .<init>(<console>:18)
    at .<clinit>(<console>)
    at .<init>(<console>:7)
    at .<clinit>(<console>)
    at $print(<console>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:734)
    at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:983)
    at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:573)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:604)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:568)
    at scala.tools.nsc.interpreter.ILoop.reallyInterpret$1(ILoop.scala:745)
    at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:790)
    at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:702)
    at scala.tools.nsc.interpreter.ILoop.processLine$1(ILoop.scala:566)
    at scala.tools.nsc.interpreter.ILoop.innerLoop$1(ILoop.scala:573)
    at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:576)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:867)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:822)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:822)
    at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
    at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:822)
    at scala.tools.nsc.interpreter.ILoop.main(ILoop.scala:889)
    at xsbt.ConsoleInterface.run(ConsoleInterface.scala:69)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:102)
    at sbt.compiler.AnalyzingCompiler.console(AnalyzingCompiler.scala:77)
    at sbt.Console.sbt$Console$$console0$1(Console.scala:23)
    at sbt.Console$$anonfun$apply$2$$anonfun$apply$1.apply$mcV$sp(Console.scala:24)
    at sbt.TrapExit$.sbt$TrapExit$$executeMain$1(TrapExit.scala:33)
    at sbt.TrapExit$$anon$1.run(TrapExit.scala:42)

Function works fine with concrete type:
scala> unpickle[Apple](Apple().pickle)
res1: Apple = Apple()

Thanks a lot,
Milan

generic function with pickling

Is it possible to write generic function that would pickle given argument of yet unknown type?
Something like:

object Barrel {
    def put[A](objx: A) = objx.pickle.value
    def get[A](valx: String) = valx.unpickle[A]
}

Design of PickleFormat trait

Hi!

I've created a library using the power off scala.pickling.

While I was working on that library I was struggling with PickleFormat.

The problem is that this is solid type for both pickle/unpickle operations. If you're serializing into database there is different types of objects for reading/writing like Statement class (write parameters) and ResultSet/Row classes for reading.
For example in PickleFormat you have to define method createBuilder without parameters which should create instance of empty output type. This is problematic with the database where you have a session/connection object and the output (Statement) is always bound to connection. My proposal is to define separate traits for input/output types:

trait PickleInputFormat {
  def createReader(pickle: PickleType, mirror: Mirror): PReader
}

trait PickleOutputFormat {
  type OutputType
  def createBuilder(): PBuilder
  def createBuilder(out: OutputType): PBuilder
}

This allows you to create pickle method with implicit parameters that accepts Statement/Session of database where it's available. The same applies to PickleOutputFormat.

If you're interested in the library please see it here: https://github.com/InnovaCo/capickling
The library is for the Cassandra and allows binding (which is actualy serialization) to database statement and mapping (deserialization) from database results.

Examples do not show pickling to and from a file

I'd like to pickle an object which is in-memory resident at over 8Gb. I can't pickle this to a string - is there an interface to pickle directly to a file? If so, could we show it in the README?

Unable to deserialize Option[X] member of case class.

This is the minimum test necessary to reproduce. I have a more complicated case class than this in my code base, but the Option[X] member appears to be the problem. The October 13th snapshot update seems to have introduced the bug. I get the same error when in 2.10.3-RC2 as well as 2.10.2.

import scala.pickling._
import scala.pickling.binary._


object Test {
  case class TestA (x: Option[Int])
  TestA(Some(1)).pickle.unpickle[TestA]

  case class TestB (x: Option[String])
  TestB(Some("hello")).pickle.unpickle[TestB]
}

I turned on -Xlog-implicits to get the following error on compile

...
[info] /Users/ashton/Projects/event_api/src/main/scala/Pickling.scala:7: pickling.this.Unpickler.genUnpickler is not a valid implicit value for scala.pickling.Unpickler[Test.TestA] because:
[info] hasMatchingSymbol reported error: value forcefulSet is not a member of reflect.runtime.universe.FieldMirror
[info]   TestA(Some(1)).pickle.unpickle[TestA]
[info]                                 ^
[error] /Users/ashton/Projects/event_api/src/main/scala/Pickling.scala:7: Cannot generate an unpickler for Test.TestA. Recompile with -Xlog-implicits for details
[error]   TestA(Some(1)).pickle.unpickle[TestA]
[error]
...

I was able to find the FieldMirror class in the nightly docs
http://www.scala-lang.org/files/archive/nightly/docs-2.10.2/library/index.html#scala.reflect.api.Mirrors$FieldMirror

forcefulSet does not appear to be a member, but it does appear to be added with RichFieldMirror implicit in core/src/main/scala/pickling/internal/package.scala

The following does work

scala> val x:Option[Int] = Some(1)
x: Option[Int] = Some(1)

scala> val y = x.pickle
y: scala.pickling.binary.BinaryPickle = BinaryPickle([0,0,0,21,115,99,97,108,97,46,83,111,109,101,91,115,99,97,108,97,46,73,110,116,93,0,0,0,1])

scala> y.unpickle[Option[Int]]
res18: Option[Int] = Some(1)

Running into problems when pickling Map and HashMap

steps (0.10.0)

scala-pickling> console
[info] Starting scala interpreter...
[info] 
Welcome to Scala version 2.11.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_51).
Type in expressions to have them evaluated.
Type :help for more information.

scala> import scala.collection.immutable
import scala.collection.immutable

scala> import scala.pickling._, Defaults._, json._
import scala.pickling._
import Defaults._
import json._

scala> immutable.HashMap(( for (j <- 0 until 20) yield j -> "Test" ): _* ).pickle.value

problem (0.10.0)

scala> println(immutable.HashMap(( for (j <- 0 until 20) yield j -> "Test" ): _* ).pickle.value)
{
  "$type": "scala.collection.immutable.HashMap.HashTrieMap",
  "bitmap": 2044127639,
  "elems": {
    "elems": [
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    },
      {
      "$type": "scala.collection.immutable.HashMap"
    }
    ]
  },
  "size0": 20
}

original report

My data structure is stored as a Map and when I try to pickle it I run into some weird issues.

When pickling a immutable.HashMap using the following example code.

import scala.collection.immutable
import scala.pickling._
import json._

object Test {
  def main(args : Array[String]) {
    println(immutable.HashMap(( for (j <- 0 until 20) yield j -> "Test" ): _* ).pickle.toString)
  }
}

as output I get

JSONPickle({
  "tpe": "scala.collection.immutable.HashMap.HashTrieMap",
  "bitmap": 2044127639,
  "elems": {

  },
  "size0": 20
})

If I use instead the following snippet of code

import scala.collection.Map
import scala.pickling._
import json._

object Test {
  def main(args : Array[String]) {
    println(Map(( for (j <- 0 until 20) yield j -> "Test" ): _* ).pickle.toString)
  }
}

I also end up with

JSONPickle({
  "tpe": "scala.collection.immutable.HashMap.HashTrieMap",
  "bitmap": 2044127639,
  "elems": {

  },
  "size0": 20
})

only when I don't explicitly import Map do I end up with the correct output.

Support for OpenJDK 6

The API of sun.misc.Unsafe in OpenJDK 6 is slightly smaller than in Oracle JDK 6. It would be good to also support OpenJDK 6.

Pickling - possible race condition?

Hey guys,

i have following problem:

i am trying to pickle this case class

trait GuiMessage
case class SkillMatrix(operators: Seq[String], skills: Seq[String], enabledSkills: Seq[(String,String)]) extends GuiMessage

The problem is that when i try to pickle this class from two different threads at the same time, i get following:

{x:Any => val pickled = x.asInstanceOf[SkillMatrix].pickle.value; println(s"pickling $x into $pickled");pickled}) //this function is called from two different threads
pickling SkillMatrix(List(1234, 4312),List(skill1, skill2),List((1234,skill1), (1234,skill2), (4312,skill1))) into {
  "tpe": "com.spinoco.horus.message.GuiMessage.SkillMessage.SkillMatrix",
  "operators": {
    "tpe": "scala.collection.Seq[java.lang.String]",
    "elems": [
      "1234",
      "4312"
    ]
  },
  "skills": {
    "tpe": "scala.collection.Seq[java.lang.String]",
    "elems": [
      "skill1",
      "skill2"
    ]
  },
  "enabledSkills": {
    "tpe": "scala.collection.Seq[scala.Tuple2[java.lang.String,java.lang.String]]",
    "elems": [
      { "$ref": 2 },
      {
      "tpe": "scala.Tuple2[java.lang.String,java.lang.String]",
      "a": "1234",
      "b": "skill2"
    },
      { "$ref": 4 }
    ]
  }
}
pickling SkillMatrix(List(1234, 4312),List(skill1, skill2),List((1234,skill1),(1234,skill2),(4312,skill1))) into {
  "tpe": "com.spinoco.horus.message.GuiMessage.SkillMessage.SkillMatrix",
  "operators": {
    "tpe": "scala.collection.Seq[java.lang.String]",
    "elems": [
      "1234",
      "4312"
    ]
  },
  "skills": {
    "tpe": "scala.collection.Seq[java.lang.String]",
    "elems": [
      "skill1",
      "skill2"
    ]
  },
  "enabledSkills": {
    "tpe": "scala.collection.Seq[scala.Tuple2[java.lang.String,java.lang.String]]",
    "elems": [
      {
      "tpe": "scala.Tuple2[java.lang.String,java.lang.String]",
      "a": "1234",
      "b": "skill1"
    },
      { "$ref": 3 },
      {
      "tpe": "scala.Tuple2[java.lang.String,java.lang.String]",
      "a": "4312",
      "b": "skill1"
    }
    ]
  }
}

as you can see, there are some really odd "$ref"s that i don't care for. Am i doing something wrong or is that error on your end? To me, it looks like some kind of race condition when pickling same object in two places at the same time.

Thanks

Akka Support

Hey, thanks a lot for the great work! This would solve a big problem for me if I could use it with Akka.

Viktor mentions @ https://groups.google.com/forum/#!searchin/akka-user/pickling/akka-user/CHeusozMtuQ/lfMtqHu0FvoJ that there might be some issues with Akka support because Pickling generates code at compile-time.

Are there any plans for an Akka extension? Can you elaborate in more detail what potential problem Viktor is referring to and what approaches might solve it?

Thank you very much and kind regards,
Philip

Please explain how to deal with PicklerUnpicklerNotFound

I'm currently getting a runtime NotImplementedError from PicklerUnpicklerNotFound, and I'd like to know what the general strategies should be for working around it.
As this is the most common bug I've found while using pickling, maybe it deserves a section in the README.

I'm also wondering why it even exists; it seems to push detectable compile-time errors (no pickler found) into a runtime error.

Pickle a class where on field is enum

Hi, the enums are not serialized nicely.
The most-specific type of the enum is not in the pickled version and the information about which value from the enumeration was pickled is completely missing.

 "source": {
   "tpe": "Enumeration.this.Val"
 }

It would be nice, if you'd look at it.

Support for common collections traits

First of all, thanks for doing this work -- very cool and practical idea.

Is there a plan in place to handle automatic pickler dispatch for e.g.

Seq(1, 2, 3).pickle

Unfortunately, it's one of the first examples I tried. I think I understand the concept; custom pickler plug-ins could potentially provide a DPickler to handle this reflectively. However, would there be a way to relieve every plug-in author of having to do this, perhaps by importing some optional implicits?

I started work on custom picklers for Apache Avro in topology-io/pickling-avro. This is currently on hold at least until the stable release, mod spare time. I have another project which uses runtime reflection to serialize/deserialize arbitrary types to Avro in GenslerAppsPod/scalavro and am hoping to improve upon that implementation by taking advantage of the macros provided in scala-pickling.

Salat

Is it possible to see if there is a possibility to introp and avoid duplication and framework fragmentation with Salat

Get rid of EncodedOutput

Use directly binary.Util for the encoding/decoding details, have a simple Output and get rid of EncodedOutput.

Apparent regression

I think a bug was added within the last 11 days to 0.8.0-SNAPSHOT.
I haven't been able to reproduce it as a test case in the scala-pickling project, but I do have a fairly simple external project in which the bug shows up.

The project is https://github.com/emchristiansen/PersistentMap.
It passed its builds on 3 October, but failed when it was rebuilt today.
Nothing of significance changed in that time (the logs show the Scala version changed, but I have also tested the previous version).

The build error comes from this file: https://github.com/emchristiansen/PersistentMap/blob/master/src/test/scala/st/sparse/persistentmap/testPersistentMap.scala

The error is

[error] /media/psf/Home/Dropbox/t/2013_q4/persistentmap/src/test/scala/st/sparse/persistentmap/testPersistentMap.scala:52: Cannot generate an unpickler for st.sparse.persistentmap.MyValue. Recompile with -Xlog-implicits for details
[error]     val map = PersistentMap.create[MyKey, MyValue]("test", database)

Enabling -Xlog-implicits, we have

[info] /media/psf/Home/Dropbox/t/2013_q4/persistentmap/src/test/scala/st/sparse/persistentmap/testPersistentMap.scala:93: pickling.this.Unpickler.genUnpickler is not a valid implicit value for scala.pickling.Unpickler[st.sparse.persistentmap.MyValue] because:
[info] hasMatchingSymbol reported error: value forcefulSet is not a member of reflect.runtime.universe.FieldMirror
[info]     val map2 = PersistentMap.connect[MyKey, MyValue]("test", database).get

I have also distilled the error into a test case here: https://github.com/emchristiansen/pickling/blob/picklerGenerationTest/core/src/test/scala/pickling/generic-spickler.scala.
See the test labeled "possible regression".

Scala pickle creates needless copies of objects

In this code

object Test extends App {
        val x = new X
        val p = x.pickle
        val y = p.unpickle[X]
        x.a(0) = 4
        y.a(0) = 4
        println(x.b.mkString) //prints 423
        println(y.b.mkString) //prints 123
}

class X {
        var a = Array(1,2,3)
        var b = a
}

X will pickle into:

JSONPickle({
  "tpe": "X",
  "a": [
    1,
    2,
    3
  ],
"b": [
  1,
  2,
  3
]
})

Which will depickle into a X with two copies of the same array. In the above example, y is a depickled x. Instead of b pointing to the same array object as a, it now points to a new, different array.

This is related to issue #1

Benchmark of usefulness of endCollection having the collection size

For now the correct collection size is given at the endCollection:

if (coll.isInstanceOf[IndexedSeq[_]]) builder.beginCollection(coll.size)
else builder.beginCollection(0)

What would be the impact of

builder.beginCollection(coll.size)

and getting rid of the length argument at endCollection.

Benefit:

  • Incremental
  • Simpler API

Scala pickle cannot handle Map

For this code

import scala.pickling._
import json._


object Test extends App {
  val elems = Map("elem1" -> 1,
                  "elem2" -> 2,
                  "elem3" -> 3)

  val pickle = elems.pickle
  println(pickle)
}

elems will pickle into:

JSONPickle({
  "tpe": "scala.collection.immutable.Map.Map3"
})

which is wrong.

Would be nice to have something like (btw, the "tpe" bellow doesn't correspond to a real thing. Just added as example):

JSONPickle({
  "tpe": "scala.collection.immutable.Map.Map3[scala.Predef.String, scala.Int]",
  "elems" : {
    "elem1": 1,
    "elem2": 2,
    "elem3": 3
  }
})

Runtime crash when pickling a FastTypeTag

Maybe it's a bit weird, but I want to serialize a FastTypeTag, so I can do type verification for a type-safe key-value store I hacked together.
Unfortunately, while the pickling seems to work, unpickling causes a runtime error.

steps

This code:

import scala.pickling._, Defaults._
import scala.pickling.binary._

val ftt = implicitly[FastTypeTag[Int]]
val pickled = ftt.pickle
// Crashes in this line.
val unpickled = pickled.unpickle[FastTypeTag[Int]]

problem

fails with this runtime error:

[info]   scala.reflect.internal.MissingRequirementError: class $anon$1 not found.
[info]   at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
[info]   at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
[info]   at scala.reflect.internal.Mirrors$RootsBase.ensureClassSymbol(Mirrors.scala:90)
[info]   at scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:119)
[info]   at scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:21)
[info]   at scala.pickling.package$.typeFromString(package.scala:65)
[info]   at scala.pickling.FastTypeTag$.apply(FastTags.scala:56)
[info]   at scalatestextra.TestPersistentMap$$anonfun$1.apply$mcV$sp(testPersistentMap.scala:29)
[info]   at scalatestextra.TestPersistentMap$$anonfun$1.apply(testPersistentMap.scala:26)
[info]   at scalatestextra.TestPersistentMap$$anonfun$1.apply(testPersistentMap.scala:26)

What happens if my case class has a field with the name `tpe`

Not sure if this is the right place to ask, or if it has been brought up elsewhere before, but when I have a case class such as

case class Ticket(id: String, tpe: String)

The pickler will give me back a JSON string with two "tpe"s:

scala> Ticket(java.util.UUID.randomUUID.toString, "A1").pickle
res10: scala.pickling.json.JSONPickle = 
JSONPickle({
  "tpe": "Ticket",
  "id": "e98e47db-255b-4594-82e3-63aeec3df093",
  "tpe": "A1"
})

which would in turn fail to unpickle:

scala> Ticket(java.util.UUID.randomUUID.toString, "A1").pickle.unpickle
scala.reflect.internal.MissingRequirementError: class A1 not found.
...

Is there a way I can avoid the above behavior, e.g. by telling the library to leave out the auto-generated tpe field, other than rolling my own pickler/unpickler? I could always provide the necessary class info when I intend to unpickle, so, for me at least, the full class name saved in tpe seems pretty much redundant.

Thanks.

BTW I'm using 0.8.0-SNAPSHOT with Scala 2.10.3.

Schema evolution handling

Hi,

Scala Pickling looks highly promising so that I want to migrate an existing Eventsourced application which uses default Java serialization for event storage.

Currently the biggest concern other than #27 is schema versioning. The application requirement changes as time goes by, so eventually I'll need to add/rename/remove some fields from event messages, and then compatibility with previous versions of message will be an issue.

Is there any plan or recommendation about this?

Weird behavior when pickling org.joda.time.DateTime

When picking then unpickling a DateTime, the original value of the DateTime is lost, replaced with the current time.

Here's a console session:

import scala.pickling._
import binary._
import org.joda.time.DateTime

val date = DateTime.now
// date: org.joda.time.DateTime = 2013-09-16T13:04:39.214+02:00

// The unpickled date is different.
date.pickle.unpickle[DateTime]
// org.joda.time.DateTime = 2013-09-16T13:05:51.761+02:00

// Calling the same line again produces yet another result.
date.pickle.unpickle[DateTime]
// org.joda.time.DateTime = 2013-09-16T13:06:08.124+02:00

"invalid index 1 in unpicklee cache of length 1" during unpickling where object ref is repeated

steps (0.10.0)

scala-pickling> console
[info] Starting scala interpreter...
[info] 
Welcome to Scala version 2.11.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_51).
Type in expressions to have them evaluated.
Type :help for more information.

scala> import scala.pickling._, Defaults._, json._
import scala.pickling._
import Defaults._
import json._

scala> case class StringProp(prop: String) {}
defined class StringProp

scala> class PropTest(val prop1: StringProp, val prop2: StringProp) {}
defined class PropTest

scala> val obj = StringProp("test")
obj: StringProp = StringProp(test)

scala> val pickle = new PropTest(obj, obj).pickle.unpickle[PropTest]

problems

scala> val pickle = new PropTest(obj, obj).pickle.unpickle[PropTest]
scala.pickling.PicklingException: error in unpickle of primitive unpickler 'scala.pickling.refs.Ref':
tag in unpickle: 'scala.pickling.refs.Ref'
message:
fatal error: invalid index 1 in unpicklee cache of length 1
  at scala.pickling.pickler.PrimitivePickler.unpickle(Primitives.scala:31)
  at $line12$readiwiwiwiwPropTestUnpickler$macro$4$2$$line11$readiwiwiwiwStringPropUnpickler$macro$12$2$.unpickle(<console>:20)
  at $line12$readiwiwiwiwPropTestUnpickler$macro$4$2$.unpickle(<console>:20)
  at scala.pickling.Unpickler$class.unpickleEntry(Pickler.scala:79)
  at $line12$readiwiwiwiwPropTestUnpickler$macro$4$2$.unpickleEntry(<console>:20)
  at scala.pickling.functions$.unpickle(functions.scala:11)
  at scala.pickling.UnpickleOps.unpickle(Ops.scala:23)
  ... 43 elided

original report

Here is a sample use case to produce an Invalid Index exception when unpickling:

import scala.pickling._
import json._
object JsonTest extends App {
  val obj = StringProp("test")
  val pickle = new PropTest(obj, obj).pickle.unpickle[PropTest]
}
case class StringProp(prop: String) {}
class PropTest(val prop1: StringProp, val prop2: StringProp) {}

As reported on StackOverflow:
http://stackoverflow.com/questions/19436070/invalid-index-unpickling-where-object-ref-is-repeated

Pickling case classes with objects and supertypes

Hi,

i am having some problems with pickling. It seems to be doing something odd when trying to pickle case classes that contain objects. Consider this REPL session:

scala> import scala.pickling._; import json._
import scala.pickling._
import json._

scala> trait A
defined trait A

scala> case class B(x: Option[Int]) extends A
defined class B

scala> B(Some(1)).pickle.unpickle[B] == B(Some(1))
res1: Boolean = true 

scala> B(None).pickle.unpickle[B] == B(None)
res2: Boolean = true

scala> B(None).pickle.unpickle[A]
res4: A = B(None)

!!!!!!!!!
scala> B(None).pickle.unpickle[A] == B(None)
res3: Boolean = false 
!!!!!!!!!

scala> (B(None): A) == B(None)
res6: Boolean = true

As you can see, there is something odd going on with == method going on in the highlighted example above. Am i doing something wrong? This happens only if we unpickle B using the unpickle[A] method (as far as i know)
Thanks,
Tomas

SPickler does not support traits

I try to defined a dummy SPickler this way:

class CustomPickler[T](implicit val format: PickleFormat) extends SPickler[T]{
  def pickle(t: T, builder: PBuilder): Unit = {
    println("In my pickle")
    builder.beginEntry(t)
    builder.endEntry
  }
}

then I defined an implicit this way:

  implicit def cPickler = new CustomPickler[X]
  println((new Y(0)).pickle)

where Y extends the trait X:

trait X {
  def id: Int
}

class Y(val id: Int) extends X

but it does not apply the the pickle method I defined in the CustomPickler class, whereas it does if replacing X by Y in the definition of the implicit. Conclusion: SPickler does not support the trait (if I correctly coded the things), which is a big limitation.

Can't generate unpickler in class hierarchy when passing object params to parent

From StackOverflow:

The example below pickles fine, but I get a compile error stating that no unpickler can be generated. Here is a simple test case to reproduce this:

    import scala.pickling._
    import json._
    object JsonTest extends App {
      val simplePickle = new Simple(new SimpleProp("TestProp")).pickle
      val simpleUnpickle = simplePickle.unpickle[Simple]
    }
    abstract class SimpleAbstract(val stringWrapper: SimpleProp) {}
    class Simple(stringWrapper: SimpleProp) extends SimpleAbstract(stringWrapper) {}
    case class SimpleProp(prop: String) {}

Please let me know if you need any additional information on this.

Here is the stacktrace I get with the -Xlog-implicits flag on:

pickling.this.Unpickler.genUnpickler is not a valid implicit value for scala.pickling.Unpickler[com.ft.flexui.modules.pm.Simple] because:
exception during macro expansion: 
java.util.NoSuchElementException: None.get
    at scala.None$.get(Option.scala:313)
    at scala.None$.get(Option.scala:311)
    at scala.pickling.Macro.reflectively(Tools.scala:380)
    at     scala.pickling.UnpicklerMacros$$anonfun$impl$2$$anonfun$16.apply(Macros.scala:248)
    at     scala.pickling.UnpicklerMacros$$anonfun$impl$2$$anonfun$16.apply(Macros.scala:245)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:251)
at scala.collection.AbstractTraversable.flatMap(Traversable.scala:105)
at scala.pickling.UnpicklerMacros$$anonfun$impl$2.unpickleObject$1(Macros.scala:245)
at scala.pickling.UnpicklerMacros$$anonfun$impl$2.unpickleLogic$1(Macros.scala:273)
at scala.pickling.UnpicklerMacros$$anonfun$impl$2.apply(Macros.scala:291)
at scala.pickling.UnpicklerMacros$$anonfun$impl$2.apply(Macros.scala:184)
at scala.pickling.Macro.preferringAlternativeImplicits(Tools.scala:357)
at scala.pickling.UnpicklerMacros$class.impl(Macros.scala:184)
at scala.pickling.Compat$$anon$3.impl(Compat.scala:24)
at scala.pickling.Compat$.UnpicklerMacros_impl(Compat.scala:25)
at sun.reflect.GeneratedMethodAccessor270.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at scala.tools.nsc.typechecker.Macros$$anonfun$scala$tools$nsc$typechecker$Macros$$macroRuntime$3$$anonfun$apply$8.apply(Macros.scala:544)
at scala.tools.nsc.typechecker.Macros$$anonfun$scala$tools$nsc$typechecker$Macros$$macroRuntime$3$$anonfun$apply$8.apply(Macros.scala:544)
at scala.tools.nsc.typechecker.Macros$class.scala$tools$nsc$typechecker$Macros$$macroExpandWithRuntime(Macros.scala:830)
at scala.tools.nsc.typechecker.Macros$$anonfun$scala$tools$nsc$typechecker$Macros$$macroExpand1$1.apply(Macros.scala:796)
at scala.tools.nsc.typechecker.Macros$$anonfun$scala$tools$nsc$typechecker$Macros$$macroExpand1$1.apply(Macros.scala:787)
at scala.tools.nsc.Global.withInfoLevel(Global.scala:190)
at scala.tools.nsc.typechecker.Macros$class.scala$tools$nsc$typechecker$Macros$$macroExpand1(Macros.scala:787)
val simpleUnpickle = simplePickle.unpickle[Simple]
                                        ^`

Cannot generate a pickler for a class named 'Unit' when not namespaced.

steps

To reproduce, run from the REPL:

scala> import scala.pickling._, Defaults._
import scala.pickling._
import Defaults._

scala> import json._
import json._

scala> object Container { case class Unit(x: Int) }
defined module Container

scala> import Container._
import Container._

scala> case class C(x: Unit)
defined class C

scala> val c = C(Unit(3))
c: C = C(Unit(3))

scala> c.pickle

problem (0.10.0)

scala> c.pickle
<console>:24: error: Cannot generate a pickler for C. Recompile with -Xlog-implicits for details
              c.pickle
                ^

problem from the original report

fails with:

<console>:20: pickling.this.SPickler.genPickler is not a valid implicit value for scala.pickling.SPickler[C.type] because:
hasMatchingSymbol reported error: type mismatch;
 found   : scala.Unit
 required: Container.Unit
              C.pickle

expectations

Note that redefining C with a fully qualified path, without importing Container._:

case class C(x: Container.Unit)
val c = C(Container.Unit(3))
c.pickle

evaluates to

JSONPickle({
  "tpe": "C",
  "x": {
    "x": 3
  }
})

which is correct.

publish roadmap

It would be handy to have - of course, as estimation - a roadmap to the first point which can be named more or less stable. Thanks!

POM file doesn't validate

I was trying to download the snapshot version (0.8.0-SNAPSHOT) through a proxy repository (Artifactory), however that failed because the XML in the POM doesn't validate the XSD schema for POMs.

The problem is that there are two organization elements, while the XSD for the POM only allows one.

In Build.scala an extra organization element is defined in pomExtra. Just removing the organization element from pomExtra would fix this problem. Sbt already has a setting key that can be used to set the organization name.

Out of memory while compiling

I'm getting java.lang.OutOfMemoryError: Java heap space while compiling my project with pickling since a recent update. It used to compile fine last week, and it still works fine as long as I comment simulation.pickle in class Parser.

I'm running sbt compile with default parameters:
/usr/bin/java -Xmx512M -jar /usr/local/Cellar/sbt/0.13.0/libexec/sbt-launch.jar compile
but I have tried to increase my heap up to 4GB, only to get: java.lang.OutOfMemoryError: GC overhead limit exceeded

The process uses 100% of one cpu until it reaches the heap limit, then it uses every cpu as it is forced to GC.

Most recent error log
Another error log

I'm currently tring to reproduce with a smaller part of the project, but I only manage to get java.io.IOException: File name too long as in #10 instead of the intended OutOfMemoryError.

Case class with List/Array/Seq does not unpickle to its parrent type

steps

Hello, I have some issue with unpickling derived case classes.
example (additional import Defaults._ is needed for 0.10.x):

scala> import scala.pickling._, Defaults._
import scala.pickling._

scala> import json._
import json._

scala> trait A
defined trait A

scala> case class B(a: List[Int]) extends A
defined class B

scala> B(List(1,2,3,4)).pickle.unpickle[B]
res1: B = B(List(1, 2, 3, 4))

scala> B(List(1,2,3,4)).pickle.unpickle[A]

problem

scala> B(List(1,2,3,4)).pickle.unpickle[A]
java.util.NoSuchElementException: key not found: hd
    at scala.collection.MapLike$class.default(MapLike.scala:228)
    at scala.collection.AbstractMap.default(Map.scala:58)
    at scala.collection.MapLike$class.apply(MapLike.scala:141)
    at scala.collection.AbstractMap.apply(Map.scala:58)
    at scala.pickling.json.JSONPickleReader.readField(JSONPickleFormat.scala:234)
    at scala.pickling.json.JSONPickleReader.readField(JSONPickleFormat.scala:158)
    at scala.pickling.InterpretedUnpicklerRuntime$$anon$2$$anonfun$fieldVals$1$1.apply(Runtime.scala:172)
    at scala.pickling.InterpretedUnpicklerRuntime$$anon$2$$anonfun$fieldVals$1$1.apply(Runtime.scala:171)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
    at scala.collection.immutable.List.foreach(List.scala:318)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
    at scala.collection.AbstractTraversable.map(Traversable.scala:105)
    at scala.pickling.InterpretedUnpicklerRuntime$$anon$2.fieldVals$1(Runtime.scala:171)
    at scala.pickling.InterpretedUnpicklerRuntime$$anon$2.unpickle(Runtime.scala:202)
    at scala.pickling.InterpretedUnpicklerRuntime$$anon$2$$anonfun$fieldVals$1$1.apply(Runtime.scala:188)
    at scala.pickling.InterpretedUnpicklerRuntime$$anon$2$$anonfun$fieldVals$1$1.apply(Runtime.scala:171)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
    at scala.collection.immutable.List.foreach(List.scala:318)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
    at scala.collection.AbstractTraversable.map(Traversable.scala:105)
    at scala.pickling.InterpretedUnpicklerRuntime$$anon$2.fieldVals$1(Runtime.scala:171)
    at scala.pickling.InterpretedUnpicklerRuntime$$anon$2.unpickle(Runtime.scala:202)
    at .<init>(<console>:17)
    at .<clinit>(<console>)
    at .<init>(<console>:7)
    at .<clinit>(<console>)
    at $print(<console>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:734)
    at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:983)
    at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:573)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:604)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:568)
    at scala.tools.nsc.interpreter.ILoop.reallyInterpret$1(ILoop.scala:745)
    at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:790)
    at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:702)
    at scala.tools.nsc.interpreter.ILoop.processLine$1(ILoop.scala:566)
    at scala.tools.nsc.interpreter.ILoop.innerLoop$1(ILoop.scala:573)
    at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:576)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:867)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:822)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:822)
    at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
    at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:822)
    at scala.tools.nsc.interpreter.ILoop.main(ILoop.scala:889)
    at xsbt.ConsoleInterface.run(ConsoleInterface.scala:57)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:73)
    at sbt.compiler.AnalyzingCompiler.console(AnalyzingCompiler.scala:64)
    at sbt.Console.console0$1(Console.scala:23)
    at sbt.Console$$anonfun$apply$2$$anonfun$apply$1.apply$mcV$sp(Console.scala:24)
    at sbt.TrapExit$.executeMain$1(TrapExit.scala:33)
    at sbt.TrapExit$$anon$1.run(TrapExit.scala:42)

note

this works fine:

scala> trait A
defined trait A

scala> case class B(a: Int) extends A
defined class B

scala> B(2).pickle.unpickle[A]
res3: A = B(2)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.