GithubHelp home page GithubHelp logo

virtuslab / akka-serialization-helper Goto Github PK

View Code? Open in Web Editor NEW
26.0 26.0 4.0 4.35 MB

Serialization toolbox for Akka messages, events and persistent state that helps achieve compile-time guarantee on serializability. No more errors in the runtime!

License: MIT License

Scala 97.15% Shell 1.77% sed 1.08%
akka compiler-plugin sbt-plugin scala

akka-serialization-helper's People

Contributors

aluscent avatar dependabot[bot] avatar hubertbalcerzak avatar lukaszkontowski avatar marconzet avatar miloradvojnovic avatar pawellipski avatar scala-steward avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

akka-serialization-helper's Issues

Research the options for event migration

Pls research https://github.com/scalapenos/stamina.

  1. Does this library automatically detect accidental changes to the events, introduced by the programmer? That's what I'm most afraid of in Hydra TBH (or any event-based software, for that matter)
  2. It seems that Stamina runs on the top of spray-json... can this be replaced by Borer or Kryo (#27)?
  3. Provide some short example of event migration in code... does the API feel mature enough? did you need to hack anything/check the source code of library to fix unexpected problems?
  4. Are there any alternatives that claim to do the same thing out there on GitHub?

Consider changing the format of event dumps for better readability of the diffs

Hmmm when I compared the diffs (OpenAPI yaml diff vs event dump diff) for Hydra PR #1101... it looks to me that the diffs for event dumps are significantly less readable. Not sure if it's a matter of using YAML vs compact JSON or just a difference in structure of the two... but still, I'd check if switching to YAML or non-compact JSON would help in reading the diffs of event dumps.

Consider switching the underlying serializer to akka-kryo (reflection-based :/)

I've just checked https://github.com/altoo-ag/akka-kryo-serialization (shame that I haven't known that before!).

Pros:

  • It seems that it's more hassle-free than Borer: doesn't require ANY codec to be declared...
  • Tested on Hydra, just works (all tests pass, took me 15 minutes to switch)
  • Akka serializer is built-in (unlike in Borer where we needed to add it ourselves)
  • Codecs for typed ActorRefs are built-in too; not sure how come it worked with SourceRefs but it did too ๐Ÿ˜ฏ

Cons:

  • It's based on reflection (like Jackson...) rather than compile-time derivation (like Borer)

I'd say: let's keep Borer for now (given that the work is already done), but without further investment into tasks like #12 or #22 as for now.

The current primary task (#5) is unaffected by which serializer is used; it's still a valid concern even if we use Kryo.
I've checked that if you forget to extend KryoSerializable && Java serialization is turned off, the stuff still crashes in the runtime.

Let's make the final decision when we move on to event migrations (Stamina-based or otherwise)... I'm pretty sure that at this stage it's gonna matter whether the serializer is compile-time or runtime (not sure which one will work better tho).

Request a logo from the graphics team

Probably the Borer logo but in Akka colors? Generally, some (minimalistic!) graphic portmanteau of Akka and Borer logos

Note that Borer logo is itself a portmanteau of Akka logo and a drill ;)

Also, we should contact Borer author first if we are to make a derived work off his logo ;)

Set up scalafix for consistent import order

plugin.sbt:

addSbtPlugin("ch.epfl.scala" % "sbt-scalafix" % <whatever is the latest>)

build.sbt:

ThisBuild / semanticdbEnabled in  := true
ThisBuild / semanticdbVersion := scalafixSemanticdb.revision
ThisBuild / scalafixDependencies += "com.github.liancheng" %% "organize-imports" % <latest>

scalafix.conf:

rules = [
  OrganizeImports
]

OrganizeImports.expandRelative = true
OrganizeImports.removeUnused = true
OrganizeImports.groupedImports = Explode
OrganizeImports.groups = [
  "java.",
  "scala.",
  "*",
  "<the top-level package of this library>"
]

CI pipeline:

sbt "scalafixAll --check"

Research if there are already serializers satisfying the compile-time safety requirement

Jackson (https://doc.akka.io/docs/akka/current/serialization-jackson.html), which is the de facto standard, does NOT guarantee safety in compile time... it's still possible that serialization crashes in runtime (e.g. if a field of a serialized object is of an abstract type, which is NOT annotated with @com.fasterxml.jackson.annotation.JsonTypeInfo :/

Would be nice to see what Protobuf (mentioned https://doc.akka.io/docs/akka/current/serialization.html) offers... but this will most likely require the developer to maintain separate .proto files with the message definitions, which is a lot of overhead.

The Borer-based solution I thought we could develop upon is described here: https://medium.com/@saloniv/how-to-write-your-own-borer-based-akka-serializer-34354a5f5681 ... this sounds pretty reasonable, but does NOT seem to be extracted to a standalone, reusable library. Also, it looks like it still requires some manual work (declaring register[...] on the serialized classes)... this should be ideally automated into some macro or whatnot.

Possibly there's something else (Borer-based or otherwise) that I missed, would be good to spend some time checking that first before we start any development.

Add dependency on borer-compat-akka

Even if we don't end up strictly needing this as a dependency ourselves, it'd be still super-useful for the our library's consumers to have borer-compat-akka in the classpath.

Debug compilation failure when generics are involved

I've got the following set of classes (simpliifed repro):

  sealed trait LimitPeriodType
  object LimitPeriodType {
    case object Day extends LimitPeriodType
    case object Week extends LimitPeriodType
    case object Month extends LimitPeriodType
  }

  sealed trait Limit[V, +LT <: LimitPeriodType]

  object Limit {
    final case class Daily[V](value: Option[V]) extends Limit[V, LimitPeriodType.Day.type]
    final case class Weekly[V](value: Option[V]) extends Limit[V, LimitPeriodType.Week.type]
    final case class Monthly[V](value: Option[V]) extends Limit[V, LimitPeriodType.Month.type]
  }

  final case class Limits[V](
      daily: Limit[V, LimitPeriodType.Day.type],
      weekly: Limit[V, LimitPeriodType.Week.type],
      monthly: Limit[V, LimitPeriodType.Month.type])

  implicit val limitPeriodTypeCodec: Codec[LimitPeriodType] = deriveAllCodecs
  implicit def limitCodec[V: Encoder: Decoder, LT <: LimitPeriodType]: Codec[Limit[V, LT]] = deriveAllCodecs
  implicit def limitsCodec[V: Encoder: Decoder]: Codec[Limits[V]] = deriveCodec

And I'm getting somewhat exotic error:

[error] MyBorerAkkaSerializer.scala:113:5: type mismatch;
[error]  found   : Product with Serializable with MyBorerAkkaSerializer.this.Limit[V,Product with Serializable with MyBorerAkkaSerializer.this.LimitPeriodType]
[error]  required: MyBorerAkkaSerializer.this.Limit[V,LT]
[error]  Note: implicit value fresh$macro$9 is not applicable here because it comes after the application point and it lacks an explicit result type
[error]     deriveAllCodecs
[error]     ^

Pls take a look if you have any idea how to circumvent that, or we can later post this as an issue directly in Borer.

Provide a mechanism for dumping schemas of all events (and transitively included classes)

Related to #28.

This would be priceless for detecting (esp. inadvertent) changes in events for the given PR in CI pipeline.

Possibly one of the serializers (kryo? borer?) already provides this functionality - pls research.

If not, then let's consider a Scala compiler plugin again (it could even just extract Scala source code of events and transitively included classes from the codebase).

`spray.json.DeserializationException: Object is missing required member 'typeSymbol'` under Scala 2.12

[error] ## Exception when compiling 365 sources to /home/plipski/hydra-backend/services/target/scala-2.12/classes
[error] spray.json.DeserializationException: Object is missing required member 'typeSymbol'
[error] spray.json.package$.deserializationError(package.scala:23)
[error] spray.json.ProductFormats.fromField(ProductFormats.scala:61)
[error] spray.json.ProductFormats.fromField$(ProductFormats.scala:51)
[error] org.virtuslab.ash.writer.EventSchemaWriter.fromField(EventSchemaWriter.scala:10)
[error] spray.json.ProductFormatsInstances$$anon$5.read(ProductFormatsInstances.scala:133)
[error] spray.json.ProductFormatsInstances$$anon$5.read(ProductFormatsInstances.scala:121)
[error] spray.json.JsValue.convertTo(JsValue.scala:33)
[error] org.virtuslab.ash.writer.EventSchemaWriter.$anonfun$lastDump$2(EventSchemaWriter.scala:20)
[error] scala.collection.Iterator$$anon$10.next(Iterator.scala:461)
[error] scala.collection.Iterator$$anon$10.next(Iterator.scala:461)
[error] scala.collection.Iterator.foreach(Iterator.scala:943)
[error] scala.collection.Iterator.foreach$(Iterator.scala:943)
[error] scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
[error] scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
[error] scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
[error] scala.collection.immutable.Map$MapBuilderImpl.$plus$plus$eq(Map.scala:583)
[error] scala.collection.immutable.Map$MapBuilderImpl.$plus$plus$eq(Map.scala:533)
[error] scala.collection.TraversableOnce.toMap(TraversableOnce.scala:354)
[error] scala.collection.TraversableOnce.toMap$(TraversableOnce.scala:352)
[error] scala.collection.AbstractIterator.toMap(Iterator.scala:1431)
[error] org.virtuslab.ash.writer.EventSchemaWriter.lastDump$lzycompute(EventSchemaWriter.scala:22)
[error] org.virtuslab.ash.writer.EventSchemaWriter.lastDump(EventSchemaWriter.scala:16)
[error] org.virtuslab.ash.DumpEventSchemaCompilerPluginComponent$$anon$1$$anonfun$2.isDefinedAt(DumpEventSchemaCompilerPluginComponent.scala:46)
[error] org.virtuslab.ash.DumpEventSchemaCompilerPluginComponent$$anon$1$$anonfun$2.isDefinedAt(DumpEventSchemaCompilerPluginComponent.scala:45)
[error] scala.reflect.internal.Trees$CollectTreeTraverser.traverse(Trees.scala:1715)
[error] scala.reflect.internal.Trees$CollectTreeTraverser.traverse(Trees.scala:1712)
[error] scala.reflect.api.Trees$Traverser.$anonfun$traverseStats$2(Trees.scala:2506)
[error] scala.reflect.api.Trees$Traverser.atOwner(Trees.scala:2515)
[error] scala.reflect.api.Trees$Traverser.$anonfun$traverseStats$1(Trees.scala:2506)
[error] scala.reflect.api.Trees$Traverser.traverseStats(Trees.scala:2505)
[error] scala.reflect.internal.Trees.itraverse(Trees.scala:1390)
[error] scala.reflect.internal.Trees.itraverse$(Trees.scala:1264)
[error] scala.reflect.internal.SymbolTable.itraverse(SymbolTable.scala:28)
[error] scala.reflect.internal.SymbolTable.itraverse(SymbolTable.scala:28)
[error] scala.reflect.api.Trees$Traverser.traverse(Trees.scala:2483)
[error] scala.reflect.internal.Trees$CollectTreeTraverser.traverse(Trees.scala:1716)
[error] scala.reflect.internal.Trees$TreeContextApiImpl.collect(Trees.scala:124)
[error] org.virtuslab.ash.DumpEventSchemaCompilerPluginComponent$$anon$1.apply(DumpEventSchemaCompilerPluginComponent.scala:45)
[error] scala.tools.nsc.Global$GlobalPhase.applyPhase(Global.scala:454)
[error] scala.tools.nsc.Global$GlobalPhase.run(Global.scala:402)
[error] scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1511)
[error] scala.tools.nsc.Global$Run.compileUnits(Global.scala:1495)
[error] scala.tools.nsc.Global$Run.compileSources(Global.scala:1488)
[error] scala.tools.nsc.Global$Run.compileFiles(Global.scala:1596)
[error] xsbt.CachedCompiler0.run(CompilerBridge.scala:163)
[error] xsbt.CachedCompiler0.run(CompilerBridge.scala:134)
[error] xsbt.CompilerBridge.run(CompilerBridge.scala:39)
[error] sbt.internal.inc.AnalyzingCompiler.compile(AnalyzingCompiler.scala:92)
[error] sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$7(MixedAnalyzingCompiler.scala:186)
[error] scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
[error] sbt.internal.inc.MixedAnalyzingCompiler.timed(MixedAnalyzingCompiler.scala:241)
[error] sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$4(MixedAnalyzingCompiler.scala:176)
[error] sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$4$adapted(MixedAnalyzingCompiler.scala:157)
[error] sbt.internal.inc.JarUtils$.withPreviousJar(JarUtils.scala:239)
[error] sbt.internal.inc.MixedAnalyzingCompiler.compileScala$1(MixedAnalyzingCompiler.scala:157)
[error] sbt.internal.inc.MixedAnalyzingCompiler.compile(MixedAnalyzingCompiler.scala:204)
[error] sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1(IncrementalCompilerImpl.scala:573)
[error] sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1$adapted(IncrementalCompilerImpl.scala:573)
[error] sbt.internal.inc.Incremental$.$anonfun$apply$5(Incremental.scala:173)
[error] sbt.internal.inc.Incremental$.$anonfun$apply$5$adapted(Incremental.scala:171)
[error] sbt.internal.inc.Incremental$$anon$2.run(Incremental.scala:458)
[error] sbt.internal.inc.IncrementalCommon$CycleState.next(IncrementalCommon.scala:116)
[error] sbt.internal.inc.IncrementalCommon$$anon$1.next(IncrementalCommon.scala:56)
[error] sbt.internal.inc.IncrementalCommon$$anon$1.next(IncrementalCommon.scala:52)
[error] sbt.internal.inc.IncrementalCommon.cycle(IncrementalCommon.scala:261)
[error] sbt.internal.inc.Incremental$.$anonfun$incrementalCompile$8(Incremental.scala:413)
[error] sbt.internal.inc.Incremental$.withClassfileManager(Incremental.scala:498)
[error] sbt.internal.inc.Incremental$.incrementalCompile(Incremental.scala:400)
[error] sbt.internal.inc.Incremental$.apply(Incremental.scala:165)
[error] sbt.internal.inc.IncrementalCompilerImpl.compileInternal(IncrementalCompilerImpl.scala:573)
[error] sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileIncrementally$1(IncrementalCompilerImpl.scala:491)
[error] sbt.internal.inc.IncrementalCompilerImpl.handleCompilationError(IncrementalCompilerImpl.scala:332)
[error] sbt.internal.inc.IncrementalCompilerImpl.compileIncrementally(IncrementalCompilerImpl.scala:420)
[error] sbt.internal.inc.IncrementalCompilerImpl.compile(IncrementalCompilerImpl.scala:137)
[error] sbt.Defaults$.compileIncrementalTaskImpl(Defaults.scala:2176)
[error] sbt.Defaults$.$anonfun$compileIncrementalTask$2(Defaults.scala:2133)
[error] sbt.internal.io.Retry$.apply(Retry.scala:40)
[error] sbt.internal.io.Retry$.apply(Retry.scala:23)
[error] sbt.internal.server.BspCompileTask$.compute(BspCompileTask.scala:31)
[error] sbt.Defaults$.$anonfun$compileIncrementalTask$1(Defaults.scala:2129)
[error] scala.Function1.$anonfun$compose$1(Function1.scala:49)
[error] sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:62)
[error] sbt.std.Transform$$anon$4.work(Transform.scala:68)
[error] sbt.Execute.$anonfun$submit$2(Execute.scala:282)
[error] sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:23)
[error] sbt.Execute.work(Execute.scala:291)
[error] sbt.Execute.$anonfun$submit$1(Execute.scala:282)
[error] sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:265)
[error] sbt.CompletionService$$anon$2.call(CompletionService.scala:64)
[error] java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[error] java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
[error] java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[error] java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
[error] java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
[error] java.base/java.lang.Thread.run(Thread.java:834)
[error]            
[error] stack trace is suppressed; run last hydra-backend / Compile / compileIncremental for the full output
[error] (hydra-backend / Compile / compileIncremental) spray.json.DeserializationException: Object is missing required member 'typeSymbol'
[error] Total time: 111 s (01:51), completed 21 Jun 2021, 13:22:15

Write a README

  • CI + License + Maven badge (shields.io, see https://github.com/VirtusLab/git-machete-intellij-plugin/ for a sample)
  • Logo (#25)
  • Build setup (compiler plugin + dependency)
  • Rationale: why Jackson sucks for Akka and how it's done better here
  • Remember about explicit mentions of Borer! so that it doesn't create an impression that we attribute to ourselves what's actually done by Borer
  • Document sbt settings for dump event schema plugin/options for compiler plugin (maybe not in README but in a separate file? TBD)
  • Document Circe serializer with some Scala snippets

Ensure in compile time that the specified serializer is used for every message, event and persistent state

Since methods like ask/? and tell/! do not accept any implicit codec param, instead resolving the (de)serializer for messages at runtime ://

Thus it's still perfectly possible that even if Borer serializer is present in config, it still doesn't cover all classes that are used as messages/events/state. For some of them, either Java serializer may be used as a fallback, or, if that's disabled, just an exception thrown.

Currently, using ArchUnit, we're checking whether type params of all Behaviors, ReplyEffects etc. present in method return types are extending CborSerializable... but that's pretty awkward. A more proper solution should check that in compile time rather than in tests using reflection :/

`dumpEventSchemaOutputFilename` does NOT allow for specifying absolute path

The output file is always assumed to live under <subproject-directory>/target :/

E.g. when I passed full path (/home/plipski/hydra-backend/target/events-current.json) to the file, I've got:

(hydra-backend / dumpEventSchema) java.nio.file.NoSuchFileException: /home/plipski/hydra-backend/services/target/home/plipski/hydra-backend/target/events-current.json

:/

[Bug in Borer?] Field of a type that is an alias for a simple type crashes the derivation

  type Foo = String
  case class Bar(bar: Foo)
  implicit val barCodec: Codec[Bar] = deriveCodec

crashes with:

[error] MyBorerAkkaSerializer.scala:54:39: value writeMyBorerAkkaSerializer.this.Foo is not a member of io.bullet.borer.Writer
[error]   implicit val barCodec: Codec[Bar] = deriveCodec

Apparently the derivation looks for "write" + <typename> method in Writer... which makes sense for String/Double etc. but not for type aliases.

Pls check if that's still a problem in Borer 1.7.0; if so, pls open this issue in Borer + look for a workaround before it's fixed in Borer.

Reproduce the problems with Jackson serializer

The purpose of this task is mostly for you to realize how poor the Jackson serialization is in Akka, and what Jackson's problems this project is supposed to alleviate.

That's a test suite from my current project that strives to capture all the problems discovered with Jackson serializer so far.

package myproject.archunit

import java.lang.reflect.ParameterizedType
import java.lang.reflect.Type

import scala.jdk.CollectionConverters.asScalaSetConverter

import com.fasterxml.jackson.annotation.JsonTypeInfo
import com.fasterxml.jackson.databind.annotation.JsonDeserialize
import com.fasterxml.jackson.databind.annotation.JsonSerialize
import com.tngtech.archunit.base.DescribedPredicate
import com.tngtech.archunit.core.domain._
import com.tngtech.archunit.lang.ArchCondition
import com.tngtech.archunit.lang.ConditionEvents
import com.tngtech.archunit.lang.SimpleConditionEvent
import com.tngtech.archunit.lang.syntax.ArchRuleDefinition.classes
import com.tngtech.archunit.lang.syntax.ArchRuleDefinition.noClasses
import org.scalatest.wordspec.AnyWordSpecLike

import myproject.CborSerializable
import myproject.archunit.BaseArchUnitSpec.importedProductionClasses
import myproject.core.ScalaObjectUtils._
import myproject.core.serialization.EnumEntryDeserializer
import myproject.core.serialization.EnumEntrySerializer

// TODO (GMV3-965): maybe we should extract ArchUnit as separate service with one method for each of those cases
//  and plug in to this test suite to verify whole code base,
//  but additionally add a Spec that for each method will provide some problematic classes
//  and thus we will check the checker service itself
class JacksonSerializationArchUnitSpec extends AnyWordSpecLike with BaseArchUnitSpec {

  implicit class JavaClassJacksonOps(self: JavaClass) {
    def hasExplicitJacksonAnnotations: Boolean = {
      self.hasAnnotation[JsonSerialize] && self.hasAnnotation[JsonDeserialize] ||
      self.hasAnnotation[JsonTypeInfo]
    }
  }

  // The below tests (roughly) check that the classes used as messages/events/state in Akka
  // are always marked as CborSerializable, to ensure that Jackson CBOR and not legacy Java serialization
  // is used for their serialization.
  // For Akka to ensure that condition statically, a major redesign would be necessary -
  // all methods like `ask`, `tell`, `Effect.persist` etc. would need to require an implicit `Codec` (?) parameter.

  // The below tests do NOT ensure that the Jackson serialization of messages/events/state will actually succeed in the runtime.

  "Messages, events and entity state classes" should {
    "implement CborSerializable" in {
      classes
        .should(new ArchCondition[JavaClass]("only use CborSerializable message/event/state types") {
          override def check(clazz: JavaClass, events: ConditionEvents): Unit = {

            clazz.getAllMethods.asScala.foreach { method =>
              def checkType(tpe: Type, category: String, failsWhen: String): Unit = {
                tpe match {
                  case clazz: Class[_] if clazz.getPackageName.startsWith("akka") =>
                  // OK, acceptable

                  case clazz: Class[_] if clazz == classOf[scala.Nothing] =>
                  // OK, acceptable

                  case clazz: Class[_] if !classOf[CborSerializable].isAssignableFrom(clazz) =>
                    val message =
                      s"Type ${clazz.getName} is used as Akka $category (as observed in the return type of method ${method.getFullName}), " +
                      s"but does NOT extend CborSerializable; this will fail in the runtime $failsWhen"
                    events.add(SimpleConditionEvent.violated(clazz, message))

                  case _ =>
                }
              }

              val returnType = method.getRawReturnType
              val genericReturnType = method.reflect.getGenericReturnType

              if (returnType.isEquivalentTo(classOf[akka.persistence.typed.scaladsl.ReplyEffect[_, _]])) {
                genericReturnType match {
                  case parameterizedType: ParameterizedType =>
                    val Array(eventType, stateType) = parameterizedType.getActualTypeArguments
                    checkType(eventType, "event", "when saving to the journal")
                    checkType(stateType, "persistent state", "when doing a snapshot")
                  case _ =>
                }
              } else if (returnType.isEquivalentTo(classOf[akka.projection.eventsourced.EventEnvelope[_]])) {
                genericReturnType match {
                  case parameterizedType: ParameterizedType =>
                    val Array(eventType) = parameterizedType.getActualTypeArguments
                    checkType(eventType, "event", "when saving to the journal")
                  case _ =>
                }
              } else if (returnType.isEquivalentTo(classOf[akka.actor.typed.Behavior[_]])) {
                genericReturnType match {
                  case parameterizedType: ParameterizedType =>
                    val Array(messageType) = parameterizedType.getActualTypeArguments
                    checkType(messageType, "message", "when sending a message outside of the current JVM")
                  case _ =>
                }
              }
            }
          }
        })
        .check(importedProductionClasses)
    }
  }

  // The below tests (roughly) check that Jackson serialization of classes marked as CborSerializable
  // will actually succeed in the runtime.
  // TODO (GMV3-763): migrate serialization checks from ArchUnit to compile-time verification (Borer-based?)

  // The below tests do NOT ensure that the the classes used as messages/events/state in Akka
  // are always marked as CborSerializable.

  "Classes that implement CborSerializable" should {
    "never contain a non-serializable field" in {
      // These have been semi-manually verified using SerializationTestKit.
      val safeGenericNonCborSerializableTypes: Seq[Class[_]] = Seq(
        classOf[akka.actor.typed.ActorRef[_]],
        classOf[akka.stream.SourceRef[_]],
        classOf[cats.data.NonEmptyList[_]],
        classOf[scala.Option[_]],
        classOf[scala.collection.immutable.List[_]],
        classOf[scala.collection.immutable.Map[_, _]],
        classOf[scala.collection.immutable.Seq[_]],
        classOf[scala.collection.immutable.Set[_]],
        classOf[scala.collection.immutable.SortedSet[_]],
        classOf[scala.collection.Seq[_]])

      val mapGenericTypes: Seq[Class[_]] = Seq(classOf[scala.collection.Map[_, _]])

      val sortedGenericTypes: Seq[Class[_]] = Seq(
        // Explicit upcast required due to a glitch in Scala 2.12
        // (apparently has trouble finding a common supertype for existential types?
        // although that's not a problem in safeGenericNonCborSerializableTypes above... weird)
        classOf[scala.collection.immutable.SortedMap[_, _]]: Class[_],
        classOf[scala.collection.immutable.SortedSet[_]]: Class[_])

      // These have been semi-manually verified using SerializationTestKit.
      val safeNonCborSerializableTypes: Seq[Class[_]] = Seq(
        java.lang.Boolean.TYPE,
        java.lang.Byte.TYPE,
        java.lang.Double.TYPE,
        java.lang.Integer.TYPE,
        java.lang.Long.TYPE,
        classOf[java.lang.String],
        classOf[java.lang.Throwable],
        classOf[java.time.Instant],
        classOf[java.time.OffsetDateTime],
        classOf[scala.concurrent.duration.FiniteDuration],
        classOf[scala.math.BigDecimal])

      classes.that
        .areAssignableTo(classOf[myproject.CborSerializable])
        .should(new ArchCondition[JavaClass]("only contain serializable fields") {

          private def checkParametrizedType(
              parameterizedType: ParameterizedType,
              field: JavaField,
              events: ConditionEvents): Unit = {

            val fieldType = field.getRawType
            val fieldCaption = s"Field ${field.getName} in class ${field.getOwner.getFullName} "

            parameterizedType.getActualTypeArguments.foreach {
              case typeArgumentAsParametrizedType: ParameterizedType =>
                typeArgumentAsParametrizedType.getRawType match {
                  case rawType: Class[_] if classOf[CborSerializable].isAssignableFrom(rawType) =>
                    checkParametrizedType(typeArgumentAsParametrizedType, field, events)
                  case rawType: Class[_] if safeGenericNonCborSerializableTypes.contains(rawType) =>
                    checkParametrizedType(typeArgumentAsParametrizedType, field, events)
                  case rawType =>
                    val message = fieldCaption +
                      s"is of a generic type $parameterizedType, " +
                      s"whose one of type arguments is $rawType (itself a generic type), " +
                      s"which does NOT extend CborSerializable and has NOT been verified by the team to be serializable yet " +
                      s"(tip: mark it as CborSerializable or use Akka's SerializationTestKit#verify(...) " +
                      s"on a case class that has a field of ${rawType.getTypeName} type)"
                    events.add(SimpleConditionEvent.violated(fieldType, message))
                }

              case typeArgumentAsClass: Class[_] if typeArgumentAsClass == classOf[java.lang.Object] =>
              // For some reason, we observed that when type parameter is Boolean in Scala code,
              // then the generic type recorded in classfile is java.lang.Object rather than java.lang.Boolean.
              // java.lang.Object isn't likely to appear for any other type (unless someone uses Any or AnyRef as a type param),
              // so we're giving it a free pass.

              case typeArgumentAsClass: Class[_] if safeNonCborSerializableTypes.contains(typeArgumentAsClass) =>
              // OK, expected

              case typeArgumentAsClass: Class[_] if classOf[CborSerializable].isAssignableFrom(typeArgumentAsClass) =>
              // OK, expected

              case typeArgument =>
                val message = fieldCaption +
                  s"is of a generic type $parameterizedType, " +
                  s"whose one of type arguments is $typeArgument, " +
                  s"which does NOT extend CborSerializable and has NOT been verified by the team to be serializable yet " +
                  s"(tip: mark it as CborSerializable or use Akka's SerializationTestKit#verify(...) " +
                  s"on a case class that has a field of ${typeArgument.getTypeName} type)"
                events.add(SimpleConditionEvent.violated(fieldType, message))
            }
          }

          override def check(clazz: JavaClass, events: ConditionEvents): Unit = {

            clazz.getFields.forEach { field =>
              val fieldType = field.getRawType
              val fieldCaption = s"Field ${field.getName} in class ${field.getOwner.getFullName} "

              if (mapGenericTypes.exists(fieldType.isEquivalentTo)) {
                val fieldGenericType = field.reflect.getGenericType
                val Array(keyType, _) = fieldGenericType.asInstanceOf[ParameterizedType].getActualTypeArguments
                if (keyType != classOf[String]) {
                  val ctor = clazz.getConstructors.asScala.head.reflect
                  val ctorsParamForField = ctor.getParameters.find(_.getName == field.getName).get
                  val annotationsOfCtorsParamForField = ctorsParamForField.getAnnotations.toSeq
                  val jsonSerializeOpt =
                    annotationsOfCtorsParamForField.find(_.annotationType == classOf[JsonSerialize])
                  val jsonDeserializeOpt =
                    annotationsOfCtorsParamForField.find(_.annotationType == classOf[JsonDeserialize])
                  (jsonSerializeOpt, jsonDeserializeOpt) match {
                    // "None" corresponds to com.fasterxml.jackson.databind.JsonSerializer.None, the default class for `keyUsing`
                    // This can't be easily done using `classOf`
                    // since accessing Java's static nested classes is cumbersome from Scala
                    case (Some(jsonSerialize: JsonSerialize), Some(jsonDeserialize: JsonDeserialize))
                        if jsonSerialize.keyUsing.getSimpleName != "None" &&
                        jsonDeserialize.keyUsing.getSimpleName != "None" =>
                    // OK, expected

                    case _ =>
                      val message = fieldCaption +
                        s"is of a map type $fieldGenericType, " +
                        s"whose key type argument is $keyType, " +
                        s"which is NOT annotated with both @JsonSerialize(keyUsing = ...) and @JsonDeserialize(keyUsing = ...)"
                      events.add(SimpleConditionEvent.violated(fieldType, message))
                  }
                }
              }

              if (sortedGenericTypes.exists(fieldType.isEquivalentTo)) {
                val fieldGenericType = field.reflect.getGenericType
                val Array(keyType, _*) = fieldGenericType.asInstanceOf[ParameterizedType].getActualTypeArguments

                def isComparable(clazz: Class[_]): Boolean = classOf[java.lang.Comparable[_]].isAssignableFrom(clazz)

                keyType match {
                  case keyClazz: Class[_] if isComparable(keyClazz) =>
                  // OK, expected

                  case keyClazz: ParameterizedType if isComparable(keyClazz.getRawType.asInstanceOf[Class[_]]) =>
                  // OK, expected

                  case _ =>
                    val message = fieldCaption +
                      s"is of a sorted type $fieldGenericType, " +
                      s"whose key type argument is $keyType, " +
                      s"which does NOT extend java.lang.Comparable; this is bound to fail on deserialization in the runtime"
                    events.add(SimpleConditionEvent.violated(fieldType, message))
                }
              }

              if (safeGenericNonCborSerializableTypes.exists(fieldType.isEquivalentTo)) {
                checkParametrizedType(field.reflect.getGenericType.asInstanceOf[ParameterizedType], field, events)

              } else if (!fieldType.isAssignableTo(classOf[CborSerializable])) {
                if (!safeNonCborSerializableTypes.exists(fieldType.isEquivalentTo)) {

                  val message = fieldCaption +
                    s"is of a non-${classOf[CborSerializable].getSimpleName} type ${fieldType.getFullName}, " +
                    s"which does NOT extend CborSerializable and has NOT been verified by the team to be serializable yet " +
                    s"(tip: mark it as CborSerializable or use Akka's SerializationTestKit#verify(...)" +
                    s" on a case class that has a field of ${fieldType.getFullName} type)"
                  events.add(SimpleConditionEvent.violated(fieldType, message))
                }
              } else if (!fieldType.isConcrete && !fieldType.hasExplicitJacksonAnnotations) {
                val interfaces = clazz.getAllInterfaces.asScala
                if (!interfaces.exists(_.hasExplicitJacksonAnnotations)) {

                  val message = fieldCaption +
                    s"is of a non-concrete type ${fieldType.getFullName}, " +
                    "which is NOT annotated with either (@JsonSerialize + @JsonDeserialize) OR @JsonTypeInfo," +
                    s"and neither is any of its implemented interfaces (${interfaces.map(_.getFullName).mkString(", ")})"
                  events.add(SimpleConditionEvent.violated(fieldType, message))
                }
              }
            }
          }
        })
        .check(importedProductionClasses)
    }

    "not be of unannotated abstract type" in {
      classes.that
        .areAssignableTo(classOf[myproject.CborSerializable])
        .and
        .areNotAssignableFrom(classOf[myproject.CborSerializable])
        .should(new ArchCondition[JavaClass]("not be of unannotated abstract type") {
          override def check(clazz: JavaClass, events: ConditionEvents): Unit = {

            if (!clazz.isConcrete && !clazz.hasExplicitJacksonAnnotations) {
              val interfaces = clazz.getAllInterfaces.asScala
              if (!interfaces.exists(_.hasExplicitJacksonAnnotations)) {

                val message = s"${clazz.getFullName} of a non-concrete type ${clazz.getFullName}, " +
                  "which is NOT annotated with either (@JsonSerialize + @JsonDeserialize) OR @JsonTypeInfo, " +
                  s"and neither is any of its implemented interfaces (${interfaces.map(_.getFullName).mkString(", ")})"
                events.add(SimpleConditionEvent.violated(clazz, message))
              }
            }
          }
        })
        .check(importedProductionClasses)
    }

    "not be Scala objects" in {
      noClasses.that
        .areAssignableTo(classOf[myproject.CborSerializable])
        .and
        // Enums get a free pass since they have a dedicated deserializer.
        .areNotAssignableTo(classOf[enumeratum.EnumEntry])
        .and
        // This one is scheduled for removal in favor of myproject.core.currency.MoneyAmount anyway.
        .areNotAssignableTo(myproject.core.currency.DefaultCurrency.getClass)
        .should(new ArchCondition[JavaClass]("be Scala objects") {
          override def check(clazz: JavaClass, events: ConditionEvents): Unit = {
            if (clazz.reflect().isScalaObject) {
              val message = s"${clazz.getFullName} is a Scala object"
              events.add(SimpleConditionEvent.satisfied(clazz, message))
            }
          }
        })
        .because("Jackson deserializer is NOT aware that Scala objects are supposed to be singletons, " +
        "and creates new instances of the object's class instead, which makes pattern matching on objects completely unreliable; " +
        "use a nullary case class instead of an object")
        .check(importedProductionClasses)
    }
  }

  "Classes that implement both CborSerializable and EnumEntry" should {
    "be serialized with EnumEntry(De)Serializer" in {
      classes.that.areInterfaces.and
        .areAssignableTo(classOf[enumeratum.EnumEntry])
        .and
        .areAssignableTo(classOf[myproject.CborSerializable])
        .should
        .beAnnotatedWith(new DescribedPredicate[JavaAnnotation[_]](
          s"JsonSerialize(classOf[_ <: ${classOf[EnumEntrySerializer[_]].getSimpleName}])") {
          override def apply(input: JavaAnnotation[_]): Boolean =
            if (input.getRawType.isAssignableTo(classOf[JsonSerialize])) {
              input.getProperties.getOrDefault("using", null) match { // scalastyle:ignore null
                case u: JavaClass if u.isAssignableTo(classOf[EnumEntrySerializer[_]]) => true
                case _                                                                 => false
              }
            } else {
              false
            }
        })
        .andShould
        .beAnnotatedWith(new DescribedPredicate[JavaAnnotation[_]](
          s"JsonDeserialize(classOf[_ <: ${classOf[EnumEntryDeserializer[_]].getSimpleName}])") {
          override def apply(input: JavaAnnotation[_]): Boolean =
            if (input.getRawType.isAssignableTo(classOf[JsonDeserialize])) {
              input.getProperties.getOrDefault("using", null) match { // scalastyle:ignore null
                case u: JavaClass if u.isAssignableTo(classOf[EnumEntryDeserializer[_]]) => true
                case _                                                                   => false
              }
            } else {
              false
            }
        })
        .because("this will fail in the runtime otherwise")
        .check(importedProductionClasses)
    }
  }
}

Publish to Maven Central

  • Add a badge from shields.io to README
  • Add code to CI pipeline that would do a release on master builds, and preferably publish a snapshot build on develop builds as well

Research compatibility of the format of serialized events between the versions of the given serialization library

https://github.com/altoo-ag/akka-kryo-serialization#features -> for Kryo, the approach is reasonable/semantic - compatibility of binary format is only guaranteed b/w patch/minor updates, but not major updates.

The end result of this task should be another row in the table in README.md, with a comparison of the documented approach to compatibility b/w libraries.

If any library does NOT mention anything about its compatibility guarantees, we should NOT check it experimentally. In fact, it's a warning sign โ€” if they don't document such guarantees, do they even care about them in the first place?

Provide a non-Java-serialization-based Borer codec for OffsetDateTime

There doesn't seem to be anything (?) like that in borer-core library at least... I'm getting Could not find implicit Encoder[java.time.OffsetDateTime] for parameter receivedAtUtc of case class ...

Maybe there is already another library (Borer-official or otherwise) that provides such codecs OOTB? Pls investigate

Provide a reusable serializer based on Circe (rather than Borer)

import akka.serialization.Serializer
import io.circe.jawn.JawnParser
import io.circe.{ Decoder, Encoder, Printer }
โ€‹
import java.nio.charset.StandardCharsets.UTF_8
โ€‹
class CirceAkkaSerializer extends Serializer {
  import CirceAkkaSerializer._
โ€‹
  override val identifier      = 1234
  override val includeManifest = false
โ€‹
  override def toBinary(o: AnyRef): Array[Byte] = o match {
    case cs: CirceSerializable =>
      import CirceSerializableWrapper._
      CirceSerializableWrapper(cs) match {
        case Right(entity)             => entity.toBytes
        case Left(UnknownSerializable) => throw new RuntimeException(s"Tried to serialize unrecognized serializable: [$cs]")
      }
โ€‹
    case unknownMsg =>
      throw new RuntimeException(s"Non-serializable message passed to serializer: [$unknownMsg]")
  }
โ€‹
  override def fromBinary(bytes: Array[Byte], manifest: Option[Class[_]]): AnyRef = {
    bytes.toEntityUnsafe[CirceSerializableWrapper].entity
  }
}
โ€‹
object CirceAkkaSerializer {
  private val parser  = new JawnParser
  private val printer = Printer.noSpaces
โ€‹
  implicit private[codec] class ByteArrayOps(private val bytes: Array[Byte]) extends AnyVal {
    def toEntityUnsafe[A: Decoder]: A = {
      parser.parseByteArray(bytes).flatMap(_.as[A]).fold(e => throw e, identity)
    }
  }
โ€‹
  implicit private[codec] class EntityOps[A](private val entity: A) extends AnyVal {
    def toBytes(implicit enc: Encoder[A]): Array[Byte] = {
      import io.circe.syntax._
โ€‹
      val buf   = printer.printToByteBuffer(entity.asJson, UTF_8)
      val bytes = new Array[Byte](buf.remaining)
      buf.get(bytes)
โ€‹
      bytes
    }
  }
}

java.lang.NoClassDefFoundError: org/virtuslab/akkasaferserializer/SerializabilityTrait

I will provide a full repro once the legal stuff about your access to Hydra repo is finally sorted out...

[info] compiling 73 Scala sources to /home/plipski/hydra-backend/services/target/scala-2.12/classes ...
[info] AkkaSerializabilityCheckerPlugin: Found new annotated trait: hydra.CborSerializable
[error] ## Exception when compiling 337 sources to /home/plipski/hydra-backend/services/target/scala-2.12/classes
[error] java.lang.NoClassDefFoundError: org/virtuslab/akkasaferserializer/SerializabilityTrait
[error] org.virtuslab.akkasaferserializer.AkkaSerializabilityCheckerPluginComponent$$anon$1.$anonfun$apply$2(AkkaSerializabilityCheckerPluginComponent.scala:49)
[error] scala.collection.immutable.List.$anonfun$foldRight$1(List.scala:447)
[error] scala.collection.immutable.List.foldRight(List.scala:91)
[error] org.virtuslab.akkasaferserializer.AkkaSerializabilityCheckerPluginComponent$$anon$1.apply(AkkaSerializabilityCheckerPluginComponent.scala:31)
[error] scala.tools.nsc.Global$GlobalPhase.applyPhase(Global.scala:454)
[error] scala.tools.nsc.Global$GlobalPhase.run(Global.scala:402)
[error] scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1511)
[error] scala.tools.nsc.Global$Run.compileUnits(Global.scala:1495)
[error] scala.tools.nsc.Global$Run.compileSources(Global.scala:1488)
[error] scala.tools.nsc.Global$Run.compileFiles(Global.scala:1596)
[error] xsbt.CachedCompiler0.run(CompilerBridge.scala:163)
[error] xsbt.CachedCompiler0.run(CompilerBridge.scala:134)
[error] xsbt.CompilerBridge.run(CompilerBridge.scala:39)
[error] sbt.internal.inc.AnalyzingCompiler.compile(AnalyzingCompiler.scala:92)
[error] sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$7(MixedAnalyzingCompiler.scala:186)
[error] scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
[error] sbt.internal.inc.MixedAnalyzingCompiler.timed(MixedAnalyzingCompiler.scala:241)
[error] sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$4(MixedAnalyzingCompiler.scala:176)
[error] sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$4$adapted(MixedAnalyzingCompiler.scala:157)
[error] sbt.internal.inc.JarUtils$.withPreviousJar(JarUtils.scala:239)
[error] sbt.internal.inc.MixedAnalyzingCompiler.compileScala$1(MixedAnalyzingCompiler.scala:157)
[error] sbt.internal.inc.MixedAnalyzingCompiler.compile(MixedAnalyzingCompiler.scala:204)
[error] sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1(IncrementalCompilerImpl.scala:573)
[error] sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1$adapted(IncrementalCompilerImpl.scala:573)
[error] sbt.internal.inc.Incremental$.$anonfun$apply$5(Incremental.scala:173)
[error] sbt.internal.inc.Incremental$.$anonfun$apply$5$adapted(Incremental.scala:171)
[error] sbt.internal.inc.Incremental$$anon$2.run(Incremental.scala:458)
[error] sbt.internal.inc.IncrementalCommon$CycleState.next(IncrementalCommon.scala:116)
[error] sbt.internal.inc.IncrementalCommon$$anon$1.next(IncrementalCommon.scala:56)
[error] sbt.internal.inc.IncrementalCommon$$anon$1.next(IncrementalCommon.scala:52)
[error] sbt.internal.inc.IncrementalCommon.cycle(IncrementalCommon.scala:261)
[error] sbt.internal.inc.Incremental$.$anonfun$incrementalCompile$8(Incremental.scala:413)
[error] sbt.internal.inc.Incremental$.withClassfileManager(Incremental.scala:498)
[error] sbt.internal.inc.Incremental$.incrementalCompile(Incremental.scala:400)
[error] sbt.internal.inc.Incremental$.apply(Incremental.scala:165)
[error] sbt.internal.inc.IncrementalCompilerImpl.compileInternal(IncrementalCompilerImpl.scala:573)
[error] sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileIncrementally$1(IncrementalCompilerImpl.scala:491)
[error] sbt.internal.inc.IncrementalCompilerImpl.handleCompilationError(IncrementalCompilerImpl.scala:332)
[error] sbt.internal.inc.IncrementalCompilerImpl.compileIncrementally(IncrementalCompilerImpl.scala:420)
[error] sbt.internal.inc.IncrementalCompilerImpl.compile(IncrementalCompilerImpl.scala:137)
[error] sbt.Defaults$.compileIncrementalTaskImpl(Defaults.scala:2176)
[error] sbt.Defaults$.$anonfun$compileIncrementalTask$2(Defaults.scala:2133)
[error] sbt.internal.io.Retry$.apply(Retry.scala:40)
[error] sbt.internal.io.Retry$.apply(Retry.scala:23)
[error] sbt.internal.server.BspCompileTask$.compute(BspCompileTask.scala:31)
[error] sbt.Defaults$.$anonfun$compileIncrementalTask$1(Defaults.scala:2129)
[error] scala.Function1.$anonfun$compose$1(Function1.scala:49)
[error] sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:62)
[error] sbt.std.Transform$$anon$4.work(Transform.scala:68)
[error] sbt.Execute.$anonfun$submit$2(Execute.scala:282)
[error] sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:23)
[error] sbt.Execute.work(Execute.scala:291)
[error] sbt.Execute.$anonfun$submit$1(Execute.scala:282)
[error] sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:265)
[error] sbt.CompletionService$$anon$2.call(CompletionService.scala:64)
[error] java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[error] java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
[error] java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[error] java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
[error] java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
[error] java.base/java.lang.Thread.run(Thread.java:834)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.