GithubHelp home page GithubHelp logo

streem / pbandk Goto Github PK

View Code? Open in Web Editor NEW
264.0 20.0 38.0 5.94 MB

Kotlin Code Generator and Runtime for Protocol Buffers

License: MIT License

Kotlin 99.93% Shell 0.05% Go 0.02%
kotlin protobuf

pbandk's Introduction

License GitHub release (latest SemVer) GitHub branch checks state

Pbandk

Pbandk is a Kotlin code generator and runtime for Protocol Buffers. It is built to work across multiple Kotlin platforms.

NOTE: This is the documentation for the version of pbandk currently in development. Documentation for the latest stable version is available at https://github.com/streem/pbandk/blob/v0.15.0/README.md.

Features

  • Clean data class generation
  • Works for JVM, Android, and JS (both legacy and IR), with experimental support for Native
  • Support for proto2 and proto3 syntaxes
  • JSON serialization/deserialization following the proto3 JSON spec (see #72 for some corner cases and Well-Known Types that are not handled yet)
  • Oneof's are properly handled as sealed classes
  • Specialized support to handle wrappers from the well-known types (e.g. StringValue, BoolValue) as nullable primitives (String?, Boolean?, etc.)
  • JS platform leverages protobuf.js for best performance
  • Support for custom service/gRPC code generator
  • Support for custom options

Experimental

  • Kotlin/Native support

Not Yet Implemented

  • Specialized support for more of the well known types (e.g. Any)
  • Access to the protobuf descriptor from generated code
  • Code comments on generated code
  • Specialized support for the deprecated option

Read below for more information and see the examples.

Status

This project is currently in beta. It has the core set of protobuf features implemented and is being used in production. But it is still under active development and new versions might introduce backwards-incompatible changes to support new features or to improve the library's usability in Kotlin. Pull requests are welcome for any of the "Not Yet Implemented" features above.

This project follows semantic versioning. After v1.0.0 is released, future versions will preserve backwards compatibility.

The project currently has a single maintainer (@garyp) working on it in his spare time. Contributors who would like to become additional maintainers are highly welcome. Your contributions don't have to be in the form of code and could also be documentation improvements, issue triage, community outreach, etc.

Summary

Generated Code Sample

For the following addressbook.proto file:

syntax = "proto3";
package tutorial;

import "google/protobuf/timestamp.proto";

message Person {
    string name = 1;
    int32 id = 2;
    string email = 3;

    enum PhoneType {
        MOBILE = 0;
        HOME = 1;
        WORK = 2;
    }

    message PhoneNumber {
        string number = 1;
        PhoneType type = 2;
    }

    repeated PhoneNumber phones = 4;

    google.protobuf.Timestamp last_updated = 5;
}

message AddressBook {
    repeated Person people = 1;
}

The following file will be generated at tutorial/addressbook.kt:

@file:OptIn(pbandk.PublicForGeneratedCode::class)

package tutorial

public data class Person(
    val name: String = "",
    val id: Int = 0,
    val email: String = "",
    val phones: List<tutorial.Person.PhoneNumber> = emptyList(),
    val lastUpdated: pbandk.wkt.Timestamp? = null,
    override val unknownFields: Map<Int, pbandk.UnknownField> = emptyMap()
) : pbandk.Message {
    override operator fun plus(other: pbandk.Message?): Person = protoMergeImpl(other)
    override val descriptor: pbandk.MessageDescriptor<Person> get() = Companion.descriptor
    override val protoSize: Int by lazy { super.protoSize }
    public companion object : pbandk.Message.Companion<Person> {
        public val defaultInstance: Person by lazy { Person() }
        override fun decodeWith(u: pbandk.MessageDecoder): Person = Person.decodeWithImpl(u)

        override val descriptors: pbandk.MessageDescriptor<Person> by lazy {
            val fieldsList = ArrayList<pbandk.FieldDescriptor<Person, *>>(5).apply {
                add(
                    pbandk.FieldDescriptor(
                        messageDescriptor = this@Companion::descriptor,
                        name = "name",
                        number = 1,
                        type = pbandk.FieldDescriptor.Type.Primitive.String(),
                        jsonName = "name",
                        value = Person::name
                    )
                )
                add(
                    pbandk.FieldDescriptor(
                        messageDescriptor = this@Companion::descriptor,
                        name = "id",
                        number = 2,
                        type = pbandk.FieldDescriptor.Type.Primitive.Int32(),
                        jsonName = "id",
                        value = Person::id
                    )
                )
                add(
                    pbandk.FieldDescriptor(
                        messageDescriptor = this@Companion::descriptor,
                        name = "email",
                        number = 3,
                        type = pbandk.FieldDescriptor.Type.Primitive.String(),
                        jsonName = "email",
                        value = Person::email
                    )
                )
                add(
                    pbandk.FieldDescriptor(
                        messageDescriptor = this@Companion::descriptor,
                        name = "phones",
                        number = 4,
                        type = pbandk.FieldDescriptor.Type.Repeated<tutorial.Person.PhoneNumber>(valueType = pbandk.FieldDescriptor.Type.Message(messageCompanion = tutorial.Person.PhoneNumber.Companion)),
                        jsonName = "phones",
                        value = Person::phones
                    )
                )
                add(
                    pbandk.FieldDescriptor(
                        messageDescriptor = this@Companion::descriptor,
                        name = "last_updated",
                        number = 5,
                        type = pbandk.FieldDescriptor.Type.Message(messageCompanion = pbandk.wkt.Timestamp.Companion),
                        jsonName = "lastUpdated",
                        value = Person::lastUpdated
                    )
                )
            }
            pbandk.MessageDescriptor(
                fullName = "tutorial.Person",
                messageClass = tutorial.Person::class,
                messageCompanion = this,
                fields = fieldsList
            )
        }
    }

    public sealed class PhoneType(override val value: Int, override val name: String? = null) : pbandk.Message.Enum {
        override fun equals(other: kotlin.Any?): Boolean = other is Person.PhoneType && other.value == value
        override fun hashCode(): Int = value.hashCode()
        override fun toString(): String = "Person.PhoneType.${name ?: "UNRECOGNIZED"}(value=$value)"

        public object MOBILE : PhoneType(0, "MOBILE")
        public object HOME : PhoneType(1, "HOME")
        public object WORK : PhoneType(2, "WORK")
        public class UNRECOGNIZED(value: Int) : Person.PhoneType(value)

        public companion object : pbandk.Message.Enum.Companion<Person.PhoneType> {
            public val values: List<Person.PhoneType> by lazy { listOf(MOBILE, HOME, WORK) }
            override fun fromValue(value: Int): Person.PhoneType = values.firstOrNull { it.value == value } ?: UNRECOGNIZED(value)
            override fun fromName(name: String): Person.PhoneType = values.firstOrNull { it.name == name } ?: throw IllegalArgumentException("No PhoneType with name: $name")
        }
    }

    public data class PhoneNumber(
        val number: String = "",
        val type: tutorial.Person.PhoneType = tutorial.Person.PhoneType.fromValue(0),
        override val unknownFields: Map<Int, pbandk.UnknownField> = emptyMap()
    ) : pbandk.Message {
        override operator fun plus(other: pbandk.Message?): Person.PhoneNumber = protoMergeImpl(other)
        override val descriptor: MessageDescriptor<Person.PhoneNumber> get() = Companion.descriptor
        override val protoSize: Int by lazy { super.protoSize }
        public companion object : pbandk.Message.Companion<Person.PhoneNumber> {
            public val defaultInstance: Person.PhoneNumber by lazy { Person.PhoneNumber() }
            override fun decodeWith(u: pbandk.MessageDecoder): Person.PhoneNumber = Person.PhoneNumber.decodeWithImpl(u)

            override val descriptor: pbandk.MessageDescriptor<PhoneNumber> by lazy {
                val fieldsList = ArrayList<pbandk.FieldDescriptor<PhoneNumber, *>>(2).apply {
                    add(
                        pbandk.FieldDescriptor(
                            messageDescriptor = this@Companion::descriptor,
                            name = "number",
                            number = 1,
                            type = pbandk.FieldDescriptor.Type.Primitive.String(),
                            jsonName = "number",
                            value = PhoneNumber::number
                        )
                    )
                    add(
                        pbandk.FieldDescriptor(
                            messageDescriptor = this@Companion::descriptor,
                            name = "type",
                            number = 2,
                            type = pbandk.FieldDescriptor.Type.Enum(enumCompanion = tutorial.Person.PhoneType.Companion),
                            jsonName = "type",
                            value = PhoneNumber::type
                        )
                    )
                }
                pbandk.MessageDescriptor(
                    fullName = "tutorial.Person.PhoneNumber",
                    messageClass = tutorial.Person.PhoneNumber::class,
                    messageCompanion = this,
                    fields = fieldsList
                )
            }
        }
    }
}

public data class AddressBook(
    val people: List<tutorial.Person> = emptyList(),
    override val unknownFields: Map<Int, pbandk.UnknownField> = emptyMap()
) : pbandk.Message {
    override operator fun plus(other: pbandk.Message?): AddressBook = protoMergeImpl(other)
    override val descriptor: MessageDescriptor<AddressBook> get() = Companion.descriptor
    override val protoSize: Int by lazy { super.protoSize }
    public companion object : pbandk.Message.Companion<AddressBook> {
        public val defaultInstance: AddressBook by lazy { AddressBook() }
        override fun decodeWith(u: pbandk.MessageDecoder): AddressBook = AddressBook.decodeWithImpl(u)

        override val descriptor: pbandk.MessageDescriptor<AddressBook> by lazy {
            val fieldsList = ArrayList<pbandk.FieldDescriptor<AddressBook, *>>(1).apply {
                add(
                    pbandk.FieldDescriptor(
                        messageDescriptor = this@Companion::descriptor,
                        name = "people",
                        number = 1,
                        type = pbandk.FieldDescriptor.Type.Repeated<tutorial.Person>(valueType = pbandk.FieldDescriptor.Type.Message(messageCompanion = tutorial.Person.Companion)),
                        jsonName = "people",
                        value = AddressBook::people
                    )
                )
            }
            pbandk.MessageDescriptor(
                fullName = "tutorial.AddressBook",
                messageClass = tutorial.AddressBook::class,
                messageCompanion = this,
                fields = fieldsList
            )
        }
    }
}

public fun Person?.orDefault(): Person = this ?: Person.defaultInstance

public fun Person.PhoneNumber?.orDefault(): Person.PhoneNumber = this ?: Person.PhoneNumber.defaultInstance

public fun AddressBook?.orDefault(): AddressBook = this ?: AddressBook.defaultInstance

// Omitted multiple supporting private extension methods

To see a full version of the file, see here. See the "Generated Code" section below under "Usage" for more details.

Usage

Generating Code

Pbandk's code generator leverages protoc. Download the latest protoc and make sure protoc is on the PATH. Then download the latest released protoc-gen-pbandk self-executing jar file (if you're using a SNAPSHOT build of pbandk, you might want to instead download the latest SNAPSHOT version of protoc-gen-pbandk-jvm-*-jvm8.jar), rename it to protoc-gen-pbandk, make the file executable (chmod +x protoc-gen-pbandk), and make sure it is on the PATH. To generate code from sample.proto and put the generated code in src/main/kotlin, run:

protoc --pbandk_out=src/main/kotlin sample.proto

The file is generated as sample.kt in the subdirectories specified by the package. Like other X_out arguments, comma-separated options can be added to --pbandk_out before the colon and out dir path:

  • To explicitly set the Kotlin package to my.pkg, use the kotlin_package option like so:

    protoc --pbandk_out=kotlin_package=my.pkg:src/main/kotlin sample.proto
    
  • If you have multiple proto packages, you can map them using kotlin_package_mapping option like so:

    protoc --pbandk_out=kotlin_package_mapping="simple.package->new.package;foo.bar.*->my.foo.bar.*":src/main/kotlin sample.proto
    
  • By default all generated classes have public visibility. To change the visibility to internal, use the visibility option like so:

    protoc --pbandk_out=visibility=internal:src/main/kotlin sample.proto
    
  • To log debug logs during generation, log=debug can be set as well.

Multiple options can be added to a single --pbandk_out argument by separating them with commas.

In addition to running protoc manually, the Protobuf Plugin for Gradle can be used. See this example to see how.

Windows

The self-executing jar file doesn't work on Windows. Also protoc doesn't support finding protoc-gen-pbandk.bat on the PATH. So it has to be specified explicitly as a plugin. Thus on Windows you will first need to build protoc-gen-pbandk locally:

./gradlew :protoc-gen-pbandk:protoc-gen-pbandk-jvm:installDist

And then provide the full path to protoc:

protoc \
    --pbandk_out=src/main/kotlin \
    --plugin=protoc-gen-pbandk=/path/to/pbandk/protoc-gen-pbandk/jvm/build/install/protoc-gen-pbandk/bin/protoc-gen-pbandk.bat \
    sample.proto

Runtime Library

Pbandk's runtime library provides a Kotlin layer over the preferred Protobuf library for each platform. The libraries are present on Maven Central. Using Gradle:

repositories {
    // This repository is only needed if using a SNAPSHOT version of pbandk
    maven { url "https://s01.oss.sonatype.org/content/repositories/snapshots/" }

    mavenCentral()
}

dependencies {
    // Can be used from the `common` sourceset in a Kotlin Multiplatform project,
    // or from platform-specific JVM, Android, JS, or Native sourcesets/projects.
    implementation("pro.streem.pbandk:pbandk-runtime:0.16.0-SNAPSHOT")
}

Pbandk has a dependency on the preferred Protobuf library on each platform:

  • Android: Google Protobuf Javalite library. The Android artifact supports SDK 21 or higher.
  • JS: protobuf.js.
  • JVM and Native: Pbandk uses its own pure-Kotlin protobuf implementation that is heavily based on the Google Protobuf Java library.

Service Code Generation

Pbandk does not generate gRPC code itself, but offers a pbandk.gen.ServiceGenerator interface in the protoc-gen-pbandk-lib-jvm project with a single method that can be implemented to generate the code.

To do this, first depend on the project but it will only be needed at compile time because it's already there at runtime:

dependencies {
    compileOnly("pro.streem.pbandk:protoc-gen-pbandk-lib:0.16.0-SNAPSHOT")
}

Then, the kotlin_service_gen option can be given to protoc to use the generator. The option is a path-separated collection of JAR files to put on the classpath. It can end with a pipe (i.e. |) following by the fully-qualified class name of the implementation of the ServiceGenerator to use. If the last part is not present, it will use the ServiceLoader mechanism to find the first implementation to use. For example, to gen with my.Generator from gen.jar, it might look like:

protoc --pbandk_out=kotlin_service_gen=gen.jar|my.Generator,kotlin_package=my.pkg:src/main/kotlin some.proto

For more details, see the custom-service-gen example.

Generated Code

Package

The package is either the kotlin_package plugin option, the java_package protobuf option or the package set in the message. If the google.protobuf package is referenced, it is assumed to be a well-known type and is changed to reference pbandk.wkt.

Message

Each Protobuf message extends pbandk.Message and has an encodeToByteArray method to encode the message with the Protobuf binary encoding into a byte array. The companion object of every message has a decodeFromByteArray method: it accepts a byte array and returns an instance of the class. Each platform also provides additional encodeTo* and decodeFrom* methods that are platform-specific. For example, the JVM provides encodeToStream and decodeFromStream methods that operate on Java's OutputStream and InputStream, respectively, and use com.google.protobuf.CodedOutputStream internally.

Messages are immutable Kotlin data classes. This means they automatically implement hashCode, equals, and toString. Each class has an unknownFields map which contains information about extra fields the decoder didn't recognize. If there are values in this map, they will be encoded on output. The MessageDecoder instances have a constructor option to discard unknown fields when reading.

For proto3, the only nullable fields are other messages and oneof fields. Other values have defaults. For proto2, optional fields are nullable and defaulted as such. Types are basically the same as they would be in Java. However, bytes fields actually use a pbandk.ByteArr class which is a simple wrapper around a byte array. This was done because Kotlin does not handle array fields in data classes predictably and it wasn't worth overriding equals and hashCode every time.

Regardless of optimize_for options, the generated code is always the same. Each message has a protoSize field that lazily calculates the size of the message when first invoked. Also, each message has the plus operator defined which follows protobuf merge semantics.

Oneof

Oneof fields are generated as nested classes of a common sealed base class. Each oneof inner field is a class that wraps a single value.

The parent message also contains a nullable field for every oneof inner field. This field resolves to the oneof inner field's value when the oneof is set to that inner field. Otherwise it resolves to null.

Enum

Enum fields are generated as sealed classes with a nested object for each known enum value, and a Unrecognized nested class to hold unknown values. This is preferred over traditional enum classes because enums in protobuf are open ended and may not be one of the specific known values. Traditional enum classes would not be able to capture this state, and using sealed classes this way requires the user to do explicit checks for the Unrecognized value during exhaustive when clauses.

Each enum object contains a value field with the numeric value of that enum, and a name field with the string value of that enum. Developers should use the fromValue and fromName methods present on the companion object of the sealed class to map from a numeric or string value, respectively, to the corresponding enum object.

The values field on the companion object of the sealed class contains a list of all known enum values.

Repeated and Map

Repeated fields are normal lists. Developers should make no assumptions about which list implementation is used. Maps are represented by Kotlin maps. In proto2, due to how map entries are serialized, both the key and the value are considered nullable.

Well-Known Types

Well known types (i.e. those in the google/protobuf imports) are shipped with the runtime under the pbandk.wkt package.

Specialized support is provided to map the types defined in google/protobuf/wrappers.proto into Kotlin nullable primitives (e.g. String? for google.protobuf.StringValue, Int? for google.protobuf.Int32Value, etc.). Specialized support for other well-known types (e.g. using Kotlin Any for google.protobuf.Any) is not yet implemented.

Services

Services can be handled with a custom service generator. See the "Service Code Generation" section above and the custom-service-gen example.

Building

The project is built with Gradle and has several sub projects. In alphabetical order, they are:

  • conformance/js - Conformance test runner for Kotlin/JS
  • conformance/jvm - Conformance test runner for Kotlin/JVM
  • conformance/native - Conformance test runner for Kotlin/Native
  • conformance/lib - Common multiplatform code for conformance tests
  • protoc-gen-pbandk/jvm - Kotlin/JVM implementation of the code generator (can generate code for any platform, but requires JVM to run)
  • protoc-gen-pbandk/lib - Multiplatform code (only Kotlin/JVM supported at the moment) for the code generator and ServiceGenerator library
  • runtime - Multiplatform library for runtime Protobuf support

Code Generator

To generate the protoc-gen-pbandk distribution, run:

./gradlew :protoc-gen-pbandk:protoc-gen-pbandk-jvm:assembleDist

Testing Changes Locally in External Project

If you want to make changes to pbandk, and immediately test these changes in your separate project, first install the generator locally:

./gradlew :protoc-gen-pbandk:protoc-gen-pbandk-jvm:installDist

This puts the files in the build/install folder. Then you need to tell protoc where to find this plugin file. For example:

protoc \
    --plugin=protoc-gen-pbandk=/path/to/pbandk/protoc-gen-pbandk/jvm/build/install/protoc-gen-pbandk/bin/protoc-gen-pbandk \
    --pbandk_out=src/main/kotlin \
    src/main/proto/*.proto

This will generate kotlin files for the specified *.proto files, without needing to publish first.

Runtime Library

To build the runtime library for both JS and the JVM, run:

./gradlew :pbandk-runtime:assemble

Bundled Types

If any changes are made to the generated code that is output by protoc-gen-pbandk, then the well-known types (and other proto types used by pbandk) need to be re-generated using the updated protoc-gen-pbandk binary:

./gradlew generateProtos

Important: If making changes in both the :protoc-gen-pbandk:protoc-gen-pbandk-lib and :pbandk-runtime projects at the same time, then it's likely the :pbandk-runtime:generateWellKnownTypeProtos task will fail to compile. To work around this, stash the changes in the :pbandk-runtime project, run the generateWellKnownTypeProtos task with only the :protoc-gen-pbandk:protoc-gen-pbandk-lib changes, and then unstash the :pbandk-runtime changes and rerun the generateWellKnownTypeProtos task.

Conformance Tests

To run conformance tests, the conformance-test-runner must be built (does not work on Windows).

curl -sSLO https://github.com/protocolbuffers/protobuf/releases/download/v3.10.1/protobuf-all-3.10.1.tar.gz
tar xzvf protobuf-all-3.10.1.tar.gz
cd protobuf-3.10.1
./configure
make
cd conformance
make

You should now have a conformance-test-runner available in this directory. Test it by running ./conformance-test-runner --help

Set the CONF_TEST_PATH environment variable (used to run the tests below) with:

export CONF_TEST_PATH="$(pwd)/conformance-test-runner"

Now, back in pbandk, build all JS. JVM and native projects via:

./gradlew :conformance:conformance-lib:assemble \
    :conformance:conformance-jvm:installDist \
    :conformance:conformance-native:build

You are now ready to run the conformance tests. Make sure CONF_TEST_PATH environment variable is set to path/to/protobuf/conformance/conformance-test-runner (see above).

Then, from the root directory:

./conformance/test-conformance.sh

Note that by default, the test-conformance.sh script will run the conformance test for jvm, js and linux. This will fail when running them on MacOS due to missing linux binaries. So in that case, run the tests for each platform individually:

./conformance/test-conformance.sh jvm
./conformance/test-conformance.sh js
./conformance/test-conformance.sh macos

Releasing

Releases are handled automatically via CI once the git tag is created.

Setup a couple shell variables to simplify the rest of the commands below:

export VERSION="0.9.0"
export NEXT_VERSION="0.9.1"

To create a new release:

  1. Update CHANGELOG.md: add a date for the release version, and update the release version's GitHub compare link with a tag instead of HEAD.
    • Note: if you are releasing a pre-release version (alpha, beta, rc) then you don't need to update CHANGELOG.md
  2. Update the pbandk version number in gradle.properties, README.md, and examples/*/build.gradle.kts to remove the SNAPSHOT suffix. For example, if the current version is 0.9.0-SNAPSHOT, then update it to be 0.9.0.
  3. Comment out the note about the stable version of the documentation that is at the top of README.md and update it to point at the new version.
  4. Commit the change. E.g.: git commit -m "Bump to ${VERSION}" -a.
  5. Tag the new version. E.g.: git tag -a -m "See https://github.com/streem/pbandk/blob/v${VERSION}/CHANGELOG.md" "v${VERSION}".

Then prepare the repository for development of the next version:

  1. Update CHANGELOG.md: add a section for NEXT_VERSION that will follow the released version (e.g. if releasing 0.9.0 then add a section for 0.9.1).
    • Note: if you are releasing a pre-release version (alpha, beta, rc) then you don't need to update CHANGELOG.md
  2. Update the pbandk version number in gradle.properties, README.md, and examples/*/build.gradle.kts to ${NEXT_VERSION}-SNAPSHOT. For example, 0.9.1-SNAPSHOT.
  3. Uncomment the note about the stable version of the documentation that is at the top of README.md.
  4. Commit the change. E.g.: git commit -m "Bump to ${NEXT_VERSION}-SNAPSHOT" -a.

GitHub will build and publish the new release once it sees the new tag:

  1. Push the changes to GitHub: git push origin --follow-tags master.
  2. Wait for CI to notice the new tag, build it, and upload it to Maven Central.
  3. Create a new release on GitHub. Use the contents of the tag description as the release description. E.g.: gh release create "v${VERSION}" -F <(git tag -l --format='%(contents)' "v${VERSION}").

Credits

This repository was originally forked from https://github.com/cretz/pb-and-k. Many thanks to https://github.com/cretz for creating this library and building the initial feature set.

pbandk's People

Contributors

antongrbin avatar cretz avatar dogacel avatar garyp avatar itegulov avatar jeroenmols avatar johnlcaron avatar kainosk avatar nbaztec avatar nickthegroot avatar niematojaktomasz avatar rahulreddym avatar themerski avatar tinder-aminghadersohi avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pbandk's Issues

Annotations for validations

Hello everyone!

Recently faced the need for validation and would like to be able to embed annotations in your code-generating library. Actually, an idea for development

For example:
proto:

message Test {
    // Validated:min=4,regexp=[a-d_1-5]+,max=145
    string field1 = 1;
}

kotlin:

data class Test (
    @field:Size(min=4, max=145)
    @field:Pattern(regexp="[a-d_1-5]+")
    val field1: String?
) {
....
}

jsonMarshal() fails on older Android versions

pbandk.ser.TimestampSerializer.serialize() is trying to use java.time.format.DateTimeFormatter when serializing a protobuf Timestamp to JSON. But that class is only available on Android with API level 26. To support older devices we might need to switch to SimpleDateFormatter instead.

Here's the stacktrace on an older Samsung Galaxy S6 device:

2020-05-20 23:54:29.451 22039-22548/pro.streem.now W/System.err: io.reactivex.exceptions.UndeliverableException: The exception could not be delivered to the consumer because it has already canceled/disposed the flow or the exception has nowhere to go to begin with. Further reading: https://github.com/ReactiveX/RxJava/wiki/What's-different-in-2.0#error-handling | java.lang.NoClassDefFoundError: Failed resolution of: Ljava/time/format/DateTimeFormatter;
2020-05-20 23:54:29.452 22039-22548/pro.streem.now W/System.err:     at io.reactivex.plugins.RxJavaPlugins.onError(RxJavaPlugins.java:367)
2020-05-20 23:54:29.452 22039-22548/pro.streem.now W/System.err:     at io.reactivex.internal.schedulers.ScheduledRunnable.run(ScheduledRunnable.java:69)
2020-05-20 23:54:29.452 22039-22548/pro.streem.now W/System.err:     at io.reactivex.internal.schedulers.ScheduledRunnable.call(ScheduledRunnable.java:57)
2020-05-20 23:54:29.452 22039-22548/pro.streem.now W/System.err:     at java.util.concurrent.FutureTask.run(FutureTask.java:237)
2020-05-20 23:54:29.452 22039-22548/pro.streem.now W/System.err:     at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:272)
2020-05-20 23:54:29.452 22039-22548/pro.streem.now W/System.err:     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1133)
2020-05-20 23:54:29.453 22039-22548/pro.streem.now W/System.err:     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:607)
2020-05-20 23:54:29.453 22039-22548/pro.streem.now W/System.err:     at java.lang.Thread.run(Thread.java:762)
2020-05-20 23:54:29.454 22039-22548/pro.streem.now W/System.err: Caused by: java.lang.NoClassDefFoundError: Failed resolution of: Ljava/time/format/DateTimeFormatter;
2020-05-20 23:54:29.454 22039-22548/pro.streem.now W/System.err:     at pbandk.Util.timestampToString(Util.kt:16)
2020-05-20 23:54:29.454 22039-22548/pro.streem.now W/System.err:     at pbandk.ser.TimestampSerializer.serialize(TimestampSerializer.kt:15)
2020-05-20 23:54:29.454 22039-22548/pro.streem.now W/System.err:     at pbandk.ser.TimestampSerializer.serialize(TimestampSerializer.kt:9)
2020-05-20 23:54:29.454 22039-22548/pro.streem.now W/System.err:     at kotlinx.serialization.json.internal.StreamingJsonOutput.encodeSerializableValue(StreamingJsonOutput.kt:228)
2020-05-20 23:54:29.454 22039-22548/pro.streem.now W/System.err:     at kotlinx.serialization.Encoder$DefaultImpls.encodeNullableSerializableValue(Coders.kt:37)
2020-05-20 23:54:29.454 22039-22548/pro.streem.now W/System.err:     at kotlinx.serialization.json.JsonOutput$DefaultImpls.encodeNullableSerializableValue(JsonOutput.kt)
2020-05-20 23:54:29.455 22039-22548/pro.streem.now W/System.err:     at kotlinx.serialization.json.internal.StreamingJsonOutput.encodeNullableSerializableValue(StreamingJsonOutput.kt:13)
2020-05-20 23:54:29.455 22039-22548/pro.streem.now W/System.err:     at kotlinx.serialization.ElementValueEncoder.encodeNullableSerializableElement(ElementWise.kt:76)
2020-05-20 23:54:29.455 22039-22548/pro.streem.now W/System.err:     at pro.streem.model.proto.WallItem$JsonMapper.write$Self(wall_items.kt:133)
2020-05-20 23:54:29.455 22039-22548/pro.streem.now W/System.err:     at pro.streem.model.proto.WallItem$JsonMapper$$serializer.serialize(wall_items.kt)
2020-05-20 23:54:29.455 22039-22548/pro.streem.now W/System.err:     at pro.streem.model.proto.WallItem$JsonMapper$$serializer.serialize(wall_items.kt:125)
2020-05-20 23:54:29.455 22039-22548/pro.streem.now W/System.err:     at kotlinx.serialization.json.internal.StreamingJsonOutput.encodeSerializableValue(StreamingJsonOutput.kt:228)
2020-05-20 23:54:29.455 22039-22548/pro.streem.now W/System.err:     at kotlinx.serialization.CoreKt.encode(Core.kt:74)
2020-05-20 23:54:29.456 22039-22548/pro.streem.now W/System.err:     at kotlinx.serialization.json.Json.stringify(Json.kt:95)
2020-05-20 23:54:29.456 22039-22548/pro.streem.now W/System.err:     at pro.streem.model.proto.Wall_itemsKt.jsonMarshalImpl(wall_items.kt:1461)
2020-05-20 23:54:29.456 22039-22548/pro.streem.now W/System.err:     at pro.streem.model.proto.Wall_itemsKt.access$jsonMarshalImpl(wall_items.kt:1)
2020-05-20 23:54:29.457 22039-22548/pro.streem.now W/System.err:     at pro.streem.model.proto.WallItem.jsonMarshal(wall_items.kt:116)
2020-05-20 23:54:29.457 22039-22548/pro.streem.now W/System.err:     at pbandk.Message$DefaultImpls.jsonMarshal(Message.kt:11)
2020-05-20 23:54:29.457 22039-22548/pro.streem.now W/System.err:     at pro.streem.model.proto.WallItem.jsonMarshal(wall_items.kt:8)
2020-05-20 23:54:29.458 22039-22548/pro.streem.now W/System.err:     at pro.streem.sdk.internal.upload.UploadManager.createQueuedStreemshot(UploadManager.kt:179)
2020-05-20 23:54:29.458 22039-22548/pro.streem.now W/System.err:     at pro.streem.sdk.internal.upload.UploadManager.access$createQueuedStreemshot(UploadManager.kt:37)
2020-05-20 23:54:29.458 22039-22548/pro.streem.now W/System.err:     at pro.streem.sdk.internal.upload.UploadManager$queueStreemshot$1.apply(UploadManager.kt:135)
2020-05-20 23:54:29.458 22039-22548/pro.streem.now W/System.err:     at pro.streem.sdk.internal.upload.UploadManager$queueStreemshot$1.apply(UploadManager.kt:37)
2020-05-20 23:54:29.458 22039-22548/pro.streem.now W/System.err:     at io.reactivex.internal.operators.single.SingleMap$MapSingleObserver.onSuccess(SingleMap.java:57)
2020-05-20 23:54:29.458 22039-22548/pro.streem.now W/System.err:     at io.reactivex.internal.operators.single.SingleFromCallable.subscribeActual(SingleFromCallable.java:56)
2020-05-20 23:54:29.459 22039-22548/pro.streem.now W/System.err:     at io.reactivex.Single.subscribe(Single.java:3666)
2020-05-20 23:54:29.459 22039-22548/pro.streem.now W/System.err:     at io.reactivex.internal.operators.single.SingleMap.subscribeActual(SingleMap.java:34)
2020-05-20 23:54:29.459 22039-22548/pro.streem.now W/System.err:     at io.reactivex.Single.subscribe(Single.java:3666)
2020-05-20 23:54:29.459 22039-22548/pro.streem.now W/System.err:     at io.reactivex.internal.operators.single.SingleMap.subscribeActual(SingleMap.java:34)
2020-05-20 23:54:29.459 22039-22548/pro.streem.now W/System.err:     at io.reactivex.Single.subscribe(Single.java:3666)
2020-05-20 23:54:29.459 22039-22548/pro.streem.now W/System.err:     at io.reactivex.internal.operators.single.SingleFlatMap$SingleFlatMapCallback.onSuccess(SingleFlatMap.java:84)
2020-05-20 23:54:29.459 22039-22548/pro.streem.now W/System.err:     at io.reactivex.internal.operators.single.SingleZipArray$ZipCoordinator.innerSuccess(SingleZipArray.java:119)
2020-05-20 23:54:29.460 22039-22548/pro.streem.now W/System.err:     at io.reactivex.internal.operators.single.SingleZipArray$ZipSingleObserver.onSuccess(SingleZipArray.java:170)
2020-05-20 23:54:29.460 22039-22548/pro.streem.now W/System.err:     at io.reactivex.internal.operators.observable.ObservableElementAtSingle$ElementAtObserver.onNext(ObservableElementAtSingle.java:89)
2020-05-20 23:54:29.460 22039-22548/pro.streem.now W/System.err:     at io.reactivex.internal.util.NotificationLite.accept(NotificationLite.java:246)
2020-05-20 23:54:29.460 22039-22548/pro.streem.now W/System.err:     at io.reactivex.subjects.BehaviorSubject$BehaviorDisposable.test(BehaviorSubject.java:569)
2020-05-20 23:54:29.460 22039-22548/pro.streem.now W/System.err:     at io.reactivex.subjects.BehaviorSubject$BehaviorDisposable.emitFirst(BehaviorSubject.java:530)
2020-05-20 23:54:29.460 22039-22548/pro.streem.now W/System.err:     at io.reactivex.subjects.BehaviorSubject.subscribeActual(BehaviorSubject.java:239)
2020-05-20 23:54:29.461 22039-22548/pro.streem.now W/System.err:     at io.reactivex.Observable.subscribe(Observable.java:12267)
2020-05-20 23:54:29.461 22039-22548/pro.streem.now W/System.err:     at io.reactivex.internal.operators.observable.ObservableElementAtSingle.subscribeActual(ObservableElementAtSingle.java:37)
2020-05-20 23:54:29.461 22039-22548/pro.streem.now W/System.err:     at io.reactivex.Single.subscribe(Single.java:3666)
2020-05-20 23:54:29.461 22039-22548/pro.streem.now W/System.err:     at io.reactivex.internal.operators.single.SingleZipArray.subscribeActual(SingleZipArray.java:63)
2020-05-20 23:54:29.461 22039-22548/pro.streem.now W/System.err:     at io.reactivex.Single.subscribe(Single.java:3666)
2020-05-20 23:54:29.461 22039-22548/pro.streem.now W/System.err:     at io.reactivex.internal.operators.single.SingleFlatMap.subscribeActual(SingleFlatMap.java:36)
2020-05-20 23:54:29.462 22039-22548/pro.streem.now W/System.err:     at io.reactivex.Single.subscribe(Single.java:3666)
2020-05-20 23:54:29.462 22039-22548/pro.streem.now W/System.err:     at io.reactivex.internal.operators.single.SingleSubscribeOn$SubscribeOnObserver.run(SingleSubscribeOn.java:89)
2020-05-20 23:54:29.462 22039-22548/pro.streem.now W/System.err:     at io.reactivex.Scheduler$DisposeTask.run(Scheduler.java:578)
2020-05-20 23:54:29.462 22039-22548/pro.streem.now W/System.err:     at io.reactivex.internal.schedulers.ScheduledRunnable.run(ScheduledRunnable.java:66)
2020-05-20 23:54:29.462 22039-22548/pro.streem.now W/System.err:    ... 6 more
2020-05-20 23:54:29.466 22039-22623/pro.streem.now D/Camera2Fragment: image size: Size(width=1280.0, height=720.0, unknownFields={})
2020-05-20 23:54:29.466 22039-22623/pro.streem.now D/gralloc: gralloc_lock_ycbcr success. format : 11, usage: 3, ycbcr.y: 0x7d6a0ae000, .cb: 0x7d6a18f001, .cr: 0x7d6a18f000, .ystride: 1280 , .cstride: 1280, .chroma_step: 2
2020-05-20 23:54:29.469 22039-22548/pro.streem.now W/System.err: Caused by: java.lang.ClassNotFoundException: Didn't find class "java.time.format.DateTimeFormatter" on path: DexPathList[[zip file "/data/app/pro.streem.now-1/base.apk"],nativeLibraryDirectories=[/data/app/pro.streem.now-1/lib/arm64, /data/app/pro.streem.now-1/base.apk!/lib/arm64-v8a, /system/lib64, /vendor/lib64]]
2020-05-20 23:54:29.469 22039-22548/pro.streem.now W/System.err:     at dalvik.system.BaseDexClassLoader.findClass(BaseDexClassLoader.java:56)
2020-05-20 23:54:29.469 22039-22548/pro.streem.now W/System.err:     at java.lang.ClassLoader.loadClass(ClassLoader.java:380)
2020-05-20 23:54:29.469 22039-22548/pro.streem.now W/System.err:     at java.lang.ClassLoader.loadClass(ClassLoader.java:312)

0.8.2 release date

@garyp is there an anticipated date for the next release of the library? I am mainly interested in the json option we added

Error in build: class ... is public, should be declared in a file named

Error:

Http.java:6: error: class HTTP is public, should be declared in a file named HTTP.java
public final class HTTP implements pbandk.Message<com.example.app.protomodels.HTTP> {
             ^/Users/morteza/AndroidStudioProjects/bb-android/app/build/tmp/kapt3/stubs/developDebug/com/example/app/protomodels/HttpKt.java:11: error: cannot find symbol

The generated file in tmp is with camelcase letters: Http.java
but the kt file is named http.kt

public final class HTTP implements pbandk.Message<com.example.app.protomodels.HTTP> {
    @org.jetbrains.annotations.NotNull()
    private final kotlin.Lazy protoSize$delegate = null;
    @org.jetbrains.annotations.NotNull()
    private final java.lang.String version = null;
    @org.jetbrains.annotations.NotNull()
...
...
data class Http(
    val rules: List<com.example.app.protomodels.HttpRule> = emptyList(),
    val fullyDecodeReservedExpansion: Boolean = false,
    val unknownFields: Map<Int, pbandk.UnknownField> = emptyMap()
) : pbandk.Message<Http> 
...
...

Proto file:


syntax = "proto3";

package google.api;

option cc_enable_arenas = true;
option go_package = "google.golang.org/genproto/googleapis/api/annotations;annotations";
option java_multiple_files = true;
option java_outer_classname = "HttpProto";
option java_package = "com.google.api";
option objc_class_prefix = "GAPI";

// Defines the HTTP configuration for an API service. It contains a list of
// [HttpRule][google.api.HttpRule], each specifying the mapping of an RPC method
// to one or more HTTP REST API methods.
message Http {

  repeated HttpRule rules = 1;


  bool fully_decode_reserved_expansion = 2;
}

What is wrong here?
How can I fix this?

New failing tests since 0.9.0-SNAPSHOT

With the new 0.9.0-SNAPSHOT version, new tests are failing:

Required.Proto2.ProtobufInput.RepeatedScalarSelectsLast.STRING.ProtobufOutput
Required.Proto2.ProtobufInput.ValidDataRepeated.STRING.ProtobufOutput Required.Proto2.ProtobufInput.ValidDataScalar.STRING[6].ProtobufOutput
Required.Proto3.ProtobufInput.RepeatedScalarSelectsLast.STRING.ProtobufOutput
Required.Proto3.ProtobufInput.ValidDataRepeated.STRING.ProtobufOutput
Required.Proto3.ProtobufInput.ValidDataScalar.STRING[6].ProtobufOutput

Name collision with nested messages

You can find a repro proto/Kotlin class here: https://github.com/sebleclerc/pbandk-name-collision

Basically, if you have a message defined at the root level name MessageA and within another root level message, you have another message name MessageA. This is perfectly fine as per protobuf but the generate Kotlin code can't compile.

From the generated code you have:

override operator fun plus(other: MessageA?) = protoMergeImpl(other)
    override val protoSize by lazy { protoSizeImpl() }
    override fun protoMarshal(m: pbandk.Marshaller) = protoMarshalImpl(m)
    override fun jsonMarshal(json: Json) = jsonMarshalImpl(json)
    fun toJsonMapper() = toJsonMapperImpl()
    companion object : pbandk.Message.Companion<MessageA> {
        val defaultInstance by lazy { MessageA() }
        override fun protoUnmarshal(u: pbandk.Unmarshaller) = MessageA.protoUnmarshalImpl(u)
        override fun jsonUnmarshal(json: Json, data: String) = MessageA.jsonUnmarshalImpl(json, data)
    }

Each time you have MessageA, it doesn't know which one to choose. The MessageA or MessageB.MessageA.

I did a quick fix and if you append the outer class name. For example:
MessageB.MessageA.protoUnmarshallImpl(u)

This works fine! I'll open a PR for that too ๐Ÿ˜„I can change the Generator to make that happen.

Thanks!

Retrofit Converter

Is there a recommended Convertor for Retrofit for the generated Kotlin code?

Windows support

I'm using the pbandk plugin with google's protobuf-gradle-plugin as per your example.
I'm on windows, and the only way i can get it to work is by specifying the path to protoc-get.kotlin.bat in the kotlin plugin block like below. I've cloned the project and built it so thats where it comes from.

protobuf {
   ...
    plugins {
        kotlin {
            // This works on windows, but obviously isn't cross platform. 
            path = 'C:\\Users\\me\\pbandk\\protoc-gen-kotlin\\jvm\\build\\install\\protoc-gen-kotlin\\bin\\protoc-gen-kotlin.bat'
              // This fails, see error below.
//            artifact = 'com.github.streem.pbandk:protoc-gen-kotlin-jvm:0.8.0:jvm8@jar'
        }
    }
    ...
}

Using the jar as suggested in the example gives

* What went wrong:
Execution failed for task ':app:generateDebugProto'.
> protoc: stdout: . stderr: --kotlin_out: protoc-gen-kotlin: %1 is not a valid Win32 application.

The full command args from gradle --info are:
[
C:\Users\olly_.gradle\caches\modules-2\files-2.1\com.google.protobuf\protoc\3.11.1\2f33a303623e20f356c70fce00579ae377dbc46b\protoc-3.11.1-windows-x86_64.exe,
-IC:\Users\olly_\myapp\engine-rust\proto\proto,
-IC:\Users\olly_\myapp\app\build\extracted-protos\main,
-IC:\Users\olly_\myapp\app\build\extracted-protos\debug,
-IC:\Users\olly_\myapp\app\build\extracted-include-protos\debug,
--plugin=protoc-gen-kotlin=C:\Users\olly_.gradle\caches\modules-2\files-2.1\com.github.streem.pbandk\protoc-gen-kotlin-jvm\0.8.0\921451f497c1bfd53d0b145c0e8753b0c0697569\protoc-gen-kotlin-jvm-0.8.0-jvm8.jar,
--kotlin_out=log=debug:C:\Users\olly_\myapp\app/gen/debug/kotlin, {list of .proto files}
]

Running protoc-gen-kotlin as root fails with an error

Running protoc as root with the --kotlin_out option produces an error:

--kotlin_out: protoc-gen-kotlin: Plugin output is unparseable: \033[0;33mApplication
is running as root (UID 0). This is considered
insecure.\033[0m\nz\344w\n\031pbandk/google/api/http.ktz

This is because pbandk uses Spring Boot's launchScript feature in order to turn protoc-gen-kotlin.jar into a self-executable file. The default launch script used by Spring Boot prints a warning if run as root. This is mainly in place to prevent running Spring Boot jars as services under the root account. Since pbandk doesn't need the service-related functionality, we should probably replace the default Spring Boot launch script with a much-simplified custom version using something like:

tasks.getByName<BootJar>("bootJar") {
    launchScript {
        script = file("src/custom.script")
    }
}

Add the ability to unit test the output of protoc-gen-kotlin

Create some test scaffolding to allow testing that the source code produced by protoc-gen-kotlin is compilable by the Kotlin compiler.

Probably the unit test would take a CodeGeneratorRequest as input, run it through pbandk.gen.runGenerator(), and use a library like https://github.com/tschuchortdev/kotlin-compile-testing for running the Kotlin compiler on the CodeGeneratorResponse output. Even better would be to have the unit test take the source of a .proto file as input and the test would run protoc in order to generate the CodeGeneratorRequest.

Make json serialization code gen optional

If we could make the json marshal/unmarshal methods optional, we could save consumers a lot of code that they might night need. We don't particularly need json serialization (everywhere) as we mostly use the binary format.

Is it possible to generate rpc requests as retrofit requests?

Thanks for your great work.
I recommend that, it would be great if you could add support for retrofit library to generate service file which is very simple file to make it easier to make api calls this way and reduce boilerplate code. This could be an optional thing in the plugin which user could enable or disable it.

Proto2 `required` fields that contain default values are not serialized correctly

My proto is as follows

message Update {
    repeated Prop prop = 1;
    message Prop {
        oneof prop_update_type {
            StandardProp standard_prop = 2;
            RegularProp regular_prop = 5;
        }
    message StandardProp {
        optional string id = 1;
        optional string title = 2;
    }

    message RegularProp {
        required string name = 1;
        required bool is_array = 2;
        optional Value value = 3;
    }
}
message Value {
    oneof value {
        string string_value = 1;
        bool boolean_value = 2;
        int32 integer_value = 3;
        int64 long_value = 4;
        float float_value = 5;
        double double_value = 6;
    }
}

The Update object protoMarshal() is sent across Kafka and on the consumer side, I am getting exceptions which are on java platform. The exceptions are different in different projects.

Note: I am getting exception only when Prop contains RegularProp.

Below listed bullet points are the exception messages I am getting in java projects which deserialize using Update.parseFrom(data)

Project - 1
Value cannot cast to Update
Project - 2
prop[0].regular_prop.is_array not present

Is this due to the presence of Value in RegularProp?

The same proto is used by other Kafka producers in different projects (Javascript, java, etc.) and working fine. So to repeat the exception is only for Kafka message sent from Kotlin to other platform and only when using RegularProp

Kotlin 1.3.72

Are there any plans to update to kotlin 1.3.72?

const val kotlin = "1.3.61"

I am experiencing issues after I upgraded to koltin to 1.3.72 in my project.

java.lang.NoSuchMethodError: kotlinx.serialization.json.JsonConfiguration.copy$default(Lkotlinx/serialization/json/JsonConfiguration;ZZZZZLjava/lang/String;ZLjava/lang/String;Lkotlinx/serialization/UpdateMode;ILjava/lang/Object;)Lkotlinx/serialization/json/JsonConfiguration;

	at pbandk.Message$DefaultImpls.jsonMarshal(Message.kt:11)
	at com.generated.analytics.model.AppAnalyticsEvent.jsonMarshal(app.kt:35)
	at com.analytics.rapidfire.retrofit.pbandk.ConverterFactoryTest.ResponseBodyConverter - given json content type, should return message(ConverterFactoryTest.kt:185)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
	at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
	at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
	at com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
	at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
	at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)

Failing `RepeatedScalarMessageMerge` conformance test

These two conformance tests fail on both JVM and JS, so most likely it's a bug in the pbandk code:

Required.Proto2.ProtobufInput.RepeatedScalarMessageMerge.ProtobufOutput
Required.Proto3.ProtobufInput.RepeatedScalarMessageMerge.ProtobufOutput

The relevant conformance failure logs are:

ERROR, test=Required.Proto2.ProtobufInput.RepeatedScalarMessageMerge.ProtobufOutput: Output was not equivalent to reference message: deleted: optional_nested_message.corecursive.optional_int64: 1234
modified: optional_nested_message.corecursive.repeated_int32[0]: 1234 -> 4321
deleted: optional_nested_message.corecursive.repeated_int32[1]: 4321
. request=protobuf_payload: "\222\001\014\022\n\010\322\t\020\322\t\370\001\322\t\222\001\014\022\n\010\341!\030\341!\370\001\341!" requested_output_format: PROTOBUF message_type: "protobuf_test_messages.proto2.TestAllTypesProto2" test_category: BINARY_TEST, response=protobuf_payload: "\222\001\014\022\n\010\341!\030\341!\370\001\341!"
ERROR, test=Required.Proto3.ProtobufInput.RepeatedScalarMessageMerge.ProtobufOutput: Output was not equivalent to reference message: deleted: optional_nested_message.corecursive.optional_int64: 1234
modified: optional_nested_message.corecursive.repeated_int32[0]: 1234 -> 4321
deleted: optional_nested_message.corecursive.repeated_int32[1]: 4321
. request=protobuf_payload: "\222\001\014\022\n\010\322\t\020\322\t\370\001\322\t\222\001\014\022\n\010\341!\030\341!\370\001\341!" requested_output_format: PROTOBUF message_type: "protobuf_test_messages.proto3.TestAllTypesProto3" test_category: BINARY_TEST, response=protobuf_payload: "\222\001\r\022\013\010\341!\030\341!\372\001\002\341!"

Help with dependencies

Hello, can anybody help me with gradle dependencies.
I was generated contracts and place all of them in another project.

I Wrote this build.gralde:

plugins {
    id 'org.jetbrains.kotlin.plugin.serialization' version '1.3.61'
    id "com.google.protobuf" version "0.8.12"
}

repositories {
    jcenter()
}

dependencies {
    implementation(
            project(":***"),
            project(":***"),
            "org.jetbrains.kotlin:kotlin-stdlib:1.3.61",
            "org.jetbrains.kotlinx:kotlinx-serialization-runtime:0.14.0",
            'com.github.streem.pbandk:pbandk-runtime-jvm:0.8.0',
            'org.jetbrains.kotlinx:kotlinx-coroutines-core:1.1.0'
    )
}

And after syncing project a got messages like that:
image
image

Unclear how to use this project with the multiplatform plugin

My project is using the multiplatform plugin (kotlin-multiplatform) to build for jvm and js.

I can't figure out how to use pbandk with it. I've followed the gradle example, added a proto {} block at the top level of my file, and added an apply plugin: 'com.google.protobuf' statement after apply plugin: 'kotlin-multiplatform'. I've put my protos into src/commonMain/proto (I hope this is right).

It downloaded all the projects but it doesn't appear to actually be compiling the protos as no output kt files are generated and modifying the proto to have syntax errors doesn't break the build.

I tried adding a kotlin { sourceSets { commonMain { proto block with srcDir 'src/commonMain/proto' in it to try and get it to find the sources, but gradle says 'proto' is unrecognized there.

I can't find any tasks related to proto in ./gradlew tasks | grep -i proto.

Any suggestions? Would it perhaps be possible to add a multiplatform example?

Failing `ValidDataRepeated.*.PackedInput` conformance tests

These conformance tests fail on both JVM and JS, so most likely it's a bug in the pbandk code:

Required.Proto3.ProtobufInput.ValidDataRepeated.BOOL.PackedInput.ProtobufOutput
Required.Proto3.ProtobufInput.ValidDataRepeated.INT32.PackedInput.ProtobufOutput
Required.Proto3.ProtobufInput.ValidDataRepeated.UINT32.PackedInput.ProtobufOutput

The relevant conformance failure logs are:

ERROR, test=Required.Proto3.ProtobufInput.ValidDataRepeated.INT32.PackedInput.ProtobufOutput: Output was not equivalent to reference message: added: repeated_int32[8]: 0
added: repeated_int32[9]: 0
added: repeated_int32[10]: 0
added: repeated_int32[11]: 0
added: repeated_int32[12]: 0
added: repeated_int32[13]: 0
added: repeated_int32[14]: 0
added: repeated_int32[15]: 0
. request=protobuf_payload: "\372\001)\000\271`\271\340\200\000\271\340\200\200\200\200\200\200\000\377\377\377\377\007\200\200\200\200\370\377\377\377\377\001\200\200\200\200 \377\377\377\377\037" requested_output_format: PROTOBUF message_type: "protobuf_test_messages.proto3.TestAllTypesProto3" test_category: BINARY_TEST, response=protobuf_payload: "\372\001)\000\271`\271`\271`\377\377\377\377\007\200\200\200\200\370\377\377\377\377\001\000\377\377\377\377\377\377\377\377\377\001\000\000\000\000\000\000\000\000"
ERROR, test=Required.Proto3.ProtobufInput.ValidDataRepeated.UINT32.PackedInput.ProtobufOutput: Output was not equivalent to reference message: added: repeated_uint32[7]: 0
added: repeated_uint32[8]: 0
added: repeated_uint32[9]: 0
added: repeated_uint32[10]: 0
added: repeated_uint32[11]: 0
added: repeated_uint32[12]: 0
added: repeated_uint32[13]: 0
added: repeated_uint32[14]: 0
added: repeated_uint32[15]: 0
added: repeated_uint32[16]: 0
added: repeated_uint32[17]: 0
added: repeated_uint32[18]: 0
added: repeated_uint32[19]: 0
. request=protobuf_payload: "\212\002\037\000\271`\271\340\200\000\271\340\200\200\200\200\200\200\000\377\377\377\377\017\200\200\200\200 \377\377\377\377\037" requested_output_format: PROTOBUF message_type: "protobuf_test_messages.proto3.TestAllTypesProto3" test_category: BINARY_TEST, response=protobuf_payload: "\212\002\037\000\271`\271`\271`\377\377\377\377\017\000\377\377\377\377\017\000\000\000\000\000\000\000\000\000\000\000\000\000"
ERROR, test=Required.Proto3.ProtobufInput.ValidDataRepeated.BOOL.PackedInput.ProtobufOutput: Output was not equivalent to reference message: added: repeated_bool[3]: false
added: repeated_bool[4]: false
added: repeated_bool[5]: false
. request=protobuf_payload: "\332\002\006\000\001\316\302\361\005" requested_output_format: PROTOBUF message_type: "protobuf_test_messages.proto3.TestAllTypesProto3" test_category: BINARY_TEST, response=protobuf_payload: "\332\002\006\000\001\001\000\000\000"

Proto3 Json - missing/null/default values are not omitted in the exported Json

Proto3Json specifies, that missing values and default values are to be omitted in the exported Json. Currently, this is not honored.

See https://developers.google.com/protocol-buffers/docs/proto3#json:

If a value is missing in the JSON-encoded data or if its value is null, it will be interpreted as the appropriate default value when parsed into a protocol buffer. If a field has the default value in the protocol buffer, it will be omitted in the JSON-encoded data by default to save space.

Context:

TestMessage.proto

syntax = "proto3";

package messages.v1alpha1;

message TestMessage {
  string id = 1;
}

TestTestMessage.kt

println(TestMessage().jsonMarshal());
println(TestMessage(id="").jsonMarshal());

Expected:

{}
{}

Actual:

{"id":null}
{"id":null}

Strip enum prefix from enums (as inspired by swift-protobuf)

The protobuf style guide suggests prefixing the names of all enum values with the enum's name (see https://developers.google.com/protocol-buffers/docs/style#enums and https://github.com/uber/prototool/tree/dev/style#enum-value-names) because .proto files use C++ scoping rules for enum values. The same scoping rules do not apply in Kotlin (where the enum value's name only has to be unique within the enum).

To make protobuf enums friendlier to use from Kotlin, we can automatically strip the enum name from the enum value names the same way that swift-protobuf does it: https://github.com/apple/swift-protobuf/blob/master/Documentation/API.md#enum-and-enum-case-naming.

Thus this protobuf enum:

enum Foo {
    FOO_INVALID = 0;
    FOO_BAR = 1;
    FOO_BAZ = 2;
}

would be used from Kotlin as Foo.BAR instead of Foo.FOO_BAR.

0.8.1 install issue

Trying to switch to 0.8.1 I get this error:

  • What went wrong:
    Execution failed for task ':model-generation:analytics-kotlin:extractIncludeProto'.

Could not resolve all files for configuration ':model-generation:analytics-kotlin:compileProtoPath'.
Could not resolve com.github.streem.pbandk:pbandk-runtime-jvm:0.8.1.
Required by:
project :model-generation:analytics-kotlin
> The consumer was configured to find a component, preferably only the resources files. However we cannot choose between the following variants of com.github.streem.pbandk:pbandk-runtime-jvm:0.8.1:
- jvm-api
- jvm-runtime
- metadata-api
All of them match the consumer attributes:
- Variant 'jvm-api' capability com.github.streem.pbandk:pbandk-runtime-jvm:0.8.1 declares a component, packaged as a jar:
- Unmatched attributes:
- Provides release status but the consumer didn't ask for it
- Provides an API but the consumer didn't ask for it
- Provides attribute 'org.jetbrains.kotlin.platform.type' with value 'jvm' but the consumer didn't ask for it
- Variant 'jvm-runtime' capability com.github.streem.pbandk:pbandk-runtime-jvm:0.8.1 declares a component, packaged as a jar:
- Unmatched attributes:
- Provides release status but the consumer didn't ask for it
- Provides a runtime but the consumer didn't ask for it
- Provides attribute 'org.jetbrains.kotlin.platform.type' with value 'jvm' but the consumer didn't ask for it
- Variant 'metadata-api' capability com.github.streem.pbandk:pbandk-runtime-jvm:0.8.1:
- Unmatched attributes:
- Doesn't say anything about its elements (required them preferably only the resources files)
- Provides release status but the consumer didn't ask for it
- Provides a usage of 'kotlin-api' but the consumer didn't ask for it
- Provides attribute 'org.jetbrains.kotlin.platform.type' with value 'common' but the consumer didn't ask for it

Generate KDoc comments for message/field/enum comments in proto files

Comments on messages, fields, enums, and enum values in .proto files should be output to the generated Kotlin code as KDoc comments.

pbandk.gen.FileBuilder will need to be modified to store the proto comments on the correct instance of pbandk.gen.File.Type.Message, pbandk.gen.File.Type.Enum, pbandk.gen.File.Type.Enum.Value and pbandk.gen.File.Field.Numbered when constructing those objects. Then pbandk.gen.CodeGenerator can use those stored comments to output KDoc comments when generating Kotlin code.

See the copious documentation on the protobuf SourceCodeInfo type for details on how to associate proto comments with the correct message/field definition: https://github.com/protocolbuffers/protobuf/blob/master/src/google/protobuf/descriptor.proto#L756.

This thread from the protobuf mailing list might also be helpful: https://groups.google.com/g/protobuf/c/AyOQvhtwvYc.

Why 'unknownFields' and 'protoSize$delegate' sent on request message from kotlin app?

We are the develop team of the new project with following tech-stack.

  • server-side application: golang + echo
  • android application: kotlin + okhttp
  • ios application: swift + alamofire
  • messaging protocol among client and server: Protocol Buffers or JSON on http/1.1

I'm server-side engineer of this project, and haven't any knowledge of java, and kotlin.

Today, our kotlin engineer reported one of strange behavior about the generated kotlin code via pbandk-0.8.0/protoc-gen-kotlin to me.
By she reported, each requests that used the generated kotlin code contains "unknownFields" and "protoSize$delegate". These fields must be uncontained.

For debugging, we tried some requests with json. Then we found following error on the log of server-side application.

failed to unmarshal json body: &errors.errorString{s:"unknown field \"protoSize$delegate\" in myapp_proto_v1.RegisterUserAccountRequest"}

I want to know how to remove protoSize$delegate and unknownFields from request payload.

Snip from proto file is followings.

### myapp_proto_v1.proto

service MyappService {
  rpc RegisterUserAccount(RegisterUserAccountRequest) returns (RegisterUserAccountResponse) {
    option (google.api.http) = {
      post: "/v1/user/"
    };
  };
}

// RegisterUserAccountRequest request meessage for register user account
message RegisterUserAccountRequest {
    string email = 1;
    string password = 2;
}

// RegisterUserAccountResponse response message for register user account
message RegisterUserAccountResponse {
    int32 status = 1;
    string message = 2;
    string user_message = 3;
    bool is_success = 4;
    string bearer_token = 5;
}

Best regards,

Provide a Kotlin DSL configuration example

I'm trying to use pb-and-k in my project that I've configured using the Kotlin DSL (.kts) and I can't figure out how to make it work. Please provide an example configuration to generate Kotlin classes from a .proto file using the Gradle Kotlin DSL.

Fully implement the proto3 JSON mapping spec

From https://developers.google.com/protocol-buffers/docs/proto3#json.

Well-known types

  • Any (see also #63)
  • Duration
  • Empty
  • FieldMask
  • Struct / Value / NullValue / ListValue
  • Timestamp:
    • "9999-12-31T23:59:59.999999999Z" fails on Kotlin/JS
    • lowercase 't' and lowercase 'z' should fail on Kotlin/JVM and Kotlin/JS
    • values outside of the allowed range ("0001-01-01T00:00:00Z" to "9999-12-31T23:59:59.999999999Z") should fail on Kotlin/JVM and Kotlin/JS
  • Wrapper types

Conformance test edge cases

  • more than one value for a oneof field in a message should be rejected (currently the last value in the JSON input wins)

    Failing conformance test:

    Required.Proto3.JsonInput.OneofFieldDuplicate
    
  • base64-url support for bytes fields on Kotlin/JVM and Kotlin/JS

    Failing conformance tests:

    Recommended.Proto3.JsonInput.BytesFieldBase64Url.JsonOutput
    Recommended.Proto3.JsonInput.BytesFieldBase64Url.ProtobufOutput
    
  • Invalid Unicode surrogate pairs in string fields should be rejected (currently they're accepted and the invalid characters are replaced with ?)

    Failing conformance tests:

    Recommended.Proto3.JsonInput.StringFieldSurrogateInWrongOrder
    Recommended.Proto3.JsonInput.StringFieldUnpairedHighSurrogate
    Recommended.Proto3.JsonInput.StringFieldUnpairedLowSurrogate
    
  • exponent notation and floating-point numbers for unmarshalling numeric types: 1e5, 100000.000, 2.147483647e9, -2.147483648e9, 4.294967295e9

    Failing conformance tests:

    Required.Proto3.JsonInput.Int32FieldExponentialFormat.JsonOutput
    Required.Proto3.JsonInput.Int32FieldExponentialFormat.ProtobufOutput
    Required.Proto3.JsonInput.Int32FieldFloatTrailingZero.JsonOutput
    Required.Proto3.JsonInput.Int32FieldFloatTrailingZero.ProtobufOutput
    Required.Proto3.JsonInput.Int32FieldMaxFloatValue.JsonOutput
    Required.Proto3.JsonInput.Int32FieldMaxFloatValue.ProtobufOutput
    Required.Proto3.JsonInput.Int32FieldMinFloatValue.JsonOutput
    Required.Proto3.JsonInput.Int32FieldMinFloatValue.ProtobufOutput
    Required.Proto3.JsonInput.Uint32FieldMaxFloatValue.JsonOutput
    Required.Proto3.JsonInput.Uint32FieldMaxFloatValue.ProtobufOutput
    
  • uint32 max value (4294967295)

  • uint64 max value (18446744073709549568, "18446744073709549568")

  • floating-point numbers outside of the min/max range for float/double should be rejected (currently they're being read in as -Inf/Inf)

    Failing conformance tests:

    Required.Proto3.JsonInput.DoubleFieldTooLarge
    Required.Proto3.JsonInput.DoubleFieldTooSmall
    Required.Proto3.JsonInput.FloatFieldTooLarge
    Required.Proto3.JsonInput.FloatFieldTooSmall
    
  • unquoted NaN, Infinity, -Infinity values for float/double fields in JSON input should be rejected (currently they're accepted)

    Failing conformance tests:

    Recommended.Proto3.JsonInput.DoubleFieldInfinityNotQuoted
    Recommended.Proto3.JsonInput.DoubleFieldNanNotQuoted
    Recommended.Proto3.JsonInput.DoubleFieldNegativeInfinityNotQuoted
    Recommended.Proto3.JsonInput.FloatFieldInfinityNotQuoted
    Recommended.Proto3.JsonInput.FloatFieldNanNotQuoted
    Recommended.Proto3.JsonInput.FloatFieldNegativeInfinityNotQuoted
    

Compile error when proto contains a oneof field with same name as the enclosing message

From #18 (comment):

A proto definition such as:

message Value {
  oneof value {
    string string_value = 1;
    bool boolean_value = 2;
    int32 integer_value = 3;
    int64 long_value = 4;
    float float_value = 5;
    double double_value = 6;
    int64 date_value = 7;
  }
}

Generates code like:

data class Value(
    val value: Value<*>? = null
) {
    sealed class Value<V>(value: V) : pbandk.Message.OneOf<V>(value) {
        class StringValue(stringValue: String = "") : Value<String>(stringValue)
        ...
    }
    ...
}

Which fails to compile because there are two types named Value and the type of value in val value: Value<*>? = null can't be resolved correctly.

Compile protoc-gen-kotlin into jar

After several minutes i decide that i have usage u library as my protoc plugin generator. But in u guide i don't founded information about: how i can use protoc-gen-kotlin without gradle and how i can add my overrided version of service generator with protoc-gen-kotlin ?

I imagine it this way: I take your jar and do the following manipulation: java -jar protoco-kotlin-gen.jar --kotlin_out = ....

Maybe u have more elegant way for this task

Kotlin Native support

Hi there!
I'm really looking forward to use pbandk for my Kotlin Multiplatform project, but it's coming short of Kotlin Native support. I'm willing to dig into this but first I wanted to know if any of you (contributors) have any tip or advance regarding what will be required to implement? Or anything I that could be helpful to me?

Thanks!

Convert `google.protobuf.*Value` to nullable primitives in generated code

The *Value types such as google.protobuf.Int32Value are meant to represent primitive types when you need to be able to differentiate between a protobuf field not being set vs a field being set to the default value. For a protobuf message like:

message Example {
    int32 foo = 1;
    Int32Value bar = 2;
}

Currently this generates:

data class Example(
    val foo: Int = 0,
    val bar: Int32Value? = null
)
data class Int32Value(val value: Int = 0)

It'd be more natural if instead the generated code was:

data class Example(
    val foo: Int = 0,
    val bar: Int? = null
)

Implement the new proto3 field presence feature

See protocolbuffers/protobuf#1606 (comment) and https://github.com/protocolbuffers/protobuf/blob/master/docs/implementing_proto3_presence.md for details.

When this feature is in use, pbandk should generate code for primitive fields using a nullable type that defaults to null, rather than a non-null type that defaults to empty string, zero, false, etc.

E.g., current code for this protobuf definition:

syntax = "proto3";

message Foo {
  int32 foo = 1;
}

will look like:

data class Foo(val foo: Int = 0)

Using the new optional presence feature, this protobuf definition:

syntax = "proto3";

message Foo {
  optional int32 foo = 1;
}

should look like:

data class Foo(val foo: Int? = null)

java.lang.NoClassDefFoundError: Failed resolution of: Ljava/util/Base64;

Exception when calling jsonMarshal on pbandk.Message object

java.lang.NoClassDefFoundError: Failed resolution of: Ljava/util/Base64;
    at pbandk.Util.bytesToBase64(Util.kt:13)
    at pbandk.ByteArr$Companion.serialize(ByteArr.kt:16)
    at pbandk.ByteArr$Companion.serialize(ByteArr.kt:12)
    at kotlinx.serialization.json.internal.StreamingJsonOutput.encodeSerializableValue(StreamingJsonOutput.kt:227)
    at kotlinx.serialization.Encoder$DefaultImpls.encodeNullableSerializableValue(Encoding.kt:263)
    at kotlinx.serialization.json.JsonOutput$DefaultImpls.encodeNullableSerializableValue(JsonOutput.kt)
    at kotlinx.serialization.json.internal.StreamingJsonOutput.encodeNullableSerializableValue(StreamingJsonOutput.kt:14)
    at kotlinx.serialization.builtins.AbstractEncoder.encodeNullableSerializableElement(AbstractEncoder.kt:76)
    at example.com.protomodels.UUID$JsonMapper.write$Self(base.kt:94)
    at example.com.protomodels.UUID$JsonMapper$$serializer.serialize(base.kt)
    at example.com.protomodels.UUID$JsonMapper$$serializer.serialize(base.kt:92)
    at kotlinx.serialization.json.internal.StreamingJsonOutput.encodeSerializableValue(StreamingJsonOutput.kt:227)
    at kotlinx.serialization.Encoder$DefaultImpls.encodeNullableSerializableValue(Encoding.kt:263)
    at kotlinx.serialization.json.JsonOutput$DefaultImpls.encodeNullableSerializableValue(JsonOutput.kt)
    at kotlinx.serialization.json.internal.StreamingJsonOutput.encodeNullableSerializableValue(StreamingJsonOutput.kt:14)
    at kotlinx.serialization.builtins.AbstractEncoder.encodeNullableSerializableElement(AbstractEncoder.kt:76)
    at example.com.protomodels.OTPRequest$JsonMapper.write$Self(service.kt:97)
    at example.com.protomodels.OTPRequest$JsonMapper$$serializer.serialize(service.kt)
    at example.com.protomodels.OTPRequest$JsonMapper$$serializer.serialize(service.kt:95)
    at kotlinx.serialization.json.internal.StreamingJsonOutput.encodeSerializableValue(StreamingJsonOutput.kt:227)
    at kotlinx.serialization.EncodingKt.encode(Encoding.kt:402)
    at kotlinx.serialization.json.Json.stringify(Json.kt:100)
    at example.com.protomodels.ServiceKt.jsonMarshalImpl(service.kt:300)
    at example.com.protomodels.ServiceKt.access$jsonMarshalImpl(service.kt:1)
    at example.com.protomodels.OTPRequest.jsonMarshal(service.kt:86)
    at pbandk.Message$DefaultImpls.jsonMarshal(Message.kt:11)
    at example.com.protomodels.OTPRequest.jsonMarshal(service.kt:77)
    at example.com.data.mappers.proto.ValidatePhoneKt.toDomain(ValidatePhone.kt:31)
    at example.com.data.datasourceimpl.RemoteUserDataSource$validatePhone$2.invokeSuspend(RemoteUserDataSource.kt:44)
    at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
    at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:56)
    at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:571)
    at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:738)
    at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:678)
    at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:665)

Add support for protobuf custom options

See https://developers.google.com/protocol-buffers/docs/proto#customoptions. This probably requires #62 to be implemented first.

Remaining items to complete

  • Document how to use custom options in the README
  • Add @ExperimentalProtoReflection annotation to parts of the public API
  • Wrap the property access with a helper method
  • Add support for Message options (e.g. (validate.rules).disabled)
  • Add support for Enum and EnumValue options
  • Add support for Oneof options
  • Add support for Service and Method options
  • Add support for nested extensions (currently only top level extensions are implemented)
  • Hide the use of pbandk.AtomicReference for the extension field map behind a wrapper class of some sort (maybe ExtensionFieldSet?)

See #103 for additional information.

Incorrect instructions

Under Usage, it says:

download the latest protoc-gen-kotlin and make sure protoc-gen-kotlin is on the PATH

The download link takes you to the release page which only contains source bundles.

I think the release process needs updated to include the distribution. In the short term, however, perhaps update the instructions to reflect the need to run ./gradlew installDist and point to protoc-gen-kotlin/protoc-gen-kotlin-jvm/build/install/protoc-gen-kotlin/bin/ for the PATH.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.