GithubHelp home page GithubHelp logo

stephenh / ts-proto Goto Github PK

View Code? Open in Web Editor NEW
1.9K 15.0 321.0 22.82 MB

An idiomatic protobuf generator for TypeScript

License: Apache License 2.0

TypeScript 98.74% JavaScript 0.12% Shell 0.30% HTML 0.70% CSS 0.01% Go 0.10% Batchfile 0.01% Dockerfile 0.03%
protobuf typescript nestjs twirp dataloader grpc grpc-node grpc-web

ts-proto's Introduction

npm build

ts-proto

ts-proto transforms your .proto files into strongly-typed, idiomatic TypeScript files!

Table of contents

Overview

ts-proto generates TypeScript types from protobuf schemas.

I.e. given a person.proto schema like:

message Person {
  string name = 1;
}

ts-proto will generate a person.ts file like:

interface Person {
  name: string
}

const Person = {
  encode(person): Writer { ... }
  decode(reader): Person { ... }
  toJSON(person): unknown { ... }
  fromJSON(data): Person { ... }
}

It also knows about services and will generate types for them as well, i.e.:

export interface PingService {
  ping(request: PingRequest): Promise<PingResponse>;
}

It will also generate client implementations of PingService; currently Twirp, grpc-web, grpc-js and nestjs are supported.

QuickStart

  • npm install ts-proto
  • protoc --plugin=./node_modules/.bin/protoc-gen-ts_proto --ts_proto_out=. ./simple.proto
    • (Note that the output parameter name, ts_proto_out, is named based on the suffix of the plugin's name, i.e. "ts_proto" suffix in the --plugin=./node_modules/.bin/protoc-gen-ts_proto parameter becomes the _out prefix, per protoc's CLI conventions.)
    • On Windows, use protoc --plugin=protoc-gen-ts_proto=".\\node_modules\\.bin\\protoc-gen-ts_proto.cmd" --ts_proto_out=. ./simple.proto (see #93)
    • Ensure you're using a modern protoc (see installation instructions for your platform, i.e. protoc v3.0.0 doesn't support the _opt flag

This will generate *.ts source files for the given *.proto types.

If you want to package these source files into an npm package to distribute to clients, just run tsc on them as usual to generate the .js/.d.ts files, and deploy the output as a regular npm package.

Buf

If you're using Buf, pass strategy: all in your buf.gen.yaml file (docs).

version: v1
plugins:
  - name: ts
    out: ../gen/ts
    strategy: all
    path: ../node_modules/ts-proto/protoc-gen-ts_proto

To prevent buf push from reading irrelevent .proto files, configure buf.yaml like so:

build:
  excludes: [node_modules]

You can also use the official plugin published to the Buf Registry.

version: v1
plugins:
  - plugin: buf.build/community/stephenh-ts-proto
    out: ../gen/ts
    opt:
      - outputServices=...
      - useExactTypes=...

ESM

If you're using a modern TS setup with either esModuleInterop or running in an ESM environment, you'll need to pass ts_proto_opts of:

  • esModuleInterop=true if using esModuleInterop in your tsconfig.json, and
  • importSuffix=.js if executing the generated ts-proto code in an ESM environment

Goals

In terms of the code that ts-proto generates, the general goals are:

  • Idiomatic TypeScript/ES6 types
    • ts-proto is a clean break from either the built-in Google/Java-esque JS code of protoc or the "make .d.ts files the *.js comments" approach of protobufjs
    • (Techically the protobufjs/minimal package is used for actually reading/writing bytes.)
  • TypeScript-first output
  • Interfaces over classes
    • As much as possible, types are just interfaces, so you can work with messages just like regular hashes/data structures.
  • Only supports codegen *.proto-to-*.ts workflow, currently no runtime reflection/loading of dynamic .proto files

Non-Goals

Note that ts-proto is not an out-of-the-box RPC framework; instead it's more of a swiss-army knife (as witnessed by its many config options), that lets you build exactly the RPC framework you'd like on top of it (i.e. that best integrates with your company's protobuf ecosystem; for better or worse, protobuf RPC is still a somewhat fragmented ecosystem).

If you'd like an out-of-the-box RPC framework built on top of ts-proto, there are a few examples:

(Note for potential contributors, if you develop other frameworks/mini-frameworks, or even blog posts/tutorials, on using ts-proto, we're happy to link to them.)

We also don't support clients for google.api.http-based Google Cloud APIs, see #948 if you'd like to submit a PR.

Example Types

The generated types are "just data", i.e.:

export interface Simple {
  name: string;
  age: number;
  createdAt: Date | undefined;
  child: Child | undefined;
  state: StateEnum;
  grandChildren: Child[];
  coins: number[];
}

Along with encode/decode factory methods:

export const Simple = {
  create(baseObject?: DeepPartial<Simple>): Simple {
    ...
  },

  encode(message: Simple, writer: Writer = Writer.create()): Writer {
    ...
  },

  decode(reader: Reader, length?: number): Simple {
    ...
  },

  fromJSON(object: any): Simple {
    ...
  },

  fromPartial(object: DeepPartial<Simple>): Simple {
    ...
  },

  toJSON(message: Simple): unknown {
    ...
  },
};

This allows idiomatic TS/JS usage like:

const bytes = Simple.encode({ name: ..., age: ..., ... }).finish();
const simple = Simple.decode(Reader.create(bytes));
const { name, age } = simple;

Which can dramatically ease integration when converting to/from other layers without creating a class and calling the right getters/setters.

Highlights

  • A poor man's attempt at "please give us back optional types"

    The canonical protobuf wrapper types, i.e. google.protobuf.StringValue, are mapped as optional values, i.e. string | undefined, which means for primitives we can kind of pretend the protobuf type system has optional types.

    (Update: ts-proto now also supports the proto3 optional keyword.)

  • Timestamps are mapped as Date

    (Configurable with the useDate parameter.)

  • fromJSON/toJSON use the proto3 canonical JSON encoding format (e.g. timestamps are ISO strings), unlike protobufjs.

  • ObjectIds can be mapped as mongodb.ObjectId

    (Configurable with the useMongoObjectId parameter.)

Auto-Batching / N+1 Prevention

(Note: this is currently only supported by the Twirp clients.)

If you're using ts-proto's clients to call backend micro-services, similar to the N+1 problem in SQL applications, it is easy for micro-service clients to (when serving an individual request) inadvertantly trigger multiple separate RPC calls for "get book 1", "get book 2", "get book 3", that should really be batched into a single "get books [1, 2, 3]" (assuming the backend supports a batch-oriented RPC method).

ts-proto can help with this, and essentially auto-batch your individual "get book" calls into batched "get books" calls.

For ts-proto to do this, you need to implement your service's RPC methods with the batching convention of:

  • A method name of Batch<OperationName>
  • The Batch<OperationName> input type has a single repeated field (i.e. repeated string ids = 1)
  • The Batch<OperationName> output type has either a:
    • A single repeated field (i.e. repeated Foo foos = 1) where the output order is the same as the input ids order, or
    • A map of the input to an output (i.e. map<string, Entity> entities = 1;)

When ts-proto recognizes methods of this pattern, it will automatically create a "non-batch" version of <OperationName> for the client, i.e. client.Get<OperationName>, that takes a single id and returns a single result.

This provides the client code with the illusion that it can make individual Get<OperationName> calls (which is generally preferrable/easier when implementing the client's business logic), but the actual implementation that ts-proto provides will end up making Batch<OperationName> calls to the backend service.

You also need to enable the useContext=true build-time parameter, which gives all client methods a Go-style ctx parameter, with a getDataLoaders method that lets ts-proto cache/resolve request-scoped DataLoaders, which provide the fundamental auto-batch detection/flushing behavior.

See the batching.proto file and related tests for examples/more details.

But the net effect is that ts-proto can provide SQL-/ORM-style N+1 prevention for clients calls, which can be critical especially in high-volume / highly-parallel implementations like GraphQL front-end gateways calling backend micro-services.

Usage

ts-proto is a protoc plugin, so you run it by (either directly in your project, or more likely in your mono-repo schema pipeline, i.e. like Ibotta or Namely):

  • Add ts-proto to your package.json
  • Run npm install to download it
  • Invoke protoc with a plugin parameter like:
protoc --plugin=node_modules/ts-proto/protoc-gen-ts_proto ./batching.proto -I.

ts-proto can also be invoked with Gradle using the protobuf-gradle-plugin:

protobuf {
    plugins {
        // `ts` can be replaced by any unused plugin name, e.g. `tsproto`
        ts {
            path = 'path/to/plugin'
        }
    }

    // This section only needed if you provide plugin options
    generateProtoTasks {
        all().each { task ->
            task.plugins {
                // Must match plugin ID declared above
                ts {
                    option 'foo=bar'
                }
            }
        }
    }
}

Generated code will be placed in the Gradle build directory.

Supported options

  • With --ts_proto_opt=globalThisPolyfill=true, ts-proto will include a polyfill for globalThis.

    Defaults to false, i.e. we assume globalThis is available.

  • With --ts_proto_opt=context=true, the services will have a Go-style ctx parameter, which is useful for tracing/logging/etc. if you're not using node's async_hooks api due to performance reasons.

  • With --ts_proto_opt=forceLong=long, all 64-bit numbers will be parsed as instances of Long (using the long library).

    With --ts_proto_opt=forceLong=string, all 64-bit numbers will be output as strings.

    With --ts_proto_opt=forceLong=bigint, all 64-bit numbers will be output as BigInts. This option still uses the long library to encode/decode internally within protobuf.js, but then converts to/from BigInts in the ts-proto-generated code.

    The default behavior is forceLong=number, which will internally still use the long library to encode/decode values on the wire (so you will still see a util.Long = Long line in your output), but will convert the long values to number automatically for you. Note that a runtime error is thrown if, while doing this conversion, a 64-bit value is larger than can be correctly stored as a number.

  • With --ts_proto_opt=esModuleInterop=true changes output to be esModuleInterop compliant.

    Specifically the Long imports will be generated as import Long from 'long' instead of import * as Long from 'long'.

  • With --ts_proto_opt=env=node or browser or both, ts-proto will make environment-specific assumptions in your output. This defaults to both, which makes no environment-specific assumptions.

    Using node changes the types of bytes from Uint8Array to Buffer for easier integration with the node ecosystem which generally uses Buffer.

    Currently browser doesn't have any specific behavior other than being "not node". It probably will soon/at some point.

  • With --ts_proto_opt=useOptionals=messages (for message fields) or --ts_proto_opt=useOptionals=all (for message and scalar fields), fields are declared as optional keys, e.g. field?: Message instead of the default field: Message | undefined.

    ts-proto defaults to useOptionals=none because it:

    1. Prevents typos when initializing messages, and
    2. Provides the most consistent API to readers
    3. Ensures production messages are properly initialized with all fields.

    For typo prevention, optional fields make it easy for extra fields to slip into a message (until we get Exact Types), i.e.:

    interface SomeMessage {
      firstName: string;
      lastName: string;
    }
    // Declared with a typo
    const data = { firstName: "a", lastTypo: "b" };
    // With useOptionals=none, this correctly fails to compile; if `lastName` was optional, it would not
    const message: SomeMessage = { ...data };

    For a consistent API, if SomeMessage.lastName is optional lastName?, then readers have to check two empty conditions: a) is lastName undefined (b/c it was created in-memory and left unset), or b) is lastName empty string (b/c we read SomeMessage off the wire and, per the proto3 spec, initialized lastName to empty string)?

    For ensuring proper initialization, if later SomeMessage.middleInitial is added, but it's marked as optional middleInitial?, you may have many call sites in production code that should now be passing middleInitial to create a valid SomeMessage, but are not.

    So, between typo-prevention, reader inconsistency, and proper initialization, ts-proto recommends using useOptionals=none as the "most safe" option.

    All that said, this approach does require writers/creators to set every field (although fromPartial and create are meant to address this), so if you still want to have optional keys, you can set useOptionals=messages or useOptionals=all.

    (See this issue and this issue for discussions on useOptional.)

  • With --ts_proto_opt=exportCommonSymbols=false, utility types like DeepPartial and protobufPackage won't be exportd.

    This should make it possible to use create barrel imports of the generated output, i.e. import * from ./foo and import * from ./bar.

    Note that if you have the same message name used in multiple *.proto files, you will still get import conflicts.

  • With --ts_proto_opt=oneof=unions, oneof fields will be generated as ADTs.

    See the "OneOf Handling" section.

  • With --ts_proto_opt=unrecognizedEnumName=<NAME> enums will contain a key <NAME> with value of the unrecognizedEnumValue option.

    Defaults to UNRECOGNIZED.

  • With --ts_proto_opt=unrecognizedEnumValue=<NUMBER> enums will contain a key provided by the unrecognizedEnumName option with value of <NUMBER>.

    Defaults to -1.

  • With --ts_proto_opt=unrecognizedEnum=false enums will not contain an unrecognized enum key and value as provided by the unrecognizedEnumName and unrecognizedEnumValue options.

  • With --ts_proto_opt=removeEnumPrefix=true generated enums will have the enum name removed from members.

    FooBar.FOO_BAR_BAZ = "FOO_BAR_BAZ" will generate FooBar.BAZ = "FOO_BAR_BAZ"

  • With --ts_proto_opt=lowerCaseServiceMethods=true, the method names of service methods will be lowered/camel-case, i.e. service.findFoo instead of service.FindFoo.

  • With --ts_proto_opt=snakeToCamel=false, fields will be kept snake case in both the message keys and the toJSON / fromJSON methods.

    snakeToCamel can also be set as a _-delimited list of strings (comma is reserved as the flag delimited), i.e. --ts_proto_opt=snakeToCamel=keys_json, where including keys will make message keys be camel case and including json will make JSON keys be camel case.

    Empty string, i.e. snakeToCamel=, will keep both messages keys and JSON keys as snake case (it is the same as snakeToCamel=false).

    Note that to use the json_name attribute, you'll have to use the json.

    The default behavior is keys_json, i.e. both will be camel cased, and json_name will be used if set.

  • With --ts_proto_opt=outputEncodeMethods=false, the Message.encode and Message.decode methods for working with protobuf-encoded/binary data will not be output.

    This is useful if you want "only types".

  • With --ts_proto_opt=outputJsonMethods=false, the Message.fromJSON and Message.toJSON methods for working with JSON-coded data will not be output.

    This is also useful if you want "only types".

  • With --ts_proto_opt=outputJsonMethods=to-only and --ts_proto_opt=outputJsonMethods=from-only you will be able to export only one between the Message.toJSON and Message.fromJSON methods.

    This is useful if you're using ts-proto just to encode or decode and not for both.

  • With --ts_proto_opt=outputPartialMethods=false, the Message.fromPartial and Message.create methods for accepting partially-formed objects/object literals will not be output.

  • With --ts_proto_opt=stringEnums=true, the generated enum types will be string-based instead of int-based.

    This is useful if you want "only types" and are using a gRPC REST Gateway configured to serialize enums as strings.

    (Requires outputEncodeMethods=false.)

  • With --ts_proto_opt=outputClientImpl=false, the client implementations, i.e. FooServiceClientImpl, that implement the client-side (in Twirp, see next option for grpc-web) RPC interfaces will not be output.

  • With --ts_proto_opt=outputClientImpl=grpc-web, the client implementations, i.e. FooServiceClientImpl, will use the @improbable-eng/grpc-web library at runtime to send grpc messages to a grpc-web backend.

    (Note that this only uses the grpc-web runtime, you don't need to use any of their generated code, i.e. the ts-proto output replaces their ts-protoc-gen output.)

    You'll need to add the @improbable-eng/grpc-web and a transport to your project's package.json; see the integration/grpc-web directory for a working example. Also see #504 for integrating with grpc-web-devtools.

  • With --ts_proto_opt=returnObservable=true, the return type of service methods will be Observable<T> instead of Promise<T>.

  • With--ts_proto_opt=addGrpcMetadata=true, the last argument of service methods will accept the grpc Metadata type, which contains additional information with the call (i.e. access tokens/etc.).

    (Requires nestJs=true.)

  • With--ts_proto_opt=addNestjsRestParameter=true, the last argument of service methods will be an rest parameter with type any. This way you can use custom decorators you could normally use in nestjs.

    (Requires nestJs=true.)

  • With --ts_proto_opt=nestJs=true, the defaults will change to generate NestJS protobuf friendly types & service interfaces that can be used in both the client-side and server-side of NestJS protobuf implementations. See the nestjs readme for more information and implementation examples.

    Specifically outputEncodeMethods, outputJsonMethods, and outputClientImpl will all be false, lowerCaseServiceMethods will be true and outputServices will be ignored.

    Note that addGrpcMetadata, addNestjsRestParameter and returnObservable will still be false.

  • With --ts_proto_opt=useDate=false, fields of type google.protobuf.Timestamp will not be mapped to type Date in the generated types. See Timestamp for more details.

  • With --ts_proto_opt=useMongoObjectId=true, fields of a type called ObjectId where the message is constructed to have on field called value that is a string will be mapped to type mongodb.ObjectId in the generated types. This will require your project to install the mongodb npm package. See ObjectId for more details.

  • With --ts_proto_opt=outputSchema=true, meta typings will be generated that can later be used in other code generators.

  • With --ts_proto_opt=outputTypeAnnotations=true, each message will be given a $type field containing its fully-qualified name. You can use --ts_proto_opt=outputTypeAnnotations=static-only to omit it from the interface declaration, or --ts_proto_opt=outputTypeAnnotations=optional to make it an optional property on the interface definition. The latter option may be useful if you want to use the $type field for runtime type checking on responses from a server.

  • With --ts_proto_opt=outputTypeRegistry=true, the type registry will be generated that can be used to resolve message types by fully-qualified name. Also, each message will be given a $type field containing its fully-qualified name.

  • With --ts_proto_opt=outputServices=grpc-js, ts-proto will output service definitions and server / client stubs in grpc-js format.

  • With --ts_proto_opt=outputServices=generic-definitions, ts-proto will output generic (framework-agnostic) service definitions. These definitions contain descriptors for each method with links to request and response types, which allows to generate server and client stubs at runtime, and also generate strong types for them at compile time. An example of a library that uses this approach is nice-grpc.

  • With --ts_proto_opt=outputServices=nice-grpc, ts-proto will output server and client stubs for nice-grpc. This should be used together with generic definitions, i.e. you should specify two options: outputServices=nice-grpc,outputServices=generic-definitions.

  • With --ts_proto_opt=metadataType=Foo@./some-file, ts-proto add a generic (framework-agnostic) metadata field to the generic service definition.

  • With --ts_proto_opt=outputServices=generic-definitions,outputServices=default, ts-proto will output both generic definitions and interfaces. This is useful if you want to rely on the interfaces, but also have some reflection capabilities at runtime.

  • With --ts_proto_opt=outputServices=false, or =none, ts-proto will output NO service definitions.

  • With --ts_proto_opt=rpcBeforeRequest=true, ts-proto will add a function definition to the Rpc interface definition with the signature: beforeRequest(service: string, message: string, request: <RequestType>). It will will also automatically set outputServices=default. Each of the Service's methods will call beforeRequest before performing it's request.

  • With --ts_proto_opt=rpcAfterResponse=true, ts-proto will add a function definition to the Rpc interface definition with the signature: afterResponse(service: string, message: string, response: <ResponseType>). It will will also automatically set outputServices=default. Each of the Service's methods will call afterResponse before returning the response.

  • With --ts_proto_opt=rpcErrorHandler=true, ts-proto will add a function definition to the Rpc interface definition with the signature: handleError(service: string, message: string, error: Error). It will will also automatically set outputServices=default.

  • With --ts_proto_opt=useAbortSignal=true, the generated services will accept an AbortSignal to cancel RPC calls.

  • With --ts_proto_opt=useAsyncIterable=true, the generated services will use AsyncIterable instead of Observable.

  • With --ts_proto_opt=emitImportedFiles=false, ts-proto will not emit google/protobuf/* files unless you explicit add files to protoc like this protoc --plugin=./node_modules/.bin/protoc-gen-ts_proto my_message.proto google/protobuf/duration.proto

  • With --ts_proto_opt=fileSuffix=<SUFFIX>, ts-proto will emit generated files using the specified suffix. A helloworld.proto file with fileSuffix=.pb would be generated as helloworld.pb.ts. This is common behavior in other protoc plugins and provides a way to quickly glob all the generated files.

  • With --ts_proto_opt=importSuffix=<SUFFIX>, ts-proto will emit file imports using the specified suffix. An import of helloworld.ts with fileSuffix=.js would generate import "helloworld.js". The default is to import without a file extension. Supported by TypeScript 4.7.x and up.

  • With --ts_proto_opt=enumsAsLiterals=true, the generated enum types will be enum-ish object with as const.

  • With --ts_proto_opt=useExactTypes=false, the generated fromPartial and create methods will not use Exact types.

    The default behavior is useExactTypes=true, which makes fromPartial and create use Exact type for its argument to make TypeScript reject any unknown properties.

  • With --ts_proto_opt=unknownFields=true, all unknown fields will be parsed and output as arrays of buffers.

  • With --ts_proto_opt=onlyTypes=true, only types will be emitted, and imports for long and protobufjs/minimal will be excluded.

    This is the same as setting outputJsonMethods=false,outputEncodeMethods=false,outputClientImpl=false,nestJs=false

  • With --ts_proto_opt=usePrototypeForDefaults=true, the generated code will wrap new objects with Object.create.

    This allows code to do hazzer checks to detect when default values have been applied, which due to proto3's behavior of not putting default values on the wire, is typically only useful for interacting with proto2 messages.

    When enabled, default values are inherited from a prototype, and so code can use Object.keys().includes("someField") to detect if someField was actually decoded or not.

    Note that, as indicated, this means Object.keys will not include set-by-default fields, so if you have code that iterates over messages keys in a generic fashion, it will have to also iterate over keys inherited from the prototype.

  • With --ts_proto_opt=useJsonName=true, json_name defined in protofiles will be used instead of message field names.

  • With --ts_proto_opt=useJsonWireFormat=true, the generated code will reflect the JSON representation of Protobuf messages.

    Requires onlyTypes=true. Implies useDate=string and stringEnums=true. This option is to generate types that can be directly used with marshalling/unmarshalling Protobuf messages serialized as JSON. You may also want to set useOptionals=all, as gRPC gateways are not required to send default value for scalar values.

  • With --ts_proto_opt=useNumericEnumForJson=true, the JSON converter (toJSON) will encode enum values as int, rather than a string literal.

  • With --ts_proto_opt=initializeFieldsAsUndefined=false, all optional field initializers will be omited from the generated base instances.

  • With --ts_proto_opt=disableProto2Optionals=true, all optional fields on proto2 files will not be set to be optional. Please note that this flag is primarily for preserving ts-proto's legacy handling of proto2 files, to avoid breaking changes, and as a result, it is not intended to be used moving forward.

  • With --ts_proto_opt=disableProto2DefaultValues=true, all fields in proto2 files that specify a default value will not actually use that default value. Please note that this flag is primarily for preserving ts-proto's legacy handling of proto2 files, to avoid breaking changes, and as a result, it is not intended to be used moving forward.

  • With --ts_proto_opt=Mgoogle/protobuf/empty.proto=./google3/protobuf/empty, ('M' means 'importMapping', similar to protoc-gen-go), the generated code import path for ./google/protobuf/empty.ts will reflect the overridden value:

    • Mfoo/bar.proto=@myorg/some-lib will map foo/bar.proto imports into import ... from '@myorg/some-lib'.
    • Mfoo/bar.proto=./some/local/lib will map foo/bar.proto imports into import ... from './some/local/lib'.
    • Mfoo/bar.proto=some-modules/some-lib will map foo/bar.proto imports into import ... from 'some-module/some-lib'.
    • Note: Uses are accummulated, so multiple values are expected in the form of --ts_proto_opt=M... --ts_proto_opt=M... (one ts_proto_opt per mapping).
    • Note: Proto files that match mapped imports will not be generated.
  • With --ts_proto_opt=useMapType=true, the generated code for protobuf map<key_type, value_type> will become Map<key_type, value_type> that uses JavaScript Map type.

    The default behavior is useMapType=false, which makes it generate the code for protobuf map<key_type, value_type with the key-value pair like {[key: key_type]: value_type}.

  • With --ts_proto_opt=useReadonlyTypes=true, the generated types will be declared as immutable using typescript's readonly modifer.

  • With --ts_proto_opt=useSnakeTypeName=false will remove snake casing from types.

    Example Protobuf

    message Box {
        message Element {
              message Image {
                    enum Alignment {
                          LEFT = 1;
                          CENTER = 2;
                          RIGHT = 3;
                    }
              }
          }
    }

    by default this is enabled which would generate a type of Box_Element_Image_Alignment. By disabling this option the type that is generated would be BoxElementImageAlignment.

  • With --ts_proto_opt=outputExtensions=true, the generated code will include proto2 extensions

    Extension encode/decode methods are compliant with the outputEncodeMethods option, and if unknownFields=true, the setExtension and getExtension methods will be created for extendable messages, also compliant with outputEncodeMethods (setExtension = encode, getExtension = decode).

  • With --ts_proto_opt=outputIndex=true, index files will be generated based on the proto package namespaces.

    This will disable exportCommonSymbols to avoid name collisions on the common symbols.

  • With --ts_proto_opt=emitDefaultValues=json-methods, the generated toJSON method will emit scalars like 0 and "" as json fields.

  • With --ts_proto_opt=comments=false, comments won't be copied from the proto files to the generated code.

  • With --ts_proto_opt=useNullAsOptional=true, undefined values will be converted to null, and if you use optional label in your .proto file, the field will have undefined type as well. for example:

message ProfileInfo {
    int32 id = 1;
    string bio = 2;
    string phone = 3;
}

message Department {
    int32 id = 1;
    string name = 2;
}

message User {
    int32 id = 1;
    string username = 2;
    /*
     ProfileInfo will be optional in typescript, the type will be ProfileInfo | null | undefined
     this is needed in cases where you don't wanna provide any value for the profile.
    */
    optional ProfileInfo profile = 3;

    /*
      Department only accepts a Department type or null, so this means you have to pass it null if there is no value available.
    */
    Department  department = 4;
}

the generated interfaces will be:

export interface ProfileInfo {
  id: number;
  bio: string;
  phone: string;
}

export interface Department {
  id: number;
  name: string;
}

export interface User {
  id: number;
  username: string;
  profile?: ProfileInfo | null | undefined; // check this one
  department: Department | null; // check this one
}

NestJS Support

We have a great way of working together with nestjs. ts-proto generates interfaces and decorators for you controller, client. For more information see the nestjs readme.

Watch Mode

If you want to run ts-proto on every change of a proto file, you'll need to use a tool like chokidar-cli and use it as a script in package.json:

"proto:generate": "protoc --ts_proto_out=. ./<proto_path>/<proto_name>.proto --ts_proto_opt=esModuleInterop=true",
"proto:watch": "chokidar \"**/*.proto\" -c \"npm run proto:generate\""

Basic gRPC implementation

ts-proto is RPC framework agnostic - how you transmit your data to and from your data source is up to you. The generated client implementations all expect a rpc parameter, which type is defined like this:

interface Rpc {
  request(service: string, method: string, data: Uint8Array): Promise<Uint8Array>;
}

If you're working with gRPC, a simple implementation could look like this:

type RpcImpl = (service: string, method: string, data: Uint8Array) => Promise<Uint8Array>;

const sendRequest: RpcImpl = (service, method, data) => {
  // Conventionally in gRPC, the request path looks like
  //   "package.names.ServiceName/MethodName",
  // we therefore construct such a string
  const path = `/${service}/${method}`;

  return new Promise((resolve, reject) => {
    // makeUnaryRequest transmits the result (and error) with a callback
    // transform this into a promise!
    const resultCallback: UnaryCallback<any> = (err, res) => {
      if (err) {
        return reject(err);
      }
      resolve(res);
    };

    function passThrough(argument: any) {
      return argument;
    }

    // Using passThrough as the serialize and deserialize functions
    conn.makeUnaryRequest(path, passThrough, passThrough, data, resultCallback);
  });
};

const rpc: Rpc = { request: sendRequest };

Sponsors

Kudos to our sponsors:

  • ngrok funded ts-proto's initial grpc-web support.

If you need ts-proto customizations or priority support for your company, you can ping me at via email.

Development

This section describes how to contribute directly to ts-proto, i.e. it's not required for running ts-proto in protoc or using the generated TypeScript.

Requirements

  • Docker
  • yarnnpm install -g yarn

Setup

The commands below assume you have Docker installed. If you are using OS X, install coreutils, brew install coreutils.

  • Check out the repository for the latest code.
  • Run yarn install to install the dependencies.
  • Run yarn build:test to generate the test files.

    This runs the following commands:

    • proto2bin — Converts integration test .proto files to .bin.
    • bin2ts — Runs ts-proto on the .bin files to generate .ts files.
    • proto2pbjs — Generates a reference implementation using pbjs for testing compatibility.
  • Run yarn test

Workflow

  • Add/update an integration test for your use case
    • Either find an existing integration/* test that is close enough to your use case, e.g. has a parameters.txt that matches the ts_proto_opt params necessary to reproduce your use case
    • If creating a new integration test:
      • Make a new integration/your-new-test/parameters.txt with the necessary ts_proto_opt params
      • Create a minimal integration/your-new-test/your-new-test.proto schema to reproduce your use case
    • After any changes to your-new-test.proto, or an existing integration/*.proto file, run yarn proto2bin
      • You can also leave yarn watch running, and it should "just do the right thing"
    • Add/update a integration/your-new-test/some-test.ts unit test, even if it's as trivial as just making sure the generated code compiles
  • Modify the ts-proto code generation logic:
    • Most important logic is found in src/main.ts.
    • After any changes to src/*.ts files, run yarn bin2ts to re-codegen all integration tests
      • Or yarn bin2ts your-new-test to re-codegen a specific test
      • Again leaving yarn watch running should "just do the right thing"
  • Run yarn test to verify your changes pass all existing tests
  • Commit and submit a PR
    • Run yarn format to format the typescript files.
    • Make sure to git add all of the *.proto, *.bin, and *.ts files in integration/your-new-test
      • Sometimes checking in generated code is frowned upon, but given ts-proto's main job is to generate code, seeing the codegen diffs in PRs is helpful

Testing in your projects

You can test your local ts-proto changes in your own projects by running yarn add ts-proto@./path/to/ts-proto, as long as you run yarn build manually.

Dockerized Protoc

The repository includes a dockerized version of protoc, which is configured in docker-compose.yml.

It can be useful in case you want to manually invoke the plugin with a known version of protoc.

Usage:

# Include the protoc alias in your shell.
. aliases.sh

# Run protoc as usual. The ts-proto directory is available in /ts-proto.
protoc --plugin=/ts-proto/protoc-gen-ts_proto --ts_proto_out=./output -I=./protos ./protoc/*.proto

# Or use the ts-protoc alias which specifies the plugin path for you.
ts-protoc --ts_proto_out=./output -I=./protos ./protoc/*.proto
  • All paths must be relative paths within the current working directory of the host. ../ is not allowed
  • Within the docker container, the absolute path to the project root is /ts-proto
  • The container mounts the current working directory in /host, and sets it as its working directory.
  • Once aliases.sh is sourced, you can use the protoc command in any folder.

Assumptions

  • TS/ES6 module name is the proto package

Todo

  • Support the string-based encoding of duration in fromJSON/toJSON
  • Make oneof=unions the default behavior in 2.0
  • Probably change forceLong default in 2.0, should default to forceLong=long
  • Make esModuleInterop=true the default in 2.0

OneOf Handling

By default, ts-proto models oneof fields "flatly" in the message, e.g. a message like:

message Foo {
  oneof either_field { string field_a = 1; string field_b = 2; }
}

Will generate a Foo type with two fields: field_a: string | undefined; and field_b: string | undefined.

With this output, you'll have to check both if object.field_a and if object.field_b, and if you set one, you'll have to remember to unset the other.

Instead, we recommend using the oneof=unions option, which will change the output to be an Abstract Data Type/ADT like:

interface YourMessage {
  eitherField?: { $case: "field_a"; field_a: string } | { $case: "field_b"; field_b: string };
}

As this will automatically enforce only one of field_a or field_b "being set" at a time, because the values are stored in the eitherField field that can only have a single value at a time.

(Note that eitherField is optional b/c oneof in Protobuf means "at most one field" is set, and does not mean one of the fields must be set.)

In ts-proto's currently-unscheduled 2.x release, oneof=unions will become the default behavior.

OneOf Type Helpers

The following helper types may make it easier to work with the types generated from oneof=unions:

/** Extracts all the case names from a oneOf field. */
type OneOfCases<T> = T extends { $case: infer U extends string } ? U : never;

/** Extracts a union of all the value types from a oneOf field */
type OneOfValues<T> = T extends { $case: infer U extends string; [key: string]: unknown }
  ? T[U]
  : never;

/** Extracts the specific type of a oneOf case based on its field name */
type OneOfCase<T, K extends OneOfCases<T>> = T extends {
  $case: infer U extends K;
  [key: string]: unknown;
}
  ? T[U]
  : never;

Default values and unset fields

In core Protobuf (and so also ts-proto), values that are unset or equal to the default value are not sent over the wire.

For example, the default value of a message is undefined. Primitive types take their natural default value, e.g. string is '', number is 0, etc.

Protobuf chose/enforces this behavior because it enables forward compatibility, as primitive fields will always have a value, even when omitted by outdated agents.

This is good, but it also means default and unset values cannot be distinguished in ts-proto fields; it's just fundamentally how Protobuf works.

If you need primitive fields where you can detect set/unset, see Wrapper Types.

Encode / Decode

ts-proto follows the Protobuf rules, and always returns default values for unsets fields when decoding, while omitting them from the output when serialized in binary format.

syntax = "proto3";
message Foo {
  string bar = 1;
}
protobufBytes; // assume this is an empty Foo object, in protobuf binary format
Foo.decode(protobufBytes); // => { bar: '' }
Foo.encode({ bar: "" }); // => { }, writes an empty Foo object, in protobuf binary format

fromJSON / toJSON

Reading JSON will also initialize the default values. Since senders may either omit unset fields, or set them to the default value, use fromJSON to normalize the input.

Foo.fromJSON({}); // => { bar: '' }
Foo.fromJSON({ bar: "" }); // => { bar: '' }
Foo.fromJSON({ bar: "baz" }); // => { bar: 'baz' }

When writing JSON, ts-proto normalizes messages by omitting unset fields and fields set to their default values.

Foo.toJSON({}); // => { }
Foo.toJSON({ bar: undefined }); // => { }
Foo.toJSON({ bar: "" }); // => { } - note: omitting the default value, as expected
Foo.toJSON({ bar: "baz" }); // => { bar: 'baz' }

Well-Known Types

Protobuf comes with several predefined message definitions, called "Well-Known Types". Their interpretation is defined by the Protobuf specification, and libraries are expected to convert these messages to corresponding native types in the target language.

ts-proto currently automatically converts these messages to their corresponding native types.

Wrapper Types

Wrapper Types are messages containing a single primitive field, and can be imported in .proto files with import "google/protobuf/wrappers.proto".

Since these are messages, their default value is undefined, allowing you to distinguish unset primitives from their default values, when using Wrapper Types. ts-proto generates these fields as <primitive> | undefined.

For example:

// Protobuf
syntax = "proto3";

import "google/protobuf/wrappers.proto";

message ExampleMessage {
  google.protobuf.StringValue name = 1;
}
// TypeScript
interface ExampleMessage {
  name: string | undefined;
}

When encoding a message the primitive value is converted back to its corresponding wrapper type:

ExampleMessage.encode({ name: "foo" }); // => { name: { value: 'foo' } }, in binary

When calling toJSON, the value is not converted, because wrapper types are idiomatic in JSON.

ExampleMessage.toJSON({ name: "foo" }); // => { name: 'foo' }

JSON Types (Struct Types)

Protobuf's language and types are not sufficient to represent all possible JSON values, since JSON may contain values whose type is unknown in advance. For this reason, Protobuf offers several additional types to represent arbitrary JSON values.

These are called Struct Types, and can be imported in .proto files with import "google/protobuf/struct.proto".

ts-proto automatically converts back and forth between these Struct Types and their corresponding JSON types.

Example:

// Protobuf
syntax = "proto3";

import "google/protobuf/struct.proto";

message ExampleMessage {
  google.protobuf.Value anything = 1;
}
// TypeScript
interface ExampleMessage {
  anything: any | undefined;
}

Encoding a JSON value embedded in a message, converts it to a Struct Type:

ExampleMessage.encode({ anything: { name: "hello" } });
/* Outputs the following structure, encoded in protobuf binary format:
{
  anything: Value {
    structValue = Struct {
      fields = [
        MapEntry {
          key = "name",
          value = Value {
            stringValue = "hello"
          }
        ]
      }
    }
 }
}*/

ExampleMessage.encode({ anything: true });
/* Outputs the following structure encoded in protobuf binary format:
{
  anything: Value {
    boolValue = true
  }
}*/

Timestamp

The representation of google.protobuf.Timestamp is configurable by the useDate flag. The useJsonTimestamp flag controls precision when useDate is false.

Protobuf well-known type Default/useDate=true useDate=false useDate=string useDate=string-nano
google.protobuf.Timestamp Date { seconds: number, nanos: number } string string

When using useDate=false and useJsonTimestamp=raw timestamp is represented as { seconds: number, nanos: number }, but has nanosecond precision.

When using useDate=string-nano timestamp is represented as an ISO string with nanosecond precision 1970-01-01T14:27:59.987654321Z and relies on nano-date library for conversion. You'll need to install it in your project.

Number Types

Numbers are by default assumed to be plain JavaScript numbers.

This is fine for Protobuf types like int32 and float, but 64-bit types like int64 can't be 100% represented by JavaScript's number type, because int64 can have larger/smaller values than number.

ts-proto's default configuration (which is forceLong=number) is to still use number for 64-bit fields, and then throw an error if a value (at runtime) is larger than Number.MAX_SAFE_INTEGER.

If you expect to use 64-bit / higher-than-MAX_SAFE_INTEGER values, then you can use the ts-proto forceLong option, which uses the long npm package to support the entire range of 64-bit values.

The protobuf number types map to JavaScript types based on the forceLong config option:

Protobuf number types Default/forceLong=number forceLong=long forceLong=string
double number number number
float number number number
int32 number number number
int64 number* Long string
uint32 number number number
uint64 number* Unsigned Long string
sint32 number number number
sint64 number* Long string
fixed32 number number number
fixed64 number* Unsigned Long string
sfixed32 number number number
sfixed64 number* Long string

Where (*) indicates they might throw an error at runtime.

Current Status of Optional Values

  • Required primitives: use as-is, i.e. string name = 1.
  • Optional primitives: use wrapper types, i.e. StringValue name = 1.
  • Required messages: not available
  • Optional messages: use as-is, i.e. SubMessage message = 1.

ts-proto's People

Contributors

aikoven avatar bouk avatar boukeversteegh avatar buzap avatar cliedeman avatar davidzeng0 avatar dependabot[bot] avatar doochik avatar ebakoba avatar fizx avatar frabat avatar gnzzz avatar gribbet avatar haines avatar hamiltop avatar hersentino avatar jeffpyke avatar jessebutterfield avatar jonaskello avatar lukealvoeiro avatar paralin avatar philikon avatar roboslone avatar schroederchris avatar semantic-release-bot avatar shakedh avatar stephenh avatar toonvanstrijp avatar vilsol avatar webmaster128 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ts-proto's Issues

Add Support for Empty Message

It's a trivial use case but for consistency we use an empty message for some endpoints. This works as is but produces a few lint errors.

message EmptyResponse{}

app/utils/pb.ts:2146:12 - error TS6133: 'object' is declared but its value is never read.

2146   fromJSON(object: any): EmptyResponse {

Compilation errors for Message(s) with Timestamp(s)

I have a proto message with a timestamp field, for which I want to generate an interface.

syntax = "proto3";
import "google/protobuf/timestamp.proto";

message Registration {
    string event_name = 1;
    google.protobuf.Timestamp date = 2;
}

I pass the options to protoc that the README suggests:

--ts_proto_opt=outputEncodeMethods=false,outputJsonMethods=false,outputClientImpl=false

and obtain the typescript interface.

/* eslint-disable */
import { Timestamp } from './google/protobuf/timestamp';

export interface Registration {
  eventName: string;
  date: Date | undefined;
}

function toTimestamp(date: Date): Timestamp {
  const seconds = date.getTime() / 1_000;
  const nanos = (date.getTime() % 1_000) * 1_000_000;
  return { seconds, nanos };
}

function fromTimestamp(t: Timestamp): Date {
  let millis = t.seconds * 1_000;
  millis += t.nanos / 1_000_000;
  return new Date(millis);
}

function fromJsonTimestamp(o: any): Date {
  if (o instanceof Date) {
    return o;
  } else if (typeof o === "string") {
    return new Date(o);
  } else {
    return fromTimestamp(Timestamp.fromJSON(o));
  }
}

When I try to compile that code I get:

src/registration.ts:28:26 - error TS2693: 'Timestamp' only refers to a type, but is being used as a value here.

28     return fromTimestamp(Timestamp.fromJSON(o));
                            ~~~~~~~~~

Found 1 error.

What am I missing?

Supporting NestJS with observables and also metadata, pull request added.

Hi,

I while ago I started a thread with regards to returning only the interface for the generated code. Due to lack of time I wasn't able to complete it but somebody else provided the functionality.

I have now further extended it as "opt in" to return an observable in place of a promise and also accept a Metadata parameter.

This is really helpful for the NestJS community which I am part of.

#47

If anybody cares to comment, I am open to suggestions.

It doesn't break compatibility and as mentioned, is OPT-IN only.

I have updated the readme.

Thanks in advance.

Enums typed too broadly for assertNever

Example:

export declare const Platform: {
    UNKNOWN_PLATFORM: PaidFeature;
    IOS: PaidFeature;
    ANDROID: PaidFeature;
    UNRECOGNIZED: PaidFeature;
    fromJSON(object: any): PaidFeature;
    toJSON(object: PaidFeature): string;
};

These should be:

    UNKNOWN_PLATFORM: 0 as const,
    IOS: 1 as const,
    ANDROID: 2 as const,
    UNRECOGNIZED: -1 as const,

Compare:

type enumType = 0 | 1 | 2 | -1;

const x = {
  UNKNOWN_PLATFORM: 0 as enumType,
  IOS: 1 as enumType,
  ANDROID: 2 as enumType,
  UNRECOGNIZED: -1 as enumType,
};

function doThing(input: typeof x.ANDROID): void {}

// This should not compile, but it does
doThing(x.IOS);

and

type enumType = 0 | 1 | 2 | -1;

const x = {
  UNKNOWN_PLATFORM: 0 as const,
  IOS: 1 as const,
  ANDROID: 2 as const,
  UNRECOGNIZED: -1 as const,
};

function doThing(input: typeof x.ANDROID): void {}

// `Argument of type '1' is not assignable to parameter of type '2'.`
doThing(x.IOS);

Support for generating TSDoc content from the message and field level comments in the Protobuf definition

Hello,

Has there been any thought around using the comments for the protobuf message and field definitions to generate TSDoc comments for the generated typescript?

We'd love to be able to have the field comments from the proto files available for devs within VS Code and other editors. Many IDEs and documentation generators support comments using the TSDoc format.

Many of our proto definitions follow both comment guidelines for Uber prototool and Google Documentation AIP. It would be awesome for those comments to come along for the ride with the TS that's generated, similar to the comments that are generated with the Google GAPIC typescript generator.

Thanks!
David

Support interface-only output

Hi,

I forked it but i notice there is a missing directory under build called pbjs, i figured this was pbjs from npm so I replaced it but it missing some things - is this a custom build ? Can you include it in repo ?

I stumbled onto this package by accident and its almost what I need. I just want simple TS files.

I was hoping to be able to extend the

export type Options = {
  useContext: boolean;
};

To contain useObservables: boolean - as I notice you are returning a promise - but i would love to have an observable returned here - i thought adding an option would be great.. I am using NestJS - and my services return observables. (there is a link here - if you are not aware of it ( https://docs.nestjs.com/microservices/grpc )

Also another option like

minimalGeneration: boolean

or something similar, basically meaning that it just creates the interfaces and nothing else - let me show you an example...

I will show you my proto file and what I prefer (just a minimal output using only interfaces etc) and the generated code that ts-proto produces :-

syntax = "proto3";

package translation;

service TranslationService {
  rpc FindOne (HeroById) returns (Hero) {}
}

message HeroById {
  int32 id = 1;
}

message Hero {
  int32 id = 1;
  string name = 2;
}

What I would like, notice it's minimal and also I return an observable rather than a promise, I don't have any ctx etc... Just the interfaces for the types and the interface for the service

export interface HeroById {
  id: number;
}

export interface Hero {
  id: number;
  name: string;
}

export interface TranslationService {

  FindOne(request: HeroById): Observable<Hero>;

}

Here is what ts-proto produces

import { Reader, Writer } from 'protobufjs/minimal';
import * as Long from 'long';


export interface HeroById {
  id: number;
}

export interface Hero {
  id: number;
  name: string;
}

const baseHeroById: object = {
  id: 0,
};

const baseHero: object = {
  id: 0,
  name: "",
};

export interface TranslationService<Context extends DataLoaders> {

  FindOne(ctx: Context, request: HeroById): Promise<Hero>;

}

export class TranslationServiceClientImpl<Context extends DataLoaders> implements TranslationService<Context> {

  private readonly rpc: Rpc<Context>;

  constructor(rpc: Rpc<Context>) {
    this.rpc = rpc;
  }

  FindOne(ctx: Context, request: HeroById): Promise<Hero> {
    const data = HeroById.encode(request).finish();
    const promise = this.rpc.request(ctx, "translation.TranslationService", "FindOne", data);
    return promise.then(data => Hero.decode(new Reader(data)));
  }

}

interface Rpc<Context> {

  request(ctx: Context, service: string, method: string, data: Uint8Array): Promise<Uint8Array>;

}

interface DataLoaders {

  getDataLoader<T>(identifier: string, cstrFn: () => T): T;

}

function longToNumber(long: Long) {
  if (long.gt(Number.MAX_SAFE_INTEGER)) {
    throw new global.Error("Value is larger than Number.MAX_SAFE_INTEGER");
  }
  return long.toNumber();
}

export const HeroById = {
  encode(message: HeroById, writer: Writer = Writer.create()): Writer {
    writer.uint32(8).int32(message.id);
    return writer;
  },
  decode(reader: Reader, length?: number): HeroById {
    let end = length === undefined ? reader.len : reader.pos + length;
    const message = Object.create(baseHeroById) as HeroById;
    while (reader.pos < end) {
      const tag = reader.uint32();
      switch (tag >>> 3) {
        case 1:
          message.id = reader.int32();
          break;
        default:
          reader.skipType(tag & 7);
          break;
      }
    }
    return message;
  },
  fromJSON(object: any): HeroById {
    const message = Object.create(baseHeroById) as HeroById;
    if (object.id) {
      message.id = Number(object.id);
    }
    return message;
  },
  fromPartial(object: DeepPartial<HeroById>): HeroById {
    const message = Object.create(baseHeroById) as HeroById;
    if (object.id) {
      message.id = object.id;
    }
    return message;
  },
  toJSON(message: HeroById): unknown {
    const obj: any = {};
    obj.id = message.id || 0;
    return obj;
  },
};

export const Hero = {
  encode(message: Hero, writer: Writer = Writer.create()): Writer {
    writer.uint32(8).int32(message.id);
    writer.uint32(18).string(message.name);
    return writer;
  },
  decode(reader: Reader, length?: number): Hero {
    let end = length === undefined ? reader.len : reader.pos + length;
    const message = Object.create(baseHero) as Hero;
    while (reader.pos < end) {
      const tag = reader.uint32();
      switch (tag >>> 3) {
        case 1:
          message.id = reader.int32();
          break;
        case 2:
          message.name = reader.string();
          break;
        default:
          reader.skipType(tag & 7);
          break;
      }
    }
    return message;
  },
  fromJSON(object: any): Hero {
    const message = Object.create(baseHero) as Hero;
    if (object.id) {
      message.id = Number(object.id);
    }
    if (object.name) {
      message.name = String(object.name);
    }
    return message;
  },
  fromPartial(object: DeepPartial<Hero>): Hero {
    const message = Object.create(baseHero) as Hero;
    if (object.id) {
      message.id = object.id;
    }
    if (object.name) {
      message.name = object.name;
    }
    return message;
  },
  toJSON(message: Hero): unknown {
    const obj: any = {};
    obj.id = message.id || 0;
    obj.name = message.name || "";
    return obj;
  },
};

type DeepPartial<T> = {
  [P in keyof T]?: T[P] extends Array<infer U>
  ? Array<DeepPartial<U>>
  : T[P] extends ReadonlyArray<infer U>
  ? ReadonlyArray<DeepPartial<U>>
  : T[P] extends Date | Function | Uint8Array | undefined
  ? T[P]
  : T[P] extends infer U | undefined
  ? DeepPartial<U>
  : T[P] extends object
  ? DeepPartial<T[P]>
  : T[P]
};

bug: stream should force observable

Heey there, I'm quite new to GRPC and protobuffers. Great library btw! I'm using the nestjs feature and it's working great so far!

However I think when defining a stream parameter or return type it should be forced to be of type Observable right? Cause a Promise can't support a stream of data.

The proto file I have:

syntax = "proto3";

import "google/protobuf/empty.proto";

package statistic;

service StatisticService {
  rpc FindOne (StatisticById) returns (Statistic);
  rpc FindMany (stream StatisticById) returns (stream Statistic);
  rpc CreateEmptyRow (EmptyStatisticRow) returns (google.protobuf.Empty);
}

message EmptyStatisticRow {
  int32 id = 1;
  string field = 2;
}

message StatisticById {
  int32 id = 1;
  string field = 2;
}

message Statistic {
  int32 entityId = 1;
  string sales = 2;
  int32 revenue = 3;
  int32 revenueCash = 4;
  int32 revenueDigital = 5;
}

As you can see the FindMany call takes a stream as input and returns a stream.

Right now ts-proto outputs this function as followed:

  findMany(request: StatisticById, metadata?: Metadata): Promise<Statistic>;

You can force the return type by providing this option --ts_proto_opt=returnObservable=true but this only changed the return type not the parameter type. Also if you set this option to false and the rpc call returns a stream it should be forced to the Observable type right?

I'm currently looking at the nestjs documentation: https://docs.nestjs.com/microservices/grpc#subject-strategy

And as you can see they wrap the input in an Observable and the return type also.

fix

I already implemented a fix and can create a pull request for this. But my main question is if we should do this if the nestJs option is true or not.

debugged@dd5a808

Import declaration conflicts with local declaration

file1.proto

syntax = "proto2";

package common.a;
message Foo {}

file2.proto

syntax = "proto2";
import "file1.proto";

package common.b;
message Foo {}

message Bar {
  optional common.a.Foo foo = 1;
}

Generated Typescript (file2.ts) (error):

import { Foo } from 'file1';
export interface Foo {
}
export interface Bar {
  foo: Foo | undefined;
}

I think this error is caused by loss of the package information.

Proto files with dots in their package name generate an error when using the option nestJS=true

Example:

$  protoc --plugin=node_modules/.bin/protoc-gen-ts_proto --js_out=import_style=commonjs,binary:./src/generated --ts_proto_opt=returnObservable=true,nestJS=true --ts_proto_out=./src/generated proto/service/UserImpl.proto

UserImpl.proto

syntax = "proto3";
package com.app.user.impl;
option java_multiple_files = true;
service UserImpl {
    rpc create(UserCreateRequest) returns (CreateUserResponse);
    rpc getUsers(GetCommand) returns (stream GetUserResponse);
    rpc updateUser(UserUpdateRequest) returns (UpdateUserResponse);
    rpc deleteUser(Id) returns (DeleteResponse);
    rpc getUserById(Id) returns (GetUserResponse);
    rpc getUserByUsername(Username) returns (GetUserResponse);
}

Generated UserImpl.ts

. . .

export const COM.APP.USER.IMPL_PACKAGE_NAME = 'com.app.user.impl' // <- Invalid const name

. . .

COM.APP.USER.IMPL_PACKAGE_NAME is an invalid variable name

Proposed Result

export const COM_APP_USER_IMPL_PACKAGE_NAME = 'com.app.user.impl' // <- valid const name

Protobufs won't decode because no pos on BufferWriter

When trying to decode I never make it into the while (reader.pos < end) because reader.pos is undefined.

It seems like the encode function is working. This is the encode call:

const message = MyProtbuf.encode({
  id: '123',
  url: 'test.com',
  params: { userId: '567' },
});

This the reader that is being created:

BufferWriter {
  len: 22,
  head: Op {
    fn: [Function: noop],
    len: 0,
    next: Op { len: 1, next: [Op], val: 10 },
    val: 0
  },
  tail: Op {
    fn: [Function: writeStringBuffer],
    len: 3,
    next: undefined,
    val: '567'
  },
  states: null
}

Is there anything else I need to do to encode, so that the BufferWriter has a pos prop?

Incompatible Wrapper Types with Nest.js (protobufjs)

When using Google's Well-Known Types with Nest (or just plain protobufjs) the types generated from ts-proto fail to be parsed.

Given this protobuf

syntax = "proto2"
import "google/protobuf/wrappers.proto";

message MonitorHelloRequest {
    optional google.protobuf.StringValue name_prefix = 2;
    optional google.protobuf.Int64Value limit = 3;
}

we get a typescript interface of

export interface MonitorHelloRequest {
  namePrefix: string | undefined;
  limit: number | undefined;
}

which seems fine. But when attempting to use that as a message that is parsed by a server using protobufjs (nest in this case)

You get the error

TypeError: MonitorHelloRequest.namePrefix: object expected

This is because protobufjs is actually expecting that interface to be

export interface MonitorHelloRequest {
  namePrefix: { value: string } | undefined
  limit: { value: number } | undefined;
}

it is unclear to me who is right here as I can see the argument for either. This was referenced on protobufjs/protobuf.js#1042 and several other issues but there does not seem to be any update in a very long time from protobufjs on the matter

Can't get it working

I'm trying to see if this project will generate TypeScript definitions for my Protobuf types (sounds like it will). However, I can't seem to get it working. I tried a command similar to the one in the readme, which gave me:

c:\apps\protobuf\bin\protoc --plugin=./node_modules/.bin/protoc-gen-ts_proto --ts_proto_out=./src/generated --proto_path=../../Proto ../../Proto/dnstools.proto

--ts_proto_out: protoc-gen-ts_proto: %1 is not a valid Win32 application.

After reading protocolbuffers/protobuf#3923, I tried protoc-gen-ts_proto.cmd as the plugin path, which gave me a different error:

c:\apps\protobuf\bin\protoc --plugin=./node_modules/.bin/protoc-gen-ts_proto.cmd --ts_proto_out=./src/generated --proto_path=../../Proto ../../Proto/dnstools.proto
'protoc-gen-ts_proto' is not recognized as an internal or external command,
operable program or batch file.
--ts_proto_out: protoc-gen-ts_proto: Plugin failed with status code 1.

Any ideas?

Upgrading to 1.4.1 broke Get* method generation for batch definitions

Unsure of what the cause is, as I haven't dug through the code, but we noticed when we upgraded the ts-proto lib from 1.4.0 to 1.4.1, the Get* methods were no longer automatically generated for batch service definitions. We rolled back to 1.4.0 and they were generated again.

TS2304: Cannot find name 'global'

I generated typescript file and try to import it.
But i have error "TS2304: Cannot find name 'global'" on line throw new global.Error("Value is larger than Number.MAX_SAFE_INTEGER");

Help me please.

Angular: Build error "Cannot find name 'Buffer'" when using a bytes field.

First, thank you for this library, @stephenh. It has been a while since I looked around for a Typescript protobuf library and I was very happy to discover ts-proto last week.

However, I did run into an issue with angular. The plugin generates the following code when a bytes field is present:

interface WindowBase64 {
  atob(b64: string): string;
  btoa(bin: string): string;
}

const windowBase64 = (globalThis as unknown as WindowBase64);
const atob = windowBase64.atob || ((b64: string) => Buffer.from(b64, 'base64').toString('binary'));
const btoa = windowBase64.btoa || ((bin: string) => Buffer.from(bin, 'binary').toString('base64'));

The Angular compiler fails the build with the following error:

error TS2591: Cannot find name 'Buffer'. Do you need to install type definitions for node? Try `npm i @types/node` and then add `node` to the types field in your tsconfig.
const atob = windowBase64.atob || ((b64: string) => Buffer.from(b64, 'base64').toString('binary'));
                                                    ~~~~~~

I don't think I should follow the compiler`s suggestion to install node's type definitions because that would pollute the application with types from node that it will never be able to use.

The following replacement fixes the issue:

interface WindowBase64 {
  atob(b64: string): string;
  btoa(bin: string): string;
}
declare class Buffer {
  static from(str: string, encoding?: 'base64'|'binary'): Buffer;
  toString(encoding:'base64'|'binary'):string;
}
const windowBase64 = (typeof globalThis !== 'undefined' ? globalThis : this) as unknown as WindowBase64;
const atob = windowBase64.atob || ((b64: string) => Buffer.from(b64, 'base64').toString('binary'));
const btoa = windowBase64.btoa || ((bin: string) => Buffer.from(bin, 'binary').toString('base64'));

Jest tests pass and the angular compiler is happy. As an added bonus this adds support for Edge 18 and iOS 12 (they don't know globalThis).

More compact version - maybe better suited for generated code:

declare const global: any;
const globalObj = typeof globalThis !== 'undefined' ? globalThis : (typeof global !== 'undefined' ? global : this) as any;
const atob = globalObj.atob || ((b64: string) => globalObj.Buffer.from(b64, 'base64').toString('binary'));
const btoa = globalObj.btoa || ((bin: string) => globalObj.Buffer.from(bin, 'binary').toString('base64'));

What do you think, are you interested in a PR?

Windows build didn't work out of the box

Got this error:
--ts_proto_out: protoc-gen-ts_proto: %1 is not a valid Win32 application.

found this answer in similar question. So I added .cmd to the --plugin directive, and it worked. For future users, please update readme, or fix this. Thanks.

Immitate protobuf.js for oneof and non-scalar behavior re: field optionality

I don't find ts-proto's generated types particularly helpful because it forces me to always provide all fields (even if I set it to undefined), which is especially annoying when you have huge nested oneofs.

As an example, consider the following proto:

message Person {
  string name = 1;
  uint32 age = 2;
  oneof legal_document {
    Passport passport = 3;
    DriversLicense dl = 4;
    GovtIssuedId govt_id  = 5;
  }
}

ts-proto translates this to

export interface Person {
  name: string;
  age: number;
  passport: Passport | undefined;
  dl: DriversLicense | undefined;
  GovtIssuedId: govtId | undefined;
}

If I wanted to encode one of these messages, I'd have specify all fields:

Person.encode({
  name: "Bob",
  age: 37,
  passport: undefined,
  dl: undefined,
  govId: undefined,
})

That is tedious and you can imagine with large messages, especially nested oneofs, it becomes entirely impractical. I'm also debating whether we should need to specify age and name at this stage. proto3 says that all fields are essentially optional, though google.protobuf.*Value is the de-facto way to explicitly declare an optional value.

(Also, I personally would also consider code that explicitly sets a value to undefined to be bad form. IMHO, undefined is a fallback value that the JS/TS language provides when the thing you're trying to access cannot be found or was not defined. Code that explicitly wants to provide a null value should use null. That's my interpretation of JS's built-in nullish types.)

I think pbts actually gets this right. It would translate the above to

                interface IPerson {
                    name?: (string|null);
                    age?: (number|null);
                    passport?: (Passport|null);
                    dl?: (DriversLicense|null);
                    govtId?: (GovtIssuedId|null);
                }

which is the interface that consumed by Person.encode(). This means I can write

Person.encode({name: "Bob"})

and it's perfectly valid. We can debate whether name and age should be optional and nullable (probably not), but all the fields in the oneof definitely should be! And for clarity, I would argue that any value that's not a basic type should be optional too, to save tedious typing, having to set them to null (or, worse, undefined, which as said above, is not a value I'd ever expect application code to have to set, it's something to check for).

I'm happy to help with the implementation, and I'm also happy to hide any changes behind a compiler flag if that's preferable.

imports in imported proto files fail to resolve.

#3 has the proto file that breaks things

Fails with:

    build/integration/import_dir/thing.ts:2:27 - error TS2307: Cannot find module './google/protobuf/timestamp'.

    2 import { Timestamp } from './google/protobuf/timestamp';

Convert Bash Scripts to Nodejs to work cross-platform

More a note to myself.

bash in mac is still on v3 for most people unless your install a newer version for brew.

To get the codegen and update-bins scripts to work you need to comment out these lines.

!/usr/bin/env zsh
shopt -s globstar

Easiest solution would probably be to re-write these 2 scripts in nodejs

generated output only does relative imports

If there's a .proto file that is not on a root path and it imports another .proto file then the resulting typescript file will have the import as a relative import causing typescript to throw errors. For example given the following file structure

- foo/
  - bar.proto
  - baz.proto

with bar.proto being

syntax = "proto3";

package foo;

import "google/protobuf/wrappers.proto";
import "foo/baz.proto";
...

The generated output of ts-proto would be

- google/
  - protobuf/
    - wrappers.ts
- foo/
  - bar.ts
  - baz.ts

With bar.ts being

import * from "./google/protobuf/wrappers";
import * from "./foo/baz.ts";

...

Since a relative path is specified for imports, typescript throws errors that it can't find the modules. A proper fix is to either compute the relative paths for imports when generating the output or just have them be "top level" packages (e.g import * from 'google/protobuf/wrappers') so that typescript can be configured to include the packages for module resolution.

Help Request: Garbled Output

Edit: If I get this figured out I can add the steps to the README.

Maybe I'm misunderstanding the point of this repo but I'm having an issue generating a TS interface given a provided .proto file with definitions.

(I have the same issue with my own protobuf usage but figured this would be example you all could see)

As an example I'll take the simple.proto file defined within this project. So I cloned this repo and then from its root I try to compile simple.proto to its TS interfaces.

The first error I get is:

❯ protoc --plugin=protoc-gen-ts_proto simple.proto -I. 
Missing output directives

OK. So that's fine, I'll try specifying the output file manually:

❯ protoc --plugin=protoc-gen-ts_proto simple.proto -I. -o simple.d.ts 

The problem then is that it produces pretty much a garbled mess.

simple.protosimplegoogle/protobuf/wrappers.protogoogle/protobuf/timestamp.protoimport_dir/thing.proto"                                                                             
Simple                                                                                                                                                                             
name (  Rname                                                                                                                                                                      
age (Rage9                                                                                                                                                                         
                                                                                                                                                                                   
created_at       (                                                                                                                                                                 
                  2.google.protobuf.TimestampR  createdAt#                                                                                                                         
child (                                                                                                                                                                            
.simple.ChildRchild'                                                                                                                                                               
state (2.simple.StateEnumRstate4                                                                                                                                                   
grand_children (                                                                                                                                                                   
grandChildrenR  2                                                                                                                                                                  
coins (Rcoins                                                                                                                                                                      
snacks (        Rsnacks/                                                                                                                                                           
        oldState (2.simple.StateEnumR   oldStates+                                                                                                                                 
thing                                                                                                                                                                              
 (

Can anyone who knows explain to me how I would compile this simple.proto file in this repo to its TS interface as an example?

Support grpc-web services

Currently we only support twirp RPC services.

(This was hence been implemented for grpcjs and grpc-web.)

Error: No type found for .Point

This proto file:

syntax = "proto3";

message Point {
    double lat = 1;
    double lng = 2;
}

message Area {
    Point nw = 1;
    Point se = 2;
}

with this command:

protoc --plugin=../../node_modules/ts-proto/protoc-gen-ts_proto --ts_proto_out=. -I../../../protos ../../../protos/api.proto

results in this error:

FAILED!No type found for .PointError: No type found for .Point
    at Object.fail (/Users/alec/Library/Caches/realestate/node_modules/ts-proto/build/utils.js:21:11)
    at toModuleAndType (/Users/alec/Library/Caches/realestate/node_modules/ts-proto/build/types.js:256:59)
    at messageToTypeName (/Users/alec/Library/Caches/realestate/node_modules/ts-proto/build/types.js:250:28)
    at basicTypeName (/Users/alec/Library/Caches/realestate/node_modules/ts-proto/build/types.js:81:20)
    at Object.toTypeName (/Users/alec/Library/Caches/realestate/node_modules/ts-proto/build/types.js:260:16)
    at generateInterfaceDeclaration (/Users/alec/Library/Caches/realestate/node_modules/ts-proto/build/main.js:178:107)
    at /Users/alec/Library/Caches/realestate/node_modules/ts-proto/build/main.js:31:34
    at visit (/Users/alec/Library/Caches/realestate/node_modules/ts-proto/build/main.js:201:9)
    at Object.generateFile (/Users/alec/Library/Caches/realestate/node_modules/ts-proto/build/main.js:30:5)
    at /Users/alec/Library/Caches/realestate/node_modules/ts-proto/build/plugin.js:18:29--ts_proto_out: protoc-gen-ts_proto: Plugin failed with status code 1.

Use of 'DeepPartial<T>' results in type errors with types generated from 'repeated bytes'

The following protobuf message

message Message {
  repeated bytes blobs;
}

Will produce the generated interface

export interface Message {
  blobs: Uint8Array[];
}

Which in turn results in type errors like

Argument of type 'DeepPartial<Uint8Array>' is not assignable to parameter of type 'Uint8Array'.
  Type 'DeepPartial<Uint8Array>' is missing the following properties from type 'Uint8Array': [Symbol.iterator], [Symbol.toStringTag]

I assume this is the same for repeated timestamps that get converted to Date

Changing the DeepPartial<T> type like below solves the problem for me (might need adjustments for the Long type if used):

type DeepPartial<T> = T extends Date | Uint8Array ? T : {
  [P in keyof T]?:
      T[P] extends Array<infer U> ? Array<DeepPartial<U>> :
      T[P] extends ReadonlyArray<infer U> ? ReadonlyArray<DeepPartial<U>> :
      DeepPartial<T[P]>
};

I don't quite understand why the DeepPartial<T> type is currently more complex than that?

Support Long instead of throwing error

Hello,
I'm converting a project to use ts-proto instead of protobufjs. In our protobuf definitions we use uint64 a fair bit. Since Long isn't supported, but the numbers are larger than MAX_SAFE_INTEGER, the error in longToNumber is thrown at runtime.

I saw the note on the README about better support for Long, and I would be interested in hearing what your thoughts on it is. From my perspective, changing to always return a Long would be good solution, but that might break other projects. Taking a note from protobufjs and adding a protoc plugin option mimicking --force-number and --force-long could also be an option.

Bug: DoubleValue, FloatValue not Converted to Simple Types

DoubleValue, FloatValue are not being converted into number | undefined when protoc is run.

Reproduce
Modify simple.proto's SimpleWithWrapper message:

message SimpleWithWrappers {
  google.protobuf.StringValue name = 1;
  google.protobuf.Int32Value age = 2;
  google.protobuf.BoolValue enabled = 3;
  repeated google.protobuf.Int32Value coins = 6;
  repeated google.protobuf.StringValue snacks = 7;
  google.protobuf.FloatValue weight = 8;
  repeated google.protobuf.FloatValue weights = 9;
}

Compile proto: protoc --plugin=./node_modules/.bin/protoc-gen-ts_proto --ts_proto_out=. ./simple.proto

The Float/DoubleValue's are preserved.

export interface SimpleWithWrappers {
  name: string | undefined;
  age: number | undefined;
  enabled: boolean | undefined;
  coins: Array<number | undefined>;
  snacks: Array<string | undefined>;
  weight: FloatValue | undefined;
  weights: FloatValue[];
}

Tests failing after clone

Hello. I am interested in contributing base64 support for the bytes type in JSON, as this is important to my use case.

I am having trouble running tests after cloning however. I run ./pbjs.sh as the README instructs, but I get this when trying to run tests still:

TypeScript diagnostics (customize using `[jest-config].globals.ts-jest.diagnostics` option):
build/integration/simple.ts:1:31 - error TS2307: Cannot find module './import_dir/thing'.

1 import { ImportedThing } from './import_dir/thing';
                                ~~~~~~~~~~~~~~~~~~~~
build/integration/simple.ts:4:27 - error TS2307: Cannot find module './google/protobuf/timestamp'.

4 import { Timestamp } from './google/protobuf/timestamp';
                            ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
build/integration/simple.ts:5:52 - error TS2307: Cannot find module './google/protobuf/wrappers'.

5 import { StringValue, Int32Value, BoolValue } from './google/protobuf/wrappers';
                                                     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~_

My exact procedure is:

  • Clone
  • yarn install (this fails due to pbjs not running)
  • ./pbjs,.sh
  • yarn install (gets much farther but fails in tests)

Publish/import proto dependencies as npm packages

I'm experimenting with ts-proto and it looks really good! I had a question about the generated files. It looks like ts-proto generates all files supplied by the CodeGeneratorRequest's assembled list of topologically sorted dependencies (proto_file).

This works great for standalone projects, but if I want to generate and publish artifacts, ideally they don't include all dependencies. That way I can publish two artifacts but only have one copy of generated dependencies from one used in the other. For example StringValue is generated everywhere, but it should only be generated once - in the project that defines it.

Would it make sense to include an option that generates files only from files_to_generate? Ideally I think this would be the default behavior, but it would break backwards compatibility.

camelcase to snakecase

Is there a possible way to remain the name as it is defined in a proto file? I want a snakecase method or variable instead of camelcase.

Generate a JSON client

Correct me if I'm wrong, but it seems like the auto-generated client impl is hard-coded to use protobufs over the wire:

  geoIP(request: GeoIPReq): Promise<GeoIPResp> {
    const data = GeoIPReq.encode(request).finish();
    const promise = this.rpc.request("service.Service", "geoIP", data);
    return promise.then(data => GeoIPResp.decode(new Reader(data)));
  }

Would it be possible to generate a parallel client impl that uses JSON over the wire instead?

No type found for

Hi, I am seeing an error message when compiling this proto.
Proto def (baz.proto)

syntax = "proto3";

message Baz {
  oneof type {
    Foo_Bar foo = 1;
  }
}

message Foo_Bar {

}

Error

FAILED!No type found for .Foo_Bar
Error: No type found for .Foo_Bar
    at Object.fail (node_modules/ts-proto/build/utils.js:21:11)
    at toModuleAndType (node_modules/ts-proto/build/types.js:272:46)
    at messageToTypeName (node_modules/ts-proto/build/types.js:266:28)
    at basicTypeName (node_modules/ts-proto/build/types.js:82:20)
    at Object.toTypeName (node_modules/ts-proto/build/types.js:276:16)
    at generateInterfaceDeclaration (node_modules/ts-proto/build/main.js:220:102)
    at visit (node_modules/ts-proto/build/main.js:33:34)
    at visit (node_modules/ts-proto/build/main.js:253:9)
    at Object.generateFile (node_modules/ts-proto/build/main.js:32:5)
    at request.protoFile.map.file (node_modules/ts-proto/build/plugin.js:18:29)--ts_proto_out: protoc-gen-ts_proto: Plugin failed with status code 1.

ts-proto version 1.12.0
Thanks in advance for any advice. I think this has to do with ts-proto internally using underscores as separators for nesting.

ts-proto expects Reader.int64() to always return Long, but protobufjs may return number

When protobufjs is used with a bundler, its Reader.int64() returns an unsafe number instead of Long.

Can we add a "Gotcha: Module bundlers and long.js" section to the readme?

The following workaround forces protobufjs to use long.js:

import * as protobuf from 'protobufjs/minimal';
import * as Long from 'long';
protobuf.util.Long = Long;
protobuf.configure();

It should be included in the app´s bootstrap code, main.ts for angular.

This is required for every forceLong option. forceLong=string may not throw errors, but it will probably silently overflow the value.

Errors compiling generated typescript code

Hello,

I generated the typescript code running:

protoc --plugin=./node_modules/.bin/protoc-gen-ts_proto \
       --ts_proto_opt=outputEncodeMethods=false,outputJsonMethods=true,outputClientImpl=false \
       --ts_proto_out=./src ./proto/*.proto \
       -I=./proto \
       -I=./extracted_proto

When I try to compile, I get:

tsc --project tsconfig.json                                                                                                                                                                                                                                      
src/google/protobuf/descriptor.ts:1892:7 - error TS2322: Type '0' is not assignable to type 'FieldDescriptorProto_Label'.

1892       message.label = 0;
           ~~~~~~~~~~~~~

src/google/protobuf/descriptor.ts:1897:7 - error TS2322: Type '0' is not assignable to type 'FieldDescriptorProto_Type'.

1897       message.type = 0;
           ~~~~~~~~~~~~

src/google/protobuf/descriptor.ts:1946:7 - error TS2322: Type '0' is not assignable to type 'FieldDescriptorProto_Label'.

1946       message.label = 0;
           ~~~~~~~~~~~~~

src/google/protobuf/descriptor.ts:1951:7 - error TS2322: Type '0' is not assignable to type 'FieldDescriptorProto_Type'.

1951       message.type = 0;
           ~~~~~~~~~~~~

src/google/protobuf/descriptor.ts:2378:7 - error TS2322: Type '0' is not assignable to type 'FieldDescriptorProto_Label'.

2378       message.optimizeFor = 0;
           ~~~~~~~~~~~~~~~~~~~

src/google/protobuf/descriptor.ts:2488:7 - error TS2322: Type '0' is not assignable to type 'FieldDescriptorProto_Label'.

2488       message.optimizeFor = 0;
           ~~~~~~~~~~~~~~~~~~~


Found 6 errors.

I'm using PGV in my proto files and it includes google/protobuf/descriptor.proto.

Have you encountered this problem before?

decode() and fromPartial() skip falsy map entries

decode() and fromPartial() skip falsy map entries

Map entries with 0, false or "" values are skipped by decode() and fromPartial()

msg.proto:

message Msg {
  map<string, int32> map_field = 1;
}

Reproduction:

const a: Msg = {
  mapField: {
    "key" : 0
  }
}

const b = Msg.decode(Msg.encode(a).finish());
b.mapField.key; // -> is undefined, but should be 0

const c = Msg.fromPartial({ mapField: { key: 0 } });
c.mapField.key; // -> is undefined, but should be 0

The reason lies in the generated decode() function:

const entry1 = Msg_MapFieldEntry.decode(reader, reader.uint32());
if (entry1.value) { // this skips 0 too
  message.mapField[entry1.key] = entry1.value;
}

I think the condition should be changed to entry1.value !== undefined. Or maybe even throw an error (in decode, not in fromPartial). All tests pass with this change.

Let me know what you think, @stephenh. Can send a PR with test coverage.

StateEnum Namespace and Enum VariableName is Suffering

I have a proto file with Enum defined.
When I got to generate ts file from this proto file, Enum variable name and Namespace variable name is are just same.
So when I build this code babel shout this error.

ERROR in ./src/proto/submodules/proto/shepherd/twirp/main.ts
Module build failed (from ./node_modules/babel-loader/lib/index.js):
SyntaxError: /Users/diegobacigalupo/WebstormProjects/shepherd-front-temp/src/proto/submodules/proto/shepherd/twirp/main.ts: Namespace not marked type-only declare. Non-declarative namespaces are only supported experimentally in Babel. To enable and review caveats see: https://babeljs.io/docs/en/babel-plugin-transform-typescript
  620 | }
  621 | 
> 622 | export namespace StateTypeNum {
      |                  ^^^^^^^^^^^^
  623 |     export function fromJSON(object: any): StateTypeNum {
  624 |         switch (object) {
  625 |             case 0:

// Enum
export enum StateTypeNum {
   Unknown = 99
}

How do you avoid this error when using Enums in Proto?

NESTJS - `Error: 12 UNIMPLEMENTED: The server does not implement this method`

I am using the new nestjs proto and i implemented the interfaces on the server and client however i am getting Error: 12 UNIMPLEMENTED: The server does not implement this method.

The following code is for client, server, and my proto file in TS.

import { Inject, Injectable, Logger, OnModuleInit } from '@nestjs/common';
import { ClientGrpc } from '@nestjs/microservices';
import { Observable } from 'rxjs';
import {
  US_CITIES_PACKAGE_NAME,
  US_CITIES_SERVICE_PROTO_SERVICE_NAME,
  USCitiesResponse,
  USCitiesServiceProtoClient,
  USCityResponse,
} from '../interfaces/us-cities';

@Injectable()
export class USCitiesService implements OnModuleInit {
  private readonly logger = new Logger(USCitiesService.name);
  private uSCitiesServiceProtoClient: USCitiesServiceProtoClient;

  constructor(@Inject(US_CITIES_PACKAGE_NAME) private client: ClientGrpc) {}

  onModuleInit() {
    this.uSCitiesServiceProtoClient = this.client.getService<
      USCitiesServiceProtoClient
    >(US_CITIES_SERVICE_PROTO_SERVICE_NAME);
  }

  getUSCity(id: number): Observable<USCityResponse> {
    return this.uSCitiesServiceProtoClient.getUSCity({ id });
  }

  getUSCities(): Observable<USCitiesResponse> {
    return this.uSCitiesServiceProtoClient.getUSCities({});
  }
}
import { Controller, Logger } from '@nestjs/common';
import { USCitiesService } from './us-cities.service';
import {
  USCitiesResponse, USCitiesServiceProtoController, USCitiesServiceProtoControllerMethods,
  USCityRequest,
  USCityResponse,
} from '../interfaces/us-cities';

@Controller()
@USCitiesServiceProtoControllerMethods()
export class USCitiesController implements USCitiesServiceProtoController {
  private readonly logger = new Logger(USCitiesController.name);
  constructor(private uSCitiesService: USCitiesService) {}

  async getUSCity(data: USCityRequest): Promise<USCityResponse> {
    this.logger.debug('accessing GetUSCity');
    const result = await this.uSCitiesService.getUSCity(data.id);
    return { uSCity: result };
  }

  async getUSCities(): Promise<USCitiesResponse> {
    this.logger.debug('accessing GetUSCities');
    const results = await this.uSCitiesService.getUSCities();
    return {
      uSCities: results,
    };
  }
}
/* eslint-disable */
import { Observable } from 'rxjs';
import { Empty } from './google/protobuf/empty';
import { GrpcMethod, GrpcStreamMethod } from '@nestjs/microservices';

export interface USCityRequest {
  id: number;
}

export interface USCitiesResponse {
  uSCities: USCity[];
}

export interface USCityResponse {
  uSCity: USCity | undefined;
}

export interface USCity {
  id: number;
  idState: number;
  city: string;
  county: string;
  latitude: number;
  longitude: number;
}

export interface USCitiesServiceProtoController {
  getUSCity(
    request: USCityRequest,
  ): Promise<USCityResponse> | Observable<USCityResponse> | USCityResponse;

  getUSCities(
    request: Empty,
  ):
    | Promise<USCitiesResponse>
    | Observable<USCitiesResponse>
    | USCitiesResponse;
}

export interface USCitiesServiceProtoClient {
  getUSCity(request: USCityRequest): Observable<USCityResponse>;

  getUSCities(request: Empty): Observable<USCitiesResponse>;
}

export function USCitiesServiceProtoControllerMethods() {
  return function(constructor: Function) {
    const grpcMethods: string[] = ['getUSCity', 'getUSCities'];
    for (const method of grpcMethods) {
      const descriptor: any = Reflect.getOwnPropertyDescriptor(
        constructor.prototype,
        method,
      );
      GrpcMethod('USCitiesServiceProto', method)(
        constructor.prototype[method],
        method,
        descriptor,
      );
    }
    const grpcStreamMethods: string[] = [];
    for (const method of grpcStreamMethods) {
      const descriptor: any = Reflect.getOwnPropertyDescriptor(
        constructor.prototype,
        method,
      );
      GrpcStreamMethod('USCitiesServiceProto', method)(
        constructor.prototype[method],
        method,
        descriptor,
      );
    }
  };
}

export const US_CITIES_PACKAGE_NAME = 'us.cities';
export const US_CITIES_SERVICE_PROTO_SERVICE_NAME = 'USCitiesServiceProto';

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.