GithubHelp home page GithubHelp logo

connectrpc / connect-es Goto Github PK

View Code? Open in Web Editor NEW
1.2K 17.0 59.0 6.37 MB

The TypeScript implementation of Connect: Protobuf RPC that works.

Home Page: https://connectrpc.com/

License: Apache License 2.0

JavaScript 2.83% Makefile 1.43% TypeScript 95.59% HTML 0.06% CSS 0.09%
grpc-web protobuf typescript grpc rpc nodejs javascript protoc-plugin schema fastify-plugin

connect-es's Introduction

Connect for ECMAScript

License Build NPM Version NPM Version

Connect is a family of libraries for building type-safe APIs with different languages and platforms. @connectrpc/connect brings them to TypeScript, the web browser, and to Node.js.

With Connect, you define your schema first:

service ElizaService {
  rpc Say(SayRequest) returns (SayResponse) {}
}

And with the magic of code generation, this schema produces servers and clients:

const answer = await eliza.say({sentence: "I feel happy."});
console.log(answer);
// {sentence: 'When you feel happy, what do you do?'}

Unlike REST, the Remote Procedure Call are type-safe, but they are regular HTTP under the hood. You can see all requests in the network inspector, and you can curl them if you want:

curl \
    --header 'Content-Type: application/json' \
    --data '{"sentence": "I feel happy."}' \
    https://demo.connectrpc.com/connectrpc.eliza.v1.ElizaService/Say

Connect uses Protobuf-ES, the only fully-compliant Protobuf JavaScript library.

Connect implements RPC three protocols: The widely available gRPC and gRPC-web protocols, and Connect's own protocol, optimized for the web. This gives you unparalleled interoperability across many platforms and languages, with type-safety end-to-end.

Get started on the web

Follow our 10 minute tutorial where we use Vite and React to create a web interface for ELIZA.

React, Svelte, Vue, Next.js and Angular are supported (see examples), and we have an expansion pack for TanStack Query. We support all modern web browsers that implement the widely available fetch API and the Encoding API.

Get started on Node.js

Follow our 10 minute tutorial to spin up a service in Node.js, and call it from the web, and from a gRPC client in your terminal.

You can serve your Connect RPCs with vanilla Node.js, or use our server plugins for Fastify, Next.js, and Express. We support Node.js v16 and later with the builtin http and http2 modules.

Other platforms

Would you like to use Connect on other platforms like Bun, Deno, Vercel’s Edge Runtime, or Cloudflare Workers? We’d love to learn about your use cases and what you’d like to do with Connect. You can reach us either through the Buf Slack or by filing a GitHub issue and we’d be more than happy to chat!

Packages

The libraries and the generated code are compatible with ES2017 and TypeScript 4.1.

Ecosystem

Status: Stable

All packages are stable and have reached a major version release.

Legal

Offered under the Apache 2 license.

connect-es's People

Contributors

akosyakov avatar bufdev avatar buildbreaker avatar chrispine avatar cyinma avatar dependabot[bot] avatar dimitropoulos avatar fubhy avatar gilwong00 avatar haines avatar jchadwick-buf avatar jrschumacher avatar minimal1 avatar mkusaka avatar mustard-mh avatar nguyenyou avatar paul-sachs avatar pkwarren avatar rubensf avatar sjbarag avatar smallsamantha avatar smaye81 avatar srikrsna-buf avatar tcarnes avatar timostamm avatar vipero07 avatar ybbkrishna avatar ymmt2005 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

connect-es's Issues

Missing response headers

Describe the bug

I have an API that returns a header along with the response that I need. The docs say that I have to use a callback onHeader to get that. I did that, but there are only 4 values available. The actual list of response headers is much longer.

To Reproduce

Thats what my code to call the API looks like:

let res: SignInResponse = await api.account.signIn(
  req,
  {
    onHeader: (headers: Headers) => {
      let t = headers.get("Token")

      console.log(headers)

      if (t !== null) {
        token.value = t
      }
    },
    onTrailer: (trailers) => console.log(trailers)
  }
)

The console.log output is the following:

Headers(4) { "accept-encoding" → "gzip", "content-encoding" → "gzip", "content-length" → "98", "content-type" → "application/json" }

But the list of returned HTTP response headers is actually much longer and include the header (Token) that I need.

HTTP/1.1 200 OK
Accept-Encoding: gzip
Access-Control-Allow-Origin: *
Access-Control-Expose-Headers: Accept, Accept-Encoding, Connect-Accept-Encoding, Connect-Content-Encoding, Content-Encoding, Connect-Protocol-Version, Grpc-Accept-Encoding, Grpc-Encoding, Grpc-Message, Grpc-Status, Grpc-Status-Details-Bin
Content-Encoding: gzip
Content-Type: application/json
Token: <TOKEN>
Vary: Origin
Date: Mon, 16 Jan 2023 17:27:16 GMT
Content-Length: 98

Environment (please complete the following information):

  • @bufbuild/connect-web version: 0.6.0
  • Framework and version: [email protected]
  • Browser and version: Mozilla Firefox 108.0.2

Improve documentation on handling Errors

Is your feature request related to a problem? Please describe.
In the current documentation or examples, errors from calls are mostly unhandled.

Describe the solution you'd like
It would be useful to extend the documentation on how to handle Errors.

In particular, showing how to:

  • Check the Error Code client-side
  • Extract the code & message from the response
  • For typescript, assert that the error is a ConnectError to receive hints on fields to access
  • Extract error details

That aside, really loving connect-web, and the ecosystem you've built on top of gRPC.

`unhandledRejection` when aborting a server stream using `connect-node`

Hello!

I have the impression that with connect-node, when you abort a server stream call via an AbortSignal, it leads to an unhandledRejection systematically.

Is it a bug, or am I using the library incorrectly?

Thank you very much in advance!

To Reproduce

import {
  createGrpcTransport,
  createPromiseClient,
} from "@bufbuild/connect-node";

import { ElizaService } from "@buf/bufbuild_eliza.bufbuild_connect-es/buf/connect/demo/eliza/v1/eliza_connect.js";

const transport = createGrpcTransport({
  baseUrl: "https://demo.connect.build",
  httpVersion: "2",
})

const client = createPromiseClient(ElizaService, transport);

try {
  const abort = new AbortController();

  for await (const response of client.introduce({name: "Johyn"}, { signal: abort.signal })) {
    console.log('response', response);
    abort.abort();
  }
} catch (e) {
  console.log('error', e);
}
response IntroduceResponse { sentence: "Hi Johyn. I'm Eliza." }
error ConnectError: [canceled] The operation was aborted
    at connectErrorFromReason (file:///home/johynpapin/Projets/gabie/clienttest/node_modules/@bufbuild/connect-core/dist/esm/connect-error.js:99:20)
    at connectErrorFromNodeReason (file:///home/johynpapin/Projets/gabie/clienttest/node_modules/@bufbuild/connect-node/dist/esm/private/node-error.js:48:12)
    at Promise.reject (file:///home/johynpapin/Projets/ga
bie/clienttest/node_modules/@bufbuild/connect-node/dist/esm/private/node-universal-client.js:285:63)
    at ClientHttp2Stream.h2StreamError (file:///home/johynpapin/Projets/gabie/clienttest/node_modules/@bufbuild/connect-node/dist/esm/private/node-universal-client.js:225:
22)
    at ClientHttp2Stream.emit (node:events:524:35)
    at emitErrorNT (node:internal/streams/destroy:151:8)
    at emitErrorCloseNT (node:internal/streams/destroy:116:3)
    at process.processTicksAndRejections (node:internal/process/task_queues:82:21) {
  rawMessage: 'The operation was aborted',
  code: 1,
  metadata: HeadersList {
    cookies: null,
    [Symbol(headers map)]: Map(0) {},
    [Symbol(headers map sorted)]: null
  },
  details: [],
  cause: undefined
}
node:internal/process/promises:289
            triggerUncaughtException(err, true /* fromPromise */);
            ^

ConnectError: [canceled] The operation was aborted
    at connectErrorFromReason (file:///home/johynpapin/Projets/gabie/clienttest/node_modules/@bufbuild/connect-core/dist/esm/connect-error.js:99:20)
    at connectErrorFromNodeReason (file:///home/johynpapin/Projets/gabie/clienttest/node_modules/@bufbuild/connect-node/dist/esm/private/node-error.js:48:12)
    at Promise.reject (file:///home/johynpapin/Projets/gabie/clienttest/node_modules/@bufbuild/connect-node/dist/esm/private/node-universal-client.js:285:63)
    at ClientHttp2Stream.h2StreamError (file:///home/johynpapin/Projets/gabie/clienttest/node_modules/@bufbuild/connect-node/dist/esm/private/node-universal-client.js:225:22)
    at ClientHttp2Stream.emit (node:events:524:35)
    at emitErrorNT (node:internal/streams/destroy:151:8)
    at emitErrorCloseNT (node:internal/streams/destroy:116:3)
    at process.processTicksAndRejections (node:internal/process/task_queues:82:21) {
  rawMessage: 'The operation was aborted',
  code: 1,
  metadata: Headers {
    [Symbol(headers list)]: HeadersList {
      cookies: null,
      [Symbol(headers map)]: Map(0) {},
      [Symbol(headers map sorted)]: null
    },
    [Symbol(guard)]: 'none'
  },
  details: [],
  cause: undefined
}

Node.js v19.6.0

Environment

  • @bufbuild/connect-node version: 0.7.0
  • Node.js version: 19.6.0

Additional context

This prevents me from properly testing an API using Jest.

Can't resolve `./objectives_pb.js`

Describe the bug

buf.gen.yaml

version: v1
plugins:
  - name: go
    out: proto
    opt: paths=source_relative
  - name: connect-go
    out: proto
    opt: paths=source_relative
  - name: es
    path: ui/node_modules/.bin/protoc-gen-es
    out: ui/src/proto
    opt: target=ts
  - name: connect-web
    path: ui/node_modules/.bin/protoc-gen-connect-web
    out: ui/src/proto
    opt: target=ts

Produces the correct TypeScript files. It's just that that the ui/src/proto/objectives/v1alpha1/objectives_connectweb.ts links to the files next to it ui/src/proto/objectives/v1alpha1/objectives_pb.ts with the following line:

import {GetObjectiveStatusRequest, GetObjectiveStatusResponse, ListObjectivesRequest, ListObjectivesResponse} from "./objectives_pb.js";

This breaks with:

Failed to compile.

Module not found: Error: Can't resolve './objectives_pb.js' in '/home/metalmatze/src/github.com/pyrra-dev/pyrra/ui/src/proto/objectives/v1alpha1'
ERROR in ./src/proto/objectives/v1alpha1/objectives_connectweb.ts 9:0-138
Module not found: Error: Can't resolve './objectives_pb.js' in '/home/metalmatze/src/github.com/pyrra-dev/pyrra/ui/src/proto/objectives/v1alpha1'

webpack compiled with 1 error

Right now it's easy to fix by deleting the .js suffix from the import in objectives_connectweb.ts.
However, this has to be done after every single buf generate...

To Reproduce

If you encountered an error message, please copy and paste it verbatim.
If the bug is specific to an RPC or payload, please provide a reduced
example.

Environment (please complete the following information):

  • @bufbuild/connect-web version: 0.1.0
  • Framework and version: [email protected]
  • Browser and version: Brave Version 1.42.97

I'll post the PR to my open-source project, once it's up.

docs and generate code issue

Describe the bug

in docs https://connect.build/docs/web/generating-code

Output
Let's take a peek at what was generated. There are two new files:

gen/buf/connect/demo/eliza/v1/eliza_connect.ts
gen/buf/connect/demo/eliza/v1/eliza_pb.ts
The first file was generated by protoc-gen-connect-es and contains the service:

import { SayRequest, SayResponse } from "./eliza_pb.js";
import { MethodKind } from "@bufbuild/protobuf";

import { SayRequest, SayResponse } from "./eliza_pb.js";

should be

import { SayRequest, SayResponse } from "./eliza_pb";

or

import { SayRequest, SayResponse } from "./eliza_pb.ts";

Environment (please complete the following information):

  • @bufbuild/connect-web 0.7.0 / 0.8.0
  • @bufbuild/connect 0.7.0 / 0.8.0

Deprecation of `TypeRegistry` breaks the current releases

Describe the bug

Installing & building the current version (0.1.0) of @bufbuild/connect-web fails.
The TypeRegistry had been deprecated but the current npm version still references it.
The issue had been fixed by #225 but it is unreleased.

To Reproduce

yarn add @bufbuild/connect-web
yarn build
'TypeRegistry' is not exported by node_modules/@bufbuild/protobuf/dist/esm/index.js, imported by node_modules/@bufbuild/connect-web/dist/esm/connect-error.js
file: [...]/node_modules/@bufbuild/connect-web/dist/esm/connect-error.js:15:35
13: // limitations under the License.
14: import { Code, codeFromString, codeToString } from "./code.js";
15: import { Any, proto3, protoBase64, TypeRegistry } from "@bufbuild/protobuf";

Question: Using the connect protocol for requests to a gRPC-Go backend

Hello,

At clarifai, we're already using grpc to expose our API using the native binary protocol. As I understand it, our backend team is using the golang grpc library from https://pkg.go.dev/google.golang.org/grpc to expose the service.

I'm evaluating the connect-web toolkit for building a typescript browser client for my team's front end react app, and I have to say so far I'm really loving the fact that the code it generates is MUCH more idiomatic than what grpc-web produces.

I'd like to know if a connect-web TS client, using the connect transport, can send requests to such a backend, or would we have to make changes to our backend in order to accept/respond to requests from a connect transport. I was looking at your homepage at https://connect.build, and noticed the following, which suggests the connect protocol can interop with native grpc servers, but maybe I'm misreading it?

In addition to its own protocol, Connect servers and clients also support gRPC — including streaming!
They interoperate seamlessly with Envoy, grpcurl, gRPC Gateway, and every other gRPC implementation.

Support for finer grained tree shaking

Is your feature request related to a problem? Please describe.
When using Vite to produce the production build of a React app, it was found that tree-shaking only occurs per file instead of per function within each file.

In my repository, I have two different proto files, each with its own service. When using the generator for connect-web, it results in respective TS files for each proto file. Currently, I have not imported any code from one of the proto files, and only one method from the other. After building the production build with Vite and inspecting the bundle, there were no methods from the unused proto definition (as expected). However, all methods from the proto definition, in which I'm only using one method, were included in the bundle.

In my case, the service that I am only using one method of in the frontend, I am using the others for inter-service communication in the backend. Those other methods are not accessible over the internet, but it is still a concern that all the details of those methods are sent in the JS bundle.

To get around this, I am breaking off the service methods that are for the frontend into their own proto definition file where the production build will only include these methods in the final bundle.

Describe the solution you'd like
Have the generated connect-web TS code be better accessible for fine-grained tree shaking at a per function/method level instead of entire files.

Describe alternatives you've considered
As mentioned earlier, I am breaking the methods I want to use in the frontend into their own service and proto definition file where they will be built into their own TS files in the package and thus everything will be tree shaken out.

[Question] How to handle errors in server streaming?

When handling server-streaming responses via the async Iterable returned by the respective method call on a PromiseClient, how do you handle errors, especially, loss of connection to the server?

It would appear that this does not abort the loop or trigger an exception, the for await (let msg of …) just gets stuck. If this can be detected somehow, how to get the loop to abort?

Edit: Nevermind, I figured out this only happens when the server is terminated in specific ways. If I terminate mine (a Go application) when run manually, I get exceptions, but when I run it via modd, restarts due to changes or terminating modd don't trigger them. Not sure about the difference (modd interrupting the application triggers a context cancellation and a shutdown delay during which HTTP connections should be terminated cleanly), maybe the TCP connection is left half-open.

buf generate fails on windows with 'node_modules' is not recognized...

Describe the bug

I'm trying to follow the instructions on connect.build to build my company's protos into a TS grpc client for use in browser (not nodeJS).

When I run buf generate I get the following error message:

> buf generate
'node_modules' is not recognized as an internal or external command,
operable program or batch file.
'node_modules' is not recognized as an internal or external command,
operable program or batch file.
'node_modules' is not recognized as an internal or external command,
operable program or batch file.
'node_modules' is not recognized as an internal or external command,
operable program or batch file.
'node_modules' is not recognized as an internal or external command,
operable program or batch file.
'node_modules' is not recognized as an internal or external command,
operable program or batch file.
'node_modules' is not recognized as an internal or external command,
operable program or batch file.
'node_modules' is not recognized as an internal or external command,
operable program or batch file.
'node_modules' is not recognized as an internal or external command,
operable program or batch file.
'node_modules' is not recognized as an internal or external command,
operable program or batch file.
'node_modules' is not recognized as an internal or external command,
operable program or batch file.
'node_modules' is not recognized as an internal or external command,
operable program or batch file.
'node_modules' is not recognized as an internal or external command,
operable program or batch file.
'node_modules' is not recognized as an internal or external command,
operable program or batch file.
'node_modules' is not recognized as an internal or external command,
operable program or batch file.
'node_modules' is not recognized as an internal or external command,
operable program or batch file.
Failure: plugin es: exit status 1; exit status 1; exit status 1; exit status 1; exit status 1; exit status 1; exit status 1; exit status 1

To Reproduce

buf.yaml

version: v1
deps:
  - buf.build/googleapis/googleapis
breaking:
  use:
    - FILE
lint:
  use:
    - DEFAULT

buf.gen.yaml

version: v1
plugins:
  - name: es
    path: node_modules/.bin/protoc-gen-es
    out: dist
    opt: target=ts
  - name: connect-web
    path: node_modules/.bin/protoc-gen-connect-web
    out: dist
    opt: target=ts

buf.lock

# Generated by buf. DO NOT EDIT.
version: v1
deps:
  - remote: buf.build
    owner: googleapis
    repository: googleapis
    commit: 80720a488c9a414bb8d4a9f811084989

With the above setup, I open git bash (I tried it in powershell as well, with the same results), run buf generate and get the error message I see above at the top of the ticket.

Environment (please complete the following information):

    "@bufbuild/connect-web": "^0.1.0",
    "@bufbuild/protobuf": "^0.1.0"
    "@bufbuild/protoc-gen-connect-web": "^0.1.0",
    "@bufbuild/protoc-gen-es": "^0.1.0",

node version v16.13.1
yarn version 1.22.18
npm version 8.1.2

Additional context
Can confirm I have the expected executables in my node_modules dir

image

Why is the codegen importing a js file instead of a ts file?

Require stack:
  proto/gen/devinternal_connect.ts
  src/connect/connect.ts
  src/connect/server.test.ts

  4 | // @ts-nocheck
  5 |
> 6 | import { GetDeltasRequest, GetDeltasResponse, GetSnapshotResponse, Path, UpsertRequest } from "./devinternal_pb.js";
    | ^
  7 | import { Empty, MethodKind } from "@bufbuild/protobuf";
  8 |
  9 | /**

Boilerplate to catch an explicit stream cancellation feels unnecessary

It's a good deal of ceremony to set up an abortable stream, you have to make an abort controller, and pass its signal,

then when you trigger the abort you need to handle an error at point of iteration, which feels like it should be unnecessary given that it's requested behavior.

In brief, I'd like to not have to write the catch block in:

(async () => {
    for await (const resp of stream) {
      this.processResponse(resp);
    }
  })().catch((e) => {
    // silently catch cancellations which shouldn't be errors
    if (!(e instanceof ConnectError && e.code === Code.Canceled)) {
      throw e;
    }
})

note that this error handling code is very far (in a different) from the abort invocation, which is in a different file that spawns the request, so it's a lot of context to hold in one's head and feels pretty disconnected.

Thanks again for the library, I can't wait to see how it develops!

Allow customizing fetch

Is your feature request related to a problem? Please describe.
I'd like to make it easier to integrate connect-web into sveltekit but the way sveltekit provides a custom fetch is not compatible with connect-web.

Describe the solution you'd like
I think it would be sufficient to add an optional param when creating a client to take a custom fetch and this fetch will be used instead of the native fetch API. Or possibly on each request, which is probably more ergonomic.

Describe alternatives you've considered
There may be a way to extract the headers used and just build a custom interceptor but I couldn't find anything in the docs. Perhaps just accessing cookies would be enough but also not very ergonomic and it still wouldn't solve the hydration problem for SSR.

Stream connection hanging forever

Describe the bug

I have a stream service in a server behind a proxy that has a timeout of 10 min and then closes the connection, when this happens I expect the async iterator to throw an exception (so I can reconnect again) but instead it just hangs forever.

I found this related issue in the grpc-js library which looks the same, but I've been able to workaround it by enabling sending keepalive pings with 'grpc.keepalive_time_ms' option, can a similar option be added to connect-node library?

Detect closed socket in server stream?

I am sending a stream of messages from the server to the client. I would like to measure how long the connection was open (ie., know when the socket closes). I am currently doing it like this:

// Ping client every 2 seconds to ensure socket still open
t := time.NewTicker(time.Second * 2)

for {
    <-t.C
    if err := stream.Send(PING_MSG); err != nil {
        // Client has disconnected
    }
}

The problem with this, though, is that err is non-nil only on the 3rd .Send(). I expect it to error immediately after the socket is closed. Am I missing something?

Edit: Seems it errors after the 4th .Send(), not the 3rd [if it makes any difference :)]

[Question] How to access the endpoints defined in gRPC-Gateway google.api.http with the generated TypeScript code.

I'm a newbie who has recently started using bufbuild/connect-web.
I think it's a great tool and we want to generate TypeScript client code with it.

We want to access HTTP endpoints defined in google.api.http.
But endpoints defined with gRPC-Gateway google.api.http is not included in generated TypeScript client codes.
Is there any way to access endpoints defined in gRPC-Gateway with generated clients?

# our proto file.

service DeviceService {
  rpc GetDevices(GetDevicesRequest) returns (GetDevicesResponse) {

    // this endpoint defined by google.api.http won't be included in the generated TypeScript client codes.
    // We want to access the endpoint below with generated TypeScript client codes.

    option (google.api.http) = {
      get: "/api/v5/devices"
    };
  }
}

# buf.gen.ts.yaml

version: v1
plugins:
  - name: es
    path: node_modules/.bin/protoc-gen-es
    out: connect_web_gen
    opt: target=ts
  - name: connect-web
    path: node_modules/.bin/protoc-gen-connect-web
    out: connect_web_gen
    opt: target=ts
# generated TypeScript codes.

/**
 * @generated from service service.app.device.v5.DeviceService
 */
export const DeviceService = {
  typeName: "service.app.device.v5.DeviceService",
  methods: {
    /**
     * @generated from rpc service.app.device.v5.DeviceService.GetDevices
     */
    getDevices: {
      name: "GetDevices",
      I: GetDevicesRequest,
      O: GetDevicesResponse,
      kind: MethodKind.Unary,
    },
  }
} as const;

http://localhost:4010/service.app.device.v5.DeviceService/GetDevices
But we want to access http://localhost:4010/api/v5/devices with the code below

...

const client = createPromiseClient(
  DeviceService,
  createConnectTransport({
    baseUrl: 'http://localhost:4010/',
  }),
)
...

Could you tell me how to access the defined http endpoints in google.api.http with the code generated TypeScript code?
If you need additional information, please let me know.

Thank you in advance.

Re-export bufbuild_es message types

Is your feature request related to a problem? Please describe.
What's the problem? For example, "I'm always frustrated when..."

Right now its annoying to keep a top level @buf/<repo>.bufbuild_connect-web package's version of its bufbuild_es packages in sync with another top level @buf/<repo>.bufbuild_es package's version of the same types. We want to use the protobuf types as interfaces for redux state, etc...

When using mainline protos its not a huge issue since they have real versions with commit hashes to match up, but right now we're working with draft protos where the package.json versions for them are

    "@buf/repo.bufbuild_connect-web": "trunk",
    "@buf/repo.bufbuild_es": "trunk",

Describe the solution you'd like

It would make all this package nonsense way easier if you guys could just reexport the contents of the @buf/<repo>.bufbuild_es package from a subdirectory or something.

Describe alternatives you've considered

Alternatively, you guys could have some yarn recipe for keeping them in sync.

Additional context
were using draft protos to follow our staging api's protos (mainline trunk) and when a feature branch is cut then we publish the mainline protos to bsr.

Using protoc-gen-es with protoc

Describe the bug
I'm attempting to use the code generation with protoc (docs claim the plugin should work just as it does with buf) but I can't seem to get protoc to recognize the --target=ts flag (Unknown flag: --target)

To Reproduce

  1. yarn add --dev @bufbuild/protoc-gen-connect-web @bufbuild/protoc-gen-es
  2. yarn add @bufbuild/connect-web @bufbuild/protobuf
  3. add the executable to PATH (.zshrc):
#Environment Settings
export NODEMODULES_PATH=$HOME/path/to/node_modules/.bin
export PATH=$PATH:$GOROOT/bin:$GOPATH/bin:$NODEMODULES_PATH
  1. check binary has been added to PATH successfully:
$cmd: protoc-gen-es --version
    protoc-gen-es v0.0.10
  1. execute protoc:
protoc --proto_path=protos \
--go_out=. --go_opt=module=github.com/repo/project \
--go-grpc_out=. --go-grpc_opt=module=github.com/repo/project \
--grpc-gateway_out=. --grpc-gateway_opt=module=github.com/repo/project \
--target=ts \
protos/*.proto

result: Unknown flag: --target

Environment (please complete the following information):

  • @bufbuild/protoc-gen-connect-web: ^0.1.0
  • @bufbuild/protoc-gen-es: ^0.0.10
  • @bufbuild/connect-web version: ^0.1.0
  • @bufbuild/protobuf version: ^0.0.10

how to generate code for dependent protos ? e.g., google types: date/datetime

Is there a way to generate TS code when I use import "google/type/datetime.proto"; dependency?

Repo for reproduction: https://github.com/xmlking/entity-resolution

image

Proto File :https://github.com/xmlking/entity-resolution/blob/main/proto/entityapis/er/schema/entity/v1/entity.proto
Generated TS file: https://github.com/xmlking/entity-resolution/blob/main/gen/ts/er/schema/entity/v1/entity_pb.ts#L8-L10

Other TS plugins have options generate_dependencies to generate code for google types

  - remote: buf.build/timostamm/plugins/protobuf-ts:v2.4.0-1
    out: gen/ts
    opt:
      - generate_dependencies
      - ts_nocheck
      - eslint_disable

image

Swagger/OpenAPI definitions

We are quite happily using grpc-gateway, which generates us swagger, which we use for generating TS. Today I found this project and I am amazed! This is a true gamechanger for us, as we would throw away three or four components and make our code pipeline a lot easier :) so in the next few months, we are going to replace our solution with connect.build.

But, using GRPC gateway with OpenAPI definitions has one advantage. We can show structure of our API to 3rd parties with no work at all :) just publishing the Swagger file.

Is it possible to have such swagger file being created? I see that the connect.build doesnt take care of OpenAPI definitions in proto files.

Unit testing of nodejs client

Is your feature request related to a problem? Please describe.

Most of today I have been trying to somehow get up and running some kind of testing setup around jest for the nodejs server you have recently released.... As far as integration to a webserver goes I went the route of fastly and was trying their idea of testing the server without "running it" (inject or whatnot).

Describe the solution you'd like

Create a template for working unit testing solution in context of nodejs server, typescript and yarn PnP

Allow `redirect:"follow"` for fetch requests

Is your feature request related to a problem? Please describe.
The fetch request that is being used for grpc-web-transport has the redirect hardcoded as "error". We have Cloudflare setup to redirect when the token is expired, using the refresh token.

Describe the solution you'd like
Can you make redirect have an optional paramter on options for "error" | "follow" | "manual", or just RequestRedirect

[question] message validation

Hey team Buf, we are loving what you are building, thank you sooo much!!!!

It was a breeze switching from @improbable-eng/grpc-web to connect-web.
We are looking to transitioning to connect-go in the future (still using @improbable-eng's grpcweb).

The question here is regarding message validation.
We are wondering if you are at all considering implementing message validation as a part of the connect ecosystem?

(Just for the background, we are early in the development process, and are currently evaluating envoyproxy/protoc-gen-validate for the backend and looking into colinhacks/zod with fabien0102/ts-to-zod for our frontend.)

connect-web peer dependency on protobuf is too strict

Describe the bug

@bufbuild/connect-web has a peer dependency on @bufbuild/protobuf, pinned to an exact version (0.2.0). The latest @bufbuild/protobuf release is 0.2.1.

Attempting to install the latest versions (or update to the latest versions) of these two dependencies fails. Installing without specifying the versions picks up the older version of @bufbuild/protobuf, which has a bug.

I think the peer dependency should be specified as ^0.2.0 to allow for bugfixes.

To Reproduce

$ npm init
$ npm install @bufbuild/[email protected] @bufbuild/[email protected]
npm ERR! code ERESOLVE
npm ERR! ERESOLVE unable to resolve dependency tree
npm ERR!
npm ERR! While resolving: undefined@undefined
npm ERR! Found: @bufbuild/[email protected]
npm ERR! node_modules/@bufbuild/protobuf
npm ERR!   @bufbuild/protobuf@"0.2.1" from the root project
npm ERR!
npm ERR! Could not resolve dependency:
npm ERR! peer @bufbuild/protobuf@"0.2.0" from @bufbuild/[email protected]
npm ERR! node_modules/@bufbuild/connect-web
npm ERR!   @bufbuild/connect-web@"0.3.0" from the root project
npm ERR!
npm ERR! Fix the upstream dependency conflict, or retry
npm ERR! this command with --force, or --legacy-peer-deps
npm ERR! to accept an incorrect (and potentially broken) dependency resolution.
npm ERR!
npm ERR! See ~/.npm/eresolve-report.txt for a full report.

Environment (please complete the following information):

  • @bufbuild/connect-web version: 0.3.0
  • npm version: 8.19.2

Gzip - Grpc-Accept-Encoding

Is your feature request related to a problem? Please describe.
Using createGrpcWebTransport I can't get zipped responses. It would be nice if those were implemented.

Describe the solution you'd like
zip responses

Describe alternatives you've considered
Regular responses without gzip. Currently my grpc service returns an 800kb message when it is unzipped. Zipped the message becomes 90kb (it has a lot of repeat information and is highly compressible).

Additional context
When using the following interceptor with a grpc server that supports gzip:

const interceptor: Interceptor = (next) => async (req) => {
  req.header.set('Grpc-Accept-Encoding', 'gzip');
  return await next(req);
};

I get the following message:
[unknown] illegal tag: field no 3 wire type 7

My server is a dotnet implementation with the following setting:

services.AddGrpc(o =>
{
    o.ResponseCompressionLevel = System.IO.Compression.CompressionLevel.SmallestSize;
    o.ResponseCompressionAlgorithm = "gzip";
});

JSON support for gRPC-Web transport

Is your feature request related to a problem? Please describe.

Using the gRPC-Web transport locks me into using binary Protobuf serialization, which makes the network inspector fully useless.

Describe the solution you'd like

I'd like the gRPC-Web transport to support JSON serialization, like the Connect transport. This would make the network inspector only partially useless :)

Describe alternatives you've considered

I could switch to the Connect protocol, but this is a more invasive change. Why not support the same switch in both protocols?

Additional context

Originally brought up by @tannerlinsley in Slack.

Headers in fetch causing an issue using Node

I was testing remote generation, and came across an issue when using Node v18.1.0. From what I understand the latest version of Node supports "browser-compatible APIs", notably fetch.

So I was expecting this to work.

import { ElizaService } from "@buf/local_connect-web_bufbuild_eliza/buf/connect/demo/eliza/v1/eliza_connectweb.js";
import { createPromiseClient, createConnectTransport } from "@bufbuild/connect-web";

const client = createPromiseClient(
  ElizaService,
  createConnectTransport({ baseUrl: "https://demo.connect.build" })
);
const { sentence } = await client.say({ sentence: "hello" });
console.log(sentence);

But, I was getting back the following error, notice the HTTP 415 Unsupported Media Type.

file:///Users/mfridman/debug-generation/connect-web/old-remote/node_modules/@bufbuild/connect-web/dist/esm/connect-transport.js:57
                        throw new ConnectError(`HTTP ${response.status} ${response.statusText}`, codeFromConnectHttpStatus(response.status));
                              ^

ConnectError: [internal] HTTP 415 Unsupported Media Type
    at file:///Users/mfridman/debug-generation/connect-web/old-remote/node_modules/@bufbuild/connect-web/dist/esm/connect-transport.js:57:31
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async Object.unary (file:///Users/mfridman/debug-generation/connect-web/old-remote/node_modules/@bufbuild/connect-web/dist/esm/connect-transport.js:28:24)
    at async Object.say (file:///Users/mfridman/debug-generation/connect-web/old-remote/node_modules/@bufbuild/connect-web/dist/esm/promise-client.js:35:26)
    at async file:///Users/mfridman/debug-generation/connect-web/old-remote/hello.js:26:22 {
  rawMessage: 'HTTP 415 Unsupported Media Type',
  code: 13,

Diving into the code a bit more, it turns out there might be compatibility issues with how Node fetch handles the Header vs something like node-fetch.

Two things resolved the issue for me:

  1. converting Header to a plain serializable javascript object (modifying the generated code)
const requestHeaders = Object.fromEntries(unaryRequest.header.entries())
const response = await fetch(
  unaryRequest.url,
  Object.assign(Object.assign({}, unaryRequest.init), { headers: requestHeaders ...
  1. importing node-fetch in connect-transport.js (modifying the generated code)

So, is there a bug in how Node's fetch implementation is handling the Headers we pass it, are we doing something wrong, or is it something in my setup?

Add bidirectional stream support

Is your feature request related to a problem? Please describe.

Using bidirectional stream.

Describe the solution you'd like

I think you could use the Websocket to be able to support this feature

Describe alternatives you've considered

See: #416

Headers polyfill should print a warning instead of throwing.

Describe the bug
Some platforms already polyfill Headers, breaking @bufbuild/connect-node due to https://github.com/bufbuild/connect-es/blob/a717308d2268f5337e8983d5dfb5636cb18d69e6/packages/connect-node/src/node-headers-polyfill.ts#LL28C5-L28C80. This should log a warning instead.

To Reproduce

Polyfill headers and then import @bufbuild/connect-node.

Environment (please complete the following information):

  • @bufbuild/connect-node version: 8.1.0
  • Node.js version: 16.14.2

Additional context
Add any other context about the problem here.

Way to recover from errors caused by non-compliant response body

Is your feature request related to a problem? Please describe.

When connect-web client receives non-200 response with non-compliant response body, then it raises an error of Internal or Unknown though I'd rather have the error code corresponding to the http status code.

In my setup, there's an authentication proxy in front of connect-web server, and it returns http 401 response with empty body when failing to authenticate a request.

I suppose, it's common to have various reverse proxies, and they are usually not compatible with connect protocol.

Describe the solution you'd like

An option to enable fallback process that determines error type based on http status code in case of failing to parse response body.

Just like how error type is determined when content-type is not application/json.
https://github.com/bufbuild/connect-es/blob/d6598c5f9fe67f9165e3d06f84cf1a4c8d81f989/packages/connect-web/src/connect-transport.ts#L256-L259

Streaming support in React Native

Currently, Connect Web cannot support server-streaming via the Connect protocol in React Native. This is due to various limitations in React Native's implementation of the Fetch API. This issue is to document the state of things as well as potentially track any effort on our side that would be needed if/when the below shortcomings are resolved.

In a nutshell, React Native uses Github's fetch polyfill to provide a subset of the Fetch API. However, this polyfill is built upon XMLHttpRequest, which doesn't offer full streaming natively (doc) so the polyfill does not support streaming requests. Additionally, there are no plans to fix this. So the current state of React Native as-is isn't a viable option. For additional context, see the following issues:

Another polyfill exists as a fork of this called react-native-fetch-api, which is built on top of React Native's networking API instead of XMLHttpRequest and aims to 'fill in some gaps of the WHATWG specification for fetch, namely the support for text streaming.'

But, React Native's Networking API also does not support full streaming natively. The fork provides text streaming only (see the README for an explanation why). Connect streaming RPCs are binary, though. Even with the JSON format, the enveloping is binary. So, this approach is not viable either. The fork's README mentions that if you wish to consume binary data, 'either blob or base64 response types have to be used', but Connect does not implement the gRPC-Web-Text (base64 encoded) format, so again, not an option.

The natural thought is 'ok, can't you just write your own polyfill that fixes all these problems so that Connect streaming works', but that isn't feasible either. Perhaps the best explanation is from this comment on the Github fetch polyfill's streaming issue linked above. Basically:

If you want actual streaming support in React Native, then React Native would need to implement it itself.

Taking all this into account, there is no real option to implement full streaming on React Native for Connect at the moment. Hopefully this changes in the future, but for right now, there is unfortunately nothing we easily do to remedy it.

Allow routes to be registered in multiple files for `connectNodeAdapter`

Is your feature request related to a problem? Please describe.
When using connect-node, the current implementation of the routes option for connectNodeAdapter only accepts a function, which is a bit limiting when trying to register services and RPCs that are in multiple files.

Describe the solution you'd like
Have an object that can imported into multiple files, register the service/RPC, and then be passed to the adapter.

Describe alternatives you've considered
Have everything in a single file, which is not ideal.

Failed to construct 'Headers': No matching constructor signature.

Describe the bug
Failed to construct 'Headers': No matching constructor signature. On Chrome 56m. This is uniquely an old browser issue where it cannot construct headers when its one parameter is undefined. An easy fix is to replace any new Headers(possibleUndefined) with new Headers(possibleUndefined ?? {}).

An alternative solution is to add headers to the GrpcWebTransportOptions allowing someone to override the default behavior.

same issue from a diff repo

To Reproduce
Failed to construct 'Headers': No matching constructor signature.
This issue is specific to chrome 56m, may even be specific to a Samsung TV.

createGrpcWebTransport({
      baseUrl: import.meta.env.VITE_GRPC_ENDPOINT,
      binaryOptions: {readUnknownFields: false, writeUnknownFields: false},
    })

Environment (please complete the following information):

Additional context
undefined headers

AsyncIterator is always one message behind when using a ServerStreaming method.

Hey, just started using connect-web this week, have been finding fantastic so far. However we have encountered an issue when using the ServerStreaming promise client.

Describe the bug
When using a long-lived streaming connection - the stream appears to always be one message behind. We're trying to keep a streaming connection alive to receive log messages as they come in. To do this we're iterating over the AsyncIterator returned by our getLogs method. This mostly works - however the stream stays one message behind until the connection is closed, and then the final message comes through.

To Reproduce
Loop over the AsyncIterator returned by a ServerStreaming method like so:

const logClient = createPromiseClient(LogService, transport);
for await (const logEvent of logClient.getLiveLogs(new GetLiveLogsRequest(), callOptions)) {
    console.log(logEvent);
}

You may need your server to only send messages slowly to notice the issue. The initial message will not display until a second comes through.

Environment:

  • @bufbuild/connect-web version: 0.1.0
  • Framework and version: [email protected]
  • Browser and version: Google Chrome 104.0.5112.81

Possible solution
It looks like the issue occurs in the createEnvelopeReadableStream function in envelope.ts: https://github.com/bufbuild/connect-web/blob/aa0a5645077ea689b50b040ad8e6298a6b5612b6/packages/connect-web/src/envelope.ts#L57-L73

The readable stream reads the header from the buffer, before immediately awaiting the next read, rather than enqueuing the message if there are enough bytes in the buffer to return the message.

This causes the ReadableStream to keep the message buffered until the next message comes through (and reader.read() returns); it will stay one message behind until the stream closes.

I modified the loop to instead check the header immediately after reading the header from the buffer like so:

for (;;) {
    if (header === void 0 && buffer.byteLength >= 5) {
        let length = 0;
        for (let i = 1; i < 5; i++) {
            length = (length << 8) + buffer[i];
        }
        header = { flags: buffer[0], length };
    }
    if (header !== void 0 && buffer.byteLength >= header.length + 5) {
        break;
    }
    const result = await reader.read();
    if (result.done) {
        break;
    }
    append(result.value);
}

This works for us as expected.

typescript-language-server shows errors about missing properties on createPromiseClient()

Describe the bug

The TypeScript Language Server shows errors when calling createPromiseClient(). The generated service is not assignable.

Diagnostics:                                                                                                                                                                                                                                   
 1. Type 'PromiseClient<ServiceType>' is missing the following properties from type 'PromiseClient<{ readonly typeName: "workplus.account.v1alpha1.AccountService"; readonly methods: { readonly createAccount: { readonly name: "CreateAccount 
 "; readonly I: typeof CreateAccountRequest; readonly O: typeof CreateAccountResponse; readonly kind: any; }; readonly generateTemporaryPassword: { ...; }; readonly...': createAccount, generateTemporaryPassword, signIn                      
 2. Argument of type '{ readonly typeName: "workplus.account.v1alpha1.AccountService"; readonly methods: { readonly createAccount: { readonly name: "CreateAccount"; readonly I: typeof CreateAccountRequest; readonly O: typeof CreateAccountR 
 esponse; readonly kind: any; }; readonly generateTemporaryPassword: { ...; }; readonly signIn: { ......' is not assignable to parameter of type 'ServiceType'.

To Reproduce

import { AccountService } from "pb/workplus/account/v1alpha1/service_connectweb"
import {
  createPromiseClient,
  createConnectTransport
} from "@bufbuild/connect-web"

import type { PromiseClient } from '@bufbuild/connect-web'

interface API {
  account: PromiseClient<typeof AccountService>
}

const transport = createConnectTransport({
  baseUrl: "localhost:50100"
})

export const api: API = {
  account: createPromiseClient(AccountService, transport)
}

Environment (please complete the following information):

  • @bufbuild/connect-web version: 0.5.0
  • Framework and version: [email protected]
  • TypeScript: 4.9.4

More info on using the generated definitions?

Been using connect-web for some time, it's sooo good compared to the past JS/TS generators I've used. Thanks so much for the work!

I was wondering if the documentation could add a few words on how best to use the generated definitions.

For example, what are the intended uses for PartialMessage<T> and PlainMessage<T>? What's the best practice for class conversions and initialization? I ask because, for example, the default constructor's parameter PartialMessage has insufficient IDE support and shows no auto-complete when initializing. The FieldList helps with type-checking but it's not too clear if it's intended for the users or for internal use.

The documentation already covered how to convert to/from JSON and binary; a few more documentation examples on which interface/class/functions to use for type-checking during conversions to/from other user types will be very much appreciated!

Missing dependency declaration for @bufbuild/protobuf

Describe the bug

Running this plugin in context of Yarn V3 with enabled PNP mechanism produces the following error:

 Generate files 
 ============== 
Error: Command failed: /xxx/.yarn/unplugged/node-protoc-npm-1.0.3-639f7ab0b0/node_modules/node-protoc/dist/protoc/bin/protoc --es_out=/xxx/packages/toolbox/gen --es_opt=target=ts,import_extension=none --connect-web_out=/xxx/packages/toolbox/gen --connect-web_opt=target=ts,import_extension=none --proto_path=/xxx/packages/toolbox/ext/main/proto /xxx/packages/toolbox/ext/main/proto/google/rpc/error_details.proto /xxx/packages/toolbox/ext/main/proto/google/rpc/status.proto /xxx/packages/toolbox/ext/main/proto/core/news.proto /xxx/packages/toolbox/ext/main/proto/core/net.proto /xxx/packages/toolbox/ext/main/proto/core/monitoring.proto /xxx/packages/toolbox/ext/main/proto/core/error.proto /xxx/packages/toolbox/ext/main/proto/core/money.proto /xxx/packages/toolbox/ext/main/proto/core/fido.proto /xxx/packages/toolbox/ext/main/proto/core/rewards.proto /xxx/packages/toolbox/ext/main/proto/core/payouts.proto /xxx/packages/toolbox/ext/main/proto/core/mailing.proto /xxx/packages/toolbox/ext/main/proto/core/activities.proto /xxx/packages/toolbox/ext/main/proto/core/client_errors.proto /xxx/packages/toolbox/ext/main/proto/core/fingerprint.proto /xxx/packages/toolbox/ext/main/proto/core/communication.proto /xxx/packages/toolbox/ext/main/proto/core/mining.proto /xxx/packages/toolbox/ext/main/proto/core/captcha.proto /xxx/packages/toolbox/ext/main/proto/core/otp.proto /xxx/packages/toolbox/ext/main/proto/core/client.proto /xxx/packages/toolbox/ext/main/proto/toolbox/service.proto
google/rpc/error_details.proto:19:1: warning: Import google/protobuf/duration.proto is unused.
/xxx/.pnp.cjs:40517
      Error.captureStackTrace(firstError);
            ^

Error: @bufbuild/protoc-gen-connect-web tried to access @bufbuild/protobuf, but it isn't declared in its dependencies; this makes the require call ambiguous and unsound.

Required package: @bufbuild/protobuf
Required by: @bufbuild/protoc-gen-connect-web@virtual:a9df22bd2675136058ccb33218a730656dcf048baefeca725877b87d0548fc929e41a7472e015620fd882f1dd55b43b92022f5949aa7e54a8098679bdfca0251#npm:0.6.0 (via /xxx/.yarn/unplugged/@bufbuild-protoc-gen-connect-web-virtual-ff7bfa0b23/node_modules/@bufbuild/protoc-gen-connect-web/dist/cjs/src/)

Require stack:
- /xxx/.yarn/unplugged/@bufbuild-protoc-gen-connect-web-virtual-ff7bfa0b23/node_modules/@bufbuild/protoc-gen-connect-web/dist/cjs/src/typescript.js
- /xxx/.yarn/unplugged/@bufbuild-protoc-gen-connect-web-virtual-ff7bfa0b23/node_modules/@bufbuild/protoc-gen-connect-web/dist/cjs/src/protoc-gen-connect-web-plugin.js
- /xxx/.yarn/unplugged/@bufbuild-protoc-gen-connect-web-virtual-ff7bfa0b23/node_modules/@bufbuild/protoc-gen-connect-web/bin/protoc-gen-connect-web
    at require$$0.Module._resolveFilename (/xxx/.pnp.cjs:40517:13)
    at require$$0.Module._load (/xxx/.pnp.cjs:40368:42)
    at Module.require (node:internal/modules/cjs/loader:1105:19)
    at require (node:internal/modules/cjs/helpers:103:18)
    at Object.<anonymous> (/xxx/.yarn/unplugged/@bufbuild-protoc-gen-connect-web-virtual-ff7bfa0b23/node_modules/@bufbuild/protoc-gen-connect-web/dist/cjs/src/typescript.js:17:20)
    at Module._compile (node:internal/modules/cjs/loader:1218:14)
    at Module._extensions..js (node:internal/modules/cjs/loader:1272:10)
    at require$$0.Module._extensions..js (/xxx/.pnp.cjs:40561:33)
    at Module.load (node:internal/modules/cjs/loader:1081:32)
    at require$$0.Module._load (/xxx/.pnp.cjs:40399:14)

Node.js v19.3.0
--connect-web_out: protoc-gen-connect-web: Plugin failed with status code 1.

    at ChildProcess.exithandler (node:child_process:419:12)
    at ChildProcess.emit (node:events:513:28)
    at maybeClose (node:internal/child_process:1098:16)
    at ChildProcess._handle.onexit (node:internal/child_process:304:5) {
  code: 1,
  killed: false,
  signal: null,
  cmd: '/xxx/.yarn/unplugged/node-protoc-npm-1.0.3-639f7ab0b0/node_modules/node-protoc/dist/protoc/bin/protoc --es_out=/xxx/packages/toolbox/gen --es_opt=target=ts,import_extension=none --connect-web_out=/xxx/packages/toolbox/gen --connect-web_opt=target=ts,import_extension=none --proto_path=/xxx/packages/toolbox/ext/main/proto /xxx/packages/toolbox/ext/main/proto/google/rpc/error_details.proto /xxx/packages/toolbox/ext/main/proto/google/rpc/status.proto /xxx/packages/toolbox/ext/main/proto/core/news.proto /xxx/packages/toolbox/ext/main/proto/core/net.proto /xxx/packages/toolbox/ext/main/proto/core/monitoring.proto /xxx/packages/toolbox/ext/main/proto/core/error.proto /xxx/packages/toolbox/ext/main/proto/core/money.proto /xxx/packages/toolbox/ext/main/proto/core/fido.proto /xxx/packages/toolbox/ext/main/proto/core/rewards.proto /xxx/packages/toolbox/ext/main/proto/core/payouts.proto /xxx/packages/toolbox/ext/main/proto/core/mailing.proto /xxx/packages/toolbox/ext/main/proto/core/activities.proto /xxx/packages/toolbox/ext/main/proto/core/client_errors.proto /xxx/packages/toolbox/ext/main/proto/core/fingerprint.proto /xxx/packages/toolbox/ext/main/proto/core/communication.proto /xxx/packages/toolbox/ext/main/proto/core/mining.proto /xxx/packages/toolbox/ext/main/proto/core/captcha.proto /xxx/packages/toolbox/ext/main/proto/core/otp.proto /xxx/packages/toolbox/ext/main/proto/core/client.proto /xxx/packages/toolbox/ext/main/proto/toolbox/service.proto'
}

but adding the following section to .yarnrc.yml resolves the issue

packageExtensions:
  "@bufbuild/protoc-gen-connect-web@*":
    peerDependencies:
      "@bufbuild/protobuf": "*"

Environment (please complete the following information):

  • @bufbuild/protoc-gen-connect-web: 0.6.0

Additional context

$ yarn --version
4.0.0-rc.34.git.20221220.hash-3246d10

$ node --version
v19.3.0

Add gRPC Server Reflection support

Is your feature request related to a problem? Please describe.
gRPC and connect-go support a server-side reflection API that serves protobuf schema information to clients. As of now, connect-node does not support this feature.

This feature would be useful as various gRPC debugging utilities can use it to grab schema information on-demand.

Describe the solution you'd like
Ideally, connect-node would have an optional implementation of serverside reflection, including the necessary code generation in protobuf-es to support this (as needed.)

Additional context
gRPC Server Reflection Protocol
connect-go grpcreflect implementation

Exceptions thrown from ASP.NET gRPC Service getting converted to "premature eof" error.

Describe the bug

I am using gRPC from an ASP.NET backend server, talking to a VueJS frontend client, running in Edge and Chrome (chromium).

When throwing an exception from within a gRPC service call, the exception appears to be misprocessed when received within the response header on the client.

// C# ASP.NET 

public override Task<Foo> Create(Foo request, ServerCallContext context)
{
    if (alreadyExists(request))
    {
        throw new RpcException(new Status(StatusCode.AlreadyExists, "Foo already exists"));
    }

    // Create Foo
    return Task.FromResult(request);
}

On the client side, this results in the callback/promise rejection containing a "premature eof" error.

The source of the error is within grpc-web-transport.

I believe the expectation is that the header will contain the status code, but upon testing, the only thing present in the header at the time is 0 content-length.

Interestingly, after the operation is complete, upon inspecting the network traffic in developer tools, both grpc-status and grpc-message are present and with the values from the exception thrown from C#.

Environment (please complete the following information):

  • @bufbuild/connect-web version: 0.1.0
  • Framework and version: [email protected], [email protected]
  • Browser and version: Microsoft Edge 104.0.1293.63, Google Chrome 104.0.5112.102

Additional context
This may not be a bug, but rather some misconfiguration on the C# side which is causing the response to be processed incorrectly, but at this stage I am running out of ideas.

gRPC response from GRPCUI
grpc response

Response header after request has completed
Response Headers

Output during breakpoint debugging, at the point of grpc-web-transport looking for the grpc-status header
Debug output

Client contains methods that can't be executed (never)

const client = createPromiseClient(ElizaService, transport);

The above produces a client that has both say and converse methods. say is fine, it's a unary method, but converse has the type of never since it's a bidistreaming method. We should omit these methods from the type signature.

Message serialization and classes vs. plain javascript objects with React Server Components

Is your feature request related to a problem? Please describe.
Today @bufbuild deserialized grpc responses into generated classes based on protobuf. This makes using React Server Components somewhat cumbersome because the objects are not JSON.stringify serializable. Additionally, available serialization methods are relatively untyped. There is no method to produce a PlainMessage<MessageType> which might (?) work with JSON.stringify. Also, toJson produces a JsonValue instead of some type related to the original generated type (e.g. JsonValue<MessageType> similar to PlainMessage<MessageType>).

Here is a concrete example of what we're currently doing today to workaround this:

page.tsx (using Next@13)

export default function Page() {
  const answer = await eliza.say({sentence: "I feel happy."});

  return <AnswerViewWrapper answer={answer.toJson()} />;
}

AnswerViewWrapper.tsx

export interface AnswerViewWrapperProps {
  readonly answer: JsonValue;
}

export function AnswerViewWrapper({ answer }: AnswerViewWrapperProps) {
  return <AnswerView answer={SayResponse.fromJson(answer)} />;
}

AnswerView.tsx

export interface AnswerViewProps {
  readonly answer: SayResponse;
}

export function AnswerView({ answer }: AnswerViewProps) {
  return <span>{answer.sentence}</span>;
}

Describe the solution you'd like
Eliminate the need to break type safety, and ideally eliminate the extra serialize/deserialize step. Some options:

  1. Move away from, or have an option for, generating plain javascript objects instead of classes. I'm sure there was a strong rationale for using classes, but generally, I believe that a more functional approach will lead to better compatibility across the stack. Use plain interfaces/data objects and provide functions for manipulating them rather than methods.
  2. Provide better type-safe serialization/deserialization. E.g. toJson(): JsonValue<SayResponse> or toPlainMessage(): PlainMessage<SayResponse>

Describe alternatives you've considered

  • Use grpc-js or some other library. IIRC it has the same issues with class based objects, and overall poorer support.

Additional context
Add any other context or screenshots about the feature request here.

Option to control file extensions in generated code

This is very related to #236

Much like the above issue I'm on a create-react-app, am struggling to figure out how to get something working with CRA/typescript/jest (which has dubious support for modules)

It would be nice if I could stay on the happy path of not thinking too much about my webpack config.

I know it's in a weird sort-of-deprecated state, but as create react app is the blessed solution for bootstrapping a react app, it'd be nice if I could get this working more or less out of the box with it, and it seems like having an option to throw file extensions on the generated code is easier than coordinating between facebook + microsoft and getting all of their compilers to work together.

Thanks for the library, it's great so far!

[Question] Do you consider supporting long-lasting server side streaming?

I would like to keep a connection between the web client and the server for quite some time so that I can push data to the client that emerge during the time of the connection.

I would like to know if you have considered any mechanisms like websockets or server-sent events to push data to a client who is connected to a streaming endpoint over a long time. If you have considered it I would like to ask you to share if you plan to implement it in the future and if not why. Thank you really much!

HTTP 400 Bad Request from grpc-web server

Hi there! I'm testing connect-web client sending requests to a grpc-web server (with Tonic + Tonic-web).

The proto:

syntax = "proto3";


package carrel_scaffold.v1;

message ScaffoldNewProjectRequest {
    string project_name = 1;
    string project_parent_dir = 2;
}

message ScaffoldNewProjectResponse {
    string project_dir = 1;
}

service ScaffoldNewProjectService {
    rpc ScaffoldNewProject(ScaffoldNewProjectRequest) returns (ScaffoldNewProjectResponse);
}

I can send grpc-web requests to the server with Postman and BloomRPC and receives result without a problem.
image

But the generated client throws [invalid_argument] HTTP 400 Bad Request

image

The generated client:

// @generated by protoc-gen-connect-web v0.2.1 with parameter "target=ts"
// @generated from file carrel_scaffold/v1/scaffold.proto (package carrel_scaffold.v1, syntax proto3)
/* eslint-disable */
/* @ts-nocheck */

import {ScaffoldNewProjectRequest, ScaffoldNewProjectResponse} from "./scaffold_pb.js";
import {MethodKind} from "@bufbuild/protobuf";

/**
 * @generated from service carrel_scaffold.v1.ScaffoldNewProjectService
 */
export const ScaffoldNewProjectService = {
  typeName: "carrel_scaffold.v1.ScaffoldNewProjectService",
  methods: {
    /**
     * @generated from rpc carrel_scaffold.v1.ScaffoldNewProjectService.ScaffoldNewProject
     */
    scaffoldNewProject: {
      name: "ScaffoldNewProject",
      I: ScaffoldNewProjectRequest,
      O: ScaffoldNewProjectResponse,
      kind: MethodKind.Unary,
    },
  }
} as const;

My usage:

const transport = createConnectTransport({
    baseUrl: "http://127.0.0.1:8081",
});

const client = createPromiseClient(ScaffoldNewProjectService, transport);
const load = async () => {
   client.scaffoldNewProject({
        projectName: "test",
        projectParentDir: "tmp",
    }).then((response) => {
        console.log(response);
    });
}

Tracing on the server sides show differences between the working requests from Postman etc:

2022-09-23T17:09:10.800049Z DEBUG hyper::proto::h1::conn: read eof
2022-09-23T17:09:10.800823Z DEBUG hyper::proto::h1::io: parsed 13 headers
2022-09-23T17:09:10.800845Z DEBUG hyper::proto::h1::conn: incoming body is content-length (28 bytes)
2022-09-23T17:09:10.800946Z DEBUG hyper::proto::h1::conn: incoming body completed
project name: test1
2022-09-23T17:09:10.801213Z DEBUG hyper::proto::h1::role: response with HTTP2 version coerced to HTTP/1.1
2022-09-23T17:09:10.801359Z DEBUG hyper::proto::h1::io: flushed 410 bytes

And request from connect-web client:

2022-09-23T17:09:37.118372Z DEBUG hyper::proto::h1::io: parsed 13 headers
2022-09-23T17:09:37.118404Z DEBUG hyper::proto::h1::conn: incoming body is empty
2022-09-23T17:09:37.118556Z DEBUG hyper::proto::h1::io: flushed 269 bytes
2022-09-23T17:09:37.119303Z DEBUG hyper::proto::h1::io: parsed 13 headers
2022-09-23T17:09:37.119318Z DEBUG hyper::proto::h1::conn: incoming body is empty
2022-09-23T17:09:37.119414Z DEBUG hyper::proto::h1::io: flushed 269 bytes
2022-09-23T17:09:37.120424Z DEBUG hyper::proto::h1::io: parsed 16 headers
2022-09-23T17:09:37.120438Z DEBUG hyper::proto::h1::conn: incoming body is content-length (47 bytes)
2022-09-23T17:09:37.120504Z DEBUG hyper::proto::h1::conn: incoming body completed
2022-09-23T17:09:37.120538Z DEBUG tonic_web::service: kind="other h1" content_type=Some("application/json")
2022-09-23T17:09:37.120630Z DEBUG hyper::proto::h1::io: flushed 205 bytes
2022-09-23T17:09:37.120778Z DEBUG hyper::proto::h1::io: parsed 16 headers
2022-09-23T17:09:37.120789Z DEBUG hyper::proto::h1::conn: incoming body is content-length (47 bytes)
2022-09-23T17:09:37.120835Z DEBUG hyper::proto::h1::conn: incoming body completed
2022-09-23T17:09:37.120863Z DEBUG tonic_web::service: kind="other h1" content_type=Some("application/json")
2022-09-23T17:09:37.120963Z DEBUG hyper::proto::h1::io: flushed 205 bytes

Add support for @improbable-eng/grpc-web

Is your feature request related to a problem? Please describe.

I would like to use @improbable-eng/grpc-web as a gRPC client to be able to use bidirectional streams

Describe the solution you'd like

Something similar to this but for @improbable-eng/grpc-web

import * as grpc from "@grpc/grpc-js";

const client = createGrpcClient(ElizaService, { 
   address: "localhost",
   channelCredentials: grpc.ChannelCredentials.createInsecure()
});

client.say({ sentence: "Hello" }, (err: grpc.ServiceError | null, value?: SayResponse) => {
  //       
});

Allow easier access to common fetch params

Some common options on the fetch function (like credentials) must currently be added via interceptors. While that works, we should probably expose some of these common options in a more ergonomic means:

Current:

const transport = createConnectTransport({
  baseUrl: host,
  interceptors: [
    (_service, _method, _options, request, response) => {
      return [
        {
          ...request,
          init: {
            ...request.init,
            credentials: "include",
          },
        },
        response,
      ];
    },
  ],
});

Proposal:

const transport = createConnectTransport({
  baseUrl: host,
  credentials: "include",
});

DOM library declaration is required in tsconfig.json for a server-side application

Describe the bug

If the DOM library declaration is not included in tsconifg.json, then compilation fails in connect-core with the error:

node_modules/@bufbuild/connect-core/dist/types/call-options.d.ts (13:15)
13     headers?: HeadersInit;
                 ~~~~~~~~~~~

I would assume this type is only used for browser-side code so it would be great if it's possible to avoid adding DOM as a library for a server-side application.

To Reproduce

  • Set up a TypeScript project which uses connect-node with lib option set to e.g. ["ESNext"] in tsconfig.json.
  • Try to compile it.

Environment:

  • @bufbuild/connect-web version: 0.7.0
  • @bufbuild/connect-node version: 0.7.0
  • Frontend framework and version: N/A
  • Node.js version: 18.14.0
  • Browser and version: N/A

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.