GithubHelp home page GithubHelp logo

mattiasbuelens / web-streams-polyfill Goto Github PK

View Code? Open in Web Editor NEW
269.0 7.0 30.0 2.02 MB

Web Streams, based on the WHATWG spec reference implementation

License: MIT License

JavaScript 10.71% TypeScript 89.29%
stream streams streams-standard javascript polyfill ponyfill

web-streams-polyfill's Introduction

web-streams-polyfill

Web Streams, based on the WHATWG spec reference implementation.

build status npm version license

Links

Usage

This library comes in multiple variants:

  • web-streams-polyfill: a ponyfill that provides the stream implementations without replacing any globals, targeting ES2015+ environments. Recommended for use in Node 6+ applications, or in web libraries supporting modern browsers.
  • web-streams-polyfill/es5: a ponyfill targeting ES5+ environments. Recommended for use in legacy Node applications, or in web libraries supporting older browsers.
  • web-streams-polyfill/polyfill: a polyfill that replaces the native stream implementations, targeting ES2015+ environments. Recommended for use in web apps supporting modern browsers through a <script> tag.
  • web-streams-polyfill/polyfill/es5: a polyfill targeting ES5+ environments. Recommended for use in web apps supporting older browsers through a <script> tag.

Each variant also includes TypeScript type definitions, compatible with the DOM type definitions for streams included in TypeScript. These type definitions require TypeScript version 4.7 or higher.

In version 4, the list of variants was reworked to have more modern defaults and to reduce the download size of the package. See the migration guide for more information.

Usage as a polyfill:

<!-- option 1: hosted by unpkg CDN -->
<script src="https://unpkg.com/web-streams-polyfill/dist/polyfill.js"></script>
<!-- option 2: self hosted -->
<script src="/path/to/web-streams-polyfill/dist/polyfill.js"></script>
<script>
var readable = new ReadableStream();
</script>

Usage as a Node module:

var streams = require("web-streams-polyfill");
var readable = new streams.ReadableStream();

Usage as a ponyfill from within a ES2015 module:

import { ReadableStream } from "web-streams-polyfill";
const readable = new ReadableStream();

Usage as a polyfill from within an ES2015 module:

import "web-streams-polyfill/polyfill";
const readable = new ReadableStream();

Compatibility

The polyfill and ponyfill variants work in any ES2015-compatible environment.

The polyfill/es5 and ponyfill/es5 variants work in any ES5-compatible environment that has a global Promise. If you need to support older browsers or Node versions that do not have a native Promise implementation (check the support table), you must first include a Promise polyfill (e.g. promise-polyfill).

Async iterable support for ReadableStream is available in all variants, but requires an ES2018-compatible environment or a polyfill for Symbol.asyncIterator.

WritableStreamDefaultController.signal is available in all variants, but requires a global AbortController constructor. If necessary, consider using a polyfill such as abortcontroller-polyfill.

Reading with a BYOB reader is available in all variants, but requires ArrayBuffer.prototype.transfer() or structuredClone() to exist in order to correctly transfer the given view's buffer. If not available, then the buffer won't be transferred during the read.

Tooling compatibility

This package uses subpath exports for its variants. As such, you need Node 12 or higher in order to import or require() such a variant.

When using TypeScript, make sure your moduleResolution is set to "node16", "nodenext" or "bundler".

Compliance

The polyfill implements version 4dc123a (13 Nov 2023) of the streams specification.

The polyfill is tested against the same web platform tests that are used by browsers to test their native implementations. The polyfill aims to pass all tests, although it allows some exceptions for practical reasons:

  • The default (ES2015) variant passes all of the tests, except for the test for the prototype of ReadableStream's async iterator. Retrieving the correct %AsyncIteratorPrototype% requires using an async generator (async function* () {}), which is invalid syntax before ES2018. Instead, the polyfill creates its own version which is functionally equivalent to the real prototype.
  • The ES5 variant passes the same tests as the ES2015 variant, except for various tests about specific characteristics of the constructors, properties and methods. These test failures do not affect the run-time behavior of the polyfill. For example:
    • The name property of down-leveled constructors is incorrect.
    • The length property of down-leveled constructors and methods with optional arguments is incorrect.
    • Not all properties and methods are correctly marked as non-enumerable.
    • Down-leveled class methods are not correctly marked as non-constructable.

Contributors

Thanks to these people for their work on the original polyfill:

web-streams-polyfill's People

Contributors

ariutta avatar creatorrr avatar dependabot-preview[bot] avatar dependabot[bot] avatar frank-dspeed avatar mattiasbuelens avatar twiss avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

web-streams-polyfill's Issues

Could you please decrease the size of the package?

Hello

The current size of the installed package is almost 9MB. It seems like it's too much for an average package size. Is it really necessary to include all files (including *.map) for the package to function correctly? It seems like it would be enough to publish only the minified files from the list below.

$ du -sh node_modules/web-streams-polyfill/dist/* | sort -hr

436K	node_modules/web-streams-polyfill/dist/ponyfill.mjs.map
436K	node_modules/web-streams-polyfill/dist/ponyfill.js.map
436K	node_modules/web-streams-polyfill/dist/polyfill.mjs.map
436K	node_modules/web-streams-polyfill/dist/polyfill.js.map
424K	node_modules/web-streams-polyfill/dist/ponyfill.es6.js.map
424K	node_modules/web-streams-polyfill/dist/polyfill.es6.mjs.map
424K	node_modules/web-streams-polyfill/dist/polyfill.es6.js.map
420K	node_modules/web-streams-polyfill/dist/ponyfill.es6.mjs.map
400K	node_modules/web-streams-polyfill/dist/ponyfill.es2018.mjs.map
400K	node_modules/web-streams-polyfill/dist/ponyfill.es2018.js.map
400K	node_modules/web-streams-polyfill/dist/polyfill.es2018.mjs.map
400K	node_modules/web-streams-polyfill/dist/polyfill.es2018.js.map
372K	node_modules/web-streams-polyfill/dist/polyfill.min.js.map
364K	node_modules/web-streams-polyfill/dist/polyfill.es6.min.js.map
344K	node_modules/web-streams-polyfill/dist/polyfill.es2018.min.js.map
228K	node_modules/web-streams-polyfill/dist/ponyfill.js
228K	node_modules/web-streams-polyfill/dist/polyfill.js
216K	node_modules/web-streams-polyfill/dist/polyfill.es6.js
212K	node_modules/web-streams-polyfill/dist/ponyfill.es6.js
212K	node_modules/web-streams-polyfill/dist/polyfill.es2018.js
208K	node_modules/web-streams-polyfill/dist/ponyfill.mjs
208K	node_modules/web-streams-polyfill/dist/ponyfill.es2018.js
208K	node_modules/web-streams-polyfill/dist/polyfill.mjs
196K	node_modules/web-streams-polyfill/dist/ponyfill.es6.mjs
196K	node_modules/web-streams-polyfill/dist/polyfill.es6.mjs
192K	node_modules/web-streams-polyfill/dist/ponyfill.es2018.mjs
192K	node_modules/web-streams-polyfill/dist/polyfill.es2018.mjs
 84K	node_modules/web-streams-polyfill/dist/types
 72K	node_modules/web-streams-polyfill/dist/polyfill.min.js
 68K	node_modules/web-streams-polyfill/dist/polyfill.es6.min.js
 64K	node_modules/web-streams-polyfill/dist/polyfill.es2018.min.js
...
total: 8.7M

`Cannot abort a stream that already has a writer`

Hi, thanks for this great polyfill first.

I want to close the WritableStream when the user cancels the download process, but it throws as title, is there any solution?

I'm using the hacky way currently.

if (evt.data.aborted) {
  if (stream._writer) {
    stream._writer.abort()
    stream._writer = undefined
  }
  stream.abort()
}

welp, looking for that readableStream iterator polyfill

I remember somebody asking how to take a ReadableStream and turning it into a asyncIterator.
I came with a suggestion and then you added some improvements to it.

Don't remember where it was...
that person posted it somewhere in your repos i think...

ReadableStream and File/Blob

Leaving this tip to hopefully save time for others who encountered this problem.

I am processing files client-side using File/Blob objects that come from an input element. My application uses a sequence of transformations to go from the original file, possibly uncompressing it (with pako for portability), converting it to text, and breaking it into lines. The TransformStream API makes this quite clean and nice, but there is an incompatibility since the object that is returned by File.stream() is a native ReadableStream, not a polyfill one. After a bit of head scratching, I made the following function which is basically cribbed from the API pages (https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream).

Effectively it is almost a no-op, but it helps bridge between the native API and the pollyfill.

    function blobToReadableStream(blob) {
      let reader = blob.stream().getReader();
      return new ReadableStream({
        start(controller) {
          function push() {
            reader.read().then(({done, value}) => {
              if (done) {
                controller.close();
                return;
              }
              controller.enqueue(value);
              push();
            })
          }
          push();
        }
      });
    }

Thanks,
Tom.

ReadableStream typing does not match async iterator spec

The globally declared Symbol.asyncIterator property here should be a method, not a property.

https://github.com/MattiasBuelens/web-streams-polyfill/blob/master/dist/types/ts3.6/polyfill.d.ts#L26

vs.

https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/AsyncIterator/@@asyncIterator

This causes subtle downstream typing bugs like this:

node_modules/@langchain/core/dist/utils/stream.d.ts:1:18 - error TS2320: Interface 'IterableReadableStreamInterface<T>' cannot simultaneously extend types 'ReadableStream<T>' and 'AsyncGenerator<T, any, unknown>'.
  Named property '[Symbol.asyncIterator]' of types 'ReadableStream<T>' and 'AsyncGenerator<T, any, unknown>' are not identical.

See:

openai/openai-node#613
langchain-ai/langchainjs#3793

closing a stream followed by respond(0) can error if close is not syncronously putting stream in closed state

Hi,

If calling controller.close does not always put the stream into closed state (e.g. there are pending pull into requests) is there a way to identify how/when to repond with 0 bytes written to byobRequests that might still be active?

in whatwg/streams#1170 there is a discussion around needing to .respond(0) immediately following the close call but since the close call is not necessarily synchronously moving the stream into closed state the respond(0) can error.

I found this issue when working with the polyfill v 3.1.1 and was doing some things that aren't recommended like enqueueing chunks before using the view that was provided in a byobRequest. I upgraded to 3.2 which made those enqueues cause the byobRequest to become invalidated which seems to resolve the specific issue I was running into (and I stopped preempting the byob view with enqueued chunks)

However in investigating this issue I can still get myself into a state where a byobRequest exists after calling close and I'd like to know if there is a way to assert we're in closed status before trying to finalize the stream with a last controller.byobRequest.repond(0)

signal with pre aborted

In one of my test, this hangs indefinitely when using a abort signal that have already been canceled togheter with polyfill (this don't happen when using natives)

  const abortController = new AbortController()
  const signal = abortController.signal
  abortController.abort()
  const promise = rs.pipeTo(ws, { signal })
  err = await promise.catch(e=>e)
  // never gets here

version using 3.0.3 (edit: and 3.0.1 apparently)

Node version bump from 8 to 18 breaks Remix installation

Hi,

seems like version 3.3.0 of web-streams-polyfill bumps engine from Node 8 to Node 18 (#130) which is higher than Remix' minimum Node version (Node 14). This in turn causes our (Sentry Javascript SDKs) integration tests to fail because Remix depends on web-streams-polyfill.

image

IMHO a minimum node version bump should not be made in a minor release. We'll probably be able to work around this issue with overriding the dependency version but just wanted to let you know anyway! (I assume more remix users are going to run into this)

cheers!

Issue with dynamic import

Hello,

I use native-file-system-adapter which have a dependency to web-streams-polyfill this library try to dynamically import web-streams-polyfill when I try to build my app (webpack5 / typescript 4.4) I have this error:

external "https://cdn.jsdelivr.net/npm/web-streams-polyfill@3/dist/ponyfill.es2018.mjs"
The target environment doesn't support dynamic import() syntax so it's not possible to use external type 'module' within a script

Could you help me ? thanks !

Investigate using native streams

Inspired by this tweet from @surma:

@MattiasBuelens Is it possible to offer up the polyfills for Readable, Writable and Transform individually? Most browsers have Readable, so ideally Iโ€™d only load Writable and Transform.

I've thought about this previously. Back then, I decided that it was not feasible because readable byte streams are not supported by any browser. A full polyfill would always need to provide its own ReadableStream implementation that supports byte streams. By extension, it would also need to provide its own implementations for WritableStream (that works with its ReadableStream.pipeTo()) and TransformStream (that uses its readable and writable streams).

Looking at this again, I think we can do better. If you don't need readable byte streams, then the native ReadableStream should be good enough as a starting point for the polyfill. From there, the polyfill could add any missing methods (pipeTo, pipeThrough, getIterator,...) and implement them using the native reader from getReader().

This approach can never be fully spec-compliant though, since the spec explicitly forbids these methods to use the public API. For example, pipeTo() must use AcquireReadableStreamDefaultReader() instead of ReadableStream.getReader(), so it cannot be affected by user-land JavaScript code making modifications to ReadableStream.prototype. I don't think that has to be a problem though: we are already a user-land polyfill written in JavaScript that modifies those prototypes, it would be silly for the polyfill to try and guard itself against other JavaScript code making similar modifications.

Steps in the spec that require inspecting the internal state of the stream or call into internal methods will need to be replaced by something that emulates the behavior using solely the public API.

  • Often, this will be easy: e.g. ReadableStreamDefaultControllerEnqueue() becomes controller.enqueue().

  • Sometimes, we have to be a bit more lenient. ReadableStreamPipeTo()'s error propagation says:

    if source.[[state]] is or becomes "errored"

    We can check if it becomes errored by waiting for the source.closed promise to become rejected. However, we can't synchronously check if it is already errored.

  • In rare cases, this may turn out to be impossible. TransformStreamDefaultSinkWriteAlgorithm specifies:

    If state is "erroring", throw writable.[[storedError]].

    Usually, the writable stream starts erroring because the writable controller has errored, which the transform stream's implementation controls. However, it could also be triggered by WritableStream.abort(), which is out of the control of the transform stream implementation. In this case, the controller is only made aware of it after the writable stream finishes erroring (state becomes "errored") through its abort() algorithm, which is already too late.

Of course, we can't just flat-out remove byte stream support from the polyfill, just for the sake of using native streams more. The default should still be a full polyfill, but we might want to give users the option to select which features they want polyfilled (as @surma suggested in another tweet).

Anyway, I still want to give this a try. It might fail catastrophically, but then at least I'll have a better answer on why we use so little from the native streams implementation. ๐Ÿ˜…

Ready to boost your popularity to like 22 mil download / week?

Do you think you can reduce the bundle by a magnitude?
node-fetch/node-fetch#1217

We have always been hesitated to include this package cuz of its size... We also would never thought NodeJS would ship whatwg streams that it would become useful. but it looks like we are in a area where ppl demand more spec'ed cross compatible code across Deno, NodeJS and Browsers instead

There is a ambition to get whatwg streams into node-fetch body as well. (b/c NodeJS started to support whatwg streams) (planed for v4)

User permission denied for WriteStream function execution in iOS Safari 12.0

Hello I was testing this polyfill to implement video capture in iOS safari 12.0 but I hit this error:

Unhandled Promise Rejection: NotAllowedError: The request is not allowed by the user agent or the platform in the current context, possibly because the user denied permission.

Do I need to enabled something in the settings or this is not a supported browser?

Upgrading to Typescript 4.9 results in error TS2322 when using web-streams-polyfill

Hello!! We have a type error when do this:

window.ReadableStream = ReadableStream;

Error:

Type 'typeof ReadableStream' is not assignable to type '{ new (underlyingSource: UnderlyingByteSource, strategy?: { highWaterMark?: number | undefined; } | undefined): ReadableStream<Uint8Array>; new <R = any>(underlyingSource: UnderlyingDefaultSource<...>, strategy?: QueuingStrategy<...> | undefined): ReadableStream<...>; new <R = any>(underlyingSource?: UnderlyingSourc...'.
  Types of parameters 'underlyingSource' and 'underlyingSource' are incompatible.
    Type 'UnderlyingSource<any> | undefined' is not assignable to type 'UnderlyingByteSource'.
      Type 'undefined' is not assignable to type 'UnderlyingByteSource'.

Do you plan to update the type? Thanks a lot!!

breaking change around abortsignal?

When I changed from v2.1.0 to v2.1.1 then a promise would never reject as it should have done.

const abortController = new AbortController()
abortController.abort()
const promise = readable.pipeTo(writable, { signal: abortController.signal })
await promise.catch(console.log)

The result was that it just hangs and waits.
(aborting something synchronously is abnormal since this is just a mocha test)
I tried creating a minimal reproducible code but was unable to just do it with just purely WritableStream + ReadableStream
I'm extending/waiting/writing stuff

Convert from/to native streams?

Right now, this project is just a polyfill: it implements the streams specification, and that's it. In reality though, you may need to inter-operate with non-polyfilled streams, e.g. when using web APIs that consume or produce native streams. In that case, you need a way to convert between native streams and polyfilled streams.

web-streams-adapter provides a low-level API to do such conversions between any stream implementation. For this polyfill, it might be interesting to provide an easier way to convert from/to a polyfilled stream.

These could be provided as static utility methods:

ReadableStream.from(stream: ReadableStreamLike): ReadableStream;
ReadableStream.toNative(stream: ReadableStream): typeof window.ReadableStream;

or exported separately:

export function fromReadableStream(stream: ReadableStreamLike): ReadableStream;
export function toNativeReadableStream(stream: ReadableStream): typeof window.ReadableStream;

Any thoughts on this API?

Enforce privacy for stream internals?

At the moment the polyfill follows the reference implementation in keeping private instance data in fields with leading underscores:

export class ReadableStream<R = any> {
  /** @internal */
  _state!: ReadableStreamState;
  /** @internal */
  _reader: ReadableStreamReader<R> | undefined;
  /** @internal */
  _storedError: any;
  /** @internal */
  _disturbed!: boolean;
  /** @internal */
  _readableStreamController!: ReadableStreamDefaultController<R> | ReadableByteStreamController;

  // ...
}

For some use cases it would be nice to have the internals hidden in a more enforced way. An obvious approach would be to use ES private fields, but polyfilling them accurately requires WeakMaps, so it would be necessary to have some kind of additional processing using something like the babel loose transform mode to support pure ES5 targets (without actually enforcing privacy for them, which seems reasonable). I don't think TypeScript offers this itself at the moment.

Would you be interested in accepting a patch along those lines? Obviously adopting this for the whole polyfill would be a bit of an invasive change.

UnhandledPromiseRejectionWarning: TypeError: reader._closedPromise_resolve is not a function

Hi, I'm getting an unhandled error even though it seems to be working otherwise.

``` (node:102333) UnhandledPromiseRejectionWarning: TypeError: reader._closedPromise_resolve is not a function at defaultReaderClosedPromiseResolve (/home/mauve/programming/make-fetch/node_modules/web-streams-polyfill/dist/ponyfill.es6.js:261:16) at ReadableStreamClose (/home/mauve/programming/make-fetch/node_modules/web-streams-polyfill/dist/ponyfill.es6.js:3092:9) at ReadableStreamDefaultControllerClose (/home/mauve/programming/make-fetch/node_modules/web-streams-polyfill/dist/ponyfill.es6.js:2616:13) at /home/mauve/programming/make-fetch/node_modules/web-streams-polyfill/dist/ponyfill.es6.js:3527:13 at processTicksAndRejections (internal/process/task_queues.js:97:5) (node:102333) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 1) (node:102333) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code. ```

This is happening with version 3.0.0 installed from NPM on Node v12.16.2

Abilities to support web streams under IE8-?

I have found that there is some breaking point for the shim under IE8-:

  1. The return is a reserved keyword of ReadableStreamAsyncIteratorImpl.prototype.return.
  2. Several places have used getter, which cannot be implemented under IE8-.

So the question is whether it is possible to shim under IE8-?

`pipeTo` doesn't use `signal.reason` as error

Repro:

import { ReadableStream, WritableStream } from 'web-streams-polyfill';

const abortController = new AbortController();
new ReadableStream().pipeTo(new WritableStream(), { signal: abortController.signal }).catch(e => console.log(e));
abortController.abort('Hello');

Expected (Microsoft Edge Version 102.0.1235.1 native implementation):

Hello

Actual ([email protected]):

e [AbortError]: Aborted

WPT in https://github.com/web-platform-tests/wpt/blob/73220c738c29bba700e608b235122613d5225b57/streams/piping/abort.any.js also fails:

piping/abort.any.html

  โˆš a signal argument 'null' should cause pipeTo() to reject
  โˆš a signal argument 'AbortSignal' should cause pipeTo() to reject
  โˆš a signal argument 'true' should cause pipeTo() to reject
  โˆš a signal argument '-1' should cause pipeTo() to reject
  โˆš a signal argument '[object AbortSignal]' should cause pipeTo() to reject
  โˆš an aborted signal should cause the writable stream to reject with an AbortError
  ร— (reason: 'null') all the error objects should be the same object (UNEXPECTED FAILURE)
  
    promise_rejects_exactly: pipeTo rejects with abort reason function "function() { throw e }" threw object "AbortError: Aborted" but we expected it to throw null
        at runMicrotasks (<anonymous>)
        at processTicksAndRejections (node:internal/process/task_queues:96:5)
        at async Test.<anonymous> (http://127.0.0.1:64298/streams/piping/abort.any.js:65:7)
  ร— (reason: 'undefined') all the error objects should be the same object (UNEXPECTED FAILURE)
  
    assert_equals: signal.reason should be error expected object "AbortError: Aborted" but got object "AbortError: The operation was aborted."
        at Test.<anonymous> (http://127.0.0.1:64298/streams/piping/abort.any.js:72:5)
        at runMicrotasks (<anonymous>)
        at processTicksAndRejections (node:internal/process/task_queues:96:5)
  ร— (reason: 'error1: error1') all the error objects should be the same object (UNEXPECTED FAILURE)
  
    promise_rejects_exactly: pipeTo rejects with abort reason function "function() { throw e }" threw object "AbortError: Aborted" but we expected it to throw object "error1: error1"
        at runMicrotasks (<anonymous>)
        at processTicksAndRejections (node:internal/process/task_queues:96:5)
        at async Test.<anonymous> (http://127.0.0.1:64298/streams/piping/abort.any.js:65:7)
  โˆš preventCancel should prevent canceling the readable
  โˆš preventAbort should prevent aborting the readable
  โˆš preventCancel and preventAbort should prevent canceling the readable and aborting the readable
  ร— (reason: 'null') abort should prevent further reads (UNEXPECTED FAILURE)
  
    promise_rejects_exactly: pipeTo rejects with abort reason function "function() { throw e }" threw object "AbortError: Aborted" but we expected it to throw null
        at runMicrotasks (<anonymous>)
        at processTicksAndRejections (node:internal/process/task_queues:96:5)
        at async Test.<anonymous> (http://127.0.0.1:64298/streams/piping/abort.any.js:135:7)
  ร— (reason: 'undefined') abort should prevent further reads (UNEXPECTED FAILURE)
  
    assert_equals: signal.reason should be error expected object "AbortError: Aborted" but got object "AbortError: The operation was aborted."
        at Test.<anonymous> (http://127.0.0.1:64298/streams/piping/abort.any.js:140:5)
        at runMicrotasks (<anonymous>)
        at processTicksAndRejections (node:internal/process/task_queues:96:5)
  ร— (reason: 'error1: error1') abort should prevent further reads (UNEXPECTED FAILURE)
  
    promise_rejects_exactly: pipeTo rejects with abort reason function "function() { throw e }" threw object "AbortError: Aborted" but we expected it to throw object "error1: error1"
        at runMicrotasks (<anonymous>)
        at processTicksAndRejections (node:internal/process/task_queues:96:5)
        at async Test.<anonymous> (http://127.0.0.1:64298/streams/piping/abort.any.js:135:7)
  ร— (reason: 'null') all pending writes should complete on abort (UNEXPECTED FAILURE)
  
    promise_rejects_exactly: pipeTo rejects with abort reason function "function() { throw e }" threw object "AbortError: Aborted" but we expected it to throw null
        at async Test.<anonymous> (http://127.0.0.1:64298/streams/piping/abort.any.js:174:7)
  ร— (reason: 'undefined') all pending writes should complete on abort (UNEXPECTED FAILURE)
  
    assert_equals: signal.reason should be error expected object "AbortError: Aborted" but got object "AbortError: The operation was aborted."
        at Test.<anonymous> (http://127.0.0.1:64298/streams/piping/abort.any.js:179:5)
        at runNextTicks (node:internal/process/task_queues:61:5)
        at processTimers (node:internal/timers:497:9)
  ร— (reason: 'error1: error1') all pending writes should complete on abort (UNEXPECTED FAILURE)
  
    promise_rejects_exactly: pipeTo rejects with abort reason function "function() { throw e }" threw object "AbortError: Aborted" but we expected it to throw object "error1: error1"
        at async Test.<anonymous> (http://127.0.0.1:64298/streams/piping/abort.any.js:174:7)
  โˆš a rejection from underlyingSource.cancel() should be returned by pipeTo()
  โˆš a rejection from underlyingSink.abort() should be returned by pipeTo()
  โˆš a rejection from underlyingSink.abort() should be preferred to one from underlyingSource.cancel()
  โˆš abort signal takes priority over closed readable
  โˆš abort signal takes priority over errored readable
  โˆš abort signal takes priority over closed writable
  โˆš abort signal takes priority over errored writable
  โˆš abort should do nothing after the readable is closed
  โˆš abort should do nothing after the readable is errored
  โˆš abort should do nothing after the readable is errored, even with pending writes
  โˆš abort should do nothing after the writable is errored

Question: how to detach a TypedArray.

Sry for a total unrelated question. but i know you have worked on byte/byob streams before and that data can be detached / transfered.

i was wondering if you did have a trick up my sleeve of any way to detach a buffer? hopefully synchronous.
I don't necessary want to transfer the data to any other thread but the only possible way i could think of was to use MessageChannel

function transfer (uint8, cb) {
  const mc = new MessageChannel()
  mc.port2.onmessage = evt => cb(evt.data)
  mc.port1.postMessage(uint8, [ uint8.buffer ])
}
uint8 = new Uint8Array([97])
transfer(uint8, console.log)

Do you happen to know of any other way? how do you do it in your polyfill?

Protecting ArrayBuffers from the whatwg reference implementation

The whatwg ReadableByteStreamController reference implementation attempts to mimic the ArrayBuffer.transfer semantics by copying the underlying ArrayBuffer of any TypedArray that passes through it, and redefining the original's byteLength to be 0:

// Not implemented correctly
exports.TransferArrayBuffer = O => {
  assert(!exports.IsDetachedBuffer(O));
  const transferredIshVersion = O.slice();

  // This is specifically to fool tests that test "is transferred" by taking a non-zero-length
  // ArrayBuffer and checking if its byteLength starts returning 0.
  Object.defineProperty(O, 'byteLength', {
    get() {
      return 0;
    }
  });
  O[isFakeDetached] = true;

  return transferredIshVersion;
};

To be fair, they state that the reference implementation shouldn't be used as a polyfill. But seeing as how it is being used as a polyfill, this behavior is problematic.

Obviously copying at every step is unacceptable for a production polyfill, but redefining byteLength also makes it impossible to use in node (without wrapping every stream to pre-copy each chunk) for interop or testing DOM-streams logic. Node often uses internal ArrayBuffer pools (like here in fs.ReadStream), so the Object.defineProperty call is modifying the ArrayBuffer pool, breaking unrelated operations dependent on that pool.

After thoroughly reading/stepping through the code paths in the reference implementation in each place TransferArrayBuffer is called to verify the returned buffer isn't modified further, and verifying the behavior with about a thousand (auto-generated) tests and one production use, I've come up with a workaround that solves both these issues:

const kIsFakeBuffer = Symbol.for('isFakeBuffer');
function protectArrayBufferFromWhatwgRefImpl(value: Uint8Array) {
    const real = value.buffer;
    if (!real[kIsFakeBuffer]) {
        const fake = Object.create(real);
        Object.defineProperty(fake, kIsFakeBuffer, { value: true });
        Object.defineProperty(fake, 'slice', { value: () => real });
        Object.defineProperty(value, 'buffer', {     value: fake });
    }
    return value;
}

Before calling readableByteStreamController.enqueue(), we can redefine the buffer property on the TypedArray to be an instance created via Object.create(), whose "slice" function is overridden to return the original, unmodified buffer. This ensure the value returned to transferredIshVersion = O.slice(); will be the "real" underlying ArrayBuffer, and won't be sliced.

I could keep this hack in userland, but maybe adding something like this to the polyfill would be worthwhile for the following reasons:

  1. Every userland call to controller.enqueue() has to be guarded by a call to protectArrayBufferFromWhatwgRefImpl(), which makes writing a true zero-copy polyfill for FF that much harder.
  2. Keeping this fix in userland means that we have to call protectArrayBufferFromWhatwgRefImpl() even in environments where the web-streams-polyfill isn't being used, like Chrome and Safari. Even though in my testing this wrapping doesn't affect the native stream implementations, if for some reason a user did want to call chunk.buffer.slice(), they'd be getting the real ArrayBuffer out, not a slice of it.
  3. My fulltime job is optimizing a binary streaming engine across node + browsers, and it took me a while to track this down -- what chance does a regular dev just trying to stream some data cross-browser stand of discovering and fixing this issue on their own?

Use in a streaming library for nodejs and browsers

I have a project that has a library of streaming classes that are intended to be used both in NodeJS runtimes and in browsers. I'd like to use web-streams-polyfill in the browser.

The main problem is that web-streams-polyfill requires the use of "web-streams-polyfill/es2018" (or other) as an import reference. Where as NodeJS provides the "stream" import reference. This means that the classes in my library are not compilable in a browser context (complains about "stream" dependency not found).

I've got a few hacky workaround ideas, but wondered if there's an immediately obvious solution that I'm not seeing.

AutoAllocateChunkSize working?

Hey, we've been having trouble using the autoAllocateChunkSize feature. I'm not sure if it's a bug or if it's just our understanding lacking about the inner workings of ReadableStreams.

Please see the following example:

const source: UnderlyingByteSource = {
  autoAllocateChunkSize: DEFAULT_BUF_SIZE, // 8192
  type: "bytes",

  async pull(controller) {
    if (!controller.byobRequest?.view) {
      throw new TypeError("Missing BYOB request although auto-allocating");
    }

    const buf = controller.byobRequest.view;
    const nRead = await read(path, buf);

    if (nRead > 0) {
      controller.byobRequest.respond(nRead);
    } else {
      controller.close();
    }
  },
  async cancel() {
    await close(path);
  },
};

const stream = new ReadableStream(source, { highWaterMark: 1 });

As far as I understand, this should never throw the above type error, because if the stream is used from a default reader, the runtime will allocate the required buffers automatically. We do, however, saw that TypeError being thrown when working with the stream. What could the reason be?

Source map warnings

Using this library I'm getting warnings like these in browser console and webpack dev server output. I'm using typescript in my project. Do you know if there's something I could do to prevent these warnings?

./node_modules/web-streams-polyfill/dist/polyfill.min.js
Module Warning (from ./node_modules/source-map-loader/dist/cjs.js):
Failed to parse source map from 'C:\Users\vinca\Documents\Code\proton-drive\node_modules\web-streams-polyfill\src\lib\abort-signal.ts' file: Error: ENOENT: no such file or directory, open 'C:\Users\vinca\Documents\Code\proton-drive\node_modules\web-streams-polyfill\src\lib\abort-signal.ts'

[Question] TransformStream and Response as specified and implemented

Consider this code

transformstream-response-test.js

(async () => {
  try {
    let { readable, writable } = new TransformStream();
    let writer = writable.getWriter();
    // new Response(readable).text().then(console.log);
    let encoder = new TextEncoder();
    console.log(readable, writable, writer);
    for (const data of "abcdef") {
      console.log({ data }); // {"data": "a"}
      await writer.ready;
      await writer.write(encoder.encode(data));
    }
    await writer.close();
    await writer.closed;
    console.log(
      "We never get here... if we comment line 4 and uncomment line 15",
    );
    console.log(await new Response(readable).text());
  } catch (e) {
    console.error(e);
  }
})();

I was expecting for console.log(await new Response(readable).text()); to log "abcdef" on Chromium 122.

That's not what happens.

Firefox Nightly 123 gives us a hint

(async () => {
  try {
    let { readable, writable } = new TransformStream();
    let writer = writable.getWriter();
    // new Response(readable).text().then(console.log);โ€ฆ
ReadableStream { locked: false }
 
WritableStream { locked: true }
 
WritableStreamDefaultWriter { closed: Promise { "pending" }, desiredSize: 1, ready: Promise { "fulfilled" } }
[debugger eval code:7:13](chrome://devtools/content/webconsole/debugger%20eval%20code)
Object { data: "a" }
[debugger eval code:9:15](chrome://devtools/content/webconsole/debugger%20eval%20code)
Promise { <state>: "pending" }

node 22 nightly tells us nothing, just exits after logging { data: 'a' }

node --experimental-default-type=module transformstream-response-test.js
ReadableStream { locked: false, state: 'readable', supportsBYOB: false } WritableStream { locked: true, state: 'writable' } WritableStreamDefaultWriter {
  stream: WritableStream { locked: true, state: 'writable' },
  close: Promise { <pending> },
  ready: Promise { undefined },
  desiredSize: 1
}
{ data: 'a' }

bun 1.0.23 exits, too

bun run transformstream-response-test.js
ReadableStream {
  locked: [Getter],
  cancel: [Function: cancel],
  getReader: [Function: getReader],
  pipeTo: [Function: pipeTo],
  pipeThrough: [Function: pipeThrough],
  tee: [Function: tee],
  values: [Function: values],
  [Symbol(Symbol.asyncIterator)]: [Function: lazyAsyncIterator],
} WritableStream {
  locked: true,
  abort: [Function: abort],
  close: [Function: close],
  getWriter: [Function: getWriter],
} WritableStreamDefaultWriter {
  closed: [Getter],
  desiredSize: [Getter],
  ready: [Getter],
  abort: [Function: abort],
  close: [Function: close],
  releaseLock: [Function: releaseLock],
  write: [Function: write],
}
{
  data: "a",
}

so does txiki.js

 tjs run transformstream-response-test.js
{ _state: 'readable',
  _reader: undefined,
  _storedError: undefined,
  _disturbed: false,
  _readableStreamController: 
   { _controlledReadableStream: [Circular],
     _queue: { _cursor: 0, _size: 0, _front: [Object], _back: [Object] },
     _queueTotalSize: 0,
     _started: false,
     _closeRequested: false,
     _pullAgain: false,
     _pulling: false,
     _strategySizeAlgorithm: [Function],
     _strategyHWM: 0,
     _pullAlgorithm: [Function: H],
     _cancelAlgorithm: [Function: G] } } { _state: 'writable',
  _storedError: undefined,
  _writer: 
   { _ownerWritableStream: [Circular],
     _readyPromise_resolve: undefined,
     _readyPromise_reject: undefined,
     _readyPromise: {},
     _readyPromiseState: 'fulfilled',
     _closedPromise_resolve: [Function],
     _closedPromise_reject: [Function],
     _closedPromiseState: 'pending',
     _closedPromise: {} },
  _writableStreamController: 
   { _controlledWritableStream: [Circular],
     _queue: { _cursor: 0, _size: 0, _front: [Object], _back: [Object] },
     _queueTotalSize: 0,
     _abortReason: undefined,
     _abortController: {},
     _started: false,
     _strategySizeAlgorithm: [Function],
     _strategyHWM: 1,
     _writeAlgorithm: [Function: A],
     _closeAlgorithm: [Function: M],
     _abortAlgorithm: [Function: L] },
  _writeRequests: 
   { _cursor: 0,
     _size: 0,
     _front: { _elements: [], _next: undefined },
     _back: { _elements: [], _next: undefined } },
  _inFlightWriteRequest: undefined,
  _closeRequest: undefined,
  _inFlightCloseRequest: undefined,
  _pendingAbortRequest: undefined,
  _backpressure: false } { _ownerWritableStream: 
   { _state: 'writable',
     _storedError: undefined,
     _writer: [Circular],
     _writableStreamController: 
      { _controlledWritableStream: [Circular],
        _queue: [Object],
        _queueTotalSize: 0,
        _abortReason: undefined,
        _abortController: {},
        _started: false,
        _strategySizeAlgorithm: [Function],
        _strategyHWM: 1,
        _writeAlgorithm: [Function: A],
        _closeAlgorithm: [Function: M],
        _abortAlgorithm: [Function: L] },
     _writeRequests: { _cursor: 0, _size: 0, _front: [Object], _back: [Object] },
     _inFlightWriteRequest: undefined,
     _closeRequest: undefined,
     _inFlightCloseRequest: undefined,
     _pendingAbortRequest: undefined,
     _backpressure: false },
  _readyPromise_resolve: undefined,
  _readyPromise_reject: undefined,
  _readyPromise: {},
  _readyPromiseState: 'fulfilled',
  _closedPromise_resolve: [Function],
  _closedPromise_reject: [Function],
  _closedPromiseState: 'pending',
  _closedPromise: {} }
{ data: 'a' }

deno 1.39.4 describes explicitly what Firefox Nightly logs as Promise { <state>: "pending" } when we comment out the (async() => {})() wrapper we only used for Firefox DevTools which doesn't implement top-level await in console

deno run -A transformstream-response-test.js
ReadableStream { locked: false } WritableStream { locked: true } WritableStreamDefaultWriter {
  closed: Promise { <pending> },
  desiredSize: 1,
  ready: Promise { undefined }
}
{ data: "a" }
error: Top-level await promise never resolved
      await writer.write(encoder.encode(data));
      ^
    at <anonymous> (file:///home/user/bin/transformstream-response-test.js:11:7)

Deno notifies use await writer.write(encoder.encode(data)); is never resolved after the first call to write().

If we comment line 15 (without (async()=>{})() wrapper, line 16 with the IIAF), before we use getWriter(), eventually "abcdef" is logged to console - except for tjs, which still just logs

{ data: 'a' } [object ReadableStream]

which is clearly a bug that I'll file over there.

What's going on here?

`pipeTo` with `preventCancel: true` never settle if `readable` doesn't produce new chunks

Repro

const readable = new ReadableStream();
const abortController = new AbortController();

readable.pipeTo(new WritableStream(), {
    preventCancel: true,
    signal: abortController.signal,
}).then(() => {
    console.log('resolve');
}, () => {
    console.log('rejected');
});

setTimeout(() => {
    console.log('abort');
    abortController.abort();
}, 1000);

setTimeout(() => {
    console.log('exit');
}, 10000);

Expected

Microsoft Edge canary 103.0.1255.0, with native ReadableStream and WritableStream

abort
rejected
exit

Node.js v16.13.1, with stream/web

(node:14660) ExperimentalWarning: stream/web is an experimental feature. This feature could change at any time
(Use `node --trace-warnings ...` to show where the warning was created)
abort
rejected
exit

Actual

web-streams-polyfill: 4.0.0-beta.2

abort
exit

Example how to use with node-fetch?

hi there, does anyone have an example how to use this polyfill combined with node-fetch?

the idea would be to hook up these libraries through a quick mapping..

import * as fetch from 'node-fetch'
import { ReadableStream } from 'web-streams-polyfill/ponyfill/es2018'

fetch.Body.prototype.getReader = function() {
  return new ReadableStream()
}

unfortunately fetch.Body is not exported but perhaps the node-fetch authors would accept a PR to export it so it could be polyfilled..

or any other suggestions?

dom-exception.ts

I thought instead of using a own custom DOMexception class, how about using this npm package instead?

Running WPT

Hi @MattiasBuelens!

You asked in #testing on IRC:

I'm working on a polyfill for web streams, and I'd like to run WPT against it in multiple browsers. Is there a way to make "wpt run" automatically load an extra JavaScript file at the start of each test, so I can inject my polyfill?

Did you figure out how to do this?

[question/help] tee:ing a async iterator

I'm trying to mimic basically what ReadableStream.prototype.tee dose but doing so only with the help of pure "async generators"

The idea is to be isomorphic across using node & whatwg streams and don't have to care of what type of stream it is. or importing one or the other streams into a environment that dose not have the one or the other type of stream... only that it's asyncIterable.

/**
 * @param {ReadableStream | Readable} iterable
 */
function tee (iterable) {
  const source = iterable[Symbol.asyncIterator]()
  let chain = source.next()
  // ...

  return [0, 1].map(async function * () {
    // read from `source|chain`, update the chain
  })
}

I was wondering if you have any ideas/pointers on how to build such a function. i was thinking of using something like a promise chain or a mutable object pointing to the next value instead of something like a array buffer that would hold all values like a sink dose to maybe avoid leakage (and also to make the other iterable more GC friendly) but I didn't fully solve it.

I found a similar (sync) solution on SO https://stackoverflow.com/a/46416353/1008999
but it still keeps a references to both tee:ed destinations (the buffers variable) in the next function, so it's not so GC friendly...?

Building with rollup ends with circular dependency warnings

Hi,
On building this package with rollup, I got errors below,

+ rollup -c

src/polyfill.ts โ†’ dist/polyfill.js, dist/polyfill.mjs...                                              
(!) Circular dependencies
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/readable-stream/generic-reader.ts -> src/lib/readable-stream.ts
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/readable-stream.ts
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/validators/readable-stream.ts -> src/lib/readable-stream.ts
...and 7 more
created dist/polyfill.js, dist/polyfill.mjs in 9.9s

src/polyfill.ts โ†’ dist/polyfill.min.js...                                                             
(!) Circular dependencies
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/readable-stream/generic-reader.ts -> src/lib/readable-stream.ts
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/readable-stream.ts
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/validators/readable-stream.ts -> src/lib/readable-stream.ts
...and 7 more
created dist/polyfill.min.js in 11.1s

src/polyfill.ts โ†’ dist/polyfill.es6.js, dist/polyfill.es6.mjs...                                      
(!) Circular dependencies
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/readable-stream/generic-reader.ts -> src/lib/readable-stream.ts
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/readable-stream.ts
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/validators/readable-stream.ts -> src/lib/readable-stream.ts
...and 7 more
created dist/polyfill.es6.js, dist/polyfill.es6.mjs in 7.6s

src/polyfill.ts โ†’ dist/polyfill.es6.min.js...                                                         
(!) Circular dependencies
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/readable-stream/generic-reader.ts -> src/lib/readable-stream.ts
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/readable-stream.ts
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/validators/readable-stream.ts -> src/lib/readable-stream.ts
...and 7 more
created dist/polyfill.es6.min.js in 10.6s

src/polyfill.ts โ†’ dist/polyfill.es2018.js, dist/polyfill.es2018.mjs...                                
(!) Circular dependencies
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/readable-stream/generic-reader.ts -> src/lib/readable-stream.ts
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/readable-stream.ts
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/validators/readable-stream.ts -> src/lib/readable-stream.ts
...and 7 more
created dist/polyfill.es2018.js, dist/polyfill.es2018.mjs in 5.3s

src/polyfill.ts โ†’ dist/polyfill.es2018.min.js...                                                      
(!) Circular dependencies
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/readable-stream/generic-reader.ts -> src/lib/readable-stream.ts
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/readable-stream.ts
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/validators/readable-stream.ts -> src/lib/readable-stream.ts
...and 7 more
created dist/polyfill.es2018.min.js in 6.9s

src/ponyfill.ts โ†’ dist/ponyfill.js, dist/ponyfill.mjs...                                              
(!) Circular dependencies
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/readable-stream/generic-reader.ts -> src/lib/readable-stream.ts
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/readable-stream.ts
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/validators/readable-stream.ts -> src/lib/readable-stream.ts
...and 7 more
created dist/ponyfill.js, dist/ponyfill.mjs in 4.2s

src/ponyfill.ts โ†’ dist/ponyfill.es6.js, dist/ponyfill.es6.mjs...                                      
(!) Circular dependencies
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/readable-stream/generic-reader.ts -> src/lib/readable-stream.ts
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/readable-stream.ts
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/validators/readable-stream.ts -> src/lib/readable-stream.ts
...and 7 more
created dist/ponyfill.es6.js, dist/ponyfill.es6.mjs in 3.8s

src/ponyfill.ts โ†’ dist/ponyfill.es2018.js, dist/ponyfill.es2018.mjs...
(!) Circular dependencies
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/readable-stream/generic-reader.ts -> src/lib/readable-stream.ts
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/readable-stream.ts
src/lib/readable-stream.ts -> src/lib/readable-stream/async-iterator.ts -> src/lib/readable-stream/default-reader.ts -> src/lib/validators/readable-stream.ts -> src/lib/readable-stream.ts
...and 7 more
created dist/ponyfill.es2018.js, dist/ponyfill.es2018.mjs in 3.5s

Any fixes?

The stream is not in a state that permits enqueue; new Response(readable).blob().text() resolve to [object ReadableStream] not underlying source

Consider this code:

Promise.allSettled([
  new Promise(async (resolve, reject) => {
    let controller;
    const rs = new ReadableStream(
      {
        start(_) {
          return (controller = _);
        },
      },
      { highWaterMark: 1 }
    );
    for (let chunk of Array.from({ length: 100 }, (_, i) => i)) {
      try {
        controller.enqueue(new Uint8Array([chunk]));
      } catch (err) {
        console.warn(
          `new Response().arrayBuffer() outside event handler: ${err.message}`
        );
        reject(err);
      }
    }
    controller.close();
    const buffer = await new Response(rs).arrayBuffer();
    resolve(buffer);
  }),
  new Promise(async (resolve, reject) => {
    let controller;
    const rs = new ReadableStream(
      {
        start(_) {
          return (controller = _);
        },
      },
      { highWaterMark: 1 }
    );
    const ac = new AudioContext();
    if (ac.state === 'suspended') {
      await ac.resume();
    }
    const msd = new MediaStreamAudioDestinationNode(ac);
    const osc = new OscillatorNode(ac);
    osc.connect(msd);
    osc.onended = () => {
      recorder.requestData();
      recorder.stop();
    };

    const recorder = new MediaRecorder(msd.stream, {
      audioBitrateMode: 'constant',
    });
    recorder.onstop = async (e) => {
      // console.log(e.type);
      const buffer = await new Response(rs).arrayBuffer();
      resolve(buffer);
    };
    recorder.ondataavailable = async (e) => {
      // console.log(e.type, e.data.size);
      if (e.data.size > 0) {
        try {
          controller.enqueue(new Uint8Array(await e.data.arrayBuffer()));
        } catch (err) {
          console.warn(`Response.arrayBuffer(): ${err.message}`);
          reject(err);
        }
      } else {
        controller.close();
        await ac.close();
      }
    };
    osc.start(ac.currentTime);
    recorder.start(1);
    osc.stop(ac.currentTime + 1.15);
  }),
  new Promise(async (resolve, reject) => {
    let controller;
    const rs = new ReadableStream(
      {
        start(_) {
          return (controller = _);
        },
      },
      { highWaterMark: 1 }
    );
    const ac = new AudioContext();
    if (ac.state === 'suspended') {
      await ac.resume();
    }
    const msd = new MediaStreamAudioDestinationNode(ac);
    const osc = new OscillatorNode(ac);
    osc.connect(msd);
    osc.onended = () => {
      recorder.requestData();
      recorder.stop();
    };

    const recorder = new MediaRecorder(msd.stream, {
      audioBitrateMode: 'constant',
    });
    recorder.onstop = async (e) => {
      // console.log(e.type);
      const buffer = await new Response(rs).blob();
      resolve(buffer);
    };
    recorder.ondataavailable = async (e) => {
      // console.log(e.type, e.data.size);
      if (e.data.size > 0) {
        try {
          controller.enqueue(new Uint8Array(await e.data.arrayBuffer()));
        } catch (err) {
          console.warn(`Response.blob(): ${err.message}`);
          reject(err);
        }
      } else {
        controller.close();
        await ac.close();
      }
    };
    osc.start(ac.currentTime);
    recorder.start(1);
    osc.stop(ac.currentTime + 1.15);
  }),
])
  .then(async (data) => {
    console.table(data);
    console.log(await data[2].value.text());
  })
  .catch((err) => console.error(err));

Using Streams API shipped with Chromium 96.0.4651.0ย (Developer Build)ย (64-bit)
Revision aabda36a688c0883019e7762faa00db6342a7e37-refs/heads/main@{#924134} this error is thrown

TypeError: Failed to execute 'enqueue' on 'ReadableStreamDefaultController': Cannot enqueue a chunk into a readable stream that is closed or has been requested to be closed

This polyfill, included as

  <script src="https://unpkg.com/web-streams-polyfill/dist/polyfill.min.js"></script>

at https://plnkr.co/edit/XwtpbXt4aqTQTIhb?preview throws

The stream is not in a state that permits enqueue

Observe that the first element of the array passed to Promise.allSettled() uses a basic loop in which ReadableStreamDefaultController.enqueue() is called with a Uint8Array() as value, then close() is called, then the ReadableStream is passed to new Response() and read to completion using blob().

The second element of the array follows the same pattern as the first element, except close() being called in MediaRecorder dataavailable event handler, and new Response(readable).arrayBuffer() called in onstop event handler. The TypeError is still thrown, logged at console, yet not caught in try..catch.

The third element of the array follows same pattern as first and second element, though uses Response.blob(), which always throws the TypeError.

I do not see anywhere in the code where enqueue() is called after close().

Using the polyfill I did observe where the third element fulfilled, was not rejected, while the very next test with the same code blob() throws an error.

The Streams API shipped with Chromium always throws https://bugs.chromium.org/p/chromium/issues/detail?id=1253143.

The polyfill does not resolve to the underlying data enqueued when Response(readable).blob(), rather appears to resolve to a string

[object ReadableStream]

There should not be any error thrown. The sequence is close(), then new Response(readable) with arrayBuffer() or blob() chained.

This behaviour of both Chromium Streams API and this polyfill makes no sense to me. (This cannot be working as intended.) Kindly illuminate.

"TypeError: Failed to execute 'pipeThrough' on 'ReadableStream'"

Hello.

I've not been able to use the polyfill due to encountering this error as soon as pipeThrough() is executed on the response body of a fetch().

TypeError: Failed to execute 'pipeThrough' on 'ReadableStream': Failed to read the 'readable' property from 'ReadableWritablePair': Failed to convert value to 'ReadableStream'

I reduced my script down to this reproduction case. As far as I can see it really is the point where .pipeThrough() is called that the exception is thrown.

<body>
  <div id="root"></div>
  <!-- Remove this include of polyfill.min.js to see native behaviour. N.b. of Sep 2021 TransformStream is supported in Chrome, but not in Firefox. -->
  <script src="https://unpkg.com/web-streams-polyfill/dist/polyfill.min.js"></script>
  <script>
    function DummyTransformStream() {
      return new TransformStream({
        transform(chunk, controller) {
          controller.enqueue(chunk); //just re-enqueue the chunk as it is
        },
      });
    }

    const url = 'https://jsonplaceholder.typicode.com/photos';

    fetch(url)
      .then(async res => {
        if (res.status !== 200) {
          console.log(`http request to ${url} failed, returned HTTP status ${res.status}. Please try a different URL.`);
        }

        const testReader = res.body
          .pipeThrough(DummyTransformStream())
          .getReader();

        while (true) {
          let { value: val, done: rdrDone } = await testReader.read();
          if (rdrDone) {
            break;
          }
          //console.log("Chunk of " + val.length + " bytes read");
          let msgDiv = document.createElement('div');
          msgDiv.append("Chunk of " + val.length + " bytes read");
          document.getElementById('root').append(msgDiv);
        }
      });
  </script>
</body>

Is there a something about syntax I've made a mistake with, etc.?

Republish as web-streams-polyfill

This has been a long time coming (see #2), but I think this fork is finally mature enough to be re-published as web-streams-polyfill on npm.

I will release this as a major version bump for web-streams-polyfill: version 2.0.0. There shouldn't be any breaking API changes for using the polyfill, since the streams API hasn't had any breaking changes either. However, I did change some fields in package.json and moved a few files around, which hypothetically some 1.x users might depend on (either directly or due to their choice of module loader or bundler). To avoid users having to deal with such potential problems on a regular npm update, I'll just do a major bump instead. Better safe than sorry! ๐Ÿ˜„

Plan:

  • Change package name to web-streams-polyfill, update changelog and documentation.
  • Publish as [email protected] under the next tag.
  • Deprecate all versions of @mattiasbuelens/web-streams-polyfill, notifying them to switch to [email protected]
  • Wait until the (inevitable) bug reports about regressions start flowing in, and fix them... ๐Ÿ˜…
  • Publish as [email protected] under the latest tag.

TS2540: Cannot assign to 'WritableStream' because it is a read-only property

We want use the web-streams-polyfill/ponyfill in our React application with TypeScript as follows:

import * as streamSaver from "streamsaver";
import { WritableStream, TransformStream } from "web-streams-polyfill/ponyfill";

function handleResponse(res: Response) {
   if (!streamSaver.WritableStream) {
        // Link pony fill
        streamSaver.WritableStream = WritableStream;
        streamSaver.TransformStream = TransformStream;
   }

   // Etcetera
}

But this yields a TS2540 error. streamSaver.WritableStream and streamSaver.TransformStream are read-only.

Are we doing something wrong: In this case (React & TypeScript), what is the recommended way to apply the ponyfill?

  • Typescript version: 3.5.3
  • React version: 16.8

Can't load the plugin with typescript 3.2.2

When I try to load the plugin into angular7 project I get an Error
ERROR in node_modules/web-streams-polyfill/dist/types/polyfill.d.ts(1,21): error TS2727: Cannot find lib definition for 'es2018.asynciterable'. Did you mean 'esnext.asynciterable'?

I did some investigations and it seems the cause is typescript version I have. 3.2.2

NOTE it works as expected with typescript version 3.5.3 that I have in angular 8 app

I can't upgrade typescript version since it's required by angular 7. The only way to make your polyfill to work is to upgrade to angular 8.

It seems that in typescript 3.2 asynciterable is under esnext.asynciterable instead of es2018.asynciterable.

Please advise. Thank you in advanced.

Queue size performance issue

Hi,

thanks for maintaining this repo, great work!

Are you aware of the performance issues mentioned here:

I created a CodePen using the code from the bug report:

I'd like to use this polyfill in production, but the performance drop, even if unlikely to happen, is a very bad experience for the user.

Do you think it's possible to include a fix (e.g. something like stardazed/sd-streams@1128f0d) into this repository? Or is this not possible because it would be non-standard?

Looking forward to hear you opinion on this.

Unhandled rejections in pipe chains

Erroring a pipe chain causes unhandled rejections. I believe I've handled all externally observable rejections in the example below but it still raises one.

const { TransformStream } = require('web-streams-polyfill')

const stream1 = new TransformStream()
const stream2 = new TransformStream()
stream1.readable.pipeTo(stream2.writable)
const reader = stream2.readable.getReader()
reader.closed.catch(() => {})
reader.read().catch(() => {})
stream1.writable.abort().catch(() => {})
(node:20114) UnhandledPromiseRejectionWarning: undefined
(node:20114) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 3)
(node:20114) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.

Is that the expected behavior? Is there a recommended way to get errors to propagate through a chain so you can handle them at the end? I have a long chain and with multiple instances running I end up with something like 100 of these internal rejections.

Q: how to best upgrade to byte stream?

Some web api's are suppose to return a byte stream, but aren't doing that.

...for instance Response.body or blob.stream()
in some env. they do but others don't

say that i would like to normalize this to always return a byte stream, how could i then best accomplish that?

// Here is just a dummy simple readable stream that yields Uint8Array's
// to get started...
var rs = new ReadableStream({
  start(ctrl) {
    var i = 10
    while(i--) ctrl.enqueue(new Uint8Array([97, 98, 99])) // abc
    ctrl.close()
  }
})

function toByteStream (rs) {
  try {
    // Check if it's already a byte stream; if so, return the original stream.
    const reader = rs.getReader({ mode: 'byob' })
    reader.releaseLock()
    return rs
  } catch (err) {
    // If not a byte stream, convert it.
    const reader = rs.getReader()

    return new ReadableStream({
      type: 'bytes',
      async pull(ctrl) {
        const { done, value } = await reader.read()
        if (done) return ctrl.close()

        if (ctrl.byobRequest) {
          const view = /** @type {Uint8Array} */ (ctrl.byobRequest.view)
          if (view.byteLength >= value.length) {
            view.set(value)
            ctrl.byobRequest.respond(value.length)
          } else {
            // If the BYOB request's buffer is too small, split the chunk
            const firstPart = value.subarray(0, view.byteLength)
            view.set(firstPart)
            ctrl.byobRequest.respond(firstPart.length)
  
            // Enqueue the remaining part
            const remainingPart = value.subarray(view.byteLength)
            ctrl.enqueue(remainingPart)
          }
        } else {
          ctrl.enqueue(value)
        }
      }
    })
  }
}

toByteStream(rs).getReader({ mode: 'byob' })

You are so good at the web stream specification, and i rather come to you to ask question about it rather than posting it on stackoverflow
you should consider opening up discussion where we can post question/answer :)

Unhandled promise rejection with TypeError when using async iterator

Both repros below error out the stream after some time.

In test 1, reader.read() read throws as expected.

In test 2, the output is as follows:

Possible Unhandled Promise Rejection (id: 0):
TypeError: reader._closedPromise_reject is not a function. (In 'reader._closedPromise_reject(reason)', 'reader._closedPromise_reject' is undefined)

If I wrap controller.error() with a try/catch block, the error above is caught and the try/catch block of for..await behaves exactly the same as in test 1.

Is this the intended behavior or am I doing something wrong? ๐Ÿค”

NOTE: I'm using the polyfill in React Native.

Test case 1

const readableStreamTest1 = async () => {
  const delay = (ms) => new Promise((resolve) => setTimeout(resolve, ms));
  let controller;

  const rs = new ReadableStream({
    async pull(c) {
      controller = c;
      await delay(250);
      c.enqueue('readable');
      await delay(250);
      c.enqueue('stream');
      await delay(250);
      c.enqueue('polyfill');
      c.close();
    },
  });

  const reader = rs.getReader();

  const read = () => {
    return reader
      .read()
      .then(({done, value}) => {
        if (done) {
          console.log('readableStreamTest1 done');
          return;
        }

        console.log('readableStreamTest1 read', {value});

        return read();
      })
      .catch((error) => console.error('readableStreamTest1 read', {error}));
  };

  read();
  await delay(500);
  controller.error(new Error('error'));
};

Test case 2

const readableStreamTest2 = async () => {
  const delay = (ms) => new Promise((resolve) => setTimeout(resolve, ms));
  let controller;

  const rs = new ReadableStream({
    async pull(c) {
      controller = c;
      await delay(250);
      c.enqueue('readable');
      await delay(250);
      c.enqueue('stream');
      await delay(250);
      c.enqueue('polyfill');
      c.close();
    },
  });

  const read = async () => {
    try {
      for await (const chunk of rs) {
        console.log('readableStreamTest2 read', {chunk});
      }
    } catch (error) {
      console.error('readableStreamTest2 read', {error});
    }
  };

  read();
  await delay(500);
  try {
    controller.error(new Error('error'));
  } catch (error) {
    console.error('readableStreamTest2 controller.error', {error});
  }
};

ERR_UNSUPPORTED_DIR_IMPORT

Reproduce

$ yarn add web-streams-polyfill
// test.mjs
import { ReadableStream as PonyReadableStream } from  'web-streams-polyfill/ponyfill'
console.log(PonyReadableStream)

Expect

No error

Actual

$ node test.mjs
internal/process/esm_loader.js:74
    internalBinding('errors').triggerUncaughtException(
                              ^

Error [ERR_UNSUPPORTED_DIR_IMPORT]: Directory import 'XXX' is not supported resolving ES modules imported from XXX/test.mjs
Did you mean to import web-streams-polyfill/dist/ponyfill.js
  code: 'ERR_UNSUPPORTED_DIR_IMPORT',
  url: 'file:///XXX/web-streams-polyfill/ponyfill'
}

Else

import { ReadableStream as PonyReadableStream } from  'web-streams-polyfill/dist/ponyfill.js'
console.log(PonyReadableStream)
$ node test.mjs 
[Function: ReadableStream]

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.