GithubHelp home page GithubHelp logo

bvaughn / suspense Goto Github PK

View Code? Open in Web Editor NEW
349.0 6.0 9.0 880 KB

Utilities for working with React Suspense

Home Page: https://suspense.vercel.app/

License: MIT License

CSS 5.25% HTML 0.21% TypeScript 94.23% JavaScript 0.31%
async caching data fetching loading react suspense

suspense's Introduction

suspense

APIs to simplify data loading and caching. Primarily intended for use with React Suspense.

⚠️ Considerations

  1. Suspense is an experimental, pre-release feature; these APIs will change along with React.
  2. This package depends on react@experimental and react-dom@experimental versions.

Example

import { createCache } from "suspense";

const userProfileCache = createCache({
  load: async ([userId]) => {
    const response = await fetch(`/api/user?id=${userId}`);
    return await response.json();
  },
});

function UserProfile({ userId }) {
  const userProfile = userProfileCache.read(userId);

  // ...
}

More examples at suspense.vercel.app.

If you like this project, buy me a coffee.

suspense's People

Contributors

andarist avatar bvaughn avatar cevr avatar hbenl avatar monster898 avatar tometo-dev avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

suspense's Issues

Automatic cache eviction to prevent memory leaks

Just reading through the source and docs, am I right in saying that by default the cache will grow essentially forever? And it's up to the user to call evict(key) to remedy this?

It's not clear to me when i would actually call evict, especially since this won't currently trigger a rerender and resuspension if the value is still in use.

getCacheForType is not a function

Hey there. Interesting project, thanks for putting it together. I installed v0.0.50 and used the code below, and received the error below.

import { createCache } from 'suspense';

const cache = createCache({
  load: () => new Promise<string>((resolve) => setTimeout(() => resolve('done'), 2000))
});
TypeError: getCacheForType is not a function
    at getOrCreateRecord (/node_modules/.pnpm/[email protected][email protected][email protected]/node_modules/suspense/dist/suspense.cjs.js:317:38)
    at Object.read (/node_modules/.pnpm/[email protected][email protected][email protected]/node_modules/suspense/dist/suspense.cjs.js:405:20)

Interval cache

Many data-loading scenarios can be handled by the createCache API, but as there are a special set of cases that are better handled by createStreamingCache– there are scenarios that would benefit from a ranged cache also.

For example, @replayio uses suspense to incrementally fetch data for a session– (things like console logs, mouse/keyboard events, etc). This data can't be fetched eagerly in its entirety for larger sessions, and so instead @replayio incrementally fetches the data as a user interacts with the session– and merges it together to avoid re-fetching the same range.

We use a specific type of cache– optimized for this use case. This issue tracks the work of taking that cache and making it more generalized.

Add tests for useImperativeCacheValue and useMutation

It's possible that these two don't properly work together.

Edit: It looks like they mostly work based on some initial testing– except for maybe the pending render during an async mutation, which returns no value (a significant difference from the Suspense based API). Need to think that out a bit.

Guard against

The value returned by the getLabel method must be unique. If an object is passed by accident, the string may not be unique. This is easy to do without realizing.

Chromium, Gecko, and Webkit browsers all cast objects to the string "[object Object]". This library should be able to check for that string (in development mode) and log an error/warning.

Discussions for suspense

It'd be nice if we could discuss about suspense on GitHub with some ideas, features requests or usage questions.

Would you please open the Discussions section of this repo ?

Awesome library, thanks. 👍

Cheers

Mutation

Currently thinking about supporting mutation with a hook like:

export function useCacheMutation<Params extends Array<any>, Value>(
  cache: Cache<Params, Value>,
  ...params: Params
): [isPending: boolean, mutate: (callback: MutationCallback) => Promise<void>] {
  // ...
}

Performing a mutation might look something like this then:

const [isPending, mutate] = useCacheMutation(cache, apiClient);

const save = (...params) => {
  mutate(async () => {
    await apiClient.save(...params);
  });
};

I think that– in order for this API to correctly schedule updates with React (after a cache mutation) this package would need to change the dependencies to the experimental release channel (for unstable_getCacheForType and unstable_useCacheRefresh). Unfortunately I think that would come with the significant downside of breaking the imperative API (e.g. getValue, fetchAsync) since those are called outside of React's render cycle.

On the other hand, we could move to a context based API and store the cache maps in state for invalidation but that would require moving the fetch calls to hooks as well (so components could subscribe/unsubscribe from updates). Not sure which of these is really a good path forward.

I think it might be possible for createCache to create its own record Map (in scope) that both suspense and imperative methods could use– but still call getRecordMap from inside of fetchSuspense (to let React know that the component is "subscribed" to the cache). This feels a bit hacky but may avoid both of the above downsides.


I might need to walk back the previous thoughts RE using getCacheForType to subscribe only. I think that won't work well enough.

Maybe there's a way that I could support React updates/transitions without breaking the imperative API– by storing data in two places:

  1. A static map that's created once (when the cache is configured) that holds values as they are loaded or cached via the imperative cache method.
  2. A React-managed map (via getCacheForType) that holds pending requests for missing values.

This is kind of how the React DevTools suspense cache works.

  • inspectedElementCache uses getCacheForType to get a reference to its record map (and useCacheRefresh to clear it when a record gets invalidated). It fetches data from a second cache (below).
  • inspectElementMutableSource is the second cache. It actually loads data from the React DevTools backend and stores loaded values in an LRU. When the backend reports that a value has changed, the LRU gets the newest value.

inspectedElementCache fetches the initial data on render, using Suspense. Then it polls inspectElementMutableSource for updates (using an interval, in an effect). If an update comes in, it schedules an update with React (using useCacheRefresh) and pre-seeds the new cache with the updated value.

  1. Pre-seeding the cache prevents the component from visibly suspending again, (and is kind of analogous to what the proposed useCacheMutation hook would enable by passing a cache callback to the mutate method).
  2. The transition update causes a re-render, which causes the component to pull in the new cache data.

A tricky thing about the way cache invalidation works in React (useCacheRefresh) that doesn't apply to React DevTools but would apply here– is that you can't evict a single cache. Calling useCacheRefresh clears all caches. However, that's where I think the 2nd static map could come in handy. Unless a value was explicitly evicted, then re-rendering could just read from that map directly and avoid suspending.

cache.prefetch() throws

If you call cache.prefetch() and an error has already been cached for that request, it will throw (because then cache.readAsync() will throw instead of returning a rejected Promise and cache.prefetch() doesn't handle that).

Feedback

Sorry if this isn't the best place to put this, feel free to move it to discussion!

Here's a little app i made for demonstrative purposes: https://stackblitz.com/edit/vitejs-vite-v6umf3?file=src/App.tsx

After using this library for a couple days, I've gathered some feedback:

  • Possible to add an invalidate function to the cache?
    The implementation I use is:
function invalidate<TParams extends any[], TValue>(
 cache: Cache<TParams, TValue>,
 ...args: TParams
) {
 cache.evict(...args);
 cache.prefetch(...args);
}
  • possible to notify subscribers when an eviction happens? Currently no notification is given, unless its an evictAll i believe.
  • similarly, evicting and forcing an update afterwards does not cause the suspense fallback to show. Intended? You can see it in the app above by going to the posts, and clicking evict current page
  • the current eviction policy seems unpredictable and aggressive, sometimes causing surprising evictions. In the app above, if you go to the posts, you'll see that sometimes when pressing the next button, a complete cache blowout has occurred forcing a suspense fallback. Will also notice it in the users demo (loading a new user will sometimes show all other previously loaded users are now evicted).

I was thinking a possible strategy is to allow the user to provide their own implementation of the internal cacher so to leave the eviction policy to them. I would personally want to use a LRU cache, but some might consider another type of cache. Providing the inversion of control could satisfy all use cases. What do you think?

  • I commonly want to know if the current cache item is revalidating (no suspense fallback). What does an api like this sound like for read:
const [value, revalidating] = Cache.read()

my current implementation for this is using this helper function:

function useRead<TParams extends any[], TValue>(
  cache: Cache<TParams, TValue>,
  ...args: TParams
) {
  const value = cache.read(...args);
  const status = useCacheStatus(cache, ...args);
  return [value, status === 'pending'] as const;
}
  • provide built in hooks? this one is not really a big deal and philosophically i can understand it not being part of the cache. But something like useCacheStatus(cache, ...) to Cache.useStatus(...) etc
  • the current getValueIfCached throws an error if rejected, which makes it cumbersome in some cases. Thoughts on providing a cache.peek function that returns the raw record or undefined?
  • In my implementation i paired it with a function called cache.evictIf(predicate, ...args) that was useful in ErrorBoundaries to evict if the cache was rejected before reseting the error boundary

Optional params can break caches

Consider the following cache:

export const objectCache: Cache<[foo: string, bar?: boolean], Object> =
  createCache({
    load: async (foo, bar = false) => {
      // ...
    },
  });

This code assumes the array of ...params will always contain both parameters:

const valueOrPromiseLike = load(...params, abortController);

If not, the actual load function will end up being called with the AbortController in place of the boolean bar parameter.

The API should be changed such that params are not spread:

const valueOrPromiseLike = load(params, abortController);

Alternately "normal" caches could be changed to mirror streaming caches and pass the additional parameters first:

const valueOrPromiseLike = load(abortController, ...params);

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.