GithubHelp home page GithubHelp logo

microsoft / fluidframework Goto Github PK

View Code? Open in Web Editor NEW
4.6K 76.0 512.0 525.29 MB

Library for building distributed, real-time collaborative web applications

Home Page: https://fluidframework.com

License: MIT License

Shell 0.03% Dockerfile 0.05% TypeScript 83.74% JavaScript 15.04% HTML 1.11% Mustache 0.01% PowerShell 0.02% Batchfile 0.01% CSS 0.01%
fluid fluid-framework microsoft distributed datastructure realtime collaboration crdt

fluidframework's Introduction

Fluid

The Fluid Framework is a library for building distributed, real-time collaborative web applications using JavaScript or TypeScript.

Getting started using the Fluid Framework

You may be here because you want to...

  • Learn more about the Fluid Framework
  • Build a Fluid object

Documentation and guides can be found at https://fluidframework.com/.

Hello World repo can be found at https://github.com/microsoft/FluidHelloWorld.

Core Examples repo can be found at https://github.com/microsoft/FluidExamples.

Have questions? Engage with other Fluid Framework users and developers in the Discussions section of our GitHub repo.

Using Fluid Framework libraries

When taking a dependency on a Fluid Framework library, we recommend using a ^ (caret) version range, such as ^1.3.4. While Fluid Framework libraries may use different ranges with interdependencies between other Fluid Framework libraries, library consumers should always prefer ^.

Code structure

The core code for both the Fluid client packages and the reference ordering service is contained within this repo.

The repo structure is somewhat unique because it contains several pnpm workspaces: some for individual packages and some for larger collections which we call "release groups". The workspaces are versioned separately from one another, but internally all packages in a workspaces are versioned together.

These workspaces do not align with package namespaces, and also don't always correspond to a single directory of this repo.

Here's the list of release group workspaces:

Here's a list of other sets of other packages (each package within these groups is versioned independently, forming its own release group):

  • "Common" Packages: miscellaneous packages in the ./common directory and published under the @fluidframework/ namespace. Most of these (but not all) have "common" in their package name. Packages which are used by multiple other groups of packages (such as built tools, linter configs and protocol definitions) live here.
  • "Tools" Packages: miscellaneous packages in the ./tools directory and published under a variety of namespaces. Logically about the same as "Common", but most of the names include "tools" instead of "common".
  • Auxiliary Microservice Packages (supporting Routerlicious)
    • ./server excluding routerlicious, gitrest and historian (Published in the @fluidframework/ namespace)
  • ./docs: The code and content for https://fluidframework.com.

Dependencies between packages in various layers of the system are enforced via a build step called layer-check. You can view the full list of packages and layers in PACKAGES.md.

  • Note: to update the contents of PACKAGES.md for local package changes, run pnpm layer-check --md ..

Setup and Building

Install the required tools:

Clone a copy of the repo and change to the repo root directory:

git clone https://github.com/microsoft/FluidFramework.git
cd FluidFramework

Enable NodeJs's corepack:

corepack enable

Run the following to build the client packages:

pnpm install
npm run build

You can use the experimental worker mode to get faster build time as well: npm run build:fast

See also: Contributing

Build in VSCode

To build Fluid Framework within VSCode, open the Fluid Framework repo folder as a work space and use Ctrl-Shift-B to activate the build task. It is the same as running npm run build on the command line.

NodeJs Installation

We recommend using nvm (for Windows or MacOS/Linux) or fnm to install Node.js. This ensures you stay at the correct version while allowing other uses of NodeJS to use the (possibly different) versions they need side-by-side.

Because of a transitive dependency on a native addon module, you'll also need to ensure that you have the prerequisites for node-gyp. Depending on your operating system, you'll have slightly different installation requirements (these are largely copied from node-gyp's documentation):

On Windows

The node installer should ask if you want to install "Tools for Native Modules." If you check the box for this nothing further should be needed. Otherwise, you can follow the steps listed here

On Unix

  1. Python v3.7, v3.8, v3.9, or v3.10
  2. make
  3. A C/C++ toolchain (like GCC)

On MacOS

If you've upgraded your Mac to Catalina or higher, you may need to follow these instructions.

  1. Python v3.7, v3.8, v3.9, or v3.10
  2. XCode Command Line Tools, which will install make, clang, and clang++
    • You can install these by running xcode-select --install from a command line.

Other Build Requirements

  • Building server/Routerlicious
    • Refer to that package's README for additional requirements.
    • Note that these requirements do not affect all workflows (e.g. the one noted above), but will affect workflows that include the packages under server (e.g. fluid-build --symlink:full).

On Windows

  • Ensure that you have enabled running Powershell scripts by setting your environment's Execution Policy.

Other Build Commands

Building our docs

There are a few different areas in which we generate documentation content as a part our overall build.

  1. fluidframework.com
    • We build the contents of our public website from the docs directory under the root of this repo. See its README for more details.
  2. Generated README contents
    • We leverage a local tool (markdown-magic) to generate / embed contents in our various package-level READMEs. This is done as a part of a full build, but it can also be executed in isolation by running npm run build:readme from the repo root.
  3. API reports
    • We leverage API-Extractor to generate summaries of our package APIs. This is done as a part of a full build, but it can also be executed in isolation by running npm run build:api from the repo root.

Testing

You can run all of our tests from the root of the repo, or you can run a scoped set of tests by running the test command from the package you're interested in.

Note: Some of the tests depend on test collateral that lives in a submodule here: https://github.com/microsoft/FluidFrameworkTestData. You may choose to fetch that collateral into your local repository, which is required to run all the tests - otherwise some will be skipped.

First, ensure you have installed Git LFS. Then, from the repo root:

git lfs install
git submodule init
git submodule update

Run the tests

npm run test

Include code coverage

npm run test:coverage

Mimic the official CI build

Our CI pipelines run on Linux machines, and the npm scripts all have the ci prefix. To replicate the test steps from the CI pipeline locally, run the following commands for the packages or pnpm workspaces:

Run Non-Windows Windows
PR npm run ci:test npm run test:report && npm run test:copyresults
Official npm run ci:test:coverage npm run test:coverage && npm run test:copyresults

Run tests from within VS Code

We've checked in VS Code configuration enabling F5 from a spec.ts file to run those tests if you set the debug configuration to "Debug Current Test".

Run it locally

Single browser window, two panes

This will use an in-memory implementation of the Fluid server to sync between the two panes in the browser window.

  • Choose an example under /examples
  • Navigate to the example's directory, e.g. /examples/data-objects/clicker
  • npm run start
  • Browse to http://localhost:8080 to interact with two copies of the example side-by-side

Multiple browser instances on the same device

This will run the local Fluid server implementation we call "Tinylicious", so you can sync between multiple browser instances.

First, start Tinylicious by running these commands from /server/routerlicious/:

pnpm install

Then these commands from /server/routerlicious/packages/tinylicious:

pnpm run build
pnpm run start

Then:

  • Navigate to the example of your choice (same as above)
  • npm run start:tinylicious
  • Browse to http://localhost:8080, copy the full URL you're redirected to, and open in a second window to collaborate

Tools

Prettier

This repository uses prettier as its code formatter. Right now, this is implemented on a per-package basis, with a shared base configuration.

  • To run prettier on your code, run npm run format from the appropriate package or release group, or run npm run format:changed from the root of the repo to format only files changed since the main branch. If your change is for the next branch instead, you can run npm run format:changed:next.
  • To run prettier with fluid-build, you can specify "format" via the script argument: fluid-build -t format or npm run build:fast -- -t format

To ensure our formatting remains consistent, we run a formatting check as a part of each package's lint script.

VSCode Options

Our workspace configuration specifies prettier as the default formatter. Please do not change this.

It is not configured to do any formatting automatically, however. This is intentional, to ensure that each developer can work formatting into their workflow as they see fit. If you wish to configure your setup to format on save/paste/etc., please feel free to update your user preferences to do so.

Notable setting options:

  • format on save
  • format on paste

User editor formatting setting options

Git Configuration

Run the following command in each of your repositories to ignore formatting changes in git blame commands: git config --local blame.ignoreRevsFile .git-blame-ignore-revs

Developer notes

Root dependencies

The root package.json in the repo includes devDependencies on the mocha and jest testing tools. This is to enable easier test running and debugging using VSCode. However, this makes it possible for projects to have a 'phantom dependency' on these tools. That is, because mocha/jest is always available in the root, projects in the repo will be able to find mocha/jest even if they don't express a dependency on those packages in their package.json. We have lint rules in place to prevent phantom dependencies from being introduced but they're not foolproof.

Contributing

There are many ways to contribute to Fluid.

Detailed instructions for working in the repo can be found in the Wiki.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

This project may contain Microsoft trademarks or logos for Microsoft projects, products, or services. Use of these trademarks or logos must follow Microsoft’s Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship.

fluidframework's People

Contributors

abe27342 avatar agarwal-navin avatar alexvy86 avatar andre4i avatar anthony-murphy avatar chumpchief avatar craigmacomber avatar curtisman avatar dependabot[bot] avatar dlehenbauer avatar garywilber avatar heliocliu avatar jatgarg avatar josmithr avatar kian-thompson avatar kurtb avatar markfields avatar msfluid-bot avatar sambroner avatar scottn12 avatar skylerjokiel avatar sonalideshpandemsft avatar sonalivdeshpande avatar tanviraumi avatar tyler-cai-microsoft avatar tylerbutler avatar vladsud avatar wes-carlson avatar yann-achard-ms avatar znewton avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

fluidframework's Issues

Reduce excess summarization

Currently the summarizer cannot listen for summary acks or nacks to make better decisions about when to summarize. Sometimes the summarizer will send two consecutive summary ops with the same parent, which causes the second one to be (correctly) rejected by the server.

The summarizer currently has 3 methods for summarizing:

  1. On receiving an op, after a certain number of ops
  2. On receiving an op, after a certain amount of time
  3. On a timer set for a certain amount of time after the last op was received (idle timer)

This logic should be cleaned up to better align the above conditions with each other (when any of the 3 are met, it should reset all 3 if it is not already). And further, we should have another time threshold to wait for an ack/nack before trying to summarize again.

Reassess usage of blockUpdateMarkers in the code

We should either clear any references to this flag from code base, or allow individual cases to use it, but add a bunch of asserts into mergetree.ts validating that if blockUpdateMarkers === false, then there are no markers in that instance of merge tree.

Aqueduct components can only be created by other Aqueduct components...

If you attempt to create an instance of an Aqueduct component via IComponentContext.createComponent(), it will never be initialized.

This is due to code in SharedComponent.initialize() that skips initialization on the assumption that forge() will be called later.

            // If the runtime is existing we will execute the internal initialize. Otherwise the initialize
            // happens during the forge
            if (this.runtime.existing) {
                this.initializeP = this.initializeInternal();
            } else {
                this.initializeP = Promise.resolve();
            }

Dedup url resolvers.

Dedup url resolver code for odsp/routerlicious. Currently it is present at multiple places like gateway, hosts, praguedump, odsp utils.

Blank screen on Edge as of release 0.9, dev server 3.9.1

My React-based Fluid component is not rendering in Edge since updating to release 0.9
In the latest server version, the "Fluid" clippy banner appears at the top of the page, but everything else is blank. On the previous version, the screen was simply blank.
In the Edge debug tools I can see that the page is loaded, but the "content" div is blank.

The issue reproduces with a new default sample React project (click counter) generated with "yo fluid".

Chrome renders as expected.
Dev server version is 3.9.1

I'm running Edge version 44.18362.267.0 and Chrome version 76.0.3809.132

Decouple ISegment from BaseSegment

ISegment is very coupled to BaseSegment, infact an ISegment that doesn't inherit from BaseSegment probably won't be valid. This seems like a bad situation, and i'd like to trim down ISegment so it is less complex, and host it off a new leaf node type that handles much of what is in BaseSegment.

This would also open up the possibility of allowing user defined Segment types, as potentially sequence could run in a mode where the segment data is never deserialized, it's basically just json hanging off the leaf. The snapshotter could then run in this mode, and not care what kind of segments are in the sequence.

On caveat here would be the ability of the snapshotter to run Zamboni, but I thinks this could be resolved to some extent, it just may not be able to Zamboni as deeply, as the Zamboni constraints would need to be derived from invariants or accessed from the segments as simple json properties. Currently the only segment type Zamboni constraints are:

  • Segment that can never be appended; marker, external
  • If types match, and length is less than some max: text, and subsequence

All other constraints come from the BaseSegment properties that would move to the new leaf node type

SparseMatrix: Relax type constraint

From Tyler's feedback:

it seems like you can only store an UnboxedOper in a SparseMatrix, and UnboxedOper = undefined | boolean | number | string;

I think this constraint is arbitrary (due to historical ties to calc) and can be trivially relaxed to any JSON serializable type (see also #136)

Use prettier for code formatting

Many popular JS OSS projects (including React) are using prettier to automatically format their code. It would be great if this repo could use prettier to enforce consistent code style.

Support signals in the local test server

Local test server doesn't support signals. In Scriptor-Pad (Local Fluid Document), when a signal is submitted, only the editor who sends the signal can receive it.
Scriptor will start to use signals for presence, and we need the local test server for testing.

Sanity check creation of summarizer client around disconnected states.

Any chance you can see if we deal with reconnections properly in summarize?

  1. Do we destroy summarizing container on lost connection?
    Note - createSummarizer() has "fluid-reconnect": false, which seems correct, assuming the rest works
  2. Do we always recreate summarizer on same client on reconnect if there were (and there are) no other clients connected?

Any chance that we can add telemetry (erorr) when there is no summurizer in a document and given client is connected?

Thanks!

Webpack-Component-Loader routes *.wasm files to static html

Attempting to compile *.wasm files loaded via the webpack dev server fails. Compilation fails because Webpack-Component-Loader routes all requests not ending in '.js(.map)' to a static HTML page:

app.get(/(.*(?<!\.js(\.map)?))$/i, (req, res) => fluid(req, res, baseDir, options));

Locally modifying the above regex to also exclude *.wasm files works around the issue.

For context, I hit this while using an OSS parser framework/generator that produces *.wasm files to syntax highlight WebFlow documents.

Properties are not applied correctly in merge tree

Property update overrides existing property on a segment (with only exception of local properties).
Basically Merge tree implements "last one wins" merge resolution for annotates.

This is problematic for snapshots, because "header" (state of sequence at MSN) may be generated before last annotate op, but it will contain properties of that future annotate op.
I'm not 100% sure, but that feels like can cause trouble.
Couple scenarios to verify:

  1. What happens with local ops that are not acked before we generate snapshot?
  2. What happens when we mix and match different rules about applying properties? I see "rewrite" mode in extend<>() in properties.ts.

Maybe it's not an issue, but I want to make sure we validate it's not an issue.

Linting should be autofixed on save

I find it very frustrating to spend cycles on linting and formatting issues when a computer can do these automatically on save or precommit. For other repos, like office-ui-fabric-react, I never have to worry about the nit picky lint configurations like trailing commas and newlines since they have configured the tooling to do this automatically for us.

I'd consider this a blocking issue for going open source. The JS community expects this to work and I think this will be an unnecessary barrier to entry for open source contributions

Support auto fixing of tslint issues and automatic formatting

Many other open source repositories, like office-ui-fabric-react support auto fixing of TS lint issues by vscode and all code formatting is handled by a tool like prettier.

This saves a lot of cycles for folks that work in multiple repos, each with different conventions.

Enable spo route for r11s gateway.

Currently due to change in join session response from spo, it no longer supply the storage and push tokens in join session. So because of that the spo route is broken.

Freeze DDS value object

Our DDSes allow you to modify objects that are set in the DDS. These modifications do not trigger new DDS value sets and cause bugs because the local state of the developer is different from remote clients.

Example

const myObject = {
  id: 1;
}

sharedMap.set("my-object-id", myObject);

myObject.id = 2;

In the above example locally myObject would be have an id of 2 but the DDS and all connect clients would have an id of 1. This is because setting the id on the object does not trigger an update to the SharedMap. These are difficult bugs for developer to find.

Solution

To prevent this we can freeze any object set using JavaScript Object.freeze(object) within the DDS.

Work

DDSes that need this change

  • SharedCell
  • SharedMap
  • SharedDirectory
  • SharedString - property bag

NPM install should not fail if you don't have docker installed globally

We have an NPM install post-install task that does some docker related tasks. There is some subset of developers that don't work on the server pieces and have no need to set up docker on their machines, so it is pretty frustrating to have this error message after every npm install:

'docker' is not recognized as an internal or external command,
operable program or batch file.

Migrate from TSLint to ESLint

Tasks

  • Add a shared ESLint config
  • Add eslint config files to all lerna-managed packages
  • Add eslint and necessary plugins to each project
  • Configure tslint-to-eslint plugin
  • Configure the missing imports plugin
  • Configure the React plugin
  • Remove tslint-to-eslint plugin and tslint for each project
  • Switch CI to run eslint tasks instead of tslint
  • Remove tslint NPM tasks

Type safety for JSON serializable things

From Tyler's Soduku feedback:

Mistake #1: Trying to store objects in a DDS that are not safely JSON round-trippable

This is approximately the solution:

export type JsonPrimitive = undefined | boolean | number | string;
export type JsonObject = { [index: string]: JsonPrimitive | JsonArray | JsonObject };
export type JsonArray = (JsonPrimitive | JsonObject)[];
export type Json = JsonPrimitive | JsonArray | JsonObject;

As I've mentioned before (#3145), we should also call Object.freeze(..) on these objects.

Decouple Merge Tree From Segment Implemenations

Currently the merge tree and merge tree client expose a number of methods that are bound to specific segment types. These methods make less sense now that we are trying to generalize merge tree beyond text.

  • remove initial text from merge tree and client constructors
  • remove text specific methods from merge tree and client
  • remove Item specific methods from merge tree and client

Open question, are marker segments available in all tree types? Or should they be abstracted away as well?

Consolidate no-op acks from client

Currently the no-op acks from the client which are intended to be broadcast immediately (indicated by having some non-null contents) are sent immediately. An improvement would be to consolidate them to send a single one after a tick, and only if an outbound op wasn't already sent during that tick.

I have an initial implementation of this in a branch, but have not tested it yet.

MergeTree: Need ability to create LocalReferences to end of sequence

Many MergeTree operations refer to the position at the end of the sequence. For example, consider appending to the end of a sequence:

"0123"
     ^-- Append = insert @4

In order to use LocalReferences to track caret positions, selection ranges, etc. we need the ability to create LocalReference to the position after the last segment in the MergeTree (@4 above).

Today, the LocalReference can only references positions 0..3 in the above example. This limitation has impacted all user of LocalReferences that I'm aware of:

  • FlowView does not use LocalReferences. Instead, it observes ops to update the caret/selection offsets manually.
  • SharedStringInterval users can only include the last position in a sequence if they treat 'end' as inclusive instead of exclusive (#1761). This has been source of numerous off-by-one errors in Table-Slice / Tablero. (I found another one yesterday.)
  • WebFlow used to attempt to preserve an extra marker at the end of the document. If this is accidentally deleted, the last position in the document can no longer be edited.
  • WebFlow now works around this by crafting a LocalReference that points to sentinel segment that is never inserted into the MergeTree. (See Flow-Document addLocalRef, etc.)

Need to provide proper web socket token in fluid-fetcher

Here are related lines of code.
Note that the hack in OdspDocumentService.ts works for now as join session provides us tokens. That is going away and the app is responsible for fetching tokens. Fluid-fetcher is already doing it storage / vroom, but not for push.
BTW, can we cleanup driver to use same terminology? In some cases we use vroom token terminology, in others - storage token, there might be other terms used.

packages\tools\fluid-fetch\src\fluidFetchInit.ts
const getWebsocketTokenStub = () => Promise.resolve("");

packages\drivers\odsp-socket-storage\src\OdspDocumentService.ts

        // This is workaround for fluid-fetcher. Need to have better long term solution
        webSocketToken ? webSocketToken : websocketEndpoint.socketToken,

Consider using jest for unit tests

The unit tests in the repo all use mocha, which isn't in line with what the big app teams at Microsoft use. For reference OWA, ODSP, Office Online, SDX, Office UI Fabric React, and many others are using jest for unit tests. There are also big open source projects like Webpack and React that are using jest.

npm install shows docker registry password

On a Windows 10 machine, I've cloned the repo and ran npm install without having Docker Desktop running.

I was greeted by error messages like these, which included a password:

> root@ postinstall:r11s D:\FluidFramework
> cd server/routerlicious && npm run registryAuth


> root@ registryAuth D:\FluidFramework\server\routerlicious
> docker login -u prague -p A_PASSWORD_WAS_HERE prague.azurecr.io

WARNING! Using --password via the CLI is insecure. Use --password-stdin.
error during connect: Post http://%2F%2F.%2Fpipe%2Fdocker_engine/v1.40/auth: open //./pipe/docker_engine: The system cannot find the file specified. In the default daemon configuration on Windows, the docker client must be run elevated to connect. This error may also indicate that the docker daemon is not running.
npm ERR! code ELIFECYCLE
npm ERR! errno 1                                                                                                        
npm ERR! root@ registryAuth: `docker login -u prague -p A_PASSWORD_WAS_HERE prague.azurecr.io`             
npm ERR! Exit status 1                                                                                                  

After starting Docker, install succeeds.
It still shows the password in the log - do we care about it?

Optimize empty blobs

Looking through bug bash files, one had a lot of blobs, and specifically a lot of empty blobs.
There were a bunch of tables inserted (and deleted), and many had empty "tardis" & "header" blobs (for things like "cols", "rows", "matrix", etc.).

It would be great to optimize it, probably at driver level is the right place to do so.
That might also help with dedupping .attributes (specifically for maps), which are around 120 bytes today. Same for directory's attributes, which I saw in big numbers in same document (around 50).

DeltaManagerProxy causes troubles when emitting "error" events.

This stack is problematic:
at DeltaManagerProxy.emit (events.js:132)
at DeltaManager. (deltaManagerProxy.ts:23)
at DeltaManager.emit (events.js:151)
at DeltaQueue. (deltaManager.ts:185)
at DeltaQueue.emit (events.js:151)
at callback (deltaQueue.ts:137)
at DeltaQueue.worker (deltaManager.ts:176)
at DeltaQueue.processDeltas (deltaQueue.ts:151)

Note EventEmitter implementation - this screws up error processing - we should not throw from here (but we do because of DeltaManagerProxy):

// If there is no 'error' event listener then throw.
if (doError) {
var er;
if (args.length > 0)
er = args[0];
if (er instanceof Error) {
// Note: The comments on the throw lines are intentional, they show
// up in Node's output if this results in an unhandled exception.
throw er; // Unhandled 'error' event
}

Version parameter to loader.resolve is not respected when you have headers

We are trying to load a container with a null version as a query parameter to skip snapshots since they won't exist in this scenario. This request also sets some request headers.

The end result is that this line of code
https://github.com/microsoft/FluidFramework/blob/master/packages/loader/container-loader/src/loader.ts#L263
sets the version back to undefined.

Perhaps, we should only have one way of setting the version?

Make options strongly typed with interface augmentation

node_modules@prague\container-loader\src\loader.ts takes options any, we should add an interface here ILoaderOptions, and then use interface augmentation in sub-classes to add optional properties to the options, so we get strong typing for options.

Provide access to sequence after partial load

Previously the sequence would become active after only loading the header.

But a recent change waited on body load prior to giving access https://github.com/microsoft/Prague/blob/12c12a3bea16181fd478ea22dd8f04a01b327338/packages/runtime/sequence/src/sequence.ts#L612.

For documents that can't delay load this is not problematic. But for those that can, like flow-view, it's a big perf hit (seconds).

This was simpler when prepare existed because the runtime would await the sequence being fully loaded before inbounding new ops (it would still allow outbound). In the updated world the sequence will need to be able to handle ops for segments it has not yet loaded.

Merge Tree: Rationalize registers

Related microsoft/Prague#1381

I'm not sure we want to keep registers. They are currently not used heavily, are not snapshotted, never clean-up, and not tested well.

If we keep them merge tree should add a new op for copying text to a register, and we should remove the register property from remove op, cut then becomes a group op of copy, and remove.
It's probably ok to keep the register on insert for paste.

Update merge-tree snapshot to include segments between MSN and snapshot sequence number

The current merge tree snapshot includes all zamboni'd segments. And then plays tardis'd messages between the min sequence number and the snapshot sequence number. This requires us to create, load and replay these tardis'd operations. Load can be simplified and storage decreased by including segments that haven't been zamboni'd in the snapshot.

Proper handling for 429 (both web socket and storage)

Example of the error (from logs):

0: "connect_document_error"
1: "activityLimitReached (Client): Request Id: bdb1049f-605f-0000-c05f-b1b7de6d6333. Status Code: 429. Response Body: {"error":{"innerError":{"code":"throttledRequest"},"code":"activityLimitReached","message":"The request has been throttled","retryAfterSeconds":120}}"

Need to respect retryAfterSeconds

Summarizer errors end up in console, not in telemetry

This looks bad - we should put all the errors into telemetry.
Can you please follow up?
I'd do code review across all files related to summaries for these kinds of issues.
Thanks!

        doneP.then(
            () => {
                // In the future we will respawn the summarizer - for now we simply stop
                // this.computeSummarizer(this.connected)
                debug("summary generation complete");
            },
            (error) => {
                debug("summary generation error", error);
            });

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.