GithubHelp home page GithubHelp logo

microsoft / mixed-reality-extension-sdk Goto Github PK

View Code? Open in Web Editor NEW
142.0 29.0 61.0 22.92 MB

The Mixed Reality Extension SDK enables developers to build 3D world extensions for AltspaceVR, using Node.JS.

License: MIT License

TypeScript 98.42% PowerShell 0.19% Dockerfile 0.06% HTML 0.96% JavaScript 0.38%
vr sdk mixed reality altspacevr gltf node typescript

mixed-reality-extension-sdk's Issues

Mac server support

Currently MREs can be deployed to Macs just fine, and the logic executes, but static files are not working

Repro steps

  1. Deploy Hello World MRE to a mac
  2. instantiate it in AltspaceVR
    Observed: Hello World text shows up and spins, but cube does not
    Expected: Hello World text and cube both show up and spin.

Feature Request: TeleportTo/InterpolateTo

Ability to instantly teleport (don't collide with anything between old and new positions) and interpolate (moving through space over time, and colliding with any collision geometry on the way)

CreateFromGLTF's collision volumes aren't enabled until actor has been updated

When creating box collision as a parameter for CreateFromGLTF, for some scenes (Suzanne),
it creates the rigid body, but it doesn't seem to be appear to be added properly to the unity physics scene. It isn't visible in the unity physics debugger view (and therefore doesn't collide) until the owning GO (or parent GO) has moved around, or the collision component has been modified.
It seems like a potential Unity physics bug.
There is no reliable repro steps for this yet.

Feature Request: Exact Timing/Synchronization API

Currently all APIs say "do it now", and due to unpredictable latency between server and host app, it's not possible to do reliably timed API calls. We should implement a way to call things relative to some other time (ideally a timer that restarts for example every minute, but alternatively time-since-start). A couple of changes needed are:

  1. Server should add a timestamp into each packet it sends out
  2. Client should continually calculate its' latency, and try to delay server packages to be as smoothed out as possible to match the server timestamp. (this means added latency at the cost of smoother packages )
  3. Client should have a queue of commands to execute X frames into the future, rather than apply them instantly.

We'd need to be very careful to make sure all clients execute all packets in the exact same order between instant and delayed commands.

Add functional test for scene hierarchy

We should add something that can easily show the scene graph parent/child relationships. For example something like:
Spawn 5 primitives in a row, parent them to each other, start spinning each of them slowly, and watch the children spin around each other, after a few seconds unparent them all, and watch them spin at the point where they were detached

Report glTF load network errors

If your baseUrl is bad, or for some other reason clients cannot download the model, there is no indication anywhere (even in the Altspace log) that the load failed. The create promise just never resolves.

Non-looping Keyframe animations doesn't always stop at the end frame

In AltspaceVR, it seems keyframe animations sometimes don't stop exactly at the last frame if they are in 'once' mode.
Repro steps:

  1. In the Hello World sample, in generateSpinKeyframes, remove the last frame.
  2. During the DoAFlip animation creation, change duration from 1.0 to 0.5, so the animation executes quicker
  3. Build, Launch, and instantiate the MRE
  4. Click on the cube
    Observed: Sometimes the cube stops spinning a little early (or a little late?), and sometimes it stops pointing straight up.
    Expected: After the rapid spin, the cube should always stop with the top pointing straight up

Rigid Body transform do not syncronize across multiple users

Rigid body simulations work fine with a single user. However, once a second user enters, the objects teleport away.

Repro steps:

  1. Instantiate an MRE that uses rigid body physics, for example the functional test named rigid-body-test
  2. Have a second user enter the same space.
  3. Observe rigid bodies

Rigid body patches are in scene coordinates, not in MRE coordinates

Rigid body patch data is all in global space, but should be relative to the MRE root.

The code shouldn't assume that MRE root is the same global offset for scene 1 as for scene 2. It is possible to share a session ID across multiple different AltspaceVR scenes, for example.

Input Behaviors are not received if multiple MREs have executed in same AltspaceVR session

Input Behaviors don't always get piped through to MRE. it seems related to having had multiple instances of the same MRE running within a single AltspaceVR session (but not necessarily in the same scene)

Repro steps:

  1. Instantiate the Hello World MRE in your AltspaceVR homespace. Observe that the cube reacts when you hover, click, and unhover.
  2. Enter another world in AltspaceVR, and instantiate the Hello World MRE.
    Observed: The cube does not react to hover, click, or unhover.
    Expected: The cube should react

Note that if going back to the homespace, the first MRE instance will launch and will be interacting properly. That leads me to guess this is an AltspaceVR integration or Client Lib issue, not a Server side issue.

Lookat math is in worldspace, not in MRE space

Repro steps

  1. Instantiate the lookat functional test.
  2. move and rotate the MRE root to not be at identity anymore
    Observed: The Monkey Head dreamily stares off into space
    Expected: The Monkey Head stares you straight in the eyes

Add procedural glTF handlers to WebHost

Though gltf-gen exists, there's currently no way to use it in an MRE. I want to add a method to WebHost where you can get a binary blob registered with a URL for clients to load it. This would let you generate a buffer from a GLTFFactory, then load it via Actor.CreateFromGLTF.

Reconnect when the URL changes

If an app is playing and you change the URL, it remains connected to the old app until you toggle the Is Playing off and back on.

Physics forces are in worldspace, not in MRE-space

Physics forces are in worldspace (absolute orientation and non-scaled) - but should be MRE relative (or alternatively local), both for orientation and scale.

Repro Steps

  1. Load functional test scene for rigid-body-test. If placed at identity orientation/scale it works as expected, but if MRE root has been rotated or scaled, it is off.. This can be simulated using the "change global root" function of the testbed.
    Observed: Projectile flies in different direction, and misses target, if MRE root is rotated
    Expected: Regardless of rotation/scale, projectile should hit the target.

Send MREAPI.Logger.LogXXXXX() text through to Node.js

When an MRE is instantiated, and the client has errors, the errors should be sent back to the MRE, so they can be piped through to where developers expect it. By default I would expect it to show up in the Visual Studio Debug Console, and in Node.js' log files.

Examples where I would expect errors to show up
If an MRE client tries to play an animation that doesn't exist on an actor
If MRE fail to load a GLTF
If MRE tries to access an actor that doesn't exist

Expose user's device performance characteristics

A user's basic performance characteristics can allow targeted assets that match the user's device's capabilities. I'm imagining a tiered system, with lower values for weaker hardware and higher values for more powerful devices. Initially, 1 for mobile, 2 for desktops with integrated graphics, and 3 for desktops with discrete GPUs.

Attach avatar to object - allow avatar position to be synced to object position.

I realize this might seem like its not a good idea, but without it people will find other much worse ways to achieve the same thing like putting avatars inside inverted boxes or moving them with planes.

Attach the avatar to a rigid body/object to move with it.
Allow the avatar to move about within a limited space - on a large vessel.
Allow multiple users to sync to the same object - maybe attachment slots.

This allows for users to be put inside things, cars, boats bikes, elevators etc.

Users should be able to disconnect at any time and it should only control the position , but possibly allow for the initial rotation to be set as an offset.

Event listeners don't work in other world spaces

When trying to have an actor listening to events (like onHover and onClick), it only works if the item is created in the home space, but not if the item is created in a new world space.

Reproducible with hello-world example: Works okay (growth/shrink on hover, sideflip on click) in Homespace, but no reaction on user interaction when created in world space (just spins)

(UPDATE): Error is more generalized: When an item is used in more than one worldspace, the first instance I encountered worked correctly, not the second one. "Re-enter space" and "Reset Space" doesn't affect that bug; one has to completely exit AltspaceVR and restart the client.

Figure out why functional test deploy to openode requires tslib dependency

The functional tests seem to have a dependency when deploying to openode.io. For some reason this does not show up as a dependency in local builds.
The package exists locally as it is used as a dev dependency by tslint/typescript, so it may be because of that, combined with Lerna?

Either way, we should figure out why it's needed and why our local build don't complain about it.

Feature Request: Access AltspaceVR Space Editor placed entities

If I place a number of space components in a world, I'd want to be able to access and modify them from within my MRE.

Note that it needs to be able to restore space components when user reenters the space or stops/restarts the MRE.
One possible implementation is:
In the space editor assets are associated with the MRE (or tagged as Used By MRE).
When the MRE is in isPlaying state, the original space components are hidden from the scene, and when MRE stops playing/is removed, the original assets are unhidden.
the MRE itself could need to call CreateFromLibrary with a custom parameter, and it would clone all of the associated hidden space components.

Material backface culling/no culling

Be able to dynamically create and apply materials to geometry.
set/update GLTF textures or colors (albedo/metalroughness/normal/occlusion/emissive)?

gltf-gen does not build or publish documentation

The code is already documented for typedoc, but we need to automate its generation, and we need to decide where to publish it. I see two possibilities:

  1. Build docs to a markdown file and host it directly from the github repo.
  2. Build docs as HTML to /docs/gltf-gen, and link from the developer portal.

AltspaceVR Android doesn't run

MRE connection fails in Android versions (GearVR etc.)

Repro steps:

  1. place an MRE in any space in AltspaceVR, and enter a valid URL, for example ws://mre-hello-world.azurewebsites.net.
  2. On Android platforms, the MRE does not show up. Entering the same space on Windows, the MRE shows up fine.

Feature Request: Player Masking

Need the ability to enable behaviors for one or all users, or possibly user group/teams.
Need the ability to enable actors for one user, all users, all users except one, or possibly user group/teams.

must be dynamic (as people join/leave teams)
Must support recording+Playback.

some use cases:
first person shooter - player sees their own first person rig vs everyone else sees a real 3d model,
when players die, they can see everyone?
card games - hide cards from anyone but player - but the moment player plays a card, it becomes visible to everyone.

Programmatically create functional tests

Right now we hardcode each functional test into the functional test level. This isn't scalable. We should instead read each test configuration from a json file and dynamically instantiate each test station. json file might look like:

{
    "host": "mre-hello-world.azurewebsites.net",
    "tests": {
        "rigid-body-test": {
            "name": "Rigid Body Test"
        },
        "text-test": {
            ...
        }
    }
}

Feature Request: Audio

Key features

  • play one-shot sounds at 3d position
  • preload sounds
  • play audio stream at 3d position
  • Set/modify pitch and volume per sound instance
  • Automatically audio streams on multi-user join

Feature Request: portals/teleport

In-space teleporting and movement:

  • The possibility to moved one user to the desired location.
  • Setting a base velocity vector. (Allows for activities like downhill skiing, where the user is set in motion)
  • Reparenting the user to a given actor. (Allows for the user making use of a vehicle like a mine cart)

World teleporting:

  • Portals and teleporters

Feature request: Attach object to user avatars

Need the ability to attach MRE actors to an individual user's avatar - either to the root node or relative to different body parts.
It must work regardless of any avatar model that exists now or in the future, so system can't make assumptions about different models.

This should probably be done by having pre-defined attach points on the avatars for hat, backpack etc., it would allow easy customization of avatars, and we update all avatar rigs with these attach points

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.