Augmented Unreality is a plugin for Unreal Engine 4 which enables creation of augmented reality applications by displaying a video stream from a camera inside the game and tracking camera position using fiducial markers.
It was created by Krzysztof Lis (Adynathos) as part of a project for ETH Zürich.
- Video stream displayed in-game
- Multiple video sources: cameras, video files, network streams. Source can be switched using in-game UI
- Camera position tracked using fiducial markers, multiple independent sets of markers can be tracked at once
- Editable spatial configurations of markers
- Camera calibration
- Augmented Unreality Plugin 1.1.0 (UE 4.13.1) - the plugin files only,
- Augmented Unreality Example Project 1.1.0 (UE 4.13.1) - an example project using the plugin
- Download the example project
- Decompress the archive - and move AugmentedUnrealityPr to the location where you store your Unreal projects.
- Launch Unreal Engine and open AugmentedUnrealityPr/AugmentedUnrealityPr.uproject.
- Navigate to AugmentedUnrealityPr/Saved/AugmentedUnreality/Markers and print the board images: AURBoard_SquareA_C_0/AURBoard_SquareA_C_0.png, AURBoard_SquareB_C_0/AURBoard_SquareB_C_0.png.
- Connect a camera and launch the game.
- If the virtual object are not well aligned with the markers, perform camera calibration.
- Download the plugin
- Decompress the archive - and move directory AugmentedUnreality to YourProject/Plugins
- Reopen your project
Video acquisition is achieved using OpenCV's VideoCapture.
Video capture and processing is performed by the AURDriver object, like ExampleDriver class in the example project. You can adjust the video settings by creating a child blueprint of AURDriver_Default and editing its properties. Once you have your AURDriver blueprint, use it as the value for your PlayerController's CameraDriverClass (if you inherit the PlayerController from example project) or pass it as DriverClass when spawning an AURCameraActor.
The key property of AURDriver is DefaultVideoSources - a list of video source classes that will be automatically created and available to switch through the UI.
The plugin can obtain video from various sources. To use video from a given source, create a blueprint for it and add it to your AURDriver's DefaultVideoSources. Your video source blueprint should extend one of these superclasses:
- AURVideoSourceCamera - video from a camera directly connected to the computer. Properties:
- CameraIndex - 0-based number of the camera. If you have only one camera, the index should be 0.
- DesiredResolution - the driver will attempt to set the camera's resolution to the desired resolution specified in this attribute, however it is not guaranteed that the camera accepts this resolution. Generally, lower resolution means lower quality and accuracy but higher refresh rate.
- AURVideoVideoFile - video from a file. The VideoFile should be the path to the file relative to FPaths::GameDir().
- AURVideoSourceStream - video streamed through network. Set only one of the following:
- ConnectionString - a GStreamer pipeline ending with appsink.
- StreamFile - path to a .sdp file relative to FPaths::GameDir().
Shared properties of all VideoStreams:
- SourceName - name to be displayed in the graphical list of video sources
- CalibrationFileName - location of the file storing calibration for this video source, relative to FPaths::GameSavedDir()/AugmentedUnreality/Calibration. If two sources use the same camera, they should have the same calibration file.
Best quality is obtained if the camera is calibrated. It is important to find the camera's field of view. If the camera's field of view differs from the rendering engine's field of view, the virtual objects will not be properly aligned to the real world. If you notice that the virtual objects move in real world when you move the camera, it means the camera is not correctly calibrated
Each VideoSource can have different camera parameters, therefore each has its own calibration file located at located in FPaths::GameSavedDir()/AugmentedUnreality/VideoSource.CalibrationFilePath. The driver will attempt to load this file and display the information whether the camera is calibrated in the UI.
To perform calibration of your camera:
- Print or display on an additional screen the calibration pattern found in AugmentedUnreality/Content/Calibration/calibration_pattern_asymmetric_circles.png
- Open the example project and start the game
- In the menu in the top-right corner of the screen, choose the right video source and click Calibrate
- Point the camera at the calibration pattern from different directions - pattern is detected if a colorful overlay is drawn
- Wait until the progress bar is full
- The camera properties are now saved to the calibraiton file and will be loaded whenever you use this video source again
This plugin uses ArUco boards for camera pose estimation, specifically the implementation of ArUco in OpenCV contrib.
Boards are used for two purposes:
- Positioning the camera in game world - this aligns the real and virtual world. The board's position in real world is equivalent to the point (0, 0, 0) in game world. Boards used for camera positioning are set in the PlayerController's MarkerBoardDefinitions property (if you are extending the example player controller) or in AURCameraActor's BoardDefinitions if you are spawning the camera actor directly.
- Positioning independent actors - to bind an actor's pose to an AR board, add an AURTrackingComponent to the actor and select the ChildActorClass to one of the board blueprints
An ArUco board is a set of square markers, together with their positions and orientations in space. When a board is visible in the video, its pose relative to the camera can be calculated. In Augmented Unreality, we use boards for finding the pose of the camera in game world and for positioning independent actors with their own markers.
Augmented Unreality allows the user to create their own custom spatial configurations of markers in Unreal Editor. Please see the example boards in AugmentedUnreality/Content/Markers and AugmentedUnrealityPr/Content/AugmentedUnrealityExample/Markers.
To design a new board, create a child blueprint of AURBoardDefinition and edit it by adding AURMarkerComponents inside it. Each AURMarkerComponent represents one square on the board.
- Location, Rotation - pose of square in space. You can use SceneCompoenents to organize the board hierarchically.
- Id - identifier of the pattern shown in this square. Each square should have a different Id.
- BoardSizeCm - length of the square's size. This will automatically set the scale. When printing the boards, please ensure the squares match this size.
- MarginCm - margin inside the square, does not affect the total size.
After you create or edit the board blueprint, launch the game to generate the marker images. Then open the directory YourProject/Saved/AugmentedUnreality/Markers/YourBoardName, print the images, and arrange them in space to match your designed configuration. The IDs of the markers in the editor need to match the numbers present in the images:
- Windows - fully functional, pre-built packages available.
- Linux - the plugin compiles and editor can open the project, but when the game is launched a crash occurs, which is potentially related to differences in memory management between Unreal Engine and OpenCV. I would be grateful for help from someone experience with memory management across shared libraries.
- Android - there is an Android version of OpenCV so it should be possible to port the plugin to that platform. This is outside the scope of my project but I am willing to help if someone wants to try this.
The following problems have been solved in this plugin, if you want to learn about these topics, please see:
- Including external libraries in UE4
- Multi-threading in UE4 (1) (2) (3) (4)
- Performing OpenCV camera calibration OpenCV tutorial, adaptation for the plugin
- Drawing on dynamic textures UE tutorial (a bit old) my adaptation
- Conversion between OpenCV's and Unreal's coordinate systems