AR-Sandbox

AR-Sandbox is a visually augmented sandbox. It is inspired by SARndbox and was developed for our augmented reality lecture in a team of three students. The Project is available on Github.

AR-Sandbox
Project Setup

Project in Action

Project Setup

We used a Microsoft Kinect to track the sandbox and a regular projector to project the color image back onto the sandbox. Additional we used a big mirror on the wall for our setup.

Project Setup
Project Setup

Our target was to implement an augmented sandbox application for Windows. We decided to use the Unity3D engine as a base, so we could use its physics and animation capabilities for additional simulations.

To get Kinect working in Unity3D we used the Unity3D  Kinect Plugin by Carnegie Mellon University. For image processing we used the AForge.NET Library.

The Kinect depth image is cropped and smoothed through a Gaussian Filter. Afterwards a Mesh is created and updated in Unity3D to represent the sandbox surface. This mesh was intended to be used for physics simulation, but no physics made it into the project. For the color we are using a self written shader, which is blending different layer textures.

Textured 3D Wireframe
Textured 3D Wireframe

Problems

While development we encountered some smaller problems.

Edges of the Box

The semitransparent edges of our sandbox weren’t captured correctly by the Kinect and caused some value spikes. As addition to that the Gaussian Filter wasn’t able to handle those spikes and caused big square in the filtered image. As solution we just taped the edges of the box, so they were captured correctly.

Calibration

For calibration we planned to have a automatic calibration via QR markers. Unfortunately the RGB camera of the Kinect had some problems with the mirror, so we skipped that. The RGB image was way too distorted, so our QR tracking wasn’t able to detect any markers  The source code for QR detection is in the project, but the calibration is not implemented. Instead we used simple calibration through mouse clicks and moving the camera in the Unity3D scene around.

RGB Kinect image
Distorted RGB Kinect Image

9 Replies to “AR-Sandbox”

  1. Julian! Thank you so much for the Github page and videos and clear explanation. At Nova Labs, a maker space in Reston, Virginia, (nova-labs.org) we are starting to build an AR Sandbox like the UC Davis one. However, they use Linux and I was hoping we could find Windows software to do the same thing. But YOU are the only person I found who created AR Sandbox with Windows Unity. Is that true?

    1. At the time of doing the project, we did some research and indeed didn’t came up with any other Windows Sandbox. As the project is now almost 2.5-3 years old, i don’t know if that still holds true.
      Anyway feel free to use the AR-Sandbox as it is or as base for your own project. If you have any further questions, i’d be glad to help out.

  2. kinetic and how much more appropriate from the sandbox,Sometimes does not show blue。mindepth and maxdepth
    Improper。

  3. ohh its great work you done.here i have some questions. is it possible to make rain, water flow, and volccano eruption, in this.because as i seen SARnbox it can did some these and some not.i read several article to implement water flow simulation but still i dont know how can i implement through this.can you through some light

    1. Hi, thanks for your comment 😀
      The project is using Unity3D, which is a full-fledged 3D Game-Engine. It does have a physics system, although they don’t have native flow simulation, AFAIK.
      So you would need to implement such stuff yourself or look for other projects that already implements it for Unity3D. Unity3D has a very active and huge community, so you’ll likely find something.

      1. thanx for your instant reply. if I put projector and kinect up side down and remove mirror than what type of change I have to do in code or is it work fine without change.

        1. It may somehow “work”. But when you remove the mirror my guess is the Beamer Ouput and/or Kinect Input is flipped. I can’t remember whether we had a boolean switch for that, but i don’t think so.

          If you have mirrored/flipped issues, see DepthMesh.cs, thats the script, that fetches the Depth-Image from the Kinect and generates the vertices for the Mesh. Otherwise check the “Scaling” properties of the Mesh-Gameobject in the scene. Maybe we had a negative value set up to flip it.

          Another issue is going to be the finetuning values for the Depth-Mesh. We cropped the Value Range and set the value range according to our lowest sand point and the highest sand point. So you need to adjust those value, depending on how far away the Kinect is from the Sandbox and how much sand you have (how deep you can dig or raise the sand).

Leave a Reply