Using Spatial Augmented Reality for Collaborative Data Visualization
in Visible Disparity
Bakhsh, N. K., Sagarminaga, P., & Leber, S. (2019)
California State University, East Bay
I spent two years in the Multimedia Master’s program at Cal State East Bay researching and learning about emerging technologies that had piqued my interest. One of those emerging technologies was “Spatial Augmented Reality” and its almost little to no commercial applications. Most of the research that I had done was exploring the kind of software and hardware needed to develop this technology. I searched on Github for existing open source code and started looking at anything with “augmented reality” and found “PapARt: Paper Augmented Reality Toolkit – Interactive Projection for Processing.” The creator, Jeremy Laviole, has worked on it for over 10 years now and calls the technology by a few names such as “tabletop AR” and “Projection-based AR.” Check out the repository below!
“PapARt is a software development kit (SDK) that enables the creation of interactive projection mapping…The strength of this library is the creation of interactive projection (also called spatial augmented reality in research)“
Jeremy Laviole – https://github.com/natar-io/PapARt
After reading through the step-by-step documentation on how to setup the hardware and configure the software, I decided that this open source library was going to serve as the main source code for programming a spatial augmented reality application. The IDE used was Processing 3, a software sketchbook and programming language based on Java. The hardware would simply be an HD camera and a projector connected to a computer. Along with my other thesis team members, we then brainstormed the type of subject matter we were going to use in order to display through the spatial augmented reality.
We looked at current topics that were in the news at that time (this was in 2018), and one that stood out to us was the number of homeless encampments that were starting to crop up around Lake Merritt and other neighborhoods in Oakland. News channels were also covering the increasing number of housing disparities, income inequality, and people in Oakland neighborhoods experiencing poverty. We researched the many factors that went into the causes for these topics and started looking for data sets that we could use. We found that it would be very helpful if we could visualize these data sets graphically on a map projected onto a large physical surface so we could easily discuss them amongst each other. For the map, we used another open source library for Processing called “Unfolding” and integrated it with PapARt.
Testing out the spatial augmented reality
We found that it would be very helpful if we could visualize these data sets graphically on a map projected onto a large physical surface
We argued that by using data sets that visually represented these disparities in neighborhoods, plot them on map through spatial augmented reality, and view it together projected on a large surface such as a dining table, we would be able to find correlations in the data much easier due to the physical size of the surface rather than trying to view it on a small computer screen or large television. Furthermore, by visualizing this on a physical surface through spatial augmented reality, there would be many use cases for this technology such as using the surface as a painted on white board that can be written on collaboratively as opposed to a digital screen where this could not be done.
In conclusion, I learned a lot about compiling, running and debugging a program using an open source library (dependencies, dependencies, dependencies), integrating code from two different libraries, and prototyping a sleek case for the spatial augmented reality hardware. In the end, we used an Ikea lamp to house the projector and camera. We were happy that it all came together in the end. To learn more about our research on this technology and the data sets that we used, please read our thesis.
Final Hardware in Ikea Lamp Case
Thesis Team from left to right:
Pierre Sagarminaga, Stephen Leber, Niloo Bakhshi
Presentation of the final Spatial Augmented Reality setup in the Art Gallery at CSU, East Bay
- Key Responsibilities:
- Served as primary technology lead (hardware and software)
- Responsible for the research in finding an open source library for Spatial Augmented Reality
- Initiated the rapid building of hardware prototypes using readily available materials
- Created the bill of materials for the final prototype
- Purchased the tech hardware (projector and camera) for the final prototype
- Configured the computer (i7 quad-core Mac mini) that the spatial augmented reality software would run on
- Lead the effort in learning, analyzing and compiling the source code of both libraries before adding additional functionality
- Created and managed all social media accounts (Instagram, Twitter, Google), website (www.visibledisparity.com) and email (visibledisparity@gmail.com)
- Documented all of the weekly builds of the software on our thesis website
- Created documentation for our code additions to the open source code
- Reached out to the creators of the open source software for support and sharing of our code (PapARt and Unfolding)
Check out our final presentation video below of our research and demo of how the spatial augmented reality worked!
- Visible Disparity – IXDIA Thesis Repository
- Published: December 19, 2019
- DOI: 10.21428/e90b493f.6622d608
- Contributors:
- Pierre Sagarminaga
- Niloofar Khoda Bakhsh
- Stephen Leber
- Cite as:
- Bakhsh, N. K., Sagarminaga, P., & Leber, S. (2019). Using Spatial Augmented Reality for Collaborative Data Visualization in Visible Disparity, 2019. MA in Interaction Design and Interactive Art. https://doi.org/10.21428/e90b493f.6622d608