Where is Design
How this project work
Step 1 Scan a piece of paper with text*
Step 2 Text will project in the mixed reality
Step 3 Take one of the three tokens (bold/italic/condense), the world design in the text will change accordingly.
*This function is not fully developed
Type: Interaction Design
Duration: 6 weeks
Skills: Unity/C#, Arduino, Microsoft Hololens, Laser cutting
Instructor: Austin S. Lee
This was a lab course, so I learned many digital tools for prototyping interaction design.
For the final project, there is only one requirement. Using whatever we learned to build a project with kinetic type.
I decide to work on Hololens with Unity to experiment typeface in mixed reality. With limited gesture support in Hololens, I wonder whether I could create a tangible input device to enhance the experience in mixed reality.
After discussing with Professor, he told me to check the Reality Editor project from MIT Media lab.
This project allows users to link and control physical object by the user interface shown in AR. This is very inspiring project, and my idea is exactly the opposite. Using physical object to control things in AR.
Future world with Mixed Reality
Although we have already lived in a information-overloaded world, it seems that it will get even worse with mixed reality in the future.
With so many text within our vision, how do we know which one is important or not? Is the traditional way such as bold the text still applicable in mixed reality? I want to experiment kinect typeface on Hololens to see how typeface works out in mixed reality.
Iteration of Tangible input
There are several iterations of input device. The most ideal one would be the one using AR to reveal the function of the input. Without AR, buttons will be identical. However, due to the viewing angle issue, if you see the button to display the AR, you don't see text in the front. As a result, I used the shape of object as a metaphor to imply functionality.
Construct the System
I choose Particle Photon, an arduino board which provide built-in web service, for my tangible input. When there is an event trigger, Photon will send it to its cloud server. What Hololens should do is to keep listening to that web server for event triggered.
A short presentation and Demo were held in the last class. Most people found the project interesting. Gesture control of Hololens is intuitive but not quite handy yet. We are still physical being and prefer to have something touchable to control the system.
WHAT I HAVE LEARNED
The design of typography always depends on its medium. With quite different context in MR, current digital typeface seems not suit very well in MR world. I think it will be very interesting to test other typefaces in this project.
As for the tangible input, considering the FOV of human's eyes, the device in this project still not ideal. I think a better way might be designing a token size object that could be held by single hand. Hololens should be able to recognize the device and provide further indication in mixed reality.