The Oncoming Times of Mixed Reality

Why should we pay attention to mixed reality?

First of all, we need to explain what the differences of AR, VR and MR are. According to what Milgram had defined in 1994, the experience in which only physical reality participates and those with only virtual reality are two extremes of a kind of continuous spectrum. This spectrum is called virtuality- reality continuity. This continuity includes various proportions of how much virtual reality or physical reality participates in the experience. According to these different proportions we can define different status of reality. AR, short for Augmented Reality, is based on physical reality and add virtual elements in it. And VR, short for Virtual Reality, is totally about virtual world. And MR, known as Mixed Reality, is a notion in between VR and AR.

In 1994, the author only considered the aspect of vision that could influence human cognition of reality. However, in 2009, Jeon and Choi extended it to include sense of touch. By far we have covered two major senses human have in Mixed Reality. However, this is far from enough. Researchers have been thinking of going further to pursue multi-sense immersive experience. Different people are experimenting in various dimensions. There is gloves which can give pressure feedback. There is clothes that provides temperature sensor. But there is no denying the fact that vision will still be playing the most important part in mixed reality.

When people watch TV, read book or listen to radios to perceive different context, they normally can get temporary immersion. This kind of immersion is usually created by the brain itself. However, it can not last very long. People need to return to the real world and go on with their daily life. But when machines are finally able to get control of human senses, we believe this kind of mixed reality experience possibly will intrude or even take the lead of people’s daily life. Information from both physical world and virtual world will be mixed by computer and then transferred back to our brain. During this process computer will operate massive invisible calculation to achieve the purpose of cheating our brain. As a result, mixed reality will become a technique that will lead people’s daily life and we need to get well prepared for this.


How far is future?

We can predict that in the future five years, display technology will experience a profound leap. This will lead to the wide application of mixed reality into our daily life. People will be able to switch between virtual and physical reality via their portable devices.

Massive scenes appeared in movies will become reality and various virtual information will be added into our environment. Social elements will appear in mixed reality. This kind of technology will enable the real breakthrough of time and space among human social activities. Everyone will get their own visualized virtual avatar. And further more, the emerging iPhone X will push the possibility of expressing emotion via virtual avatar with its latest technology to track facial details.


What would future be like?

With the appearance of ARKit in iOS and ARcore in Android, our daily mobile devices will be able to visualize surroundings in mixed reality. When this technology becomes mature, the way we communicate with digital information will be profoundly turned. We can use them based on current applications (download apps and each of them solving a specific need). Or we can make them become invisible intelligent assistant. The interface of mixed reality will thoroughly be merged into the natural interactions and our vision. And following are a couple of features it will have:

Application will get rid of interfaces and become invisible.

With the popularity of mixed reality component, current interface in 2D interaction will be totally turned into a world based on 3D graphic rendering. Vocal Q &A  by AI will become the main input medium for people to interact with information. People’s body language and interaction will be released. The notion of application will ultimately be forgotten in that physical behavior will replace physical controllers like mouse and keyboard and become a new way to interact with our context. Behavior is interaction. Interaction is Information.

Mixed reality will influence how people perceive the environment.

Mixed reality is able to add a new layer of information in the context. This ability will bring new prospect of how people perceive reality. The meaning of an object will gain different function or significance when new information is added. Every simple object is possible to produce their own interaction when recognized by the computer. This will enlarge the entrance of interaction to the whole context. Because human perception and cognition is not that precise and objective to fully reflect what the physical world is really like. As a result, computer and software designers are possible to further develop the possibility and this is also the foundation of the mixed reality redirected walking technique we are researching.

Socialization is important for mixed reality experience.

The socialization we are talking about here is not just interacting with other users in mixed reality. We are talking about how to synchronize other users with what current user sees when he is experiencing his own mixed content.

There is a problem in most of the VR/AR exhibitions. The problem is only one person can experience while others have to wait till the next round. They can only see the 2D realtime display from, say, a TV. This problem really makes it harder for a VR product to be promoted effectively. How to enable multi-players in mixed reality, how to synchronize visual displays from different users or even interact with each other are all the possible directions in future researches.

The ideal mixed reality experience should enable us to conveniently share what we see and what we feel. We are able to control freely to what extent our content is private or open to others.

The original blog is posted on in Chinese and this is a translated version by Liquan Liu.