If you’re interested in technology or futurism, you’ve probably heard of simulation theory by now. Popularised by Elon Musk in the last few years. Whilst we will be running an article delving into that subject, for now watch Elon explain (and a conference attendee completely miss the point) if you’re new to the idea.
The fact is we are edging ever closer to creating our own simulated realities, at an exponential pace. Photo-realistic video games, and advancements in AR is become rapid. Games like EVE already prove the scale we can manage. Now in 2018 Haptic technology is really gaining the same kind of advancement speed.
If you own a recent iPhone you’ll already be used to haptic feedback. The phone ‘pushing back’ when you execute certain actions. Reach the bottom of an Instagram feed and you’ll feel the phone vibrate, pushing back at you, as if you’ve touched the bottom of the feed.
We’ve seen rudimentary haptic gloves, control surfaces and peripherals. But in 2018 it’s clear as a species we’re starting to create the next level of haptic technology.
This is one of the most forward examples of the technology and theory. Whilst Teslasuit has been developed for a few years now, at CES 2018 the suit was present again with an upgraded and improved model. Boasting haptic feedback, climate control, and motion capture positioning to name a few.
The climate control technology is exactly the advancement I’m talking about. The ability to feel hot and cold, at different extremities puts us all into a whole new world of ‘real’. Whether that be a sunrise, a campfire, or a cold winter’s night.
To give you an idea of just what I’m talking about here, watch the video below from a development test in 2015.
ContactCi Haptic Glove
At ContactCi their prototype haptic glove is giving touch a whole new advancement. Our ability to interact with objects, pick up and move inanimate items is all down to our touch. With haptic ability, and tracking technology that even includes synthesised tendons, the ContactCi glove is advancing us even further. Again take a look at a development test below:
Unsold on the wearables element? It’s quite possible that the concept behind Ultrahaptics will blow your mind. The company is producing technology to project haptic feedback onto your hand, with no need for any wearable item. Imagine touching an AR projected screen and getting touch feedback to where that screen is mapped in real-space.
This technology could change everything. I’ve already predicted that screens are on their way out, to be replaced with projected AR – this would be the solution to control them. We could get to the point of developing real touch gesture interfaces anywhere, anytime.
We’ve proved that we can go from primitive pixel blocks to full photo-realistic visuals with the power contained within what is essentially a ‘set top box’ within 40 years. If we’re already at the stage of projected haptics as an early adoptive SDK and prototype, it’s almost ignorant to imagine that in 100 years we wouldn’t have virtual worlds that are indistiungashable from our own reality.