🔔 The Morph is discontinued. Click to read the announcement. 🔔 The Morph is discontinued. Click to read the announcement.
Taking Back The Interface

Taking Back The Interface

[by Stephanie Chedíd]

You’ve all probably heard the term “user interface” before, but just in case you haven’t…:

  1. User Interface: (noun) the means by which the user and a computer system interact, in particular the use of input devices and software.

Most people  will read this definition and have no issue with it.  But here at Sensel, we aren’t completely satisfied.  In “User Interface”—  the word we have issue with is “user”.  The tech world has grown accustomed to viewing the humans that interact with computers simply as users rather than actual humans, with their own depth of knowledge, creativity, and physical language, and to the detriment of anyone who uses computers, user interfaces have evolved accordingly.  Tech developers have forgotten to take into consideration the humans behind the “user”, the details of their interaction, the imperfections and precision and expression they’re capable of communicating with when using hands.

At Sensel, we like to think we’re taking back the interface— evolving it from a “user interface” to a “human interface” by enabling people to communicate with technology that truly captures their expressions— not just the shapes but the lines and pressure and force that their hands use to express, whether they’re playing video games or pumping up a crowd with some new beats.

Let’s take a brief moment to rewind:

Once upon a time, the word “computers” referred to actual people.  That’s right, people.  Before World War II, (sometime after the glory days of the abacus and before the magic of Texas Instruments) many citizens were hired, essentially, as human calculators— these people were called “computers”.  Later on, a different breed of “computers”, which were mechanical calcluators (ENIAC is one such example), were borne out of military research during WWII.  IBM later saw an opportunity and started to commercialize these kinds of calculating computers.

I know what you’re thinking: Cool factoids— what’s your point?  Computers started with humans, as humans, even.  After a long time away, we think it’s time to incorporate the “human” back into our interactions with computers.

Remember when Sega Genesis came out? No? Nintendo, Xbox? Now I feel old… The point is, I bet you can recall playing with each (ok, at least one) of these consoles and how you used to press extra hard on the controller buttons when you wanted to go faster, or stop sooner, or jump higher.  Why did you do that?  You knew it wouldn’t make a difference.  The button was a button, and it was going to perform the same action, with the same (lack of) precision, at the same (static) intensity, no matter how hard you pressed.  As humans we use our hands to communicate, just like we use our words and our facial expressions— so this behavior, though lacking in logic, was quite natural.

Isn’t it a shame, though, that you can recall the same frustration with the limited ability to communicate through these input devices from the 80s with Sega to thirty years later with Xbox?  It’s weird how the games have evolved, but the ways in which we interact with them have not.  Or maybe you’ve never questioned it until now.  Well, here at Sensel we’ve been questioning this for years (or rather, since the first time we broke a game controller button from pressing too hard, and questioned why we did it.)

Technological evolution is good, but evolving in the wrong direction could sometimes be worse than not evolving at all.  Sensel is here to steer the evolution of the interface back in the right direction— from a limited “user interface” to a limitless “human interface”.  We believe that when you press on your controller harder, Yoshi should jump higher—case closed.