How Emotions Shape Your Digital Experience

“… people will forget what you said, people will forget what you did, but people will never forget how you made them feel.” – Maya Angelou

“It was scary at the beginning, then it was funny and really cool,” my 5-year-old answered when I asked what she thought of the movie we’d just seen. Not a word about the plot, the setting or the characters. Her memory of the movie experience was purely emotional.

There’s a psychological reason for this. A 5-year-old’s brain has finite cognitive resources. Details take time and energy to process and store. Memories of feelings, on the other hand, are a shortcut. They provide the necessary information with minimal brain workload.

Adult brains work in almost the same fashion. We are hardwired to construct memories of experiences based on the emotions they evinced. And what defines these emotions? True to Maya Angelou, it’s usually not just words. Rather, it’s the non-verbal cues: gestures, facial expressions, eye contact, touch and body language that add meaning to verbal cues and ultimately mold our memories.

Consequently, we understand and retain experiences in the physical realm based on a series of verbal and non-verbal behaviors. And guess what? The digital world is no different.

Digital Body Language: Beyond a Buzzword

In the digital realm, as in the physical world, experience is comprised of a series of behaviors, some verbal (or textual) and some non-verbal. To derive meaning from a visitor’s interaction on a website, we need to take into account both what he or she says, i.e. expresses overtly via text or voice, and what he or she does.

What a visitor does during a website visit — the subtle nonverbal cues so crucial to understanding experience — is referred to as digital body language. Just as the mechanism used by an adult to interpret non-verbal cues is more sophisticated than that of a 5-year-old, understanding the subtleties of digital body language is a complex science.

Why is it so complex? First off, because, as any robotics or AI expert will tell you, emulating anything the human brain does is really hard. Secondly, because to measure digital body language, it’s necessary to examine a multifaceted matrix of online behavior made up of behavioral attributes, page elements and visitor actions, over time. Just like the human brain needs a complete picture of words and actions to distill meaning and create memories, technology that effectively interprets digital body language must take all these factors into account.

Rethinking Metrics

To truly and accurately read digital body language, we need to rethink the way we approach page-level and site-level metrics.

By way of example, the holy grail of traditional online metrics is conversion. And despite conversions being absolutely crucial to business, they are nonetheless relevant only to the roughly 3 percent of visitors who convert. For the other 97 percent, it’s common to try to quantify individual actions like clicks and scrolls, in the hope of understanding why they did not convert.

Yet, just like a smile in the physical world is only a true expression of pleasure in context (think of a polite smile in the face of a bad joke), clicking, scrolling, and other commonly-tracked on-screen behaviors can’t be understood in a vacuum.

In other words, narrow quantification does not equal understanding when it comes to digital body language.

Digital body language, like its offline counterpart, can only be understood in context. Clicking and scrolling are not metrics with intrinsic digital body language value. Slow scrolling is different from fast scrolling and different types of mouse movements reflect different intentions. To effectively interpret digital body language, we need to start with a sophisticated set of metrics like hover time over a call to action, active versus inactive time, clicks on elements, scrolling speed and reach, number and speed of mouse moves, direction of mouse movement (horizontal versus vertical), and much more. 

Then, only by watching these online equivalents of non-verbal behavior closely, can we evaluate how on-screen actions correlate to different states of mind, reading each visitor’s digital body language.

Reading the Behavioral Cues 

Interpreting digital body language is an emerging science that has already given birth to a first generation of technological solutions. Using these tools, we can adopt a more holistic approach to online behavior and learn far more about our visitors than ever before possible. 

We can understand, in a very real and quantifiable way, how they are feeling as they browse, purchase or leave our site. We can identify their mindset and adapt content on-the-fly to best suit them. We can take steps to ensure that whatever they do, their overall experience is, in my daughter’s words, “really cool.”

 

Originally published at Psychology Today