Regardless of how clever today’s e-commerce storefronts are, their digital interfaces lack one particular quality: the ability to mimic human expression — and research indicates it’s more than just a makeover.
E-commerce and the “digital storefront” have existed for a little more than two decades. In that time, online shopping has come a long way in terms of user experience and personalization.
And yet, even with the plethora of advanced user interfaces available to e-vendors, a recent IBM study reveals that only 14 percent of online consumers are satisfied with their online shopping experience. After all, shoppers are still forced to speak the language of the computer — i.e., clicking, dragging and scrolling — to carry out tasks.
The advent of large language models (LLMs), such as ChatGPT, Bard, and Llama, have the potential to assist e-commerce businesses in aligning with users’ intuitive expectations. Savvy online retailers have begun integrating their own tailored LLMs so that shoppers can submit any prompt they want — e.g., “do you have these shoes in size 12?” — and receive “thought out” text responses, almost as if an actual store clerk was replying.
But simply integrating an LLM into one’s online storefront isn’t enough. Text isn't the most natural or usable way to get information in a shopping context. Users often want to enjoy their online shopping experience on a level that only face-to-face interactions can provide. But without a face, LLMs remain disembodied entities, often holding merchants back from making genuine connections with customers.
The next stage of digital storefront interfaces should therefore encompass both verbal and graphic elements, as humans are inherently wired to communicate not just through text, but through speech and emotive expression as well. With it, online retail interactions can be just as meaningful, helpful and productive as a genuine in-person conversation.
Digital Faces, Real Reactions
Advances in neuroimaging techniques have enabled scientists to study and map neural connections in the human brain, providing a tangible means of observing the brain’s response to stimuli in the areas linked to processing emotion and interpersonal experiences.
This helped researchers from Maastricht University in the Netherlands conclude that emotional expressions from human-like avatars can elicit natural empathy from human users as if they were interacting with a real person, accompanied by corresponding behavioral and neural activity.
A separate study by The Australian National University found that computer-generated faces induce N170 ERP — the same neural processing waveform the brain manifests after seeing someone’s real face.
This makes sense; human minds have evolved over millennia to react emotionally to facial cues. So even though we can recognize human-looking avatars as artificial constructs, we're still capable of instinctively connecting with them.
Looks Matter
The more real, or anthropomorphic, avatar faces look, the more credibility they’re perceived to have, and the more seriously they’re taken.
This is because the presence of the human form triggers individuals' intrinsic social scripts like politeness and reciprocity. When applied to digital avatars, this can yield cognitive, affective responses from users and encourage individuals to be more accepting of such services.
One study assessed the impact "automated virtual humans" have on alleviating the stigma service members felt when reporting combat-induced conditions like PTSD. Soldiers were found to be more forthcoming when conversing with a virtual human interviewer than they were filling out official Post-Deployment Health Assessment forms.
That same logic can also be applicable to shopping experiences. For example, certain items can feel embarrassing to have rung up by a cashier. In these situations, shoppers may feel more comfortable interacting with a virtual e-commerce agent than with a human attendant.
Let’s Face it
The evidence suggests that digital human avatars can evoke the same kind of emotional and neuronal responses as interactions with real people.
This highlights the importance of incorporating realistic, human-like avatars into the interfaces that are being adopted by e-commerce merchants, emphasizing the crucial role they might play in facilitating more effective, empathetic communication with online shoppers.
Providing sound advice and judgement-free assistance is just the tip of the iceberg for e-commerce retailers. While research continues to uncover new applications for humanized AI, it also underscores the remarkable potential for elevating user engagement, fostering emotional connections, and enhancing overall satisfaction.
We're on the brink of a new era when digital humans may indeed prove to be the missing element in the online shopping experience.
Matthew Kershaw is vice president of strategy at D-ID, the go-to providers of AI-powered face-to-face digital assistants.
Related story: Outsmarting the Devil: How to Add Product Personalization Features Without Jeopardizing Brand Image
Matthew Kershaw is vice president of strategy at D-ID, the go-to providers of AI-powered face-to-face digital assistants. Matthew has a 20-year background in digital media and content strategy, and has developed commercial programmes and services for clients such as LG Mobile, Barclays, Shell, Audi and Samsung. At MTV, as Head of Digital, he oversaw and launched numerous services including their first streaming on-demand video platform, mobile and social channels. With certificates from MIT and Codecademy in AI and data science, he helps businesses identify and exploit the gains created by AI, as well as helping take AI-based businesses into new markets. Matthew has a Masters degree from Wadham College, Oxford University in Philosophy & Experimental Psychology.