- Duo Discover
- Posts
- An AI has started to “taste” colors and shapes—something more human than it seems.
An AI has started to “taste” colors and shapes—something more human than it seems.
2 Cards Charging 0% Interest Until 2026
Paying down your credit card balance can be tough with the majority of your payment going to interest. Avoid interest charges for up to 18 months with these cards.
The human brain blurs the lines between the senses. We don't just see, hear, or taste separately; these sensations constantly interact. Marketers have long taken advantage of this by designing food packaging that "tastes" better through color, shape, or sound. It turns out, AI does something similar.
What’s the flavor of a pink sphere? The sound of a Sauvignon Blanc? These questions aren’t as strange as they seem. Research shows that humans naturally connect sensory experiences. Colors and shapes can change how something tastes. A wine may feel sweeter or muskier depending on the glass’s color or the music playing in the background.
This cross-sensory interaction is automatic, happening without conscious thought. In rare cases, it becomes extreme, where words trigger tastes or music creates colors—a phenomenon called synesthesia.
And now, AI is doing it too.
Generative AI systems, trained on human data, mimic these sensory links. When asked the same questions as humans in studies—like which colors taste sweetest—they produce similar responses. It’s not magic; it’s a reflection of patterns in the data they consume. But it raises an interesting question: If AI thinks like this, maybe these sensory associations are more hardwired in us than we realize.
We Eat With Our Eyes
The brain builds a map of the world using all senses at once. This blending isn’t random. Red and pink are associated with sweetness. Yellow and green signal sourness. Black and brown mean bitter. These patterns make sense if you think about nature: ripe fruits turn red, unripe ones stay green. Our brains learn to link colors with survival.
Shapes follow similar rules. Round shapes feel sweet and safe—think ripe berries. Sharp, spiky shapes feel sour or bitter—think thorns or poisons. These associations likely evolved to protect us, but they now influence how we experience everything, from food to art.
AI Picks Up The Patterns
Inspired by these findings, researchers asked generative AI models like ChatGPT to answer questions about sensory connections. For example, “Which colors match sweetness?” or “What shapes feel bitter?” The AI’s responses mirrored human judgments surprisingly closely.
More advanced versions of AI—like ChatGPT-4—were even better at reflecting these patterns than earlier ones, likely because they were trained on richer, more diverse data.
But there’s a twist. AI might just be regurgitating what it’s read in scientific studies, not creating original thoughts. That’s both the strength and limitation of AI: it reflects us. It’s like a mirror, showing us patterns we didn’t even know were there.
Practical Applications
What can we do with this? Sensory marketing is one obvious application. Companies can use AI to identify new ways to design products that “taste” better, look more appealing, or feel more intuitive. Think of a candy wrapper designed to make the chocolate inside taste sweeter—just because of its color or shape.
But AI isn’t perfect. Sometimes it hallucinates—producing results that don’t make sense. And even when it gets things right, it lacks the subtlety and creativity of the human mind. AI can show us what works, but humans still need to decide how to use it.
The Bigger Picture
This blending of senses—both human and AI—raises deeper questions about how we perceive the world. Are our experiences really separate, or are they always interconnected? And if AI can replicate these associations, does it mean that our sensory world is predictable, even programmable?
Perhaps AI isn’t just reflecting our data; it’s reflecting us. And in doing so, it shows us how much of our “human” experience is shared, universal, and deeply wired into the way we see, taste, and feel the world.
We’re not just eating with our mouths—we’re eating with our eyes, our ears, and our minds.
What did you think of this week's issue?We take your feedback seriously. |