Innovativa lösningar som driver ditt företag framåt
Kundnytta i praktiken och skräddarsydda helhetslösningar
Våra experter delar med sig av sina kunskaper och trendspaningar
Din framtid kan finnas hos oss
Lär känna oss som företag och vad vi har att erbjuda
Skriv in nyckelord för att söka på webbplatsen. Tryck enter för att skicka.
Generativ AI
Cloud
Testing
Artificiell intelligens
Säkerhet
January 05, 2026
Late at night, in a dimly lit room, two friends hunch over a digital bomb-defusing game.“Cut the red wire!” one shouts.“Which red wire? There are two!” the other gasps, sweat on his forehead.
The game Keep Talking and Nobody Explodes demands exactly what its title suggests—keep talking, or it all blows up. One player sees a ticking bomb with cryptic modules but has no manual; the other holds the instructions but can’t see the bomb. Their only chance is to describe, listen actively, and think together. Misunderstandings or silence mean disaster.
This nerve-wracking exercise is a perfect metaphor for today’s IT projects: in complex development teams, each member holds a piece of the puzzle, but no one sees the full picture from the start. Persistent dialogue and coordinated listening become the lifeline that keeps projects alive when deadlines loom and risks flash red. Like in the game, teams avoid “explosions” by constantly adjusting course through conversation and feedback—and trust grows from this interplay. When someone says, “I need help, what does the manual say?” that moment of open communication transforms stress into progress.
The impact of dialogue extends far beyond the gaming world. Last spring, we helped a municipality in northern Sweden introduce an AI-based conversational partner in elder care. The goal was to see if a digital “companion” could improve safety for seniors and ease the burden on staff.
The results surprised everyone: the platform became a double value engine. For the elderly, AI created a sense of presence and companionship—a warm voice available even when staff couldn’t be there, reducing loneliness and increasing everyday security. For caregivers, it became a tool to quickly gauge each person’s well-being and history; if a door didn’t open in the morning, they knew something might be wrong, and the AI could even provide clues about what happened overnight.
In both roles, it wasn’t technical perfection that mattered—it was the ongoing dialogue that built trust, calm, and real utility. Empathy was embedded in the technology: the digital colleague “knew” personal details (asking about the cat, reminding about medication) and could alert staff if something seemed off. By mimicking human conversation, AI built bridges of trust. True progress depended on our ability to engage in genuine dialogue—with both people and machines—and to navigate uncertainty together.
An early morning in northern Sweden. A home-care worker steps into a silent stairwell. Two apartment doors remain closed; no sounds, no movement. The light is dim, the air still. It’s a space between worlds—between night’s solitude and day’s care, between worry and reassurance. Here, in this empty stairwell, a liminal space emerges.
Liminal spaces are transitional zones—physical or mental—where the old has been left behind but the new has yet to take shape. The word “liminal” comes from the Latin limen, meaning threshold. We stand on the edge: no longer in the past, not yet anchored in the future. And in this strange in-between, something magical can happen.
In home care, such moments are charged with empathy, attention, and presence. These thresholds invite creativity and transformation. The gaming world has long explored liminality. In the classic Myst, players awaken among misty islands and deserted libraries—no clear instructions, no immediate threats, just an enigmatic silence. This ambiguity sparks curiosity: without fixed rules, you’re free to experiment, solve puzzles in unexpected ways, and think beyond the obvious. Liminality becomes a creative catalyst—stepping outside your comfort zone opens new mental pathways.
Workplaces experience similar phases during transitions. Imagine a development team leaving behind an old method but not yet mastering the new. It feels messy: “Are we doing this right? Who decides now?” These in-between times can feel frustrating—like waiting in an airport lounge between flights—but they also offer fertile ground for innovation.
Research and experience show that uncertainty breeds creativity—if managed well. When no one claims to have all the answers, bold ideas surface. When routines dissolve, experimentation becomes the norm. Organizational psychology calls liminality a “fertile chaos zone”—at its best. It takes courage and guidance to harness it. Wise leaders mark transitions (celebrating the end of an old system before launching the new) and encourage an exploratory mindset.
In today’s development world, a silent struggle is underway: the clash between data and intuition. In game development, data-driven methods have exploded—everything is measured, tested, and optimized. The analyst’s voice is now as influential as the creative director’s. This brings superpowers: spotting flaws, testing design alternatives, and choosing the objectively best option.
But there’s a flip side. Many creators report a new kind of frustration: “Are the numbers taking over our vision?” One designer described it as the soul “bleeding” when a beloved idea gets rejected because it doesn’t fit the metrics. Meanwhile, analysts grow exasperated with gut-driven colleagues who ignore clear user data.
This tension is just as visible in system and business development. KPIs and dashboards steer decisions. But what happens to craftsmanship and creativity? A developer might know a feature will delight users—yet the numbers say otherwise. Do we dare stand our ground?
Managing this requires emotional resilience. Teams must take data seriously—without taking it personally. Like an author enduring an editor’s critique, developers must accept when data exposes flaws. But we can’t become data fundamentalists. Passion and meaning can never be fully quantified.
The best game teams seat analysts and creatives side by side. Numbers become decision support—not decision makers. Behind every data point are real people with subjective experiences. When data contradicts an idea, address the frustration openly:“What do you feel we’re missing if we only follow the numbers?”And the creative can ask:“Can we measure my intuition somehow?”
Once again, dialogue becomes the bridge—connecting feeling and logic.
We stand at the threshold of a paradigm shift: AI as a creative collaborator. Generative models appear as idea generators, coding assistants, and conversational partners. Integrating AI feels like hiring an alien colleague—brilliant in some ways, baffling in others.
Organizations now live in a permanent liminal phase—oscillating between human and machine. The boundary of what is “our” work and what is “theirs” (AI’s) is constantly moving. It’s exciting—but mentally demanding.
Questions pile up:
These questions have no fixed answers. We’re in terra incognita. AI doesn’t change what makes collaboration successful—it amplifies the need for it. Clear goals, strong communication, and trust matter more than ever.
AI acts like a magnifying glass—intensifying team dynamics. Good communication smooths integration and helps navigate friction. Poor communication breeds confusion and resistance. High trust and psychological safety give courage to challenge AI; low trust sparks frustration.
Think of AI integration as a vast liminal experiment. We stand between eras—and must shape collaboration together.
Ultimately, success doesn’t hinge on the perfect algorithm—it depends on our human ability to create mutual understanding, with both people and machines.
Empathy is our most vital compass. In development teams, it means sensing colleagues’ concerns, noticing when morale dips, and understanding user frustration. Empathetic communication turns stress into solidarity—and strengthens both bonds and performance.
In human–AI collaboration, empathy takes new forms. We need empathy for the user—AI executes tasks, but we design experiences. We also need empathy toward AI: understanding its limitations and “thinking style.” Treat AI like a new intern—talented but inexperienced. Give clear instructions, be patient, take responsibility.
Empathy brings calm and direction when the world feels uncertain. When pace is high and change relentless, pausing to check in with each other is priceless.
The most innovative organizations of the future will unite high technology with high humanity. They’ll recognize that every AI, every data strategy, every new method ultimately exists to serve people—and must be shaped by human values.
Innovation doesn’t ignite spontaneously—just as fire doesn’t light itself. It needs fuel, warmth, and air.
The spark? It often strikes when two minds meet from entirely different starting points—and choose to collaborate.
So: Keep talking. Keep listening, sensing, and sharing. In a world where AI and algorithms join our teams, embracing dialogue and in-between states is critical. Fill uncertainty with curious questions and shared insights. Then we’ll be ready to defuse complex problems, ignite new ideas, and guide each other through every foggy corridor.
Conversation and empathy are the bridges that carry us across the threshold into tomorrow’s world.
Reference list – for those who want to research the subject more deeply:
Senior Test Expert & SogetiLabs Member