Technology

AI that talks to itself, rethinks, acts

The new study published in the journal, Neural Computation reveals a different approach towards artificial intelligence (AI)

January 29, 2026
AI that talks to itself, rethinks, acts
AI that talks to itself, rethinks & acts

We all know that artificial intelligence uses algorithms to analyze massive datasets, identifying patterns and structures to make predictions and decisions and to generate new content, while tech researchers suggest something different and unusual about AI.

A team of researchers from the Okinawa Institute of Science and Technology (OIST) found that artificial intelligence, or AI, systems perform better across many tasks when they are trained to use inner speech alongside short-term memory.

Researchers showed that internal “mumbling,” combined with short-term memory, helps AI adapt to new tasks, switch goals, and handle complex challenges more easily.

They assume that relying on less data—giving AI an inner voice—helps it think more flexibly and learn new tasks faster, as self-talk, paired with working memory, can let machines generalize better.

Teaching AI to talk to itself could be the key to smarter, more adaptable machines.

This approach boosts learning efficiency while using far less training data, while paving the way for more flexible, human-like AI systems.

Talking to yourself may feel uniquely human, but it turns out this habit can also help machines learn.

Internal dialogue helps people organize ideas, weigh choices, and make sense of emotions.

New research shows that a similar process can improve how artificial intelligence learns and adapts.

As first author Dr. Jeffrey Queißer, staff scientist in OIST's Cognitive Neurorobotics Research Unit, explains, "This study highlights the importance of self-interactions in how we learn."

"By structuring training data in a way that teaches our system to talk to itself, we show that learning is shaped not only by the architecture of our AI systems but also by the interaction dynamics embedded within our training procedures."

To test this idea, the researchers combined self-directed internal speech, described as quiet "mumbling," with a specialized working memory system.

This approach allowed their AI models to learn more efficiently, adjust to unfamiliar situations, and handle multiple tasks at once.

The researchers began by examining memory design in AI models, focusing on working memory and its role in generalization.

Working memory is the short-term ability to hold and use information, whether that means following instructions or doing quick mental calculations.

By testing tasks with different levels of difficulty, the team compared various memory structures.

They found that models with multiple working memory slots (temporary containers for pieces of information) performed better on challenging problems, such as reversing sequences or recreating patterns.

These tasks require holding several pieces of information at once and manipulating them in the correct order.

The results showed clear gains in flexibility and overall performance compared with systems that relied on memory alone.

The findings suggest that learning is shaped not only by the structure of an AI system but also by how it interacts with itself during training.

Additionally, the new study was published in the journal Neural Computation.