Rapid News Roundup

Research leaders funding physical-world AI alternatives to LLMs

Research leaders funding physical-world AI alternatives to LLMs

Physical / embodied AI bets

Research Leaders and Investors Shift Focus Toward Physical-World AI and Embodied Intelligence

The landscape of artificial intelligence research is undergoing a significant transformation. While large language models (LLMs) like GPT have dominated headlines and funding rounds over recent years, leading scientists and investors are now increasingly channeling resources into physical-world AI, robotics, and embodied intelligence. This emerging paradigm emphasizes AI systems that interact with, perceive, and learn from the real environment, signaling a potential paradigm shift away from the traditional emphasis on scale and language processing.

Major Research Leaders Drive the Shift with Bold Initiatives

Among the most prominent figures championing this new direction are Yann LeCun and Yoshua Bengio, two pioneering AI researchers who have historically contributed to the deep learning revolution. Their recent activities underscore a strategic pivot away from the scaling of LLMs toward embodied AI:

  • Yann LeCun, a Turing Award laureate and deep learning pioneer, has announced the launch of a startup called AMI, dedicated to developing AI systems capable of operating within the physical environment. LeCun has publicly declared that the pursuit of ever-larger language models is “nonsense,” emphasizing instead the importance of embodied intelligence—AI that can perceive, interact with, and learn from the real world through sensors and actuators.

    • Funding and Vision: LeCun has successfully raised $1 billion for AMI, reflecting strong investor confidence in this approach. The investment aims to develop AI agents capable of physical interaction, sensory perception, and autonomous learning—traits vital for real-world applications like robotics, autonomous vehicles, and perceptual systems.
  • Yoshua Bengio, another leading AI scientist, is re-engaging with ventures exploring what comes after LLMs. He has teamed up with Xie Saining and garnered backing from tech giants like NVIDIA to support a new company focused on alternative AI paradigms.

    • Goals and Focus: This initiative aims to develop AI systems that surpass the limitations of current large-scale language models, likely emphasizing perception, motor control, and sensorimotor learning. The focus is on creating AI that seamlessly integrates perception and action, enabling more autonomous and adaptable systems.

Supporting Developments: Real-World Data and Embodied Applications

The push toward physical-world AI is bolstered by practical developments and data sources that demonstrate the viability and momentum of embodied approaches:

  • Unintentional Data Gathering via Consumer Activities: A compelling example is the phenomenon where "Pokémon Go" players unknowingly trained delivery robots with their in-game actions, contributing to a dataset of roughly 30 billion images. This vast, organically collected data set is invaluable for training perception and navigation systems in robotics and autonomous agents, highlighting how everyday human activities can accelerate embodied AI development.

  • Real-World Interaction and Data: Such examples underscore that AI systems can benefit from sensory data collected in natural settings, reducing reliance on curated datasets and advancing the development of perception-driven robots and agents.

Significance and Future Outlook

The concerted efforts by top researchers and large-scale investments indicate a potential paradigm shift in AI research. Instead of solely scaling models for language understanding, the focus is expanding toward perception, motor control, and embodied cognition—areas essential for creating AI that can operate autonomously in complex physical environments.

This shift could lead to:

  • More versatile AI agents capable of physical manipulation, navigation, and interaction
  • Enhanced autonomous systems for robotics, self-driving cars, and assistive devices
  • A broader understanding of intelligence as perception-action coupling, moving beyond language-centric paradigms

As these initiatives progress, they may redefine the core objectives of AI research, emphasizing embodied, perceptual, and motor capabilities alongside or instead of linguistic proficiency.

Current Status and Implications

With Yann LeCun’s $1 billion investment, Bengio’s new ventures supported by industry giants like NVIDIA, and practical demonstrations such as data collection through everyday activities, the momentum behind physical-world AI is unmistakable. This strategic realignment suggests that the future of AI may be less about scaling up language models and more about building systems that understand and act within the physical universe—a move that could unlock new levels of intelligence, autonomy, and applicability across industries.

In summary, the AI community is witnessing a decisive pivot toward embodied intelligence, driven by visionary leaders and substantial funding, signaling a new era where physical interaction and perception take center stage in artificial intelligence research and development.

Sources (4)
Updated Mar 16, 2026