
In an exciting leap forward for AI accessibility, Multiverse Computing—one of Europe’s most innovative AI startups—has introduced two remarkably compact yet powerful AI models, playfully named after animal brains. Dubbed SuperFly (inspired by a fly’s brain) and ChickBrain (modeled after a chicken’s brain), these models push the boundaries of efficiency, proving that big performance can come in small packages.
The World’s Smallest High-Performing AI Models
Multiverse Computing’s breakthrough lies in its ability to compress AI models to unprecedented sizes without sacrificing their capabilities. These tiny but mighty models are designed to operate seamlessly on Internet of Things (IoT) devices, smartphones, tablets, and even wearables like the Apple Watch—all without requiring an internet connection.
As co-founder and quantum physics expert Román Orús explained to TechCrunch, “We can compress models so much that they fit directly on your devices. Imagine running advanced AI locally on your iPhone or smartwatch—no cloud dependency, just instant, on-device intelligence.”
The Secret Sauce: CompactifAI
The magic behind this innovation is Multiverse’s proprietary CompactifAI technology—a quantum-inspired compression algorithm that drastically shrinks AI models while maintaining (and sometimes even enhancing) their performance. Unlike traditional compression methods, CompactifAI leverages principles from quantum physics, offering a more refined and efficient approach.
This cutting-edge tech has already been applied to popular open-source models, including compressed versions of Meta’s Llama and OpenAI’s latest releases. But Multiverse didn’t stop there—it set out to create the smallest, most efficient models possible, leading to the birth of its Model Zoo (a playful nod to the animal brain sizes that inspired the naming).
SuperFly & ChickBrain: Tiny Brains, Big Potential
- SuperFly (94M parameters) – A lightweight AI based on Hugging Face’s SmolLM2-135, optimized for ultra-low-power devices like home appliances. Picture asking your washing machine, “Start quick wash,” and having it respond instantly—no cloud processing needed.
- ChickBrain (3.2B parameters) – A more advanced model, compressed from Meta’s Llama 3.1 8B, capable of reasoning and outperforming its original version in benchmarks like MMLU-Pro (language skills), GSM8K (math), and GPQA Diamond (general knowledge). It runs locally on a MacBook, no internet required.
Why This Matters
While these models won’t dethrone GPT-4 or Gemini in raw power, their real value lies in their efficiency. Multiverse isn’t just making AI smaller—it’s making it ubiquitous. From smart home gadgets to industrial IoT, these models open doors for AI in places previously deemed impossible due to hardware limitations.
Industry Adoption on the Horizon
Multiverse is already in discussions with tech giants like Apple, Samsung, Sony, and HP (which recently invested in its latest €189M funding round). Beyond consumer electronics, the startup’s compression tech is being used by major players like Bosch, BASF, and Moody’s for applications ranging from image recognition to predictive analytics.
For developers, Multiverse offers an AWS-hosted API with competitive token pricing, making it easier to integrate these compact models into apps and devices.
The Future of On-Device AI
As AI continues to evolve, the demand for efficient, localized processing will only grow. Multiverse Computing’s tiny but powerful models represent a major step toward a future where AI isn’t just in the cloud—it’s in your pocket, your home, and even your coffee maker.
Want to share your thoughts on AI advancements like these? TechCrunch values your feedback—take their survey and help shape the future of tech coverage!