A California startup just showed off something that feels like it shouldn't fit in a carry-on: a fully functional AI supercomputer. ODINN's OMNIA system packs the computing muscle of a sprawling data center into a unit small enough to wheel through an airport.
This matters because building a traditional data center is slow and expensive. You need months of construction, specialized cooling infrastructure, raised floors, dedicated power lines. Most organizations that need serious AI capability either wait it out or rent computing time from cloud providers — which means their data leaves the building. OMNIA changes the equation.
The system comes with the same processors, graphics cards, memory, and storage you'd find in enterprise data centers, but it runs on standard office power and networking. It has its own closed-loop cooling, so it stays quiet enough to sit in a hospital lab, research facility, or secure government building without needing a separate server room. Deploy it in minutes instead of months.
We're a new kind of news feed.
Regular news is designed to drain you. We're a non-profit built to restore you. Every story we publish is scored for impact, progress, and hope.
Start Your News DetoxOf course, a single carry-on-sized unit has limits. One OMNIA can only handle so much data processing. ODINN anticipated this and built the Infinity Cube — multiple OMNIA systems stacked together in a glass enclosure that functions as a modular data center. Each unit manages its own cooling and computing independently, so you don't need external cooling plants or infrastructure. You just plug it in and scale.
The real shift here is about data sovereignty and speed. Organizations with strict privacy rules — hospitals, government agencies, financial institutions — can now keep their AI work in-house instead of sending sensitive data to cloud providers. There's no latency penalty from sending queries across the internet. And because ODINN built NeuroEdge, a software layer that works with NVIDIA's ecosystem and other AI frameworks, institutions can focus on actually using the AI instead of spending months tuning systems for optimal performance.
At CES 2026, ODINN isn't positioning itself as a data center company or a cloud service provider. It's an AI infrastructure company that solves a specific problem: how do you get high-performance AI capability where you need it, when you need it, without building a facility. For research labs, hospitals, and institutions handling sensitive data, that's a meaningful shift.









