Remember when companies just used AI? That was quaint. Now, they're not just using it; they're owning it. And by "it," we mean the entire data ecosystem that feeds those hungry algorithms. The goal? To customize AI so precisely it practically reads their minds — or at least, their spreadsheets.
The trick, apparently, is to balance this newfound data ownership with making sure the good stuff (high-quality data) flows safely and reliably. Because, as any AI will tell you, garbage in, garbage out. And nobody wants a sentient spreadsheet spitting out nonsense.

This delightful paradox was the talk of the town at MIT Technology Review’s EmTech AI conference, where the buzzword on everyone's lips was "AI factories." Think of it: a dedicated, data-churning facility designed for scale, sustainability, and, most importantly, control. Because apparently, data sovereignty is the new black, not just for businesses, but for entire governments.
We're a new kind of news feed.
Regular news is designed to drain you. We're a non-profit built to restore you. Every story we publish is scored for impact, progress, and hope.
Start Your News DetoxWho's Building These Things?
One of the architects of this brave new world is Chris Davidson, VP of HPC & AI Customer Solutions at HPE. He's the guy leading HPE's global charge into "AI Factory solutions" and "Sovereign AI." Basically, he's helping governments and big businesses build their own secure, scalable AI systems. Systems so robust, they could probably run a small country. Or at least its tax department.
Davidson also steers the ship for product management and performance engineering for HPE’s high-performance computing (HPC) and AI products. This includes the platforms that train those ridiculously large AI models you keep hearing about, and the Cray exascale systems that make your laptop look like an abacus. His teams are setting the strategy, ensuring HPE stays firmly planted at the cutting edge of "computers that think really, really fast."

Then there's Arjun Shankar, Division Director for the National Center for Computational Science at Oak Ridge National Laboratory. His gig involves making sure computer science plays nice with large-scale scientific discoveries. Discoveries that, you guessed it, rely on scalable computing and data science. He’s also teaching the next generation of data wranglers at the University of Tennessee’s Bredesen Center. Because someone has to.
The takeaway? We’re moving beyond just using AI. We’re building custom, secure, data-powered intelligence hubs. Which, if you think about it, is both impressive and slightly terrifying.











