The Aimable team spent a day in the forest at my cofounder ArjΓ©'s place, splitting wood and enjoying the quiet. Being out there gave me a bit more perspective on the scale of what we are building in the AI industry and the energy it consumes. It is easy to forget that every query, every workflow, every model call has a real footprint somewhere in the world.
That is why this new Stanford study on local model efficiency caught my attention so strongly. The data shows how fast things are changing. Local models can now handle close to 89 percent of single-turn tasks, and intelligence per watt has improved more than fivefold in just two years. With smart routing, most queries can be answered locally while cutting energy use by 60 to 80 percent.
I find this encouraging. It is a reminder that responsible AI adoption is not only about safety and control but also about making real progress on efficiency. When we started Aimable, the focus was on helping organizations steer and govern AI inside their walls. Seeing energy savings emerge as a natural consequence of better orchestration is a welcome bonus.
The photo I am adding has nothing to do with AI on the surface, but the contrast makes the point for me. If we want powerful systems, we also need to build them in a way that respects the world that powers them.

