Every six weeks, our engineering team at Lune wraps up with a “cooldown” week. It’s a chance to slow down from our usual pace, tackle maintenance tasks, clean up tech debt, and explore ideas that have been quietly waiting in the backlog or floating in our heads.
A few cycles ago, we decided to try something new: dedicate the cooldown week to all things AI. We called it, simply, AI Week.
It turned out to be one of the most energising cooldowns we’ve had.
AI is everywhere right now. From code assistants to search engines to image generators, it feels like everything is “powered by AI.” But as engineers building emissions intelligence, we wanted to go beyond the hype and get a better sense of what this tech can actually do for us, and whether it makes sense to adopt it.
So we set a few goals for ourselves:
Throughout the week, we had short presentations and casual knowledge-sharing sessions from people across the team. The topics ranged from:
Everyone picked something they were curious about, explored it, and shared back with the team. The result? A week full of fun rabbit holes, buggy prototypes, and a lot of learning.
For my part, I played around with transformers.js, a JavaScript library that brings some of Hugging Face’s AI models (pre-trained machine learning models for tasks like translation or text analysis) directly into the browser. This means you can run AI features without sending data to a server, which can be faster and more private.
The goal: see if we could translate part of our product interface into the user’s browser language, right in the browser. In other words, if a customer in Spain opened Lune, they’d instantly see their menus, labels, and instructions in Spanish — without waiting for a server to handle the translation. This could make the product feel more personalised and responsive for international users.
The setup was simple: use transformers.js to load a translation model (the AI brain that converts text from one language to another), detect the user’s preferred language, and translate a chunk of text.
Since these models can be heavy, I used web workers. This is a way of running processes in the background that controls the page’s interface. This keeps the UI smooth and responsive while the translation happens.
Sure, it wasn’t production-ready, but it was a great proof of concept, and a fun way to test whether “on-device AI” could be useful for certain cases in our product.
By the end of the week, we all came away with different learnings, but there were a few common threads:
AI Week wasn’t about shipping code or chasing trends, it was about learning together. It was a space and time to be curious, to ask “how does this even work?” and “can we use this at Lune?” And maybe just as importantly, it helped us build a shared language around AI that we’ll carry into future work.
We’ve still got more questions than answers, but that’s kind of the point.
Until next cooldown!
To keep up with the latest applications of AI in emissions intelligence, subscribe to Lune’s newsletter.
Get the latest updates in the world of carbon tracking, accounting, reporting, and offsetting direct to your inbox.