Question from our Community
Source: Facebook Post ↗︎
"Serious question for the CEO: since this has AI elements, has the company reviewed the resource usage of this technology? I read about the freshwater and electricity draw, and have tried to consciously reject AI usage wherever I am aware of it. I did not think about how much AI this system relies on for its games, and now am wondering how much our household footprint is."
Answer from David Lee, CEO of Nex
When people talk about the environmental footprint of AI, there are generally two very different sources of energy usage:
(1) running an AI model, and
(2) training an AI model.
(1) Running the AI model (what happens in your home)
Nex Playground’s motion-tracking AI runs entirely locally on the device. Nothing is streamed to the cloud for gameplay, and no video data is sent to servers.
The system uses a 12nm ARM-based SoC and a 15V / 3A power supply. That power rating represents a maximum ceiling, not what the device continuously draws. In practice, only a portion of the system’s power is used for motion tracking. Rendering graphics on the GPU generally consumes more power than the AI inference itself.
Overall, real-world power usage is in the same general range as a Nintendo Switch during gameplay. For context, a cloud-based AI request, such as a text query to a large language model running in a data center, typically involves more energy due to server-side compute, cooling, and networking than local inference running on a consumer device like Nex Playground.
Because everything runs locally, using Nex Playground does not create ongoing server load, data-center computation, or associated cooling and freshwater usage during play.
(2) Training the AI model (what happens on our side)
We do train and improve our motion-tracking models, and that does require server-side compute. Improved models are delivered to Nex Playgrounds via system updates a few times a year. However, this is a computer-vision detection model, not a generative AI model like ChatGPT, Gemini, or Grok.
The scale difference is very large. Our training datasets and model sizes are several orders of magnitude smaller than those used for large generative models. To give a rough sense of scale, the total cost of training our models is in the hundreds of thousands of dollars over time. Large generative AI models are widely reported to cost tens or hundreds of millions of dollars to train. That gap reflects an equally large difference in compute usage, electricity consumption, and associated environmental impact.
In short, Nex Playground’s AI is designed to be lightweight, local, and efficient. From a household footprint perspective, it behaves much more like a traditional game console than like cloud-based AI services. That distinction is important, and I appreciate you being thoughtful about it.
If you ever want us to go deeper on how we think about sustainability, privacy, or system design tradeoffs, I’m always happy to share more.
