na.util.us
Navigational Utility Underwater System, or Na.util.us, is a conceptual augmented interface displayed on a scuba diver’s mask. The system connects to the diver’s gear to provide essential data on equipment and surroundings, enhancing situational awareness and safety. It also preemptively recognizes key signals between divers, identifies marine life, and surfaces geo-location data to create a more intelligent and connected dive experience.
Designed during the era of Google Glass (before the rise of AI) Na.util.us explored how wearable computing and machine vision could redefine exploration beneath the surface. As a self-initiated project, I developed the concept, branding, interface design, and motion graphics to visualize how early augmented reality could evolve into a system that not only minimizes risk but transforms diving into an immersive, interactive experience.
Client
na.util.us
Project Date
2013
Category
3D · Augmented Reality · Motion Experience Design · Product Design

01
Concept Exploration
Na.util.us began as a speculative concept that imagined how divers could use augmented interfaces to enhance underwater safety, communication, and discovery. It explored how wearable computing and data visualization could extend human capability beneath the surface.

02
Interface Design
The visual system was built to deliver real-time environmental feedback, including depth, temperature, oxygen levels, and compass orientation. Its goal was to create a balance between clarity and immersion, by displaying vital information without overwhelming the diver’s view of the ocean environment.

03
Intelligent Recognition
Using early ideas around computer vision, the system conceptually identified marine life and key underwater objects. These overlays added contextual intelligence to the diving experience, helping divers recognize species and navigate safely through complex underwater terrain.


04
Motion & Visualization
To bring the concept to life, I developed motion sequences simulating the diver’s point of view. These visualizations integrated real-time telemetry and AR data overlays, demonstrating how dynamic feedback could create a more interactive and intuitive underwater experience.


