nasa logo nemo logo ames logo

PLAY iOS & Mac! Android! Windows!


What is NeMO-Net?

NeMO-Net is a single player iPad game where players help NASA classify coral reefs by painting 3D and 2D images of coral. Players can rate the classifications of other players and level up in the food chain as they explore and classify coral reefs and other shallow marine environments and creatures from locations all over the world!

Unique Downloads
Classifications Completed
Classifications Reviewed
iOS App Store Rating


new enabling technologies

Data from the NeMO-Net game is fed to NASA NeMO-Net, the first neural multi-modal observation and training network for global coral reef assessment. NeMO-Net is an opensource deep convolutional neural network (CNN) that leverages NASA’s Supercomputer, Pleiades, to use game data to classify and assess the health of coral reefs around the world.

NeMO-Net exploits active learning and data fusion of mm-scale remotely sensed 3D images of coral reefs captured using fluid lensing with the NASA FluidCam instrument, presently the highest-resolution remote sensing benthic imaging technology capable of removing ocean wave distortion. These data are used to train low resolution data from NASA’s Earth Observing System, including hyperspectral airborne remote sensing data and satellite data to determine coral reef ecosystem makeup globally at unprecedented spatial and temporal scales.

The Challenge

Changing Marine Environments

Aquatic ecosystems, particularly coral reefs, remain quantitatively misrepresented by low-resolution remote sensing as a result of refractive distortion from ocean waves, optical attenuation, and remoteness. Machine learning classification of coral reefs using NASA FluidCam mm-scale 3D data show that present satellite and airborne remote sensing techniques poorly characterize coral reef percent living cover, morphology type, and species breakdown at the mm, cm, and meter scales. Indeed, current global assessments of coral reef cover and morphology classification based on km-scale satellite data alone can suffer from segmentation errors greater than 40%, capable of change detection only on yearly temporal scales and decameter spatial scales, significantly hindering our understanding of patterns and processes in marine biodiversity at a time when these ecosystems are experiencing unprecedented anthropogenic pressures, ocean acidification, and sea surface temperature rise.


The search for biogenic markers

NeMO-Net’s neural network is able to discern complex living organisms such as coral reefs from remotely sensed imagery with high accuracy. This idea may soon be extended to the search for life elsewhere in the universe in the future. Life may have flourished, for example, in Mars’ ancient salty seas. Earth’s early history is dominated by stromatolites, which left fossils akin to modern coral reefs and still exist on Earth today. This led to a fluid lensing field campaign to Shark Bay, Western Australia, home to the largest collection of extant marine stromatolites. Might is be possible to map these stromatolites in sufficient detail to enable an algorithm like NeMO-Net to learn what biogenic carbonate structures and patterns look like? This may enable NeMO-Net to help identify targets of interest on Mars which may be fossilized forms of ancient Martian life and distinguish between formations abiotic in origin.


With more forthcoming

Learn more about the science and technologies behind NeMO-Net in these new publications.

Next-Generation Optical Sensing Technologies for Exploring Ocean Worlds - NASA FluidCam, MiDAR, and NeMO-Net.

NeMO-Net – Gamifying 3D Labeling of Multi-Modal Reference Datasets to Support Automated Marine Habitat Mapping

NASA NeMO-Net’s Convolutional Neural Network: Mapping Marine Habitats with Spectrally Heterogeneous Remote Sensing Imagery.

Cloud Detection Algorithm for Multi-Modal Satellite Imagery using Convolutional Neural-Networks (CNN).

Fluid Lensing and Machine Learning for Automated Centimeter-Resolution Airborne Assessment of Coral Reefs in American Samoa without Ocean Wave Distortion.

Drones that See through Waves – Preliminary Results from Airborne Fluid Lensing for Centimetre-Scale Aquatic Conservation.

Coral reef video game will help create global database.

Drone takes to the skies to image offshore reefs.

Learning Instrument Invariant Characteristics for Generating High-resolution Global Coral Reef Maps.

System and method for imaging underwater environments using fluid lensing.

System and method for active multispectral imaging and optical communications.

Chord Shore 🎹🌊 · NeMO-Net (Original Game Soundtrack)

The Team