Pan

Overview

Real-world Pokedex for plants. Point your camera, Core ML identifies species on-device, and collect botanical illustrations in a gamified field journal.

Pan turns nature walks into collection quests. Point your camera at a plant, Core ML identifies the species on-device, and you collect stylized botanical illustrations in a personal field journal. Named after the Greek god of the wild, the aesthetic is a worn leather-bound naturalist's journal with parchment textures and hand-drawn illustrations. The ML pipeline runs locally via Vision framework with a fallback to Pl@ntNet's API for uncertain results. Gamification drives engagement: XP for discoveries, rarity multipliers, daily streaks, quests, and level progression.

Category

AI

Stack

SwiftSwiftUICore MLVisionSwiftData

Features

Core ML Identification

On-device inference via Vision framework. iNaturalist-trained model classifies ~500 taxa at 70%+ confidence without network. Falls back to Pl@ntNet API for uncertain results.

Field Journal

Pokedex-style grid with discovered and undiscovered states. Each entry shows a stylized botanical illustration, sighting history, fun facts, and location data.

Gamification

XP rewards for discoveries (50 for new species, 2x for rare, 5x for legendary). Level progression from Seedling to Pan's Chosen. Daily streaks, quests, and badges.

Offline-First

SwiftData persists all sightings and species data locally. Core ML runs without network. Photos stored as HEIC (~150KB each). Full functionality without WiFi.

Camera Pipeline

AVFoundation captures frames, preprocesses to 299x299, feeds Core ML. Result sheet shows species illustration, common and scientific names, and XP animation.

Location Tracking

CoreLocation tags each sighting with GPS coordinates. Species distribution maps show where you found each plant. Location boosts cloud ID accuracy.

Identification Pipeline

1 Camera frame
2 Preprocess 299x299
3 Core ML inference
4 Confidence check
5 Journal entry