An AR kite-flying game for Snap Spectacles that turns the sky into your playground.
An AR kite-flying game for Snap Spectacles that turns the sky into your playground.
SpectaKite combines the words Spectacles and Kite, representing both the platform and the spirit of play behind the project. It is my solo AR experience developed for Snap Spectacles using Lens Studio 5.9.1. The project transforms kite flying into an immersive, gesture-controlled AR game where players steer a virtual kite, collect coins, and race against time.
Built entirely from the ground up, SpectaKite explores how movement, timing, and precision can create a sense of flow and engagement in spatial computing. By blending interactive physics, spatial audio, and real-time feedback, the experience demonstrates how wearable AR can inspire active play, encourage presence, and turn simple motion into something creative and meaningful.
PROBLEM
Most Spectacles experiences focus on passive viewing or surface-level interaction. I wanted to explore how embodied movement, specifically hand motion and timing, could become a core mechanic for AR gameplay. I was especially interested in creating an experience that encourages users to look upward toward the sky, something rarely explored within the Spectacles community.
GAMEPLAY
FLOW
The experience begins with an on-screen prompt, “Got It,” activated by a pinch gesture that signals readiness to start. After a 10-second countdown, players enter the gameplay phase where they control the kite using real hand motions tracked by Spectacles.
Players can steer, collect coins, and maintain flight stability, earning points throughout the timed session. When the timer ends, the game transitions to a results screen with options to restart or replay.
This interaction flow was designed to balance responsiveness, motion clarity, and accessibility, creating a light physical challenge that feels satisfying and intuitive.
TECHNICAL
IMPLEMENTATION
All gameplay logic and state management were handled in Lens Studio 5.9.1 using custom JavaScript and TypeScript scripts:
• KiteGameManager.js – Manages scoring, timers, music transitions, and game resets.
• KiteStringRope.ts – Renders a real-time 3D rope mesh that dynamically connects the kite and hand positions.
• RandomPointsInsideBox.js – Spawns exactly 15 coins in randomized positions each round, maintaining consistent performance.
• YesButtonPinch.ts – Handles gesture-based UI confirmation for the “Got It” button and game restarts.
Additional Features:
• Integrated two-track audio system for seamless transitions between intro and gameplay music.
• Utilized Spectacles hand tracking and motion sensors to detect pinch gestures and natural hand movements.
• Designed with future compatibility for the Spectacles Motion Controller API, which enables mobile devices to function as 6DoF motion controllers for enhanced spatial precision and haptic feedback.
IMPROVEMENT
As my first solo Spectacles project, SpectaKite taught me a lot about balancing creativity, technical problem-solving, and optimization in Lens Studio. Through development and testing, I identified several areas for improvement in future iterations:
1. Consistent Scripting Framework:
I mixed JavaScript and TypeScript within the same project, which made debugging and variable management more complicated. In future builds, I plan to stick with TypeScript for better structure, type safety, and code readability.
2. Shader and Asset Optimization:
I initially attempted to use a pre-made rope asset from the Lens Studio Asset Library, but it failed to render correctly during Spectacles recording. The issue was related to shader compatibility with the device. This experience taught me the importance of testing assets early on Spectacles hardware and understanding shader limitations when building for wearable AR.
Despite these challenges, the project strengthened my technical workflow and gave me a clearer direction for creating more optimized, visually consistent Spectacles experiences in the future.