A precision spatial-awareness platform that connects visually impaired users with their caregivers in real time — powered by Edge AI, IoT sensors, and a design philosophy rooted in human dignity.
Four ultrasonic and Time-of-Flight sensors map the user's surroundings in every direction — delivering obstacle alerts with millimetre precision.
An onboard ESP32-CAM runs a real-time AI pipeline with no cloud dependency — instant, localised object recognition that works even without an internet connection.
Directional vibration pulses encode both the proximity and bearing of obstacles — giving the user clear spatial information through touch alone, without breaking stride.
Enter your credentials to access the dashboard.
Step 1 of 2 — Your Details
Register your device to start monitoring.
Step 2 of 2 — Verify Your Identity
Enter the 6-digit OTP sent to your mobile and email address.
DEMO MODE — OTP shown for testing
——
Didn't receive the code? Edit details
Real-time physical mirror. Highlights indicate obstacle proximity based on active sensor data.
A wearable AI-powered cap that restores spatial awareness for visually impaired individuals — enabling independence, confident movement, and safety-focused caregiving, every single day.
Maarg Darshan Care is a next-generation assistive technology platform built to restore independence for individuals living with visual impairments. At its core is a wearable smart cap embedded with ultrasonic transducers, Time-of-Flight modules, and an ESP32-CAM — all working in harmony to build a continuous, real-time 360° picture of the space around the user.
The platform connects the wearer directly to their caregivers through a live web dashboard displaying sensor telemetry, a camera feed, GPS coordinates, and system alerts — giving families and support workers the visibility and reassurance they need, from wherever they are.
Four ultrasonic sensors sweep all directions at once, building a continuous spatial picture with colour-coded severity levels — updated in real time.
The onboard ESP32-CAM streams video directly to the caregiver dashboard with zero cloud dependency.
A precision vibration motor translates obstacle data into directional pulses so users understand their surroundings through touch alone.
Real-time coordinates and a visual path trace appear on the caregiver dashboard the moment the user starts moving.
Spoken alerts are delivered through the caregiver's device using the Web Speech API — clear, natural Indian English.
A single button press sends an immediate alert to registered contacts and emergency services with the user's live GPS coordinates.
We believe that visual impairment should never be a barrier to independence, safety, or living a confident and fulfilling life.
Every step taken outside should be a step taken with confidence. Our primary mission is to replace fear with freedom. By providing a reliable, 360-degree awareness of their surroundings, we empower visually impaired individuals to step out, explore, and navigate their daily lives without constantly relying on physical guidance from others.
We harness the power of modern innovation to solve real-world challenges. By integrating ultrasonic sensors, advanced Time-of-Flight lasers, and Edge AI processing through the ESP32-CAM, we transform a simple wearable cap into a powerful "digital eye."
True safety involves a support system. Our live dashboard provides caretakers with real-time GPS tracking, a live camera feed, and instant emergency SOS alerts. We bridge the distance between users and their loved ones, providing unparalleled peace of mind.
Advanced assistive technology is often treated as a luxury, priced far beyond the reach of those who need it most. We are on a mission to democratize mobility by providing enterprise-grade safety at a fraction of the traditional cost.
Maarg Darshan Care is constantly evolving. By continuously researching and integrating smart technology, we aim to build the most advanced mobility aids for the visually impaired.
Developing intelligent pathfinding algorithms that adapt to dynamic environments in real-time.
Enhancing machine vision to accurately classify complex obstacles using edge computing.
Designing ergonomic, discreet wearables that seamlessly integrate into the user's daily wardrobe.
Creating zero-latency telemetry pipelines to keep caregivers connected without interruption.
Identifying core mobility challenges and brainstorming smart, accessible solutions.
Drafting hardware schematics and designing intuitive user interfaces for maximum accessibility.
Building functional hardware models to test sensor fusion and edge processing capabilities.
Conducting extensive real-world trials to ensure safety, reliability, and precision.
Rolling out the finalized product and actively gathering user feedback for continuous improvement.
Manages low-level sensor timing and rapid component integration seamlessly.
Delivers powerful edge AI processing and live video streaming over Wi-Fi.
Provides reliable 360° proximity data by measuring sound wave reflections.
Time-of-Flight lasers offer millimeter-precise distance measurements.
Tracks live coordinates to ensure users and caregivers always stay connected.
Refining sensor accuracy, optimizing battery life, and expanding our real-world testing pool.
Integrating advanced machine learning models for precise object classification like stairs or vehicles.
Building a global ecosystem of smart wearables that makes universal accessibility an affordable standard.
A dedicated group of engineers, designers, and innovators united by a single vision: making independent mobility safe, affordable, and accessible for everyone.

Hardware Designer / Software Developer
JIS College of Engineering
Leads hardware design, embedded systems logic, leads testing, ai development and overall system architecture.

Backend Web Developer
JIS College of Engineering
Works on software integration,backend development and cloud APIs.

AI & App Developer
JIS College of Engineering
Handles machine learning models and optimizes ultrasonic proximity algorithms.

UI/UX & Management
JIS College of Engineering
Conducts user research, , UI/UX dashboard development and designs haptic feedback patterns.
Constantly exploring new edge-computing and AI methodologies to improve device latency and accuracy.
Designing with deep empathy for the end-user, ensuring technology solves real, human problems seamlessly.
Leveraging modern stacks from embedded C to React to deliver enterprise-grade stability on accessible hardware.