CookAR

A head-mounted AR system with real-time object affordance augmentations to support safe and efficient interactions with kitchen tools

AR/VR

Accessibility

UIST 2024 BEST PAPER AWARD 🏆

CookAR explores how real-time affordance detection and AR can make kitchen tools safer for people with low vision.

The project combines computer vision and AR to highlight safe-to-touch areas in green and dangerous areas in red, providing intuitive visual guidance during cooking tasks.

You can read the full paper here.

Overview

Role

UX Designer and researcher

Timeline

2023 - 2024

Tool

PyTorch, RTMDet, ZED Mini AR, Oculus Quest 2, Roboflow

Skills

Literature Review, Qualitative Interviews, Rapid Ideation, Rapid Prototyping, Evaluative Research

Contribution

Concept Development

Data annotation

Conduct user study

the problem

Navigating Kitchen Safety: Challenges for People with Low Vision

Cooking can be daunting for individuals with low vision due to difficulties in identifying safe-to-touch areas and navigating potentially hazardous tools like sharp knives or hot pans. These challenges often compromise safety and limit independence in the kitchen.

the Solution

Impact

Transforming Kitchen Accessibility Through AR

CookAR empowers people with low vision to navigate kitchen tasks safely and confidently through real-time, affordance-based visual guidance. Its innovative approach demonstrates potential far beyond the kitchen.

0

%

Of participants

preferred affordance-based visualizations

0.00

ms

system latency

was achieved while maintaining accurate

affordance detection and real-time assistance

0

+

new potential applications

were identified users, demonstrating the broader impact potential beyond kitchen safety

Key Takeaways & Growth

© 2024 All rights reserved.  looks best when viewed on your computer 👩🏻‍💻