Skip to main content

Optimizing the Visual Experiences of the Visually Impaired on Social Media

Recent advancements in machine learning and computer vision have created an environment in which new applications in accessibility design have arisen. To fully utilize this technology, we worked with the Laboratory for Image and Video Engineering (LIVE) at the University of Texas at Austin to create a React Native based mobile application that will assist the visually impaired in taking better photos for social media, which has become a predominant aspect of daily life for a large percentage of the world. The camera application employs a no-reference (NR) visual quality perception model that details the presence of various distortions in an image. Further, we designed an accessible feedback loop allowing the user to adjust the photography methods they employ in a constructive and intuitive manner. We wish to integrate their needs and provide the visually impaired with a camera application that can make digital media, an ever-growing aspect of our lives, more accessible for everyone.

Team Members

Anqing Chen, Nicholas Chu, Jianchen Gu, Nikhil Krish, Pierce Phillips, Ammar Sheikh, Ricky Tiet, Carlos Villapudua

Semester