Alex Olwal, Ph.D.
Engineering Leader / Product Innovation, R&D and User Experience

Human—AI Interaction · Augmented Reality · Wearables · Human—Computer Interaction · Accessibility · Interaction Technology · Ubiquitous Computing

Founded and led various teams at Google, including the Interaction Lab; Tech & Society 2024-2025, Augmented Reality 2020-2024, Research 2018-2020, ATAP 2017-2018, Wearables 2015-2017, [x] 2014-2015. Academic research conducted at MIT, Columbia University, University of California, KTH (Royal Institute of Technology), and Microsoft Research, with much appreciated collaborators and contributors at the same or other institutions. Also, thank you Stanford University, Rhode Island School of Design and KTH for the opportunity to teach.

Alex Olwal

Wearable Subtitles
Wearable Subtitles facilitates communication for deaf and hard-of-hearing individuals through real-time speech-to-text in the user's line of sight in a proof-of-concept eyewear display. The hybrid, low-power wireless architecture is designed for all-day use with up to 15 hours of continuous operation.

Google AI Blog: Robust speech recognition in AR through infinite virtual rooms with acoustic modeling ->
Wearable Subtitles: Augmenting Spoken Communication with Lightweight Eyewear for All-day Captioning
Olwal, A., Balke, K., Votintcev, D., Starner, T., Conn, P., Chinh, B., and Corda, B.
Proceedings of UIST 2020 - Best Demo Honorable Mention Award (ACM Symposium on User Interface Software and Technology), Virtual Event, Oct 20-23, 2020, pp. 1108-1120.

UIST 2020 - Best Demo Honorable Mention Award
PDF [16MB]
Quantifying The Effect of Simulator-Based Data Augmentation for Speech Recognition on Augmented Reality Glasses
Arakawa, R., Parvaix, M., Lai, C., Erdogan, H., and Olwal, A.
Proceedings of ICASSP 2024 (IEEE International Conference on Acoustics, Speech and Signal Processing), Seoul, South Korea, Apr 14-19, 2024, pp. 726-730.

ICASSP 2024
PDF [2.4MB]
thumbnails/wearable_subtitles/2.png