Abstract
Abstract:
Emerging platforms such as Augmented Reality (AR), Virtual Reality (VR), and autonomous machines all intimately interact with humans. They must be built from the ground up, with principled considerations of human perception. This talk will discuss some of our recent work on exploiting the symbiosis between Computer Science and Vision Science.
We will discuss how to jointly optimize imaging, computing, and human perception to obtain unprecedented efficiency. The overarching theme is that a computing problem that seems challenging may become significantly easier when one considers how computing interacts with imaging and human perception in an end-to-end system. In particular, we will discuss two specific projects: real-time eye tracking for AR/VR and power optimization for VR displays. If time permits, I will also briefly discuss our ongoing work using computational techniques to help dichromats regain the trichromatic vision.
https://www.cs.rochester.edu/people/faculty/zhu_yuhao/index.html