Author
Listed:
- Jae-Hyun Jung
- Doron Aloni
- Yitzhak Yitzhaky
- Eli Peli
Abstract
There are encouraging advances in prosthetic vision for the blind, including retinal and cortical implants, and other ?sensory substitution devices? that use tactile or electrical stimulation. However, they all have low resolution, limited visual field, and can display only few gray levels (limited dynamic range), severely restricting their utility. To overcome these limitations, image processing or the imaging system could emphasize objects of interest and suppress the background clutter. We propose an active confocal imaging system based on light-field technology that will enable a blind user of any visual prosthesis to efficiently scan, focus on, and ?see? only an object of interest while suppressing interference from background clutter. The system captures three-dimensional scene information using a light-field sensor and displays only an in-focused plane with objects in it. After capturing a confocal image, a de-cluttering process removes the clutter based on blur difference. In preliminary experiments we verified the positive impact of confocal-based background clutter removal on recognition of objects in low resolution and limited dynamic range simulated phosphene images. Using a custom-made multiple-camera system, we confirmed that the concept of a confocal de-cluttered image can be realized effectively using light field imaging.
Suggested Citation
Jae-Hyun Jung & Doron Aloni & Yitzhak Yitzhaky & Eli Peli, "undated".
"Active Confocal Imaging for Visual Prostheses,"
Working Paper
169356, Harvard University OpenScholar.
Handle:
RePEc:qsh:wpaper:169356
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:qsh:wpaper:169356. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Richard Brandon (email available below). General contact details of provider: https://edirc.repec.org/data/cbrssus.html .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.