Apparatus and methods for determining the three-dimensional shape of an object using active illumination and relative blurring in two images due to defocus
- Summary
- Lead Inventors: Shree Nayar, Minori Noguchi, Masahiro WantanabeProblem or Unmet Need:Problem or Unmet Need ~Assessing the depth of an object from a 2D image is a pertinent problem in computer vision. Active illumination combined with focus analysis has several advantages over other techniques as it allows multiple images of an object to be taken from the same view point. However, determining depth from defocus is complicated by the need to separate the data into multiple narrow frequency domains and its inability to resolve the depth of texture-less objects. In addition, the relationship between magnification and defocus have yet to be adequately addressed. Details of the Invention ~The following invention: Describes an apparatus for mapping three dimensional object from two-dimensional images with increased accuracy, using the depth from defocus method The method requires that the scene be illuminated with a preselected illumination pattern, and that at least two images of the scene be sensed, where the sensed images are formed with different imaging parameters The relative blur between corresponding elemental portions of the sensed images is measured, thereby to determine the relative depth of corresponding elemental portions of object The technology resolves issues of texture-less objects by using active illumination to impose texture Uses constant magnification defocusing
- Technology Benefits
- The approach produces precise, high resolution depth maps at frame rate. The approach uses inexpensive off-the-shelf imaging and processing hardware
- Technology Application
- Security systems Military and police surveillance Navigation systems
- Detailed Technology Description
- Details of the Invention ~The following invention: Describes an apparatus for mapping three dimensional object from two-dimensional images with increased accuracy, using the depth from defocus method The method requires that the sce...
- *Abstract
-
None
- *Inquiry
- Calvin Chu Columbia Technology Ventures Tel: (212) 854-8444 Email: TechTransfer@columbia.edu
- *IR
- MS95/01/03
- *Principal Investigator
-
- *Publications
- Videos of the invention are available on the Inventor's websiteRational Filters for Passive Depth from Defocus. M. Watanabe and S.K. Nayar, International Journal on Computer Vision, Vol.27, No.3, pp.203-225, May, 1998.Telecentric Optics for Focus Analysis. M. Watanabe and S.K. Nayar, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol.19, No.12, pp.1360-1365, Dec, 1997.Are Textureless Scenes Recoverable? H. Sundaram and S.K. Nayar, IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp.814-820, Jun, 1997.Real-Time Focus Range Sensor S.K. Nayar, M. Watanabe and M. Noguchi, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol.18, No.12, pp.1186-1198, Dec, 1996.Minimal Operator Set for Passive Depth from Defocus. M. Watanabe and S.K. Nayar, IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp.431-438, Jun, 1996.Real-Time Computation of Depth from Defocus. M. Watanabe, S.K. Nayar and M. Noguchi, Proceedings of The International Society for Optical Engineering (SPIE), Vol.2599, pp.14-25, Jan, 1996
- *Web Links
- Patent number: WO9641304: US 6,229,913VIDEO PROFILE: SHREE NAYAR
- Country/Region
- USA

For more information, please click Here