Selected Publications

This paper presents the advantages of a single-camera stereo omnidirectional system (SOS) in estimating egomotion in real-world environments. The challenge of applying omnidirectional stereo vision via a single camera is what separates our work from others. In practice, dynamic environments, deficient illumination, and poor textured surfaces result in the lack of features to track in the observable scene. As a consequence, this negatively affects the pose estimation of visual odometry (VO) systems, regardless of their field-of-view. We compare the tracking accuracy and stability of the single-camera SOS versus an RGB-D device under various real circumstances. Our quantitative evaluation is performed with respect to 3D ground truth data obtained from a motion capture system. The datasets and experimental results we provide are unique due to the nature of our catadioptric omnistereo rig, and the situations in which we captured these motion sequences. We have implemented a tracking system with simple rules applicable to both synthetic and real scenes. Our implementation does not make any motion model assumptions, and it maintains a fixed configuration among the compared sensors. Our experimental outcomes confer the robustness in 3D metric visual odometry estimation that the single-camera SOS can achieve under normal and special conditions in which other perspective narrow view systems such as RGB-D cameras would fail.
In MVAP, 2019.

We explore low-cost solutions for efficiently improving the 3D pose estimation problem of a single camera moving in an unfamiliar environment. The visual odometry (VO) task – as it is called when using computer vision to estimate egomotion – is of particular interest to mobile robots as well as humans with visual impairments. The payload capacity of small robots like micro-aerial vehicles (drones) requires the use of portable perception equipment, which is constrained by size, weight, energy consumption, and processing power. Using a single camera as the passive sensor for the VO task satisfies these requirements, and it motivates the proposed solutions presented in this thesis. To deliver the portability goal with a single off-the-shelf camera, we have taken two approaches: The first one, and the most extensively studied here, revolves around an unorthodox camera-mirrors configuration (catadioptrics) achieving a stereo omnidirectional system (SOS). The second approach relies on expanding the visual features from the scene into higher dimensionalities to track the pose of a conventional camera in a photogrammetric fashion. The first goal has many interdependent challenges, which we address as part of this thesis: SOS design, projection model, adequate calibration procedure, and application to VO. We show several practical advantages for the single-camera SOS due to its complete 360-degree stereo views, that other conventional 3D sensors lack due to their limited field of view. Since our omnidirectional stereo (omnistereo) views are captured by a single camera, a truly instantaneous pair of panoramic images is possible for 3D perception tasks. Finally, we address the VO problem as a direct multichannel tracking approach, which increases the pose estimation accuracy of the baseline method (i,e., using only grayscale or color information) under the photometric error minimization as the heart of the ‘direct’ tracking algorithm. Currently, this solution has been tested on standard monocular cameras, but it could also be applied to an SOS. We believe the challenges that we attempted to solve have not been considered previously with the level of detail needed for successfully performing VO with a single camera as the ultimate goal in both real-life and simulated scenes.
Thesis, 2018.

We present direct multichannel tracking, an algorithm for tracking the pose of a monocular camera (visual odometry) using high-dimensional features in a direct image alignment framework.
In 3DV, 2017.

GUMS is a complete projection model for omnidirectional stereo vision systems. GUMS is based on the existing generalized unified model (GUM), which we extend for fixed baseline sensors.
In IROS, 2016.

We describe the design and 3D sensing performance of an omnidirectional stereo (omnistereo) vision system applied to Micro Aerial Vehicles (MAVs).
In Sensors, 2016.

All Publications

. Visual Odometry with a Single-Camera Stereo Omnidirectional System. In MVAP, 2019.

Details Video Code Dataset Project

. Enhancing 3D Visual Odometry with Single-Camera Stereo Omnidirectional Systems. Thesis, 2018.

Details PDF Project

. Direct Multichannel Tracking. In 3DV, 2017.

Details PDF Video Project Supplementary Materials

. GUMS: A Generalized Unified Model for Stereo Omnidirectional Vision. In IROS, 2016.

Details PDF Slides Video Project

. Design and Analysis of a Single-Camera Omnistereo Sensor for Quadrotor Micro Aerial Vehicles (MAVs). In Sensors, 2016.

Details PDF Code

. Autonomous Quadrotor Flight Using Onboard RGB-D Visual Odometry. In ICRA, 2014.

Details PDF Video Code

. 6-DoF Pose Localization in 3D Point-Cloud Dense Maps Using a Monocular Camera. In ROBIO, 2013.

Details PDF Video

. A Single-Camera Omni-Stereo Vision System for 3D Perception of Micro Aerial Vehicles (MAVs). In ICIEA, 2013.

Details PDF

. Incremental Registration of RGB-D Images. In ICRA, 2012.

Details PDF

. Generating near-spherical range panoramas by fusing optical flow and stereo from a single-camera folded catadioptric rig. In MVAP, 2011.

Details PDF

. Fusing Optical Flow and Stereo in a Spherical Depth Panorama Using a Single-Camera Folded Catadioptric Rig. In ICRA, 2011.

Details PDF


Visual Odometry with a Single-Camera Stereo Omnidirectional System

An extension of the generalized unified model (GUM) originally applied to a single omnidirectional view.

Direct Multichannel Tracking

An extension of the semi-dense visual odometry (camera pose tracking) originally applied to a single grayscale image by the state-of-the-art Large-Scale Direct (LSD) SLAM.

GUMS: Generalized Unified Model for Stereo Omnidirectional Vision

An extension of the generalized unified model (GUM) originally applied to a single omnidirectional view.

Single-Camera Omnistereo Sensor for Quadrotor Micro Aerial Vehicles (MAVs)

A novel omnidirectional stereo sensor using a pair of hyperbolic mirrors and a single camera.


In 2011, a new intelligent ground vehicle CATA (City Autonomous Transportation Agent) was rebuilt to employ the ROS framework to participate in IGVC.


In 2010, our intelligent ground vehicle CityALIEN participated and won the IGVC Design Challenge.


I have quite a bit of experience teaching both kids and adults.

I was an adjunct lecturer at CUNY Lehman College for the Mathematics and Computer Science Dept., where I taught the following courses:

  • CIS 212 Microcomputer Architecture (Spring 2014-Spring 2016): This requirement course provides a broad study of architecture of microcomputer systems with emphasis on CPU functionality, system bus & memory design and performance, secondary storage technologies and management, input/output peripherals (display and printer technologies), and network technologies. The course follows the Systems Architecture textbook by Stephen D. Burd.

  • CMP 230 Programming Methods I (Fall 2013): Introduced freshman students to structured computer programming using Python, a modern high-level programming language. Programming constructs such as console I/O, data types, variables, control structures, iteration, data structures, function definitions and calls, parameter passing, functional decomposition, object oriented programming, debugging and documentation techniques.

I have also taught STEM summer courses, such as:

  • STEM Robotics (Summer 2015) sponsored by the CUNY City College STEM Institute: In this intensive program for selective high school students who learned fundamentals of mobile robotics using the Raspberry Pi (computer) and Python programming language in order to actuate motors and poll sensor data (e.g. ultrasonic, infrared) and various electronic components. Ultimately, participants built robots to compete in an autonomous robot sumo tournament

As well as teaching middle school in NYC:

In the summer of 2013, I participated in a two-week NSF-sponsored CUNY Science Now Professional Development Workshop.


  • omnistereo AT gmail
  • Grove School of Engineering, Room T539, City College of New York, New York, NY 10031, USA