Home About us Contact | |||
Visual Tracking (visual + tracking)
Selected AbstractsVisual Tracking and LIDAR Relative Positioning for Automated Launch and Recovery of an Unmanned Rotorcraft from Ships at SeaNAVAL ENGINEERS JOURNAL, Issue 2 2009MATT GARRATT Sensors and systems for a fully autonomous unmanned helicopter have been developed with the aim of completely automating the landing and launch of a small-unmanned helicopter from the deck of a ship. For our scheme, we have combined a laser rangefinder (LRF) system with a visual tracking sensor to construct a low-cost guidance system. Our novel LRF system determines both the distance to and the orientation of the deck in one cycle. We have constructed an optical sensor to complement the laser system, comprising a digital camera interfaced to a Field Programmable Gate Array (FPGA), which enables the entire target tracking computation to be achieved in a very small self-contained form factor. A narrowband light source on the deck is detected by the digital camera and tracked by an algorithm implemented on the FPGA to provide a relative bearing to the deck from the helicopter. By combining the optical sensor bearing with the information from the laser system, an accurate estimate of the helicopter position relative to the deck can be found. [source] Visual Tracking For References Generated By A Stochastic ModelASIAN JOURNAL OF CONTROL, Issue 3 2003T. Kamiya ABSTRACT This paper describes a visual tracking system for an unknown reference signal. A time-varying reference signal is realized as a random process generated by an auto-regressive (AR) model, which is identifed by a recursive algorithm. Based on the obtained AR model, the future value of reference signal is predicted. We propose a new visual tracking system using generalized minimum variance control (GMVC) and illustrate its properties through experiments. [source] Targeted driving using visual tracking on Mars: From research to flightJOURNAL OF FIELD ROBOTICS (FORMERLY JOURNAL OF ROBOTIC SYSTEMS), Issue 3 2009Won S. Kim This paper presents the development, validation, and deployment of the visual target tracking capability onto the Mars Exploration Rover (MER) mission. Visual target tracking enables targeted driving, in which the rover approaches a designated target in a closed visual feedback loop, increasing the target position accuracy by an order of magnitude and resulting in fewer ground-in-the-loop cycles. As a result of an extensive validation, we developed a reliable normalized cross-correlation visual tracker. To enable tracking with the limited computational resources of a planetary rover, the tracker uses the vehicle motion estimation to scale and roll the template image, compensating for large image changes between rover steps. The validation showed that a designated target can be reliably tracked within several pixels or a few centimeters of accuracy over a 10-m traverse using a rover step size of 10% of the target distance in any direction. It also showed that the target is not required to have conspicuous features and can be selected anywhere on natural rock surfaces excluding rock boundary and shadowed regions. The tracker was successfully executed on the Opportunity rover near Victoria Crater on four distinct runs, including a single-sol instrument placement. We present the flight experiment data of the tracking performance and execution time. © 2009 Wiley Periodicals, Inc. [source] Application of visual tracking for robot-assisted laparoscopic surgeryJOURNAL OF FIELD ROBOTICS (FORMERLY JOURNAL OF ROBOTIC SYSTEMS), Issue 7 2002Xiaoli Zhang With the increasing popularity of laparoscopic surgery, the demand for better modes of laparoscopic surgery also increases. The current laparoscopic surgery mode requires an assistant to hold and manipulate the endoscope through commands from the surgeon. However, during lengthy surgery procedures, accurate and on-time adjustment of the camera cannot be guaranteed due to the fatigue and hand trembling of the camera assistant. This article proposes a practical visual tracking method to achieve automated instrument localization and endoscope maneuvering in robot-assisted laparoscopic surgery. Solutions concerning this approach, such as, endoscope calibration, marker design, distortion correction, and endoscope manipulator design are described in detail. Experimental results are presented to show the feasibility of the proposed method. © 2002 Wiley Periodicals, Inc. [source] |