This paper proposes underwater simultaneous localization and mapping (SLAM) with 3D reconstruction by applying YOLOv7 to acoustic images. In underwater exploration, acoustic cameras, which are called the next generation of ultrasonic sensors, are gradually being applied, and underwater SLAM technologies based on 3D reconstruction with acoustic cameras have been proposed. However, many limitations remain in the accuracy of maps. In this study, we propose a novel approach to improve SLAM accuracy by applying detection results from YOLOv7 in acoustic images to the 3D reconstruction. We utilized the detected objects by YOLOv7 as feature information applied to iterative closest point (ICP)-based SLAM.
KEYWORDS: Thermography, RGB color model, Visualization, Education and training, 3D image processing, Cameras, Temperature distribution, Super resolution, Gallium nitride, Point clouds
In this study, we propose a new framework to perform visual simultaneous localization and mapping (SLAM) with RGB images artificially generated from thermal images in low light environments where an optical camera cannot be applied. We applied contrastive unpaired translation (CUT) and enhanced generative adversarial network for super-resolution (ESRGAN), which are image translation methods to generate a clear realistic RGB image from a thermal image. Oriented FAST and rotated BRIEF (ORB)-SLAM was performed using the super-resolution fake RGB image to generate a 3D point cloud. Experimental results showed that our thermography-based visual SLAM could generate a 3D temperature distribution map in the low light environment.
This paper proposes an efficient imaging sonar simulation method based on 3D modeling. In underwater scenarios, a forward-looking sonar, which is also known as an acoustic camera, outperforms other sensors including popular optical cameras, for it is resistant to turbidity and weak illumination, which are typical in underwater environments, and thus able to provide accurate information of the environments. For those underwater tasks highly automated along with artificial intelligence and computer vision, the development of the acoustic image simulator can provide support by reproducing the environment and generating synthetic acoustic images. It can also facilitate researchers to tackle the scarcity of real underwater data in some theoretical studies. In this paper, we make use of the 3D modeling technique to simulate the underwater scenarios and the flexible automated control of the acoustic camera and objects in the scenarios. The simulation results and the comparison to real acoustic images demonstrate that the proposed simulator can generate accurate synthetic acoustic images efficiently and flexibly.
This paper presents very compact range image sensors for short distance measurement, which is suitable for robot hands, etc. Robot manipulation such as grasping is one of the applications that require a range image sensor to obtain threedimensional (3D) information of the target object. For such applications, it is necessary to avoid the occlusion by a robot manipulator or a robot hand while measurement, and it is effective to attach a sensor to the robot hand for the avoidance. For this aim, a range sensor that is small enough and can measure at the short distance is required. Two sensors are constructed in this paper: one uses a multi-slit laser projector and the other uses a multi-spot laser projector. A small laser projector and a small camera is combined and range images are obtained in real time using the principle of active stereo. Appropriate methods to obtain range image are proposed for both sensors, and especially for the one with a multislit laser projector, a method to use both disparity and the intensity of laser light image is presented. The effectiveness of the proposed sensors is verified through short-range object measurement experiments.
In this paper, a novel system that operates home appliances at arbitrary positions in a room is proposed based on a “command space” associated with the operation of the home appliance. In the proposed system, a hand waving gesture is used to operate the home appliances. First, three-dimensional (3D) positions are extracted from the hand waving gestures at two different positions, and a command space is set up based on the extracted 3D positions. Thus, it is possible to operate the home appliances freely in an arbitrary place chosen by the user by installing the user-definable command space. Next, detailed operations are performed by hand waving in the command space. Experiments were conducted to confirm that detailed operations, such as TV channel switching, can be executed from different places using the proposed system.
This paper proposes a novel approach that performs extrinsic parameter estimation of a camera installed in a man-made environment using a single image. The problem of extrinsic parameter calibration is identical to 6DoF (six-degrees of freedom) localization problem of the camera. We take advantage of line information that is usually present in the man-made environment such as inside of the building. Our approach only requires a flat surface map for a 3D environment model which can be easily obtained from the blueprint of the artificial environment (e.g., CAD data). In order to manage the complicated 6DoF search problem, we propose a novel image descriptor defined in quantized Hough space to perform 3D-2D matching process between line features from the 3D flat surface model and the 2D single image. The proposed method can robustly estimate the complete extrinsic parameters of the camera, as we demonstrate experimentally.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.