KudanSLAM: 3D Recognition and Position Tracking Software Is Now Ready for the Market for Autonomous Car, Drone, Robotics

-High tracking accuracy (1mm-1cm) / High speed / Low CPU consumption (bellow 5%) / Robustness-

TOKYO — (BUSINESS WIRE) — August 14, 2017 — Kudan, Inc. has succeeded to develop real time 3D mapping and position tracking via camera, called “KudanSLAM*1”, and started to provide its technology to the market such as Autonomous car, ADAS*2, Drone, Industrial and Personal Robots in addition to the existing AR/VR industries.

SLAM, is the software technology, which is capable of 3D mapping and position tracking. It provides computers the ability of “computer vision” to acquire, process, analyse and understand digital images as well as the ability to map its 3D environment, objects, and understand its location within it. This “Computer Vision” technology can be used for any industries such as Autonomous car and Robotics.

Kudan has been developing tracking space and object technology through AR. As a result, Kudan succeeded to develop practicable and next generation algorithm, which would replace the existing SLAM such as ORB and PTAM*3 SLAM base, and apply those technology to be ready for the market.

Kudan, as a SLAM’s leading company, aims to spread use of KudanSLAM which is to be embedded on all image-related devices with camera, in any fields such as Autonomous car, ADAS, Drone and Robotics in addition to the existing AR/VR area.

Key features of KudanSLAM

  • Hardware friendly: flexible with camera setup including monocular, rolling shutter, and other sensors. Ready to be embedded on processor any other tech architecture
  • High speed / Low consumption: less than 5% of mobile CPU consumption
  • High tracking accuracy: 1mm-1cm*4
  • Robustness: Capable to work under severe lighting condition and with unpredictable movement

1 SLAM: Simultaneous Localization and Mapping, enables real-time 3D mapping and position tracking
2 ADAS : Advanced Driver Assistance System
3 ORB,PTAM SLAM : The existing open source SLAM algorithms
4 Kudan research: The accuracy is proportional to the distance between the camera and the recognition target. 1mm to 1cm accuracy in 1m distance by smartphone spec camera.

=== KudanSLAM Use scene ===

1) Autonomous car / ADAS

-KudanSLAM is ready to be also combined with internal sensor and LiDAR and that leads to realize further robustness and more precise position tracking.
-It would be useful for monitoring both front and back, which is not to be effected by environmental noise, also useful for parking assistant, which needs precise position tracking with a few centimeters difference.
(demo) KudanSLAM building a 3D map using a car camera
https://www.youtube.com/watch?v=EE-QvVTMTdY

2) Drone

-Despite low-end camera of Drone, it enables to recognize object and position tracking precisely with 1mm to 1cm accuracy.
-Robust with severe lighting condition, occlusion and unpredictable movement.
(demo) KudanSLAM building a 3D map using a drone camera
https://www.youtube.com/watch?v=GDJ6aFsPWN4

3) Robotics

-Even without outside sensor, KudanSLAM enables robot to work independently, which makes the robot work freely, without any specific facility and environment.

4) VR/AR/MR

-Even without marker, KudanSLAM enables to display absolute position tracking. It enables to display the specific AR at the specific place, and share the same AR image with the other person.
-Navigation without GPS is available, such as indoor and inside of the factory.
-KudanSLAM enables to track the headset holder’s position and eye tracking, which would be utilized as data analysis of the effective operation.

 

Comparison chart of SLAM specification

 

Technical Strength: Practicality

KudanSLAM performs with high speed/low consumption, high accuracy and
high robustness at any setup required

     
Performance comparison with open source algorithms*1
  KudanSLAM PTAM ORB

Processing time/
computing foot-print

(ms for a single tracking frame)

1 15 30

Tracking accuracy
(mm in 1m distance)

1 30 10

Mapping accuracy
(mm in 1m distance)

1 50 20

Occlusion

(minimum % of fields of view
required for workable tracking)

  10   50   20
 

*1: Original evaluation using 2014 Macbook Pro and Duo3D
( https://duo3d.com/product/duo-minilx-lv1 )

 

1 | 2  Next Page »
Featured Video
Editorial
Jobs
Advanced Mechanical Engineer for General Dynamics Mission Systems at Canonsburg, Pennsylvania
Mechanical Engineer for PTEC Solutions at Fremont, California
GIS Analyst for NV5 at Hollywood, Florida
GIS Senior Analyst for Far Western Research at Davis, California
Senior Geographic Information Systems (GIS) Analyst for County of Santa Clara at san jose, California
GIS Technician for Far Western Research at Davis, California
Upcoming Events
SprutCAM X World Conference at Cyprus - Jun 10 - 13, 2024
Welcome to FABTECH Canada 2024 at The Toronto Congress Centre (South Building) 650 Dixon Road Toronto Ontario Canada - Jun 11 - 13, 2024
RAPID + TCT 2024 at Anaheim Convention Center Anaheim CA - Jun 25 - 27, 2024
IMTS 2024 – The International Manufacturing Technology Show at McCormick Place 2301 S Lake Shore Dr Chicago IL - Sep 9 - 14, 2024



© 2024 Internet Business Systems, Inc.
670 Aberdeen Way, Milpitas, CA 95035
+1 (408) 882-6554 — Contact Us, or visit our other sites:
AECCafe - Architectural Design and Engineering EDACafe - Electronic Design Automation GISCafe - Geographical Information Services TechJobsCafe - Technical Jobs and Resumes ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy PolicyAdvertise