机器人SDK

  • SentiBotics Kit

 

SENTIBOTICS
READY-TO-USE ROBOTICS DEVELOPMENT KIT
SentiBotics development kit is designed to provide a starting point for researchers and developers, who would like to focus on robot development. SentiBotics can be also used as an educational platform at universities.
The kit includes a mobile autonomous robot with a manipulator, ROS-based SDK and full source code of all algorithms used in the robot software. The robot is able to perform navigation in a common environment, recognizes 3D objects, can grasp them and manipulate with them.

SENTIBOTICS
现成的机器人开发工具包
SentiBotics开发工具包旨在为希望专注于机器人开发的研究人员和开发人员提供一个起点。SentiBotics也可以作为大学的教育平台。
该工具包包括一个移动自主机器人和机械手,一个基于ROS的SDK,以及包含机器人软件中使用的所有算法源代码。机器人能够在一个普通环境自动导航,识别三维物体,并抓取和操纵它们。

 

FEATURES AND CAPABILITIES

  • Robot hardware is included with instructions to assemble and setup.
  • Original proprietary algorithms for autonomous navigation, 3D object recognition and manipulation.
  • Development kit is based on the open and versatile ROS (Robot Operating System) framework.
  • Source code for the robotic algorithms is included.
  • Robot hardware is based on the components available on the market and can be modified by customers.
  • Robotic simulator can be used for algorithms evaluation and software development without real robotic hardware.


产品特点和功能

  • 机器人硬件包括装配和安装指引。
  • 用于自主导航、3D物体识别和操作的原创专有算法。
  • 开发工具包是基于开放和通用的ROS(机器人操作系统)框架。
  • 包括机器人算法的源代码。
  • 机器人硬件是以市场上现有的部件为基础的,用户可以对其进行修改。
  • 机器人模拟器可以用于算法评估和软件开发,而不需要真正的机器人硬件。

Neurotechnology began research and developing in the autonomous robotics field in 2004. Ten years later, in 2014, Neurotechnology released the SentiBotics development kit, which includes easy-to-setup robotic hardware and original proprietary algorithms for autonomous navigation, 3D object recognition and grasping:

  • Simple assembly and setup of the robot hardware. The SentiBotics robot hardware is shipped as a set of several components (tracked platform, robotic arm, cameras etc.), which need to be assembled together and connected. All necessary instructions are included. The on-board computer already includes pre-installed Ubuntu Linux and SentiBotics software. See the robotic platform specifications for more information.
  • Software is based on ROS (Robot Operating System) framework. The software for the robot navigation, object recognition and object manipulation using the robotic arm is based on the widely used Robot Operating System. Researchers and developers may use their experience with ROS and their existing ROS-based software to work with the SentiBotics development kit. The SentiBotics kit includes ROS-based infrastructure, which allows to integrate third-party hardware parts or robotics algorithms.
  • SLAM and navigation. SentiBotics uses an original navigation algorithm based on recognizing certain elements of an environment. The robot needs to explore the environment and build the environment map at first. Users may use manual mapping process by controlling the robot via the included control pad, or by writing a simple set of movement instructions. After the environment map is built, the robot will be able to move, navigate and operate in the environment completely autonomously..
  • Object recognition and manipulation. SentiBotics includes a set of original, computer vision based algorithms for object learning and recognition. Users may teach the robot to recognize a previously unknown object by placing it in front of the robot's 3D camera and assigning an identifier to it. Then the robot will be able to recognize the learned object in the environment. Users may also specify, which objects should be grasped with the robot's arm, and once the robot will see the specified object within the grasping range, it will try grasp it and place into the attached container.
  • Source code for the algorithms is included. The SentiBotics kit includes full source code for the algorithms used in the robot together with working software samples for autonomous navigatio, object recognition and manipulation. SentiBotics algorithms are written in C++ and designed to be run on the specified robotic hardware, but can be be ported to other robotic platform, which includes the specified or better computer.
  • Robot hardware is based on the components available on the market. Users may purchase additional components to upgrade the robot, change its functionality or to build another robots, which will run SentiBotics software.
  • Robotics simulator. SentiBotics software can be run in Gazebo robotics simulator for algorithm evaluation and software development without using real robotic hardware. See the robotic simulator section for more information. A 30-day trial version of SentiBotics development kit is available for download.

2004年,神网科技公司开始在自主机器人领域进行研究和开发。十年后,2014年,公司发布了Sentibotics开发工具包,其中包括易于安装的机器人硬件和原创专有算法,可进行自主导航,三维物体识别和抓取:

  • 简单的装配和安装机器人硬件 SentiBotics机器人硬件是一套由几个部件(履带平台、机器人手臂、摄像机等)组成的套件,需要组装和连接,有详细说明和指引。车载计算机已经预装Ubuntu Linux和SentiBotics软件。
  • 软件基于机器人操作系统(ROS)框架 机器人导航、目标识别和机器人手臂操作的软件是在广泛应用的ROS(机器人操作系统)基础上开发的。研究人员和开发人员可以根据他们使用ROS的经验和他们现有的基于ROS的软件来使用SentiBotics开发工具包。SentiBotics工具包包括基于ROS的基础架构,允许集成第三方硬件部件或机器人算法。
  • SLAM和导航。SentiBotics使用了一种基于识别环境中某些元素的原创导航算法。机器人首先探索并建立环境和环境地图。用户可以使用手动映射过程,通过所包含的控制垫合并机器人,或编写一组简单的运动指令。在建立了环境地图之后,机器人将能够完全自主地在环境中移动、导航和操作。
  • 目标识别和操作。SentiBotics包括一组原创的、基于计算机视觉的对象学习和识别算法。用户可以教机器人识别先前未知的物体,方法是把它放在机器人的3D摄像机前,并给它分配一个标识符。然后机器人将能够识别环境中的学习对象。用户还可以指定用机器人的手臂抓取哪些物体,一旦机器人在抓取范围内看到指定的对象,它就会尝试抓住它并将其放置在附加的容器中。
  • 包含了算法的源代码 SentiBotics工具包包括机器人所用算法的完整源代码,以及用于自主导航、目标识别和操作的工作软件示例。Sentibotics算法是用C+并设计为运行在指定机器人硬件,但可以移植其他机器人平台,其中包括指定的或更好的计算机。
  • 机器人硬件是以市场上现有的部件为基础的 用户可以购买更多的部件来升级机器人,改变其功能,或者建造另一个运行SentiBotics软件的机器人。
  • 机器人模拟器 Sentibotics软件可以在Gazebo机器人模拟器中运行算法评价以及不使用真正的机器人硬件的软件开发。
  • SentiBotics Navigation SDK

 

SENTIBOTICS NAVIGATION SDK
IMITATION LEARNING-BASED AUTONOMOUS ROBOT NAVIGATION KIT
The software development kit is designed for researchers and engineers working on autonomous robot navigation projects. The kit may be also used for educational purposes in universities and other education institutions.
Available as either a complete, ready-to-run robotics system that includes Neurotechnology's mobile reference platform prototype, or as a software-only option for integration into existing robotics hardware.

SENTIBOTICS NAVIGATION SDK
基于仿真学习的自主机器人导航套件
该软件开发工具包是为从事自主机器人导航项目的研究人员和工程师设计的。该工具包还可用于大学和其他教育机构的教育目的。
可以作为一个完整的,现成的机器人系统,其中包括神网科技公司的移动参考平台原型,或作为一个纯软件的选择,集成到现有的机器人硬件。

 

FEATURES AND CAPABILITIES

  • Proprietary deep neural network based algorithm for imitation learning based autonomous robot navigation.
  • Autonomous navigation over long distances.
  • Object learning and recognition engine included.
  • Software Development Kit is based on Tensorflow and ROS (Robot Operating System) framework middleware.
  • Gazebo simulator is fully integrated with the kit, and can be used for software development and algorithm evaluation without the need for robotic hardware.
  • Complete source code for the robotics algorithms is included.
  • Detailed specifications for the robotics hardware are included.
  • Ready-to-run mobile robot prototype is optionally available.

产品特点和功能

  • 基于自主学习的自主机器人导航的专有深神经网络算法。
  • 长距离自主导航。
  • 包括对象学习和识别引擎。
  • 软件开发工具包基于TensorFlowROS(机器人操作系统)框架中间件。
  • 平台模拟器与该工具包完全集成,可以用于软件开发和算法评估,而不需要机器人硬件。
  • 机器人算法完整的源代码。
  • 详细规格的机器人硬件。
  • 可选择现成的移动机器人原型

 

SentiBotics Navigation SDK provides the following functionality:

  • Imitation learning based autonomous robot navigation – the robotics user, via the control interface, first runs the robot several times in the desired closed trajectory. During this process, training data pairs (images and control pad commands) are captured and a deep neural network and imitation learning-based motion controller are established off-line using the TensorFlow framework and provided controller-training infrastructure. Once the controller is fully trained, the robot may be placed at any point along the learned trajectory and it will function autonomously within that environment.
  • Autonomous navigation over long distances – the trained controller will allow the robot to navigate over distances in excess of multiple hundreds of meters.
  • Object learning and recognition – users may enroll objects of interest into the included object recognition engine for additional enhancement of the controller-learned space. The object recognition capability may allow to perform certain actions once a particular object is detected, as well as perform automatic robot recharging by searching for docking stations along the trajectory.
  • Easy integration with other ROS-based robots – SentiBotics Navigation SDK uses Tensorflow infrastructure for training and executing DNN-based robot navigation controllers. It also relies on ROS middleware (versions Kinetic Kame and Melodic Morenia are supported). The algorithm needs data input only from a single webcam and two low cost ultrasonic range finders, thus does not require any advanced hardware for scanning the environment.
  • Source code for the robotics algorithms – the SDK includes the Python and C++ code for all robotics algorithm implementations (trajectory learning and execution, object learning and recognition, high-level robot control interface), URDF description of robot models, and software for integration of SentiBotics Navigation SDK with the Gazebo simulator.
  • Usage scenarios – the algorithms run in simulator or in a real-world (with robot hardware) modes. The training data collection and controller training are the same for all scenarios:
    • Gazebo robotic simulator – SentiBotics Navigation SDK includes fully integrated Gazebo simulator software and allows to perform navigation controller training in a virtual environment. A comprehensive 3D test model, complete with an office layout and simulated SentiBotics robot, is included. This scenario has the lowest hardware requirements, as it requires only a PC that meets the system requirements.
    • Remote robot – in this scenario a robot transmits camera images and sensor data to a PC or server that runs the SentiBotics Navigation software for computing motion commands The generated commands are then sent back to the robot. This scenario has low hardware requirements for the robot but requires robust wireless network connection.
    • Stand-alone robot – the trained controller is converted to Movidius graph format, uploaded to the robot's computer, and executed onboard. This scenario requires a robot with dedicated hardware for DNN computations, like Movidius NCS dongle or a GPU. In this scenario the robot can be used in areas without wireless network coverage.
  • Robotic hardware – documentation with detailed specifications for the SentiBotics reference mobile platform hardware can be downloaded for free. SentiBotics Navigation SDK customers can also purchase a ready-to-run mobile robot prototype.

SentiBotics Navigation SDK提供了以下功能:

  • 基于仿真学习的机器人自主导航 用户通过控制界面,第一次运行机器人在期望的封闭轨道上几次。在此过程中,利用TensorFlow框架,捕获训练数据对(图像和控制垫命令),建立了基于神经网络和仿真学习的运动控制器,并提供了控制器训练的基础结构。一旦对控制器进行了充分的训练,机器人就可以被放置在学习轨迹的任意一点上,并且它将在该环境中自主地工作。
  • 长距离自主导航 经过训练的控制器将允许机器人在超过数百米的距离上导航。
  • 目标学习与识别 用户可以将感兴趣的对象注册到所包含的对象识别引擎中,以进一步增强控制器学习空间。该目标识别能力允许在检测到某一特定物体时执行某些动作,以及通过沿着所述轨迹搜索对接站来执行机器人自动充电。
  • 易于与其他基于ROS的机器人集成 SentiBotics Navigation SDK使用TensorFlow基础架构来训练和执行基于DNN的机器人导航控制器。它还依赖于ROS中间件(支持Kinetic Kame和Melodic Morenia版本)。该算法只需要从一个摄像头和两个低成本超声波测距仪输入数据,因此不需要任何先进的硬件来扫描环境。
  • 机器人算法的源代码 SDK包括用于所有机器人算法实现的Python和C+代码(轨迹学习和执行、对象学习和识别、高级机器人控制接口)、机器人模型的URDF描述以及SDK与Gazebo模拟器集成的软件。
  • 使用情景 算法可运行在模拟器或在一个现实世界(使用机器人硬件)模式。培训数据收集和控制器培训对于所有场景都是相同的:
    • Gazebo机器人模拟器 SentiBotics Navigation SDK包括完全集成的Gazebo模拟器软件,允许在虚拟环境中进行导航控制器培训。包括一个全面的三维测试模型,完整的办公室布局和模拟机器人。此方案的硬件要求很低。.
    • 远程机器人 在这种情况下,机器人将摄像机图像和传感器数据传送到运行SentiBotics导航软件以计算运动命令的PC或服务器,然后将生成的命令发送回机器人。该方案对机器人的硬件要求较低,但需要可靠的无线网络连接。
    • 独立机器人 经过训练的控制器被转换成Movidius图形格式,上传到机器人的计算机上,并在机上执行。这个场景需要一个具有专用硬件来进行DNN计算的机器人,比如Movidius NCS dongle或GPU。在这种情况下,机器人可以在没有无线网络覆盖的区域使用。
  • 机器人硬件 有详细说明的SentiBotics参考移动平台硬件的文档可供下载。Sentibotics Navigation SDK客户也可以购买现成的移动机器人原型产品。