YAHBOOM DOFBOT AI Vision Robotic Arm User Guide
- June 12, 2024
- YAHBOOM
Table of Contents
- YAHBOOM DOFBOT AI Vision Robotic Arm
- Product Information
- Product Usage Instructions
- About DOFBOT
- PRODUCT FEATURES
- Based on I TSOf’J 11\1 O’S powerful Al computing capabilities
- Cross-platform interconnection control
- Color recognition interaction
- Model training / Garbage sorting
- Human feature recognition and interaction
- ROBOT MOTION CONTROL
- Powerful hardware configuration
- Function on expansion board
- Guided quick start
- MULTIPLE REMOTE CONTROL
- Massive learning materials
- PRODUCT SPECIFICATION
- PACKAGE LIST
- Features
- References
- Read User Manual Online (PDF format)
- Download This Manual (PDF format)
YAHBOOM DOFBOT AI Vision Robotic Arm
Product Information
- The Yahboom DOFBOT AI Vision Robotic Arm is a versatile and convenient robotic arm designed for first-time users. It comes pre-assembled and includes a TF card with a factory image file, allowing for a plug-and-play experience without the need for complex operations. The robot arm features a 6 DOF (Degree of Freedom) design and is integrated with a camera, making it a 2-in-1 device.
- The structural design of the DOFBOT is excellent, with an all-aluminum alloy bracket that has a thickness of 2mm. The chassis is equipped with suction cups, ensuring stability and allowing it to be placed securely in any experimental environment. The robot arm is capable of performing various functions and comes with detailed tutorials and codes.
- In terms of hardware, the DOFBOT is equipped with a multifunctional expansion board that is compatible with Jetson NANO, Raspberry Pi, Arduino, Micro:bit board. It features 515KG bus servos and 16KG bus servo, offering precise and reliable movement. The product also includes a PS2 handle receiver, WiFi/Bluetooth module interface, and I2C port for user customization.
- One of the standout features of the DOFBOT is its fantastic AI capabilities. It supports Android/iOS APP, PC computer, Game handle, and Jupyter Lab webpage online programming for remote control. Users can study and store custom fixed action groups and enjoy various AI vision game plays such as gesture recognition, color interaction, visual positioning, garbage sorting, catch game, face tracking, and blocks stack.
Product Usage Instructions
- Unbox the Yahboom DOFBOT AI Vision Robotic Arm package.
- Ensure that all components are included in the package list.
- Connect the DOFBOT to a power source.
- Download the mobile app associated with the DOFBOT from either the Android or iOS store.
- Scan the QR code on the mobile app using the camera to quickly configure the network and start up the DOFBOT.
- Refer to the tutorials and codes provided for each function of the DOFBOT for detailed instructions.
- If using a Jetson NANO, Raspberry Pi, Arduino, or Micro:bit board, ensure compatibility with the DOFBOT’s multifunctional expansion board.
- For remote control, install the necessary software on your PC computer or use a game handle or Jupyter Lab webpage for online programming.
- Explore the various AI vision game plays available, such as gesture recognition, color interaction, visual positioning, garbage sorting, catch game, face tracking, and blocks stack.
About DOFBOT
DOFBOT is an Al vision robotic arm with Jetson nano as the main control, Open Source CV as the image processing library. mainstream Jupyter Lab as the development tool, and Python3 as the main programming language. The camera is integrated with the robotic arm, and the “2 in 1” design makes the visual recognition of the robotic arm more flexible. It can not only realize color recognition. tracking and grabbing, but also recognize and interact with human body features, and even train garbage classification models for garbage sorting. Through the ROS robot control system, the complex motion control of the 6-DOF serial bus steering gear is simplified. The official provides a guided quick-start tutorial to help you easily control the DOFBOT Al vision robot arm.
PRODUCT FEATURES
Based on I TSOf’J 11\1 O’S powerful Al computing capabilities
NVIDIA Jetson Nano is a small but powerful computer that can run multiple neural networks, object detection, segmentation and speech processing applications at the same time. Equipped with quad-core cortex-A57 processor, 128-core Mazwell GPU and 4GB LPDDR memory, Jetson Nano has sufficient Al computing power, provides 472 GFLOP computing power, and supports a series of popular Al frameworks and algorithms, such as TensorFlow, Pytorch, caffe /caffe2, Keras, MXNET, etc.
Artificial Intelligence Vision Robotic Arm
The first-person perspective controls the robotic arm, making visual recognition more flexible
-
Artificial intelligence / Camera and robot arm 2 in 1
-
Camera on the robot arm
3D simu1a11on modeling, complex and powerful algonlh111 FPV vision, stronger experience -
Separate camera
Convert 3D scene 10 2D plane Simple algonlhm, su11able for getting started
-
- **FPV HD video control
**
Cross-platform interconnection control
Multi-function mobile phone APP (iOS/Android)
A variety of Al gameplays are built-in, and the robot arm can be watched in
real time and supports at the same time: gesture interaction, tracking game.
garbage sorting, etc.
PC host computer control
In addition to the video screen of the FPV camera, the host computer has also
added a 3D simulation model of the robotic arm, which rotates synchronously
with the entity, allowing the robotic arm control theory and practice to be
combined.
USB handle remote control
In addition to mobile phone APP and computer control, it also comes standard
with a USB handle to experience the fun of combining multiple degrees of
freedom and multiple keys.
Color recognition interaction
-
Color recognition tracking
In addition to standard colors, custom colors can also be extracted for color recognition and tracking of the robotic arm, so that the camera can follow the selected color automatically. -
Catch Game
In addition to standard colors, custom colors can also be extracted for color recognition and tracking of the robotic arm, so that the camera can follow the selected color automatically.
-
Color capture
Place the wooden block in front of the camera, and the robotic arm will recognize the color of the wooden block and clamp the wooden block in the corresponding color area to the middle area. -
Color interaction
Choose DOFBOT favorite color. It will follow the target color back and forth.
Model training / Garbage sorting
- BASIC FUNCTION: The DOFBOT kit comes standard with multiple wooden blocks with trash patterns. By loading the garbage dassification model, the camera can identify the type of garbage on the wood block and return the garbage name and classification data to the APP. The robotic arm sorts the “garbage” to the corresponding classification area on the map.
- DEVELOPER FUNCTION: You can also identify the area by calibrating the map, place multiple “garbage” blocks in the identification area for real-time recognition, and continuously sort multiple “garbage” to the corresponding classification area on the map.
Human feature recognition and interaction
DOFBOT can recognize human features such as gestures and faces through the camera screen, and complete interactive actions such as grabbing, recognition and tracking.
- Gesture interaction
DOFBOT can recognize multiple gestures and perform corresponding actions for interaction.
- Gesture grab
Recognize digital gestures, stack the corresponding layers, and finally push down the stacked wooden blocks under the “fist” gesture
- Face recognition and tracking
Detect whether there is a face in the current picture, mark the face after it is recognized and follow the movement
ROBOT MOTION CONTROL
- Robotic arm custom learning action group
Due to the intelligent serial bus servo, DOFBOT can read and record the angle of each rotation after entering the action learning mode to realize learning and repeat the action group.
-
Fun fixed action group
There are 8 fixed action groups provided in the APP, you can click the serial number to preview, and click Run to start execution. -
Synchronous teaching of robotic arm
This gameplay requires two DOFBOT robots for teaching control Read the joint angle of the servo on the DOFBOT Master Machine side,and transmit it to the Slave Machine side in real time, allowing the Slave Machine to rotate synchronously according to the posture of the Master Machine.
- 6 degree of freedom inverse kinematics control
Decompose the servo motion control of the robot arm with 6 degrees of freedom, calculate the theoretical motion angle of each servo by inputting the target coordinates, and simultaneously control the motion of each DOFBOT servo in combination with the servo control protocol.
- Operating system
The ROS robot operating system is a collection of tools, software libraries and protocols, designed to simplify the DOFBOT robot platform and build complex and powerful robots.
Powerful hardware configuration
Function on expansion board
- Micro USB interface
- I2C interface
- Cooling fan interface
- Button. K1 +K2+RESET button
- Status Indicator
- RGB light
- Ultrasonic interface
- Serial port function selection
- PWM servo interface
- buzzer
- Bus servo interface
- PS2 handle receiver base
- T-type power supply interface
- Switch
- Serial port
Guided quick start
Guided teaching, follow the APP in three steps
-
Scan and configure network
DOFBOT connect to WIFI and pair by scanning the QR code -
Servo calibration
By reading the angle of the steering gear, angle calibration of the robotic arm entity -
Color calibration
For beginners and developers, we provide basic color calibration and professional color calibration functions to avoid environmental factors affecting the robotic arm
MULTIPLE REMOTE CONTROL
Massive learning materials
Website: http://www.yahboom.net/study/Dofbot-Jetson_nano
PRODUCT SPECIFICATION
PRODUCT SIZE
PRODUCT PARAMETER
- Microprocessor: Quad-core ARM A57+128-core NVIDIA Maxwell
- Al computing power: 472GFLOPs
- Operating system: Ubuntu 18 04 LTS + ROS_Melodic
- Programming language: Python
- Steering gear scheme: 15kg5+6kg1 Serial Bus Smart Servo
- Input: Wide-angle camera / Emergency stop button I Distribution network button / Reset button / 2*I1C interface/ 5V fan interface
- Output: RGB indicator/ OLEO display buzzer I 6 bus servo interface I 6-channel PWM servo interface
- Power scheme: 12V5A power adapter
- Power connector: T plug interface
- Remote control method: Mobile APP/PC upper computer/PS2 handle(PC port)
- Way of communication: WIFI network
- Safety protection: Overcurrent protection Reverse connection protection
- Mechanical arm material: Anodized aluminum
- Assembled size: 303135473mm
- Weight after assembly: 1256g
- Degree of Freedom of Robot Arm {DOF): 5 degrees of freedom+ gripper
- Payload: 200g {Straightenable weight)
- Load: 500g {Clamping handling weight)
- Arm span: 350mm
- Diameter of grabbed object: 1-6CM
- Effective crawl range: Radius <; 30cm, the area with the central axis as a semicircle
- Repeatability: ±0.Smm
- Voltage: 100-240V 50/60 HZ
- Camera: 300,000 pixel 110 degree wide-angle camera
- Resolution: 480p ( 640*680)
- Frame rate: 30fps
- Focus method: Focal length can be adjusted manually
PACKAGE LIST
Features
- Reasonable and convenient first trial
- Assembled before shipping, users didn’t need to assemble.
- TF card with factory image file, plug and play without complex operation.
- Scanning the QR code on the mobile APP through the camera to quickly configure the network and start up DOFBOT.
- Each function possess tutorials and codes in detail.
- Excellent structural design
- All aluminum alloy bracket with 2mm thickness
- The chassis with suction cups is more stable and can be stably placed in any experimental environment at any time.
- Camera and robot arm 2 in 1.
- Flexible 6 DOF vision robotic arm.
- Top hardware Configuration
- Multifunctional expansion board, compatible with Jetson NANO, Raspberry Pi, Arduino, Micro:bit board.
- 515KG bus servo+16KG bus servo.
- PS2 handle receiver, WiFi/Bluetooth module interface, I2C port are reserved for users.
- Fantastic AI function
- Support Android/iOS APP, PC computer, Game handle, Jupyter Lab webpage online programming remote control.
- Can study and storage custom fixed action groups.
- Simultaneous movement of dual robotic arms.
- Gesture recognition, color interaction, visual positioning, garbage sorting, catch game, face tracking, and blocks stack and others AI vision game play.
References
Read User Manual Online (PDF format)
Read User Manual Online (PDF format) >>