Research

Robot Learning from Demonstrations

1. Learning from Demonstrations for Smart Manufacturing

We develop a new LfD model for the collaborative robot to learn from human demonstrations and assist its human partner in shared working situations. The human could program the robot using natural language instructions according to his/her personal working preferences. The robot can leverage its learned knowledge to actively assist the human in the collaborative assembly task.

Human Factors in Human-Robot Interaction

1. Human Comfort Characterization and Modeling

Human comfort is significant in human-robot collaboration since it can influence the task efficiency and quality. In this project, we propose a computational Human Comfort Model (HuCoM) to model and quantify human comfort during human-robot collaborative manufacturing. Based on the defined primitive comfort rewards and combined comfort rewards, the HuCoM is developed with the incorporation of the static comfort model and the dynamic comfort model. ​With the human EEG signals, the robot actions can be customized by the human comfort levels in real-time.

Distributed Multi-Robot Collaboration

1. Multi-Robot Collaboration for Co-Assembly Tasks 

Adopting a multi-robot team in a collaborative workspace can significantly boost the throughput and efficiency of manufacturing tasks. In this project, a merge sort-based task planning approach is proposed for multi-robot collaboration in shared tasks. We develop a multi-robot simulation platform with three (can be scaled to three more) six degrees of freedom (DOF) collaborative robots sharing a common workspace. The merge sort algorithm is used for the multi-robot team to plan collaborative assembly tasks. 

2. Human-CAD Co-Guided Collaborative Robotic Vacuum

On automotive assembly lines, many associates are employed to clean the dust inside the vehicle body by holding industrial vacuum cleaners. In this project, to reduce the cleaning time and facilitate ergonomics, a Collaborative Robotic Vacuum (CoRV) system is developed for humans in automotive body cleaning tasks. A human-CAD co-guided robot motion planning approach based on the cost function is proposed for the CoRV to automatically generate optimized cleaning waypoints and trajectories with collision avoidance. 

Robot-Enabled Smart and Precision Farming/Agriculture

1. Robot-Assisted Crop Harvest System

Robotics and artificial intelligence (AI) have increasingly expanded smart and precision technological solutions for the agricultural industry, especially in time-consuming and costly farming tasks. We developed a robot-assisted crop maturity recognition and harvest system to accurately detect the stages of crop ripeness. The proposed approach integrates collaborative robotics, computer vision, image processing, and AI. A transfer learning-based model is created for the robot to recognize the crop in its maturity stages and locate the crop during real-time detection.

Human Intention Understanding

1. Human Manipulation Intention Understanding

We develop a human manipulation understanding approach using a wearable sensory system, which has a natural and simple configuration and can be easily used by humans. This approach could make the robot recognize human hand-over intentions and enable the human to naturally, flexibly, and conveniently control the hand-over process. In addition, the approach could recognize the attribute classes of the objects in the human’s hands using wearable sensing and enable the robot to actively make decisions to ensure that graspable objects are handed over from human to robot.​

Human-Robot Collaboration-Quality

1. Collaborative Task Cost Evaluation

We develop a task cost evaluation model using online optimization in human-robot collaboration. First, we extract the task model by graphical representations and design the collaboration cost functions which incorporate time consumption and human efforts. After that, the robot action planning algorithms are developed to online plan robot actions by evaluating and optimizing the collaborative assembly cost. In addition, appropriate robot actions can be planned by the proposed algorithms to ensure the accomplishment of assembly process when the human happens to conduct wrong actions during the collaboration.​

Smart Driving

1. Deep Learning based Autonomous Vehicle Parking

We propose a FAP method for autonomous vehicles via the computer vision. By taking advantage of the input images from a rear camera on the vehicle, a convolutional neural network (CNN) based on the Caffe deep learning framework is trained to automatically output the steering and velocity commands for vehicle control.​

Distributed Control Systems

1. High Availability Control for Large-Scale Scientific Facility

In the high-energy large-scale scientific facility system, the controllers and devices are operated on three electric potential platforms: ground, 350 kV, and 400 kV. We develop a high-speed redundancy optical fiber ring network with high availability for monitor level and control level. The specialized intranet can complete the network reconfiguration within a few milliseconds when the communication failure occurs, and has no effect on the progress of logic control and data terminal.​

Great appreciation to all our sponsors!