This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Biomedical Engineering, is properly cited. The complete bibliographic information, a link to the original publication on https://biomedeng.jmir.org/, as well as this copyright and license information must be included.
Applications of robotics in daily life are becoming essential by creating new possibilities in different fields, especially in the collaborative environment. The potentials of collaborative robots are tremendous as they can work in the same workspace as humans. A framework employing a top-notch technology for collaborative robots will surely be worthwhile for further research.
This study aims to present the development of a novel framework for the collaborative robot using mixed reality.
The framework uses Unity and Unity Hub as a cross-platform gaming engine and project management tool to design the mixed reality interface and digital twin. It also uses the Windows Mixed Reality platform to show digital materials on holographic display and the Azure mixed reality services to capture and expose digital information. Eventually, it uses a holographic device (HoloLens 2) to execute the mixed reality–based collaborative system.
A thorough experiment was conducted to validate the novel framework for mixed reality–based control of a collaborative robot. This framework was successfully applied to implement a collaborative system using a 5–degree of freedom robot (xArm-5) in a mixed reality environment. The framework was stable and worked smoothly throughout the collaborative session. Due to the distributed nature of cloud applications, there is a negligible latency between giving a command and the execution of the physical collaborative robot.
Opportunities for collaborative robots in telerehabilitation and teleoperation are vital as in any other field. The proposed framework was successfully applied in a collaborative session, and it can also be applied in other similar potential applications for robust and more promising performance.
Robots are becoming friends of humans by impacting and contributing to our daily life in many ways. They can carry out complex or repetitive activities for us, from household to industrial work, medicine, security, and so on [
Collaborative robots, also known as cobots, are making the situation easier and more productive as they are designed to work in the same workspace as humans [
To support the needs of collaborative robots in different potential fields, researchers are working to propose different frameworks for different collaborative robot-based applications. Some researchers followed the agent-based system [
In this study, a novel framework is proposed for the collaborative robot using mixed reality. Unity, Unity Hub, Windows Mixed Reality (WMR) platform, and Azure mixed reality services are adopted to design the framework and use a holographic device (HoloLens 2) to execute it. The framework can be used in various collaborative applications such as telerehabilitation or teleoperation.
The rest of the manuscript is outlined as follows: first, some recent advancements in this research area are briefly discussed; then, the development of the framework, along with the system architecture and the control of a collaborative robot with mixed reality, is represented; subsequently, one experiment using the framework and result of the experiment is illustrated; and finally, the conclusion of the study is portrayed.
The collaborative robot is one of the potential fields in robotics, and researchers are working on the control system nowadays. Few researchers followed a unified approach by merging an impedance model with a dynamic model with friction and optimizing the assembly [
Rehabilitation robots and assistive robots are two potential applications for collaborative robots. Researchers are working on different approaches using skin surface electromyogram–based signals [
One study suggested controlling the momentum of a robot by considering the maximum speed acceptable to secure the safety of human coworkers [
Researchers are also exploring framework-based approaches to construct generalized systems for different applications. In one study, researchers proposed an open-source framework for a humanoid robot using cross architecture, which was low-cost in computation [
The mixed reality–based approaches are becoming popular for different applications among researchers day by day as it has potential in many ways. Researchers used mixed reality–based methods to design an interface for human-robot interaction to teleoperate a robot [
To develop the mixed reality–based system from scratch for controlling an assistive robot, some prerequisites and a few prior pieces of knowledge are required. Windows operating system–based computer and windows SDK (software development kit) with visual studio are needed to design the structure. The simplest approach to making mixed reality apps is installing either the Unity or Unreal game engines [
The xArm-5 robot is an end effector robot with 5 DoF, developed by UFactory [
During the kinematic analysis, the modified Denavit-Hartenberg (DH) parameters are adopted to specify the xArm configuration of links and joints [
The link frame allocation (according to modified DH convention) of the xArm-5 robot is shown in
Coordinate frame placement using the modified Denavit-Hartenberg parameters.
Modified Denavit-Hartenberg parameters of xArm-5 robot.
|
|
|
|
|
1 | 0 | –π/2 | 267 | 0 |
2 | 289.4886 | 0 | 0 | –1.3849 |
3 | 351.1587 | 0 | 0 | 2.7331 |
4 | 76 | –π/2 | 0 | –1.3482 |
5 | 0 | 0 | 97 | 0 |
Here,
Dimensional parameters of xArm-5 robots.
|
|
|
|
|
|
|
|
267 mm | 284.5 mm | 77.5 mm | 342.5 mm | 76 mm | –1.3849 rad | 2.7331 rad | –1.3482 rad |
The general form Homogeneous Transformation Matrix that relates two successive coordinate frames is represented by Equation (1).
Where
Moreover,
The single transformation matrix found from Equation (2) represents the reference frame’s positions and orientations attached to the end effector with respect to the base reference frame {0}.
The dynamic equation of the xARm5 Robot derived from the Newton-Euler formulation can be written in the following form:
Where
Inertial parameters for each link of xArm-5 robot.
|
Mass (kg) | Center of mass (mm) |
Link 1 | 2.177 | [0.15, 27.24, –13.57] |
Link 2 | 2.011 | [36.7, –220.9, 33.56] |
Link 3 | 2.01 | [68.34, 223.66, 1.1] |
Link 4 | 1.206 | [63.87, 29.3, 3.5] |
Link 5 | 0.17 | [0, –6.77, –10.98] |
Range of motion.
Joint | Working range | ||
|
|
||
|
Joint 1 | ±360 | |
|
Joint 2 | –118~120 | |
|
Joint 3 | –225~11 | |
|
Joint 4 | ±360 | |
|
Joint 5 | –97~180 | |
Maximum speed, deg/s | 180 |
Control architecture of the system. If: filtered current; Iref: reference current; PI: proportional integral; Qf: filtered joint angle; Qref: reference joint angle; Vref: reference voltage.
This section discusses the life cycle of a collaborative session involving a collaborative robot in a mixed reality environment. The user launches the application from their Mixed Reality headset, which is the application’s access point (HoloLens 2). With the suitable digital twin and user, the provider can construct a collaborative session room. The user, as well as any allowed onlookers, can now join the collaboration session. Even though they are in separate places, all users in the collaboration session, including the host user (provider), are now in a digital collaboration session and communicating with the identical digital twin. Depending on the different needs, the provider can regulate the digital twin in several ways.
Data flow of the connected system.
Due to the distributed nature of cloud applications, the system can send a command to the digital twin, which the collaborative robot physically executes. All commands require a controller that runs on the robot’s local network. WebSocket is a computer communications protocol that allows full-duplex communication channels over a single Transmission Control Protocol connection, which manages all device connections and interactions [
Detailed system architecture.
At first, in the event sequence of the mixed reality environment, the user launches the app on the client device and connects with Azure App Service via WebSocket. Then, the user is authenticated via Azure Active Directory. Subsequently, the user can select a digital twin model they wish to interact with. The App Service will retrieve assets corresponding to the selected digital twin, including 3D models and data. Next, the App Service provides a user with their requested data. The Digital Twin pushes incoming data to the Event Hub when the system is running, which fires a serverless function on the Azure cloud server. The serverless function updates database values. At the final step of data flow, some machine learning models will be deployed, which will use historical users’ data and real time collaboration metrics for future automated result analysis.
The Azure cloud platform enables multiple users to join a shared collaboration session. In this collaboration session, users can visualize and interact with one digital twin. The digital twin will move in tandem with the physical robot and mirror its movements. Users in the collaboration session can additionally send high-level commands to control the digital twin.
Mixed reality interface.
The researchers involved in the project took part in the experiments to demonstrate the proof of concept of the teleoperation using the proposed framework. Therefore, ethical approval is not required for this study. The project did not focus on any intervention development or intervention efficacy testing; hence, we did not recruit any participants and did not seek ethics approval for this project.
To validate the proposed framework, an end effector type 5 DoF assistive robot is mounted on a table (
The system provides an overall cross-platform rehabilitation experience for its users. A clinician can remotely assist a patient in both active and passive rehabilitation therapy via a shared mixed reality–based collaboration session. The rehabilitation robot that exists locally for the patient has a digital twin that lives on the Azure cloud platform. A therapist or clinician can interact with this virtual digital twin and use it to assist with rehabilitation therapy.
Experimental setup for the proposed collaborative robot framework: (a) xArm-5 Robot as the collaborative robot; (b) user with HoloLens 2 headset; and (c) collaborative session via the mixed reality–based framework (
Visualizing the key metrics in the interactive mixed reality environment.
Framework for collaborative robots has many potential applications in industry and telehealth such as teleoperation and telerehabilitation. Especially, a pandemic situation such as COVID-19 has affected all aspects of health care and gave the realization of the need for telehealth, which can help health care workers and patients to protect themselves from the risk of disease transmission. Telehealth can also offer personalized rehabilitation programs, real time control, continuous interaction with doctors, negligible waiting time, and so on. Using the proposed framework, it will be handy to implement different systems for teleoperation. While the proposed mixed reality–based framework promises a stable control system for collaborative robots, there are some limitations to it [
Collaborative robots and their applications have a magnificent impact in the rehabilitation and teleoperation fields. A framework for collaborative robots is very much needed to meet the demands of collaborative robots. The proposed mixed reality–based framework for collaborative robots can be used for different telehealth applications such as teleoperation and telerehabilitation and can guide other researchers to conduct further studies for the advancement of humans. Several state-of-the-art technologies were used while developing the framework, including Unity, WMR platform, Azure mixed reality services, and HoloLens 2. The framework is validated by conducting a comprehensive, collaborative experiment using a 5-DoF collaborative robot (xArm-5) in a mixed reality environment. In the future, the study will continue with mixed reality–based applications for collaborative robots. Further research will be conducted on telerehabilitation and teleoperation to design a more robust and stable framework for further advancement.
Individual homogeneous transfer matrix.
Experiment of collaborative session via mixed reality–based framework.
Denavit-Hartenberg
Degrees of Freedom
Mixed Reality
software development kit
Windows Mixed Reality
This research is partially funded by the Connected System Institute, University of Wisconsin-Milwaukee (Grant AAH9481).
None declared.