Introduction
This is the English version of day 21 of the Node-RED Advent Calendar 2024.
Japanese Version:
Last year, I wrote about creating nodes for the Node-RED MCU.
This year, I have been assigned to a lab and started using the Unreal Engine 5 (UE5) and ROS for the first time. I am also using Node-RED to connect and cooperate with them.
In this article, I provide an overview of connecting UE5, ROS, and Node-RED. The technical elements are written on my website if you are interested. I include related articles in each section.
About Software
This section provides a brief overview of the software in use and its usage policy.
Node-RED
Node-RED is a flow type programming tool. It can use HTTP, WebSocket, TCP, UDP, MQTT, etc. communication. It also works well with IoT.
In my setup, the ROS side used WebSocket communication, while the UE5 side used HTTP communication. These methods are different, but Node-RED enabled seamless relaying of communication between them.
ROS
ROS is software to control the robot. UE5 can also calculate the kinematics of robot arms, but I chose ROS to ensure consistency with the calculations of the actual robot.
Unreal Engine 5(UE5)
UE5 is a game engine. I am especially seeking to take advantage of graphics processing that utilizes the GPU. It can import 3D scanned data from real world.
Similar to Node-RED, it allows flow-type programming with Blueprint.
Related Article:Unreal Engine 5を使ってみる その10(Scaniverse、点群データの取り込み)
Testing with Open Manipulator
Overall Configuration
The overall configuration is shown in the figure below; since Open Manipulator's software is prepared so well, we can operate it at no-code level.
ROS calculates the angles of the Open Manipulator's joints and sends the values to the real manipulator and UE5 at the same time.
Conversely, I have also confirmed that UE5 input can be sent to ROS to control the real manipulator.
At first, I used Windows 10 laptop and Windows 11 desktop PC communicating with Node-RED to distribute the load, using different PCs to run UE5 and ROS.
Recently, I am using a gaming laptop with Windows 11 and a RTX3050 GPU, and I can process on a single PC. The GPU seems to be important for running UE5.
Connecting Node-RED and ROS
ROS Side
I have installed Ubuntu 20.04 on WSL2 and built a ROS Noetic environment.
Related Article: WSL2を使ってみる その3(Ubuntu 20.04、ROS Noetic、Open Manipulator)
In advance, I have confirmed that it is possible to send parameters in a rosservice call to make it work. To match this format, Node-RED sends them.
The ros node is assumed to be running ROS Bridge, so it must be running.
Related Article:WSL2のROSとWebSocket通信(ROS Bridge、Node-RED、Gazebo)
Node-RED Side
I had confirmed in advance that turtlesim turtles could be manipulated in Node-RED.
Related Article:ROS1を使ってみる その1(Turtlesim、Node-RED)
The overall flow is as follows. I use ros callSrv node. I made it controllable with a slider.
Related Article:ROS1を使ってみる その3(Open Manipulatorの操作、Node-RED)
The template node is set to the value that was just sent in the rosservice call. The flow variable is used to hold the value of the slider.
I created the following page with the dashboard node.
How it works
I configured the joint to rotate based on the slider's value.
Connecting Node-RED and UE5
UE5 Side
Using UE5's HTTP Request, it can handle HTTP communications. WebSocket communication is reportedly also supported.
There are other APIs such as the Remote Control API, but the HTTP Request node was the easiest to use when working with Node-RED. JSON format data can also be edited.
Related Article:Unreal Engine 5を使ってみる その8(HTTP Request、Remote Control Web Interface、Node-RED)
I also created a model for Open Manipulator.
Related Article:Unreal Engine 5を使ってみる その7(Open Manipulator)
Node-RED Side
It communicates using http in/out and request nodes.
Related Article:Unreal Engine 5を使ってみる その9(Remote Control API、Node-RED)
This is an example of using the Remote Control API, which can also be operated with a slider, similar to the ROS connection. The data is processed so that it is formatted according to the API specifications.
How it works
I used UE5's Remote Control API and was able to manipulate the joint angles.
Connecting UE5 and ROS
Node-RED and ROS, Node-RED and UE5 were able to communicate with each other, so I could relay the communication between UE5 and ROS with Node-RED.
In the following video, the UE5 side moves slowly because the processing was limited, but I could move it almost at the same time as the real manipulator.
The following video shows the ROS simulator and UE5 screens side by side.
Note that Open Manipulator has 4 axes plus end-effectors and few flexibility, but I could create MoveIt package to calculate kinematics for a 6-axis model as well and make it run in UE5. It seems that even a robot designed by myself can be moved.
For the Robot's Digital Twin
Although a ROS simulator can be used for just simulation, I am planning to simulate the robot while taking advantage of the graphics of UE5 and using image processing related software in conjunction with it.
For example, if the position of the light source changes, it will change as shown in the image below. It looks like image processing can be done taking shadows into account.
Using UE5's Set View Target with Blend node and Take High Res Screenshot node, you can capture screenshots like the one shown below. It uses a realistic model from Quixel Megascans.
Related Article:Unreal Engine 5を使ってみる その12(Screenshot )
When detecting objects with the robot, I use software called YOLO to detect them in real time.
In UE5, I could also run Python in Node-RED, detect objects with YOLO, and combine the results with other nodes.
Related Article:YOLOで物体検出 その2(Python、Node-RED)
Unreal Engine 5を使ってみる その11(Pixel Streaming )
The 3D scanned tomato seedling data in the video is rough, but I have recently been able to import it with other plugins to display it as scanned by Scaniverse. The following image is my haniwa.
Using YOLO's posture detection model, I could also use two PCs to get the coordinates from each camera image and make them into 3D coordinates to move the object on UE5. It seems to be possible to perform easy whole-body tracking.
This tracking was inspired by an article by @mshioji, who I often work with at events. The robot is controlled by two cameras with feedback of the workpiece position.
チトセロボティクスの「クルーボ」で産業用ロボットのビジュアルフィードバック技術を試してみた(2)
I could also confirm that changing the light source color changes the YOLO detection results. This is a situation that does not seem to exist in reality, and since the tomato model detects apples, it is a false detection anyway.
Conclusion
I have been using Unreal Engine and ROS for less than a year, but thanks to familiarity with Node-RED, I could connect them together. The implementation is also at a low code level.
I would like to continue to simulate robots by taking advantage of the strengths of each software.