Robie Becomes a Telepresence Robot



From Toy To Telepresence

A Telepresence Robot is not as complicated as some lead you to believe. It starts with an object avoidance robot, adds remote control over the Internet and then includes a person-to-person communications method to display audio/video of the remote person. That is basically it!

This paper will describe the implementation of a Telepresence Robot by the Dallas Personal Robotics Group (DPRG) as one of our group projects.

This version of the project calls for the Robot to be implemented in stages:

1 - Get the motors working so the robot can move about.

2 - Add object sensors and have robot able to use these to avoid collisions.

3 - Add remote control via the Internet.

4 - Add Skype device to provide telepresence.

At the time of this writing, the robot has completed stage four, which makes it functional as a Telepresence Robot. Work continues on additional stages which will be followed by refactoring the code to make it perform better and be easier to maintain.

Background

The DPRG Telepresence Robot was the brainchild of then club president Carl Ott. He presented the challenge to DPRG members to create the robot to permit our remote users to continue to interact with the club events after the time of Covid was passed.

Most of the members of the club joined the design team and were assigned sections. John Kuhlenschmidt found an Instructables.com example, https://www.instructables.com/Telepresence-Robot-Basic-Platform-Part-1/, which I volunteered to implement. Although I took upon the task of implementing the functional prototype, it would not have been possible to complete without the work of all of the other members of the team.

Our remote member from Connecticut, Chris Netter discovered an example of a remote-control method using Amazon Web Services which could be operated at very little expense. Our friend from the Seattle Robot Society, Donna Smith provided significant research on remote control methods used by other projects. I discovered the method of remote control with VNC, which I eventually chose for my version of the Telepresence Robot, mostly because it was simple and free.

There were many other members of the original team which I know I have missed mentioning here and which I know provided valuable input, but my notes were insufficient to detail their accomplishments along with those listed. My apologies for that and kudos to the entire membership of the DPRG for making this Telepresence Robot possible.

Architecture
   Basic Telepresence Architecture

My prototype uses both a Raspberry Pi and an Arduino to provide the “smarts” for the robot. Object avoidance is provided by front and rear facing SR04 ultrasonic sensors (probably overkill), side facing IR distance sensors, and contact switches on the front bumper. A web camera which is used for the driving camera, to provide the operator with a view of the environment, and an iPhone/iPad for the person-to-person communication device running Skype to provide remote audio/video feeds. Power is supplied by an 18V Ryobi power tool battery for the robot and a cell-phone battery power station to provide extended battery life to the iPhone. Optional connection may be made via video meeting software like Zoom to allow multiple people to view and interact via the personal communications device feed.

The Raspberry Pi uses the built in VNC client to be remotely controlled. A GUI interface, created in Tkinter, provides buttons for the remote operator to direct the robot movement and to display video from the driver camera.

The Raspberry Pi and Arduino are connected together via the USB serial port and communicate using the DPRG Remote Robot Control API. This API permits movement direction to be provided by the Raspberry Pi to the Arduino which controls the motors. The Arduino also manages the object avoidance sensors, which may override the movement directive of the operator if an object is detected in the direction of movement. The Arduino also returns distance information regarding known objects in range back to the Raspberry Pi for display in the GUI.

Power

The robot is powered by a single 18V Ryobi power tool battery. These batteries come in various sizes, the bigger the better for providing extended operational time. A separate cell phone battery pack is used to ensure the phone or tablet has sufficient power to complete the session.

The positive (hot) side of the primary battery is run through a 6-amp fuse to an on/off switch and then to a power distribution board.

The distribution board contains step down voltage regulators to reduce the voltage from 18V to both 9V and 5V to be used by various parts of the robot. The 9V power is split one part to the LH298N motor controller and one part to the Arduino. There are actually two 5V regulators to isolate 5V powering the Raspberry Pi from the 5V powering the sensor activity.

Screw terminals on the distribution board are used to provide multiple connection points for the ground, 18V input, and 5V sensor power.
   Diagram of the DPRG Telepresence Robot Power Distribution Board
   A view of the Power Distribution Board looking into Robie's body with the head removed.

Raspberry Pi and Arduino Communication

A serial cable connects a USB port of the Raspberry Pi to the USB serial port of the Arduino, both of which are configured to communicate at 15200 baud. Each iteration of the command loop on each device data can send/receive serial communications. Also, the command buttons on the GUI will send commands to the Arduino immediately.

The Raspberry Pi GUI can send movement direction and speed commands to the Arduino. The Arduino will send distance values of detected obstacles as the distance changes, which will cause the GUI to update its display.

All of this communication is accomplished with formatted messages, detailed in the DPRG Remote Robot API. Example Movement Command: MC=F* This is defined as a motor command to move forward until told to stop. Example Distance Information: SULF=13* SULF is the sensor on the left forward part of the robot. The U part of the command which stands for Ultrasonic type sensor will be deprecated in the future, since the type of sensor is irrelevant. The data information provided (13) indicates that the obstruction is located 13 cm away. Another future improvement is to make the units information be configurable between CM and INCHES.

Arduino Sensor/Motor Control

The Arduino will receive movement commands from the Raspberry Pi via Serial communications (and may implement an IR control later). The Arduino will also be monitoring its obstacle avoidance sensors.

The Arduino will implement a subsumption architecture to determine what motor commands will be sent to the motors. For example, if the user sends a MOVE FORWARD command from the GUI through the serial interface to the Arduino, but the sensors report that an obstacle is detected a short distance ahead, the subsumption controller may move the robot back a step and turn to avoid the obstacle before responding to the user command.

Basic Usage

The Participant who wants to control the robot will contact the club representative. The participant will be directed to ‘call’ the Skype account on the attached personal communications device via its email address. Once a connection is made, the robot will be powered up and the password set by the club representative, who will then pass it along via Skype only to the participant. The participant will connect to the Telepresence Robot using VNC with the supplied credentials.

When the event is over, the club representative will coordinate shutting the system down and using the switch to turn off power to the robot.

This entry will be edited to add pictures and video.
 

   


   
   



Most Recent Blog:

New Entry