ISRobonet

The ISRoboNet@Home Testbed

Introduction

The ISRoboNet@Home is an infrastructure that has been under continuous development and update since 2006, when, under the European FP6 URUS project, a networked robot system composed of 10 IP cameras (cabled-networked among them), wirelessly networked with several mobile robots at ISR facilities, was deployed.

Currently, the infrastructure has grown to comprehend a camera network ready to plug more IP cameras from locations at the three floors of ISR and a domestic robot test bed designed and installed during the FP7 RoCKIn project, in 2014, which also includes several home automation devices.

ISRoboNet@Home main purpose is to benchmark domestic robot functionalities and tasks, using RoCKIn’s definitions.

Apartment

The apartment part of the infrastructure is based on the general design and specifications of the RoCKIn@Home testbed. Check RoCKIn@Home rulebook for details.

Environment Structure and Properties

  • Ensemble of five spatial areas accessible to the robots and three others inaccessible.
  • Rooms and spatial areas (accessible to the robot): Living room, dining room, kitchen, inside hallway, bedroom.
  • Spatial areas (inaccessible to the robot): outside hallway, bathroom, patio.
  • Flat with no stairs.
  • Open-plan architecture followed for the living room, dining room and kitchen. The bedroom is separated by walls.
  • Floor well-leveled and uniform all over the testbed, but including carpets.
  • Walls: Final version not yet in place – will be reported in version 3 of this document.

Rooms Properties and Furniture

  • Bedroom: one window, a double bed, two side tables, two table lamps and one large wardrobe with mirror.
  • Living room: windows, couch, two armchairs, one coffee table, one TV table and one large floor lamp.
  • Dining room: one glass-top dining table and 2 dining chairs.
  • Kitchen: one kitchen table and 2 chairs, kitchen cabinet with multiple drawers and wash sink, two wall-mounted kitchen shelves.
  • Hallway: consists of one coat rack.

Networked Devices

The @Home Testbed at IST is equipped with network devices capable of opening/closing the blinds and turning on/off the lamps.

  • Server: A computer used to manage the network.
  • Switch: An Ethernet switch used to connect all the devices.
  • AP: An Access Point the mobile robot wirelessly connects to. Acts as a bridge between WLAN and LAN. The Access Point (Cisco AIR – AP1042N-E-K9) works in Dual-band Standalone 802.11a/g/n.
  • Ethernet Camera: 1 perspective camera facing the Outside Hallway. The AXIS P1344 camera parameters (frame rate, resolution, color gains) can be changed over Ethernet.
  • Home Automation Devices:
  • 1 motor to control the window blinds,
  • 3 controlled power plugs,
  • 1 light dimmer,
  • 1 door bell button.
  • SMARTIF IO: This module controls the different devices/sensors existing in the house. It is prepared to add more devices in case of need.
  • SMARTIF Server: Device responsible for the communication between the SMARTIF IO and the network.

The @Home Testbed at IST IP Camera is networked with the rest of the ISRoboNet@Home network. Currently 15 cameras are installed, of the following models:

1 AXIS 215 PTZ
3 VIVOTEK SF8174
3 AXIS 211
2 AXIS 211A
3 AXIS P1344
3 Vivotek FE8171V

Servers:

  • The camera network is connected to four servers (Intel Xeon 2.4Ghz with 8GB of RAM), which are responsible for the data processing and for streaming the videos;
  • Another server (Intel Core i7 with 8GB of RAM) is dedicated to the @Home testbed and is used to run the Referee Box software (see below);
  • Yet another server (Intel Core i5 with 4GB of RAM) is used to access the motion capture system (Benchmark box).

Referee Scoring and Benchmarking Box:

This is the Referee Box for RoCKIn@Home competitions. It interacts with the mobile robots wirlessly and also controls the test bed home automation devices. It is described in detail in RoCKIn@Home rulebook and its code can be downloaded from

https://github.com/rockin-robot-challenge/at_home_rsbb

Objects

Task-relevant objects:

  • Navigation-Relevant Objects: all objects which have extent in physical space and do (or may) intersect (in 3D) with the robot’s navigation space. All such objects must be avoided during navigation, i.e. whenever the robot moves, it may not bump into these objects or touch them, unless otherwise specified by a task.
  • Manipulation-Relevant Objects: all objects that the robot may have manipulative interactions with, which may include touching (a switch), grasping (a glass), lifting (a book), holding (a cup), placing (a parcel), dropping (waste), carrying (a glass), pushing (a drawer), pulling (a drawer), turning (a book), filling ( a glass), pouring (from a cup), etc.
  • Perception-Relevant Objects: objects that the robot must ”only” be able to perceive. By ”perceive” we mean that the robot should be able to recognize if such an object is in its view, that it should be able to identify the object if it is unique or to classify it if not (e.g. an instance of a cup, if several non-unique instances exist), and that it should be able to localize the object.

Non-Task-relevant objects:

Many of the objects in the apartment were selected so as to set a long-term agenda concerning perception, navigation and manipulation issues. There are pillows with different solid colors but also patterns, transparent and opaque cups, wooden and glass tabletops, mirror in the wardrobe front, etc.

Robots

3 ADEPT Pioneer 3AT with SICK Laser Scanner
4 MOnarCH MBOTs extended with a RODE Microphone and an AUS RGB-D Camera
2 ROBAI Cyton Gamma 1500 Robai Arm with 7 DoF

Motion Capture System and Datasets

The @Home Testbed at IST is equipped with network devices capable of opening/closing the blinds and turning on/off the lamps.

The apartment test bed includes Motion Capture System (MCS) based on 12 OptiTrack PRIME13 cameras (1.3MP, 240FPS). This system provides real-time tracking data of rigid bodies (position and attitude) at a very high rate.

In the future we plan to make available datasets acquired from the test bed, including the ground-trust on the location of robots and all relevant objects.

For now, a large collection of datasets in a similar test bed is available from the RoCKIn Dataset Wiki.