Episode 03 – Delivering coffee shop orders
General Description. In this episode the robot will assist people in a coffee shop to take care of customers, by taking orders and bringing objects to and from customers’ tables.
The main functionality that is evaluated in this episode is people perception. Additional side functionalities are navigation, speech synthesis and recognition.
Platforms allowed. A mobile robot equipped with a tray, a planar surface or any way of transporting small items (e.g., a cup of coffee – possibly empty). Sensors for people perception and navigation in the environment are also needed.
Setting. a coffee shop, with a few tables and typical objects (e.g, coffee cups, napkins, …). A set of people sitting at the table, some tables are clean, some will have objects on it.
Procedure. The robot is placed in a starting location in front of the tables. Some customers will try to get attention of the robot (e.g., by calling and waving at it). The robot will reach the table and will assess the status of the table (either clean or with objects). If the table is clean, the robot will accept an order from the customers and will report the order to the kitchen. It will remember the order and will deliver the objects to that table. If the table is full, the robot will ask the customers if they want to place some objects on its tray to bring them back to the kitchen. If the robot detects that people are giving attention to it, it will tell the customers about special events occurring in MK in the following days.
Notes: 1) no manipulation actions are required, the objects are placed and gathered by people to and from the tray on the robot, 2) objects to deliver are such that they will fit on the robot space.
DH interaction. Type: data consumption. Orders are placed through an application which is connected to the MK:DataHub. The MK:DataHub keeps track of all the orders. Additionally, the robot can gather information from the MK:DataHub about next events to refer them to customers.
Main functionality(ies). The main functionality tested is Person and Object Perception.
Auxiliary functionalities. In addition, other functionalities that are needed are Navigation, Speech synthesis and Spoken Language Understanding to interact with customers.
- Correctly identifying people calling for the robot.
- Reaching the table.
- Evaluating the status of the table and performing the correct action.
- Taking and deliver the order.
- Taking items to the kitchen.
- Understanding people giving attention to the robot.
- Tell about events in MK.
- Hitting or damaging the table.
- Wrongly evaluating the status of the table.
- Pouring or crashing any of the carried items.
- Misinterpreting customer requests.