Skip to main content

Table 2 Sensory information from smart device

From: A modular cognitive model of socially embedded robot partners for information support

Input information

Parameters (sampling interval: every 0.5 s)

Output value

Sensor label

Display touch

Tap, long press, swipe (up, down, left, right)

0 or 1

\(C_{in,1}\)

Touched finger radius

20–100

0.0–1.0

\(C_{in,2}\)

Human detection

Human or nobody

0 or 1

\(C_{in,3}\)

Human distance

1.5–0.1 m

0.0–1.0

\(C_{in,4}\)

Human gesture

Hand up down, left right, circling

0 or 1

\(C_{in,5}, E_{in,1}\)

Object color

Skin, red, blue

0 or 1

\(E_{in,2}\)

Smile

Smiling or not

0 or 1

\(E_{in,3}\)

Gender

Female or male

0 or 1

\(E_{in,4}\)

Race

Asian or non-Asian

0 or 1

\(E_{in,5}\)

Proximity sensor

Nothing or covering

0 or 1

\(B_{in,1}\)

Input sound magnitude

Volume level: −120 (min) to 0 (max)

0.0–1.0

\(B_{in,2}\)

Body shake

Shaking or not

0 or 1

\(B_{in,3}\)

Compass direction

North to south

0.0–1.0

\(B_{in,4}\)

Battery status

Battery level

0.0–1.0

\(B_{in,5}\)